Sensory processing and world modeling for an active ranging device
NASA Technical Reports Server (NTRS)
Hong, Tsai-Hong; Wu, Angela Y.
1991-01-01
In this project, we studied world modeling and sensory processing for laser range data. World Model data representation and operation were defined. Sensory processing algorithms for point processing and linear feature detection were designed and implemented. The interface between world modeling and sensory processing in the Servo and Primitive levels was investigated and implemented. In the primitive level, linear features detectors for edges were also implemented, analyzed and compared. The existing world model representations is surveyed. Also presented is the design and implementation of the Y-frame model, a hierarchical world model. The interfaces between the world model module and the sensory processing module are discussed as well as the linear feature detectors that were designed and implemented.
SLS Navigation Model-Based Design Approach
NASA Technical Reports Server (NTRS)
Oliver, T. Emerson; Anzalone, Evan; Geohagan, Kevin; Bernard, Bill; Park, Thomas
2018-01-01
The SLS Program chose to implement a Model-based Design and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team has been responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for the navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1-B design, the additional GPS Receiver hardware is managed as a DMM at the vehicle design level. This paper provides a discussion of the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the Navigation components. These include composing system requirements, requirements verification, model development, model verification and validation, and modeling and analysis approaches. The Model-based Design and Requirements approach does not reduce the effort associated with the design process versus previous processes used at Marshall Space Flight Center. Instead, the approach takes advantage of overlap between the requirements development and management process, and the design and analysis process by efficiently combining the control (i.e. the requirement) and the design mechanisms. The design mechanism is the representation of the component behavior and performance in design and analysis tools. The focus in the early design process shifts from the development and management of design requirements to the development of usable models, model requirements, and model verification and validation efforts. The models themselves are represented in C/C++ code and accompanying data files. Under the idealized process, potential ambiguity in specification is reduced because the model must be implementable versus a requirement which is not necessarily subject to this constraint. Further, the models are shown to emulate the hardware during validation. For models developed by the Navigation Team, a common interface/standalone environment was developed. The common environment allows for easy implementation in design and analysis tools. Mechanisms such as unit test cases ensure implementation as the developer intended. The model verification and validation process provides a very high level of component design insight. The origin and implementation of the SLS variant of Model-based Design is described from the perspective of the SLS Navigation Team. The format of the models and the requirements are described. The Model-based Design approach has many benefits but is not without potential complications. Key lessons learned associated with the implementation of the Model Based Design approach and process from infancy to verification and certification are discussed
A Praxeological Perspective for the Design and Implementation of a Digital Role-Play Game
ERIC Educational Resources Information Center
Sanchez, Eric; Monod-Ansaldi, Réjane; Vincent, Caroline; Safadi-Katouzian, Sina
2017-01-01
This paper draws on an empirical work dedicated to discussing a theoretical model for design-based research. The context of our study is a research project for the design, the implementation and the analysis of Insectophagia, a digital role-play game implemented in secondary schools. The model presented in this paper aims at conceptualizing…
NASA Astrophysics Data System (ADS)
Fuhrmann, Tamar; Schneider, Bertrand; Blikstein, Paulo
2018-05-01
The Bifocal Modelling Framework (BMF) is an approach for science learning which links students' physical experimentation with computer modelling in real time, focusing on the comparison of the two media. In this paper, we explore how a Bifocal Modelling implementation supported learning outcomes related to both content and metamodeling knowledge, focusing on the role of designing models. Our study consisted of three conditions implemented with a total of 69 9th grade high-school students. The first and second classes were assigned two implementation modes of BMF: with and without a model design module. The third condition, employed as a control, consisted of a class that received instruction in the school's traditional approach. Our results indicate that students participating in both BMF implementations demonstrated improved content knowledge and a better understanding of metamodeling. However, only the 'BMF-with-design' group improved significantly in both content and metamodeling knowledge. Our qualitative analyses indicate that both BMF groups designed detailed models that included scientific explanations. However only students who engaged in the model design component: (1) completed a detailed model displaying molecular interaction; and (2) developed a critical perspective about models. We discuss the implications of those results for teaching scientific science concepts and metamodeling knowledge.
Sulfur Dioxide Designations and Implementation Modeling Guidance and Assistance
The information and links here are intended to help inform and assist Regional, State, Local, and Tribal modelers with respect to designation and implementation modeling for the 1-hour SO2 NAAQS that became effective on June 2, 2010.
Design study of Software-Implemented Fault-Tolerance (SIFT) computer
NASA Technical Reports Server (NTRS)
Wensley, J. H.; Goldberg, J.; Green, M. W.; Kutz, W. H.; Levitt, K. N.; Mills, M. E.; Shostak, R. E.; Whiting-Okeefe, P. M.; Zeidler, H. M.
1982-01-01
Software-implemented fault tolerant (SIFT) computer design for commercial aviation is reported. A SIFT design concept is addressed. Alternate strategies for physical implementation are considered. Hardware and software design correctness is addressed. System modeling and effectiveness evaluation are considered from a fault-tolerant point of view.
Developing an active implementation model for a chronic disease management program.
Smidth, Margrethe; Christensen, Morten Bondo; Olesen, Frede; Vedsted, Peter
2013-04-01
Introduction and diffusion of new disease management programs in healthcare is usually slow, but active theory-driven implementation seems to outperform other implementation strategies. However, we have only scarce evidence on the feasibility and real effect of such strategies in complex primary care settings where municipalities, general practitioners and hospitals should work together. The Central Denmark Region recently implemented a disease management program for chronic obstructive pulmonary disease (COPD) which presented an opportunity to test an active implementation model against the usual implementation model. The aim of the present paper is to describe the development of an active implementation model using the Medical Research Council's model for complex interventions and the Chronic Care Model. We used the Medical Research Council's five-stage model for developing complex interventions to design an implementation model for a disease management program for COPD. First, literature on implementing change in general practice was scrutinised and empirical knowledge was assessed for suitability. In phase I, the intervention was developed; and in phases II and III, it was tested in a block- and cluster-randomised study. In phase IV, we evaluated the feasibility for others to use our active implementation model. The Chronic Care Model was identified as a model for designing efficient implementation elements. These elements were combined into a multifaceted intervention, and a timeline for the trial in a randomised study was decided upon in accordance with the five stages in the Medical Research Council's model; this was captured in a PaTPlot, which allowed us to focus on the structure and the timing of the intervention. The implementation strategies identified as efficient were use of the Breakthrough Series, academic detailing, provision of patient material and meetings between providers. The active implementation model was tested in a randomised trial (results reported elsewhere). The combination of the theoretical model for complex interventions and the Chronic Care Model and the chosen specific implementation strategies proved feasible for a practice-based active implementation model for a chronic-disease-management-program for COPD. Using the Medical Research Council's model added transparency to the design phase which further facilitated the process of implementing the program. http://www.clinicaltrials.gov/(NCT01228708).
Model reduction methods for control design
NASA Technical Reports Server (NTRS)
Dunipace, K. R.
1988-01-01
Several different model reduction methods are developed and detailed implementation information is provided for those methods. Command files to implement the model reduction methods in a proprietary control law analysis and design package are presented. A comparison and discussion of the various reduction techniques is included.
An Approach to Verification and Validation of a Reliable Multicasting Protocol
NASA Technical Reports Server (NTRS)
Callahan, John R.; Montgomery, Todd L.
1994-01-01
This paper describes the process of implementing a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally using a combination of formal and informal techniques in an attempt to ensure the correctness of its implementation. Our development process involved three concurrent activities: (1) the initial construction and incremental enhancement of a formal state model of the protocol machine; (2) the initial coding and incremental enhancement of the implementation; and (3) model-based testing of iterative implementations of the protocol. These activities were carried out by two separate teams: a design team and a V&V team. The design team built the first version of RMP with limited functionality to handle only nominal requirements of data delivery. In a series of iterative steps, the design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation. This was done by generating test cases based on suspected errant or offnominal behaviors predicted by the current model. If the execution of a test was different between the model and implementation, then the differences helped identify inconsistencies between the model and implementation. The dialogue between both teams drove the co-evolution of the model and implementation. Testing served as the vehicle for keeping the model and implementation in fidelity with each other. This paper describes (1) our experiences in developing our process model; and (2) three example problems found during the development of RMP.
An approach to verification and validation of a reliable multicasting protocol
NASA Technical Reports Server (NTRS)
Callahan, John R.; Montgomery, Todd L.
1995-01-01
This paper describes the process of implementing a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally using a combination of formal and informal techniques in an attempt to ensure the correctness of its implementation. Our development process involved three concurrent activities: (1) the initial construction and incremental enhancement of a formal state model of the protocol machine; (2) the initial coding and incremental enhancement of the implementation; and (3) model-based testing of iterative implementations of the protocol. These activities were carried out by two separate teams: a design team and a V&V team. The design team built the first version of RMP with limited functionality to handle only nominal requirements of data delivery. In a series of iterative steps, the design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation. This was done by generating test cases based on suspected errant or off-nominal behaviors predicted by the current model. If the execution of a test was different between the model and implementation, then the differences helped identify inconsistencies between the model and implementation. The dialogue between both teams drove the co-evolution of the model and implementation. Testing served as the vehicle for keeping the model and implementation in fidelity with each other. This paper describes (1) our experiences in developing our process model; and (2) three example problems found during the development of RMP.
ERIC Educational Resources Information Center
Tichnor-Wagner, Ariel; Allen, Danielle; Socol, Allison Rose; Cohen-Vogel, Lora; Rutledge, Stacey A.; Xing, Qi W.
2018-01-01
Background/Context: This study examines the implementation of an academic and social-emotional learning innovation called Personalization for Academic and Social-Emotional Learning, or PASL. The innovation was designed, tested, and implemented using a continuous continuous-improvement model. The model emphasized a top-and-bottom process in which…
ERIC Educational Resources Information Center
Luecht, Richard M.
2013-01-01
Assessment engineering is a new way to design and implement scalable, sustainable and ideally lower-cost solutions to the complexities of designing and developing tests. It represents a merger of sorts between cognitive task modeling and engineering design principles--a merger that requires some new thinking about the nature of score scales, item…
Baranwal, Mayank; Gorugantu, Ram S; Salapaka, Srinivasa M
2015-08-01
This paper aims at control design and its implementation for robust high-bandwidth precision (nanoscale) positioning systems. Even though modern model-based control theoretic designs for robust broadband high-resolution positioning have enabled orders of magnitude improvement in performance over existing model independent designs, their scope is severely limited by the inefficacies of digital implementation of the control designs. High-order control laws that result from model-based designs typically have to be approximated with reduced-order systems to facilitate digital implementation. Digital systems, even those that have very high sampling frequencies, provide low effective control bandwidth when implementing high-order systems. In this context, field programmable analog arrays (FPAAs) provide a good alternative to the use of digital-logic based processors since they enable very high implementation speeds, moreover with cheaper resources. The superior flexibility of digital systems in terms of the implementable mathematical and logical functions does not give significant edge over FPAAs when implementing linear dynamic control laws. In this paper, we pose the control design objectives for positioning systems in different configurations as optimal control problems and demonstrate significant improvements in performance when the resulting control laws are applied using FPAAs as opposed to their digital counterparts. An improvement of over 200% in positioning bandwidth is achieved over an earlier digital signal processor (DSP) based implementation for the same system and same control design, even when for the DSP-based system, the sampling frequency is about 100 times the desired positioning bandwidth.
NASA Astrophysics Data System (ADS)
Kasim, N.; Zainal Abidin, N. A.; Zainal, R.; Sarpin, N.; Rahim, M. H. I. Abd; Saikah, M.
2017-11-01
Implementation of Building Information Modelling (BIM) was expected to bring improvement in current practices of Malaysian construction industry. In the design phase, there is a lack of a ready pool of skilled workers who are able to develop BIM strategic plan and effectively utilise it. These create boundaries for BIM nature in Malaysian construction industry specifically in the design phase to achieve its best practices. Therefore, the objectives of this research are to investigate the current practices of BIM implementation in the design phase as well as the best practices factors of BIM implementation in the design phase. The qualitative research approach is carried out through semi-structured interviews with the designers of different organisations which adopt BIM in the design phase. Data collection is analysed by executing content analysis method. From the findings, the best practices factors of BIM implementation in design phase such as the incentive for BIM training, formal approach to monitoring automated Level of Detailing (LOD), run a virtual meeting and improve Industry Foundation Class (IFC). Thus, best practices factors which lead to practices improvements in the design phase of project development which subsequently improves the implementation of BIM in the design phase of Malaysian construction industry.
Implementing and Assessing a Flipped Classroom Model for First-Year Engineering Design
ERIC Educational Resources Information Center
Saterbak, Ann; Volz, Tracy; Wettergreen, Matthew
2016-01-01
Faculty at Rice University are creating instructional resources to support teaching first-year engineering design using a flipped classroom model. This implementation of flipped pedagogy is unusual because content-driven, lecture courses are usually targeted for flipping, not project-based design courses that already incorporate an abundance of…
Developing an active implementation model for a chronic disease management program
Smidth, Margrethe; Christensen, Morten Bondo; Olesen, Frede; Vedsted, Peter
2013-01-01
Background Introduction and diffusion of new disease management programs in healthcare is usually slow, but active theory-driven implementation seems to outperform other implementation strategies. However, we have only scarce evidence on the feasibility and real effect of such strategies in complex primary care settings where municipalities, general practitioners and hospitals should work together. The Central Denmark Region recently implemented a disease management program for chronic obstructive pulmonary disease (COPD) which presented an opportunity to test an active implementation model against the usual implementation model. The aim of the present paper is to describe the development of an active implementation model using the Medical Research Council’s model for complex interventions and the Chronic Care Model. Methods We used the Medical Research Council’s five-stage model for developing complex interventions to design an implementation model for a disease management program for COPD. First, literature on implementing change in general practice was scrutinised and empirical knowledge was assessed for suitability. In phase I, the intervention was developed; and in phases II and III, it was tested in a block- and cluster-randomised study. In phase IV, we evaluated the feasibility for others to use our active implementation model. Results The Chronic Care Model was identified as a model for designing efficient implementation elements. These elements were combined into a multifaceted intervention, and a timeline for the trial in a randomised study was decided upon in accordance with the five stages in the Medical Research Council’s model; this was captured in a PaTPlot, which allowed us to focus on the structure and the timing of the intervention. The implementation strategies identified as efficient were use of the Breakthrough Series, academic detailing, provision of patient material and meetings between providers. The active implementation model was tested in a randomised trial (results reported elsewhere). Conclusion The combination of the theoretical model for complex interventions and the Chronic Care Model and the chosen specific implementation strategies proved feasible for a practice-based active implementation model for a chronic-disease-management-program for COPD. Using the Medical Research Council’s model added transparency to the design phase which further facilitated the process of implementing the program. Trial registration: http://www.clinicaltrials.gov/(NCT01228708). PMID:23882169
NASA Technical Reports Server (NTRS)
Lin, Risheng; Afjeh, Abdollah A.
2003-01-01
Crucial to an efficient aircraft simulation-based design is a robust data modeling methodology for both recording the information and providing data transfer readily and reliably. To meet this goal, data modeling issues involved in the aircraft multidisciplinary design are first analyzed in this study. Next, an XML-based. extensible data object model for multidisciplinary aircraft design is constructed and implemented. The implementation of the model through aircraft databinding allows the design applications to access and manipulate any disciplinary data with a lightweight and easy-to-use API. In addition, language independent representation of aircraft disciplinary data in the model fosters interoperability amongst heterogeneous systems thereby facilitating data sharing and exchange between various design tools and systems.
Self-conscious robotic system design process--from analysis to implementation.
Chella, Antonio; Cossentino, Massimo; Seidita, Valeria
2011-01-01
Developing robotic systems endowed with self-conscious capabilities means realizing complex sub-systems needing ad-hoc software engineering techniques for their modelling, analysis and implementation. In this chapter the whole process (from analysis to implementation) to model the development of self-conscious robotic systems is presented and the new created design process, PASSIC, supporting each part of it, is fully illustrated.
Design and Implementation of 3D Model Data Management System Based on SQL
NASA Astrophysics Data System (ADS)
Li, Shitao; Zhang, Shixin; Zhang, Zhanling; Li, Shiming; Jia, Kun; Hu, Zhongxu; Ping, Liang; Hu, Youming; Li, Yanlei
CAD/CAM technology plays an increasingly important role in the machinery manufacturing industry. As an important means of production, the accumulated three-dimensional models in many years of design work are valuable. Thus the management of these three-dimensional models is of great significance. This paper gives detailed explanation for a method to design three-dimensional model databases based on SQL and to implement the functions such as insertion, modification, inquiry, preview and so on.
Embracing model-based designs for dose-finding trials
Love, Sharon B; Brown, Sarah; Weir, Christopher J; Harbron, Chris; Yap, Christina; Gaschler-Markefski, Birgit; Matcham, James; Caffrey, Louise; McKevitt, Christopher; Clive, Sally; Craddock, Charlie; Spicer, James; Cornelius, Victoria
2017-01-01
Background: Dose-finding trials are essential to drug development as they establish recommended doses for later-phase testing. We aim to motivate wider use of model-based designs for dose finding, such as the continual reassessment method (CRM). Methods: We carried out a literature review of dose-finding designs and conducted a survey to identify perceived barriers to their implementation. Results: We describe the benefits of model-based designs (flexibility, superior operating characteristics, extended scope), their current uptake, and existing resources. The most prominent barriers to implementation of a model-based design were lack of suitable training, chief investigators’ preference for algorithm-based designs (e.g., 3+3), and limited resources for study design before funding. We use a real-world example to illustrate how these barriers can be overcome. Conclusions: There is overwhelming evidence for the benefits of CRM. Many leading pharmaceutical companies routinely implement model-based designs. Our analysis identified barriers for academic statisticians and clinical academics in mirroring the progress industry has made in trial design. Unified support from funders, regulators, and journal editors could result in more accurate doses for later-phase testing, and increase the efficiency and success of clinical drug development. We give recommendations for increasing the uptake of model-based designs for dose-finding trials in academia. PMID:28664918
Design, Implementation, and Verification of the Reliable Multicast Protocol. Thesis
NASA Technical Reports Server (NTRS)
Montgomery, Todd L.
1995-01-01
This document describes the Reliable Multicast Protocol (RMP) design, first implementation, and formal verification. RMP provides a totally ordered, reliable, atomic multicast service on top of an unreliable multicast datagram service. RMP is fully and symmetrically distributed so that no site bears an undue portion of the communications load. RMP provides a wide range of guarantees, from unreliable delivery to totally ordered delivery, to K-resilient, majority resilient, and totally resilient atomic delivery. These guarantees are selectable on a per message basis. RMP provides many communication options, including virtual synchrony, a publisher/subscriber model of message delivery, a client/server model of delivery, mutually exclusive handlers for messages, and mutually exclusive locks. It has been commonly believed that total ordering of messages can only be achieved at great performance expense. RMP discounts this. The first implementation of RMP has been shown to provide high throughput performance on Local Area Networks (LAN). For two or more destinations a single LAN, RMP provides higher throughput than any other protocol that does not use multicast or broadcast technology. The design, implementation, and verification activities of RMP have occurred concurrently. This has allowed the verification to maintain a high fidelity between design model, implementation model, and the verification model. The restrictions of implementation have influenced the design earlier than in normal sequential approaches. The protocol as a whole has matured smoother by the inclusion of several different perspectives into the product development.
Äikäs, Antti Hermanni; Pronk, Nicolaas P; Hirvensalo, Mirja Hannele; Absetz, Pilvikki
2017-08-01
The aim of this study was to describe the content of a multiyear market-based workplace health promotion (WHP) program and to evaluate design and implementation processes in a real-world setting. Data was collected from the databases of the employer and the service provider. It was classified using the 4-S (Size, Scope, Scalability, and Sustainability) and PIPE Impact Metric (Penetration, Implementation) models. Data analysis utilized both qualitative and quantitative methods. Program design covered well the evidence-informed best practices except for clear path toward sustainability, cooperation with occupational health care, and support from middle-management supervisors. The penetration rate among participants was high (99%) and majority (81%) of services were implemented as designed. Study findings indicate that WHP market would benefit the use of evidence-based design principles and tendentious decisions to anticipate a long-term implementation process already during the planning phase.
Äikäs, Antti Hermanni; Pronk, Nicolaas P.; Hirvensalo, Mirja Hannele; Absetz, Pilvikki
2017-01-01
Objective: The aim of this study was to describe the content of a multiyear market-based workplace health promotion (WHP) program and to evaluate design and implementation processes in a real-world setting. Methods: Data was collected from the databases of the employer and the service provider. It was classified using the 4-S (Size, Scope, Scalability, and Sustainability) and PIPE Impact Metric (Penetration, Implementation) models. Data analysis utilized both qualitative and quantitative methods. Results: Program design covered well the evidence-informed best practices except for clear path toward sustainability, cooperation with occupational health care, and support from middle-management supervisors. The penetration rate among participants was high (99%) and majority (81%) of services were implemented as designed. Conclusion: Study findings indicate that WHP market would benefit the use of evidence-based design principles and tendentious decisions to anticipate a long-term implementation process already during the planning phase. PMID:28665839
Establishment of a Re-Entry Model to Reduce Recidivism Among Court Ward Students.
ERIC Educational Resources Information Center
Kammuller, Kenneth C.
The development and implementation of a Re-Entry Model designed to facilitate and monitor the adjustment of high school students returned to public school from a county commitment facility is described. Transition and follow-up procedures implemented through the model with a liaison teacher at the commitment facility school designated as the…
Promoting Retention through the Implementation of Integrated Multicultural Instructional Design
ERIC Educational Resources Information Center
Higbee, Jeanne L.; Goff, Emily; Schultz, Jennifer L.
2013-01-01
This article introduces the guiding principles of integrated multicultural instructional design (IMID), a new pedagogical model created to promote retention by addressing multicultural perspectives and social justice issues across the curriculum. To illustrate the model, specific strategies for implementing IMID in a content-based,…
Evaluating Comprehensive School Reform Models at Scale: Focus on Implementation
ERIC Educational Resources Information Center
Vernez, Georges; Karam, Rita; Mariano, Louis T.; DeMartini, Christine
2006-01-01
This study was designed to fill the "implementation measurement" gap. A methodology to quantitatively measure the level of Comprehensive School Reform (CSR) implementation that can be used across a variety of CSR models was developed, and then applied to measure actual implementation of four different CSR models in a large number of schools. The…
A top-down design methodology and its implementation for VCSEL-based optical links design
NASA Astrophysics Data System (ADS)
Li, Jiguang; Cao, Mingcui; Cai, Zilong
2005-01-01
In order to find the optimal design for a given specification of an optical communication link, an integrated simulation of electronic, optoelectronic, and optical components of a complete system is required. It is very important to be able to simulate at both system level and detailed model level. This kind of model is feasible due to the high potential of Verilog-AMS language. In this paper, we propose an effective top-down design methodology and employ it in the development of a complete VCSEL-based optical links simulation. The principle of top-down methodology is that the development would proceed from the system to device level. To design a hierarchical model for VCSEL based optical links, the design framework is organized in three levels of hierarchy. The models are developed, and implemented in Verilog-AMS. Therefore, the model parameters are fitted to measured data. A sample transient simulation demonstrates the functioning of our implementation. Suggestions for future directions in top-down methodology used for optoelectronic systems technology are also presented.
Teacher Perceptions about New Evaluation Model Implementations
ERIC Educational Resources Information Center
Bush, Charles D.
2017-01-01
The challenge of designing and implementing teacher evaluation reform throughout the U.S. has been represented by different policies, teacher evaluation components, and difficulties with implementation. The purpose of this qualitative embedded single case study was to explore teacher perceptions about new evaluation model implementations and how…
ERIC Educational Resources Information Center
Lee, Chia-Jung; Kim, ChanMin
2014-01-01
This study presents a refined technological pedagogical content knowledge (also known as TPACK) based instructional design model, which was revised using findings from the implementation study of a prior model. The refined model was applied in a technology integration course with 38 preservice teachers. A case study approach was used in this…
Design and Analysis Tools for Supersonic Inlets
NASA Technical Reports Server (NTRS)
Slater, John W.; Folk, Thomas C.
2009-01-01
Computational tools are being developed for the design and analysis of supersonic inlets. The objective is to update existing tools and provide design and low-order aerodynamic analysis capability for advanced inlet concepts. The Inlet Tools effort includes aspects of creating an electronic database of inlet design information, a document describing inlet design and analysis methods, a geometry model for describing the shape of inlets, and computer tools that implement the geometry model and methods. The geometry model has a set of basic inlet shapes that include pitot, two-dimensional, axisymmetric, and stream-traced inlet shapes. The inlet model divides the inlet flow field into parts that facilitate the design and analysis methods. The inlet geometry model constructs the inlet surfaces through the generation and transformation of planar entities based on key inlet design factors. Future efforts will focus on developing the inlet geometry model, the inlet design and analysis methods, a Fortran 95 code to implement the model and methods. Other computational platforms, such as Java, will also be explored.
Price, Christine A; Zavotka, Susan L; Teaford, Margaret H
2004-10-01
A collaborative partnership model was used to develop and implement a state-wide community education program on universal design. University faculty, extension professionals, older adult service agencies, service learning students, and a community retail chain made up the original partnership. This collaboration resulted in a five-stage partnership model. The model was used to develop and disseminate a consumer education program to promote aging in place. The five stages include (a) identifying partner strengths and shared learning, (b) program development, (c) implementing the universal design program, (d) facilitating collaborative outreach, and (e) shifting toward sustainable outreach. A lack of knowledge exists among consumers, builders, and health care professionals regarding strategies for aging in place. Collaborations between educators, outreach professionals, students, and a retail partner resulted in increased interest and awareness about universal design changes that enable seniors to age in place.
The openEHR Java reference implementation project.
Chen, Rong; Klein, Gunnar
2007-01-01
The openEHR foundation has developed an innovative design for interoperable and future-proof Electronic Health Record (EHR) systems based on a dual model approach with a stable reference information model complemented by archetypes for specific clinical purposes.A team from Sweden has implemented all the stable specifications in the Java programming language and donated the source code to the openEHR foundation. It was adopted as the openEHR Java Reference Implementation in March 2005 and released under open source licenses. This encourages early EHR implementation projects around the world and a number of groups have already started to use this code. The early Java implementation experience has also led to the publication of the openEHR Java Implementation Technology Specification. A number of design changes to the specifications and important minor corrections have been directly initiated by the implementation project over the last two years. The Java Implementation has been important for the validation and improvement of the openEHR design specifications and provides building blocks for future EHR systems.
ERIC Educational Resources Information Center
Jimoyiannis, Athanassios
2010-01-01
This paper reports on the design and the implementation of the Technological Pedagogical Science Knowledge (TPASK), a new model for science teachers professional development built on an integrated framework determined by the Technological Pedagogical Content Knowledge (TPACK) model and the authentic learning approach. The TPASK curriculum…
ERIC Educational Resources Information Center
Wells, John G.
2016-01-01
The PIRPOSAL model is both a conceptual and pedagogical framework intended for use as a pragmatic guide to classroom implementation of Integrative STEM Education. Designerly questioning prompted by a "need to know" serves as the basis for transitioning student designers within and among multiple phases while they progress toward an…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Novak, J.H.
1984-05-01
Model design, implementation and quality assurance procedures can have a significant impact on the effectiveness of long term utility of any modeling approach. The Regional Oxidant Modeling System (ROMS) is exceptionally complex because it treats all chemical and physical processes thought to affect ozone concentration on a regional scale. Thus, to effectively illustrate useful design and implementation techniques, this paper describes the general modeling framework which forms the basis of the ROMS. This framework is flexible enough to allow straightforward update or replacement of the chemical kinetics mechanism and/or any theoretical formulations of the physical processes. Use of the Jacksonmore » Structured Programming (JSP) method to implement this modeling framework has not only increased programmer productivity and quality of the resulting programs, but also has provided standardized program design, dynamic documentation, and easily maintainable and transportable code. A summary of the JSP method is presented to encourage modelers to pursue this technique in their own model development efforts. In addition, since data preparation is such an integral part of a successful modeling system, the ROMS processor network is described with emphasis on the internal quality control techniques.« less
Efficient design of CMOS TSC checkers
NASA Technical Reports Server (NTRS)
Biddappa, Anita; Shamanna, Manjunath K.; Maki, Gary; Whitaker, Sterling
1990-01-01
This paper considers the design of an efficient, robustly testable, CMOS Totally Self-Checking (TSC) Checker for k-out-of-2k codes. Most existing implementations use primitive gates and assume the single stuck-at fault model. The self-testing property has been found to fail for CMOS TSC checkers under the stuck-open fault model due to timing skews and arbitrary delays in the circuit. A new four level design using CMOS primitive gates (NAND, NOR, INVERTERS) is presented. This design retains its properties under the stuck-open fault model. Additionally, this method offers an impressive reduction (greater than 70 percent) in gate count, gate inputs, and test set size when compared to the existing method. This implementation is easily realizable and is based on Anderson's technique. A thorough comparative study has been made on the proposed implementation and Kundu's implementation and the results indicate that the proposed one is better than Kundu's in all respects for k-out-of-2k codes.
Modeling IoT-Based Solutions Using Human-Centric Wireless Sensor Networks
Monares, Álvaro; Ochoa, Sergio F.; Santos, Rodrigo; Orozco, Javier; Meseguer, Roc
2014-01-01
The Internet of Things (IoT) has inspired solutions that are already available for addressing problems in various application scenarios, such as healthcare, security, emergency support and tourism. However, there is no clear approach to modeling these systems and envisioning their capabilities at the design time. Therefore, the process of designing these systems is ad hoc and its real impact is evaluated once the solution is already implemented, which is risky and expensive. This paper proposes a modeling approach that uses human-centric wireless sensor networks to specify and evaluate models of IoT-based systems at the time of design, avoiding the need to spend time and effort on early implementations of immature designs. It allows designers to focus on the system design, leaving the implementation decisions for a next phase. The article illustrates the usefulness of this proposal through a running example, showing the design of an IoT-based solution to support the first responses during medium-sized or large urban incidents. The case study used in the proposal evaluation is based on a real train crash. The proposed modeling approach can be used to design IoT-based systems for other application scenarios, e.g., to support security operatives or monitor chronic patients in their homes. PMID:25157549
Modeling IoT-based solutions using human-centric wireless sensor networks.
Monares, Álvaro; Ochoa, Sergio F; Santos, Rodrigo; Orozco, Javier; Meseguer, Roc
2014-08-25
The Internet of Things (IoT) has inspired solutions that are already available for addressing problems in various application scenarios, such as healthcare, security, emergency support and tourism. However, there is no clear approach to modeling these systems and envisioning their capabilities at the design time. Therefore, the process of designing these systems is ad hoc and its real impact is evaluated once the solution is already implemented, which is risky and expensive. This paper proposes a modeling approach that uses human-centric wireless sensor networks to specify and evaluate models of IoT-based systems at the time of design, avoiding the need to spend time and effort on early implementations of immature designs. It allows designers to focus on the system design, leaving the implementation decisions for a next phase. The article illustrates the usefulness of this proposal through a running example, showing the design of an IoT-based solution to support the first responses during medium-sized or large urban incidents. The case study used in the proposal evaluation is based on a real train crash. The proposed modeling approach can be used to design IoT-based systems for other application scenarios, e.g., to support security operatives or monitor chronic patients in their homes.
Assume-Guarantee Verification of Source Code with Design-Level Assumptions
NASA Technical Reports Server (NTRS)
Giannakopoulou, Dimitra; Pasareanu, Corina S.; Cobleigh, Jamieson M.
2004-01-01
Model checking is an automated technique that can be used to determine whether a system satisfies certain required properties. To address the 'state explosion' problem associated with this technique, we propose to integrate assume-guarantee verification at different phases of system development. During design, developers build abstract behavioral models of the system components and use them to establish key properties of the system. To increase the scalability of model checking at this level, we have developed techniques that automatically decompose the verification task by generating component assumptions for the properties to hold. The design-level artifacts are subsequently used to guide the implementation of the system, but also to enable more efficient reasoning at the source code-level. In particular we propose to use design-level assumptions to similarly decompose the verification of the actual system implementation. We demonstrate our approach on a significant NASA application, where design-level models were used to identify; and correct a safety property violation, and design-level assumptions allowed us to check successfully that the property was presented by the implementation.
ERIC Educational Resources Information Center
Thoms, Brian
2009-01-01
In this dissertation I examine the design, construction and implementation of an online blog ratings and user recommender system for the Claremont Conversation Online (CCO). In line with constructivist learning models and practical information systems (IS) design, I implemented a blog ratings system (a system that can be extended to allow for…
SLS Model Based Design: A Navigation Perspective
NASA Technical Reports Server (NTRS)
Oliver, T. Emerson; Anzalone, Evan; Park, Thomas; Geohagan, Kevin
2018-01-01
The SLS Program has implemented a Model-based Design (MBD) and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team is responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1B design, the additional GPS Receiver hardware model is managed as a DMM at the vehicle design level. This paper describes the models, and discusses the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the navigation components.
Operational concepts and implementation strategies for the design configuration management process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trauth, Sharon Lee
2007-05-01
This report describes operational concepts and implementation strategies for the Design Configuration Management Process (DCMP). It presents a process-based systems engineering model for the successful configuration management of the products generated during the operation of the design organization as a business entity. The DCMP model focuses on Pro/E and associated activities and information. It can serve as the framework for interconnecting all essential aspects of the product design business. A design operation scenario offers a sense of how to do business at a time when DCMP is second nature within the design organization.
Geometric modeling for computer aided design
NASA Technical Reports Server (NTRS)
Schwing, James L.
1988-01-01
Research focused on two major areas. The first effort addressed the design and implementation of a technique that allows for the visualization of the real time variation of physical properties. The second effort focused on the design and implementation of an on-line help system with components designed for both authors and users of help information.
Toward fidelity between specification and implementation
NASA Technical Reports Server (NTRS)
Callahan, John R.; Montgomery, Todd L.; Morrison, Jeff; Wu, Yunqing
1994-01-01
This paper describes the methods used to specify and implement a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally by two complementary teams using a combination of formal and informal techniques in an attempt to ensure the correctness of the protocol implementation. The first team, called the Design team, initially specified protocol requirements using a variant of SCR requirements tables and implemented a prototype solution. The second team, called the V&V team, developed a state model based on the requirements tables and derived test cases from these tables to exercise the implementation. In a series of iterative steps, the Design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation through testing. Test cases derived from state transition paths in the formal model formed the dialogue between teams during development and served as the vehicles for keeping the model and implementation in fidelity with each other. This paper describes our experiences in developing our process model, details of our approach, and some example problems found during the development of RMP.
Kant, Nasir Ali; Dar, Mohamad Rafiq; Khanday, Farooq Ahmad
2015-01-01
The output of every neuron in neural network is specified by the employed activation function (AF) and therefore forms the heart of neural networks. As far as the design of artificial neural networks (ANNs) is concerned, hardware approach is preferred over software one because it promises the full utilization of the application potential of ANNs. Therefore, besides some arithmetic blocks, designing AF in hardware is the most important for designing ANN. While attempting to design the AF in hardware, the designs should be compatible with the modern Very Large Scale Integration (VLSI) design techniques. In this regard, the implemented designs should: only be in Metal Oxide Semiconductor (MOS) technology in order to be compatible with the digital designs, provide electronic tunability feature, and be able to operate at ultra-low voltage. Companding is one of the promising circuit design techniques for achieving these goals. In this paper, 0.5 V design of Liao's AF using sinh-domain technique is introduced. Furthermore, the function is tested by implementing inertial neuron model. The performance of the AF and inertial neuron model have been evaluated through simulation results, using the PSPICE software with the MOS transistor models provided by the 0.18-μm Taiwan Semiconductor Manufacturer Complementary Metal Oxide Semiconductor (TSM CMOS) process.
A Design Quality Learning Unit in Relational Data Modeling Based on Thriving Systems Properties
ERIC Educational Resources Information Center
Waguespack, Leslie J.
2013-01-01
This paper presents a learning unit that addresses quality design in relational data models. The focus on modeling allows the learning to span analysis, design, and implementation enriching pedagogy across the systems development life cycle. Thriving Systems Theory presents fifteen choice properties that convey design quality in models integrating…
Designing an activity-based costing model for a non-admitted prisoner healthcare setting.
Cai, Xiao; Moore, Elizabeth; McNamara, Martin
2013-09-01
To design and deliver an activity-based costing model within a non-admitted prisoner healthcare setting. Key phases from the NSW Health clinical redesign methodology were utilised: diagnostic, solution design and implementation. The diagnostic phase utilised a range of strategies to identify issues requiring attention in the development of the costing model. The solution design phase conceptualised distinct 'building blocks' of activity and cost based on the speciality of clinicians providing care. These building blocks enabled the classification of activity and comparisons of costs between similar facilities. The implementation phase validated the model. The project generated an activity-based costing model based on actual activity performed, gained acceptability among clinicians and managers, and provided the basis for ongoing efficiency and benchmarking efforts.
ERIC Educational Resources Information Center
Dunst, Carl J.
2015-01-01
A model for designing and implementing evidence-based in-service professional development in early childhood intervention as well as the key features of the model are described. The key features include professional development specialist (PDS) description and demonstration of an intervention practice, active and authentic job-embedded…
ERIC Educational Resources Information Center
Waight, Noemi; Liu, Xiufeng; Gregorius, Roberto Ma.
2015-01-01
This paper examined the nuances of the background process of design and development and follow up classroom implementation of computer-based models for high school chemistry. More specifically, the study examined the knowledge contributions of an interdisciplinary team of experts; points of tensions, negotiations and non-negotiable aspects of…
Neuromorphic Silicon Neuron Circuits
Indiveri, Giacomo; Linares-Barranco, Bernabé; Hamilton, Tara Julia; van Schaik, André; Etienne-Cummings, Ralph; Delbruck, Tobi; Liu, Shih-Chii; Dudek, Piotr; Häfliger, Philipp; Renaud, Sylvie; Schemmel, Johannes; Cauwenberghs, Gert; Arthur, John; Hynna, Kai; Folowosele, Fopefolu; Saighi, Sylvain; Serrano-Gotarredona, Teresa; Wijekoon, Jayawan; Wang, Yingxue; Boahen, Kwabena
2011-01-01
Hardware implementations of spiking neurons can be extremely useful for a large variety of applications, ranging from high-speed modeling of large-scale neural systems to real-time behaving systems, to bidirectional brain–machine interfaces. The specific circuit solutions used to implement silicon neurons depend on the application requirements. In this paper we describe the most common building blocks and techniques used to implement these circuits, and present an overview of a wide range of neuromorphic silicon neurons, which implement different computational models, ranging from biophysically realistic and conductance-based Hodgkin–Huxley models to bi-dimensional generalized adaptive integrate and fire models. We compare the different design methodologies used for each silicon neuron design described, and demonstrate their features with experimental results, measured from a wide range of fabricated VLSI chips. PMID:21747754
Towards Behavioral Reflexion Models
NASA Technical Reports Server (NTRS)
Ackermann, Christopher; Lindvall, Mikael; Cleaveland, Rance
2009-01-01
Software architecture has become essential in the struggle to manage today s increasingly large and complex systems. Software architecture views are created to capture important system characteristics on an abstract and, thus, comprehensible level. As the system is implemented and later maintained, it often deviates from the original design specification. Such deviations can have implication for the quality of the system, such as reliability, security, and maintainability. Software architecture compliance checking approaches, such as the reflexion model technique, have been proposed to address this issue by comparing the implementation to a model of the systems architecture design. However, architecture compliance checking approaches focus solely on structural characteristics and ignore behavioral conformance. This is especially an issue in Systems-of- Systems. Systems-of-Systems (SoS) are decompositions of large systems, into smaller systems for the sake of flexibility. Deviations of the implementation to its behavioral design often reduce the reliability of the entire SoS. An approach is needed that supports the reasoning about behavioral conformance on architecture level. In order to address this issue, we have developed an approach for comparing the implementation of a SoS to an architecture model of its behavioral design. The approach follows the idea of reflexion models and adopts it to support the compliance checking of behaviors. In this paper, we focus on sequencing properties as they play an important role in many SoS. Sequencing deviations potentially have a severe impact on the SoS correctness and qualities. The desired behavioral specification is defined in UML sequence diagram notation and behaviors are extracted from the SoS implementation. The behaviors are then mapped to the model of the desired behavior and the two are compared. Finally, a reflexion model is constructed that shows the deviations between behavioral design and implementation. This paper discusses the approach and shows how it can be applied to investigate reliability issues in SoS.
An approach to verification and validation of a reliable multicasting protocol: Extended Abstract
NASA Technical Reports Server (NTRS)
Callahan, John R.; Montgomery, Todd L.
1995-01-01
This paper describes the process of implementing a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally using a combination of formal and informal techniques in an attempt to ensure the correctness of its implementation. Our development process involved three concurrent activities: (1) the initial construction and incremental enhancement of a formal state model of the protocol machine; (2) the initial coding and incremental enhancement of the implementation; and (3) model-based testing of iterative implementations of the protocol. These activities were carried out by two separate teams: a design team and a V&V team. The design team built the first version of RMP with limited functionality to handle only nominal requirements of data delivery. This initial version did not handle off-nominal cases such as network partitions or site failures. Meanwhile, the V&V team concurrently developed a formal model of the requirements using a variant of SCR-based state tables. Based on these requirements tables, the V&V team developed test cases to exercise the implementation. In a series of iterative steps, the design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation. This was done by generating test cases based on suspected errant or off-nominal behaviors predicted by the current model. If the execution of a test in the model and implementation agreed, then the test either found a potential problem or verified a required behavior. However, if the execution of a test was different in the model and implementation, then the differences helped identify inconsistencies between the model and implementation. In either case, the dialogue between both teams drove the co-evolution of the model and implementation. We have found that this interactive, iterative approach to development allows software designers to focus on delivery of nominal functionality while the V&V team can focus on analysis of off nominal cases. Testing serves as the vehicle for keeping the model and implementation in fidelity with each other. This paper describes (1) our experiences in developing our process model; and (2) three example problems found during the development of RMP. Although RMP has provided our research effort with a rich set of test cases, it also has practical applications within NASA. For example, RMP is being considered for use in the NASA EOSDIS project due to its significant performance benefits in applications that need to replicate large amounts of data to many network sites.
Predicting the Consequences of Workload Management Strategies with Human Performance Modeling
NASA Technical Reports Server (NTRS)
Mitchell, Diane Kuhl; Samma, Charneta
2011-01-01
Human performance modelers at the US Army Research Laboratory have developed an approach for establishing Soldier high workload that can be used for analyses of proposed system designs. Their technique includes three key components. To implement the approach in an experiment, the researcher would create two experimental conditions: a baseline and a design alternative. Next they would identify a scenario in which the test participants perform all their representative concurrent interactions with the system. This scenario should include any events that would trigger a different set of goals for the human operators. They would collect workload values during both the control and alternative design condition to see if the alternative increased workload and decreased performance. They have successfully implemented this approach for military vehicle. designs using the human performance modeling tool, IMPRINT. Although ARL researches use IMPRINT to implement their approach, it can be applied to any workload analysis. Researchers using other modeling and simulations tools or conducting experiments or field tests can use the same approach.
Lee, Heewon; Contento, Isobel R.; Koch, Pamela
2012-01-01
Objective To use and review a conceptual model of process evaluation and to examine the implementation of a nutrition education curriculum, Choice, Control & Change, designed to promote dietary and physical activity behaviors that reduce obesity risk. Design A process evaluation study based on a systematic conceptual model. Setting Five middle schools in New York City. Participants 562 students in 20 classes and their science teachers (n=8). Main Outcome Measures Based on the model, teacher professional development, teacher implementation, and student reception were evaluated. Also measured were teacher characteristics, teachers’ curriculum evaluation, and satisfaction with teaching the curriculum. Analysis Descriptive statistics and Spearman’s Rho Correlation for quantitative analysis and content analysis for qualitative data were used. Results Mean score of the teacher professional development evaluation was 4.75 on a 5-point scale. Average teacher implementation rate was 73%, and student reception rate was 69%. Ongoing teacher support was highly valued by teachers. Teachers’ satisfaction with teaching the curriculum was highly correlated with students’ satisfaction (p <.05). Teachers’ perception of amount of student work was negatively correlated with implementation and with student satisfaction (p<.05). Conclusions and implications Use of a systematic conceptual model and comprehensive process measures improves understanding of the implementation process and helps educators to better implement interventions as designed. PMID:23321021
VIMOS Instrument Control Software Design: an Object Oriented Approach
NASA Astrophysics Data System (ADS)
Brau-Nogué, Sylvie; Lucuix, Christian
2002-12-01
The Franco-Italian VIMOS instrument is a VIsible imaging Multi-Object Spectrograph with outstanding multiplex capabilities, allowing to take spectra of more than 800 objects simultaneously, or integral field spectroscopy mode in a 54x54 arcsec area. VIMOS is being installed at the Nasmyth focus of the third Unit Telescope of the European Southern Observatory Very Large Telescope (VLT) at Mount Paranal in Chile. This paper will describe the analysis, the design and the implementation of the VIMOS Instrument Control System, using UML notation. Our Control group followed an Object Oriented software process while keeping in mind the ESO VLT standard control concepts. At ESO VLT a complete software library is available. Rather than applying waterfall lifecycle, ICS project used iterative development, a lifecycle consisting of several iterations. Each iteration consisted in : capture and evaluate the requirements, visual modeling for analysis and design, implementation, test, and deployment. Depending of the project phases, iterations focused more or less on specific activity. The result is an object model (the design model), including use-case realizations. An implementation view and a deployment view complement this product. An extract of VIMOS ICS UML model will be presented and some implementation, integration and test issues will be discussed.
Verification and validation of a reliable multicast protocol
NASA Technical Reports Server (NTRS)
Callahan, John R.; Montgomery, Todd L.
1995-01-01
This paper describes the methods used to specify and implement a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally by two complementary teams using a combination of formal and informal techniques in an attempt to ensure the correctness of the protocol implementation. The first team, called the Design team, initially specified protocol requirements using a variant of SCR requirements tables and implemented a prototype solution. The second team, called the V&V team, developed a state model based on the requirements tables and derived test cases from these tables to exercise the implementation. In a series of iterative steps, the Design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation through testing. Test cases derived from state transition paths in the formal model formed the dialogue between teams during development and served as the vehicles for keeping the model and implementation in fidelity with each other. This paper describes our experiences in developing our process model, details of our approach, and some example problems found during the development of RMP.
Implementation of model predictive control for resistive wall mode stabilization on EXTRAP T2R
NASA Astrophysics Data System (ADS)
Setiadi, A. C.; Brunsell, P. R.; Frassinetti, L.
2015-10-01
A model predictive control (MPC) method for stabilization of the resistive wall mode (RWM) in the EXTRAP T2R reversed-field pinch is presented. The system identification technique is used to obtain a linearized empirical model of EXTRAP T2R. MPC employs the model for prediction and computes optimal control inputs that satisfy performance criterion. The use of a linearized form of the model allows for compact formulation of MPC, implemented on a millisecond timescale, that can be used for real-time control. The design allows the user to arbitrarily suppress any selected Fourier mode. The experimental results from EXTRAP T2R show that the designed and implemented MPC successfully stabilizes the RWM.
ERIC Educational Resources Information Center
Moallem, Mahnaz
2001-01-01
Provides an overview of the process of designing and developing a Web-based course using instructional design principles and models, including constructivist and objectivist theories. Explains the process of implementing an instructional design model in designing a Web-based undergraduate course and evaluates the model based on course evaluations.…
Propulsion System Models for Rotorcraft Conceptual Design
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2014-01-01
The conceptual design code NDARC (NASA Design and Analysis of Rotorcraft) was initially implemented to model conventional rotorcraft propulsion systems, consisting of turboshaft engines burning jet fuel, connected to one or more rotors through a mechanical transmission. The NDARC propulsion system representation has been extended to cover additional propulsion concepts, including electric motors and generators, rotor reaction drive, turbojet and turbofan engines, fuel cells and solar cells, batteries, and fuel (energy) used without weight change. The paper describes these propulsion system components, the architecture of their implementation in NDARC, and the form of the models for performance and weight. Requirements are defined for improved performance and weight models of the new propulsion system components. With these new propulsion models, NDARC can be used to develop environmentally-friendly rotorcraft designs.
NASA Technical Reports Server (NTRS)
Hale, Mark A.
1996-01-01
Computer applications for design have evolved rapidly over the past several decades, and significant payoffs are being achieved by organizations through reductions in design cycle times. These applications are overwhelmed by the requirements imposed during complex, open engineering systems design. Organizations are faced with a number of different methodologies, numerous legacy disciplinary tools, and a very large amount of data. Yet they are also faced with few interdisciplinary tools for design collaboration or methods for achieving the revolutionary product designs required to maintain a competitive advantage in the future. These organizations are looking for a software infrastructure that integrates current corporate design practices with newer simulation and solution techniques. Such an infrastructure must be robust to changes in both corporate needs and enabling technologies. In addition, this infrastructure must be user-friendly, modular and scalable. This need is the motivation for the research described in this dissertation. The research is focused on the development of an open computing infrastructure that facilitates product and process design. In addition, this research explicitly deals with human interactions during design through a model that focuses on the role of a designer as that of decision-maker. The research perspective here is taken from that of design as a discipline with a focus on Decision-Based Design, Theory of Languages, Information Science, and Integration Technology. Given this background, a Model of IPPD is developed and implemented along the lines of a traditional experimental procedure: with the steps of establishing context, formalizing a theory, building an apparatus, conducting an experiment, reviewing results, and providing recommendations. Based on this Model, Design Processes and Specification can be explored in a structured and implementable architecture. An architecture for exploring design called DREAMS (Developing Robust Engineering Analysis Models and Specifications) has been developed which supports the activities of both meta-design and actual design execution. This is accomplished through a systematic process which is comprised of the stages of Formulation, Translation, and Evaluation. During this process, elements from a Design Specification are integrated into Design Processes. In addition, a software infrastructure was developed and is called IMAGE (Intelligent Multidisciplinary Aircraft Generation Environment). This represents a virtual apparatus in the Design Experiment conducted in this research. IMAGE is an innovative architecture because it explicitly supports design-related activities. This is accomplished through a GUI driven and Agent-based implementation of DREAMS. A HSCT design has been adopted from the Framework for Interdisciplinary Design Optimization (FIDO) and is implemented in IMAGE. This problem shows how Design Processes and Specification interact in a design system. In addition, the problem utilizes two different solution models concurrently: optimal and satisfying. The satisfying model allows for more design flexibility and allows a designer to maintain design freedom. As a result of following this experimental procedure, this infrastructure is an open system that it is robust to changes in both corporate needs and computer technologies. The development of this infrastructure leads to a number of significant intellectual contributions: 1) A new approach to implementing IPPD with the aid of a computer; 2) A formal Design Experiment; 3) A combined Process and Specification architecture that is language-based; 4) An infrastructure for exploring design; 5) An integration strategy for implementing computer resources; and 6) A seamless modeling language. The need for these contributions is emphasized by the demand by industry and government agencies for the development of these technologies.
An Overview of Research and Evaluation Designs for Dissemination and Implementation
Brown, C. Hendricks; Curran, Geoffrey; Palinkas, Lawrence A.; Aarons, Gregory A.; Wells, Kenneth B.; Jones, Loretta; Collins, Linda M.; Duan, Naihua; Mittman, Brian S.; Wallace, Andrea; Tabak, Rachel G.; Ducharme, Lori; Chambers, David; Neta, Gila; Wiley, Tisha; Landsverk, John; Cheung, Ken; Cruden, Gracelyn
2016-01-01
Background The wide variety of dissemination and implementation designs now being used to evaluate and improve health systems and outcomes warrants review of the scope, features, and limitations of these designs. Methods This paper is one product of a design workgroup formed in 2013 by the National Institutes of Health to address dissemination and implementation research, and whose members represented diverse methodologic backgrounds, content focus areas, and health sectors. These experts integrated their collective knowledge on dissemination and implementation designs with searches of published evaluations strategies. Results This paper emphasizes randomized and non-randomized designs for the traditional translational research continuum or pipeline, which builds on existing efficacy and effectiveness trials to examine how one or more evidence-based clinical/prevention interventions are adopted, scaled up, and sustained in community or service delivery systems. We also mention other designs, including hybrid designs that combine effectiveness and implementation research, quality improvement designs for local knowledge, and designs that use simulation modeling. PMID:28384085
Systems engineering principles for the design of biomedical signal processing systems.
Faust, Oliver; Acharya U, Rajendra; Sputh, Bernhard H C; Min, Lim Choo
2011-06-01
Systems engineering aims to produce reliable systems which function according to specification. In this paper we follow a systems engineering approach to design a biomedical signal processing system. We discuss requirements capturing, specification definition, implementation and testing of a classification system. These steps are executed as formal as possible. The requirements, which motivate the system design, are based on diabetes research. The main requirement for the classification system is to be a reliable component of a machine which controls diabetes. Reliability is very important, because uncontrolled diabetes may lead to hyperglycaemia (raised blood sugar) and over a period of time may cause serious damage to many of the body systems, especially the nerves and blood vessels. In a second step, these requirements are refined into a formal CSP‖ B model. The formal model expresses the system functionality in a clear and semantically strong way. Subsequently, the proven system model was translated into an implementation. This implementation was tested with use cases and failure cases. Formal modeling and automated model checking gave us deep insight in the system functionality. This insight enabled us to create a reliable and trustworthy implementation. With extensive tests we established trust in the reliability of the implementation. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
MELCOR/CONTAIN LMR Implementation Report. FY14 Progress
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humphries, Larry L; Louie, David L.Y.
2014-10-01
This report describes the preliminary implementation of the sodium thermophysical properties and the design documentation for the sodium models of CONTAIN-LMR to be implemented into MELCOR 2.1. In the past year, the implementation included two separate sodium properties from two different sources. The first source is based on the previous work done by Idaho National Laboratory by modifying MELCOR to include liquid lithium equation of state as a working fluid to model the nuclear fusion safety research. To minimize the impact to MELCOR, the implementation of the fusion safety database (FSD) was done by utilizing the detection of the datamore » input file as a way to invoking the FSD. The FSD methodology has been adapted currently for this work, but it may subject modification as the project continues. The second source uses properties generated for the SIMMER code. Preliminary testing and results from this implementation of sodium properties are given. In this year, the design document for the CONTAIN-LMR sodium models, such as the two condensable option, sodium spray fire, and sodium pool fire is being developed. This design document is intended to serve as a guide for the MELCOR implementation. In addition, CONTAIN-LMR code used was based on the earlier version of CONTAIN code. Many physical models that were developed since this early version of CONTAIN may not be captured by the code. Although CONTAIN 2, which represents the latest development of CONTAIN, contains some sodium specific models, which are not complete, the utilizing CONTAIN 2 with all sodium models implemented from CONTAIN-LMR as a comparison code for MELCOR should be done. This implementation should be completed in early next year, while sodium models from CONTAIN-LMR are being integrated into MELCOR. For testing, CONTAIN decks have been developed for verification and validation use.« less
Strömberg, Eric A; Nyberg, Joakim; Hooker, Andrew C
2016-12-01
With the increasing popularity of optimal design in drug development it is important to understand how the approximations and implementations of the Fisher information matrix (FIM) affect the resulting optimal designs. The aim of this work was to investigate the impact on design performance when using two common approximations to the population model and the full or block-diagonal FIM implementations for optimization of sampling points. Sampling schedules for two example experiments based on population models were optimized using the FO and FOCE approximations and the full and block-diagonal FIM implementations. The number of support points was compared between the designs for each example experiment. The performance of these designs based on simulation/estimations was investigated by computing bias of the parameters as well as through the use of an empirical D-criterion confidence interval. Simulations were performed when the design was computed with the true parameter values as well as with misspecified parameter values. The FOCE approximation and the Full FIM implementation yielded designs with more support points and less clustering of sample points than designs optimized with the FO approximation and the block-diagonal implementation. The D-criterion confidence intervals showed no performance differences between the full and block diagonal FIM optimal designs when assuming true parameter values. However, the FO approximated block-reduced FIM designs had higher bias than the other designs. When assuming parameter misspecification in the design evaluation, the FO Full FIM optimal design was superior to the FO block-diagonal FIM design in both of the examples.
Digital-flutter-suppression-system investigations for the active flexible wing wind-tunnel model
NASA Technical Reports Server (NTRS)
Perry, Boyd, III; Mukhopadhyay, Vivek; Hoadley, Sherwood Tiffany; Cole, Stanley R.; Buttrill, Carey S.
1990-01-01
Active flutter suppression control laws were designed, implemented, and tested on an aeroelastically-scaled wind-tunnel model in the NASA Langley Transonic Dynamics Tunnel. One of the control laws was successful in stabilizing the model while the dynamic pressure was increased to 24 percent greater than the measured open-loop flutter boundary. Other accomplishments included the design, implementation, and successful operation of a one-of-a-kind digital controller, the design and use of two simulation methods to support the project, and the development and successful use of a methodology for online controller performance evaluation.
Digital-flutter-suppression-system investigations for the active flexible wing wind-tunnel model
NASA Technical Reports Server (NTRS)
Perry, Boyd, III; Mukhopadhyay, Vivek; Hoadley, Sherwood T.; Cole, Stanley R.; Buttrill, Carey S.; Houck, Jacob A.
1990-01-01
Active flutter suppression control laws were designed, implemented, and tested on an aeroelastically-scaled wind tunnel model in the NASA Langley Transonic Dynamics Tunnel. One of the control laws was successful in stabilizing the model while the dynamic pressure was increased to 24 percent greater than the measured open-loop flutter boundary. Other accomplishments included the design, implementation, and successful operation of a one-of-a-kind digital controller, the design and use of two simulation methods to support the project, and the development and successful use of a methodology for on-line controller performance evaluation.
Fuzzy model-based fault detection and diagnosis for a pilot heat exchanger
NASA Astrophysics Data System (ADS)
Habbi, Hacene; Kidouche, Madjid; Kinnaert, Michel; Zelmat, Mimoun
2011-04-01
This article addresses the design and real-time implementation of a fuzzy model-based fault detection and diagnosis (FDD) system for a pilot co-current heat exchanger. The design method is based on a three-step procedure which involves the identification of data-driven fuzzy rule-based models, the design of a fuzzy residual generator and the evaluation of the residuals for fault diagnosis using statistical tests. The fuzzy FDD mechanism has been implemented and validated on the real co-current heat exchanger, and has been proven to be efficient in detecting and isolating process, sensor and actuator faults.
Split-plot designs for robotic serial dilution assays.
Buzas, Jeffrey S; Wager, Carrie G; Lansky, David M
2011-12-01
This article explores effective implementation of split-plot designs in serial dilution bioassay using robots. We show that the shortest path for a robot to fill plate wells for a split-plot design is equivalent to the shortest common supersequence problem in combinatorics. We develop an algorithm for finding the shortest common supersequence, provide an R implementation, and explore the distribution of the number of steps required to implement split-plot designs for bioassay through simulation. We also show how to construct collections of split plots that can be filled in a minimal number of steps, thereby demonstrating that split-plot designs can be implemented with nearly the same effort as strip-plot designs. Finally, we provide guidelines for modeling data that result from these designs. © 2011, The International Biometric Society.
Parameterized hardware description as object oriented hardware model implementation
NASA Astrophysics Data System (ADS)
Drabik, Pawel K.
2010-09-01
The paper introduces novel model for design, visualization and management of complex, highly adaptive hardware systems. The model settles component oriented environment for both hardware modules and software application. It is developed on parameterized hardware description research. Establishment of stable link between hardware and software, as a purpose of designed and realized work, is presented. Novel programming framework model for the environment, named Graphic-Functional-Components is presented. The purpose of the paper is to present object oriented hardware modeling with mentioned features. Possible model implementation in FPGA chips and its management by object oriented software in Java is described.
Yap, Christina; Billingham, Lucinda J; Cheung, Ying Kuen; Craddock, Charlie; O'Quigley, John
2017-12-15
The ever-increasing pace of development of novel therapies mandates efficient methodologies for assessment of their tolerability and activity. Evidence increasingly support the merits of model-based dose-finding designs in identifying the recommended phase II dose compared with conventional rule-based designs such as the 3 + 3 but despite this, their use remains limited. Here, we propose a useful tool, dose transition pathways (DTP), which helps overcome several commonly faced practical and methodologic challenges in the implementation of model-based designs. DTP projects in advance the doses recommended by a model-based design for subsequent patients (stay, escalate, de-escalate, or stop early), using all the accumulated information. After specifying a model with favorable statistical properties, we utilize the DTP to fine-tune the model to tailor it to the trial's specific requirements that reflect important clinical judgments. In particular, it can help to determine how stringent the stopping rules should be if the investigated therapy is too toxic. Its use to design and implement a modified continual reassessment method is illustrated in an acute myeloid leukemia trial. DTP removes the fears of model-based designs as unknown, complex systems and can serve as a handbook, guiding decision-making for each dose update. In the illustrated trial, the seamless, clear transition for each dose recommendation aided the investigators' understanding of the design and facilitated decision-making to enable finer calibration of a tailored model. We advocate the use of the DTP as an integral procedure in the co-development and successful implementation of practical model-based designs by statisticians and investigators. Clin Cancer Res; 23(24); 7440-7. ©2017 AACR . ©2017 American Association for Cancer Research.
The dynamical analysis of modified two-compartment neuron model and FPGA implementation
NASA Astrophysics Data System (ADS)
Lin, Qianjin; Wang, Jiang; Yang, Shuangming; Yi, Guosheng; Deng, Bin; Wei, Xile; Yu, Haitao
2017-10-01
The complexity of neural models is increasing with the investigation of larger biological neural network, more various ionic channels and more detailed morphologies, and the implementation of biological neural network is a task with huge computational complexity and power consumption. This paper presents an efficient digital design using piecewise linearization on field programmable gate array (FPGA), to succinctly implement the reduced two-compartment model which retains essential features of more complicated models. The design proposes an approximate neuron model which is composed of a set of piecewise linear equations, and it can reproduce different dynamical behaviors to depict the mechanisms of a single neuron model. The consistency of hardware implementation is verified in terms of dynamical behaviors and bifurcation analysis, and the simulation results including varied ion channel characteristics coincide with the biological neuron model with a high accuracy. Hardware synthesis on FPGA demonstrates that the proposed model has reliable performance and lower hardware resource compared with the original two-compartment model. These investigations are conducive to scalability of biological neural network in reconfigurable large-scale neuromorphic system.
Choi, Insook
2018-01-01
Sonification is an open-ended design task to construct sound informing a listener of data. Understanding application context is critical for shaping design requirements for data translation into sound. Sonification requires methodology to maintain reproducibility when data sources exhibit non-linear properties of self-organization and emergent behavior. This research formalizes interactive sonification in an extensible model to support reproducibility when data exhibits emergent behavior. In the absence of sonification theory, extensibility demonstrates relevant methods across case studies. The interactive sonification framework foregrounds three factors: reproducible system implementation for generating sonification; interactive mechanisms enhancing a listener's multisensory observations; and reproducible data from models that characterize emergent behavior. Supramodal attention research suggests interactive exploration with auditory feedback can generate context for recognizing irregular patterns and transient dynamics. The sonification framework provides circular causality as a signal pathway for modeling a listener interacting with emergent behavior. The extensible sonification model adopts a data acquisition pathway to formalize functional symmetry across three subsystems: Experimental Data Source, Sound Generation, and Guided Exploration. To differentiate time criticality and dimensionality of emerging dynamics, tuning functions are applied between subsystems to maintain scale and symmetry of concurrent processes and temporal dynamics. Tuning functions accommodate sonification design strategies that yield order parameter values to render emerging patterns discoverable as well as rehearsable, to reproduce desired instances for clinical listeners. Case studies are implemented with two computational models, Chua's circuit and Swarm Chemistry social agent simulation, generating data in real-time that exhibits emergent behavior. Heuristic Listening is introduced as an informal model of a listener's clinical attention to data sonification through multisensory interaction in a context of structured inquiry. Three methods are introduced to assess the proposed sonification framework: Listening Scenario classification, data flow Attunement, and Sonification Design Patterns to classify sound control. Case study implementations are assessed against these methods comparing levels of abstraction between experimental data and sound generation. Outcomes demonstrate the framework performance as a reference model for representing experimental implementations, also for identifying common sonification structures having different experimental implementations, identifying common functions implemented in different subsystems, and comparing impact of affordances across multiple implementations of listening scenarios. PMID:29755311
Feedback Model to Support Designers of Blended-Learning Courses
ERIC Educational Resources Information Center
Hummel, Hans G. K.
2006-01-01
Although extensive research has been carried out, describing the role of feedback in education, and many theoretical models are yet available, procedures and guidelines for actually designing and implementing feedback in practice have remained scarce so far. This explorative study presents a preliminary six-phase design model for feedback…
Design and Implementation of Volitional Control Support in Mathematics Courses
ERIC Educational Resources Information Center
Kim, ChanMin; Bennekin, Kimberly N.
2013-01-01
We designed support for volitional control with four stages for "goal initiation" ("Want it"), "goal formation" ("Plan for it"), "action control" ("Do it"), and "emotion control" ("Finish it") based on theories and models of volition. We implemented the support in…
NASA Astrophysics Data System (ADS)
Wang, S.; Peters-Lidard, C. D.; Mocko, D. M.; Kumar, S.; Nearing, G. S.; Arsenault, K. R.; Geiger, J. V.
2014-12-01
Model integration bridges the data flow between modeling frameworks and models. However, models usually do not fit directly into a particular modeling environment, if not designed for it. An example includes implementing different types of models into the NASA Land Information System (LIS), a software framework for land-surface modeling and data assimilation. Model implementation requires scientific knowledge and software expertise and may take a developer months to learn LIS and model software structure. Debugging and testing of the model implementation is also time-consuming due to not fully understanding LIS or the model. This time spent is costly for research and operational projects. To address this issue, an approach has been developed to automate model integration into LIS. With this in mind, a general model interface was designed to retrieve forcing inputs, parameters, and state variables needed by the model and to provide as state variables and outputs to LIS. Every model can be wrapped to comply with the interface, usually with a FORTRAN 90 subroutine. Development efforts need only knowledge of the model and basic programming skills. With such wrappers, the logic is the same for implementing all models. Code templates defined for this general model interface could be re-used with any specific model. Therefore, the model implementation can be done automatically. An automated model implementation toolkit was developed with Microsoft Excel and its built-in VBA language. It allows model specifications in three worksheets and contains FORTRAN 90 code templates in VBA programs. According to the model specification, the toolkit generates data structures and procedures within FORTRAN modules and subroutines, which transfer data between LIS and the model wrapper. Model implementation is standardized, and about 80 - 90% of the development load is reduced. In this presentation, the automated model implementation approach is described along with LIS programming interfaces, the general model interface and five case studies, including a regression model, Noah-MP, FASST, SAC-HTET/SNOW-17, and FLake. These different models vary in complexity with software structure. Also, we will describe how these complexities were overcome through using this approach and results of model benchmarks within LIS.
Streamline Your Project: A Lifecycle Model.
ERIC Educational Resources Information Center
Viren, John
2000-01-01
Discusses one approach to project organization providing a baseline lifecycle model for multimedia/CBT development. This variation of the standard four-phase model of Analysis, Design, Development, and Implementation includes a Pre-Analysis phase, called Definition, and a Post-Implementation phase, known as Maintenance. Each phase is described.…
ERIC Educational Resources Information Center
Klebansky, Anna; Fraser, Sharon P.
2013-01-01
This paper details a conceptual framework that situates curriculum design for information literacy and lifelong learning, through a cohesive developmental information literacy based model for learning, at the core of teacher education courses at UTAS. The implementation of the framework facilitates curriculum design that systematically,…
Geometric modeling for computer aided design
NASA Technical Reports Server (NTRS)
Schwing, James L.
1993-01-01
Over the past several years, it has been the primary goal of this grant to design and implement software to be used in the conceptual design of aerospace vehicles. The work carried out under this grant was performed jointly with members of the Vehicle Analysis Branch (VAB) of NASA LaRC, Computer Sciences Corp., and Vigyan Corp. This has resulted in the development of several packages and design studies. Primary among these are the interactive geometric modeling tool, the Solid Modeling Aerospace Research Tool (smart), and the integration and execution tools provided by the Environment for Application Software Integration and Execution (EASIE). In addition, it is the purpose of the personnel of this grant to provide consultation in the areas of structural design, algorithm development, and software development and implementation, particularly in the areas of computer aided design, geometric surface representation, and parallel algorithms.
ERIC Educational Resources Information Center
Katz, Jennifer
2015-01-01
Fifty-eight teachers of grades 1-12 in 10 schools located in two rural and three urban school divisions in Manitoba were involved in a study implementing the Three Block Model of Universal Design for Learning and exploring its outcomes for teachers and students. This article reports teachers' perceptions related to the outcomes of the…
Course Design Using an Authentic Studio Model
ERIC Educational Resources Information Center
Wilson, Jay R.
2013-01-01
Educational Technology and Design 879 is a graduate course that introduces students to the basics of video design and production. In an attempt to improve the learning experience for students a redesign of the course was implemented for the summer of 2011 that incorporated an authentic design studio model. The design studio approach is based on…
Collaborative Team Model: Design for Successful Special Education
ERIC Educational Resources Information Center
Bishop, Ellis Norman
2016-01-01
This study examined the academic impact in reading and mathematics when Collaborative, Co-Teaching Team Model of high incidence special education student service delivery implemented in a suburban school district. This study hypothesized that the implementation of an inclusive collaborative co-teaching model of service delivery could possibly…
Young Children's Metarepresentational Competence in Data Modelling
ERIC Educational Resources Information Center
English, Lyn
2012-01-01
This paper reports findings from an activity implemented in the final year of a 3-year longitudinal study of data modelling across grades 1-3. The activity engaged children in designing, implementing, and analysing a survey about their new playground. Data modelling involves investigations of meaningful phenomena, deciding what is worthy of…
Implementation of and Ada real-time executive: A case study
NASA Technical Reports Server (NTRS)
Laird, James D.; Burton, Bruce A.; Koppes, Mary R.
1986-01-01
Current Ada language implementations and runtime environments are immature, unproven and are a key risk area for real-time embedded computer system (ECS). A test-case environment is provided in which the concerns of the real-time, ECS community are addressed. A priority driven executive is selected to be implemented in the Ada programming language. The model selected is representative of real-time executives tailored for embedded systems used missile, spacecraft, and avionics applications. An Ada-based design methodology is utilized, and two designs are considered. The first of these designs requires the use of vendor supplied runtime and tasking support. An alternative high-level design is also considered for an implementation requiring no vendor supplied runtime or tasking support. The former approach is carried through to implementation.
Augustsson, Hanna; von Thiele Schwarz, Ulrica; Stenfors-Hayes, Terese; Hasson, Henna
2015-06-01
The workplace has been suggested as an important arena for health promotion, but little is known about how the organizational setting influences the implementation of interventions. The aims of this study are to evaluate implementation fidelity in an organizational-level occupational health intervention and to investigate possible explanations for variations in fidelity between intervention units. The intervention consisted of an integration of health promotion, occupational health and safety, and a system for continuous improvements (Kaizen) and was conducted in a quasi-experimental design at a Swedish hospital. Implementation fidelity was evaluated with the Conceptual Framework for Implementation Fidelity and implementation factors used to investigate variations in fidelity with the Framework for Evaluating Organizational-level Interventions. A multi-method approach including interviews, Kaizen notes, and questionnaires was applied. Implementation fidelity differed between units even though the intervention was introduced and supported in the same way. Important differences in all elements proposed in the model for evaluating organizational-level interventions, i.e., context, intervention, and mental models, were found to explain the differences in fidelity. Implementation strategies may need to be adapted depending on the local context. Implementation fidelity, as well as pre-intervention implementation elements, is likely to affect the implementation success and needs to be assessed in intervention research. The high variation in fidelity across the units indicates the need for adjustments to the type of designs used to assess the effects of interventions. Thus, rather than using designs that aim to control variation, it may be necessary to use those that aim at exploring and explaining variation, such as adapted study designs.
Hiding the system from the user: Moving from complex mental models to elegant metaphors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curtis W. Nielsen; David J. Bruemmer
2007-08-01
In previous work, increased complexity of robot behaviors and the accompanying interface design often led to operator confusion and/or a fight for control between the robot and operator. We believe the reason for the conflict was that the design of the interface and interactions presented too much of the underlying robot design model to the operator. Since the design model includes the implementation of sensors, behaviors, and sophisticated algorithms, the result was that the operator’s cognitive efforts were focused on understanding the design of the robot system as opposed to focusing on the task at hand. This paper illustrates howmore » this very problem emerged at the INL and how the implementation of new metaphors for interaction has allowed us to hide the design model from the user and allow the user to focus more on the task at hand. Supporting the user’s focus on the task rather than on the design model allows increased use of the system and significant performance improvement in a search task with novice users.« less
ERIC Educational Resources Information Center
Tritrakan, Kasame; Kidrakarn, Pachoen; Asanok, Manit
2016-01-01
The aim of this research is to develop a learning model which blends factors from learning environment and engineering design concept for learning in computer programming course. The usage of the model was also analyzed. This study presents the design, implementation, and evaluation of the model. The research methodology is divided into three…
Formal specification and design techniques for wireless sensor and actuator networks.
Martínez, Diego; González, Apolinar; Blanes, Francisco; Aquino, Raúl; Simo, José; Crespo, Alfons
2011-01-01
A current trend in the development and implementation of industrial applications is to use wireless networks to communicate the system nodes, mainly to increase application flexibility, reliability and portability, as well as to reduce the implementation cost. However, the nondeterministic and concurrent behavior of distributed systems makes their analysis and design complex, often resulting in less than satisfactory performance in simulation and test bed scenarios, which is caused by using imprecise models to analyze, validate and design these systems. Moreover, there are some simulation platforms that do not support these models. This paper presents a design and validation method for Wireless Sensor and Actuator Networks (WSAN) which is supported on a minimal set of wireless components represented in Colored Petri Nets (CPN). In summary, the model presented allows users to verify the design properties and structural behavior of the system.
NASA Technical Reports Server (NTRS)
Rauw, Marc O.
1993-01-01
The design of advanced Automatic Aircraft Control Systems (AACS's) can be improved upon considerably if the designer can access all models and tools required for control system design and analysis through a graphical user interface, from within one software environment. This MSc-thesis presents the first step in the development of such an environment, which is currently being done at the Section for Stability and Control of Delft University of Technology, Faculty of Aerospace Engineering. The environment is implemented within the commercially available software package MATLAB/SIMULINK. The report consists of two parts. Part 1 gives a detailed description of the AACS design environment. The heart of this environment is formed by the SIMULINK implementation of a nonlinear aircraft model in block-diagram format. The model has been worked out for the old laboratory aircraft of the Faculty, the DeHavilland DHC-2 'Beaver', but due to its modular structure, it can easily be adapted for other aircraft. Part 1 also describes MATLAB programs which can be applied for finding steady-state trimmed-flight conditions and for linearization of the aircraft model, and it shows how the built-in simulation routines of SIMULINK have been used for open-loop analysis of the aircraft dynamics. Apart from the implementation of the models and tools, a thorough treatment of the theoretical backgrounds is presented. Part 2 of this report presents a part of an autopilot design process for the 'Beaver' aircraft, which clearly demonstrates the power and flexibility of the AACS design environment from part 1. Evaluations of all longitudinal and lateral control laws by means of nonlinear simulations are treated in detail. A floppy disk containing all relevant MATLAB programs and SIMULINK models is provided as a supplement.
Identification and Control of Aircrafts using Multiple Models and Adaptive Critics
NASA Technical Reports Server (NTRS)
Principe, Jose C.
2007-01-01
We compared two possible implementations of local linear models for control: one approach is based on a self-organizing map (SOM) to cluster the dynamics followed by a set of linear models operating at each cluster. Therefore the gating function is hard (a single local model will represent the regional dynamics). This simplifies the controller design since there is a one to one mapping between controllers and local models. The second approach uses a soft gate using a probabilistic framework based on a Gaussian Mixture Model (also called a dynamic mixture of experts). In this approach several models may be active at a given time, we can expect a smaller number of models, but the controller design is more involved, with potentially better noise rejection characteristics. Our experiments showed that the SOM provides overall best performance in high SNRs, but the performance degrades faster than with the GMM for the same noise conditions. The SOM approach required about an order of magnitude more models than the GMM, so in terms of implementation cost, the GMM is preferable. The design of the SOM is straight forward, while the design of the GMM controllers, although still reasonable, is more involved and needs more care in the selection of the parameters. Either one of these locally linear approaches outperform global nonlinear controllers based on neural networks, such as the time delay neural network (TDNN). Therefore, in essence the local model approach warrants practical implementations. In order to call the attention of the control community for this design methodology we extended successfully the multiple model approach to PID controllers (still today the most widely used control scheme in the industry), and wrote a paper on this subject. The echo state network (ESN) is a recurrent neural network with the special characteristics that only the output parameters are trained. The recurrent connections are preset according to the problem domain and are fixed. In a nutshell, the states of the reservoir of recurrent processing elements implement a projection space, where the desired response is optimally projected. This architecture trades training efficiency by a large increase in the dimension of the recurrent layer. However, the power of the recurrent neural networks can be brought to bear on practical difficult problems. Our goal was to implement an adaptive critic architecture implementing Bellman s approach to optimal control. However, we could only characterize the ESN performance as a critic in value function evaluation, which is just one of the pieces of the overall adaptive critic controller. The results were very convincing, and the simplicity of the implementation was unparalleled.
NASA Astrophysics Data System (ADS)
Azimi, Ehsan; Behrad, Alireza; Ghaznavi-Ghoushchi, Mohammad Bagher; Shanbehzadeh, Jamshid
2016-11-01
The projective model is an important mapping function for the calculation of global transformation between two images. However, its hardware implementation is challenging because of a large number of coefficients with different required precisions for fixed point representation. A VLSI hardware architecture is proposed for the calculation of a global projective model between input and reference images and refining false matches using random sample consensus (RANSAC) algorithm. To make the hardware implementation feasible, it is proved that the calculation of the projective model can be divided into four submodels comprising two translations, an affine model and a simpler projective mapping. This approach makes the hardware implementation feasible and considerably reduces the required number of bits for fixed point representation of model coefficients and intermediate variables. The proposed hardware architecture for the calculation of a global projective model using the RANSAC algorithm was implemented using Verilog hardware description language and the functionality of the design was validated through several experiments. The proposed architecture was synthesized by using an application-specific integrated circuit digital design flow utilizing 180-nm CMOS technology as well as a Virtex-6 field programmable gate array. Experimental results confirm the efficiency of the proposed hardware architecture in comparison with software implementation.
Communications, Navigation, and Surveillance Models in ACES: Design Implementation and Capabilities
NASA Technical Reports Server (NTRS)
Kubat, Greg; Vandrei, Don; Satapathy, Goutam; Kumar, Anil; Khanna, Manu
2006-01-01
Presentation objectives include: a) Overview of the ACES/CNS System Models Design and Integration; b) Configuration Capabilities available for Models and Simulations using ACES with CNS Modeling; c) Descriptions of recently added, Enhanced CNS Simulation Capabilities; and d) General Concepts Ideas that Utilize CNS Modeling to Enhance Concept Evaluations.
NASA Technical Reports Server (NTRS)
Kubat, Greg; Vandrei, Don
2006-01-01
Project Objectives include: a) CNS Model Development; b Design/Integration of baseline set of CNS Models into ACES; c) Implement Enhanced Simulation Capabilities in ACES; d) Design and Integration of Enhanced (2nd set) CNS Models; and e) Continue with CNS Model Integration/Concept evaluations.
Gleddie, Doug
2012-03-01
The health-promoting schools approach has gained momentum in the last decade with many jurisdictions providing guidelines and frameworks for general implementation. Although general agreement exists as to the broad strokes needed for effectiveness, less apparent are local implementation designs and models. The Battle River Project was designed to explore one such local implementation strategy for a provincial (Alberta, Canada) health promoting schools program. Located in the Battle River School Division, the project featured a partnership between Ever Active Schools, the school division and the local health authority. Case study was used to come to a greater understanding of how the health promoting schools approach worked in this particular school authority and model. Three themes emerged: participation, coordination and, integration.
CoLeMo: A Collaborative Learning Environment for UML Modelling
ERIC Educational Resources Information Center
Chen, Weiqin; Pedersen, Roger Heggernes; Pettersen, Oystein
2006-01-01
This paper presents the design, implementation, and evaluation of a distributed collaborative UML modelling environment, CoLeMo. CoLeMo is designed for students studying UML modelling. It can also be used as a platform for collaborative design of software. We conducted formative evaluations and a summative evaluation to improve the environment and…
Minkoff, Kenneth; Cline, Christie A
2004-12-01
This article has described the CCISC model and the process of implementation of systemic implementation of co-occurring disorder services enhancements within the context of existing resources. Four projects were described as illustrations of current implementation activities. Clearly, there is need for improved services for these individuals, and increasing recognition of the need for systemic change models that are effective and efficient. The CCISC model has been recognized by SAMHSA as a consensus best practice for system design, and initial efforts at implementation appear to be promising. The existing toolkit may permit a more formal process of data-driven evaluation of system, program, clinician, and client outcomes, to better measure the effectiveness of this approach. Some projects have begun such formal evaluation processes, but more work is needed, not only with individual projects, but also to develop opportunities for multi-system evaluation, as more projects come on line.
Investigation of model-based physical design restrictions (Invited Paper)
NASA Astrophysics Data System (ADS)
Lucas, Kevin; Baron, Stanislas; Belledent, Jerome; Boone, Robert; Borjon, Amandine; Couderc, Christophe; Patterson, Kyle; Riviere-Cazaux, Lionel; Rody, Yves; Sundermann, Frank; Toublan, Olivier; Trouiller, Yorick; Urbani, Jean-Christophe; Wimmer, Karl
2005-05-01
As lithography and other patterning processes become more complex and more non-linear with each generation, the task of physical design rules necessarily increases in complexity also. The goal of the physical design rules is to define the boundary between the physical layout structures which will yield well from those which will not. This is essentially a rule-based pre-silicon guarantee of layout correctness. However the rapid increase in design rule requirement complexity has created logistical problems for both the design and process functions. Therefore, similar to the semiconductor industry's transition from rule-based to model-based optical proximity correction (OPC) due to increased patterning complexity, opportunities for improving physical design restrictions by implementing model-based physical design methods are evident. In this paper we analyze the possible need and applications for model-based physical design restrictions (MBPDR). We first analyze the traditional design rule evolution, development and usage methodologies for semiconductor manufacturers. Next we discuss examples of specific design rule challenges requiring new solution methods in the patterning regime of low K1 lithography and highly complex RET. We then evaluate possible working strategies for MBPDR in the process development and product design flows, including examples of recent model-based pre-silicon verification techniques. Finally we summarize with a proposed flow and key considerations for MBPDR implementation.
Earth Observations, Models and Geo-Design in Support of SDG Implementation and Monitoring
NASA Astrophysics Data System (ADS)
Plag, H. P.; Jules-Plag, S.
2016-12-01
Implementation and Monitoring of the United Nations' Sustainable Development Goals (SDGs) requires support from Earth observation and scientific communities. Applying a goal-based approach to determine the data needs to the Targets and Indicators associated with the SDGs demonstrates that integration of environmental with socio-economic and statistical data is required. Large data gaps exist for the built environment. A Geo-Design platform can provide the infrastructure and conceptual model for the data integration. The development of policies and actions to foster the implementation of SDGs in many cases requires research and the development of tools to answer "what if" questions. Here, agent-based models and model webs combined with a Geo-Design platform are promising avenues. This advanced combined infrastructure can also play a crucial role in the necessary capacity building. We will use the example of SDG 5 (Gender equality) to illustrate these approaches. SDG 11 (Sustainable Cities and Communities) is used to underline the cross-goal linkages and the joint benefits of Earth observations, data integration, and modeling tools for multiple SDGs.
Supporting BPMN choreography with system integration artefacts for enterprise process collaboration
NASA Astrophysics Data System (ADS)
Nie, Hongchao; Lu, Xudong; Duan, Huilong
2014-07-01
Business Process Model and Notation (BPMN) choreography modelling depicts externally visible message exchanges between collaborating processes of enterprise information systems. Implementation of choreography relies on designing system integration solutions to realise message exchanges between independently developed systems. Enterprise integration patterns (EIPs) are widely accepted artefacts to design integration solutions. If the choreography model represents coordination requirements between processes with behaviour mismatches, the integration designer needs to analyse the routing requirements and address these requirements by manually designing EIP message routers. As collaboration scales and complexity increases, manual design becomes inefficient. Thus, the research problem of this paper is to explore a method to automatically identify routing requirements from BPMN choreography model and to accordingly design routing in the integration solution. To achieve this goal, recurring behaviour mismatch scenarios are analysed as patterns, and corresponding solutions are proposed as EIP routers. Using this method, a choreography model can be analysed by computer to identify occurrences of mismatch patterns, leading to corresponding router selection. A case study demonstrates that the proposed method enables computer-assisted integration design to implement choreography. A further experiment reveals that the method is effective to improve the design quality and reduce time cost.
Probabilistic Finite Element Analysis & Design Optimization for Structural Designs
NASA Astrophysics Data System (ADS)
Deivanayagam, Arumugam
This study focuses on implementing probabilistic nature of material properties (Kevlar® 49) to the existing deterministic finite element analysis (FEA) of fabric based engine containment system through Monte Carlo simulations (MCS) and implementation of probabilistic analysis in engineering designs through Reliability Based Design Optimization (RBDO). First, the emphasis is on experimental data analysis focusing on probabilistic distribution models which characterize the randomness associated with the experimental data. The material properties of Kevlar® 49 are modeled using experimental data analysis and implemented along with an existing spiral modeling scheme (SMS) and user defined constitutive model (UMAT) for fabric based engine containment simulations in LS-DYNA. MCS of the model are performed to observe the failure pattern and exit velocities of the models. Then the solutions are compared with NASA experimental tests and deterministic results. MCS with probabilistic material data give a good prospective on results rather than a single deterministic simulation results. The next part of research is to implement the probabilistic material properties in engineering designs. The main aim of structural design is to obtain optimal solutions. In any case, in a deterministic optimization problem even though the structures are cost effective, it becomes highly unreliable if the uncertainty that may be associated with the system (material properties, loading etc.) is not represented or considered in the solution process. Reliable and optimal solution can be obtained by performing reliability optimization along with the deterministic optimization, which is RBDO. In RBDO problem formulation, in addition to structural performance constraints, reliability constraints are also considered. This part of research starts with introduction to reliability analysis such as first order reliability analysis, second order reliability analysis followed by simulation technique that are performed to obtain probability of failure and reliability of structures. Next, decoupled RBDO procedure is proposed with a new reliability analysis formulation with sensitivity analysis, which is performed to remove the highly reliable constraints in the RBDO, thereby reducing the computational time and function evaluations. Followed by implementation of the reliability analysis concepts and RBDO in finite element 2D truss problems and a planar beam problem are presented and discussed.
An agent-based simulation model to study accountable care organizations.
Liu, Pai; Wu, Shinyi
2016-03-01
Creating accountable care organizations (ACOs) has been widely discussed as a strategy to control rapidly rising healthcare costs and improve quality of care; however, building an effective ACO is a complex process involving multiple stakeholders (payers, providers, patients) with their own interests. Also, implementation of an ACO is costly in terms of time and money. Immature design could cause safety hazards. Therefore, there is a need for analytical model-based decision-support tools that can predict the outcomes of different strategies to facilitate ACO design and implementation. In this study, an agent-based simulation model was developed to study ACOs that considers payers, healthcare providers, and patients as agents under the shared saving payment model of care for congestive heart failure (CHF), one of the most expensive causes of sometimes preventable hospitalizations. The agent-based simulation model has identified the critical determinants for the payment model design that can motivate provider behavior changes to achieve maximum financial and quality outcomes of an ACO. The results show nonlinear provider behavior change patterns corresponding to changes in payment model designs. The outcomes vary by providers with different quality or financial priorities, and are most sensitive to the cost-effectiveness of CHF interventions that an ACO implements. This study demonstrates an increasingly important method to construct a healthcare system analytics model that can help inform health policy and healthcare management decisions. The study also points out that the likely success of an ACO is interdependent with payment model design, provider characteristics, and cost and effectiveness of healthcare interventions.
Chao, Tian-Jy; Kim, Younghun
2015-02-10
An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented that communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.
On data modeling for neurological application
NASA Astrophysics Data System (ADS)
Woźniak, Karol; Mulawka, Jan
The aim of this paper is to design and implement information system containing large database dedicated to support neurological-psychiatric examinations focused on human brain after stroke. This approach encompasses the following steps: analysis of software requirements, presentation of the problem solving concept, design and implementation of the final information system. Certain experiments were performed in order to verify the correctness of the project ideas. The approach can be considered as an interdisciplinary venture. Elaboration of the system architecture, data model and the tools supporting medical examinations are provided. The achievement of the design goals is demonstrated in the final conclusion.
NASA Technical Reports Server (NTRS)
Rowe, Sidney E.
2010-01-01
In September 2007, the Engineering Directorate at the Marshall Space Flight Center (MSFC) created the Design System Focus Team (DSFT). MSFC was responsible for the in-house design and development of the Ares 1 Upper Stage and the Engineering Directorate was preparing to deploy a new electronic Configuration Management and Data Management System with the Design Data Management System (DDMS) based upon a Commercial Off The Shelf (COTS) Product Data Management (PDM) System. The DSFT was to establish standardized CAD practices and a new data life cycle for design data. Of special interest here, the design teams were to implement Model Based Definition (MBD) in support of the Upper Stage manufacturing contract. It is noted that this MBD does use partially dimensioned drawings for auxiliary information to the model. The design data lifecycle implemented several new release states to be used prior to formal release that allowed the models to move through a flow of progressive maturity. The DSFT identified some 17 Lessons Learned as outcomes of the standards development, pathfinder deployments and initial application to the Upper Stage design completion. Some of the high value examples are reviewed.
Logic Models for Program Design, Implementation, and Evaluation: Workshop Toolkit. REL 2015-057
ERIC Educational Resources Information Center
Shakman, Karen; Rodriguez, Sheila M.
2015-01-01
The Logic Model Workshop Toolkit is designed to help practitioners learn the purpose of logic models, the different elements of a logic model, and the appropriate steps for developing and using a logic model for program evaluation. Topics covered in the sessions include an overview of logic models, the elements of a logic model, an introduction to…
DOT National Transportation Integrated Search
2009-11-01
The development of the Mechanistic-Empirical Pavement Design Guide (MEPDG) under National Cooperative Highway Research Program (NCHRP) projects 1-37A and 1-40D has significantly improved the ability of pavement designers to model and simulate the eff...
A design optimization process for Space Station Freedom
NASA Technical Reports Server (NTRS)
Chamberlain, Robert G.; Fox, George; Duquette, William H.
1990-01-01
The Space Station Freedom Program is used to develop and implement a process for design optimization. Because the relative worth of arbitrary design concepts cannot be assessed directly, comparisons must be based on designs that provide the same performance from the point of view of station users; such designs can be compared in terms of life cycle cost. Since the technology required to produce a space station is widely dispersed, a decentralized optimization process is essential. A formulation of the optimization process is provided and the mathematical models designed to facilitate its implementation are described.
Neural networks and MIMD-multiprocessors
NASA Technical Reports Server (NTRS)
Vanhala, Jukka; Kaski, Kimmo
1990-01-01
Two artificial neural network models are compared. They are the Hopfield Neural Network Model and the Sparse Distributed Memory model. Distributed algorithms for both of them are designed and implemented. The run time characteristics of the algorithms are analyzed theoretically and tested in practice. The storage capacities of the networks are compared. Implementations are done using a distributed multiprocessor system.
A New Design for Airway Management Training with Mixed Reality and High Fidelity Modeling.
Shen, Yunhe; Hananel, David; Zhao, Zichen; Burke, Daniel; Ballas, Crist; Norfleet, Jack; Reihsen, Troy; Sweet, Robert
2016-01-01
Restoring airway function is a vital task in many medical scenarios. Although various simulation tools have been available for learning such skills, recent research indicated that fidelity in simulating airway management deserves further improvements. In this study, we designed and implemented a new prototype for practicing relevant tasks including laryngoscopy, intubation and cricothyrotomy. A large amount of anatomical details or landmarks were meticulously selected and reconstructed from medical scans, and 3D-printed or molded to the airway intervention model. This training model was augmented by virtually and physically presented interactive modules, which are interoperable with motion tracking and sensor data feedback. Implementation results showed that this design is a feasible approach to develop higher fidelity airway models that can be integrated with mixed reality interfaces.
Innovative Peer Review Model for Rural Physicians: System Design and Implementation
ERIC Educational Resources Information Center
Williams, Josie R.; Mechler, Kathy; Akins, Ralitsa B.
2008-01-01
Context: The peer review process in small rural hospitals is complicated by limited numbers of physicians, conflict of interest, issues related to appropriate utilization of new technology, possibility for conflicting recommendations, and need for external expertise. Purpose: The purpose of this project was to design, test, and implement a virtual…
A systematic approach to embedded biomedical decision making.
Song, Zhe; Ji, Zhongkai; Ma, Jian-Guo; Sputh, Bernhard; Acharya, U Rajendra; Faust, Oliver
2012-11-01
An embedded decision making is a key feature for many biomedical systems. In most cases human life directly depends on correct decisions made by these systems, therefore they have to work reliably. This paper describes how we applied systems engineering principles to design a high performance embedded classification system in a systematic and well structured way. We introduce the structured design approach by discussing requirements capturing, specifications refinement, implementation and testing. Thereby, we follow systems engineering principles and execute each of these processes as formal as possible. The requirements, which motivate the system design, describe an automated decision making system for diagnostic support. These requirements are refined into the implementation of a support vector machine (SVM) algorithm which enables us to integrate automated decision making in embedded systems. With a formal model we establish functionality, stability and reliability of the system. Furthermore, we investigated different parallel processing configurations of this computationally complex algorithm. We found that, by adding SVM processes, an almost linear speedup is possible. Once we established these system properties, we translated the formal model into an implementation. The resulting implementation was tested using XMOS processors with both normal and failure cases, to build up trust in the implementation. Finally, we demonstrated that our parallel implementation achieves the speedup, predicted by the formal model. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Design and Implementation Skills for Social Innovation.
ERIC Educational Resources Information Center
Tornatzky, Louis G.; Fairweather, George W.
New models of research and training combined with dissemination techniques can contribute to relevant social change. The Ecological Psychology Program at Michigan State University, a graduate training program which focuses on model building and implementation research, offers ideas on the plausability of social programming. The process would…
Supporting Parent Engagement in Programme-Wide Behavioural Intervention Implementation
ERIC Educational Resources Information Center
Cummings, Katrina P.
2017-01-01
Positive behaviour intervention and support (PBIS) models are evolving as an effective means to promote social and emotional competence among young children and address challenging behaviours. This study was designed to gain insights into parental involvement in programme-wide implementation of the "Pyramid" model. Interviews were…
Adaptive User Model for Web-Based Learning Environment.
ERIC Educational Resources Information Center
Garofalakis, John; Sirmakessis, Spiros; Sakkopoulos, Evangelos; Tsakalidis, Athanasios
This paper describes the design of an adaptive user model and its implementation in an advanced Web-based Virtual University environment that encompasses combined and synchronized adaptation between educational material and well-known communication facilities. The Virtual University environment has been implemented to support a postgraduate…
A depictive neural model for the representation of motion verbs.
Rao, Sunil; Aleksander, Igor
2011-11-01
In this paper, we present a depictive neural model for the representation of motion verb semantics in neural models of visual awareness. The problem of modelling motion verb representation is shown to be one of function application, mapping a set of given input variables defining the moving object and the path of motion to a defined output outcome in the motion recognition context. The particular function-applicative implementation and consequent recognition model design presented are seen as arising from a noun-adjective recognition model enabling the recognition of colour adjectives as applied to a set of shapes representing objects to be recognised. The presence of such a function application scheme and a separately implemented position identification and path labelling scheme are accordingly shown to be the primitives required to enable the design and construction of a composite depictive motion verb recognition scheme. Extensions to the presented design to enable the representation of transitive verbs are also discussed.
Formal Specification and Design Techniques for Wireless Sensor and Actuator Networks
Martínez, Diego; González, Apolinar; Blanes, Francisco; Aquino, Raúl; Simo, José; Crespo, Alfons
2011-01-01
A current trend in the development and implementation of industrial applications is to use wireless networks to communicate the system nodes, mainly to increase application flexibility, reliability and portability, as well as to reduce the implementation cost. However, the nondeterministic and concurrent behavior of distributed systems makes their analysis and design complex, often resulting in less than satisfactory performance in simulation and test bed scenarios, which is caused by using imprecise models to analyze, validate and design these systems. Moreover, there are some simulation platforms that do not support these models. This paper presents a design and validation method for Wireless Sensor and Actuator Networks (WSAN) which is supported on a minimal set of wireless components represented in Colored Petri Nets (CPN). In summary, the model presented allows users to verify the design properties and structural behavior of the system. PMID:22344203
NASA Astrophysics Data System (ADS)
Nakano, Masaru; Kubota, Fumiko; Inamori, Yutaka; Mitsuyuki, Keiji
Manufacturing system designers should concentrate on designing and planning manufacturing systems instead of spending their efforts on creating the simulation models to verify the design. This paper proposes a method and its tool to navigate the designers through the engineering process and generate the simulation model automatically from the design results. The design agent also supports collaborative design projects among different companies or divisions with distributed engineering and distributed simulation techniques. The idea was implemented and applied to a factory planning process.
Implications of Modeling Uncertainty for Water Quality Decision Making
NASA Astrophysics Data System (ADS)
Shabman, L.
2002-05-01
The report, National Academy of Sciences report, "Assessing the TMDL Approach to Water Quality Management" endorsed the "watershed" and "ambient water quality focused" approach" to water quality management called for in the TMDL program. The committee felt that available data and models were adequate to move such a program forward, if the EPA and all stakeholders better understood the nature of the scientific enterprise and its application to the TMDL program. Specifically, the report called for a greater acknowledgement of model prediction uncertinaity in making and implementing TMDL plans. To assure that such uncertinaity was addressed in water quality decision making the committee called for a commitment to "adaptive implementation" of water quality management plans. The committee found that the number and complexity of the interactions of multiple stressors, combined with model prediction uncertinaity means that we need to avoid the temptation to make assurances that specific actions will result in attainment of particular water quality standards. Until the work on solving a water quality problem begins, analysts and decision makers cannot be sure what the correct solutions are, or even what water quality goals a community should be seeking. In complex systems we need to act in order to learn; adaptive implementation is a concurrent process of action and learning. Learning requires (1) continued monitoring of the waterbody to determine how it responds to the actions taken and (2) carefully designed experiments in the watershed. If we do not design learning into what we attempt we are not doing adaptive implementation. Therefore, there needs to be an increased commitment to monitoring and experiments in watersheds that will lead to learning. This presentation will 1) explain the logic for adaptive implementation; 2) discuss the ways that water quality modelers could characterize and explain model uncertinaity to decision makers; 3) speculate on the implications of the adaptive implementation for setting of water quality standards, for design of watershed monitoring programs and for the regulatory rules governing the TMDL program implementation.
NASA Technical Reports Server (NTRS)
Farrara, John D.; Drummond, Leroy A.; Mechoso, Carlos R.; Spahr, Joseph A.
1998-01-01
The design, implementation and performance optimization on the CRAY T3E of an atmospheric general circulation model (AGCM) which includes the transport of, and chemical reactions among, an arbitrary number of constituents is reviewed. The parallel implementation is based on a two-dimensional (longitude and latitude) data domain decomposition. Initial optimization efforts centered on minimizing the impact of substantial static and weakly-dynamic load imbalances among processors through load redistribution schemes. Recent optimization efforts have centered on single-node optimization. Strategies employed include loop unrolling, both manually and through the compiler, the use of an optimized assembler-code library for special function calls, and restructuring of parts of the code to improve data locality. Data exchanges and synchronizations involved in coupling different data-distributed models can account for a significant fraction of the running time. Therefore, the required scattering and gathering of data must be optimized. In systems such as the T3E, there is much more aggregate bandwidth in the total system than in any particular processor. This suggests a distributed design. The design and implementation of a such distributed 'Data Broker' as a means to efficiently couple the components of our climate system model is described.
NASA Astrophysics Data System (ADS)
Erduran, Sibel
The central problem underlying this dissertation is the design of learning environments that enable the teaching and learning of chemistry through modeling. Significant role of models in chemistry knowledge is highlighted with a shift in emphasis from conceptual to epistemological accounts of models. Research context is the design and implementation of student centered Acids & Bases Curriculum, developed as part of Project SEPIA. Qualitative study focused on 3 curriculum activities conducted in one 7th grade class of 19 students in an urban, public middle school in eastern United States. Questions guiding the study were: (a) How can learning environments be designed to promote growth of chemistry knowledge through modeling? (b) What epistemological criteria facilitate learning of growth of chemistry knowledge through modeling? Curriculum materials, and verbal data from whole class conversations and student group interviews were analyzed. Group interviews consisted of same 4 students, selected randomly before curriculum implementation, and were conducted following each activity to investigate students' developing understandings of models. Theoretical categories concerning definition, properties and kinds of models as well as educational and chemical models informed curriculum design, and were redefined as codes in the analysis of verbal data. Results indicate more diversity of codes in student than teacher talk across all activities. Teacher concentrated on educational and chemical models. A significant finding is that model properties such as 'compositionality' and 'projectability' were not present in teacher talk as expected by curriculum design. Students did make reference to model properties. Another finding is that students demonstrate an understanding of models characterized by the seventeenth century Lemery model of acids and bases. Two students' developing understandings of models across curriculum implementation suggest that curriculum bears some change in students' understanding of models. The tension between students' everyday knowledge and teacher's scientific knowledge is highlighted relative to the patterns in codes observed in data. Implications for theory of learning, curriculum design and teacher education are discussed. It is argued that future educational research should acknowledge and incorporate perspectives from chemical epistemology.
Creating an Electronic Reference and Information Database for Computer-aided ECM Design
NASA Astrophysics Data System (ADS)
Nekhoroshev, M. V.; Pronichev, N. D.; Smirnov, G. V.
2018-01-01
The paper presents a review on electrochemical shaping. An algorithm has been developed to implement a computer shaping model applicable to pulse electrochemical machining. For that purpose, the characteristics of pulse current occurring in electrochemical machining of aviation materials have been studied. Based on integrating the experimental results and comprehensive electrochemical machining process data modeling, a subsystem for computer-aided design of electrochemical machining for gas turbine engine blades has been developed; the subsystem was implemented in the Teamcenter PLM system.
Model-Based Design of Air Traffic Controller-Automation Interaction
NASA Technical Reports Server (NTRS)
Romahn, Stephan; Callantine, Todd J.; Palmer, Everett A.; Null, Cynthia H. (Technical Monitor)
1998-01-01
A model of controller and automation activities was used to design the controller-automation interactions necessary to implement a new terminal area air traffic management concept. The model was then used to design a controller interface that provides the requisite information and functionality. Using data from a preliminary study, the Crew Activity Tracking System (CATS) was used to help validate the model as a computational tool for describing controller performance.
Design and implementation of a random neural network routing engine.
Kocak, T; Seeber, J; Terzioglu, H
2003-01-01
Random neural network (RNN) is an analytically tractable spiked neural network model that has been implemented in software for a wide range of applications for over a decade. This paper presents the hardware implementation of the RNN model. Recently, cognitive packet networks (CPN) is proposed as an alternative packet network architecture where there is no routing table, instead the RNN based reinforcement learning is used to route packets. Particularly, we describe implementation details for the RNN based routing engine of a CPN network processor chip: the smart packet processor (SPP). The SPP is a dual port device that stores, modifies, and interprets the defining characteristics of multiple RNN models. In addition to hardware design improvements over the software implementation such as the dual access memory, output calculation step, and reduced output calculation module, this paper introduces a major modification to the reinforcement learning algorithm used in the original CPN specification such that the number of weight terms are reduced from 2n/sup 2/ to 2n. This not only yields significant memory savings, but it also simplifies the calculations for the steady state probabilities (neuron outputs in RNN). Simulations have been conducted to confirm the proper functionality for the isolated SPP design as well as for the multiple SPP's in a networked environment.
The implementation of POSTGRES
NASA Technical Reports Server (NTRS)
Stonebraker, Michael; Rowe, Lawrence A.; Hirohama, Michael
1990-01-01
The design and implementation decisions made for the three-dimensional data manager POSTGRES are discussed. Attention is restricted to the DBMS backend functions. The POSTGRES data model and query language, the rules system, the storage system, the POSTGRES implementation, and the current status and performance are discussed.
AI/OR computational model for integrating qualitative and quantitative design methods
NASA Technical Reports Server (NTRS)
Agogino, Alice M.; Bradley, Stephen R.; Cagan, Jonathan; Jain, Pramod; Michelena, Nestor
1990-01-01
A theoretical framework for integrating qualitative and numerical computational methods for optimally-directed design is described. The theory is presented as a computational model and features of implementations are summarized where appropriate. To demonstrate the versatility of the methodology we focus on four seemingly disparate aspects of the design process and their interaction: (1) conceptual design, (2) qualitative optimal design, (3) design innovation, and (4) numerical global optimization.
Use Of REX Control System For The Ball On Spool Model
NASA Astrophysics Data System (ADS)
Ožana, Štěpán; Pieš, Martin; Hájovský, Radovan; Dočekal, Tomáš
2015-07-01
This paper deals with the design and implementation of linear quadratic controller (LQR) for modeling of Ball on Spool. The paper presents the entire process, starting from mathematical model through control design towards application of controller with the use of given hardware platform. Proposed solution by means of REX Control System provides a high level of user comfort regarding implementation of control loop, diagnostics and automatically generated visualization based on HTML5. It represents an ideal example of a complex nonlinear mechatronic system with a lot of possibilities to apply other types of controllers.
Design sensitivity analysis and optimization tool (DSO) for sizing design applications
NASA Technical Reports Server (NTRS)
Chang, Kuang-Hua; Choi, Kyung K.; Perng, Jyh-Hwa
1992-01-01
The DSO tool, a structural design software system that provides the designer with a graphics-based menu-driven design environment to perform easy design optimization for general applications, is presented. Three design stages, preprocessing, design sensitivity analysis, and postprocessing, are implemented in the DSO to allow the designer to carry out the design process systematically. A framework, including data base, user interface, foundation class, and remote module, has been designed and implemented to facilitate software development for the DSO. A number of dedicated commercial software/packages have been integrated in the DSO to support the design procedures. Instead of parameterizing an FEM, design parameters are defined on a geometric model associated with physical quantities, and the continuum design sensitivity analysis theory is implemented to compute design sensitivity coefficients using postprocessing data from the analysis codes. A tracked vehicle road wheel is given as a sizing design application to demonstrate the DSO's easy and convenient design optimization process.
In Search of the Elusive ADDIE Model.
ERIC Educational Resources Information Center
Molenda, Michael
2003-01-01
Discusses the origin of the ADDIE model of instructional design and concludes that the term came into use by word of mouth as a label for the whole family of systematic instructional development models. Examines the underlying ideas behind the acronym analysis, design, development, implementation, and evaluation. (Author/LRW)
Systematic Implementation of a Tier 2 Behavior Intervention
ERIC Educational Resources Information Center
Carter, Deborah Russell; Carter, Gabriel M.; Johnson, Evelyn S.; Pool, Juli L.
2013-01-01
Schools are increasingly adopting tiered models of prevention to meet the needs of diverse populations of students. This article outlines the steps involved in designing and implementing a systematic Tier 2 behavior intervention within a tiered service delivery model. An elementary school example is provided to outline the identification,…
NASA Technical Reports Server (NTRS)
Mielke, R. R.; Tung, L. J.; Carraway, P. I., III
1984-01-01
The feasibility of using reduced order models and reduced order observers with eigenvalue/eigenvector assignment procedures is investigated. A review of spectral assignment synthesis procedures is presented. Then, a reduced order model which retains essential system characteristics is formulated. A constant state feedback matrix which assigns desired closed loop eigenvalues and approximates specified closed loop eigenvectors is calculated for the reduced order model. It is shown that the eigenvalue and eigenvector assignments made in the reduced order system are retained when the feedback matrix is implemented about the full order system. In addition, those modes and associated eigenvectors which are not included in the reduced order model remain unchanged in the closed loop full order system. The full state feedback design is then implemented by using a reduced order observer. It is shown that the eigenvalue and eigenvector assignments of the closed loop full order system rmain unchanged when a reduced order observer is used. The design procedure is illustrated by an actual design problem.
NASA Technical Reports Server (NTRS)
Mielke, R. R.; Tung, L. J.; Carraway, P. I., III
1985-01-01
The feasibility of using reduced order models and reduced order observers with eigenvalue/eigenvector assignment procedures is investigated. A review of spectral assignment synthesis procedures is presented. Then, a reduced order model which retains essential system characteristics is formulated. A constant state feedback matrix which assigns desired closed loop eigenvalues and approximates specified closed loop eigenvectors is calculated for the reduced order model. It is shown that the eigenvalue and eigenvector assignments made in the reduced order system are retained when the feedback matrix is implemented about the full order system. In addition, those modes and associated eigenvectors which are not included in the reduced order model remain unchanged in the closed loop full order system. The fulll state feedback design is then implemented by using a reduced order observer. It is shown that the eigenvalue and eigenvector assignments of the closed loop full order system remain unchanged when a reduced order observer is used. The design procedure is illustrated by an actual design problem.
NASA Technical Reports Server (NTRS)
Ensey, Tyler S.
2013-01-01
During my internship at NASA, I was a model developer for Ground Support Equipment (GSE). The purpose of a model developer is to develop and unit test model component libraries (fluid, electrical, gas, etc.). The models are designed to simulate software for GSE (Ground Special Power, Crew Access Arm, Cryo, Fire and Leak Detection System, Environmental Control System (ECS), etc. .) before they are implemented into hardware. These models support verifying local control and remote software for End-Item Software Under Test (SUT). The model simulates the physical behavior (function, state, limits and 110) of each end-item and it's dependencies as defined in the Subsystem Interface Table, Software Requirements & Design Specification (SRDS), Ground Integrated Schematic (GIS), and System Mechanical Schematic.(SMS). The software of each specific model component is simulated through MATLAB's Simulink program. The intensiv model development life cycle is a.s follows: Identify source documents; identify model scope; update schedule; preliminary design review; develop model requirements; update model.. scope; update schedule; detailed design review; create/modify library component; implement library components reference; implement subsystem components; develop a test script; run the test script; develop users guide; send model out for peer review; the model is sent out for verifictionlvalidation; if there is empirical data, a validation data package is generated; if there is not empirical data, a verification package is generated; the test results are then reviewed; and finally, the user. requests accreditation, and a statement of accreditation is prepared. Once each component model is reviewed and approved, they are intertwined together into one integrated model. This integrated model is then tested itself, through a test script and autotest, so that it can be concluded that all models work conjointly, for a single purpose. The component I was assigned, specifically, was a fluid component, a discrete pressure switch. The switch takes a fluid pressure input, and if the pressure is greater than a designated cutoff pressure, the switch would stop fluid flow.
Inverse problems in the design, modeling and testing of engineering systems
NASA Technical Reports Server (NTRS)
Alifanov, Oleg M.
1991-01-01
Formulations, classification, areas of application, and approaches to solving different inverse problems are considered for the design of structures, modeling, and experimental data processing. Problems in the practical implementation of theoretical-experimental methods based on solving inverse problems are analyzed in order to identify mathematical models of physical processes, aid in input data preparation for design parameter optimization, help in design parameter optimization itself, and to model experiments, large-scale tests, and real tests of engineering systems.
Bazos, Dorothy A; Schifferdecker, Karen E; Fedrizzi, Rudolph; Hoebeke, Jaime; Ruggles, Laural; Goldsberry, Yvonne
2013-01-01
Although process elements that define community-based participatory research (CBPR) are well articulated and provide guidance for bringing together researchers and communities, additional models to implement CBPR are needed. One potential model for implementing and monitoring CBPR is Action Learning Collaboratives (ALCs); short term, team-based learning processes that are grounded in quality improvement. Since 2010, the Prevention Research Center at Dartmouth (PRCD) has used ALCs with three communities as a platform to design, implement and evaluate CBPR. The first ALC provided an opportunity for academia and community leadership to strengthen their relationships and knowledge of respective assets through design and evaluation of community-based QI projects. Building on this work, we jointly designed and are implementing a second ALC, a cross-community research project focused on obesity prevention in vulnerable populations. An enhanced community capacity now exists to support CBPR activities with a high degree of sophistication and decreased reliance on external facilitation.
Hierarchical specification of the SIFT fault tolerant flight control system
NASA Technical Reports Server (NTRS)
Melliar-Smith, P. M.; Schwartz, R. L.
1981-01-01
The specification and mechanical verification of the Software Implemented Fault Tolerance (SIFT) flight control system is described. The methodology employed in the verification effort is discussed, and a description of the hierarchical models of the SIFT system is given. To meet the objective of NASA for the reliability of safety critical flight control systems, the SIFT computer must achieve a reliability well beyond the levels at which reliability can be actually measured. The methodology employed to demonstrate rigorously that the SIFT computer meets as reliability requirements is described. The hierarchy of design specifications from very abstract descriptions of system function down to the actual implementation is explained. The most abstract design specifications can be used to verify that the system functions correctly and with the desired reliability since almost all details of the realization were abstracted out. A succession of lower level models refine these specifications to the level of the actual implementation, and can be used to demonstrate that the implementation has the properties claimed of the abstract design specifications.
Information system and website design to support theautomotive manufacture ERP system
NASA Astrophysics Data System (ADS)
Amran, T. G.; Azmi, N.; Surjawati, A. A.
2017-12-01
This research is to create an on-time production system design with Heijunka model so that the product diversity for all models could meet time and capacity requirements, own production flexibility, high quality, meet the customers’ demands, realistic in production as well as creating a web-based local components’ order information system that supports the Enterprise Resource Planning (ERP) system. The Heijunka model for equalization with heuristic and stochastic model has been implemented for productions up to 3000 units by implementing Suzuki International Manufacturing. The inefficiency in the local order information system demanded the need for a new information system design that is integrated in ERP. Kaizen needs to be done is the Supplier Network that all vendors can download and utilize those data to deliver the components to the company and for vendors’ internal uses as well. The model design is presumed effective where the model is able to be utilized as a solution so that the production can run according to the schedule and presumed efficient were the model is able to show the reduction of loss time and stock.
Application of Diagnostic Analysis Tools to the Ares I Thrust Vector Control System
NASA Technical Reports Server (NTRS)
Maul, William A.; Melcher, Kevin J.; Chicatelli, Amy K.; Johnson, Stephen B.
2010-01-01
The NASA Ares I Crew Launch Vehicle is being designed to support missions to the International Space Station (ISS), to the Moon, and beyond. The Ares I is undergoing design and development utilizing commercial-off-the-shelf tools and hardware when applicable, along with cutting edge launch technologies and state-of-the-art design and development. In support of the vehicle s design and development, the Ares Functional Fault Analysis group was tasked to develop an Ares Vehicle Diagnostic Model (AVDM) and to demonstrate the capability of that model to support failure-related analyses and design integration. One important component of the AVDM is the Upper Stage (US) Thrust Vector Control (TVC) diagnostic model-a representation of the failure space of the US TVC subsystem. This paper first presents an overview of the AVDM, its development approach, and the software used to implement the model and conduct diagnostic analysis. It then uses the US TVC diagnostic model to illustrate details of the development, implementation, analysis, and verification processes. Finally, the paper describes how the AVDM model can impact both design and ground operations, and how some of these impacts are being realized during discussions of US TVC diagnostic analyses with US TVC designers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clausen, Jonathan R.; Brunini, Victor E.; Moffat, Harry K.
We develop a capability to simulate reduction-oxidation (redox) flow batteries in the Sierra Multi-Mechanics code base. Specifically, we focus on all-vanadium redox flow batteries; however, the capability is general in implementation and could be adopted to other chemistries. The electrochemical and porous flow models follow those developed in the recent publication by [28]. We review the model implemented in this work and its assumptions, and we show several verification cases including a binary electrolyte, and a battery half-cell. Then, we compare our model implementation with the experimental results shown in [28], with good agreement seen. Next, a sensitivity study ismore » conducted for the major model parameters, which is beneficial in targeting specific features of the redox flow cell for improvement. Lastly, we simulate a three-dimensional version of the flow cell to determine the impact of plenum channels on the performance of the cell. Such channels are frequently seen in experimental designs where the current collector plates are borrowed from fuel cell designs. These designs use a serpentine channel etched into a solid collector plate.« less
FPGA-based firmware model for extended measurement systems with data quality monitoring
NASA Astrophysics Data System (ADS)
Wojenski, A.; Pozniak, K. T.; Mazon, D.; Chernyshova, M.
2017-08-01
Modern physics experiments requires construction of advanced, modular measurement systems for data processing and registration purposes. Components are often designed in one of the common mechanical and electrical standards, e.g. VME or uTCA. The paper is focused on measurement systems using FPGAs as data processing blocks, especially for plasma diagnostics using GEM detectors with data quality monitoring aspects. In the article is proposed standardized model of HDL FPGA firmware implementation, for use in a wide range of different measurement system. The effort was made in term of flexible implementation of data quality monitoring along with source data dynamic selection. In the paper is discussed standard measurement system model followed by detailed model of FPGA firmware for modular measurement systems. Considered are both: functional blocks and data buses. In the summary, necessary blocks and signal lines are described. Implementation of firmware following the presented rules should provide modular design, with ease of change different parts of it. The key benefit is construction of universal, modular HDL design, that can be applied in different measurement system with simple adjustments.
Implementing Project Based Learning Approach to Graphic Design Course
ERIC Educational Resources Information Center
Riyanti, Menul Teguh; Erwin, Tuti Nuriah; Suriani, S. H.
2017-01-01
The purpose of this study was to develop a learning model based Commercial Graphic Design Drafting project-based learning approach, was chosen as a strategy in the learning product development research. University students as the target audience of this model are the students of the fifth semester Visual Communications Design Studies Program…
Integrating Technology into Classroom: The Learner-Centered Instructional Design
ERIC Educational Resources Information Center
Sezer, Baris; Karaoglan Yilmaz, Fatma Gizem; Yilmaz, Ramazan
2013-01-01
In this study, to present an instructional model by considering the existing models of instructional design (ARCS, ADDIE, ASSURE, Dick and Carey, Seels and Glasgow, Smith and Ragan etc.) with the nature of technology-based education and to reveal analysis, design, development, implementation, evaluation, and to revise levels with lower levels of…
Lee, Heewon; Contento, Isobel R; Koch, Pamela
2013-03-01
To use and review a conceptual model of process evaluation and to examine the implementation of a nutrition education curriculum, Choice, Control & Change, designed to promote dietary and physical activity behaviors that reduce obesity risk. A process evaluation study based on a systematic conceptual model. Five middle schools in New York City. Five hundred sixty-two students in 20 classes and their science teachers (n = 8). Based on the model, teacher professional development, teacher implementation, and student reception were evaluated. Also measured were teacher characteristics, teachers' curriculum evaluation, and satisfaction with teaching the curriculum. Descriptive statistics and Spearman ρ correlation for quantitative analysis and content analysis for qualitative data were used. Mean score of the teacher professional development evaluation was 4.75 on a 5-point scale. Average teacher implementation rate was 73%, and the student reception rate was 69%. Ongoing teacher support was highly valued by teachers. Teacher satisfaction with teaching the curriculum was highly correlated with student satisfaction (P < .05). Teacher perception of amount of student work was negatively correlated with implementation and with student satisfaction (P < .05). Use of a systematic conceptual model and comprehensive process measures improves understanding of the implementation process and helps educators to better implement interventions as designed. Copyright © 2013 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.
Three axis electronic flight motion simulator real time control system design and implementation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gao, Zhiyuan; Miao, Zhonghua, E-mail: zhonghua-miao@163.com; Wang, Xiaohua
2014-12-15
A three axis electronic flight motion simulator is reported in this paper including the modelling, the controller design as well as the hardware implementation. This flight motion simulator could be used for inertial navigation test and high precision inertial navigation system with good dynamic and static performances. A real time control system is designed, several control system implementation problems were solved including time unification with parallel port interrupt, high speed finding-zero method of rotary inductosyn, zero-crossing management with continuous rotary, etc. Tests were carried out to show the effectiveness of the proposed real time control system.
Three axis electronic flight motion simulator real time control system design and implementation.
Gao, Zhiyuan; Miao, Zhonghua; Wang, Xuyong; Wang, Xiaohua
2014-12-01
A three axis electronic flight motion simulator is reported in this paper including the modelling, the controller design as well as the hardware implementation. This flight motion simulator could be used for inertial navigation test and high precision inertial navigation system with good dynamic and static performances. A real time control system is designed, several control system implementation problems were solved including time unification with parallel port interrupt, high speed finding-zero method of rotary inductosyn, zero-crossing management with continuous rotary, etc. Tests were carried out to show the effectiveness of the proposed real time control system.
Randomized Controlled Trials in Music Therapy: Guidelines for Design and Implementation.
Bradt, Joke
2012-01-01
Evidence from randomized controlled trials (RCTs) plays a powerful role in today's healthcare industry. At the same time, it is important that multiple types of evidence contribute to music therapy's knowledge base and that the dialogue of clinical effectiveness in music therapy is not dominated by the biomedical hierarchical model of evidence-based practice. Whether or not one agrees with the hierarchical model of evidence in the current healthcare climate, RCTs can contribute important knowledge to our field. Therefore, it is important that music therapists are prepared to design trials that meet current methodological standards and, equally important, are able to respond appropriately to those design aspects that may not be feasible in music therapy research. To provide practical guidelines to music therapy researchers for the design and implementation of RCTs as well as to enable music therapists to be well-informed consumers of RCT evidence. This article reviews key design aspects of RCTs and discusses how to best implement these standards in music therapy trials. A systematic presentation of basic randomization methods, allocation concealment strategies, issues related to blinding in music therapy trials and strategies for implementation, the use of treatment manuals, types of control groups, outcome selection, and sample size computation is provided. Despite the challenges of meeting all key design demands typical of an RCT, it is possible to design rigorous music therapy RCTs that accurately estimate music therapy treatment benefits.
Design and implementation of ticket price forecasting system
NASA Astrophysics Data System (ADS)
Li, Yuling; Li, Zhichao
2018-05-01
With the advent of the aviation travel industry, a large number of data mining technologies have been developed to increase profits for airlines in the past two decades. The implementation of the digital optimization strategy leads to price discrimination, for example, similar seats on the same flight are purchased at different prices, depending on the time of purchase, the supplier, and so on. Price fluctuations make the prediction of ticket prices have application value. In this paper, a combination of ARMA algorithm and random forest algorithm is proposed to predict the price of air ticket. The experimental results show that the model is more reliable by comparing the forecasting results with the actual results of each price model. The model is helpful for passengers to buy tickets and to save money. Based on the proposed model, using Python language and SQL Server database, we design and implement the ticket price forecasting system.
ERIC Educational Resources Information Center
Blevins, Samantha; Brill, Jennifer
2017-01-01
Drawing from a design and development research approach, specifically model research, this study investigated the perspectives of higher education faculty and administrators regarding their experiences with a university-wide electronic portfolio implementation initiative. Participants in the study were fifty-two faculty and administrators at a…
ERIC Educational Resources Information Center
Rimando, Marylen; Smalley, K. Bryant; Warren, Jacob C.
2015-01-01
This article describes the design, implementation and lessons learned from a digital storytelling project in a health promotion theory course. From 2011-2012, 195 health promotion majors completed a digital storytelling project at a Midwestern university. The instructor observed students' understanding of theories and models. This article adds to…
ERIC Educational Resources Information Center
Adams, Mark Thomas
2013-01-01
This qualitative study investigated the nature of the relationship between principal leadership and school culture within a school-wide implementation of Professional Crisis Management (PCM). PCM is a comprehensive and fully integrated system designed to manage crisis situations effectively, safely, and with dignity. While designed primarily to…
Improving Treatment Plan Implementation in Schools: A Meta-Analysis of Single Subject Design Studies
ERIC Educational Resources Information Center
Noell, George H.; Gansle, Kristin A.; Mevers, Joanna Lomas; Knox, R. Maria; Mintz, Joslyn Cynkus; Dahir, Amanda
2014-01-01
Twenty-nine peer-reviewed journal articles that analyzed intervention implementation in schools using single-case experimental designs were meta-analyzed. These studies reported 171 separate data paths and provided 3,991 data points. The meta-analysis was accomplished by fitting data extracted from graphs in mixed linear growth models. This…
1990-11-01
to design and implement an adaptive intelligent interface for a command-and-control-style domain. The primary functionality of the resulting...technical tasks, as follows: 1. Analysis of Current Interface Technologies 2. Dejineation of User Roles 3. Development of User Models 4. Design of Interface...Management Association (FEMA). In the initial version of the prototype, two distin-t user models were designed . One type of user modeled by the system is
Design and Implementation of Harmful Algal Bloom Diagnosis System Based on J2EE Platform
NASA Astrophysics Data System (ADS)
Guo, Chunfeng; Zheng, Haiyong; Ji, Guangrong; Lv, Liang
According to the shortcomings which are time consuming and laborious of the traditional HAB (Harmful Algal Bloom) diagnosis by the experienced experts using microscope, all kinds of methods and technologies to identify HAB emerged such as microscopic images, molecular biology, characteristics of pigments analysis, fluorescence spectra, inherent optical properties, etc. This paper proposes the design and implementation of a web-based diagnosis system integrating the popular methods for HAB identification. This system is designed with J2EE platform based on MVC (Model-View-Controller) model as well as technologies such as JSP, Servlets, EJB and JDBC.
Parametric Design within an Atomic Design Process (ADP) applied to Spacecraft Design
NASA Astrophysics Data System (ADS)
Ramos Alarcon, Rafael
This thesis describes research investigating the development of a model for the initial design of complex systems, with application to spacecraft design. The design model is called an atomic design process (ADP) and contains four fundamental stages (specifications, configurations, trade studies and drivers) that constitute the minimum steps of an iterative process that helps designers find a feasible solution. Representative design models from the aerospace industry are reviewed and are compared with the proposed model. The design model's relevance, adaptability and scalability features are evaluated through a focused design task exercise with two undergraduate teams and a long-term design exercise performed by a spacecraft payload team. The implementation of the design model is explained in the context in which the model has been researched. This context includes the organization (a student-run research laboratory at the University of Michigan), its culture (academically oriented), members that have used the design model and the description of the information technology elements meant to provide support while using the model. This support includes a custom-built information management system that consolidates relevant information that is currently being used in the organization. The information is divided in three domains: personnel development history, technical knowledge base and laboratory operations. The focused study with teams making use of the design model to complete an engineering design exercise consists of the conceptual design of an autonomous system, including a carrier and a deployable lander that form the payload of a rocket with an altitude range of over 1000 meters. Detailed results from each of the stages of the design process while implementing the model are presented, and an increase in awareness of good design practices in the teams while using the model are explained. A long-term investigation using the design model consisting of the successful characterization of an imaging system for a spacecraft is presented. The spacecraft is designed to take digital color images from low Earth orbit. The dominant drivers from each stage of the design process are indicated as they were identified, with the accompanying hardware development leading to the final configuration that comprises the flight spacecraft.
Smeltzer, Matthew P.; Rugless, Fedoria E.; Jackson, Bianca M.; Berryman, Courtney L.; Faris, Nicholas R.; Ray, Meredith A.; Meadows, Meghan; Patel, Anita A.; Roark, Kristina S.; Kedia, Satish K.; DeBon, Margaret M.; Crossley, Fayre J.; Oliver, Georgia; McHugh, Laura M.; Hastings, Willeen; Osborne, Orion; Osborne, Jackie; Ill, Toni; Ill, Mark; Jones, Wynett; Lee, Hyo K.; Signore, Raymond S.; Fox, Roy C.; Li, Jingshan; Robbins, Edward T.; Ward, Kenneth D.; Klesges, Lisa M.
2018-01-01
Background Responsible for 25% of all US cancer deaths, lung cancer presents complex care-delivery challenges. Adoption of the highly recommended multidisciplinary care model suffers from a dearth of good quality evidence. Leading up to a prospective comparative-effectiveness study of multidisciplinary vs. serial care, we studied the implementation of a rigorously benchmarked multidisciplinary lung cancer clinic. Methods We used a mixed-methods approach to conduct a patient-centered, combined implementation and effectiveness study of a multidisciplinary model of lung cancer care. We established a co-located multidisciplinary clinic to study the implementation of this care-delivery model. We identified and engaged key stakeholders from the onset, used their input to develop the program structure, processes, performance benchmarks, and study endpoints (outcome-related process measures, patient- and caregiver-reported outcomes, survival). In this report, we describe the study design, process of implementation, comparative populations, and how they contrast with patients within the local and regional healthcare system. Trial Registration: ClinicalTrials.gov Identifier: NCT02123797. Results Implementation: the multidisciplinary clinic obtained an overall treatment concordance rate of 90% (target >85%). Satisfaction scores were high, with >95% of patients and caregivers rating themselves as being “very satisfied” with all aspects of care from the multidisciplinary team (patient/caregiver response rate >90%). The Reach of the multidisciplinary clinic included a higher proportion of minority patients, more women, and younger patients than the regional population. Comparative effectiveness: The comparative effectiveness trial conducted in the last phase of the study met the planned enrollment per statistical design, with 178 patients in the multidisciplinary arm and 348 in the serial care arm. The multidisciplinary cohort had older age and a higher percentage of racial minorities, with a higher proportion of stage IV patients in the serial care arm. Conclusions This study demonstrates a comprehensive implementation of a multidisciplinary model of lung cancer care, which will advance the science behind implementing this much-advocated clinical care model. PMID:29535915
Patel, Sapana R; Margolies, Paul J; Covell, Nancy H; Lipscomb, Cristine; Dixon, Lisa B
2018-01-01
Implementation science lacks a systematic approach to the development of learning strategies for online training in evidence-based practices (EBPs) that takes the context of real-world practice into account. The field of instructional design offers ecologically valid and systematic processes to develop learning strategies for workforce development and performance support. This report describes the application of an instructional design framework-Analyze, Design, Develop, Implement, and Evaluate (ADDIE) model-in the development and evaluation of e-learning modules as one strategy among a multifaceted approach to the implementation of individual placement and support (IPS), a model of supported employment for community behavioral health treatment programs, in New York State. We applied quantitative and qualitative methods to develop and evaluate three IPS e-learning modules. Throughout the ADDIE process, we conducted formative and summative evaluations and identified determinants of implementation using the Consolidated Framework for Implementation Research (CFIR). Formative evaluations consisted of qualitative feedback received from recipients and providers during early pilot work. The summative evaluation consisted of levels 1 and 2 (reaction to the training, self-reported knowledge, and practice change) quantitative and qualitative data and was guided by the Kirkpatrick model for training evaluation. Formative evaluation with key stakeholders identified a range of learning needs that informed the development of a pilot training program in IPS. Feedback on this pilot training program informed the design document of three e-learning modules on IPS: Introduction to IPS, IPS Job development, and Using the IPS Employment Resource Book . Each module was developed iteratively and provided an assessment of learning needs that informed successive modules. All modules were disseminated and evaluated through a learning management system. Summative evaluation revealed that learners rated the modules positively, and self-report of knowledge acquisition was high (mean range: 4.4-4.6 out of 5). About half of learners indicated that they would change their practice after watching the modules (range: 48-51%). All learners who completed the level 1 evaluation demonstrated 80% or better mastery of knowledge on the level 2 evaluation embedded in each module. The CFIR was used to identify implementation barriers and facilitators among the evaluation data which facilitated planning for subsequent implementation support activities in the IPS initiative. Instructional design approaches such as ADDIE may offer implementation scientists and practitioners a flexible and systematic approach for the development of e-learning modules as a single component or one strategy in a multifaceted approach for training in EBPs.
Okeibunor, Joseph; Bump, Jesse; Zouré, Honorat G M; Sékétéli, Azodoga; Godin, Christine; Amazigo, Uche V
2012-01-01
Onchocerciasis is controlled by mass treatment of at-risk populations with ivermectin. Ivermectin is delivered through community-directed treatment (CDTI) approach. A model has been developed to evaluate the sustainability of the approach and has been tested at 35 projects in 10 countries of the African Program for Onchocerciasis Control (APOC). It incorporates quantitative and qualitative data collection and analysis, taking account of two factors identified as crucial to project sustainability. These are (i) the provision of project performance information to partners, and (ii) evidence-based support for project implementation. The model is designed to provide critical indicators of project performance of the model to implementing, coordinating, and funding partners. The model's participatory and flexible nature makes it culturally sensitive and usable by project management. This model is able to analyze the different levels involved in project implementation and arrive at a judgment for the whole project. It has inbuilt mechanisms for ensuring data reliability and validity. The model addresses the complex issue of sustainability with a cross-sectional design focusing on how and at which operational level of implementation to strengthen a CDTI project. The unique attributes and limitations of the model for evaluating the sustainability of projects were described. Copyright © 2012 John Wiley & Sons, Ltd.
ERIC Educational Resources Information Center
Nazari, Mohammad Ali; Perrier, Pascal; Payan, Yohan
2013-01-01
Purpose: The authors aimed to design a distributed lambda model (DLM), which is well adapted to implement three-dimensional (3-D), finite-element descriptions of muscles. Method: A muscle element model was designed. Its stress-strain relationships included the active force-length characteristics of the ? model along the muscle fibers, together…
NASA Astrophysics Data System (ADS)
McMackin, Lenore; Herman, Matthew A.; Weston, Tyler
2016-02-01
We present the design of a multi-spectral imager built using the architecture of the single-pixel camera. The architecture is enabled by the novel sampling theory of compressive sensing implemented optically using the Texas Instruments DLP™ micro-mirror array. The array not only implements spatial modulation necessary for compressive imaging but also provides unique diffractive spectral features that result in a multi-spectral, high-spatial resolution imager design. The new camera design provides multi-spectral imagery in a wavelength range that extends from the visible to the shortwave infrared without reduction in spatial resolution. In addition to the compressive imaging spectrometer design, we present a diffractive model of the architecture that allows us to predict a variety of detailed functional spatial and spectral design features. We present modeling results, architectural design and experimental results that prove the concept.
Hospital information system: reusability, designing, modelling, recommendations for implementing.
Huet, B
1998-01-01
The aims of this paper are to precise some essential conditions for building reuse models for hospital information systems (HIS) and to present an application for hospital clinical laboratories. Reusability is a general trend in software, however reuse can involve a more or less part of design, classes, programs; consequently, a project involving reusability must be precisely defined. In the introduction it is seen trends in software, the stakes of reuse models for HIS and the special use case constituted with a HIS. The main three parts of this paper are: 1) Designing a reuse model (which objects are common to several information systems?) 2) A reuse model for hospital clinical laboratories (a genspec object model is presented for all laboratories: biochemistry, bacteriology, parasitology, pharmacology, ...) 3) Recommendations for generating plug-compatible software components (a reuse model can be implemented as a framework, concrete factors that increase reusability are presented). In conclusion reusability is a subtle exercise of which project must be previously and carefully defined.
Model for the design of distributed data bases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ram, S.
This research focuses on developing a model to solve the File Allocation Problem (FAP). The model integrates two major design issues, namely Concurrently Control and Data Distribution. The central node locking mechanism is incorporated in developing a nonlinear integer programming model. Two solution algorithms are proposed, one of which was implemented in FORTRAN.V. The allocation of data bases and programs are examined using this heuristic. Several decision rules were also formulated based on the results of the heuristic. A second more comprehensive heuristic was proposed, based on the knapsack problem. The development and implementation of this algorithm has been leftmore » as a topic for future research.« less
DOT National Transportation Integrated Search
2014-11-01
The main objective of Part 3 was to locally calibrate and validate the mechanistic-empirical pavement : design guide (Pavement-ME) performance models to Michigan conditions. The local calibration of the : performance models in the Pavement-ME is a ch...
2014-01-01
Background The HIV/AIDS epidemic continues to disproportionately affect African American communities in the US, particularly those located in urban areas. Despite the fact that HIV is often transmitted from one sexual partner to another, most HIV prevention interventions have focused only on individuals, rather than couples. This five-year study investigates community-based implementation, effectiveness, and sustainability of ‘Eban II,’ an evidence-based risk reduction intervention for African-American heterosexual, serodiscordant couples. Methods/design This hybrid implementation/effectiveness implementation study is guided by organizational change theory as conceptualized in the Texas Christian University Program Change Model (PCM), a model of phased organizational change from exposure to adoption, implementation, and sustainability. The primary implementation aims are to assist 10 community-based organizations (CBOs) to implement and sustain Eban II; specifically, to partner with CBOs to expose providers to the intervention; facilitate its adoption, implementation and sustainment; and to evaluate processes and determinants of implementation, effectiveness, fidelity, and sustainment. The primary effectiveness aim is to evaluate the effect of Eban II on participant (n = 200 couples) outcomes, specifically incidents of protected sex and proportion of condom use. We will also determine the cost-effectiveness of implementation, as measured by implementation costs and potential cost savings. A mixed methods evaluation will examine implementation at the agency level; staff members from the CBOs will complete baseline measures of organizational context and climate, while key stakeholders will be interviewed periodically throughout implementation. Effectiveness of Eban II will be assessed using a randomized delayed enrollment (waitlist) control design to evaluate the impact of treatment on outcomes at posttest and three-month follow-up. Multi-level hierarchical modeling with a multi-level nested structure will be used to evaluate the effects of agency- and couples-level characteristics on couples-level outcomes (e.g., condom use). Discussion This study will produce important information regarding the value of the Eban II program and a theory-guided implementation process and tools designed for use in implementing Eban II and other evidence-based programs in demographically diverse, resource-constrained treatment settings. Trial registration NCT00644163 PMID:24950708
NASA Astrophysics Data System (ADS)
Lee, Kwangkook; Jeong, Mijin; Kim, Dong Hun
2017-12-01
Since an unmanned semi-submersible is mainly used for the purpose of carrying out dangerous missions in the sea, it is possible to work in a region where it is difficult to access due to safety reasons. In this study, an USV hull design was determined using Myring hull profile, and reinforcement work was performed by designing and implementing inner stiffener member for 3D printing. In order to simulate a sea state 5.0 or more at sea, which is difficult to implement in practice, a regular and irregular wave equation was implemented in Matlab / Simulink. We performed modeling and simulation of semi - submersible simulation based on DMWorks considering the rolling motion in wave. To verify and improve unpredicted errors, we implemented a numeric and physical simulation model of the USV based on software-in-the-loop (SIL) method. This simulation allows shipbuilders to participate in new value-added markets such as engineering, procurement, construction, installation, commissioning, operation, and maintenance for the USV.
Programming model for distributed intelligent systems
NASA Technical Reports Server (NTRS)
Sztipanovits, J.; Biegl, C.; Karsai, G.; Bogunovic, N.; Purves, B.; Williams, R.; Christiansen, T.
1988-01-01
A programming model and architecture which was developed for the design and implementation of complex, heterogeneous measurement and control systems is described. The Multigraph Architecture integrates artificial intelligence techniques with conventional software technologies, offers a unified framework for distributed and shared memory based parallel computational models and supports multiple programming paradigms. The system can be implemented on different hardware architectures and can be adapted to strongly different applications.
Time-Centric Models For Designing Embedded Cyber-physical Systems
2009-10-09
Time -centric Models For Designing Embedded Cyber- physical Systems John C. Eidson Edward A. Lee Slobodan Matic Sanjit A. Seshia Jia Zou Electrical... Time -centric Models For Designing Embedded Cyber-physical Systems ∗ John C. Eidson , Edward A. Lee, Slobodan Matic, Sanjit A. Seshia, Jia Zou...implementations, such a uniform notion of time cannot be precisely realized. Time triggered networks [10] and time synchronization [9] can be used to
JETSPIN: A specific-purpose open-source software for simulations of nanofiber electrospinning
NASA Astrophysics Data System (ADS)
Lauricella, Marco; Pontrelli, Giuseppe; Coluzza, Ivan; Pisignano, Dario; Succi, Sauro
2015-12-01
We present the open-source computer program JETSPIN, specifically designed to simulate the electrospinning process of nanofibers. Its capabilities are shown with proper reference to the underlying model, as well as a description of the relevant input variables and associated test-case simulations. The various interactions included in the electrospinning model implemented in JETSPIN are discussed in detail. The code is designed to exploit different computational architectures, from single to parallel processor workstations. This paper provides an overview of JETSPIN, focusing primarily on its structure, parallel implementations, functionality, performance, and availability.
DRFM Cordic Processor and Sea Clutter Modeling for Enhancing Structured False Target Synthesis
2017-09-01
was implemented using the Verilog hardware description language. The second investigation concerns generating sea clutter to impose on the false target...to achieve accuracy at 5.625o. The resulting design was implemented using the Verilog hardware description language. The second investigation...33 3. Initialization of the Angle Accumulator ....................................34 4. Design Methodology for I/Q Phase
ERIC Educational Resources Information Center
Shiffman, Catherine Dunn
2015-01-01
This paper proposes a framework for analyzing program design features that seem to matter in implementation. The framework is based on findings from a study conducted by the Consortium for Policy Research in Education (CPRE) between 2004 and 2007 that explored how reform ideas and practices created by five external provider organizations were…
ERIC Educational Resources Information Center
Wooden, Cherie L.; Anderson, Frances R.
2012-01-01
Engaging and supporting parents to provide sexuality education to their children is successful when parents take ownership of the intervention. The purpose of this article is to illustrate the lessons learned from implementing a parent-designed, parent-led sexuality education curriculum for parents of preteens (10-14 year olds). The parents…
Ultra low power CMOS technology
NASA Technical Reports Server (NTRS)
Burr, J.; Peterson, A.
1991-01-01
This paper discusses the motivation, opportunities, and problems associated with implementing digital logic at very low voltages, including the challenge of making use of the available real estate in 3D multichip modules, energy requirements of very large neural networks, energy optimization metrics and their impact on system design, modeling problems, circuit design constraints, possible fabrication process modifications to improve performance, and barriers to practical implementation.
Working Group 1: Software System Design and Implementation for Environmental Modeling
ISCMEM Working Group One Presentation, presentation with the purpose of fostering the exchange of information about environmental modeling tools, modeling frameworks, and environmental monitoring databases.
Utilization of building information modeling in infrastructure’s design and construction
NASA Astrophysics Data System (ADS)
Zak, Josef; Macadam, Helen
2017-09-01
Building Information Modeling (BIM) is a concept that has gained its place in the design, construction and maintenance of buildings in Czech Republic during recent years. This paper deals with description of usage, applications and potential benefits and disadvantages connected with implementation of BIM principles in the preparation and construction of infrastructure projects. Part of the paper describes the status of BIM implementation in Czech Republic, and there is a review of several virtual design and construction practices in Czech Republic. Examples of best practice are presented from current infrastructure projects. The paper further summarizes experiences with new technologies gained from the application of BIM related workflows. The focus is on the BIM model utilization for the machine control systems on site, quality assurance, quality management and construction management.
A procedural model for planning and evaluating behavioral interventions.
Hyner, G C
2005-01-01
A model for planning, implementing and evaluating health behavior change strategies is proposed. Variables are presented which can be used in the model or serve as examples for how the model is utilized once a theory of health behavior is adopted. Examples of three innovative strategies designed to influence behavior change are presented so that the proposed model can be modified for use following comprehensive screening and baseline measurements. Three measurement priorities: clients, methods and agency are subjected to three phases of assessment: goals, implementation and effects. Lifestyles account for the majority of variability in quality-of-life and premature morbidity and mortality. Interventions designed to influence healthy behavior changes must be driven by theory and carefully planned and evaluated. The proposed model is offered as a useful tool for the behavior change strategist.
Model-Driven Engineering of Machine Executable Code
NASA Astrophysics Data System (ADS)
Eichberg, Michael; Monperrus, Martin; Kloppenburg, Sven; Mezini, Mira
Implementing static analyses of machine-level executable code is labor intensive and complex. We show how to leverage model-driven engineering to facilitate the design and implementation of programs doing static analyses. Further, we report on important lessons learned on the benefits and drawbacks while using the following technologies: using the Scala programming language as target of code generation, using XML-Schema to express a metamodel, and using XSLT to implement (a) transformations and (b) a lint like tool. Finally, we report on the use of Prolog for writing model transformations.
An ontology-based semantic configuration approach to constructing Data as a Service for enterprises
NASA Astrophysics Data System (ADS)
Cai, Hongming; Xie, Cheng; Jiang, Lihong; Fang, Lu; Huang, Chenxi
2016-03-01
To align business strategies with IT systems, enterprises should rapidly implement new applications based on existing information with complex associations to adapt to the continually changing external business environment. Thus, Data as a Service (DaaS) has become an enabling technology for enterprise through information integration and the configuration of existing distributed enterprise systems and heterogonous data sources. However, business modelling, system configuration and model alignment face challenges at the design and execution stages. To provide a comprehensive solution to facilitate data-centric application design in a highly complex and large-scale situation, a configurable ontology-based service integrated platform (COSIP) is proposed to support business modelling, system configuration and execution management. First, a meta-resource model is constructed and used to describe and encapsulate information resources by way of multi-view business modelling. Then, based on ontologies, three semantic configuration patterns, namely composite resource configuration, business scene configuration and runtime environment configuration, are designed to systematically connect business goals with executable applications. Finally, a software architecture based on model-view-controller (MVC) is provided and used to assemble components for software implementation. The result of the case study demonstrates that the proposed approach provides a flexible method of implementing data-centric applications.
National Combustion Code: Parallel Implementation and Performance
NASA Technical Reports Server (NTRS)
Quealy, A.; Ryder, R.; Norris, A.; Liu, N.-S.
2000-01-01
The National Combustion Code (NCC) is being developed by an industry-government team for the design and analysis of combustion systems. CORSAIR-CCD is the current baseline reacting flow solver for NCC. This is a parallel, unstructured grid code which uses a distributed memory, message passing model for its parallel implementation. The focus of the present effort has been to improve the performance of the NCC flow solver to meet combustor designer requirements for model accuracy and analysis turnaround time. Improving the performance of this code contributes significantly to the overall reduction in time and cost of the combustor design cycle. This paper describes the parallel implementation of the NCC flow solver and summarizes its current parallel performance on an SGI Origin 2000. Earlier parallel performance results on an IBM SP-2 are also included. The performance improvements which have enabled a turnaround of less than 15 hours for a 1.3 million element fully reacting combustion simulation are described.
Designing Effective Curricula with an Interactive Collaborative Curriculum Design Tool (CCDT)
ERIC Educational Resources Information Center
Khadimally, Seda
2015-01-01
Guided by the principles of the Analysis, Design, Development, Implementation, and Evaluation (ADDIE) instructional design (ID) model, this creative instructional product presents a learning/teaching approach that is fundamentally constructivist. For the purposes of designing effective instruction in an academic preparation course, a…
Teacher and student supports for implementation of the NGSS
NASA Astrophysics Data System (ADS)
Severance, Samuel
Through three articles, this dissertation examines the use of supports for implementing the Next Generation Science Standards (NGSS) within a large urban school district. Article one, titled Organizing for Teacher Agency in Curricular Co-design, examines the need for coherent curriculum materials that teachers' had a meaningful role in shaping and how the use of a co-design approach and specific tools and routines can help to address this need. Article two, titled Relevant Learning and Student Agency within a Citizen Science Design Challenge, examines the need for curriculum materials that provide students with learning experiences they find relevant and that expands their sense of agency and how a curriculum centered around a community-based citizen science design challenge can help achieve such an aim. Article three, titled Implementation of a Novel Professional Development Program to Support Teachers' Understanding of Modeling, examines the need for professional development that builds teachers' understanding of and skill in engaging their students in the practice of developing and using models and how a novel professional development program, the Next Generation Science Exemplar, can aid teachers in this regard by providing them with carefully sequenced professional development activities and specific modeling tools for use in the classroom.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klein, Steven Karl; Day, Christy M.; Determan, John C.
LANL has developed a process to generate a progressive family of system models for a fissile solution system. This family includes a dynamic system simulation comprised of coupled nonlinear differential equations describing the time evolution of the system. Neutron kinetics, radiolytic gas generation and transport, and core thermal hydraulics are included in the DSS. Extensions to explicit operation of cooling loops and radiolytic gas handling are embedded in these systems as is a stability model. The DSS may then be converted to an implementation in Visual Studio to provide a design team the ability to rapidly estimate system performance impactsmore » from a variety of design decisions. This provides a method to assist in optimization of the system design. Once design has been generated in some detail the C++ version of the system model may then be implemented in a LabVIEW user interface to evaluate operator controls and instrumentation and operator recognition and response to off-normal events. Taken as a set of system models the DSS, Visual Studio, and LabVIEW progression provides a comprehensive set of design support tools.« less
The Effect of STEM Learning through the Project of Designing Boat Model toward Student STEM Literacy
NASA Astrophysics Data System (ADS)
Tati, T.; Firman, H.; Riandi, R.
2017-09-01
STEM Learning focusses on development of STEM-literate society, the research about implementation of STEM learning to develope students’ STEM literacy is still limited. This study is aimed to examine the effect of implementation STEM learning through the project of designing boat model on students STEM literacy in energy topic. The method of this study was a quasi-experiment with non-randomized pretest-posttest control group design. There were two classes involved, the experiment class used Project Based Learning with STEM approach and control class used Project-Based Learning without STEM approach. A STEM Literacy test instrument was developed to measure students STEM literacy which consists of science literacy, mathematics literacy, and technology-engineering literacy. The analysis showed that there were significant differences on improvement science literacy, mathematics technology-engineering between experiment class and control class with effect size more than 0.8 (large effect). The difference of improvement of STEM literacy between experiment class and control class is caused by the existence of design engineering activity which required students to apply the knowledge from every field of STEM. The challenge that was faced in STEM learning through design engineering activity was how to give the students practice to integrate STEM field in solving the problems. In additional, most of the students gave positive response toward implementation of STEM learning through design boat model project.
NASA Astrophysics Data System (ADS)
McEver, Jimmie; Davis, Paul K.; Bigelow, James H.
2000-06-01
We have developed and used families of multiresolution and multiple-perspective models (MRM and MRMPM), both in our substantive analytic work for the Department of Defense and to learn more about how such models can be designed and implemented. This paper is a brief case history of our experience with a particular family of models addressing the use of precision fires in interdicting and halting an invading army. Our models were implemented as closed-form analytic solutions, in spreadsheets, and in the more sophisticated AnalyticaTM environment. We also drew on an entity-level simulation for data. The paper reviews the importance of certain key attributes of development environments (visual modeling, interactive languages, friendly use of array mathematics, facilities for experimental design and configuration control, statistical analysis tools, graphical visualization tools, interactive post-processing, and relational database tools). These can go a long way towards facilitating MRMPM work, but many of these attributes are not yet widely available (or available at all) in commercial model-development tools--especially for use with personal computers. We conclude with some lessons learned from our experience.
Foo, Mathias; Sawlekar, Rucha; Kulkarni, Vishwesh V; Bates, Declan G
2016-08-01
The use of abstract chemical reaction networks (CRNs) as a modelling and design framework for the implementation of computing and control circuits using enzyme-free, entropy driven DNA strand displacement (DSD) reactions is starting to garner widespread attention in the area of synthetic biology. Previous work in this area has demonstrated the theoretical plausibility of using this approach to design biomolecular feedback control systems based on classical proportional-integral (PI) controllers, which may be constructed from CRNs implementing gain, summation and integrator operators. Here, we propose an alternative design approach that utilises the abstract chemical reactions involved in cellular signalling cycles to implement a biomolecular controller - termed a signalling-cycle (SC) controller. We compare the performance of the PI and SC controllers in closed-loop with a nonlinear second-order chemical process. Our results show that the SC controller outperforms the PI controller in terms of both performance and robustness, and also requires fewer abstract chemical reactions to implement, highlighting its potential usefulness in the construction of biomolecular control circuits.
Design Considerations for a Launch Vehicle Development Flight Instrumentation System
NASA Technical Reports Server (NTRS)
Johnson, Martin L.; Crawford, Kevin
2011-01-01
When embarking into the design of a new launch vehicle, engineering models of expected vehicle performance are always generated. While many models are well established and understood, some models contain design features that are only marginally known. Unfortunately, these analytical models produce uncertainties in design margins. The best way to answer these analytical issues is with vehicle level testing. The National Aeronautics and Space Administration respond to these uncertainties by using a vehicle level system called the Development Flight Instrumentation, or DFI. This DFI system can be simple to implement, with only a few measurements, or it may be a sophisticated system with hundreds of measurement and video, without a recording capability. From experience with DFI systems, DFI never goes away. The system is renamed and allowed to continue, in most cases. Proper system design can aid the transition to future data requirements. This paper will discuss design features that need to be considered when developing a DFI system for a launch vehicle. It will briefly review the data acquisition units, sensors, multiplexers and recorders, telemetry components and harnessing. It will present a reasonable set of requirements which should be implemented in the beginning of the program in order to start the design. It will discuss a simplistic DFI architecture that could be the basis for the next NASA launch vehicle. This will be followed by a discussion of the "experiences gained" from a past DFI system implementation, such as the very successful Ares I-X test flight. Application of these design considerations may not work for every situation, but they may direct a path toward success or at least make one pause and ask the right questions.
WindPACT Reference Wind Turbines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dykes, Katherine L; Rinker, Jennifer
To fully understand how loads and turbine cost scale with turbine size, it is necessary to have identical turbine models that have been scaled to different rated powers. The report presents the WindPACT baseline models, which are a series of four baseline models that were designed to facilitate investigations into the scalings of loads and turbine cost with size. The models have four different rated powers (750 kW, 1.5 MW, 3.0 MW, and 5.0 MW), and each model was designed to its specified rated power using the same design methodology. The models were originally implemented in FAST_AD, the predecessor tomore » NREL's open-source wind turbine simulator FAST, but have yet to be implemented in FAST. This report contains the specifications for all four WindPACT baseline models - including structural, aerodynamic, and control specifications - along with the inherent assumptions and equations that were used to calculate the model parameters. It is hoped that these baseline models will serve as extremely useful resources for investigations into the scalings of costs, loads, or optimization routines.« less
Next-generation concurrent engineering: developing models to complement point designs
NASA Technical Reports Server (NTRS)
Morse, Elizabeth; Leavens, Tracy; Cohanim, Babak; Harmon, Corey; Mahr, Eric; Lewis, Brian
2006-01-01
Concurrent Engineering Design (CED) teams have made routine the rapid development of point designs for space missions. The Jet Propulsion Laboratory's Team X is now evolving into a 'next-generation CED; in addition to a point design, the Team develops a model of the local trade space. The process is a balance between the power of a model developing tools and the creativity of humal experts, enabling the development of a variety of trade models for any space mission. This paper reviews the modeling method and its practical implementation in the ED environment. Example results illustrate the benefit of this approach.
2013-06-03
associated Program design: Kenya . and written project with improved I) Generic model should be adaptable documentation. diagnosis or treatment. to local...potential for and threats to development through an unsustainable and 3) Gaps in evaluation in area of building sustainable to performance -based...implementation in developing countries by building a framework that will identify key elements in this process and serve as guidance to implementers. This study
de Stampa, Matthieu; Vedel, Isabelle; Mauriat, Claire; Bagaragaza, Emmanuel; Routelous, Christelle; Bergman, Howard; Lapointe, Liette; Cassou, Bernard; Ankri, Joel; Henrard, Jean-Claude
2010-01-01
Purpose To present an innovative bottom-up and pragmatic strategy used to implement a new integrated care model in France for community-dwelling elderly people with complex needs. Context Sustaining integrated care is difficult, in large part because of problems encountered securing the participation of health care and social service professionals and, in particular, general practitioners (GPs). Case description In the first step, a diagnostic study was conducted with face-to-face interviews to gather data on current practices from a sample of health and social stakeholders working with elderly people. In the second step, an integrated care model called Coordination Personnes Agées (COPA) was designed by the same major stakeholders in order to define its detailed characteristics based on the local context. In the third step, the model was implemented in two phases: adoption and maintenance. This strategy was carried out by a continuous and flexible leadership throughout the process, initially with a mixed leadership (clinician and researcher) followed by a double one (clinician and managers of services) in the implementation phase. Conclusions The implementation of this bottom-up and pragmatic strategy relied on establishing a collaborative dynamic among health and social stakeholders. This enhanced their involvement throughout the implementation phase, particularly among the GPs, and allowed them to support the change practices and services arrangements.
NASA Astrophysics Data System (ADS)
Bellerby, Tim
2014-05-01
Model Integration System (MIST) is open-source environmental modelling programming language that directly incorporates data parallelism. The language is designed to enable straightforward programming structures, such as nested loops and conditional statements to be directly translated into sequences of whole-array (or more generally whole data-structure) operations. MIST thus enables the programmer to use well-understood constructs, directly relating to the mathematical structure of the model, without having to explicitly vectorize code or worry about details of parallelization. A range of common modelling operations are supported by dedicated language structures operating on cell neighbourhoods rather than individual cells (e.g.: the 3x3 local neighbourhood needed to implement an averaging image filter can be simply accessed from within a simple loop traversing all image pixels). This facility hides details of inter-process communication behind more mathematically relevant descriptions of model dynamics. The MIST automatic vectorization/parallelization process serves both to distribute work among available nodes and separately to control storage requirements for intermediate expressions - enabling operations on very large domains for which memory availability may be an issue. MIST is designed to facilitate efficient interpreter based implementations. A prototype open source interpreter is available, coded in standard FORTRAN 95, with tools to rapidly integrate existing FORTRAN 77 or 95 code libraries. The language is formally specified and thus not limited to FORTRAN implementation or to an interpreter-based approach. A MIST to FORTRAN compiler is under development and volunteers are sought to create an ANSI-C implementation. Parallel processing is currently implemented using OpenMP. However, parallelization code is fully modularised and could be replaced with implementations using other libraries. GPU implementation is potentially possible.
DOT National Transportation Integrated Search
2012-12-01
This report presents the results of a 16-month project for system development and design of a model for a Travel Management Coordination Center (TMCC) using ITS capabilities. The system was designed as a tool to facilitate the exchange of knowledge a...
PBL and CDIO: Complementary Models for Engineering Education Development
ERIC Educational Resources Information Center
Edström, Kristina; Kolmos, Anette
2014-01-01
This paper compares two models for reforming engineering education, problem/project-based learning (PBL), and conceive-design-implement-operate (CDIO), identifying and explaining similarities and differences. PBL and CDIO are defined and contrasted in terms of their history, community, definitions, curriculum design, relation to disciplines,…
ERIC Educational Resources Information Center
Isman, Aytekin; Abanmy, Fahad AbdulAziz; Hussein, Hisham Barakat; Al Saadany, Mohammed Abdurrahman
2012-01-01
The new instructional design model (Isman - 2011) aims at planing, developing, implementing, evaluating, and organizing full learning activities effectively to ensure competent performance by students. The theoretical foundation of this model comes from behaviorism, cognitivism and constructivism views. And it's based on active learning. During…
ERIC Educational Resources Information Center
Reider, David; Knestis, Kirk; Malyn-Smith, Joyce
2016-01-01
This article proposes a STEM workforce education logic model, tailored to the particular context of the National Science Foundation's Innovative Technology Experiences for Students and Teachers (ITEST) program. This model aims to help program designers and researchers address challenges particular to designing, implementing, and studying education…
ERIC Educational Resources Information Center
Ketterlin-Geller, Leanne R.
2008-01-01
This article presents a model of assessment development integrating student characteristics with the conceptualization, design, and implementation of standardized achievement tests. The model extends the assessment triangle proposed by the National Research Council (Pellegrino, Chudowsky, & Glaser, 2001) to consider the needs of students with…
Tier 2 Team Processes and Decision-Making in a Comprehensive Three-Tiered Model
ERIC Educational Resources Information Center
Pool, Juli L.; Carter, Deborah Russell; Johnson, Evelyn S.
2013-01-01
Three-tiered models of academic and behavioral support are being increasingly adopted across the nation, and with that adoption has come an increasing message that designing and implementing effective practices alone is not enough. Systems are needed to help staff to collectively implement best practices. These systems, as well as effective…
Intensive Evaluation of Head Start Implementation in the Tucson Early Education Model.
ERIC Educational Resources Information Center
Rentfrow, Robert K.
As part of the national Head Start Planned Variation Study, this study used a relatively small sample in an intensive evaluation of program implementation in one field community using the Tucson Early Education Model (TEEM). A modified Solomon four-group research design formed the organization framework. Evaluation of six TEEM classrooms and two…
Three-Tiered Models of Prevention: Teacher Efficacy and Burnout
ERIC Educational Resources Information Center
Oakes, Wendy Peia; Lane, Kathleen Lynne; Jenkins, Abbie; Booker, Belle B.
2013-01-01
Project Persevere examined teacher efficacy and burnout within Comprehensive, Integrated, Three-tiered (CI3T) models of prevention, as implemented in two middle schools in a southern state. Participating schools completed a year-long training series to design their CI3T plans and were in their first year of implementation as part of regular school…
Implementing a Service Learning Model for Teaching Research Methods and Program Evaluation
ERIC Educational Resources Information Center
Shannon, Patrick; Kim, Wooksoo; Robinson, Adjoa
2012-01-01
In an effort to teach students the basic knowledge of research methods and the realities of conducting research in the context of agencies in the community, faculty developed and implemented a service learning model for teaching research and program evaluation to foundation-year MSW students. A year-long foundation course was designed in which one…
The Effect of Academic Culture on the Implementation of the EFQM Excellence Model in UK Universities
ERIC Educational Resources Information Center
Davies, John; Douglas, Alex; Douglas, Jacqueline
2007-01-01
Purpose: The paper seeks to explore the effect of academic culture on the implementation of the European Foundation for Quality Management's (EFQM) Excellence Model in UK universities. Design/methodology/approach: A literature review reveals several aspects, which collectively define the academic culture in UK universities. These aspects were…
The TESSA OER Experience: Building Sustainable Models of Production and User Implementation
ERIC Educational Resources Information Center
Wolfenden, Freda
2008-01-01
This paper offers a review of the origins, design strategy and implementation plans of the Teacher Education in Sub-Saharan Africa (TESSA) research and development programme. The programme is working to develop new models of teacher education, particularly school based training, including the creation of a programme webspace and an extensive bank…
ERIC Educational Resources Information Center
Wright, Courtney A.; Kaiser, Ann P.
2017-01-01
Measuring treatment fidelity is an essential step in research designed to increase the use of evidence-based practices. For parent-implemented communication interventions, measuring the implementation of the teaching and coaching provided to the parents is as critical as measuring the parents' delivery of the intervention to the child. Both levels…
ERIC Educational Resources Information Center
Rosales, Rocío; Gongola, Leah; Homlitas, Christa
2015-01-01
A multiple baseline design across participants was used to evaluate the effects of video modeling with embedded instructions on training teachers to implement 3 preference assessments. Each assessment was conducted with a confederate learner or a child with autism during generalization probes. All teachers met the predetermined mastery criterion,…
ERIC Educational Resources Information Center
Bulunz, Nermin; Gursoy, Esim; Kesner, John; Baltaci Goktalay, Sehnaz; Salihoglu, Umut M.
2014-01-01
Implementation of the standards established by the Higher Education Council (HEC) has shown great variation between universities, between departments and even between supervisors. A TUBITAK (111K162)-EVRENA project designed to develop a "teaching practice program" using a Clinical Supervision Model (CSM) was conducted. The present study…
ERIC Educational Resources Information Center
Lee, Sunghye; Koszalka, Tiffany A.
2016-01-01
The First Principles of Instruction (FPI) represent ideologies found in most instructional design theories and models. Few attempts, however, have been made to empirically test the relationship of these FPI to instructional outcomes. This study addresses whether the degree to which FPI are implemented in courses makes a difference to student…
[Modeling and implementation method for the automatic biochemistry analyzer control system].
Wang, Dong; Ge, Wan-cheng; Song, Chun-lin; Wang, Yun-guang
2009-03-01
In this paper the system structure The automatic biochemistry analyzer is a necessary instrument for clinical diagnostics. First of is analyzed. The system problems description and the fundamental principles for dispatch are brought forward. Then this text puts emphasis on the modeling for the automatic biochemistry analyzer control system. The objects model and the communications model are put forward. Finally, the implementation method is designed. It indicates that the system based on the model has good performance.
Care Model Design for E-Health: Integration of Point-of-Care Testing at Dutch General Practices.
Verhees, Bart; van Kuijk, Kees; Simonse, Lianne
2017-12-21
Point-of-care testing (POCT)-laboratory tests performed with new mobile devices and online technologies outside of the central laboratory-is rapidly outpacing the traditional laboratory test market, growing at a rate of 12 to 15% each year. POCT impacts the diagnostic process of care providers by yielding high efficiency benefits in terms of turnaround time and related quality improvements in the reduction of errors. However, the implementation of this disruptive eHealth technology requires the integration and transformation of diagnostic services across the boundaries of healthcare organizations. Research has revealed both advantages and barriers of POCT implementations, yet to date, there is no business model for the integration of POCT within general practice. The aim of this article is to contribute with a design for a care model that enables the integration of POCT in primary healthcare. In this research, we used a design modelling toolkit for data collection at five general practices. Through an iterative design process, we modelled the actors and value transactions, and designed an optimized care model for the dynamic integration of POCTs into the GP's network of care delivery. The care model design will have a direct bearing on improving the integration of POCT through the connectivity and norm guidelines between the general practice, the POC technology, and the diagnostic centre.
Care Model Design for E-Health: Integration of Point-of-Care Testing at Dutch General Practices
Verhees, Bart; van Kuijk, Kees
2017-01-01
Point-of-care testing (POCT)—laboratory tests performed with new mobile devices and online technologies outside of the central laboratory—is rapidly outpacing the traditional laboratory test market, growing at a rate of 12 to 15% each year. POCT impacts the diagnostic process of care providers by yielding high efficiency benefits in terms of turnaround time and related quality improvements in the reduction of errors. However, the implementation of this disruptive eHealth technology requires the integration and transformation of diagnostic services across the boundaries of healthcare organizations. Research has revealed both advantages and barriers of POCT implementations, yet to date, there is no business model for the integration of POCT within general practice. The aim of this article is to contribute with a design for a care model that enables the integration of POCT in primary healthcare. In this research, we used a design modelling toolkit for data collection at five general practices. Through an iterative design process, we modelled the actors and value transactions, and designed an optimized care model for the dynamic integration of POCTs into the GP’s network of care delivery. The care model design will have a direct bearing on improving the integration of POCT through the connectivity and norm guidelines between the general practice, the POC technology, and the diagnostic centre. PMID:29267224
Transforming revenue management.
Silveria, Richard; Alliegro, Debra; Nudd, Steven
2008-11-01
Healthcare organizations that want to undertake a patient administrative/revenue management transformation should: Define the vision with underlying business objectives and key performance measures. Strategically partner with key vendors for business process development and technology design. Create a program organization and governance infrastructure. Develop a corporate design model that defines the standards for operationalizing the vision. Execute the vision through technology deployment and corporate design model implementation.
IMPLEMENTING PRACTICAL PICO-HYDROPOWER
Deliverables for this proposal will be energy output data modeled from experimental testing of the hydropower unit and monitoring of the stormwater handling infrastructure in the GIS building; along with a design and engineering plan for implementation and building integrat...
ERIC Educational Resources Information Center
Herro, Danielle C.
2015-01-01
This case uses a worked or "working example" model (Gee, 2010), documenting the implementation of a novel game design curriculum in the United States. Created by an Instructional Technology Administrator (ITA) and two classroom teachers, it was subsequently offered to high school students. With an aim of providing in-depth understanding…
ERIC Educational Resources Information Center
Cody, Jeremy A.; Craig, Paul A.; Loudermilk, Adam D.; Yacci, Paul M.; Frisco, Sarah L.; Milillo, Jennifer R.
2012-01-01
A novel stereochemistry lesson was prepared that incorporated both handheld molecular models and embedded virtual three-dimensional (3D) images. The images are fully interactive and eye-catching for the students; methods for preparing 3D molecular images in Adobe Acrobat are included. The lesson was designed and implemented to showcase the 3D…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parker, Andrew; Haves, Philip; Jegi, Subhash
This paper describes a software system for automatically generating a reference (baseline) building energy model from the proposed (as-designed) building energy model. This system is built using the OpenStudio Software Development Kit (SDK) and is designed to operate on building energy models in the OpenStudio file format.
From Petascale to Exascale: Eight Focus Areas of R&D Challenges for HPC Simulation Environments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Springmeyer, R; Still, C; Schulz, M
2011-03-17
Programming models bridge the gap between the underlying hardware architecture and the supporting layers of software available to applications. Programming models are different from both programming languages and application programming interfaces (APIs). Specifically, a programming model is an abstraction of the underlying computer system that allows for the expression of both algorithms and data structures. In comparison, languages and APIs provide implementations of these abstractions and allow the algorithms and data structures to be put into practice - a programming model exists independently of the choice of both the programming language and the supporting APIs. Programming models are typically focusedmore » on achieving increased developer productivity, performance, and portability to other system designs. The rapidly changing nature of processor architectures and the complexity of designing an exascale platform provide significant challenges for these goals. Several other factors are likely to impact the design of future programming models. In particular, the representation and management of increasing levels of parallelism, concurrency and memory hierarchies, combined with the ability to maintain a progressive level of interoperability with today's applications are of significant concern. Overall the design of a programming model is inherently tied not only to the underlying hardware architecture, but also to the requirements of applications and libraries including data analysis, visualization, and uncertainty quantification. Furthermore, the successful implementation of a programming model is dependent on exposed features of the runtime software layers and features of the operating system. Successful use of a programming model also requires effective presentation to the software developer within the context of traditional and new software development tools. Consideration must also be given to the impact of programming models on both languages and the associated compiler infrastructure. Exascale programming models must reflect several, often competing, design goals. These design goals include desirable features such as abstraction and separation of concerns. However, some aspects are unique to large-scale computing. For example, interoperability and composability with existing implementations will prove critical. In particular, performance is the essential underlying goal for large-scale systems. A key evaluation metric for exascale models will be the extent to which they support these goals rather than merely enable them.« less
Implementation strategies for collaborative primary care-mental health models.
Franx, Gerdien; Dixon, Lisa; Wensing, Michel; Pincus, Harold
2013-09-01
Extensive research exists that collaborative primary care-mental health models can improve care and outcomes for patients. These programs are currently being implemented throughout the United States and beyond. The purpose of this study is to review the literature and to generate an overview of strategies currently used to implement such models in daily practice. Six overlapping strategies to implement collaborative primary care-mental health models were described in 18 selected studies. We identified interactive educational strategies, quality improvement change processes, technological support tools, stakeholder engagement in the design and execution of implementation plans, organizational changes in terms of expanding the task of nurses and financial strategies such as additional collaboration fees and pay for performance incentives. Considering the overwhelming evidence about the effectiveness of primary care-mental health models, there is a lack of good studies focusing on their implementation strategies. In practice, these strategies are multifaceted and locally defined, as a result of intensive and required stakeholder engagement. Although many barriers still exist, the implementation of collaborative models could have a chance to succeed in the United States, where new service delivery and payment models, such as the Patient-Centered Medical Home, the Health Home and the Accountable Care Organization, are being promoted.
Development of a student engagement approach to alcohol prevention: the Pragmatics Project.
Buettner, Cynthia K; Andrews, David W; Glassman, Michael
2009-01-01
Significant involvement of students in the development and implementation of college alcohol prevention strategies is largely untested, despite recommendations by the National Institute of Alcohol Abuse and Alcoholism and others. The purpose of the Pragmatics Project was to test a student engagement model for developing and implementing alcohol intervention strategies. The Pragmatics Project involved 89 undergraduate students on a large Midwestern university campus in the design and implementation of projects focused on reducing harm associated with high-risk drinking and off-campus parties. The engagement model used an innovative course piloted in the Human Development and Family Science department. The course successfully involved both students and the community in addressing local alcohol issues. The course design described would fit well into a Master of Public Health, Community Psychology, Health Psychology, or interdisciplinary curricula as well as the service learning model, and it is applicable in addressing other health risk behaviors.
Component Framework for Loosely Coupled High Performance Integrated Plasma Simulations
NASA Astrophysics Data System (ADS)
Elwasif, W. R.; Bernholdt, D. E.; Shet, A. G.; Batchelor, D. B.; Foley, S.
2010-11-01
We present the design and implementation of a component-based simulation framework for the execution of coupled time-dependent plasma modeling codes. The Integrated Plasma Simulator (IPS) provides a flexible lightweight component model that streamlines the integration of stand alone codes into coupled simulations. Standalone codes are adapted to the IPS component interface specification using a thin wrapping layer implemented in the Python programming language. The framework provides services for inter-component method invocation, configuration, task, and data management, asynchronous event management, simulation monitoring, and checkpoint/restart capabilities. Services are invoked, as needed, by the computational components to coordinate the execution of different aspects of coupled simulations on Massive parallel Processing (MPP) machines. A common plasma state layer serves as the foundation for inter-component, file-based data exchange. The IPS design principles, implementation details, and execution model will be presented, along with an overview of several use cases.
Heim, Joseph A; Huang, Hao; Zabinsky, Zelda B; Dickerson, Jane; Wellner, Monica; Astion, Michael; Cruz, Doris; Vincent, Jeanne; Jack, Rhona
2015-08-01
Design and implement a concurrent campaign of influenza immunization and tuberculosis (TB) screening for health care workers (HCWs) that can reduce the number of clinic visits for each HCW. A discrete-event simulation model was developed to support issues of resource allocation decisions in planning and operations phases. The campaign was compressed to100 days in 2010 and further compressed to 75 days in 2012 and 2013. With more than 5000 HCW arrivals in 2011, 2012 and 2013, the 14-day goal of TB results was achieved for each year and reduced to about 4 days in 2012 and 2013. Implementing a concurrent campaign allows less number of visiting clinics and the compressing of campaign length allows earlier immunization. The support of simulation modelling can provide useful evaluations of different configurations. © 2015 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Barlas, Thanasis; Pettas, Vasilis; Gertz, Drew; Madsen, Helge A.
2016-09-01
The application of active trailing edge flaps in an industrial oriented implementation is evaluated in terms of capability of alleviating design extreme loads. A flap system with basic control functionality is implemented and tested in a realistic full Design Load Basis (DLB) for the DTU 10MW Reference Wind Turbine (RWT) model and for an upscaled rotor version in DTU's aeroelastic code HAWC2. The flap system implementation shows considerable potential in reducing extreme loads in components of interest including the blades, main bearing and tower top, with no influence on fatigue loads and power performance. In addition, an individual flap controller for fatigue load reduction in above rated power conditions is also implemented and integrated in the general controller architecture. The system is shown to be a technology enabler for rotor upscaling, by combining extreme and fatigue load reduction.
Offering integrated medical equipment management in an application service provider model.
Cruz, Antonio Miguel; Barr, Cameron; Denis, Ernesto Rodríguez
2007-01-01
With the advancement of medical technology and thus the complexity of the equipment under their care, clinical engineering departments (CEDs) must continue to make use of computerized tools in the management of departmental activities. Authors of this paper designed, installed, and implemented an application service provider (ASP) model at the laboratory level to offer value added management tools in an online format to CEDs. The project, designed to investigate how to help meet demands across multiple healthcare organizations and provide a means of access for organizations that otherwise might not be able to take advantage of the benefits of those tools, has been well received. Ten hospitals have requested the service, and five of those are ready to proceed with the implementation of the ASP. With the proposed centralized system architecture, the model has shown promise in reducing network infrastructure labor and equipment costs, benchmarking of equipment performance indicators, and developing avenues for proper and timely problem reporting. The following is a detailed description of the design process from conception to implementation of the five main software modules and supporting system architecture.
Green Revolving Funds: An Introductory Guide to Implementation & Management
ERIC Educational Resources Information Center
Indvik, Joe; Foley, Rob; Orlowski, Mark
2013-01-01
The goal of this introductory implementation guide is to provide practical guidance for designing, implementing, and managing a green revolving fund (GRF) at a college, university, or other institution. The GRF model is widespread in higher education, with at least 79 funds in operation in North America representing over $111 million in committed…
Methodology for object-oriented real-time systems analysis and design: Software engineering
NASA Technical Reports Server (NTRS)
Schoeffler, James D.
1991-01-01
Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.
Job Aid Manuals for Phase II--DESIGN of the Instructional Systems Development Model.
ERIC Educational Resources Information Center
Schulz, Russel E.; Farrell, Jean R.
Designed to supplement the descriptive authoring flowcharts presented in a companion volume, this manual includes specific guidance, examples, and other information referred to in the flowcharts for the implementation of the second phase of the Instructional Systems Development Model (ISD). The introductory section includes definitions;…
An Evaluation Research Model for System-Wide Textbook Selection.
ERIC Educational Resources Information Center
Talmage, Harriet; Walberg, Herbert T.
One component of an evaluation research model for system-wide selection of curriculum materials is reported: implementation of an evaluation design for obtaining data that permits professional and lay persons to base curriculum materials decisions on a "best fit" principle. The design includes teacher characteristics, learning environment…
Campus network security model study
NASA Astrophysics Data System (ADS)
Zhang, Yong-ku; Song, Li-ren
2011-12-01
Campus network security is growing importance, Design a very effective defense hacker attacks, viruses, data theft, and internal defense system, is the focus of the study in this paper. This paper compared the firewall; IDS based on the integrated, then design of a campus network security model, and detail the specific implementation principle.
Model-based design of RNA hybridization networks implemented in living cells
Rodrigo, Guillermo; Prakash, Satya; Shen, Shensi; Majer, Eszter
2017-01-01
Abstract Synthetic gene circuits allow the behavior of living cells to be reprogrammed, and non-coding small RNAs (sRNAs) are increasingly being used as programmable regulators of gene expression. However, sRNAs (natural or synthetic) are generally used to regulate single target genes, while complex dynamic behaviors would require networks of sRNAs regulating each other. Here, we report a strategy for implementing such networks that exploits hybridization reactions carried out exclusively by multifaceted sRNAs that are both targets of and triggers for other sRNAs. These networks are ultimately coupled to the control of gene expression. We relied on a thermodynamic model of the different stable conformational states underlying this system at the nucleotide level. To test our model, we designed five different RNA hybridization networks with a linear architecture, and we implemented them in Escherichia coli. We validated the network architecture at the molecular level by native polyacrylamide gel electrophoresis, as well as the network function at the bacterial population and single-cell levels with a fluorescent reporter. Our results suggest that it is possible to engineer complex cellular programs based on RNA from first principles. Because these networks are mainly based on physical interactions, our designs could be expanded to other organisms as portable regulatory resources or to implement biological computations. PMID:28934501
Son, Sanghyun; Baek, Yunju
2015-01-01
As society has developed, the number of vehicles has increased and road conditions have become complicated, increasing the risk of crashes. Therefore, a service that provides safe vehicle control and various types of information to the driver is urgently needed. In this study, we designed and implemented a real-time traffic information system and a smart camera device for smart driver assistance systems. We selected a commercial device for the smart driver assistance systems, and applied a computer vision algorithm to perform image recognition. For application to the dynamic region of interest, dynamic frame skip methods were implemented to perform parallel processing in order to enable real-time operation. In addition, we designed and implemented a model to estimate congestion by analyzing traffic information. The performance of the proposed method was evaluated using images of a real road environment. We found that the processing time improved by 15.4 times when all the proposed methods were applied in the application. Further, we found experimentally that there was little or no change in the recognition accuracy when the proposed method was applied. Using the traffic congestion estimation model, we also found that the average error rate of the proposed model was 5.3%. PMID:26295230
Son, Sanghyun; Baek, Yunju
2015-08-18
As society has developed, the number of vehicles has increased and road conditions have become complicated, increasing the risk of crashes. Therefore, a service that provides safe vehicle control and various types of information to the driver is urgently needed. In this study, we designed and implemented a real-time traffic information system and a smart camera device for smart driver assistance systems. We selected a commercial device for the smart driver assistance systems, and applied a computer vision algorithm to perform image recognition. For application to the dynamic region of interest, dynamic frame skip methods were implemented to perform parallel processing in order to enable real-time operation. In addition, we designed and implemented a model to estimate congestion by analyzing traffic information. The performance of the proposed method was evaluated using images of a real road environment. We found that the processing time improved by 15.4 times when all the proposed methods were applied in the application. Further, we found experimentally that there was little or no change in the recognition accuracy when the proposed method was applied. Using the traffic congestion estimation model, we also found that the average error rate of the proposed model was 5.3%.
Thinking outside ISD: A management model for instructional design
NASA Astrophysics Data System (ADS)
Taylor, Tony Dewayne
The purpose of this study was to examine the effectiveness of an instructional system management-level model proposed by the author designed to orchestrate the efficient development and implementation of customer requested curriculum. The three phases of systems-based model are designed to ensure delivery of high quality and timely instruction are: (1) the assessment and documentation of organizational training requirements; (2) project management control of curriculum development; and (3) the implementation of relevant instruction by competent instructors. This model also provides (4) measurable and quantifiable course evaluation results to justify return on investment and validate its importance with respect to the customer's organizational strategic objectives. The theoretical approach for this study was systems theory-based due to the nature of the instructional systems design model and the systematic design of the management model. The study was accomplished using single-case study application of qualitative style of inquiry as described by Patton (2002). Qualitative inquiry was selected to collect and analyze participant holistic perspective assessment of effectiveness, relevance, and timeliness of the instructional design management model. Participants for this study included five managers, five subject matter experts, and six students assigned to a military organization responsible for the collection of hydrographic data for the U.S. Navy. Triangulation of data sources within the qualitative framework of the study incorporated the three participant groups---managers, SMEs, and students---incorporated multiple views of the course development and implementation to validate the findings and the remove researcher bias. Qualitative coding was accomplished by importing transcribed interviews into Microsoft Excel and sorted using Auto-Filter. The coded interviews indicated effective functionality in the views of the model from each of the three participant groups. Results from a pre-test/post-test comparative analysis indicated a significant difference between the pre-test and post-test mean at the p < .001 for the six students. Although the subject of the case study was within a military training environment, the application of the proposed instructional systems managerial model can be applied to the design, development, delivery, and assessment of instructional material in any line of study where quantifiable effective learning is the goal.
Alternative fuels and vehicles choice model
DOT National Transportation Integrated Search
1994-10-01
This report describes the theory and implementation of a model of alternative fuel and vehicle choice (AFVC), designed for use with the United States Department of Energy's Alternative Fuels Trade Model (AFTM). The AFTM is a static equilibrium model ...
Implementing effective and sustainable multidisciplinary clinical thoracic oncology programs
Freeman, Richard K.; Krasna, Mark J.
2015-01-01
Three models of care are described, including two models of multidisciplinary care for thoracic malignancies. The pros and cons of each model are discussed, the evidence supporting each is reviewed, and the need for more (and better) research into care delivery models is highlighted. Key stakeholders in thoracic oncology care delivery outcomes are identified, and the need to consider stakeholder perspectives in designing, validating and implementing multidisciplinary programs as a vehicle for quality improvement in thoracic oncology is emphasized. The importance of reconciling stakeholder perspectives, and identify meaningful stakeholder-relevant benchmarks is also emphasized. Metrics for measuring program implementation and overall success are proposed. PMID:26380186
Implementing effective and sustainable multidisciplinary clinical thoracic oncology programs.
Osarogiagbon, Raymond U; Freeman, Richard K; Krasna, Mark J
2015-08-01
Three models of care are described, including two models of multidisciplinary care for thoracic malignancies. The pros and cons of each model are discussed, the evidence supporting each is reviewed, and the need for more (and better) research into care delivery models is highlighted. Key stakeholders in thoracic oncology care delivery outcomes are identified, and the need to consider stakeholder perspectives in designing, validating and implementing multidisciplinary programs as a vehicle for quality improvement in thoracic oncology is emphasized. The importance of reconciling stakeholder perspectives, and identify meaningful stakeholder-relevant benchmarks is also emphasized. Metrics for measuring program implementation and overall success are proposed.
Digital hardware implementation of a stochastic two-dimensional neuron model.
Grassia, F; Kohno, T; Levi, T
2016-11-01
This study explores the feasibility of stochastic neuron simulation in digital systems (FPGA), which realizes an implementation of a two-dimensional neuron model. The stochasticity is added by a source of current noise in the silicon neuron using an Ornstein-Uhlenbeck process. This approach uses digital computation to emulate individual neuron behavior using fixed point arithmetic operation. The neuron model's computations are performed in arithmetic pipelines. It was designed in VHDL language and simulated prior to mapping in the FPGA. The experimental results confirmed the validity of the developed stochastic FPGA implementation, which makes the implementation of the silicon neuron more biologically plausible for future hybrid experiments. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Hernández, María Isabel; Couso, Digna; Pintó, Roser
2015-04-01
The study we have carried out aims to characterize 15- to 16-year-old students' learning progressions throughout the implementation of a teaching-learning sequence on the acoustic properties of materials. Our purpose is to better understand students' modeling processes about this topic and to identify how the instructional design and actual enactment influences students' learning progressions. This article presents the design principles which elicit the structure and types of modeling and inquiry activities designed to promote students' development of three conceptual models. Some of these activities are enhanced by the use of ICT such as sound level meters connected to data capture systems, which facilitate the measurement of the intensity level of sound emitted by a sound source and transmitted through different materials. Framing this study within the design-based research paradigm, it consists of the experimentation of the designed teaching sequence with two groups of students ( n = 29) in their science classes. The analysis of students' written productions together with classroom observations of the implementation of the teaching sequence allowed characterizing students' development of the conceptual models. Moreover, we could evidence the influence of different modeling and inquiry activities on students' development of the conceptual models, identifying those that have a major impact on students' modeling processes. Having evidenced different levels of development of each conceptual model, our results have been interpreted in terms of the attributes of each conceptual model, the distance between students' preliminary mental models and the intended conceptual models, and the instructional design and enactment.
Cymatics for the cloaking of flexural vibrations in a structured plate
Misseroni, D.; Colquitt, D. J.; Movchan, A. B.; Movchan, N. V.; Jones, I. S.
2016-01-01
Based on rigorous theoretical findings, we present a proof-of-concept design for a structured square cloak enclosing a void in an elastic lattice. We implement high-precision fabrication and experimental testing of an elastic invisibility cloak for flexural waves in a mechanical lattice. This is accompanied by verifications and numerical modelling performed through finite element simulations. The primary advantage of our square lattice cloak, over other designs, is the straightforward implementation and the ease of construction. The elastic lattice cloak, implemented experimentally, shows high efficiency. PMID:27068339
IRDS prototyping with applications to the representation of EA/RA models
NASA Technical Reports Server (NTRS)
Lekkos, Anthony A.; Greenwood, Bruce
1988-01-01
The requirements and system overview for the Information Resources Dictionary System (IRDS) are described. A formal design specification for a scaled down IRDS implementation compatible with the proposed FIPS IRDS standard is contained. The major design objectives for this IRDS will include a menu driven user interface, implementation of basic IRDS operations, and PC compatibility. The IRDS was implemented using Smalltalk/5 object oriented programming system and an ATT 6300 personal computer running under MS-DOS 3.1. The difficulties encountered in using Smalltalk are discussed.
Standards for detailed clinical models as the basis for medical data exchange and decision support.
Coyle, Joseph F; Mori, Angelo Rossi; Huff, Stanley M
2003-03-01
Detailed clinical models are necessary to exchange medical data between heterogeneous computer systems and to maintain consistency in a longitudinal electronic medical record system. At Intermountain Health Care (IHC), we have a history of designing detailed clinical models. The purpose of this paper is to share our experience and the lessons we have learned over the last 5 years. IHC's newest model is implemented using eXtensible Markup Language (XML) Schema as the formalism, and conforms to the Health Level Seven (HL7) version 3 data types. The centerpiece of the new strategy is the Clinical Event Model, which is a flexible name-value pair data structure that is tightly linked to a coded terminology. We describe IHC's third-generation strategy for representing and implementing detailed clinical models, and discuss the reasons for this design.
A Software Designed For STP Data Plot and Analysis Based on Object-oriented Methodology
NASA Astrophysics Data System (ADS)
Lina, L.; Murata, K.
2006-12-01
In the present study, we design a system that is named "STARS (Solar-Terrestrial data Analysis and Reference System)". The STARS provides a research environment that researchers can refer to and analyse a variety of data with single software. This software design is based on the OMT (Object Modeling Technique). The OMT is one of the object-oriented techniques, which has an advantage in maintenance improvement, reuse and long time development of a system. At the Center for Information Technology, Ehime University, after our designing of the STARS, we have already started implementing the STARS. The latest version of the STARS, the STARS5, was released in 2006. Any user can download the system from our WWW site (http:// www.infonet.cite.ehime-u.ac.jp/STARS). The present paper is mainly devoted to the design of a data analysis software system. Through our designing, we paid attention so that the design is flexible and applicable when other developers design software for the similar purpose. If our model is so particular only for our own purpose, it would be useless for other developers. Through our design of the domain object model, we carefully removed the parts, which depend on the system resources, e.g. hardware and software. We put the dependent parts into the application object model. In the present design, therefore, the domain object model and the utility object model are independent of computer resource. This helps anther developer to construct his/her own system based the present design. They simply modify their own application object models according to their system resource. This division of the design between dependent and independent part into three object models is one of the advantages in the OMT. If the design of software is completely done along with the OMT, implementation is rather simple and automatic: developers simply map their designs on our programs. If one creates "ganother STARS" with other programming language such as Java, the programmer simply follows the present system as long as the language is object-oriented language. Researchers would want to add their data into the STARS. In this case, they simply add their own data class in the domain object model. It is because any satellite data has properties such as time or date, which are inherited from the upper class. In this way, their effort is less than in other old methodologies. In the OMT, description format of the system is rather strictly standardized. When new developers take part in STARS project, they have only to understand each model to obtain the overview of the STARS. Then they follow this designs and documents to implement the system. The OMT makes a new comer easy to join into the project already running.
Design of a Model Execution Framework: Repetitive Object-Oriented Simulation Environment (ROSE)
NASA Technical Reports Server (NTRS)
Gray, Justin S.; Briggs, Jeffery L.
2008-01-01
The ROSE framework was designed to facilitate complex system analyses. It completely divorces the model execution process from the model itself. By doing so ROSE frees the modeler to develop a library of standard modeling processes such as Design of Experiments, optimizers, parameter studies, and sensitivity studies which can then be applied to any of their available models. The ROSE framework accomplishes this by means of a well defined API and object structure. Both the API and object structure are presented here with enough detail to implement ROSE in any object-oriented language or modeling tool.
Generating Systems Biology Markup Language Models from the Synthetic Biology Open Language.
Roehner, Nicholas; Zhang, Zhen; Nguyen, Tramy; Myers, Chris J
2015-08-21
In the context of synthetic biology, model generation is the automated process of constructing biochemical models based on genetic designs. This paper discusses the use cases for model generation in genetic design automation (GDA) software tools and introduces the foundational concepts of standards and model annotation that make this process useful. Finally, this paper presents an implementation of model generation in the GDA software tool iBioSim and provides an example of generating a Systems Biology Markup Language (SBML) model from a design of a 4-input AND sensor written in the Synthetic Biology Open Language (SBOL).
An Instructional Design Framework for Fostering Student Engagement in Online Learning Environments
ERIC Educational Resources Information Center
Czerkawski, Betul C.; Lyman, Eugene W.
2016-01-01
Many approaches, models and frameworks exist when designing quality online learning environments. These approaches assist and guide instructional designers through the process of analysis, design, development, implementation and evaluation of instructional processes. Some of these frameworks are concerned with student participation, some with…
Chambers, Andrea; Mustard, Cameron A; Breslin, Curtis; Holness, Linn; Nichol, Kathryn
2013-01-22
Implementation effectiveness models have identified important factors that can promote the successful implementation of an innovation; however, these models have been examined within contexts where innovations are adopted voluntarily and often ignore the socio-political and environmental context. In the field of occupational health and safety, there are circumstances where organizations must adopt innovations to comply with a regulatory standard. Examining how the external environment can facilitate or challenge an organization's change process may add to our understanding of implementation effectiveness. The objective of this study is to describe implementation facilitators and barriers in the context of a regulation designed to promote the uptake of safer engineered medical devices in healthcare. The proposed study will focus on Ontario's safer needle regulation (2007) which requires healthcare organizations to transition to the use of safer engineered medical devices for the prevention of needlestick injuries. A collective case study design will be used to learn from the experiences of three acute care hospitals in the province of Ontario, Canada. Interviews with management and front-line healthcare workers and analysis of supporting documents will be used to describe the implementation experience and examine issues associated with the integration of these devices. The data collection and analysis process will be influenced by a conceptual framework that draws from implementation science and the occupational health and safety literature. The focus of this study in addition to the methodology creates a unique opportunity to contribute to the field of implementation science. First, the study will explore implementation experiences under circumstances where regulatory pressures are influencing the organization's change process. Second, the timing of this study provides an opportunity to focus on issues that arise during later stages of implementation, a phase during the implementation cycle that has been understudied. This study also provides the opportunity to examine the relevance and utility of current implementation science models in the field of occupational health where the adoption of an innovation is meant to enhance the health and safety of workers. Previous work has tended to focus almost exclusively on innovations that are designed to enhance an organization's productivity or competitive advantage.
Effective Implementation of Collaborative Care for Depression: What is Needed?
Whitebird, Robin R.; Solberg, Leif I.; Jaeckels, Nancy A.; Pietruszewski, Pamela B.; Hadzic, Senka; Unützer, Jürgen; Ohnsorg, Kris A.; Rossom, Rebecca C.; Beck, Arne; Joslyn, Ken; Rubenstein, Lisa V.
2014-01-01
Objective To identify the care model factors that were key for successful implementation of collaborative depression care in a statewide Minnesota primary care initiative. Study Design We used a mixed-methods design incorporating both qualitative data from clinic site visits and quantitative measures of patient activation and 6-month remission rates. Methods Care model factors identified from the site visits were tested for association with rates of activation into the program and remission rates. Results Nine factors were identified as important for successful implementation of collaborative care by the consultants who had trained and interviewed participating clinic teams. Factors correlated with higher patient activation rates were: strong leadership support (0.63), well-defined and implemented care manager roles (0.62), a strong primary care physician champion (0.60), and an on-site and accessible care manager (0.59). However, remission rates at six months were correlated with: an engaged psychiatrist (0.62), not seeing operating costs as a barrier to participation (0.56), and face-to-face communication (warm handoffs) between the care-manager and primary care physician for new patients (0.54). Conclusions Care model factors most important for successful program implementation differ for patient activation into the program versus remission at six months. Knowing which implementation factors are most important for successful implementation will be useful for those interested in adopting this evidence-based approach to improve primary care for patients with depression. PMID:25365745
NASA Astrophysics Data System (ADS)
ChePa, Noraziah; Jasin, Noorhayati Md; Bakar, Nur Azzah Abu
2017-10-01
Fail to prevent or control challenges of Information System (IS) implementation have led to the failure of its implementation. Successful implementation of IS has been a challenging task to any organization including government hospitals. Government has invested a big amount of money on information system (IS) projects to improve service delivery in healthcare. However, several of them failed to be implemented successfully due to several factors. This article proposes a prevention model which incorporated Change Management (CM) concepts to avoid the failure of IS implementation, hence ensuring the success of it. Challenges of IS implementation in government hospitals have been discovered. Extensive literature review and deep interview approaches were employed to discover these challenges. A prevention model has been designed to cater the challenges. The model caters three main phases of implementation; pre-implementation, during implementation, and post-implementation by adopting CM practices of Lewin's, Kotter's and Prosci's CM model. Six elements of CM comprising thirteen sub-elements adopted from the three CM models have been used to handle CFFs of Human and Support issues; guiding team, resistance avoidance, IS adoption, enforcement, monitoring, and IS sustainability. Successful practice of the proposed mapping is expected to prevent CFFs to occur, hence ensuring a successful implementation of IS in the hospitals. The proposed model has been presented and successfully evaluated by the domain experts from the selected hospitals. The proposed model is believed to be beneficial for top management, IT practitioners and medical practitioners in preventing IS implementation failure among government hospitals towards ensuring the success implementation.
An Implementation of Wireless Body Area Networks for Improving Priority Data Transmission Delay.
Gündoğdu, Köksal; Çalhan, Ali
2016-03-01
The rapid growth of wireless sensor networks has enabled the human health monitoring of patients using body sensor nodes that gather and evaluate human body parameters and movements. This study describes both simulation model and implementation of a new traffic sensitive wireless body area network by using non-preemptive priority queue discipline. A wireless body area network implementation employing TDMA is designed with three different priorities of data traffics. Besides, a coordinator node having the non-preemptive priority queue is performed in this study. We have also developed, modeled and simulated example network scenarios by using the Riverbed Modeler simulation software with the purpose of verifying the implementation results. The simulation results obtained under various network load conditions are consistent with the implementation results.
ERIC Educational Resources Information Center
Lee, Heewon; Contento, Isobel R.; Koch, Pamela
2013-01-01
Objective: To use and review a conceptual model of process evaluation and to examine the implementation of a nutrition education curriculum, "Choice, Control & Change", designed to promote dietary and physical activity behaviors that reduce obesity risk. Design: A process evaluation study based on a systematic conceptual model. Setting: Five…
Harris, Melanie; Jones, Phil; Heartfield, Marie; Allstrom, Mary; Hancock, Janette; Lawn, Sharon; Battersby, Malcolm
2015-01-01
Health services introducing practice changes need effective implementation methods. Within the setting of a community mental health service offering recovery-oriented psychosocial support for people with mental illness, we aimed to: (i) identify a well-founded implementation model; and (ii) assess its practical usefulness in introducing a new programme for recovery-oriented self-management support. We reviewed the literature to identify implementation models applicable to community mental health organisations, and that also had corresponding measurement tools. We used one of these models to inform organisational change strategies. The literature review showed few models with corresponding tools. The Promoting Action on Research Implementation in Health Services (PARIHS) model and the related Organisational Readiness to Change Assessment (ORCA) tool were used. The PARIHS proposes prerequisites for health service change and the ORCA measures the extent to which these prerequisites are present. Application of the ORCA at two time points during implementation of the new programme showed strategy-related gains for some prerequisites but not for others, reflecting observed implementation progress. Additional strategies to address target prerequisites could be drawn from the PARIHS model. The PARIHS model and ORCA tool have potential in designing and monitoring practice change strategies in community mental health organisations. Further practical use and testing of implementation models appears justified in overcoming barriers to change.
Electroacoustic analysis, design, and implementation of a small balanced armature speaker.
Bai, Mingsian R; You, Bo-Cheng; Lo, Yi-Yang
2014-11-01
This paper presents a new design and implementation of a balanced armature speaker (BAS), which is composed of permanent magnetic circuits, a moving armature, and a coil. The armature rocks about a pivot with the coil at one end and the permanent magnet on another. A magnetic circuit analysis is conducted for the designed BAS to formulate the force factor, which is required for modeling the coupling between the electrical and mechanical systems. In addition, an electromechanoacoustical analogous circuit is established for the BAS, which bears the same structure as the moving coil loudspeaker, except that the force factor is different. A hybrid model, which combines the lumped parameter model in the electrical and acoustical domains with a finite element model in the mechanical domain, is developed to model the high-frequency response because of the high-order modes of the membrane, the drive rod, and the armature. The electroacoustic analysis is experimentally verified. The results indicate that the sound pressure response that is simulated using the hybrid model is in superior agreement with the measured response to that simulated using the lumped parameter model.
ERIC Educational Resources Information Center
Prescott, Stephanie, Ed.; And Others
This resource book is designed to assist teachers in implementing California's history-social science framework at the 10th grade level. The models support implementation at the local level and may be used to plan topics and select resources for professional development and preservice education. This document provides a link between the…
ERIC Educational Resources Information Center
Patton, Michael Quinn
2016-01-01
Fidelity concerns the extent to which a specific evaluation sufficiently incorporates the core characteristics of the overall approach to justify labeling that evaluation by its designated name. Fidelity has traditionally meant implementing a model in exactly the same way each time following the prescribed steps and procedures. The essential…
ERIC Educational Resources Information Center
Gowindasamy, Maniyarasi
2017-01-01
This study was conducted to evaluate the implementation of reflective development model in improving the intercultural competence among business student in Stamford College. This study will be focus on the local and international students in terms of their cultural competencies through the globalization subjects. An embedded design of mixed…
Geometrical model for DBMS: an experimental DBMS using IBM solid modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ali, D.E.D.L.
1985-01-01
This research presents a new model for data base management systems (DBMS). The new model, Geometrical DBMS, is based on using solid modelling technology in designing and implementing DBMS. The Geometrical DBMS is implemented using the IBM solid modelling Geometric Design Processor (GDP). Built basically on computer-graphics concepts, Geometrical DBMS is indeed a unique model. Traditionally, researchers start with one of the existent DBMS models and then put a graphical front end on it. In Geometrical DBMS, the graphical aspect of the model is not an alien concept tailored to the model but is, as a matter of fact, themore » atom around which the model is designed. The main idea in Geometrical DBMS is to allow the user and the system to refer to and manipulate data items as a solid object in 3D space, and representing a record as a group of logically related solid objects. In Geometical DBMS, hierarchical structure is used to present the data relations and the user sees the data as a group of arrays; yet, for the user and the system together, the data structure is a multidimensional tree.« less
Sabater-Hernández, Daniel; Tudball, Jacqueline; Ferguson, Caleb; Franco-Trigo, Lucía; Hossain, Lutfun N; Benrimoj, Shalom I
2018-02-27
Community pharmacies provide a suitable setting to promote self-screening programs aimed at enhancing the early detection of atrial fibrillation (AF). Developing and implementing novel community pharmacy services (CPSs) is a complex and acknowledged challenge, which requires comprehensive planning and the participation of relevant stakeholders. Co-design processes are participatory research approaches that can enhance the development, evaluation and implementation of health services. The aim of this study was to co-design a pharmacist-led CPS aimed at enhancing self-monitoring/screening of AF. A 3-step co-design process was conducted using qualitative methods: (1) interviews and focus group with potential service users (n = 8) to identify key needs and concerns; (2) focus group with a mixed group of stakeholders (n = 8) to generate a preliminary model of the service; and (3) focus group with community pharmacy owners and managers (n = 4) to explore the feasibility and appropriateness of the model. Data were analysed qualitatively to identify themes and intersections between themes. The JeMa2 model to conceptualize pharmacy-based health programs was used to build a theoretical model of the service. Stakeholders delineated: a clear target population (i.e., individuals ≥65 years old, with hypertension, with or without previous AF or stroke); the components of the service (i.e., patient education; self-monitoring at home; results evaluation, referral and follow-up); and a set of circumstances that may influence the implementation of the service (e.g., quality of the service, competency of the pharmacist, inter-professional relationships, etc.). A number of strategies were recommended to enable implementation (e.g.,. endorsement by leading cardiovascular organizations, appropriate communication methods and channels between the pharmacy and the general medical practice settings, etc.). A novel and preliminary model of a CPS aimed at enhancing the management of AF was generated from this participatory process. This model can be used to inform decision making processes aimed at adopting and piloting of the service. It is expected the co-designed service has been adapted to suit existing needs of patients and current care practices, which, in turn, may increase the feasibility and acceptance of the service when it is implemented into a real setting.
ERIC Educational Resources Information Center
Dahm, Kevin; Riddell, William; Constans, Eric; Courtney, Jennifer; Harvey, Roberta; Von Lockette, Paris
2009-01-01
This paper discusses a sophomore-level course that teaches engineering design and technical writing. Historically, the course was taught using semester-long design projects. Most students' overall approach to design problems left considerable room for improvement. Many teams chose a design without investigating alternatives, and important…
A four stage approach for ontology-based health information system design.
Kuziemsky, Craig E; Lau, Francis
2010-11-01
To describe and illustrate a four stage methodological approach to capture user knowledge in a biomedical domain area, use that knowledge to design an ontology, and then implement and evaluate the ontology as a health information system (HIS). A hybrid participatory design-grounded theory (GT-PD) method was used to obtain data and code them for ontology development. Prototyping was used to implement the ontology as a computer-based tool. Usability testing evaluated the computer-based tool. An empirically derived domain ontology and set of three problem-solving approaches were developed as a formalized model of the concepts and categories from the GT coding. The ontology and problem-solving approaches were used to design and implement a HIS that tested favorably in usability testing. The four stage approach illustrated in this paper is useful for designing and implementing an ontology as the basis for a HIS. The approach extends existing ontology development methodologies by providing an empirical basis for theory incorporated into ontology design. Copyright © 2010 Elsevier B.V. All rights reserved.
Computerized Adaptive Testing System Design: Preliminary Design Considerations.
ERIC Educational Resources Information Center
Croll, Paul R.
A functional design model for a computerized adaptive testing (CAT) system was developed and presented through a series of hierarchy plus input-process-output (HIPO) diagrams. System functions were translated into system structure: specifically, into 34 software components. Implementation of the design in a physical system was addressed through…
Reduced complexity structural modeling for automated airframe synthesis
NASA Technical Reports Server (NTRS)
Hajela, Prabhat
1987-01-01
A procedure is developed for the optimum sizing of wing structures based on representing the built-up finite element assembly of the structure by equivalent beam models. The reduced-order beam models are computationally less demanding in an optimum design environment which dictates repetitive analysis of several trial designs. The design procedure is implemented in a computer program requiring geometry and loading information to create the wing finite element model and its equivalent beam model, and providing a rapid estimate of the optimum weight obtained from a fully stressed design approach applied to the beam. The synthesis procedure is demonstrated for representative conventional-cantilever and joined wing configurations.
Sebok, Angelia; Wickens, Christopher D
2017-03-01
The objectives were to (a) implement theoretical perspectives regarding human-automation interaction (HAI) into model-based tools to assist designers in developing systems that support effective performance and (b) conduct validations to assess the ability of the models to predict operator performance. Two key concepts in HAI, the lumberjack analogy and black swan events, have been studied extensively. The lumberjack analogy describes the effects of imperfect automation on operator performance. In routine operations, an increased degree of automation supports performance, but in failure conditions, increased automation results in more significantly impaired performance. Black swans are the rare and unexpected failures of imperfect automation. The lumberjack analogy and black swan concepts have been implemented into three model-based tools that predict operator performance in different systems. These tools include a flight management system, a remotely controlled robotic arm, and an environmental process control system. Each modeling effort included a corresponding validation. In one validation, the software tool was used to compare three flight management system designs, which were ranked in the same order as predicted by subject matter experts. The second validation compared model-predicted operator complacency with empirical performance in the same conditions. The third validation compared model-predicted and empirically determined time to detect and repair faults in four automation conditions. The three model-based tools offer useful ways to predict operator performance in complex systems. The three tools offer ways to predict the effects of different automation designs on operator performance.
Transitioning From Volume to Value: A Strategic Approach to Design and Implementation.
Randazzo, Geralyn; Brown, Zenobia
2016-01-01
As the health care delivery system migrates toward a model based on value rather than volume, nursing leaders play a key role in assisting in the design and implementation of new models of care to support this transition. This article provides an overview of one organization's approach to evolve in the direction of value while gaining the experience needed to scope and scale cross-continuum assets to meet this growing demand. This article outlines the development and deployment of an organizational structure, information technology integration, clinical implementation strategies, and tools and metrics utilized to evaluate the outcomes of value-based programs. Experience in Bundled Payments for Care Improvement program is highlighted. The outcomes and lessons learned are incorporated for those interested in advancing value-based endeavors in their own organizations.
The potential application of the blackboard model of problem solving to multidisciplinary design
NASA Technical Reports Server (NTRS)
Rogers, J. L.
1989-01-01
Problems associated with the sequential approach to multidisciplinary design are discussed. A blackboard model is suggested as a potential tool for implementing the multilevel decomposition approach to overcome these problems. The blackboard model serves as a global database for the solution with each discipline acting as a knowledge source for updating the solution. With this approach, it is possible for engineers to improve the coordination, communication, and cooperation in the conceptual design process, allowing them to achieve a more optimal design from an interdisciplinary standpoint.
NASA Astrophysics Data System (ADS)
Rosenberg, D. E.
2008-12-01
Designing and implementing a hydro-economic computer model to support or facilitate collaborative decision making among multiple stakeholders or users can be challenging and daunting. Collaborative modeling is distinguished and more difficult than non-collaborative efforts because of a large number of users with different backgrounds, disagreement or conflict among stakeholders regarding problem definitions, modeling roles, and analysis methods, plus evolving ideas of model scope and scale and needs for information and analysis as stakeholders interact, use the model, and learn about the underlying water system. This presentation reviews the lifecycle for collaborative model making and identifies some key design decisions that stakeholders and model developers must make to develop robust and trusted, verifiable and transparent, integrated and flexible, and ultimately useful models. It advances some best practices to implement and program these decisions. Among these best practices are 1) modular development of data- aware input, storage, manipulation, results recording and presentation components plus ways to couple and link to other models and tools, 2) explicitly structure both input data and the meta data that describes data sources, who acquired it, gaps, and modifications or translations made to put the data in a form usable by the model, 3) provide in-line documentation on model inputs, assumptions, calculations, and results plus ways for stakeholders to document their own model use and share results with others, and 4) flexibly program with graphical object-oriented properties and elements that allow users or the model maintainers to easily see and modify the spatial, temporal, or analysis scope as the collaborative process moves forward. We draw on examples of these best practices from the existing literature, the author's prior work, and some new applications just underway. The presentation concludes by identifying some future directions for collaborative modeling including geo-spatial display and analysis, real-time operations, and internet-based tools plus the design and programming needed to implement these capabilities.
Linskell, Jeremy; Bouamrane, Matt-Mouley
2012-09-01
An assisted living space (ALS) is a technology-enabled environment designed to allow people with complex health or social care needs to remain, and live independently, in their own home for longer. However, many challenges remain in order to deliver usable systems acceptable to a diverse range of stakeholders, including end-users, and their families and carers, as well as health and social care services. ALSs need to support activities of daily-living while allowing end-users to maintain important social connections. They must be dynamic, flexible and adaptable living environments. In this article, we provide an overview of the technological landscape of assisted-living technology (ALT) and recent policies to promote an increased adoption of ALT in Scotland. We discuss our experiences in implementing technology-supported ALSs and emphasise key lessons. Finally, we propose an iterative and pragmatic user-centred implementation model for delivering ALSs in complex-needs scenarios. This empirical model is derived from our past ALS implementations. The proposed model allows project stakeholders to identify requirements, allocate tasks and responsibilities, and identify appropriate technological solutions for the delivery of functional ALS systems. The model is generic and makes no assumptions on needs or technology solutions, nor on the technical knowledge, skills and experience of the stakeholders involved in the ALS design process.
NASA Technical Reports Server (NTRS)
Frederick, D. K.; Lashmet, P. K.; Sandor, G. N.; Shen, C. N.; Smith, E. V.; Yerazunis, S. W.
1973-01-01
Problems related to the design and control of a mobile planetary vehicle to implement a systematic plan for the exploration of Mars are reported. Problem areas include: vehicle configuration, control, dynamics, systems and propulsion; systems analysis, terrain modeling and path selection; and chemical analysis of specimens. These tasks are summarized: vehicle model design, mathematical model of vehicle dynamics, experimental vehicle dynamics, obstacle negotiation, electrochemical controls, remote control, collapsibility and deployment, construction of a wheel tester, wheel analysis, payload design, system design optimization, effect of design assumptions, accessory optimal design, on-board computer subsystem, laser range measurement, discrete obstacle detection, obstacle detection systems, terrain modeling, path selection system simulation and evaluation, gas chromatograph/mass spectrometer system concepts, and chromatograph model evaluation and improvement.
Software Prototyping: Designing Systems for Users.
ERIC Educational Resources Information Center
Spies, Phyllis Bova
1983-01-01
Reports on major change in computer software development process--the prototype model, i.e., implementation of skeletal system that is enhanced during interaction with users. Expensive and unreliable software, software design errors, traditional development approach, resources required for prototyping, success stories, and systems designer's role…
A finite element model of rigid body structures actuated by dielectric elastomer actuators
NASA Astrophysics Data System (ADS)
Simone, F.; Linnebach, P.; Rizzello, G.; Seelecke, S.
2018-06-01
This paper presents on finite element (FE) modeling and simulation of dielectric elastomer actuators (DEAs) coupled with articulated structures. DEAs have proven to represent an effective transduction technology for the realization of large deformation, low-power consuming, and fast mechatronic actuators. However, the complex dynamic behavior of the material, characterized by nonlinearities and rate-dependent phenomena, makes it difficult to accurately model and design DEA systems. The problem is further complicated in case the DEA is used to activate articulated structures, which increase both system complexity and implementation effort of numerical simulation models. In this paper, we present a model based tool which allows to effectively implement and simulate complex articulated systems actuated by DEAs. A first prototype of a compact switch actuated by DEA membranes is chosen as reference study to introduce the methodology. The commercially available FE software COMSOL is used for implementing and coupling a physics-based dynamic model of the DEA with the external structure, i.e., the switch. The model is then experimentally calibrated and validated in both quasi-static and dynamic loading conditions. Finally, preliminary results on how to use the simulation tool to optimize the design are presented.
ERIC Educational Resources Information Center
Kopp, Jason P.; Hulleman, Chris S.; Harackiewicz, Judith M.; Rozek, Chris
2012-01-01
Assessing fidelity of implementation is becoming increasingly important in education research, in particular as a tool for understanding variations in treatment effectiveness. Fidelity of implementation is defined as "the determination of how well an intervention is implemented in comparison with the original program design during an efficacy…
A Four-Stage Model for Planning Computer-Based Instruction.
ERIC Educational Resources Information Center
Morrison, Gary R.; Ross, Steven M.
1988-01-01
Describes a flexible planning process for developing computer based instruction (CBI) in which the CBI design is implemented on paper between the lesson design and the program production. A four-stage model is explained, including (1) an initial flowchart, (2) storyboards, (3) a detailed flowchart, and (4) an evaluation. (16 references)…
A Brief Introduction to Evidence-Centered Design. CSE Report 632
ERIC Educational Resources Information Center
Mislevy, Robert J.; Almond, Russell G.; Lukas, Janice F.
2004-01-01
Evidence-centered assessment design (ECD) is an approach to constructing educational assessments in terms of evidentiary arguments. This paper provides an introduction to the basic ideas of ECD, including some of the terminology and models that have been developed to implement the approach. In particular, it presents the high-level models of …
Feedback and Feed-Forward for Promoting Problem-Based Learning in Online Learning Environments
ERIC Educational Resources Information Center
Webb, Ashley; Moallem, Mahnaz
2016-01-01
Purpose: The study aimed to (1) review the literature to construct conceptual models that could guide instructional designers in developing problem/project-based learning environments while applying effective feedback strategies, (2) use the models to design, develop, and implement an online graduate course, and (3) assess the efficiency of the…
Programmable logic construction kits for hyper-real-time neuronal modeling.
Guerrero-Rivera, Ruben; Morrison, Abigail; Diesmann, Markus; Pearce, Tim C
2006-11-01
Programmable logic designs are presented that achieve exact integration of leaky integrate-and-fire soma and dynamical synapse neuronal models and incorporate spike-time dependent plasticity and axonal delays. Highly accurate numerical performance has been achieved by modifying simpler forward-Euler-based circuitry requiring minimal circuit allocation, which, as we show, behaves equivalently to exact integration. These designs have been implemented and simulated at the behavioral and physical device levels, demonstrating close agreement with both numerical and analytical results. By exploiting finely grained parallelism and single clock cycle numerical iteration, these designs achieve simulation speeds at least five orders of magnitude faster than the nervous system, termed here hyper-real-time operation, when deployed on commercially available field-programmable gate array (FPGA) devices. Taken together, our designs form a programmable logic construction kit of commonly used neuronal model elements that supports the building of large and complex architectures of spiking neuron networks for real-time neuromorphic implementation, neurophysiological interfacing, or efficient parameter space investigations.
Designing an Agent-Based Model for Childhood Obesity Interventions: A Case Study of ChildObesity180.
Hennessy, Erin; Ornstein, Joseph T; Economos, Christina D; Herzog, Julia Bloom; Lynskey, Vanessa; Coffield, Edward; Hammond, Ross A
2016-01-07
Complex systems modeling can provide useful insights when designing and anticipating the impact of public health interventions. We developed an agent-based, or individual-based, computation model (ABM) to aid in evaluating and refining implementation of behavior change interventions designed to increase physical activity and healthy eating and reduce unnecessary weight gain among school-aged children. The potential benefits of applying an ABM approach include estimating outcomes despite data gaps, anticipating impact among different populations or scenarios, and exploring how to expand or modify an intervention. The practical challenges inherent in implementing such an approach include data resources, data availability, and the skills and knowledge of ABM among the public health obesity intervention community. The aim of this article was to provide a step-by-step guide on how to develop an ABM to evaluate multifaceted interventions on childhood obesity prevention in multiple settings. We used data from 2 obesity prevention initiatives and public-use resources. The details and goals of the interventions, overview of the model design process, and generalizability of this approach for future interventions is discussed.
Design, Implementation and Applications of 3d Web-Services in DB4GEO
NASA Astrophysics Data System (ADS)
Breunig, M.; Kuper, P. V.; Dittrich, A.; Wild, P.; Butwilowski, E.; Al-Doori, M.
2013-09-01
The object-oriented database architecture DB4GeO was originally designed to support sub-surface applications in the geo-sciences. This is reflected in DB4GeO's geometric data model as well as in its import and export functions. Initially, these functions were designed for communication with 3D geological modeling and visualization tools such as GOCAD or MeshLab. However, it soon became clear that DB4GeO was suitable for a much wider range of applications. Therefore it is natural to move away from a standalone solution and to open the access to DB4GeO data by standardized OGC web-services. Though REST and OGC services seem incompatible at first sight, the implementation in DB4GeO shows that OGC-based implementation of web-services may use parts of the DB4GeO-REST implementation. Starting with initial solutions in the history of DB4GeO, this paper will introduce the design, adaptation (i.e. model transformation), and first steps in the implementation of OGC Web Feature (WFS) and Web Processing Services (WPS), as new interfaces to DB4GeO data and operations. Among its capabilities, DB4GeO can provide data in different data formats like GML, GOCAD, or DB3D XML through a WFS, as well as its ability to run operations like a 3D-to-2D service, or mesh-simplification (Progressive Meshes) through a WPS. We then demonstrate, an Android-based mobile 3D augmented reality viewer for DB4GeO that uses the Web Feature Service to visualize 3D geo-database query results. Finally, we explore future research work considering DB4GeO in the framework of the research group "Computer-Aided Collaborative Subway Track Planning in Multi-Scale 3D City and Building Models".
Earth Observing System (EOS) Communication (Ecom) Modeling, Analysis, and Testbed (EMAT) activiy
NASA Technical Reports Server (NTRS)
Desai, Vishal
1994-01-01
This paper describes the Earth Observing System (EOS) Communication (Ecom) Modeling, Analysis, and Testbed (EMAT) activity performed by Code 540 in support of the Ecom project. Ecom is the ground-to-ground data transport system for operational EOS traffic. The National Aeronautic and Space Administration (NASA) Communications (Nascom) Division, Code 540, is responsible for implementing Ecom. Ecom interfaces with various systems to transport EOS forward link commands, return link telemetry, and science payload data. To understand the complexities surrounding the design and implementation of Ecom, it is necessary that sufficient testbedding, modeling, and analysis be conducted prior to the design phase. These activities, when grouped, are referred to as the EMAT activity. This paper describes work accomplished to date in each of the three major EMAT activities: modeling, analysis, and testbedding.
ASIC implementation of recursive scaled discrete cosine transform algorithm
NASA Astrophysics Data System (ADS)
On, Bill N.; Narasimhan, Sam; Huang, Victor K.
1994-05-01
A program to implement the Recursive Scaled Discrete Cosine Transform (DCT) algorithm as proposed by H. S. Hou has been undertaken at the Institute of Microelectronics. Implementation of the design was done using top-down design methodology with VHDL (VHSIC Hardware Description Language) for chip modeling. When the VHDL simulation has been satisfactorily completed, the design is synthesized into gates using a synthesis tool. The architecture of the design consists of two processing units together with a memory module for data storage and transpose. Each processing unit is composed of four pipelined stages which allow the internal clock to run at one-eighth (1/8) the speed of the pixel clock. Each stage operates on eight pixels in parallel. As the data flows through each stage, there are various adders and multipliers to transform them into the desired coefficients. The Scaled IDCT was implemented in a similar fashion with the adders and multipliers rearranged to perform the inverse DCT algorithm. The chip has been verified using Field Programmable Gate Array devices. The design is operational. The combination of fewer multiplications required and pipelined architecture give Hou's Recursive Scaled DCT good potential of achieving high performance at a low cost in using Very Large Scale Integration implementation.
Using A Model-Based Systems Engineering Approach For Exploration Medical System Development
NASA Technical Reports Server (NTRS)
Hanson, A.; Mindock, J.; McGuire, K.; Reilly, J.; Cerro, J.; Othon, W.; Rubin, D.; Urbina, M.; Canga, M.
2017-01-01
NASA's Human Research Program's Exploration Medical Capabilities (ExMC) element is defining the medical system needs for exploration class missions. ExMC's Systems Engineering (SE) team will play a critical role in successful design and implementation of the medical system into exploration vehicles. The team's mission is to "Define, develop, validate, and manage the technical system design needed to implement exploration medical capabilities for Mars and test the design in a progression of proving grounds." Development of the medical system is being conducted in parallel with exploration mission architecture and vehicle design development. Successful implementation of the medical system in this environment will require a robust systems engineering approach to enable technical communication across communities to create a common mental model of the emergent engineering and medical systems. Model-Based Systems Engineering (MBSE) improves shared understanding of system needs and constraints between stakeholders and offers a common language for analysis. The ExMC SE team is using MBSE techniques to define operational needs, decompose requirements and architecture, and identify medical capabilities needed to support human exploration. Systems Modeling Language (SysML) is the specific language the SE team is utilizing, within an MBSE approach, to model the medical system functional needs, requirements, and architecture. Modeling methods are being developed through the practice of MBSE within the team, and tools are being selected to support meta-data exchange as integration points to other system models are identified. Use of MBSE is supporting the development of relationships across disciplines and NASA Centers to build trust and enable teamwork, enhance visibility of team goals, foster a culture of unbiased learning and serving, and be responsive to customer needs. The MBSE approach to medical system design offers a paradigm shift toward greater integration between vehicle and the medical system and directly supports the transition of Earth-reliant ISS operations to the Earth-independent operations envisioned for Mars. Here, we describe the methods and approach to building this integrated model.
The integration between Business Model Canvas and Manufacturing System Design
NASA Astrophysics Data System (ADS)
Prasetyawan, Y.; Maulida, N.; Lutvitasari, M. R.
2018-04-01
Business Model Canvas (BMC) is an increasingly popular business design tool especially for a start-up business and new business player. In general, BMC seeks a balance between effective working patterns with suppliers, good relation with customers and ability to understand and manage internal resources. This balance will expedite the implementation of Manufacturing System Design (MSD). The existing use of BMC and MSD is frequently applied separately at various business levels. BMC business plan is primarily to have engagement with customers and explore potential revenue to increase profits, while MSD primarily aims to meet production targets with available resources. The purpose of this research is to provide a roadmap to align BMC and MSD. A series of simple mathematical (modified) and integration models are created to connect BMC and MSD. Several results in various industries (new, developed and mature) are presented and used as examples of implementation.
In-Flight System Identification
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.
1998-01-01
A method is proposed and studied whereby the system identification cycle consisting of experiment design and data analysis can be repeatedly implemented aboard a test aircraft in real time. This adaptive in-flight system identification scheme has many advantages, including increased flight test efficiency, adaptability to dynamic characteristics that are imperfectly known a priori, in-flight improvement of data quality through iterative input design, and immediate feedback of the quality of flight test results. The technique uses equation error in the frequency domain with a recursive Fourier transform for the real time data analysis, and simple design methods employing square wave input forms to design the test inputs in flight. Simulation examples are used to demonstrate that the technique produces increasingly accurate model parameter estimates resulting from sequentially designed and implemented flight test maneuvers. The method has reasonable computational requirements, and could be implemented aboard an aircraft in real time.
NASA Technical Reports Server (NTRS)
Ferrell, Bob A.; Lewis, Mark E.; Perotti, Jose M.; Brown, Barbara L.; Oostdyk, Rebecca L.; Goetz, Jesse W.
2010-01-01
This paper's main purpose is to detail issues and lessons learned regarding designing, integrating, and implementing Fault Detection Isolation and Recovery (FDIR) for Constellation Exploration Program (CxP) Ground Operations at Kennedy Space Center (KSC). Part of the0 overall implementation of National Aeronautics and Space Administration's (NASA's) CxP, FDIR is being implemented in three main components of the program (Ares, Orion, and Ground Operations/Processing). While not initially part of the design baseline for the CxP Ground Operations, NASA felt that FDIR is important enough to develop, that NASA's Exploration Systems Mission Directorate's (ESMD's) Exploration Technology Development Program (ETDP) initiated a task for it under their Integrated System Health Management (ISHM) research area. This task, referred to as the FDIIR project, is a multi-year multi-center effort. The primary purpose of the FDIR project is to develop a prototype and pathway upon which Fault Detection and Isolation (FDI) may be transitioned into the Ground Operations baseline. Currently, Qualtech Systems Inc (QSI) Commercial Off The Shelf (COTS) software products Testability Engineering and Maintenance System (TEAMS) Designer and TEAMS RDS/RT are being utilized in the implementation of FDI within the FDIR project. The TEAMS Designer COTS software product is being utilized to model the system with Functional Fault Models (FFMs). A limited set of systems in Ground Operations are being modeled by the FDIR project, and the entire Ares Launch Vehicle is being modeled under the Functional Fault Analysis (FFA) project at Marshall Space Flight Center (MSFC). Integration of the Ares FFMs and the Ground Processing FFMs is being done under the FDIR project also utilizing the TEAMS Designer COTS software product. One of the most significant challenges related to integration is to ensure that FFMs developed by different organizations can be integrated easily and without errors. Software Interface Control Documents (ICDs) for the FFMs and their usage will be addressed as the solution to this issue. In particular, the advantages and disadvantages of these ICDs across physically separate development groups will be delineated.
Hospitalists: a chief nursing officer's perspective.
Olender, Lynda
2005-11-01
The hospitalist "specialty" is sweeping the inpatient setting with numbers of physicians choosing this specialty expected to exceed 20,000 by 2010. Yet, little is known about the involvement of nursing in the design, implementation, and evaluation of a hospitalist initiative. The author suggests the chief nursing officer's pivotal role in proactively encouraging the design and implementation of a hospitalist-nurse manager patient-centered care delivery model. The chief nursing officer can create an environment to foster research designed to identify outcomes from this partnership of hospitalist and clinical (nurse) manager.
Design and Implementation of the MSL Cruise Propulsion Tank Heaters
NASA Technical Reports Server (NTRS)
Krylo, Robert; Mikhaylov, Rebecca; Cucullu, Gordon; Watkins, Brenda
2008-01-01
This slide presentation reviews the design and the implementation of the heaters for the Mars Science Laboratory (MSL). The pressurized tanks store hydrazine that freezes at 2 C, this means that heaters are required to keep the hydrazine and the helium at 36 C for the trip to Mars. Using the TMG software the heat loss was analyzed, and a thermal model simulates a half full tank which yielded a 13W heating requirement for each hemisphere. Views of the design, and the heater are included.
77 FR 36489 - Agency Information Collection Activities: Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-19
... collection methods, including interviews and research, to inform the design, development, and implementation.... For example, information collected from consumers will help the CFPB to design model forms... used for quantitative information collections that are designed to yield statistically significant...
Optimizing Automatic Deployment Using Non-functional Requirement Annotations
NASA Astrophysics Data System (ADS)
Kugele, Stefan; Haberl, Wolfgang; Tautschnig, Michael; Wechs, Martin
Model-driven development has become common practice in design of safety-critical real-time systems. High-level modeling constructs help to reduce the overall system complexity apparent to developers. This abstraction caters for fewer implementation errors in the resulting systems. In order to retain correctness of the model down to the software executed on a concrete platform, human faults during implementation must be avoided. This calls for an automatic, unattended deployment process including allocation, scheduling, and platform configuration.
The Three-Block Model of Universal Design for Learning Implementation in a High School
ERIC Educational Resources Information Center
Katz, Jennifer; Sugden, Ron
2013-01-01
The role of the school leader (principal) in supporting educational reform is explored through a case study of one high school implementing the Three Block Model of UDL (Katz, 2012a) in an effort to meet the needs of a diverse student population. This case study is a part of a much larger study exploring outcomes for students and teachers of…
DOE Office of Scientific and Technical Information (OSTI.GOV)
March-Leuba, S.; Jansen, J.F.; Kress, R.L.
1992-08-01
A new program package, Symbolic Manipulator Laboratory (SML), for the automatic generation of both kinematic and static manipulator models in symbolic form is presented. Critical design parameters may be identified and optimized using symbolic models as shown in the sample application presented for the Future Armor Rearm System (FARS) arm. The computer-aided development of the symbolic models yields equations with reduced numerical complexity. Important considerations have been placed on the closed form solutions simplification and on the user friendly operation. The main emphasis of this research is the development of a methodology which is implemented in a computer program capablemore » of generating symbolic kinematic and static forces models of manipulators. The fact that the models are obtained trigonometrically reduced is among the most significant results of this work and the most difficult to implement. Mathematica, a commercial program that allows symbolic manipulation, is used to implement the program package. SML is written such that the user can change any of the subroutines or create new ones easily. To assist the user, an on-line help has been written to make of SML a user friendly package. Some sample applications are presented. The design and optimization of the 5-degrees-of-freedom (DOF) FARS manipulator using SML is discussed. Finally, the kinematic and static models of two different 7-DOF manipulators are calculated symbolically.« less
NASA Astrophysics Data System (ADS)
Takemiya, Tetsushi
In modern aerospace engineering, the physics-based computational design method is becoming more important, as it is more efficient than experiments and because it is more suitable in designing new types of aircraft (e.g., unmanned aerial vehicles or supersonic business jets) than the conventional design method, which heavily relies on historical data. To enhance the reliability of the physics-based computational design method, researchers have made tremendous efforts to improve the fidelity of models. However, high-fidelity models require longer computational time, so the advantage of efficiency is partially lost. This problem has been overcome with the development of variable fidelity optimization (VFO). In VFO, different fidelity models are simultaneously employed in order to improve the speed and the accuracy of convergence in an optimization process. Among the various types of VFO methods, one of the most promising methods is the approximation management framework (AMF). In the AMF, objective and constraint functions of a low-fidelity model are scaled at a design point so that the scaled functions, which are referred to as "surrogate functions," match those of a high-fidelity model. Since scaling functions and the low-fidelity model constitutes surrogate functions, evaluating the surrogate functions is faster than evaluating the high-fidelity model. Therefore, in the optimization process, in which gradient-based optimization is implemented and thus many function calls are required, the surrogate functions are used instead of the high-fidelity model to obtain a new design point. The best feature of the AMF is that it may converge to a local optimum of the high-fidelity model in much less computational time than the high-fidelity model. However, through literature surveys and implementations of the AMF, the author xx found that (1) the AMF is very vulnerable when the computational analysis models have numerical noise, which is very common in high-fidelity models, and that (2) the AMF terminates optimization erroneously when the optimization problems have constraints. The first problem is due to inaccuracy in computing derivatives in the AMF, and the second problem is due to erroneous treatment of the trust region ratio, which sets the size of the domain for an optimization in the AMF. In order to solve the first problem of the AMF, automatic differentiation (AD) technique, which reads the codes of analysis models and automatically generates new derivative codes based on some mathematical rules, is applied. If derivatives are computed with the generated derivative code, they are analytical, and the required computational time is independent of the number of design variables, which is very advantageous for realistic aerospace engineering problems. However, if analysis models implement iterative computations such as computational fluid dynamics (CFD), which solves system partial differential equations iteratively, computing derivatives through the AD requires a massive memory size. The author solved this deficiency by modifying the AD approach and developing a more efficient implementation with CFD, and successfully applied the AD to general CFD software. In order to solve the second problem of the AMF, the governing equation of the trust region ratio, which is very strict against the violation of constraints, is modified so that it can accept the violation of constraints within some tolerance. By accepting violations of constraints during the optimization process, the AMF can continue optimization without terminating immaturely and eventually find the true optimum design point. With these modifications, the AMF is referred to as "Robust AMF," and it is applied to airfoil and wing aerodynamic design problems using Euler CFD software. The former problem has 21 design variables, and the latter 64. In both problems, derivatives computed with the proposed AD method are first compared with those computed with the finite differentiation (FD) method, and then, the Robust AMF is implemented along with the sequential quadratic programming (SQP) optimization method with only high-fidelity models. The proposed AD method computes derivatives more accurately and faster than the FD method, and the Robust AMF successfully optimizes shapes of the airfoil and the wing in a much shorter time than SQP with only high-fidelity models. These results clearly show the effectiveness of the Robust AMF. Finally, the feasibility of reducing computational time for calculating derivatives and the necessity of AMF with an optimum design point always in the feasible region are discussed as future work.
Chang, Esther; Hancock, Karen; Hickman, Louise; Glasson, Janet; Davidson, Patricia
2007-09-01
There is a lack of research investigating models of nursing care for older hospitalised patients that address the nursing needs of this group. The objective of this study is to evaluate the efficacy of models of care for acutely older patients tailored to two contexts: an aged care specific ward and a medical ward. This is a repeated measures design. Efficacy of the models was evaluated in terms of: patient and nurses' satisfaction with care provided; increased activities of daily living; reduced unplanned hospital readmissions; and medication knowledge. An aged care specific ward and a medical ward in two Sydney teaching hospitals. There were two groups of patients aged 65 years or older who were admitted to hospital for an acute illness: those admitted prior to model implementation (n=232) and those admitted during model implementation (n=116). Patients with moderate or severe dementia were excluded. The two groups of nurses were the pre-model group (n=90) who were working on the medical and aged care wards for the study prior to model implementation, and the post-model group (n=22), who were the nurses working on the wards during model implementation. Action research was used to develop the models of care in two wards: one for an aged care specific ward and another for a general medical ward where older patients were admitted. The models developed were based on empirical data gathered in an earlier phase of this study. The models were successful in both wards in terms of increasing satisfaction levels in patients and nurses (p<0.001), increasing functional independence as measured by activities of daily living (p<0.01), and increasing medication knowledge (p<0.001). Findings indicate that models of care developed by nurses using an evidence-based action research strategy can enhance both satisfaction and health outcomes in older patients.
FPGA implementation of predictive degradation model for engine oil lifetime
NASA Astrophysics Data System (ADS)
Idros, M. F. M.; Razak, A. H. A.; Junid, S. A. M. Al; Suliman, S. I.; Halim, A. K.
2018-03-01
This paper presents the implementation of linear regression model for degradation prediction on Register Transfer Logic (RTL) using QuartusII. A stationary model had been identified in the degradation trend for the engine oil in a vehicle in time series method. As for RTL implementation, the degradation model is written in Verilog HDL and the data input are taken at a certain time. Clock divider had been designed to support the timing sequence of input data. At every five data, a regression analysis is adapted for slope variation determination and prediction calculation. Here, only the negative value are taken as the consideration for the prediction purposes for less number of logic gate. Least Square Method is adapted to get the best linear model based on the mean values of time series data. The coded algorithm has been implemented on FPGA for validation purposes. The result shows the prediction time to change the engine oil.
Electrooptical adaptive switching network for the hypercube computer
NASA Technical Reports Server (NTRS)
Chow, E.; Peterson, J.
1988-01-01
An all-optical network design for the hyperswitch network using regular free-space interconnects between electronic processor nodes is presented. The adaptive routing model used is described, and an adaptive routing control example is presented. The design demonstrates that existing electrooptical techniques are sufficient for implementing efficient parallel architectures without the need for more complex means of implementing arbitrary interconnection schemes. The electrooptical hyperswitch network significantly improves the communication performance of the hypercube computer.
ERIC Educational Resources Information Center
Alodwan, Talal; Almosa, Mosaab
2018-01-01
The study aimed to assess the effectiveness of a computer program based on Analysis, Design, Development, Implementation and Evaluation (ADDIE) Model on the achievement of Ninth Graders' listening and Reading Comprehension Skills in English. The study sample comprised 70 ninth graders during the second semester of the academic year 2016/2017. The…
Strengthening organizations to implement evidence-based clinical practices.
VanDeusen Lukas, Carol; Engle, Ryann L; Holmes, Sally K; Parker, Victoria A; Petzel, Robert A; Nealon Seibert, Marjorie; Shwartz, Michael; Sullivan, Jennifer L
2010-01-01
Despite recognition that implementation of evidence-based clinical practices (EBPs) usually depends on the structure and processes of the larger health care organizational context, the dynamics of implementation are not well understood. This project's aim was to deepen that understanding by implementing and evaluating an organizational model hypothesized to strengthen the ability of health care organizations to facilitate EBPs. CONCEPTUAL MODEL: The model posits that implementation of EBPs will be enhanced through the presence of three interacting components: active leadership commitment to quality, robust clinical process redesign incorporating EBPs into routine operations, and use of management structures and processes to support and align redesign. In a mixed-methods longitudinal comparative case study design, seven medical centers in one network in the Department of Veterans Affairs participated in an intervention to implement the organizational model over 3 years. The network was selected randomly from three interested in using the model. The target EBP was hand-hygiene compliance. Measures included ratings of implementation fidelity, observed hand-hygiene compliance, and factors affecting model implementation drawn from interviews. Analyses support the hypothesis that greater fidelity to the organizational model was associated with higher compliance with hand-hygiene guidelines. High-fidelity sites showed larger effect sizes for improvement in hand-hygiene compliance than lower-fidelity sites. Adherence to the organizational model was in turn affected by factors in three categories: urgency to improve, organizational environment, and improvement climate. Implementation of EBPs, particularly those that cut across multiple processes of care, is a complex process with many possibilities for failure. The results provide the basis for a refined understanding of relationships among components of the organizational model and factors in the organizational context affecting them. This understanding suggests practical lessons for future implementation efforts and contributes to theoretical understanding of the dynamics of the implementation of EBPs.
NASA Technical Reports Server (NTRS)
Allen, Cheryl L.
1991-01-01
Enhanced engineering tools can be obtained through the integration of expert system methodologies and existing design software. The application of these methodologies to the spacecraft design and cost model (SDCM) software provides an improved technique for the selection of hardware for unmanned spacecraft subsystem design. The knowledge engineering system (KES) expert system development tool was used to implement a smarter equipment section algorithm than that which is currently achievable through the use of a standard data base system. The guidance, navigation, and control subsystems of the SDCM software was chosen as the initial subsystem for implementation. The portions of the SDCM code which compute the selection criteria and constraints remain intact, and the expert system equipment selection algorithm is embedded within this existing code. The architecture of this new methodology is described and its implementation is reported. The project background and a brief overview of the expert system is described, and once the details of the design are characterized, an example of its implementation is demonstrated.
Analysis, requirements and development of a collaborative social and medical services data model.
Bobroff, R B; Petermann, C A; Beck, J R; Buffone, G J
1994-01-01
In any medical and social service setting, patient data must be readily shared among multiple providers for delivery of expeditious, quality care. This paper describes the development and implementation of a generalized social and medical services data model for an ambulatory population. The model, part of the Collaborative Social and Medical Services System Project, is based on the data needs of the Baylor College of Medicine Teen Health Clinics and follows the guidelines of the ANSI HISPP/MSDS JWG for a Common Data Model. Design details were determined by informal staff interviews, operational observations, and examination of clinic guidelines and forms. The social and medical services data model is implemented using object-oriented data modeling techniques and will be implemented in C++ using an Object-Oriented Database Management System.
Toolkit of Available EPA Green Infrastructure Modeling ...
This webinar will present a toolkit consisting of five EPA green infrastructure models and tools, along with communication material. This toolkit can be used as a teaching and quick reference resource for use by planners and developers when making green infrastructure implementation decisions. It can also be used for low impact development design competitions. Models and tools included: Green Infrastructure Wizard (GIWiz), Watershed Management Optimization Support Tool (WMOST), Visualizing Ecosystem Land Management Assessments (VELMA) Model, Storm Water Management Model (SWMM), and the National Stormwater Calculator (SWC). This webinar will present a toolkit consisting of five EPA green infrastructure models and tools, along with communication material. This toolkit can be used as a teaching and quick reference resource for use by planners and developers when making green infrastructure implementation decisions. It can also be used for low impact development design competitions. Models and tools included: Green Infrastructure Wizard (GIWiz), Watershed Management Optimization Support Tool (WMOST), Visualizing Ecosystem Land Management Assessments (VELMA) Model, Storm Water Management Model (SWMM), and the National Stormwater Calculator (SWC).
Model-based design of RNA hybridization networks implemented in living cells.
Rodrigo, Guillermo; Prakash, Satya; Shen, Shensi; Majer, Eszter; Daròs, José-Antonio; Jaramillo, Alfonso
2017-09-19
Synthetic gene circuits allow the behavior of living cells to be reprogrammed, and non-coding small RNAs (sRNAs) are increasingly being used as programmable regulators of gene expression. However, sRNAs (natural or synthetic) are generally used to regulate single target genes, while complex dynamic behaviors would require networks of sRNAs regulating each other. Here, we report a strategy for implementing such networks that exploits hybridization reactions carried out exclusively by multifaceted sRNAs that are both targets of and triggers for other sRNAs. These networks are ultimately coupled to the control of gene expression. We relied on a thermodynamic model of the different stable conformational states underlying this system at the nucleotide level. To test our model, we designed five different RNA hybridization networks with a linear architecture, and we implemented them in Escherichia coli. We validated the network architecture at the molecular level by native polyacrylamide gel electrophoresis, as well as the network function at the bacterial population and single-cell levels with a fluorescent reporter. Our results suggest that it is possible to engineer complex cellular programs based on RNA from first principles. Because these networks are mainly based on physical interactions, our designs could be expanded to other organisms as portable regulatory resources or to implement biological computations. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
From good ideas to actions: a model-driven community collaborative to prevent childhood obesity.
Huberty, Jennifer L; Balluff, Mary; O'Dell, Molly; Peterson, Kerri
2010-01-01
Activate Omaha Kids, a community collaborative, was designed, implemented, and evaluated with the aim of preventing childhood obesity in the Omaha community. Activate Omaha Kids brought together key stakeholders and community leaders to create a community coalition. The coalition's aim was to oversee a long-term sustainable approach to preventing obesity. Following a planning phase, a business plan was developed that prioritized best practices to be implemented in Omaha. The business plan was developed using the Ecological Model, Health Policy Model, and Robert Wood Johnson Foundation Active Living by Design 5P model. The three models helped the community identify target populations and activities that then created a single model for sustainable change. Twenty-four initiatives were identified, over one million dollars in funding was secured, and evaluation strategies were identified. By using the models from the initial steps through evaluation, a clear facilitation of the process was possible, and the result was a comprehensive, feasible plan. The use of the models to design a strategic plan was pivotal in building a sustainable coalition to achieve measurable improvements in the health of children and prove replicable over time.
NASA Technical Reports Server (NTRS)
Briggs, Maxwell H.
2011-01-01
The Fission Power System (FPS) project is developing a Technology Demonstration Unit (TDU) to verify the performance and functionality of a subscale version of the FPS reference concept in a relevant environment, and to verify component and system models. As hardware is developed for the TDU, component and system models must be refined to include the details of specific component designs. This paper describes the development of a Sage-based pseudo-steady-state Stirling convertor model and its implementation into a system-level model of the TDU.
NASA Astrophysics Data System (ADS)
Marion, Giles M.; Kargel, Jeffrey S.
Implementation of the Pitzer approach is through the FREZCHEM (FREEZING CHEMISTRY) model, which is at the core of this work. This model was originally designed to simulate salt chemistries and freezing processes at low temperatures (-54 to 25°C) and 1 atm pressure. Over the years, this model has been broadened to include more chemistries (from 16 to 58 solid phases), a broader temperature range for some chemistries (to 113°C), and incorporation of a pressure dependence (1 to 1000 bars) into the model. Implementation, parameterization, validation, and limitations of the FREZCHEM model are extensively discussed in Chapter 3.
Implementing secure laptop-based testing in an undergraduate nursing program: a case study.
Tao, Jinyuan; Lorentz, B Chris; Hawes, Stacey; Rugless, Fely; Preston, Janice
2012-07-01
This article presents the implementation of secure laptop-based testing in an undergraduate nursing program. Details on how to design, develop, implement, and secure tests are discussed. Laptop-based testing mode is also compared with the computer-laboratory-based testing model. Five elements of the laptop-based testing model are illustrated: (1) it simulates the national board examination, (2) security is achievable, (3) it is convenient for both instructors and students, (4) it provides students hands-on practice, (5) continuous technical support is the key.
Damani, Zaheed; MacKean, Gail; Bohm, Eric; Noseworthy, Tom; Wang, Jenney Meng Han; DeMone, Brie; Wright, Brock; Marshall, Deborah A
2018-02-01
Single-entry models (SEMs) in healthcare allow patients to see the next-available provider and have been shown to improve waiting times, access and patient flow for preference-sensitive, scheduled services. The Winnipeg Central Intake Service (WCIS) for hip and knee replacement surgery was implemented to improve access in the Winnipeg Regional Health Authority. This paper describes the system's design/implementation; successes, challenges, and unanticipated consequences. On two occasions, during and following implementation, we interviewed all members of the WCIS project team, including processing engineers, waiting list coordinators, administrators and policy-makers regarding their experiences. We used semi-structured telephone interviews to collect data and qualitative thematic analysis to analyze and interpret the findings. Respondents indicated that the overarching objectives of the WCIS were being met. Benefits included streamlined processes, greater patient access, improved measurement and monitoring of outcomes. Challenges included low awareness, change readiness, and initial participation among stakeholders. Unanticipated consequences included workload increases, confusion around stakeholder expectations and under-reporting of data by surgeons' offices. Critical success factors for implementation included a requirement for clear communication, robust data collection, physician leadership and patience by all, especially implementation teams. Although successfully implemented, key lessons and critical success factors were learned related to change management, which if considered and applied, can reduce unanticipated consequences, improve uptake and benefit new models of care. Copyright © 2017 Elsevier B.V. All rights reserved.
Holm Hansen, Christian; Warner, Pamela; Parker, Richard A; Walker, Brian R; Critchley, Hilary Od; Weir, Christopher J
2017-12-01
It is often unclear what specific adaptive trial design features lead to an efficient design which is also feasible to implement. This article describes the preparatory simulation study for a Bayesian response-adaptive dose-finding trial design. Dexamethasone for Excessive Menstruation aims to assess the efficacy of Dexamethasone in reducing excessive menstrual bleeding and to determine the best dose for further study. To maximise learning about the dose response, patients receive placebo or an active dose with randomisation probabilities adapting based on evidence from patients already recruited. The dose-response relationship is estimated using a flexible Bayesian Normal Dynamic Linear Model. Several competing design options were considered including: number of doses, proportion assigned to placebo, adaptation criterion, and number and timing of adaptations. We performed a fractional factorial study using SAS software to simulate virtual trial data for candidate adaptive designs under a variety of scenarios and to invoke WinBUGS for Bayesian model estimation. We analysed the simulated trial results using Normal linear models to estimate the effects of each design feature on empirical type I error and statistical power. Our readily-implemented approach using widely available statistical software identified a final design which performed robustly across a range of potential trial scenarios.
NASA Technical Reports Server (NTRS)
Dixon, William; Fan, William; Lloyd, Joey; Pham, Nam-Anh; Stevens, Michael
1988-01-01
The design of the Soil Transport Implement (STI) for SKITTER is presented. The purpose of STI is to provide a protective layer of lunar soil for the lunar modules. The objective is to cover the lunar module with a layer of soil approximately two meters thick within a two week period. The amount of soil required to cover the module is roughly 77 dump truck loads or three million earth pounds. A spinning disk is employed to accomplish its task. STI is an autonomous, teleoperated system. The design incorporates the latest advances in composite materials and high strength, light weight alloys to achieve a high strength to weight ratio. The preliminary design should only be used to assess the feasibility of employing a spinning wheel as a soil transport implement. A mathematical model of the spinning wheel was used to evaluate the performance of this design.
NASA Technical Reports Server (NTRS)
Frederick, D. K.; Lashmet, P. K.; Sandor, G. N.; Shen, C. N.; Smith, E. J.; Yerazunis, S. W.
1972-01-01
The problems related to the design and control of a mobile planetary vehicle to implement a systematic plan for the exploration of Mars were investigated. Problem areas receiving attention include: vehicle configuration, control, dynamics, systems and propulsion; systems analysis; navigation, terrain modeling and path selection; and chemical analysis of specimens. The following specific tasks were studied: vehicle model design, mathematical modeling of dynamic vehicle, experimental vehicle dynamics, obstacle negotiation, electromechanical controls, collapsibility and deployment, construction of a wheel tester, wheel analysis, payload design, system design optimization, effect of design assumptions, accessory optimal design, on-board computer subsystem, laser range measurement, discrete obstacle detection, obstacle detection systems, terrain modeling, path selection system simulation and evaluation, gas chromatograph/mass spectrometer system concepts, chromatograph model evaluation and improvement and transport parameter evaluation.
NASA Technical Reports Server (NTRS)
Frederick, D. K.; Lashmet, P. K.; Sandor, G. N.; Shen, C. N.; Smith, E. J.; Yerazunis, S. W.
1972-01-01
Investigation of problems related to the design and control of a mobile planetary vehicle to implement a systematic plan for the exploration of Mars has been undertaken. Problem areas receiving attention include: vehicle configuration, control, dynamics, systems and propulsion; systems analysis; terrain modeling and path selection; and chemical analysis of specimens. The following specific tasks have been under study: vehicle model design, mathematical modeling of a dynamic vehicle, experimental vehicle dynamics, obstacle negotiation, electromechanical controls, collapsibility and deployment, construction of a wheel tester, wheel analysis, payload design, system design optimization, effect of design assumptions, accessory optimal design, on-board computer sybsystem, laser range measurement, discrete obstacle detection, obstacle detection systems, terrain modeling, path selection system simulation and evaluation, gas chromatograph/mass spectrometer system concepts, chromatograph model evaluation and improvement.
Hardware Prototyping of Neural Network based Fetal Electrocardiogram Extraction
NASA Astrophysics Data System (ADS)
Hasan, M. A.; Reaz, M. B. I.
2012-01-01
The aim of this paper is to model the algorithm for Fetal ECG (FECG) extraction from composite abdominal ECG (AECG) using VHDL (Very High Speed Integrated Circuit Hardware Description Language) for FPGA (Field Programmable Gate Array) implementation. Artificial Neural Network that provides efficient and effective ways of separating FECG signal from composite AECG signal has been designed. The proposed method gives an accuracy of 93.7% for R-peak detection in FHR monitoring. The designed VHDL model is synthesized and fitted into Altera's Stratix II EP2S15F484C3 using the Quartus II version 8.0 Web Edition for FPGA implementation.
Object-orientated DBMS techniques for time-oriented medical record.
Pinciroli, F; Combi, C; Pozzi, G
1992-01-01
In implementing time-orientated medical record (TOMR) management systems, use of a relational model played a big role. Many applications have been developed to extend query and data manipulation languages to temporal aspects of information. Our experience in developing TOMR revealed some deficiencies inside the relational model, such as: (a) abstract data type definition; (b) unified view of data, at a programming level; (c) management of temporal data; (d) management of signals and images. We identified some first topics to face by an object-orientated approach to database design. This paper describes the first steps in designing and implementing a TOMR by an object-orientated DBMS.
Design and experimental evaluation of robust controllers for a two-wheeled robot
NASA Astrophysics Data System (ADS)
Kralev, J.; Slavov, Ts.; Petkov, P.
2016-11-01
The paper presents the design and experimental evaluation of two alternative μ-controllers for robust vertical stabilisation of a two-wheeled self-balancing robot. The controllers design is based on models derived by identification from closed-loop experimental data. In the first design, a signal-based uncertainty representation obtained directly from the identification procedure is used, which leads to a controller of order 29. In the second design the signal uncertainty is approximated by an input multiplicative uncertainty, which leads to a controller of order 50, subsequently reduced to 30. The performance of the two μ-controllers is compared with the performance of a conventional linear quadratic controller with 17th-order Kalman filter. A proportional-integral controller of the rotational motion around the vertical axis is implemented as well. The control code is generated using Simulink® controller models and is embedded in a digital signal processor. Results from the simulation of the closed-loop system as well as experimental results obtained during the real-time implementation of the designed controllers are given. The theoretical investigation and experimental results confirm that the closed-loop system achieves robust performance in respect to the uncertainties related to the identified robot model.
Applying Contamination Modelling to Spacecraft Propulsion Systems Designs and Operations
NASA Technical Reports Server (NTRS)
Chen, Philip T.; Thomson, Shaun; Woronowicz, Michael S.
2000-01-01
Molecular and particulate contaminants generated from the operations of a propulsion system may impinge on spacecraft critical surfaces. Plume depositions or clouds may hinder the spacecraft and instruments from performing normal operations. Firing thrusters will generate both molecular and particulate contaminants. How to minimize the contamination impact from the plume becomes very critical for a successful mission. The resulting effect from either molecular or particulate contamination of the thruster firing is very distinct. This paper will discuss the interconnection between the functions of spacecraft contamination modeling and propulsion system implementation. The paper will address an innovative contamination engineering approach implemented from the spacecraft concept design, manufacturing, integration and test (I&T), launch, to on- orbit operations. This paper will also summarize the implementation on several successful missions. Despite other contamination sources, only molecular contamination will be considered here.
Dubois, Maarten
2012-09-01
Although economic theory supports the use of extended producer responsibility (EPR) to stimulate prevention and recycling of waste, EPR systems implemented in Europe are often criticized as a result of weak incentives for prevention and green product design. Using a stylized economic model, this article evaluates the efficiency of European EPR systems. The model reveals that the introduction of static collection targets creates a gap between theory and implementation. Static targets lead to inefficient market outcomes and weak incentives for prevention and green product design. The minimum collection targets should be complemented with a tax on producers for the non-collected waste fraction. Because such a tax internalizes the cost of waste disposal, more efficient price signals will lead to better incentives for waste management in a complex and dynamic market.
ERIC Educational Resources Information Center
Lee, Chia-Jung; Kim, ChanMin
2017-01-01
This paper presents the third version of a technological pedagogical content knowledge (TPACK) based instructional design model that incorporates the distinctive, transformative, and integrative views of TPACK into a comprehensive actionable framework. Strategies of relating TPACK domains to real-life learning experiences, role-playing, and…
Quantifying Aluminum Crystal Size Part 1: The Model-Eliciting Activity
ERIC Educational Resources Information Center
Diefes-Dux, Heidi A.; Hjalmarson, Margret; Zawojewski, Judith S.; Bowman, Keith
2006-01-01
Model-eliciting activities (MEA), specially designed client-drive, open-ended problems, have been implemented in a first-year engineering course and in secondary schools. The educational goals and settings are different, but the design of an MEA enables it to be versatile. This paper will introduce the reader to the principles that guide MEA…
A review of the solar array manufacturing industry costing standards
NASA Technical Reports Server (NTRS)
1977-01-01
The solar array manufacturing industry costing standards model is designed to compare the cost of producing solar arrays using alternative manufacturing processes. Constructive criticism of the methodology used is intended to enhance its implementation as a practical design tool. Three main elements of the procedure include workbook format and presentation, theoretical model validity and standard financial parameters.
2016-10-01
and implementation of embedded, adaptive feedback and performance assessment. The investigators also initiated work designing a Bayesian Belief ...training; Teamwork; Adaptive performance; Leadership; Simulation; Modeling; Bayesian belief networks (BBN) 16. SECURITY CLASSIFICATION OF: 17. LIMITATION...Trauma teams Team training Teamwork Adaptability Adaptive performance Leadership Simulation Modeling Bayesian belief networks (BBN) 6
Development of a Pedagogical Model to Help Engineering Faculty Design Interdisciplinary Curricula
ERIC Educational Resources Information Center
Navarro, Maria; Foutz, Timothy; Thompson, Sidney; Singer, Kerri Patrick
2016-01-01
The purpose of this study was to develop a model to help engineering faculty overcome the challenges they face when asked to design and implement interdisciplinary curricula. Researchers at a U.S. University worked with an Interdisciplinary Consultant Team and prepared a steering document with Guiding Principles and Essential Elements for the…
A Design Quality Learning Unit in OO Modeling Bridging the Engineer and the Artist
ERIC Educational Resources Information Center
Waguespack, Leslie J.
2015-01-01
Recent IS curriculum guidelines compress software development pedagogy into smaller and smaller pockets of course syllabi. Where undergraduate IS students once may have practiced modeling in analysis, design, and implementation across six or more courses in a curriculum using a variety of languages and tools they commonly now experience modeling…
A Simple Effect Size Estimator for Single Case Designs Using WinBUGS
ERIC Educational Resources Information Center
Rindskopf, David; Shadish, William; Hedges, Larry V.
2012-01-01
This conference presentation demonstrates a multilevel model for analyzing single case designs. The model is implemented in the Bayesian program WinBUGS. The authors show how it is possible to estimate a d-statistic like the one in Hedges, Pustejovsky and Shadish (2012) in this program. Results are demonstrated on an example.
School Principals' Decision-Making Behaviour in the Management of Innovation.
ERIC Educational Resources Information Center
McGeown, Vincent
1979-01-01
A rating scale operationalized a model for the adoption and implementation of educational innovation. Phases were designated: creating a climate for change; analyzing antecedent conditions; generating alternatives; initiating change adoption; implementing change; and evaluating change outcomes. Principals' decision-making behavior was the best…
Chessa, Manuela; Bianchi, Valentina; Zampetti, Massimo; Sabatini, Silvio P; Solari, Fabio
2012-01-01
The intrinsic parallelism of visual neural architectures based on distributed hierarchical layers is well suited to be implemented on the multi-core architectures of modern graphics cards. The design strategies that allow us to optimally take advantage of such parallelism, in order to efficiently map on GPU the hierarchy of layers and the canonical neural computations, are proposed. Specifically, the advantages of a cortical map-like representation of the data are exploited. Moreover, a GPU implementation of a novel neural architecture for the computation of binocular disparity from stereo image pairs, based on populations of binocular energy neurons, is presented. The implemented neural model achieves good performances in terms of reliability of the disparity estimates and a near real-time execution speed, thus demonstrating the effectiveness of the devised design strategies. The proposed approach is valid in general, since the neural building blocks we implemented are a common basis for the modeling of visual neural functionalities.
Model annotation for synthetic biology: automating model to nucleotide sequence conversion
Misirli, Goksel; Hallinan, Jennifer S.; Yu, Tommy; Lawson, James R.; Wimalaratne, Sarala M.; Cooling, Michael T.; Wipat, Anil
2011-01-01
Motivation: The need for the automated computational design of genetic circuits is becoming increasingly apparent with the advent of ever more complex and ambitious synthetic biology projects. Currently, most circuits are designed through the assembly of models of individual parts such as promoters, ribosome binding sites and coding sequences. These low level models are combined to produce a dynamic model of a larger device that exhibits a desired behaviour. The larger model then acts as a blueprint for physical implementation at the DNA level. However, the conversion of models of complex genetic circuits into DNA sequences is a non-trivial undertaking due to the complexity of mapping the model parts to their physical manifestation. Automating this process is further hampered by the lack of computationally tractable information in most models. Results: We describe a method for automatically generating DNA sequences from dynamic models implemented in CellML and Systems Biology Markup Language (SBML). We also identify the metadata needed to annotate models to facilitate automated conversion, and propose and demonstrate a method for the markup of these models using RDF. Our algorithm has been implemented in a software tool called MoSeC. Availability: The software is available from the authors' web site http://research.ncl.ac.uk/synthetic_biology/downloads.html. Contact: anil.wipat@ncl.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:21296753
WISE Design for Knowledge Integration.
ERIC Educational Resources Information Center
Linn, Marcia C.; Clark, Douglas; Slotta, James D.
2003-01-01
Examines the implementation of Web-based Inquiry Science Environment (WISE), which can incorporate modeling tools and hand-held devices. Describes WISE design team practices, features of the WISE learning environment, and patterns of feature use in WISE library projects. (SOE)
Properties of a center/surround retinex. Part 2: Surround design
NASA Technical Reports Server (NTRS)
Jobson, Daniel J.; Woodell, Glenn A.
1995-01-01
The last version of Edwin Land's retinex model for human vision's lightness and color constancy has been implemented. Previous research has established the mathematical foundations of Land's retinex but has not examined specific design issues and their effects on the properties of the retinex operation. We have sought to define a practical implementation of the retinex without particular concern for its validity as a model for human lightness and color perception. Here we describe issues involved in designing the surround function. We find that there is a trade-off between rendition and dynamic range compression that is governed by the surround space constant. Various functional forms for the retinex surround are evaluated and a Gaussian form is found to perform better than the inverse square suggested by Land. Preliminary testing led to the design of a Gaussian surround with a space constant of 80 pixels as a reasonable compromise between dynamic range compression and rendition.
Piezoelectric actuator design for MR elastography: implementation and vibration issues.
Tse, Zion Tsz Ho; Chan, Yum Ji; Janssen, Henning; Hamed, Abbi; Young, Ian; Lamperth, Michael
2011-09-01
MR elastography (MRE) is an emerging technique for tumor diagnosis. MRE actuation devices require precise mechanical design and radiofrequency engineering to achieve the required mechanical vibration performance and MR compatibility. A method of designing a general-purpose, compact and inexpensive MRE actuator is presented. It comprises piezoelectric bimorphs arranged in a resonant structure designed to operate at its resonant frequency for maximum vibration amplitude. An analytical model was established to understand the device vibration characteristics. The model-predicted performance was validated in experiments, showing its accuracy in predicting the actuator resonant frequency with an error < 4%. The device MRI compatibility was shown to cause minimal interference to a 1.5 tesla MRI scanner, with maximum signal-to-noise ratio reduction of 7.8% and generated artefact of 7.9 mm in MR images. A piezoelectric MRE actuator is proposed, and its implementation, vibration issues and future work are discussed. Copyright © 2011 John Wiley & Sons, Ltd.
Rapid Prototyping of High Performance Signal Processing Applications
NASA Astrophysics Data System (ADS)
Sane, Nimish
Advances in embedded systems for digital signal processing (DSP) are enabling many scientific projects and commercial applications. At the same time, these applications are key to driving advances in many important kinds of computing platforms. In this region of high performance DSP, rapid prototyping is critical for faster time-to-market (e.g., in the wireless communications industry) or time-to-science (e.g., in radio astronomy). DSP system architectures have evolved from being based on application specific integrated circuits (ASICs) to incorporate reconfigurable off-the-shelf field programmable gate arrays (FPGAs), the latest multiprocessors such as graphics processing units (GPUs), or heterogeneous combinations of such devices. We, thus, have a vast design space to explore based on performance trade-offs, and expanded by the multitude of possibilities for target platforms. In order to allow systematic design space exploration, and develop scalable and portable prototypes, model based design tools are increasingly used in design and implementation of embedded systems. These tools allow scalable high-level representations, model based semantics for analysis and optimization, and portable implementations that can be verified at higher levels of abstractions and targeted toward multiple platforms for implementation. The designer can experiment using such tools at an early stage in the design cycle, and employ the latest hardware at later stages. In this thesis, we have focused on dataflow-based approaches for rapid DSP system prototyping. This thesis contributes to various aspects of dataflow-based design flows and tools as follows: 1. We have introduced the concept of topological patterns, which exploits commonly found repetitive patterns in DSP algorithms to allow scalable, concise, and parameterizable representations of large scale dataflow graphs in high-level languages. We have shown how an underlying design tool can systematically exploit a high-level application specification consisting of topological patterns in various aspects of the design flow. 2. We have formulated the core functional dataflow (CFDF) model of computation, which can be used to model a wide variety of deterministic dynamic dataflow behaviors. We have also presented key features of the CFDF model and tools based on these features. These tools provide support for heterogeneous dataflow behaviors, an intuitive and common framework for functional specification, support for functional simulation, portability from several existing dataflow models to CFDF, integrated emphasis on minimally-restricted specification of actor functionality, and support for efficient static, quasi-static, and dynamic scheduling techniques. 3. We have developed a generalized scheduling technique for CFDF graphs based on decomposition of a CFDF graph into static graphs that interact at run-time. Furthermore, we have refined this generalized scheduling technique using a new notion of "mode grouping," which better exposes the underlying static behavior. We have also developed a scheduling technique for a class of dynamic applications that generates parameterized looped schedules (PLSs), which can handle dynamic dataflow behavior without major limitations on compile-time predictability. 4. We have demonstrated the use of dataflow-based approaches for design and implementation of radio astronomy DSP systems using an application example of a tunable digital downconverter (TDD) for spectrometers. Design and implementation of this module has been an integral part of this thesis work. This thesis demonstrates a design flow that consists of a high-level software prototype, analysis, and simulation using the dataflow interchange format (DIF) tool, and integration of this design with the existing tool flow for the target implementation on an FPGA platform, called interconnect break-out board (IBOB). We have also explored the trade-off between low hardware cost for fixed configurations of digital downconverters and flexibility offered by TDD designs. 5. This thesis has contributed significantly to the development and release of the latest version of a graph package oriented toward models of computation (MoCGraph). Our enhancements to this package include support for tree data structures, and generalized schedule trees (GSTs), which provide a useful data structure for a wide variety of schedule representations. Our extensions to the MoCGraph package provided key support for the CFDF model, and functional simulation capabilities in the DIF package.
Model Checking Verification and Validation at JPL and the NASA Fairmont IV and V Facility
NASA Technical Reports Server (NTRS)
Schneider, Frank; Easterbrook, Steve; Callahan, Jack; Montgomery, Todd
1999-01-01
We show how a technology transfer effort was carried out. The successful use of model checking on a pilot JPL flight project demonstrates the usefulness and the efficacy of the approach. The pilot project was used to model a complex spacecraft controller. Software design and implementation validation were carried out successfully. To suggest future applications we also show how the implementation validation step can be automated. The effort was followed by the formal introduction of the modeling technique as a part of the JPL Quality Assurance process.
Design and implementation of GaAs HBT circuits with ACME
NASA Technical Reports Server (NTRS)
Hutchings, Brad L.; Carter, Tony M.
1993-01-01
GaAs HBT circuits offer high performance (5-20 GHz) and radiation hardness (500 Mrad) that is attractive for space applications. ACME is a CAD tool specifically developed for HBT circuits. ACME implements a novel physical schematic-capture design technique where designers simultaneously view the structure and physical organization of a circuit. ACME's design interface is similar to schematic capture; however, unlike conventional schematic capture, designers can directly control the physical placement of both function and interconnect at the schematic level. In addition, ACME provides design-time parasitic extraction, complex wire models, and extensions to Multi-Chip Modules (MCM's). A GaAs HBT gate-array and semi-custom circuits have been developed with ACME; several circuits have been fabricated and found to be fully functional .
The purpose of the Interagency Steering Committee on Multimedia Environmental Modeling (ISCMEM) is to foster the exchange of information about environmental modeling tools, modeling frameworks, and environmental monitoring databases that are all in the public domain. It is compos...
Using Live Dual Modeling to Help Preservice Teachers Develop TPACK
ERIC Educational Resources Information Center
Lu, Liangyue; Lei, Jing
2012-01-01
To help preservice teachers learn about teaching with technology--specifically, technological pedagogical content knowledge (TPACK)--the researchers designed and implemented a Live Dual Modeling strategy involving both live behavior modeling and cognitive modeling in this study. Using qualitative research methods, the researchers investigated…
Consumer preference models: fuzzy theory approach
NASA Astrophysics Data System (ADS)
Turksen, I. B.; Wilson, I. A.
1993-12-01
Consumer preference models are widely used in new product design, marketing management, pricing and market segmentation. The purpose of this article is to develop and test a fuzzy set preference model which can represent linguistic variables in individual-level models implemented in parallel with existing conjoint models. The potential improvements in market share prediction and predictive validity can substantially improve management decisions about what to make (product design), for whom to make it (market segmentation) and how much to make (market share prediction).
Use of agents to implement an integrated computing environment
NASA Technical Reports Server (NTRS)
Hale, Mark A.; Craig, James I.
1995-01-01
Integrated Product and Process Development (IPPD) embodies the simultaneous application to both system and quality engineering methods throughout an iterative design process. The use of IPPD results in the time-conscious, cost-saving development of engineering systems. To implement IPPD, a Decision-Based Design perspective is encapsulated in an approach that focuses on the role of the human designer in product development. The approach has two parts and is outlined in this paper. First, an architecture, called DREAMS, is being developed that facilitates design from a decision-based perspective. Second, a supporting computing infrastructure, called IMAGE, is being designed. Agents are used to implement the overall infrastructure on the computer. Successful agent utilization requires that they be made of three components: the resource, the model, and the wrap. Current work is focused on the development of generalized agent schemes and associated demonstration projects. When in place, the technology independent computing infrastructure will aid the designer in systematically generating knowledge used to facilitate decision-making.
A fuzzy set preference model for market share analysis
NASA Technical Reports Server (NTRS)
Turksen, I. B.; Willson, Ian A.
1992-01-01
Consumer preference models are widely used in new product design, marketing management, pricing, and market segmentation. The success of new products depends on accurate market share prediction and design decisions based on consumer preferences. The vague linguistic nature of consumer preferences and product attributes, combined with the substantial differences between individuals, creates a formidable challenge to marketing models. The most widely used methodology is conjoint analysis. Conjoint models, as currently implemented, represent linguistic preferences as ratio or interval-scaled numbers, use only numeric product attributes, and require aggregation of individuals for estimation purposes. It is not surprising that these models are costly to implement, are inflexible, and have a predictive validity that is not substantially better than chance. This affects the accuracy of market share estimates. A fuzzy set preference model can easily represent linguistic variables either in consumer preferences or product attributes with minimal measurement requirements (ordinal scales), while still estimating overall preferences suitable for market share prediction. This approach results in flexible individual-level conjoint models which can provide more accurate market share estimates from a smaller number of more meaningful consumer ratings. Fuzzy sets can be incorporated within existing preference model structures, such as a linear combination, using the techniques developed for conjoint analysis and market share estimation. The purpose of this article is to develop and fully test a fuzzy set preference model which can represent linguistic variables in individual-level models implemented in parallel with existing conjoint models. The potential improvements in market share prediction and predictive validity can substantially improve management decisions about what to make (product design), for whom to make it (market segmentation), and how much to make (market share prediction).
Model-based Systems Engineering: Creation and Implementation of Model Validation Rules for MOS 2.0
NASA Technical Reports Server (NTRS)
Schmidt, Conrad K.
2013-01-01
Model-based Systems Engineering (MBSE) is an emerging modeling application that is used to enhance the system development process. MBSE allows for the centralization of project and system information that would otherwise be stored in extraneous locations, yielding better communication, expedited document generation and increased knowledge capture. Based on MBSE concepts and the employment of the Systems Modeling Language (SysML), extremely large and complex systems can be modeled from conceptual design through all system lifecycles. The Operations Revitalization Initiative (OpsRev) seeks to leverage MBSE to modernize the aging Advanced Multi-Mission Operations Systems (AMMOS) into the Mission Operations System 2.0 (MOS 2.0). The MOS 2.0 will be delivered in a series of conceptual and design models and documents built using the modeling tool MagicDraw. To ensure model completeness and cohesiveness, it is imperative that the MOS 2.0 models adhere to the specifications, patterns and profiles of the Mission Service Architecture Framework, thus leading to the use of validation rules. This paper outlines the process by which validation rules are identified, designed, implemented and tested. Ultimately, these rules provide the ability to maintain model correctness and synchronization in a simple, quick and effective manner, thus allowing the continuation of project and system progress.
The use of analytical models in human-computer interface design
NASA Technical Reports Server (NTRS)
Gugerty, Leo
1993-01-01
Recently, a large number of human-computer interface (HCI) researchers have investigated building analytical models of the user, which are often implemented as computer models. These models simulate the cognitive processes and task knowledge of the user in ways that allow a researcher or designer to estimate various aspects of an interface's usability, such as when user errors are likely to occur. This information can lead to design improvements. Analytical models can supplement design guidelines by providing designers rigorous ways of analyzing the information-processing requirements of specific tasks (i.e., task analysis). These models offer the potential of improving early designs and replacing some of the early phases of usability testing, thus reducing the cost of interface design. This paper describes some of the many analytical models that are currently being developed and evaluates the usefulness of analytical models for human-computer interface design. This paper will focus on computational, analytical models, such as the GOMS model, rather than less formal, verbal models, because the more exact predictions and task descriptions of computational models may be useful to designers. The paper also discusses some of the practical requirements for using analytical models in complex design organizations such as NASA.
Analytic Guided-Search Model of Human Performance Accuracy in Target- Localization Search Tasks
NASA Technical Reports Server (NTRS)
Eckstein, Miguel P.; Beutter, Brent R.; Stone, Leland S.
2000-01-01
Current models of human visual search have extended the traditional serial/parallel search dichotomy. Two successful models for predicting human visual search are the Guided Search model and the Signal Detection Theory model. Although these models are inherently different, it has been difficult to compare them because the Guided Search model is designed to predict response time, while Signal Detection Theory models are designed to predict performance accuracy. Moreover, current implementations of the Guided Search model require the use of Monte-Carlo simulations, a method that makes fitting the model's performance quantitatively to human data more computationally time consuming. We have extended the Guided Search model to predict human accuracy in target-localization search tasks. We have also developed analytic expressions that simplify simulation of the model to the evaluation of a small set of equations using only three free parameters. This new implementation and extension of the Guided Search model will enable direct quantitative comparisons with human performance in target-localization search experiments and with the predictions of Signal Detection Theory and other search accuracy models.
Rosales, Rocío; Gongola, Leah; Homlitas, Christa
2015-01-01
A multiple baseline design across participants was used to evaluate the effects of video modeling with embedded instructions on training teachers to implement 3 preference assessments. Each assessment was conducted with a confederate learner or a child with autism during generalization probes. All teachers met the predetermined mastery criterion, and 2 of the 3 demonstrated skill maintenance at 1-month follow-up.
Implementing ARFORGEN: Installation Capability and Feasibility Study of Meeting ARFORGEN Guidelines
2007-07-26
aligning troop requirements with the Army’s new strategic mission, the force stabilization element of ARFORGEN was developed to raise the morale of...a discrete event simulation model developed for the project to mirror the reset process. The Unit Reset model is implemented in Java as a discrete...and transportation. Further, the typical installation support staff is manned by a Table of Distribution and Allowance ( TDA ) designed to
Simulation technique for modeling flow on floodplains and in coastal wetlands
Schaffranek, Raymond W.; Baltzer, Robert A.
1988-01-01
The system design is premised on a proven, areal two-dimensional, finite-difference flow/transport model which is supported by an operational set of computer programs for input data management and model output interpretation. The purposes of the project are (1) to demonstrate the utility of the model for providing useful highway design information, (2) to develop guidelines and procedures for using the simulation system for evaluation, analysis, and optimal design of highway crossings of floodplain and coastal wetland areas, and (3) to identify improvements which can be effected in the simulation system to better serve the needs of highway design engineers. Two case study model implementations, being conducted to demonstrate the simulation system and modeling procedure, are presented and discussed briefly.
Model-based Acceleration Control of Turbofan Engines with a Hammerstein-Wiener Representation
NASA Astrophysics Data System (ADS)
Wang, Jiqiang; Ye, Zhifeng; Hu, Zhongzhi; Wu, Xin; Dimirovsky, Georgi; Yue, Hong
2017-05-01
Acceleration control of turbofan engines is conventionally designed through either schedule-based or acceleration-based approach. With the widespread acceptance of model-based design in aviation industry, it becomes necessary to investigate the issues associated with model-based design for acceleration control. In this paper, the challenges for implementing model-based acceleration control are explained; a novel Hammerstein-Wiener representation of engine models is introduced; based on the Hammerstein-Wiener model, a nonlinear generalized minimum variance type of optimal control law is derived; the feature of the proposed approach is that it does not require the inversion operation that usually upsets those nonlinear control techniques. The effectiveness of the proposed control design method is validated through a detailed numerical study.
ERIC Educational Resources Information Center
Fitzgerald, Michael; McKinnon, David H.; Danaia, Lena
2015-01-01
In this paper, we outline the theory behind the educational design used to implement a large-scale high school astronomy education project. This design was created in response to the realization of ineffective educational design in the initial early stages of the project. The new design follows an iterative improvement model where the materials…
An adaptive observer for on-line tool wear estimation in turning, Part I: Theory
NASA Astrophysics Data System (ADS)
Danai, Kourosh; Ulsoy, A. Galip
1987-04-01
On-line sensing of tool wear has been a long-standing goal of the manufacturing engineering community. In the absence of any reliable on-line tool wear sensors, a new model-based approach for tool wear estimation has been proposed. This approach is an adaptive observer, based on force measurement, which uses both parameter and state estimation techniques. The design of the adaptive observer is based upon a dynamic state model of tool wear in turning. This paper (Part I) presents the model, and explains its use as the basis for the adaptive observer design. This model uses flank wear and crater wear as state variables, feed as the input, and the cutting force as the output. The suitability of the model as the basis for adaptive observation is also verified. The implementation of the adaptive observer requires the design of a state observer and a parameter estimator. To obtain the model parameters for tuning the adaptive observer procedures for linearisation of the non-linear model are specified. The implementation of the adaptive observer in turning and experimental results are presented in a companion paper (Part II).
76 FR 13136 - Notice of Submission for OMB Review
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-10
... design; (3) analyze the impact of receipt of RTT funds on student outcomes using an interrupted time series design; and (4) investigate the relationship between STM turnaround models (and strategies within...-performing schools. The evaluation is designed to (1) study the implementation of RTT and SIG; (2) analyze...
76 FR 77503 - Notice of Submission for OMB Review
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-13
... series design; and (4) investigate the relationship between STM turnaround models (and strategies within...-performing schools. The evaluation is designed to (1) study the implementation of RTT and SIG; (2) analyze the impact of SIG- or RTT-funded STMs on student outcomes using a regression discontinuity design; (3...
75 FR 78230 - Notice of Proposed Information Collection Requests
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-15
... discontinuity design; (3) analyze the impact of receipt of RTT funds on student outcomes using an interrupted time series design; and (4) investigate the relationship between STM turnaround models (and strategies... (STMs) in the lowest-performing schools. The evaluation is designed to (1) study the implementation of...
Designing for Productive Adaptations of Curriculum Interventions
ERIC Educational Resources Information Center
Debarger, Angela Haydel; Choppin, Jeffrey; Beauvineau, Yves; Moorthy, Savitha
2013-01-01
Productive adaptations at the classroom level are evidence-based curriculum adaptations that are responsive to the demands of a particular classroom context and still consistent with the core design principles and intentions of a curriculum intervention. The model of design-based implementation research (DBIR) offers insights into complexities and…
DOT National Transportation Integrated Search
2009-11-01
The development of the Mechanistic-Empirical Pavement Design Guide (MEPDG) under National Cooperative Highway Research Program (NCHRP) projects 1-37A and 1-40D has significantly improved the ability of pavement designers to model and simulate the eff...
Designing and Evaluating Representations to Model Pedagogy
ERIC Educational Resources Information Center
Masterman, Elizabeth; Craft, Brock
2013-01-01
This article presents the case for a theory-informed approach to designing and evaluating representations for implementation in digital tools to support Learning Design, using the framework of epistemic efficacy as an example. This framework, which is rooted in the literature of cognitive psychology, is operationalised through dimensions of fit…
Design of Energy Storage Reactors for Dc-To-Dc Converters. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Chen, D. Y.
1975-01-01
Two methodical approaches to the design of energy-storage reactors for a group of widely used dc-to-dc converters are presented. One of these approaches is based on a steady-state time-domain analysis of piecewise-linearized circuit models of the converters, while the other approach is based on an analysis of the same circuit models, but from an energy point of view. The design procedure developed from the first approach includes a search through a stored data file of magnetic core characteristics and results in a list of usable reactor designs which meet a particular converter's requirements. Because of the complexity of this procedure, a digital computer usually is used to implement the design algorithm. The second approach, based on a study of the storage and transfer of energy in the magnetic reactors, leads to a straightforward design procedure which can be implemented with hand calculations. An equation to determine the lower-bound volume of workable cores for given converter design specifications is derived. Using this computer lower-bound volume, a comparative evaluation of various converter configurations is presented.
Design and implementation of a compliant robot with force feedback and strategy planning software
NASA Technical Reports Server (NTRS)
Premack, T.; Strempek, F. M.; Solis, L. A.; Brodd, S. S.; Cutler, E. P.; Purves, L. R.
1984-01-01
Force-feedback robotics techniques are being developed for automated precision assembly and servicing of NASA space flight equipment. Design and implementation of a prototype robot which provides compliance and monitors forces is in progress. Computer software to specify assembly steps and makes force feedback adjustments during assembly are coded and tested for three generically different precision mating problems. A model program demonstrates that a suitably autonomous robot can plan its own strategy.
Development and implementation of a Bayesian-based aquifer vulnerability assessment in Florida
Arthur, J.D.; Wood, H.A.R.; Baker, A.E.; Cichon, J.R.; Raines, G.L.
2007-01-01
The Florida Aquifer Vulnerability Assessment (FAVA) was designed to provide a tool for environmental, regulatory, resource management, and planning professionals to facilitate protection of groundwater resources from surface sources of contamination. The FAVA project implements weights-of-evidence (WofE), a data-driven, Bayesian-probabilistic model to generate a series of maps reflecting relative aquifer vulnerability of Florida's principal aquifer systems. The vulnerability assessment process, from project design to map implementation is described herein in reference to the Floridan aquifer system (FAS). The WofE model calculates weighted relationships between hydrogeologic data layers that influence aquifer vulnerability and ambient groundwater parameters in wells that reflect relative degrees of vulnerability. Statewide model input data layers (evidential themes) include soil hydraulic conductivity, density of karst features, thickness of aquifer confinement, and hydraulic head difference between the FAS and the watertable. Wells with median dissolved nitrogen concentrations exceeding statistically established thresholds serve as training points in the WofE model. The resulting vulnerability map (response theme) reflects classified posterior probabilities based on spatial relationships between the evidential themes and training points. The response theme is subjected to extensive sensitivity and validation testing. Among the model validation techniques is calculation of a response theme based on a different water-quality indicator of relative recharge or vulnerability: dissolved oxygen. Successful implementation of the FAVA maps was facilitated by the overall project design, which included a needs assessment and iterative technical advisory committee input and review. Ongoing programs to protect Florida's springsheds have led to development of larger-scale WofE-based vulnerability assessments. Additional applications of the maps include land-use planning amendments and prioritization of land purchases to protect groundwater resources. ?? International Association for Mathematical Geology 2007.
De-Regil, Luz Maria; Peña-Rosas, Juan Pablo; Flores-Ayala, Rafael; del Socorro Jefferds, Maria Elena
2015-01-01
Objective Nutrition interventions are critical to achieve the Millennium Development Goals; among them, micronutrient interventions are considered cost-effective and programmatically feasible to scale up, but there are limited tools to communicate the programme components and their relationships. The WHO/CDC (Centers for Disease Control and Prevention) logic model for micronutrient interventions in public health programmes is a useful resource for planning, implementation, monitoring and evaluation of these interventions, which depicts the programme theory and expected relationships between inputs and expected Millennium Development Goals. Design The model was developed by applying principles of programme evaluation, public health nutrition theory and programmatic expertise. The multifaceted and iterative structure validation included feedback from potential users and adaptation by national stakeholders involved in public health programmes' design and implementation. Results In addition to the inputs, main activity domains identified as essential for programme development, implementation and performance include: (i) policy; (ii) products and supply; (iii) delivery systems; (iv) quality control; and (v) behaviour change communication. Outputs encompass the access to and coverage of interventions. Outcomes include knowledge and appropriate use of the intervention, as well as effects on micronutrient intake, nutritional status and health of target populations, for ultimate achievement of the Millennium Development Goals. Conclusions The WHO/CDC logic model simplifies the process of developing a logic model by providing a tool that has identified high-priority areas and concepts that apply to virtually all public health micronutrient interventions. Countries can adapt it to their context in order to support programme design, implementation, monitoring and evaluation for the successful scale-up of nutrition interventions in public health. PMID:23507463
2012-01-01
Background Currently, 1 out of 88 children are diagnosed with an autism spectrum disorder (ASD), and the estimated cost for treatment services is $126 billion annually. Typically, ASD community providers (ASD-CPs) provide services to children with any severity of ASD symptoms using a combination of various treatment paradigms, some with an evidence-base and some without. When evidence-based practices (EBPs) are successfully implemented by ASD-CPs, they can result in positive outcomes. Despite this promise, EBPs are often implemented unsuccessfully and other treatments used by ASD-CPs lack supportive evidence, especially for school-age children with ASD. While it is not well understood why ASD-CPs are not implementing EBPs, organizational and individual characteristics likely play a role. As a response to this need and to improve the lives of children with ASD and their families, this study aims to develop and test the feasibility and acceptability of the Autism Model of Implementation (AMI) to support the implementation of EBPs by ASD-CPs. Methods/design An academic-community collaboration developed to partner with ASD-CPs will facilitate the development of the AMI, a process specifically for use by ASD community-based agencies. Using a mixed methods approach, the project will assess agency and individual factors likely to facilitate or hinder implementing EBPs in this context; develop the AMI to address identified barriers and facilitators; and pilot test the AMI to examine its feasibility and acceptability using a specific EBP to treat anxiety disorders in school-age children with ASD. Discussion The AMI will represent a data-informed approach to facilitate implementation of EBPs by ASD-CPs by providing an implementation model specifically developed for this context. This study is designed to address the real-world implications of EBP implementation in ASD community-based agencies. In doing so, the AMI will help to provide children with ASD the best and most effective services in their own community. Moreover, the proposed study will positively impact the field of implementation science by providing an empirically supported and tested model of implementation to facilitate the identification, adoption, and use of EBPs. PMID:22963616
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bobrek, Miljko; Albright, Austin P
This paper presents FPGA implementation of the Reed-Solomon decoder for use in IEEE 802.16 WiMAX systems. The decoder is based on RS(255,239) code, and is additionally shortened and punctured according to the WiMAX specifications. Simulink model based on Sysgen library of Xilinx blocks was used for simulation and hardware implementation. At the end, simulation results and hardware implementation performances are presented.
Methods for design and evaluation of integrated hardware-software systems for concurrent computation
NASA Technical Reports Server (NTRS)
Pratt, T. W.
1985-01-01
Research activities and publications are briefly summarized. The major tasks reviewed are: (1) VAX implementation of the PISCES parallel programming environment; (2) Apollo workstation network implementation of the PISCES environment; (3) FLEX implementation of the PISCES environment; (4) sparse matrix iterative solver in PSICES Fortran; (5) image processing application of PISCES; and (6) a formal model of concurrent computation being developed.
ERIC Educational Resources Information Center
Ollis, Debbie; Harrison, Lyn
2016-01-01
Purpose: The health promoting school model is rarely implemented in relation to sexuality education. This paper reports on data collected as part of a five-year project designed to implement a health promoting and whole school approach to sexuality education in a five campus year 1-12 college in regional Victoria, Australia. Using a community…
ERIC Educational Resources Information Center
Braham, Hana Manor; Ben-Zvi, Dani
2017-01-01
A fundamental aspect of statistical inference is representation of real-world data using statistical models. This article analyzes students' articulations of statistical models and modeling during their first steps in making informal statistical inferences. An integrated modeling approach (IMA) was designed and implemented to help students…
Mohr, David C; Lyon, Aaron R; Lattie, Emily G; Reddy, Madhu; Schueller, Stephen M
2017-05-10
Mental health problems are common and pose a tremendous societal burden in terms of cost, morbidity, quality of life, and mortality. The great majority of people experience barriers that prevent access to treatment, aggravated by a lack of mental health specialists. Digital mental health is potentially useful in meeting the treatment needs of large numbers of people. A growing number of efficacy trials have shown strong outcomes for digital mental health treatments. Yet despite their positive findings, there are very few examples of successful implementations and many failures. Although the research-to-practice gap is not unique to digital mental health, the inclusion of technology poses unique challenges. We outline some of the reasons for this gap and propose a collection of methods that can result in sustainable digital mental health interventions. These methods draw from human-computer interaction and implementation science and are integrated into an Accelerated Creation-to-Sustainment (ACTS) model. The ACTS model uses an iterative process that includes 2 basic functions (design and evaluate) across 3 general phases (Create, Trial, and Sustain). The ultimate goal in using the ACTS model is to produce a functioning technology-enabled service (TES) that is sustainable in a real-world treatment setting. We emphasize the importance of the service component because evidence from both research and practice has suggested that human touch is a critical ingredient in the most efficacious and used digital mental health treatments. The Create phase results in at least a minimally viable TES and an implementation blueprint. The Trial phase requires evaluation of both effectiveness and implementation while allowing optimization and continuous quality improvement of the TES and implementation plan. Finally, the Sustainment phase involves the withdrawal of research or donor support, while leaving a functioning, continuously improving TES in place. The ACTS model is a step toward bringing implementation and sustainment into the design and evaluation of TESs, public health into clinical research, research into clinics, and treatment into the lives of our patients. ©David C. Mohr, Aaron R Lyon, Emily G Lattie, Madhu Reddy, Stephen M Schueller. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 10.05.2017.
Lyon, Aaron R; Lattie, Emily G; Reddy, Madhu; Schueller, Stephen M
2017-01-01
Mental health problems are common and pose a tremendous societal burden in terms of cost, morbidity, quality of life, and mortality. The great majority of people experience barriers that prevent access to treatment, aggravated by a lack of mental health specialists. Digital mental health is potentially useful in meeting the treatment needs of large numbers of people. A growing number of efficacy trials have shown strong outcomes for digital mental health treatments. Yet despite their positive findings, there are very few examples of successful implementations and many failures. Although the research-to-practice gap is not unique to digital mental health, the inclusion of technology poses unique challenges. We outline some of the reasons for this gap and propose a collection of methods that can result in sustainable digital mental health interventions. These methods draw from human-computer interaction and implementation science and are integrated into an Accelerated Creation-to-Sustainment (ACTS) model. The ACTS model uses an iterative process that includes 2 basic functions (design and evaluate) across 3 general phases (Create, Trial, and Sustain). The ultimate goal in using the ACTS model is to produce a functioning technology-enabled service (TES) that is sustainable in a real-world treatment setting. We emphasize the importance of the service component because evidence from both research and practice has suggested that human touch is a critical ingredient in the most efficacious and used digital mental health treatments. The Create phase results in at least a minimally viable TES and an implementation blueprint. The Trial phase requires evaluation of both effectiveness and implementation while allowing optimization and continuous quality improvement of the TES and implementation plan. Finally, the Sustainment phase involves the withdrawal of research or donor support, while leaving a functioning, continuously improving TES in place. The ACTS model is a step toward bringing implementation and sustainment into the design and evaluation of TESs, public health into clinical research, research into clinics, and treatment into the lives of our patients. PMID:28490417
DEM Calibration Approach: design of experiment
NASA Astrophysics Data System (ADS)
Boikov, A. V.; Savelev, R. V.; Payor, V. A.
2018-05-01
The problem of DEM models calibration is considered in the article. It is proposed to divide models input parameters into those that require iterative calibration and those that are recommended to measure directly. A new method for model calibration based on the design of the experiment for iteratively calibrated parameters is proposed. The experiment is conducted using a specially designed stand. The results are processed with technical vision algorithms. Approximating functions are obtained and the error of the implemented software and hardware complex is estimated. The prospects of the obtained results are discussed.
A software development and evolution model based on decision-making
NASA Technical Reports Server (NTRS)
Wild, J. Christian; Dong, Jinghuan; Maly, Kurt
1991-01-01
Design is a complex activity whose purpose is to construct an artifact which satisfies a set of constraints and requirements. However the design process is not well understood. The software design and evolution process is the focus of interest, and a three dimensional software development space organized around a decision-making paradigm is presented. An initial instantiation of this model called 3DPM(sub p) which was partly implemented, is presented. Discussion of the use of this model in software reuse and process management is given.
Flexible Environments for Grand-Challenge Simulation in Climate Science
NASA Astrophysics Data System (ADS)
Pierrehumbert, R.; Tobis, M.; Lin, J.; Dieterich, C.; Caballero, R.
2004-12-01
Current climate models are monolithic codes, generally in Fortran, aimed at high-performance simulation of the modern climate. Though they adequately serve their designated purpose, they present major barriers to application in other problems. Tailoring them to paleoclimate of planetary simulations, for instance, takes months of work. Theoretical studies, where one may want to remove selected processes or break feedback loops, are similarly hindered. Further, current climate models are of little value in education, since the implementation of textbook concepts and equations in the code is obscured by technical detail. The Climate Systems Center at the University of Chicago seeks to overcome these limitations by bringing modern object-oriented design into the business of climate modeling. Our ultimate goal is to produce an end-to-end modeling environment capable of configuring anything from a simple single-column radiative-convective model to a full 3-D coupled climate model using a uniform, flexible interface. Technically, the modeling environment is implemented as a Python-based software component toolkit: key number-crunching procedures are implemented as discrete, compiled-language components 'glued' together and co-ordinated by Python, combining the high performance of compiled languages and the flexibility and extensibility of Python. We are incrementally working towards this final objective following a series of distinct, complementary lines. We will present an overview of these activities, including PyOM, a Python-based finite-difference ocean model allowing run-time selection of different Arakawa grids and physical parameterizations; CliMT, an atmospheric modeling toolkit providing a library of 'legacy' radiative, convective and dynamical modules which can be knitted into dynamical models, and PyCCSM, a version of NCAR's Community Climate System Model in which the coupler and run-control architecture are re-implemented in Python, augmenting its flexibility and adaptability.
VLSI circuits implementing computational models of neocortical circuits.
Wijekoon, Jayawan H B; Dudek, Piotr
2012-09-15
This paper overviews the design and implementation of three neuromorphic integrated circuits developed for the COLAMN ("Novel Computing Architecture for Cognitive Systems based on the Laminar Microcircuitry of the Neocortex") project. The circuits are implemented in a standard 0.35 μm CMOS technology and include spiking and bursting neuron models, and synapses with short-term (facilitating/depressing) and long-term (STDP and dopamine-modulated STDP) dynamics. They enable execution of complex nonlinear models in accelerated-time, as compared with biology, and with low power consumption. The neural dynamics are implemented using analogue circuit techniques, with digital asynchronous event-based input and output. The circuits provide configurable hardware blocks that can be used to simulate a variety of neural networks. The paper presents experimental results obtained from the fabricated devices, and discusses the advantages and disadvantages of the analogue circuit approach to computational neural modelling. Copyright © 2012 Elsevier B.V. All rights reserved.
This fact sheet was designed to be used by technical staff responsible for identifying and implementing flow and transport models to support cleanup decisions at hazardous and radioactive waste sites.
This fact sheet summarizes the report by a joint Interagency Environmental Pathway Modeling Working Group. It was designed to be used by technical staff responsible for identifying and implementing flow and transport models to support cleanup decisions.
AIR QUALITY FORECAST DATABASE AND ANALYSIS
In 2003, NOAA and EPA signed a Memorandum of Agreement to collaborate on the design and implementation of a capability to produce daily air quality modeling forecast information for the U.S. NOAA's ETA meteorological model and EPA's Community Multiscale Air Quality (CMAQ) model ...
ERIC Educational Resources Information Center
Hoffman, Elise; And Others
In order to design and implement a plan to integrate human sexuality into the curriculum for associate degree nursing students at Alvin Community College (Texas), levels of knowledge, attitudes and skills necessary in promoting sexual health were defined. Of the four levels in the Mims and Swenson Sexual Health Model (life experiences, basic,…
Design and Impacts of a Youth-Directed Café Scientifique Program
ERIC Educational Resources Information Center
Hall, Michelle K.; Foutz, Susan; Mayhew, Michael A.
2013-01-01
We have modified the popular Cafe Scientifique model for engaging adults in dialog on issues at the nexus of science and society to address the specific needs and interests of high-school age youth. Key elements of the model are Youth Leadership Teams that guide the program design and assist with implementation; a speaker preparation process to…
ERIC Educational Resources Information Center
McLean, Hilary
2012-01-01
The Early Assessment Program (EAP) has emerged as a national model for states seeking to design policies that increase the number of students who leave high school ready for college and careers. In addition, the two national consortia designing new assessments aligned to the Common Core State Standards have recognized the EAP as a model for the…
Design and implementation of ergonomic performance measurement system at a steel plant in India.
Ray, Pradip Kumar; Tewari, V K
2012-01-01
Management of Tata Steel, the largest steel making company of India in the private sector, felt the need to develop a framework to determine the levels of ergonomic performance at its different workplaces. The objectives of the study are manifold: to identify and characterize the ergonomic variables for a given worksystem with regard to work efficiency, operator safety, and working conditions, to design a comprehensive Ergonomic Performance Indicator (EPI) for quantitative determination of the ergonomic status and maturity of a given worksystem. The study team of IIT Kharagpur consists of three faculty members and the management of Tata Steel formed a team of eleven members for implementation of EPI model. In order to design and develop the EPI model with total participation and understanding of the concerned personnel of Tata Steel, a three-phase action plan for the project was prepared. The project consists of three phases: preparation and data collection, detailed structuring and validation of EPI model. Identification of ergonomic performance factors, development of interaction matrix, design of assessment tool, and testing and validation of assessment tool (EPI) in varied situations are the major steps in these phases. The case study discusses in detail the EPI model and its applications.
An aircraft model for the AIAA controls design challenge
NASA Technical Reports Server (NTRS)
Brumbaugh, Randal W.
1991-01-01
A generic, state-of-the-art, high-performance aircraft model, including detailed, full-envelope, nonlinear aerodynamics, and full-envelope thrust and first-order engine response data is described. While this model was primarily developed Controls Design Challenge, the availability of such a model provides a common focus for research in aeronautical control theory and methodology. An implementation of this model using the FORTRAN computer language, associated routines furnished with the aircraft model, and techniques for interfacing these routines to external procedures is also described. Figures showing vehicle geometry, surfaces, and sign conventions are included.
Ng, C M
2013-10-01
The development of a population PK/PD model, an essential component for model-based drug development, is both time- and labor-intensive. A graphical-processing unit (GPU) computing technology has been proposed and used to accelerate many scientific computations. The objective of this study was to develop a hybrid GPU-CPU implementation of parallelized Monte Carlo parametric expectation maximization (MCPEM) estimation algorithm for population PK data analysis. A hybrid GPU-CPU implementation of the MCPEM algorithm (MCPEMGPU) and identical algorithm that is designed for the single CPU (MCPEMCPU) were developed using MATLAB in a single computer equipped with dual Xeon 6-Core E5690 CPU and a NVIDIA Tesla C2070 GPU parallel computing card that contained 448 stream processors. Two different PK models with rich/sparse sampling design schemes were used to simulate population data in assessing the performance of MCPEMCPU and MCPEMGPU. Results were analyzed by comparing the parameter estimation and model computation times. Speedup factor was used to assess the relative benefit of parallelized MCPEMGPU over MCPEMCPU in shortening model computation time. The MCPEMGPU consistently achieved shorter computation time than the MCPEMCPU and can offer more than 48-fold speedup using a single GPU card. The novel hybrid GPU-CPU implementation of parallelized MCPEM algorithm developed in this study holds a great promise in serving as the core for the next-generation of modeling software for population PK/PD analysis.
Kock, Tobias J.; Perry, Russell W.; Monzyk, Fred R.; Pope, Adam C.; Plumb, John M.
2016-12-23
Survival estimates for juvenile salmon and steelhead fry in reservoirs impounded by high head dams are coveted data by resource managers. However, this information is difficult to obtain because these fish are too small for tagging using conventional methods such as passive-integrated transponders or radio or acoustic transmitters. We developed a study design and implementation plan to conduct a pilot evaluation that would assess the performance of two models for estimating fry survival in a field setting. The first model is a staggered-release recovery model that was described by Skalski and others (2009) and Skalski (2016). The second model is a parentage-based tagging N-mixture model that was developed and described in this document. Both models are conceptually and statistically sound, but neither has been evaluated in the field. In this document we provide an overview of a proposed study for 2017 in Lookout Point Reservoir, Oregon, that will evaluate survival of Chinook salmon fry using both models. This approach will allow us to test each model and compare survival estimates, to determine model performance and better understand these study designs using field-collected data.
Marketing Internships: A Planning and Implementation Guide.
ERIC Educational Resources Information Center
Faught, Suzanne G.
This planning and implementation guide is designed to assist marketing educators and others supportive of marketing education. It begins with definitions of vocabulary of related terminology and descriptions of the four models of internships presented in the guide: full-year, rotation-type format; 1-semester, rotation-type format; full-year format…
The theory and implementation of SRT Division. Report No. 230
DOE Office of Scientific and Technical Information (OSTI.GOV)
Atkins, III, Daniel E.
1967-06-01
To a large extent, this report has been directed to the designer faced with the task of implementing digital division. The author also hopes that this report will support exploration into development of higher radix quotient selection models, which can select 8 quotients bits in parallel.
A verification procedure for MSC/NASTRAN Finite Element Models
NASA Technical Reports Server (NTRS)
Stockwell, Alan E.
1995-01-01
Finite Element Models (FEM's) are used in the design and analysis of aircraft to mathematically describe the airframe structure for such diverse tasks as flutter analysis and actively controlled landing gear design. FEM's are used to model the entire airplane as well as airframe components. The purpose of this document is to describe recommended methods for verifying the quality of the FEM's and to specify a step-by-step procedure for implementing the methods.
ERIC Educational Resources Information Center
Blank, Rolf K.
2004-01-01
The purpose of the three-year CCSSO study was to design, implement, and test the effectiveness of the Data on Enacted Curriculum (DEC) model for improving math and science instruction. The model was tested by measuring its effects with a randomly selected sample of ?treatment? schools at the middle grades level as compared to a control group of…
Hybrid network defense model based on fuzzy evaluation.
Cho, Ying-Chiang; Pan, Jen-Yi
2014-01-01
With sustained and rapid developments in the field of information technology, the issue of network security has become increasingly prominent. The theme of this study is network data security, with the test subject being a classified and sensitive network laboratory that belongs to the academic network. The analysis is based on the deficiencies and potential risks of the network's existing defense technology, characteristics of cyber attacks, and network security technologies. Subsequently, a distributed network security architecture using the technology of an intrusion prevention system is designed and implemented. In this paper, first, the overall design approach is presented. This design is used as the basis to establish a network defense model, an improvement over the traditional single-technology model that addresses the latter's inadequacies. Next, a distributed network security architecture is implemented, comprising a hybrid firewall, intrusion detection, virtual honeynet projects, and connectivity and interactivity between these three components. Finally, the proposed security system is tested. A statistical analysis of the test results verifies the feasibility and reliability of the proposed architecture. The findings of this study will potentially provide new ideas and stimuli for future designs of network security architecture.
Anderson, D A; Bankston, K; Stindt, J L; Weybright, D W
2000-09-01
Today's managed care environment is forcing hospitals to seek new and innovative ways to deliver a seamless continuum of high-quality care and services to defined populations at lower costs. Many are striving to achieve this goal through the implementation of shared governance models that support point-of-service decision making, interdisciplinary partnerships, and the integration of work across clinical settings and along the service delivery continuum. The authors describe the key processes and strategies used to facilitate the design and successful implementation of an interdisciplinary shared governance model at The University Hospital, Cincinnati, Ohio. Implementation costs and initial benefits obtained over a 2-year period also are identified.
Twelve tips for implementing whole-task curricula: how to make it work.
Dolmans, Diana H J M; Wolfhagen, Ineke H A P; Van Merriënboer, Jeroen J G
2013-10-01
Whole-task models of learning and instructional design, such as problem-based learning, are nowadays very popular. Schools regularly encounter large problems when they implement whole-task curricula. The main aim of this article is to provide 12 tips that may help to make the implementation of a whole-task curriculum successful. Implementing whole-task curricula fails when the implementation is not well prepared. Requirements that must be met to make the implementation of whole task models into a success are described as twelve tips. The tips are organized in four clusters and refer to (1) the infrastructure, (2) the teachers, (3) the students, and (4) the management of the educational organization. Finally, the presented framework will be critically discussed and the importance of shared values and a change of culture is emphasized.
Olaf. Kuegler
2015-01-01
The Pacific Northwest Research Stationâs Forest Inventory and Analysis Unit began remeasurement of permanently located FIA plots under the annualized design in 2011. With remeasurement has come the need to implement the national FIA system for compiling estimates of forest growth, removals, and mortality. The national system requires regional diameter-growth models to...
ERIC Educational Resources Information Center
Lee, Kyungmee; Brett, Clare
2013-01-01
This qualitative case study is the first phase of a large-scale design-based research project to implement a theoretically derived double-layered CoP model within real-world teacher development practices. The main goal of this first iteration is to evaluate the courses and test and refine the CoP model for future implementations. This paper…
Pragmatic User Model Implementation in an Intelligent Help System.
ERIC Educational Resources Information Center
Fernandez-Manjon, Baltasar; Fernandez-Valmayor, Alfredo; Fernandez-Chamizo, Carmen
1998-01-01
Describes Aran, a knowledge-based system designed to help users deal with problems related to Unix operation. Highlights include adaptation to the individual user; user modeling knowledge; stereotypes; content of the individual user model; instantiation, acquisition, and maintenance of the individual model; dynamic acquisition of objective and…
Adaptive Automation Design and Implementation
2015-09-17
Study : Space Navigator This section demonstrates the player modeling paradigm, focusing specifically on the response generation section of the player ...human-machine system, a real-time player modeling framework for imitating a specific person’s task performance, and the Adaptive Automation System...Model . . . . . . . . . . . . . . . . . . . . . . . 13 Clustering-Based Real-Time Player Modeling . . . . . . . . . . . . . . . . . . . . . . 15 An
The potential application of the blackboard model of problem solving to multidisciplinary design
NASA Technical Reports Server (NTRS)
Rogers, James L.
1989-01-01
The potential application of the blackboard model of problem solving to multidisciplinary design is discussed. Multidisciplinary design problems are complex, poorly structured, and lack a predetermined decision path from the initial starting point to the final solution. The final solution is achieved using data from different engineering disciplines. Ideally, for the final solution to be the optimum solution, there must be a significant amount of communication among the different disciplines plus intradisciplinary and interdisciplinary optimization. In reality, this is not what happens in today's sequential approach to multidisciplinary design. Therefore it is highly unlikely that the final solution is the true optimum solution from an interdisciplinary optimization standpoint. A multilevel decomposition approach is suggested as a technique to overcome the problems associated with the sequential approach, but no tool currently exists with which to fully implement this technique. A system based on the blackboard model of problem solving appears to be an ideal tool for implementing this technique because it offers an incremental problem solving approach that requires no a priori determined reasoning path. Thus it has the potential of finding a more optimum solution for the multidisciplinary design problems found in today's aerospace industries.
E-book recommender system design and implementation based on data mining
NASA Astrophysics Data System (ADS)
Wang, Zongjiang
2011-12-01
In the knowledge explosion, rapid development of information age, how quickly the user or users interested in useful information for feedback to the user problem to be solved in this article. This paper based on data mining, association rules to the model and classification model a combination of electronic books on the recommendation of the user's neighboring users interested in e-books to target users. Introduced the e-book recommendation and the key technologies, system implementation algorithms, and implementation process, was proved through experiments that this system can help users quickly find the required e-books.
Design of demand side response model in energy internet demonstration park
NASA Astrophysics Data System (ADS)
Zhang, Q.; Liu, D. N.
2017-08-01
The implementation of demand side response can bring a lot of benefits to the power system, users and society, but there are still many problems in the actual operation. Firstly, this paper analyses the current situation and problems of demand side response. On this basis, this paper analyses the advantages of implementing demand side response in the energy Internet demonstration park. Finally, the paper designs three kinds of feasible demand side response modes in the energy Internet demonstration park.
Multi-scale Multi-mechanism Toughening of Hydrogels
NASA Astrophysics Data System (ADS)
Zhao, Xuanhe
Hydrogels are widely used as scaffolds for tissue engineering, vehicles for drug delivery, actuators for optics and fluidics, and model extracellular matrices for biological studies. The scope of hydrogel applications, however, is often severely limited by their mechanical properties. Inspired by the mechanics and hierarchical structures of tough biological tissues, we propose that a general principle for the design of tough hydrogels is to implement two mechanisms for dissipating mechanical energy and maintaining high elasticity in hydrogels. A particularly promising strategy for the design is to integrate multiple pairs of mechanisms across multiple length scales into a hydrogel. We develop a multiscale theoretical framework to quantitatively guide the design of tough hydrogels. On the network level, we have developed micro-physical models to characterize the evolution of polymer networks under deformation. On the continuum level, we have implemented constitutive laws formulated from the network-level models into a coupled cohesive-zone and Mullins-effect model to quantitatively predict crack propagation and fracture toughness of hydrogels. Guided by the design principle and quantitative model, we will demonstrate a set of new hydrogels, based on diverse types of polymers, yet can achieve extremely high toughness superior to their natural counterparts such as cartilages. The work was supported by NSF(No. CMMI- 1253495) and ONR (No. N00014-14-1-0528).
Lunar surface vehicle model competition
NASA Technical Reports Server (NTRS)
1990-01-01
During Fall and Winter quarters, Georgia Tech's School of Mechanical Engineering students designed machines and devices related to Lunar Base construction tasks. These include joint projects with Textile Engineering students. Topics studied included lunar environment simulator via drop tower technology, lunar rated fasteners, lunar habitat shelter, design of a lunar surface trenching machine, lunar support system, lunar worksite illumination (daytime), lunar regolith bagging system, sunlight diffusing tent for lunar worksite, service apparatus for lunar launch vehicles, lunar communication/power cables and teleoperated deployment machine, lunar regolith bag collection and emplacement device, soil stabilization mat for lunar launch/landing site, lunar rated fastening systems for robotic implementation, lunar surface cable/conduit and automated deployment system, lunar regolith bagging system, and lunar rated fasteners and fastening systems. A special topics team of five Spring quarter students designed and constructed a remotely controlled crane implement for the SKITTER model.
A programming language for composable DNA circuits
Phillips, Andrew; Cardelli, Luca
2009-01-01
Recently, a range of information-processing circuits have been implemented in DNA by using strand displacement as their main computational mechanism. Examples include digital logic circuits and catalytic signal amplification circuits that function as efficient molecular detectors. As new paradigms for DNA computation emerge, the development of corresponding languages and tools for these paradigms will help to facilitate the design of DNA circuits and their automatic compilation to nucleotide sequences. We present a programming language for designing and simulating DNA circuits in which strand displacement is the main computational mechanism. The language includes basic elements of sequence domains, toeholds and branch migration, and assumes that strands do not possess any secondary structure. The language is used to model and simulate a variety of circuits, including an entropy-driven catalytic gate, a simple gate motif for synthesizing large-scale circuits and a scheme for implementing an arbitrary system of chemical reactions. The language is a first step towards the design of modelling and simulation tools for DNA strand displacement, which complements the emergence of novel implementation strategies for DNA computing. PMID:19535415
A programming language for composable DNA circuits.
Phillips, Andrew; Cardelli, Luca
2009-08-06
Recently, a range of information-processing circuits have been implemented in DNA by using strand displacement as their main computational mechanism. Examples include digital logic circuits and catalytic signal amplification circuits that function as efficient molecular detectors. As new paradigms for DNA computation emerge, the development of corresponding languages and tools for these paradigms will help to facilitate the design of DNA circuits and their automatic compilation to nucleotide sequences. We present a programming language for designing and simulating DNA circuits in which strand displacement is the main computational mechanism. The language includes basic elements of sequence domains, toeholds and branch migration, and assumes that strands do not possess any secondary structure. The language is used to model and simulate a variety of circuits, including an entropy-driven catalytic gate, a simple gate motif for synthesizing large-scale circuits and a scheme for implementing an arbitrary system of chemical reactions. The language is a first step towards the design of modelling and simulation tools for DNA strand displacement, which complements the emergence of novel implementation strategies for DNA computing.
Modeling false positive detections in species occurrence data under different study designs.
Chambert, Thierry; Miller, David A W; Nichols, James D
2015-02-01
The occurrence of false positive detections in presence-absence data, even when they occur infrequently, can lead to severe bias when estimating species occupancy patterns. Building upon previous efforts to account for this source of observational error, we established a general framework to model false positives in occupancy studies and extend existing modeling approaches to encompass a broader range of sampling designs. Specifically, we identified three common sampling designs that are likely to cover most scenarios encountered by researchers. The different designs all included ambiguous detections, as well as some known-truth data, but their modeling differed in the level of the model hierarchy at which the known-truth information was incorporated (site level or observation level). For each model, we provide the likelihood, as well as R and BUGS code needed for implementation. We also establish a clear terminology and provide guidance to help choosing the most appropriate design and modeling approach.
Rapid algorithm prototyping and implementation for power quality measurement
NASA Astrophysics Data System (ADS)
Kołek, Krzysztof; Piątek, Krzysztof
2015-12-01
This article presents a Model-Based Design (MBD) approach to rapidly implement power quality (PQ) metering algorithms. Power supply quality is a very important aspect of modern power systems and will become even more important in future smart grids. In this case, maintaining the PQ parameters at the desired level will require efficient implementation methods of the metering algorithms. Currently, the development of new, advanced PQ metering algorithms requires new hardware with adequate computational capability and time intensive, cost-ineffective manual implementations. An alternative, considered here, is an MBD approach. The MBD approach focuses on the modelling and validation of the model by simulation, which is well-supported by a Computer-Aided Engineering (CAE) packages. This paper presents two algorithms utilized in modern PQ meters: a phase-locked loop based on an Enhanced Phase Locked Loop (EPLL), and the flicker measurement according to the IEC 61000-4-15 standard. The algorithms were chosen because of their complexity and non-trivial development. They were first modelled in the MATLAB/Simulink package, then tested and validated in a simulation environment. The models, in the form of Simulink diagrams, were next used to automatically generate C code. The code was compiled and executed in real-time on the Zynq Xilinx platform that combines a reconfigurable Field Programmable Gate Array (FPGA) with a dual-core processor. The MBD development of PQ algorithms, automatic code generation, and compilation form a rapid algorithm prototyping and implementation path for PQ measurements. The main advantage of this approach is the ability to focus on the design, validation, and testing stages while skipping over implementation issues. The code generation process renders production-ready code that can be easily used on the target hardware. This is especially important when standards for PQ measurement are in constant development, and the PQ issues in emerging smart grids will require tools for rapid development and implementation of such algorithms.
Huynh, Alexis K; Lee, Martin L; Farmer, Melissa M; Rubenstein, Lisa V
2016-10-21
Stepped wedge designs have gained recognition as a method for rigorously assessing implementation of evidence-based quality improvement interventions (QIIs) across multiple healthcare sites. In theory, this design uses random assignment of sites to successive QII implementation start dates based on a timeline determined by evaluators. However, in practice, QII timing is often controlled more by site readiness. We propose an alternate version of the stepped wedge design that does not assume the randomized timing of implementation while retaining the method's analytic advantages and applying to a broader set of evaluations. To test the feasibility of a nonrandomized stepped wedge design, we developed simulated data on patient care experiences and on QII implementation that had the structures and features of the expected data from a planned QII. We then applied the design in anticipation of performing an actual QII evaluation. We used simulated data on 108,000 patients to model nonrandomized stepped wedge results from QII implementation across nine primary care sites over 12 quarters. The outcome we simulated was change in a single self-administered question on access to care used by Veterans Health Administration (VA), based in the United States, as part of its quarterly patient ratings of quality of care. Our main predictors were QII exposure and time. Based on study hypotheses, we assigned values of 4 to 11 % for improvement in access when sites were first exposed to implementation and 1 to 3 % improvement in each ensuing time period thereafter when sites continued with implementation. We included site-level (practice size) and respondent-level (gender, race/ethnicity) characteristics that might account for nonrandomized timing in site implementation of the QII. We analyzed the resulting data as a repeated cross-sectional model using HLM 7 with a three-level hierarchical data structure and an ordinal outcome. Levels in the data structure included patient ratings, timing of adoption of the QII, and primary care site. We were able to demonstrate a statistically significant improvement in adoption of the QII, as postulated in our simulation. The linear time trend while sites were in the control state was not significant, also as expected in the real life scenario of the example QII. We concluded that the nonrandomized stepped wedge design was feasible within the parameters of our planned QII with its data structure and content. Our statistical approach may be applicable to similar evaluations.
Harvey, Gill; Fitzgerald, Louise; Fielden, Sandra; McBride, Anne; Waterman, Heather; Bamford, David; Kislov, Roman; Boaden, Ruth
2011-08-23
In response to policy recommendations, nine National Institute for Health Research (NIHR) Collaborations for Leadership in Applied Health Research and Care (CLAHRCs) were established in England in 2008, aiming to create closer working between the health service and higher education and narrow the gap between research and its implementation in practice. The Greater Manchester (GM) CLAHRC is a partnership between the University of Manchester and twenty National Health Service (NHS) trusts, with a five-year mission to improve healthcare and reduce health inequalities for people with cardiovascular conditions. This paper outlines the GM CLAHRC approach to designing and evaluating a large-scale, evidence- and theory-informed, context-sensitive implementation programme. The paper makes a case for embedding evaluation within the design of the implementation strategy. Empirical, theoretical, and experiential evidence relating to implementation science and methods has been synthesised to formulate eight core principles of the GM CLAHRC implementation strategy, recognising the multi-faceted nature of evidence, the complexity of the implementation process, and the corresponding need to apply approaches that are situationally relevant, responsive, flexible, and collaborative. In turn, these core principles inform the selection of four interrelated building blocks upon which the GM CLAHRC approach to implementation is founded. These determine the organizational processes, structures, and roles utilised by specific GM CLAHRC implementation projects, as well as the approach to researching implementation, and comprise: the Promoting Action on Research Implementation in Health Services (PARIHS) framework; a modified version of the Model for Improvement; multiprofessional teams with designated roles to lead, facilitate, and support the implementation process; and embedded evaluation and learning. Designing and evaluating a large-scale implementation strategy that can cope with and respond to the local complexities of implementing research evidence into practice is itself complex and challenging. We present an argument for adopting an integrative, co-production approach to planning and evaluating the implementation of research into practice, drawing on an eclectic range of evidence sources.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-22
... to the research, then a quasi- experimental evaluation of the program's implementation on the... experimental or quasi- experimental research design, with the evaluation results suggesting effectiveness in... this grant competition, a research design of the highest quality means an experimental design in which...
Performance/price estimates for cortex-scale hardware: a design space exploration.
Zaveri, Mazad S; Hammerstrom, Dan
2011-04-01
In this paper, we revisit the concept of virtualization. Virtualization is useful for understanding and investigating the performance/price and other trade-offs related to the hardware design space. Moreover, it is perhaps the most important aspect of a hardware design space exploration. Such a design space exploration is a necessary part of the study of hardware architectures for large-scale computational models for intelligent computing, including AI, Bayesian, bio-inspired and neural models. A methodical exploration is needed to identify potentially interesting regions in the design space, and to assess the relative performance/price points of these implementations. As an example, in this paper we investigate the performance/price of (digital and mixed-signal) CMOS and hypothetical CMOL (nanogrid) technology based hardware implementations of human cortex-scale spiking neural systems. Through this analysis, and the resulting performance/price points, we demonstrate, in general, the importance of virtualization, and of doing these kinds of design space explorations. The specific results suggest that hybrid nanotechnology such as CMOL is a promising candidate to implement very large-scale spiking neural systems, providing a more efficient utilization of the density and storage benefits of emerging nano-scale technologies. In general, we believe that the study of such hypothetical designs/architectures will guide the neuromorphic hardware community towards building large-scale systems, and help guide research trends in intelligent computing, and computer engineering. Copyright © 2010 Elsevier Ltd. All rights reserved.
Cyber-workstation for computational neuroscience.
Digiovanna, Jack; Rattanatamrong, Prapaporn; Zhao, Ming; Mahmoudi, Babak; Hermer, Linda; Figueiredo, Renato; Principe, Jose C; Fortes, Jose; Sanchez, Justin C
2010-01-01
A Cyber-Workstation (CW) to study in vivo, real-time interactions between computational models and large-scale brain subsystems during behavioral experiments has been designed and implemented. The design philosophy seeks to directly link the in vivo neurophysiology laboratory with scalable computing resources to enable more sophisticated computational neuroscience investigation. The architecture designed here allows scientists to develop new models and integrate them with existing models (e.g. recursive least-squares regressor) by specifying appropriate connections in a block-diagram. Then, adaptive middleware transparently implements these user specifications using the full power of remote grid-computing hardware. In effect, the middleware deploys an on-demand and flexible neuroscience research test-bed to provide the neurophysiology laboratory extensive computational power from an outside source. The CW consolidates distributed software and hardware resources to support time-critical and/or resource-demanding computing during data collection from behaving animals. This power and flexibility is important as experimental and theoretical neuroscience evolves based on insights gained from data-intensive experiments, new technologies and engineering methodologies. This paper describes briefly the computational infrastructure and its most relevant components. Each component is discussed within a systematic process of setting up an in vivo, neuroscience experiment. Furthermore, a co-adaptive brain machine interface is implemented on the CW to illustrate how this integrated computational and experimental platform can be used to study systems neurophysiology and learning in a behavior task. We believe this implementation is also the first remote execution and adaptation of a brain-machine interface.
Cyber-Workstation for Computational Neuroscience
DiGiovanna, Jack; Rattanatamrong, Prapaporn; Zhao, Ming; Mahmoudi, Babak; Hermer, Linda; Figueiredo, Renato; Principe, Jose C.; Fortes, Jose; Sanchez, Justin C.
2009-01-01
A Cyber-Workstation (CW) to study in vivo, real-time interactions between computational models and large-scale brain subsystems during behavioral experiments has been designed and implemented. The design philosophy seeks to directly link the in vivo neurophysiology laboratory with scalable computing resources to enable more sophisticated computational neuroscience investigation. The architecture designed here allows scientists to develop new models and integrate them with existing models (e.g. recursive least-squares regressor) by specifying appropriate connections in a block-diagram. Then, adaptive middleware transparently implements these user specifications using the full power of remote grid-computing hardware. In effect, the middleware deploys an on-demand and flexible neuroscience research test-bed to provide the neurophysiology laboratory extensive computational power from an outside source. The CW consolidates distributed software and hardware resources to support time-critical and/or resource-demanding computing during data collection from behaving animals. This power and flexibility is important as experimental and theoretical neuroscience evolves based on insights gained from data-intensive experiments, new technologies and engineering methodologies. This paper describes briefly the computational infrastructure and its most relevant components. Each component is discussed within a systematic process of setting up an in vivo, neuroscience experiment. Furthermore, a co-adaptive brain machine interface is implemented on the CW to illustrate how this integrated computational and experimental platform can be used to study systems neurophysiology and learning in a behavior task. We believe this implementation is also the first remote execution and adaptation of a brain-machine interface. PMID:20126436
Reconfigurable Hardware for Compressing Hyperspectral Image Data
NASA Technical Reports Server (NTRS)
Aranki, Nazeeh; Namkung, Jeffrey; Villapando, Carlos; Kiely, Aaron; Klimesh, Matthew; Xie, Hua
2010-01-01
High-speed, low-power, reconfigurable electronic hardware has been developed to implement ICER-3D, an algorithm for compressing hyperspectral-image data. The algorithm and parts thereof have been the topics of several NASA Tech Briefs articles, including Context Modeler for Wavelet Compression of Hyperspectral Images (NPO-43239) and ICER-3D Hyperspectral Image Compression Software (NPO-43238), which appear elsewhere in this issue of NASA Tech Briefs. As described in more detail in those articles, the algorithm includes three main subalgorithms: one for computing wavelet transforms, one for context modeling, and one for entropy encoding. For the purpose of designing the hardware, these subalgorithms are treated as modules to be implemented efficiently in field-programmable gate arrays (FPGAs). The design takes advantage of industry- standard, commercially available FPGAs. The implementation targets the Xilinx Virtex II pro architecture, which has embedded PowerPC processor cores with flexible on-chip bus architecture. It incorporates an efficient parallel and pipelined architecture to compress the three-dimensional image data. The design provides for internal buffering to minimize intensive input/output operations while making efficient use of offchip memory. The design is scalable in that the subalgorithms are implemented as independent hardware modules that can be combined in parallel to increase throughput. The on-chip processor manages the overall operation of the compression system, including execution of the top-level control functions as well as scheduling, initiating, and monitoring processes. The design prototype has been demonstrated to be capable of compressing hyperspectral data at a rate of 4.5 megasamples per second at a conservative clock frequency of 50 MHz, with a potential for substantially greater throughput at a higher clock frequency. The power consumption of the prototype is less than 6.5 W. The reconfigurability (by means of reprogramming) of the FPGAs makes it possible to effectively alter the design to some extent to satisfy different requirements without adding hardware. The implementation could be easily propagated to future FPGA generations and/or to custom application-specific integrated circuits.
An evaluation of the directed flow graph methodology
NASA Technical Reports Server (NTRS)
Snyder, W. E.; Rajala, S. A.
1984-01-01
The applicability of the Directed Graph Methodology (DGM) to the design and analysis of special purpose image and signal processing hardware was evaluated. A special purpose image processing system was designed and described using DGM. The design, suitable for very large scale integration (VLSI) implements a region labeling technique. Two computer chips were designed, both using metal-nitride-oxide-silicon (MNOS) technology, as well as a functional system utilizing those chips to perform real time region labeling. The system is described in terms of DGM primitives. As it is currently implemented, DGM is inappropriate for describing synchronous, tightly coupled, special purpose systems. The nature of the DGM formalism lends itself more readily to modeling networks of general purpose processors.
RF model of the distribution system as a communication channel, phase 2. Volume 3: Appendices
NASA Technical Reports Server (NTRS)
Rustay, R. C.; Gajjar, J. T.; Rankin, R. W.; Wentz, R. C.; Wooding, R.
1982-01-01
Program documentation concerning the design, implementation, and verification of a computerized model for predicting the steady-state sinusoidal response of radial configured distribution feeders is presented in these appendices.
Modelling brain emergent behaviours through coevolution of neural agents.
Maniadakis, Michail; Trahanias, Panos
2006-06-01
Recently, many research efforts focus on modelling partial brain areas with the long-term goal to support cognitive abilities of artificial organisms. Existing models usually suffer from heterogeneity, which constitutes their integration very difficult. The present work introduces a computational framework to address brain modelling tasks, emphasizing on the integrative performance of substructures. Moreover, implemented models are embedded in a robotic platform to support its behavioural capabilities. We follow an agent-based approach in the design of substructures to support the autonomy of partial brain structures. Agents are formulated to allow the emergence of a desired behaviour after a certain amount of interaction with the environment. An appropriate collaborative coevolutionary algorithm, able to emphasize both the speciality of brain areas and their cooperative performance, is employed to support design specification of agent structures. The effectiveness of the proposed approach is illustrated through the implementation of computational models for motor cortex and hippocampus, which are successfully tested on a simulated mobile robot.
DOE Office of Scientific and Technical Information (OSTI.GOV)
March-Leuba, S.; Jansen, J.F.; Kress, R.L.
A new program package, Symbolic Manipulator Laboratory (SML), for the automatic generation of both kinematic and static manipulator models in symbolic form is presented. Critical design parameters may be identified and optimized using symbolic models as shown in the sample application presented for the Future Armor Rearm System (FARS) arm. The computer-aided development of the symbolic models yields equations with reduced numerical complexity. Important considerations have been placed on the closed form solutions simplification and on the user friendly operation. The main emphasis of this research is the development of a methodology which is implemented in a computer program capablemore » of generating symbolic kinematic and static forces models of manipulators. The fact that the models are obtained trigonometrically reduced is among the most significant results of this work and the most difficult to implement. Mathematica, a commercial program that allows symbolic manipulation, is used to implement the program package. SML is written such that the user can change any of the subroutines or create new ones easily. To assist the user, an on-line help has been written to make of SML a user friendly package. Some sample applications are presented. The design and optimization of the 5-degrees-of-freedom (DOF) FARS manipulator using SML is discussed. Finally, the kinematic and static models of two different 7-DOF manipulators are calculated symbolically.« less
NASA Technical Reports Server (NTRS)
Thomas, Stan J.
1993-01-01
KATE (Knowledge-based Autonomous Test Engineer) is a model-based software system developed in the Artificial Intelligence Laboratory at the Kennedy Space Center for monitoring, fault detection, and control of launch vehicles and ground support systems. In order to bring KATE to the level of performance, functionality, and integratability needed for firing room applications, efforts are underway to implement KATE in the C++ programming language using an X-windows interface. Two programs which were designed and added to the collection of tools which comprise the KATE toolbox are described. The first tool, called the schematic viewer, gives the KATE user the capability to view digitized schematic drawings in the KATE environment. The second tool, called the model editor, gives the KATE model builder a tool for creating and editing knowledge base files. Design and implementation issues having to do with these two tools are discussed. It will be useful to anyone maintaining or extending either the schematic viewer or the model editor.
NASA Technical Reports Server (NTRS)
Dominick, Wayne D. (Editor); Farooq, Mohammad U.
1986-01-01
The definition of proposed research addressing the development and validation of a methodology for the design and evaluation of user interfaces for interactive information systems is given. The major objectives of this research are: the development of a comprehensive, objective, and generalizable methodology for the design and evaluation of user interfaces for information systems; the development of equations and/or analytical models to characterize user behavior and the performance of a designed interface; the design of a prototype system for the development and administration of user interfaces; and the design and use of controlled experiments to support the research and test/validate the proposed methodology. The proposed design methodology views the user interface as a virtual machine composed of three layers: an interactive layer, a dialogue manager layer, and an application interface layer. A command language model of user system interactions is presented because of its inherent simplicity and structured approach based on interaction events. All interaction events have a common structure based on common generic elements necessary for a successful dialogue. It is shown that, using this model, various types of interfaces could be designed and implemented to accommodate various categories of users. The implementation methodology is discussed in terms of how to store and organize the information.
Hamilton, Alison B; Mittman, Brian S; Williams, John K; Liu, Honghu H; Eccles, Alicia M; Hutchinson, Craig S; Wyatt, Gail E
2014-06-20
The HIV/AIDS epidemic continues to disproportionately affect African American communities in the US, particularly those located in urban areas. Despite the fact that HIV is often transmitted from one sexual partner to another, most HIV prevention interventions have focused only on individuals, rather than couples. This five-year study investigates community-based implementation, effectiveness, and sustainability of 'Eban II,' an evidence-based risk reduction intervention for African-American heterosexual, serodiscordant couples. This hybrid implementation/effectiveness implementation study is guided by organizational change theory as conceptualized in the Texas Christian University Program Change Model (PCM), a model of phased organizational change from exposure to adoption, implementation, and sustainability. The primary implementation aims are to assist 10 community-based organizations (CBOs) to implement and sustain Eban II; specifically, to partner with CBOs to expose providers to the intervention; facilitate its adoption, implementation and sustainment; and to evaluate processes and determinants of implementation, effectiveness, fidelity, and sustainment. The primary effectiveness aim is to evaluate the effect of Eban II on participant (n = 200 couples) outcomes, specifically incidents of protected sex and proportion of condom use. We will also determine the cost-effectiveness of implementation, as measured by implementation costs and potential cost savings. A mixed methods evaluation will examine implementation at the agency level; staff members from the CBOs will complete baseline measures of organizational context and climate, while key stakeholders will be interviewed periodically throughout implementation. Effectiveness of Eban II will be assessed using a randomized delayed enrollment (waitlist) control design to evaluate the impact of treatment on outcomes at posttest and three-month follow-up. Multi-level hierarchical modeling with a multi-level nested structure will be used to evaluate the effects of agency- and couples-level characteristics on couples-level outcomes (e.g., condom use). This study will produce important information regarding the value of the Eban II program and a theory-guided implementation process and tools designed for use in implementing Eban II and other evidence-based programs in demographically diverse, resource-constrained treatment settings. NCT00644163.
Impact of science objectives and requirements on probe mission and system design
NASA Technical Reports Server (NTRS)
Ledbetter, K. W.
1974-01-01
Problem areas in probe science technology are discussed that require a solution before probe systems can actually be designed. Considered are the effects of the model atmospheres on probe design; secondly, the effects of implementing the requirements to locate and measure the clouds and, trade-offs between descent sampling and measurement criteria as they affect probe system design.
NASA Technical Reports Server (NTRS)
Gettman, Chang-Ching LO
1993-01-01
This thesis develops and demonstrates an approach to nonlinear control system design using linearization by state feedback. The design provides improved transient response behavior allowing faster maneuvering of payloads by the SRMS. Modeling uncertainty is accounted for by using a second feedback loop designed around the feedback linearized dynamics. A classical feedback loop is developed to provide the easy implementation required for the relatively small on board computers. Feedback linearization also allows the use of higher bandwidth model based compensation in the outer loop, since it helps maintain stability in the presence of the nonlinearities typically neglected in model based designs.
A Comparison of Multivariable Control Design Techniques for a Turbofan Engine Control
NASA Technical Reports Server (NTRS)
Garg, Sanjay; Watts, Stephen R.
1995-01-01
This paper compares two previously published design procedures for two different multivariable control design techniques for application to a linear engine model of a jet engine. The two multivariable control design techniques compared were the Linear Quadratic Gaussian with Loop Transfer Recovery (LQG/LTR) and the H-Infinity synthesis. The two control design techniques were used with specific previously published design procedures to synthesize controls which would provide equivalent closed loop frequency response for the primary control loops while assuring adequate loop decoupling. The resulting controllers were then reduced in order to minimize the programming and data storage requirements for a typical implementation. The reduced order linear controllers designed by each method were combined with the linear model of an advanced turbofan engine and the system performance was evaluated for the continuous linear system. Included in the performance analysis are the resulting frequency and transient responses as well as actuator usage and rate capability for each design method. The controls were also analyzed for robustness with respect to structured uncertainties in the unmodeled system dynamics. The two controls were then compared for performance capability and hardware implementation issues.
Reuseable Objects Software Environment (ROSE): Introduction to Air Force Software Reuse Workshop
NASA Technical Reports Server (NTRS)
Cottrell, William L.
1994-01-01
The Reusable Objects Software Environment (ROSE) is a common, consistent, consolidated implementation of software functionality using modern object oriented software engineering including designed-in reuse and adaptable requirements. ROSE is designed to minimize abstraction and reduce complexity. A planning model for the reverse engineering of selected objects through object oriented analysis is depicted. Dynamic and functional modeling are used to develop a system design, the object design, the language, and a database management system. The return on investment for a ROSE pilot program and timelines are charted.
Tan, Erwin J.
2014-01-01
Purpose: Experience Corps Baltimore City (EC) is a product of a partnership between the Greater Homewood Community Corporation (GHCC) and the Johns Hopkins Center on Aging and Health (COAH) that began in 1998. EC recruits volunteers aged 55 and older into high-impact mentoring and tutoring roles in public elementary schools that are designed to also benefit the volunteers. We describe the evolution of the GHCC–COAH partnership through the “Courtship Model.” Design and Methods: We describe how community-based participatory research principals, such as shared governance, were applied at the following stages: (1) partner selection, (2) getting serious, (3) commitment, and (4) leaving a legacy. Results: EC could not have achieved its current level of success without academic–community partnership. In early stages of the “Courtship Model,” GHCC and COAH were able to rely on the trust developed between the leadership of the partner organizations. Competing missions from different community and academic funders led to tension in later stages of the “Courtship Model” and necessitated a formal Memorandum of Understanding between the partners as they embarked on a randomized controlled trial. Implications: The GHCC–COAH partnership demonstrates how academic–community partnerships can serve as an engine for social innovation. The partnership could serve as a model for other communities seeking multiple funding sources to implement similar public health interventions that are based on national service models. Unified funding mechanisms would assist the formation of academic–community partnerships that could support the design, implementation, and the evaluation of community-based public health interventions. PMID:23887931
Model and controller reduction of large-scale structures based on projection methods
NASA Astrophysics Data System (ADS)
Gildin, Eduardo
The design of low-order controllers for high-order plants is a challenging problem theoretically as well as from a computational point of view. Frequently, robust controller design techniques result in high-order controllers. It is then interesting to achieve reduced-order models and controllers while maintaining robustness properties. Controller designed for large structures based on models obtained by finite element techniques yield large state-space dimensions. In this case, problems related to storage, accuracy and computational speed may arise. Thus, model reduction methods capable of addressing controller reduction problems are of primary importance to allow the practical applicability of advanced controller design methods for high-order systems. A challenging large-scale control problem that has emerged recently is the protection of civil structures, such as high-rise buildings and long-span bridges, from dynamic loadings such as earthquakes, high wind, heavy traffic, and deliberate attacks. Even though significant effort has been spent in the application of control theory to the design of civil structures in order increase their safety and reliability, several challenging issues are open problems for real-time implementation. This dissertation addresses with the development of methodologies for controller reduction for real-time implementation in seismic protection of civil structures using projection methods. Three classes of schemes are analyzed for model and controller reduction: nodal truncation, singular value decomposition methods and Krylov-based methods. A family of benchmark problems for structural control are used as a framework for a comparative study of model and controller reduction techniques. It is shown that classical model and controller reduction techniques, such as balanced truncation, modal truncation and moment matching by Krylov techniques, yield reduced-order controllers that do not guarantee stability of the closed-loop system, that is, the reduced-order controller implemented with the full-order plant. A controller reduction approach is proposed such that to guarantee closed-loop stability. It is based on the concept of dissipativity (or positivity) of linear dynamical systems. Utilizing passivity preserving model reduction together with dissipative-LQG controllers, effective low-order optimal controllers are obtained. Results are shown through simulations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chao, Tian-Jy; Kim, Younghun
An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented thatmore » communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.« less
Living Design Memory: Framework, Implementation, Lessons Learned.
ERIC Educational Resources Information Center
Terveen, Loren G.; And Others
1995-01-01
Discusses large-scale software development and describes the development of the Designer Assistant to improve software development effectiveness. Highlights include the knowledge management problem; related work, including artificial intelligence and expert systems, software process modeling research, and other approaches to organizational memory;…
Xyce parallel electronic simulator users guide, version 6.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keiter, Eric R; Mei, Ting; Russo, Thomas V.
This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas; Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes support for most popular parallel and serial computers; A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows one to developmore » new types of analysis without requiring the implementation of analysis-specific device models; Device models that are specifically tailored to meet Sandia's needs, including some radiationaware devices (for Sandia users only); and Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase-a message passing parallel implementation-which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows.« less
Xyce parallel electronic simulator users' guide, Version 6.0.1.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keiter, Eric R; Mei, Ting; Russo, Thomas V.
This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows one to developmore » new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandias needs, including some radiationaware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase a message passing parallel implementation which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows.« less
Xyce parallel electronic simulator users guide, version 6.0.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keiter, Eric R; Mei, Ting; Russo, Thomas V.
This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows one to developmore » new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandias needs, including some radiationaware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase a message passing parallel implementation which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows.« less
The Use of Wireless Laptop Computers for Computer-Assisted Learning in Pharmacokinetics
Munar, Myrna Y.; Singh, Harleen; Belle, Donna; Brackett, Carolyn C.; Earle, Sandra B.
2006-01-01
Objective To implement computer-assisted learning workshops into pharmacokinetics courses in a doctor of pharmacy (PharmD) program. Design Workshops were designed for students to utilize computer software programs on laptop computers to build pharmacokinetic models to predict drug concentrations resulting from various dosage regimens. In addition, students were able to visualize through graphing programs how altering different parameters changed drug concentration-time curves. Surveys were conducted to measure students’ attitudes toward computer technology before and after implementation. Finally, traditional examinations were used to evaluate student learning. Assessment Doctor of pharmacy students responded favorably to the use of wireless laptop computers in problem-based pharmacokinetic workshops. Eighty-eight percent (n = 61/69) and 82% (n = 55/67) of PharmD students completed surveys before and after computer implementation, respectively. Prior to implementation, 95% of students agreed that computers would enhance learning in pharmacokinetics. After implementation, 98% of students strongly agreed (p < 0.05) that computers enhanced learning. Examination results were significantly higher after computer implementation (89% with computers vs. 84% without computers; p = 0.01). Conclusion Implementation of wireless laptop computers in a pharmacokinetic course enabled students to construct their own pharmacokinetic models that could respond to changing parameters. Students had greater comprehension and were better able to interpret results and provide appropriate recommendations. Computer-assisted pharmacokinetic techniques can be powerful tools when making decisions about drug therapy. PMID:17136147
Implementation of an inter-agency transition model for youth with spina bifida.
Lindsay, S; Cruickshank, H; McPherson, A C; Maxwell, J
2016-03-01
To address gaps in transfer of care and transition support, a paediatric hospital and adult community health care centre partnered to implement an inter-agency transition model for youth with spina bifida. Our objective was to understand the enablers and challenges experienced in the implementation of the model. Using a descriptive, qualitative design, we conducted semi-structured interviews, in-person or over the phone, with 12 clinicians and nine key informants involved in implementing the spina bifida transition model. We recruited all 21 participants from an urban area of Ontario, Canada. Clinicians and key informants experienced several enablers and challenges in implementing the spina bifida transition model. Enablers included dedicated leadership, advocacy, funding, inter-agency partnerships, cross-appointed staff and gaps in co-ordinated care to connect youth to adult services. Challenges included gaps in the availability of adult specialty services, limited geographical catchment of adult services, limited engagement of front-line staff, gaps in communication and role clarity. Although the transition model has realized some initial successes, there are still many challenges to overcome in transferring youth with spina bifida to adult health care and transitioning to adulthood. © 2015 John Wiley & Sons Ltd.
Implementation science approaches for integrating eHealth research into practice and policy.
Glasgow, Russell E; Phillips, Siobhan M; Sanchez, Michael A
2014-07-01
To summarize key issues in the eHealth field from an implementation science perspective and to highlight illustrative processes, examples and key directions to help more rapidly integrate research, policy and practice. We present background on implementation science models and emerging principles; discuss implications for eHealth research; provide examples of practical designs, measures and exemplar studies that address key implementation science issues; and make recommendations for ways to more rapidly develop and test eHealth interventions as well as future research, policy and practice. The pace of eHealth research has generally not kept up with technological advances, and many of our designs, methods and funding mechanisms are incapable of providing the types of rapid and relevant information needed. Although there has been substantial eHealth research conducted with positive short-term results, several key implementation and dissemination issues such as representativeness, cost, unintended consequences, impact on health inequities, and sustainability have not been addressed or reported. Examples of studies in several of these areas are summarized to demonstrate this is possible. eHealth research that is intended to translate into policy and practice should be more contextual, report more on setting factors, employ more responsive and pragmatic designs and report results more transparently on issues important to potential adopting patients, clinicians and organizational decision makers. We outline an alternative development and assessment model, summarize implementation science findings that can help focus attention, and call for different types of more rapid and relevant research and funding mechanisms. Published by Elsevier Ireland Ltd.
ERIC Educational Resources Information Center
Farina, William J., Jr.; Bodzin, Alec M.
2018-01-01
Web-based learning is a growing field in education, yet empirical research into the design of high quality Web-based university science instruction is scarce. A one-week asynchronous online module on the Bohr Model of the atom was developed and implemented guided by the knowledge integration framework. The unit design aligned with three identified…
ERIC Educational Resources Information Center
McNaughton, Stuart; Lai, Mei Kuin
2009-01-01
A model of school change has been designed and implemented in a systematic replication series. Key principles are: that teachers need to be able to act as adaptive experts; that local evidence about teaching and learning is necessary to inform instructional design; that school professional learning communities are vehicles for changing teaching…
Tan, Erwin J; McGill, Sylvia; Tanner, Elizabeth K; Carlson, Michelle C; Rebok, George W; Seeman, Teresa E; Fried, Linda P
2014-04-01
Experience Corps Baltimore City (EC) is a product of a partnership between the Greater Homewood Community Corporation (GHCC) and the Johns Hopkins Center on Aging and Health (COAH) that began in 1998. EC recruits volunteers aged 55 and older into high-impact mentoring and tutoring roles in public elementary schools that are designed to also benefit the volunteers. We describe the evolution of the GHCC-COAH partnership through the "Courtship Model." We describe how community-based participatory research principals, such as shared governance, were applied at the following stages: (1) partner selection, (2) getting serious, (3) commitment, and (4) leaving a legacy. EC could not have achieved its current level of success without academic-community partnership. In early stages of the "Courtship Model," GHCC and COAH were able to rely on the trust developed between the leadership of the partner organizations. Competing missions from different community and academic funders led to tension in later stages of the "Courtship Model" and necessitated a formal Memorandum of Understanding between the partners as they embarked on a randomized controlled trial. The GHCC-COAH partnership demonstrates how academic-community partnerships can serve as an engine for social innovation. The partnership could serve as a model for other communities seeking multiple funding sources to implement similar public health interventions that are based on national service models. Unified funding mechanisms would assist the formation of academic-community partnerships that could support the design, implementation, and the evaluation of community-based public health interventions.
Simulation studies of the application of SEASAT data in weather and state of sea forecasting models
NASA Technical Reports Server (NTRS)
Cardone, V. J.; Greenwood, J. A.
1979-01-01
The design and analysis of SEASAT simulation studies in which the error structure of conventional analyses and forecasts is modeled realistically are presented. The development and computer implementation of a global spectral ocean wave model is described. The design of algorithms for the assimilation of theoretical wind data into computers and for the utilization of real wind data and wave height data in a coupled computer system are presented.
An Open Source modular platform for hydrological model implementation
NASA Astrophysics Data System (ADS)
Kolberg, Sjur; Bruland, Oddbjørn
2010-05-01
An implementation framework for setup and evaluation of spatio-temporal models is developed, forming a highly modularized distributed model system. The ENKI framework allows building space-time models for hydrological or other environmental purposes, from a suite of separately compiled subroutine modules. The approach makes it easy for students, researchers and other model developers to implement, exchange, and test single routines in a fixed framework. The open-source license and modular design of ENKI will also facilitate rapid dissemination of new methods to institutions engaged in operational hydropower forecasting or other water resource management. Written in C++, ENKI uses a plug-in structure to build a complete model from separately compiled subroutine implementations. These modules contain very little code apart from the core process simulation, and are compiled as dynamic-link libraries (dll). A narrow interface allows the main executable to recognise the number and type of the different variables in each routine. The framework then exposes these variables to the user within the proper context, ensuring that time series exist for input variables, initialisation for states, GIS data sets for static map data, manually or automatically calibrated values for parameters etc. ENKI is designed to meet three different levels of involvement in model construction: • Model application: Running and evaluating a given model. Regional calibration against arbitrary data using a rich suite of objective functions, including likelihood and Bayesian estimation. Uncertainty analysis directed towards input or parameter uncertainty. o Need not: Know the model's composition of subroutines, or the internal variables in the model, or the creation of method modules. • Model analysis: Link together different process methods, including parallel setup of alternative methods for solving the same task. Investigate the effect of different spatial discretization schemes. o Need not: Write or compile computer code, handle file IO for each modules, • Routine implementation and testing. Implementation of new process-simulating methods/equations, specialised objective functions or quality control routines, testing of these in an existing framework. o Need not: Implement user or model interface for the new routine, IO handling, administration of model setup and run, calibration and validation routines etc. From being developed for Norway's largest hydropower producer Statkraft, ENKI is now being turned into an Open Source project. At the time of writing, the licence and the project administration is not established. Also, it remains to port the application to other compilers and computer platforms. However, we hope that ENKI will prove useful for both academic and operational users.
The Hindmarsh-Rose neuron model: bifurcation analysis and piecewise-linear approximations.
Storace, Marco; Linaro, Daniele; de Lange, Enno
2008-09-01
This paper provides a global picture of the bifurcation scenario of the Hindmarsh-Rose model. A combination between simulations and numerical continuations is used to unfold the complex bifurcation structure. The bifurcation analysis is carried out by varying two bifurcation parameters and evidence is given that the structure that is found is universal and appears for all combinations of bifurcation parameters. The information about the organizing principles and bifurcation diagrams are then used to compare the dynamics of the model with that of a piecewise-linear approximation, customized for circuit implementation. A good match between the dynamical behaviors of the models is found. These results can be used both to design a circuit implementation of the Hindmarsh-Rose model mimicking the diversity of neural response and as guidelines to predict the behavior of the model as well as its circuit implementation as a function of parameters. (c) 2008 American Institute of Physics.
Ji, Yuan; Wang, Sue-Jane
2013-01-01
The 3 + 3 design is the most common choice among clinicians for phase I dose-escalation oncology trials. In recent reviews, more than 95% of phase I trials have been based on the 3 + 3 design. Given that it is intuitive and its implementation does not require a computer program, clinicians can conduct 3 + 3 dose escalations in practice with virtually no logistic cost, and trial protocols based on the 3 + 3 design pass institutional review board and biostatistics reviews quickly. However, the performance of the 3 + 3 design has rarely been compared with model-based designs in simulation studies with matched sample sizes. In the vast majority of statistical literature, the 3 + 3 design has been shown to be inferior in identifying true maximum-tolerated doses (MTDs), although the sample size required by the 3 + 3 design is often orders-of-magnitude smaller than model-based designs. In this article, through comparative simulation studies with matched sample sizes, we demonstrate that the 3 + 3 design has higher risks of exposing patients to toxic doses above the MTD than the modified toxicity probability interval (mTPI) design, a newly developed adaptive method. In addition, compared with the mTPI design, the 3 + 3 design does not yield higher probabilities in identifying the correct MTD, even when the sample size is matched. Given that the mTPI design is equally transparent, costless to implement with free software, and more flexible in practical situations, we highly encourage its adoption in early dose-escalation studies whenever the 3 + 3 design is also considered. We provide free software to allow direct comparisons of the 3 + 3 design with other model-based designs in simulation studies with matched sample sizes. PMID:23569307
Integrated active and passive control design methodology for the LaRC CSI evolutionary model
NASA Technical Reports Server (NTRS)
Voth, Christopher T.; Richards, Kenneth E., Jr.; Schmitz, Eric; Gehling, Russel N.; Morgenthaler, Daniel R.
1994-01-01
A general design methodology to integrate active control with passive damping was demonstrated on the NASA LaRC CSI Evolutionary Model (CEM), a ground testbed for future large, flexible spacecraft. Vibration suppression controllers designed for Line-of Sight (LOS) minimization were successfully implemented on the CEM. A frequency-shaped H2 methodology was developed, allowing the designer to specify the roll-off of the MIMO compensator. A closed loop bandwidth of 4 Hz, including the six rigid body modes and the first three dominant elastic modes of the CEM was achieved. Good agreement was demonstrated between experimental data and analytical predictions for the closed loop frequency response and random tests. Using the Modal Strain Energy (MSE) method, a passive damping treatment consisting of 60 viscoelastically damped struts was designed, fabricated and implemented on the CEM. Damping levels for the targeted modes were more than an order of magnitude larger than for the undamped structure. Using measured loss and stiffness data for the individual damped struts, analytical predictions of the damping levels were very close to the experimental values in the (1-10) Hz frequency range where the open loop model matched the experimental data. An integrated active/passive controller was successfully implemented on the CEM and was evaluated against an active-only controller. A two-fold increase in the effective control bandwidth and further reductions of 30 percent to 50 percent in the LOS RMS outputs were achieved compared to an active-only controller. Superior performance was also obtained compared to a High-Authority/Low-Authority (HAC/LAC) controller.
Effects of Video Modeling on Treatment Integrity of Behavioral Interventions
ERIC Educational Resources Information Center
DiGennaro-Reed, Florence D.; Codding, Robin; Catania, Cynthia N.; Maguire, Helena
2010-01-01
We examined the effects of individualized video modeling on the accurate implementation of behavioral interventions using a multiple baseline design across 3 teachers. During video modeling, treatment integrity improved above baseline levels; however, teacher performance remained variable. The addition of verbal performance feedback increased…
Turbulence model development and application at Lockheed Fort Worth Company
NASA Technical Reports Server (NTRS)
Smith, Brian R.
1995-01-01
This viewgraph presentation demonstrates that computationally efficient k-l and k-kl turbulence models have been developed and implemented at Lockheed Fort Worth Company. Many years of experience have been gained applying two equation turbulence models to complex three-dimensional flows for design and analysis.
Structural Equation Modeling of School Violence Data: Methodological Considerations
ERIC Educational Resources Information Center
Mayer, Matthew J.
2004-01-01
Methodological challenges associated with structural equation modeling (SEM) and structured means modeling (SMM) in research on school violence and related topics in the social and behavioral sciences are examined. Problems associated with multiyear implementations of large-scale surveys are discussed. Complex sample designs, part of any…
NASA Technical Reports Server (NTRS)
Campbell, B. H.
1974-01-01
A methodology which was developed for balanced designing of spacecraft subsystems and interrelates cost, performance, safety, and schedule considerations was refined. The methodology consists of a two-step process: the first step is one of selecting all hardware designs which satisfy the given performance and safety requirements, the second step is one of estimating the cost and schedule required to design, build, and operate each spacecraft design. Using this methodology to develop a systems cost/performance model allows the user of such a model to establish specific designs and the related costs and schedule. The user is able to determine the sensitivity of design, costs, and schedules to changes in requirements. The resulting systems cost performance model is described and implemented as a digital computer program.
Systems cost/performance analysis; study 2.3. Volume 3: Programmer's manual and user's guide
NASA Technical Reports Server (NTRS)
1975-01-01
The implementation of the entire systems cost/performance model as a digital computer program was studied. A discussion of the operating environment in which the program was written and checked, the program specifications such as discussions of logic and computational flow, the different subsystem models involved in the design of the spacecraft, and routines involved in the nondesign area such as costing and scheduling of the design were covered. Preliminary results for the DSCS-2 design are also included.
Computational Modeling Approaches to Multiscale Design of Icephobic Surfaces
NASA Technical Reports Server (NTRS)
Tallman, Aaron; Wang, Yan; Vargas, Mario
2017-01-01
To aid in the design of surfaces that prevent icing, a model and computational simulation of impact ice formation at the single droplet scale was implemented. The nucleation of a single supercooled droplet impacting on a substrate, in rime ice conditions, was simulated. Open source computational fluid dynamics (CFD) software was used for the simulation. To aid in the design of surfaces that prevent icing, a model of impact ice formation at the single droplet scale was proposed•No existing model simulates simultaneous impact and freezing of a single super-cooled water droplet•For the 10-week project, a low-fidelity feasibility study was the goal.
Flight elements: Fault detection and fault management
NASA Technical Reports Server (NTRS)
Lum, H.; Patterson-Hine, A.; Edge, J. T.; Lawler, D.
1990-01-01
Fault management for an intelligent computational system must be developed using a top down integrated engineering approach. An approach proposed includes integrating the overall environment involving sensors and their associated data; design knowledge capture; operations; fault detection, identification, and reconfiguration; testability; causal models including digraph matrix analysis; and overall performance impacts on the hardware and software architecture. Implementation of the concept to achieve a real time intelligent fault detection and management system will be accomplished via the implementation of several objectives, which are: Development of fault tolerant/FDIR requirement and specification from a systems level which will carry through from conceptual design through implementation and mission operations; Implementation of monitoring, diagnosis, and reconfiguration at all system levels providing fault isolation and system integration; Optimize system operations to manage degraded system performance through system integration; and Lower development and operations costs through the implementation of an intelligent real time fault detection and fault management system and an information management system.
Using implementation science as the core of the doctor of nursing practice inquiry project.
Riner, Mary E
2015-01-01
New knowledge in health care needs to be implemented for continuous practice improvement. Doctor of nursing practice (DNP) programs are designed to increase clinical practice knowledge and leadership skills of graduates. This article describes an implementation science course developed in a DNP program focused on advancing graduates' capacity for health systems leadership. Curriculum and course development are presented, and the course is mapped to depict how the course objectives and assignments were aligned with DNP Essentials. Course modules with rational are described, and examples of how students implemented assignments are provided. The challenges of integrating this course into the life of the school are discussed as well as steps taken to develop faculty for this capstone learning experience. This article describes a model of using implementation science to provide DNP students an experience in designing and managing an evidence-based practice change project. Copyright © 2015 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Vinton, Dennis A.; Zachmeyer, Richard F.
This final report presents a description of a 3-year project to develop and implement a model training program (for special education personnel, park and resource management personnel, and parents of disabled children) designed to promote outdoor environmental education for disabled children. The project conducted 22 training workshops (2-5 days)…
DOT National Transportation Integrated Search
2017-02-01
As part of the Federal Highway Administration (FHWA) Traffic Analysis Toolbox (Volume XIII), this guide was designed to help corridor stakeholders implement the Integrated Corridor Management (ICM) Analysis, Modeling, and Simulation (AMS) methodology...
DOT National Transportation Integrated Search
2017-02-01
As part of the Federal Highway Administration (FHWA) Traffic Analysis Toolbox (Volume XIII), this guide was designed to help corridor stakeholders implement the Integrated Corridor Management (ICM) Analysis, Modeling, and Simulation (AMS) methodology...
Odhiambo-Otieno, George W; Odero, Wilson W O
2005-03-01
The District Health Management Information Systems (DHMISs) were established by the Ministry of Health (MoH) in Kenya more than two decades ago. Since then, no comprehensive evaluation has been undertaken. This can partly be attributed to lack of defined criteria for evaluating them. To propose evaluation criteria for assessing the design, implementation and impact of DHMIS in the management of the District Health System (DHS) in Kenya. A descriptive cross-sectional study conducted in three DHSs in Kenya: Bungoma, Murang'a and Uasin Gishu districts. Data was collected through focus group discussions, key informant interviews, and documents' review. The respondents, purposely selected from the Ministry of Health headquarters and the three DHS districts, included designers, managers and end-users of the systems. A set of evaluation criteria for DHMISs was identified for each of the three phases of implementation: pre-implementation evaluation criteria (categorised as policy and objectives, technical feasibility, financial viability, political viability and administrative operability) to be applied at the design stage; concurrent implementation evaluation criteria to be applied during implementation of the new system; and post-implementation evaluation criteria (classified as internal - quality of information; external - resources and managerial support; ultimate - systems impact) to be applied after implementation of the system for at least three years. In designing a DHMIS model there is need to have built-in these three sets of evaluation criteria which should be used in a phased manner. Pre-implementation evaluation criteria should be used to evaluate the system's viability before more resources are committed to it; concurrent (operational) - implementation evaluation criteria should be used to monitor the process; and post-implementation evaluation criteria should be applied to assess the system's effectiveness.
Moving from Theory to Practice: Implementing the Kin Keeper[superscript SM] Cancer Prevention Model
ERIC Educational Resources Information Center
Williams, K. P.; Mullan, P. B.; Todem, D.
2009-01-01
This paper presents the rationale and findings of a feasibility and process study of the Kin Keeper[superscript SM] Cancer Prevention Intervention. An observational cohort study design was implemented with African-American women in synergistic female family relationships. Community health workers (CHWs) from two Michigan public health programs…
ERIC Educational Resources Information Center
Stoyanoff, Dawn Galadriel Pfeiffer
2012-01-01
This study examined the enterprise resource planning (ERP) implementations that utilized a shared services model in higher education. The purpose of this research was to examine the critical success factors which were perceived to contribute to project success. This research employed a quantitative non-experimental correlational design and the…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-11
... the Office of the Commissioner on the implementation of the FDA Safety and Innovation Act, Business Impact of Outsourcing, Supplier Management Models that Work, Implementing Quality by Design (QbD... quality and management through the following topics: Beyond our Borders--Maximizing the Impact of FDA's...
ERIC Educational Resources Information Center
Shire, Stephanie Y.; Chang, Ya-Chih; Shih, Wendy; Bracaglia, Suzanne; Kodjoe, Maria; Kasari, Connie
2017-01-01
Background: Interventions found to be effective in research settings are often not as effective when implemented in community settings. Considering children with autism, studies have rarely examined the efficacy of laboratory-tested interventions on child outcomes in community settings using randomized controlled designs. Methods: One hundred and…
A Culture-Based Model for Strategic Implementation of Virtual Education Delivery
ERIC Educational Resources Information Center
Burn, Janice; Thongprasert, Nalinee
2005-01-01
This study was designed to examine the critical success factors for implementing Virtual Education Delivery (VED) in Thailand, and to identify ways to facilitate such adoption and lead to effective outcomes. The study incorporated an analysis of three specific factors related to Thai culture: high power distance "Bhun Khun", uncertainty…
Using Multimedia to Enhance Knowledge of Service Attitude in the Hospitality Industry
ERIC Educational Resources Information Center
Kuo, Chun Min
2012-01-01
Having used a quasi-experimental research model and the ADDIE (Analyze, Design, Develop, Implement, and Evaluate) calibration method to gather and implement data, the researcher developed an interactive multimedia assisted learning (MAL) program promoting proper service attitudes in the hospitality industry. In order to gauge MAL program's…
The Impact of Knowledge Conversion Processes on Implementing a Learning Organization Strategy
ERIC Educational Resources Information Center
Al-adaileh, Raid Moh'd; Dahou, Khadra; Hacini, Ishaq
2012-01-01
Purpose: The purpose of this research is to explore the influence of the knowledge conversion processes (KCP) on the success of a learning organization (LO) strategy implementation. Design/methodology/approach: Using a case study approach, the research model examines the impact of the KCP including socialization, externalization, combination and…
An Effective Assessment Model for Implementing Change and Improving Learning
ERIC Educational Resources Information Center
Mince, Rose; Ebersole, Tara
2008-01-01
Assessment at Community College of Baltimore County (CCBC) involves asking the right questions and using data to determine what changes should be implemented to enhance student learning. Guided by a 5-stage design, CCBC's assessment program is faculty-driven, risk-free, and externally validated. Curricular and pedagogical changes have resulted in…
ERIC Educational Resources Information Center
Nichols, James O.
This guide is intended for college and university administrators responsible for designing and implementing a model for assessment of student outcomes and institutional effectiveness. The first chapter explains use of the handbook and introduces the institutional effectiveness paradigm on which it is based. The second chapter explains the model…
A Framework for Implementing Individualized Self-Regulated Learning Strategies in the Classroom
ERIC Educational Resources Information Center
Ness, Bryan M.; Middleton, Michael J.
2012-01-01
Self-regulated learning (SRL) is a conceptual model that can be used to design and implement individualized learning strategies for students with learning disabilities. Students who self-regulate their learning engage in planning, performance, and self-evaluation during academic tasks. This article highlights one approach for teaching SRL skills…
A Qualitative Analysis of an Advanced Practice Nurse-Directed Transitional Care Model Intervention
ERIC Educational Resources Information Center
Bradway, Christine; Trotta, Rebecca; Bixby, M. Brian; McPartland, Ellen; Wollman, M. Catherine; Kapustka, Heidi; McCauley, Kathleen; Naylor, Mary D.
2012-01-01
Purpose: The purpose of this study was to describe barriers and facilitators to implementing a transitional care intervention for cognitively impaired older adults and their caregivers lead by advanced practice nurses (APNs). Design and Methods: APNs implemented an evidence-based protocol to optimize transitions from hospital to home. An…
Implementation Blueprint and Self-Assessment: Positive Behavioral Interventions and Supports
ERIC Educational Resources Information Center
Technical Assistance Center on Positive Behavioral Interventions and Supports, 2010
2010-01-01
A "blueprint" is a guide designed to improve large-scale implementations of a specific systems or organizational approach, like School-Wide Positive Behavior Support (SWPBS). This blueprint is intended to make the conceptual theory, organizational models, and practices of SWPBS more accessible for those involved in enhancing how schools,…
Implementing High School JROTC [Junior Reserve Officers Training Corps] Career Academies.
ERIC Educational Resources Information Center
Hanser, Lawrence M.; Robyn, Abby E.
In 1992, the U.S. Department of Defense and U.S. Department of Education jointly developed the Junior Reserve Officers Training Corps (JROTC) Career Academy model, which provides a framework for implementation of an innovative vocational education program designed to keep dropout-prone students in school. The program, which combines military…
Planetary gear profile modification design based on load sharing modelling
NASA Astrophysics Data System (ADS)
Iglesias, Miguel; Fernández Del Rincón, Alfonso; De-Juan, Ana Magdalena; Garcia, Pablo; Diez, Alberto; Viadero, Fernando
2015-07-01
In order to satisfy the increasing demand on high performance planetary transmissions, an important line of research is focused on the understanding of some of the underlying phenomena involved in this mechanical system. Through the development of models capable of reproduce the system behavior, research in this area contributes to improve gear transmission insight, helping developing better maintenance practices and more efficient design processes. A planetary gear model used for the design of profile modifications ratio based on the levelling of the load sharing ratio is presented. The gear profile geometry definition, following a vectorial approach that mimics the real cutting process of gears, is thoroughly described. Teeth undercutting and hypotrochoid definition are implicitly considered, and a procedure for the incorporation of a rounding arc at the tooth tip in order to deal with corner contacts is described. A procedure for the modeling of profile deviations is presented, which can be used for the introduction of both manufacturing errors and designed profile modifications. An easy and flexible implementation of the profile deviation within the planetary model is accomplished based on the geometric overlapping. The contact force calculation and dynamic implementation used in the model are also introduced, and parameters from a real transmission for agricultural applications are presented for the application example. A set of reliefs is designed based on the levelling of the load sharing ratio for the example transmission, and finally some other important dynamic factors of the transmission are analyzed to assess the changes in the dynamic behavior with respect to the non-modified case. Thus, the main innovative aspect of the proposed planetary transmission model is the capacity of providing a simulated load sharing ratio which serves as design variable for the calculation of the tooth profile modifications.
Pelletier, Alexandra C; Jethwani, Kamal; Bello, Heather; Kvedar, Joseph; Grant, Richard W
2011-01-01
The practice of outpatient type 2 diabetes management is gradually moving from the traditional visit-based, fee-for-service model to a new, health information communication technology (ICT)-supported model that can enable non-visit-based diabetes care. To date, adoption of innovative health ICT tools for diabetes management has been slowed by numerous barriers, such as capital investment costs, lack of reliable reimbursement mechanisms, design defects that have made some systems time-consuming and inefficient to use, and the need to integrate new ICT tools into a system not primarily designed for their use. Effective implementation of innovative diabetes health ICT interventions must address local practice heterogeneity and the interaction of this heterogeneity with clinical care delivery. The Center for Connected Health at Partners Healthcare has implemented a new ICT intervention, Diabetes Connect (DC), a Web-based glucose home monitoring and clinical messaging system. Using the framework of the diffusion of innovation theory, we review the implementation and examine lessons learned as we continue to deploy DC across the health care network. © 2010 Diabetes Technology Society.
An Application of Artificial Intelligence to the Implementation of Electronic Commerce
NASA Astrophysics Data System (ADS)
Srivastava, Anoop Kumar
In this paper, we present an application of Artificial Intelligence (AI) to the implementation of Electronic Commerce. We provide a multi autonomous agent based framework. Our agent based architecture leads to flexible design of a spectrum of multiagent system (MAS) by distributing computation and by providing a unified interface to data and programs. Autonomous agents are intelligent enough and provide autonomy, simplicity of communication, computation, and a well developed semantics. The steps of design and implementation are discussed in depth, structure of Electronic Marketplace, an ontology, the agent model, and interaction pattern between agents is given. We have developed mechanisms for coordination between agents using a language, which is called Virtual Enterprise Modeling Language (VEML). VEML is a integration of Java and Knowledge Query and Manipulation Language (KQML). VEML provides application programmers with potential to globally develop different kinds of MAS based on their requirements and applications. We have implemented a multi autonomous agent based system called VE System. We demonstrate efficacy of our system by discussing experimental results and its salient features.
2014-01-01
Background Healthcare reform in the United States is encouraging Federally Qualified Health Centers and other primary-care practices to integrate treatment for addiction and other behavioral health conditions into their practices. The potential of mobile health technologies to manage addiction and comorbidities such as HIV in these settings is substantial but largely untested. This paper describes a protocol to evaluate the implementation of an E-Health integrated communication technology delivered via mobile phones, called Seva, into primary-care settings. Seva is an evidence-based system of addiction treatment and recovery support for patients and real-time caseload monitoring for clinicians. Methods/Design Our implementation strategy uses three models of organizational change: the Program Planning Model to promote acceptance and sustainability, the NIATx quality improvement model to create a welcoming environment for change, and Rogers’s diffusion of innovations research, which facilitates adaptations of innovations to maximize their adoption potential. We will implement Seva and conduct an intensive, mixed-methods assessment at three diverse Federally Qualified Healthcare Centers in the United States. Our non-concurrent multiple-baseline design includes three periods — pretest (ending in four months of implementation preparation), active Seva implementation, and maintenance — with implementation staggered at six-month intervals across sites. The first site will serve as a pilot clinic. We will track the timing of intervention elements and assess study outcomes within each dimension of the Reach, Effectiveness, Adoption, Implementation, and Maintenance framework, including effects on clinicians, patients, and practices. Our mixed-methods approach will include quantitative (e.g., interrupted time-series analysis of treatment attendance, with clinics as the unit of analysis) and qualitative (e.g., staff interviews regarding adaptations to implementation protocol) methods, and assessment of implementation costs. Discussion If implementation is successful, the field will have a proven technology that helps Federally Qualified Health Centers and affiliated organizations provide addiction treatment and recovery support, as well as a proven strategy for implementing the technology. Seva also has the potential to improve core elements of addiction treatment, such as referral and treatment processes. A mobile technology for addiction treatment and accompanying implementation model could provide a cost-effective means to improve the lives of patients with drug and alcohol problems. Trial registration ClinicalTrials.gov (NCT01963234). PMID:24884976
Johnson, B.L.; Barko, J.W.; Clevenstine, R.; Davis, M.; Galat, D.L.; Lubinski, S.J.; Nestler, J.M.
2010-01-01
The primary purpose of this report is to provide an adaptive management approach for learning more about summer water level reductions (drawdowns) as a management tool, including where and how drawdowns can be applied most effectively within the Upper Mississippi River System. The report reviews previous drawdowns conducted within the system and provides specific recommendations for learning more about the lesser known effects of drawdowns and how the outcomes can be influenced by different implementation strategies and local conditions. The knowledge gained can be used by managers to determine how best to implement drawdowns in different parts of the UMRS to help achieve management goals. The information and recommendations contained in the report are derived from results of previous drawdown projects, insights from regional disciplinary experts, and the experience of the authors in experimental design, modeling, and monitoring. Modeling is a critical part of adaptive management and can involve conceptual models, simulation models, and empirical models. In this report we present conceptual models that express current understanding regarding functioning of the UMRS as related to drawdowns and highlight interactions among key ecological components of the system. The models were developed within the constraints of drawdown timing, magnitude (depth), and spatial differences in effects (longitudinal and lateral) with attention to ecological processes affected by drawdowns. With input from regional experts we focused on the responses of vegetation, fish, mussels, other invertebrates, and birds. The conceptual models reflect current understanding about relations and interactions among system components, the expected strength of those interactions, potential responses of system components to drawdowns, likelihood of the response occurring, and key uncertainties that limit our ability to make accurate predictions of effects (Table 1, Fig. 4-10). Based on this current understanding, the main questions still associated with drawdowns include (1) the effects of frequency of drawdowns (from once every few years to multiple years in succession); (2) timing of the beginning of drawdowns (follow the descending arm of the flood pulse versus always beginning in early summer); (3) long-term benefits (greater than 5-6 years), especially as compared to known short-term loses (e.g., mortality of mussels in exposed areas, loss of submersed vegetation in exposed areas, cost of advanced dredging); and (4) the effects in northern (above pool 14) versus southern pools (pool 14 and below, and the Illinois River). An adaptive management design should address these questions to reduce uncertainty in predictions of drawdown effects and help determine if different implementation strategies are needed in different parts of the system. Given that drawdowns will continue to be used as a management tool on the UMRS, we suggest that some drawdowns be conducted in an adaptive management context that helps meet management objectives, but also provides efficient learning about the questions listed above. We propose two different, but interrelated, experimental designs to address these questions. Both designs call for conducting multiple drawdowns in multiple pools (2-4 pools) to allow direct comparison of results and produce rapid learning. However, the report does not provide a detailed scope of work for carrying out the designs. If managers choose to implement one of the experimental designs, specifics of choosing appropriate pools and developing a monitoring plan will need to be determined through collaboration among managers, researchers, and statisticians. We suggest characteristics to consider in selecting treatment and reference pools (study sites) and also provide guidance for developing a monitoring plan. Some aspects of these two designs could be implemented individually, but by implementing individual elements, direct comparisons of some design features
Supporting Teachers Learning Through the Collaborative Design of Technology-Enhanced Science Lessons
NASA Astrophysics Data System (ADS)
Kafyulilo, Ayoub C.; Fisser, Petra; Voogt, Joke
2015-12-01
This study used the Interconnected Model of Professional Growth (Clarke & Hollingsworth in Teaching and Teacher Education, 18, 947-967, 2002) to unravel how science teachers' technology integration knowledge and skills developed in a professional development arrangement. The professional development arrangement used Technological Pedagogical Content Knowledge as a conceptual framework and included collaborative design of technology-enhanced science lessons, implementation of the lessons and reflection on outcomes. Support to facilitate the process was offered in the form of collaboration guidelines, online learning materials, exemplary lessons and the availability of an expert. Twenty teachers participated in the intervention. Pre- and post-intervention results showed improvements in teachers' perceived and demonstrated knowledge and skills in integrating technology in science teaching. Collaboration guidelines helped the teams to understand the design process, while exemplary materials provided a picture of the product they had to design. The availability of relevant online materials simplified the design process. The expert was important in providing technological and pedagogical support during design and implementation, and reflected with teachers on how to cope with problems met during implementation.
Computational statistics using the Bayesian Inference Engine
NASA Astrophysics Data System (ADS)
Weinberg, Martin D.
2013-09-01
This paper introduces the Bayesian Inference Engine (BIE), a general parallel, optimized software package for parameter inference and model selection. This package is motivated by the analysis needs of modern astronomical surveys and the need to organize and reuse expensive derived data. The BIE is the first platform for computational statistics designed explicitly to enable Bayesian update and model comparison for astronomical problems. Bayesian update is based on the representation of high-dimensional posterior distributions using metric-ball-tree based kernel density estimation. Among its algorithmic offerings, the BIE emphasizes hybrid tempered Markov chain Monte Carlo schemes that robustly sample multimodal posterior distributions in high-dimensional parameter spaces. Moreover, the BIE implements a full persistence or serialization system that stores the full byte-level image of the running inference and previously characterized posterior distributions for later use. Two new algorithms to compute the marginal likelihood from the posterior distribution, developed for and implemented in the BIE, enable model comparison for complex models and data sets. Finally, the BIE was designed to be a collaborative platform for applying Bayesian methodology to astronomy. It includes an extensible object-oriented and easily extended framework that implements every aspect of the Bayesian inference. By providing a variety of statistical algorithms for all phases of the inference problem, a scientist may explore a variety of approaches with a single model and data implementation. Additional technical details and download details are available from http://www.astro.umass.edu/bie. The BIE is distributed under the GNU General Public License.
Synthetic Proxy Infrastructure for Task Evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Junghans, Christoph; Pavel, Robert
The Synthetic Proxy Infrastructure for Task Evaluation is a proxy application designed to support application developers in gauging the performance of various task granularities when determining how best to utilize task based programming models.The infrastructure is designed to provide examples of common communication patterns with a synthetic workload intended to provide performance data to evaluate programming model and platform overheads for the purpose of determining task granularity for task decomposition purposes. This is presented as a reference implementation of a proxy application with run-time configurable input and output task dependencies ranging from an embarrassingly parallel scenario to patterns with stencil-likemore » dependencies upon their nearest neighbors. Once all, if any, inputs are satisfied each task will execute a synthetic workload (a simple DGEMM of in this case) of varying size and output all, if any, outputs to the next tasks.The intent is for this reference implementation to be implemented as a proxy app in different programming models so as to provide the same infrastructure and to allow for application developers to simulate their own communication needs to assist in task decomposition under various models on a given platform.« less
Conceptual astronomy: A novel model for teaching postsecondary science courses
NASA Astrophysics Data System (ADS)
Zeilik, Michael; Schau, Candace; Mattern, Nancy; Hall, Shannon; Teague, Kathleen W.; Bisard, Walter
1997-10-01
An innovative, conceptually based instructional model for teaching large undergraduate astronomy courses was designed, implemented, and evaluated in the Fall 1995 semester. This model was based on cognitive and educational theories of knowledge and, we believe, is applicable to other large postsecondary science courses. Major components were: (a) identification of the basic important concepts and their interrelationships that are necessary for connected understanding of astronomy in novice students; (b) use of these concepts and their interrelationships throughout the design, implementation, and evaluation stages of the model; (c) identification of students' prior knowledge and misconceptions; and (d) implementation of varied instructional strategies targeted toward encouraging conceptual understanding in students (i.e., instructional concept maps, cooperative small group work, homework assignments stressing concept application, and a conceptually based student assessment system). Evaluation included the development and use of three measures of conceptual understanding and one of attitudes toward studying astronomy. Over the semester, students showed very large increases in their understanding as assessed by a conceptually based multiple-choice measure of misconceptions, a select-and-fill-in concept map measure, and a relatedness-ratings measure. Attitudes, which were slightly positive before the course, changed slightly in a less favorable direction.
Electronic Education System Model-2
ERIC Educational Resources Information Center
Güllü, Fatih; Kuusik, Rein; Laanpere, Mart
2015-01-01
In this study we presented new EES Model-2 extended from EES model for more productive implementation in e-learning process design and modelling in higher education. The most updates were related to uppermost instructional layer. We updated learning processes object of the layer for adaptation of educational process for young and old people,…
Community characteristics and implementation factors associated with effective systems of care.
Lunn, Laurel M; Heflinger, Craig Anne; Wang, Wei; Greenbaum, Paul E; Kutash, Krista; Boothroyd, Roger A; Friedman, Robert M
2011-07-01
How are characteristics of communities associated with the implementation of the principles of systems of care (SOC)? This study uses multilevel modeling with a stratified random sample (N = 225) of US counties to explore community-level predictors of the implementation factors of the System of Care Implementation Survey. A model composed of community-level social indicators fits well with 5 of 14 factors identified as relevant for effective SOCs. As hypothesized, community disadvantage was negatively and residential stability positively associated with the implementation of SOC principles. Designation as a mental health professional shortage area was positively related to some implementation scores, as was the percentage of minority residents, while rurality was not significantly associated with any of the factors. Given the limitations of the study, the results should be interpreted with caution, but suggest that further research is merited to clarify these relationships that could inform efforts directed at promoting SOCs.
Where Next for Marine Cloud Brightening Research?
NASA Astrophysics Data System (ADS)
Jenkins, A. K. L.; Forster, P.
2014-12-01
Realistic estimates of geoengineering effectiveness will be central to informed decision-making on its possible role in addressing climate change. Over the last decade, global-scale computer climate modelling of geoengineering has been developing. While these developments have allowed quantitative estimates of geoengineering effectiveness to be produced, the relative coarseness of the grid of these models (tens of kilometres) means that key practical details of the proposed geoengineering is not always realistically captured. This is particularly true for marine cloud brightening (MCB), where both the clouds, as well as the tens-of-meters scale sea-going implementation vessels cannot be captured in detail. Previous research using cloud resolving modelling has shown that neglecting such details may lead to MCB effectiveness being overestimated by up to half. Realism of MCB effectiveness will likely improve from ongoing developments in the understanding and modelling of clouds. We also propose that realism can be increased via more specific improvements (see figure). A readily achievable example would be the reframing of previous MCB effectiveness estimates in light of the cloud resolving scale findings. Incorporation of implementation details could also be made - via parameterisation - into future global-scale modelling of MCB. However, as significant unknowns regarding the design of the MCB aerosol production technique remain, resource-intensive cloud resolving computer modelling of MCB may be premature unless of broader benefit to the wider understanding of clouds. One of the most essential recommendations is for enhanced communication between climate scientists and MCB designers. This would facilitate the identification of potentially important design aspects necessary for realistic computer simulations. Such relationships could be mutually beneficial, with computer modelling potentially informing more efficient designs of the MCB implementation technique. (Acknowledgment) This work is part of the Integrated Assessment of Geoengineering Proposals (IAGP) project, funded by the Engineering and Physical Sciences Research Council and the Natural Environment Research Council (EP/I014721/1).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Genser, Krzysztof; Hatcher, Robert; Kelsey, Michael
The Geant4 simulation toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models rely on measured cross-sections and phenomenological models with the physically motivated parameters that are tuned to cover many application domains. To study what uncertainties are associated with the Geant4 physics models we have designed and implemented a comprehensive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variantsmore » of the resulting physics observables of interest in order to estimate the uncertainties associated with the simulation model choices. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. exible run-time con gurable work ow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented in this paper and illustrated with selected results.« less
Virtual Observation System for Earth System Model: An Application to ACME Land Model Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Dali; Yuan, Fengming; Hernandez, Benjamin
Investigating and evaluating physical-chemical-biological processes within an Earth system model (EMS) can be very challenging due to the complexity of both model design and software implementation. A virtual observation system (VOS) is presented to enable interactive observation of these processes during system simulation. Based on advance computing technologies, such as compiler-based software analysis, automatic code instrumentation, and high-performance data transport, the VOS provides run-time observation capability, in-situ data analytics for Earth system model simulation, model behavior adjustment opportunities through simulation steering. A VOS for a terrestrial land model simulation within the Accelerated Climate Modeling for Energy model is also presentedmore » to demonstrate the implementation details and system innovations.« less
Virtual Observation System for Earth System Model: An Application to ACME Land Model Simulations
Wang, Dali; Yuan, Fengming; Hernandez, Benjamin; ...
2017-01-01
Investigating and evaluating physical-chemical-biological processes within an Earth system model (EMS) can be very challenging due to the complexity of both model design and software implementation. A virtual observation system (VOS) is presented to enable interactive observation of these processes during system simulation. Based on advance computing technologies, such as compiler-based software analysis, automatic code instrumentation, and high-performance data transport, the VOS provides run-time observation capability, in-situ data analytics for Earth system model simulation, model behavior adjustment opportunities through simulation steering. A VOS for a terrestrial land model simulation within the Accelerated Climate Modeling for Energy model is also presentedmore » to demonstrate the implementation details and system innovations.« less
PyRosetta: a script-based interface for implementing molecular modeling algorithms using Rosetta.
Chaudhury, Sidhartha; Lyskov, Sergey; Gray, Jeffrey J
2010-03-01
PyRosetta is a stand-alone Python-based implementation of the Rosetta molecular modeling package that allows users to write custom structure prediction and design algorithms using the major Rosetta sampling and scoring functions. PyRosetta contains Python bindings to libraries that define Rosetta functions including those for accessing and manipulating protein structure, calculating energies and running Monte Carlo-based simulations. PyRosetta can be used in two ways: (i) interactively, using iPython and (ii) script-based, using Python scripting. Interactive mode contains a number of help features and is ideal for beginners while script-mode is best suited for algorithm development. PyRosetta has similar computational performance to Rosetta, can be easily scaled up for cluster applications and has been implemented for algorithms demonstrating protein docking, protein folding, loop modeling and design. PyRosetta is a stand-alone package available at http://www.pyrosetta.org under the Rosetta license which is free for academic and non-profit users. A tutorial, user's manual and sample scripts demonstrating usage are also available on the web site.
PyRosetta: a script-based interface for implementing molecular modeling algorithms using Rosetta
Chaudhury, Sidhartha; Lyskov, Sergey; Gray, Jeffrey J.
2010-01-01
Summary: PyRosetta is a stand-alone Python-based implementation of the Rosetta molecular modeling package that allows users to write custom structure prediction and design algorithms using the major Rosetta sampling and scoring functions. PyRosetta contains Python bindings to libraries that define Rosetta functions including those for accessing and manipulating protein structure, calculating energies and running Monte Carlo-based simulations. PyRosetta can be used in two ways: (i) interactively, using iPython and (ii) script-based, using Python scripting. Interactive mode contains a number of help features and is ideal for beginners while script-mode is best suited for algorithm development. PyRosetta has similar computational performance to Rosetta, can be easily scaled up for cluster applications and has been implemented for algorithms demonstrating protein docking, protein folding, loop modeling and design. Availability: PyRosetta is a stand-alone package available at http://www.pyrosetta.org under the Rosetta license which is free for academic and non-profit users. A tutorial, user's manual and sample scripts demonstrating usage are also available on the web site. Contact: pyrosetta@graylab.jhu.edu PMID:20061306
Exploiting the chaotic behaviour of atmospheric models with reconfigurable architectures
NASA Astrophysics Data System (ADS)
Russell, Francis P.; Düben, Peter D.; Niu, Xinyu; Luk, Wayne; Palmer, T. N.
2017-12-01
Reconfigurable architectures are becoming mainstream: Amazon, Microsoft and IBM are supporting such architectures in their data centres. The computationally intensive nature of atmospheric modelling is an attractive target for hardware acceleration using reconfigurable computing. Performance of hardware designs can be improved through the use of reduced-precision arithmetic, but maintaining appropriate accuracy is essential. We explore reduced-precision optimisation for simulating chaotic systems, targeting atmospheric modelling, in which even minor changes in arithmetic behaviour will cause simulations to diverge quickly. The possibility of equally valid simulations having differing outcomes means that standard techniques for comparing numerical accuracy are inappropriate. We use the Hellinger distance to compare statistical behaviour between reduced-precision CPU implementations to guide reconfigurable designs of a chaotic system, then analyse accuracy, performance and power efficiency of the resulting implementations. Our results show that with only a limited loss in accuracy corresponding to less than 10% uncertainty in input parameters, the throughput and energy efficiency of a single-precision chaotic system implemented on a Xilinx Virtex-6 SX475T Field Programmable Gate Array (FPGA) can be more than doubled.
Gregório, João; Pizarro, Ângela; Cavaco, Afonso; Wipfli, Rolf; Lovis, Christian; Mira da Silva, Miguel; Lapão, Luís Velez
2015-01-01
Chronic diseases are pressing health systems to introduce reforms, focused on primary care and multidisciplinary models. Community pharmacists have developed a new role, addressing pharmaceutical care and services. Information systems and technologies (IST) will have an important role in shaping future healthcare provision. However, the best way to design and implement an IST for pharmaceutical service provision is still an open research question. In this paper, we present a possible strategy based on the use of Design Science Research Methodology (DSRM). The application of the DSRM six stages is described, from the definition and characterization of the problem to the evaluation of the artefact.
Cognitive Foundry v. 3.0 (OSS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Basilico, Justin; Dixon, Kevin; McClain, Jonathan
2009-11-18
The Cognitive Foundry is a unified collection of tools designed for research and applications that use cognitive modeling, machine learning, or pattern recognition. The software library contains design patterns, interface definitions, and default implementations of reusable software components and algorithms designed to support a wide variety of research and development needs. The library contains three main software packages: the Common package that contains basic utilities and linear algebraic methods, the Cognitive Framework package that contains tools to assist in implementing and analyzing theories of cognition, and the Machine Learning package that provides general algorithms and methods for populating Cognitive Frameworkmore » components from domain-relevant data.« less
Verification of the Icarus Material Response Tool
NASA Technical Reports Server (NTRS)
Schroeder, Olivia; Palmer, Grant; Stern, Eric; Schulz, Joseph; Muppidi, Suman; Martin, Alexandre
2017-01-01
Due to the complex physics encountered during reentry, material response solvers are used for two main purposes: improve the understanding of the physical phenomena; and design and size thermal protection systems (TPS). Icarus, is a three dimensional, unstructured material response tool that is intended to be used for design while maintaining the flexibility to easily implement physical models as needed. Because TPS selection and sizing is critical, it is of the utmost importance that the design tools be extensively verified and validated before their use. Verification tests aim at insuring that the numerical schemes and equations are implemented correctly by comparison to analytical solutions and grid convergence tests.
A Geostationary Earth Orbit Satellite Model Using Easy Java Simulation
ERIC Educational Resources Information Center
Wee, Loo Kang; Goh, Giam Hwee
2013-01-01
We develop an Easy Java Simulation (EJS) model for students to visualize geostationary orbits near Earth, modelled using a Java 3D implementation of the EJS 3D library. The simplified physics model is described and simulated using a simple constant angular velocity equation. We discuss four computer model design ideas: (1) a simple and realistic…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shott, G.; Yucel, V.; Desotell, L.
2006-07-01
The long-term safety of U.S. Department of Energy (DOE) low-level radioactive disposal facilities is assessed by conducting a performance assessment -- a systematic analysis that compares estimated risks to the public and the environment with performance objectives contained in DOE Manual 435.1-1, Radioactive Waste Management Manual. Before site operations, facilities design features such as final inventory, waste form characteristics, and closure cover design may be uncertain. Site operators need a modeling tool that can be used throughout the operational life of the disposal site to guide decisions regarding the acceptance of problematic waste streams, new disposal cell design, environmental monitoringmore » program design, and final site closure. In response to these needs the National Nuclear Security Administration Nevada Site Office (NNSA/NSO) has developed a decision support system for the Area 5 Radioactive Waste Management Site in Frenchman Flat on the Nevada Test Site. The core of the system is a probabilistic inventory and performance assessment model implemented in the GoldSim{sup R} simulation platform. The modeling platform supports multiple graphic capabilities that allow clear documentation of the model data sources, conceptual model, mathematical implementation, and results. The combined models have the capability to estimate disposal site inventory, contaminant concentrations in environmental media, and radiological doses to members of the public engaged in various activities at multiple locations. The model allows rapid assessment and documentation of the consequences of waste management decisions using the most current site characterization information, radionuclide inventory, and conceptual model. The model is routinely used to provide annual updates of site performance, evaluate the consequences of disposal of new waste streams, develop waste concentration limits, optimize the design of new disposal cells, and assess the adequacy of environmental monitoring programs. (authors)« less
van Hout, H P J; Macneil Vroomen, J L; Van Mierlo, L D; Meiland, F J M; Moll van Charante, E P; Joling, K J; van den Dungen, P; Dröes, R M; van der Horst, H E; de Rooij, S E J A
2014-04-01
Dementia care in The Netherlands is shifting from fragmented, ad hoc care to more coordinated and personalized care. Case management contributes to this shift. The linkage model and a combination of intensive case management and joint agency care models were selected based on their emerging prominence in The Netherlands. It is unclear if these different forms of case management are more effective than usual care in improving or preserving the functioning and well-being at the patient and caregiver level and at the societal cost. The objective of this article is to describe the design of a study comparing these two case management care models against usual care. Clinical and cost outcomes are investigated while care processes and the facilitators and barriers for implementation of these models are considered. Mixed methods include a prospective, observational, controlled, cohort study among persons with dementia and their primary informal caregiver in regions of The Netherlands with and without case management including a qualitative process evaluation. Community-dwelling individuals with a dementia diagnosis with an informal caregiver are included. The primary outcome measure is the Neuropsychiatric Inventory for the people with dementia and the General Health Questionnaire for their caregivers. Costs are measured from a societal perspective. Semi-structured interviews with stakeholders based on the theoretical model of adaptive implementation are planned. 521 pairs of persons with dementia and their primary informal caregiver were included and are followed over two years. In the linked model substantially more impeding factors for implementation were identified compared with the model. This article describes the design of an evaluation study of two case management models along with clinical and economic data from persons with dementia and caregivers. The impeding and facilitating factors differed substantially between the two models. Further results on cost-effectiveness are expected by the beginning of 2015. This is a Dutch adaptation of MacNeil Vroomen et al., Comparing Dutch case management care models for people with dementia and their caregivers: The design of the COMPAS study.
The UIC Therapeutic Partnership Project. Final Report.
ERIC Educational Resources Information Center
Lawlor, Mary C.; Cada, Elizabeth A.
This interdisciplinary inservice training project at the University of Illinois at Chicago was designed to improve early childhood occupational and physical therapy services by developing, implementing, evaluating, and disseminating a comprehensive training model. The competency-based program was designed to address the developmental needs of…
Embodied Design: Constructing Means for Constructing Meaning
ERIC Educational Resources Information Center
Abrahamson, Dor
2009-01-01
Design-based research studies are conducted as iterative implementation-analysis-modification cycles, in which emerging theoretical models and pedagogically plausible activities are reciprocally tuned toward each other as a means of investigating conjectures pertaining to mechanisms underlying content teaching and learning. Yet this approach, even…
Evidence-Centered Design: Recommendations for Implementation and Practice
ERIC Educational Resources Information Center
Hendrickson, Amy; Ewing, Maureen; Kaliski, Pamela; Huff, Kristen
2013-01-01
Evidence-centered design (ECD) is an orientation towards assessment development. It differs from conventional practice in several ways and consists of multiple activities. Each of these activities results in a set of useful documentation: domain analysis, domain modeling, construction of the assessment framework, and assessment…
IMPLEMENTATION OF GREEN ROOF SUSTAINABILITY IN ARID CONDITIONS
We successfully designed and fabricated accurately scaled prototypes of a green roof and a conventional white roof and began testing in simulated conditions of 115-70°F with relative humidity of 13%. The design parameters were based on analytical models created through ver...
Design and implementation of an air-conditioning system with storage tank for load shifting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hsu, Y.Y.; Wu, C.J.; Liou, K.L.
1987-11-01
The experience with the design, simulation and implementation of an air-conditioning system with chilled water storage tank is presented in this paper. The system is used to shift air-conditioning load of residential and commercial buildings from on-peak to off-peak period. Demand-side load management can thus be achieved if many buildings are equipped with such storage devices. In the design of this system, a lumped-parameter circuit model is first employed to simulate the heat transfer within the air-conditioned building such that the required capacity of the storage tank can be figured out. Then, a set of desirable parameters for the temperaturemore » controller of the system are determined using the parameter plane method and the root locus method. The validity of the proposed mathematical model and design approach is verified by comparing the results obtained from field tests with those from the computer simulations. Cost-benefit analysis of the system is also discussed.« less
Design of adaptive control systems by means of self-adjusting transversal filters
NASA Technical Reports Server (NTRS)
Merhav, S. J.
1986-01-01
The design of closed-loop adaptive control systems based on nonparametric identification was addressed. Implementation is by self-adjusting Least Mean Square (LMS) transversal filters. The design concept is Model Reference Adaptive Control (MRAC). Major issues are to preserve the linearity of the error equations of each LMS filter, and to prevent estimation bias that is due to process or measurement noise, thus providing necessary conditions for the convergence and stability of the control system. The controlled element is assumed to be asymptotically stable and minimum phase. Because of the nonparametric Finite Impulse Response (FIR) estimates provided by the LMS filters, a-priori information on the plant model is needed only in broad terms. Following a survey of control system configurations and filter design considerations, system implementation is shown here in Single Input Single Output (SISO) format which is readily extendable to multivariable forms. In extensive computer simulation studies the controlled element is represented by a second-order system with widely varying damping, natural frequency, and relative degree.
Wen, Kuang-Yi; Gustafson, David H; Hawkins, Robert P; Brennan, Patricia F; Dinauer, Susan; Johnson, Pauley R; Siegler, Tracy
2010-01-01
To develop and validate the Readiness for Implementation Model (RIM). This model predicts a healthcare organization's potential for success in implementing an interactive health communication system (IHCS). The model consists of seven weighted factors, with each factor containing five to seven elements. Two decision-analytic approaches, self-explicated and conjoint analysis, were used to measure the weights of the RIM with a sample of 410 experts. The RIM model with weights was then validated in a prospective study of 25 IHCS implementation cases. Orthogonal main effects design was used to develop 700 conjoint-analysis profiles, which varied on seven factors. Each of the 410 experts rated the importance and desirability of the factors and their levels, as well as a set of 10 different profiles. For the prospective 25-case validation, three time-repeated measures of the RIM scores were collected for comparison with the implementation outcomes. Two of the seven factors, 'organizational motivation' and 'meeting user needs,' were found to be most important in predicting implementation readiness. No statistically significant difference was found in the predictive validity of the two approaches (self-explicated and conjoint analysis). The RIM was a better predictor for the 1-year implementation outcome than the half-year outcome. The expert sample, the order of the survey tasks, the additive model, and basing the RIM cut-off score on experience are possible limitations of the study. The RIM needs to be empirically evaluated in institutions adopting IHCS and sustaining the system in the long term.
Designing a Strategic Plan through an Emerging Knowledge Generation Process: The ATM Experience
ERIC Educational Resources Information Center
Zanotti, Francesco
2012-01-01
Purpose: The aim of this contribution is to describe a new methodology for designing strategic plans and how it was implemented by ATM, a public transportation agency based in Milan, Italy. Design/methodology/approach: This methodology is founded on a new system theory, called "quantum systemics". It is based on models and metaphors both…
Designing a Blended Course: Using ADDIE to Guide Instructional Design
ERIC Educational Resources Information Center
Shibley, Ike; Amaral, Katie E.; Shank, John D.; Shibley, Lisa R.
2011-01-01
The ADDIE (analysis, design, development, implementation, and evaluation) model was applied to help redesign a General Chemistry course to improve student success in the course. A team of six professionals spent 18 months and over 1,000 man-hours in the redesign. The resultant course is a blend of online and face-to-face instruction that utilizes…
Instructional Design to Measure the Efficacy of Interactive E-Books in a High School Setting
ERIC Educational Resources Information Center
Pabrua Batoon, Maria Victoria; Glasserman Morales, Leonardo David; Yanez Figueroa, Jose Antonio
2018-01-01
This article describes a qualitative research analysis on the implementation of interactive ebooks in high school courses using a case study approach. The subjects of the study included seven professors and 16 freshmen who were surveyed and interviewed with a questionnaire designed according to the Kemp Model of Instructional Design. The study…
Using ADDIE and Systems Thinking as the Framework for Developing a MOOC: A Case Study
ERIC Educational Resources Information Center
Croxton, Rebecca A.; Chow, Anthony S.
2015-01-01
This article presents a case study of how systems thinking and the instructional systems design ADDIE (analysis, design, development, implementation, and assessment) model were used to design and develop one of the first MOOCs at a mid-sized university in the southeastern United States. Contemporary issues surrounding MOOCs at both the macro…
ERIC Educational Resources Information Center
Becuwe, Heleen; Roblin, Natalie Pareja; Tondeur, Jo; Thys, Jeroen; Castelein, Els; Voogt, Joke
2017-01-01
Teacher educators often struggle to model effective integration of technology. Several studies suggest that the involvement of teacher educators in collaborative design is effective in developing the competences necessary for integrating information and communication technology (ICT) in teaching. In a teacher educator design team (TeDT), two or…
A Learning Design Ontology Based on the IMS Specification
ERIC Educational Resources Information Center
Amorim, Ricardo R.; Lama, Manuel; Sanchez, Eduardo; Riera, Adolfo; Vila, Xose A.
2006-01-01
In this paper, we present an ontology to represent the semantics of the IMS Learning Design (IMS LD) specification, a meta-language used to describe the main elements of the learning design process. The motivation of this work relies on the expressiveness limitations found on the current XML-Schema implementation of the IMS LD conceptual model. To…
Applications of large-eddy simulation: Synthesis of neutral boundary layer models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ohmstede, W.D.
The object of this report is to describe progress made towards the application of large-eddy simulation (LES), in particular, to the study of the neutral boundary layer (NBL). The broad purpose of the study is to provide support to the LES project currently underway at LLNL. The specific purpose of this study is to lay the groundwork for the simulation of the SBL through the establishment and implementation of model criteria for the simulation of the NBL. The idealistic NBL is never observed in the atmosphere and therefore has little practical significance. However, it is of considerable theoretical interest formore » several reasons. The report discusses the concept of Rossby-number similarity theory as it applies to the NBL. A particular implementation of the concept is described. Then, the results from prior simulations of the NBL are summarized. Model design criteria for two versions of the Brost LES (BLES) model are discussed. The general guidelines for the development of Version 1 of the Brost model (BV1) were to implement the model with a minimum of modifications which would alter the design criteria as established by Brost. Two major modifications of BLES incorporated into BV1 pertain to the initialization/parameterization of the model and the generalization of the boundary conditions at the air/earth interface. 18 refs., 4 figs.« less
Low-cost replicable plastic HUD combiner element
NASA Astrophysics Data System (ADS)
Kress, Bernard; Raulot, Victorien; St. Hilaire, Pierre; Meyrueis, Patrick
2009-05-01
We present a novel technique to fabricate low cost mass replicable plastic HUDs for the transportation industry. HUD are implemented in numerous sectors today (in avionics, automobile, military, machinery,...). Typical implementations include an optical combiner which produces the desired virtual image while leaving the field mostly unaffected by the optics. Such combiners optics are usually implemented as cumbersome catadioptric devices in automobile, dichroic coated curved plates, or expensive volume holograms in commercial and military aviation. We propose a novel way to design, model and fabricate combiner masters which can be replicated in mass by UV casting in plastic. We review the various design techniques required for such elements and the novel mastering technology.
The Foundations Framework for Developing and Reporting New Models of Care for Multimorbidity
Stokes, Jonathan; Man, Mei-See; Guthrie, Bruce; Mercer, Stewart W.; Salisbury, Chris; Bower, Peter
2017-01-01
PURPOSE Multimorbidity challenges health systems globally. New models of care are urgently needed to better manage patients with multimorbidity; however, there is no agreed framework for designing and reporting models of care for multimorbidity and their evaluation. METHODS Based on findings from a literature search to identify models of care for multimorbidity, we developed a framework to describe these models. We illustrate the application of the framework by identifying the focus and gaps in current models of care, and by describing the evolution of models over time. RESULTS Our framework describes each model in terms of its theoretical basis and target population (the foundations of the model) and of the elements of care implemented to deliver the model. We categorized elements of care into 3 types: (1) clinical focus, (2) organization of care, (3) support for model delivery. Application of the framework identified a limited use of theory in model design and a strong focus on some patient groups (elderly, high users) more than others (younger patients, deprived populations). We found changes in elements with time, with a decrease in models implementing home care and an increase in models offering extended appointments. CONCLUSIONS By encouragin greater clarity about the underpinning theory and target population, and by categorizing the wide range of potentially important elements of an intervention to improve care for patients with multimorbidity, the framework may be useful in designing and reporting models of care and help advance the currently limited evidence base. PMID:29133498
The Impact of Model Uncertainty on Spatial Compensation in Structural Acoustic Control
NASA Technical Reports Server (NTRS)
Clark, Robert L.
2005-01-01
Turbulent boundary layer (TBL) noise is considered a primary contribution to the interior noise present in commercial airliners. There are numerous investigations of interior noise control devoted to aircraft panels; however, practical realization is a potential challenge since physical boundary conditions are uncertain at best. In most prior studies, pinned or clamped boundary conditions were assumed; however, realistic panels likely display a range of boundary conditions between these two limits. Uncertainty in boundary conditions is a challenge for control system designers, both in terms of the compensator implemented and the location of transducers required to achieve the desired control. The impact of model uncertainties, specifically uncertain boundaries, on the selection of transducer locations for structural acoustic control is considered herein. The final goal of this work is the design of an aircraft panel structure that can reduce TBL noise transmission through the use of a completely adaptive, single-input, single-output control system. The feasibility of this goal is demonstrated through the creation of a detailed analytical solution, followed by the implementation of a test model in a transmission loss apparatus. Successfully realizing a control system robust to variations in boundary conditions can lead to the design and implementation of practical adaptive structures that could be used to control the transmission of sound to the interior of aircraft. Results from this research effort indicate it is possible to optimize the design of actuator and sensor location and aperture, minimizing the impact of boundary conditions on the desired structural acoustic control.
Constraint-based component-modeling for knowledge-based design
NASA Technical Reports Server (NTRS)
Kolb, Mark A.
1992-01-01
The paper describes the application of various advanced programming techniques derived from artificial intelligence research to the development of flexible design tools for conceptual design. Special attention is given to two techniques which appear to be readily applicable to such design tools: the constraint propagation technique and the object-oriented programming. The implementation of these techniques in a prototype computer tool, Rubber Airplane, is described.
Design, Development, and Automated Verification of an Integrity-Protected Hypervisor
2012-07-16
mechanism for implementing software virtualization. Since hypervisors execute at a very high privilege level, they must be secure. A fundamental security...using the CBMC model checker. CBMC verified XMHF?s implementation ? about 4700 lines of C code ? in about 80 seconds using less than 2GB of RAM. 15...Hypervisors are a popular mechanism for implementing software virtualization. Since hypervisors execute at a very high privilege level, they must be
Mena, Carlos F; Walsh, Stephen J; Frizzelle, Brian G; Xiaozheng, Yao; Malanson, George P
2011-01-01
This paper describes the design and implementation of an Agent-Based Model (ABM) used to simulate land use change on household farms in the Northern Ecuadorian Amazon (NEA). The ABM simulates decision-making processes at the household level that is examined through a longitudinal, socio-economic and demographic survey that was conducted in 1990 and 1999. Geographic Information Systems (GIS) are used to establish spatial relationships between farms and their environment, while classified Landsat Thematic Mapper (TM) imagery is used to set initial land use/land cover conditions for the spatial simulation, assess from-to land use/land cover change patterns, and describe trajectories of land use change at the farm and landscape levels. Results from prior studies in the NEA provide insights into the key social and ecological variables, describe human behavioral functions, and examine population-environment interactions that are linked to deforestation and agricultural extensification, population migration, and demographic change. Within the architecture of the model, agents are classified as active or passive. The model comprises four modules, i.e., initialization, demography, agriculture, and migration that operate individually, but are linked through key household processes. The main outputs of the model include a spatially-explicit representation of the land use/land cover on survey and non-survey farms and at the landscape level for each annual time-step, as well as simulated socio-economic and demographic characteristics of households and communities. The work describes the design and implementation of the model and how population-environment interactions can be addressed in a frontier setting. The paper contributes to land change science by examining important pattern-process relations, advocating a spatial modeling approach that is capable of synthesizing fundamental relationships at the farm level, and links people and environment in complex ways.
Output-Feedback Model Predictive Control of a Pasteurization Pilot Plant based on an LPV model
NASA Astrophysics Data System (ADS)
Karimi Pour, Fatemeh; Ocampo-Martinez, Carlos; Puig, Vicenç
2017-01-01
This paper presents a model predictive control (MPC) of a pasteurization pilot plant based on an LPV model. Since not all the states are measured, an observer is also designed, which allows implementing an output-feedback MPC scheme. However, the model of the plant is not completely observable when augmented with the disturbance models. In order to solve this problem, the following strategies are used: (i) the whole system is decoupled into two subsystems, (ii) an inner state-feedback controller is implemented into the MPC control scheme. A real-time example based on the pasteurization pilot plant is simulated as a case study for testing the behavior of the approaches.
High dimensional biological data retrieval optimization with NoSQL technology.
Wang, Shicai; Pandis, Ioannis; Wu, Chao; He, Sijin; Johnson, David; Emam, Ibrahim; Guitton, Florian; Guo, Yike
2014-01-01
High-throughput transcriptomic data generated by microarray experiments is the most abundant and frequently stored kind of data currently used in translational medicine studies. Although microarray data is supported in data warehouses such as tranSMART, when querying relational databases for hundreds of different patient gene expression records queries are slow due to poor performance. Non-relational data models, such as the key-value model implemented in NoSQL databases, hold promise to be more performant solutions. Our motivation is to improve the performance of the tranSMART data warehouse with a view to supporting Next Generation Sequencing data. In this paper we introduce a new data model better suited for high-dimensional data storage and querying, optimized for database scalability and performance. We have designed a key-value pair data model to support faster queries over large-scale microarray data and implemented the model using HBase, an implementation of Google's BigTable storage system. An experimental performance comparison was carried out against the traditional relational data model implemented in both MySQL Cluster and MongoDB, using a large publicly available transcriptomic data set taken from NCBI GEO concerning Multiple Myeloma. Our new key-value data model implemented on HBase exhibits an average 5.24-fold increase in high-dimensional biological data query performance compared to the relational model implemented on MySQL Cluster, and an average 6.47-fold increase on query performance on MongoDB. The performance evaluation found that the new key-value data model, in particular its implementation in HBase, outperforms the relational model currently implemented in tranSMART. We propose that NoSQL technology holds great promise for large-scale data management, in particular for high-dimensional biological data such as that demonstrated in the performance evaluation described in this paper. We aim to use this new data model as a basis for migrating tranSMART's implementation to a more scalable solution for Big Data.
High dimensional biological data retrieval optimization with NoSQL technology
2014-01-01
Background High-throughput transcriptomic data generated by microarray experiments is the most abundant and frequently stored kind of data currently used in translational medicine studies. Although microarray data is supported in data warehouses such as tranSMART, when querying relational databases for hundreds of different patient gene expression records queries are slow due to poor performance. Non-relational data models, such as the key-value model implemented in NoSQL databases, hold promise to be more performant solutions. Our motivation is to improve the performance of the tranSMART data warehouse with a view to supporting Next Generation Sequencing data. Results In this paper we introduce a new data model better suited for high-dimensional data storage and querying, optimized for database scalability and performance. We have designed a key-value pair data model to support faster queries over large-scale microarray data and implemented the model using HBase, an implementation of Google's BigTable storage system. An experimental performance comparison was carried out against the traditional relational data model implemented in both MySQL Cluster and MongoDB, using a large publicly available transcriptomic data set taken from NCBI GEO concerning Multiple Myeloma. Our new key-value data model implemented on HBase exhibits an average 5.24-fold increase in high-dimensional biological data query performance compared to the relational model implemented on MySQL Cluster, and an average 6.47-fold increase on query performance on MongoDB. Conclusions The performance evaluation found that the new key-value data model, in particular its implementation in HBase, outperforms the relational model currently implemented in tranSMART. We propose that NoSQL technology holds great promise for large-scale data management, in particular for high-dimensional biological data such as that demonstrated in the performance evaluation described in this paper. We aim to use this new data model as a basis for migrating tranSMART's implementation to a more scalable solution for Big Data. PMID:25435347
Théodore, Florence L; Moreno-Saracho, Jessica E; Bonvecchio, Anabelle; Morales-Ruán, María Del Carmen; Tolentino-Mayo, Lizbeth; López-Olmedo, Nancy; Shamah-Levy, Teresa; Rivera, Juan A
2018-01-01
Obesity is a serious problem among children in Mexico. In 2010, the government implemented a national food and physical activity policy in elementary schools, to prevent obesity. The goal of this study is to assess the implementation of this policy, using the logic model from a descriptive survey with national representativeness at the elementary school level and based on a stratified cluster design. We used a systematic random sampling of schools (n = 122), stratified into public and private. We administered questionnaires to 116 principals, 165 members of the Food and Physical Activity Committees, 132 food school food vendors, 119 teachers, 348 parents. This study evidences a significant deviation in implementation from what had been planned. Our lessons learned are the importance to: base the design/implementation of the policy on a theoretical framework, make programs appealing to stakeholders, select concrete and measurable objective or goals, and support stakeholders during the implementation process.
An Allocation Model for Teaching and Nonteaching Staff in a Decentralized Institution.
ERIC Educational Resources Information Center
Dijkman, Frank G
1985-01-01
An allocation model for teaching and nonteaching staff developed at the University of Utrecht is characterized as highly normative, leading to lump sums to be allocated to academic departments. Details are given regarding the reasons for designing the new model and the process of implementation. (Author/MLW)
DOT National Transportation Integrated Search
2009-10-01
Travel demand modeling, in recent years, has seen a paradigm shift with an emphasis on analyzing travel at the : individual level rather than using direct statistical projections of aggregate travel demand as in the trip-based : approach. Specificall...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-23
... requirements for SIPs, including emissions inventories, monitoring, and modeling, to assure attainment and... include requirements, such as modeling, monitoring, and emissions inventories, which are designed to... significant deterioration (PSD) and visibility protection. 110(a)(2)(K): Air quality modeling/data. 110(a)(2...
Evidencing Learning Outcomes: A Multi-Level, Multi-Dimensional Course Alignment Model
ERIC Educational Resources Information Center
Sridharan, Bhavani; Leitch, Shona; Watty, Kim
2015-01-01
This conceptual framework proposes a multi-level, multi-dimensional course alignment model to implement a contextualised constructive alignment of rubric design that authentically evidences and assesses learning outcomes. By embedding quality control mechanisms at each level for each dimension, this model facilitates the development of an aligned…
School Nurse Summer Institute: A Model for Professional Development
ERIC Educational Resources Information Center
Neighbors, Marianne; Barta, Kathleen
2004-01-01
The components of a professional development model designed to empower school nurses to become leaders in school health services is described. The model was implemented during a 3-day professional development institute that included clinical and leadership components, especially coalition building, with two follow-up sessions in the fall and…