Tailoring Systems Engineering Projects for Small Satellite Missions
NASA Technical Reports Server (NTRS)
Horan, Stephen; Belvin, Keith
2013-01-01
NASA maintains excellence in its spaceflight systems by utilizing rigorous engineering processes based on over 50 years of experience. The NASA systems engineering process for flight projects described in NPR 7120.5E was initially developed for major flight projects. The design and development of low-cost small satellite systems does not entail the financial and risk consequences traditionally associated with spaceflight projects. Consequently, an approach is offered to tailoring of the processes such that the small satellite missions will benefit from the engineering rigor without overly burdensome overhead. In this paper we will outline the approaches to tailoring the standard processes for these small missions and describe how it will be applied in a proposed small satellite mission.
Engineering education as a complex system
NASA Astrophysics Data System (ADS)
Gattie, David K.; Kellam, Nadia N.; Schramski, John R.; Walther, Joachim
2011-12-01
This paper presents a theoretical basis for cultivating engineering education as a complex system that will prepare students to think critically and make decisions with regard to poorly understood, ill-structured issues. Integral to this theoretical basis is a solution space construct developed and presented as a benchmark for evaluating problem-solving orientations that emerge within students' thinking as they progress through an engineering curriculum. It is proposed that the traditional engineering education model, while analytically rigorous, is characterised by properties that, although necessary, are insufficient for preparing students to address complex issues of the twenty-first century. A Synthesis and Design Studio model for engineering education is proposed, which maintains the necessary rigor of analysis within a uniquely complex yet sufficiently structured learning environment.
Advances in knowledge-based software engineering
NASA Technical Reports Server (NTRS)
Truszkowski, Walt
1991-01-01
The underlying hypothesis of this work is that a rigorous and comprehensive software reuse methodology can bring about a more effective and efficient utilization of constrained resources in the development of large-scale software systems by both government and industry. It is also believed that correct use of this type of software engineering methodology can significantly contribute to the higher levels of reliability that will be required of future operational systems. An overview and discussion of current research in the development and application of two systems that support a rigorous reuse paradigm are presented: the Knowledge-Based Software Engineering Environment (KBSEE) and the Knowledge Acquisition fo the Preservation of Tradeoffs and Underlying Rationales (KAPTUR) systems. Emphasis is on a presentation of operational scenarios which highlight the major functional capabilities of the two systems.
Freshman Engineering Retention: A Holistic Look
ERIC Educational Resources Information Center
Honken, Nora; Ralston, Patricia A. S.
2013-01-01
The ability to increase the number of engineering graduates depends on many factors including our country's P-16+ educational system, the job market and the engineering professions. Students need to be prepared for the rigorous math and science components of engineering programs, but they also must have interest in engineering as a profession,…
A Novel Approach to Physiology Education for Biomedical Engineering Students
ERIC Educational Resources Information Center
DiCecco, J.; Wu, J.; Kuwasawa, K.; Sun, Y.
2007-01-01
It is challenging for biomedical engineering programs to incorporate an indepth study of the systemic interdependence of cells, tissues, and organs into the rigorous mathematical curriculum that is the cornerstone of engineering education. To be sure, many biomedical engineering programs require their students to enroll in anatomy and physiology…
Control Engineering, System Theory and Mathematics: The Teacher's Challenge
ERIC Educational Resources Information Center
Zenger, K.
2007-01-01
The principles, difficulties and challenges in control education are discussed and compared to the similar problems in the teaching of mathematics and systems science in general. The difficulties of today's students to appreciate the classical teaching of engineering disciplines, which are based on rigorous and scientifically sound grounds, are…
Pedagogical Training and Research in Engineering Education
ERIC Educational Resources Information Center
Wankat, Phillip C.
2008-01-01
Ferment in engineering has focused increased attention on undergraduate engineering education, and has clarified the need for rigorous research in engineering education. This need has spawned the new research field of Engineering Education and greatly increased interest in earning Ph.D. degrees based on rigorous engineering education research.…
High-Temperature Alloys for Automotive Stirling Engines
NASA Technical Reports Server (NTRS)
Stephens, J. R.; Titran, R. H.
1986-01-01
Stirling engine is external-combustion engine that offers fuel economy, low emissions, low noise, and low vibrations. One of most critical areas in engine development concerns material selection for component parts. Alloys CG-27 and XF-818 identified capable of withstanding rigorous requirements of automotive Stirling engine. Alloys chosen for availability, performance, and manufacturability. Advanced iron-base alloys have potential for variety of applications, including stationary solar-power systems.
Principles to Products: Toward Realizing MOS 2.0
NASA Technical Reports Server (NTRS)
Bindschadler, Duane L.; Delp, Christopher L.
2012-01-01
This is a report on the Operations Revitalization Initiative, part of the ongoing NASA-funded Advanced Multi-Mission Operations Systems (AMMOS) program. We are implementing products that significantly improve efficiency and effectiveness of Mission Operations Systems (MOS) for deep-space missions. We take a multi-mission approach, in keeping with our organization's charter to "provide multi-mission tools and services that enable mission customers to operate at a lower total cost to NASA." Focusing first on architectural fundamentals of the MOS, we review the effort's progress. In particular, we note the use of stakeholder interactions and consideration of past lessons learned to motivate a set of Principles that guide the evolution of the AMMOS. Thus guided, we have created essential patterns and connections (detailed in companion papers) that are explicitly modeled and support elaboration at multiple levels of detail (system, sub-system, element...) throughout a MOS. This architecture is realized in design and implementation products that provide lifecycle support to a Mission at the system and subsystem level. The products include adaptable multi-mission engineering documentation that describes essentials such as operational concepts and scenarios, requirements, interfaces and agreements, information models, and mission operations processes. Because we have adopted a model-based system engineering method, these documents and their contents are meaningfully related to one another and to the system model. This means they are both more rigorous and reusable (from mission to mission) than standard system engineering products. The use of models also enables detailed, early (e.g., formulation phase) insight into the impact of changes (e.g., to interfaces or to software) that is rigorous and complete, allowing better decisions on cost or technical trades. Finally, our work provides clear and rigorous specification of operations needs to software developers, further enabling significant gains in productivity.
Use of software engineering techniques in the design of the ALEPH data acquisition system
NASA Astrophysics Data System (ADS)
Charity, T.; McClatchey, R.; Harvey, J.
1987-08-01
The SASD methodology is being used to provide a rigorous design framework for various components of the ALEPH data acquisition system. The Entity-Relationship data model is used to describe the layout and configuration of the control and acquisition systems and detector components. State Transition Diagrams are used to specify control applications such as run control and resource management and Data Flow Diagrams assist in decomposing software tasks and defining interfaces between processes. These techniques encourage rigorous software design leading to enhanced functionality and reliability. Improved documentation and communication ensures continuity over the system life-cycle and simplifies project management.
The Rigors of Aligning Performance
2015-06-01
organization merged 6 its field activities into regional facilities engineering commands (FECs). Today, FECs provide one-stop shopping for NAVFAC clients...many are old and antiquated , sometimes the systems mesh together other times they do not. Training is lacking on the various systems. Communication
Recent Developments: PKI Square Dish for the Soleras Project
NASA Technical Reports Server (NTRS)
Rogers, W. E.
1984-01-01
The Square Dish solar collectors are subjected to rigorous design attention regarding corrosion at the site, and certification of the collector structure. The microprocessor controls and tracking mechanisms are improved in the areas of fail safe operations, durability, and low parasitic power requirements. Prototype testing demonstrates performance efficiency of approximately 72% at 730 F outlet temperature. Studies are conducted that include developing formal engineering design studies, developing formal engineering design drawing and fabrication details, establishing subcontracts for fabrication of major components, and developing a rigorous quality control system. The improved design is more cost effective to product and the extensive manuals developed for assembly and operation/maintenance result in faster field assembly and ease of operation.
Recent developments: PKI square dish for the Soleras Project
NASA Astrophysics Data System (ADS)
Rogers, W. E.
1984-03-01
The Square Dish solar collectors are subjected to rigorous design attention regarding corrosion at the site, and certification of the collector structure. The microprocessor controls and tracking mechanisms are improved in the areas of fail safe operations, durability, and low parasitic power requirements. Prototype testing demonstrates performance efficiency of approximately 72% at 730 F outlet temperature. Studies are conducted that include developing formal engineering design studies, developing formal engineering design drawing and fabrication details, establishing subcontracts for fabrication of major components, and developing a rigorous quality control system. The improved design is more cost effective to product and the extensive manuals developed for assembly and operation/maintenance result in faster field assembly and ease of operation.
Developments in REDES: The rocket engine design expert system
NASA Technical Reports Server (NTRS)
Davidian, Kenneth O.
1990-01-01
The Rocket Engine Design Expert System (REDES) is being developed at the NASA-Lewis to collect, automate, and perpetuate the existing expertise of performing a comprehensive rocket engine analysis and design. Currently, REDES uses the rigorous JANNAF methodology to analyze the performance of the thrust chamber and perform computational studies of liquid rocket engine problems. The following computer codes were included in REDES: a gas properties program named GASP, a nozzle design program named RAO, a regenerative cooling channel performance evaluation code named RTE, and the JANNAF standard liquid rocket engine performance prediction code TDK (including performance evaluation modules ODE, ODK, TDE, TDK, and BLM). Computational analyses are being conducted by REDES to provide solutions to liquid rocket engine thrust chamber problems. REDES is built in the Knowledge Engineering Environment (KEE) expert system shell and runs on a Sun 4/110 computer.
Developments in REDES: The Rocket Engine Design Expert System
NASA Technical Reports Server (NTRS)
Davidian, Kenneth O.
1990-01-01
The Rocket Engine Design Expert System (REDES) was developed at NASA-Lewis to collect, automate, and perpetuate the existing expertise of performing a comprehensive rocket engine analysis and design. Currently, REDES uses the rigorous JANNAF methodology to analyze the performance of the thrust chamber and perform computational studies of liquid rocket engine problems. The following computer codes were included in REDES: a gas properties program named GASP; a nozzle design program named RAO; a regenerative cooling channel performance evaluation code named RTE; and the JANNAF standard liquid rocket engine performance prediction code TDK (including performance evaluation modules ODE, ODK, TDE, TDK, and BLM). Computational analyses are being conducted by REDES to provide solutions to liquid rocket engine thrust chamber problems. REDES was built in the Knowledge Engineering Environment (KEE) expert system shell and runs on a Sun 4/110 computer.
13th Annual Systems Engineering Conference: Tues- Wed
2010-10-28
greater understanding/documentation of lessons learned – Promotes SE within the organization • Justification for continued funding of SE Infrastructure...educational process – Addresses the development of innovative learning tools, strategies, and teacher training • Research and Development – Promotes ...technology, and mathematics • More commitment to engaging young students in science, engineering, technology and mathematics • More rigor in defining
Efficiency versus speed in quantum heat engines: Rigorous constraint from Lieb-Robinson bound
NASA Astrophysics Data System (ADS)
Shiraishi, Naoto; Tajima, Hiroyasu
2017-08-01
A long-standing open problem whether a heat engine with finite power achieves the Carnot efficiency is investgated. We rigorously prove a general trade-off inequality on thermodynamic efficiency and time interval of a cyclic process with quantum heat engines. In a first step, employing the Lieb-Robinson bound we establish an inequality on the change in a local observable caused by an operation far from support of the local observable. This inequality provides a rigorous characterization of the following intuitive picture that most of the energy emitted from the engine to the cold bath remains near the engine when the cyclic process is finished. Using this description, we prove an upper bound on efficiency with the aid of quantum information geometry. Our result generally excludes the possibility of a process with finite speed at the Carnot efficiency in quantum heat engines. In particular, the obtained constraint covers engines evolving with non-Markovian dynamics, which almost all previous studies on this topic fail to address.
Efficiency versus speed in quantum heat engines: Rigorous constraint from Lieb-Robinson bound.
Shiraishi, Naoto; Tajima, Hiroyasu
2017-08-01
A long-standing open problem whether a heat engine with finite power achieves the Carnot efficiency is investgated. We rigorously prove a general trade-off inequality on thermodynamic efficiency and time interval of a cyclic process with quantum heat engines. In a first step, employing the Lieb-Robinson bound we establish an inequality on the change in a local observable caused by an operation far from support of the local observable. This inequality provides a rigorous characterization of the following intuitive picture that most of the energy emitted from the engine to the cold bath remains near the engine when the cyclic process is finished. Using this description, we prove an upper bound on efficiency with the aid of quantum information geometry. Our result generally excludes the possibility of a process with finite speed at the Carnot efficiency in quantum heat engines. In particular, the obtained constraint covers engines evolving with non-Markovian dynamics, which almost all previous studies on this topic fail to address.
Bayesian Nonlinear Assimilation of Eulerian and Lagrangian Coastal Flow Data
2015-09-30
Lagrangian Coastal Flow Data Dr. Pierre F.J. Lermusiaux Department of Mechanical Engineering Center for Ocean Science and Engineering Massachusetts...Develop and apply theory, schemes and computational systems for rigorous Bayesian nonlinear assimilation of Eulerian and Lagrangian coastal flow data...coastal ocean fields, both in Eulerian and Lagrangian forms. - Further develop and implement our GMM-DO schemes for robust Bayesian nonlinear estimation
Caya, Teresa; Musuuza, Jackson; Yanke, Eric; Schmitz, Michelle; Anderson, Brooke; Carayon, Pascale; Safdar, Nasia
2015-01-01
We undertook a systems engineering approach to evaluate housewide implementation of daily chlorhexidine bathing. We performed direct observations of the bathing process and conducted provider and patient surveys. The main outcome was compliance with bathing using a checklist. Fifty-seven percent of baths had full compliance with the chlorhexidine bathing protocol. Additional time was the main barrier. Institutions undertaking daily chlorhexidine bathing should perform a rigorous assessment of implementation to optimize the benefits of this intervention.
Sensemaking in a Value Based Context for Large Scale Complex Engineered Systems
NASA Astrophysics Data System (ADS)
Sikkandar Basha, Nazareen
The design and the development of Large-Scale Complex Engineered Systems (LSCES) requires the involvement of multiple teams and numerous levels of the organization and interactions with large numbers of people and interdisciplinary departments. Traditionally, requirements-driven Systems Engineering (SE) is used in the design and development of these LSCES. The requirements are used to capture the preferences of the stakeholder for the LSCES. Due to the complexity of the system, multiple levels of interactions are required to elicit the requirements of the system within the organization. Since LSCES involves people and interactions between the teams and interdisciplinary departments, it should be socio-technical in nature. The elicitation of the requirements of most large-scale system projects are subjected to creep in time and cost due to the uncertainty and ambiguity of requirements during the design and development. In an organization structure, the cost and time overrun can occur at any level and iterate back and forth thus increasing the cost and time. To avoid such creep past researches have shown that rigorous approaches such as value based designing can be used to control it. But before the rigorous approaches can be used, the decision maker should have a proper understanding of requirements creep and the state of the system when the creep occurs. Sensemaking is used to understand the state of system when the creep occurs and provide a guidance to decision maker. This research proposes the use of the Cynefin framework, sensemaking framework which can be used in the design and development of LSCES. It can aide in understanding the system and decision making to minimize the value gap due to requirements creep by eliminating ambiguity which occurs during design and development. A sample hierarchical organization is used to demonstrate the state of the system at the occurrence of requirements creep in terms of cost and time using the Cynefin framework. These trials are continued for different requirements and at different sub-system level. The results obtained show that the Cynefin framework can be used to improve the value of the system and can be used for predictive analysis. The decision makers can use these findings and use rigorous approaches and improve the design of Large Scale Complex Engineered Systems.
Lenas, Petros; Moos, Malcolm; Luyten, Frank P
2009-12-01
The field of tissue engineering is moving toward a new concept of "in vitro biomimetics of in vivo tissue development." In Part I of this series, we proposed a theoretical framework integrating the concepts of developmental biology with those of process design to provide the rules for the design of biomimetic processes. We named this methodology "developmental engineering" to emphasize that it is not the tissue but the process of in vitro tissue development that has to be engineered. To formulate the process design rules in a rigorous way that will allow a computational design, we should refer to mathematical methods to model the biological process taking place in vitro. Tissue functions cannot be attributed to individual molecules but rather to complex interactions between the numerous components of a cell and interactions between cells in a tissue that form a network. For tissue engineering to advance to the level of a technologically driven discipline amenable to well-established principles of process engineering, a scientifically rigorous formulation is needed of the general design rules so that the behavior of networks of genes, proteins, or cells that govern the unfolding of developmental processes could be related to the design parameters. Now that sufficient experimental data exist to construct plausible mathematical models of many biological control circuits, explicit hypotheses can be evaluated using computational approaches to facilitate process design. Recent progress in systems biology has shown that the empirical concepts of developmental biology that we used in Part I to extract the rules of biomimetic process design can be expressed in rigorous mathematical terms. This allows the accurate characterization of manufacturing processes in tissue engineering as well as the properties of the artificial tissues themselves. In addition, network science has recently shown that the behavior of biological networks strongly depends on their topology and has developed the necessary concepts and methods to describe it, allowing therefore a deeper understanding of the behavior of networks during biomimetic processes. These advances thus open the door to a transition for tissue engineering from a substantially empirical endeavor to a technology-based discipline comparable to other branches of engineering.
Practical Application of Model-based Programming and State-based Architecture to Space Missions
NASA Technical Reports Server (NTRS)
Horvath, Gregory A.; Ingham, Michel D.; Chung, Seung; Martin, Oliver; Williams, Brian
2006-01-01
Innovative systems and software engineering solutions are required to meet the increasingly challenging demands of deep-space robotic missions. While recent advances in the development of an integrated systems and software engineering approach have begun to address some of these issues, they are still at the core highly manual and, therefore, error-prone. This paper describes a task aimed at infusing MIT's model-based executive, Titan, into JPL's Mission Data System (MDS), a unified state-based architecture, systems engineering process, and supporting software framework. Results of the task are presented, including a discussion of the benefits and challenges associated with integrating mature model-based programming techniques and technologies into a rigorously-defined domain specific architecture.
Visit from JAXA to NASA MSFC: The Engines Element & Ideas for Collaboration
NASA Technical Reports Server (NTRS)
Greene, William D.
2013-01-01
System Design, Development, and Fabrication: Design, develop, and fabricate or procure MB-60 component hardware compliant with the imposed technical requirements and in sufficient quantities to fulfill the overall MB-60 development effort. System Development, Assembly, and Test: Manage the scope of the development, assembly, and test-related activities for MB-60 development. This scope includes engine-level development planning, engine assembly and disassembly, test planning, engine testing, inspection, anomaly resolution, and development of necessary ground support equipment and special test equipment. System Integration: Provide coordinated integration in the realms of engineering, safety, quality, and manufacturing disciplines across the scope of the MB-60 design and associated products development Safety and Mission Assurance, structural design, fracture control, materials and processes, thermal analysis. Systems Engineering and Analysis: Manage and perform Systems Engineering and Analysis to provide rigor and structure to the overall design and development effort for the MB-60. Milestone reviews, requirements management, system analysis, program management support Program Management: Manage, plan, and coordinate the activities across all portions of the MB-60 work scope by providing direction for program administration, business management, and supplier management.
2014-10-27
a phase-averaged spectral wind-wave generation and transformation model and its interface in the Surface-water Modeling System (SMS). Ambrose...applications of the Boussinesq (BOUSS-2D) wave model that provides more rigorous calculations for design and performance optimization of integrated...navigation systems . Together these wave models provide reliable predictions on regional and local spatial domains and cost-effective engineering solutions
Integrated model development for liquid fueled rocket propulsion systems
NASA Technical Reports Server (NTRS)
Santi, L. Michael
1993-01-01
As detailed in the original statement of work, the objective of phase two of this research effort was to develop a general framework for rocket engine performance prediction that integrates physical principles, a rigorous mathematical formalism, component level test data, system level test data, and theory-observation reconciliation. Specific phase two development tasks are defined.
Timmis, J; Alden, K; Andrews, P; Clark, E; Nellis, A; Naylor, B; Coles, M; Kaye, P
2017-03-01
This tutorial promotes good practice for exploring the rationale of systems pharmacology models. A safety systems engineering inspired notation approach provides much needed rigor and transparency in development and application of models for therapeutic discovery and design of intervention strategies. Structured arguments over a model's development, underpinning biological knowledge, and analyses of model behaviors are constructed to determine the confidence that a model is fit for the purpose for which it will be applied. © 2016 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.
SSME component assembly and life management expert system
NASA Technical Reports Server (NTRS)
Ali, M.; Dietz, W. E.; Ferber, H. J.
1989-01-01
The space shuttle utilizes several rocket engine systems, all of which must function with a high degree of reliability for successful mission completion. The space shuttle main engine (SSME) is by far the most complex of the rocket engine systems and is designed to be reusable. The reusability of spacecraft systems introduces many problems related to testing, reliability, and logistics. Components must be assembled from parts inventories in a manner which will most effectively utilize the available parts. Assembly must be scheduled to efficiently utilize available assembly benches while still maintaining flight schedules. Assembled components must be assigned to as many contiguous flights as possible, to minimize component changes. Each component must undergo a rigorous testing program prior to flight. In addition, testing and assembly of flight engines and components must be done in conjunction with the assembly and testing of developmental engines and components. The development, testing, manufacture, and flight assignments of the engine fleet involves the satisfaction of many logistical and operational requirements, subject to many constraints. The purpose of the SSME Component Assembly and Life Management Expert System (CALMES) is to assist the engine assembly and scheduling process, and to insure that these activities utilize available resources as efficiently as possible.
Programmable lithography engine (ProLE) grid-type supercomputer and its applications
NASA Astrophysics Data System (ADS)
Petersen, John S.; Maslow, Mark J.; Gerold, David J.; Greenway, Robert T.
2003-06-01
There are many variables that can affect lithographic dependent device yield. Because of this, it is not enough to make optical proximity corrections (OPC) based on the mask type, wavelength, lens, illumination-type and coherence. Resist chemistry and physics along with substrate, exposure, and all post-exposure processing must be considered too. Only a holistic approach to finding imaging solutions will accelerate yield and maximize performance. Since experiments are too costly in both time and money, accomplishing this takes massive amounts of accurate simulation capability. Our solution is to create a workbench that has a set of advanced user applications that utilize best-in-class simulator engines for solving litho-related DFM problems using distributive computing. Our product, ProLE (Programmable Lithography Engine), is an integrated system that combines Petersen Advanced Lithography Inc."s (PAL"s) proprietary applications and cluster management software wrapped around commercial software engines, along with optional commercial hardware and software. It uses the most rigorous lithography simulation engines to solve deep sub-wavelength imaging problems accurately and at speeds that are several orders of magnitude faster than current methods. Specifically, ProLE uses full vector thin-mask aerial image models or when needed, full across source 3D electromagnetic field simulation to make accurate aerial image predictions along with calibrated resist models;. The ProLE workstation from Petersen Advanced Lithography, Inc., is the first commercial product that makes it possible to do these intensive calculations at a fraction of a time previously available thus significantly reducing time to market for advance technology devices. In this work, ProLE is introduced, through model comparison to show why vector imaging and rigorous resist models work better than other less rigorous models, then some applications of that use our distributive computing solution are shown. Topics covered describe why ProLE solutions are needed from an economic and technical aspect, a high level discussion of how the distributive system works, speed benchmarking, and finally, a brief survey of applications including advanced aberrations for lens sensitivity and flare studies, optical-proximity-correction for a bitcell and an application that will allow evaluation of the potential of a design to have systematic failures during fabrication.
From Science To Design: Systems Engineering For The Lsst
NASA Astrophysics Data System (ADS)
Claver, Chuck F.; Axelrod, T.; Fouts, K.; Kantor, J.; Nordby, M.; Sebag, J.; LSST Collaboration
2009-01-01
The LSST is a universal-purpose survey telescope that will address scores of scientific missions. To assist the technical teams to convergence to a specific engineering design, the LSST Science Requirements Document (SRD) selects four stressing principle scientific missions: 1) Constraining Dark Matter and Dark Energy; 2) taking an Inventory of the Solar System; 3) Exploring the Transient Optical Sky; and 4) mapping the Milky Way. From these 4 missions the SRD specifies the needed requirements for single images and the full 10 year survey that enables a wide range of science beyond the 4 principle missions. Through optical design and analysis, operations simulation, and throughput modeling the systems engineering effort in the LSST has largely focused on taking the SRD specifications and deriving system functional requirements that define the system design. A Model Based Systems Engineering approach with SysML is used to manage the flow down of requirements from science to system function to sub-system. The rigor of requirements flow and management assists the LSST in keeping the overall scope, hence budget and schedule, under control.
The Tailoring of Traditional Systems Engineering for the Morpheus Project
NASA Technical Reports Server (NTRS)
Devolites, Jennifer L.; Hart, Jeremy J.
2013-01-01
NASA's Morpheus Project has developed and tested a prototype planetary lander capable of vertical takeoff and landing that is designed to serve as a testbed for advanced spacecraft technologies. The lander vehicle, propelled by a LOX/Methane engine and sized to carry a 500kg payload to the lunar surface, provides a platform for bringing technologies from the laboratory into an integrated flight system at relatively low cost. From the beginning, one of goals for the Morpheus Project was to streamline agency processes and practices. The Morpheus project accepted a challenge to tailor the traditional NASA systems engineering approach in a way that would be appropriate for a lower cost, rapid prototype engineering effort, but retain the essence of the guiding principles. The team has produced innovative ways to create an infrastructure and approach that would challenge existing systems engineering processes while still enabling successful implementation of the current Morpheus Project. This paper describes the tailored systems engineering approach for the Morpheus project, including the processes, tools, and amount of rigor employed over the project's multiple lifecycles since the project began in FY11. Lessons learned from these trials have the potential to be scaled up and improve efficiency on a larger projects or programs.
ERIC Educational Resources Information Center
HARDWICK, ARTHUR LEE
AT THIS WORKSHOP OF INDUSTRIAL REPRESENTATIVE AND TECHNICAL EDUCATORS, A TECHNICIAN WAS DEFINED AS ONE WITH BROAD-BASED MATHEMATICAL AND SCIENTIFIC TRAINING AND WITH COMPETENCE TO SUPPORT PROFESSIONAL SYSTEMS, ENGINEERING, AND OTHER SCIENTIFIC PERSONNEL. HE SHOULD RECEIVE A RIGOROUS, 2-YEAR, POST SECONDARY EDUCATION ESPECIALLY DESIGNED FOR HIS…
Dynamic Gate Product and Artifact Generation from System Models
NASA Technical Reports Server (NTRS)
Jackson, Maddalena; Delp, Christopher; Bindschadler, Duane; Sarrel, Marc; Wollaeger, Ryan; Lam, Doris
2011-01-01
Model Based Systems Engineering (MBSE) is gaining acceptance as a way to formalize systems engineering practice through the use of models. The traditional method of producing and managing a plethora of disjointed documents and presentations ("Power-Point Engineering") has proven both costly and limiting as a means to manage the complex and sophisticated specifications of modern space systems. We have developed a tool and method to produce sophisticated artifacts as views and by-products of integrated models, allowing us to minimize the practice of "Power-Point Engineering" from model-based projects and demonstrate the ability of MBSE to work within and supersede traditional engineering practices. This paper describes how we have created and successfully used model-based document generation techniques to extract paper artifacts from complex SysML and UML models in support of successful project reviews. Use of formal SysML and UML models for architecture and system design enables production of review documents, textual artifacts, and analyses that are consistent with one-another and require virtually no labor-intensive maintenance across small-scale design changes and multiple authors. This effort thus enables approaches that focus more on rigorous engineering work and less on "PowerPoint engineering" and production of paper-based documents or their "office-productivity" file equivalents.
On analyticity of linear waves scattered by a layered medium
NASA Astrophysics Data System (ADS)
Nicholls, David P.
2017-10-01
The scattering of linear waves by periodic structures is a crucial phenomena in many branches of applied physics and engineering. In this paper we establish rigorous analytic results necessary for the proper numerical analysis of a class of High-Order Perturbation of Surfaces methods for simulating such waves. More specifically, we prove a theorem on existence and uniqueness of solutions to a system of partial differential equations which model the interaction of linear waves with a multiply layered periodic structure in three dimensions. This result provides hypotheses under which a rigorous numerical analysis could be conducted for recent generalizations to the methods of Operator Expansions, Field Expansions, and Transformed Field Expansions.
NASA Ares I Crew Launch Vehicle Upper Stage Overview
NASA Technical Reports Server (NTRS)
Davusm Daniel J.; McArthur, J. Craig
2008-01-01
By incorporating rigorous engineering practices, innovative manufacturing processes and test techniques, a unique multi-center government/contractor partnership, and a clean-sheet design developed around the primary requirements for the International Space Station (ISS) and Lunar missions, the Upper Stage Element of NASA's Crew Launch Vehicle (CLV), the "Ares I," is a vital part of the Constellation Program's transportation system.
NASA Ares I Crew Launch Vehicle Upper State Overview
NASA Technical Reports Server (NTRS)
Davis, Daniel J.
2008-01-01
By incorporating rigorous engineering practices, innovative manufacturing processes and test techniques, a unique multi-center government/contractor partnership, and a clean-sheet design developed around the primary requirements for the International Space Station (ISS) and Lunar missions, the Upper Stage Element of NASA s Crew Launch Vehicle (CLV), the "Ares I," is a vital part of the Constellation Program s transportation system.
Navigating Transitions: Challenges for Engineering Students
ERIC Educational Resources Information Center
Moore-Russo, Deborah; Wilsey, Jillian N.; Parthum, Michael J., Sr.; Lewis, Kemper
2017-01-01
As college students enter engineering, they face challenges when they navigate across various transitions. These challenges impact whether a student can successfully adapt to the rigorous curricular requirements of an engineering degree and to the norms and expectations that are particular to engineering. This article focuses on the transitions…
Applying formal methods and object-oriented analysis to existing flight software
NASA Technical Reports Server (NTRS)
Cheng, Betty H. C.; Auernheimer, Brent
1993-01-01
Correctness is paramount for safety-critical software control systems. Critical software failures in medical radiation treatment, communications, and defense are familiar to the public. The significant quantity of software malfunctions regularly reported to the software engineering community, the laws concerning liability, and a recent NRC Aeronautics and Space Engineering Board report additionally motivate the use of error-reducing and defect detection software development techniques. The benefits of formal methods in requirements driven software development ('forward engineering') is well documented. One advantage of rigorously engineering software is that formal notations are precise, verifiable, and facilitate automated processing. This paper describes the application of formal methods to reverse engineering, where formal specifications are developed for a portion of the shuttle on-orbit digital autopilot (DAP). Three objectives of the project were to: demonstrate the use of formal methods on a shuttle application, facilitate the incorporation and validation of new requirements for the system, and verify the safety-critical properties to be exhibited by the software.
Enterprise and system of systems capability development life-cycle processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beck, David Franklin
2014-08-01
This report and set of appendices are a collection of memoranda originally drafted circa 2007-2009 for the purpose of describing and detailing a models-based systems engineering approach for satisfying enterprise and system-of-systems life cycle process requirements. At the time there was interest and support to move from Capability Maturity Model Integration (CMMI) Level One (ad hoc processes) to Level Three. The main thrust of the material presents a rational exposâe of a structured enterprise development life cycle that uses the scientific method as a framework, with further rigor added from adapting relevant portions of standard systems engineering processes. While themore » approach described invokes application of the Department of Defense Architectural Framework (DoDAF), it is suitable for use with other architectural description frameworks.« less
1987-06-01
redress a growingstrategic imbalance and provide an en- test pilots conducted a rigorous flight test during capability to penetrate Soviet program...ment career path for rated officers vidual rotates through assignments in ( pilots and navigators) is different from engineering, test and evaluation...pain. Acquiring the B-1B, or any other weapon system for that matter, entails developing, testing ard producing new technology. In any high-tech en
Documenting the Engineering Design Process
ERIC Educational Resources Information Center
Hollers, Brent
2017-01-01
Documentation of ideas and the engineering design process is a critical, daily component of a professional engineer's job. While patent protection is often cited as the primary rationale for documentation, it can also benefit the engineer, the team, company, and stakeholders through creating a more rigorously designed and purposeful solution.…
The Applied Mathematics for Power Systems (AMPS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chertkov, Michael
2012-07-24
Increased deployment of new technologies, e.g., renewable generation and electric vehicles, is rapidly transforming electrical power networks by crossing previously distinct spatiotemporal scales and invalidating many traditional approaches for designing, analyzing, and operating power grids. This trend is expected to accelerate over the coming years, bringing the disruptive challenge of complexity, but also opportunities to deliver unprecedented efficiency and reliability. Our Applied Mathematics for Power Systems (AMPS) Center will discover, enable, and solve emerging mathematics challenges arising in power systems and, more generally, in complex engineered networks. We will develop foundational applied mathematics resulting in rigorous algorithms and simulation toolboxesmore » for modern and future engineered networks. The AMPS Center deconstruction/reconstruction approach 'deconstructs' complex networks into sub-problems within non-separable spatiotemporal scales, a missing step in 20th century modeling of engineered networks. These sub-problems are addressed within the appropriate AMPS foundational pillar - complex systems, control theory, and optimization theory - and merged or 'reconstructed' at their boundaries into more general mathematical descriptions of complex engineered networks where important new questions are formulated and attacked. These two steps, iterated multiple times, will bridge the growing chasm between the legacy power grid and its future as a complex engineered network.« less
A Status Report on the Parachute Development for NASA's Next Manned Spacecraft
NASA Technical Reports Server (NTRS)
Sinclair, Robert
2008-01-01
NASA has determined that the parachute portion of the Landing System for the Crew Exploration Vehicle (CEV) will be Government Furnished Equipment (GFE). The Earth Landing System has been designated CEV Parachute Assembly System (CPAS). Thus a program team was developed consisting of NASA Johnson Space Center (JSC) and Jacobs Engineering through their Engineering and Science Contract Group (ESCG). Following a rigorous competitive phase, Airborne Systems North America was selected to provide the parachute design, testing and manufacturing role to support this team. The development program has begun with some early flight testing of a Generation 1 parachute system. Future testing will continue to refine the design and complete a qualification phase prior to manned flight of the spacecraft. The program team will also support early spacecraft system testing, including a Pad Abort Flight Test in the Fall of 2008
NASA Technical Reports Server (NTRS)
Patterson, Jonathan D.; Breckenridge, Jonathan T.; Johnson, Stephen B.
2013-01-01
Building upon the purpose, theoretical approach, and use of a Goal-Function Tree (GFT) being presented by Dr. Stephen B. Johnson, described in a related Infotech 2013 ISHM abstract titled "Goal-Function Tree Modeling for Systems Engineering and Fault Management", this paper will describe the core framework used to implement the GFTbased systems engineering process using the Systems Modeling Language (SysML). These two papers are ideally accepted and presented together in the same Infotech session. Statement of problem: SysML, as a tool, is currently not capable of implementing the theoretical approach described within the "Goal-Function Tree Modeling for Systems Engineering and Fault Management" paper cited above. More generally, SysML's current capabilities to model functional decompositions in the rigorous manner required in the GFT approach are limited. The GFT is a new Model-Based Systems Engineering (MBSE) approach to the development of goals and requirements, functions, and its linkage to design. As a growing standard for systems engineering, it is important to develop methods to implement GFT in SysML. Proposed Method of Solution: Many of the central concepts of the SysML language are needed to implement a GFT for large complex systems. In the implementation of those central concepts, the following will be described in detail: changes to the nominal SysML process, model view definitions and examples, diagram definitions and examples, and detailed SysML construct and stereotype definitions.
Teaching Mathematics to Civil Engineers
ERIC Educational Resources Information Center
Sharp, J. J.; Moore, E.
1977-01-01
This paper outlines a technique for teaching a rigorous course in calculus and differential equations which stresses applicability of the mathematics to problems in civil engineering. The method involves integration of subject matter and team teaching. (SD)
NASA's Solar Dynamics Observatory (SDO): A Systems Approach to a Complex Mission
NASA Technical Reports Server (NTRS)
Ruffa, John A.; Ward, David K.; Bartusek, LIsa M.; Bay, Michael; Gonzales, Peter J.; Pesnell, William D.
2012-01-01
The Solar Dynamics Observatory (SDO) includes three advanced instruments, massive science data volume, stringent science data completeness requirements, and a custom ground station to meet mission demands. The strict instrument science requirements imposed a number of challenging drivers on the overall mission system design, leading the SDO team to adopt an integrated systems engineering presence across all aspects of the mission to ensure that mission science requirements would be met. Key strategies were devised to address these system level drivers and mitigate identified threats to mission success. The global systems engineering team approach ensured that key drivers and risk areas were rigorously addressed through all phases of the mission, leading to the successful SDO launch and on-orbit operation. Since launch, SDO's on-orbit performance has met all mission science requirements and enabled groundbreaking science observations, expanding our understanding of the Sun and its dynamic processes.
NASA's Solar Dynamics Observatory (SDO): A Systems Approach to a Complex Mission
NASA Technical Reports Server (NTRS)
Ruffa, John A.; Ward, David K.; Bartusek, Lisa M.; Bay, Michael; Gonzales, Peter J.; Pesnell, William D.
2012-01-01
The Solar Dynamics Observatory (SDO) includes three advanced instruments, massive science data volume, stringent science data completeness requirements, and a custom ground station to meet mission demands. The strict instrument science requirements imposed a number of challenging drivers on the overall mission system design, leading the SDO team to adopt an integrated systems engineering presence across all aspects of the mission to ensure that mission science requirements would be met. Key strategies were devised to address these system level drivers and mitigate identified threats to mission success. The global systems engineering team approach ensured that key drivers and risk areas were rigorously addressed through all phases of the mission, leading to the successful SDO launch and on-orbit operation. Since launch, SDO s on-orbit performance has met all mission science requirements and enabled groundbreaking science observations, expanding our understanding of the Sun and its dynamic processes.
We Don’t Dance Well: Government and Industry Defense Materiel Acquisition
2010-04-01
tive prototyping prior to Milestone B and rigorous system engineering. Those activities are extremely important and critical to successful...for the warfighter and is a critical member of the materiel acquisition team. That point seems to be forgotten by some acquisition workforce members...A healthy and engaging relationship with industry partners is a critical component of any program and will surely impact—posi- tively or negatively
A design and implementation methodology for diagnostic systems
NASA Technical Reports Server (NTRS)
Williams, Linda J. F.
1988-01-01
A methodology for design and implementation of diagnostic systems is presented. Also discussed are the advantages of embedding a diagnostic system in a host system environment. The methodology utilizes an architecture for diagnostic system development that is hierarchical and makes use of object-oriented representation techniques. Additionally, qualitative models are used to describe the host system components and their behavior. The methodology architecture includes a diagnostic engine that utilizes a combination of heuristic knowledge to control the sequence of diagnostic reasoning. The methodology provides an integrated approach to development of diagnostic system requirements that is more rigorous than standard systems engineering techniques. The advantages of using this methodology during various life cycle phases of the host systems (e.g., National Aerospace Plane (NASP)) include: the capability to analyze diagnostic instrumentation requirements during the host system design phase, a ready software architecture for implementation of diagnostics in the host system, and the opportunity to analyze instrumentation for failure coverage in safety critical host system operations.
Validation (not just verification) of Deep Space Missions
NASA Technical Reports Server (NTRS)
Duren, Riley M.
2006-01-01
ion & Validation (V&V) is a widely recognized and critical systems engineering function. However, the often used definition 'Verification proves the design is right; validation proves it is the right design' is rather vague. And while Verification is a reasonably well standardized systems engineering process, Validation is a far more abstract concept and the rigor and scope applied to it varies widely between organizations and individuals. This is reflected in the findings in recent Mishap Reports for several NASA missions, in which shortfalls in Validation (not just Verification) were cited as root- or contributing-factors in catastrophic mission loss. Furthermore, although there is strong agreement in the community that Test is the preferred method for V&V, many people equate 'V&V' with 'Test', such that Analysis and Modeling aren't given comparable attention. Another strong motivator is a realization that the rapid growth in complexity of deep-space missions (particularly Planetary Landers and Space Observatories given their inherent unknowns) is placing greater demands on systems engineers to 'get it right' with Validation.
Requirements Development for the NASA Advanced Engineering Environment (AEE)
NASA Technical Reports Server (NTRS)
Rogers, Eric; Hale, Joseph P.; Zook, Keith; Gowda, Sanjay; Salas, Andrea O.
2003-01-01
The requirements development process for the Advanced Engineering Environment (AEE) is presented. This environment has been developed to allow NASA to perform independent analysis and design of space transportation architectures and technologies. Given the highly collaborative and distributed nature of AEE, a variety of organizations are involved in the development, operations and management of the system. Furthermore, there are additional organizations involved representing external customers and stakeholders. Thorough coordination and effective communication is essential to translate desired expectations of the system into requirements. Functional, verifiable requirements for this (and indeed any) system are necessary to fulfill several roles. Requirements serve as a contractual tool, configuration management tool, and as an engineering tool, sometimes simultaneously. The role of requirements as an engineering tool is particularly important because a stable set of requirements for a system provides a common framework of system scope and characterization among team members. Furthermore, the requirements provide the basis for checking completion of system elements and form the basis for system verification. Requirements are at the core of systems engineering. The AEE Project has undertaken a thorough process to translate the desires and expectations of external customers and stakeholders into functional system-level requirements that are captured with sufficient rigor to allow development planning, resource allocation and system-level design, development, implementation and verification. These requirements are maintained in an integrated, relational database that provides traceability to governing Program requirements and also to verification methods and subsystem-level requirements.
Unleashing Lessons: Sharing Stories About the Fine Art of Systems Engineering
NASA Technical Reports Server (NTRS)
Singer, Christopher E.
2010-01-01
NASA leaders have a responsibility to share their unique oral histories with junior-level employees on whom NASA's future depends. This presentation will give a few examples of how the imaginative, flexible art of systems engineering is as necessary to mission success as is the rigorous, disciplined side of engineering. Engineering space systems involves many disciplines propulsion, loads, dynamics, and so forth that are based on the foundations of scientific principles and methodology and the application of the laws of physics. The term rocket scientist is an apt term, considering that the underlying chemical properties of propellants and the subatomic properties of materials must be understood to harness the powerful energy necessary to escape Earth's gravity in machines that can withstand the stresses and forces to which they are subjected, not to mention the harsh space environments in which they must work. This is a simplistic, yet illustrative, explanation of the scientific side of the engineer s challenge. Bringing together these individual parts into a solid system goes beyond the science of engineering to employ the art of systems engineering. Systems engineers are known for their ability to integrate various solutions to meet or exceed challenging requirements. As the old adage goes, measure twice and cut once. The act of measuring is balancing rigid, inflexible requirements with creative compromises to attain the optimum solution to the challenge of space flight. Then, we cut out those answers that are too risky, expensive, dangerous, and so forth. The process of sharing stories about the little-discussed art of engineering, also known as the art of compromise, will equip the workforce to subjectively judge the best right answer from among the many presented, while objectively integrating the various piece parts into a unified whole.
The FoReVer Methodology: A MBSE Framework for Formal Verification
NASA Astrophysics Data System (ADS)
Baracchi, Laura; Mazzini, Silvia; Cimatti, Alessandro; Tonetta, Stefano; Garcia, Gerald
2013-08-01
The need for high level of confidence and operational integrity in critical space (software) systems is well recognized in the Space industry and has been addressed so far through rigorous System and Software Development Processes and stringent Verification and Validation regimes. The Model Based Space System Engineering process (MBSSE) derived in the System and Software Functional Requirement Techniques study (SSFRT) focused on the application of model based engineering technologies to support the space system and software development processes, from mission level requirements to software implementation through model refinements and translations. In this paper we report on our work in the ESA-funded FoReVer project where we aim at developing methodological, theoretical and technological support for a systematic approach to the space avionics system development, in phases 0/A/B/C. FoReVer enriches the MBSSE process with contract-based formal verification of properties, at different stages from system to software, through a step-wise refinement approach, with the support for a Software Reference Architecture.
2017-08-09
The 8.5-minute test conducted at NASA’s Stennis Space Center is part of a series of tests designed to put the upgraded former space shuttle engines through the rigorous temperature and pressure conditions they will experience during a launch. The tests also support the development of a new controller, or “brain,” for the engine, which monitors engine status and communicates between the rocket and the engine, relaying commands to the engine and transmitting data back to the rocket.
Development of the Functional Flow Block Diagram for the J-2X Rocket Engine System
NASA Technical Reports Server (NTRS)
White, Thomas; Stoller, Sandra L.; Greene, WIlliam D.; Christenson, Rick L.; Bowen, Barry C.
2007-01-01
The J-2X program calls for the upgrade of the Apollo-era Rocketdyne J-2 engine to higher power levels, using new materials and manufacturing techniques, and with more restrictive safety and reliability requirements than prior human-rated engines in NASA history. Such requirements demand a comprehensive systems engineering effort to ensure success. Pratt & Whitney Rocketdyne system engineers performed a functional analysis of the engine to establish the functional architecture. J-2X functions were captured in six major operational blocks. Each block was divided into sub-blocks or states. In each sub-block, functions necessary to perform each state were determined. A functional engine schematic consistent with the fidelity of the system model was defined for this analysis. The blocks, sub-blocks, and functions were sequentially numbered to differentiate the states in which the function were performed and to indicate the sequence of events. The Engine System was functionally partitioned, to provide separate and unique functional operators. Establishing unique functional operators as work output of the System Architecture process is novel in Liquid Propulsion Engine design. Each functional operator was described such that its unique functionality was identified. The decomposed functions were then allocated to the functional operators both of which were the inputs to the subsystem or component performance specifications. PWR also used a novel approach to identify and map the engine functional requirements to customer-specified functions. The final result was a comprehensive Functional Flow Block Diagram (FFBD) for the J-2X Engine System, decomposed to the component level and mapped to all functional requirements. This FFBD greatly facilitates component specification development, providing a well-defined trade space for functional trades at the subsystem and component level. It also provides a framework for function-based failure modes and effects analysis (FMEA), and a rigorous baseline for the functional architecture.
ERIC Educational Resources Information Center
Welch, Karla Conn; Hieb, Jeffrey; Graham, James
2015-01-01
Coursework that instills patterns of rigorous logical thought has long been a hallmark of the engineering curriculum. However, today's engineering students are expected to exhibit a wider range of thinking capabilities both to satisfy ABET requirements and to prepare the students to become successful practitioners. This paper presents the initial…
ERIC Educational Resources Information Center
Jehopio, Peter J.; Wesonga, Ronald
2017-01-01
Background: The main objective of the study was to examine the relevance of engineering mathematics to the emerging industries. The level of abstraction, the standard of rigor, and the depth of theoretical treatment are necessary skills expected of a graduate engineering technician to be derived from mathematical knowledge. The question of whether…
Goals Analysis Procedure Guidelines for Applying the Goals Analysis Process
NASA Technical Reports Server (NTRS)
Motley, Albert E., III
2000-01-01
One of the key elements to successful project management is the establishment of the "right set of requirements", requirements that reflect the true customer needs and are consistent with the strategic goals and objectives of the participating organizations. A viable set of requirements implies that each individual requirement is a necessary element in satisfying the stated goals and that the entire set of requirements, taken as a whole, is sufficient to satisfy the stated goals. Unfortunately, it is the author's experience that during project formulation phases' many of the Systems Engineering customers do not conduct a rigorous analysis of the goals and objectives that drive the system requirements. As a result, the Systems Engineer is often provided with requirements that are vague, incomplete, and internally inconsistent. To complicate matters, most systems development methodologies assume that the customer provides unambiguous, comprehensive and concise requirements. This paper describes the specific steps of a Goals Analysis process applied by Systems Engineers at the NASA Langley Research Center during the formulation of requirements for research projects. The objective of Goals Analysis is to identify and explore all of the influencing factors that ultimately drive the system's requirements.
Spillover, nonlinearity, and flexible structures
NASA Technical Reports Server (NTRS)
Bass, Robert W.; Zes, Dean
1991-01-01
Many systems whose evolution in time is governed by Partial Differential Equations (PDEs) are linearized around a known equilibrium before Computer Aided Control Engineering (CACE) is considered. In this case, there are infinitely many independent vibrational modes, and it is intuitively evident on physical grounds that infinitely many actuators would be needed in order to control all modes. A more precise, general formulation of this grave difficulty (spillover problem) is due to A.V. Balakrishnan. A possible route to circumvention of this difficulty lies in leaving the PDE in its original nonlinear form, and adding the essentially finite dimensional control action prior to linearization. One possibly applicable technique is the Liapunov Schmidt rigorous reduction of singular infinite dimensional implicit function problems to finite dimensional implicit function problems. Omitting details of Banach space rigor, the formalities of this approach are given.
NASA Technical Reports Server (NTRS)
Devolites, Jennifer L.; Olansen, Jon B.
2015-01-01
NASA's Morpheus Project has developed and tested a prototype planetary lander capable of vertical takeoff and landing that is designed to serve as a testbed for advanced spacecraft technologies. The lander vehicle, propelled by a Liquid Oxygen (LOX)/Methane engine and sized to carry a 500kg payload to the lunar surface, provides a platform for bringing technologies from the laboratory into an integrated flight system at relatively low cost. In 2012, Morpheus began integrating the Autonomous Landing and Hazard Avoidance Technology (ALHAT) sensors and software onto the vehicle in order to demonstrate safe, autonomous landing and hazard avoidance. From the beginning, one of goals for the Morpheus Project was to streamline agency processes and practices. The Morpheus project accepted a challenge to tailor the traditional NASA systems engineering approach in a way that would be appropriate for a lower cost, rapid prototype engineering effort, but retain the essence of the guiding principles. This paper describes the tailored project life cycle and systems engineering approach for the Morpheus project, including the processes, tools, and amount of rigor employed over the project's multiple lifecycles since the project began in fiscal year (FY) 2011.
System engineering and science projects: lessons from MeerKAT
NASA Astrophysics Data System (ADS)
Kapp, Francois
2016-08-01
The Square Kilometre Array (SKA) is a large science project planning to commence construction of the world's largest Radio Telescope after 2018. MeerKAT is one of the precursor projects to the SKA, based on the same site that will host the SKA Mid array in the central Karoo area of South Africa. From the perspective of signal processing hardware development, we analyse the challenges that MeerKAT encountered and extrapolate them to SKA in order to prepare the System Engineering and Project Management methods that could contribute to a successful completion of SKA. Using the MeerKAT Digitiser, Correlator/Beamformer and Time and Frequency Reference Systems as an example, we will trace the risk profile and subtle differences in engineering approaches of these systems over time and show the effects of varying levels of System Engineering rigour on the evolution of their risk profiles. It will be shown that the most rigorous application of System Engineering discipline resulted in the most substantial reduction in risk over time. Since the challenges faced by SKA are not limited to that of MeerKAT, we also look into how that translates to a system development where there is substantial complexity in both the created system as well as the creating system. Since the SKA will be designed and constructed by consortia made up from the ten member countries, there are many additional complexities to the organisation creating the system - a challenge the MeerKAT project did not encounter. Factors outside of engineering, for instance procurement models and political interests, also play a more significant role, and add to the project risks of SKA when compared to MeerKAT.
2016-08-18
The 7.5-minute test conducted at NASA’s Stennis Space Center is part of a series of tests designed to put the upgraded former space shuttle engines through the rigorous temperature and pressure conditions they will experience during a launch. The tests also support the development of a new controller, or “brain,” for the engine, which monitors engine status and communicates between the rocket and the engine, relaying commands to the engine and transmitting data back to the rocket.
2016-08-18
The 7.5-minute test conducted at NASA’s Stennis Space Center is part of a series of tests designed to put the upgraded former space shuttle engines through the rigorous temperature and pressure conditions they will experience during a launch. The tests also support the development of a new controller, or “brain,” for the engine, which monitors engine status and communicates between the rocket and the engine, relaying commands to the engine and transmitting data back to the rocket.
ERIC Educational Resources Information Center
Kittur, H.; Shaw, L.; Herrera, W.
2017-01-01
The High School Summer Research Program (HSSRP) is a rigorous eight-week research experience that challenges high school students to a novel scientific question in an engineering laboratory at the Henry Samueli School of Engineering and Applied Science (HSSEAS) at the University of California, Los Angeles (UCLA). The program collates highly…
Agapakis, Christina M; Silver, Pamela A
2009-07-01
Synthetic biology has been used to describe many biological endeavors over the past thirty years--from designing enzymes and in vitro systems, to manipulating existing metabolisms and gene expression, to creating entirely synthetic replicating life forms. What separates the current incarnation of synthetic biology from the recombinant DNA technology or metabolic engineering of the past is an emphasis on principles from engineering such as modularity, standardization, and rigorously predictive models. As such, synthetic biology represents a new paradigm for learning about and using biological molecules and data, with applications in basic science, biotechnology, and medicine. This review covers the canonical examples as well as some recent advances in synthetic biology in terms of what we know and what we can learn about the networks underlying biology, and how this endeavor may shape our understanding of living systems.
Towards a Unified Theory of Engineering Education
ERIC Educational Resources Information Center
Salcedo Orozco, Oscar H.
2017-01-01
STEM education is an interdisciplinary approach to learning where rigorous academic concepts are coupled with real-world lessons and activities as students apply science, technology, engineering, and mathematics in contexts that make connections between school, community, work, and the global enterprise enabling STEM literacy (Tsupros, Kohler and…
A preliminary design for the GMT-Consortium Large Earth Finder (G-CLEF)
NASA Astrophysics Data System (ADS)
Szentgyorgyi, Andrew; Barnes, Stuart; Bean, Jacob; Bigelow, Bruce; Bouchez, Antonin; Chun, Moo-Young; Crane, Jeffrey D.; Epps, Harland; Evans, Ian; Evans, Janet; Frebel, Anna; Furesz, Gabor; Glenday, Alex; Guzman, Dani; Hare, Tyson; Jang, Bi-Ho; Jang, Jeong-Gyun; Jeong, Ueejong; Jordan, Andres; Kim, Kang-Min; Kim, Jihun; Li, Chih-Hao; Lopez-Morales, Mercedes; McCracken, Kenneth; McLeod, Brian; Mueller, Mark; Nah, Jakyung; Norton, Timothy; Oh, Heeyoung; Oh, Jae Sok; Ordway, Mark; Park, Byeong-Gon; Park, Chan; Park, Sung-Joon; Phillips, David; Plummer, David; Podgorski, William; Rodler, Florian; Seifahrt, Andreas; Tak, Kyung-Mo; Uomoto, Alan; Van Dam, Marcos A.; Walsworth, Ronald; Yu, Young Sam; Yuk, In-Soo
2014-08-01
The GMT-Consortium Large Earth Finder (G-CLEF) is an optical-band echelle spectrograph that has been selected as the first light instrument for the Giant Magellan Telescope (GMT). G-CLEF is a general-purpose, high dispersion spectrograph that is fiber fed and capable of extremely precise radial velocity measurements. The G-CLEF Concept Design (CoD) was selected in Spring 2013. Since then, G-CLEF has undergone science requirements and instrument requirements reviews and will be the subject of a preliminary design review (PDR) in March 2015. Since CoD review (CoDR), the overall G-CLEF design has evolved significantly as we have optimized the constituent designs of the major subsystems, i.e. the fiber system, the telescope interface, the calibration system and the spectrograph itself. These modifications have been made to enhance G-CLEF's capability to address frontier science problems, as well as to respond to the evolution of the GMT itself and developments in the technical landscape. G-CLEF has been designed by applying rigorous systems engineering methodology to flow Level 1 Scientific Objectives to Level 2 Observational Requirements and thence to Level 3 and Level 4. The rigorous systems approach applied to G-CLEF establishes a well defined science requirements framework for the engineering design. By adopting this formalism, we may flexibly update and analyze the capability of G-CLEF to respond to new scientific discoveries as we move toward first light. G-CLEF will exploit numerous technological advances and features of the GMT itself to deliver an efficient, high performance instrument, e.g. exploiting the adaptive optics secondary system to increase both throughput and radial velocity measurement precision.
Efficiency bounds for nonequilibrium heat engines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mehta, Pankaj; Polkovnikov, Anatoli, E-mail: asp@bu.edu
2013-05-15
We analyze the efficiency of thermal engines (either quantum or classical) working with a single heat reservoir like an atmosphere. The engine first gets an energy intake, which can be done in an arbitrary nonequilibrium way e.g. combustion of fuel. Then the engine performs the work and returns to the initial state. We distinguish two general classes of engines where the working body first equilibrates within itself and then performs the work (ergodic engine) or when it performs the work before equilibrating (non-ergodic engine). We show that in both cases the second law of thermodynamics limits their efficiency. For ergodicmore » engines we find a rigorous upper bound for the efficiency, which is strictly smaller than the equivalent Carnot efficiency. I.e. the Carnot efficiency can be never achieved in single reservoir heat engines. For non-ergodic engines the efficiency can be higher and can exceed the equilibrium Carnot bound. By extending the fundamental thermodynamic relation to nonequilibrium processes, we find a rigorous thermodynamic bound for the efficiency of both ergodic and non-ergodic engines and show that it is given by the relative entropy of the nonequilibrium and initial equilibrium distributions. These results suggest a new general strategy for designing more efficient engines. We illustrate our ideas by using simple examples. -- Highlights: ► Derived efficiency bounds for heat engines working with a single reservoir. ► Analyzed both ergodic and non-ergodic engines. ► Showed that non-ergodic engines can be more efficient. ► Extended fundamental thermodynamic relation to arbitrary nonequilibrium processes.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2015-08-01
Since 1990, the National Renewable Energy Laboratory’s (NREL's) National Wind Technology Center (NWTC) has tested more than 150 wind turbine blades. NWTC researchers can test full-scale and subcomponent articles, conduct data analyses, and provide engineering expertise on best design practices. Structural testing of wind turbine blades enables designers, manufacturers, and owners to validate designs and assess structural performance to specific load conditions. Rigorous structural testing can reveal design and manufacturing problems at an early stage of development that can lead to overall improvements in design and increase system reliability.
Researches on direct injection in internal-combustion engines
NASA Technical Reports Server (NTRS)
Tuscher, Jean E
1941-01-01
These researches present a solution for reducing the fatigue of the Diesel engine by permitting the preservation of its components and, at the same time, raising its specific horsepower to a par with that of carburetor engines, while maintaining for the Diesel engine its perogative of burning heavy fuel under optimum economical conditions. The feeding of Diesel engines by injection pumps actuated by engine compression achieves the required high speeds of injection readily and permits rigorous control of the combustible charge introduced into each cylinder and of the peak pressure in the resultant cycle.
Automated Generation of Fault Management Artifacts from a Simple System Model
NASA Technical Reports Server (NTRS)
Kennedy, Andrew K.; Day, John C.
2013-01-01
Our understanding of off-nominal behavior - failure modes and fault propagation - in complex systems is often based purely on engineering intuition; specific cases are assessed in an ad hoc fashion as a (fallible) fault management engineer sees fit. This work is an attempt to provide a more rigorous approach to this understanding and assessment by automating the creation of a fault management artifact, the Failure Modes and Effects Analysis (FMEA) through querying a representation of the system in a SysML model. This work builds off the previous development of an off-nominal behavior model for the upcoming Soil Moisture Active-Passive (SMAP) mission at the Jet Propulsion Laboratory. We further developed the previous system model to more fully incorporate the ideas of State Analysis, and it was restructured in an organizational hierarchy that models the system as layers of control systems while also incorporating the concept of "design authority". We present software that was developed to traverse the elements and relationships in this model to automatically construct an FMEA spreadsheet. We further discuss extending this model to automatically generate other typical fault management artifacts, such as Fault Trees, to efficiently portray system behavior, and depend less on the intuition of fault management engineers to ensure complete examination of off-nominal behavior.
Application of State Analysis and Goal-based Operations to a MER Mission Scenario
NASA Technical Reports Server (NTRS)
Morris, John Richard; Ingham, Michel D.; Mishkin, Andrew H.; Rasmussen, Robert D.; Starbird, Thomas W.
2006-01-01
State Analysis is a model-based systems engineering methodology employing a rigorous discovery process which articulates operations concepts and operability needs as an integrated part of system design. The process produces requirements on system and software design in the form of explicit models which describe the system behavior in terms of state variables and the relationships among them. By applying State Analysis to an actual MER flight mission scenario, this study addresses the specific real world challenges of complex space operations and explores technologies that can be brought to bear on future missions. The paper first describes the tools currently used on a daily basis for MER operations planning and provides an in-depth description of the planning process, in the context of a Martian day's worth of rover engineering activities, resource modeling, flight rules, science observations, and more. It then describes how State Analysis allows for the specification of a corresponding goal-based sequence that accomplishes the same objectives, with several important additional benefits.
Decibel: The Relational Dataset Branching System
Maddox, Michael; Goehring, David; Elmore, Aaron J.; Madden, Samuel; Parameswaran, Aditya; Deshpande, Amol
2017-01-01
As scientific endeavors and data analysis become increasingly collaborative, there is a need for data management systems that natively support the versioning or branching of datasets to enable concurrent analysis, cleaning, integration, manipulation, or curation of data across teams of individuals. Common practice for sharing and collaborating on datasets involves creating or storing multiple copies of the dataset, one for each stage of analysis, with no provenance information tracking the relationships between these datasets. This results not only in wasted storage, but also makes it challenging to track and integrate modifications made by different users to the same dataset. In this paper, we introduce the Relational Dataset Branching System, Decibel, a new relational storage system with built-in version control designed to address these shortcomings. We present our initial design for Decibel and provide a thorough evaluation of three versioned storage engine designs that focus on efficient query processing with minimal storage overhead. We also develop an exhaustive benchmark to enable the rigorous testing of these and future versioned storage engine designs. PMID:28149668
Evaluative Assessment for NASA/GSFC Equal Opportunity Programs Office Sponsored Programs
NASA Technical Reports Server (NTRS)
Jarrell, H. Judith
1995-01-01
The purpose of PREP (Pre-College Minority Engineering Program) is to upgrade skills of minority students who have shown an interest in pursuing academic degrees in electrical engineering. The goal is to upgrade skills needed for successful completion of the rigorous curriculum leading to a Bachelor of Science degree in engineering through a comprehensive upgrade of academic, study and interpersonal skills.
Integrated Sensitivity Analysis Workflow
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedman-Hill, Ernest J.; Hoffman, Edward L.; Gibson, Marcus J.
2014-08-01
Sensitivity analysis is a crucial element of rigorous engineering analysis, but performing such an analysis on a complex model is difficult and time consuming. The mission of the DART Workbench team at Sandia National Laboratories is to lower the barriers to adoption of advanced analysis tools through software integration. The integrated environment guides the engineer in the use of these integrated tools and greatly reduces the cycle time for engineering analysis.
Performance of a Laser Ignited Multicylinder Lean Burn Natural Gas Engine
Almansour, Bader; Vasu, Subith; Gupta, Sreenath B.; ...
2017-06-06
Market demands for lower fueling costs and higher specific powers in stationary natural gas engines has engine designs trending towards higher in-cylinder pressures and leaner combustion operation. However, Ignition remains as the main limiting factor in achieving further performance improvements in these engines. Addressing this concern, while incorporating various recent advances in optics and laser technologies, laser igniters were designed and developed through numerous iterations. Final designs incorporated water-cooled, passively Q-switched, Nd:YAG micro-lasers that were optimized for stable operation under harsh engine conditions. Subsequently, the micro-lasers were installed in the individual cylinders of a lean-burn, 350 kW, inline 6-cylinder, open-chamber,more » spark ignited engine and tests were conducted. To the best of our knowledge, this is the world’s first demonstration of a laser ignited multi-cylinder natural gas engine. The engine was operated at high-load (298 kW) and rated speed (1800 rpm) conditions. Ignition timing sweeps and excess-air ratio (λ) sweeps were performed while keeping the NOx emissions below the USEPA regulated value (BSNOx < 1.34 g/kW-hr), and while maintaining ignition stability at industry acceptable values (COV_IMEP <5 %). Through such engine tests, the relative merits of (i) standard electrical ignition system, and (ii) laser ignition system were determined. In conclusion, a rigorous combustion data analysis was performed and the main reasons leading to improved performance in the case of laser ignition were identified.« less
Performance of a Laser Ignited Multicylinder Lean Burn Natural Gas Engine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Almansour, Bader; Vasu, Subith; Gupta, Sreenath B.
Market demands for lower fueling costs and higher specific powers in stationary natural gas engines has engine designs trending towards higher in-cylinder pressures and leaner combustion operation. However, Ignition remains as the main limiting factor in achieving further performance improvements in these engines. Addressing this concern, while incorporating various recent advances in optics and laser technologies, laser igniters were designed and developed through numerous iterations. Final designs incorporated water-cooled, passively Q-switched, Nd:YAG micro-lasers that were optimized for stable operation under harsh engine conditions. Subsequently, the micro-lasers were installed in the individual cylinders of a lean-burn, 350 kW, inline 6-cylinder, open-chamber,more » spark ignited engine and tests were conducted. To the best of our knowledge, this is the world’s first demonstration of a laser ignited multi-cylinder natural gas engine. The engine was operated at high-load (298 kW) and rated speed (1800 rpm) conditions. Ignition timing sweeps and excess-air ratio (λ) sweeps were performed while keeping the NOx emissions below the USEPA regulated value (BSNOx < 1.34 g/kW-hr), and while maintaining ignition stability at industry acceptable values (COV_IMEP <5 %). Through such engine tests, the relative merits of (i) standard electrical ignition system, and (ii) laser ignition system were determined. In conclusion, a rigorous combustion data analysis was performed and the main reasons leading to improved performance in the case of laser ignition were identified.« less
Thiel, Kati; Mulaku, Edita; Dandapani, Hariharan; Nagy, Csaba; Aro, Eva-Mari; Kallio, Pauli
2018-03-02
Photosynthetic cyanobacteria have been studied as potential host organisms for direct solar-driven production of different carbon-based chemicals from CO 2 and water, as part of the development of sustainable future biotechnological applications. The engineering approaches, however, are still limited by the lack of comprehensive information on most optimal expression strategies and validated species-specific genetic elements which are essential for increasing the intricacy, predictability and efficiency of the systems. This study focused on the systematic evaluation of the key translational control elements, ribosome binding sites (RBS), in the cyanobacterial host Synechocystis sp. PCC 6803, with the objective of expanding the palette of tools for more rigorous engineering approaches. An expression system was established for the comparison of 13 selected RBS sequences in Synechocystis, using several alternative reporter proteins (sYFP2, codon-optimized GFPmut3 and ethylene forming enzyme) as quantitative indicators of the relative translation efficiencies. The set-up was shown to yield highly reproducible expression patterns in independent analytical series with low variation between biological replicates, thus allowing statistical comparison of the activities of the different RBSs in vivo. While the RBSs covered a relatively broad overall expression level range, the downstream gene sequence was demonstrated in a rigorous manner to have a clear impact on the resulting translational profiles. This was expected to reflect interfering sequence-specific mRNA-level interaction between the RBS and the coding region, yet correlation between potential secondary structure formation and observed translation levels could not be resolved with existing in silico prediction tools. The study expands our current understanding on the potential and limitations associated with the regulation of protein expression at translational level in engineered cyanobacteria. The acquired information can be used for selecting appropriate RBSs for optimizing over-expression constructs or multicistronic pathways in Synechocystis, while underlining the complications in predicting the activity due to gene-specific interactions which may reduce the translational efficiency for a given RBS-gene combination. Ultimately, the findings emphasize the need for additional characterized insulator sequence elements to decouple the interaction between the RBS and the coding region for future engineering approaches.
Saparova, D; Belden, J; Williams, J; Richardson, B; Schuster, K
2014-01-01
Federated medical search engines are health information systems that provide a single access point to different types of information. Their efficiency as clinical decision support tools has been demonstrated through numerous evaluations. Despite their rigor, very few of these studies report holistic evaluations of medical search engines and even fewer base their evaluations on existing evaluation frameworks. To evaluate a federated medical search engine, MedSocket, for its potential net benefits in an established clinical setting. This study applied the Human, Organization, and Technology (HOT-fit) evaluation framework in order to evaluate MedSocket. The hierarchical structure of the HOT-factors allowed for identification of a combination of efficiency metrics. Human fit was evaluated through user satisfaction and patterns of system use; technology fit was evaluated through the measurements of time-on-task and the accuracy of the found answers; and organization fit was evaluated from the perspective of system fit to the existing organizational structure. Evaluations produced mixed results and suggested several opportunities for system improvement. On average, participants were satisfied with MedSocket searches and confident in the accuracy of retrieved answers. However, MedSocket did not meet participants' expectations in terms of download speed, access to information, and relevance of the search results. These mixed results made it necessary to conclude that in the case of MedSocket, technology fit had a significant influence on the human and organization fit. Hence, improving technological capabilities of the system is critical before its net benefits can become noticeable. The HOT-fit evaluation framework was instrumental in tailoring the methodology for conducting a comprehensive evaluation of the search engine. Such multidimensional evaluation of the search engine resulted in recommendations for system improvement.
Evaluating a Federated Medical Search Engine
Belden, J.; Williams, J.; Richardson, B.; Schuster, K.
2014-01-01
Summary Background Federated medical search engines are health information systems that provide a single access point to different types of information. Their efficiency as clinical decision support tools has been demonstrated through numerous evaluations. Despite their rigor, very few of these studies report holistic evaluations of medical search engines and even fewer base their evaluations on existing evaluation frameworks. Objectives To evaluate a federated medical search engine, MedSocket, for its potential net benefits in an established clinical setting. Methods This study applied the Human, Organization, and Technology (HOT-fit) evaluation framework in order to evaluate MedSocket. The hierarchical structure of the HOT-factors allowed for identification of a combination of efficiency metrics. Human fit was evaluated through user satisfaction and patterns of system use; technology fit was evaluated through the measurements of time-on-task and the accuracy of the found answers; and organization fit was evaluated from the perspective of system fit to the existing organizational structure. Results Evaluations produced mixed results and suggested several opportunities for system improvement. On average, participants were satisfied with MedSocket searches and confident in the accuracy of retrieved answers. However, MedSocket did not meet participants’ expectations in terms of download speed, access to information, and relevance of the search results. These mixed results made it necessary to conclude that in the case of MedSocket, technology fit had a significant influence on the human and organization fit. Hence, improving technological capabilities of the system is critical before its net benefits can become noticeable. Conclusions The HOT-fit evaluation framework was instrumental in tailoring the methodology for conducting a comprehensive evaluation of the search engine. Such multidimensional evaluation of the search engine resulted in recommendations for system improvement. PMID:25298813
Interdisciplinary Interactions During R&D and Early Design of Large Engineered Systems
NASA Technical Reports Server (NTRS)
McGowan, Anna-Maria Rivas
2014-01-01
Designing Large-Scale Complex Engineered Systems (LaCES) such as aircraft and submarines requires the input of thousands of engineers and scientists whose work is proximate in neither time nor space. Comprehensive knowledge of the system is dispersed among specialists whose expertise is in typically one system component or discipline. This study examined the interactive work practices among such specialists seeking to improve engineering practice through a rigorous and theoretical understanding of current practice. This research explored current interdisciplinary practices and perspectives during R&D and early LaCES design and identified why these practices and perspectives prevail and persist. The research design consisted of a three-fold, integrative approach that combined an open-ended survey, semi-structured interviews, and ethnography. Significant empirical data from experienced engineers and scientists in a large engineering organization were obtained and integrated with theories from organization science and engineering. Qualitative analysis was used to obtain a holistic, contextualized understanding. The over-arching finding is that issues related to cognition, organization, and social interrelations mostly dominate interactions across disciplines. Engineering issues, such as the integration of hardware or physics-based models, are not as significant. For example, organization culture is an important underlying factor that guided researchers more toward individual sovereignty over cross-disciplinarity. The organization structure and the engineered system architecture also serve as constraints to the engineering work. Many differences in work practices were observed, including frequency and depth of interactions, definition or co-construction of requirements, clarity or creation of the system architecture, work group proximity, and cognitive challenges. Practitioners are often unaware of these differences resulting in confusion and incorrect assumptions regarding work expectations. Cognitively, the enactment and coconstruction of knowledge are the fundamental tasks of the interdisciplinary interactions. Distributed and collective cognition represent most of the efforts. Argument, ignorance, learning, and creativity are interrelated aspects of the interactions that cause discomfort but yield benefits such as problem mitigation, broader understanding, and improved system design and performance. The quality and quantity of social interrelations are central to all work across disciplines with reciprocity, respectful engagement, and heedful interrelations being significant to the effectiveness of the engineering and scientific work.
Lenas, Petros; Moos, Malcolm; Luyten, Frank P
2009-12-01
Recent advances in developmental biology, systems biology, and network science are converging to poise the heretofore largely empirical field of tissue engineering on the brink of a metamorphosis into a rigorous discipline based on universally accepted engineering principles of quality by design. Failure of more simplistic approaches to the manufacture of cell-based therapies has led to increasing appreciation of the need to imitate, at least to some degree, natural mechanisms that control cell fate and differentiation. The identification of many of these mechanisms, which in general are based on cell signaling pathways, is an important step in this direction. Some well-accepted empirical concepts of developmental biology, such as path-dependence, robustness, modularity, and semiautonomy of intermediate tissue forms, that appear sequentially during tissue development are starting to be incorporated in process design.
An Example-Centric Tool for Context-Driven Design of Biomedical Devices
ERIC Educational Resources Information Center
Dzombak, Rachel; Mehta, Khanjan; Butler, Peter
2015-01-01
Engineering is one of the most global professions, with design teams developing technologies for an increasingly interconnected and borderless world. In order for engineering students to be proficient in creating viable solutions to the challenges faced by diverse populations, they must receive an experiential education in rigorous engineering…
Increasing the reliability of ecological models using modern software engineering techniques
Robert M. Scheller; Brian R. Sturtevant; Eric J. Gustafson; Brendan C. Ward; David J. Mladenoff
2009-01-01
Modern software development techniques are largely unknown to ecologists. Typically, ecological models and other software tools are developed for limited research purposes, and additional capabilities are added later, usually in an ad hoc manner. Modern software engineering techniques can substantially increase scientific rigor and confidence in ecological models and...
Modeling Tools for Propulsion Analysis and Computational Fluid Dynamics on the Internet
NASA Technical Reports Server (NTRS)
Muss, J. A.; Johnson, C. W.; Gotchy, M. B.
2000-01-01
The existing RocketWeb(TradeMark) Internet Analysis System (httr)://www.iohnsonrockets.com/rocketweb) provides an integrated set of advanced analysis tools that can be securely accessed over the Internet. Since these tools consist of both batch and interactive analysis codes, the system includes convenient methods for creating input files and evaluating the resulting data. The RocketWeb(TradeMark) system also contains many features that permit data sharing which, when further developed, will facilitate real-time, geographically diverse, collaborative engineering within a designated work group. Adding work group management functionality while simultaneously extending and integrating the system's set of design and analysis tools will create a system providing rigorous, controlled design development, reducing design cycle time and cost.
Spacecraft Testing Programs: Adding Value to the Systems Engineering Process
NASA Technical Reports Server (NTRS)
Britton, Keith J.; Schaible, Dawn M.
2011-01-01
Testing has long been recognized as a critical component of spacecraft development activities - yet many major systems failures may have been prevented with more rigorous testing programs. The question is why is more testing not being conducted? Given unlimited resources, more testing would likely be included in a spacecraft development program. Striking the right balance between too much testing and not enough has been a long-term challenge for many industries. The objective of this paper is to discuss some of the barriers, enablers, and best practices for developing and sustaining a strong test program and testing team. This paper will also explore the testing decision factors used by managers; the varying attitudes toward testing; methods to develop strong test engineers; and the influence of behavior, culture and processes on testing programs. KEY WORDS: Risk, Integration and Test, Validation, Verification, Test Program Development
Plouchart, Diane; Guizard, Guillaume; Latrille, Eric
2018-01-01
Continuous cultures in chemostats have proven their value in microbiology, microbial ecology, systems biology and bioprocess engineering, among others. In these systems, microbial growth and ecosystem performance can be quantified under stable and defined environmental conditions. This is essential when linking microbial diversity to ecosystem function. Here, a new system to test this link in anaerobic, methanogenic microbial communities is introduced. Rigorously replicated experiments or a suitable experimental design typically require operating several chemostats in parallel. However, this is labor intensive, especially when measuring biogas production. Commercial solutions for multiplying reactors performing continuous anaerobic digestion exist but are expensive and use comparably large reactor volumes, requiring the preparation of substantial amounts of media. Here, a flexible system of Lab-scale Automated and Multiplexed Anaerobic Chemostat system (LAMACs) with a working volume of 200 mL is introduced. Sterile feeding, biomass wasting and pressure monitoring are automated. One module containing six reactors fits the typical dimensions of a lab bench. Thanks to automation, time required for reactor operation and maintenance are reduced compared to traditional lab-scale systems. Several modules can be used together, and so far the parallel operation of 30 reactors was demonstrated. The chemostats are autoclavable. Parameters like reactor volume, flow rates and operating temperature can be freely set. The robustness of the system was tested in a two-month long experiment in which three inocula in four replicates, i.e., twelve continuous digesters were monitored. Statistically significant differences in the biogas production between inocula were observed. In anaerobic digestion, biogas production and consequently pressure development in a closed environment is a proxy for ecosystem performance. The precision of the pressure measurement is thus crucial. The measured maximum and minimum rates of gas production could be determined at the same precision. The LAMACs is a tool that enables us to put in practice the often-demanded need for replication and rigorous testing in microbial ecology as well as bioprocess engineering. PMID:29518106
Done in 60 seconds- See a Massive Rocket Fuel Tank Built in A Minute
2016-08-18
The 7.5-minute test conducted at NASA’s Stennis Space Center is part of a series of tests designed to put the upgraded former space shuttle engines through the rigorous temperature and pressure conditions they will experience during a launch. The tests also support the development of a new controller, or “brain,” for the engine, which monitors engine status and communicates between the rocket and the engine, relaying commands to the engine and transmitting data back to the rocket.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kok, Koen; Widergren, Steve
Secure, Clean and Efficient Energy is one of the great societal challenges of our time. Electricity as a sustainable energy carrier plays a central role in the most effective transition scenarios towards sustainability. To harness this potential, the current electricity infrastructure needs to be rigorously re-engineered into an integrated and intelligent electricity system: the smart grid. Key elements of the smart grid vision are the coordination mechanisms. In such a system, vast numbers of devices, currently just passively connected to the grid, will become actively involved in system-wide and local coordination tasks. In this light, transactive energy (TE) is emergingmore » as a strong contender for orchestrating the coordinated operation of so many devices.« less
Application of State Analysis and Goal-Based Operations to a MER Mission Scenario
NASA Technical Reports Server (NTRS)
Morris, J. Richard; Ingham, Michel D.; Mishkin, Andrew H.; Rasmussen, Robert D.; Starbird, Thomas W.
2006-01-01
State Analysis is a model-based systems engineering methodology employing a rigorous discovery process which articulates operations concepts and operability needs as an integrated part of system design. The process produces requirements on system and software design in the form of explicit models which describe the behavior of states and the relationships among them. By applying State Analysis to an actual MER flight mission scenario, this study addresses the specific real world challenges of complex space operations and explores technologies that can be brought to bear on future missions. The paper describes the tools currently used on a daily basis for MER operations planning and provides an in-depth description of the planning process, in the context of a Martian day's worth of rover engineering activities, resource modeling, flight rules, science observations, and more. It then describes how State Analysis allows for the specification of a corresponding goal-based sequence that accomplishes the same objectives, with several important additional benefits.
Technical Submission Form: Technical Specification of a Wave Energy Farm.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roberts, Jesse D.; Nielsen, Kim; Kennedy, Ben
The Wave - SPARC project developed the Technology Performance Level (TPL) assessment procedure based on a rigorous Systems Engineering exercise. The TPL assessment allows a whole system evaluation of Wave Energy Conversion Technology by measuring it against the requirements determined through the Systems Engineering exercise. The TPL assessment is intended to be useful in technology evaluation; in technology innovation; in allocation of public or priva te investment, and; in making equipment purchasing decisions. This Technical Submission Form (TSF) serves the purpose of collecting relevant and complete information, in a technology agnostic way, to allow TPL assessment s to be mademore » by third party assessor s. The intended usage of this document is that the organization or people that are performing the role of developers or promoters of a particular technology will use this form to provide the information necessary for the organization or people who are perf orming the assessor role to use the TPL assessment.« less
Thermodynamic fingerprints of non-Markovianity in a system of coupled superconducting qubits
NASA Astrophysics Data System (ADS)
Hamedani Raja, Sina; Borrelli, Massimo; Schmidt, Rebecca; Pekola, Jukka P.; Maniscalco, Sabrina
2018-03-01
The exploitation and characterization of memory effects arising from the interaction between system and environment is a key prerequisite for quantum reservoir engineering beyond the standard Markovian limit. In this paper we investigate a prototype of non-Markovian dynamics experimentally implementable with superconducting qubits. We rigorously quantify non-Markovianity, highlighting the effects of the environmental temperature on the Markovian to non-Markovian crossover. We investigate how memory effects influence, and specifically suppress, the ability to perform work on the driven qubit. We show that the average work performed on the qubit can be used as a diagnostic tool to detect the presence or absence of memory effects.
Navigation Ground Data System Engineering for the Cassini/Huygens Mission
NASA Technical Reports Server (NTRS)
Beswick, R. M.; Antreasian, P. G.; Gillam, S. D.; Hahn, Y.; Roth, D. C.; Jones, J. B.
2008-01-01
The launch of the Cassini/Huygens mission on October 15, 1997, began a seven year journey across the solar system that culminated in the entry of the spacecraft into Saturnian orbit on June 30, 2004. Cassini/Huygens Spacecraft Navigation is the result of a complex interplay between several teams within the Cassini Project, performed on the Ground Data System. The work of Spacecraft Navigation involves rigorous requirements for accuracy and completeness carried out often under uncompromising critical time pressures. To support the Navigation function, a fault-tolerant, high-reliability/high-availability computational environment was necessary to support data processing. Configuration Management (CM) was integrated with fault tolerant design and security engineering, according to the cornerstone principles of Confidentiality, Integrity, and Availability. Integrated with this approach are security benchmarks and validation to meet strict confidence levels. In addition, similar approaches to CM were applied in consideration of the staffing and training of the system administration team supporting this effort. As a result, the current configuration of this computational environment incorporates a secure, modular system, that provides for almost no downtime during tour operations.
Modeling Off-Nominal Behavior in SysML
NASA Technical Reports Server (NTRS)
Day, John; Donahue, Kenny; Ingham, Mitch; Kadesch, Alex; Kennedy, Kit; Post, Ethan
2012-01-01
Fault Management is an essential part of the system engineering process that is limited in its effectiveness by the ad hoc nature of the applied approaches and methods. Providing a rigorous way to develop and describe off-nominal behavior is a necessary step in the improvement of fault management, and as a result, will enable safe, reliable and available systems even as system complexity increases... The basic concepts described in this paper provide a foundation to build a larger set of necessary concepts and relationships for precise modeling of off-nominal behavior, and a basis for incorporating these ideas into the overall systems engineering process.. The simple FMEA example provided applies the modeling patterns we have developed and illustrates how the information in the model can be used to reason about the system and derive typical fault management artifacts.. A key insight from the FMEA work was the utility of defining failure modes as the "inverse of intent", and deriving this from the behavior models.. Additional work is planned to extend these ideas and capabilities to other types of relevant information and additional products.
Validating the Use of pPerformance Risk Indices for System-Level Risk and Maturity Assessments
NASA Astrophysics Data System (ADS)
Holloman, Sherrica S.
With pressure on the U.S. Defense Acquisition System (DAS) to reduce cost overruns and schedule delays, system engineers' performance is only as good as their tools. Recent literature details a need for 1) objective, analytical risk quantification methodologies over traditional subjective qualitative methods -- such as, expert judgment, and 2) mathematically rigorous system-level maturity assessments. The Mahafza, Componation, and Tippett (2005) Technology Performance Risk Index (TPRI) ties the assessment of technical performance to the quantification of risk of unmet performance; however, it is structured for component- level data as input. This study's aim is to establish a modified TPRI with systems-level data as model input, and then validate the modified index with actual system-level data from the Department of Defense's (DoD) Major Defense Acquisition Programs (MDAPs). This work's contribution is the establishment and validation of the System-level Performance Risk Index (SPRI). With the introduction of the SPRI, system-level metrics are better aligned, allowing for better assessment, tradeoff and balance of time, performance and cost constraints. This will allow system engineers and program managers to ultimately make better-informed system-level technical decisions throughout the development phase.
Revisiting classical design in engineering from a perspective of frugality.
Rao, Balkrishna C
2017-05-01
The conservative nature of design in engineering has typically unleashed products fabricated with generous amounts of raw materials. This is epitomized by the factor of safety whose values higher than unity suggests various uncertainties of design that are tackled through material padding. This effort proposes a new factor of safety called the factor of frugality that could be used in ecodesign and which addresses both rigors of the classical design process and quantification of savings in materials going into a product. An example of frugal shaft design together with some other cases has been presented to explain the working of the factor of frugality . Adoption of the frugality factor would entail a change in design philosophy whereby designers would constantly make avail of a rigorous design process coupled with material-saving schemes for realizing products that are benign to the environment. Such a change in the foundations of design would abet the stewardship of earth in avoiding planetary boundaries since engineering influences a significant proportion of human endeavors.
NASA Technical Reports Server (NTRS)
Henke, Luke
2010-01-01
The ICARE method is a flexible, widely applicable method for systems engineers to solve problems and resolve issues in a complete and comprehensive manner. The method can be tailored by diverse users for direct application to their function (e.g. system integrators, design engineers, technical discipline leads, analysts, etc.). The clever acronym, ICARE, instills the attitude of accountability, safety, technical rigor and engagement in the problem resolution: Identify, Communicate, Assess, Report, Execute (ICARE). This method was developed through observation of Space Shuttle Propulsion Systems Engineering and Integration (PSE&I) office personnel approach in an attempt to succinctly describe the actions of an effective systems engineer. Additionally it evolved from an effort to make a broadly-defined checklist for a PSE&I worker to perform their responsibilities in an iterative and recursive manner. The National Aeronautics and Space Administration (NASA) Systems Engineering Handbook states, engineering of NASA systems requires a systematic and disciplined set of processes that are applied recursively and iteratively for the design, development, operation, maintenance, and closeout of systems throughout the life cycle of the programs and projects. ICARE is a method that can be applied within the boundaries and requirements of NASA s systems engineering set of processes to provide an elevated sense of duty and responsibility to crew and vehicle safety. The importance of a disciplined set of processes and a safety-conscious mindset increases with the complexity of the system. Moreover, the larger the system and the larger the workforce, the more important it is to encourage the usage of the ICARE method as widely as possible. According to the NASA Systems Engineering Handbook, elements of a system can include people, hardware, software, facilities, policies and documents; all things required to produce system-level results, qualities, properties, characteristics, functions, behavior and performance. The ICARE method can be used to improve all elements of a system and, consequently, the system-level functional, physical and operational performance. Even though ICARE was specifically designed for a systems engineer, any person whose job is to examine another person, product, or process can use the ICARE method to improve effectiveness, implementation, usefulness, value, capability, efficiency, integration, design, and/or marketability. This paper provides the details of the ICARE method, emphasizing the method s application to systems engineering. In addition, a sample of other, non-systems engineering applications are briefly discussed to demonstrate how ICARE can be tailored to a variety of diverse jobs (from project management to parenting).
Characterizing Learning-through-Service Students in Engineering by Gender and Academic Year
ERIC Educational Resources Information Center
Carberry, Adam Robert
2010-01-01
Service is increasingly being viewed as an integral part of education nationwide. Service-based courses and programs are growing in popularity as opportunities for students to learn and experience their discipline. Widespread adoption of learning-through-service (LTS) in engineering is stymied by a lack of a body of rigorous research supporting…
Software Analyzes Complex Systems in Real Time
NASA Technical Reports Server (NTRS)
2008-01-01
Expert system software programs, also known as knowledge-based systems, are computer programs that emulate the knowledge and analytical skills of one or more human experts, related to a specific subject. SHINE (Spacecraft Health Inference Engine) is one such program, a software inference engine (expert system) designed by NASA for the purpose of monitoring, analyzing, and diagnosing both real-time and non-real-time systems. It was developed to meet many of the Agency s demanding and rigorous artificial intelligence goals for current and future needs. NASA developed the sophisticated and reusable software based on the experience and requirements of its Jet Propulsion Laboratory s (JPL) Artificial Intelligence Research Group in developing expert systems for space flight operations specifically, the diagnosis of spacecraft health. It was designed to be efficient enough to operate in demanding real time and in limited hardware environments, and to be utilized by non-expert systems applications written in conventional programming languages. The technology is currently used in several ongoing NASA applications, including the Mars Exploration Rovers and the Spacecraft Health Automatic Reasoning Pilot (SHARP) program for the diagnosis of telecommunication anomalies during the Neptune Voyager Encounter. It is also finding applications outside of the Space Agency.
Handbook of applied mathematics for engineers and scientists
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurtz, M.
1991-12-31
This book is intended to be reference for applications of mathematics in a wide range of topics of interest to engineers and scientists. An unusual feature of this book is that it covers a large number of topics from elementary algebra, trigonometry, and calculus to computer graphics and cybernetics. The level of mathematics covers high school through about the junior level of an engineering curriculum in a major univeristy. Throughout, the emphasis is on applications of mathematics rather than on rigorous proofs.
A Thermal Management Systems Model for the NASA GTX RBCC Concept
NASA Technical Reports Server (NTRS)
Traci, Richard M.; Farr, John L., Jr.; Laganelli, Tony; Walker, James (Technical Monitor)
2002-01-01
The Vehicle Integrated Thermal Management Analysis Code (VITMAC) was further developed to aid the analysis, design, and optimization of propellant and thermal management concepts for advanced propulsion systems. The computational tool is based on engineering level principles and models. A graphical user interface (GUI) provides a simple and straightforward method to assess and evaluate multiple concepts before undertaking more rigorous analysis of candidate systems. The tool incorporates the Chemical Equilibrium and Applications (CEA) program and the RJPA code to permit heat transfer analysis of both rocket and air breathing propulsion systems. Key parts of the code have been validated with experimental data. The tool was specifically tailored to analyze rocket-based combined-cycle (RBCC) propulsion systems being considered for space transportation applications. This report describes the computational tool and its development and verification for NASA GTX RBCC propulsion system applications.
Bioengineering Solutions for Manufacturing Challenges in CAR T Cells
Piscopo, Nicole J.; Mueller, Katherine P.; Das, Amritava; Hematti, Peiman; Murphy, William L.; Palecek, Sean P.; Capitini, Christian M.
2017-01-01
The next generation of therapeutic products to be approved for the clinic is anticipated to be cell therapies, termed “living drugs” for their capacity to dynamically and temporally respond to changes during their production ex vivo and after their administration in vivo. Genetically engineered chimeric antigen receptor (CAR) T cells have rapidly developed into powerful tools to harness the power of immune system manipulation against cancer. Regulatory agencies are beginning to approve CAR T cell therapies due to their striking efficacy in treating some hematological malignancies. However, the engineering and manufacturing of such cells remains a challenge for widespread adoption of this technology. Bioengineering approaches including biomaterials, synthetic biology, metabolic engineering, process control and automation, and in vitro disease modeling could offer promising methods to overcome some of these challenges. Here, we describe the manufacturing process of CAR T cells, highlighting potential roles for bioengineers to partner with biologists and clinicians to advance the manufacture of these complex cellular products under rigorous regulatory and quality control. PMID:28840981
ERIC Educational Resources Information Center
Mattson, Beverly
2011-01-01
One of the competitive priorities of the U.S. Department of Education's Race to the Top applications addressed science, technology, engineering, and mathematics (STEM). States that applied were required to submit plans that addressed rigorous courses of study, cooperative partnerships to prepare and assist teachers in STEM content, and prepare…
Simscape Modeling Verification in the Simulink Development Environment
NASA Technical Reports Server (NTRS)
Volle, Christopher E. E.
2014-01-01
The purpose of the Simulation Product Group of the Control and Data Systems division of the NASA Engineering branch at Kennedy Space Center is to provide a realtime model and simulation of the Ground Subsystems participating in vehicle launching activities. The simulation software is part of the Spaceport Command and Control System (SCCS) and is designed to support integrated launch operation software verification, and console operator training. Using Mathworks Simulink tools, modeling engineers currently build models from the custom-built blocks to accurately represent ground hardware. This is time consuming and costly due to required rigorous testing and peer reviews to be conducted for each custom-built block. Using Mathworks Simscape tools, modeling time can be reduced since there would be no custom-code developed. After careful research, the group came to the conclusion it is feasible to use Simscape's blocks in MatLab's Simulink. My project this fall was to verify the accuracy of the Crew Access Arm model developed using Simscape tools running in the Simulink development environment.
Engineering research, development and technology FY99
DOE Office of Scientific and Technical Information (OSTI.GOV)
Langland, R T
The growth of computer power and connectivity, together with advances in wireless sensing and communication technologies, is transforming the field of complex distributed systems. The ability to deploy large numbers of sensors with a rapid, broadband communication system will enable high-fidelity, near real-time monitoring of complex systems. These technological developments will provide unprecedented insight into the actual performance of engineered and natural environment systems, enable the evolution of many new types of engineered systems for monitoring and detection, and enhance our ability to perform improved and validated large-scale simulations of complex systems. One of the challenges facing engineering is tomore » develop methodologies to exploit the emerging information technologies. Particularly important will be the ability to assimilate measured data into the simulation process in a way which is much more sophisticated than current, primarily ad hoc procedures. The reports contained in this section on the Center for Complex Distributed Systems describe activities related to the integrated engineering of large complex systems. The first three papers describe recent developments for each link of the integrated engineering process for large structural systems. These include (1) the development of model-based signal processing algorithms which will formalize the process of coupling measurements and simulation and provide a rigorous methodology for validation and update of computational models; (2) collaborative efforts with faculty at the University of California at Berkeley on the development of massive simulation models for the earth and large bridge structures; and (3) the development of wireless data acquisition systems which provide a practical means of monitoring large systems like the National Ignition Facility (NIF) optical support structures. These successful developments are coming to a confluence in the next year with applications to NIF structural characterizations and analysis of large bridge structures for the State of California. Initial feasibility investigations into the development of monitoring and detection systems are described in the papers on imaging of underground structures with ground-penetrating radar, and the use of live insects as sensor platforms. These efforts are establishing the basic performance characteristics essential to the decision process for future development of sensor arrays for information gathering related to national security.« less
NASA Technical Reports Server (NTRS)
Cornford, Steven L.; Feather, Martin S.
2016-01-01
This report explores the current state of the art of Safety and Mission Assurance (S&MA) in projects that have shifted towards Model Based Systems Engineering (MBSE). Its goal is to provide insight into how NASA's Office of Safety and Mission Assurance (OSMA) should respond to this shift. In MBSE, systems engineering information is organized and represented in models: rigorous computer-based representations, which collectively make many activities easier to perform, less error prone, and scalable. S&MA practices must shift accordingly. The "Objective Structure Hierarchies" recently developed by OSMA provide the framework for understanding this shift. Although the objectives themselves will remain constant, S&MA practices (activities, processes, tools) to achieve them are subject to change. This report presents insights derived from literature studies and interviews. The literature studies gleaned assurance implications from reports of space-related applications of MBSE. The interviews with knowledgeable S&MA and MBSE personnel discovered concerns and ideas for how assurance may adapt. Preliminary findings and observations are presented on the state of practice of S&MA with respect to MBSE, how it is already changing, and how it is likely to change further. Finally, recommendations are provided on how to foster the evolution of S&MA to best fit with MBSE.
NASA Astrophysics Data System (ADS)
Sokolovskiy, Vladimir; Grünebohm, Anna; Buchelnikov, Vasiliy; Entel, Peter
2014-09-01
This special issue collects contributions from the participants of the "Information in Dynamical Systems and Complex Systems" workshop, which cover a wide range of important problems and new approaches that lie in the intersection of information theory and dynamical systems. The contributions include theoretical characterization and understanding of the different types of information flow and causality in general stochastic processes, inference and identification of coupling structure and parameters of system dynamics, rigorous coarse-grain modeling of network dynamical systems, and exact statistical testing of fundamental information-theoretic quantities such as the mutual information. The collective efforts reported herein reflect a modern perspective of the intimate connection between dynamical systems and information flow, leading to the promise of better understanding and modeling of natural complex systems and better/optimal design of engineering systems.
ERIC Educational Resources Information Center
Francis, Clay
2018-01-01
Historic notions of academic rigor usually follow from critiques of the system--we often define our goals for academically rigorous work through the lens of our shortcomings. This chapter discusses how the Truman Commission in 1947 and the Spellings Commission in 2006 shaped the way we think about academic rigor in today's context.
Software engineering methodologies and tools
NASA Technical Reports Server (NTRS)
Wilcox, Lawrence M.
1993-01-01
Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.
NASA Technical Reports Server (NTRS)
Menrad, Robert J.; Larson, Wiley J.
2008-01-01
This paper shares the findings of NASA's Integrated Learning and Development Program (ILDP) in its effort to reinvigorate the HANDS-ON practice of space systems engineering and project/program management through focused coursework, training opportunities, on-the job learning and special assignments. Prior to March 2005, NASA responsibility for technical workforce development (the program/project manager, systems engineering, discipline engineering, discipline engineering and associated communities) was executed by two parallel organizations. In March 2005 these organizations merged. The resulting program-ILDP-was chartered to implement an integrated competency-based development model capable of enhancing NASA's technical workforce performance as they face the complex challenges of Earth science, space science, aeronautics and human spaceflight missions. Results developed in collaboration with NASA Field Centers are reported on. This work led to definition of the agency's first integrated technical workforce development model known as the Requisite Occupation Competence and Knowledge (the ROCK). Critical processes and products are presented including: 'validation' techniques to guide model development, the Design-A-CUrriculuM (DACUM) process, and creation of the agency's first systems engineering body-of-knowledge. Findings were validated via nine focus groups from industry and government, validated with over 17 space-related organizations, at an estimated cost exceeding $300,000 (US). Masters-level programs and training programs have evolved to address the needs of these practitioner communities based upon these results. The ROCK reintroduced rigor and depth to the practitioner's development in these critical disciplines enabling their ability to take mission concepts from imagination to reality.
ERIC Educational Resources Information Center
Sacramento City Unified School District, CA.
The Academy of Math, Science, and Engineering was established at the Luther Burbank High School of Sacramento, California as a rigorous and competitive academic alternative program. This report contains an evaluation of the second year (1984-85) of the program. Program accomplishments are reviewed in the categories of: (1) student enrollment; (2)…
Learning from Science and Sport - How we, Safety, "Engage with Rigor"
NASA Astrophysics Data System (ADS)
Herd, A.
2012-01-01
As the world of spaceflight safety is relatively small and potentially inward-looking, we need to be aware of the "outside world". We should then try to remind ourselves to be open to the possibility that data, knowledge or experience from outside of the spaceflight community may provide some constructive alternate perspectives. This paper will assess aspects from two seemingly tangential fields, science and sport, and align these with the world of safety. In doing so some useful insights will be given to the challenges we face and may provide solutions relevant in our everyday (of safety engineering). Sport, particularly a contact sport such as rugby union, requires direct interaction between members of two (opposing) teams. Professional, accurately timed and positioned interaction for a desired outcome. These interactions, whilst an essential part of the game, are however not without their constraints. The rugby scrum has constraints as to the formation and engagement of the two teams. The controlled engagement provides for an interaction between the two teams in a safe manner. The constraints arising from the reality that an incorrect engagement could cause serious injury to members of either team. In academia, scientific rigor is applied to assure that the arguments provided and the conclusions drawn in academic papers presented for publication are valid, legitimate and credible. The scientific goal of the need for rigor may be expressed in the example of achieving a statistically relevant sample size, n, in order to assure analysis validity of the data pool. A failure to apply rigor could then place the entire study at risk of failing to have the respective paper published. This paper will consider the merits of these two different aspects, scientific rigor and sports engagement, and offer a reflective look at how this may provide a "modus operandi" for safety engineers at any level whether at their desks (creating or reviewing safety assessments) or in a safety review meeting (providing a verbal critique of the presented safety case).
Goal-Function Tree Modeling for Systems Engineering and Fault Management
NASA Technical Reports Server (NTRS)
Johnson, Stephen B.; Breckenridge, Jonathan T.
2013-01-01
This paper describes a new representation that enables rigorous definition and decomposition of both nominal and off-nominal system goals and functions: the Goal-Function Tree (GFT). GFTs extend the concept and process of functional decomposition, utilizing state variables as a key mechanism to ensure physical and logical consistency and completeness of the decomposition of goals (requirements) and functions, and enabling full and complete traceabilitiy to the design. The GFT also provides for means to define and represent off-nominal goals and functions that are activated when the system's nominal goals are not met. The physical accuracy of the GFT, and its ability to represent both nominal and off-nominal goals enable the GFT to be used for various analyses of the system, including assessments of the completeness and traceability of system goals and functions, the coverage of fault management failure detections, and definition of system failure scenarios.
NASA Technical Reports Server (NTRS)
Thomas, Dale; Smith, Charles; Thomas, Leann; Kittredge, Sheryl
2002-01-01
The overall goal of the 2nd Generation RLV Program is to substantially reduce technical and business risks associated with developing a new class of reusable launch vehicles. NASA's specific goals are to improve the safety of a 2nd-generation system by 2 orders of magnitude - equivalent to a crew risk of 1-in-10,000 missions - and decrease the cost tenfold, to approximately $1,000 per pound of payload launched. Architecture definition is being conducted in parallel with the maturating of key technologies specifically identified to improve safety and reliability, while reducing operational costs. An architecture broadly includes an Earth-to-orbit reusable launch vehicle, on-orbit transfer vehicles and upper stages, mission planning, ground and flight operations, and support infrastructure, both on the ground and in orbit. The systems engineering approach ensures that the technologies developed - such as lightweight structures, long-life rocket engines, reliable crew escape, and robust thermal protection systems - will synergistically integrate into the optimum vehicle. To best direct technology development decisions, analytical models are employed to accurately predict the benefits of each technology toward potential space transportation architectures as well as the risks associated with each technology. Rigorous systems analysis provides the foundation for assessing progress toward safety and cost goals. The systems engineering review process factors in comprehensive budget estimates, detailed project schedules, and business and performance plans, against the goals of safety, reliability, and cost, in addition to overall technical feasibility. This approach forms the basis for investment decisions in the 2nd Generation RLV Program's risk-reduction activities. Through this process, NASA will continually refine its specialized needs and identify where Defense and commercial requirements overlap those of civil missions.
NASA Technical Reports Server (NTRS)
Thomas, Dale; Smith, Charles; Thomas, Leann; Kittredge, Sheryl
2002-01-01
The overall goal of the 2nd Generation RLV Program is to substantially reduce technical and business risks associated with developing a new class of reusable launch vehicles. NASA's specific goals are to improve the safety of a 2nd generation system by 2 orders of magnitude - equivalent to a crew risk of 1-in-10,000 missions - and decrease the cost tenfold, to approximately $1,000 per pound of payload launched. Architecture definition is being conducted in parallel with the maturating of key technologies specifically identified to improve safety and reliability, while reducing operational costs. An architecture broadly includes an Earth-to-orbit reusable launch vehicle, on-orbit transfer vehicles and upper stages, mission planning, ground and flight operations, and support infrastructure, both on the ground and in orbit. The systems engineering approach ensures that the technologies developed - such as lightweight structures, long-life rocket engines, reliable crew escape, and robust thermal protection systems - will synergistically integrate into the optimum vehicle. To best direct technology development decisions, analytical models are employed to accurately predict the benefits of each technology toward potential space transportation architectures as well as the risks associated with each technology. Rigorous systems analysis provides the foundation for assessing progress toward safety and cost goals. The systems engineering review process factors in comprehensive budget estimates, detailed project schedules, and business and performance plans, against the goals of safety, reliability, and cost, in addition to overall technical feasibility. This approach forms the basis for investment decisions in the 2nd Generation RLV Program's risk-reduction activities. Through this process, NASA will continually refine its specialized needs and identify where Defense and commercial requirements overlap those of civil missions.
NASA Technical Reports Server (NTRS)
Cole, Bjorn; Chung, Seung
2012-01-01
One of the challenges of systems engineering is in working multidisciplinary problems in a cohesive manner. When planning analysis of these problems, system engineers must trade between time and cost for analysis quality and quantity. The quality often correlates with greater run time in multidisciplinary models and the quantity is associated with the number of alternatives that can be analyzed. The trade-off is due to the resource intensive process of creating a cohesive multidisciplinary systems model and analysis. Furthermore, reuse or extension of the models used in one stage of a product life cycle for another is a major challenge. Recent developments have enabled a much less resource-intensive and more rigorous approach than hand-written translation scripts between multi-disciplinary models and their analyses. The key is to work from a core systems model defined in a MOF-based language such as SysML and in leveraging the emerging tool ecosystem, such as Query/View/Transformation (QVT), from the OMG community. SysML was designed to model multidisciplinary systems. The QVT standard was designed to transform SysML models into other models, including those leveraged by engineering analyses. The Europa Habitability Mission (EHM) team has begun to exploit these capabilities. In one case, a Matlab/Simulink model is generated on the fly from a system description for power analysis written in SysML. In a more general case, symbolic analysis (supported by Wolfram Mathematica) is coordinated by data objects transformed from the systems model, enabling extremely flexible and powerful design exploration and analytical investigations of expected system performance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Twitty, A.F.; Handler, B.H.; Duncan, L.D.
Data Systems Engineering Organization (DSEO) personnel are developing a prototype computer aided instruction (CAI) system for the Naval Aviation Logistics Data Analysis (NALDA) system. The objective of this project is to provide a prototype for implementing CAI as an enhancement to existing NALDA training. The CAI prototype project is being performed in phases. The task undertaken in Phase I was to analyze the problem and the alternative solutions and to develop a set of recommendations on how best to proceed. In Phase II a structured design and specification document was completed that will provide the basis for development and implementationmore » of the desired CAI system. Phase III will consist of designing, developing, and testing a user interface which will extend the features of the Phase II prototype. The design of the CAI prototype has followed a rigorous structured analysis based on Yourdon/DeMarco methodology and Information Engineering tools. This document includes data flow diagrams, a data dictionary, process specifications, an entity-relationship diagram, a curriculum description, special function key definitions, and a set of standards developed for the NALDA CAI Prototype.« less
Wisneski, Andrew D; Huang, Lixia; Hong, Bo; Wang, Xiaoqin
2011-01-01
A model for an international undergraduate biomedical engineering research exchange program is outlined. In 2008, the Johns Hopkins University in collaboration with Tsinghua University in Beijing, China established the Tsinghua-Johns Hopkins Joint Center for Biomedical Engineering Research. Undergraduate biomedical engineering students from both universities are offered the opportunity to participate in research at the overseas institution. Programs such as these will not only provide research experiences for undergraduates but valuable cultural exchange and enrichment as well. Currently, strict course scheduling and rigorous curricula in most biomedical engineering programs may present obstacles for students to partake in study abroad opportunities. Universities are encouraged to harbor abroad opportunities for undergraduate engineering students, for which this particular program can serve as a model.
DOT National Transportation Integrated Search
1978-01-01
Economic design of new subways requires optimization of installation and maintenance costs of all the major constituent items. A prerequisite for this is an awareness of the rigorous environmental and other conditions imposed on the subway. Changing ...
Rapid Transit Subways - Guidelines for Engineering New Installations for Reduced Maintenance
DOT National Transportation Integrated Search
1978-01-01
Economic design of new subways requires optimization of installation and maintenance costs of all the major constituent items. A prerequisite for this is an awareness of the rigorous environmental and other conditions imposed on the subway. Changing ...
Ju, Feng; Zhang, Tong
2015-11-03
Recent advances in DNA sequencing technologies have prompted the widespread application of metagenomics for the investigation of novel bioresources (e.g., industrial enzymes and bioactive molecules) and unknown biohazards (e.g., pathogens and antibiotic resistance genes) in natural and engineered microbial systems across multiple disciplines. This review discusses the rigorous experimental design and sample preparation in the context of applying metagenomics in environmental sciences and biotechnology. Moreover, this review summarizes the principles, methodologies, and state-of-the-art bioinformatics procedures, tools and database resources for metagenomics applications and discusses two popular strategies (analysis of unassembled reads versus assembled contigs/draft genomes) for quantitative or qualitative insights of microbial community structure and functions. Overall, this review aims to facilitate more extensive application of metagenomics in the investigation of uncultured microorganisms, novel enzymes, microbe-environment interactions, and biohazards in biotechnological applications where microbial communities are engineered for bioenergy production, wastewater treatment, and bioremediation.
NASA Technical Reports Server (NTRS)
Lee, Taesik; Jeziorek, Peter
2004-01-01
Large complex projects cost large sums of money throughout their life cycle for a variety of reasons and causes. For such large programs, the credible estimation of the project cost, a quick assessment of the cost of making changes, and the management of the project budget with effective cost reduction determine the viability of the project. Cost engineering that deals with these issues requires a rigorous method and systematic processes. This paper introduces a logical framework to a&e effective cost engineering. The framework is built upon Axiomatic Design process. The structure in the Axiomatic Design process provides a good foundation to closely tie engineering design and cost information together. The cost framework presented in this paper is a systematic link between the functional domain (FRs), physical domain (DPs), cost domain (CUs), and a task/process-based model. The FR-DP map relates a system s functional requirements to design solutions across all levels and branches of the decomposition hierarchy. DPs are mapped into CUs, which provides a means to estimate the cost of design solutions - DPs - from the cost of the physical entities in the system - CUs. The task/process model describes the iterative process ot-developing each of the CUs, and is used to estimate the cost of CUs. By linking the four domains, this framework provides a superior traceability from requirements to cost information.
NASA Astrophysics Data System (ADS)
Skrzypek, Josef; Mesrobian, Edmond; Gungner, David J.
1989-03-01
The development of autonomous land vehicles (ALV) capable of operating in an unconstrained environment has proven to be a formidable research effort. The unpredictability of events in such an environment calls for the design of a robust perceptual system, an impossible task requiring the programming of a system bases on the expectation of future, unconstrained events. Hence, the need for a "general purpose" machine vision system that is capable of perceiving and understanding images in an unconstrained environment in real-time. The research undertaken at the UCLA Machine Perception Laboratory addresses this need by focusing on two specific issues: 1) the long term goals for machine vision research as a joint effort between the neurosciences and computer science; and 2) a framework for evaluating progress in machine vision. In the past, vision research has been carried out independently within different fields including neurosciences, psychology, computer science, and electrical engineering. Our interdisciplinary approach to vision research is based on the rigorous combination of computational neuroscience, as derived from neurophysiology and neuropsychology, with computer science and electrical engineering. The primary motivation behind our approach is that the human visual system is the only existing example of a "general purpose" vision system and using a neurally based computing substrate, it can complete all necessary visual tasks in real-time.
Physical Analytics: An emerging field with real-world applications and impact
NASA Astrophysics Data System (ADS)
Hamann, Hendrik
2015-03-01
In the past most information on the internet has been originated by humans or computers. However with the emergence of cyber-physical systems, vast amount of data is now being created by sensors from devices, machines etc digitizing the physical world. While cyber-physical systems are subject to active research around the world, the vast amount of actual data generated from the physical world has attracted so far little attention from the engineering and physics community. In this presentation we use examples to highlight the opportunities in this new subject of ``Physical Analytics'' for highly inter-disciplinary research (including physics, engineering and computer science), which aims understanding real-world physical systems by leveraging cyber-physical technologies. More specifically, the convergence of the physical world with the digital domain allows applying physical principles to everyday problems in a much more effective and informed way than what was possible in the past. Very much like traditional applied physics and engineering has made enormous advances and changed our lives by making detailed measurements to understand the physics of an engineered device, we can now apply the same rigor and principles to understand large-scale physical systems. In the talk we first present a set of ``configurable'' enabling technologies for Physical Analytics including ultralow power sensing and communication technologies, physical big data management technologies, numerical modeling for physical systems, machine learning based physical model blending, and physical analytics based automation and control. Then we discuss in detail several concrete applications of Physical Analytics ranging from energy management in buildings and data centers, environmental sensing and controls, precision agriculture to renewable energy forecasting and management.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-01
... standards for which EPA has issued the waiver. CARB has as long a history of enforcement of vehicle/engine... program. The history and rigor of CARB's enforcement program lends assurance to California SIP revisions...
The Rigor Mortis of Education: Rigor Is Required in a Dying Educational System
ERIC Educational Resources Information Center
Mixon, Jason; Stuart, Jerry
2009-01-01
In an effort to answer the "Educational Call to Arms", our national public schools have turned to Advanced Placement (AP) courses as the predominate vehicle used to address the lack of academic rigor in our public high schools. Advanced Placement is believed by many to provide students with the rigor and work ethic necessary to…
Efficient shortcut techniques in evanescently coupled waveguides
NASA Astrophysics Data System (ADS)
Paul, Koushik; Sarma, Amarendra K.
2016-10-01
Shortcut to Adiabatic Passage (SHAPE) technique, in the context of coherent control of atomic systems has gained considerable attention in last few years. It is primarily because of its ability to manipulate population among the quantum states infinitely fast compared to the adiabatic processes. Two methods in this regard have been explored rigorously, namely the transitionless quantum driving and the Lewis-Riesenfeld invariant approach. We have applied these two methods to realize SHAPE in adiabatic waveguide coupler. Waveguide couplers are integral components of photonic circuits, primarily used as switching devices. Our study shows that with appropriate engineering of the coupling coefficient and propagation constants of the coupler it is possible to achieve efficient and complete power switching. We also observed that the coupler length could be reduced significantly without affecting the coupling efficiency of the system.
From Goal-Oriented Requirements to Event-B Specifications
NASA Technical Reports Server (NTRS)
Aziz, Benjamin; Arenas, Alvaro E.; Bicarregui, Juan; Ponsard, Christophe; Massonet, Philippe
2009-01-01
In goal-oriented requirements engineering methodologies, goals are structured into refinement trees from high-level system-wide goals down to fine-grained requirements assigned to specific software/ hardware/human agents that can realise them. Functional goals assigned to software agents need to be operationalised into specification of services that the agent should provide to realise those requirements. In this paper, we propose an approach for operationalising requirements into specifications expressed in the Event-B formalism. Our approach has the benefit of aiding software designers by bridging the gap between declarative requirements and operational system specifications in a rigorous manner, enabling powerful correctness proofs and allowing further refinements down to the implementation level. Our solution is based on verifying that a consistent Event-B machine exhibits properties corresponding to requirements.
Pakhomova, A A; Aksel'-Rubinshteĭn, V Z; Mikos, K N; Nikitin, E I
2009-01-01
Analysis of experimental data about the quantitative and qualitative chemical make-up of air in the orbital station Mir and International space station (ISS) showed a permanent presence of silicon. The main source of silicon contaminants seems to be a variety of polymethyl siloxane liquids and siloxane coating of electronics. The article describes the volatile silicon contaminants detected in space stations air. To control concentrations of silicon, the existing air purification system needs to be augmented with carbons having the micropore entrance larger than diameters of silicon-containing molecules. It is also important to elaborate the technology of polymethyl siloxane liquids synthesis so as to reduce the amount of volatile admixtures emission and to observe rigorously the pre-flight off-gassing requirements with special concern about silicon coatings.
MATHEMATICAL METHODS IN MEDICAL IMAGE PROCESSING
ANGENENT, SIGURD; PICHON, ERIC; TANNENBAUM, ALLEN
2013-01-01
In this paper, we describe some central mathematical problems in medical imaging. The subject has been undergoing rapid changes driven by better hardware and software. Much of the software is based on novel methods utilizing geometric partial differential equations in conjunction with standard signal/image processing techniques as well as computer graphics facilitating man/machine interactions. As part of this enterprise, researchers have been trying to base biomedical engineering principles on rigorous mathematical foundations for the development of software methods to be integrated into complete therapy delivery systems. These systems support the more effective delivery of many image-guided procedures such as radiation therapy, biopsy, and minimally invasive surgery. We will show how mathematics may impact some of the main problems in this area, including image enhancement, registration, and segmentation. PMID:23645963
Special feature on imaging systems and techniques
NASA Astrophysics Data System (ADS)
Yang, Wuqiang; Giakos, George
2013-07-01
The IEEE International Conference on Imaging Systems and Techniques (IST'2012) was held in Manchester, UK, on 16-17 July 2012. The participants came from 26 countries or regions: Austria, Brazil, Canada, China, Denmark, France, Germany, Greece, India, Iran, Iraq, Italy, Japan, Korea, Latvia, Malaysia, Norway, Poland, Portugal, Sweden, Switzerland, Taiwan, Tunisia, UAE, UK and USA. The technical program of the conference consisted of a series of scientific and technical sessions, exploring physical principles, engineering and applications of new imaging systems and techniques, as reflected by the diversity of the submitted papers. Following a rigorous review process, a total of 123 papers were accepted, and they were organized into 30 oral presentation sessions and a poster session. In addition, six invited keynotes were arranged. The conference not only provided the participants with a unique opportunity to exchange ideas and disseminate research outcomes but also paved a way to establish global collaboration. Following the IST'2012, a total of 55 papers, which were technically extended substantially from their versions in the conference proceeding, were submitted as regular papers to this special feature of Measurement Science and Technology . Following a rigorous reviewing process, 25 papers have been finally accepted for publication in this special feature and they are organized into three categories: (1) industrial tomography, (2) imaging systems and techniques and (3) image processing. These papers not only present the latest developments in the field of imaging systems and techniques but also offer potential solutions to existing problems. We hope that this special feature provides a good reference for researchers who are active in the field and will serve as a catalyst to trigger further research. It has been our great pleasure to be the guest editors of this special feature. We would like to thank the authors for their contributions, without which it would not be possible to have this special feature published. We are grateful to all reviewers, who devoted their time and effort, on a voluntary basis, to ensure that all submissions were reviewed rigorously and fairly. The publishing staff of Measurement Science and Technology are particularly acknowledged for giving us timely advice on guest-editing this special feature.
Semantically-Rigorous Systems Engineering Modeling Using Sysml and OWL
NASA Technical Reports Server (NTRS)
Jenkins, J. Steven; Rouquette, Nicolas F.
2012-01-01
The Systems Modeling Language (SysML) has found wide acceptance as a standard graphical notation for the domain of systems engineering. SysML subsets and extends the Unified Modeling Language (UML) to define conventions for expressing structural, behavioral, and analytical elements, and relationships among them. SysML-enabled modeling tools are available from multiple providers, and have been used for diverse projects in military aerospace, scientific exploration, and civil engineering. The Web Ontology Language (OWL) has found wide acceptance as a standard notation for knowledge representation. OWL-enabled modeling tools are available from multiple providers, as well as auxiliary assets such as reasoners and application programming interface libraries, etc. OWL has been applied to diverse projects in a wide array of fields. While the emphasis in SysML is on notation, SysML inherits (from UML) a semantic foundation that provides for limited reasoning and analysis. UML's partial formalization (FUML), however, does not cover the full semantics of SysML, which is a substantial impediment to developing high confidence in the soundness of any conclusions drawn therefrom. OWL, by contrast, was developed from the beginning on formal logical principles, and consequently provides strong support for verification of consistency and satisfiability, extraction of entailments, conjunctive query answering, etc. This emphasis on formal logic is counterbalanced by the absence of any graphical notation conventions in the OWL standards. Consequently, OWL has had only limited adoption in systems engineering. The complementary strengths and weaknesses of SysML and OWL motivate an interest in combining them in such a way that we can benefit from the attractive graphical notation of SysML and the formal reasoning of OWL. This paper describes an approach to achieving that combination.
Bond, William F; Hui, Joshua; Fernandez, Rosemarie
2018-02-01
Over the past decade, emergency medicine (EM) took a lead role in healthcare simulation in part due to its demands for successful interprofessional and multidisciplinary collaboration, along with educational needs in a diverse array of cognitive and procedural skills. Simulation-based methodologies have the capacity to support training and research platforms that model micro-, meso-, and macrosystems of healthcare. To fully capitalize on the potential of simulation-based research to improve emergency healthcare delivery will require the application of rigorous methods from engineering, social science, and basic science disciplines. The Academic Emergency Medicine (AEM) Consensus Conference "Catalyzing System Change Through Healthcare Simulation: Systems, Competency, and Outcome" was conceived to foster discussion among experts in EM, engineering, and social sciences, focusing on key barriers and opportunities in simulation-based research. This executive summary describes the overall rationale for the conference, conference planning, and consensus-building approaches and outlines the focus of the eight breakout sessions. The consensus outcomes from each breakout session are summarized in proceedings papers published in this issue of Academic Emergency Medicine. Each paper provides an overview of methodologic and knowledge gaps in simulation research and identifies future research targets aimed at improving the safety and quality of healthcare. © 2017 by the Society for Academic Emergency Medicine.
Towards a Rigorous Assessment of Systems Biology Models: The DREAM3 Challenges
Prill, Robert J.; Marbach, Daniel; Saez-Rodriguez, Julio; Sorger, Peter K.; Alexopoulos, Leonidas G.; Xue, Xiaowei; Clarke, Neil D.; Altan-Bonnet, Gregoire; Stolovitzky, Gustavo
2010-01-01
Background Systems biology has embraced computational modeling in response to the quantitative nature and increasing scale of contemporary data sets. The onslaught of data is accelerating as molecular profiling technology evolves. The Dialogue for Reverse Engineering Assessments and Methods (DREAM) is a community effort to catalyze discussion about the design, application, and assessment of systems biology models through annual reverse-engineering challenges. Methodology and Principal Findings We describe our assessments of the four challenges associated with the third DREAM conference which came to be known as the DREAM3 challenges: signaling cascade identification, signaling response prediction, gene expression prediction, and the DREAM3 in silico network challenge. The challenges, based on anonymized data sets, tested participants in network inference and prediction of measurements. Forty teams submitted 413 predicted networks and measurement test sets. Overall, a handful of best-performer teams were identified, while a majority of teams made predictions that were equivalent to random. Counterintuitively, combining the predictions of multiple teams (including the weaker teams) can in some cases improve predictive power beyond that of any single method. Conclusions DREAM provides valuable feedback to practitioners of systems biology modeling. Lessons learned from the predictions of the community provide much-needed context for interpreting claims of efficacy of algorithms described in the scientific literature. PMID:20186320
Systemic Planning: An Annotated Bibliography and Literature Guide. Exchange Bibliography No. 91.
ERIC Educational Resources Information Center
Catanese, Anthony James
Systemic planning is an operational approach to using scientific rigor and qualitative judgment in a complementary manner. It integrates rigorous techniques and methods from systems analysis, cybernetics, decision theory, and work programing. The annotated reference sources in this bibliography include those works that have been most influential…
ERIC Educational Resources Information Center
Cassata-Widera, Amy; Century, Jeanne; Kim, Dae Y.
2011-01-01
The practical need for multidimensional measures of fidelity of implementation (FOI) of reform-based science, technology, engineering, and mathematics (STEM) instructional materials, combined with a theoretical need in the field for a shared conceptual framework that could support accumulating knowledge on specific enacted program elements across…
Exploring in Aeronautics. An Introduction to Aeronautical Sciences.
ERIC Educational Resources Information Center
National Aeronautics and Space Administration, Cleveland, OH. Lewis Research Center.
This curriculum guide is based on a year of lectures and projects of a contemporary special-interest Explorer program intended to provide career guidance and motivation for promising students interested in aerospace engineering and scientific professions. The adult-oriented program avoids technicality and rigorous mathematics and stresses real…
Balancing Stakeholders' Interests in Evolving Teacher Education Accreditation Contexts
ERIC Educational Resources Information Center
Elliott, Alison
2008-01-01
While Australian teacher education programs have long had rigorous accreditation pathways at the University level they have not been subject to the same formal public or professional scrutiny typical of professions such as medicine, nursing or engineering. Professional accreditation for teacher preparation programs is relatively new and is linked…
An Educational and Entrepreneurial Ecosystem to Actualize Technology-Based Social Ventures
ERIC Educational Resources Information Center
Mehta, Khanjan; Zappe, Sarah; Brannon, Mary Lynn; Zhao, Yu
2016-01-01
The Humanitarian Engineering and Social Entrepreneurship (HESE) Program engages students and faculty across Penn State in the rigorous research, design, field-testing, and launch of technology-based social enterprises that address global development challenges. HESE ventures are embedded in a series of five courses that integrate learning,…
LeChevallier, Mark W; Gullick, Richard W; Karim, Mohammad R; Friedman, Melinda; Funk, James E
2003-03-01
The potential for public health risks associated with intrusion of contaminants into water supply distribution systems resulting from transient low or negative pressures is assessed. It is shown that transient pressure events occur in distribution systems; that during these negative pressure events pipeline leaks provide a potential portal for entry of groundwater into treated drinking water; and that faecal indicators and culturable human viruses are present in the soil and water exterior to the distribution system. To date, all observed negative pressure events have been related to power outages or other pump shutdowns. Although there are insufficient data to indicate whether pressure transients are a substantial source of risk to water quality in the distribution system, mitigation techniques can be implemented, principally the maintenance of an effective disinfectant residual throughout the distribution system, leak control, redesign of air relief venting, and more rigorous application of existing engineering standards. Use of high-speed pressure data loggers and surge modelling may have some merit, but more research is needed.
Managing Programmatic Risk for Complex Space System Developments
NASA Technical Reports Server (NTRS)
Panetta, Peter V.; Hastings, Daniel; Brumfield, Mark (Technical Monitor)
2001-01-01
Risk management strategies have become a recent important research topic to many aerospace organizations as they prepare to develop the revolutionary complex space systems of the future. Future multi-disciplinary complex space systems will make it absolutely essential for organizations to practice a rigorous, comprehensive risk management process, emphasizing thorough systems engineering principles to succeed. Project managers must possess strong leadership skills to direct high quality, cross-disciplinary teams for successfully developing revolutionary space systems that are ever increasing in complexity. Proactive efforts to reduce or eliminate risk throughout a project's lifecycle ideally must be practiced by all technical members in the organization. This paper discusses some of the risk management perspectives that were collected from senior managers and project managers of aerospace and aeronautical organizations by the use of interviews and surveys. Some of the programmatic risks which drive the success or failure of projects are revealed. Key findings lead to a number of insights for organizations to consider for proactively approaching the risks which face current and future complex space systems projects.
NASA Technical Reports Server (NTRS)
Larman, B. T.
1981-01-01
The conduction of the Project Galileo Orbiter, with 18 microcomputers and the equivalent of 360K 8-bit bytes of memory contained within two major engineering subsystems and eight science instruments, requires that the key onboard computer system resources be managed in a very rigorous manner. Attention is given to the rationale behind the project policy, the development stage, the preliminary design stage, the design/implementation stage, and the optimization or 'scrubbing' stage. The implementation of the policy is discussed, taking into account the development of the Attitude and Articulation Control Subsystem (AACS) and the Command and Data Subsystem (CDS), the reporting of margin status, and the response to allocation oversubscription.
NASA-STD-7009 Guidance Document for Human Health and Performance Models and Simulations
NASA Technical Reports Server (NTRS)
Walton, Marlei; Mulugeta, Lealem; Nelson, Emily S.; Myers, Jerry G.
2014-01-01
Rigorous verification, validation, and credibility (VVC) processes are imperative to ensure that models and simulations (MS) are sufficiently reliable to address issues within their intended scope. The NASA standard for MS, NASA-STD-7009 (7009) [1] was a resultant outcome of the Columbia Accident Investigation Board (CAIB) to ensure MS are developed, applied, and interpreted appropriately for making decisions that may impact crew or mission safety. Because the 7009 focus is engineering systems, a NASA-STD-7009 Guidance Document is being developed to augment the 7009 and provide information, tools, and techniques applicable to the probabilistic and deterministic biological MS more prevalent in human health and performance (HHP) and space biomedical research and operations.
What is microbial community ecology?
Konopka, Allan
2009-11-01
The activities of complex communities of microbes affect biogeochemical transformations in natural, managed and engineered ecosystems. Meaningfully defining what constitutes a community of interacting microbial populations is not trivial, but is important for rigorous progress in the field. Important elements of research in microbial community ecology include the analysis of functional pathways for nutrient resource and energy flows, mechanistic understanding of interactions between microbial populations and their environment, and the emergent properties of the complex community. Some emergent properties mirror those analyzed by community ecologists who study plants and animals: biological diversity, functional redundancy and system stability. However, because microbes possess mechanisms for the horizontal transfer of genetic information, the metagenome may also be considered as a community property.
Biotelemetry and computer analysis of sleep processes on earth and in space.
NASA Technical Reports Server (NTRS)
Adey, W. R.
1972-01-01
Developments in biomedical engineering now permit study of states of sleep, wakefulness, and focused attention in man exposed to rigorous environments, including aerospace flight. These new sensing devices, data acquisition systems, and computational methods have also been extensively applied to clinical problems of disordered sleep. A 'library' of EEG data has been prepared for sleep in normal man, and characterized for its group features by computational analysis. Sleep in an astronaut in space flight has been examined for the first and second 'nights' of space flight. Normal 90-min cycles were detected during the second night. Sleep patterns in quadriplegic patients deprived of all sensory inputs below the neck have indicated major deviations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Konopka, Allan
The activities of complex communities of microbes affect biogeochemical transformations in natural, managed and engineered ecosystems. Meaningfully defining what constitutes a community of interacting microbial populations is not trivial, but is important for rigorous progress in the field. Important elements of research in microbial community ecology include the analysis of functional pathways for nutrient resource and energy flows, mechanistic understanding of interactions between microbial populations and their environment, and the emergent properties of the complex community. Some emergent properties mirror those analyzed by community ecologists who study plants and animals: biological diversity, functional redundancy and system stability. However, because microbesmore » possess mechanisms for the horizontal transfer of genetic information, the metagenome may also be considered a community property.« less
An Overview of Different Approaches for Battery Lifetime Prediction
NASA Astrophysics Data System (ADS)
Zhang, Peng; Liang, Jun; Zhang, Feng
2017-05-01
With the rapid development of renewable energy and the continuous improvement of the power supply reliability, battery energy storage technology has been wildly used in power system. Battery degradation is a nonnegligible issue when battery energy storage system participates in system design and operation strategies optimization. The health assessment and remaining cycle life estimation of battery gradually become a challenge and research hotspot in many engineering areas. In this paper, the battery capacity falling and internal resistance increase are presented on the basis of chemical reactions inside the battery. The general life prediction models are analysed from several aspects. The characteristics of them as well as their application scenarios are discussed in the survey. In addition, a novel weighted Ah ageing model with the introduction of the Ragone curve is proposed to provide a detailed understanding of the ageing processes. A rigorous proof of the mathematical theory about the proposed model is given in the paper.
A Rigorous Framework for Optimization of Expensive Functions by Surrogates
NASA Technical Reports Server (NTRS)
Booker, Andrew J.; Dennis, J. E., Jr.; Frank, Paul D.; Serafini, David B.; Torczon, Virginia; Trosset, Michael W.
1998-01-01
The goal of the research reported here is to develop rigorous optimization algorithms to apply to some engineering design problems for which design application of traditional optimization approaches is not practical. This paper presents and analyzes a framework for generating a sequence of approximations to the objective function and managing the use of these approximations as surrogates for optimization. The result is to obtain convergence to a minimizer of an expensive objective function subject to simple constraints. The approach is widely applicable because it does not require, or even explicitly approximate, derivatives of the objective. Numerical results are presented for a 31-variable helicopter rotor blade design example and for a standard optimization test example.
NASA Astrophysics Data System (ADS)
Dimitrakopoulos, Panagiotis
2018-03-01
The calculation of polytropic efficiencies is a very important task, especially during the development of new compression units, like compressor impellers, stages and stage groups. Such calculations are also crucial for the determination of the performance of a whole compressor. As processors and computational capacities have substantially been improved in the last years, the need for a new, rigorous, robust, accurate and at the same time standardized method merged, regarding the computation of the polytropic efficiencies, especially based on thermodynamics of real gases. The proposed method is based on the rigorous definition of the polytropic efficiency. The input consists of pressure and temperature values at the end points of the compression path (suction and discharge), for a given working fluid. The average relative error for the studied cases was 0.536 %. Thus, this high-accuracy method is proposed for efficiency calculations related with turbocompressors and their compression units, especially when they are operating at high power levels, for example in jet engines and high-power plants.
An Overview of Starfish: A Table-Centric Tool for Interactive Synthesis
NASA Technical Reports Server (NTRS)
Tsow, Alex
2008-01-01
Engineering is an interactive process that requires intelligent interaction at many levels. My thesis [1] advances an engineering discipline for high-level synthesis and architectural decomposition that integrates perspicuous representation, designer interaction, and mathematical rigor. Starfish, the software prototype for the design method, implements a table-centric transformation system for reorganizing control-dominated system expressions into high-level architectures. Based on the digital design derivation (DDD) system a designer-guided synthesis technique that applies correctness preserving transformations to synchronous data flow specifications expressed as co- recursive stream equations Starfish enhances user interaction and extends the reachable design space by incorporating four innovations: behavior tables, serialization tables, data refinement, and operator retiming. Behavior tables express systems of co-recursive stream equations as a table of guarded signal updates. Developers and users of the DDD system used manually constructed behavior tables to help them decide which transformations to apply and how to specify them. These design exercises produced several formally constructed hardware implementations: the FM9001 microprocessor, an SECD machine for evaluating LISP, and the SchemEngine, garbage collected machine for interpreting a byte-code representation of compiled Scheme programs. Bose and Tuna, two of DDD s developers, have subsequently commercialized the design derivation methodology at Derivation Systems, Inc. (DSI). DSI has formally derived and validated PCI bus interfaces and a Java byte-code processor; they further executed a contract to prototype SPIDER-NASA's ultra-reliable communications bus. To date, most derivations from DDD and DRS have targeted hardware due to its synchronous design paradigm. However, Starfish expressions are independent of the synchronization mechanism; there is no commitment to hardware or globally broadcast clocks. Though software back-ends for design derivation are limited to the DDD stream-interpreter, targeting synchronous or real-time software is not substantively different from targeting hardware.
Concise Review: Organ Engineering: Design, Technology, and Integration.
Kaushik, Gaurav; Leijten, Jeroen; Khademhosseini, Ali
2017-01-01
Engineering complex tissues and whole organs has the potential to dramatically impact translational medicine in several avenues. Organ engineering is a discipline that integrates biological knowledge of embryological development, anatomy, physiology, and cellular interactions with enabling technologies including biocompatible biomaterials and biofabrication platforms such as three-dimensional bioprinting. When engineering complex tissues and organs, core design principles must be taken into account, such as the structure-function relationship, biochemical signaling, mechanics, gradients, and spatial constraints. Technological advances in biomaterials, biofabrication, and biomedical imaging allow for in vitro control of these factors to recreate in vivo phenomena. Finally, organ engineering emerges as an integration of biological design and technical rigor. An overall workflow for organ engineering and guiding technology to advance biology as well as a perspective on necessary future iterations in the field is discussed. Stem Cells 2017;35:51-60. © 2016 AlphaMed Press.
Methodology discourses as boundary work in the construction of engineering education.
Beddoes, Kacey
2014-04-01
Engineering education research is a new field that emerged in the social sciences over the past 10 years. This analysis of engineering education research demonstrates that methodology discourses have played a central role in the construction and development of the field of engineering education, and that they have done so primarily through boundary work. This article thus contributes to science and technology studies literature by examining the role of methodology discourses in an emerging social science field. I begin with an overview of engineering education research before situating the case within relevant bodies of literature on methodology discourses and boundary work. I then identify two methodology discourses--rigor and methodological diversity--and discuss how they contribute to the construction and development of engineering education research. The article concludes with a discussion of how the findings relate to prior research on methodology discourses and boundary work and implications for future research.
Technology development life cycle processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beck, David Franklin
2013-05-01
This report and set of appendices are a collection of memoranda originally drafted in 2009 for the purpose of providing motivation and the necessary background material to support the definition and integration of engineering and management processes related to technology development. At the time there was interest and support to move from Capability Maturity Model Integration (CMMI) Level One (ad hoc processes) to Level Three. As presented herein, the material begins with a survey of open literature perspectives on technology development life cycles, including published data on %E2%80%9Cwhat went wrong.%E2%80%9D The main thrust of the material presents a rational expose%CC%81more » of a structured technology development life cycle that uses the scientific method as a framework, with further rigor added from adapting relevant portions of the systems engineering process. The material concludes with a discussion on the use of multiple measures to assess technology maturity, including consideration of the viewpoint of potential users.« less
From grand-canonical density functional theory towards rational compound design
NASA Astrophysics Data System (ADS)
von Lilienfeld, Anatole
2008-03-01
The fundamental challenge of rational compound design, ie the reverse engineering of chemical compounds with predefined specific properties, originates in the high-dimensional combinatorial nature of chemical space. Chemical space is the hyper-space of a given set of molecular observables that is spanned by the grand-canonical variables (particle densities of electrons and nuclei) which define chemical composition. A brief but rigorous description of chemical space within the molecular grand-canonical ensemble multi-component density functional theory framework will be given [1]. Numerical results will be presented for intermolecular energies as a continuous function of alchemical variations within a neutral and isoelectronic 10 proton system, including CH4, NH3, H2O, and HF, interacting with formic acid [2]. Furthermore, engineering the Fermi level through alchemical generation of boron-nitrogen doped mutants of benzene shall be discussed [3].[1] von Lilienfeld and Tuckerman JCP 125 154104 (2006)[2] von Lilienfeld and Tuckerman JCTC 3 1083 (2007)[3] Marcon et al. JCP 127 064305 (2007)
The Engagement of Engineers in Education and Public Outreach: Beginning the Conversation
NASA Astrophysics Data System (ADS)
Grier, J.; Buxner, S.; Vezino, B.; Shipp, S. S.
2014-12-01
The Next Generation Science Standards (NGSS) are a new set of K-12 science standards that have been developed through a collaborative, state-led process. Based on the National Research Council (NRC) 'Framework for K-12 Education,' the NGSS are designed to provide all students with a coherent education possessing both robust content and rigorous practice. Within these standards is an enhanced emphasis on the intersection between science and engineering. The focus is not only on asking questions and finding answers (science) but also in identifying and designing solution to problems (engineering.) The NASA SMD (Science Mission Directorate) Education and Public Outreach (E/PO) Forums have been working with space scientists for many years to assist with their engagement in E/PO efforts, thus supporting the needs of previous science standards. In order to properly address the needs of NGSS, this conversation is being expanded to include engineers. Our initial efforts include a series of semi-structured interviews with a dozen engineers involved in different aspects of space science and mission development. We will present the responses from the survey and compare this information to our knowledge base about space scientists, their needs, attitudes, and understandings of E/PO. In addition to a new emphasis on engineering in the NGSS, we also consider engineering habits of mind such as systems thinking, creativity, optimism, collaboration, communication, and attention to ethical considerations as described by an NRC policy document for engineering education. Using the overall results, we will consider strategies, further ideas for investigation, and possible steps for going forward with this important aspect of including engineering in education and outreach programming.
Computation of Quasiperiodic Normally Hyperbolic Invariant Tori: Rigorous Results
NASA Astrophysics Data System (ADS)
Canadell, Marta; Haro, Àlex
2017-12-01
The development of efficient methods for detecting quasiperiodic oscillations and computing the corresponding invariant tori is a subject of great importance in dynamical systems and their applications in science and engineering. In this paper, we prove the convergence of a new Newton-like method for computing quasiperiodic normally hyperbolic invariant tori carrying quasiperiodic motion in smooth families of real-analytic dynamical systems. The main result is stated as an a posteriori KAM-like theorem that allows controlling the inner dynamics on the torus with appropriate detuning parameters, in order to obtain a prescribed quasiperiodic motion. The Newton-like method leads to several fast and efficient computational algorithms, which are discussed and tested in a companion paper (Canadell and Haro in J Nonlinear Sci, 2017. doi: 10.1007/s00332-017-9388-z), in which new mechanisms of breakdown are presented.
Kuzma, Jennifer; Najmaie, Pouya; Larson, Joel
2009-01-01
The U.S. oversight system for genetically engineered organisms (GEOs) was evaluated to develop hypotheses and derive lessons for oversight of other emerging technologies, such as nanotechnology. Evaluation was based upon quantitative expert elicitation, semi-standardized interviews, and historical literature analysis. Through an interdisciplinary policy analysis approach, blending legal, ethical, risk analysis, and policy sciences viewpoints, criteria were used to identify strengths and weaknesses of GEOs oversight and explore correlations among its attributes and outcomes. From the three sources of data, hypotheses and broader conclusions for oversight were developed. Our analysis suggests several lessons for oversight of emerging technologies: the importance of reducing complexity and uncertainty in oversight for minimizing financial burdens on small product developers; consolidating multi-agency jurisdictions to avoid gaps and redundancies in safety reviews; consumer benefits for advancing acceptance of GEO products; rigorous and independent pre- and post-market assessment for environmental safety; early public input and transparency for ensuring public confidence; and the positive role of public input in system development, informed consent, capacity, compliance, incentives, and data requirements and stringency in promoting health and environmental safety outcomes, as well as the equitable distribution of health impacts. Our integrated approach is instructive for more comprehensive analyses of oversight systems, developing hypotheses for how features of oversight systems affect outcomes, and formulating policy options for oversight of future technological products, especially nanotechnology products.
NASA Testing the Webb Telescope's MIRI Thermal Shield
2017-12-08
NASA engineer Acey Herrera recently checked out copper test wires inside the thermal shield of the Mid-Infrared Instrument, known as MIRI, that will fly aboard NASA's James Webb Space Telescope. The shield is designed to protect the vital MIRI instrument from excess heat. At the time of the photo, the thermal shield was about to go through rigorous environmental testing to ensure it can perform properly in the extreme cold temperatures that it will encounter in space. Herrera is working in a thermal vacuum chamber at NASA's Goddard Space Flight Center in Greenbelt, Md. As the MIRI shield lead, Herrera along with a thermal engineer and cryo-engineer verify that the shield is ready for testing. On the Webb telescope, the pioneering camera and spectrometer that comprise the MIRI instrument sit inside the Integrated Science Instrument Module flight structure, that holds Webb's four instruments and their electronic systems during launch and operations. Read more: 1.usa.gov/15I0wrS Credit: NASA/Chris Gunn NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
ERIC Educational Resources Information Center
McFadden, Paula; Taylor, Brian J.; Campbell, Anne; McQuilkin, Janice
2012-01-01
Context: The development of a consolidated knowledge base for social work requires rigorous approaches to identifying relevant research. Method: The quality of 10 databases and a web search engine were appraised by systematically searching for research articles on resilience and burnout in child protection social workers. Results: Applied Social…
Science and Mathematics Advanced Placement Exams: Growth and Achievement over Time
ERIC Educational Resources Information Center
Judson, Eugene
2017-01-01
Rapid growth of Advanced Placement (AP) exams in the last 2 decades has been paralleled by national enthusiasm to promote availability and rigor of science, technology, engineering, and mathematics (STEM). Trends were examined in STEM AP to evaluate and compare growth and achievement. Analysis included individual STEM subjects and disaggregation…
ERIC Educational Resources Information Center
Ashley, Michael; Cooper, Katelyn M.; Cala, Jacqueline M.; Brownell, Sara E.
2017-01-01
Summer bridge programs are designed to help transition students into the college learning environment. Increasingly, bridge programs are being developed in science, technology, engineering, and mathematics (STEM) disciplines because of the rigorous content and lower student persistence in college STEM compared with other disciplines. However, to…
Accuracy and performance of 3D mask models in optical projection lithography
NASA Astrophysics Data System (ADS)
Agudelo, Viviana; Evanschitzky, Peter; Erdmann, Andreas; Fühner, Tim; Shao, Feng; Limmer, Steffen; Fey, Dietmar
2011-04-01
Different mask models have been compared: rigorous electromagnetic field (EMF) modeling, rigorous EMF modeling with decomposition techniques and the thin mask approach (Kirchhoff approach) to simulate optical diffraction from different mask patterns in projection systems for lithography. In addition, each rigorous model was tested for two different formulations for partially coherent imaging: The Hopkins assumption and rigorous simulation of mask diffraction orders for multiple illumination angles. The aim of this work is to closely approximate results of the rigorous EMF method by the thin mask model enhanced with pupil filtering techniques. The validity of this approach for different feature sizes, shapes and illumination conditions is investigated.
Ontology-Driven Information Integration
NASA Technical Reports Server (NTRS)
Tissot, Florence; Menzel, Chris
2005-01-01
Ontology-driven information integration (ODII) is a method of computerized, automated sharing of information among specialists who have expertise in different domains and who are members of subdivisions of a large, complex enterprise (e.g., an engineering project, a government agency, or a business). In ODII, one uses rigorous mathematical techniques to develop computational models of engineering and/or business information and processes. These models are then used to develop software tools that support the reliable processing and exchange of information among the subdivisions of this enterprise or between this enterprise and other enterprises.
NASA Technical Reports Server (NTRS)
Rosner, D. E.; Gokoglu, S. A.; Israel, R.
1982-01-01
A multiparameter correlation approach to the study of particle deposition rates in engineering applications is discussed with reference to two specific examples, one dealing with thermophoretically augmented small particle convective diffusion and the other involving larger particle inertial impaction. The validity of the correlations proposed here is demonstrated through rigorous computations including all relevant phenomena and interactions. Such representations are shown to minimize apparent differences between various geometric, flow, and physicochemical parameters, allowing many apparently different physicochemical situations to be described in a unified way.
Physics for Scientists and Engineers, 5th edition - Volume 1
NASA Astrophysics Data System (ADS)
Tipler, Paul A.; Mosca, Gene P.
For nearly 30 years, Paul Tipler's Physics for Scientists and Engineers has set the standard in the introductory calculus-based physics course for clarity, accuracy, and precision. In this fifth edition, Paul has recruited Gene Mosca to bring his years of teaching experience to bear on the text, to scrutinize every explanation and example from the perspective of the freshman student. The result is a teaching tool that retains its precision and rigor, but offers struggling students the support they need to solve problems strategically and to gain real understanding of physical concepts.
Mechatronics as a technological basis for an innovative learning environment in engineering
NASA Astrophysics Data System (ADS)
Garner, Gavin Thomas
Mechatronic systems that couple mechanical and electrical systems with the help of computer control are forcing a paradigm shift in the design, manufacture, and implementation of mechanical devices. The inherently interdisciplinary nature of these systems generates exciting new opportunities for developing a hands-on, inventive, and creativity-focused educational program while still embracing rigorous scientific fundamentals. The technologies associated with mechatronics are continually evolving (e.g., integrated circuit chips, miniature and new types of sensors, and state-of-the-art actuators). As a result, a mechatronics curriculum must prepare students to adapt along with these rapidly changing technologies---and perhaps even advance these technologies themselves. Such is the inspiring and uncharted new world that is presented for student exploration and experimentation in the University of Virginia's Mechatronics Laboratory. The underlying goal of this research has been to develop a framework for teaching mechatronics that helps students master fundamental concepts and build essential technical and analytical skills. To this end, two courses involving over fifty hours worth of technologically-innovative and educationally-effective laboratory experiments have been developed along with open-ended projects in response to the unique and new challenges associated with teaching mechatronics. These experiments synthesize an unprecedentedly vast array of skills from many different disciplines and enable students to haptically absorb the fundamental concepts involved in designing mechatronic systems. They have been optimized through several iterations to become highly efficient. Perspectives on the development of these courses and on the field of mechatronics in general are included. Furthermore, this dissertation demonstrates the integration of new technologies within a learning environment specifically designed to teach mechatronics to mechanical engineers. For mechanical engineering in particular, mechatronics poses considerable challenges, and necessitates a fundamental evolution in the understanding of the relationship between the various engineering disciplines. Consequently, this dissertation helps to define the role that mechatronics must play in mechanical engineering and presents unique laboratory experiments, creative projects, and modeling and simulation exercises as effective tools for teaching mechatronics to the modern mechanical engineering student.
Systems engineering analysis of five 'as-manufactured' SXI telescopes
NASA Astrophysics Data System (ADS)
Harvey, James E.; Atanassova, Martina; Krywonos, Andrey
2005-09-01
Four flight models and a spare of the Solar X-ray Imager (SXI) telescope mirrors have been fabricated. The first of these is scheduled to be launched on the NOAA GOES- N satellite on July 29, 2005. A complete systems engineering analysis of the "as-manufactured" telescope mirrors has been performed that includes diffraction effects, residual design errors (aberrations), surface scatter effects, and all of the miscellaneous errors in the mirror manufacturer's error budget tree. Finally, a rigorous analysis of mosaic detector effects has been included. SXI is a staring telescope providing full solar disc images at X-ray wavelengths. For wide-field applications such as this, a field-weighted-average measure of resolution has been modeled. Our performance predictions have allowed us to use metrology data to model the "as-manufactured" performance of the X-ray telescopes and to adjust the final focal plane location to optimize the number of spatial resolution elements in a given operational field-of-view (OFOV) for either the aerial image or the detected image. The resulting performance predictions from five separate mirrors allow us to evaluate and quantify the optical fabrication process for producing these very challenging grazing incidence X-ray optics.
NASA Astrophysics Data System (ADS)
Thomas, W. A.; McAnally, W. H., Jr.
1985-07-01
TABS-2 is a generalized numerical modeling system for open-channel flows, sedimentation, and constituent transport. It consists of more than 40 computer programs to perform modeling and related tasks. The major modeling components--RMA-2V, STUDH, and RMA-4--calculate two-dimensional, depth-averaged flows, sedimentation, and dispersive transport, respectively. The other programs in the system perform digitizing, mesh generation, data management, graphical display, output analysis, and model interfacing tasks. Utilities include file management and automatic generation of computer job control instructions. TABS-2 has been applied to a variety of waterways, including rivers, estuaries, bays, and marshes. It is designed for use by engineers and scientists who may not have a rigorous computer background. Use of the various components is described in Appendices A-O. The bound version of the report does not include the appendices. A looseleaf form with Appendices A-O is distributed to system users.
Guilak, Farshid
2017-03-21
We are currently in one of the most exciting times for science and engineering as we witness unprecedented growth in our computational and experimental capabilities to generate new data and models. To facilitate data and model sharing, and to enhance reproducibility and rigor in biomechanics research, the Journal of Biomechanics has introduced a number of tools for Content Innovation to allow presentation, sharing, and archiving of methods, models, and data in our articles. The tools include an Interactive Plot Viewer, 3D Geometric Shape and Model Viewer, Virtual Microscope, Interactive MATLAB Figure Viewer, and Audioslides. Authors are highly encouraged to make use of these in upcoming journal submissions. Copyright © 2017 Elsevier Ltd. All rights reserved.
JACOB: an enterprise framework for computational chemistry.
Waller, Mark P; Dresselhaus, Thomas; Yang, Jack
2013-06-15
Here, we present just a collection of beans (JACOB): an integrated batch-based framework designed for the rapid development of computational chemistry applications. The framework expedites developer productivity by handling the generic infrastructure tier, and can be easily extended by user-specific scientific code. Paradigms from enterprise software engineering were rigorously applied to create a scalable, testable, secure, and robust framework. A centralized web application is used to configure and control the operation of the framework. The application-programming interface provides a set of generic tools for processing large-scale noninteractive jobs (e.g., systematic studies), or for coordinating systems integration (e.g., complex workflows). The code for the JACOB framework is open sourced and is available at: www.wallerlab.org/jacob. Copyright © 2013 Wiley Periodicals, Inc.
Optimal protocol for maximum work extraction in a feedback process with a time-varying potential
NASA Astrophysics Data System (ADS)
Kwon, Chulan
2017-12-01
The nonequilibrium nature of information thermodynamics is characterized by the inequality or non-negativity of the total entropy change of the system, memory, and reservoir. Mutual information change plays a crucial role in the inequality, in particular if work is extracted and the paradox of Maxwell's demon is raised. We consider the Brownian information engine where the protocol set of the harmonic potential is initially chosen by the measurement and varies in time. We confirm the inequality of the total entropy change by calculating, in detail, the entropic terms including the mutual information change. We rigorously find the optimal values of the time-dependent protocol for maximum extraction of work both for the finite-time and the quasi-static process.
Respectful Modeling: Addressing Uncertainty in Dynamic System Models for Molecular Biology.
Tsigkinopoulou, Areti; Baker, Syed Murtuza; Breitling, Rainer
2017-06-01
Although there is still some skepticism in the biological community regarding the value and significance of quantitative computational modeling, important steps are continually being taken to enhance its accessibility and predictive power. We view these developments as essential components of an emerging 'respectful modeling' framework which has two key aims: (i) respecting the models themselves and facilitating the reproduction and update of modeling results by other scientists, and (ii) respecting the predictions of the models and rigorously quantifying the confidence associated with the modeling results. This respectful attitude will guide the design of higher-quality models and facilitate the use of models in modern applications such as engineering and manipulating microbial metabolism by synthetic biology. Copyright © 2016 Elsevier Ltd. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-14
... the States. Magnus Ericsson and Conny Harlin are part of a team of Volvo engineers and technicians... Swedish CDLs (74 FR 20778). Volvo Application for Exemption Volvo applied for exemption for drivers Magnus... experience and unblemished safety records of Magnus Ericsson and Conny Harlin, and the rigorous training and...
Technological Literacy: The Proper Focus to Educate All Students
ERIC Educational Resources Information Center
Loveland, Thomas; Love, Tyler
2017-01-01
As technology and engineering (T&E) education seeks to survive a shortage of teachers and funding, among other factors, it must proceed with caution. The field should remain true to its hands-on, design-based roots but must also provide rigorous instruction that applies STEM skills and situates it as a valuable stakeholder among the core…
ERIC Educational Resources Information Center
Crutchfield, Orpheus S. L.; Harrison, Christopher D.; Haas, Guy; Garcia, Daniel D.; Humphreys, Sheila M.; Lewis, Colleen M.; Khooshabeh, Peter
2011-01-01
The Berkeley Foundation for Opportunities in Information Technology is a decade-old endeavor to expose pre-college young women and underrepresented racial and ethnic minorities to the fields of computer science and engineering, and prepare them for rigorous, university-level study. We have served more than 150 students, and graduated more than 65…
Hertzian Dipole Radiation over Isotropic Magnetodielectric Substrates
2015-03-01
Analytical and numerical techniques in the Green’s function treatment of microstrip antennas and scatterers. IEE Proceedings. March 1983:130(2). 3...public release; distribution unlimited. 13. SUPPLEMENTARY NOTES 14. ABSTRACT This report investigates dipole antennas printed on grounded...engineering of thin planar antennas . Since these materials often require complicated constitutive equations to describe their properties rigorously, the
Validation Engine for Observational Protocols. Measures of Effective Teaching (MET) Project
ERIC Educational Resources Information Center
Bill & Melinda Gates Foundation, 2010
2010-01-01
In the fall of 2009, the Bill and Melinda Gates Foundation launched the two-year Measures of Effective Teaching (MET) project to rigorously develop and test multiple measures of teacher effectiveness. As part of the project, partners from more than a dozen reputable academic, non-profit and for-profit organizations collected and analyzed data from…
DOE Office of Scientific and Technical Information (OSTI.GOV)
LaSalle, F.R.; Golbeg, P.R.; Chenault, D.M.
For reactor and nuclear facilities, both Title 10, Code of Federal Regulations, Part 50, and US Department of Energy Order 6430.1A require assessments of the interaction of non-Safety Class 1 piping and equipment with Safety Class 1 piping and equipment during a seismic event to maintain the safety function. The safety class systems of nuclear reactors or nuclear facilities are designed to the applicable American Society of Mechanical Engineers standards and Seismic Category 1 criteria that require rigorous analysis, construction, and quality assurance. Because non-safety class systems are generally designed to lesser standards and seismic criteria, they may become missilesmore » during a safe shutdown earthquake. The resistance of piping, tubing, and equipment to seismically generated missiles is addressed in the paper. Gross plastic and local penetration failures are considered with applicable test verification. Missile types and seismic zones of influence are discussed. Field qualification data are also developed for missile evaluation.« less
Radioisotope Electric Propulsion (REP): A Near-Term Approach to Nuclear Propulsion
NASA Technical Reports Server (NTRS)
Schmidt, George R.; Manzella, David H.; Kamhawi, Hani; Kremic, Tibor; Oleson, Steven R.; Dankanich, John W.; Dudzinski, Leonard A.
2009-01-01
Studies over the last decade have shown radioisotope-based nuclear electric propulsion to be enhancing and, in some cases, enabling for many potential robotic science missions. Also known as radioisotope electric propulsion (REP), the technology offers the performance advantages of traditional reactor-powered electric propulsion (i.e., high specific impulse propulsion at large distances from the Sun), but with much smaller, affordable spacecraft. Future use of REP requires development of radioisotope power sources with system specific powers well above that of current systems. The US Department of Energy and NASA have developed an advanced Stirling radioisotope generator (ASRG) engineering unit, which was subjected to rigorous flight qualification-level tests in 2008, and began extended lifetime testing later that year. This advancement, along with recent work on small ion thrusters and life extension technology for Hall thrusters, could enable missions using REP sometime during the next decade.
Developing Analogy Cost Estimates for Space Missions
NASA Technical Reports Server (NTRS)
Shishko, Robert
2004-01-01
The analogy approach in cost estimation combines actual cost data from similar existing systems, activities, or items with adjustments for a new project's technical, physical or programmatic differences to derive a cost estimate for the new system. This method is normally used early in a project cycle when there is insufficient design/cost data to use as a basis for (or insufficient time to perform) a detailed engineering cost estimate. The major limitation of this method is that it relies on the judgment and experience of the analyst/estimator. The analyst must ensure that the best analogy or analogies have been selected, and that appropriate adjustments have been made. While analogy costing is common, there is a dearth of advice in the literature on the 'adjustment methodology', especially for hardware projects. This paper discusses some potential approaches that can improve rigor and repeatability in the analogy costing process.
Systems Engineering Lessons Learned for Class D Missions
NASA Technical Reports Server (NTRS)
Rojdev, Kristina; Piatek, Irene; Moore, Josh; Calvert, Derek
2015-01-01
One of NASA's goals within human exploration is to determine how to get humans to Mars safely and to live and work on the Martian surface. To accomplish this goal, several smaller missions act as stepping-stones to the larger end goal. NASA uses these smaller missions to develop new technologies and learn about how to survive outside of Low Earth Orbit for long periods. Additionally, keeping a cadence of these missions allows the team to maintain proficiency in the complex art of bringing spacecraft to fruition. Many of these smaller missions are robotic in nature and have smaller timescales, whereas there are others that involve crew and have longer mission timelines. Given the timelines associated with these various missions, different levels of risk and rigor need to be implemented to be more in line with what is appropriate for the mission. Thus, NASA has four different classifications that range from Class A to Class D based on the mission details. One of these projects is the Resource Prospector (RP) Mission, which is a multi-center and multi-institution collaborative project to search for volatiles in the polar regions of the Moon. The RP mission is classified as a Class D mission and as such, has the opportunity to more tightly manage, and therefore accept, greater levels of risk. The requirements for Class D missions were at the forefront of the design and thus presented unique challenges in vehicle development and systems engineering processes. This paper will discuss the systems engineering process at NASA and how that process is tailored for Class D missions, specifically the RP mission.
NASA Technical Reports Server (NTRS)
Cole, Bjorn; Chung, Seung H.
2012-01-01
One of the challenges of systems engineering is in working multidisciplinary problems in a cohesive manner. When planning analysis of these problems, system engineers must tradeoff time and cost for analysis quality and quantity. The quality is associated with the fidelity of the multidisciplinary models and the quantity is associated with the design space that can be analyzed. The tradeoff is due to the resource intensive process of creating a cohesive multidisciplinary system model and analysis. Furthermore, reuse or extension of the models used in one stage of a product life cycle for another is a major challenge. Recent developments have enabled a much less resource-intensive and more rigorous approach than handwritten translation scripts or codes of multidisciplinary models and their analyses. The key is to work from a core system model defined in a MOF-based language such as SysML and in leveraging the emerging tool ecosystem, such as Query-View- Transform (QVT), from the OMG community. SysML was designed to model multidisciplinary systems and analyses. The QVT standard was designed to transform SysML models. The Europa Hability Mission (EHM) team has begun to exploit these capabilities. In one case, a Matlab/Simulink model is generated on the fly from a system description for power analysis written in SysML. In a more general case, a symbolic mathematical framework (supported by Wolfram Mathematica) is coordinated by data objects transformed from the system model, enabling extremely flexible and powerful tradespace exploration and analytical investigations of expected system performance.
Use of perfusion bioreactors and large animal models for long bone tissue engineering.
Gardel, Leandro S; Serra, Luís A; Reis, Rui L; Gomes, Manuela E
2014-04-01
Tissue engineering and regenerative medicine (TERM) strategies for generation of new bone tissue includes the combined use of autologous or heterologous mesenchymal stem cells (MSC) and three-dimensional (3D) scaffold materials serving as structural support for the cells, that develop into tissue-like substitutes under appropriate in vitro culture conditions. This approach is very important due to the limitations and risks associated with autologous, as well as allogenic bone grafiting procedures currently used. However, the cultivation of osteoprogenitor cells in 3D scaffolds presents several challenges, such as the efficient transport of nutrient and oxygen and removal of waste products from the cells in the interior of the scaffold. In this context, perfusion bioreactor systems are key components for bone TERM, as many recent studies have shown that such systems can provide dynamic environments with enhanced diffusion of nutrients and therefore, perfusion can be used to generate grafts of clinically relevant sizes and shapes. Nevertheless, to determine whether a developed tissue-like substitute conforms to the requirements of biocompatibility, mechanical stability and safety, it must undergo rigorous testing both in vitro and in vivo. Results from in vitro studies can be difficult to extrapolate to the in vivo situation, and for this reason, the use of animal models is often an essential step in the testing of orthopedic implants before clinical use in humans. This review provides an overview of the concepts, advantages, and challenges associated with different types of perfusion bioreactor systems, particularly focusing on systems that may enable the generation of critical size tissue engineered constructs. Furthermore, this review discusses some of the most frequently used animal models, such as sheep and goats, to study the in vivo functionality of bone implant materials, in critical size defects.
A Practical Engineering Approach to Predicting Fatigue Crack Growth in Riveted Lap Joints
NASA Technical Reports Server (NTRS)
Harris, Charles E.; Piascik, Robert S.; Newman, James C., Jr.
1999-01-01
An extensive experimental database has been assembled from very detailed teardown examinations of fatigue cracks found in rivet holes of fuselage structural components. Based on this experimental database, a comprehensive analysis methodology was developed to predict the onset of widespread fatigue damage in lap joints of fuselage structure. Several computer codes were developed with specialized capabilities to conduct the various analyses that make up the comprehensive methodology. Over the past several years, the authors have interrogated various aspects of the analysis methods to determine the degree of computational rigor required to produce numerical predictions with acceptable engineering accuracy. This study led to the formulation of a practical engineering approach to predicting fatigue crack growth in riveted lap joints. This paper describes the practical engineering approach and compares predictions with the results from several experimental studies.
A Practical Engineering Approach to Predicting Fatigue Crack Growth in Riveted Lap Joints
NASA Technical Reports Server (NTRS)
Harris, C. E.; Piascik, R. S.; Newman, J. C., Jr.
2000-01-01
An extensive experimental database has been assembled from very detailed teardown examinations of fatigue cracks found in rivet holes of fuselage structural components. Based on this experimental database, a comprehensive analysis methodology was developed to predict the onset of widespread fatigue damage in lap joints of fuselage structure. Several computer codes were developed with specialized capabilities to conduct the various analyses that make up the comprehensive methodology. Over the past several years, the authors have interrogated various aspects of the analysis methods to determine the degree of computational rigor required to produce numerical predictions with acceptable engineering accuracy. This study led to the formulation of a practical engineering approach to predicting fatigue crack growth in riveted lap joints. This paper describes the practical engineering approach and compares predictions with the results from several experimental studies.
A New Overview of The Trilinos Project
Heroux, Michael A.; Willenbring, James M.
2012-01-01
Since An Overview of the Trilinos Project [ACM Trans. Math. Softw. 31(3) (2005), 397–423] was published in 2005, Trilinos has grown significantly. It now supports the development of a broad collection of libraries for scalable computational science and engineering applications, and a full-featured software infrastructure for rigorous lean/agile software engineering. This growth has created significant opportunities and challenges. This paper focuses on some of the most notable changes to the Trilinos project in the last few years. At the time of the writing of this article, the current release version of Trilinos was 10.12.2.
NASA Astrophysics Data System (ADS)
Claver, Chuck F.; Dubois-Felsmann, G. P.; Delgado, F.; Hascall, P.; Horn, D.; Marshall, S.; Nordby, M.; Schalk, T. L.; Schumacher, G.; Sebag, J.; LSST Project Team
2010-01-01
The LSST is a complete observing system that acquires and archives images, processes and analyzes them, and publishes reduced images and catalogs of sources and objects. The LSST will operate over a ten year period producing a survey of 20,000 square degrees over the entire southern sky in 6 filters (ugrizy) with each field having been visited several hundred times enabling a wide spectrum of science from fast transients to exploration of dark matter and dark energy. The LSST itself is a complex system of systems consisting of the 8.4m three mirror telescope, a 3.2 billion pixel camera, and a peta-scale data management system. The LSST project uses a Model Based Systems Engineering (MBSE) methodology to ensure an integrated approach to system design and rigorous definition of system interfaces and specifications. The MBSE methodology is applied through modeling of the LSST's systems with the System Modeling Language (SysML). The SysML modeling recursively establishes the threefold relationship between requirements, logical & physical functional decomposition and definition, and system and component behavior at successively deeper levels of abstraction and detail. The MBSE approach is applied throughout all stages of the project from design, to validation and verification, though to commissioning.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boggs, Paul T.; Althsuler, Alan; Larzelere, Alex R.
2005-08-01
The Design-through-Analysis Realization Team (DART) is chartered with reducing the time Sandia analysts require to complete the engineering analysis process. The DART system analysis team studied the engineering analysis processes employed by analysts in Centers 9100 and 8700 at Sandia to identify opportunities for reducing overall design-through-analysis process time. The team created and implemented a rigorous analysis methodology based on a generic process flow model parameterized by information obtained from analysts. They also collected data from analysis department managers to quantify the problem type and complexity distribution throughout Sandia's analyst community. They then used this information to develop a communitymore » model, which enables a simple characterization of processes that span the analyst community. The results indicate that equal opportunity for reducing analysis process time is available both by reducing the ''once-through'' time required to complete a process step and by reducing the probability of backward iteration. In addition, reducing the rework fraction (i.e., improving the engineering efficiency of subsequent iterations) offers approximately 40% to 80% of the benefit of reducing the ''once-through'' time or iteration probability, depending upon the process step being considered. Further, the results indicate that geometry manipulation and meshing is the largest portion of an analyst's effort, especially for structural problems, and offers significant opportunity for overall time reduction. Iteration loops initiated late in the process are more costly than others because they increase ''inner loop'' iterations. Identifying and correcting problems as early as possible in the process offers significant opportunity for time savings.« less
ERIC Educational Resources Information Center
Eisenhart, Margaret; Weis, Lois; Allen, Carrie D.; Cipollone, Kristin; Stich, Amy; Dominguez, Rachel
2015-01-01
In response to numerous calls for more rigorous STEM (science, technology, engineering, and mathematics) education to improve US competitiveness and the job prospects of next-generation workers, especially those from low-income and minority groups, a growing number of schools emphasizing STEM have been established in the US over the past decade.…
ERIC Educational Resources Information Center
Gottfried, Michael; Bozick, Robert
2012-01-01
Academic math and science courses have been long shown to increase learning and educational attainment, but are they sufficient on their own to prepare youth for the challenges and rigor of the STEM workforce? Or, are there distinctive benefits to complementing these traditional academic courses with applied ones? Answers to these questions are…
Performance and Safety to NAVSEA Instruction 9310.1A of Lithium-thionyl Chloride Reserve Batteries
NASA Technical Reports Server (NTRS)
Hall, J. C.
1984-01-01
The design, performance and safety of a fully engineered, selfcontained Li/SOCl2 battery as the power source for underwater applications. In addition to meeting the performance standards of the end user this battery is successfully tested under the rigorous safety conditions of NAVSEA Instruction 9310.1A for use on land, aircraft and surface ships.
Closed loop statistical performance analysis of N-K knock controllers
NASA Astrophysics Data System (ADS)
Peyton Jones, James C.; Shayestehmanesh, Saeed; Frey, Jesse
2017-09-01
The closed loop performance of engine knock controllers cannot be rigorously assessed from single experiments or simulations because knock behaves as a random process and therefore the response belongs to a random distribution also. In this work a new method is proposed for computing the distributions and expected values of the closed loop response, both in steady state and in response to disturbances. The method takes as its input the control law, and the knock propensity characteristic of the engine which is mapped from open loop steady state tests. The method is applicable to the 'n-k' class of knock controllers in which the control action is a function only of the number of cycles n since the last control move, and the number k of knock events that have occurred in this time. A Cumulative Summation (CumSum) based controller falls within this category, and the method is used to investigate the performance of the controller in a deeper and more rigorous way than has previously been possible. The results are validated using onerous Monte Carlo simulations, which confirm both the validity of the method and its high computational efficiency.
Na, Hyuntae; Lee, Seung-Yub; Üstündag, Ersan; ...
2013-01-01
This paper introduces a recent development and application of a noncommercial artificial neural network (ANN) simulator with graphical user interface (GUI) to assist in rapid data modeling and analysis in the engineering diffraction field. The real-time network training/simulation monitoring tool has been customized for the study of constitutive behavior of engineering materials, and it has improved data mining and forecasting capabilities of neural networks. This software has been used to train and simulate the finite element modeling (FEM) data for a fiber composite system, both forward and inverse. The forward neural network simulation precisely reduplicates FEM results several orders ofmore » magnitude faster than the slow original FEM. The inverse simulation is more challenging; yet, material parameters can be meaningfully determined with the aid of parameter sensitivity information. The simulator GUI also reveals that output node size for materials parameter and input normalization method for strain data are critical train conditions in inverse network. The successful use of ANN modeling and simulator GUI has been validated through engineering neutron diffraction experimental data by determining constitutive laws of the real fiber composite materials via a mathematically rigorous and physically meaningful parameter search process, once the networks are successfully trained from the FEM database.« less
Cutting More than Metal: Breaking the Development Cycle
NASA Technical Reports Server (NTRS)
Singer, Chris
2014-01-01
New technology is changing the way we do business at NASA. The ability to use these new tools is made possible by a learning culture able to embrace innovation, flexibility, and prudent risk tolerance, while retaining the hard-won lessons learned of other successes and failures. Technologies such as 3-D manufacturing and structured light scanning are re-shaping the entire product life cycle, from design and analysis, through production, verification, logistics and operations. New fabrication techniques, verification techniques, integrated analysis, and models that follow the hardware from initial concept through operation are reducing the cost and time of building space hardware. Using these technologies to be more efficient, reliable and affordable requires we bring them to a level safe for NASA systems, maintain appropriate rigor in testing and acceptance, and transition new technology. Maximizing these technologies also requires cultural acceptance and understanding and balancing rules with creativity. Evolved systems engineering processes at NASA are increasingly more flexible than they have been in the past, enabling the implementation of new techniques and approaches. This paper provides an overview of NASA Marshall Space Flight Center's new approach to development, as well as examples of how that approach has been incorporated into NASA's Space Launch System (SLS) Program, which counts among its key tenants - safety, affordability, and sustainability. One of the 3D technologies that will be discussed in this paper is the design and testing of various rocket engine components.
Large Liquid Rocket Testing: Strategies and Challenges
NASA Technical Reports Server (NTRS)
Rahman, Shamim A.; Hebert, Bartt J.
2005-01-01
Rocket propulsion development is enabled by rigorous ground testing in order to mitigate the propulsion systems risks that are inherent in space flight. This is true for virtually all propulsive devices of a space vehicle including liquid and solid rocket propulsion, chemical and non-chemical propulsion, boost stage and in-space propulsion and so forth. In particular, large liquid rocket propulsion development and testing over the past five decades of human and robotic space flight has involved a combination of component-level testing and engine-level testing to first demonstrate that the propulsion devices were designed to meet the specified requirements for the Earth to Orbit launchers that they powered. This was followed by a vigorous test campaign to demonstrate the designed propulsion articles over the required operational envelope, and over robust margins, such that a sufficiently reliable propulsion system is delivered prior to first flight. It is possible that hundreds of tests, and on the order of a hundred thousand test seconds, are needed to achieve a high-reliability, flight-ready, liquid rocket engine system. This paper overviews aspects of earlier and recent experience of liquid rocket propulsion testing at NASA Stennis Space Center, where full scale flight engines and flight stages, as well as a significant amount of development testing has taken place in the past decade. The liquid rocket testing experience discussed includes testing of engine components (gas generators, preburners, thrust chambers, pumps, powerheads), as well as engine systems and complete stages. The number of tests, accumulated test seconds, and years of test stand occupancy needed to meet varying test objectives, will be selectively discussed and compared for the wide variety of ground test work that has been conducted at Stennis for subscale and full scale liquid rocket devices. Since rocket propulsion is a crucial long-lead element of any space system acquisition or development, the appropriate plan and strategy must be put in place at the outset of the development effort. A deferment of this test planning, or inattention to strategy, will compromise the ability of the development program to achieve its systems reliability requirements and/or its development milestones. It is important for the government leadership and support team, as well as the vehicle and propulsion development team, to give early consideration to this aspect of space propulsion and space transportation work.
Systems approach used in the Gas Centrifuge Enrichment Plant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rooks, W.A. Jr.
A requirement exists for effective and efficient transfer of technical knowledge from the design engineering team to the production work force. Performance-Based Training (PBT) is a systematic approach to the design, development, and implementation of technical training. This approach has been successfully used by the US Armed Forces, industry, and other organizations. The advantages of the PBT approach are: cost-effectiveness (lowest life-cycle training cost), learning effectiveness, reduced implementation time, and ease of administration. The PBT process comprises five distinctive and rigorous phases: Analysis of Job Performance, Design of Instructional Strategy, Development of Training Materials and Instructional Media, Validation of Materialsmore » and Media, and Implementation of the Instructional Program. Examples from the Gas Centrifuge Enrichment Plant (GCEP) are used to illustrate the application of PBT.« less
Tracing the Rationale Behind UML Model Change Through Argumentation
NASA Astrophysics Data System (ADS)
Jureta, Ivan J.; Faulkner, Stéphane
Neglecting traceability—i.e., the ability to describe and follow the life of a requirement—is known to entail misunderstanding and miscommunication, leading to the engineering of poor quality systems. Following the simple principles that (a) changes to UML model instances ought be justified to the stakeholders, (b) justification should proceed in a structured manner to ensure rigor in discussions, critique, and revisions of model instances, and (c) the concept of argument instantiated in a justification process ought to be well defined and understood, the present paper introduces the UML Traceability through Argumentation Method (UML-TAM) to enable the traceability of design rationale in UML while allowing the appropriateness of model changes to be checked by analysis of the structure of the arguments provided to justify such changes.
Improving formaldehyde consumption drives methanol assimilation in engineered E. coli.
Woolston, Benjamin M; King, Jason R; Reiter, Michael; Van Hove, Bob; Stephanopoulos, Gregory
2018-06-19
Due to volatile sugar prices, the food vs fuel debate, and recent increases in the supply of natural gas, methanol has emerged as a promising feedstock for the bio-based economy. However, attempts to engineer Escherichia coli to metabolize methanol have achieved limited success. Here, we provide a rigorous systematic analysis of several potential pathway bottlenecks. We show that regeneration of ribulose 5-phosphate in E. coli is insufficient to sustain methanol assimilation, and overcome this by activating the sedoheptulose bisphosphatase variant of the ribulose monophosphate pathway. By leveraging the kinetic isotope effect associated with deuterated methanol as a chemical probe, we further demonstrate that under these conditions overall pathway flux is kinetically limited by methanol dehydrogenase. Finally, we identify NADH as a potent kinetic inhibitor of this enzyme. These results provide direction for future engineering strategies to improve methanol utilization, and underscore the value of chemical biology methodologies in metabolic engineering.
The rehabilitation engineering research center for the advancement of cognitive technologies.
Heyn, Patricia Cristine; Cassidy, Joy Lucille; Bodine, Cathy
2015-02-01
Barring few exceptions, allied health professionals, engineers, manufacturers of assistive technologies (ATs), and consumer product manufacturers have developed few technologies for individuals with cognitive impairments (CIs). In 2004, the National Institute on Disability Rehabilitation Research (NIDRR) recognized the need to support research in this emergent field. They funded the first Rehabilitation Engineering Research Center for the Advancement of Cognitive Technologies (RERC-ACT). The RERC-ACT has since designed and evaluated existing and emerging technologies through rigorous research, improving upon existing AT devices, and creating new technologies for individuals with CIs. The RERC-ACT has contributed to the development and testing of AT products that assist persons with CIs to actively engage in tasks of daily living at home, school, work, and in the community. This article highlights the RERC-ACT's engineering development and research projects and discusses how current research may impact the quality of life for an aging population. © The Author(s) 2014.
On decentralized design: Rationale, dynamics, and effects on decision-making
NASA Astrophysics Data System (ADS)
Chanron, Vincent
The focus of this dissertation is the design of complex systems, including engineering systems such as cars, airplanes, and satellites. Companies who design these systems are under constant pressure to design better products that meet customer expectations, and competition forces them to develop them faster. One of the responses of the industry to these conflicting challenges has been the decentralization of the design responsibilities. The current lack of understanding of the dynamics of decentralized design processes is the main motivation for this research, and places value on the descriptive base. It identifies the main reasons and the true benefits for companies to decentralize the design of their products. It also demonstrates the limitations of this approach by listing the relevant issues and problems created by the decentralization of decisions. Based on these observations, a game-theoretic approach to decentralized design is proposed to model the decisions made during the design process. The dynamics are modeled using mathematical formulations inspired from control theory. Building upon this formalism, the issue of convergence in decentralized design is analyzed: the equilibrium points of the design space are identified and convergent and divergent patterns are recognized. This rigorous investigation of the design process provides motivation and support for proposing new approaches to decentralized design problems. Two methods are developed, which aim at improving the design process in two ways: decreasing the product development time, and increasing the optimality of the final design. The frame of these methods are inspired by eigenstructure decomposition and set-based design, respectively. The value of the research detailed within this dissertation is in the proposed methods which are built upon the sound mathematical formalism developed. The contribution of this work is two fold: rigorous investigation of the design process, and practical support to decision-making in decentralized environments.
Systems engingeering for the Kepler Mission : a search for terrestrial planets
NASA Technical Reports Server (NTRS)
Duren, Riley M.; Dragon, Karen; Gunter, Steve Z.; Gautier, Nick; Koch, Dave; Harvey, Adam; Enos, Alan; Borucki, Bill; Sobeck, Charlie; Mayer, Dave;
2004-01-01
The Kepler mission will launch in 2007 and determine the distribution of earth-size planets (0.5 to 10 earth masses) in the habitable zones (HZs) of solar-like stars. The mission will monitor > 100,000 dwarf stars simultaneously for at least 4 years. Precision differential photometry will be used to detect the periodic signals of transiting planets. Kepler will also support asteroseismology by measuring the pressure-mode (p-mode) oscillations of selected stars. Key mission elements include a spacecraft bus and 0.95 meter, wide-field, CCD-based photometer injected into an earth-trailing heliocentric orbit by a 3-stage Delta II launch vehicle as well as a distributed Ground Segment and Follow-up Observing Program. The project is currently preparing for Preliminary Design Review (October 2004) and is proceeding with detailed design and procurement of long-lead components. In order to meet the unprecedented photometric precision requirement and to ensure a statistically significant result, the Kepler mission involves technical challenges in the areas of photometric noise and systematic error reduction, stability, and false-positive rejection. Programmatic and logistical challenges include the collaborative design, modeling, integration, test, and operation of a geographically and functionally distributed project. A very rigorous systems engineering program has evolved to address these challenges. This paper provides an overview of the Kepler systems engineering program, including some examples of our processes and techniques in areas such as requirements synthesis, validation & verification, system robustness design, and end-to-end performance modeling.
Synergizing 13C Metabolic Flux Analysis and Metabolic Engineering for Biochemical Production.
Guo, Weihua; Sheng, Jiayuan; Feng, Xueyang
Metabolic engineering of industrial microorganisms to produce chemicals, fuels, and drugs has attracted increasing interest as it provides an environment-friendly and renewable route that does not depend on depleting petroleum sources. However, the microbial metabolism is so complex that metabolic engineering efforts often have difficulty in achieving a satisfactory yield, titer, or productivity of the target chemical. To overcome this challenge, 13 C Metabolic Flux Analysis ( 13 C-MFA) has been developed to investigate rigorously the cell metabolism and quantify the carbon flux distribution in central metabolic pathways. In the past decade, 13 C-MFA has been widely used in academic labs and the biotechnology industry to pinpoint the key issues related to microbial-based chemical production and to guide the development of the appropriate metabolic engineering strategies for improving the biochemical production. In this chapter we introduce the basics of 13 C-MFA and illustrate how 13 C-MFA has been applied to synergize with metabolic engineering to identify and tackle the rate-limiting steps in biochemical production.
NASA Astrophysics Data System (ADS)
Uchrin, Christoph; Krogmann, Uta; Gimenez, Daniel
2010-05-01
It is becoming increasingly apparent that environmental problems have become extremely complex, involving inter- and multidisciplinary expertise. Furthermore, the nature of environmental episodes requires practitioners who are flexible in designing appropriate solution approaches. As a result, there is a high demand for environmental engineering graduates in the professional sector as well as graduate schools. At Rutgers University, we have designed and are now delivering an undergraduate curriculum that melds a strong background in basic and applied sciences with a rigorous sequence of design oriented engineering courses, all focused on producing graduates who view the environment in a holistic sense, rather than a narrow, medium oriented manner. Since the implementation of the program in 2004 student numbers have doubled and half of the students graduate with honors. The undergraduate program is complemented by the new Environmental Engineering option of the Graduate Program in Environmental Sciences. The undergraduate program and the graduate option are served by a highly committed faculty of seven full-time members and one part-time member.
A Disturbance Rejection Framework for the Study of Traditional Chinese Medicine
Sun, Yan
2014-01-01
The traditional Chinese medicine (TCM) is explained in the language of engineering cybernetics (EC), an engineering science with the tradition of rigor and long history of practice. The inherent connection is articulated between EC, as a science of interrelations, and the Chinese conception of Wuxing. The combined cybernetic model of Wuxing seems to have significant explaining power for the TCM and could potentially facilitate better communications of the insights of the TCM to the West. In disturbance rejection, an engineering concept, a great metaphor, is found to show how the TCM is practiced, using the liver cancer pathogenesis and treatment as a case study. The results from a series of experimental studies seem to lend support to the cybernetic model of Wuxing and the principles of disturbance rejection. PMID:24995034
Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness
NASA Technical Reports Server (NTRS)
Staats, Matt; Whalen, Michael W.; Heindahl, Mats P. E.; Rajan, Ajitha
2010-01-01
In black-box testing, the tester creates a set of tests to exercise a system under test without regard to the internal structure of the system. Generally, no objective metric is used to measure the adequacy of black-box tests. In recent work, we have proposed three requirements coverage metrics, allowing testers to objectively measure the adequacy of a black-box test suite with respect to a set of requirements formalized as Linear Temporal Logic (LTL) properties. In this report, we evaluate the effectiveness of these coverage metrics with respect to fault finding. Specifically, we conduct an empirical study to investigate two questions: (1) do test suites satisfying a requirements coverage metric provide better fault finding than randomly generated test suites of approximately the same size?, and (2) do test suites satisfying a more rigorous requirements coverage metric provide better fault finding than test suites satisfying a less rigorous requirements coverage metric? Our results indicate (1) only one coverage metric proposed -- Unique First Cause (UFC) coverage -- is sufficiently rigorous to ensure test suites satisfying the metric outperform randomly generated test suites of similar size and (2) that test suites satisfying more rigorous coverage metrics provide better fault finding than test suites satisfying less rigorous coverage metrics.
Development of a dynamic computational model of social cognitive theory.
Riley, William T; Martin, Cesar A; Rivera, Daniel E; Hekler, Eric B; Adams, Marc A; Buman, Matthew P; Pavel, Misha; King, Abby C
2016-12-01
Social cognitive theory (SCT) is among the most influential theories of behavior change and has been used as the conceptual basis of health behavior interventions for smoking cessation, weight management, and other health behaviors. SCT and other behavior theories were developed primarily to explain differences between individuals, but explanatory theories of within-person behavioral variability are increasingly needed as new technologies allow for intensive longitudinal measures and interventions adapted from these inputs. These within-person explanatory theoretical applications can be modeled as dynamical systems. SCT constructs, such as reciprocal determinism, are inherently dynamical in nature, but SCT has not been modeled as a dynamical system. This paper describes the development of a dynamical system model of SCT using fluid analogies and control systems principles drawn from engineering. Simulations of this model were performed to assess if the model performed as predicted based on theory and empirical studies of SCT. This initial model generates precise and testable quantitative predictions for future intensive longitudinal research. Dynamic modeling approaches provide a rigorous method for advancing health behavior theory development and refinement and for guiding the development of more potent and efficient interventions.
Implanted component faults and their effects on gas turbine engine performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
MacLeod, J.D.; Taylor, V.; Laflamme, J.C.G.
Under the sponsorship of the Canadian Department of National Defence, the Engine Laboratory of the National Research Council of Canada (NRCC) has established a program for the evaluation of component deterioration on gas turbine engine performance. The effect is aimed at investigating the effects of typical in-service faults on the performance characteristics of each individual engine component. The objective of the program is the development of a generalized fault library, which will be used with fault identification techniques in the field, to reduce unscheduled maintenance. To evaluate the effects of implanted faults on the performance of a single spool engine,more » such as an Allison T56 turboprop engine, a series of faulted parts were installed. For this paper the following faults were analyzed: (a) first-stage turbine nozzle erosion damage; (b) first-stage turbine rotor blade untwist; (c) compressor seal wear; (d) first and second-stage compressor blade tip clearance increase. This paper describes the project objectives, the experimental installation, and the results of the fault implantation on engine performance. Discussed are performance variations on both engine and component characteristics. As the performance changes were significant, a rigorous measurement uncertainty analysis is included.« less
Brandon, Catherine J; Holody, Michael; Inch, Geoffrey; Kabcenell, Michael; Schowalter, Diane; Mullan, Patricia B
2012-01-01
The aim of this study was to evaluate the feasibility of partnering with engineering students and critically examining the merit of the problem identification and analyses students generated in identifying sources impeding effective turnaround in a large university department of diagnostic radiology. Turnaround involves the time and activities beginning when a patient enters the magnetic resonance scanner room until the patient leaves, minus the time the scanner is conducting the protocol. A prospective observational study was conducted, in which four senior undergraduate industrial and operations engineering students interviewed magnetic resonance staff members and observed all shifts. On the basis of 150 hours of observation, the engineering students identified 11 process steps (eg, changing coils). They charted machine use for all shifts, providing a breakdown of turnaround time between appropriate process and non-value-added time. To evaluate the processes occurring in the scanning room, the students used a work-sampling schedule in which a beeper sounded 2.5 times per hour, signaling the technologist to identify which of 11 process steps was occurring. This generated 2147 random observations over a 3-week period. The breakdown of machine use over 105 individual studies showed that non-value-added time accounted for 62% of turnaround time. Analysis of 2147 random samples of work showed that scanners were empty and waiting for patients 15% of the total time. Analyses showed that poor communication delayed the arrival of patients and that no one had responsibility for communicating when scanning was done. Engineering students used rigorous study design and sampling methods to conduct interviews and observations. This led to data-driven definition of problems and potential solutions to guide systems-based improvement. Copyright © 2012 AUR. Published by Elsevier Inc. All rights reserved.
Fast synthesis of topographic mask effects based on rigorous solutions
NASA Astrophysics Data System (ADS)
Yan, Qiliang; Deng, Zhijie; Shiely, James
2007-10-01
Topographic mask effects can no longer be ignored at technology nodes of 45 nm, 32 nm and beyond. As feature sizes become comparable to the mask topographic dimensions and the exposure wavelength, the popular thin mask model breaks down, because the mask transmission no longer follows the layout. A reliable mask transmission function has to be derived from Maxwell equations. Unfortunately, rigorous solutions of Maxwell equations are only manageable for limited field sizes, but impractical for full-chip optical proximity corrections (OPC) due to the prohibitive runtime. Approximation algorithms are in demand to achieve a balance between acceptable computation time and tolerable errors. In this paper, a fast algorithm is proposed and demonstrated to model topographic mask effects for OPC applications. The ProGen Topographic Mask (POTOMAC) model synthesizes the mask transmission functions out of small-sized Maxwell solutions from a finite-difference-in-time-domain (FDTD) engine, an industry leading rigorous simulator of topographic mask effect from SOLID-E. The integral framework presents a seamless solution to the end user. Preliminary results indicate the overhead introduced by POTOMAC is contained within the same order of magnitude in comparison to the thin mask approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, Rachel M; Tfaily, Malak M
These data are provided in support of the Commentary, Advanced molecular techniques provide a rigorous method for characterizing organic matter quality in complex systems, Wilson and Tfaily (2018). Measurement results demonstrate that optical characterization of peatland dissolved organic matter (DOM) may not fully capture classically identified chemical characteristics and may, therefore, not be the best measure of organic matter quality.
Monsen, Karen A; Finn, Robert S; Fleming, Thea E; Garner, Erin J; LaValla, Amy J; Riemer, Judith G
2016-01-01
Rigor in clinical knowledge representation is necessary foundation for meaningful interoperability, exchange and reuse of electronic health record (EHR) data. It is critical for clinicians to understand principles and implications of using clinical standards for knowledge representation within EHRs. To educate clinicians and students about knowledge representation and to evaluate their success of applying the manual lookups method for assigning Systematized Nomenclature of Medicine Clinical Terms (SNOMED CT) concept identifiers using formally mapped concepts from the Omaha System interface terminology. Clinicians who were students in a doctoral nursing program conducted 21 lookups for Omaha System terms in publicly available SNOMED CT browsers. Lookups were deemed successful if results matched exactly with the corresponding code from the January 2013 SNOMED CT-Omaha System terminology cross-map. Of the 21 manual lookups attempted, 12 (57.1%) were successful. Errors were due to semantic gaps differences in granularity and synonymy or partial term matching. Achieving rigor in clinical knowledge representation across settings, vendors and health systems is a globally recognized challenge. Cross-maps have potential to improve rigor in SNOMED CT encoding of clinical data. Further research is needed to evaluate outcomes of using of terminology cross-maps to encode clinical terms with SNOMED CT concept identifiers based on interface terminologies.
Analysis of small crack behavior for airframe applications
NASA Technical Reports Server (NTRS)
Mcclung, R. C.; Chan, K. S.; Hudak, S. J., Jr.; Davidson, D. L.
1994-01-01
The small fatigue crack problem is critically reviewed from the perspective of airframe applications. Different types of small cracks-microstructural, mechanical, and chemical-are carefully defined and relevant mechanisms identified. Appropriate analysis techniques, including both rigorous scientific and practical engineering treatments, are briefly described. Important materials data issues are addressed, including increased scatter in small crack data and recommended small crack test methods. Key problems requiring further study are highlighted.
(Low-level radioactive waste management techniques)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Hoesen, S.D.; Kennerly, J.M.; Williams, L.C.
1988-08-08
The US team consisting of representatives of Oak Ridge National Laboratory (ORNL), Savannah River plant (SRP), Idaho National Engineering Laboratory (INEL), and the Department of Energy, Oak Ridge Operations participated in a training program on French low-level radioactive waste (LLW) management techniques. Training in the rigorous waste characterization, acceptance and certification procedures required in France was provided at Agence Nationale pour les Gestion des Dechets Radioactif (ANDRA) offices in Paris.
Using EPIC to Find Conflicts, Inconsistencies, and Gaps in Department of Defense Policies
2013-01-01
documentation; or deliver preliminary findings. All RAND reports un- dergo rigorous peer review to ensure that they meet high standards for research quality...responsibilities and the products that result from their execution. Once the high -level frame- work was defined, successive lower layers were developed to further...Lead or Chief Engineer Component Acquisition Executive ( CAE ) Managers Configuration Steering Board Materiel developer Contractor Milestone Decision
Parallel labeling experiments for pathway elucidation and (13)C metabolic flux analysis.
Antoniewicz, Maciek R
2015-12-01
Metabolic pathway models provide the foundation for quantitative studies of cellular physiology through the measurement of intracellular metabolic fluxes. For model organisms metabolic models are well established, with many manually curated genome-scale model reconstructions, gene knockout studies and stable-isotope tracing studies. However, for non-model organisms a similar level of knowledge is often lacking. Compartmentation of cellular metabolism in eukaryotic systems also presents significant challenges for quantitative (13)C-metabolic flux analysis ((13)C-MFA). Recently, innovative (13)C-MFA approaches have been developed based on parallel labeling experiments, the use of multiple isotopic tracers and integrated data analysis, that allow more rigorous validation of pathway models and improved quantification of metabolic fluxes. Applications of these approaches open new research directions in metabolic engineering, biotechnology and medicine. Copyright © 2015 Elsevier Ltd. All rights reserved.
Double Dutch: A Tool for Designing Combinatorial Libraries of Biological Systems.
Roehner, Nicholas; Young, Eric M; Voigt, Christopher A; Gordon, D Benjamin; Densmore, Douglas
2016-06-17
Recently, semirational approaches that rely on combinatorial assembly of characterized DNA components have been used to engineer biosynthetic pathways. In practice, however, it is not practical to assemble and test millions of pathway variants in order to elucidate how different DNA components affect the behavior of a pathway. To address this challenge, we apply a rigorous mathematical approach known as design of experiments (DOE) that can be used to construct empirical models of system behavior without testing all variants. To support this approach, we have developed a tool named Double Dutch, which uses a formal grammar and heuristic algorithms to automate the process of DOE library design. Compared to designing by hand, Double Dutch enables users to more efficiently and scalably design libraries of pathway variants that can be used in a DOE framework and uniquely provides a means to flexibly balance design considerations of statistical analysis, construction cost, and risk of homologous recombination, thereby demonstrating the utility of automating decision making when faced with complex design trade-offs.
Modal-pushover-based ground-motion scaling procedure
Kalkan, Erol; Chopra, Anil K.
2011-01-01
Earthquake engineering is increasingly using nonlinear response history analysis (RHA) to demonstrate the performance of structures. This rigorous method of analysis requires selection and scaling of ground motions appropriate to design hazard levels. This paper presents a modal-pushover-based scaling (MPS) procedure to scale ground motions for use in a nonlinear RHA of buildings. In the MPS method, the ground motions are scaled to match to a specified tolerance, a target value of the inelastic deformation of the first-mode inelastic single-degree-of-freedom (SDF) system whose properties are determined by the first-mode pushover analysis. Appropriate for first-mode dominated structures, this approach is extended for structures with significant contributions of higher modes by considering elastic deformation of second-mode SDF systems in selecting a subset of the scaled ground motions. Based on results presented for three actual buildings-4, 6, and 13-story-the accuracy and efficiency of the MPS procedure are established and its superiority over the ASCE/SEI 7-05 scaling procedure is demonstrated.
Point-of-Care Technologies for Precision Cardiovascular Care and Clinical Research
King, Kevin; Grazette, Luanda P.; Paltoo, Dina N.; McDevitt, John T.; Sia, Samuel K.; Barrett, Paddy M.; Apple, Fred S.; Gurbel, Paul A.; Weissleder, Ralph; Leeds, Hilary; Iturriaga, Erin J.; Rao, Anupama; Adhikari, Bishow; Desvigne-Nickens, Patrice; Galis, Zorina S.; Libby, Peter
2016-01-01
Point-of-care technologies (POC or POCT) are enabling innovative cardiovascular diagnostics that promise to improve patient care across diverse clinical settings. The National Heart, Lung, and Blood Institute convened a working group to discuss POCT in cardiovascular medicine. The multidisciplinary working group, which included clinicians, scientists, engineers, device manufacturers, regulatory officials, and program staff, reviewed the state of the POCT field; discussed opportunities for POCT to improve cardiovascular care, realize the promise of precision medicine, and advance the clinical research enterprise; and identified barriers facing translation and integration of POCT with existing clinical systems. A POCT development roadmap emerged to guide multidisciplinary teams of biomarker scientists, technologists, health care providers, and clinical trialists as they: 1) formulate needs assessments; 2) define device design specifications; 3) develop component technologies and integrated systems; 4) perform iterative pilot testing; and 5) conduct rigorous prospective clinical testing to ensure that POCT solutions have substantial effects on cardiovascular care. PMID:26977455
The 1991 natural gas vehicle challenge: Developing dedicated natural gas vehicle technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larsen, R.; Rimkus, W.; Davies, J.
An engineering research and design competition to develop and demonstrate dedicated natural gas-powered light-duty trucks, the Natural Gas Vehicle (NGV) Challenge, was held June 6--11, 1191, in Oklahoma. Sponsored by the US Department of Energy (DOE), Energy, Mines, and Resources -- Canada (EMR), the Society of Automative Engineers (SAE), and General Motors Corporation (GM), the competition consisted of rigorous vehicle testing of exhaust emissions, fuel economy, performance parameters, and vehicle design. Using Sierra 2500 pickup trucks donated by GM, 24 teams of college and university engineers from the US and Canada participated in the event. A gasoline-powered control testing asmore » a reference vehicle. This paper discusses the results of the event, summarizes the technologies employed, and makes observations on the state of natural gas vehicle technology.« less
The 1991 natural gas vehicle challenge: Developing dedicated natural gas vehicle technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larsen, R.; Rimkus, W.; Davies, J.
1992-01-01
An engineering research and design competition to develop and demonstrate dedicated natural gas-powered light-duty trucks, the Natural Gas Vehicle (NGV) Challenge, was held June 6--11, 1191, in Oklahoma. Sponsored by the US Department of Energy (DOE), Energy, Mines, and Resources -- Canada (EMR), the Society of Automative Engineers (SAE), and General Motors Corporation (GM), the competition consisted of rigorous vehicle testing of exhaust emissions, fuel economy, performance parameters, and vehicle design. Using Sierra 2500 pickup trucks donated by GM, 24 teams of college and university engineers from the US and Canada participated in the event. A gasoline-powered control testing asmore » a reference vehicle. This paper discusses the results of the event, summarizes the technologies employed, and makes observations on the state of natural gas vehicle technology.« less
A rigorous and simpler method of image charges
NASA Astrophysics Data System (ADS)
Ladera, C. L.; Donoso, G.
2016-07-01
The method of image charges relies on the proven uniqueness of the solution of the Laplace differential equation for an electrostatic potential which satisfies some specified boundary conditions. Granted by that uniqueness, the method of images is rightly described as nothing but shrewdly guessing which and where image charges are to be placed to solve the given electrostatics problem. Here we present an alternative image charges method that is based not on guessing but on rigorous and simpler theoretical grounds, namely the constant potential inside any conductor and the application of powerful geometric symmetries. The aforementioned required uniqueness and, more importantly, guessing are therefore both altogether dispensed with. Our two new theoretical fundaments also allow the image charges method to be introduced in earlier physics courses for engineering and sciences students, instead of its present and usual introduction in electromagnetic theory courses that demand familiarity with the Laplace differential equation and its boundary conditions.
NASA Technical Reports Server (NTRS)
Sankararaman, Shankar
2016-01-01
This paper presents a computational framework for uncertainty characterization and propagation, and sensitivity analysis under the presence of aleatory and epistemic un- certainty, and develops a rigorous methodology for efficient refinement of epistemic un- certainty by identifying important epistemic variables that significantly affect the overall performance of an engineering system. The proposed methodology is illustrated using the NASA Langley Uncertainty Quantification Challenge (NASA-LUQC) problem that deals with uncertainty analysis of a generic transport model (GTM). First, Bayesian inference is used to infer subsystem-level epistemic quantities using the subsystem-level model and corresponding data. Second, tools of variance-based global sensitivity analysis are used to identify four important epistemic variables (this limitation specified in the NASA-LUQC is reflective of practical engineering situations where not all epistemic variables can be refined due to time/budget constraints) that significantly affect system-level performance. The most significant contribution of this paper is the development of the sequential refine- ment methodology, where epistemic variables for refinement are not identified all-at-once. Instead, only one variable is first identified, and then, Bayesian inference and global sensi- tivity calculations are repeated to identify the next important variable. This procedure is continued until all 4 variables are identified and the refinement in the system-level perfor- mance is computed. The advantages of the proposed sequential refinement methodology over the all-at-once uncertainty refinement approach are explained, and then applied to the NASA Langley Uncertainty Quantification Challenge problem.
Self-reconfigurable ship fluid-network modeling for simulation-based design
NASA Astrophysics Data System (ADS)
Moon, Kyungjin
Our world is filled with large-scale engineering systems, which provide various services and conveniences in our daily life. A distinctive trend in the development of today's large-scale engineering systems is the extensive and aggressive adoption of automation and autonomy that enable the significant improvement of systems' robustness, efficiency, and performance, with considerably reduced manning and maintenance costs, and the U.S. Navy's DD(X), the next-generation destroyer program, is considered as an extreme example of such a trend. This thesis pursues a modeling solution for performing simulation-based analysis in the conceptual or preliminary design stage of an intelligent, self-reconfigurable ship fluid system, which is one of the concepts of DD(X) engineering plant development. Through the investigations on the Navy's approach for designing a more survivable ship system, it is found that the current naval simulation-based analysis environment is limited by the capability gaps in damage modeling, dynamic model reconfiguration, and simulation speed of the domain specific models, especially fluid network models. As enablers of filling these gaps, two essential elements were identified in the formulation of the modeling method. The first one is the graph-based topological modeling method, which will be employed for rapid model reconstruction and damage modeling, and the second one is the recurrent neural network-based, component-level surrogate modeling method, which will be used to improve the affordability and efficiency of the modeling and simulation (M&S) computations. The integration of the two methods can deliver computationally efficient, flexible, and automation-friendly M&S which will create an environment for more rigorous damage analysis and exploration of design alternatives. As a demonstration for evaluating the developed method, a simulation model of a notional ship fluid system was created, and a damage analysis was performed. Next, the models representing different design configurations of the fluid system were created, and damage analyses were performed with them in order to find an optimal design configuration for system survivability. Finally, the benefits and drawbacks of the developed method were discussed based on the result of the demonstration.
Rost, Christina M.; Sachet, Edward; Borman, Trent; Moballegh, Ali; Dickey, Elizabeth C.; Hou, Dong; Jones, Jacob L.; Curtarolo, Stefano; Maria, Jon-Paul
2015-01-01
Configurational disorder can be compositionally engineered into mixed oxide by populating a single sublattice with many distinct cations. The formulations promote novel and entropy-stabilized forms of crystalline matter where metal cations are incorporated in new ways. Here, through rigorous experiments, a simple thermodynamic model, and a five-component oxide formulation, we demonstrate beyond reasonable doubt that entropy predominates the thermodynamic landscape, and drives a reversible solid-state transformation between a multiphase and single-phase state. In the latter, cation distributions are proven to be random and homogeneous. The findings validate the hypothesis that deliberate configurational disorder provides an orthogonal strategy to imagine and discover new phases of crystalline matter and untapped opportunities for property engineering. PMID:26415623
Near Identifiability of Dynamical Systems
NASA Technical Reports Server (NTRS)
Hadaegh, F. Y.; Bekey, G. A.
1987-01-01
Concepts regarding approximate mathematical models treated rigorously. Paper presents new results in analysis of structural identifiability, equivalence, and near equivalence between mathematical models and physical processes they represent. Helps establish rigorous mathematical basis for concepts related to structural identifiability and equivalence revealing fundamental requirements, tacit assumptions, and sources of error. "Structural identifiability," as used by workers in this field, loosely translates as meaning ability to specify unique mathematical model and set of model parameters that accurately predict behavior of corresponding physical system.
A Theoretical Framework for Lagrangian Descriptors
NASA Astrophysics Data System (ADS)
Lopesino, C.; Balibrea-Iniesta, F.; García-Garrido, V. J.; Wiggins, S.; Mancho, A. M.
This paper provides a theoretical background for Lagrangian Descriptors (LDs). The goal of achieving rigorous proofs that justify the ability of LDs to detect invariant manifolds is simplified by introducing an alternative definition for LDs. The definition is stated for n-dimensional systems with general time dependence, however we rigorously prove that this method reveals the stable and unstable manifolds of hyperbolic points in four particular 2D cases: a hyperbolic saddle point for linear autonomous systems, a hyperbolic saddle point for nonlinear autonomous systems, a hyperbolic saddle point for linear nonautonomous systems and a hyperbolic saddle point for nonlinear nonautonomous systems. We also discuss further rigorous results which show the ability of LDs to highlight additional invariants sets, such as n-tori. These results are just a simple extension of the ergodic partition theory which we illustrate by applying this methodology to well-known examples, such as the planar field of the harmonic oscillator and the 3D ABC flow. Finally, we provide a thorough discussion on the requirement of the objectivity (frame-invariance) property for tools designed to reveal phase space structures and their implications for Lagrangian descriptors.
Gatica, M C; Monti, G E; Knowles, T G; Gallo, C B
2010-01-09
Two systems for transporting live salmon (Salmo salar) were compared in terms of their effects on blood variables, muscle pH and rigor index: an 'open system' well-boat with recirculated sea water at 13.5 degrees C and a stocking density of 107 kg/m3 during an eight-hour journey, and a 'closed system' well-boat with water chilled from 16.7 to 2.1 degrees C and a stocking density of 243.7 kg/m3 during a seven-hour journey. Groups of 10 fish were sampled at each of four stages: in cages at the farm, in the well-boat after loading, in the well-boat after the journey and before unloading, and in the processing plant after they were pumped from the resting cages. At each sampling, the fish were stunned and bled by gill cutting. Blood samples were taken to measure lactate, osmolality, chloride, sodium, cortisol and glucose, and their muscle pH and rigor index were measured at death and three hours later. In the open system well-boat, the initial muscle pH of the fish decreased at each successive stage, and at the final stage they had a significantly lower initial muscle pH and more rapid onset of rigor than the fish transported on the closed system well-boat. At the final stage all the blood variables except glucose were significantly affected in the fish transported on both types of well-boat.
2010-09-01
estimation of total exposure at any toxicological endpoint in the body. This effort is a significant contribution as it highlights future research needs...rigorous modeling of the nanoparticle transport by including physico-chemical properties of engineered particles. Similarly, toxicological dose-response...exposure risks as compared to larger sized particles of the same material. Although the toxicology of a base material may be thoroughly defined, the
A liquid metal-based structurally embedded vascular antenna: I. Concept and multiphysical modeling
NASA Astrophysics Data System (ADS)
Hartl, D. J.; Frank, G. J.; Huff, G. H.; Baur, J. W.
2017-02-01
This work proposes a new concept for a reconfigurable structurally embedded vascular antenna (SEVA). The work builds on ongoing research of structurally embedded microvascular systems in laminated structures for thermal transport and self-healing and on studies of non-toxic liquid metals for reconfigurable electronics. In the example design, liquid metal-filled channels in a laminated composite act as radiating elements for a high-power planar zig-zag wire log periodic dipole antenna. Flow of liquid metal through the channels is used to limit the temperature of the composite in which the antenna is embedded. A multiphysics engineering model of the transmitting antenna is formulated that couples the electromagnetic, fluid, thermal, and mechanical responses. In part 1 of this two-part work, it is shown that the liquid metal antenna is highly reconfigurable in terms of its electromagnetic response and that dissipated thermal energy generated during high power operation can be offset by the action of circulating or cyclically replacing the liquid metal such that heat is continuously removed from the system. In fact, the SEVA can potentially outperform traditional copper-based antennas in high-power operational configurations. The coupled engineering model is implemented in an automated framework and a design of experiment study is performed to quantify first-order design trade-offs in this multifunctional structure. More rigorous design optimization is addressed in part 2.
Visually Coupled Systems (VCS): The Virtual Panoramic Display (VPD) System
NASA Technical Reports Server (NTRS)
Kocian, Dean F.
1992-01-01
The development and impact is described of new visually coupled system (VCS) equipment designed to support engineering and human factors research in the military aircraft cockpit environment. VCS represents an advanced man-machine interface (MMI). Its potential to improve aircrew situational awareness seems enormous, but its superiority over the conventional cockpit MMI has not been established in a conclusive and rigorous fashion. What has been missing is a 'systems' approach to technology advancement that is comprehensive enough to produce conclusive results concerning the operational viability of the VCS concept and verify any risk factors that might be involved with its general use in the cockpit. The advanced VCS configuration described here, was ruggedized for use in military aircraft environments and was dubbed the Virtual Panoramic Display (VPD). It was designed to answer the VCS portion of the systems problem, and is implemented as a modular system whose performance can be tailored to specific application requirements. The overall system concept and the design of the two most important electronic subsystems that support the helmet mounted parts, a new militarized version of the magnetic helmet mounted sight and correspondingly similar helmet display electronics, are discussed in detail. Significant emphasis is given to illustrating how particular design features in the hardware improve overall system performance and support research activities.
Reverse Engineering Validation using a Benchmark Synthetic Gene Circuit in Human Cells
Kang, Taek; White, Jacob T.; Xie, Zhen; Benenson, Yaakov; Sontag, Eduardo; Bleris, Leonidas
2013-01-01
Multi-component biological networks are often understood incompletely, in large part due to the lack of reliable and robust methodologies for network reverse engineering and characterization. As a consequence, developing automated and rigorously validated methodologies for unraveling the complexity of biomolecular networks in human cells remains a central challenge to life scientists and engineers. Today, when it comes to experimental and analytical requirements, there exists a great deal of diversity in reverse engineering methods, which renders the independent validation and comparison of their predictive capabilities difficult. In this work we introduce an experimental platform customized for the development and verification of reverse engineering and pathway characterization algorithms in mammalian cells. Specifically, we stably integrate a synthetic gene network in human kidney cells and use it as a benchmark for validating reverse engineering methodologies. The network, which is orthogonal to endogenous cellular signaling, contains a small set of regulatory interactions that can be used to quantify the reconstruction performance. By performing successive perturbations to each modular component of the network and comparing protein and RNA measurements, we study the conditions under which we can reliably reconstruct the causal relationships of the integrated synthetic network. PMID:23654266
Reverse engineering validation using a benchmark synthetic gene circuit in human cells.
Kang, Taek; White, Jacob T; Xie, Zhen; Benenson, Yaakov; Sontag, Eduardo; Bleris, Leonidas
2013-05-17
Multicomponent biological networks are often understood incompletely, in large part due to the lack of reliable and robust methodologies for network reverse engineering and characterization. As a consequence, developing automated and rigorously validated methodologies for unraveling the complexity of biomolecular networks in human cells remains a central challenge to life scientists and engineers. Today, when it comes to experimental and analytical requirements, there exists a great deal of diversity in reverse engineering methods, which renders the independent validation and comparison of their predictive capabilities difficult. In this work we introduce an experimental platform customized for the development and verification of reverse engineering and pathway characterization algorithms in mammalian cells. Specifically, we stably integrate a synthetic gene network in human kidney cells and use it as a benchmark for validating reverse engineering methodologies. The network, which is orthogonal to endogenous cellular signaling, contains a small set of regulatory interactions that can be used to quantify the reconstruction performance. By performing successive perturbations to each modular component of the network and comparing protein and RNA measurements, we study the conditions under which we can reliably reconstruct the causal relationships of the integrated synthetic network.
NASA's J-2X Engine Builds on the Apollo Program for Lunar Return Missions
NASA Technical Reports Server (NTRS)
Snoddy, Jimmy R.
2006-01-01
In January 2006, NASA streamlined its U.S. Vision for Space Exploration hardware development approach for replacing the Space Shuttle after it is retired in 2010. The revised CLV upper stage will use the J-2X engine, a derivative of NASA s Apollo Program Saturn V s S-II and S-IVB main propulsion, which will also serve as the Earth Departure Stage (EDS) engine. This paper gives details of how the J- 2X engine effort mitigates risk by building on the Apollo Program and other lessons learned to deliver a human-rated engine that is on an aggressive development schedule, with first demonstration flight in 2010 and human test flights in 2012. It is well documented that propulsion is historically a high-risk area. NASA s risk reduction strategy for the J-2X engine design, development, test, and evaluation is to build upon heritage hardware and apply valuable experience gained from past development efforts. In addition, NASA and its industry partner, Rocketdyne, which originally built the J-2, have tapped into their extensive databases and are applying lessons conveyed firsthand by Apollo-era veterans of America s first round of Moon missions in the 1960s and 1970s. NASA s development approach for the J-2X engine includes early requirements definition and management; designing-in lessons learned from the 5-2 heritage programs; initiating long-lead procurement items before Preliminary Desi& Review; incorporating design features for anticipated EDS requirements; identifying facilities for sea-level and altitude testing; and starting ground support equipment and logistics planning at an early stage. Other risk reduction strategies include utilizing a proven gas generator cycle with recent development experience; utilizing existing turbomachinery ; applying current and recent main combustion chamber (Integrated Powerhead Demonstrator) and channel wall nozzle (COBRA) advances; and performing rigorous development, qualification, and certification testing of the engine system, with a philosophy of "test what you fly, and fly what you test". These and other active risk management strategies are in place to deliver the J-2X engine for LEO and lunar return missions as outlined in the U.S. Vision for Space Exploration.
Bringing a military approach to teaching.
Baillie, Jonathan
2015-03-01
Despite having only established the company nine years ago, the founders of Kidderminster-based Avensys Medical believe the company now offers not only one of the UK's most comprehensive maintenance, repair, consultancy, and equipment audit services for medical and dental equipment, but also one of the most tailored training portfolios for electro-biomedical (EBME) engineers working in healthcare settings to enable them to get the best out of such equipment, improve patient safety, optimise service life, and save both the NHS and private sector money. As HEJ editor, Jonathan Baillie, discovered on meeting one of the two co-founders, ex-Royal Electrical and Mechanical Engineers (REME) artificer sergeant-major (ASM) and MoD engineering trainer, Robert Strange, many of the company's key trainers have a strong military background, and it is the rigorous and disciplined approach this enables them to bring to their training that he believes singles the company out.
A Novel Estimator for the Rate of Information Transfer by Continuous Signals
Takalo, Jouni; Ignatova, Irina; Weckström, Matti; Vähäsöyrinki, Mikko
2011-01-01
The information transfer rate provides an objective and rigorous way to quantify how much information is being transmitted through a communications channel whose input and output consist of time-varying signals. However, current estimators of information content in continuous signals are typically based on assumptions about the system's linearity and signal statistics, or they require prohibitive amounts of data. Here we present a novel information rate estimator without these limitations that is also optimized for computational efficiency. We validate the method with a simulated Gaussian information channel and demonstrate its performance with two example applications. Information transfer between the input and output signals of a nonlinear system is analyzed using a sensory receptor neuron as the model system. Then, a climate data set is analyzed to demonstrate that the method can be applied to a system based on two outputs generated by interrelated random processes. These analyses also demonstrate that the new method offers consistent performance in situations where classical methods fail. In addition to these examples, the method is applicable to a wide range of continuous time series commonly observed in the natural sciences, economics and engineering. PMID:21494562
Shaping Ability of Single-file Systems with Different Movements: A Micro-computed Tomographic Study.
Santa-Rosa, Joedy; de Sousa-Neto, Manoel Damião; Versiani, Marco Aurelio; Nevares, Giselle; Xavier, Felipe; Romeiro, Kaline; Cassimiro, Marcely; Leoni, Graziela Bianchi; de Menezes, Rebeca Ferraz; Albuquerque, Diana
2016-01-01
This study aimed to perform a rigorous sample standardization and also evaluate the preparation of mesiobuccal (MB) root canals of maxillary molars with severe curvatures using two single-file engine-driven systems (WaveOne with reciprocating motion and OneShape with rotary movement), using micro-computed tomography (micro-CT). Ten MB roots with single canals were included, uniformly distributed into two groups (n=5). The samples were prepared with a WaveOne or OneShape files. The shaping ability and amount of canal transportation were assessed by a comparison of the pre- and post-instrumentation micro-CT scans. The Kolmogorov-Smirnov and t-tests were used for statistical analysis. The level of significance was set at 0.05. Instrumentation of canals increased their surface area and volume. Canal transportation occurred in coronal, middle and apical thirds and no statistical difference was observed between the two systems (P>0.05). In apical third, significant differences were found between groups in canal roundness (in 3 mm level) and perimeter (in 3 and 4 mm levels) (P<0.05). The WaveOne and One Shape single-file systems were able to shape curved root canals, producing minor changes in the canal curvature.
The growth and breakdown of a vortex-pair in a stably stratified fluid
NASA Astrophysics Data System (ADS)
Advaith, S.; Tinaikar, Aashay; Manu, K. V.; Basu, Saptarshi
2017-11-01
Vortex interaction with density stratification is ubiquitous in nature and applied to various engineering applications. Present study have characterized the spatial and temporal dynamics of the interaction between a vortex and a density stratified interface. The present work is prompted by our research on single tank Thermal Energy Storage (TES) system used in concentrated solar power (CSP) plants where hot and cold fluids are separated by means of density stratification. Rigorous qualitative (High speed Shadowgraph) and quantitative (high speed PIV) studies enable us to have great understanding about vortex formation, propagation, interaction dynamics with density stratified interface, resulted plume characteristics and so on. We have categorized this interaction phenomena in to three different cases based on its nature as non-penetrative, partial penetrative and extensively penetrative. Along with that we have proposed a regime map consisting non-dimensional parameters like Reynolds, Richardson and Atwood numbers which predicts the occurrence above mentioned cases.
Comprehensive modeling of a liquid rocket combustion chamber
NASA Technical Reports Server (NTRS)
Liang, P.-Y.; Fisher, S.; Chang, Y. M.
1985-01-01
An analytical model for the simulation of detailed three-phase combustion flows inside a liquid rocket combustion chamber is presented. The three phases involved are: a multispecies gaseous phase, an incompressible liquid phase, and a particulate droplet phase. The gas and liquid phases are continuum described in an Eulerian fashion. A two-phase solution capability for these continuum media is obtained through a marriage of the Implicit Continuous Eulerian (ICE) technique and the fractional Volume of Fluid (VOF) free surface description method. On the other hand, the particulate phase is given a discrete treatment and described in a Lagrangian fashion. All three phases are hence treated rigorously. Semi-empirical physical models are used to describe all interphase coupling terms as well as the chemistry among gaseous components. Sample calculations using the model are given. The results show promising application to truly comprehensive modeling of complex liquid-fueled engine systems.
Promoting a Culture of Tailoring for Systems Engineering Policy Expectations
NASA Technical Reports Server (NTRS)
Blankenship, Van A.
2016-01-01
NASA's Marshall Space Flight Center (MSFC) has developed an integrated systems engineering approach to promote a culture of tailoring for program and project policy requirements. MSFC's culture encourages and supports tailoring, with an emphasis on risk-based decision making, for enhanced affordability and efficiency. MSFC's policy structure integrates the various Agency requirements into a single, streamlined implementation approach which serves as a "one-stop-shop" for our programs and projects to follow. The engineers gain an enhanced understanding of policy and technical expectations, as well as lesson's learned from MSFC's history of spaceflight and science missions, to enable them to make appropriate, risk-based tailoring recommendations. The tailoring approach utilizes a standard methodology to classify projects into predefined levels using selected mission and programmatic scaling factors related to risk tolerance. Policy requirements are then selectively applied and tailored, with appropriate rationale, and approved by the governing authorities, to support risk-informed decisions to achieve the desired cost and schedule efficiencies. The policy is further augmented by implementation tools and lifecycle planning aids which help promote and support the cultural shift toward more tailoring. The MSFC Customization Tool is an integrated spreadsheet that ties together everything that projects need to understand, navigate, and tailor the policy. It helps them classify their project, understand the intent of the requirements, determine their tailoring approach, and document the necessary governance approvals. It also helps them plan for and conduct technical reviews throughout the lifecycle. Policy tailoring is thus established as a normal part of project execution, with the tools provided to facilitate and enable the tailoring process. MSFC's approach to changing the culture emphasizes risk-based tailoring of policy to achieve increased flexibility, efficiency, and effectiveness in project execution, while maintaining appropriate rigor to ensure mission success.
NASA Astrophysics Data System (ADS)
Rohde, Mitchell M.; Crawford, Justin; Toschlog, Matthew; Iagnemma, Karl D.; Kewlani, Guarav; Cummins, Christopher L.; Jones, Randolph A.; Horner, David A.
2009-05-01
It is widely recognized that simulation is pivotal to vehicle development, whether manned or unmanned. There are few dedicated choices, however, for those wishing to perform realistic, end-to-end simulations of unmanned ground vehicles (UGVs). The Virtual Autonomous Navigation Environment (VANE), under development by US Army Engineer Research and Development Center (ERDC), provides such capabilities but utilizes a High Performance Computing (HPC) Computational Testbed (CTB) and is not intended for on-line, real-time performance. A product of the VANE HPC research is a real-time desktop simulation application under development by the authors that provides a portal into the HPC environment as well as interaction with wider-scope semi-automated force simulations (e.g. OneSAF). This VANE desktop application, dubbed the Autonomous Navigation Virtual Environment Laboratory (ANVEL), enables analysis and testing of autonomous vehicle dynamics and terrain/obstacle interaction in real-time with the capability to interact within the HPC constructive geo-environmental CTB for high fidelity sensor evaluations. ANVEL leverages rigorous physics-based vehicle and vehicle-terrain interaction models in conjunction with high-quality, multimedia visualization techniques to form an intuitive, accurate engineering tool. The system provides an adaptable and customizable simulation platform that allows developers a controlled, repeatable testbed for advanced simulations. ANVEL leverages several key technologies not common to traditional engineering simulators, including techniques from the commercial video-game industry. These enable ANVEL to run on inexpensive commercial, off-the-shelf (COTS) hardware. In this paper, the authors describe key aspects of ANVEL and its development, as well as several initial applications of the system.
2010-10-18
August 2010 was building the right game “ – World of Warcraft has 30% women (according to womengamers.com) Conclusion: – We don’t really understand why...Report of the National Academies on Informal Learning • Infancy - late adulthood: Learn about the world & develop important skills for science...Education With Rigor and Vigor – Excitement, interest, and motivation to learn about phenomena in the natural and physical world . – Generate
Goal-Function Tree Modeling for Systems Engineering and Fault Management
NASA Technical Reports Server (NTRS)
Patterson, Jonathan D.; Johnson, Stephen B.
2013-01-01
The draft NASA Fault Management (FM) Handbook (2012) states that Fault Management (FM) is a "part of systems engineering", and that it "demands a system-level perspective" (NASAHDBK- 1002, 7). What, exactly, is the relationship between systems engineering and FM? To NASA, systems engineering (SE) is "the art and science of developing an operable system capable of meeting requirements within often opposed constraints" (NASA/SP-2007-6105, 3). Systems engineering starts with the elucidation and development of requirements, which set the goals that the system is to achieve. To achieve these goals, the systems engineer typically defines functions, and the functions in turn are the basis for design trades to determine the best means to perform the functions. System Health Management (SHM), by contrast, defines "the capabilities of a system that preserve the system's ability to function as intended" (Johnson et al., 2011, 3). Fault Management, in turn, is the operational subset of SHM, which detects current or future failures, and takes operational measures to prevent or respond to these failures. Failure, in turn, is the "unacceptable performance of intended function." (Johnson 2011, 605) Thus the relationship of SE to FM is that SE defines the functions and the design to perform those functions to meet system goals and requirements, while FM detects the inability to perform those functions and takes action. SHM and FM are in essence "the dark side" of SE. For every function to be performed (SE), there is the possibility that it is not successfully performed (SHM); FM defines the means to operationally detect and respond to this lack of success. We can also describe this in terms of goals: for every goal to be achieved, there is the possibility that it is not achieved; FM defines the means to operationally detect and respond to this inability to achieve the goal. This brief description of relationships between SE, SHM, and FM provide hints to a modeling approach to provide formal connectivity between the nominal (SE), and off-nominal (SHM and FM) aspects of functions and designs. This paper describes a formal modeling approach to the initial phases of the development process that integrates the nominal and off-nominal perspectives in a model that unites SE goals and functions of with the failure to achieve goals and functions (SHM/FM). This methodology and corresponding model, known as a Goal-Function Tree (GFT), provides a means to represent, decompose, and elaborate system goals and functions in a rigorous manner that connects directly to design through use of state variables that translate natural language requirements and goals into logical-physical state language. The state variable-based approach also provides the means to directly connect FM to the design, by specifying the range in which state variables must be controlled to achieve goals, and conversely, the failures that exist if system behavior go out-of-range. This in turn allows for the systems engineers and SHM/FM engineers to determine which state variables to monitor, and what action(s) to take should the system fail to achieve that goal. In sum, the GFT representation provides a unified approach to early-phase SE and FM development. This representation and methodology has been successfully developed and implemented using Systems Modeling Language (SysML) on the NASA Space Launch System (SLS) Program. It enabled early design trade studies of failure detection coverage to ensure complete detection coverage of all crew-threatening failures. The representation maps directly both to FM algorithm designs, and to failure scenario definitions needed for design analysis and testing. The GFT representation provided the basis for mapping of abort triggers into scenarios, both needed for initial, and successful quantitative analyses of abort effectiveness (detection and response to crew-threatening events).
Dasgupta, Nilanjana
2017-01-01
Scientific and engineering innovation is vital for American competitiveness, quality of life, and national security. However, too few American students, especially women, pursue these fields. Although this problem has attracted enormous attention, rigorously tested interventions outside artificial laboratory settings are quite rare. To address this gap, we conducted a longitudinal field experiment investigating the effect of peer mentoring on women’s experiences and retention in engineering during college transition, assessing its impact for 1 y while mentoring was active, and an additional 1 y after mentoring had ended. Incoming women engineering students (n = 150) were randomly assigned to female or male peer mentors or no mentors for 1 y. Their experiences were assessed multiple times during the intervention year and 1-y postintervention. Female (but not male) mentors protected women’s belonging in engineering, self-efficacy, motivation, retention in engineering majors, and postcollege engineering aspirations. Counter to common assumptions, better engineering grades were not associated with more retention or career aspirations in engineering in the first year of college. Notably, increased belonging and self-efficacy were significantly associated with more retention and career aspirations. The benefits of peer mentoring endured long after the intervention had ended, inoculating women for the first 2 y of college—the window of greatest attrition from science, technology, engineering, and mathematics (STEM) majors. Thus, same-gender peer mentoring for a short period during developmental transition points promotes women’s success and retention in engineering, yielding dividends over time. PMID:28533360
Dennehy, Tara C; Dasgupta, Nilanjana
2017-06-06
Scientific and engineering innovation is vital for American competitiveness, quality of life, and national security. However, too few American students, especially women, pursue these fields. Although this problem has attracted enormous attention, rigorously tested interventions outside artificial laboratory settings are quite rare. To address this gap, we conducted a longitudinal field experiment investigating the effect of peer mentoring on women's experiences and retention in engineering during college transition, assessing its impact for 1 y while mentoring was active, and an additional 1 y after mentoring had ended. Incoming women engineering students ( n = 150) were randomly assigned to female or male peer mentors or no mentors for 1 y. Their experiences were assessed multiple times during the intervention year and 1-y postintervention. Female (but not male) mentors protected women's belonging in engineering, self-efficacy, motivation, retention in engineering majors, and postcollege engineering aspirations. Counter to common assumptions, better engineering grades were not associated with more retention or career aspirations in engineering in the first year of college. Notably, increased belonging and self-efficacy were significantly associated with more retention and career aspirations. The benefits of peer mentoring endured long after the intervention had ended, inoculating women for the first 2 y of college-the window of greatest attrition from science, technology, engineering, and mathematics (STEM) majors. Thus, same-gender peer mentoring for a short period during developmental transition points promotes women's success and retention in engineering, yielding dividends over time.
NASA Technical Reports Server (NTRS)
Sohus, Anita M.; Wessen, Alice S.
2004-01-01
In communicating science to the public, just the facts can leave the public baffled, bewildered, and bored. In communicating science to the public, we need to learn to tell the story, not just the facts. Science and engineering is serious business, requiring precise language and rigorous reporting of "just the facts." Yet, we believe this very code of integrity has contributed to a public image, at best, of scientists as eccentrics and engineers as geeks, and at worst, as elitist snobs who speak in secret codes. The very heart of the science process - open discussion and disagreement - often leaves the public with the impression that scientists don't know which way is up.
Quality in physical therapist clinical education: a systematic review.
McCallum, Christine A; Mosher, Peter D; Jacobson, Peri J; Gallivan, Sean P; Giuffre, Suzanne M
2013-10-01
Many factors affect student learning throughout the clinical education (CE) component of professional (entry-level) physical therapist education curricula. Physical therapist education programs (PTEPs) manage CE, yet the material and human resources required to provide CE are generally overseen by community-based physical therapist practices. The purposes of this systematic review were: (1) to examine how the construct of quality is defined in CE literature and (2) to determine the methodological rigor of the available evidence on quality in physical therapist CE. This study was a systematic review of English-language journals using the American Physical Therapy Association's Open Door Portal to Evidence-Based Practice as the computer search engine. The search was categorized using terms for physical therapy and quality and for CE pedagogy and models or roles. Summary findings were characterized by 5 primary themes and 14 subthemes using a qualitative-directed content analysis. Fifty-four articles were included in the study. The primary quality themes were: CE framework, CE sites, structure of CE, assessment in CE, and CE faculty. The methodological rigor of the studies was critically appraised using a binary system based on the McMaster appraisal tools. Scores ranged from 3 to 14. Publication bias and outcome reporting bias may be inherent limitations to the results. The review found inconclusive evidence about what constitutes quality or best practice for physical therapist CE. Five key constructs of CE were identified that, when aggregated, could construe quality.
Bilitchenko, Lesia; Liu, Adam; Cheung, Sherine; Weeding, Emma; Xia, Bing; Leguia, Mariana; Anderson, J Christopher; Densmore, Douglas
2011-04-29
Synthetic biological systems are currently created by an ad-hoc, iterative process of specification, design, and assembly. These systems would greatly benefit from a more formalized and rigorous specification of the desired system components as well as constraints on their composition. Therefore, the creation of robust and efficient design flows and tools is imperative. We present a human readable language (Eugene) that allows for the specification of synthetic biological designs based on biological parts, as well as provides a very expressive constraint system to drive the automatic creation of composite Parts (Devices) from a collection of individual Parts. We illustrate Eugene's capabilities in three different areas: Device specification, design space exploration, and assembly and simulation integration. These results highlight Eugene's ability to create combinatorial design spaces and prune these spaces for simulation or physical assembly. Eugene creates functional designs quickly and cost-effectively. Eugene is intended for forward engineering of DNA-based devices, and through its data types and execution semantics, reflects the desired abstraction hierarchy in synthetic biology. Eugene provides a powerful constraint system which can be used to drive the creation of new devices at runtime. It accomplishes all of this while being part of a larger tool chain which includes support for design, simulation, and physical device assembly.
NASA Technical Reports Server (NTRS)
Dumbacher, Daniel L.
2006-01-01
The United States (US) Vision for Space Exploration, announced in January 2004, outlines the National Aeronautics and Space Administration's (NASA) strategic goals and objectives, including retiring the Space Shuttle and replacing it with new space transportation systems for missions to the Moon, Mars, and beyond. The Crew Exploration Vehicle (CEV) that the new human-rated Crew Launch Vehicle (CLV) lofts into space early next decade will initially ferry astronauts to the International Space Station (ISS) Toward the end of the next decade, a heavy-lift Cargo Launch Vehicle (CaLV) will deliver the Earth Departure Stage (EDS) carrying the Lunar Surface Access Module (LSAM) to low-Earth orbit (LEO), where it will rendezvous with the CEV launched on the CLV and return astronauts to the Moon for the first time in over 30 years. This paper outlines how NASA is building these new space transportation systems on a foundation of legacy technical and management knowledge, using extensive experience gained from past and ongoing launch vehicle programs to maximize its design and development approach, with the objective of reducing total life cycle costs through operational efficiencies such as hardware commonality. For example, the CLV in-line configuration is composed of a 5-segment Reusable Solid Rocket Booster (RSRB), which is an upgrade of the current Space Shuttle 4- segment RSRB, and a new upper stage powered by the liquid oxygen/liquid hydrogen (LOX/LH2) J-2X engine, which is an evolution of the J-2 engine that powered the Apollo Program s Saturn V second and third stages in the 1960s and 1970s. The CaLV configuration consists of a propulsion system composed of two 5-segment RSRBs and a 33- foot core stage that will provide the LOX/LED needed for five commercially available RS-68 main engines. The J-2X also will power the EDS. The Exploration Launch Projects, managed by the Exploration Launch Office located at NASA's Marshall Space Flight Center, is leading the design, development, testing, and operations planning for these new space transportation systems. Utilizing a foundation of heritage hardware and management lessons learned mitigates both technical and programmatic risk. Project engineers and managers work closely with the Space Shuttle Program to transition hardware, infrastructure, and workforce assets to the new launch systems, leveraging a wealth of knowledge from Shuffle operations. In addition, NASA and its industry partners have tapped into valuable Apollo databases and are applying corporate wisdom conveyed firsthand by Apollo-era veterans of America s original Moon missions. Learning from its successes and failures, NASA employs rigorous systems engineering and systems management processes and principles in a disciplined, integrated fashion to further improve the probability of mission success.
Image synthesis for SAR system, calibration and processor design
NASA Technical Reports Server (NTRS)
Holtzman, J. C.; Abbott, J. L.; Kaupp, V. H.; Frost, V. S.
1978-01-01
The Point Scattering Method of simulating radar imagery rigorously models all aspects of the imaging radar phenomena. Its computational algorithms operate on a symbolic representation of the terrain test site to calculate such parameters as range, angle of incidence, resolution cell size, etc. Empirical backscatter data and elevation data are utilized to model the terrain. Additionally, the important geometrical/propagation effects such as shadow, foreshortening, layover, and local angle of incidence are rigorously treated. Applications of radar image simulation to a proposed calibrated SAR system are highlighted: soil moisture detection and vegetation discrimination.
NASA's future plans for lunar astronomy and astrophysics
NASA Technical Reports Server (NTRS)
Stachnik, Robert V.; Kaplan, Michael S.
1994-01-01
An expanding scientific interest in astronomical observations from the Moon has led the National Aeronautics and Space Administration (NASA) to develop a two-part strategy for lunar-astrophysics planning. The strategy emphasizes a systematic review process involving both the external scientific community and internal NASA engineering teams, coupled with the rigorous exclusion of projects inappropriate to lunar emplacement. Five major candidate lunar-astronomy projects are described, together with a modest derivative of one of them that could be implemented early in the establishment of a lunar base.
2014-09-01
The NATO Science and Technology Organization Science & Technology (S& T ) in the NATO context is defined as the selective and rigorous...generation and application of state-of-the-art, validated knowledge for defence and security purposes. S& T activities embrace scientific research...engineering, operational research and analysis, synthesis, integration and validation of knowledge derived through the scientific method. In NATO, S& T is
NASA Astrophysics Data System (ADS)
Sivapalan, Murugesu
2017-04-01
Hydrologic science has undergone almost transformative changes over the past 50 years. Huge strides have been made in the transition from early empirical approaches to rigorous approaches based on the fluid mechanics of water movement on and below the land surface. However, further progress has been hampered by problems posed by the presence of heterogeneity, especially subsurface heterogeneity, at all scales. The inability to measure or map subsurface heterogeneity everywhere prevented further development of balance equations and associated closure relations at the scales of interest, and has led to the virtual impasse we are presently in, in terms of development of physically based models needed for hydrologic predictions. An alternative to the mapping of subsurface heterogeneity everywhere is a new earth system science view, which sees the heterogeneity as the end result of co-evolutionary hydrological, geomorphological, ecological and pedological processes, each operating at a different rate, which have helped to shape the landscapes that we see in nature, including the heterogeneity below that we do not see. The expectation is that instead of specifying exact details of the heterogeneity in our models, we can replace it, without loss of information, with the ecosystem function they perform. Guided by this new earth system science perspective, development of hydrologic science is now guided by altogether new questions and new approaches to address them, compared to the purely physical, fluid mechanics based approaches that we inherited from the past. In the emergent Anthropocene, the co-evolutionary view is expanded further to involve interactions and feedbacks with human-social processes as well. In this lecture, I will present key milestones in the transformation of hydrologic science from Engineering Hydrology to Earth System Science, and what this means for hydrologic observations, theory development and predictions.
Expert system for computer-assisted annotation of MS/MS spectra.
Neuhauser, Nadin; Michalski, Annette; Cox, Jürgen; Mann, Matthias
2012-11-01
An important step in mass spectrometry (MS)-based proteomics is the identification of peptides by their fragment spectra. Regardless of the identification score achieved, almost all tandem-MS (MS/MS) spectra contain remaining peaks that are not assigned by the search engine. These peaks may be explainable by human experts but the scale of modern proteomics experiments makes this impractical. In computer science, Expert Systems are a mature technology to implement a list of rules generated by interviews with practitioners. We here develop such an Expert System, making use of literature knowledge as well as a large body of high mass accuracy and pure fragmentation spectra. Interestingly, we find that even with high mass accuracy data, rule sets can quickly become too complex, leading to over-annotation. Therefore we establish a rigorous false discovery rate, calculated by random insertion of peaks from a large collection of other MS/MS spectra, and use it to develop an optimized knowledge base. This rule set correctly annotates almost all peaks of medium or high abundance. For high resolution HCD data, median intensity coverage of fragment peaks in MS/MS spectra increases from 58% by search engine annotation alone to 86%. The resulting annotation performance surpasses a human expert, especially on complex spectra such as those of larger phosphorylated peptides. Our system is also applicable to high resolution collision-induced dissociation data. It is available both as a part of MaxQuant and via a webserver that only requires an MS/MS spectrum and the corresponding peptides sequence, and which outputs publication quality, annotated MS/MS spectra (www.biochem.mpg.de/mann/tools/). It provides expert knowledge to beginners in the field of MS-based proteomics and helps advanced users to focus on unusual and possibly novel types of fragment ions.
Expert System for Computer-assisted Annotation of MS/MS Spectra*
Neuhauser, Nadin; Michalski, Annette; Cox, Jürgen; Mann, Matthias
2012-01-01
An important step in mass spectrometry (MS)-based proteomics is the identification of peptides by their fragment spectra. Regardless of the identification score achieved, almost all tandem-MS (MS/MS) spectra contain remaining peaks that are not assigned by the search engine. These peaks may be explainable by human experts but the scale of modern proteomics experiments makes this impractical. In computer science, Expert Systems are a mature technology to implement a list of rules generated by interviews with practitioners. We here develop such an Expert System, making use of literature knowledge as well as a large body of high mass accuracy and pure fragmentation spectra. Interestingly, we find that even with high mass accuracy data, rule sets can quickly become too complex, leading to over-annotation. Therefore we establish a rigorous false discovery rate, calculated by random insertion of peaks from a large collection of other MS/MS spectra, and use it to develop an optimized knowledge base. This rule set correctly annotates almost all peaks of medium or high abundance. For high resolution HCD data, median intensity coverage of fragment peaks in MS/MS spectra increases from 58% by search engine annotation alone to 86%. The resulting annotation performance surpasses a human expert, especially on complex spectra such as those of larger phosphorylated peptides. Our system is also applicable to high resolution collision-induced dissociation data. It is available both as a part of MaxQuant and via a webserver that only requires an MS/MS spectrum and the corresponding peptides sequence, and which outputs publication quality, annotated MS/MS spectra (www.biochem.mpg.de/mann/tools/). It provides expert knowledge to beginners in the field of MS-based proteomics and helps advanced users to focus on unusual and possibly novel types of fragment ions. PMID:22888147
Mekios, Constantinos
2016-04-01
Twentieth-century theoretical efforts towards the articulation of general system properties came short of having the significant impact on biological practice that their proponents envisioned. Although the latter did arrive at preliminary mathematical formulations of such properties, they had little success in showing how these could be productively incorporated into the research agenda of biologists. Consequently, the gap that kept system-theoretic principles cut-off from biological experimentation persisted. More recently, however, simple theoretical tools have proved readily applicable within the context of systems biology. In particular, examples reviewed in this paper suggest that rigorous mathematical expressions of design principles, imported primarily from engineering, could produce experimentally confirmable predictions of the regulatory properties of small biological networks. But this is not enough for contemporary systems biologists who adopt the holistic aspirations of early systemologists, seeking high-level organizing principles that could provide insights into problems of biological complexity at the whole-system level. While the presented evidence is not conclusive about whether this strategy could lead to the realization of the lofty goal of a comprehensive explanatory integration, it suggests that the ongoing quest for organizing principles is pragmatically advantageous for systems biologists. The formalisms postulated in the course of this process can serve as bridges between system-theoretic concepts and the results of molecular experimentation: they constitute theoretical tools for generalizing molecular data, thus producing increasingly accurate explanations of system-wide phenomena.
Phytoremediation of hazardous wastes. Technical report, 23--26 July 1995
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCutcheon, S.C.; Wolfe, N.L.; Carreria, L.H.
1995-07-26
A new and innovative approach to phytoremediation (the use of plants to degrade hazardous contaminants) was developed. The new approach to phytoremediation involves rigorous pathway analyses, mass balance determinations, and identification of specific enzymes that break down trinitrotoluene (TNT), other explosives (RDX and HMX), nitrobenzene, and chlorinated solvents (e.g., TCE and PCE) (EPA 1994). As a good example, TNT is completely and rapidly degraded by nitroreductase and laccase enzymes. The aromatic ring is broken and the carbon in the ring fragments is incorporated into new plant fiber, as part of the natural lignification process. Half lives for TNT degradation approachmore » 1 hr or less under ideal laboratory conditions. Continuous-flow pilot studies indicate that scale up residence times in created wetlands may be two to three times longer than in laboratory batch studies. The use of created wetlands and land farming techniques guided by rigorous field biochemistry and ecology promises to be a vital part of a newly evolving field, ecological engineering.« less
Phytoremediation of hazardous wastes
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCutcheon, S.C.; Wolfe, N.L.; Carreria, L.H.
1995-11-01
A new and innovative approach to phytoremediation (the use of plants to degrade hazardous contaminants) was developed. The new approach to phytoremediation involves rigorous pathway analyses, mass balance determinations, and identification of specific enzymes that break down trinitrotoluene (TNT), other explosives (RDX and HMX), nitrobenzene, and chlorinated solvents (e.g., TCE and PCE) (EPA 1994). As a good example, TNT is completely and rapidly degraded by nitroreductase and laccase enzymes. The aromatic ring is broken and the carbon in the ring fragments is incorporated into new plant fiber, as part of the natural lignification process. Half lives for TNT degradation approachmore » 1 hr or less under ideal laboratory conditions. Continuous-flow pilot studies indicate that scale up residence times in created wetlands may be two to three times longer than in laboratory batch studies. The use of created wetlands and land farming techniques guided by rigorous field biochemistry and ecology promises to be a vital part of a newly evolving field, ecological engineering.« less
Sharing Research Models: Using Software Engineering Practices for Facilitation
Bryant, Stephanie P.; Solano, Eric; Cantor, Susanna; Cooley, Philip C.; Wagener, Diane K.
2011-01-01
Increasingly, researchers are turning to computational models to understand the interplay of important variables on systems’ behaviors. Although researchers may develop models that meet the needs of their investigation, application limitations—such as nonintuitive user interface features and data input specifications—may limit the sharing of these tools with other research groups. By removing these barriers, other research groups that perform related work can leverage these work products to expedite their own investigations. The use of software engineering practices can enable managed application production and shared research artifacts among multiple research groups by promoting consistent models, reducing redundant effort, encouraging rigorous peer review, and facilitating research collaborations that are supported by a common toolset. This report discusses three established software engineering practices— the iterative software development process, object-oriented methodology, and Unified Modeling Language—and the applicability of these practices to computational model development. Our efforts to modify the MIDAS TranStat application to make it more user-friendly are presented as an example of how computational models that are based on research and developed using software engineering practices can benefit a broader audience of researchers. PMID:21687780
N+3 Aircraft Concept Designs and Trade Studies. Volume 1
NASA Technical Reports Server (NTRS)
Greitzer, E. M.; Bonnefoy, P. A.; DelaRosaBlanco, E.; Dorbian, C. S.; Drela, M.; Hall, D. K.; Hansman, R. J.; Hileman, J. I.; Liebeck, R. H.; Levegren, J.;
2010-01-01
MIT, Aerodyne Research, Aurora Flight Sciences, and Pratt & Whitney have collaborated to address NASA s desire to pursue revolutionary conceptual designs for a subsonic commercial transport that could enter service in the 2035 timeframe. The MIT team brings together multidisciplinary expertise and cutting-edge technologies to determine, in a rigorous and objective manner, the potential for improvements in noise, emissions, and performance for subsonic fixed wing transport aircraft. The collaboration incorporates assessment of the trade space in aerodynamics, propulsion, operations, and structures to ensure that the full spectrum of improvements is identified. Although the analysis focuses on these key areas, the team has taken a system-level approach to find the integrated solutions that offer the best balance in performance enhancements. Based on the trade space analyses and system-level assessment, two aircraft have been identified and carried through conceptual design to show both the in-depth engineering that underpins the benefits envisioned and also the technology paths that need to be followed to enable, within the next 25 years, the development of aircraft three generations ahead in capabilities from those flying today.
Automatic Debugging Support for UML Designs
NASA Technical Reports Server (NTRS)
Schumann, Johann; Swanson, Keith (Technical Monitor)
2001-01-01
Design of large software systems requires rigorous application of software engineering methods covering all phases of the software process. Debugging during the early design phases is extremely important, because late bug-fixes are expensive. In this paper, we describe an approach which facilitates debugging of UML requirements and designs. The Unified Modeling Language (UML) is a set of notations for object-orient design of a software system. We have developed an algorithm which translates requirement specifications in the form of annotated sequence diagrams into structured statecharts. This algorithm detects conflicts between sequence diagrams and inconsistencies in the domain knowledge. After synthesizing statecharts from sequence diagrams, these statecharts usually are subject to manual modification and refinement. By using the "backward" direction of our synthesis algorithm. we are able to map modifications made to the statechart back into the requirements (sequence diagrams) and check for conflicts there. Fed back to the user conflicts detected by our algorithm are the basis for deductive-based debugging of requirements and domain theory in very early development stages. Our approach allows to generate explanations oil why there is a conflict and which parts of the specifications are affected.
An ORCID based synchronization framework for a national CRIS ecosystem.
Mendes Moreira, João; Cunha, Alcino; Macedo, Nuno
2015-01-01
PTCRIS (Portuguese Current Research Information System) is a program aiming at the creation and sustained development of a national integrated information ecosystem, to support research management according to the best international standards and practices. This paper reports on the experience of designing and prototyping a synchronization framework for PTCRIS based on ORCID (Open Researcher and Contributor ID). This framework embraces the "input once, re-use often" principle, and will enable a substantial reduction of the research output management burden by allowing automatic information exchange between the various national systems. The design of the framework followed best practices in rigorous software engineering, namely well-established principles in the research field of consistency management, and relied on formal analysis techniques and tools for its validation and verification. The notion of consistency between the services was formally specified and discussed with the stakeholders before the technical aspects on how to preserve said consistency were explored. Formal specification languages and automated verification tools were used to analyze the specifications and generate usage scenarios, useful for validation with the stakeholder and essential to certificate compliant services.
Dynamics of two-phase interfaces and surface tensions: A density-functional theory perspective
NASA Astrophysics Data System (ADS)
Yatsyshin, Petr; Sibley, David N.; Duran-Olivencia, Miguel A.; Kalliadasis, Serafim
2016-11-01
Classical density functional theory (DFT) is a statistical mechanical framework for the description of fluids at the nanoscale, where the inhomogeneity of the fluid structure needs to be carefully accounted for. By expressing the grand free-energy of the fluid as a functional of the one-body density, DFT offers a theoretically consistent and computationally accessible way to obtain two-phase interfaces and respective interfacial tensions in a ternary solid-liquid-gas system. The dynamic version of DFT (DDFT) can be rigorously derived from the Smoluchowsky picture of the dynamics of colloidal particles in a solvent. It is generally agreed that DDFT can capture the diffusion-driven evolution of many soft-matter systems. In this context, we use DDFT to investigate the dynamic behaviour of two-phase interfaces in both equilibrium and dynamic wetting and discuss the possibility of defining a time-dependent surface tension, which still remains in debate. We acknowledge financial support from the European Research Council via Advanced Grant No. 247031 and from the Engineering and Physical Sciences Research Council of the UK via Grants No. EP/L027186 and EP/L020564.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eccleston, C.H.
1997-09-05
The National Environmental Policy Act (NEPA) of 1969 was established by Congress more than a quarter of a century ago, yet there is a surprising lack of specific tools, techniques, and methodologies for effectively implementing these regulatory requirements. Lack of professionally accepted techniques is a principal factor responsible for many inefficiencies. Often, decision makers do not fully appreciate or capitalize on the true potential which NEPA provides as a platform for planning future actions. New approaches and modem management tools must be adopted to fully achieve NEPA`s mandate. A new strategy, referred to as Total Federal Planning, is proposed formore » unifying large-scale federal planning efforts under a single, systematic, structured, and holistic process. Under this approach, the NEPA planning process provides a unifying framework for integrating all early environmental and nonenvironmental decision-making factors into a single comprehensive planning process. To promote effectiveness and efficiency, modem tools and principles from the disciplines of Value Engineering, Systems Engineering, and Total Quality Management are incorporated. Properly integrated and implemented, these planning tools provide the rigorous, structured, and disciplined framework essential in achieving effective planning. Ultimately, the goal of a Total Federal Planning strategy is to construct a unified and interdisciplinary framework that substantially improves decision-making, while reducing the time, cost, redundancy, and effort necessary to comply with environmental and other planning requirements. At a time when Congress is striving to re-engineer the governmental framework, apparatus, and process, a Total Federal Planning philosophy offers a systematic approach for uniting the disjointed and often convoluted planning process currently used by most federal agencies. Potentially this approach has widespread implications in the way federal planning is approached.« less
NASA Astrophysics Data System (ADS)
Abdul-Aziz, Ali; Woike, Mark R.; Clem, Michelle; Baaklini, George
2015-03-01
Efforts to update and improve turbine engine components in meeting flights safety and durability requirements are commitments that engine manufacturers try to continuously fulfill. Most of their concerns and developments energies focus on the rotating components as rotor disks. These components typically undergo rigorous operating conditions and are subject to high centrifugal loadings which subject them to various failure mechanisms. Thus, developing highly advanced health monitoring technology to screen their efficacy and performance is very essential to their prolonged service life and operational success. Nondestructive evaluation techniques are among the many screening methods that presently are being used to pre-detect hidden flaws and mini cracks prior to any appalling events occurrence. Most of these methods or procedures are confined to evaluating material's discontinuities and other defects that have mature to a point where failure is eminent. Hence, development of more robust techniques to pre-predict faults prior to any catastrophic events in these components is highly vital. This paper is focused on presenting research activities covering the ongoing research efforts at NASA Glenn Research Center (GRC) rotor dynamics laboratory in support of developing a fault detection system for key critical turbine engine components. Data obtained from spin test experiments of a rotor disk that relates to investigating behavior of blade tip clearance, tip timing and shaft displacement based on measured data acquired from sensor devices such as eddy current, capacitive and microwave are presented. Additional results linking test data with finite element modeling to characterize the structural durability of a cracked rotor as it relays to the experimental tests and findings is also presented. An obvious difference in the vibration response is shown between the notched and the baseline no notch rotor disk indicating the presence of some type of irregularity.
Analysis of the Impact of Introductory Physics on Engineering Students at Texas A&M University
NASA Astrophysics Data System (ADS)
Perry, Jonathan; Bassichis, William
Introductory physics forms a major part of the foundational knowledge of engineering majors, independent of discipline and institution. While the content of introductory physics courses is consistent from institution to institution, the manner in which it is taught can vary greatly due to professor, textbook, instructional method, and overall course design. This work attempts to examine variations in student success, as measured by overall academic performance in an engineering major, and matriculation rates, based on the type of introductory physics a student took while enrolled in an engineering degree at Texas A&M University. Specific options for introductory physics at Texas A&M University include two calculus based physics courses, one traditional (UP), and one more mathematically rigorous (DP), transfer credit, and high school (AP or dual) credit. In order to examine the impact of introductory physics on a student's degree progression, data mining analyses are performed on a data set of relatively comprehensive academic records for all students enrolled as an engineering major for a minimum of one academic term. Student data has been collected for years of entering freshman beginning in 1990 and ending in 2010. Correlations will be examined between freshman level courses, including introductory physics, and follow on engineering courses, matriculation rates, and time to graduation.
The National Ignition Facility: alignment from construction to shot operations
NASA Astrophysics Data System (ADS)
Burkhart, S. C.; Bliss, E.; Di Nicola, P.; Kalantar, D.; Lowe-Webb, R.; McCarville, T.; Nelson, D.; Salmon, T.; Schindler, T.; Villanueva, J.; Wilhelmsen, K.
2010-08-01
The National Ignition Facility in Livermore, California, completed it's commissioning milestone on March 10, 2009 when it fired all 192 beams at a combined energy of 1.1 MJ at 351nm. Subsequently, a target shot series from August through December of 2009 culminated in scale ignition target design experiments up to 1.2 MJ in the National Ignition Campaign. Preparations are underway through the first half of of 2010 leading to DT ignition and gain experiments in the fall of 2010 into 2011. The top level requirement for beam pointing to target of 50μm rms is the culmination of 15 years of engineering design of a stable facility, commissioning of precision alignment, and precise shot operations controls. Key design documents which guided this project were published in the mid 1990's, driving systems designs. Precision Survey methods were used throughout construction, commissioning and operations for precision placement. Rigorous commissioning processes were used to ensure and validate placement and alignment throughout commissioning and in present day operations. Accurate and rapid system alignment during operations is accomplished by an impressive controls system to align and validate alignment readiness, assuring machine safety and productive experiments.
Evaluating Emulation-based Models of Distributed Computing Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, Stephen T.; Gabert, Kasimir G.; Tarman, Thomas D.
Emulation-based models of distributed computing systems are collections of virtual ma- chines, virtual networks, and other emulation components configured to stand in for oper- ational systems when performing experimental science, training, analysis of design alterna- tives, test and evaluation, or idea generation. As with any tool, we should carefully evaluate whether our uses of emulation-based models are appropriate and justified. Otherwise, we run the risk of using a model incorrectly and creating meaningless results. The variety of uses of emulation-based models each have their own goals and deserve thoughtful evaluation. In this paper, we enumerate some of these uses andmore » describe approaches that one can take to build an evidence-based case that a use of an emulation-based model is credible. Predictive uses of emulation-based models, where we expect a model to tell us something true about the real world, set the bar especially high and the principal evaluation method, called validation , is comensurately rigorous. We spend the majority of our time describing and demonstrating the validation of a simple predictive model using a well-established methodology inherited from decades of development in the compuational science and engineering community.« less
NASA Technical Reports Server (NTRS)
Hale, Joseph P.
2006-01-01
Models and simulations (M&S) are critical resources in the exploration of space. They support program management, systems engineering, integration, analysis, test, and operations and provide critical information and data supporting key analyses and decisions (technical, cost and schedule). Consequently, there is a clear need to establish a solid understanding of M&S strengths and weaknesses, and the bounds within which they can credibly support decision-making. Their usage requires the implementation of a rigorous approach to verification, validation and accreditation (W&A) and establishment of formal process and practices associated with their application. To ensure decision-making is suitably supported by information (data, models, test beds) from activities (studies, exercises) from M&S applications that are understood and characterized, ESMD is establishing formal, tailored W&A processes and practices. In addition, to ensure the successful application of M&S within ESMD, a formal process for the certification of analysts that use M&S is being implemented. This presentation will highlight NASA's Exploration Systems Mission Directorate (ESMD) management approach for M&S W&A to ensure decision-makers receive timely information on the model's fidelity, credibility, and quality.
Computing Generalized Matrix Inverse on Spiking Neural Substrate.
Shukla, Rohit; Khoram, Soroosh; Jorgensen, Erik; Li, Jing; Lipasti, Mikko; Wright, Stephen
2018-01-01
Emerging neural hardware substrates, such as IBM's TrueNorth Neurosynaptic System, can provide an appealing platform for deploying numerical algorithms. For example, a recurrent Hopfield neural network can be used to find the Moore-Penrose generalized inverse of a matrix, thus enabling a broad class of linear optimizations to be solved efficiently, at low energy cost. However, deploying numerical algorithms on hardware platforms that severely limit the range and precision of representation for numeric quantities can be quite challenging. This paper discusses these challenges and proposes a rigorous mathematical framework for reasoning about range and precision on such substrates. The paper derives techniques for normalizing inputs and properly quantizing synaptic weights originating from arbitrary systems of linear equations, so that solvers for those systems can be implemented in a provably correct manner on hardware-constrained neural substrates. The analytical model is empirically validated on the IBM TrueNorth platform, and results show that the guarantees provided by the framework for range and precision hold under experimental conditions. Experiments with optical flow demonstrate the energy benefits of deploying a reduced-precision and energy-efficient generalized matrix inverse engine on the IBM TrueNorth platform, reflecting 10× to 100× improvement over FPGA and ARM core baselines.
Trinh, Cong T.; Wlaschin, Aaron; Srienc, Friedrich
2010-01-01
Elementary Mode Analysis is a useful Metabolic Pathway Analysis tool to identify the structure of a metabolic network that links the cellular phenotype to the corresponding genotype. The analysis can decompose the intricate metabolic network comprised of highly interconnected reactions into uniquely organized pathways. These pathways consisting of a minimal set of enzymes that can support steady state operation of cellular metabolism represent independent cellular physiological states. Such pathway definition provides a rigorous basis to systematically characterize cellular phenotypes, metabolic network regulation, robustness, and fragility that facilitate understanding of cell physiology and implementation of metabolic engineering strategies. This mini-review aims to overview the development and application of elementary mode analysis as a metabolic pathway analysis tool in studying cell physiology and as a basis of metabolic engineering. PMID:19015845
NASA Technical Reports Server (NTRS)
Eisley, Joe T.
1990-01-01
The declining pool of graduates, the lack of rigorous preparation in science and mathematics, and the declining interest in science and engineering careers at the precollege level promises a shortage of technically educated personnel at the college level for industry, government, and the universities in the next several decades. The educational process, which starts out with a large number of students at the elementary level, but with an ever smaller number preparing for science and engineering at each more advanced educational level, is in a state of crisis. These pipeline issues, so called because the educational process is likened to a series of ever smaller constrictions in a pipe, were examined in a workshop at the Space Grant Conference and a summary of the presentations and the results of the discussion, and the conclusions of the workshop participants are reported.
Designer BHAs reduce costs on Andrew/Cyrus
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klein, R.; Todd, S.; Clark, G.
1997-10-01
A key mechanism to development of Andrew/Cyrus has been the development of alliances between BP Exploration, its field partners and various contractors. One such joint venture was the well engineering alliance between BP, Schlumberger, Baker Hughes INTEQ and Transocean, created to deliver five predrilled horizontal wells in the two fields prior to Andrew platform installation. This alliance provided the opportunity to radically change usual ways of doing business in the well engineering arena. One specific aspect involved a rigorous learning process within the team. Rapid learning before and during the operation led to several substantial improvements, one of which wasmore » the development of designer bottomhole assembly (BHA) configurations. Continual development of these assemblies has optimized well placement, bringing enhanced value while reducing costs and increasing potential alliance gainshare.« less
Geldenhuys, Greta; Muller, Nina; Frylinck, Lorinda; Hoffman, Louwrens C
2016-01-15
Baseline research on the toughness of Egyptian goose meat is required. This study therefore investigates the post mortem pH and temperature decline (15 min-4 h 15 min post mortem) in the pectoralis muscle (breast portion) of this gamebird species. It also explores the enzyme activity of the Ca(2+)-dependent protease (calpain system) and the lysosomal cathepsins during the rigor mortis period. No differences were found for any of the variables between genders. The pH decline in the pectoralis muscle occurs quite rapidly (c = -0.806; ultimate pH ∼ 5.86) compared with other species and it is speculated that the high rigor temperature (>20 °C) may contribute to the increased toughness. No calpain I was found in Egyptian goose meat and the µ/m-calpain activity remained constant during the rigor period, while a decrease in calpastatin activity was observed. The cathepsin B, B & L and H activity increased over the rigor period. Further research into the connective tissue content and myofibrillar breakdown during aging is required in order to know if the proteolytic enzymes do in actual fact contribute to tenderisation. © 2015 Society of Chemical Industry.
Modern Management Principles Come to the Dental School.
Wataha, John C; Mouradian, Wendy E; Slayton, Rebecca L; Sorensen, John A; Berg, Joel H
2016-04-01
The University of Washington School of Dentistry may be the first dental school in the nation to apply lean process management principles as a primary tool to re-engineer its operations and curriculum to produce the dentist of the future. The efficiencies realized through re-engineering will better enable the school to remain competitive and viable as a national leader of dental education. Several task forces conducted rigorous value stream analyses in a highly collaborative environment led by the dean of the school. The four areas undergoing evaluation and re-engineering were organizational infrastructure, organizational processes, curriculum, and clinic operations. The new educational model was derived by thoroughly analyzing the current state of dental education in order to design and achieve the closest possible ideal state. As well, the school's goal was to create a lean, sustainable operational model. This model aims to ensure continued excellence in restorative dental instruction and to serve as a blueprint for other public dental schools seeking financial stability in this era of shrinking state support and rising costs.
NASA Astrophysics Data System (ADS)
Claver, Chuck F.; Debois-Felsmann, G. P.; Delgado, F.; Hascall, P.; Marshall, S.; Nordby, M.; Schumacher, G.; Sebag, J.; LSST Collaboration
2011-01-01
The Large Synoptic Survey Telescope (LSST) is a complete observing system that acquires and archives images, processes and analyzes them, and publishes reduced images and catalogs of sources and objects. The LSST will operate over a ten year period producing a survey of 20,000 square degrees over the entire [Southern] sky in 6 filters (ugrizy) with each field having been visited several hundred times enabling a wide spectrum of science from fast transients to exploration of dark matter and dark energy. The LSST itself is a complex system of systems consisting of the 8.4m 3-mirror telescope, a 3.2 billion pixel camera, and a peta-scale data management system. The LSST project uses a Model Based Systems Engineering (MBSE) methodology to ensure an integrated approach to system design and rigorous definition of system interfaces and specifications. The MBSE methodology is applied through modeling of the LSST's systems with the System Modeling Language (SysML). The SysML modeling recursively establishes the threefold relationship between requirements, logical & physical functional decomposition and definition, and system and component behavior at successively deeper level of abstraction and detail. The LSST modeling includes the analysis and documenting the flow of command and control information and data between the suite of systems in the LSST observatory that are needed to carry out the activities of the survey. The MBSE approach is applied throughout all stages of the project from design, to validation and verification, though to commissioning.
NASA Astrophysics Data System (ADS)
Smith, Denise A.; Mendez, B.; Shipp, S.; Schwerin, T.; Stockman, S.; Cooper, L. P.; Sharma, M.
2010-01-01
Scientists, engineers, educators, and public outreach professionals have a rich history of creatively using NASA's pioneering scientific discoveries and technology to engage and educate youth and adults nationwide in core science, technology, engineering, and mathematics topics. We introduce four new Science Education and Public Outreach Forums that will work in partnership with the community and NASA's Science Mission Directorate (SMD) to ensure that current and future SMD-funded education and public outreach (E/PO) activities form a seamless whole, with easy entry points for general public, students, K-12 formal and informal science educators, faculty, scientists, engineers, and E/PO professionals alike. The new Science Education and Public Outreach Forums support the astrophysics, heliophysics, planetary and Earth science divisions of NASA SMD in three core areas: 1) E/PO community engagement and development activities will provide clear paths of involvement for scientists and engineers interested - or potentially interested - in participating in SMD-funded E/PO activities. Collaborations with scientists and engineers are vital for infusing current, accurate SMD mission and research findings into educational products and activities. Forum activities will also yield readily accessible information on effective E/PO strategies, resources, and expertise; context for individual E/PO activities; and opportunities for collaboration. 2) A rigorous analysis of SMD-funded K-12 formal, informal, and higher education products and activities will help the community and SMD to understand how the existing collection supports education standards and audience needs, and to strategically identify areas of opportunity for new materials and activities. 3) Finally, a newly convened Coordinating Committee will work across the four SMD science divisions to address systemic issues and integrate related activities. By supporting the NASA E/PO community and facilitating coordination of E/PO activities, the NASA-SEPOF partnerships will lead to more effective, sustainable, and efficient utilization of NASA science discoveries and learning experiences.
Mechanical properties of frog skeletal muscles in iodoacetic acid rigor.
Mulvany, M J
1975-01-01
1. Methods have been developed for describing the length: tension characteristics of frog skeletal muscles which go into rigor at 4 degrees C following iodoacetic acid poisoning either in the presence of Ca2+ (Ca-rigor) or its absence (Ca-free-rigor). 2. Such rigor muscles showed less resistance to slow stretch (slow rigor resistance) that to fast stretch (fast rigor resistance). The slow and fast rigor resistances of Ca-free-rigor muscles were much lower than those of Ca-rigor muscles. 3. The slow rigor resistance of Ca-rigor muscles was proportional to the amount of overlap between the contractile filaments present when the muscles were put into rigor. 4. Withdrawing Ca2+ from Ca-rigor muscles (induced-Ca-free rigor) reduced their slow and fast rigor resistances. Readdition of Ca2+ (but not Mg2+, Mn2+ or Sr2+) reversed the effect. 5. The slow and fast rigor resistances of Ca-rigor muscles (but not of Ca-free-rigor muscles) decreased with time. 6.The sarcomere structure of Ca-rigor and induced-Ca-free rigor muscles stretched by 0.2lo was destroyed in proportion to the amount of stretch, but the lengths of the remaining intact sarcomeres were essentially unchanged. This suggests that there had been a successive yielding of the weakeast sarcomeres. 7. The difference between the slow and fast rigor resistance and the effect of calcium on these resistances are discussed in relation to possible variations in the strength of crossbridges between the thick and thin filaments. Images Plate 1 Plate 2 PMID:1082023
Improved key-rate bounds for practical decoy-state quantum-key-distribution systems
NASA Astrophysics Data System (ADS)
Zhang, Zhen; Zhao, Qi; Razavi, Mohsen; Ma, Xiongfeng
2017-01-01
The decoy-state scheme is the most widely implemented quantum-key-distribution protocol in practice. In order to account for the finite-size key effects on the achievable secret key generation rate, a rigorous statistical fluctuation analysis is required. Originally, a heuristic Gaussian-approximation technique was used for this purpose, which, despite its analytical convenience, was not sufficiently rigorous. The fluctuation analysis has recently been made rigorous by using the Chernoff bound. There is a considerable gap, however, between the key-rate bounds obtained from these techniques and that obtained from the Gaussian assumption. Here we develop a tighter bound for the decoy-state method, which yields a smaller failure probability. This improvement results in a higher key rate and increases the maximum distance over which secure key exchange is possible. By optimizing the system parameters, our simulation results show that our method almost closes the gap between the two previously proposed techniques and achieves a performance similar to that of conventional Gaussian approximations.
Bilitchenko, Lesia; Liu, Adam; Cheung, Sherine; Weeding, Emma; Xia, Bing; Leguia, Mariana; Anderson, J. Christopher; Densmore, Douglas
2011-01-01
Background Synthetic biological systems are currently created by an ad-hoc, iterative process of specification, design, and assembly. These systems would greatly benefit from a more formalized and rigorous specification of the desired system components as well as constraints on their composition. Therefore, the creation of robust and efficient design flows and tools is imperative. We present a human readable language (Eugene) that allows for the specification of synthetic biological designs based on biological parts, as well as provides a very expressive constraint system to drive the automatic creation of composite Parts (Devices) from a collection of individual Parts. Results We illustrate Eugene's capabilities in three different areas: Device specification, design space exploration, and assembly and simulation integration. These results highlight Eugene's ability to create combinatorial design spaces and prune these spaces for simulation or physical assembly. Eugene creates functional designs quickly and cost-effectively. Conclusions Eugene is intended for forward engineering of DNA-based devices, and through its data types and execution semantics, reflects the desired abstraction hierarchy in synthetic biology. Eugene provides a powerful constraint system which can be used to drive the creation of new devices at runtime. It accomplishes all of this while being part of a larger tool chain which includes support for design, simulation, and physical device assembly. PMID:21559524
NASA Astrophysics Data System (ADS)
Akilan, A.; Nagasubramanian, V.; Chaudhry, A.; Reddy, D. Rajesh; Sudheer Reddy, D.; Usha Devi, R.; Tirupati, T.; Radhadevi, P. V.; Varadan, G.
2014-11-01
Block Adjustment is a technique for large area mapping for images obtained from different remote sensingsatellites.The challenge in this process is to handle huge number of satellite imageries from different sources with different resolution and accuracies at the system level. This paper explains a system with various tools and techniques to effectively handle the end-to-end chain in large area mapping and production with good level of automation and the provisions for intuitive analysis of final results in 3D and 2D environment. In addition, the interface for using open source ortho and DEM references viz., ETM, SRTM etc. and displaying ESRI shapes for the image foot-prints are explained. Rigorous theory, mathematical modelling, workflow automation and sophisticated software engineering tools are included to ensure high photogrammetric accuracy and productivity. Major building blocks like Georeferencing, Geo-capturing and Geo-Modelling tools included in the block adjustment solution are explained in this paper. To provide optimal bundle block adjustment solution with high precision results, the system has been optimized in many stages to exploit the full utilization of hardware resources. The robustness of the system is ensured by handling failure in automatic procedure and saving the process state in every stage for subsequent restoration from the point of interruption. The results obtained from various stages of the system are presented in the paper.
Phases, phase equilibria, and phase rules in low-dimensional systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frolov, T., E-mail: timfrol@berkeley.edu; Mishin, Y., E-mail: ymishin@gmu.edu
2015-07-28
We present a unified approach to thermodynamic description of one, two, and three dimensional phases and phase transformations among them. The approach is based on a rigorous definition of a phase applicable to thermodynamic systems of any dimensionality. Within this approach, the same thermodynamic formalism can be applied for the description of phase transformations in bulk systems, interfaces, and line defects separating interface phases. For both lines and interfaces, we rigorously derive an adsorption equation, the phase coexistence equations, and other thermodynamic relations expressed in terms of generalized line and interface excess quantities. As a generalization of the Gibbs phasemore » rule for bulk phases, we derive phase rules for lines and interfaces and predict the maximum number of phases than may coexist in systems of the respective dimensionality.« less
User Guidelines and Best Practices for CASL VUQ Analysis Using Dakota
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Coleman, Kayla; Gilkey, Lindsay N.
Sandia’s Dakota software (available at http://dakota.sandia.gov) supports science and engineering transformation through advanced exploration of simulations. Specifically it manages and analyzes ensembles of simulations to provide broader and deeper perspective for analysts and decision makers. This enables them to enhance understanding of risk, improve products, and assess simulation credibility. In its simplest mode, Dakota can automate typical parameter variation studies through a generic interface to a physics-based computational model. This can lend efficiency and rigor to manual parameter perturbation studies already being conducted by analysts. However, Dakota also delivers advanced parametric analysis techniques enabling design exploration, optimization, model calibration, riskmore » analysis, and quantification of margins and uncertainty with such models. It directly supports verification and validation activities. Dakota algorithms enrich complex science and engineering models, enabling an analyst to answer crucial questions of - Sensitivity: Which are the most important input factors or parameters entering the simulation, and how do they influence key outputs?; Uncertainty: What is the uncertainty or variability in simulation output, given uncertainties in input parameters? How safe, reliable, robust, or variable is my system? (Quantification of margins and uncertainty, QMU); Optimization: What parameter values yield the best performing design or operating condition, given constraints? Calibration: What models and/or parameters best match experimental data? In general, Dakota is the Consortium for Advanced Simulation of Light Water Reactors (CASL) delivery vehicle for verification, validation, and uncertainty quantification (VUQ) algorithms. It permits ready application of the VUQ methods described above to simulation codes by CASL researchers, code developers, and application engineers.« less
The benefits and costs of new fuels and engines for light-duty vehicles in the United States.
Keefe, Ryan; Griffin, James P; Graham, John D
2008-10-01
Rising oil prices and concerns about energy security and climate change are spurring reconsideration of both automobile propulsion systems and the fuels that supply energy to them. In addition to the gasoline internal combustion engine, recent years have seen alternatives develop in the automotive marketplace. Currently, hybrid-electric vehicles, advanced diesels, and flex-fuel vehicles running on a high percentage mixture of ethanol and gasoline (E85) are appearing at auto shows and in driveways. We conduct a rigorous benefit-cost analysis from both the private and societal perspective of the marginal benefits and costs of each technology--using the conventional gasoline engine as a baseline. The private perspective considers only those factors that influence the decisions of individual consumers, while the societal perspective accounts for environmental, energy, and congestion externalities as well. Our analysis illustrates that both hybrids and diesels show promise for particular light-duty applications (sport utility vehicles and pickup trucks), but that vehicles running continuously on E85 consistently have greater costs than benefits. The results for diesels were particularly robust over a wide range of sensitivity analyses. The results from the societal analysis are qualitatively similar to the private analysis, demonstrating that the most relevant factors to the benefit-cost calculations are the factors that drive the individual consumer's decision. We conclude with a brief discussion of marketplace and public policy trends that will both illustrate and influence the relative adoption of these alternative technologies in the United States in the coming decade.
Development of Autonomous Aerobraking - Phase 2
NASA Technical Reports Server (NTRS)
Murri, Daniel G.
2013-01-01
Phase 1 of the Development of Autonomous Aerobraking (AA) Assessment investigated the technical capability of transferring the processes of aerobraking maneuver (ABM) decision-making (currently performed on the ground by an extensive workforce and communicated to the spacecraft via the deep space network) to an efficient flight software algorithm onboard the spacecraft. This document describes Phase 2 of this study, which was a 12-month effort to improve and rigorously test the AA Development Software developed in Phase 1. Aerobraking maneuver; Autonomous Aerobraking; Autonomous Aerobraking Development Software; Deep Space Network; NASA Engineering and Safety Center
Space Suit Technologies Protect Deep-Sea Divers
NASA Technical Reports Server (NTRS)
2008-01-01
Working on NASA missions allows engineers and scientists to hone their skills. Creating devices for the high-stress rigors of space travel pushes designers to their limits, and the results often far exceed the original concepts. The technologies developed for the extreme environment of space are often applicable here on Earth. Some of these NASA technologies, for example, have been applied to the breathing apparatuses worn by firefighters, the fire-resistant suits worn by racecar crews, and, most recently, the deep-sea gear worn by U.S. Navy divers.
Editorial: Special Issue on Experimental Vibration Analysis
NASA Astrophysics Data System (ADS)
Serra, Roger
2018-04-01
The vibratory analyses are particularly present today in the various fields of industry, from aeronautics to manufacturing, from machining and maintenance to civil engineering, to mention a few areas, which have made this special issue a true need. The International Journal of Mechanics & Industry compiles a Special Issue on Experimental Vibration Analysis. More than thirty manuscripts were received by the international scientific committee on the 6th congress AVE2016 and only eight papers have been selected after completing a careful and rigorous peer-review process for the Special Issue, which are briefly summarized below.
Numerical parametric studies of spray combustion instability
NASA Technical Reports Server (NTRS)
Pindera, M. Z.
1993-01-01
A coupled numerical algorithm has been developed for studies of combustion instabilities in spray-driven liquid rocket engines. The model couples gas and liquid phase physics using the method of fractional steps. Also introduced is a novel, efficient methodology for accounting for spray formation through direct solution of liquid phase equations. Preliminary parametric studies show marked sensitivity of spray penetration and geometry to droplet diameter, considerations of liquid core, and acoustic interactions. Less sensitivity was shown to the combustion model type although more rigorous (multi-step) formulations may be needed for the differences to become apparent.
A criterion for establishing life limits. [for Space Shuttle Main Engine service
NASA Technical Reports Server (NTRS)
Skopp, G. H.; Porter, A. A.
1990-01-01
The development of a rigorous statistical method that would utilize hardware-demonstrated reliability to evaluate hardware capability and provide ground rules for safe flight margin is discussed. A statistical-based method using the Weibull/Weibayes cumulative distribution function is described. Its advantages and inadequacies are pointed out. Another, more advanced procedure, Single Flight Reliability (SFR), determines a life limit which ensures that the reliability of any single flight is never less than a stipulated value at a stipulated confidence level. Application of the SFR method is illustrated.
LIHE Spectral Dynamics and Jaguar Data Acquisition System Measurement Assurance Results 2014.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Covert, Timothy T.; Willis, Michael David; Radtke, Gregg Arthur
2015-06-01
The Light Initiated High Explosive (LIHE) facility performs high rigor, high consequence impulse testing for the nuclear weapons (NW) community. To support the facility mission, LIHE's extensive data acquisition system (DAS) is comprised of several discrete components as well as a fully integrated system. Due to the high consequence and high rigor of the testing performed at LIHE, a measurement assurance plan (MAP) was developed in collaboration with NW system customers to meet their data quality needs and to provide assurance of the robustness of the LIHE DAS. While individual components of the DAS have been calibrated by the SNLmore » Primary Standards Laboratory (PSL), the integrated nature of this complex system requires verification of the complete system, from end-to-end. This measurement assurance plan (MAP) report documents the results of verification and validation procedures used to ensure that the data quality meets customer requirements.« less
Undergraduate Research in Physics as a course for Engineering and Computer Science Majors
NASA Astrophysics Data System (ADS)
O'Brien, James; Rueckert, Franz; Sirokman, Greg
2017-01-01
Undergraduate research has become more and more integral to the functioning of higher educational institutions. At many institutions undergraduate research is conducted as capstone projects in the pure sciences, however, science faculty at some schools (including that of the authors) face the challenge of not having science majors. Even at these institutions, a select population of high achieving engineering students will often express a keen interest in conducting pure science research. Since a foray into science research provides the student the full exposure to the scientific method and scientific collaboration, the experience can be quite rewarding and beneficial to the development of the student as a professional. To this end, the authors have been working to find new contexts in which to offer research experiences to non- science majors, including a new undergraduate research class conducted by physics and chemistry faculty. An added benefit is that these courses are inherently interdisciplinary. Students in the engineering and computer science fields step into physics and chemistry labs to solve science problems, often invoking their own relevant expertise. In this paper we start by discussing the common themes and outcomes of the course. We then discuss three particular projects that were conducted with engineering students and focus on how the undergraduate research experience enhanced their already rigorous engineering curriculum.
Space radiator simulation system analysis
NASA Technical Reports Server (NTRS)
Black, W. Z.; Wulff, W.
1972-01-01
A transient heat transfer analysis was carried out on a space radiator heat rejection system exposed to an arbitrarily prescribed combination of aerodynamic heating, solar, albedo, and planetary radiation. A rigorous analysis was carried out for the radiation panel and tubes lying in one plane and an approximate analysis was used to extend the rigorous analysis to the case of a curved panel. The analysis permits the consideration of both gaseous and liquid coolant fluids, including liquid metals, under prescribed, time dependent inlet conditions. The analysis provided a method for predicting: (1) transient and steady-state, two dimensional temperature profiles, (2) local and total heat rejection rates, (3) coolant flow pressure in the flow channel, and (4) total system weight and protection layer thickness.
Space radiator simulation manual for computer code
NASA Technical Reports Server (NTRS)
Black, W. Z.; Wulff, W.
1972-01-01
A computer program that simulates the performance of a space radiator is presented. The program basically consists of a rigorous analysis which analyzes a symmetrical fin panel and an approximate analysis that predicts system characteristics for cases of non-symmetrical operation. The rigorous analysis accounts for both transient and steady state performance including aerodynamic and radiant heating of the radiator system. The approximate analysis considers only steady state operation with no aerodynamic heating. A description of the radiator system and instructions to the user for program operation is included. The input required for the execution of all program options is described. Several examples of program output are contained in this section. Sample output includes the radiator performance during ascent, reentry and orbit.
Foundations of planetary quarantine.
NASA Technical Reports Server (NTRS)
Hall, L. B.; Lyle, R. G.
1971-01-01
Discussion of some of the problems in microbiology and engineering involved in the implementation of planetary quarantine. It is shown that the solutions require new knowledge in both disciplines for success at low cost in terms of both monetary outlay and man's further exploration of the planets. A related problem exists in that engineers are not accustomed to the wide variation of biological data and microbiologists must learn to work and think in more exact terms. Those responsible for formulating or influencing national and international policies must walk a tightrope with delicate balance between unnecessarily stringent requirements for planetary quarantine on the one hand and prevention of contamination on the other. The success of planetary quarantine measures can be assured only by rigorous measures, each checked, rechecked, and triple-checked to make sure that no errors have been made and that no factor has been overlooked.
Model of dissolution in the framework of tissue engineering and drug delivery.
Sanz-Herrera, J A; Soria, L; Reina-Romo, E; Torres, Y; Boccaccini, A R
2018-05-22
Dissolution phenomena are ubiquitously present in biomaterials in many different fields. Despite the advantages of simulation-based design of biomaterials in medical applications, additional efforts are needed to derive reliable models which describe the process of dissolution. A phenomenologically based model, available for simulation of dissolution in biomaterials, is introduced in this paper. The model turns into a set of reaction-diffusion equations implemented in a finite element numerical framework. First, a parametric analysis is conducted in order to explore the role of model parameters on the overall dissolution process. Then, the model is calibrated and validated versus a straightforward but rigorous experimental setup. Results show that the mathematical model macroscopically reproduces the main physicochemical phenomena that take place in the tests, corroborating its usefulness for design of biomaterials in the tissue engineering and drug delivery research areas.
King, Gary; Pan, Jennifer; Roberts, Margaret E
2014-08-22
Existing research on the extensive Chinese censorship organization uses observational methods with well-known limitations. We conducted the first large-scale experimental study of censorship by creating accounts on numerous social media sites, randomly submitting different texts, and observing from a worldwide network of computers which texts were censored and which were not. We also supplemented interviews with confidential sources by creating our own social media site, contracting with Chinese firms to install the same censoring technologies as existing sites, and--with their software, documentation, and even customer support--reverse-engineering how it all works. Our results offer rigorous support for the recent hypothesis that criticisms of the state, its leaders, and their policies are published, whereas posts about real-world events with collective action potential are censored. Copyright © 2014, American Association for the Advancement of Science.
On-ground characterization of the Euclid's CCD273-based readout chain
NASA Astrophysics Data System (ADS)
Szafraniec, Magdalena; Azzollini, R.; Cropper, M.; Pottinger, S.; Khalil, A.; Hailey, M.; Hu, D.; Plana, C.; Cutts, A.; Hunt, T.; Kohley, R.; Walton, D.; Theobald, C.; Sharples, R.; Schmoll, J.; Ferrando, P.
2016-07-01
Euclid is a medium class European Space Agency mission scheduled for launch in 2020. The goal of the survey is to examine the nature of Dark Matter and Dark Energy in the Universe. One of the cosmological probes used to analyze Euclid's data, the weak lensing technique, measures the distortions of galaxy shapes and this requires very accurate knowledge of the system point spread function (PSF). Therefore, to ensure that the galaxy shape is not affected, the detector chain of the telescope's VISible Instrument (VIS) needs to meet specific performance performance requirements. Each of the 12 VIS readout chains consisting of 3 CCDs, readout electronics (ROE) and a power supply unit (RPSU) will undergo a rigorous on-ground testing to ensure that these requirements are met. This paper reports on the current status of the warm and cold testing of the VIS Engineering Model readout chain. Additionally, an early insight to the commissioning of the Flight Model calibration facility and program is provided.
Quantum metrology with a single spin-3/2 defect in silicon carbide
NASA Astrophysics Data System (ADS)
Soykal, Oney O.; Reinecke, Thomas L.
We show that implementations for quantum sensing with exceptional sensitivity and spatial resolution can be made using the novel features of semiconductor high half-spin multiplet defects with easy-to-implement optical detection protocols. To achieve this, we use the spin- 3 / 2 silicon monovacancy deep center in hexagonal silicon carbide based on our rigorous derivation of this defect's ground state and of its electronic and optical properties. For a single VSi- defect, we obtain magnetic field sensitivities capable of detecting individual nuclear magnetic moments. We also show that its zero-field splitting has an exceptional strain and temperature sensitivity within the technologically desirable near-infrared window of biological systems. Other point defects, i.e. 3d transition metal or rare-earth impurities in semiconductors, may also provide similar opportunities in quantum sensing due to their similar high spin (S >= 3 / 2) configurations. This work was supported in part by ONR and by the Office of Secretary of Defense, Quantum Science and Engineering Program.
Matrix methods applied to engineering rigid body mechanics
NASA Astrophysics Data System (ADS)
Crouch, T.
The purpose of this book is to present the solution of a range of rigorous body mechanics problems using a matrix formulation of vector algebra. Essential theory concerning kinematics and dynamics is formulated in terms of matrix algebra. The solution of kinematics and dynamics problems is discussed, taking into account the velocity and acceleration of a point moving in a circular path, the velocity and acceleration determination for a linkage, the angular velocity and angular acceleration of a roller in a taper-roller thrust race, Euler's theroem on the motion of rigid bodies, an automotive differential, a rotating epicyclic, the motion of a high speed rotor mounted in gimbals, and the vibration of a spinning projectile. Attention is given to the activity of a force, the work done by a conservative force, the work and potential in a conservative system, the equilibrium of a mechanism, bearing forces due to rotor misalignment, and the frequency of vibrations of a constrained rod.
Experience report: Using formal methods for requirements analysis of critical spacecraft software
NASA Technical Reports Server (NTRS)
Lutz, Robyn R.; Ampo, Yoko
1994-01-01
Formal specification and analysis of requirements continues to gain support as a method for producing more reliable software. However, the introduction of formal methods to a large software project is difficult, due in part to the unfamiliarity of the specification languages and the lack of graphics. This paper reports results of an investigation into the effectiveness of formal methods as an aid to the requirements analysis of critical, system-level fault-protection software on a spacecraft currently under development. Our experience indicates that formal specification and analysis can enhance the accuracy of the requirements and add assurance prior to design development in this domain. The work described here is part of a larger, NASA-funded research project whose purpose is to use formal-methods techniques to improve the quality of software in space applications. The demonstration project described here is part of the effort to evaluate experimentally the effectiveness of supplementing traditional engineering approaches to requirements specification with the more rigorous specification and analysis available with formal methods.
Separating intrinsic from extrinsic fluctuations in dynamic biological systems
Paulsson, Johan
2011-01-01
From molecules in cells to organisms in ecosystems, biological populations fluctuate due to the intrinsic randomness of individual events and the extrinsic influence of changing environments. The combined effect is often too complex for effective analysis, and many studies therefore make simplifying assumptions, for example ignoring either intrinsic or extrinsic effects to reduce the number of model assumptions. Here we mathematically demonstrate how two identical and independent reporters embedded in a shared fluctuating environment can be used to identify intrinsic and extrinsic noise terms, but also how these contributions are qualitatively and quantitatively different from what has been previously reported. Furthermore, we show for which classes of biological systems the noise contributions identified by dual-reporter methods correspond to the noise contributions predicted by correct stochastic models of either intrinsic or extrinsic mechanisms. We find that for broad classes of systems, the extrinsic noise from the dual-reporter method can be rigorously analyzed using models that ignore intrinsic stochasticity. In contrast, the intrinsic noise can be rigorously analyzed using models that ignore extrinsic stochasticity only under very special conditions that rarely hold in biology. Testing whether the conditions are met is rarely possible and the dual-reporter method may thus produce flawed conclusions about the properties of the system, particularly about the intrinsic noise. Our results contribute toward establishing a rigorous framework to analyze dynamically fluctuating biological systems. PMID:21730172
Separating intrinsic from extrinsic fluctuations in dynamic biological systems.
Hilfinger, Andreas; Paulsson, Johan
2011-07-19
From molecules in cells to organisms in ecosystems, biological populations fluctuate due to the intrinsic randomness of individual events and the extrinsic influence of changing environments. The combined effect is often too complex for effective analysis, and many studies therefore make simplifying assumptions, for example ignoring either intrinsic or extrinsic effects to reduce the number of model assumptions. Here we mathematically demonstrate how two identical and independent reporters embedded in a shared fluctuating environment can be used to identify intrinsic and extrinsic noise terms, but also how these contributions are qualitatively and quantitatively different from what has been previously reported. Furthermore, we show for which classes of biological systems the noise contributions identified by dual-reporter methods correspond to the noise contributions predicted by correct stochastic models of either intrinsic or extrinsic mechanisms. We find that for broad classes of systems, the extrinsic noise from the dual-reporter method can be rigorously analyzed using models that ignore intrinsic stochasticity. In contrast, the intrinsic noise can be rigorously analyzed using models that ignore extrinsic stochasticity only under very special conditions that rarely hold in biology. Testing whether the conditions are met is rarely possible and the dual-reporter method may thus produce flawed conclusions about the properties of the system, particularly about the intrinsic noise. Our results contribute toward establishing a rigorous framework to analyze dynamically fluctuating biological systems.
Chain representations of Open Quantum Systems and Lieb-Robinson like bounds for the dynamics
NASA Astrophysics Data System (ADS)
Woods, Mischa
2013-03-01
This talk is concerned with the mapping of the Hamiltonian of open quantum systems onto chain representations, which forms the basis for a rigorous theory of the interaction of a system with its environment. This mapping progresses as an interaction which gives rise to a sequence of residual spectral densities of the system. The rigorous mathematical properties of this mapping have been unknown so far. Here we develop the theory of secondary measures to derive an analytic, expression for the sequence solely in terms of the initial measure and its associated orthogonal polynomials of the first and second kind. These mappings can be thought of as taking a highly nonlocal Hamiltonian to a local Hamiltonian. In the latter, a Lieb-Robinson like bound for the dynamics of the open quantum system makes sense. We develop analytical bounds on the error to observables of the system as a function of time when the semi-infinite chain in truncated at some finite length. The fact that this is possible shows that there is a finite ``Speed of sound'' in these chain representations. This has many implications of the simulatability of open quantum systems of this type and demonstrates that a truncated chain can faithfully reproduce the dynamics at shorter times. These results make a significant and mathematically rigorous contribution to the understanding of the theory of open quantum systems; and pave the way towards the efficient simulation of these systems, which within the standard methods, is often an intractable problem. EPSRC CDT in Controlled Quantum Dynamics, EU STREP project and Alexander von Humboldt Foundation
Guo, Weihua; Sheng, Jiayuan; Feng, Xueyang
2015-01-01
Metabolic engineering of various industrial microorganisms to produce chemicals, fuels, and drugs has raised interest since it is environmentally friendly, sustainable, and independent of nonrenewable resources. However, microbial metabolism is so complex that only a few metabolic engineering efforts have been able to achieve a satisfactory yield, titer or productivity of the target chemicals for industrial commercialization. In order to overcome this challenge, 13C Metabolic Flux Analysis (13C-MFA) has been continuously developed and widely applied to rigorously investigate cell metabolism and quantify the carbon flux distribution in central metabolic pathways. In the past decade, many 13C-MFA studies have been performed in academic labs and biotechnology industries to pinpoint key issues related to microbe-based chemical production. Insightful information about the metabolic rewiring has been provided to guide the development of the appropriate metabolic engineering strategies for improving the biochemical production. In this review, we will introduce the basics of 13C-MFA and illustrate how 13C-MFA has been applied via integration with metabolic engineering to identify and tackle the rate-limiting steps in biochemical production for various host microorganisms PMID:28952565
NASA Astrophysics Data System (ADS)
Gottwald, Georg; Melbourne, Ian
2013-04-01
Whereas diffusion limits of stochastic multi-scale systems have a long and successful history, the case of constructing stochastic parametrizations of chaotic deterministic systems has been much less studied. We present rigorous results of convergence of a chaotic slow-fast system to a stochastic differential equation with multiplicative noise. Furthermore we present rigorous results for chaotic slow-fast maps, occurring as numerical discretizations of continuous time systems. This raises the issue of how to interpret certain stochastic integrals; surprisingly the resulting integrals of the stochastic limit system are generically neither of Stratonovich nor of Ito type in the case of maps. It is shown that the limit system of a numerical discretisation is different to the associated continuous time system. This has important consequences when interpreting the statistics of long time simulations of multi-scale systems - they may be very different to the one of the original continuous time system which we set out to study.
NASA Astrophysics Data System (ADS)
Sandini, Giulio; Morasso, Pietro
2018-03-01
In engineering cybernetics, observability is a measure of how well internal states of a system can be inferred from knowledge of its external outputs. Moreover, observability and controllability of a system are mathematically inter-related properties in the sense that it does not matter to have access to hidden states if this knowledge is not exploited for achieving a goal. While such issues can be well posed in the engineering field, in cognitive neuroscience it is quite difficult to restrict the analysis in such a way to isolate direct perception from other cognitive processes, named as "inferences" by the authors [1], without losing a great part of the action (unless one trivializes the meaning of "direct" by stating that "all perception is direct": Gallagher and Zahavi [6]). In other words, in spite of the elegance and scientific rigor of the proposed experimental strategy, in our opinion it misses the fact that in real human-human interactions "direct perception" and "inference" are two faces of the same coin and mental states in a social context are, in a general sense, accessible on the basis of directly perceived sensory signals (here and now) tuned by expectations. In the following, we elaborate this opinion with reference to a competitive interaction paradigm, namely the attempt of a goalkeeper to save a soccer penalty kick by "reading the mind" of his opponent.
Identifying incompatible combinations of concrete materials: volume II, test protocol.
DOT National Transportation Integrated Search
2006-08-01
Unexpected interactions between otherwise acceptable ingredients in portland cement : concrete are becoming increasingly common as cementitious systems become more complex : and demands on the systems are more rigorous. Examples of incompatibilities ...
NASA Technical Reports Server (NTRS)
Weise, Timothy M
2012-01-01
NASA's Dawn mission to the asteroid Vesta and dwarf planet Ceres launched September 27, 2007 and arrived at Vesta in July of 2011. This mission uses ion propulsion to achieve the necessary delta-V to reach and maneuver at Vesta and Ceres. This paper will show how the evolution of ground system automation and process improvement allowed a relatively small engineering team to transition from cruise operations to asteroid operations while maintaining robust processes. The cruise to Vesta phase lasted almost 4 years and consisted of activities that were built with software tools, but each tool was open loop and required engineers to review the output to ensure consistency. Additionally, this same time period was characterized by the evolution from manually retrieved and reviewed data products to automatically generated data products and data value checking. Furthermore, the team originally took about three to four weeks to design and build about four weeks of spacecraft activities, with spacecraft contacts only once a week. Operations around the asteroid Vesta increased the tempo dramatically by transitioning from one contact a week to three or four contacts a week, to fourteen contacts a week (every 12 hours). This was accompanied by a similar increase in activity complexity as well as very fast turn around activity design and build cycles. The design process became more automated and the tools became closed loop, allowing the team to build more activities without sacrificing rigor. Additionally, these activities were dependent on the results of flight system performance, so more automation was added to analyze the flight data and provide results in a timely fashion to feed the design cycle. All of this automation and process improvement enabled up the engineers to focus on other aspects of spacecraft operations, including spacecraft health monitoring and anomaly resolution.
Spin zero Hawking radiation for non-zero-angular momentum mode
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ngampitipan, Tritos; Bonserm, Petarpa; Visser, Matt
2015-05-15
Black hole greybody factors carry some quantum black hole information. Studying greybody factors may lead to understanding the quantum nature of black holes. However, solving for exact greybody factors in many black hole systems is impossible. One way to deal with this problem is to place some rigorous analytic bounds on the greybody factors. In this paper, we calculate rigorous bounds on the greybody factors for spin zero hawking radiation for non-zero-angular momentum mode from the Kerr-Newman black holes.
Hybrid Theory of Electron-Hydrogenic Systems Elastic Scattering
NASA Technical Reports Server (NTRS)
Bhatia, A. K.
2007-01-01
Accurate electron-hydrogen and electron-hydrogenic cross sections are required to interpret fusion experiments, laboratory plasma physics and properties of the solar and astrophysical plasmas. We have developed a method in which the short-range and long-range correlations can be included at the same time in the scattering equations. The phase shifts have rigorous lower bounds and the scattering lengths have rigorous upper bounds. The phase shifts in the resonance region can be used to calculate very accurately the resonance parameters.
Solar Energy Enhancement Using Down-converting Particles: A Rigorous Approach
2011-06-06
Solar energy enhancement using down-converting particles: A rigorous approach Ze’ev R. Abrams,1,2 Avi Niv ,2 and Xiang Zhang2,3,a) 1Applied Science...System 1. 114905-2 Abrams, Niv , and Zhang J. Appl. Phys. 109, 114905 (2011) [This article is copyrighted as indicated in the article. Reuse of AIP...This increase per band-gap is displayed in 114905-3 Abrams, Niv , and Zhang J. Appl. Phys. 109, 114905 (2011) [This article is copyrighted as indicated
Pearce, N
1985-10-01
This paper describes in broad terms, the fire testing programme we carried out on whole bed assemblies in 1984. It should be clear that the tests were carried out in a thoroughly rigorous scientific manner. As always there is more to be done. The immediate task of finding the so called 'safe' bed assembly is proceeding with the search this year for safer pillows. Softer barrier foams are now being produced and it may be that the NHS could use full depth foam mattresses rather than a barrier foam wrap. On the engineering side I have explained the false alarm problem, and I have reviewed some of the research we are doing to see that new technology is used to give us better systems in future. Life safety sprinkler systems give the possibility of truly active fire protection in patient areas. They will enhance fire safety but at the moment no trade-offs can be offered in other areas of fire protection--either active or passive. My final point is that although I have considered the Department's fire research by looking separately at specific projects, the fire safety of a hospital must always be considered as a total package. To be effective, individual components of fire safety must not be considered in isolation but as part of the overall fire safety system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ascough, II, James Clifford
1992-05-01
The capability to objectively evaluate design performance of shallow landfill burial (SLB) systems is of great interest to diverse scientific disciplines, including hydrologists, engineers, environmental scientists, and SLB regulators. The goal of this work was to develop and validate a procedure for the nonsubjective evaluation of SLB designs under actual or simulated environmental conditions. A multiobjective decision module (MDM) based on scoring functions (Wymore, 1988) was implemented to evaluate SLB design performance. Input values to the MDM are provided by hydrologic models. The MDM assigns a total score to each SLB design alternative, thereby allowing for rapid and repeatable designmore » performance evaluation. The MDM was validated for a wide range of SLB designs under different climatic conditions. Rigorous assessment of SLB performance also requires incorporation of hydrologic probabilistic analysis and hydrologic risk into the overall design. This was accomplished through the development of a frequency analysis module. The frequency analysis module allows SLB design event magnitudes to be calculated based on the hydrologic return period. The multiobjective decision and freqeuncy anslysis modules were integrated in a decision support system (DSS) framework, SLEUTH (Shallow Landfill Evaluation Using Transport and Hydrology). SLEUTH is a Microsoft Windows {trademark} application, and is written in the Knowledge Pro Windows (Knowledge Garden, Inc., 1991) development language.« less
Computing Generalized Matrix Inverse on Spiking Neural Substrate
Shukla, Rohit; Khoram, Soroosh; Jorgensen, Erik; Li, Jing; Lipasti, Mikko; Wright, Stephen
2018-01-01
Emerging neural hardware substrates, such as IBM's TrueNorth Neurosynaptic System, can provide an appealing platform for deploying numerical algorithms. For example, a recurrent Hopfield neural network can be used to find the Moore-Penrose generalized inverse of a matrix, thus enabling a broad class of linear optimizations to be solved efficiently, at low energy cost. However, deploying numerical algorithms on hardware platforms that severely limit the range and precision of representation for numeric quantities can be quite challenging. This paper discusses these challenges and proposes a rigorous mathematical framework for reasoning about range and precision on such substrates. The paper derives techniques for normalizing inputs and properly quantizing synaptic weights originating from arbitrary systems of linear equations, so that solvers for those systems can be implemented in a provably correct manner on hardware-constrained neural substrates. The analytical model is empirically validated on the IBM TrueNorth platform, and results show that the guarantees provided by the framework for range and precision hold under experimental conditions. Experiments with optical flow demonstrate the energy benefits of deploying a reduced-precision and energy-efficient generalized matrix inverse engine on the IBM TrueNorth platform, reflecting 10× to 100× improvement over FPGA and ARM core baselines. PMID:29593483
NASA Astrophysics Data System (ADS)
Scott, Catherine Elizabeth
This study examined the characteristics of 10 science, technology, engineering, and mathematics (STEM) focused high schools. A comparative case designed was used to identify key components of STEM school designs. Schools were selected from various regions across the United States. Data collected included websites, national statistics database, standardized test scores, interviews and published articles. Results from this study indicate that there is a variety of STEM high school programs designed to increase students' ability to pursue college degrees in STEM fields. The school mission statements influence the overall school design. Students at STEM schools must submit an application to be admitted to STEM high schools. Half of the STEM high schools used a lottery system to select students. STEM high schools have a higher population of black students and a lower population of white and Hispanic students than most schools in the United States. They serve about the same number of economically disadvantaged students. The academic programs at STEM high schools are more rigorous with electives focused on STEM content. In addition to coursework requirements, students must also complete internships and/or a capstone project. Teachers who teach in STEM schools are provided regularly scheduled professional development activities that focus on STEM content and pedagogy. Teachers provide leadership in the development and delivery of the professional development activities.
A Comparison of Single-Cycle Versus Multiple-Cycle Proof Testing Strategies
NASA Technical Reports Server (NTRS)
McClung, R. C.; Chell, G. G.; Millwater, H. R.; Russell, D. A.; Millwater, H. R.
1999-01-01
Single-cycle and multiple-cycle proof testing (SCPT and MCPT) strategies for reusable aerospace propulsion system components are critically evaluated and compared from a rigorous elastic-plastic fracture mechanics perspective. Earlier MCPT studies are briefly reviewed. New J-integral estimation methods for semielliptical surface cracks and cracks at notches are derived and validated. Engineering methods are developed to characterize crack growth rates during elastic-plastic fatigue crack growth (FCG) and the tear-fatigue interaction near instability. Surface crack growth experiments are conducted with Inconel 718 to characterize tearing resistance, FCG under small-scale yielding and elastic-plastic conditions, and crack growth during simulated MCPT. Fractography and acoustic emission studies provide additional insight. The relative merits of SCPT and MCPT are directly compared using a probabilistic analysis linked with an elastic-plastic crack growth computer code. The conditional probability of failure in service is computed for a population of components that have survived a previous proof test, based on an assumed distribution of initial crack depths. Parameter studies investigate the influence of proof factor, tearing resistance, crack shape, initial crack depth distribution, and notches on the MCPT versus SCPT comparison. The parameter studies provide a rational basis to formulate conclusions about the relative advantages and disadvantages of SCPT and MCPT. Practical engineering guidelines are proposed to help select the optimum proof test protocol in a given application.
A Comparison of Single-Cycle Versus Multiple-Cycle Proof Testing Strategies
NASA Technical Reports Server (NTRS)
McClung, R. C.; Chell, G. G.; Millwater, H. R.; Russell, D. A.; Orient, G. E.
1996-01-01
Single-cycle and multiple-cycle proof testing (SCPT and MCPT) strategies for reusable aerospace propulsion system components are critically evaluated and compared from a rigorous elastic-plastic fracture mechanics perspective. Earlier MCPT studies are briefly reviewed. New J-integral estimation methods for semi-elliptical surface cracks and cracks at notches are derived and validated. Engineering methods are developed to characterize crack growth rates during elastic-plastic fatigue crack growth (FCG) and the tear-fatigue interaction near instability. Surface crack growth experiments are conducted with Inconel 718 to characterize tearing resistance, FCG under small-scale yielding and elastic-plastic conditions, and crack growth during simulated MCPT. Fractography and acoustic emission studies provide additional insight. The relative merits of SCPT and MCPT are directly compared using a probabilistic analysis linked with an elastic-plastic crack growth computer code. The conditional probability of failure in service is computed for a population of components that have survived a previous proof test, based on an assumed distribution of initial crack depths. Parameter studies investigate the influence of proof factor, tearing resistance, crack shape, initial crack depth distribution, and notches on the MCPT vs. SCPT comparison. The parameter studies provide a rational basis to formulate conclusions about the relative advantages and disadvantages of SCPT and MCPT. Practical engineering guidelines are proposed to help select the optimum proof test protocol in a given application.
Kalkan, Erol; Chopra, Anil K.
2010-01-01
Earthquake engineering practice is increasingly using nonlinear response history analysis (RHA) to demonstrate performance of structures. This rigorous method of analysis requires selection and scaling of ground motions appropriate to design hazard levels. Presented herein is a modal-pushover-based scaling (MPS) method to scale ground motions for use in nonlinear RHA of buildings and bridges. In the MPS method, the ground motions are scaled to match (to a specified tolerance) a target value of the inelastic deformation of the first-'mode' inelastic single-degree-of-freedom (SDF) system whose properties are determined by first-'mode' pushover analysis. Appropriate for first-?mode? dominated structures, this approach is extended for structures with significant contributions of higher modes by considering elastic deformation of second-'mode' SDF system in selecting a subset of the scaled ground motions. Based on results presented for two bridges, covering single- and multi-span 'ordinary standard' bridge types, and six buildings, covering low-, mid-, and tall building types in California, the accuracy and efficiency of the MPS procedure are established and its superiority over the ASCE/SEI 7-05 scaling procedure is demonstrated.
Cellular Signaling Networks Function as Generalized Wiener-Kolmogorov Filters to Suppress Noise
NASA Astrophysics Data System (ADS)
Hinczewski, Michael; Thirumalai, D.
2014-10-01
Cellular signaling involves the transmission of environmental information through cascades of stochastic biochemical reactions, inevitably introducing noise that compromises signal fidelity. Each stage of the cascade often takes the form of a kinase-phosphatase push-pull network, a basic unit of signaling pathways whose malfunction is linked with a host of cancers. We show that this ubiquitous enzymatic network motif effectively behaves as a Wiener-Kolmogorov optimal noise filter. Using concepts from umbral calculus, we generalize the linear Wiener-Kolmogorov theory, originally introduced in the context of communication and control engineering, to take nonlinear signal transduction and discrete molecule populations into account. This allows us to derive rigorous constraints for efficient noise reduction in this biochemical system. Our mathematical formalism yields bounds on filter performance in cases important to cellular function—such as ultrasensitive response to stimuli. We highlight features of the system relevant for optimizing filter efficiency, encoded in a single, measurable, dimensionless parameter. Our theory, which describes noise control in a large class of signal transduction networks, is also useful both for the design of synthetic biochemical signaling pathways and the manipulation of pathways through experimental probes such as oscillatory input.
NASA Astrophysics Data System (ADS)
Lapin, Alexei; Klann, Michael; Reuss, Matthias
Agent-based models are rigorous tools for simulating the interactions of individual entities, such as organisms or molecules within cells and assessing their effects on the dynamic behavior of the system as a whole. In context with bioprocess and biosystems engineering there are several interesting and important applications. This contribution aims at introducing this strategy with the aid of two examples characterized by striking distinctions in the scale of the individual entities and the mode of their interactions. In the first example a structured-segregated model is applied to travel along the lifelines of single cells in the environment of a three-dimensional turbulent field of a stirred bioreactor. The modeling approach is based on an Euler-Lagrange formulation of the system. The strategy permits one to account for the heterogeneity present in real reactors in both the fluid and cellular phases, respectively. The individual response of the cells to local variations in the extracellular concentrations is pictured by a dynamically structured model of the key reactions of the central metabolism. The approach permits analysis of the lifelines of individual cells in space and time.
Validation of Bioreactor and Human-on-a-Chip Devices for Chemical Safety Assessment.
Rebelo, Sofia P; Dehne, Eva-Maria; Brito, Catarina; Horland, Reyk; Alves, Paula M; Marx, Uwe
2016-01-01
Equipment and device qualification and test assay validation in the field of tissue engineered human organs for substance assessment remain formidable tasks with only a few successful examples so far. The hurdles seem to increase with the growing complexity of the biological systems, emulated by the respective models. Controlled single tissue or organ culture in bioreactors improves the organ-specific functions and maintains their phenotypic stability for longer periods of time. The reproducibility attained with bioreactor operations is, per se, an advantage for the validation of safety assessment. Regulatory agencies have gradually altered the validation concept from exhaustive "product" to rigorous and detailed process characterization, valuing reproducibility as a standard for validation. "Human-on-a-chip" technologies applying micro-physiological systems to the in vitro combination of miniaturized human organ equivalents into functional human micro-organisms are nowadays thought to be the most elaborate solution created to date. They target the replacement of the current most complex models-laboratory animals. Therefore, we provide here a road map towards the validation of such "human-on-a-chip" models and qualification of their respective bioreactor and microchip equipment along a path currently used for the respective animal models.
Optimization of Wireless Power Transfer Systems Enhanced by Passive Elements and Metasurfaces
NASA Astrophysics Data System (ADS)
Lang, Hans-Dieter; Sarris, Costas D.
2017-10-01
This paper presents a rigorous optimization technique for wireless power transfer (WPT) systems enhanced by passive elements, ranging from simple reflectors and intermedi- ate relays all the way to general electromagnetic guiding and focusing structures, such as metasurfaces and metamaterials. At its core is a convex semidefinite relaxation formulation of the otherwise nonconvex optimization problem, of which tightness and optimality can be confirmed by a simple test of its solutions. The resulting method is rigorous, versatile, and general -- it does not rely on any assumptions. As shown in various examples, it is able to efficiently and reliably optimize such WPT systems in order to find their physical limitations on performance, optimal operating parameters and inspect their working principles, even for a large number of active transmitters and passive elements.
Wisniewski, Janna M; Yeager, Valerie A; Diana, Mark L; Hotchkiss, David R
2016-10-01
The number of health systems strengthening (HSS) programs has increased in the last decade. However, a limited number of studies providing robust evidence for the value and impact of these programs are available. This study aims to identify knowledge gaps and challenges that impede rigorous monitoring and evaluation (M&E) of HSS, and to ascertain the extent to which these efforts are informed by existing technical guidance. Interviews were conducted with HSS advisors at United States Agency for International Development-funded missions as well as senior M&E advisors at implementing partner and multilateral organizations. Findings showed that mission staff do not use existing technical resources, either because they do not know about them or do not find them useful. Barriers to rigorous M&E included a lack suitable of indicators, data limitations, difficulty in demonstrating an impact on health, and insufficient funding and resources. Consensus and collaboration between international health partners and local governments may mitigate these challenges. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Identifying incompatible combinations of concrete materials : volume I, final report.
DOT National Transportation Integrated Search
2006-08-01
Unexpected interactions between otherwise acceptable ingredients in portland cement concrete are becoming increasingly common as cementitious systems become more and more complex and demands on the systems are more rigorous. Such incompatibilities ar...
Space Shuttle GN and C Development History and Evolution
NASA Technical Reports Server (NTRS)
Zimpfer, Douglas; Hattis, Phil; Ruppert, John; Gavert, Don
2011-01-01
Completion of the final Space Shuttle flight marks the end of a significant era in Human Spaceflight. Developed in the 1970 s, first launched in 1981, the Space Shuttle embodies many significant engineering achievements. One of these is the development and operation of the first extensive fly-by-wire human space transportation Guidance, Navigation and Control (GN&C) System. Development of the Space Shuttle GN&C represented first time inclusions of modern techniques for electronics, software, algorithms, systems and management in a complex system. Numerous technical design trades and lessons learned continue to drive current vehicle development. For example, the Space Shuttle GN&C system incorporated redundant systems, complex algorithms and flight software rigorously verified through integrated vehicle simulations and avionics integration testing techniques. Over the past thirty years, the Shuttle GN&C continued to go through a series of upgrades to improve safety, performance and to enable the complex flight operations required for assembly of the international space station. Upgrades to the GN&C ranged from the addition of nose wheel steering to modifications that extend capabilities to control of the large flexible configurations while being docked to the Space Station. This paper provides a history of the development and evolution of the Space Shuttle GN&C system. Emphasis is placed on key architecture decisions, design trades and the lessons learned for future complex space transportation system developments. Finally, some of the interesting flight operations experience is provided to inform future developers of flight experiences.
Development of rigor mortis is not affected by muscle volume.
Kobayashi, M; Ikegaya, H; Takase, I; Hatanaka, K; Sakurada, K; Iwase, H
2001-04-01
There is a hypothesis suggesting that rigor mortis progresses more rapidly in small muscles than in large muscles. We measured rigor mortis as tension determined isometrically in rat musculus erector spinae that had been cut into muscle bundles of various volumes. The muscle volume did not influence either the progress or the resolution of rigor mortis, which contradicts the hypothesis. Differences in pre-rigor load on the muscles influenced the onset and resolution of rigor mortis in a few pairs of samples, but did not influence the time taken for rigor mortis to reach its full extent after death. Moreover, the progress of rigor mortis in this muscle was biphasic; this may reflect the early rigor of red muscle fibres and the late rigor of white muscle fibres.
Hughes, Brianna H; Greenberg, Neil J; Yang, Tom C; Skonberg, Denise I
2015-01-01
High-pressure processing (HPP) is used to increase meat safety and shelf-life, with conflicting quality effects depending on rigor status during HPP. In the seafood industry, HPP is used to shuck and pasteurize oysters, but its use on abalones has only been minimally evaluated and the effect of rigor status during HPP on abalone quality has not been reported. Farm-raised abalones (Haliotis rufescens) were divided into 12 HPP treatments and 1 unprocessed control treatment. Treatments were processed pre-rigor or post-rigor at 2 pressures (100 and 300 MPa) and 3 processing times (1, 3, and 5 min). The control was analyzed post-rigor. Uniform plugs were cut from adductor and foot meat for texture profile analysis, shear force, and color analysis. Subsamples were used for scanning electron microscopy of muscle ultrastructure. Texture profile analysis revealed that post-rigor processed abalone was significantly (P < 0.05) less firm and chewy than pre-rigor processed irrespective of muscle type, processing time, or pressure. L values increased with pressure to 68.9 at 300 MPa for pre-rigor processed foot, 73.8 for post-rigor processed foot, 90.9 for pre-rigor processed adductor, and 89.0 for post-rigor processed adductor. Scanning electron microscopy images showed fraying of collagen fibers in processed adductor, but did not show pressure-induced compaction of the foot myofibrils. Post-rigor processed abalone meat was more tender than pre-rigor processed meat, and post-rigor processed foot meat was lighter in color than pre-rigor processed foot meat, suggesting that waiting for rigor to resolve prior to processing abalones may improve consumer perceptions of quality and market value. © 2014 Institute of Food Technologists®
Spray combustion experiments and numerical predictions
NASA Technical Reports Server (NTRS)
Mularz, Edward J.; Bulzan, Daniel L.; Chen, Kuo-Huey
1993-01-01
The next generation of commercial aircraft will include turbofan engines with performance significantly better than those in the current fleet. Control of particulate and gaseous emissions will also be an integral part of the engine design criteria. These performance and emission requirements present a technical challenge for the combustor: control of the fuel and air mixing and control of the local stoichiometry will have to be maintained much more rigorously than with combustors in current production. A better understanding of the flow physics of liquid fuel spray combustion is necessary. This paper describes recent experiments on spray combustion where detailed measurements of the spray characteristics were made, including local drop-size distributions and velocities. Also, an advanced combustor CFD code has been under development and predictions from this code are compared with experimental results. Studies such as these will provide information to the advanced combustor designer on fuel spray quality and mixing effectiveness. Validation of new fast, robust, and efficient CFD codes will also enable the combustor designer to use them as additional design tools for optimization of combustor concepts for the next generation of aircraft engines.
Engineering Exosomes for Cancer Therapy.
Gilligan, Katie E; Dwyer, Róisín M
2017-05-24
There remains an urgent need for novel therapeutic strategies to treat metastatic cancer, which results in over 8 million deaths annually worldwide. Following secretion, exosomes are naturally taken up by cells, and capable of the stable transfer of drugs, therapeutic microRNAs and proteins. As knowledge of the biogenesis, release and uptake of exosomes continues to evolve, and thus also has interest in these extracellular vesicles as potential tumor-targeted vehicles for cancer therapy. The ability to engineer exosome content and migratory itinerary holds tremendous promise. Studies to date have employed viral and non-viral methods to engineer the parent cells to secrete modified exosomes, or alternatively, to directly manipulate exosome content following secretion. The majority of studies have demonstrated promising results, with decreased tumor cell invasion, migration and proliferation, along with enhanced immune response, cell death, and sensitivity to chemotherapy observed. The studies outlined in this review highlight the exciting potential for exosomes as therapeutic vehicles for cancer treatment. Successful implementation in the clinical setting will be dependent upon establishment of rigorous standards for exosome manipulation, isolation, and characterisation.
Issues Involved in Developing Ada Real-Time Systems
1989-02-15
expensive modifications to the compiler or Ada runtime system to fit a particular application. Whether we can solve the problems of programming real - time systems in...lock in solutions to problems that are not yet well understood in standards as rigorous as the Ada language. Moreover, real - time systems typically have
Modelling the Shuttle Remote Manipulator System: Another flexible model
NASA Technical Reports Server (NTRS)
Barhorst, Alan A.
1993-01-01
High fidelity elastic system modeling algorithms are discussed. The particular system studied is the Space Shuttle Remote Manipulator System (RMS) undergoing full articulated motion. The model incorporates flexibility via a methodology the author has been developing. The technique is based in variational principles, so rigorous boundary condition generation and weak formulations for the associated partial differential equations are realized, yet the analyst need not integrate by parts. The methodology is formulated using vector-dyad notation with minimal use of tensor notation, therefore the technique is believed to be affable to practicing engineers. The objectives of this work are as follows: (1) determine the efficacy of the modeling method; and (2) determine if the method affords an analyst advantages in the overall modeling and simulation task. Generated out of necessity were Mathematica algorithms that quasi-automate the modeling procedure and simulation development. The project was divided into sections as follows: (1) model development of a simplified manipulator; (2) model development of the full-freedom RMS including a flexible movable base on a six degree of freedom orbiter (a rigid-body is attached to the manipulator end-effector); (3) simulation development for item 2; and (4) comparison to the currently used model of the flexible RMS in the Structures and Mechanics Division of NASA JSC. At the time of the writing of this report, items 3 and 4 above were not complete.
Imaging the Gouy phase shift in photonic jets with a wavefront sensor.
Bon, Pierre; Rolly, Brice; Bonod, Nicolas; Wenger, Jérôme; Stout, Brian; Monneret, Serge; Rigneault, Hervé
2012-09-01
A wavefront sensor is used as a direct observation tool to image the Gouy phase shift in photonic nanojets created by micrometer-sized dielectric spheres. The amplitude and phase distributions of light are found in good agreement with a rigorous electromagnetic computation. Interestingly the observed phase shift when travelling through the photonic jet is a combination of the awaited π Gouy shift and a phase shift induced by the bead refraction. Such direct spatial phase shift observation using wavefront sensors would find applications in microscopy, diffractive optics, optical trapping, and point spread function engineering.
Coastal wetlands of Chesapeake Bay
Baldwin, Andrew H.; Kangas, Patrick J.; Megonigal, J. Patrick; Perry, Matthew C.; Whigham, Dennis F.; Batzer, Darold P.; Batzer, Darold P.; Baldwin, Andrew H.
2012-01-01
Wetlands are prominent landscapes throughout North America. The general characteristics of wetlands are controversial, thus there has not been a systematic assessment of different types of wetlands in different parts of North America, or a compendium of the threats to their conservation. Wetland Habitats of North America adopts a geographic and habitat approach, in which experts familiar with wetlands from across North America provide analyses and syntheses of their particular region of study. Addressing a broad audience of students, scientists, engineers, environmental managers, and policy makers, this book reviews recent, scientifically rigorous literature directly relevant to understanding, managing, protecting, and restoring wetland ecosystems of North America.
2016-01-01
Information is a precise concept that can be defined mathematically, but its relationship to what we call ‘knowledge’ is not always made clear. Furthermore, the concepts ‘entropy’ and ‘information’, while deeply related, are distinct and must be used with care, something that is not always achieved in the literature. In this elementary introduction, the concepts of entropy and information are laid out one by one, explained intuitively, but defined rigorously. I argue that a proper understanding of information in terms of prediction is key to a number of disciplines beyond engineering, such as physics and biology. PMID:26857663
Fast Response Shape Memory Effect Titanium Nickel (TiNi) Foam Torque Tubes
NASA Technical Reports Server (NTRS)
Jardine, Peter
2014-01-01
Shape Change Technologies has developed a process to manufacture net-shaped TiNi foam torque tubes that demonstrate the shape memory effect. The torque tubes dramatically reduce response time by a factor of 10. This Phase II project matured the actuator technology by rigorously characterizing the process to optimize the quality of the TiNi and developing a set of metrics to provide ISO 9002 quality assurance. A laboratory virtual instrument engineering workbench (LabVIEW'TM')-based, real-time control of the torsional actuators was developed. These actuators were developed with The Boeing Company for aerospace applications.
A Rigorous Theory of Many-Body Prethermalization for Periodically Driven and Closed Quantum Systems
NASA Astrophysics Data System (ADS)
Abanin, Dmitry; De Roeck, Wojciech; Ho, Wen Wei; Huveneers, François
2017-09-01
Prethermalization refers to the transient phenomenon where a system thermalizes according to a Hamiltonian that is not the generator of its evolution. We provide here a rigorous framework for quantum spin systems where prethermalization is exhibited for very long times. First, we consider quantum spin systems under periodic driving at high frequency {ν}. We prove that up to a quasi-exponential time {τ_* ˜ e^{c ν/log^3 ν}}, the system barely absorbs energy. Instead, there is an effective local Hamiltonian {\\widehat D} that governs the time evolution up to {τ_*}, and hence this effective Hamiltonian is a conserved quantity up to {τ_*}. Next, we consider systems without driving, but with a separation of energy scales in the Hamiltonian. A prime example is the Fermi-Hubbard model where the interaction U is much larger than the hopping J. Also here we prove the emergence of an effective conserved quantity, different from the Hamiltonian, up to a time {τ_*} that is (almost) exponential in {U/J}.
NASA Astrophysics Data System (ADS)
Lumban Gaol, Ford; Rizwan Hussain, Raja; Pandiangan, Tumpal; Desai, Amit
2013-06-01
Banner The 2013 International Conference on Manufacturing, Optimization, Industrial and Material Engineering (MOIME 2013), was held at the Grand Royal Panghegar Hotel, Bandung, Indonesia, from 9-10 March 2013. The MOIME 2013 conference brought together researchers, engineers and scientists in the field from around the world. MOIME 2013 aimed to promote interaction between the theoretical, experimental, and applied communities, so that a high level exchange was achieved in new and emerging areas within Material Engineering, Industrial Engineering and all areas that related to Optimization. We would like to express our sincere gratitude to all in the Technical Program Committee who reviewed the papers and developed a very interesting Conference Program as well as the invited and plenary speakers. This year, we received 103 papers and after rigorous review, 45 papers were accepted. The participants came from 16 countries. There were six Plenary and Invited Speakers. It is an honour to present this volume of IOP Conference Series: Materials Science and Engineering (MSE) and we deeply thank the authors for their enthusiastic and high-grade contribution. Finally, we would like to thank the conference chairmen, the members of the steering committee, the organizing committee, the organizing secretariat and the conference sponsors for the financial support that contributed to the success of MOIME 2013. The Editors of the MOIME 2013 Dr Ford Lumban Gaol Dr Raja Rizwan Hussain Tumpal Pandiangan Dr Amit Desai The PDF contains the abstracts from the plenary and invited articles and the workshop.
2014 International Conference on Manufacturing, Optimization, Industrial and Material Engineering
NASA Astrophysics Data System (ADS)
Lumban Gaol, Ford; Webb, Jeff; Ding, Jun
2014-06-01
The 2nd International Conference on Manufacturing, Optimization, Industrial and Material Engineering 2014 (MOIME 2014), was held at the Grand Mercure Harmoni, Opal Room 3rd Floor, Jakarta, Indonesia, during 29-30 March 2014. The MOIME 2014 conference is designed to bring together researchers, engineers and scientists in the domain of interest from around the world. MOIME 2014 is placed on promoting interaction between the theoretical, experimental, and applied communities, so that a high level exchange is achieved in new and emerging areas within Material Engineering, Industrial Engineering and all areas that relate to Optimization. We would like to express our sincere gratitude to all in the Technical Program Committee who have reviewed the papers and developed a very interesting Conference Program as well as the invited and plenary speakers. This year, we received 97 papers and after rigorous review, 24 papers were accepted. The participants come from 7 countries. There are 4 (four) parallel session and 2 Invited Speakers and one workshop. It is an honour to present this volume of IOP Conference Series: Materials Science and Engineering (MSE) and we deeply thank the authors for their enthusiastic and high-grade contributions. Finally, we would like to thank the conference chairmen, the members of the steering committee, the organizing committee, the organizing secretariat and the financial support from the conference sponsors that allowed the success of MOIME 2014. The Editors of the MOIME 2014 Proceedings Editors Dr Ford Lumban Gaol Jeff Webb, PhD Professor Jun Ding, PhD
NASA Astrophysics Data System (ADS)
Lumban Gaol, Ford; Webb, Jeff; Ding, Jun
2015-05-01
The 3rd International Conference on Manufacturing, Optimization, Industrial and Material Engineering (MOIME 2015) was held at the Sheraton Kuta, Bali, Indonesia, from 28 - 29 March 2015. The MOIME 2015 conference is aimed to bring together researchers, engineers and scientists in the domain of interest from around the world. MOIME 2015 is placed on promoting interaction between the theoretical, experimental, and applied communities, so that a high level exchange is achieved in new and emerging areas within Material Engineering, Industrial Engineering and all areas that relate to Optimization. We would like to express our sincere gratitude to all in the Technical Program Committee who have reviewed the papers and developed a very interesting Conference Program, as well as the invited and plenary speakers. This year, we received 99 papers and after rigorous review, 24 papers were accepted. The participants come from eight countries. There were four parallel sessions and two invited speakers. It is an honour to present this volume of IOP Conference Series: Materials Science and Engineering (MSE) and we deeply thank the authors for their enthusiastic and high-grade contributions. Finally, we would like to thank the conference chairmen, the members of the steering committee, the organizing committee, the organizing secretariat and the financial support from the conference sponsors that allowed the success of MOIME 2015. The Editors of the MOIME 2015 Proceedings Dr. Ford Lumban Gaol Jeff Webb, Ph.D Prof. Jun DING, Ph.D
Preserving pre-rigor meat functionality for beef patty production.
Claus, J R; Sørheim, O
2006-06-01
Three methods were examined for preserving pre-rigor meat functionality in beef patties. Hot-boned semimembranosus muscles were processed as follows: (1) pre-rigor ground, salted, patties immediately cooked; (2) pre-rigor ground, salted and stored overnight; (3) pre-rigor injected with brine; and (4) post-rigor ground and salted. Raw patties contained 60% lean beef, 19.7% beef fat trim, 1.7% NaCl, 3.6% starch, and 15% water. Pre-rigor processing occurred at 3-3.5h postmortem. Patties made from pre-rigor ground meat had higher pH values; greater protein solubility; firmer, more cohesive, and chewier texture; and substantially lower cooking losses than the other treatments. Addition of salt was sufficient to reduce the rate and extent of glycolysis. Brine injection of intact pre-rigor muscles resulted in some preservation of the functional properties but not as pronounced as with salt addition to pre-rigor ground meat.
NASA Astrophysics Data System (ADS)
Sivapalan, Murugesu
2018-03-01
Hydrology has undergone almost transformative changes over the past 50 years. Huge strides have been made in the transition from early empirical approaches to rigorous approaches based on the fluid mechanics of water movement on and below the land surface. However, progress has been hampered by problems posed by the presence of heterogeneity, including subsurface heterogeneity present at all scales. The inability to measure or map the heterogeneity everywhere prevented the development of balance equations and associated closure relations at the scales of interest, and has led to the virtual impasse we are presently in, in terms of development of physically based models needed for hydrologic predictions. An alternative to the mapping of heterogeneity everywhere is a new Earth system science view, which sees the heterogeneity as the end result of co-evolutionary hydrological, geomorphological, ecological, and pedological processes, each operating at a different rate, which help to shape the landscapes that we find in nature, including the heterogeneity that we do not readily see. The expectation is that instead of specifying exact details of the heterogeneity in our models, we can replace it (without loss of information) with the ecosystem function that they perform. Guided by this new Earth system science perspective, development of hydrologic science is now addressing new questions using novel holistic co-evolutionary approaches as opposed to the physical, fluid mechanics based reductionist approaches that we inherited from the recent past. In the emergent Anthropocene, the co-evolutionary view has expanded further to involve interactions and feedbacks with human-social processes as well. In this paper, I present my own perspective of key milestones in the transformation of hydrologic science from engineering hydrology to Earth system science, drawn from the work of several students and colleagues of mine, and discuss their implication for hydrologic observations, theory development, and predictions.
NASA Astrophysics Data System (ADS)
Dykas, Brian; Harris, James
2017-09-01
Acoustic emission sensing techniques have been applied in recent years to dynamic machinery with varying degrees of success in diagnosing various component faults and distinguishing between operating conditions. This work explores basic properties of acoustic emission signals measured on a small single cylinder diesel engine in a laboratory setting. As reported in other works in the open literature, the measured acoustic emission on the engine is mostly continuous mode and individual burst events are generally not readily identifiable. Therefore, the AE are processed into the local (instantaneous) root mean square (rms) value of the signal which is averaged over many cycles to obtain a mean rms AE in the crank angle domain. Crank-resolved spectral representation of the AE is also given but rigorous investigation of the AE spectral qualities is left to future study. Cycle-to-cycle statistical dispersion of the AE signal is considered to highlight highly variable engine processes. Engine speed was held constant but load conditions are varied to investigate AE signal sensitivity to operating condition. Furthermore, during the course of testing the fuel injector developed a fault and acoustic emission signals were captured and several signal attributes were successful in distinguishing this altered condition. The sampling and use of instantaneous rms acoustic emission signal demonstrated promise for non-intrusive and economical change detection of engine injection, combustion and valve events.
A Greenhouse-Gas Information System: Monitoring and Validating Emissions Reporting and Mitigation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jonietz, Karl K.; Dimotakis, Paul E.; Rotman, Douglas A.
2011-09-26
This study and report focus on attributes of a greenhouse-gas information system (GHGIS) needed to support MRV&V needs. These needs set the function of such a system apart from scientific/research monitoring of GHGs and carbon-cycle systems, and include (not exclusively): the need for a GHGIS that is operational, as required for decision-support; the need for a system that meets specifications derived from imposed requirements; the need for rigorous calibration, verification, and validation (CV&V) standards, processes, and records for all measurement and modeling/data-inversion data; the need to develop and adopt an uncertainty-quantification (UQ) regimen for all measurement and modeling data; andmore » the requirement that GHGIS products can be subjected to third-party questioning and scientific scrutiny. This report examines and assesses presently available capabilities that could contribute to a future GHGIS. These capabilities include sensors and measurement technologies; data analysis and data uncertainty quantification (UQ) practices and methods; and model-based data-inversion practices, methods, and their associated UQ. The report further examines the need for traceable calibration, verification, and validation processes and attached metadata; differences between present science-/research-oriented needs and those that would be required for an operational GHGIS; the development, operation, and maintenance of a GHGIS missions-operations center (GMOC); and the complex systems engineering and integration that would be required to develop, operate, and evolve a future GHGIS.« less
Araújo, Luciano V; Malkowski, Simon; Braghetto, Kelly R; Passos-Bueno, Maria R; Zatz, Mayana; Pu, Calton; Ferreira, João E
2011-12-22
Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces.
2011-01-01
Background Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. Results This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. Conclusions This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces. PMID:22369688
NASA Astrophysics Data System (ADS)
Osman, Sharifah; Mohammad, Shahrin; Abu, Mohd Salleh
2015-05-01
Mathematics and engineering are inexorably and significantly linked and essentially required in analyzing and accessing thought to make good judgment when dealing in complex and varied engineering problems. A study in the current engineering education curriculum to explore how the critical thinking and mathematical thinking relates to one another, is therefore timely crucial. Unfortunately, there is not much information available explicating about the link. This paper aims to report findings of a critical review as well as to provide brief description of an on-going research aimed to investigate the dispositions of critical thinking and the relationship and integration between critical thinking and mathematical thinking during the execution of civil engineering tasks. The first part of the paper reports an in-depth review on these matters based on rather limited resources. The review showed a considerable form of congruency between these two perspectives of thinking, with some prevalent trends of engineering workplace tasks, problems and challenges. The second part describes an on-going research to be conducted by the researcher to investigate rigorously the relationship and integration between these two types of thinking within the perspective of civil engineering tasks. A reasonably close non-participant observations and semi-structured interviews will be executed for the pilot and main stages of the study. The data will be analyzed using constant comparative analysis in which the grounded theory methodology will be adopted. The findings will serve as a useful grounding for constructing a substantive theory revealing the integral relationship between critical thinking and mathematical thinking in the real civil engineering practice context. The substantive theory, from an angle of view, is expected to contribute some additional useful information to the engineering program outcomes and engineering education instructions, aligns with the expectations of engineering program outcomes set by the Engineering Accreditation Council.
Multi-Partner Experiment to Test Volcanic-Ash Ingestion by a Jet Engine
NASA Technical Reports Server (NTRS)
Lekki, John; Lyall, Eric; Guffanti, Marianne; Fisher, John; Erlund, Beth; Clarkson, Rory; van de Wall, Allan
2013-01-01
A research team of U.S. Government agencies and engine manufacturers are designing an experiment to test volcanic-ash ingestion by a NASA owned F117 engine that was donated by the U.S. Air Force. The experiment is being conducted under the auspices of NASA s Vehicle Integrated Propulsion Research (VIPR) Program and will take place in early 2014 at Edwards AFB in California as an on-ground, on-wing test. The primary objectives are to determine the effect on the engine of several hours of exposure to low to moderate ash concentrations, currently proposed at 1 and 10 mg/m3 and to evaluate the capability of engine health management technologies for detecting these effects. A natural volcanic ash will be used that is representative of distal ash clouds many 100's to approximately 1000 km from a volcanic source i.e., the ash should be composed of fresh glassy particles a few tens of microns in size. The glassy ash particles are expected to soften and become less viscous when exposed to the high temperatures of the combustion chamber, then stick to the nozzle guide vanes of the high-pressure turbine. Numerous observations and measurements of the engine s performance and degradation will be made during the course of the experiment, including borescope and tear-down inspections. While not intended to be sufficient for rigorous certification of engine performance when ash is ingested, the experiment should provide useful information to aircraft manufacturers, airline operators, and military and civil regulators in their efforts to evaluate the range of risks that ash hazards pose to aviation.
Repair systems for deteriorated bridge piles : final report.
DOT National Transportation Integrated Search
2017-04-01
The objective of this research project is to develop a durable repair system for deteriorated steel bridge piles that : can be implemented without the need for dewatering. A rigorous survey of the relevant practice nationwide was : conducted to infor...
Information Models, Data Requirements, and Agile Data Curation
NASA Astrophysics Data System (ADS)
Hughes, John S.; Crichton, Dan; Ritschel, Bernd; Hardman, Sean; Joyner, Ron
2015-04-01
The Planetary Data System's next generation system, PDS4, is an example of the successful use of an ontology-based Information Model (IM) to drive the development and operations of a data system. In traditional systems engineering, requirements or statements about what is necessary for the system are collected and analyzed for input into the design stage of systems development. With the advent of big data the requirements associated with data have begun to dominate and an ontology-based information model can be used to provide a formalized and rigorous set of data requirements. These requirements address not only the usual issues of data quantity, quality, and disposition but also data representation, integrity, provenance, context, and semantics. In addition the use of these data requirements during system's development has many characteristics of Agile Curation as proposed by Young et al. [Taking Another Look at the Data Management Life Cycle: Deconstruction, Agile, and Community, AGU 2014], namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. For example customers can be satisfied through early and continuous delivery of system software and services that are configured directly from the information model. This presentation will describe the PDS4 architecture and its three principle parts: the ontology-based Information Model (IM), the federated registries and repositories, and the REST-based service layer for search, retrieval, and distribution. The development of the IM will be highlighted with special emphasis on knowledge acquisition, the impact of the IM on development and operations, and the use of shared ontologies at multiple governance levels to promote system interoperability and data correlation.
Calculating life? Duelling discourses in interdisciplinary systems biology.
Calvert, Jane; Fujimura, Joan H
2011-06-01
A high profile context in which physics and biology meet today is in the new field of systems biology. Systems biology is a fascinating subject for sociological investigation because the demands of interdisciplinary collaboration have brought epistemological issues and debates front and centre in discussions amongst systems biologists in conference settings, in publications, and in laboratory coffee rooms. One could argue that systems biologists are conducting their own philosophy of science. This paper explores the epistemic aspirations of the field by drawing on interviews with scientists working in systems biology, attendance at systems biology conferences and workshops, and visits to systems biology laboratories. It examines the discourses of systems biologists, looking at how they position their work in relation to previous types of biological inquiry, particularly molecular biology. For example, they raise the issue of reductionism to distinguish systems biology from molecular biology. This comparison with molecular biology leads to discussions about the goals and aspirations of systems biology, including epistemic commitments to quantification, rigor and predictability. Some systems biologists aspire to make biology more similar to physics and engineering by making living systems calculable, modelable and ultimately predictable-a research programme that is perhaps taken to its most extreme form in systems biology's sister discipline: synthetic biology. Other systems biologists, however, do not think that the standards of the physical sciences are the standards by which we should measure the achievements of systems biology, and doubt whether such standards will ever be applicable to 'dirty, unruly living systems'. This paper explores these epistemic tensions and reflects on their sociological dimensions and their consequences for future work in the life sciences. Copyright © 2010 Elsevier Ltd. All rights reserved.
Engineering the on-axis intensity of Bessel beam by a feedback tuning loop
NASA Astrophysics Data System (ADS)
Li, Runze; Yu, Xianghua; Yang, Yanlong; Peng, Tong; Yao, Baoli; Zhang, Chunmin; Ye, Tong
2018-02-01
The Bessel beam belongs to a typical class of non-diffractive optical fields that are characterized by their invariant focal profiles along the propagation direction. However, ideal Bessel beams only rigorously exist in theory; Bessel beams generated in the lab are quasi-Bessel beams with finite focal extensions and varying intensity profiles along the propagation axis. The ability to engineer the on-axis intensity profile to the desired shape is essential for many applications. Here we demonstrate an iterative optimization-based approach to engineering the on-axis intensity of Bessel beams. The genetic algorithm is used to demonstrate this approach. Starting with a traditional axicon phase mask, in the design process, the computed on-axis beam profile is fed into a feedback tuning loop of an iterative optimization process, which searches for an optimal radial phase distribution that can generate a generalized Bessel beam with the desired onaxis intensity profile. The experimental implementation involves a fine-tuning process that adjusts the originally targeted profile so that the optimization process can optimize the phase mask to yield an improved on-axis profile. Our proposed method has been demonstrated in engineering several zeroth-order Bessel beams with customized on-axis profiles. High accuracy and high energy throughput merit its use in many applications.
The Hazard of Volcanic Ash Ingestion
NASA Technical Reports Server (NTRS)
Lekki, John
2017-01-01
A research team of U.S. Government agencies and engine manufacturers conducted an experiment to test volcanic-ash ingestion by a NASA owned engine in the same family as the PW 2000 that was donated by the U.S. Air Force. The experiment, called Vehicle Integrated Propulsions Research (VIPR) test, was conducted under the auspices of NASAs Convergent Aeronautics Solutions (CAS) Program and took place in summer of 2015 at Edwards AFB in California as an on-ground, on-wing test. The primary objectives of the volcanic ash test were to determine the effect on the engine of several hours of exposure to low to moderate ash concentrations and to evaluate the capability of engine health management technologies for detecting these effects. The target concentrations of volcanic ash tested were at 1 and 10 mgm3. A natural volcanic ash was used that is representative of distal ash clouds many 100s to 1000 km from a volcanic source. The glassy ash particles were expected to soften and become less viscous when exposed to the high temperatures of the combustion chamber, then stick to the nozzle guide vanes of the high-pressure turbine and this was observed. Numerous observations and measurements of the engines performance and degradation were made during the course of the experiment, including borescope inspections after each test run. The engine has been disassembled so that detailed inspections of the engine effects have been made. A summary of the test methodology and execution will be made along with results from the test. While not intended to be sufficient for rigorous certification of engine performance when ash is ingested, the experiment should provide useful information to aircraft manufacturers, airline operators, and military and civil regulators in their efforts to evaluate the range of risks that ash hazards pose to aviation.
Rigorous mathematical modelling for a Fast Corrector Power Supply in TPS
NASA Astrophysics Data System (ADS)
Liu, K.-B.; Liu, C.-Y.; Chien, Y.-C.; Wang, B.-S.; Wong, Y. S.
2017-04-01
To enhance the stability of beam orbit, a Fast Orbit Feedback System (FOFB) eliminating undesired disturbances was installed and tested in the 3rd generation synchrotron light source of Taiwan Photon Source (TPS) of National Synchrotron Radiation Research Center (NSRRC). The effectiveness of the FOFB greatly depends on the output performance of Fast Corrector Power Supply (FCPS); therefore, the design and implementation of an accurate FCPS is essential. A rigorous mathematical modelling is very useful to shorten design time and improve design performance of a FCPS. A rigorous mathematical modelling derived by the state-space averaging method for a FCPS in the FOFB of TPS composed of a full-bridge topology is therefore proposed in this paper. The MATLAB/SIMULINK software is used to construct the proposed mathematical modelling and to conduct the simulations of the FCPS. Simulations for the effects of the different resolutions of ADC on the output accuracy of the FCPS are investigated. A FCPS prototype is realized to demonstrate the effectiveness of the proposed rigorous mathematical modelling for the FCPS. Simulation and experimental results show that the proposed mathematical modelling is helpful for selecting the appropriate components to meet the accuracy requirements of a FCPS.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burgard, K.G.
This Configuration Management Implementation Plan was developed to assist in the management of systems, structures, and components, to facilitate the effective control and statusing of changes to systems, structures, and components; and to ensure technical consistency between design, performance, and operational requirements. Its purpose is to describe the approach Project W-464 will take in implementing a configuration management control, to determine the rigor of control, and to identify the mechanisms for imposing that control.This Configuration Management Implementation Plan was developed to assist in the management of systems, structures, and components, to facilitate the effective control and statusing of changes tomore » systems, structures, and components; and to ensure technical consistency between design, performance, and operational requirements. Its purpose is to describe the approach Project W-464 will take in implementing a configuration management control, to determine the rigor of control, and to identify the mechanisms for imposing that control.« less
Management Information System Based on the Balanced Scorecard
ERIC Educational Resources Information Center
Kettunen, Juha; Kantola, Ismo
2005-01-01
Purpose: This study seeks to describe the planning and implementation in Finland of a campus-wide management information system using a rigorous planning methodology. Design/methodology/approach: The structure of the management information system is planned on the basis of the management process, where strategic management and the balanced…
Three Views of Systems Theories and Their Implications for Sustainability Education
ERIC Educational Resources Information Center
Porter, Terry; Cordoba, Jose
2009-01-01
Worldwide, there is an emerging interest in sustainability and sustainability education. A popular and promising approach is the use of systems thinking. However, the systems approach to sustainability has neither been clearly defined nor has its practical application followed any systematic rigor, resulting in confounded and underspecified…
Projects for People: An International Exchange Focused on Drinking Water Quality in Rural Peru
NASA Astrophysics Data System (ADS)
Weathers, T. S.; Tarazona Vasquez, F.; Bailey, E.; Duong, V.; Gonzales Vera, R.; LaPorte, D.; Rojas Cala, B.; Torres Atencia, S.; Vasquez Auqui, J.
2016-12-01
The integration of human-centered design with technical engineering in a classroom setting can be challenging but immensely rewarding if coupled with a community-focused experience. Undergraduate students participated in an international exchange to address drinking water quality in the community of Huamancaca, located in the Junin region of Peru. Technical research and experimentation often comes easily to students in undergraduate engineering programs, however, implementation within a community requires a social license to operate. The objectives of this study were to address the technical challenges of designing a sustainable and effective water filtration system while also ensuring community support and education, coupled with user ownership of the process. In tandem with filter media experimentation with biochar and activated carbon produced using locally available agricultural waste from potatoes and carrots, we visited the people of Huamancaca to understand their needs and concerns. This direct communication with the community was invaluable; we observed that many of the residents' water quality problems could be solved with education. For example, proper sanitation techniques and appropriate addition of bleach or sufficient boiling time may make up for inconsistent water quality provided by the local distribution system. An education plan may also be developed for water treatment plant operators covering chlorine dosage for effective residual treatment within the distribution network in addition to filtration. Upon site visitation and sample collection, we realized that open communication with city officials, operators, business owners, and residents in both technical and social settings is essential for continued collaboration within this community. Solving a tangible problem or designing a product that can be effectively adopted is not a concept that is rigorously addressed in undergraduate education, however the setbacks, challenges, and triumphs experienced when interacting with a community can provide valuable lessons for career development.
Rigorous Combination of GNSS and VLBI: How it Improves Earth Orientation and Reference Frames
NASA Astrophysics Data System (ADS)
Lambert, S. B.; Richard, J. Y.; Bizouard, C.; Becker, O.
2017-12-01
Current reference series (C04) of the International Earth Rotation and Reference Systems Service (IERS) are produced by a weighted combination of Earth orientation parameters (EOP) time series built up by combination centers of each technique (VLBI, GNSS, Laser ranging, DORIS). In the future, we plan to derive EOP from a rigorous combination of the normal equation systems of the four techniques.We present here the results of a rigorous combination of VLBI and GNSS pre-reduced, constraint-free, normal equations with the DYNAMO geodetic analysis software package developed and maintained by the French GRGS (Groupe de Recherche en GeÌodeÌsie Spatiale). The used normal equations are those produced separately by the IVS and IGS combination centers to which we apply our own minimal constraints.We address the usefulness of such a method with respect to the classical, a posteriori, combination method, and we show whether EOP determinations are improved.Especially, we implement external validations of the EOP series based on comparison with geophysical excitation and examination of the covariance matrices. Finally, we address the potential of the technique for the next generation celestial reference frames, which are currently determined by VLBI only.
Abraham, Adane
2013-01-01
On September 9, 2009, Ethiopia enacted a highly restrictive biosafety law firmly based on precautionary principles as a foundation for its GMO regulation system. Its drafting process, led by the country's Environmental Protection Authority, was judged as biased, focusing only on protecting the environment from perceived risks, giving little attention to potential benefits of GMOs. Many of its provisions are very stringent, exceeding those of Cartagena Protocol on Biosafety, while others cannot be fulfilled by applicants, collectively rendering the emerged biosafety system unworkable. These provisions include requirements for advance informed agreement and rigorous socioeconomic assessment in risk evaluation for all GMO transactions, including contained research use-which requires the head of the competent national authority of the exporting country to take full responsibility for GMO-related information provided-and stringent labeling, insurance and monitoring requirements for all GMO activities. Furthermore, there is no provision to establish an independent national biosafety decision-making body(ies). As a result, foreign technology owners that provide highly demanded technologies like Bt cotton declined to work with Ethiopia. There is a fear that the emerged biosafety system might also continue to suppress domestic genetic engineering research and development. Thus, to benefit from GMOs, Ethiopia has to revise its biosafety system, primarily by making changes to some provisions of the law in a way that balances its diverse interests of conserving biodiversity, protecting the environment and enhancing competition in agricultural and other economic sectors.
Space environmental effects observed on the Hubble Space Telescope
NASA Technical Reports Server (NTRS)
Edelman, Joel E.; Mason, James B.
1995-01-01
The Hubble Space Telescope (HST) Repair Mission of December, 1993, was first and foremost a mission to improve the performance of the observatory. But for a specialized segment of the aerospace industry, the primary interest is in the return to Earth of numerous pieces of the HST hardware, pieces which have been replaced, repaired, improved, or superseded. The returned hardware is of interest because of the information it potentially carries about the effects of exposure to the space environment for three and a half years. Like the LDEF retrieval mission four years ago, the HST repair mission is of interest to many engineering disciplines, including all of the disciplines represented by the LDEF Special Investigation Groups (SIG's). There is particular interest in the evaluation of specific materials and systems in the returned components. Some coated surfaces have been processed with materials which are newer and still in use by, or under consideration for, other spacecraft in a variety of stages of development. Several of the systems are being returned because a specific failure or anomaly has been observed and thus there is, at the outset, a specific investigative trail that needs to be followed. These systems are much more complex than those flown on LDEF and, in two instances, comprised state-of-the-art science instruments. Further, the parts used in these systems generally were characterized more rigorously prior to flight than were those in the LDEF systems, and thus post flight testing may yield more significant results.
Space environmental effects observed on the Hubble Space Telescope
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edelman, J.E.; Mason, J.B.
1995-02-01
The Hubble Space Telescope (HST) Repair Mission of December, 1993, was first and foremost a mission to improve the performance of the observatory. But for a specialized segment of the aerospace industry, the primary interest is in the return to Earth of numerous pieces of the HST hardware, pieces which have been replaced, repaired, improved, or superseded. The returned hardware is of interest because of the information it potentially carries about the effects of exposure to the space environment for three and a half years. Like the LDEF retrieval mission four years ago, the HST repair mission is of interestmore » to many engineering disciplines, including all of the disciplines represented by the LDEF Special Investigation Groups (SIG`s). There is particular interest in the evaluation of specific materials and systems in the returned components. Some coated surfaces have been processed with materials which are newer and still in use by, or under consideration for, other spacecraft in a variety of stages of development. Several of the systems are being returned because a specific failure or anomaly has been observed and thus there is, at the outset, a specific investigative trail that needs to be followed. These systems are much more complex than those flown on LDEF and, in two instances, comprised state-of-the-art science instruments. Further, the parts used in these systems generally were characterized more rigorously prior to flight than were those in the LDEF systems, and thus post flight testing may yield more significant results.« less
Rigorous RG Algorithms and Area Laws for Low Energy Eigenstates in 1D
NASA Astrophysics Data System (ADS)
Arad, Itai; Landau, Zeph; Vazirani, Umesh; Vidick, Thomas
2017-11-01
One of the central challenges in the study of quantum many-body systems is the complexity of simulating them on a classical computer. A recent advance (Landau et al. in Nat Phys, 2015) gave a polynomial time algorithm to compute a succinct classical description for unique ground states of gapped 1D quantum systems. Despite this progress many questions remained unsolved, including whether there exist efficient algorithms when the ground space is degenerate (and of polynomial dimension in the system size), or for the polynomially many lowest energy states, or even whether such states admit succinct classical descriptions or area laws. In this paper we give a new algorithm, based on a rigorously justified RG type transformation, for finding low energy states for 1D Hamiltonians acting on a chain of n particles. In the process we resolve some of the aforementioned open questions, including giving a polynomial time algorithm for poly( n) degenerate ground spaces and an n O(log n) algorithm for the poly( n) lowest energy states (under a mild density condition). For these classes of systems the existence of a succinct classical description and area laws were not rigorously proved before this work. The algorithms are natural and efficient, and for the case of finding unique ground states for frustration-free Hamiltonians the running time is {\\tilde{O}(nM(n))} , where M( n) is the time required to multiply two n × n matrices.
Concrete ensemble Kalman filters with rigorous catastrophic filter divergence
Kelly, David; Majda, Andrew J.; Tong, Xin T.
2015-01-01
The ensemble Kalman filter and ensemble square root filters are data assimilation methods used to combine high-dimensional, nonlinear dynamical models with observed data. Ensemble methods are indispensable tools in science and engineering and have enjoyed great success in geophysical sciences, because they allow for computationally cheap low-ensemble-state approximation for extremely high-dimensional turbulent forecast models. From a theoretical perspective, the dynamical properties of these methods are poorly understood. One of the central mysteries is the numerical phenomenon known as catastrophic filter divergence, whereby ensemble-state estimates explode to machine infinity, despite the true state remaining in a bounded region. In this article we provide a breakthrough insight into the phenomenon, by introducing a simple and natural forecast model that transparently exhibits catastrophic filter divergence under all ensemble methods and a large set of initializations. For this model, catastrophic filter divergence is not an artifact of numerical instability, but rather a true dynamical property of the filter. The divergence is not only validated numerically but also proven rigorously. The model cleanly illustrates mechanisms that give rise to catastrophic divergence and confirms intuitive accounts of the phenomena given in past literature. PMID:26261335
Concrete ensemble Kalman filters with rigorous catastrophic filter divergence.
Kelly, David; Majda, Andrew J; Tong, Xin T
2015-08-25
The ensemble Kalman filter and ensemble square root filters are data assimilation methods used to combine high-dimensional, nonlinear dynamical models with observed data. Ensemble methods are indispensable tools in science and engineering and have enjoyed great success in geophysical sciences, because they allow for computationally cheap low-ensemble-state approximation for extremely high-dimensional turbulent forecast models. From a theoretical perspective, the dynamical properties of these methods are poorly understood. One of the central mysteries is the numerical phenomenon known as catastrophic filter divergence, whereby ensemble-state estimates explode to machine infinity, despite the true state remaining in a bounded region. In this article we provide a breakthrough insight into the phenomenon, by introducing a simple and natural forecast model that transparently exhibits catastrophic filter divergence under all ensemble methods and a large set of initializations. For this model, catastrophic filter divergence is not an artifact of numerical instability, but rather a true dynamical property of the filter. The divergence is not only validated numerically but also proven rigorously. The model cleanly illustrates mechanisms that give rise to catastrophic divergence and confirms intuitive accounts of the phenomena given in past literature.
Unmet Need: Improving mHealth Evaluation Rigor to Build the Evidence Base.
Mookherji, Sangeeta; Mehl, Garrett; Kaonga, Nadi; Mechael, Patricia
2015-01-01
mHealth-the use of mobile technologies for health-is a growing element of health system activity globally, but evaluation of those activities remains quite scant, and remains an important knowledge gap for advancing mHealth activities. In 2010, the World Health Organization and Columbia University implemented a small-scale survey to generate preliminary data on evaluation activities used by mHealth initiatives. The authors describe self-reported data from 69 projects in 29 countries. The majority (74%) reported some sort of evaluation activity, primarily nonexperimental in design (62%). The authors developed a 6-point scale of evaluation rigor comprising information on use of comparison groups, sample size calculation, data collection timing, and randomization. The mean score was low (2.4); half (47%) were conducting evaluations with a minimum threshold (4+) of rigor, indicating use of a comparison group, while less than 20% had randomized the mHealth intervention. The authors were unable to assess whether the rigor score was appropriate for the type of mHealth activity being evaluated. What was clear was that although most data came from mHealth projects pilots aimed for scale-up, few had designed evaluations that would support crucial decisions on whether to scale up and how. Whether the mHealth activity is a strategy to improve health or a tool for achieving intermediate outcomes that should lead to better health, mHealth evaluations must be improved to generate robust evidence for cost-effectiveness assessment and to allow for accurate identification of the contribution of mHealth initiatives to health systems strengthening and the impact on actual health outcomes.
High-order computer-assisted estimates of topological entropy
NASA Astrophysics Data System (ADS)
Grote, Johannes
The concept of Taylor Models is introduced, which offers highly accurate C0-estimates for the enclosures of functional dependencies, combining high-order Taylor polynomial approximation of functions and rigorous estimates of the truncation error, performed using verified interval arithmetic. The focus of this work is on the application of Taylor Models in algorithms for strongly nonlinear dynamical systems. A method to obtain sharp rigorous enclosures of Poincare maps for certain types of flows and surfaces is developed and numerical examples are presented. Differential algebraic techniques allow the efficient and accurate computation of polynomial approximations for invariant curves of certain planar maps around hyperbolic fixed points. Subsequently we introduce a procedure to extend these polynomial curves to verified Taylor Model enclosures of local invariant manifolds with C0-errors of size 10-10--10 -14, and proceed to generate the global invariant manifold tangle up to comparable accuracy through iteration in Taylor Model arithmetic. Knowledge of the global manifold structure up to finite iterations of the local manifold pieces enables us to find all homoclinic and heteroclinic intersections in the generated manifold tangle. Combined with the mapping properties of the homoclinic points and their ordering we are able to construct a subshift of finite type as a topological factor of the original planar system to obtain rigorous lower bounds for its topological entropy. This construction is fully automatic and yields homoclinic tangles with several hundred homoclinic points. As an example rigorous lower bounds for the topological entropy of the Henon map are computed, which to the best knowledge of the authors yield the largest such estimates published so far.
A Novel Decision Support Tool to Develop Link Driving Schedules for Moves.
DOT National Transportation Integrated Search
2015-01-01
A system or user level strategy that aims to reduce emissions from transportation networks requires a rigorous assessment of emissions inventory for the system to justify its effectiveness. It is important to estimate the total emissions for a transp...
Skill Assessment for Coupled Biological/Physical Models of Marine Systems.
Stow, Craig A; Jolliff, Jason; McGillicuddy, Dennis J; Doney, Scott C; Allen, J Icarus; Friedrichs, Marjorie A M; Rose, Kenneth A; Wallhead, Philip
2009-02-20
Coupled biological/physical models of marine systems serve many purposes including the synthesis of information, hypothesis generation, and as a tool for numerical experimentation. However, marine system models are increasingly used for prediction to support high-stakes decision-making. In such applications it is imperative that a rigorous model skill assessment is conducted so that the model's capabilities are tested and understood. Herein, we review several metrics and approaches useful to evaluate model skill. The definition of skill and the determination of the skill level necessary for a given application is context specific and no single metric is likely to reveal all aspects of model skill. Thus, we recommend the use of several metrics, in concert, to provide a more thorough appraisal. The routine application and presentation of rigorous skill assessment metrics will also serve the broader interests of the modeling community, ultimately resulting in improved forecasting abilities as well as helping us recognize our limitations.
McCaig, Chris; Begon, Mike; Norman, Rachel; Shankland, Carron
2011-03-01
Changing scale, for example, the ability to move seamlessly from an individual-based model to a population-based model, is an important problem in many fields. In this paper, we introduce process algebra as a novel solution to this problem in the context of models of infectious disease spread. Process algebra allows us to describe a system in terms of the stochastic behaviour of individuals, and is a technique from computer science. We review the use of process algebra in biological systems, and the variety of quantitative and qualitative analysis techniques available. The analysis illustrated here solves the changing scale problem: from the individual behaviour we can rigorously derive equations to describe the mean behaviour of the system at the level of the population. The biological problem investigated is the transmission of infection, and how this relates to individual interactions.
Eslick, John C.; Ng, Brenda; Gao, Qianwen; ...
2014-12-31
Under the auspices of the U.S. Department of Energy’s Carbon Capture Simulation Initiative (CCSI), a Framework for Optimization and Quantification of Uncertainty and Sensitivity (FOQUS) has been developed. This tool enables carbon capture systems to be rapidly synthesized and rigorously optimized, in an environment that accounts for and propagates uncertainties in parameters and models. FOQUS currently enables (1) the development of surrogate algebraic models utilizing the ALAMO algorithm, which can be used for superstructure optimization to identify optimal process configurations, (2) simulation-based optimization utilizing derivative free optimization (DFO) algorithms with detailed black-box process models, and (3) rigorous uncertainty quantification throughmore » PSUADE. FOQUS utilizes another CCSI technology, the Turbine Science Gateway, to manage the thousands of simulated runs necessary for optimization and UQ. Thus, this computational framework has been demonstrated for the design and analysis of a solid sorbent based carbon capture system.« less
Rigorous high-precision enclosures of fixed points and their invariant manifolds
NASA Astrophysics Data System (ADS)
Wittig, Alexander N.
The well established concept of Taylor Models is introduced, which offer highly accurate C0 enclosures of functional dependencies, combining high-order polynomial approximation of functions and rigorous estimates of the truncation error, performed using verified arithmetic. The focus of this work is on the application of Taylor Models in algorithms for strongly non-linear dynamical systems. A method is proposed to extend the existing implementation of Taylor Models in COSY INFINITY from double precision coefficients to arbitrary precision coefficients. Great care is taken to maintain the highest efficiency possible by adaptively adjusting the precision of higher order coefficients in the polynomial expansion. High precision operations are based on clever combinations of elementary floating point operations yielding exact values for round-off errors. An experimental high precision interval data type is developed and implemented. Algorithms for the verified computation of intrinsic functions based on the High Precision Interval datatype are developed and described in detail. The application of these operations in the implementation of High Precision Taylor Models is discussed. An application of Taylor Model methods to the verification of fixed points is presented by verifying the existence of a period 15 fixed point in a near standard Henon map. Verification is performed using different verified methods such as double precision Taylor Models, High Precision intervals and High Precision Taylor Models. Results and performance of each method are compared. An automated rigorous fixed point finder is implemented, allowing the fully automated search for all fixed points of a function within a given domain. It returns a list of verified enclosures of each fixed point, optionally verifying uniqueness within these enclosures. An application of the fixed point finder to the rigorous analysis of beam transfer maps in accelerator physics is presented. Previous work done by Johannes Grote is extended to compute very accurate polynomial approximations to invariant manifolds of discrete maps of arbitrary dimension around hyperbolic fixed points. The algorithm presented allows for automatic removal of resonances occurring during construction. A method for the rigorous enclosure of invariant manifolds of continuous systems is introduced. Using methods developed for discrete maps, polynomial approximations of invariant manifolds of hyperbolic fixed points of ODEs are obtained. These approximations are outfit with a sharp error bound which is verified to rigorously contain the manifolds. While we focus on the three dimensional case, verification in higher dimensions is possible using similar techniques. Integrating the resulting enclosures using the verified COSY VI integrator, the initial manifold enclosures are expanded to yield sharp enclosures of large parts of the stable and unstable manifolds. To demonstrate the effectiveness of this method, we construct enclosures of the invariant manifolds of the Lorenz system and show pictures of the resulting manifold enclosures. To the best of our knowledge, these enclosures are the largest verified enclosures of manifolds in the Lorenz system in existence.
Near-field plasmonic beam engineering with complex amplitude modulation based on metasurface
NASA Astrophysics Data System (ADS)
Song, Xu; Huang, Lingling; Sun, Lin; Zhang, Xiaomeng; Zhao, Ruizhe; Li, Xiaowei; Wang, Jia; Bai, Benfeng; Wang, Yongtian
2018-02-01
Metasurfaces have recently intrigued extensive interest due to their ability to locally manipulate electromagnetic waves, which provide great feasibility for tailoring both propagation waves and surface plasmon polaritons (SPPs). Manipulation of SPPs with arbitrary complex fields is an important issue in integrated nanophotonics due to their capability of guiding waves with subwavelength footprints. Here, an approach with metasurfaces composed of nanoaperture arrays is proposed and experimentally demonstrated which can effectively manipulate the complex amplitude of SPPs in the near-field regime. Tailoring the azimuthal angles of individual nanoapertures and simultaneously tuning their geometric parameters, the phase and amplitude are controlled based on the Pancharatnam-Berry phases and their individual transmission coefficients. For the verification of the concept, Airy plasmons and axisymmetric Airy-SPPs are generated. The results of numerical simulations and near-field imaging are consistent with each other. Besides the rigorous simulations, we applied a 2D dipole analysis for additional analysis. This strategy of complex amplitude manipulation with metasurfaces can be used for potential applications in plasmonic beam shaping, integrated optoelectronic systems, and surface wave holography.
Modeling driver behavior in a cognitive architecture.
Salvucci, Dario D
2006-01-01
This paper explores the development of a rigorous computational model of driver behavior in a cognitive architecture--a computational framework with underlying psychological theories that incorporate basic properties and limitations of the human system. Computational modeling has emerged as a powerful tool for studying the complex task of driving, allowing researchers to simulate driver behavior and explore the parameters and constraints of this behavior. An integrated driver model developed in the ACT-R (Adaptive Control of Thought-Rational) cognitive architecture is described that focuses on the component processes of control, monitoring, and decision making in a multilane highway environment. This model accounts for the steering profiles, lateral position profiles, and gaze distributions of human drivers during lane keeping, curve negotiation, and lane changing. The model demonstrates how cognitive architectures facilitate understanding of driver behavior in the context of general human abilities and constraints and how the driving domain benefits cognitive architectures by pushing model development toward more complex, realistic tasks. The model can also serve as a core computational engine for practical applications that predict and recognize driver behavior and distraction.
Swartz, R. Andrew
2013-01-01
This paper investigates the time series representation methods and similarity measures for sensor data feature extraction and structural damage pattern recognition. Both model-based time series representation and dimensionality reduction methods are studied to compare the effectiveness of feature extraction for damage pattern recognition. The evaluation of feature extraction methods is performed by examining the separation of feature vectors among different damage patterns and the pattern recognition success rate. In addition, the impact of similarity measures on the pattern recognition success rate and the metrics for damage localization are also investigated. The test data used in this study are from the System Identification to Monitor Civil Engineering Structures (SIMCES) Z24 Bridge damage detection tests, a rigorous instrumentation campaign that recorded the dynamic performance of a concrete box-girder bridge under progressively increasing damage scenarios. A number of progressive damage test case datasets and damage test data with different damage modalities are used. The simulation results show that both time series representation methods and similarity measures have significant impact on the pattern recognition success rate. PMID:24191136
Hard Constraints in Optimization Under Uncertainty
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Giesy, Daniel P.; Kenny, Sean P.
2008-01-01
This paper proposes a methodology for the analysis and design of systems subject to parametric uncertainty where design requirements are specified via hard inequality constraints. Hard constraints are those that must be satisfied for all parameter realizations within a given uncertainty model. Uncertainty models given by norm-bounded perturbations from a nominal parameter value, i.e., hyper-spheres, and by sets of independently bounded uncertain variables, i.e., hyper-rectangles, are the focus of this paper. These models, which are also quite practical, allow for a rigorous mathematical treatment within the proposed framework. Hard constraint feasibility is determined by sizing the largest uncertainty set for which the design requirements are satisfied. Analytically verifiable assessments of robustness are attained by comparing this set with the actual uncertainty model. Strategies that enable the comparison of the robustness characteristics of competing design alternatives, the description and approximation of the robust design space, and the systematic search for designs with improved robustness are also proposed. Since the problem formulation is generic and the tools derived only require standard optimization algorithms for their implementation, this methodology is applicable to a broad range of engineering problems.
NASA Technical Reports Server (NTRS)
Watson, Robert A.
1991-01-01
Approximate solutions of static and dynamic beam problems by the p-version of the finite element method are investigated. Within a hierarchy of engineering beam idealizations, rigorous formulations of the strain and kinetic energies for straight and circular beam elements are presented. These formulations include rotating coordinate system effects and geometric nonlinearities to allow for the evaluation of vertical axis wind turbines, the motivating problem for this research. Hierarchic finite element spaces, based on extensions of the polynomial orders used to approximate the displacement variables, are constructed. The developed models are implemented into a general purpose computer program for evaluation. Quality control procedures are examined for a diverse set of sample problems. These procedures include estimating discretization errors in energy norm and natural frequencies, performing static and dynamic equilibrium checks, observing convergence for qualities of interest, and comparison with more exacting theories and experimental data. It is demonstrated that p-extensions produce exponential rates of convergence in the approximation of strain energy and natural frequencies for the class of problems investigated.
NASA Astrophysics Data System (ADS)
Lee, Hyomin; Jung, Yeonsu; Park, Sungmin; Kim, Ho-Young; Kim, Sung Jae
2016-11-01
Generally, an ion depletion region near a permselective medium is induced by predominant ion flux through the medium. External electric field or hydraulic pressure has been reported as the driving forces. Among these driving forces, an imbibition through the nanoporous medium was chosen as the mechanism to spontaneously generate the ion depletion region. The water-absorbing process leads to the predominant ion flux so that the spontaneous formation of the ion depletion zone is expected even if there are no additional driving forces except for the inherent capillary action. In this presentation, we derived the analytical solutions using perturbation method and asymptotic analysis for the spontaneous phenomenon. Using the analysis, we found that there is also spontaneous accumulation regime depending on the mobility of dissolved electrolytic species. Therefore, the rigorous analysis of the spontaneous ion depletion and accumulation phenomena would provide a key perspective for the control of ion transportation in nanofluidic system such as desalinator, preconcentrator, and energy harvesting device, etc. Samsung Research Funding Center of Samsung Electronics (SRFC-MA1301-02) and BK21 plus program of Creative Research Engineer Development IT, Seoul National University.
ECUT (Energy Conversion and Utilization Technologies) program: Biocatalysis project
NASA Technical Reports Server (NTRS)
Baresi, Larry
1989-01-01
The Annual Report presents the fiscal year (FY) 1988 research activities and accomplishments, for the Biocatalysis Project of the U.S. Department of Energy, Energy Conversion and Utilization Technologies (ECUT) Division. The ECUT Biocatalysis Project is managed by the Jet Propulsion Laboratory, California Institute of Technology. The Biocatalysis Project is a mission-oriented, applied research and exploratory development activity directed toward resolution of the major generic technical barriers that impede the development of biologically catalyzed commercial chemical production. The approach toward achieving project objectives involves an integrated participation of universities, industrial companies and government research laboratories. The Project's technical activities were organized into three work elements: (1) The Molecular Modeling and Applied Genetics work element includes research on modeling of biological systems, developing rigorous methods for the prediction of three-dimensional (tertiary) protein structure from the amino acid sequence (primary structure) for designing new biocatalysis, defining kinetic models of biocatalyst reactivity, and developing genetically engineered solutions to the generic technical barriers that preclude widespread application of biocatalysis. (2) The Bioprocess Engineering work element supports efforts in novel bioreactor concepts that are likely to lead to substantially higher levels of reactor productivity, product yields and lower separation energetics. Results of work within this work element will be used to establish the technical feasibility of critical bioprocess monitoring and control subsystems. (3) The Bioprocess Design and Assessment work element attempts to develop procedures (via user-friendly computer software) for assessing the energy-economics of biocatalyzed chemical production processes, and initiation of technology transfer for advanced bioprocesses.
ECUT (Energy Conversion and Utilization Technologies) program: Biocatalysis project
NASA Astrophysics Data System (ADS)
Baresi, Larry
1989-03-01
The Annual Report presents the fiscal year (FY) 1988 research activities and accomplishments, for the Biocatalysis Project of the U.S. Department of Energy, Energy Conversion and Utilization Technologies (ECUT) Division. The ECUT Biocatalysis Project is managed by the Jet Propulsion Laboratory, California Institute of Technology. The Biocatalysis Project is a mission-oriented, applied research and exploratory development activity directed toward resolution of the major generic technical barriers that impede the development of biologically catalyzed commercial chemical production. The approach toward achieving project objectives involves an integrated participation of universities, industrial companies and government research laboratories. The Project's technical activities were organized into three work elements: (1) The Molecular Modeling and Applied Genetics work element includes research on modeling of biological systems, developing rigorous methods for the prediction of three-dimensional (tertiary) protein structure from the amino acid sequence (primary structure) for designing new biocatalysis, defining kinetic models of biocatalyst reactivity, and developing genetically engineered solutions to the generic technical barriers that preclude widespread application of biocatalysis. (2) The Bioprocess Engineering work element supports efforts in novel bioreactor concepts that are likely to lead to substantially higher levels of reactor productivity, product yields and lower separation energetics. Results of work within this work element will be used to establish the technical feasibility of critical bioprocess monitoring and control subsystems. (3) The Bioprocess Design and Assessment work element attempts to develop procedures (via user-friendly computer software) for assessing the energy-economics of biocatalyzed chemical production processes, and initiation of technology transfer for advanced bioprocesses.
Krompecher, T
1981-01-01
Objective measurements were carried out to study the evolution of rigor mortis on rats at various temperatures. Our experiments showed that: (1) at 6 degrees C rigor mortis reaches full development between 48 and 60 hours post mortem, and is resolved at 168 hours post mortem; (2) at 24 degrees C rigor mortis reaches full development at 5 hours post mortem, and is resolved at 16 hours post mortem; (3) at 37 degrees C rigor mortis reaches full development at 3 hours post mortem, and is resolved at 6 hours post mortem; (4) the intensity of rigor mortis grows with increase in temperature (difference between values obtained at 24 degrees C and 37 degrees C); and (5) and 6 degrees C a "cold rigidity" was found, in addition to and independent of rigor mortis.
Miniaturized pulsed laser source for time-domain diffuse optics routes to wearable devices.
Di Sieno, Laura; Nissinen, Jan; Hallman, Lauri; Martinenghi, Edoardo; Contini, Davide; Pifferi, Antonio; Kostamovaara, Juha; Mora, Alberto Dalla
2017-08-01
We validate a miniaturized pulsed laser source for use in time-domain (TD) diffuse optics, following rigorous and shared protocols for performance assessment of this class of devices. This compact source (12×6 mm2) has been previously developed for range finding applications and is able to provide short, high energy (∼100 ps, ∼0.5 nJ) optical pulses at up to 1 MHz repetition rate. Here, we start with a basic level laser characterization with an analysis of suitability of this laser for the diffuse optics application. Then, we present a TD optical system using this source and its performances in both recovering optical properties of tissue-mimicking homogeneous phantoms and in detecting localized absorption perturbations. Finally, as a proof of concept of in vivo application, we demonstrate that the system is able to detect hemodynamic changes occurring in the arm of healthy volunteers during a venous occlusion. Squeezing the laser source in a small footprint removes a key technological bottleneck that has hampered so far the realization of a miniaturized TD diffuse optics system, able to compete with already assessed continuous-wave devices in terms of size and cost, but with wider performance potentialities, as demonstrated by research over the last two decades. (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).
Elf, Johan
2016-04-27
A new, game-changing approach makes it possible to rigorously disprove models without making assumptions about the unknown parts of the biological system. Copyright © 2016 Elsevier Inc. All rights reserved.
Towards Identifying and Reducing the Bias of Disease Information Extracted from Search Engine Data
Huang, Da-Cang; Wang, Jin-Feng; Huang, Ji-Xia; Sui, Daniel Z.; Zhang, Hong-Yan; Hu, Mao-Gui; Xu, Cheng-Dong
2016-01-01
The estimation of disease prevalence in online search engine data (e.g., Google Flu Trends (GFT)) has received a considerable amount of scholarly and public attention in recent years. While the utility of search engine data for disease surveillance has been demonstrated, the scientific community still seeks ways to identify and reduce biases that are embedded in search engine data. The primary goal of this study is to explore new ways of improving the accuracy of disease prevalence estimations by combining traditional disease data with search engine data. A novel method, Biased Sentinel Hospital-based Area Disease Estimation (B-SHADE), is introduced to reduce search engine data bias from a geographical perspective. To monitor search trends on Hand, Foot and Mouth Disease (HFMD) in Guangdong Province, China, we tested our approach by selecting 11 keywords from the Baidu index platform, a Chinese big data analyst similar to GFT. The correlation between the number of real cases and the composite index was 0.8. After decomposing the composite index at the city level, we found that only 10 cities presented a correlation of close to 0.8 or higher. These cities were found to be more stable with respect to search volume, and they were selected as sample cities in order to estimate the search volume of the entire province. After the estimation, the correlation improved from 0.8 to 0.864. After fitting the revised search volume with historical cases, the mean absolute error was 11.19% lower than it was when the original search volume and historical cases were combined. To our knowledge, this is the first study to reduce search engine data bias levels through the use of rigorous spatial sampling strategies. PMID:27271698
Towards Identifying and Reducing the Bias of Disease Information Extracted from Search Engine Data.
Huang, Da-Cang; Wang, Jin-Feng; Huang, Ji-Xia; Sui, Daniel Z; Zhang, Hong-Yan; Hu, Mao-Gui; Xu, Cheng-Dong
2016-06-01
The estimation of disease prevalence in online search engine data (e.g., Google Flu Trends (GFT)) has received a considerable amount of scholarly and public attention in recent years. While the utility of search engine data for disease surveillance has been demonstrated, the scientific community still seeks ways to identify and reduce biases that are embedded in search engine data. The primary goal of this study is to explore new ways of improving the accuracy of disease prevalence estimations by combining traditional disease data with search engine data. A novel method, Biased Sentinel Hospital-based Area Disease Estimation (B-SHADE), is introduced to reduce search engine data bias from a geographical perspective. To monitor search trends on Hand, Foot and Mouth Disease (HFMD) in Guangdong Province, China, we tested our approach by selecting 11 keywords from the Baidu index platform, a Chinese big data analyst similar to GFT. The correlation between the number of real cases and the composite index was 0.8. After decomposing the composite index at the city level, we found that only 10 cities presented a correlation of close to 0.8 or higher. These cities were found to be more stable with respect to search volume, and they were selected as sample cities in order to estimate the search volume of the entire province. After the estimation, the correlation improved from 0.8 to 0.864. After fitting the revised search volume with historical cases, the mean absolute error was 11.19% lower than it was when the original search volume and historical cases were combined. To our knowledge, this is the first study to reduce search engine data bias levels through the use of rigorous spatial sampling strategies.
NASA Technical Reports Server (NTRS)
Horvath, Thomas; Splinter, Scott; Daryabeigi, Kamran; Wood, William; Schwartz, Richard; Ross, Martin
2008-01-01
High resolution calibrated infrared imagery of vehicles during hypervelocity atmospheric entry or sustained hypersonic cruise has the potential to provide flight data on the distribution of surface temperature and the state of the airflow over the vehicle. In the early 1980 s NASA sought to obtain high spatial resolution infrared imagery of the Shuttle during entry. Despite mission execution with a technically rigorous pre-planning capability, the single airborne optical system for this attempt was considered developmental and the scientific return was marginal. In 2005 the Space Shuttle Program again sponsored an effort to obtain imagery of the Orbiter. Imaging requirements were targeted towards Shuttle ascent; companion requirements for entry did not exist. The engineering community was allowed to define observation goals and incrementally demonstrate key elements of a quantitative spatially resolved measurement capability over a series of flights. These imaging opportunities were extremely beneficial and clearly demonstrated capability to capture infrared imagery with mature and operational assets of the US Navy and the Missile Defense Agency. While successful, the usefulness of the imagery was, from an engineering perspective, limited. These limitations were mainly associated with uncertainties regarding operational aspects of data acquisition. These uncertainties, in turn, came about because of limited pre-flight mission planning capability, a poor understanding of several factors including the infrared signature of the Shuttle, optical hardware limitations, atmospheric effects and detector response characteristics. Operational details of sensor configuration such as detector integration time and tracking system algorithms were carried out ad hoc (best practices) which led to low probability of target acquisition and detector saturation. Leveraging from the qualified success during Return-to-Flight, the NASA Engineering and Safety Center sponsored an assessment study focused on increasing the probability of returning spatially resolved scientific/engineering thermal imagery. This paper provides an overview of the assessment task and the systematic approach designed to establish confidence in the ability of existing assets to reliably acquire, track and return global quantitative surface temperatures of the Shuttle during entry. A discussion of capability demonstration in support of a potential Shuttle boundary layer transition flight test is presented. Successful demonstration of a quantitative, spatially resolved, global temperature measurement on the proposed Shuttle boundary layer transition flight test could lead to potential future applications with hypersonic flight test programs within the USAF and DARPA along with flight test opportunities supporting NASA s project Constellation.
DOT National Transportation Integrated Search
2000-11-01
In many intelligent transportation systems (ITS) implementations, the telecommunications solution was arrived at without the kind of rigorous examination that would have accompanied similarly significant and complex technical/business choices. The pu...
NASA Technical Reports Server (NTRS)
Manford, J. S.; Bennett, G. R.
1985-01-01
The Space Station Program will incorporate analysis of operations constraints and considerations in the early design phases to avoid the need for later modifications to the Space Station for operations. The application of modern tools and administrative techniques to minimize the cost of performing effective orbital operations planning and design analysis in the preliminary design phase of the Space Station Program is discussed. Tools and techniques discussed include: approach for rigorous analysis of operations functions, use of the resources of a large computer network, and providing for efficient research and access to information.
NASA Astrophysics Data System (ADS)
Al-Ajmi, R. M.; Abou-Ziyan, H. Z.; Mahmoud, M. A.
2012-01-01
This paper reports the results of a comprehensive study that aimed at identifying best neural network architecture and parameters to predict subcooled boiling characteristics of engine oils. A total of 57 different neural networks (NNs) that were derived from 14 different NN architectures were evaluated for four different prediction cases. The NNs were trained on experimental datasets performed on five engine oils of different chemical compositions. The performance of each NN was evaluated using a rigorous statistical analysis as well as careful examination of smoothness of predicted boiling curves. One NN, out of the 57 evaluated, correctly predicted the boiling curves for all cases considered either for individual oils or for all oils taken together. It was found that the pattern selection and weight update techniques strongly affect the performance of the NNs. It was also revealed that the use of descriptive statistical analysis such as R2, mean error, standard deviation, and T and slope tests, is a necessary but not sufficient condition for evaluating NN performance. The performance criteria should also include inspection of the smoothness of the predicted curves either visually or by plotting the slopes of these curves.
Using Approximations to Accelerate Engineering Design Optimization
NASA Technical Reports Server (NTRS)
Torczon, Virginia; Trosset, Michael W.
1998-01-01
Optimization problems that arise in engineering design are often characterized by several features that hinder the use of standard nonlinear optimization techniques. Foremost among these features is that the functions used to define the engineering optimization problem often are computationally intensive. Within a standard nonlinear optimization algorithm, the computational expense of evaluating the functions that define the problem would necessarily be incurred for each iteration of the optimization algorithm. Faced with such prohibitive computational costs, an attractive alternative is to make use of surrogates within an optimization context since surrogates can be chosen or constructed so that they are typically much less expensive to compute. For the purposes of this paper, we will focus on the use of algebraic approximations as surrogates for the objective. In this paper we introduce the use of so-called merit functions that explicitly recognize the desirability of improving the current approximation to the objective during the course of the optimization. We define and experiment with the use of merit functions chosen to simultaneously improve both the solution to the optimization problem (the objective) and the quality of the approximation. Our goal is to further improve the effectiveness of our general approach without sacrificing any of its rigor.
Luo, Yuehao; Yuan, Lu; Li, Jianhua; Wang, Jianshe
2015-12-01
Nature has supplied the inexhaustible resources for mankind, and at the same time, it has also progressively developed into the school for scientists and engineers. Through more than four billions years of rigorous and stringent evolution, different creatures in nature gradually exhibit their own special and fascinating biological functional surfaces. For example, sharkskin has the potential drag-reducing effect in turbulence, lotus leaf possesses the self-cleaning and anti-foiling function, gecko feet have the controllable super-adhesion surfaces, the flexible skin of dolphin can accelerate its swimming velocity. Great profits of applying biological functional surfaces in daily life, industry, transportation and agriculture have been achieved so far, and much attention from all over the world has been attracted and focused on this field. In this overview, the bio-inspired drag-reducing mechanism derived from sharkskin is explained and explored comprehensively from different aspects, and then the main applications in different fluid engineering are demonstrated in brief. This overview will inevitably improve the comprehension of the drag reduction mechanism of sharkskin surface and better understand the recent applications in fluid engineering. Copyright © 2015 Elsevier Ltd. All rights reserved.
Tailoring optical metamaterials to tune the atom-surface Casimir-Polder interaction.
Chan, Eng Aik; Aljunid, Syed Abdullah; Adamo, Giorgio; Laliotis, Athanasios; Ducloy, Martial; Wilkowski, David
2018-02-01
Metamaterials are fascinating tools that can structure not only surface plasmons and electromagnetic waves but also electromagnetic vacuum fluctuations. The possibility of shaping the quantum vacuum is a powerful concept that ultimately allows engineering the interaction between macroscopic surfaces and quantum emitters such as atoms, molecules, or quantum dots. The long-range atom-surface interaction, known as Casimir-Polder interaction, is of fundamental importance in quantum electrodynamics but also attracts a significant interest for platforms that interface atoms with nanophotonic devices. We perform a spectroscopic selective reflection measurement of the Casimir-Polder interaction between a Cs(6P 3/2 ) atom and a nanostructured metallic planar metamaterial. We show that by engineering the near-field plasmonic resonances of the metamaterial, we can successfully tune the Casimir-Polder interaction, demonstrating both a strong enhancement and reduction with respect to its nonresonant value. We also show an enhancement of the atomic spontaneous emission rate due to its coupling with the evanescent modes of the nanostructure. Probing excited-state atoms next to nontrivial tailored surfaces is a rigorous test of quantum electrodynamics. Engineering Casimir-Polder interactions represents a significant step toward atom trapping in the extreme near field, possibly without the use of external fields.
Beaudrie, Christian E H; Kandlikar, Milind; Satterfield, Terre
2013-06-04
Engineered nanomaterials (ENMs) promise great benefits for society, yet our knowledge of potential risks and best practices for regulation are still in their infancy. Toward the end of better practices, this paper analyzes U.S. federal environmental, health, and safety (EHS) regulations using a life cycle framework. It evaluates their adequacy as applied to ENMs to identify gaps through which emerging nanomaterials may escape regulation from initial production to end-of-life. High scientific uncertainty, a lack of EHS and product data, inappropriately designed exemptions and thresholds, and limited agency resources are a challenge to both the applicability and adequacy of current regulations. The result is that some forms of engineered nanomaterials may escape federal oversight and rigorous risk review at one or more stages along their life cycle, with the largest gaps occurring at the postmarket stages, and at points of ENM release to the environment. Oversight can be improved through pending regulatory reforms, increased research and development for the monitoring, control, and analysis of environmental and end-of-life releases, introduction of periodic re-evaluation of ENM risks, and fostering a "bottom-up" stewardship approach to the responsible management of risks from engineered nanomaterials.
Emergency cricothyrotomy for trismus caused by instantaneous rigor in cardiac arrest patients.
Lee, Jae Hee; Jung, Koo Young
2012-07-01
Instantaneous rigor as muscle stiffening occurring in the moment of death (or cardiac arrest) can be confused with rigor mortis. If trismus is caused by instantaneous rigor, orotracheal intubation is impossible and a surgical airway should be secured. Here, we report 2 patients who had emergency cricothyrotomy for trismus caused by instantaneous rigor. This case report aims to help physicians understand instantaneous rigor and to emphasize the importance of securing a surgical airway quickly on the occurrence of trismus. Copyright © 2012 Elsevier Inc. All rights reserved.
A method to select human-system interfaces for nuclear power plants
Hugo, Jacques Victor; Gertman, David Ira
2015-10-19
The new generation of nuclear power plants (NPPs) will likely make use of state-of-the-art technologies in many areas of the plant. The analysis, design, and selection of advanced human–system interfaces (HSIs) constitute an important part of power plant engineering. Designers need to consider the new capabilities afforded by these technologies in the context of current regulations and new operational concepts, which is why they need a more rigorous method by which to plan the introduction of advanced HSIs in NPP work areas. Much of current human factors research stops at the user interface and fails to provide a definitive processmore » for integration of end user devices with instrumentation and control (I&C) and operational concepts. The current lack of a clear definition of HSI technology, including the process for integration, makes characterization and implementation of new and advanced HSIs difficult. This paper describes how new design concepts in the nuclear industry can be analyzed and how HSI technologies associated with new industrial processes might be considered. Furthermore, it also describes a basis for an understanding of human as well as technology characteristics that could be incorporated into a prioritization scheme for technology selection and deployment plans.« less
Nanosystem self-assembly pathways discovered via all-atom multiscale analysis.
Pankavich, Stephen D; Ortoleva, Peter J
2012-07-26
We consider the self-assembly of composite structures from a group of nanocomponents, each consisting of particles within an N-atom system. Self-assembly pathways and rates for nanocomposites are derived via a multiscale analysis of the classical Liouville equation. From a reduced statistical framework, rigorous stochastic equations for population levels of beginning, intermediate, and final aggregates are also derived. It is shown that the definition of an assembly type is a self-consistency criterion that must strike a balance between precision and the need for population levels to be slowly varying relative to the time scale of atomic motion. The deductive multiscale approach is complemented by a qualitative notion of multicomponent association and the ensemble of exact atomic-level configurations consistent with them. In processes such as viral self-assembly from proteins and RNA or DNA, there are many possible intermediates, so that it is usually difficult to predict the most efficient assembly pathway. However, in the current study, rates of assembly of each possible intermediate can be predicted. This avoids the need, as in a phenomenological approach, for recalibration with each new application. The method accounts for the feedback across scales in space and time that is fundamental to nanosystem self-assembly. The theory has applications to bionanostructures, geomaterials, engineered composites, and nanocapsule therapeutic delivery systems.
NASA Astrophysics Data System (ADS)
Marx, K. D.; Edwards, C. F.
1992-12-01
The effect of the single-particle constraint on the response of phase-Doppler instruments is determined for particle flows which are spatially nonuniform and time-dependent. Poisson statistics are applied to particle positions and arrival times within the phase-Doppler probe volume to determine the probability that a particle is measured successfully. It is shown that the single-particle constraint can be viewed as applying spatial and temporal filters to the particle flow. These filters have the same meaning as those that were defined previously for uniform, steady-state sprays, but in space- and time-dependent form. Criteria are developed for determining when a fully inhomogeneous analysis of a flow is required and when a quasi-steady analysis will suffice. A new bias due to particle arrival time displacement is identified and the conditions under which it must be considered are established. The present work provides the means to rigorously investigate the response of phase-Doppler measurement systems to transient sprays such as those which occur in diesel engines. To this end, the results are applied to a numerical simulation of a diesel spray. The calculated hypothetical response of the ideal instrument provides a quantitative demonstration of the regimes within which measurements can accurately be made in such sprays.
Making a Reliable Actuator Faster and More Affordable
NASA Technical Reports Server (NTRS)
2005-01-01
Before any rocket is allowed to fly and be used for a manned mission, it is first test-fired on a static test stand to verify its flight readiness. NASA s Stennis Space Center provides testing of Space Shuttle Main Engines, rocket propulsion systems, and related components with several test facilities. It has been NASA s test-launch site since 1961. The testing stations age with time and repeated use; and with aging comes maintenance; and with maintenance comes expense. NASA has been seeking ways to lower the cost of maintaining the stations, and has aided in the development of an improved reliable linear actuator that arrives onsite quickly and costs less money than other actuators. In general terms, a linear actuator is a servomechanism that supplies a measured amount of energy for the operation of another mechanical system. Accuracy, reliability, and speed of the actuator are critical to performance of the entire system, and these actuators are critical components of the engine test stands. Partnership An actuator was developed as part of a Dual-Use Cooperative Agreement between BAFCO, Inc., of Warminister, Pennsylvania, and Stennis. BAFCO identified four suppliers that manufactured actuator components that met the rigorous testing standards imposed by the Space Agency and then modified these components for application on the rocket test stands. In partnership with BAFCO, the existing commercial products size and weight were reworked, reducing cost and delivery time. Previously, these parts would cost between $20,000 and $22,000, but with the new process, they now run between $11,000 and $13,000, a substantial savings, considering NASA has already purchased over 120 of the units. Delivery time of the cost-saving actuators has also been cut from over 20 to 22 weeks to within 8 to 10 weeks. The redesigned actuator is commercially available, and the company is successfully supplying them to customers other than NASA.
Characterizing learning-through-service students in engineering by gender and academic year
NASA Astrophysics Data System (ADS)
Carberry, Adam Robert
Service is increasingly being viewed as an integral part of education nationwide. Service-based courses and programs are growing in popularity as opportunities for students to learn and experience their discipline. Widespread adoption of learning-through-service (LTS) in engineering is stymied by a lack of a body of rigorous research supporting the effectiveness of these experiences. In this study, I examine learning-through-service through a nationwide survey of engineering undergraduate and graduate students participating in a variety of LTS experiences. Students (N = 322) participating in some form of service -- service-learning courses or extra-curricular service programs -- from eighty-seven different institutions across the United States completed a survey measuring demographic information (institution, gender, academic year, age, major, and grade point average), self-perceived sources of learning (service and traditional coursework), engineering epistemological beliefs, personality traits, and self-concepts (self-efficacy, motivation, expectancy, and anxiety) toward engineering design. Responses to the survey were used to characterize engineering LTS students and identify differences in these variables in terms of gender and academic year. The overall findings were that LTS students perceived their service experience to be a beneficial source for learning professional skills and, to a lesser degree, technical skills, held moderately sophisticated engineering epistemological beliefs, and were generally outgoing, compassionate, and adventurous. Self-perceived sources of learning, epistemological beliefs, and personality traits were shown to be poor predictors of student engineering achievement. Self-efficacy, motivation, and outcome expectancy toward engineering design were generally high for all LTS students; most possessed rather low anxiety levels toward engineering design. These trends were generally consistent between genders and across the five academic years (first-year, sophomores, juniors, seniors, and graduate students) surveyed. Females had significantly more sophisticated epistemological beliefs, greater perceptions of service as a source of learning professional and technical skills, and higher anxiety toward engineering design. They also were significantly more extroverted and agreeable. Males had higher confidence, motivation, and expectancy for success toward engineering design. Across academic year it was seen that students varied in their engineering design self-concepts, except for motivation.
Software service history report
DOT National Transportation Integrated Search
2002-01-01
The safe and reliable operation of software within civil aviation systems and equipment has historically been assured through the application of rigorous design assurance applied during the software development process. Increasingly, manufacturers ar...
Birkeland, S; Akse, L
2010-01-01
Improved slaughtering procedures in the salmon industry have caused a delayed onset of rigor mortis and, thus, a potential for pre-rigor secondary processing. The aim of this study was to investigate the effect of rigor status at time of processing on quality traits color, texture, sensory, microbiological, in injection salted, and cold-smoked Atlantic salmon (Salmo salar). Injection of pre-rigor fillets caused a significant (P<0.001) contraction (-7.9%± 0.9%) on the caudal-cranial axis. No significant differences in instrumental color (a*, b*, C*, or h*), texture (hardness), or sensory traits (aroma, color, taste, and texture) were observed between pre- or post-rigor processed fillets; however, post-rigor (1477 ± 38 g) fillets had a significant (P>0.05) higher fracturability than pre-rigor fillets (1369 ± 71 g). Pre-rigor fillets were significantly (P<0.01) lighter, L*, (39.7 ± 1.0) than post-rigor fillets (37.8 ± 0.8) and had significantly lower (P<0.05) aerobic plate count (APC), 1.4 ± 0.4 log CFU/g against 2.6 ± 0.6 log CFU/g, and psychrotrophic count (PC), 2.1 ± 0.2 log CFU/g against 3.0 ± 0.5 log CFU/g, than post-rigor processed fillets. This study showed that similar quality characteristics can be obtained in cold-smoked products processed either pre- or post-rigor when using suitable injection salting protocols and smoking techniques. © 2010 Institute of Food Technologists®
Horseshoes in a Chaotic System with Only One Stable Equilibrium
NASA Astrophysics Data System (ADS)
Huan, Songmei; Li, Qingdu; Yang, Xiao-Song
To confirm the numerically demonstrated chaotic behavior in a chaotic system with only one stable equilibrium reported by Wang and Chen, we resort to Poincaré map technique and present a rigorous computer-assisted verification of horseshoe chaos by virtue of topological horseshoes theory.
A Point System Approach to Secondary Classroom Management
ERIC Educational Resources Information Center
Xenos, Anthony J.
2012-01-01
This article presents guiding principles governing the design, implementation, and management of a point system to promote discipline and academic rigor in a secondary classroom. Four considerations are discussed: (1) assigning appropriate point values to integral classroom behaviors and tasks; (2) determining the relationship among consequences,…
Accountability and Virginia Public Schools, 2016-2017 School Year
ERIC Educational Resources Information Center
Virginia Department of Education, 2017
2017-01-01
This document offers a brief guide to understanding Virginia's system for holding schools accountable for raising student achievement. Virginia's accountability system supports teaching and learning by setting rigorous academic standards--known as the Standards of Learning (SOL)--and through annual statewide assessments of student achievement.…
... certain circulating white blood cells, as a part of the lymph system, and as a part of the immune system. Review Date 1/31/2017 Updated by: Mary ... urac.org). URAC's accreditation program is an independent audit to verify that ... follows rigorous standards of quality and accountability. A.D.A.M. is ...
The bacteriorhodopsin model membrane system as a prototype molecular computing element.
Hong, F T
1986-01-01
The quest for more sophisticated integrated circuits to overcome the limitation of currently available silicon integrated circuits has led to the proposal of using biological molecules as computational elements by computer scientists and engineers. While the theoretical aspect of this possibility has been pursued by computer scientists, the research and development of experimental prototypes have not been pursued with an equal intensity. In this survey, we make an attempt to examine model membrane systems that incorporate the protein pigment bacteriorhodopsin which is found in Halobacterium halobium. This system was chosen for several reasons. The pigment/membrane system is sufficiently simple and stable for rigorous quantitative study, yet at the same time sufficiently complex in molecular structure to permit alteration of this structure in an attempt to manipulate the photosignal. Several methods of forming the pigment/membrane assembly are described and the potential application to biochip design is discussed. Experimental data using these membranes and measured by a tunable voltage clamp method are presented along with a theoretical analysis based on the Gouy-Chapman diffuse double layer theory to illustrate the usefulness of this approach. It is shown that detailed layouts of the pigment/membrane assembly as well as external loading conditions can modify the time course of the photosignal in a predictable manner. Some problems that may arise in the actual implementation and manufacturing, as well as the use of existing technology in protein chemistry, immunology, and recombinant DNA technology are discussed.
NASA Astrophysics Data System (ADS)
Cerroni, D.; Manservisi, S.; Pozzetti, G.
2015-11-01
In this work we investigate the potentialities of multi-scale engineering techniques to approach complex problems related to biomedical and biological fields. In particular we study the interaction between blood and blood vessel focusing on the presence of an aneurysm. The study of each component of the cardiovascular system is very difficult due to the fact that the movement of the fluid and solid is determined by the rest of system through dynamical boundary conditions. The use of multi-scale techniques allows us to investigate the effect of the whole loop on the aneurysm dynamic. A three-dimensional fluid-structure interaction model for the aneurysm is developed and coupled to a mono-dimensional one for the remaining part of the cardiovascular system, where a point zero-dimensional model for the heart is provided. In this manner it is possible to achieve rigorous and quantitative investigations of the cardiovascular disease without loosing the system dynamic. In order to study this biomedical problem we use a monolithic fluid-structure interaction (FSI) model where the fluid and solid equations are solved together. The use of a monolithic solver allows us to handle the convergence issues caused by large deformations. By using this monolithic approach different solid and fluid regions are treated as a single continuum and the interface conditions are automatically taken into account. In this way the iterative process characteristic of the commonly used segregated approach, it is not needed any more.
PROBABILISTIC RISK ANALYSIS OF RADIOACTIVE WASTE DISPOSALS - a case study
NASA Astrophysics Data System (ADS)
Trinchero, P.; Delos, A.; Tartakovsky, D. M.; Fernandez-Garcia, D.; Bolster, D.; Dentz, M.; Sanchez-Vila, X.; Molinero, J.
2009-12-01
The storage of contaminant material in superficial or sub-superficial repositories, such as tailing piles for mine waste or disposal sites for low and intermediate nuclear waste, poses a potential threat for the surrounding biosphere. The minimization of these risks can be achieved by supporting decision-makers with quantitative tools capable to incorporate all source of uncertainty within a rigorous probabilistic framework. A case study is presented where we assess the risks associated to the superficial storage of hazardous waste close to a populated area. The intrinsic complexity of the problem, involving many events with different spatial and time scales and many uncertainty parameters is overcome by using a formal PRA (probabilistic risk assessment) procedure that allows decomposing the system into a number of key events. Hence, the failure of the system is directly linked to the potential contamination of one of the three main receptors: the underlying karst aquifer, a superficial stream that flows near the storage piles and a protection area surrounding a number of wells used for water supply. The minimal cut sets leading to the failure of the system are obtained by defining a fault-tree that incorporates different events including the failure of the engineered system (e.g. cover of the piles) and the failure of the geological barrier (e.g. clay layer that separates the bottom of the pile from the karst formation). Finally the probability of failure is quantitatively assessed combining individual independent or conditional probabilities that are computed numerically or borrowed from reliability database.
NASA Technical Reports Server (NTRS)
Callier, Frank M.; Desoer, Charles A.
1991-01-01
The aim of this book is to provide a systematic and rigorous access to the main topics of linear state-space system theory in both the continuous-time case and the discrete-time case; and the I/O description of linear systems. The main thrusts of the work are the analysis of system descriptions and derivations of their properties, LQ-optimal control, state feedback and state estimation, and MIMO unity-feedback systems.
Kim, Hyun-Wook; Hwang, Ko-Eun; Song, Dong-Heon; Kim, Yong-Jae; Ham, Youn-Kyung; Yeo, Eui-Joo; Jeong, Tae-Jun; Choi, Yun-Sang; Kim, Cheon-Jei
2015-01-01
This study was conducted to evaluate the effect of pre-rigor salting level (0-4% NaCl concentration) on physicochemical and textural properties of pre-rigor chicken breast muscles. The pre-rigor chicken breast muscles were de-boned 10 min post-mortem and salted within 25 min post-mortem. An increase in pre-rigor salting level led to the formation of high ultimate pH of chicken breast muscles at post-mortem 24 h. The addition of minimum of 2% NaCl significantly improved water holding capacity, cooking loss, protein solubility, and hardness when compared to the non-salting chicken breast muscle (p<0.05). On the other hand, the increase in pre-rigor salting level caused the inhibition of myofibrillar protein degradation and the acceleration of lipid oxidation. However, the difference in NaCl concentration between 3% and 4% had no great differences in the results of physicochemical and textural properties due to pre-rigor salting effects (p>0.05). Therefore, our study certified the pre-rigor salting effect of chicken breast muscle salted with 2% NaCl when compared to post-rigor muscle salted with equal NaCl concentration, and suggests that the 2% NaCl concentration is minimally required to ensure the definite pre-rigor salting effect on chicken breast muscle.
Choi, Yun-Sang
2015-01-01
This study was conducted to evaluate the effect of pre-rigor salting level (0-4% NaCl concentration) on physicochemical and textural properties of pre-rigor chicken breast muscles. The pre-rigor chicken breast muscles were de-boned 10 min post-mortem and salted within 25 min post-mortem. An increase in pre-rigor salting level led to the formation of high ultimate pH of chicken breast muscles at post-mortem 24 h. The addition of minimum of 2% NaCl significantly improved water holding capacity, cooking loss, protein solubility, and hardness when compared to the non-salting chicken breast muscle (p<0.05). On the other hand, the increase in pre-rigor salting level caused the inhibition of myofibrillar protein degradation and the acceleration of lipid oxidation. However, the difference in NaCl concentration between 3% and 4% had no great differences in the results of physicochemical and textural properties due to pre-rigor salting effects (p>0.05). Therefore, our study certified the pre-rigor salting effect of chicken breast muscle salted with 2% NaCl when compared to post-rigor muscle salted with equal NaCl concentration, and suggests that the 2% NaCl concentration is minimally required to ensure the definite pre-rigor salting effect on chicken breast muscle. PMID:26761884
Hardy, Micael; Zielonka, Jacek; Karoui, Hakim; Sikora, Adam; Michalski, Radosław; Podsiadły, Radosław; Lopez, Marcos; Vasquez-Vivar, Jeannette; Kalyanaraman, Balaraman; Ouari, Olivier
2018-05-20
Since the discovery of the superoxide dismutase enzyme, the generation and fate of short-lived oxidizing, nitrosating, nitrating, and halogenating species in biological systems has been of great interest. Despite the significance of reactive oxygen species (ROS) and reactive nitrogen species (RNS) in numerous diseases and intracellular signaling, the rigorous detection of ROS and RNS has remained a challenge. Recent Advances: Chemical characterization of the reactions of selected ROS and RNS with electron paramagnetic resonance (EPR) spin traps and fluorescent probes led to the establishment of species-specific products, which can be used for specific detection of several forms of ROS and RNS in cell-free systems and in cultured cells in vitro and in animals in vivo. Profiling oxidation products from the ROS and RNS probes provides a rigorous method for detection of those species in biological systems. Formation and detection of species-specific products from the probes enables accurate characterization of the oxidative environment in cells. Measurement of the total signal (fluorescence, chemiluminescence, etc.) intensity does not allow for identification of the ROS/RNS formed. It is critical to identify the products formed by using chromatographic or other rigorous techniques. Product analyses should be accompanied by monitoring of the intracellular probe level, another factor controlling the yield of the product(s) formed. More work is required to characterize the chemical reactivity of the ROS/RNS probes, and to develop new probes/detection approaches enabling real-time, selective monitoring of the specific products formed from the probes. Antioxid. Redox Signal. 28, 1416-1432.
Simic, Vladimir
2016-06-01
As the number of end-of-life vehicles (ELVs) is estimated to increase to 79.3 million units per year by 2020 (e.g., 40 million units were generated in 2010), there is strong motivation to effectively manage this fast-growing waste flow. Intensive work on management of ELVs is necessary in order to more successfully tackle this important environmental challenge. This paper proposes an interval-parameter chance-constraint programming model for end-of-life vehicles management under rigorous environmental regulations. The proposed model can incorporate various uncertainty information in the modeling process. The complex relationships between different ELV management sub-systems are successfully addressed. Particularly, the formulated model can help identify optimal patterns of procurement from multiple sources of ELV supply, production and inventory planning in multiple vehicle recycling factories, and allocation of sorted material flows to multiple final destinations under rigorous environmental regulations. A case study is conducted in order to demonstrate the potentials and applicability of the proposed model. Various constraint-violation probability levels are examined in detail. Influences of parameter uncertainty on model solutions are thoroughly investigated. Useful solutions for the management of ELVs are obtained under different probabilities of violating system constraints. The formulated model is able to tackle a hard, uncertainty existing ELV management problem. The presented model has advantages in providing bases for determining long-term ELV management plans with desired compromises between economic efficiency of vehicle recycling system and system-reliability considerations. The results are helpful for supporting generation and improvement of ELV management plans. Copyright © 2016 Elsevier Ltd. All rights reserved.
Experiment for validation of fluid-structure interaction models and algorithms.
Hessenthaler, A; Gaddum, N R; Holub, O; Sinkus, R; Röhrle, O; Nordsletten, D
2017-09-01
In this paper a fluid-structure interaction (FSI) experiment is presented. The aim of this experiment is to provide a challenging yet easy-to-setup FSI test case that addresses the need for rigorous testing of FSI algorithms and modeling frameworks. Steady-state and periodic steady-state test cases with constant and periodic inflow were established. Focus of the experiment is on biomedical engineering applications with flow being in the laminar regime with Reynolds numbers 1283 and 651. Flow and solid domains were defined using computer-aided design (CAD) tools. The experimental design aimed at providing a straightforward boundary condition definition. Material parameters and mechanical response of a moderately viscous Newtonian fluid and a nonlinear incompressible solid were experimentally determined. A comprehensive data set was acquired by using magnetic resonance imaging to record the interaction between the fluid and the solid, quantifying flow and solid motion. Copyright © 2016 The Authors. International Journal for Numerical Methods in Biomedical Engineering published by John Wiley & Sons Ltd.
Bioprinting: an assessment based on manufacturing readiness levels.
Wu, Changsheng; Wang, Ben; Zhang, Chuck; Wysk, Richard A; Chen, Yi-Wen
2017-05-01
Over the last decade, bioprinting has emerged as a promising technology in the fields of tissue engineering and regenerative medicine. With recent advances in additive manufacturing, bioprinting is poised to provide patient-specific therapies and new approaches for tissue and organ studies, drug discoveries and even food manufacturing. Manufacturing Readiness Level (MRL) is a method that has been applied to assess manufacturing maturity and to identify risks and gaps in technology-manufacturing transitions. Technology Readiness Level (TRL) is used to evaluate the maturity of a technology. This paper reviews recent advances in bioprinting following the MRL scheme and addresses corresponding MRL levels of engineering challenges and gaps associated with the translation of bioprinting from lab-bench experiments to ultimate full-scale manufacturing of tissues and organs. According to our step-by-step TRL and MRL assessment, after years of rigorous investigation by the biotechnology community, bioprinting is on the cusp of entering the translational phase where laboratory research practices can be scaled up into manufacturing products specifically designed for individual patients.
Sustainability of algae derived biodiesel: a mass balance approach.
Pfromm, Peter H; Amanor-Boadu, Vincent; Nelson, Richard
2011-01-01
A rigorous chemical engineering mass balance/unit operations approach is applied here to bio-diesel from algae mass culture. An equivalent of 50,000,000 gallons per year (0.006002 m3/s) of petroleum-based Number 2 fuel oil (US, diesel for compression-ignition engines, about 0.1% of annual US consumption) from oleaginous algae is the target. Methyl algaeate and ethyl algaeate diesel can according to this analysis conceptually be produced largely in a technologically sustainable way albeit at a lower available diesel yield. About 11 square miles of algae ponds would be needed with optimistic assumptions of 50 g biomass yield per day and m2 pond area. CO2 to foster algae growth should be supplied from a sustainable source such as a biomass-based ethanol production. Reliance on fossil-based CO2 from power plants or fertilizer production renders algae diesel non-sustainable in the long term. Copyright © 2010 Elsevier Ltd. All rights reserved.
A framework for evolutionary systems biology
Loewe, Laurence
2009-01-01
Background Many difficult problems in evolutionary genomics are related to mutations that have weak effects on fitness, as the consequences of mutations with large effects are often simple to predict. Current systems biology has accumulated much data on mutations with large effects and can predict the properties of knockout mutants in some systems. However experimental methods are too insensitive to observe small effects. Results Here I propose a novel framework that brings together evolutionary theory and current systems biology approaches in order to quantify small effects of mutations and their epistatic interactions in silico. Central to this approach is the definition of fitness correlates that can be computed in some current systems biology models employing the rigorous algorithms that are at the core of much work in computational systems biology. The framework exploits synergies between the realism of such models and the need to understand real systems in evolutionary theory. This framework can address many longstanding topics in evolutionary biology by defining various 'levels' of the adaptive landscape. Addressed topics include the distribution of mutational effects on fitness, as well as the nature of advantageous mutations, epistasis and robustness. Combining corresponding parameter estimates with population genetics models raises the possibility of testing evolutionary hypotheses at a new level of realism. Conclusion EvoSysBio is expected to lead to a more detailed understanding of the fundamental principles of life by combining knowledge about well-known biological systems from several disciplines. This will benefit both evolutionary theory and current systems biology. Understanding robustness by analysing distributions of mutational effects and epistasis is pivotal for drug design, cancer research, responsible genetic engineering in synthetic biology and many other practical applications. PMID:19239699
Soft network materials with isotropic negative Poisson's ratios over large strains.
Liu, Jianxing; Zhang, Yihui
2018-01-31
Auxetic materials with negative Poisson's ratios have important applications across a broad range of engineering areas, such as biomedical devices, aerospace engineering and automotive engineering. A variety of design strategies have been developed to achieve artificial auxetic materials with controllable responses in the Poisson's ratio. The development of designs that can offer isotropic negative Poisson's ratios over large strains can open up new opportunities in emerging biomedical applications, which, however, remains a challenge. Here, we introduce deterministic routes to soft architected materials that can be tailored precisely to yield the values of Poisson's ratio in the range from -1 to 1, in an isotropic manner, with a tunable strain range from 0% to ∼90%. The designs rely on a network construction in a periodic lattice topology, which incorporates zigzag microstructures as building blocks to connect lattice nodes. Combined experimental and theoretical studies on broad classes of network topologies illustrate the wide-ranging utility of these concepts. Quantitative mechanics modeling under both infinitesimal and finite deformations allows the development of a rigorous design algorithm that determines the necessary network geometries to yield target Poisson ratios over desired strain ranges. Demonstrative examples in artificial skin with both the negative Poisson's ratio and the nonlinear stress-strain curve precisely matching those of the cat's skin and in unusual cylindrical structures with engineered Poisson effect and shape memory effect suggest potential applications of these network materials.
NASA Astrophysics Data System (ADS)
Kannape, Oliver Alan; Lenggenhager, Bigna
2016-03-01
From brain-computer interfaces to wearable robotics and bionic prostheses - intelligent assistive devices have already become indispensable in the therapy of people living with reduced sensorimotor functioning of their physical body, be it due to spinal cord injury, amputation or brain lesions [1]. Rapid technological advances will continue to fuel this field for years to come. As Pazzaglia and Molinari [2] rightly point out, progress in this domain should not solely be driven by engineering prowess, but utilize the increasing psychological and neuroscientific understanding of cortical body-representations and their plasticity [3]. We argue that a core concept for such an integrated embodiment framework was introduced with the formalization of the forward model for sensorimotor control [4]. The application of engineering concepts to human movement control paved the way for rigorous computational and neuroscientific analysis. The forward model has successfully been adapted to investigate principles underlying aspects of bodily awareness such as the sense of agency in the comparator framework [5]. At the example of recent advances in lower limb prostheses, we propose a cross-disciplinary, integrated embodiment framework to investigate the sense of agency and the related sense of body ownership for such devices. The main onus now is on the engineers and cognitive scientists to embed such an approach into the design of assistive technology and its evaluation battery.
Specifying the behavior of concurrent systems
NASA Technical Reports Server (NTRS)
Furtek, F. C.
1984-01-01
A framework for rigorously specifying the behavior of concurrent systems is proposed. It is based on the view of a concurrent system as a collection of interacting processes but no assumptions are made about the mechanisms for process synchronization and communication. A formal language is described that permits the expression of a broad range of logical and timing dependencies.
ERIC Educational Resources Information Center
Vineberg, Robert; Joyner, John N.
Instructional System Development (ISD) methodologies and practices were examined in the Army, Navy, Marine Corps, and Air Force, each of which prescribes the ISD system involving rigorous derivation of training requirements from job requirements, selection of instructional strategies to maximize training efficiency, and revision of instruction…
ERIC Educational Resources Information Center
Reed, Eileen; Scull, Janie; Slicker, Gerilyn; Winkler, Amber M.
2012-01-01
Rigorous standards and aligned assessments are vital tools for boosting education outcomes but they have little traction without strong accountability systems that attach consequences to performance. In this pilot study, Eileen Reed, Janie Scull, Gerilyn Slicker, and Amber Winkler lay out the essential features of such accountability systems,…
Krompecher, T; Bergerioux, C; Brandt-Casadevall, C; Gujer, H R
1983-07-01
The evolution of rigor mortis was studied in cases of nitrogen asphyxia, drowning and strangulation, as well as in fatal intoxications due to strychnine, carbon monoxide and curariform drugs, using a modified method of measurement. Our experiments demonstrated that: (1) Strychnine intoxication hastens the onset and passing of rigor mortis. (2) CO intoxication delays the resolution of rigor mortis. (3) The intensity of rigor may vary depending upon the cause of death. (4) If the stage of rigidity is to be used to estimate the time of death, it is necessary: (a) to perform a succession of objective measurements of rigor mortis intensity; and (b) to verify the eventual presence of factors that could play a role in the modification of its development.
Rigorous electromagnetic simulation applied to alignment systems
NASA Astrophysics Data System (ADS)
Deng, Yunfei; Pistor, Thomas V.; Neureuther, Andrew R.
2001-09-01
Rigorous electromagnetic simulation with TEMPEST is used to provide benchmark data and understanding of key parameters in the design of topographical features of alignment marks. Periodic large silicon trenches are analyzed as a function of wavelength (530-800 nm), duty cycle, depth, slope and angle of incidence. The signals are well behaved except when the trench width becomes about 1 micrometers or smaller. Segmentation of the trenches to form 3D marks shows that a segmentation period of 2-5 wavelengths makes the diffraction in the (1,1) direction about 1/3 to 1/2 of that in the main first order (1,0). Transmission alignment marks nanoimprint lithography using the difference between the +1 and -1 reflected orders showed a sensitivity of the difference signal to misalignment of 0.7%/nm for rigorous simulation and 0.5%/nm for simple ray-tracing. The sensitivity to a slanted substrate indentation was 10 nm off-set per degree of tilt from horizontal.
Predicting Observer Training Satisfaction and Certification
ERIC Educational Resources Information Center
Bell, Courtney A.; Jones, Nathan D.; Lewis, Jennifer M.; Liu, Shuangshuang
2013-01-01
The last decade produced numerous studies that show that students learn more from high-quality teachers than they do from lower quality teachers. If instruction is to improve through the use of more rigorous teacher evaluation systems, the implementation of these systems must provide consistent and interpretable information about which aspects of…
Anticipating and Incorporating Stakeholder Feedback When Developing Value-Added Models
ERIC Educational Resources Information Center
Balch, Ryan; Koedel, Cory
2014-01-01
State and local education agencies across the United States are increasingly adopting rigorous teacher evaluation systems. Most systems formally incorporate teacher performance as measured by student test-score growth, sometimes by state mandate. An important consideration that will influence the long-term persistence and efficacy of these systems…
Create a College Access Contract
ERIC Educational Resources Information Center
Dannenberg, Michael
2007-01-01
America's financial aid system provides too much taxpayer support to banks making college loans, demands too little of students assuming them, and burdens families with too much debt. The system fails to reward rigorous college-preparatory work in high school and penalizes students who hold jobs while in college. Lenders make extraordinary…
RIGOR MORTIS AND THE INFLUENCE OF CALCIUM AND MAGNESIUM SALTS UPON ITS DEVELOPMENT.
Meltzer, S J; Auer, J
1908-01-01
Calcium salts hasten and magnesium salts retard the development of rigor mortis, that is, when these salts are administered subcutaneously or intravenously. When injected intra-arterially, concentrated solutions of both kinds of salts cause nearly an immediate onset of a strong stiffness of the muscles which is apparently a contraction, brought on by a stimulation caused by these salts and due to osmosis. This contraction, if strong, passes over without a relaxation into a real rigor. This form of rigor may be classed as work-rigor (Arbeitsstarre). In animals, at least in frogs, with intact cords, the early contraction and the following rigor are stronger than in animals with destroyed cord. If M/8 solutions-nearly equimolecular to "physiological" solutions of sodium chloride-are used, even when injected intra-arterially, calcium salts hasten and magnesium salts retard the onset of rigor. The hastening and retardation in this case as well as in the cases of subcutaneous and intravenous injections, are ion effects and essentially due to the cations, calcium and magnesium. In the rigor hastened by calcium the effects of the extensor muscles mostly prevail; in the rigor following magnesium injection, on the other hand, either the flexor muscles prevail or the muscles become stiff in the original position of the animal at death. There seems to be no difference in the degree of stiffness in the final rigor, only the onset and development of the rigor is hastened in the case of the one salt and retarded in the other. Calcium hastens also the development of heat rigor. No positive facts were obtained with regard to the effect of magnesium upon heat vigor. Calcium also hastens and magnesium retards the onset of rigor in the left ventricle of the heart. No definite data were gathered with regard to the effects of these salts upon the right ventricle.
RIGOR MORTIS AND THE INFLUENCE OF CALCIUM AND MAGNESIUM SALTS UPON ITS DEVELOPMENT
Meltzer, S. J.; Auer, John
1908-01-01
Calcium salts hasten and magnesium salts retard the development of rigor mortis, that is, when these salts are administered subcutaneously or intravenously. When injected intra-arterially, concentrated solutions of both kinds of salts cause nearly an immediate onset of a strong stiffness of the muscles which is apparently a contraction, brought on by a stimulation caused by these salts and due to osmosis. This contraction, if strong, passes over without a relaxation into a real rigor. This form of rigor may be classed as work-rigor (Arbeitsstarre). In animals, at least in frogs, with intact cords, the early contraction and the following rigor are stronger than in animals with destroyed cord. If M/8 solutions—nearly equimolecular to "physiological" solutions of sodium chloride—are used, even when injected intra-arterially, calcium salts hasten and magnesium salts retard the onset of rigor. The hastening and retardation in this case as well as in the cases of subcutaneous and intravenous injections, are ion effects and essentially due to the cations, calcium and magnesium. In the rigor hastened by calcium the effects of the extensor muscles mostly prevail; in the rigor following magnesium injection, on the other hand, either the flexor muscles prevail or the muscles become stiff in the original position of the animal at death. There seems to be no difference in the degree of stiffness in the final rigor, only the onset and development of the rigor is hastened in the case of the one salt and retarded in the other. Calcium hastens also the development of heat rigor. No positive facts were obtained with regard to the effect of magnesium upon heat vigor. Calcium also hastens and magnesium retards the onset of rigor in the left ventricle of the heart. No definite data were gathered with regard to the effects of these salts upon the right ventricle. PMID:19867124
Manufacturing of microcomponents in a research institute under DIN EN ISO 9001
NASA Astrophysics Data System (ADS)
Maas, Dieter; Karl, Bernhard; Saile, Volker; Schulz, Joachim
2000-08-01
The Institute for Microstructure Technology at Forschungszentrum Karlsruhe has implemented a rigorous quality management system and was certified according to the DIN ISO EN 9001 standard in January 2000.
The Facility Registry System (FRS) is a centrally managed database that identifies facilities, sites or places subject to environmental regulations or of environmental interest. FRS creates high-quality, accurate, and authoritative facility identification records through rigorous...
The MINERVA Software Development Process
NASA Technical Reports Server (NTRS)
Narkawicz, Anthony; Munoz, Cesar A.; Dutle, Aaron M.
2017-01-01
This paper presents a software development process for safety-critical software components of cyber-physical systems. The process is called MINERVA, which stands for Mirrored Implementation Numerically Evaluated against Rigorously Verified Algorithms. The process relies on formal methods for rigorously validating code against its requirements. The software development process uses: (1) a formal specification language for describing the algorithms and their functional requirements, (2) an interactive theorem prover for formally verifying the correctness of the algorithms, (3) test cases that stress the code, and (4) numerical evaluation on these test cases of both the algorithm specifications and their implementations in code. The MINERVA process is illustrated in this paper with an application to geo-containment algorithms for unmanned aircraft systems. These algorithms ensure that the position of an aircraft never leaves a predetermined polygon region and provide recovery maneuvers when the region is inadvertently exited.
Testability of evolutionary game dynamics based on experimental economics data
NASA Astrophysics Data System (ADS)
Wang, Yijia; Chen, Xiaojie; Wang, Zhijian
2017-11-01
Understanding the dynamic processes of a real game system requires an appropriate dynamics model, and rigorously testing a dynamics model is nontrivial. In our methodological research, we develop an approach to testing the validity of game dynamics models that considers the dynamic patterns of angular momentum and speed as measurement variables. Using Rock-Paper-Scissors (RPS) games as an example, we illustrate the geometric patterns in the experiment data. We then derive the related theoretical patterns from a series of typical dynamics models. By testing the goodness-of-fit between the experimental and theoretical patterns, we show that the validity of these models can be evaluated quantitatively. Our approach establishes a link between dynamics models and experimental systems, which is, to the best of our knowledge, the most effective and rigorous strategy for ascertaining the testability of evolutionary game dynamics models.
Long persistence of rigor mortis at constant low temperature.
Varetto, Lorenzo; Curto, Ombretta
2005-01-06
We studied the persistence of rigor mortis by using physical manipulation. We tested the mobility of the knee on 146 corpses kept under refrigeration at Torino's city mortuary at a constant temperature of +4 degrees C. We found a persistence of complete rigor lasting for 10 days in all the cadavers we kept under observation; and in one case, rigor lasted for 16 days. Between the 11th and the 17th days, a progressively increasing number of corpses showed a change from complete into partial rigor (characterized by partial bending of the articulation). After the 17th day, all the remaining corpses showed partial rigor and in the two cadavers that were kept under observation "à outrance" we found the absolute resolution of rigor mortis occurred on the 28th day. Our results prove that it is possible to find a persistence of rigor mortis that is much longer than the expected when environmental conditions resemble average outdoor winter temperatures in temperate zones. Therefore, this datum must be considered when a corpse is found in those environmental conditions so that when estimating the time of death, we are not misled by the long persistence of rigor mortis.
Rigor Made Easy: Getting Started
ERIC Educational Resources Information Center
Blackburn, Barbara R.
2012-01-01
Bestselling author and noted rigor expert Barbara Blackburn shares the secrets to getting started, maintaining momentum, and reaching your goals. Learn what rigor looks like in the classroom, understand what it means for your students, and get the keys to successful implementation. Learn how to use rigor to raise expectations, provide appropriate…
Close Early Learning Gaps with Rigorous DAP
ERIC Educational Resources Information Center
Brown, Christopher P.; Mowry, Brian
2015-01-01
Rigorous DAP (developmentally appropriate practices) is a set of 11 principles of instruction intended to help close early childhood learning gaps. Academically rigorous learning environments create the conditions for children to learn at high levels. While academic rigor focuses on one dimension of education--academic--DAP considers the whole…
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Vos, Winnok H., E-mail: winnok.devos@uantwerpen.be; Cell Systems and Imaging Research Group, Department of Molecular Biotechnology, Ghent University, Ghent; Beghuin, Didier
As commercial space flights have become feasible and long-term extraterrestrial missions are planned, it is imperative that the impact of space travel and the space environment on human physiology be thoroughly characterized. Scrutinizing the effects of potentially detrimental factors such as ionizing radiation and microgravity at the cellular and tissue level demands adequate visualization technology. Advanced light microscopy (ALM) is the leading tool for non-destructive structural and functional investigation of static as well as dynamic biological systems. In recent years, technological developments and advances in photochemistry and genetic engineering have boosted all aspects of resolution, readout and throughput, rendering ALMmore » ideally suited for biological space research. While various microscopy-based studies have addressed cellular response to space-related environmental stressors, biological endpoints have typically been determined only after the mission, leaving an experimental gap that is prone to bias results. An on-board, real-time microscopical monitoring device can bridge this gap. Breadboards and even fully operational microscope setups have been conceived, but they need to be rendered more compact and versatile. Most importantly, they must allow addressing the impact of gravity, or the lack thereof, on physiologically relevant biological systems in space and in ground-based simulations. In order to delineate the essential functionalities for such a system, we have reviewed the pending questions in space science, the relevant biological model systems, and the state-of-the art in ALM. Based on a rigorous trade-off, in which we recognize the relevance of multi-cellular systems and the cellular microenvironment, we propose a compact, but flexible concept for space-related cell biological research that is based on light sheet microscopy.« less
Analysis of In-Space Assembly of Modular Systems
NASA Technical Reports Server (NTRS)
Moses, Robert W.; VanLaak, James; Johnson, Spencer L.; Chytka, Trina M.; Reeves, John D.; Todd, B. Keith; Moe, Rud V.; Stambolian, Damon B.
2005-01-01
Early system-level life cycle assessments facilitate cost effective optimization of system architectures to enable implementation of both modularity and in-space assembly, two key Exploration Systems Research & Technology (ESR&T) Strategic Challenges. Experiences with the International Space Station (ISS) demonstrate that the absence of this rigorous analysis can result in increased cost and operational risk. An effort is underway, called Analysis of In-Space Assembly of Modular Systems, to produce an innovative analytical methodology, including an evolved analysis toolset and proven processes in a collaborative engineering environment, to support the design and evaluation of proposed concepts. The unique aspect of this work is that it will produce the toolset, techniques and initial products to analyze and compare the detailed, life cycle costs and performance of different implementations of modularity for in-space assembly. A multi-Center team consisting of experienced personnel from the Langley Research Center, Johnson Space Center, Kennedy Space Center, and the Goddard Space Flight Center has been formed to bring their resources and experience to this development. At the end of this 30-month effort, the toolset will be ready to support the Exploration Program with an integrated assessment strategy that embodies all life-cycle aspects of the mission from design and manufacturing through operations to enable early and timely selection of an optimum solution among many competing alternatives. Already there are many different designs for crewed missions to the Moon that present competing views of modularity requiring some in-space assembly. The purpose of this paper is to highlight the approach for scoring competing designs.
NASA Astrophysics Data System (ADS)
Marulcu, Ismail
This mixed method study examined the impact of a LEGO-based, engineering-oriented curriculum compared to an inquiry-based curriculum on fifth graders' content learning of simple machines. This study takes a social constructivist theoretical stance that science learning involves learning scientific concepts and their relations to each other. From this perspective, students are active participants, and they construct their conceptual understanding through the guidance of their teacher. With the goal of better understanding the use of engineering education materials in classrooms the National Academy of Engineering and National Research Council in the book "Engineering in K-12 Education" conducted an in-depth review of the potential benefits of including engineering in K--12 schools as (a) improved learning and achievement in science and mathematics, (b) increased awareness of engineering and the work of engineers, (c) understanding of and the ability to engage in engineering design, (d) interest in pursuing engineering as a career, and (e) increased technological literacy (Katehi, Pearson, & Feder, 2009). However, they also noted a lack of reliable data and rigorous research to support these assertions. Data sources included identical written tests and interviews, classroom observations and videos, teacher interviews, and classroom artifacts. To investigate the impact of the design-based simple machines curriculum compared to the scientific inquiry-based simple machines curriculum on student learning outcomes, I compared the control and the experimental groups' scores on the tests and interviews by using ANCOVA. To analyze and characterize the classroom observation videotapes, I used Jordan and Henderson's (1995) method and divide them into episodes. My analyses revealed that the design-based Design a People Mover: Simple Machines unit was, if not better, as successful as the inquiry-based FOSS Levers and Pulleys unit in terms of students' content learning. I also found that students in the engineering group outperformed students in the control group in regards to their ability to answer open-ended questions when interviewed. Implications for students' science content learning and teachers' professional development are discussed.
Rigorous Science: a How-To Guide.
Casadevall, Arturo; Fang, Ferric C
2016-11-08
Proposals to improve the reproducibility of biomedical research have emphasized scientific rigor. Although the word "rigor" is widely used, there has been little specific discussion as to what it means and how it can be achieved. We suggest that scientific rigor combines elements of mathematics, logic, philosophy, and ethics. We propose a framework for rigor that includes redundant experimental design, sound statistical analysis, recognition of error, avoidance of logical fallacies, and intellectual honesty. These elements lead to five actionable recommendations for research education. Copyright © 2016 Casadevall and Fang.
Rigorous vector wave propagation for arbitrary flat media
NASA Astrophysics Data System (ADS)
Bos, Steven P.; Haffert, Sebastiaan Y.; Keller, Christoph U.
2017-08-01
Precise modelling of the (off-axis) point spread function (PSF) to identify geometrical and polarization aberrations is important for many optical systems. In order to characterise the PSF of the system in all Stokes parameters, an end-to-end simulation of the system has to be performed in which Maxwell's equations are rigorously solved. We present the first results of a python code that we are developing to perform multiscale end-to-end wave propagation simulations that include all relevant physics. Currently we can handle plane-parallel near- and far-field vector diffraction effects of propagating waves in homogeneous isotropic and anisotropic materials, refraction and reflection of flat parallel surfaces, interference effects in thin films and unpolarized light. We show that the code has a numerical precision on the order of 10-16 for non-absorbing isotropic and anisotropic materials. For absorbing materials the precision is on the order of 10-8. The capabilities of the code are demonstrated by simulating a converging beam reflecting from a flat aluminium mirror at normal incidence.
Hanning, Brian; Predl, Nicolle
2015-09-01
Traditional overnight rehabilitation payment models in the private sector are not based on a rigorous classification system and vary greatly between contracts with no consideration of patient complexity. The payment rates are not based on relative cost and the length-of-stay (LOS) point at which a reduced rate applies (step downs) varies markedly. The rehabilitation Australian National Sub-Acute and Non-Acute Patient (AN-SNAP) model (RAM), which has been in place for over 2 years in some private hospitals, bases payment on a rigorous classification system, relative cost and industry LOS. RAM is in the process of being rolled out more widely. This paper compares and contrasts RAM with traditional overnight rehabilitation payment models. It considers the advantages of RAM for hospitals and Australian Health Service Alliance. It also considers payment model changes in the context of maintaining industry consistency with Electronic Claims Lodgement and Information Processing System Environment (ECLIPSE) and health reform generally.
Nodal surfaces and interdimensional degeneracies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loos, Pierre-François, E-mail: pf.loos@anu.edu.au; Bressanini, Dario, E-mail: dario.bressanini@uninsubria.it
2015-06-07
The aim of this paper is to shed light on the topology and properties of the nodes (i.e., the zeros of the wave function) in electronic systems. Using the “electrons on a sphere” model, we study the nodes of two-, three-, and four-electron systems in various ferromagnetic configurations (sp, p{sup 2}, sd, pd, p{sup 3}, sp{sup 2}, and sp{sup 3}). In some particular cases (sp, p{sup 2}, sd, pd, and p{sup 3}), we rigorously prove that the non-interacting wave function has the same nodes as the exact (yet unknown) wave function. The number of atomic and molecular systems for whichmore » the exact nodes are known analytically is very limited and we show here that this peculiar feature can be attributed to interdimensional degeneracies. Although we have not been able to prove it rigorously, we conjecture that the nodes of the non-interacting wave function for the sp{sup 3} configuration are exact.« less
Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions
NASA Astrophysics Data System (ADS)
De Risi, Raffaele; Goda, Katsuichiro
2017-08-01
Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.
Krompecher, T; Bergerioux, C
1988-01-01
The influence of electrocution on the evolution of rigor mortis was studied on rats. Our experiments showed that: (1) Electrocution hastens the onset of rigor mortis. After an electrocution of 90 s, a complete rigor develops already 1 h post-mortem (p.m.) compared to 5 h p.m. for the controls. (2) Electrocution hastens the passing of rigor mortis. After an electrocution of 90 s, the first significant decrease occurs at 3 h p.m. (8 h p.m. in the controls). (3) These modifications in rigor mortis evolution are less pronounced in the limbs not directly touched by the electric current. (4) In case of post-mortem electrocution, the changes are slightly less pronounced, the resistance is higher and the absorbed energy is lower as compared with the ante-mortem electrocution cases. The results are completed by two practical observations on human electrocution cases.
ERIC Educational Resources Information Center
Colbert, Peta; Wyatt-Smith, Claire; Klenowski, Val
2012-01-01
This article considers the conditions that are necessary at system and local levels for teacher assessment to be valid, reliable and rigorous. With sustainable assessment cultures as a goal, the article examines how education systems can support local-level efforts for quality learning and dependable teacher assessment. This is achieved through…
ERIC Educational Resources Information Center
Herlihy, Corinne; Karger, Ezra; Pollard, Cynthia; Hill, Heather C.; Kraft, Matthew A.; Williams, Megan; Howard, Sarah
2014-01-01
Context: In the past two years, states have implemented sweeping reforms to their teacher evaluation systems in response to Race to the Top legislation and, more recently, NCLB waivers. With these new systems, policymakers hope to make teacher evaluation both more rigorous and more grounded in specific job performance domains such as teaching…
Rigorous Schools and Classrooms: Leading the Way
ERIC Educational Resources Information Center
Williamson, Ronald; Blackburn, Barbara R.
2010-01-01
Turn your school into a student-centered learning environment, where rigor is at the heart of instruction in every classroom. From the bestselling author of "Rigor is Not a Four-Letter Word," Barbara Blackburn, and award-winning educator Ronald Williamson, this comprehensive guide to establishing a schoolwide culture of rigor is for principals and…
Rigor Revisited: Scaffolding College Student Learning by Incorporating Their Lived Experiences
ERIC Educational Resources Information Center
Castillo-Montoya, Milagros
2018-01-01
This chapter explores how students' lived experiences contribute to the rigor of their thinking. Insights from research indicate faculty can enhance rigor by accounting for the many ways it may surface in the classroom. However, to see this type of rigor, we must revisit the way we conceptualize it for higher education.
Mungure, Tanyaradzwa E; Bekhit, Alaa El-Din A; Birch, E John; Stewart, Ian
2016-04-01
The effects of rigor temperature (5, 15, 20 and 25°C), ageing (3, 7, 14, and 21 days) and display time on meat quality and lipid oxidative stability of hot boned beef M. Semimembranosus (SM) muscle were investigated. Ultimate pH (pH(u)) was rapidly attained at higher rigor temperatures. Electrical conductivity increased with rigor temperature (p<0.001). Tenderness, purge and cooking losses were not affected by rigor temperature; however purge loss and tenderness increased with ageing (p<0.01). Lightness (L*) and redness (a*) of the SM increased as rigor temperature increased (p<0.01). Lipid oxidation was assessed using (1)H NMR where changes in aliphatic to olefinic (R(ao)) and diallylmethylene (R(ad)) proton ratios can be rapidly monitored. R(ad), R(ao), PUFA and TBARS were not affected by rigor temperature, however ageing and display increased lipid oxidation (p<0.05). This study shows that rigor temperature manipulation of hot boned beef SM muscle does not have adverse effects on lipid oxidation. Copyright © 2016 Elsevier Ltd. All rights reserved.
Misimi, E; Erikson, U; Digre, H; Skavhaug, A; Mathiassen, J R
2008-03-01
The present study describes the possibilities for using computer vision-based methods for the detection and monitoring of transient 2D and 3D changes in the geometry of a given product. The rigor contractions of unstressed and stressed fillets of Atlantic salmon (Salmo salar) and Atlantic cod (Gadus morhua) were used as a model system. Gradual changes in fillet shape and size (area, length, width, and roundness) were recorded for 7 and 3 d, respectively. Also, changes in fillet area and height (cross-section profiles) were tracked using a laser beam and a 3D digital camera. Another goal was to compare rigor developments of the 2 species of farmed fish, and whether perimortem stress affected the appearance of the fillets. Some significant changes in fillet size and shape were found (length, width, area, roundness, height) between unstressed and stressed fish during the course of rigor mortis as well as after ice storage (postrigor). However, the observed irreversible stress-related changes were small and would hardly mean anything for postrigor fish processors or consumers. The cod were less stressed (as defined by muscle biochemistry) than the salmon after the 2 species had been subjected to similar stress bouts. Consequently, the difference between the rigor courses of unstressed and stressed fish was more extreme in the case of salmon. However, the maximal whole fish rigor strength was judged to be about the same for both species. Moreover, the reductions in fillet area and length, as well as the increases in width, were basically of similar magnitude for both species. In fact, the increases in fillet roundness and cross-section height were larger for the cod. We conclude that the computer vision method can be used effectively for automated monitoring of changes in 2D and 3D shape and size of fish fillets during rigor mortis and ice storage. In addition, it can be used for grading of fillets according to uniformity in size and shape, as well as measurement of fillet yield measured in thickness. The methods are accurate, rapid, nondestructive, and contact-free and can therefore be regarded as suitable for industrial purposes.
Van Driest transformation and compressible wall-bounded flows
NASA Technical Reports Server (NTRS)
Huang, P. G.; Coleman, G. N.
1994-01-01
The transformation validity question utilizing resulting data from direct numerical simulations (DNS) of supersonic, isothermal cold wall channel flow was investigated. The DNS results stood for a wide scope of parameter and were suitable for the purpose of examining the generality of Van Driest transformation. The Van Driest law of the wall can be obtained from the inner-layer similarity arguments. It was demonstrated that the Van Driest transformation cannot be incorporated to collapse the sublayer and log-layer velocity profiles simultaneously. Velocity and temperature predictions according to the preceding composite mixing-length model were presented. Despite satisfactory congruity with the DNS data, the model must be perceived as an engineering guide and not as a rigorous analysis.
1999-04-21
University of Alabama engineer Stacey Giles briefs NASA astronaut Dr. Bornie Dunbar about the design and capabilities of the X-ray Crystallography Facility under development at the Center for Macromolecular Crystallography of the University of Alabama at Birmingham, AL, April 21, 1999. The X-ray Crystallography Facility is designed to speed the collection of protein structure information from crystals grown aboard the International Space Station. By measuring and mapping the protein crystal structure in space, researchers will avoid exposing the delicate crystals to the rigors of space travel and make important research data available to scientists much faster. The X-ray Crystallography facility is being designed and developed by the Center for Macromolecular Crystallography of the University of Alabama at Birmingham, a NASA Commercial Space Center.
1999-04-21
University of Alabama engineer Lance Weiss briefs NASA astronaut Dr. Bornie Dunbar about the design and capabilities of the X-ray Crystallography Facility under development at the Center for Macromolecular Crystallography of the University of Alabama at Birmingham, AL, April 21, 1999. The X-ray Crystallography Facility is designed to speed the collection of protein structure information from crystals grown aboard the International Space Station. By measuring and mapping the protein crystal structure in space, researchers will avoid exposing the delicate crystals to the rigors of space travel and make important research data available to scientists much faster. The X-ray Crystallography facility is being designed and developed by the Center for Macromolecular Crystallography of the University of Alabama at Birmingham, a NASA Commercial Space Center.
First-Order Frameworks for Managing Models in Engineering Optimization
NASA Technical Reports Server (NTRS)
Alexandrov, Natlia M.; Lewis, Robert Michael
2000-01-01
Approximation/model management optimization (AMMO) is a rigorous methodology for attaining solutions of high-fidelity optimization problems with minimal expense in high- fidelity function and derivative evaluation. First-order AMMO frameworks allow for a wide variety of models and underlying optimization algorithms. Recent demonstrations with aerodynamic optimization achieved three-fold savings in terms of high- fidelity function and derivative evaluation in the case of variable-resolution models and five-fold savings in the case of variable-fidelity physics models. The savings are problem dependent but certain trends are beginning to emerge. We give an overview of the first-order frameworks, current computational results, and an idea of the scope of the first-order framework applicability.
High and low rigor temperature effects on sheep meat tenderness and ageing.
Devine, Carrick E; Payne, Steven R; Peachey, Bridget M; Lowe, Timothy E; Ingram, John R; Cook, Christian J
2002-02-01
Immediately after electrical stimulation, the paired m. longissimus thoracis et lumborum (LT) of 40 sheep were boned out and wrapped tightly with a polyethylene cling film. One of the paired LT's was chilled in 15°C air to reach a rigor mortis (rigor) temperature of 18°C and the other side was placed in a water bath at 35°C and achieved rigor at this temperature. Wrapping reduced rigor shortening and mimicked meat left on the carcass. After rigor, the meat was aged at 15°C for 0, 8, 26 and 72 h and then frozen. The frozen meat was cooked to 75°C in an 85°C water bath and shear force values obtained from a 1×1 cm cross-section. The shear force values of meat for 18 and 35°C rigor were similar at zero ageing, but as ageing progressed, the 18 rigor meat aged faster and became more tender than meat that went into rigor at 35°C (P<0.001). The mean sarcomere length values of meat samples for 18 and 35°C rigor at each ageing time were significantly different (P<0.001), the samples at 35°C being shorter. When the short sarcomere length values and corresponding shear force values were removed for further data analysis, the shear force values for the 35°C rigor were still significantly greater. Thus the toughness of 35°C meat was not a consequence of muscle shortening and appears to be due to both a faster rate of tenderisation and the meat tenderising to a greater extent at the lower temperature. The cook loss at 35°C rigor (30.5%) was greater than that at 18°C rigor (28.4%) (P<0.01) and the colour Hunter L values were higher at 35°C (P<0.01) compared with 18°C, but there were no significant differences in a or b values.
Failure to Learn from Failure: Evaluating Computer Systems in Medicine
Grann, Richard P.
1980-01-01
Evaluation of ADP systems in medicine frequently becomes mired in problems of tenuous cost measurement, of proving illusory cost savings, of false precision, and of dubious discounting methods, while giving only superficial treatment to non-dollar benefits. It would frequently be more advantageous to study non-dollar impacts with greater care and rigor.
Vocabulary Intervention Discourse in Special Education Classroom: What Word?
ERIC Educational Resources Information Center
Kim, Joyce Junghee
2017-01-01
The Every Student Succeeds Act (ESSA, 2015), which replaced the federal government's education policy called the No Child Left Behind Act (NCLB, 2002), takes full effect in the 2017-2018 school year with renewed focus on accountability systems established by each state. States must have for their middle schools rigorous accountability systems in…
Learning Transfer--Validation of the Learning Transfer System Inventory in Portugal
ERIC Educational Resources Information Center
Velada, Raquel; Caetano, Antonio; Bates, Reid; Holton, Ed
2009-01-01
Purpose: The purpose of this paper is to analyze the construct validity of learning transfer system inventory (LTSI) for use in Portugal. Furthermore, it also aims to analyze whether LTSI dimensions differ across individual variables such as gender, age, educational level and job tenure. Design/methodology/approach: After a rigorous translation…
The Mauritian Education System: Was There a Will to Anglicize it?
ERIC Educational Resources Information Center
Tirvsssen, Rada
2007-01-01
Clive Whitehead (2005: 315-329) makes an indisputable claim that British colonial education is a controversial topic in the history of education. The macro study of educational systems undertaken within a framework that guarantees a systematic and rigorous approach can offer answers to many disputed issues, but researchers should not underestimate…
Approximation Methods for Inverse Problems Governed by Nonlinear Parabolic Systems
1999-12-17
We present a rigorous theoretical framework for approximation of nonlinear parabolic systems with delays in the context of inverse least squares...numerical results demonstrating the convergence are given for a model of dioxin uptake and elimination in a distributed liver model that is a special case of the general theoretical framework .
The Impact of Performance Ratings on Job Satisfaction for Public School Teachers
ERIC Educational Resources Information Center
Koedel, Cory; Li, Jiaxi; Springer, Matthew G.; Tan, Li
2017-01-01
Spurred by the federal Race to the Top competition, the state of Tennessee implemented a comprehensive statewide educator evaluation system in 2011. The new system is designed to increase the rigor of evaluations and better differentiate teachers based on performance. The use of more differentiated ratings represents a significant shift in…
ERIC Educational Resources Information Center
Steinberg, Matthew P.; Garrett, Rachel
2016-01-01
As states and districts implement more rigorous teacher evaluation systems, measures of teacher performance are increasingly being used to support instruction and inform retention decisions. Classroom observations take a central role in these systems, accounting for the majority of teacher ratings upon which accountability decisions are based.…
Multi-Disciplinary Knowledge Synthesis for Human Health Assessment on Earth and in Space
NASA Astrophysics Data System (ADS)
Christakos, G.
We discuss methodological developments in multi-disciplinary knowledge synthesis (KS) of human health assessment. A theoretical KS framework can provide the rational means for the assimilation of various information bases (general, site-specific etc.) that are relevant to the life system of interest. KS-based techniques produce a realistic representation of the system, provide a rigorous assessment of the uncertainty sources, and generate informative health state predictions across space-time. The underlying epistemic cognition methodology is based on teleologic criteria and stochastic logic principles. The mathematics of KS involves a powerful and versatile spatiotemporal random field model that accounts rigorously for the uncertainty features of the life system and imposes no restriction on the shape of the probability distributions or the form of the predictors. KS theory is instrumental in understanding natural heterogeneities, assessing crucial human exposure correlations and laws of physical change, and explaining toxicokinetic mechanisms and dependencies in a spatiotemporal life system domain. It is hoped that a better understanding of KS fundamentals would generate multi-disciplinary models that are useful for the maintenance of human health on Earth and in Space.
An automated dose tracking system for adaptive radiation therapy.
Liu, Chang; Kim, Jinkoo; Kumarasiri, Akila; Mayyas, Essa; Brown, Stephen L; Wen, Ning; Siddiqui, Farzan; Chetty, Indrin J
2018-02-01
The implementation of adaptive radiation therapy (ART) into routine clinical practice is technically challenging and requires significant resources to perform and validate each process step. The objective of this report is to identify the key components of ART, to illustrate how a specific automated procedure improves efficiency, and to facilitate the routine clinical application of ART. Data was used from patient images, exported from a clinical database and converted to an intermediate format for point-wise dose tracking and accumulation. The process was automated using in-house developed software containing three modularized components: an ART engine, user interactive tools, and integration tools. The ART engine conducts computing tasks using the following modules: data importing, image pre-processing, dose mapping, dose accumulation, and reporting. In addition, custom graphical user interfaces (GUIs) were developed to allow user interaction with select processes such as deformable image registration (DIR). A commercial scripting application programming interface was used to incorporate automated dose calculation for application in routine treatment planning. Each module was considered an independent program, written in C++or C#, running in a distributed Windows environment, scheduled and monitored by integration tools. The automated tracking system was retrospectively evaluated for 20 patients with prostate cancer and 96 patients with head and neck cancer, under institutional review board (IRB) approval. In addition, the system was evaluated prospectively using 4 patients with head and neck cancer. Altogether 780 prostate dose fractions and 2586 head and neck cancer dose fractions went processed, including DIR and dose mapping. On average, daily cumulative dose was computed in 3 h and the manual work was limited to 13 min per case with approximately 10% of cases requiring an additional 10 min for image registration refinement. An efficient and convenient dose tracking system for ART in the clinical setting is presented. The software and automated processes were rigorously evaluated and validated using patient image datasets. Automation of the various procedures has improved efficiency significantly, allowing for the routine clinical application of ART for improving radiation therapy effectiveness. Copyright © 2017 Elsevier B.V. All rights reserved.
Exploring Student Perceptions of Rigor Online: Toward a Definition of Rigorous Learning
ERIC Educational Resources Information Center
Duncan, Heather E.; Range, Bret; Hvidston, David
2013-01-01
Technological advances in the last decade have impacted delivery methods of university courses. More and more courses are offered in a variety of formats. While academic rigor is a term often used, its definition is less clear. This mixed-methods study explored graduate student conceptions of rigor in the online learning environment embedded…
Methodological rigor and citation frequency in patient compliance literature.
Bruer, J T
1982-01-01
An exhaustive bibliography which assesses the methodological rigor of the patient compliance literature, and citation data from the Science Citation Index (SCI) are combined to determine if methodologically rigorous papers are used with greater frequency than substandard articles by compliance investigators. There are low, but statistically significant, correlations between methodological rigor and citation indicators for 138 patient compliance papers published in SCI source journals during 1975 and 1976. The correlation is not strong enough to warrant use of citation measures as indicators of rigor on a paper-by-paper basis. The data do suggest that citation measures might be developed as crude indicators of methodological rigor. There is no evidence that randomized trials are cited more frequently than studies that employ other experimental designs. PMID:7114334
76 FR 71445 - American Education Week, 2011
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-17
... rigorous and lasting investments in our education system so the American dream remains within reach of each... promise to give our children the chance to achieve their dreams and to write the next proud chapter in the...
Property-Based Software Engineering Measurement
NASA Technical Reports Server (NTRS)
Briand, Lionel; Morasca, Sandro; Basili, Victor R.
1995-01-01
Little theory exists in the field of software system measurement. Concepts such as complexity, coupling, cohesion or even size are very often subject to interpretation and appear to have inconsistent definitions in the literature. As a consequence, there is little guidance provided to the analyst attempting to define proper measures for specific problems. Many controversies in the literature are simply misunderstandings and stem from the fact that some people talk about different measurement concepts under the same label (complexity is the most common case). There is a need to define unambiguously the most important measurement concepts used in the measurement of software products. One way of doing so is to define precisely what mathematical properties characterize these concepts regardless of the specific software artifacts to which these concepts are applied. Such a mathematical framework could generate a consensus in the software engineering community and provide a means for better communication among researchers, better guidelines for analysis, and better evaluation methods for commercial static analyzers for practitioners. In this paper, we propose a mathematical framework which is generic, because it is not specific to any particular software artifact, and rigorous, because it is based on precise mathematical concepts. This framework defines several important measurement concepts (size, length, complexity, cohesion, coupling). It is not intended to be complete or fully objective; other frameworks could have been proposed and different choices could have been made. However, we believe that the formalism and properties we introduce are convenient and intuitive. In addition, we have reviewed the literature on this subject and compared it with our work. This framework contributes constructively to a firmer theoretical ground of software measurement.
Property-Based Software Engineering Measurement
NASA Technical Reports Server (NTRS)
Briand, Lionel C.; Morasca, Sandro; Basili, Victor R.
1997-01-01
Little theory exists in the field of software system measurement. Concepts such as complexity, coupling, cohesion or even size are very often subject to interpretation and appear to have inconsistent definitions in the literature. As a consequence, there is little guidance provided to the analyst attempting to define proper measures for specific problems. Many controversies in the literature are simply misunderstandings and stem from the fact that some people talk about different measurement concepts under the same label (complexity is the most common case). There is a need to define unambiguously the most important measurement concepts used in the measurement of software products. One way of doing so is to define precisely what mathematical properties characterize these concepts, regardless of the specific software artifacts to which these concepts are applied. Such a mathematical framework could generate a consensus in the software engineering community and provide a means for better communication among researchers, better guidelines for analysts, and better evaluation methods for commercial static analyzers for practitioners. In this paper, we propose a mathematical framework which is generic, because it is not specific to any particular software artifact and rigorous, because it is based on precise mathematical concepts. We use this framework to propose definitions of several important measurement concepts (size, length, complexity, cohesion, coupling). It does not intend to be complete or fully objective; other frameworks could have been proposed and different choices could have been made. However, we believe that the formalisms and properties we introduce are convenient and intuitive. This framework contributes constructively to a firmer theoretical ground of software measurement.
Ju, Feng; Lee, Hyo Kyung; Yu, Xinhua; Faris, Nicholas R; Rugless, Fedoria; Jiang, Shan; Li, Jingshan; Osarogiagbon, Raymond U
2017-12-01
The process of lung cancer care from initial lesion detection to treatment is complex, involving multiple steps, each introducing the potential for substantial delays. Identifying the steps with the greatest delays enables a focused effort to improve the timeliness of care-delivery, without sacrificing quality. We retrospectively reviewed clinical events from initial detection, through histologic diagnosis, radiologic and invasive staging, and medical clearance, to surgery for all patients who had an attempted resection of a suspected lung cancer in a community healthcare system. We used a computer process modeling approach to evaluate delays in care delivery, in order to identify potential 'bottlenecks' in waiting time, the reduction of which could produce greater care efficiency. We also conducted 'what-if' analyses to predict the relative impact of simulated changes in the care delivery process to determine the most efficient pathways to surgery. The waiting time between radiologic lesion detection and diagnostic biopsy, and the waiting time from radiologic staging to surgery were the two most critical bottlenecks impeding efficient care delivery (more than 3 times larger compared to reducing other waiting times). Additionally, instituting surgical consultation prior to cardiac consultation for medical clearance and decreasing the waiting time between CT scans and diagnostic biopsies, were potentially the most impactful measures to reduce care delays before surgery. Rigorous computer simulation modeling, using clinical data, can provide useful information to identify areas for improving the efficiency of care delivery by process engineering, for patients who receive surgery for lung cancer.
Effectiveness of Culturally Appropriate Adaptations to Juvenile Justice Services
Vergara, Andrew T.; Kathuria, Parul; Woodmass, Kyler; Janke, Robert; Wells, Susan J.
2017-01-01
Despite efforts to increase cultural competence of services within juvenile justice systems, disproportional minority contact (DMC) persists throughout Canada and the United States. Commonly cited approaches to decreasing DMC include large-scale systemic changes as well as enhancement of the cultural relevance and responsiveness of services delivered. Cultural adaptations to service delivery focus on prevention, decision-making, and treatment services to reduce initial contact, minimize unnecessary restraint, and reduce recidivism. Though locating rigorous testing of these approaches compared to standard interventions is difficult, this paper identifies and reports on such research. The Cochrane guidelines for systematic literature reviews and meta-analyses served as a foundation for study methodology. Databases such as Legal Periodicals and Books were searched through June 2015. Three studies were sufficiently rigorous to identify the effect of the cultural adaptations, and three studies that are making potentially important contributions to the field were also reviewed. PMID:29468092
NASA Astrophysics Data System (ADS)
Sarkar, Biplab; Adhikari, Satrajit
If a coupled three-state electronic manifold forms a sub-Hilbert space, it is possible to express the non-adiabatic coupling (NAC) elements in terms of adiabatic-diabatic transformation (ADT) angles. Consequently, we demonstrate: (a) Those explicit forms of the NAC terms satisfy the Curl conditions with non-zero Divergences; (b) The formulation of extended Born-Oppenheimer (EBO) equation for any three-state BO system is possible only when there exists coordinate independent ratio of the gradients for each pair of ADT angles leading to zero Curls at and around the conical intersection(s). With these analytic advancements, we formulate a rigorous EBO equation and explore its validity as well as necessity with respect to the approximate one (Sarkar and Adhikari, J Chem Phys 2006, 124, 074101) by performing numerical calculations on two different models constructed with different chosen forms of the NAC elements.
Aerodynamic Analysis of Simulated Heat Shield Recession for the Orion Command Module
NASA Technical Reports Server (NTRS)
Bibb, Karen L.; Alter, Stephen J.; Mcdaniel, Ryan D.
2008-01-01
The aerodynamic effects of the recession of the ablative thermal protection system for the Orion Command Module of the Crew Exploration Vehicle are important for the vehicle guidance. At the present time, the aerodynamic effects of recession being handled within the Orion aerodynamic database indirectly with an additional safety factor placed on the uncertainty bounds. This study is an initial attempt to quantify the effects for a particular set of recessed geometry shapes, in order to provide more rigorous analysis for managing recession effects within the aerodynamic database. The aerodynamic forces and moments for the baseline and recessed geometries were computed at several trajectory points using multiple CFD codes, both viscous and inviscid. The resulting aerodynamics for the baseline and recessed geometries were compared. The forces (lift, drag) show negligible differences between baseline and recessed geometries. Generally, the moments show a difference between baseline and recessed geometries that correlates with the maximum amount of recession of the geometry. The difference between the pitching moments for the baseline and recessed geometries increases as Mach number decreases (and the recession is greater), and reach a value of -0.0026 for the lowest Mach number. The change in trim angle of attack increases from approx. 0.5deg at M = 28.7 to approx. 1.3deg at M = 6, and is consistent with a previous analysis with a lower fidelity engineering tool. This correlation of the present results with the engineering tool results supports the continued use of the engineering tool for future work. The present analysis suggests there does not need to be an uncertainty due to recession in the Orion aerodynamic database for the force quantities. The magnitude of the change in pitching moment due to recession is large enough to warrant inclusion in the aerodynamic database. An increment in the uncertainty for pitching moment could be calculated from these results and included in the development of the aerodynamic database uncertainty for pitching moment.
NASA Astrophysics Data System (ADS)
Sharma, Mangala; Smith, D.; Mendez, B.; Shipp, S.; Schwerin, T.; Stockman, S.; Cooper, L.
2010-03-01
The AAS-HEAD community has a rich history of involvement in education and public outreach (E/PO). HEAD members have been using NASA science and educational resources to engage and educate youth and adults nationwide in science, technology, engineering, and mathematics topics. Four new Science Education and Public Outreach Forums ("Forums") funded by NASA Science Mission Directorate (SMD) are working in partnership with the research and education community to ensure that current and future SMD-funded E/PO activities form a seamless whole, with easy entry points for scientists, engineers, faculty, students, K-12 formal and informal science educators, general public, and E/PO professionals alike. These Forums support the astrophysics, heliophysics, planetary and Earth science divisions of NASA SMD in three core areas: 1) E/PO community engagement and development to facilitate clear paths of involvement for scientists, engineers and others interested - or potentially interested - in participating in SMD-funded E/PO activities. Collaborations with science professionals are vital for infusing current, accurate SMD mission and research findings into educational products and activities. Forum activities will yield readily accessible information on effective E/PO strategies, resources, and expertise; context for individual E/PO activities; and opportunities for collaboration. 2) A rigorous analysis of SMD-funded E/PO products and activities to help understand how the existing collection supports education standards and audience needs and to identify areas of opportunity for new materials and activities. K-12 formal, informal, and higher education products and activities are included in this analysis. 3) Finally, to address E/PO-related systemic issues and coordinate related activities across the four SMD science divisions. By supporting the NASA E/PO community and facilitating coordination of E/PO activities within and across disciplines, the SMD-Forum partnerships will lead to more effective, sustainable, and efficient utilization of NASA science discoveries and learning experiences.
The progenitors of extended emission gamma-ray bursts
NASA Astrophysics Data System (ADS)
Gompertz, B. P.
2015-06-01
Gamma-ray bursts (GRBs) are the most luminous transient events in the Universe, and as such are associated with some of the most extreme processes in nature. They come in two types: long and short, nominally separated either side of a two second divide in gamma-ray emission duration. The short class (those with durations of less than two seconds) are believed to be due to the merger of two compact objects, most likely neutron stars. Within this population, a small subsection exhibit an apparent extra high-energy emission feature, which rises to prominence several seconds after the initial emission event. These are the extended emission (EE) bursts. This thesis investigates the progenitors of the EE sample, including what drives them, and where they fit in the broader context of short GRBs. The science chapters outline a rigorous test of the magnetar model, in which the compact object merger results in a massive, rapidly-rotating neutron star with an extremely strong magnetic field. The motivation for this central engine is the late-time plateaux seen in some short and EE GRBs, which can be interpreted as energy injection from a long-lived central engine, in this case from the magnetar as it loses angular momentum along open field lines. Chapter 2 addresses the energy budget of such a system, including whether the EE component is consistent with the rotational energy reservoir of a millisecond neutron star, and the implications the model has for the physical properties of the underlying magnetar. Chapter 3 proposes a potential mechanism by which EE may arise, and how both classes may be born within the framework of a single central engine. Chapter 4 addresses the broadband signature of both short and EE GRBs, and provides some observational tests that can be used to either support or contradict the model.
NASA Astrophysics Data System (ADS)
Dahms, Rainer N.
2016-04-01
A generalized framework for multi-component liquid injections is presented to understand and predict the breakdown of classic two-phase theory and spray atomization at engine-relevant conditions. The analysis focuses on the thermodynamic structure and the immiscibility state of representative gas-liquid interfaces. The most modern form of Helmholtz energy mixture state equation is utilized which exhibits a unique and physically consistent behavior over the entire two-phase regime of fluid densities. It is combined with generalized models for non-linear gradient theory and for liquid injections to quantify multi-component two-phase interface structures in global thermal equilibrium. Then, the Helmholtz free energy is minimized which determines the interfacial species distribution as a consequence. This minimal free energy state is demonstrated to validate the underlying assumptions of classic two-phase theory and spray atomization. However, under certain engine-relevant conditions for which corroborating experimental data are presented, this requirement for interfacial thermal equilibrium becomes unsustainable. A rigorously derived probability density function quantifies the ability of the interface to develop internal spatial temperature gradients in the presence of significant temperature differences between injected liquid and ambient gas. Then, the interface can no longer be viewed as an isolated system at minimal free energy. Instead, the interfacial dynamics become intimately connected to those of the separated homogeneous phases. Hence, the interface transitions toward a state in local equilibrium whereupon it becomes a dense-fluid mixing layer. A new conceptual view of a transitional liquid injection process emerges from a transition time scale analysis. Close to the nozzle exit, the two-phase interface still remains largely intact and more classic two-phase processes prevail as a consequence. Further downstream, however, the transition to dense-fluid mixing generally occurs before the liquid length is reached. The significance of the presented modeling expressions is established by a direct comparison to a reduced model, which utilizes widely applied approximations but fundamentally fails to capture the physical complexity discussed in this paper.
Release of genetically engineered insects: a framework to identify potential ecological effects
David, Aaron S; Kaser, Joe M; Morey, Amy C; Roth, Alexander M; Andow, David A
2013-01-01
Genetically engineered (GE) insects have the potential to radically change pest management worldwide. With recent approvals of GE insect releases, there is a need for a synthesized framework to evaluate their potential ecological and evolutionary effects. The effects may occur in two phases: a transitory phase when the focal population changes in density, and a steady state phase when it reaches a new, constant density. We review potential effects of a rapid change in insect density related to population outbreaks, biological control, invasive species, and other GE organisms to identify a comprehensive list of potential ecological and evolutionary effects of GE insect releases. We apply this framework to the Anopheles gambiae mosquito – a malaria vector being engineered to suppress the wild mosquito population – to identify effects that may occur during the transitory and steady state phases after release. Our methodology reveals many potential effects in each phase, perhaps most notably those dealing with immunity in the transitory phase, and with pathogen and vector evolution in the steady state phase. Importantly, this framework identifies knowledge gaps in mosquito ecology. Identifying effects in the transitory and steady state phases allows more rigorous identification of the potential ecological effects of GE insect release. PMID:24198955
FDA 2011 process validation guidance: lifecycle compliance model.
Campbell, Cliff
2014-01-01
This article has been written as a contribution to the industry's efforts in migrating from a document-driven to a data-driven compliance mindset. A combination of target product profile, control engineering, and general sum principle techniques is presented as the basis of a simple but scalable lifecycle compliance model in support of modernized process validation. Unit operations and significant variables occupy pole position within the model, documentation requirements being treated as a derivative or consequence of the modeling process. The quality system is repositioned as a subordinate of system quality, this being defined as the integral of related "system qualities". The article represents a structured interpretation of the U.S. Food and Drug Administration's 2011 Guidance for Industry on Process Validation and is based on the author's educational background and his manufacturing/consulting experience in the validation field. The U.S. Food and Drug Administration's Guidance for Industry on Process Validation (2011) provides a wide-ranging and rigorous outline of compliant drug manufacturing requirements relative to its 20(th) century predecessor (1987). Its declared focus is patient safety, and it identifies three inter-related (and obvious) stages of the compliance lifecycle. Firstly, processes must be designed, both from a technical and quality perspective. Secondly, processes must be qualified, providing evidence that the manufacturing facility is fully "roadworthy" and fit for its intended purpose. Thirdly, processes must be verified, meaning that commercial batches must be monitored to ensure that processes remain in a state of control throughout their lifetime.
Sukumaran, Anuraj T; Holtcamp, Alexander J; Campbell, Yan L; Burnett, Derris; Schilling, Mark W; Dinh, Thu T N
2018-06-07
The objective of this study was to determine the effects of deboning time (pre- and post-rigor), processing steps (grinding - GB; salting - SB; batter formulation - BB), and storage time on the quality of raw beef mixtures and vacuum-packaged cooked sausage, produced using a commercial formulation with 0.25% phosphate. The pH was greater in pre-rigor GB and SB than in post-rigor GB and SB (P < .001). However, deboning time had no effect on metmyoglobin reducing activity, cooking loss, and color of raw beef mixtures. Protein solubility of pre-rigor beef mixtures (124.26 mg/kg) was greater than that of post-rigor beef (113.93 mg/kg; P = .071). TBARS were increased in BB but decreased during vacuum storage of cooked sausage (P ≤ .018). Except for chewiness and saltiness being 52.9 N-mm and 0.3 points greater in post-rigor sausage (P = .040 and 0.054, respectively), texture profile analysis and trained panelists detected no difference in texture between pre- and post-rigor sausage. Published by Elsevier Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hugo, Jacques Victor; Gertman, David Ira
The new generation of nuclear power plants (NPPs) will likely make use of state-of-the-art technologies in many areas of the plant. The analysis, design, and selection of advanced human–system interfaces (HSIs) constitute an important part of power plant engineering. Designers need to consider the new capabilities afforded by these technologies in the context of current regulations and new operational concepts, which is why they need a more rigorous method by which to plan the introduction of advanced HSIs in NPP work areas. Much of current human factors research stops at the user interface and fails to provide a definitive processmore » for integration of end user devices with instrumentation and control (I&C) and operational concepts. The current lack of a clear definition of HSI technology, including the process for integration, makes characterization and implementation of new and advanced HSIs difficult. This paper describes how new design concepts in the nuclear industry can be analyzed and how HSI technologies associated with new industrial processes might be considered. Furthermore, it also describes a basis for an understanding of human as well as technology characteristics that could be incorporated into a prioritization scheme for technology selection and deployment plans.« less
NASA Astrophysics Data System (ADS)
Ipsen, Andreas; Ebbels, Timothy M. D.
2014-10-01
In a recent article, we derived a probability distribution that was shown to closely approximate that of the data produced by liquid chromatography time-of-flight mass spectrometry (LC/TOFMS) instruments employing time-to-digital converters (TDCs) as part of their detection system. The approach of formulating detailed and highly accurate mathematical models of LC/MS data via probability distributions that are parameterized by quantities of analytical interest does not appear to have been fully explored before. However, we believe it could lead to a statistically rigorous framework for addressing many of the data analytical problems that arise in LC/MS studies. In this article, we present new procedures for correcting for TDC saturation using such an approach and demonstrate that there is potential for significant improvements in the effective dynamic range of TDC-based mass spectrometers, which could make them much more competitive with the alternative analog-to-digital converters (ADCs). The degree of improvement depends on our ability to generate mass and chromatographic peaks that conform to known mathematical functions and our ability to accurately describe the state of the detector dead time—tasks that may be best addressed through engineering efforts.
Development of High Fidelity, Fuel-Like Thermal Simulators for Non-Nuclear Testing
NASA Technical Reports Server (NTRS)
Bragg-Sitton, S. M.; Farmer, J.; Dixon, D.; Kapernick, R.; Dickens, R.; Adams, M.
2007-01-01
Non-nuclear testing can be a valuable tool in development of a space nuclear power or propulsion system. In a non-nuclear test bed, electric heaters are used to simulate the heat from nuclear fuel. Work at the NASA Marshall Space Flight Center seeks to develop high fidelity thermal simulators that not only match the static power profile that would be observed in an operating, fueled nuclear reactor, but to also match the dynamic fuel pin performance during feasible transients. Comparison between the fuel pins and thermal simulators is made at the fuel clad surface, which corresponds to the sheath surface in the thermal simulator. Static and dynamic fuel pin performance was determined using SINDA-FLUINT analysis, and the performance of conceptual thermal simulator designs was compared to the expected nuclear performance. Through a series of iterative analysis, a conceptual high fidelity design will be developed, followed by engineering design, fabrication, and testing to validate the overall design process. Although the resulting thermal simulator will be designed for a specific reactor concept, establishing this rigorous design process will assist in streamlining the thermal simulator development for other reactor concepts.
Physics Education in a Multidisciplinary Materials Research Environment
NASA Astrophysics Data System (ADS)
Doyle, W. D.
1997-03-01
The MINT Center, an NSF Materials Research Science and Engineering Center, is a multidisciplinary research program focusing on materials information storage. It involves 17 faculty, 10 post-doctoral fellows and 25 graduate students from six academic programs including Physics, Chemistry, Materials Science, Metallurgical and Materials Engineering, Electric al Engineering and Chemical Engineering, whose research is supported by university, federal and industrial funds. The research facilities (15,000 ft^2) which include faculty and student offices are located in one building and are maintained by the university and the Center at no cost to participating faculty. The academic requirements for the students are determined by the individual departments along relatively rigid, traditional grounds although several materials and device courses are offered for students from all departments. Within the Center, participants work in teams assigning responsibilities and sharing results at regularly scheduled meetings. Bi-weekly research seminars for all participants provide excellent opportunities for students to improve their communication skills and to receive critical input from a large, diverse audience. Strong collaboration with industrial partners in the storage industry supported by workshops, research reviews, internships, industrial visitors and participation in industry consortia give students a broader criteria for self-evaluation, higher motivation and excellent career opportunities. Physics students, because of their rigorous basic training, are an important element in a strong materials sciences program, but they often are deficient in the behavior and characterization of real materials. The curriculum for physics students should be broadened to prepare them fully for a rewarding career in this emerging discipline.
Texas M-E flexible pavement design system: literature review and proposed framework.
DOT National Transportation Integrated Search
2012-04-01
Recent developments over last several decades have offered an opportunity for more rational and rigorous pavement design procedures. Substantial work has already been completed in Texas, nationally, and internationally, in all aspects of modeling, ma...
Nuclear Safety via Commercial Grade Dedication - Hitting the Right Target - 12163
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kindred, Greg
2012-07-01
S.A.Technology has developed and implemented a highly effective Commercial Grade Dedication program that has been used to qualify a variety of equipment to the rigorous requirements of ASTM NQA-1. In specific cases, S.A.Technology personnel have worked closely with clients to develop complex Commercial Grade Dedication plans that satisfy the scrutiny of the US Department of Energy. These projects have been as simple as Passive Mechanical systems, and as complicated as Active Mechanical and Electrical systems. S.A.Technology's Commercial Grade Dedication plans have even been used as presentation materials to a client's internal departments encompassing Engineering, Quality and Procurement. This is themore » new target of today's CGD: exposing the reasoning behind the dedication process. Previously, only test and inspection results were expected. Today's CGD now needs to show how the decisions presented are the right decisions to make. We must be willing to undergo the process of learning how each new piece of equipment is affected by the system it is placed into, as well as understanding how that equipment can affect the system itself. It is a much more complicated and time-consuming endeavor to undertake. On top of it all, we must be able to voice those discoveries and rationalizations in a clear and concise manner. Unless we effectively communicate our intentions to the reader, we will not be understood. If researched correctly and presented properly, today's Commercial Grade Dedication plans will answer the appropriate questions before they are asked. (authors)« less
NASA Ares I Crew Launch Vehicle Upper Stage Overview
NASA Technical Reports Server (NTRS)
Davis, Daniel J.
2008-01-01
By incorporating rigorous engineering practices, innovative manufacturing processes and test techniques, a unique multi-center government/contractor partnership, and a clean-sheet design developed around the primary requirements for the International Space Station (ISS) and Lunar missions, the Upper Stage Element of NASA's Crew Launch Vehicle (CLV), the "Ares I," is a vital part of the Constellation Program's transportation system. Constellation's exploration missions will include Ares I and Ares V launch vehicles required to place crew and cargo in low-Earth orbit (LEO), crew and cargo transportation systems required for human space travel, and transportation systems and scientific equipment required for human exploration of the Moon and Mars. Early Ares I configurations will support ISS re-supply missions. A self-supporting cylindrical structure, the Ares I Upper Stage will be approximately 84' long and 18' in diameter. The Upper Stage Element is being designed for increased supportability and increased reliability to meet human-rating requirements imposed by NASA standards. The design also incorporates state-of-the-art materials, hardware, design, and integrated logistics planning, thus facilitating a supportable, reliable, and operable system. With NASA retiring the Space Shuttle fleet in 2010, the success of the Ares I Project is essential to America's continued leadership in space. The first Ares I test flight, called Ares 1-X, is scheduled for 2009. Subsequent test flights will continue thereafter, with the first crewed flight of the Crew Exploration Vehicle (CEV), "Orion," planned for no later than 2015. Crew transportation to the ISS will follow within the same decade, and the first Lunar excursion is scheduled for the 2020 timeframe.
NASA Ares I Crew Launch Vehicle Upper Stage Overview
NASA Technical Reports Server (NTRS)
McArthur, J. Craig
2008-01-01
By incorporating rigorous engineering practices, innovative manufacturing processes and test techniques, a unique multi-center government/contractor partnership, and a clean-sheet design developed around the primary requirements for the International Space Station (ISS) and Lunar missions, the Upper Stage Element of NASA's Crew Launch Vehicle (CLV), the "Ares I," is a vital part of the Constellation Program's transportation system. Constellation's exploration missions will include Ares I and Ares V launch vehicles required to place crew and cargo in low-Earth orbit (LEO), crew and cargo transportation systems required for human space travel, and transportation systems and scientific equipment required for human exploration of the Moon and Mars. Early Ares I configurations will support ISS re-supply missions. A self-supporting cylindrical structure, the Ares I Upper Stage will be approximately 84' long and 18' in diameter. The Upper Stage Element is being designed for increased supportability and increased reliability to meet human-rating requirements imposed by NASA standards. The design also incorporates state-of-the-art materials, hardware, design, and integrated logistics planning, thus facilitating a supportable, reliable, and operable system. With NASA retiring the Space Shuttle fleet in 2010, the success of the Ares I Project is essential to America's continued leadership in space. The first Ares I test flight, called Ares I-X, is scheduled for 2009. Subsequent test flights will continue thereafter, with the first crewed flight of the Crew Exploration Vehicle (CEV), "Orion," planned for no later than 2015. Crew transportation to the ISS will follow within the same decade, and the first Lunar excursion is scheduled for the 2020 timeframe.
Advanced computational techniques for incompressible/compressible fluid-structure interactions
NASA Astrophysics Data System (ADS)
Kumar, Vinod
2005-07-01
Fluid-Structure Interaction (FSI) problems are of great importance to many fields of engineering and pose tremendous challenges to numerical analyst. This thesis addresses some of the hurdles faced for both 2D and 3D real life time-dependent FSI problems with particular emphasis on parachute systems. The techniques developed here would help improve the design of parachutes and are of direct relevance to several other FSI problems. The fluid system is solved using the Deforming-Spatial-Domain/Stabilized Space-Time (DSD/SST) finite element formulation for the Navier-Stokes equations of incompressible and compressible flows. The structural dynamics solver is based on a total Lagrangian finite element formulation. Newton-Raphson method is employed to linearize the otherwise nonlinear system resulting from the fluid and structure formulations. The fluid and structural systems are solved in decoupled fashion at each nonlinear iteration. While rigorous coupling methods are desirable for FSI simulations, the decoupled solution techniques provide sufficient convergence in the time-dependent problems considered here. In this thesis, common problems in the FSI simulations of parachutes are discussed and possible remedies for a few of them are presented. Further, the effects of the porosity model on the aerodynamic forces of round parachutes are analyzed. Techniques for solving compressible FSI problems are also discussed. Subsequently, a better stabilization technique is proposed to efficiently capture and accurately predict the shocks in supersonic flows. The numerical examples simulated here require high performance computing. Therefore, numerical tools using distributed memory supercomputers with message passing interface (MPI) libraries were developed.
Ribosome flow model with positive feedback
Margaliot, Michael; Tuller, Tamir
2013-01-01
Eukaryotic mRNAs usually form a circular structure; thus, ribosomes that terminatae translation at the 3′ end can diffuse with increased probability to the 5′ end of the transcript, initiating another cycle of translation. This phenomenon describes ribosomal flow with positive feedback—an increase in the flow of ribosomes terminating translating the open reading frame increases the ribosomal initiation rate. The aim of this paper is to model and rigorously analyse translation with feedback. We suggest a modified version of the ribosome flow model, called the ribosome flow model with input and output. In this model, the input is the initiation rate and the output is the translation rate. We analyse this model after closing the loop with a positive linear feedback. We show that the closed-loop system admits a unique globally asymptotically stable equilibrium point. From a biophysical point of view, this means that there exists a unique steady state of ribosome distributions along the mRNA, and thus a unique steady-state translation rate. The solution from any initial distribution will converge to this steady state. The steady-state distribution demonstrates a decrease in ribosome density along the coding sequence. For the case of constant elongation rates, we obtain expressions relating the model parameters to the equilibrium point. These results may perhaps be used to re-engineer the biological system in order to obtain a desired translation rate. PMID:23720534
Evaluating Manufacturing and Assembly Errors in Rotating Machinery to Enhance Component Performance
NASA Technical Reports Server (NTRS)
Tumer, Irem Y.; Huff, Edward M.; Swanson, Keith (Technical Monitor)
2001-01-01
Manufacturing and assembly phases play a crucial role in providing products that meet the strict functional specifications associated with rotating machinery components. The errors resulting during the manufacturing and assembly of such components are correlated with the vibration and noise emanating from the final system during its operational lifetime. Vibration and noise are especially unacceptable elements in high-risk systems such as helicopters, resulting in premature component degradation and an unsafe flying environment. In such applications, individual components often are subject to 100% inspection prior to assembly, as well as during operation through rigorous maintenance, resulting in increased product development cycles and high production and operation costs. In this work, we focus on providing designers and manufacturing engineers with a technique to evaluate vibration modes and levels for each component or subsystem prior to putting them into operation. This paper presents a preliminary investigation of the correlation between vibrations and manufacturing and assembly errors using an experimental test rig, which simulates a simple bearing and shaft arrangement. A factorial design is used to study the effects of: 1) different manufacturing instances; 2) different assembly instances; and, 3) varying shaft speeds. The results indicate a correlation between manufacturing or assembly errors and vibrations measured from accelerometers. Challenges in developing a tool for DFM are identified, followed by a discussion of future work, including a real-world application to helicopter transmission vibrations.
A preliminary and qualitative study of resource ratio theory to nitrifying lab-scale bioreactors
Bellucci, Micol; Ofiţeru, Irina D; Beneduce, Luciano; Graham, David W; Head, Ian M; Curtis, Thomas P
2015-01-01
The incorporation of microbial diversity in design would ideally require predictive theory that would relate operational parameters to the numbers and distribution of taxa. Resource ratio-theory (RRT) might be one such theory. Based on Monod kinetics, it explains diversity in function of resource-ratio and richness. However, to be usable in biological engineered system, the growth parameters of all the bacteria under consideration and the resource supply and diffusion parameters for all the relevant nutrients should be determined. This is challenging, but plausible, at least for low diversity groups with simple resource requirements like the ammonia oxidizing bacteria (AOB). One of the major successes of RRT was its ability to explain the ‘paradox of enrichment’ which states that diversity first increases and then decreases with resource richness. Here, we demonstrate that this pattern can be seen in lab-scale-activated sludge reactors and parallel simulations that incorporate the principles of RRT in a floc-based system. High and low ammonia and oxygen were supplied to continuous flow bioreactors with resource conditions correlating with the composition and diversity of resident AOB communities based on AOB 16S rDNA clone libraries. Neither the experimental work nor the simulations are definitive proof for the application of RRT in this context. However, it is sufficient evidence that such approach might work and justify a more rigorous investigation. PMID:25874592
Good pharmacovigilance practices: technology enabled.
Nelson, Robert C; Palsulich, Bruce; Gogolak, Victor
2002-01-01
The assessment of spontaneous reports is most effective it is conducted within a defined and rigorous process. The framework for good pharmacovigilance process (GPVP) is proposed as a subset of good postmarketing surveillance process (GPMSP), a functional structure for both a public health and corporate risk management strategy. GPVP has good practices that implement each step within a defined process. These practices are designed to efficiently and effectively detect and alert the drug safety professional to new and potentially important information on drug-associated adverse reactions. These practices are enabled by applied technology designed specifically for the review and assessment of spontaneous reports. Specific practices include rules-based triage, active query prompts for severe organ insults, contextual single case evaluation, statistical proportionality and correlational checks, case-series analyses, and templates for signal work-up and interpretation. These practices and the overall GPVP are supported by state-of-the-art web-based systems with powerful analytical engines, workflow and audit trials to allow validated systems support for valid drug safety signalling efforts. It is also important to understand that a process has a defined set of steps and any one cannot stand independently. Specifically, advanced use of technical alerting methods in isolation can mislead and allow one to misunderstand priorities and relative value. In the end, pharmacovigilance is a clinical art and a component process to the science of pharmacoepidemiology and risk management.
NASA Technical Reports Server (NTRS)
Phfarr, Barbara B.; So, Maria M.; Lamb, Caroline Twomey; Rhodes, Donna H.
2009-01-01
Experienced systems engineers are adept at more than implementing systems engineering processes: they utilize systems thinking to solve complex engineering problems. Within the space industry demographics and economic pressures are reducing the number of experienced systems engineers that will be available in the future. Collaborative systems thinking within systems engineering teams is proposed as a way to integrate systems engineers of various experience levels to handle complex systems engineering challenges. This paper uses the GOES-R Program Systems Engineering team to illustrate the enablers and barriers to team level systems thinking and to identify ways in which performance could be improved. Ways NASA could expand its engineering training to promote team-level systems thinking are proposed.
National Aeronautics and Space Administration Exploration Systems Interim Strategy
NASA Technical Reports Server (NTRS)
2004-01-01
Contents include the following: 1. The Exploration Systems Mission Directorate within NASA. Enabling the Vision for Space Exploration. The Role of the Directorate. 2. Strategic Context and Approach. Corporate Focus. Focused, Prioritized Requirements. Spiral Transformation. Management Rigor. 3. Achieving Directorate Objectives. Strategy to Task Process. Capability Development. Research and Technology Development. 4. Beyond the Horizon. Appendices.
Binding and Scope Dependencies with "Floating Quantifiers" in Japanese
ERIC Educational Resources Information Center
Mukai, Emi
2012-01-01
The primary concern of this thesis is how we can achieve rigorous testability when we set the properties of the Computational System (hypothesized to be at the center of the language faculty) as our object of inquiry and informant judgments as a tool to construct and/or evaluate our hypotheses concerning the properties of the Computational System.…
ERIC Educational Resources Information Center
Collins-Camargo, Crystal; Shackelford, Kim; Kelly, Michael; Martin-Galijatovic, Ramie
2011-01-01
Expansion of the child welfare evidence base is a major challenge. The field must establish how organizational systems and practice techniques yield outcomes for children and families. Needed research must be grounded in practice and must engage practitioners and administrators via participatory evaluation. The extent to which successful practices…
ERIC Educational Resources Information Center
Lacireno-Paquet, Natalie; Bocala, Candice; Bailey, Jessica
2016-01-01
Recent changes in the policy environment have led states and districts in the Regional Educational Laboratory (REL) Northeast & Islands Region to increase the rigor of their teacher evaluation systems by including more frequent observations or student test score data. States and districts nationwide began reforming their evaluation systems as…
ERIC Educational Resources Information Center
Babb, Jeffry S.; Abdullat, Amjad
2014-01-01
Undergraduate programs in Information Systems are challenged to offer a curriculum that is both rigorous and relevant. Specialized college-level accreditation, such as AACSB, and program-level accreditation, such as ABET, offer an opportunity to signal quality in academics while also remaining relevant to local stakeholders and constituents.…
Krompecher, T; Fryc, O
1978-01-01
The use of new methods and an appropriate apparatus has allowed us to make successive measurements of rigor mortis and a study of its evolution in the rat. By a comparative examination on the front and hind limbs, we have determined the following: (1) The muscular mass of the hind limbs is 2.89 times greater than that of the front limbs. (2) In the initial phase rigor mortis is more pronounced in the front limbs. (3) The front and hind limbs reach maximum rigor mortis at the same time and this state is maintained for 2 hours. (4) Resolution of rigor mortis is accelerated in the front limbs during the initial phase, but both front and hind limbs reach complete resolution at the same time.
Onset of rigor mortis is earlier in red muscle than in white muscle.
Kobayashi, M; Takatori, T; Nakajima, M; Sakurada, K; Hatanaka, K; Ikegaya, H; Matsuda, Y; Iwase, H
2000-01-01
Rigor mortis is thought to be related to falling ATP levels in muscles postmortem. We measured rigor mortis as tension determined isometrically in three rat leg muscles in liquid paraffin kept at 37 degrees C or 25 degrees C--two red muscles, red gastrocnemius (RG) and soleus (SO) and one white muscle, white gastrocnemius (WG). Onset, half and full rigor mortis occurred earlier in RG and SO than in WG both at 37 degrees C and at 25 degrees C even though RG and WG were portions of the same muscle. This suggests that rigor mortis directly reflects the postmortem intramuscular ATP level, which decreases more rapidly in red muscle than in white muscle after death. Rigor mortis was more retarded at 25 degrees C than at 37 degrees C in each type of muscle.
Warriss, P D; Brown, S N; Knowles, T G
2003-12-13
The degree of development of rigor mortis in the carcases of slaughter pigs was assessed subjectively on a three-point scale 35 minutes after they were exsanguinated, and related to the levels of cortisol, lactate and creatine kinase in blood collected at exsanguination. Earlier rigor development was associated with higher concentrations of these stress indicators in the blood. This relationship suggests that the mean rigor score, and the frequency distribution of carcases that had or had not entered rigor, could be used as an index of the degree of stress to which the pigs had been subjected.