Center for the Built Environment: UFAD Cooling Load Design Tool
Energy Publications Project Title: Underfloor Air Distribution (UFAD) Cooling Load Design Tool Providing . Webster, 2010. Development of a simplified cooling load design tool for underfloor air distribution Near-ZNE Buildings Setpoint Energy Savings Calculator UFAD Case Studies UFAD Cooling Design Tool UFAD
Simplified tools for evaluating domestic ventilation systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maansson, L.G.; Orme, M.
1999-07-01
Within an International Energy Agency (IEA) project, Annex 27, experts from 8 countries (Canada, France, Italy, Japan, The Netherlands, Sweden, UK and USA) have developed simplified tools for evaluating domestic ventilation systems during the heating season. Tools for building and user aspects, thermal comfort, noise, energy, life cycle cost, reliability and indoor air quality (IAQ) have been devised. The results can be used both for dwellings at the design stage and after construction. The tools lead to immediate answers and indications about the consequences of different choices that may arise during discussion with clients. This paper presents an introduction tomore » these tools. Examples applications of the indoor air quality and energy simplified tools are also provided. The IAQ tool accounts for constant emission sources, CO{sub 2}, cooking products, tobacco smoke, condensation risks, humidity levels (i.e., for judging the risk for mould and house dust mites), and pressure difference (for identifying the risk for radon or land fill spillage entering the dwelling or problems with indoor combustion appliances). An elaborated set of design parameters were worked out that resulted in about 17,000 combinations. By using multi-variate analysis it was possible to reduce this to 174 combinations for IAQ. In addition, a sensitivity analysis was made using 990 combinations. The results from all the runs were used to develop a simplified tool, as well as quantifying equations relying on the design parameters. A computerized energy tool has also been developed within this project, which takes into account air tightness, climate, window airing pattern, outdoor air flow rate and heat exchange efficiency.« less
Center for the Built Environment: Research on Building HVAC Systems
, and lessons learned. Underfloor Air Distribution (UFAD) Cooling Airflow Design Tool Developing simplified design tools for optimization of underfloor systems. Underfloor Air Distribution (UFAD) Cost Near-ZNE Buildings Setpoint Energy Savings Calculator UFAD Case Studies UFAD Cooling Design Tool UFAD
Guidelines and Metrics for Assessing Space System Cost Estimates
2008-01-01
analysis time, reuse tooling, models , mechanical ground-support equipment [MGSE]) High mass margin ( simplifying assumptions used to bound solution...engineering environment changes High reuse of architecture, design , tools, code, test scripts, and commercial real- time operating systems Simplified life...Coronal Explorer TWTA traveling wave tube amplifier USAF U.S. Air Force USCM Unmanned Space Vehicle Cost Model USN U.S. Navy UV ultraviolet UVOT UV
SMOKE TOOL FOR MODELS-3 VERSION 4.1 STRUCTURE AND OPERATION DOCUMENTATION
The SMOKE Tool is a part of the Models-3 system, a flexible software system designed to simplify the development and use of air quality models and other environmental decision support tools. The SMOKE Tool is an input processor for SMOKE, (Sparse Matrix Operator Kernel Emissio...
Nose-to-tail analysis of an airbreathing hypersonic vehicle using an in-house simplified tool
NASA Astrophysics Data System (ADS)
Piscitelli, Filomena; Cutrone, Luigi; Pezzella, Giuseppe; Roncioni, Pietro; Marini, Marco
2017-07-01
SPREAD (Scramjet PREliminary Aerothermodynamic Design) is a simplified, in-house method developed by CIRA (Italian Aerospace Research Centre), able to provide a preliminary estimation of the performance of engine/aeroshape for airbreathing configurations. It is especially useful for scramjet engines, for which the strong coupling between the aerothermodynamic (external) and propulsive (internal) flow fields requires real-time screening of several engine/aeroshape configurations and the identification of the most promising one/s with respect to user-defined constraints and requirements. The outcome of this tool defines the base-line configuration for further design analyses with more accurate tools, e.g., CFD simulations and wind tunnel testing. SPREAD tool has been used to perform the nose-to-tail analysis of the LAPCAT-II Mach 8 MR2.4 vehicle configuration. The numerical results demonstrate SPREAD capability to quickly predict reliable values of aero-propulsive balance (i.e., net-thrust) and aerodynamic efficiency in a pre-design phase.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bono, M J; Hibbard, R L
2005-12-05
A tool holder was designed to facilitate the machining of precision meso-scale components with complex three-dimensional shapes with sub-{micro}m accuracy on a four-axis lathe. A four-axis lathe incorporates a rotary table that allows the cutting tool to swivel with respect to the workpiece to enable the machining of complex workpiece forms, and accurately machining complex meso-scale parts often requires that the cutting tool be aligned precisely along the axis of rotation of the rotary table. The tool holder designed in this study has greatly simplified the process of setting the tool in the correct location with sub-{micro}m precision. The toolmore » holder adjusts the tool position using flexures that were designed using finite element analyses. Two flexures adjust the lateral position of the tool to align the center of the nose of the tool with the axis of rotation of the B-axis, and another flexure adjusts the height of the tool. The flexures are driven by manual micrometer adjusters, each of which provides a minimum increment of motion of 20 nm. This tool holder has simplified the process of setting a tool with sub-{micro}m accuracy, and it has significantly reduced the time required to set a tool.« less
Simulation in the Service of Design - Asking the Right Questions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Donn, Michael; Selkowitz, Stephen; Bordass, Bill
2009-03-01
This paper proposes an approach to the creation of design tools that address the real information needs of designers in the early stages of design of nonresidential buildings. Traditional simplified design tools are typically too limited to be of much use, even in conceptual design. The proposal is to provide access to the power of detailed simulation tools, at a stage in design when little is known about the final building, but at a stage also when the freedom to explore options is greatest. The proposed approach to tool design has been derived from consultation with design analysis teams asmore » part of the COMFEN tool development. The paper explores how tools like COMFEN have been shaped by this consultation and how requests from these teams for real-world relevance might shape such tools in the future, drawing into the simulation process the lessons from Post Occupancy Evaluation (POE) of buildings.« less
DOT National Transportation Integrated Search
1980-06-01
The purpose of this report is to provide the tunneling profession with improved practical tools in the technical or design area, which provide more accurate representations of the ground-structure interaction in tunneling. The design methods range fr...
Some Novel Design Principles for Collective Behaviors in Mobile Robots
DOE Office of Scientific and Technical Information (OSTI.GOV)
OSBOURN, GORDON C.
2002-09-01
We present a set of novel design principles to aid in the development of complex collective behaviors in fleets of mobile robots. The key elements are: the use of a graph algorithm that we have created, with certain proven properties, that guarantee scalable local communications for fleets of arbitrary size; the use of artificial forces to simplify the design of motion control; the use of certain proximity values in the graph algorithm to simplify the sharing of robust navigation and sensor information among the robots. We describe these design elements and present a computer simulation that illustrates the behaviors readilymore » achievable with these design tools.« less
Simplified, inverse, ejector design tool
NASA Technical Reports Server (NTRS)
Dechant, Lawrence J.
1993-01-01
A simple lumped parameter based inverse design tool has been developed which provides flow path geometry and entrainment estimates subject to operational, acoustic, and design constraints. These constraints are manifested through specification of primary mass flow rate or ejector thrust, fully-mixed exit velocity, and static pressure matching. Fundamentally, integral forms of the conservation equations coupled with the specified design constraints are combined to yield an easily invertible linear system in terms of the flow path cross-sectional areas. Entrainment is computed by back substitution. Initial comparison with experimental and analogous one-dimensional methods show good agreement. Thus, this simple inverse design code provides an analytically based, preliminary design tool with direct application to High Speed Civil Transport (HSCT) design studies.
Geometric modeling for computer aided design
NASA Technical Reports Server (NTRS)
Schwing, James L.
1992-01-01
The goal was the design and implementation of software to be used in the conceptual design of aerospace vehicles. Several packages and design studies were completed, including two software tools currently used in the conceptual level design of aerospace vehicles. These tools are the Solid Modeling Aerospace Research Tool (SMART) and the Environment for Software Integration and Execution (EASIE). SMART provides conceptual designers with a rapid prototyping capability and additionally provides initial mass property analysis. EASIE provides a set of interactive utilities that simplify the task of building and executing computer aided design systems consisting of diverse, stand alone analysis codes that result in the streamlining of the exchange of data between programs, reducing errors and improving efficiency.
Pressure distribution under flexible polishing tools. II - Cylindrical (conical) optics
NASA Astrophysics Data System (ADS)
Mehta, Pravin K.
1990-10-01
A previously developed eigenvalue model is extended to determine polishing pressure distribution by rectangular tools with unequal stiffness in two directions on cylindrical optics. Tool misfit is divided into two simplified one-dimensional problems and one simplified two-dimensional problem. Tools with nonuniform cross-sections are treated with a new one-dimensional eigenvalue algorithm, permitting evaluation of tool designs where the edge is more flexible than the interior. This maintains edge pressure variations within acceptable parameters. Finite element modeling is employed to resolve upper bounds, which handle pressure changes in the two-dimensional misfit element. Paraboloids and hyperboloids from the NASA AXAF system are treated with the AXAFPOD software for this method, and are verified with NASTRAN finite element analyses. The maximum deviation from the one-dimensional azimuthal pressure variation is predicted to be 10 percent and 20 percent for paraboloids and hyperboloids, respectively.
Development of a simplified urban water balance model (WABILA).
Henrichs, M; Langner, J; Uhl, M
2016-01-01
During the last decade, water sensitive urban design (WSUD) has become more and more accepted. However, there is not any simple tool or option available to evaluate the influence of these measures on the local water balance. To counteract the impact of new settlements, planners focus on mitigating increases in runoff through installation of infiltration systems. This leads to an increasing non-natural groundwater recharge and decreased evapotranspiration. Simple software tools which evaluate or simulate the effect of WSUD on the local water balance are still needed. The authors developed a tool named WABILA (Wasserbilanz) that could support planners for optimal WSUD. WABILA is an easy-to-use planning tool that is based on simplified regression functions for established measures and land covers. Results show that WSUD has to be site-specific, based on climate conditions and the natural water balance.
USER MANUAL FOR THE EPA THIRD-GENERATION AIR QUALITY MODELING SYSTEM (MODELS-3 VERSION 3.0)
Models-3 is a flexible software system designed to simplify the development and use of environmental assessment and other decision support tools. It is designed for applications ranging from regulatory and policy analysis to understanding the complex interactions of atmospheri...
Models-3 is a flexible software system designed to simplify the development and use of environmental assessment and other decision support tools. It is designed for applications ranging from regulatory and policy analysis to understanding the complex interactions of atmospheri...
26 CFR 1.263A-0 - Outline of regulations under section 263A.
Code of Federal Regulations, 2010 CFR
2010-04-01
...) Utilities. (O) Repairs and maintenance. (P) Engineering and design costs. (Q) Spoilage. (R) Tools and...) Engineering and design services. (F) Safety engineering services. (v) Accounting method change. (h) Simplified... mixed service costs. (7) Costs allocable to more than one business. (8) De minimis rule. (9) Separate...
26 CFR 1.263A-0 - Outline of regulations under section 263A.
Code of Federal Regulations, 2012 CFR
2012-04-01
...) Utilities. (O) Repairs and maintenance. (P) Engineering and design costs. (Q) Spoilage. (R) Tools and...) Engineering and design services. (F) Safety engineering services. (v) Accounting method change. (h) Simplified... mixed service costs. (7) Costs allocable to more than one business. (8) De minimis rule. (9) Separate...
26 CFR 1.263A-0 - Outline of regulations under section 263A.
Code of Federal Regulations, 2011 CFR
2011-04-01
...) Utilities. (O) Repairs and maintenance. (P) Engineering and design costs. (Q) Spoilage. (R) Tools and...) Engineering and design services. (F) Safety engineering services. (v) Accounting method change. (h) Simplified... mixed service costs. (7) Costs allocable to more than one business. (8) De minimis rule. (9) Separate...
26 CFR 1.263A-0 - Outline of regulations under section 263A.
Code of Federal Regulations, 2013 CFR
2013-04-01
...) Utilities. (O) Repairs and maintenance. (P) Engineering and design costs. (Q) Spoilage. (R) Tools and...) Engineering and design services. (F) Safety engineering services. (v) Accounting method change. (h) Simplified... mixed service costs. (7) Costs allocable to more than one business. (8) De minimis rule. (9) Separate...
MODELS-3 INSTALLATION PROCEDURES FOR A PC WITH AN NT OPERATING SYSTEM (MODELS-3 VERSION 4.0)
Models-3 is a flexible software system designed to simplify the development and use of air quality models and other environmental decision support tools. It is designed for applications ranging from regulatory and policy analysis to understanding the complex interactions of at...
Models-3 is a flexible system designed to simplify the development and use of air quality models and other environmental decision support tools. It is designed for applications ranging from regulatory and policy analysis to understanding the complex interactions of atmospheric...
Curriculum Guide for Art in the Secondary Schools.
ERIC Educational Resources Information Center
Chicago Board of Education, IL.
This secondary school curriculum guide is written in outline form to simplify the planning of a design-oriented art program. For each of 15 design units, a step-by-step set of instructions is given. Each unit is presented in three stages, each of which is a complete lesson in design. Materials and tools necessary for lesson preparation, motivation…
Automated Sensitivity Analysis of Interplanetary Trajectories for Optimal Mission Design
NASA Technical Reports Server (NTRS)
Knittel, Jeremy; Hughes, Kyle; Englander, Jacob; Sarli, Bruno
2017-01-01
This work describes a suite of Python tools known as the Python EMTG Automated Trade Study Application (PEATSA). PEATSA was written to automate the operation of trajectory optimization software, simplify the process of performing sensitivity analysis, and was ultimately found to out-perform a human trajectory designer in unexpected ways. These benefits will be discussed and demonstrated on sample mission designs.
Models-3 is a flexible system designed to simplify the development and use of air quality models and other environmental decision support tools. It is designed for applications ranging from regulatory and policy analysis to understanding the complex interactions of atmospheric...
Kimmel, Lara A; Holland, Anne E; Simpson, Pam M; Edwards, Elton R; Gabbe, Belinda J
2014-07-01
Early, accurate prediction of discharge destination from the acute hospital assists individual patients and the wider hospital system. The Trauma Rehabilitation and Prediction Tool (TRaPT), developed using registry data, determines probability of inpatient rehabilitation discharge for patients with isolated lower limb fractures. The aims of this study were: (1) to prospectively validatate the TRaPT, (2) to assess whether its performance could be improved by adding additional demographic data, and (3) to simplify it for use as a bedside tool. This was a cohort, measurement-focused study. Patients with isolated lower limb fractures (N=114) who were admitted to a major trauma center in Melbourne, Australia, were included. The participants' TRaPT scores were calculated from admission data. Performance of the TRaPT score alone, and in combination with frailty, weight-bearing status, and home supports, was assessed using measures of discrimination and calibration. A simplified TRaPT was developed by rounding the coefficients of variables in the original model and grouping age into 8 categories. Simplified TRaPT performance measures, including specificity, sensitivity, and positive and negative predictive values, were evaluated. Prospective validation of the TRaPT showed excellent discrimination (C-statistic=0.90 [95% confidence interval=0.82, 0.97]), a sensitivity of 80%, and specificity of 94%. All participants able to weight bear were discharged directly home. Simplified TRaPT scores had a sensitivity of 80% and a specificity of 88%. Generalizability may be limited given the compensation system that exists in Australia, but the methods used will assist in designing a similar tool in any population. The TRaPT accurately predicted discharge destination for 80% of patients and may form a useful aid for discharge decision making, with the simplified version facilitating its use as a bedside tool. © 2014 American Physical Therapy Association.
Automated Sensitivity Analysis of Interplanetary Trajectories
NASA Technical Reports Server (NTRS)
Knittel, Jeremy; Hughes, Kyle; Englander, Jacob; Sarli, Bruno
2017-01-01
This work describes a suite of Python tools known as the Python EMTG Automated Trade Study Application (PEATSA). PEATSA was written to automate the operation of trajectory optimization software, simplify the process of performing sensitivity analysis, and was ultimately found to out-perform a human trajectory designer in unexpected ways. These benefits will be discussed and demonstrated on sample mission designs.
AUTOMATIC CALIBRATION OF A STOCHASTIC-LAGRANGIAN TRANSPORT MODEL (SLAM)
Numerical models are a useful tool in evaluating and designing NAPL remediation systems. Traditional constitutive finite difference and finite element models are complex and expensive to apply. For this reason, this paper presents the application of a simplified stochastic-Lagran...
NASA Astrophysics Data System (ADS)
Misceo, Monica; Buonamici, Roberto; Buttol, Patrizia; Naldesi, Luciano; Grimaldi, Filomena; Rinaldi, Caterina
2004-12-01
TESPI (Tool for Environmental Sound Product Innovation) is the prototype of a software tool developed within the framework of the "eLCA" project. The project, (www.elca.enea.it)financed by the European Commission, is realising "On line green tools and services for Small and Medium sized Enterprises (SMEs)". The implementation by SMEs of environmental product innovation (as fostered by the European Integrated Product Policy, IPP) needs specific adaptation to their economic model, their knowledge of production and management processes and their relationships with innovation and the environment. In particular, quality and costs are the main driving forces of innovation in European SMEs, and well known barriers exist to the adoption of an environmental approach in the product design. Starting from these considerations, the TESPI tool has been developed to support the first steps of product design taking into account both the quality and the environment. Two main issues have been considered: (i) classic Quality Function Deployment (QFD) can hardly be proposed to SMEs; (ii) the environmental aspects of the product life cycle need to be integrated with the quality approach. TESPI is a user friendly web-based tool, has a training approach and applies to modular products. Users are guided through the investigation of the quality aspects of their product (customer"s needs and requirements fulfilment) and the identification of the key environmental aspects in the product"s life cycle. A simplified check list allows analyzing the environmental performance of the product. Help is available for a better understanding of the analysis criteria. As a result, the significant aspects for the redesign of the product are identified.
Toward a mathematical formalism of performance, task difficulty, and activation
NASA Technical Reports Server (NTRS)
Samaras, George M.
1988-01-01
The rudiments of a mathematical formalism for handling operational, physiological, and psychological concepts are developed for use by the man-machine system design engineer. The formalism provides a framework for developing a structured, systematic approach to the interface design problem, using existing mathematical tools, and simplifying the problem of telling a machine how to measure and use performance.
IGA: A Simplified Introduction and Implementation Details for Finite Element Users
NASA Astrophysics Data System (ADS)
Agrawal, Vishal; Gautam, Sachin S.
2018-05-01
Isogeometric analysis (IGA) is a recently introduced technique that employs the Computer Aided Design (CAD) concept of Non-uniform Rational B-splines (NURBS) tool to bridge the substantial bottleneck between the CAD and finite element analysis (FEA) fields. The simplified transition of exact CAD models into the analysis alleviates the issues originating from geometrical discontinuities and thus, significantly reduces the design-to-analysis time in comparison to traditional FEA technique. Since its origination, the research in the field of IGA is accelerating and has been applied to various problems. However, the employment of CAD tools in the area of FEA invokes the need of adapting the existing implementation procedure for the framework of IGA. Also, the usage of IGA requires the in-depth knowledge of both the CAD and FEA fields. This can be overwhelming for a beginner in IGA. Hence, in this paper, a simplified introduction and implementation details for the incorporation of NURBS based IGA technique within the existing FEA code is presented. It is shown that with little modifications, the available standard code structure of FEA can be adapted for IGA. For the clear and concise explanation of these modifications, step-by-step implementation of a benchmark plate with a circular hole under the action of in-plane tension is included.
Development of a Software Tool to Automate ADCO Flight Controller Console Planning Tasks
NASA Technical Reports Server (NTRS)
Anderson, Mark G.
2011-01-01
This independent study project covers the development of the International Space Station (ISS) Attitude Determination and Control Officer (ADCO) Planning Exchange APEX Tool. The primary goal of the tool is to streamline existing manual and time-intensive planning tools into a more automated, user-friendly application that interfaces with existing products and allows the ADCO to produce accurate products and timelines more effectively. This paper will survey the current ISS attitude planning process and its associated requirements, goals, documentation and software tools and how a software tool could simplify and automate many of the planning actions which occur at the ADCO console. The project will be covered from inception through the initial prototype delivery in November 2011 and will include development of design requirements and software as well as design verification and testing.
NASA Technical Reports Server (NTRS)
Melton, John E.
1994-01-01
EGADS is a comprehensive preliminary design tool for estimating the performance of light, single-engine general aviation aircraft. The software runs on the Apple Macintosh series of personal computers and assists amateur designers and aeronautical engineering students in performing the many repetitive calculations required in the aircraft design process. The program makes full use of the mouse and standard Macintosh interface techniques to simplify the input of various design parameters. Extensive graphics, plotting, and text output capabilities are also included.
The Achiever. Volume 6, Number 5
ERIC Educational Resources Information Center
Ashby, Nicole, Ed.
2007-01-01
"The Achiever" is a monthly newsletter designed expressly for parents and community leaders. Each issue contains news and information about and from public and private organizations about school improvement in the United States. Highlights of this issue include: (1) New Online Tool Simplifies Financial Aid Process; (2) Rigor in K-6:…
Uas for Archaeology - New Perspectives on Aerial Documentation
NASA Astrophysics Data System (ADS)
Fallavollita, P.; Balsi, M.; Esposito, S.; Melis, M. G.; Milanese, M.; Zappino, L.
2013-08-01
In this work some Unmanned Aerial Systems applications are discussed and applied to archaeological sites survey and 3D model reconstructions. Interesting results are shown for three important and different aged sites on north Sardinia (Italy). An easy and simplified procedure has proposed permitting the adoption of multi-rotor aircrafts for daily archaeological survey during excavation and documentation, involving state of art in UAS design, flight control systems, high definition sensor cameras and innovative photogrammetric software tools. Very high quality 3D models results are shown and discussed and how they have been simplified the archaeologist work and decisions.
Computer Aided Self-Forging Fragment Design,
1978-06-01
This value is reached so quickly that HEMP solutions using work hardening and those using only elastic—perfectly plastic formulations are quite...Elastic— Plastic Flow, UCRL—7322 , Lawrence Radiation Laboratory , Livermore , California (1969) . 4. Giroux , E. D . , HEMP Users Manual, UCRL—5l079...Laboratory, the HEMP computer code has been developed to serve as an effective design tool to simplify this task considerably. Using this code, warheads 78 06
A knowledge based search tool for performance measures in health care systems.
Beyan, Oya D; Baykal, Nazife
2012-02-01
Performance measurement is vital for improving the health care systems. However, we are still far from having accepted performance measurement models. Researchers and developers are seeking comparable performance indicators. We developed an intelligent search tool to identify appropriate measures for specific requirements by matching diverse care settings. We reviewed the literature and analyzed 229 performance measurement studies published after 2000. These studies are evaluated with an original theoretical framework and stored in the database. A semantic network is designed for representing domain knowledge and supporting reasoning. We have applied knowledge based decision support techniques to cope with uncertainty problems. As a result we designed a tool which simplifies the performance indicator search process and provides most relevant indicators by employing knowledge based systems.
Architectural evaluation of dynamic and partial reconfigurable systems designed with DREAMS tool
NASA Astrophysics Data System (ADS)
Otero, Andrés.; Gallego, Ángel; de la Torre, Eduardo; Riesgo, Teresa
2013-05-01
Benefits of dynamic and partial reconfigurable systems are increasingly being more accepted by the industry. For this reason, SRAM-based FPGA manufacturers have improved, or even included for the first time, the support they offer for the design of this kind of systems. However, commercial tools still offer a poor flexibility, which leads to a limited efficiency. This is witnessed by the overhead introduced by the communication primitives, as well as by the inability to relocate reconfigurable modules, among others. For this reason, authors have proposed an academic design tool called DREAMS, which targets the design of dynamically reconfigurable systems. In this paper, main features offered by DREAMS are described, comparing them with existing commercial and academic tools. Moreover, a graphic user interface (GUI) is originally described in this work, with the aim of simplifying the design process, as well as to hide the low level device dependent details to the system designer. The overall goal is to increase the designer productivity. Using the graphic interface, different reconfigurable architectures are provided as design examples. Among them, both conventional slot-based architectures and mesh type designs have been included.
Shuttle's 160 hour ground turnaround - A design driver
NASA Technical Reports Server (NTRS)
Widick, F.
1977-01-01
Turnaround analysis added a new dimension to the Space Program with the advent of the Space Shuttle. The requirement to turn the flight hardware around in 160 working hours from landing to launch was a significant design driver and a useful tool in forcing the integration of flight and ground systems design to permit an efficient ground operation. Although there was concern that time constraints might increase program costs, the result of the analysis was to minimize facility requirements and simplify operations with resultant cost savings.
Oak Ridge Spallation Neutron Source (ORSNS) target station design integration
DOE Office of Scientific and Technical Information (OSTI.GOV)
McManamy, T.; Booth, R.; Cleaves, J.
1996-06-01
The conceptual design for a 1- to 3-MW short pulse spallation source with a liquid mercury target has been started recently. The design tools and methods being developed to define requirements, integrate the work, and provide early cost guidance will be presented with a summary of the current target station design status. The initial design point was selected with performance and cost estimate projections by a systems code. This code was developed recently using cost estimates from the Brookhaven Pulsed Spallation Neutron Source study and experience from the Advanced Neutron Source Project`s conceptual design. It will be updated and improvedmore » as the design develops. Performance was characterized by a simplified figure of merit based on a ratio of neutron production to costs. A work breakdown structure was developed, with simplified systems diagrams used to define interfaces and system responsibilities. A risk assessment method was used to identify potential problems, to identify required research and development (R&D), and to aid contingency development. Preliminary 3-D models of the target station are being used to develop remote maintenance concepts and to estimate costs.« less
SCOUT: A Fast Monte-Carlo Modeling Tool of Scintillation Camera Output
Hunter, William C. J.; Barrett, Harrison H.; Lewellen, Thomas K.; Miyaoka, Robert S.; Muzi, John P.; Li, Xiaoli; McDougald, Wendy; MacDonald, Lawrence R.
2011-01-01
We have developed a Monte-Carlo photon-tracking and readout simulator called SCOUT to study the stochastic behavior of signals output from a simplified rectangular scintillation-camera design. SCOUT models the salient processes affecting signal generation, transport, and readout. Presently, we compare output signal statistics from SCOUT to experimental results for both a discrete and a monolithic camera. We also benchmark the speed of this simulation tool and compare it to existing simulation tools. We find this modeling tool to be relatively fast and predictive of experimental results. Depending on the modeled camera geometry, we found SCOUT to be 4 to 140 times faster than other modeling tools. PMID:22072297
SCOUT: a fast Monte-Carlo modeling tool of scintillation camera output†
Hunter, William C J; Barrett, Harrison H.; Muzi, John P.; McDougald, Wendy; MacDonald, Lawrence R.; Miyaoka, Robert S.; Lewellen, Thomas K.
2013-01-01
We have developed a Monte-Carlo photon-tracking and readout simulator called SCOUT to study the stochastic behavior of signals output from a simplified rectangular scintillation-camera design. SCOUT models the salient processes affecting signal generation, transport, and readout of a scintillation camera. Presently, we compare output signal statistics from SCOUT to experimental results for both a discrete and a monolithic camera. We also benchmark the speed of this simulation tool and compare it to existing simulation tools. We find this modeling tool to be relatively fast and predictive of experimental results. Depending on the modeled camera geometry, we found SCOUT to be 4 to 140 times faster than other modeling tools. PMID:23640136
DOE Office of Scientific and Technical Information (OSTI.GOV)
Damiani, Rick
This manual summarizes the theory and preliminary verifications of the JacketSE module, which is an offshore jacket sizing tool that is part of the Wind-Plant Integrated System Design & Engineering Model toolbox. JacketSE is based on a finite-element formulation and on user-prescribed inputs and design standards' criteria (constraints). The physics are highly simplified, with a primary focus on satisfying ultimate limit states and modal performance requirements. Preliminary validation work included comparing industry data and verification against ANSYS, a commercial finite-element analysis package. The results are encouraging, and future improvements to the code are recommended in this manual.
Failure mode and effects analysis: a comparison of two common risk prioritisation methods.
McElroy, Lisa M; Khorzad, Rebeca; Nannicelli, Anna P; Brown, Alexandra R; Ladner, Daniela P; Holl, Jane L
2016-05-01
Failure mode and effects analysis (FMEA) is a method of risk assessment increasingly used in healthcare over the past decade. The traditional method, however, can require substantial time and training resources. The goal of this study is to compare a simplified scoring method with the traditional scoring method to determine the degree of congruence in identifying high-risk failures. An FMEA of the operating room (OR) to intensive care unit (ICU) handoff was conducted. Failures were scored and ranked using both the traditional risk priority number (RPN) and criticality-based method, and a simplified method, which designates failures as 'high', 'medium' or 'low' risk. The degree of congruence was determined by first identifying those failures determined to be critical by the traditional method (RPN≥300), and then calculating the per cent congruence with those failures designated critical by the simplified methods (high risk). In total, 79 process failures among 37 individual steps in the OR to ICU handoff process were identified. The traditional method yielded Criticality Indices (CIs) ranging from 18 to 72 and RPNs ranging from 80 to 504. The simplified method ranked 11 failures as 'low risk', 30 as medium risk and 22 as high risk. The traditional method yielded 24 failures with an RPN ≥300, of which 22 were identified as high risk by the simplified method (92% agreement). The top 20% of CI (≥60) included 12 failures, of which six were designated as high risk by the simplified method (50% agreement). These results suggest that the simplified method of scoring and ranking failures identified by an FMEA can be a useful tool for healthcare organisations with limited access to FMEA expertise. However, the simplified method does not result in the same degree of discrimination in the ranking of failures offered by the traditional method. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Tools and Techniques for Simplifying the Analysis of Captured Packet Data
ERIC Educational Resources Information Center
Cavaiani, Thomas P.
2008-01-01
Students acquire an understanding of the differences between TCP and UDP (connection-oriented vs. connection-less) data transfers as they analyze network packet data collected during one of a series of labs designed for an introductory network essentials course taught at Boise State University. The learning emphasis of the lab is not on the…
Development and Implementation of a Simplified Tool Measuring System
NASA Astrophysics Data System (ADS)
Chen, Jenn-Yih; Lee, Bean-Yin; Lee, Kuang-Chyi; Chen, Zhao-Kai
2010-01-01
This paper presents a simplified system for measuring geometric profiles of end mills. Firstly, a CCD camera was used to capture images of cutting tools. Then, an image acquisition card with the encoding function was adopted to convert the source of image into an USB port of a PC, and the image could be shown on a monitor. In addition, two linear scales were mounted on the X-Y table for positioning and measuring purposes. The signals of the linear scales were transmitted into a 4-axis quadrature encoder with 4-channel counter card for position monitoring. The C++ Builder was utilized for designing the user friendly human machine interface of the measuring system of tools. There is a cross line on the image of the interface to show a coordinate for the position measurement. Finally, a well-known tool measuring and inspection machine was employed for the measuring standard. This study compares the difference of the measuring results by using the machine and the proposed system. Experimental results show that the percentage of measuring error is acceptable for some geometric parameters of the square or ball nose end mills. Therefore, the results demonstrate the effectiveness of the presented approach.
Assessment of the GECKO-A Modeling Tool and Simplified 3D Model Parameterizations for SOA Formation
NASA Astrophysics Data System (ADS)
Aumont, B.; Hodzic, A.; La, S.; Camredon, M.; Lannuque, V.; Lee-Taylor, J. M.; Madronich, S.
2014-12-01
Explicit chemical mechanisms aim to embody the current knowledge of the transformations occurring in the atmosphere during the oxidation of organic matter. These explicit mechanisms are therefore useful tools to explore the fate of organic matter during its tropospheric oxidation and examine how these chemical processes shape the composition and properties of the gaseous and the condensed phases. Furthermore, explicit mechanisms provide powerful benchmarks to design and assess simplified parameterizations to be included 3D model. Nevertheless, the explicit mechanism describing the oxidation of hydrocarbons with backbones larger than few carbon atoms involves millions of secondary organic compounds, far exceeding the size of chemical mechanisms that can be written manually. Data processing tools can however be designed to overcome these difficulties and automatically generate consistent and comprehensive chemical mechanisms on a systematic basis. The Generator for Explicit Chemistry and Kinetics of Organics in the Atmosphere (GECKO-A) has been developed for the automatic writing of explicit chemical schemes of organic species and their partitioning between the gas and condensed phases. GECKO-A can be viewed as an expert system that mimics the steps by which chemists might develop chemical schemes. GECKO-A generates chemical schemes according to a prescribed protocol assigning reaction pathways and kinetics data on the basis of experimental data and structure-activity relationships. In its current version, GECKO-A can generate the full atmospheric oxidation scheme for most linear, branched and cyclic precursors, including alkanes and alkenes up to C25. Assessments of the GECKO-A modeling tool based on chamber SOA observations will be presented. GECKO-A was recently used to design a parameterization for SOA formation based on a Volatility Basis Set (VBS) approach. First results will be presented.
Cognitive ergonomics of operational tools
NASA Astrophysics Data System (ADS)
Lüdeke, A.
2012-10-01
Control systems have become increasingly more powerful over the past decades. The availability of high data throughput and sophisticated graphical interactions has opened a variety of new possibilities. But has this helped to provide intuitive, easy to use applications to simplify the operation of modern large scale accelerator facilities? We will discuss what makes an application useful to operation and what is necessary to make a tool easy to use. We will show that even the implementation of a small number of simple application design rules can help to create ergonomic operational tools. The author is convinced that such tools do indeed help to achieve higher beam availability and better beam performance at accelerator facilities.
Computational tools for multi-linked flexible structures
NASA Technical Reports Server (NTRS)
Lee, Gordon K. F.; Brubaker, Thomas A.; Shults, James R.
1990-01-01
A software module which designs and tests controllers and filters in Kalman Estimator form, based on a polynomial state-space model is discussed. The user-friendly program employs an interactive graphics approach to simplify the design process. A variety of input methods are provided to test the effectiveness of the estimator. Utilities are provided which address important issues in filter design such as graphical analysis, statistical analysis, and calculation time. The program also provides the user with the ability to save filter parameters, inputs, and outputs for future use.
The design and construction of a cost-efficient confocal laser scanning microscope
NASA Astrophysics Data System (ADS)
Xi, Peng; Rajwa, Bartlomiej; Jones, James T.; Robinson, J. Paul
2007-03-01
The optical dissection ability of confocal microscopy makes it a powerful tool for biological materials. However, the cost and complexity of confocal scanning laser microscopy hinders its wide application in education. We describe the construction of a simplified confocal scanning laser microscope and demonstrate three-dimensional projection based on cost-efficient commercial hardware, together with available open source software.
Jonnalagadda, Siddhartha; Gonzalez, Graciela
2010-11-13
BioSimplify is an open source tool written in Java that introduces and facilitates the use of a novel model for sentence simplification tuned for automatic discourse analysis and information extraction (as opposed to sentence simplification for improving human readability). The model is based on a "shot-gun" approach that produces many different (simpler) versions of the original sentence by combining variants of its constituent elements. This tool is optimized for processing biomedical scientific literature such as the abstracts indexed in PubMed. We tested our tool on its impact to the task of PPI extraction and it improved the f-score of the PPI tool by around 7%, with an improvement in recall of around 20%. The BioSimplify tool and test corpus can be downloaded from https://biosimplify.sourceforge.net.
Geometric modeling for computer aided design
NASA Technical Reports Server (NTRS)
Schwing, James L.; Olariu, Stephen
1995-01-01
The primary goal of this grant has been the design and implementation of software to be used in the conceptual design of aerospace vehicles particularly focused on the elements of geometric design, graphical user interfaces, and the interaction of the multitude of software typically used in this engineering environment. This has resulted in the development of several analysis packages and design studies. These include two major software systems currently used in the conceptual level design of aerospace vehicles. These tools are SMART, the Solid Modeling Aerospace Research Tool, and EASIE, the Environment for Software Integration and Execution. Additional software tools were designed and implemented to address the needs of the engineer working in the conceptual design environment. SMART provides conceptual designers with a rapid prototyping capability and several engineering analysis capabilities. In addition, SMART has a carefully engineered user interface that makes it easy to learn and use. Finally, a number of specialty characteristics have been built into SMART which allow it to be used efficiently as a front end geometry processor for other analysis packages. EASIE provides a set of interactive utilities that simplify the task of building and executing computer aided design systems consisting of diverse, stand-alone, analysis codes. Resulting in a streamlining of the exchange of data between programs reducing errors and improving the efficiency. EASIE provides both a methodology and a collection of software tools to ease the task of coordinating engineering design and analysis codes.
A simplified method of evaluating the stress wave environment of internal equipment
NASA Technical Reports Server (NTRS)
Colton, J. D.; Desmond, T. P.
1979-01-01
A simplified method called the transfer function technique (TFT) was devised for evaluating the stress wave environment in a structure containing internal equipment. The TFT consists of following the initial in-plane stress wave that propagates through a structure subjected to a dynamic load and characterizing how the wave is altered as it is transmitted through intersections of structural members. As a basis for evaluating the TFT, impact experiments and detailed stress wave analyses were performed for structures with two or three, or more members. Transfer functions that relate the wave transmitted through an intersection to the incident wave were deduced from the predicted wave response. By sequentially applying these transfer functions to a structure with several intersections, it was found that the environment produced by the initial stress wave propagating through the structure can be approximated well. The TFT can be used as a design tool or as an analytical tool to determine whether a more detailed wave analysis is warranted.
2010-12-01
Simulation of Free -Field Blast ........................................................................45 27. (a) Peak Incident Pressure and (b...several types of problems involving blast propagation. Mastin et al. (1995) compared CTH simulations to free -field incident pressure as predicted by...a measure of accuracy and efficiency. To provide this direct comparison, a series of 2D-axisymmetric free -field air blast simulations were
Study of a direct visualization display tool for space applications
NASA Astrophysics Data System (ADS)
Pereira do Carmo, J.; Gordo, P. R.; Martins, M.; Rodrigues, F.; Teodoro, P.
2017-11-01
The study of a Direct Visualization Display Tool (DVDT) for space applications is reported. The review of novel technologies for a compact display tool is described. Several applications for this tool have been identified with the support of ESA astronauts and are presented. A baseline design is proposed. It consists mainly of OLEDs as image source; a specially designed optical prism as relay optics; a Personal Digital Assistant (PDA), with data acquisition card, as control unit; and voice control and simplified keyboard as interfaces. Optical analysis and the final estimated performance are reported. The system is able to display information (text, pictures or/and video) with SVGA resolution directly to the astronaut using a Field of View (FOV) of 20x14.5 degrees. The image delivery system is a monocular Head Mounted Display (HMD) that weights less than 100g. The HMD optical system has an eye pupil of 7mm and an eye relief distance of 30mm.
Techniques for designing rotorcraft control systems
NASA Technical Reports Server (NTRS)
Levine, William S.; Barlow, Jewel
1993-01-01
This report summarizes the work that was done on the project from 1 Apr. 1992 to 31 Mar. 1993. The main goal of this research is to develop a practical tool for rotorcraft control system design based on interactive optimization tools (CONSOL-OPTCAD) and classical rotorcraft design considerations (ADOCS). This approach enables the designer to combine engineering intuition and experience with parametric optimization. The combination should make it possible to produce a better design faster than would be possible using either pure optimization or pure intuition and experience. We emphasize that the goal of this project is not to develop an algorithm. It is to develop a tool. We want to keep the human designer in the design process to take advantage of his or her experience and creativity. The role of the computer is to perform the calculation necessary to improve and to display the performance of the nominal design. Briefly, during the first year we have connected CONSOL-OPTCAD, an existing software package for optimizing parameters with respect to multiple performance criteria, to a simplified nonlinear simulation of the UH-60 rotorcraft. We have also created mathematical approximations to the Mil-specs for rotorcraft handling qualities and input them into CONSOL-OPTCAD. Finally, we have developed the additional software necessary to use CONSOL-OPTCAD for the design of rotorcraft controllers.
ERIC Educational Resources Information Center
Dwyer, Christopher P.; Hogan, Michael J.; Stewart, Ian
2010-01-01
The current study compared the effects on comprehension and memory of learning via text versus learning via argument map. Argument mapping is a method of diagrammatic representation of arguments designed to simplify the reading of an argument structure and allow for easy assimilation of core propositions and relations. In the current study, 400…
ERIC Educational Resources Information Center
Ronan, Michael W.
This manual, the Spanish translation of a guide on accounting for microbusinesses, is designed as a tool for development workers to use in teaching the MICRON accounting system to persons in developing areas. (Developed by a Peace Corps volunteer in Colombia, MICRON is a simplified accounting system that is intended for use in small businesses.)…
Integrating Flight Dynamics & Control Analysis and Simulation in Rotorcraft Conceptual Design
NASA Technical Reports Server (NTRS)
Lawrence, Ben; Berger, Tom; Tischler, Mark B.; Theodore, Colin R; Elmore, Josh; Gallaher, Andrew; Tobias, Eric L.
2016-01-01
The development of a toolset, SIMPLI-FLYD ('SIMPLIfied FLight dynamics for conceptual Design') is described. SIMPLI-FLYD is a collection of tools that perform flight dynamics and control modeling and analysis of rotorcraft conceptual designs including a capability to evaluate the designs in an X-Plane-based real-time simulation. The establishment of this framework is now facilitating the exploration of this new capability, in terms of modeling fidelity and data requirements, and the investigation of which stability and control and handling qualities requirements are appropriate for conceptual design. Illustrative design variation studies for single main rotor and tiltrotor vehicle configurations show sensitivity of the stability and control characteristics and an approach to highlight potential weight savings by identifying over-design.
The dynamic analysis of drum roll lathe for machining of rollers
NASA Astrophysics Data System (ADS)
Qiao, Zheng; Wu, Dongxu; Wang, Bo; Li, Guo; Wang, Huiming; Ding, Fei
2014-08-01
An ultra-precision machine tool for machining of the roller has been designed and assembled, and due to the obvious impact which dynamic characteristic of machine tool has on the quality of microstructures on the roller surface, the dynamic characteristic of the existing machine tool is analyzed in this paper, so is the influence of circumstance that a large scale and slender roller is fixed in the machine on dynamic characteristic of the machine tool. At first, finite element model of the machine tool is built and simplified, and based on that, the paper carries on with the finite element mode analysis and gets the natural frequency and shaking type of four steps of the machine tool. According to the above model analysis results, the weak stiffness systems of machine tool can be further improved and the reasonable bandwidth of control system of the machine tool can be designed. In the end, considering the shock which is caused by Z axis as a result of fast positioning frequently to feeding system and cutting tool, transient analysis is conducted by means of ANSYS analysis in this paper. Based on the results of transient analysis, the vibration regularity of key components of machine tool and its impact on cutting process are explored respectively.
Oxygen Mass Transport in Stented Coronary Arteries.
Murphy, Eoin A; Dunne, Adrian S; Martin, David M; Boyle, Fergal J
2016-02-01
Oxygen deficiency, known as hypoxia, in arterial walls has been linked to increased intimal hyperplasia, which is the main adverse biological process causing in-stent restenosis. Stent implantation has significant effects on the oxygen transport into the arterial wall. Elucidating these effects is critical to optimizing future stent designs. In this study the most advanced oxygen transport model developed to date was assessed in two test cases and used to compare three coronary stent designs. Additionally, the predicted results from four simplified blood oxygen transport models are compared in the two test cases. The advanced model showed good agreement with experimental measurements within the mass-transfer boundary layer and at the luminal surface; however, more work is needed in predicting the oxygen transport within the arterial wall. Simplifying the oxygen transport model within the blood flow produces significant errors in predicting the oxygen transport in arteries. This study can be used as a guide for all future numerical studies in this area and the advanced model could provide a powerful tool in aiding design of stents and other cardiovascular devices.
An Ultrasonic Compactor for Oil and Gas Exploration
NASA Astrophysics Data System (ADS)
Feeney, Andrew; Sikaneta, Sakalima; Harkness, Patrick; Lucas, Margaret
The Badger Explorer is a rig-less oil and gas exploration tool which drills into the subsea environment to collect geological data. Drill spoil is transported from the front end of the system to the rear, where the material is compacted. Motivated by the need to develop a highly efficient compaction system, an ultrasonic compactor for application with granular geological materials encountered in subsea environments is designed and fabricated as part of this study. The finite element method is used to design a compactor configuration suitable for subsea exploration, consisting of a vibrating ultrasonic horn called a resonant compactor head, which operates in a longitudinal mode at 20 kHz, driven by a Langevin piezoelectric transducer. A simplified version of the compactor is also designed, due to its ease of incorporating in a lab-based experimental rig, in order to demonstrate enhanced compaction using ultrasonics. Numerical analysis of this simplified compactor system is supported with experimental characterisation using laser Doppler vibrometry. Compaction testing is then conducted on granular geological material, showing that compaction can be enhanced through the use of an ultrasonic compactor.
Development of the ICD-10 simplified version and field test.
Paoin, Wansa; Yuenyongsuwan, Maliwan; Yokobori, Yukiko; Endo, Hiroyoshi; Kim, Sukil
2018-05-01
The International Statistical Classification of Diseases and Related Health Problems, 10th Revision (ICD-10) has been used in various Asia-Pacific countries for more than 20 years. Although ICD-10 is a powerful tool, clinical coding processes are complex; therefore, many developing countries have not been able to implement ICD-10-based health statistics (WHO-FIC APN, 2007). This study aimed to simplify ICD-10 clinical coding processes, to modify index terms to facilitate computer searching and to provide a simplified version of ICD-10 for use in developing countries. The World Health Organization Family of International Classifications Asia-Pacific Network (APN) developed a simplified version of the ICD-10 and conducted field testing in Cambodia during February and March 2016. Ten hospitals were selected to participate. Each hospital sent a team to join a training workshop before using the ICD-10 simplified version to code 100 cases. All hospitals subsequently sent their coded records to the researchers. Overall, there were 1038 coded records with a total of 1099 ICD clinical codes assigned. The average accuracy rate was calculated as 80.71% (66.67-93.41%). Three types of clinical coding errors were found. These related to errors relating to the coder (14.56%), those resulting from the physician documentation (1.27%) and those considered system errors (3.46%). The field trial results demonstrated that the APN ICD-10 simplified version is feasible for implementation as an effective tool to implement ICD-10 clinical coding for hospitals. Developing countries may consider adopting the APN ICD-10 simplified version for ICD-10 code assignment in hospitals and health care centres. The simplified version can be viewed as an introductory tool which leads to the implementation of the full ICD-10 and may support subsequent ICD-11 adoption.
A Module Language for Typing by Contracts
NASA Technical Reports Server (NTRS)
Glouche, Yann; Talpin, Jean-Pierre; LeGuernic, Paul; Gautier, Thierry
2009-01-01
Assume-guarantee reasoning is a popular and expressive paradigm for modular and compositional specification of programs. It is becoming a fundamental concept in some computer-aided design tools for embedded system design. In this paper, we elaborate foundations for contract-based embedded system design by proposing a general-purpose module language based on a Boolean algebra allowing to define contracts. In this framework, contracts are used to negotiate the correctness of assumptions made on the definition of a component at the point where it is used and provides guarantees to its environment. We illustrate this presentation with the specification of a simplified 4-stroke engine model.
Precision tool holder with flexure-adjustable, three degrees of freedom for a four-axis lathe
Bono, Matthew J [Pleasanton, CA; Hibbard, Robin L [Livermore, CA
2008-03-04
A precision tool holder for precisely positioning a single point cutting tool on 4-axis lathe, such that the center of the radius of the tool nose is aligned with the B-axis of the machine tool, so as to facilitate the machining of precision meso-scale components with complex three-dimensional shapes with sub-.mu.m accuracy on a four-axis lathe. The device is designed to fit on a commercial diamond turning machine and can adjust the cutting tool position in three orthogonal directions with sub-micrometer resolution. In particular, the tool holder adjusts the tool position using three flexure-based mechanisms, with two flexure mechanisms adjusting the lateral position of the tool to align the tool with the B-axis, and a third flexure mechanism adjusting the height of the tool. Preferably, the flexures are driven by manual micrometer adjusters. In this manner, this tool holder simplifies the process of setting a tool with sub-.mu.m accuracy, to substantially reduce the time required to set the tool.
NASA Astrophysics Data System (ADS)
Schafhirt, S.; Kaufer, D.; Cheng, P. W.
2014-12-01
In recent years many advanced load simulation tools, allowing an aero-servo-hydroelastic analyses of an entire offshore wind turbine, have been developed and verified. Nowadays, even an offshore wind turbine with a complex support structure such as a jacket can be analysed. However, the computational effort rises significantly with an increasing level of details. This counts especially for offshore wind turbines with lattice support structures, since those models do naturally have a higher number of nodes and elements than simpler monopile structures. During the design process multiple load simulations are demanded to obtain an optimal solution. In the view of pre-design tasks it is crucial to apply load simulations which keep the simulation quality and the computational effort in balance. The paper will introduce a reference wind turbine model consisting of the REpower5M wind turbine and a jacket support structure with a high level of detail. In total twelve variations of this reference model are derived and presented. Main focus is to simplify the models of the support structure and the foundation. The reference model and the simplified models are simulated with the coupled simulation tool Flex5-Poseidon and analysed regarding frequencies, fatigue loads, and ultimate loads. A model has been found which reaches an adequate increase of simulation speed while holding the results in an acceptable range compared to the reference results.
NASA Technical Reports Server (NTRS)
Szczur, Martha R.
1991-01-01
The Transportable Applications Environment (TAE) Plus, developed at NASA's Goddard Space Flight Center, is an advanced portable user interface development environment which simplifies the process of creating and managing complex application graphical user interfaces (GUIs), supports prototyping, allows applications to be oported easily between different platforms, and encourages appropriate levels of user interface consistency between applications. This paper discusses the capabilities of the TAE Plus tool, and how it makes the job of designing and developing GUIs easier for the application developers. The paper also explains how tools like TAE Plus provide for reusability and ensure reliability of UI software components, as well as how they aid in the reduction of development and maintenance costs.
Streamlining genomes: toward the generation of simplified and stabilized microbial systems.
Leprince, Audrey; van Passel, Mark W J; dos Santos, Vitor A P Martins
2012-10-01
At the junction between systems and synthetic biology, genome streamlining provides a solid foundation both for increased understanding of cellular circuitry, and for the tailoring of microbial chassis towards innovative biotechnological applications. Iterative genomic deletions (targeted and random) helps to generate simplified, stabilized and predictable genomes, whereas multiplexing genome engineering reveals a broad functional genetic diversity. The decrease in oligo and gene synthesis costs promises effective combinatorial tools for the generation of chassis based on streamlined and tractable genomes. Here we review recent progresses in streamlining genomes through recombineering techniques aiming to generate insights into cellular mechanisms and responses towards the design and assembly of streamlined genome chassis together with new cellular modules in diverse biotechnological applications. Copyright © 2012 Elsevier Ltd. All rights reserved.
Block, Bruce; Brennan, J.A.
1987-01-01
A successful health maintenance program requires physicians interested in and knowledgeable about the appropriate health surveillance actions to pursue. But even well-informed physicians need help transforming good intentions into effective health surveillance. An automated health surveillance system was designed and implemented to simplify documentation of health maintenance and remind physicians when actions were overdue. The system has increased insight into the complex process of health promotion and promises to be an important clinical, educational, and research tool.
Mapping healthcare systems: a policy relevant analytic tool
Sekhri Feachem, Neelam; Afshar, Ariana; Pruett, Cristina; Avanceña, Anton L.V.
2017-01-01
Abstract Background In the past decade, an international consensus on the value of well-functioning systems has driven considerable health systems research. This research falls into two broad categories. The first provides conceptual frameworks that take complex healthcare systems and create simplified constructs of interactions and functions. The second focuses on granular inputs and outputs. This paper presents a novel translational mapping tool – the University of California, San Francisco mapping tool (the Tool) - which bridges the gap between these two areas of research, creating a platform for multi-country comparative analysis. Methods Using the Murray-Frenk framework, we create a macro-level representation of a country's structure, focusing on how it finances and delivers healthcare. The map visually depicts the fundamental policy questions in healthcare system design: funding sources and amount spent through each source, purchasers, populations covered, provider categories; and the relationship between these entities. Results We use the Tool to provide a macro-level comparative analysis of the structure of India's and Thailand's healthcare systems. Conclusions As part of the systems strengthening arsenal, the Tool can stimulate debate about the merits and consequences of different healthcare systems structural designs, using a common framework that fosters multi-country comparative analyses. PMID:28541518
DOE Office of Scientific and Technical Information (OSTI.GOV)
Udhay Ravishankar; Milos manic
2013-08-01
This paper presents a micro-grid simulator tool useful for implementing and testing multi-agent controllers (SGridSim). As a common engineering practice it is important to have a tool that simplifies the modeling of the salient features of a desired system. In electric micro-grids, these salient features are the voltage and power distributions within the micro-grid. Current simplified electric power grid simulator tools such as PowerWorld, PowerSim, Gridlab, etc, model only the power distribution features of a desired micro-grid. Other power grid simulators such as Simulink, Modelica, etc, use detailed modeling to accommodate the voltage distribution features. This paper presents a SGridSimmore » micro-grid simulator tool that simplifies the modeling of both the voltage and power distribution features in a desired micro-grid. The SGridSim tool accomplishes this simplified modeling by using Effective Node-to-Node Complex Impedance (EN2NCI) models of components that typically make-up a micro-grid. The term EN2NCI models means that the impedance based components of a micro-grid are modeled as single impedances tied between their respective voltage nodes on the micro-grid. Hence the benefit of the presented SGridSim tool are 1) simulation of a micro-grid is performed strictly in the complex-domain; 2) faster simulation of a micro-grid by avoiding the simulation of detailed transients. An example micro-grid model was built using the SGridSim tool and tested to simulate both the voltage and power distribution features with a total absolute relative error of less than 6%.« less
NASA Astrophysics Data System (ADS)
Pedersen, N. L.
2015-06-01
The strength of a gear is typically defined relative to durability (pitting) and load capacity (tooth-breakage). Tooth-breakage is controlled by the root shape and this gear part can be designed because there is no contact between gear pairs here. The shape of gears is generally defined by different standards, with the ISO standard probably being the most common one. Gears are manufactured using two principally different tools: rack tools and gear tools. In this work, the bending stress of involute teeth is minimized by shape optimization made directly on the final gear. This optimized shape is then used to find the cutting tool (the gear envelope) that can create this optimized gear shape. A simple but sufficiently flexible root parameterization is applied and emphasis is put on the importance of separating the shape parameterization from the finite element analysis of stresses. Large improvements in the stress level are found.
NASA Technical Reports Server (NTRS)
Szczur, Martha R.
1993-01-01
The Transportable Applications Environment (TAE) Plus, developed at NASA's Goddard Space Flight Center, is an advanced portable user interface development which simplifies the process of creating and managing complex application graphical user interfaces (GUI's). TAE Plus supports the rapid prototyping of GUI's and allows applications to be ported easily between different platforms. This paper will discuss the capabilities of the TAE Plus tool, and how it makes the job of designing and developing GUI's easier for application developers. TAE Plus is being applied to many types of applications, and this paper discusses how it has been used both within and outside NASA.
Software for Secondary-School Learning About Robotics
NASA Technical Reports Server (NTRS)
Shelton, Robert O.; Smith, Stephanie L.; Truong, Dat; Hodgson, Terry R.
2005-01-01
The ROVer Ranch is an interactive computer program designed to help secondary-school students learn about space-program robotics and related basic scientific concepts by involving the students in simplified design and programming tasks that exercise skills in mathematics and science. The tasks involve building simulated robots and then observing how they behave. The program furnishes (1) programming tools that a student can use to assemble and program a simulated robot and (2) a virtual three-dimensional mission simulator for testing the robot. First, the ROVer Ranch presents fundamental information about robotics, mission goals, and facts about the mission environment. On the basis of this information, and using the aforementioned tools, the student assembles a robot by selecting parts from such subsystems as propulsion, navigation, and scientific tools, the student builds a simulated robot to accomplish its mission. Once the robot is built, it is programmed and then placed in a three-dimensional simulated environment. Success or failure in the simulation depends on the planning and design of the robot. Data and results of the mission are available in a summary log once the mission is concluded.
A Design Tool for Liquid Rocket Engine Injectors
NASA Technical Reports Server (NTRS)
Farmer, R.; Cheng, G.; Trinh, H.; Tucker, K.
2000-01-01
A practical design tool which emphasizes the analysis of flowfields near the injector face of liquid rocket engines has been developed and used to simulate preliminary configurations of NASA's Fastrac and vortex engines. This computational design tool is sufficiently detailed to predict the interactive effects of injector element impingement angles and points and the momenta of the individual orifice flows and the combusting flow which results. In order to simulate a significant number of individual orifices, a homogeneous computational fluid dynamics model was developed. To describe sub- and supercritical liquid and vapor flows, the model utilized thermal and caloric equations of state which were valid over a wide range of pressures and temperatures. The model was constructed such that the local quality of the flow was determined directly. Since both the Fastrac and vortex engines utilize RP-1/LOX propellants, a simplified hydrocarbon combustion model was devised in order to accomplish three-dimensional, multiphase flow simulations. Such a model does not identify drops or their distribution, but it does allow the recirculating flow along the injector face and into the acoustic cavity and the film coolant flow to be accurately predicted.
Light aircraft crash safety program
NASA Technical Reports Server (NTRS)
Thomson, R. G.; Hayduk, R. J.
1974-01-01
NASA is embarked upon research and development tasks aimed at providing the general aviation industry with a reliable crashworthy airframe design technology. The goals of the NASA program are: reliable analytical techniques for predicting the nonlinear behavior of structures; significant design improvements of airframes; and simulated full-scale crash test data. The analytical tools will include both simplified procedures for estimating energy absorption characteristics and more complex computer programs for analysis of general airframe structures under crash loading conditions. The analytical techniques being developed both in-house and under contract are described, and a comparison of some analytical predictions with experimental results is shown.
Rocket/launcher structural dynamics
NASA Technical Reports Server (NTRS)
Ferragut, N. J.
1976-01-01
The equations of motion describing the interactions between a rocket and a launcher were derived using Lagrange's Equation. A rocket launching was simulated. The motions of both the rocket and the launcher can be considered in detail. The model contains flexible elements and rigid elements. The rigid elements (masses) were judiciously utilized to simplify the derivation of the equations. The advantages of simultaneous shoe release were illustrated. Also, the loading history of the interstage structure of a boosted configuration was determined. The equations shown in this analysis could be used as a design tool during the modification of old launchers and the design of new launchers.
Tire-rim interface pressure of a commercial vehicle wheel under radial loads: theory and experiment
NASA Astrophysics Data System (ADS)
Wan, Xiaofei; Shan, Yingchun; Liu, Xiandong; He, Tian; Wang, Jiegong
2017-11-01
The simulation of the radial fatigue test of a wheel has been a necessary tool to improve the design of the wheel and calculate its fatigue life. The simulation model, including the strong nonlinearity of the tire structure and material, may produce accurate results, but often leads to a divergence in calculation. Thus, a simplified simulation model in which the complicated tire model is replaced with a tire-wheel contact pressure model is used extensively in the industry. In this paper, a simplified tire-rim interface pressure model of a wheel under a radial load is established, and the pressure of the wheel under different radial loads is tested. The tire-rim contact behavior affected by the radial load is studied and analyzed according to the test result, and the tire-rim interface pressure extracted from the test result is used to evaluate the simplified pressure model and the traditional cosine function model. The results show that the proposed model may provide a more accurate prediction of the wheel radial fatigue life than the traditional cosine function model.
Data management in an object-oriented distributed aircraft conceptual design environment
NASA Astrophysics Data System (ADS)
Lu, Zhijie
In the competitive global market place, aerospace companies are forced to deliver the right products to the right market, with the right cost, and at the right time. However, the rapid development of technologies and new business opportunities, such as mergers, acquisitions, supply chain management, etc., have dramatically increased the complexity of designing an aircraft. Therefore, the pressure to reduce design cycle time and cost is enormous. One way to solve such a dilemma is to develop and apply advanced engineering environments (AEEs), which are distributed collaborative virtual design environments linking researchers, technologists, designers, etc., together by incorporating application tools and advanced computational, communications, and networking facilities. Aircraft conceptual design, as the first design stage, provides major opportunity to compress design cycle time and is the cheapest place for making design changes. However, traditional aircraft conceptual design programs, which are monolithic programs, cannot provide satisfactory functionality to meet new design requirements due to the lack of domain flexibility and analysis scalability. Therefore, we are in need of the next generation aircraft conceptual design environment (NextADE). To build the NextADE, the framework and the data management problem are two major problems that need to be addressed at the forefront. Solving these two problems, particularly the data management problem, is the focus of this research. In this dissertation, in light of AEEs, a distributed object-oriented framework is firstly formulated and tested for the NextADE. In order to improve interoperability and simplify the integration of heterogeneous application tools, data management is one of the major problems that need to be tackled. To solve this problem, taking into account the characteristics of aircraft conceptual design data, a robust, extensible object-oriented data model is then proposed according to the distributed object-oriented framework. By overcoming the shortcomings of the traditional approach of modeling aircraft conceptual design data, this data model makes it possible to capture specific detailed information of aircraft conceptual design without sacrificing generality, which is one of the most desired features of a data model for aircraft conceptual design. Based upon this data model, a prototype of the data management system, which is one of the fundamental building blocks of the NextADE, is implemented utilizing the state of the art information technologies. Using a general-purpose integration software package to demonstrate the efficacy of the proposed framework and the data management system, the NextADE is initially implemented by integrating the prototype of the data management system with other building blocks of the design environment, such as disciplinary analyses programs and mission analyses programs. As experiments, two case studies are conducted in the integrated design environments. One is based upon a simplified conceptual design of a notional conventional aircraft; the other is a simplified conceptual design of an unconventional aircraft. As a result of the experiments, the proposed framework and the data management approach are shown to be feasible solutions to the research problems.
Research on AutoCAD secondary development and function expansion based on VBA technology
NASA Astrophysics Data System (ADS)
Zhang, Runmei; Gu, Yehuan
2017-06-01
AutoCAD is the most widely used drawing tool among the similar design drawing products. In the process of drawing different types of design drawings of the same product, there are a lot of repetitive and single work contents. The traditional manual method uses a drawing software AutoCAD drawing graphics with low efficiency, high error rate and high input cost shortcomings and many more. In order to solve these problems, the design of the parametric drawing system of the hot-rolled I-beam (steel beam) cross-section is completed by using the VBA secondary development tool and the Access database software with large-capacity storage data, and the analysis of the functional extension of the plane drawing and the parametric drawing design in this paper. For the secondary development of AutoCAD functions, the system drawing work will be simplified and work efficiency also has been greatly improved. This introduction of parametric design of AutoCAD drawing system to promote the industrial mass production and related industries economic growth rate similar to the standard I-beam hot-rolled products.
Dynamic response of a collidant impacting a low pressure airbag
NASA Astrophysics Data System (ADS)
Dreher, Peter A.
There are many uses of low pressure airbags, both military and commercial. Many of these applications have been hampered by inadequate and inaccurate modeling tools. This dissertation contains the derivation of a four degree-of-freedom system of differential equations from physical laws of mass and energy conservation, force equilibrium, and the Ideal Gas Law. Kinematic equations were derived to model a cylindrical airbag as a single control volume impacted by a parallelepiped collidant. An efficient numerical procedure was devised to solve the simplified system of equations in a manner amenable to discovering design trends. The largest public airbag experiment, both in scale and scope, was designed and built to collect data on low-pressure airbag responses, otherwise unavailable in the literature. The experimental results were compared to computational simulations to validate the simplified numerical model. Experimental response trends are presented that will aid airbag designers. The two objectives of using a low pressure airbag to demonstrate the feasibility to (1) accelerate a munition to 15 feet per second velocity from a bomb bay, and (2) decelerate humans hitting trucks below the human tolerance level of 50 G's, were both met.
CASTEAUR: a simple tool to assess the transfer of radionuclides in waterways.
Beaugelin-Seiller, K; Boyer, P; Garnier-Laplace, J; Adam, C
2002-10-01
The CASTEAUR project proposes a simplified tool to assess the transfer of radionuclides between and in the main biotic and abiotic components of the freshwater ecosystem. Applied to phenomenological modeling, various hypotheses simplify the transfer equations, which, when programmed under Excel, can be readily dispatched and used. CASTEAUR can be used as an assessment tool for impact studies of accidental release as well as "routine" release. This code is currently being tested on the Rhone River, downstream from a nuclear reprocessing plant. The first results are reported to illustrate the possibilities offered by CASTEAUR.
Pyviko: an automated Python tool to design gene knockouts in complex viruses with overlapping genes.
Taylor, Louis J; Strebel, Klaus
2017-01-07
Gene knockouts are a common tool used to study gene function in various organisms. However, designing gene knockouts is complicated in viruses, which frequently contain sequences that code for multiple overlapping genes. Designing mutants that can be traced by the creation of new or elimination of existing restriction sites further compounds the difficulty in experimental design of knockouts of overlapping genes. While software is available to rapidly identify restriction sites in a given nucleotide sequence, no existing software addresses experimental design of mutations involving multiple overlapping amino acid sequences in generating gene knockouts. Pyviko performed well on a test set of over 240,000 gene pairs collected from viral genomes deposited in the National Center for Biotechnology Information Nucleotide database, identifying a point mutation which added a premature stop codon within the first 20 codons of the target gene in 93.2% of all tested gene-overprinted gene pairs. This shows that Pyviko can be used successfully in a wide variety of contexts to facilitate the molecular cloning and study of viral overprinted genes. Pyviko is an extensible and intuitive Python tool for designing knockouts of overlapping genes. Freely available as both a Python package and a web-based interface ( http://louiejtaylor.github.io/pyViKO/ ), Pyviko simplifies the experimental design of gene knockouts in complex viruses with overlapping genes.
FY17 Status Report on NEAMS Neutronics Activities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, C. H.; Jung, Y. S.; Smith, M. A.
2017-09-30
Under the U.S. DOE NEAMS program, the high-fidelity neutronics code system has been developed to support the multiphysics modeling and simulation capability named SHARP. The neutronics code system includes the high-fidelity neutronics code PROTEUS, the cross section library and preprocessing tools, the multigroup cross section generation code MC2-3, the in-house meshing generation tool, the perturbation and sensitivity analysis code PERSENT, and post-processing tools. The main objectives of the NEAMS neutronics activities in FY17 are to continue development of an advanced nodal solver in PROTEUS for use in nuclear reactor design and analysis projects, implement a simplified sub-channel based thermal-hydraulic (T/H)more » capability into PROTEUS to efficiently compute the thermal feedback, improve the performance of PROTEUS-MOCEX using numerical acceleration and code optimization, improve the cross section generation tools including MC2-3, and continue to perform verification and validation tests for PROTEUS.« less
A multi-fidelity framework for physics based rotor blade simulation and optimization
NASA Astrophysics Data System (ADS)
Collins, Kyle Brian
New helicopter rotor designs are desired that offer increased efficiency, reduced vibration, and reduced noise. Rotor Designers in industry need methods that allow them to use the most accurate simulation tools available to search for these optimal designs. Computer based rotor analysis and optimization have been advanced by the development of industry standard codes known as "comprehensive" rotorcraft analysis tools. These tools typically use table look-up aerodynamics, simplified inflow models and perform aeroelastic analysis using Computational Structural Dynamics (CSD). Due to the simplified aerodynamics, most design studies are performed varying structural related design variables like sectional mass and stiffness. The optimization of shape related variables in forward flight using these tools is complicated and results are viewed with skepticism because rotor blade loads are not accurately predicted. The most accurate methods of rotor simulation utilize Computational Fluid Dynamics (CFD) but have historically been considered too computationally intensive to be used in computer based optimization, where numerous simulations are required. An approach is needed where high fidelity CFD rotor analysis can be utilized in a shape variable optimization problem with multiple objectives. Any approach should be capable of working in forward flight in addition to hover. An alternative is proposed and founded on the idea that efficient hybrid CFD methods of rotor analysis are ready to be used in preliminary design. In addition, the proposed approach recognizes the usefulness of lower fidelity physics based analysis and surrogate modeling. Together, they are used with high fidelity analysis in an intelligent process of surrogate model building of parameters in the high fidelity domain. Closing the loop between high and low fidelity analysis is a key aspect of the proposed approach. This is done by using information from higher fidelity analysis to improve predictions made with lower fidelity models. This thesis documents the development of automated low and high fidelity physics based rotor simulation frameworks. The low fidelity framework uses a comprehensive code with simplified aerodynamics. The high fidelity model uses a parallel processor capable CFD/CSD methodology. Both low and high fidelity frameworks include an aeroacoustic simulation for prediction of noise. A synergistic process is developed that uses both the low and high fidelity frameworks together to build approximate models of important high fidelity metrics as functions of certain design variables. To test the process, a 4-bladed hingeless rotor model is used as a baseline. The design variables investigated include tip geometry and spanwise twist distribution. Approximation models are built for metrics related to rotor efficiency and vibration using the results from 60+ high fidelity (CFD/CSD) experiments and 400+ low fidelity experiments. Optimization using the approximation models found the Pareto Frontier anchor points, or the design having maximum rotor efficiency and the design having minimum vibration. Various Pareto generation methods are used to find designs on the frontier between these two anchor designs. When tested in the high fidelity framework, the Pareto anchor designs are shown to be very good designs when compared with other designs from the high fidelity database. This provides evidence that the process proposed has merit. Ultimately, this process can be utilized by industry rotor designers with their existing tools to bring high fidelity analysis into the preliminary design stage of rotors. In conclusion, the methods developed and documented in this thesis have made several novel contributions. First, an automated high fidelity CFD based forward flight simulation framework has been built for use in preliminary design optimization. The framework was built around an integrated, parallel processor capable CFD/CSD/AA process. Second, a novel method of building approximate models of high fidelity parameters has been developed. The method uses a combination of low and high fidelity results and combines Design of Experiments, statistical effects analysis, and aspects of approximation model management. And third, the determination of rotor blade shape variables through optimization using CFD based analysis in forward flight has been performed. This was done using the high fidelity CFD/CSD/AA framework and method mentioned above. While the low and high fidelity predictions methods used in the work still have inaccuracies that can affect the absolute levels of the results, a framework has been successfully developed and demonstrated that allows for an efficient process to improve rotor blade designs in terms of a selected choice of objective function(s). Using engineering judgment, this methodology could be applied today to investigate opportunities to improve existing designs. With improvements in the low and high fidelity prediction components that will certainly occur, this framework could become a powerful tool for future rotorcraft design work. (Abstract shortened by UMI.)
Easy Handling of Sensors and Actuators over TCP/IP Networks by Open Source Hardware/Software
Mejías, Andrés; Herrera, Reyes S.; Márquez, Marco A.; Calderón, Antonio José; González, Isaías; Andújar, José Manuel
2017-01-01
There are several specific solutions for accessing sensors and actuators present in any process or system through a TCP/IP network, either local or a wide area type like the Internet. The usage of sensors and actuators of different nature and diverse interfaces (SPI, I2C, analogue, etc.) makes access to them from a network in a homogeneous and secure way more complex. A framework, including both software and hardware resources, is necessary to simplify and unify networked access to these devices. In this paper, a set of open-source software tools, specifically designed to cover the different issues concerning the access to sensors and actuators, and two proposed low-cost hardware architectures to operate with the abovementioned software tools are presented. They allow integrated and easy access to local or remote sensors and actuators. The software tools, integrated in the free authoring tool Easy Java and Javascript Simulations (EJS) solve the interaction issues between the subsystem that integrates sensors and actuators into the network, called convergence subsystem in this paper, and the Human Machine Interface (HMI)—this one designed using the intuitive graphical system of EJS—located on the user’s computer. The proposed hardware architectures and software tools are described and experimental implementations with the proposed tools are presented. PMID:28067801
Easy Handling of Sensors and Actuators over TCP/IP Networks by Open Source Hardware/Software.
Mejías, Andrés; Herrera, Reyes S; Márquez, Marco A; Calderón, Antonio José; González, Isaías; Andújar, José Manuel
2017-01-05
There are several specific solutions for accessing sensors and actuators present in any process or system through a TCP/IP network, either local or a wide area type like the Internet. The usage of sensors and actuators of different nature and diverse interfaces (SPI, I2C, analogue, etc.) makes access to them from a network in a homogeneous and secure way more complex. A framework, including both software and hardware resources, is necessary to simplify and unify networked access to these devices. In this paper, a set of open-source software tools, specifically designed to cover the different issues concerning the access to sensors and actuators, and two proposed low-cost hardware architectures to operate with the abovementioned software tools are presented. They allow integrated and easy access to local or remote sensors and actuators. The software tools, integrated in the free authoring tool Easy Java and Javascript Simulations (EJS) solve the interaction issues between the subsystem that integrates sensors and actuators into the network, called convergence subsystem in this paper, and the Human Machine Interface (HMI)-this one designed using the intuitive graphical system of EJS-located on the user's computer. The proposed hardware architectures and software tools are described and experimental implementations with the proposed tools are presented.
Mapping healthcare systems: a policy relevant analytic tool.
Sekhri Feachem, Neelam; Afshar, Ariana; Pruett, Cristina; Avanceña, Anton L V
2017-07-01
In the past decade, an international consensus on the value of well-functioning systems has driven considerable health systems research. This research falls into two broad categories. The first provides conceptual frameworks that take complex healthcare systems and create simplified constructs of interactions and functions. The second focuses on granular inputs and outputs. This paper presents a novel translational mapping tool - the University of California, San Francisco mapping tool (the Tool) - which bridges the gap between these two areas of research, creating a platform for multi-country comparative analysis. Using the Murray-Frenk framework, we create a macro-level representation of a country's structure, focusing on how it finances and delivers healthcare. The map visually depicts the fundamental policy questions in healthcare system design: funding sources and amount spent through each source, purchasers, populations covered, provider categories; and the relationship between these entities. We use the Tool to provide a macro-level comparative analysis of the structure of India's and Thailand's healthcare systems. As part of the systems strengthening arsenal, the Tool can stimulate debate about the merits and consequences of different healthcare systems structural designs, using a common framework that fosters multi-country comparative analyses. © The Author 2017. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene.
Hua, Xijin; Wang, Ling; Al-Hajjar, Mazen; Jin, Zhongmin; Wilcox, Ruth K; Fisher, John
2014-07-01
Finite element models are becoming increasingly useful tools to conduct parametric analysis, design optimisation and pre-clinical testing for hip joint replacements. However, the verification of the finite element model is critically important. The purposes of this study were to develop a three-dimensional anatomic finite element model for a modular metal-on-polyethylene total hip replacement for predicting its contact mechanics and to conduct experimental validation for a simple finite element model which was simplified from the anatomic finite element model. An anatomic modular metal-on-polyethylene total hip replacement model (anatomic model) was first developed and then simplified with reasonable accuracy to a simple modular total hip replacement model (simplified model) for validation. The contact areas on the articulating surface of three polyethylene liners of modular metal-on-polyethylene total hip replacement bearings with different clearances were measured experimentally in the Leeds ProSim hip joint simulator under a series of loading conditions and different cup inclination angles. The contact areas predicted from the simplified model were then compared with that measured experimentally under the same conditions. The results showed that the simplification made for the anatomic model did not change the predictions of contact mechanics of the modular metal-on-polyethylene total hip replacement substantially (less than 12% for contact stresses and contact areas). Good agreements of contact areas between the finite element predictions from the simplified model and experimental measurements were obtained, with maximum difference of 14% across all conditions considered. This indicated that the simplification and assumptions made in the anatomic model were reasonable and the finite element predictions from the simplified model were valid. © IMechE 2014.
Semiautomated Device for Batch Extraction of Metabolites from Tissue Samples
2012-01-01
Metabolomics has become a mainstream analytical strategy for investigating metabolism. The quality of data derived from these studies is proportional to the consistency of the sample preparation. Although considerable research has been devoted to finding optimal extraction protocols, most of the established methods require extensive sample handling. Manual sample preparation can be highly effective in the hands of skilled technicians, but an automated tool for purifying metabolites from complex biological tissues would be of obvious utility to the field. Here, we introduce the semiautomated metabolite batch extraction device (SAMBED), a new tool designed to simplify metabolomics sample preparation. We discuss SAMBED’s design and show that SAMBED-based extractions are of comparable quality to extracts produced through traditional methods (13% mean coefficient of variation from SAMBED versus 16% from manual extractions). Moreover, we show that aqueous SAMBED-based methods can be completed in less than a quarter of the time required for manual extractions. PMID:22292466
A spectral Poisson solver for kinetic plasma simulation
NASA Astrophysics Data System (ADS)
Szeremley, Daniel; Obberath, Jens; Brinkmann, Ralf
2011-10-01
Plasma resonance spectroscopy is a well established plasma diagnostic method, realized in several designs. One of these designs is the multipole resonance probe (MRP). In its idealized - geometrically simplified - version it consists of two dielectrically shielded, hemispherical electrodes to which an RF signal is applied. A numerical tool is under development which is capable of simulating the dynamics of the plasma surrounding the MRP in electrostatic approximation. In this contribution we concentrate on the specialized Poisson solver for that tool. The plasma is represented by an ensemble of point charges. By expanding both the charge density and the potential into spherical harmonics, a largely analytical solution of the Poisson problem can be employed. For a practical implementation, the expansion must be appropriately truncated. With this spectral solver we are able to efficiently solve the Poisson equation in a kinetic plasma simulation without the need of introducing a spatial discretization.
Cunningham, Lucas J.; Lingley, Jessica K.; Haines, Lee R.; Ndung’u, Joseph M.; Torr, Stephen J.; Adams, Emily R.
2016-01-01
Background As the reality of eliminating human African trypanosomiasis (HAT) by 2020 draws closer, the need to detect and identify the remaining areas of transmission increases. Here, we have explored the feasibility of using commercially available LAMP kits, designed to detect the Trypanozoon group of trypanosomes, as a xenomonitoring tool to screen tsetse flies for trypanosomes to be used in future epidemiological surveys. Methods and Findings The DNA extraction method was simplified and worked with the LAMP kits to detect a single positive fly when pooled with 19 negative flies, and the absolute lowest limit of detection that the kits were able to work at was the equivalent of 0.1 trypanosome per ml. The DNA from Trypanosoma brucei brucei could be detected six days after the fly had taken a blood meal containing dead trypanosomes, and when confronted with a range of non-target species, from both laboratory-reared flies and wild-caught flies, the kits showed no evidence of cross-reacting. Conclusion We have shown that it is possible to use a simplified DNA extraction method in conjunction with the pooling of tsetse flies to decrease the time it would take to screen large numbers of flies for the presence of Trypanozoon trypanosomes. The use of commercially-available LAMP kits provides a reliable and highly sensitive tool for xenomonitoring and identifying potential sleeping sickness transmission sites. PMID:26890882
GeneSCF: a real-time based functional enrichment tool with support for multiple organisms.
Subhash, Santhilal; Kanduri, Chandrasekhar
2016-09-13
High-throughput technologies such as ChIP-sequencing, RNA-sequencing, DNA sequencing and quantitative metabolomics generate a huge volume of data. Researchers often rely on functional enrichment tools to interpret the biological significance of the affected genes from these high-throughput studies. However, currently available functional enrichment tools need to be updated frequently to adapt to new entries from the functional database repositories. Hence there is a need for a simplified tool that can perform functional enrichment analysis by using updated information directly from the source databases such as KEGG, Reactome or Gene Ontology etc. In this study, we focused on designing a command-line tool called GeneSCF (Gene Set Clustering based on Functional annotations), that can predict the functionally relevant biological information for a set of genes in a real-time updated manner. It is designed to handle information from more than 4000 organisms from freely available prominent functional databases like KEGG, Reactome and Gene Ontology. We successfully employed our tool on two of published datasets to predict the biologically relevant functional information. The core features of this tool were tested on Linux machines without the need for installation of more dependencies. GeneSCF is more reliable compared to other enrichment tools because of its ability to use reference functional databases in real-time to perform enrichment analysis. It is an easy-to-integrate tool with other pipelines available for downstream analysis of high-throughput data. More importantly, GeneSCF can run multiple gene lists simultaneously on different organisms thereby saving time for the users. Since the tool is designed to be ready-to-use, there is no need for any complex compilation and installation procedures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dana, Scott; Van Dam, Jeroen J; Damiani, Rick R
As part of an ongoing effort to improve the modeling and prediction of small wind turbine dynamics, the National Renewable Energy Laboratory (NREL) tested a small horizontal-axis wind turbine in the field at the National Wind Technology Center. The test turbine was a 2.1-kW downwind machine mounted on an 18-m multi-section fiberglass composite tower. The tower was instrumented and monitored for approximately 6 months. The collected data were analyzed to assess the turbine and tower loads and further validate the simplified loads equations from the International Electrotechnical Commission (IEC) 61400-2 design standards. Field-measured loads were also compared to the outputmore » of an aeroelastic model of the turbine. In particular, we compared fatigue loads as measured in the field, predicted by the aeroelastic model, and calculated using the simplified design equations. Ultimate loads at the tower base were assessed using both the simplified design equations and the aeroelastic model output. The simplified design equations in IEC 61400-2 do not accurately model fatigue loads and a discussion about the simplified design equations is discussed.« less
Emerging CFD technologies and aerospace vehicle design
NASA Technical Reports Server (NTRS)
Aftosmis, Michael J.
1995-01-01
With the recent focus on the needs of design and applications CFD, research groups have begun to address the traditional bottlenecks of grid generation and surface modeling. Now, a host of emerging technologies promise to shortcut or dramatically simplify the simulation process. This paper discusses the current status of these emerging technologies. It will argue that some tools are already available which can have positive impact on portions of the design cycle. However, in most cases, these tools need to be integrated into specific engineering systems and process cycles to be used effectively. The rapidly maturing status of unstructured and Cartesian approaches for inviscid simulations makes suggests the possibility of highly automated Euler-boundary layer simulations with application to loads estimation and even preliminary design. Similarly, technology is available to link block structured mesh generation algorithms with topology libraries to avoid tedious re-meshing of topologically similar configurations. Work in algorithmic based auto-blocking suggests that domain decomposition and point placement operations in multi-block mesh generation may be properly posed as problems in Computational Geometry, and following this approach may lead to robust algorithmic processes for automatic mesh generation.
Automated Design Space Exploration with Aspen
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spafford, Kyle L.; Vetter, Jeffrey S.
Architects and applications scientists often use performance models to explore a multidimensional design space of architectural characteristics, algorithm designs, and application parameters. With traditional performance modeling tools, these explorations forced users to first develop a performance model and then repeatedly evaluate and analyze the model manually. These manual investigations proved laborious and error prone. More importantly, the complexity of this traditional process often forced users to simplify their investigations. To address this challenge of design space exploration, we extend our Aspen (Abstract Scalable Performance Engineering Notation) language with three new language constructs: user-defined resources, parameter ranges, and a collection ofmore » costs in the abstract machine model. Then, we use these constructs to enable automated design space exploration via a nonlinear optimization solver. We show how four interesting classes of design space exploration scenarios can be derived from Aspen models and formulated as pure nonlinear programs. The analysis tools are demonstrated using examples based on Aspen models for a three-dimensional Fast Fourier Transform, the CoMD molecular dynamics proxy application, and the DARPA Streaming Sensor Challenge Problem. Our results show that this approach can compose and solve arbitrary performance modeling questions quickly and rigorously when compared to the traditional manual approach.« less
Automated Design Space Exploration with Aspen
Spafford, Kyle L.; Vetter, Jeffrey S.
2015-01-01
Architects and applications scientists often use performance models to explore a multidimensional design space of architectural characteristics, algorithm designs, and application parameters. With traditional performance modeling tools, these explorations forced users to first develop a performance model and then repeatedly evaluate and analyze the model manually. These manual investigations proved laborious and error prone. More importantly, the complexity of this traditional process often forced users to simplify their investigations. To address this challenge of design space exploration, we extend our Aspen (Abstract Scalable Performance Engineering Notation) language with three new language constructs: user-defined resources, parameter ranges, and a collection ofmore » costs in the abstract machine model. Then, we use these constructs to enable automated design space exploration via a nonlinear optimization solver. We show how four interesting classes of design space exploration scenarios can be derived from Aspen models and formulated as pure nonlinear programs. The analysis tools are demonstrated using examples based on Aspen models for a three-dimensional Fast Fourier Transform, the CoMD molecular dynamics proxy application, and the DARPA Streaming Sensor Challenge Problem. Our results show that this approach can compose and solve arbitrary performance modeling questions quickly and rigorously when compared to the traditional manual approach.« less
DynaSim: A MATLAB Toolbox for Neural Modeling and Simulation
Sherfey, Jason S.; Soplata, Austin E.; Ardid, Salva; Roberts, Erik A.; Stanley, David A.; Pittman-Polletta, Benjamin R.; Kopell, Nancy J.
2018-01-01
DynaSim is an open-source MATLAB/GNU Octave toolbox for rapid prototyping of neural models and batch simulation management. It is designed to speed up and simplify the process of generating, sharing, and exploring network models of neurons with one or more compartments. Models can be specified by equations directly (similar to XPP or the Brian simulator) or by lists of predefined or custom model components. The higher-level specification supports arbitrarily complex population models and networks of interconnected populations. DynaSim also includes a large set of features that simplify exploring model dynamics over parameter spaces, running simulations in parallel using both multicore processors and high-performance computer clusters, and analyzing and plotting large numbers of simulated data sets in parallel. It also includes a graphical user interface (DynaSim GUI) that supports full functionality without requiring user programming. The software has been implemented in MATLAB to enable advanced neural modeling using MATLAB, given its popularity and a growing interest in modeling neural systems. The design of DynaSim incorporates a novel schema for model specification to facilitate future interoperability with other specifications (e.g., NeuroML, SBML), simulators (e.g., NEURON, Brian, NEST), and web-based applications (e.g., Geppetto) outside MATLAB. DynaSim is freely available at http://dynasimtoolbox.org. This tool promises to reduce barriers for investigating dynamics in large neural models, facilitate collaborative modeling, and complement other tools being developed in the neuroinformatics community. PMID:29599715
DynaSim: A MATLAB Toolbox for Neural Modeling and Simulation.
Sherfey, Jason S; Soplata, Austin E; Ardid, Salva; Roberts, Erik A; Stanley, David A; Pittman-Polletta, Benjamin R; Kopell, Nancy J
2018-01-01
DynaSim is an open-source MATLAB/GNU Octave toolbox for rapid prototyping of neural models and batch simulation management. It is designed to speed up and simplify the process of generating, sharing, and exploring network models of neurons with one or more compartments. Models can be specified by equations directly (similar to XPP or the Brian simulator) or by lists of predefined or custom model components. The higher-level specification supports arbitrarily complex population models and networks of interconnected populations. DynaSim also includes a large set of features that simplify exploring model dynamics over parameter spaces, running simulations in parallel using both multicore processors and high-performance computer clusters, and analyzing and plotting large numbers of simulated data sets in parallel. It also includes a graphical user interface (DynaSim GUI) that supports full functionality without requiring user programming. The software has been implemented in MATLAB to enable advanced neural modeling using MATLAB, given its popularity and a growing interest in modeling neural systems. The design of DynaSim incorporates a novel schema for model specification to facilitate future interoperability with other specifications (e.g., NeuroML, SBML), simulators (e.g., NEURON, Brian, NEST), and web-based applications (e.g., Geppetto) outside MATLAB. DynaSim is freely available at http://dynasimtoolbox.org. This tool promises to reduce barriers for investigating dynamics in large neural models, facilitate collaborative modeling, and complement other tools being developed in the neuroinformatics community.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robinson, P.
2014-09-23
GRAPE is a tool for managing software project workflows for the Git version control system. It provides a suite of tools to simplify and configure branch based development, integration with a project's testing suite, and integration with the Atlassian Stash repository hosting tool.
Simplified method for calculating shear deflections of beams.
I. Orosz
1970-01-01
When one designs with wood, shear deflections can become substantial compared to deflections due to moments, because the modulus of elasticity in bending differs from that in shear by a large amount. This report presents a simplified energy method to calculate shear deflections in bending members. This simplified approach should help designers decide whether or not...
Simple design of slanted grating with simplified modal method.
Li, Shubin; Zhou, Changhe; Cao, Hongchao; Wu, Jun
2014-02-15
A simplified modal method (SMM) is presented that offers a clear physical image for subwavelength slanted grating. The diffraction characteristic of the slanted grating under Littrow configuration is revealed by the SMM as an equivalent rectangular grating, which is in good agreement with rigorous coupled-wave analysis. Based on the equivalence, we obtained an effective analytic solution for simplifying the design and optimization of a slanted grating. It offers a new approach for design of the slanted grating, e.g., a 1×2 beam splitter can be easily designed. This method should be helpful for designing various new slanted grating devices.
NASA Technical Reports Server (NTRS)
Ables, Brett
2014-01-01
Multi-stage launch vehicles with solid rocket motors (SRMs) face design optimization challenges, especially when the mission scope changes frequently. Significant performance benefits can be realized if the solid rocket motors are optimized to the changing requirements. While SRMs represent a fixed performance at launch, rapid design iterations enable flexibility at design time, yielding significant performance gains. The streamlining and integration of SRM design and analysis can be achieved with improved analysis tools. While powerful and versatile, the Solid Performance Program (SPP) is not conducive to rapid design iteration. Performing a design iteration with SPP and a trajectory solver is a labor intensive process. To enable a better workflow, SPP, the Program to Optimize Simulated Trajectories (POST), and the interfaces between them have been improved and automated, and a graphical user interface (GUI) has been developed. The GUI enables real-time visual feedback of grain and nozzle design inputs, enforces parameter dependencies, removes redundancies, and simplifies manipulation of SPP and POST's numerous options. Automating the analysis also simplifies batch analyses and trade studies. Finally, the GUI provides post-processing, visualization, and comparison of results. Wrapping legacy high-fidelity analysis codes with modern software provides the improved interface necessary to enable rapid coupled SRM ballistics and vehicle trajectory analysis. Low cost trade studies demonstrate the sensitivities of flight performance metrics to propulsion characteristics. Incorporating high fidelity analysis from SPP into vehicle design reduces performance margins and improves reliability. By flying an SRM designed with the same assumptions as the rest of the vehicle, accurate comparisons can be made between competing architectures. In summary, this flexible workflow is a critical component to designing a versatile launch vehicle model that can accommodate a volatile mission scope.
Proof of Concept Study of Trade Space Configuration Tool for Spacecraft Design
NASA Technical Reports Server (NTRS)
Glidden, Geoffrey L.
2009-01-01
Spacecraft design is a very difficult and time consuming process because requirements and criteria are often changed or modified as the design is refined. Accounting for these adjustments in the design constraints plays a significant role in furthering the overall progress. There are numerous aspects and variables that hold significant influence on various characteristics of the design. This can be especially frustrating when attempting to conduct rapid trade space analysis on system configurations. Currently, the data and designs considered for trade space evaluations can only be displayed by using the traditional interfaces of Excel spreadsheets or CAD (Computer Aided Design) models. While helpful, these methods of analyzing the data from a systems engineering approach can be rather complicated and overwhelming. As a result, a proof of concept was conducted on a dynamic data visualization software called Thinkmap SDK (Software Developer Kit) to allow for better organization and understanding of the relationships between the various aspects that make up an entire design. The Orion Crew Module Aft Bay Subsystem was used as the test case for this study because the design and layout of many of the subsystem components will be significant in ensuring the overall center of gravity of the capsule is correct. A simplified model of this subsystem was created and programmed using Thinkmap SDK to create a preliminary prototype application of a Trade Space Configuration Tool. The completed application ensures that the core requirements for the Tool can be met. Further development is strongly suggested to produce a full prototype application to allow final evaluations and recommendations of the software capabilities.
Design and implementation of the mobility assessment tool: software description.
Barnard, Ryan T; Marsh, Anthony P; Rejeski, Walter Jack; Pecorella, Anthony; Ip, Edward H
2013-07-23
In previous work, we described the development of an 81-item video-animated tool for assessing mobility. In response to criticism levied during a pilot study of this tool, we sought to develop a new version built upon a flexible framework for designing and administering the instrument. Rather than constructing a self-contained software application with a hard-coded instrument, we designed an XML schema capable of describing a variety of psychometric instruments. The new version of our video-animated assessment tool was then defined fully within the context of a compliant XML document. Two software applications--one built in Java, the other in Objective-C for the Apple iPad--were then built that could present the instrument described in the XML document and collect participants' responses. Separating the instrument's definition from the software application implementing it allowed for rapid iteration and easy, reliable definition of variations. Defining instruments in a software-independent XML document simplifies the process of defining instruments and variations and allows a single instrument to be deployed on as many platforms as there are software applications capable of interpreting the instrument, thereby broadening the potential target audience for the instrument. Continued work will be done to further specify and refine this type of instrument specification with a focus on spurring adoption by researchers in gerontology and geriatric medicine.
Design and implementation of the mobility assessment tool: software description
2013-01-01
Background In previous work, we described the development of an 81-item video-animated tool for assessing mobility. In response to criticism levied during a pilot study of this tool, we sought to develop a new version built upon a flexible framework for designing and administering the instrument. Results Rather than constructing a self-contained software application with a hard-coded instrument, we designed an XML schema capable of describing a variety of psychometric instruments. The new version of our video-animated assessment tool was then defined fully within the context of a compliant XML document. Two software applications—one built in Java, the other in Objective-C for the Apple iPad—were then built that could present the instrument described in the XML document and collect participants’ responses. Separating the instrument’s definition from the software application implementing it allowed for rapid iteration and easy, reliable definition of variations. Conclusions Defining instruments in a software-independent XML document simplifies the process of defining instruments and variations and allows a single instrument to be deployed on as many platforms as there are software applications capable of interpreting the instrument, thereby broadening the potential target audience for the instrument. Continued work will be done to further specify and refine this type of instrument specification with a focus on spurring adoption by researchers in gerontology and geriatric medicine. PMID:23879716
A special purpose silicon compiler for designing supercomputing VLSI systems
NASA Technical Reports Server (NTRS)
Venkateswaran, N.; Murugavel, P.; Kamakoti, V.; Shankarraman, M. J.; Rangarajan, S.; Mallikarjun, M.; Karthikeyan, B.; Prabhakar, T. S.; Satish, V.; Venkatasubramaniam, P. R.
1991-01-01
Design of general/special purpose supercomputing VLSI systems for numeric algorithm execution involves tackling two important aspects, namely their computational and communication complexities. Development of software tools for designing such systems itself becomes complex. Hence a novel design methodology has to be developed. For designing such complex systems a special purpose silicon compiler is needed in which: the computational and communicational structures of different numeric algorithms should be taken into account to simplify the silicon compiler design, the approach is macrocell based, and the software tools at different levels (algorithm down to the VLSI circuit layout) should get integrated. In this paper a special purpose silicon (SPS) compiler based on PACUBE macrocell VLSI arrays for designing supercomputing VLSI systems is presented. It is shown that turn-around time and silicon real estate get reduced over the silicon compilers based on PLA's, SLA's, and gate arrays. The first two silicon compiler characteristics mentioned above enable the SPS compiler to perform systolic mapping (at the macrocell level) of algorithms whose computational structures are of GIPOP (generalized inner product outer product) form. Direct systolic mapping on PLA's, SLA's, and gate arrays is very difficult as they are micro-cell based. A novel GIPOP processor is under development using this special purpose silicon compiler.
A Methodology for Developing Army Acquisition Strategies for an Uncertain Future
2007-01-01
manuscript for publication. Acronyms ABP Assumption-Based Planning ACEIT Automated Cost Estimating Integrated Tool ACR Armored Cavalry Regiment ACTD...decisions. For example, they employ the Automated Cost Estimating Integrated Tools ( ACEIT ) to simplify life cycle cost estimates; other tools are
Simplified models for dark matter face their consistent completions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gonçalves, Dorival; Machado, Pedro A. N.; No, Jose Miguel
Simplified dark matter models have been recently advocated as a powerful tool to exploit the complementarity between dark matter direct detection, indirect detection and LHC experimental probes. Focusing on pseudoscalar mediators between the dark and visible sectors, we show that the simplified dark matter model phenomenology departs significantly from that of consistentmore » $${SU(2)_{\\mathrm{L}} \\times U(1)_{\\mathrm{Y}}}$$ gauge invariant completions. We discuss the key physics simplified models fail to capture, and its impact on LHC searches. Notably, we show that resonant mono-Z searches provide competitive sensitivities to standard mono-jet analyses at $13$ TeV LHC.« less
Modular chassis simplifies packaging and interconnecting of circuit boards
NASA Technical Reports Server (NTRS)
Arens, W. E.; Boline, K. G.
1964-01-01
A system of modular chassis structures has simplified the design for mounting a number of printed circuit boards. This design is structurally adaptable to computer and industrial control system applications.
Expert systems for space power supply - Design, analysis, and evaluation
NASA Technical Reports Server (NTRS)
Cooper, Ralph S.; Thomson, M. Kemer; Hoshor, Alan
1987-01-01
The feasibility of applying expert systems to the conceptual design, analysis, and evaluation of space power supplies in particular, and complex systems in general is evaluated. To do this, the space power supply design process and its associated knowledge base were analyzed and characterized in a form suitable for computer emulation of a human expert. The existing expert system tools and the results achieved with them were evaluated to assess their applicability to power system design. Some new concepts for combining program architectures (modular expert systems and algorithms) with information about the domain were applied to create a 'deep' system for handling the complex design problem. NOVICE, a code to solve a simplified version of a scoping study of a wide variety of power supply types for a broad range of missions, has been developed, programmed, and tested as a concrete feasibility demonstration.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-16
... NUCLEAR REGULATORY COMMISSION [NRC-2011-0055] Economic Simplified Boiling Water Reactor Standard Design: GE Hitachi Nuclear Energy; Issuance of Final Design Approval The U.S. Nuclear Regulatory Commission has issued a final design approval (FDA) to GE Hitachi Nuclear Energy (GEH) for the economic...
Human factors model concerning the man-machine interface of mining crewstations
NASA Technical Reports Server (NTRS)
Rider, James P.; Unger, Richard L.
1989-01-01
The U.S. Bureau of Mines is developing a computer model to analyze the human factors aspect of mining machine operator compartments. The model will be used as a research tool and as a design aid. It will have the capability to perform the following: simulated anthropometric or reach assessment, visibility analysis, illumination analysis, structural analysis of the protective canopy, operator fatigue analysis, and computation of an ingress-egress rating. The model will make extensive use of graphics to simplify data input and output. Two dimensional orthographic projections of the machine and its operator compartment are digitized and the data rebuilt into a three dimensional representation of the mining machine. Anthropometric data from either an individual or any size population may be used. The model is intended for use by equipment manufacturers and mining companies during initial design work on new machines. In addition to its use in machine design, the model should prove helpful as an accident investigation tool and for determining the effects of machine modifications made in the field on the critical areas of visibility and control reach ability.
An architecture for genomics analysis in a clinical setting using Galaxy and Docker
Digan, W; Countouris, H; Barritault, M; Baudoin, D; Laurent-Puig, P; Blons, H; Burgun, A
2017-01-01
Abstract Next-generation sequencing is used on a daily basis to perform molecular analysis to determine subtypes of disease (e.g., in cancer) and to assist in the selection of the optimal treatment. Clinical bioinformatics handles the manipulation of the data generated by the sequencer, from the generation to the analysis and interpretation. Reproducibility and traceability are crucial issues in a clinical setting. We have designed an approach based on Docker container technology and Galaxy, the popular bioinformatics analysis support open-source software. Our solution simplifies the deployment of a small-size analytical platform and simplifies the process for the clinician. From the technical point of view, the tools embedded in the platform are isolated and versioned through Docker images. Along the Galaxy platform, we also introduce the AnalysisManager, a solution that allows single-click analysis for biologists and leverages standardized bioinformatics application programming interfaces. We added a Shiny/R interactive environment to ease the visualization of the outputs. The platform relies on containers and ensures the data traceability by recording analytical actions and by associating inputs and outputs of the tools to EDAM ontology through ReGaTe. The source code is freely available on Github at https://github.com/CARPEM/GalaxyDocker. PMID:29048555
An architecture for genomics analysis in a clinical setting using Galaxy and Docker.
Digan, W; Countouris, H; Barritault, M; Baudoin, D; Laurent-Puig, P; Blons, H; Burgun, A; Rance, B
2017-11-01
Next-generation sequencing is used on a daily basis to perform molecular analysis to determine subtypes of disease (e.g., in cancer) and to assist in the selection of the optimal treatment. Clinical bioinformatics handles the manipulation of the data generated by the sequencer, from the generation to the analysis and interpretation. Reproducibility and traceability are crucial issues in a clinical setting. We have designed an approach based on Docker container technology and Galaxy, the popular bioinformatics analysis support open-source software. Our solution simplifies the deployment of a small-size analytical platform and simplifies the process for the clinician. From the technical point of view, the tools embedded in the platform are isolated and versioned through Docker images. Along the Galaxy platform, we also introduce the AnalysisManager, a solution that allows single-click analysis for biologists and leverages standardized bioinformatics application programming interfaces. We added a Shiny/R interactive environment to ease the visualization of the outputs. The platform relies on containers and ensures the data traceability by recording analytical actions and by associating inputs and outputs of the tools to EDAM ontology through ReGaTe. The source code is freely available on Github at https://github.com/CARPEM/GalaxyDocker. © The Author 2017. Published by Oxford University Press.
Simplified filtered Smith predictor for MIMO processes with multiple time delays.
Santos, Tito L M; Torrico, Bismark C; Normey-Rico, Julio E
2016-11-01
This paper proposes a simplified tuning strategy for the multivariable filtered Smith predictor. It is shown that offset-free control can be achieved with step references and disturbances regardless of the poles of the primary controller, i.e., integral action is not explicitly required. This strategy reduces the number of design parameters and simplifies tuning procedure because the implicit integrative poles are not considered for design purposes. The simplified approach can be used to design continuous-time or discrete-time controllers. Three case studies are used to illustrate the advantages of the proposed strategy if compared with the standard approach, which is based on the explicit integrative action. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.
Designing and Developing Web-Based Administrative Tools for Program Management
NASA Technical Reports Server (NTRS)
Gutensohn, Michael
2017-01-01
The task assigned for this internship was to develop a new tool for tracking projects, their subsystems, the leads, backups, and other employees assigned to them, as well as all the relevant information related to the employee (WBS (time charge) codes, time distribution, certifications, and assignments). Currently, this data is tracked manually using a number of different spreadsheets and other tools simultaneously by a number of different people; some of these documents are then merged into one large document. This often leads to inconsistencies and loss in data due to human error. By simplifying the process of tracking this data and aggregating it into a single tool, it is possible to significantly decrease the potential for human error and time spent collecting and checking this information. II. Objective The main objective of this internship is to develop a web-based tool using Ruby on Rails to serve as a method of easily tracking projects, subsystems, and points of contact, along with employees, their assignments, time distribution, certifications, and contact information. Additionally, this tool must be capable of generating a number of different reports based on the data collected. It was important that this tool deliver all of this information using a readable and intuitive interface.
Synthesis of research on work zone delays and simplified application of QuickZone analysis tool.
DOT National Transportation Integrated Search
2010-03-01
The objectives of this project were to synthesize the latest information on work zone safety and management and identify case studies in which FHWAs decision support tool QuickZone or other appropriate analysis tools could be applied. The results ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dana, S.; Damiani, R.; vanDam, J.
As part of an ongoing effort to improve the modeling and prediction of small wind turbine dynamics, NREL tested a small horizontal axis wind turbine in the field at the National Wind Technology Center (NWTC). The test turbine was a 2.1-kW downwind machine mounted on an 18-meter multi-section fiberglass composite tower. The tower was instrumented and monitored for approximately 6 months. The collected data were analyzed to assess the turbine and tower loads and further validate the simplified loads equations from the International Electrotechnical Commission (IEC) 61400-2 design standards. Field-measured loads were also compared to the output of an aeroelasticmore » model of the turbine. Ultimate loads at the tower base were assessed using both the simplified design equations and the aeroelastic model output. The simplified design equations in IEC 61400-2 do not accurately model fatigue loads. In this project, we compared fatigue loads as measured in the field, as predicted by the aeroelastic model, and as calculated using the simplified design equations.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hong, Tianzhen; Yan, Da; D'Oca, Simona
Occupant behavior has significant impacts on building energy performance and occupant comfort. However, occupant behavior is not well understood and is often oversimplified in the building life cycle, due to its stochastic, diverse, complex, and interdisciplinary nature. The use of simplified methods or tools to quantify the impacts of occupant behavior in building performance simulations significantly contributes to performance gaps between simulated models and actual building energy consumption. Therefore, it is crucial to understand occupant behavior in a comprehensive way, integrating qualitative approaches and data- and model-driven quantitative approaches, and employing appropriate tools to guide the design and operation ofmore » low-energy residential and commercial buildings that integrate technological and human dimensions. This paper presents ten questions, highlighting some of the most important issues regarding concepts, applications, and methodologies in occupant behavior research. The proposed questions and answers aim to provide insights into occupant behavior for current and future researchers, designers, and policy makers, and most importantly, to inspire innovative research and applications to increase energy efficiency and reduce energy use in buildings.« less
The Reactome Pathway Knowledgebase
Jupe, Steven; Matthews, Lisa; Sidiropoulos, Konstantinos; Gillespie, Marc; Garapati, Phani; Haw, Robin; Jassal, Bijay; Korninger, Florian; May, Bruce; Milacic, Marija; Roca, Corina Duenas; Rothfels, Karen; Sevilla, Cristoffer; Shamovsky, Veronica; Shorser, Solomon; Varusai, Thawfeek; Viteri, Guilherme; Weiser, Joel
2018-01-01
Abstract The Reactome Knowledgebase (https://reactome.org) provides molecular details of signal transduction, transport, DNA replication, metabolism, and other cellular processes as an ordered network of molecular transformations—an extended version of a classic metabolic map, in a single consistent data model. Reactome functions both as an archive of biological processes and as a tool for discovering unexpected functional relationships in data such as gene expression profiles or somatic mutation catalogues from tumor cells. To support the continued brisk growth in the size and complexity of Reactome, we have implemented a graph database, improved performance of data analysis tools, and designed new data structures and strategies to boost diagram viewer performance. To make our website more accessible to human users, we have improved pathway display and navigation by implementing interactive Enhanced High Level Diagrams (EHLDs) with an associated icon library, and subpathway highlighting and zooming, in a simplified and reorganized web site with adaptive design. To encourage re-use of our content, we have enabled export of pathway diagrams as ‘PowerPoint’ files. PMID:29145629
NASA Astrophysics Data System (ADS)
Naldesi, Luciano; Buttol, Patrizia; Masoni, Paolo; Misceo, Monica; Sára, Balázs
2004-12-01
"eLCA" is a European Commission financed project aimed at realising "On line green tools and services for Small and Medium-sized Enterprises (SMEs)". Knowledge and use of Life Cycle Assessment (LCA) by SMEs are strategic to introduce the Integrated Product Policy (IPP) in Europe, but methodology simplification is needed. LCA requires a large amount of validated general and sector specific data. Since their availability and cost can be insuperable barriers for SMEs, pre-elaborated data/meta-data, use of standards and low cost solutions are required. Within the framework of the eLCA project an LCA software - eVerdEE - based on a simplified methodology and specialised for SMEs has been developed. eVerdEE is a web-based tool with some innovative features. Its main feature is the adaptation of ISO 14040 requirements to offer easy-to-handle functions with solid scientific bases. Complex methodological problems, such as the system boundaries definition, the data quality estimation and documentation, the choice of impact categories, are simplified according to the SMEs" needs. Predefined "Goal and Scope definition" and "Inventory" forms, a user-friendly and well structured procedure are time and cost-effective. The tool is supported by a database containing pre-elaborated environmental indicators of substances and processes for different impact categories. The impact assessment is calculated automatically by using the user"s input and the database values. The results have different levels of interpretation in order to identify the life cycle critical points and the improvement options. The use of a target plot allows the direct comparison of different design alternatives.
NASA Astrophysics Data System (ADS)
Fasel, Markus
2016-10-01
High-Performance Computing Systems are powerful tools tailored to support large- scale applications that rely on low-latency inter-process communications to run efficiently. By design, these systems often impose constraints on application workflows, such as limited external network connectivity and whole node scheduling, that make more general-purpose computing tasks, such as those commonly found in high-energy nuclear physics applications, more difficult to carry out. In this work, we present a tool designed to simplify access to such complicated environments by handling the common tasks of job submission, software management, and local data management, in a framework that is easily adaptable to the specific requirements of various computing systems. The tool, initially constructed to process stand-alone ALICE simulations for detector and software development, was successfully deployed on the NERSC computing systems, Carver, Hopper and Edison, and is being configured to provide access to the next generation NERSC system, Cori. In this report, we describe the tool and discuss our experience running ALICE applications on NERSC HPC systems. The discussion will include our initial benchmarks of Cori compared to other systems and our attempts to leverage the new capabilities offered with Cori to support data-intensive applications, with a future goal of full integration of such systems into ALICE grid operations.
ERIC Educational Resources Information Center
Walsh, John P.; Sun, Jerry Chih-Yuan; Riconscente, Michelle
2011-01-01
Digital technologies can improve student interest and knowledge in science. However, researching the vast number of websites devoted to science education and integrating them into undergraduate curricula is time-consuming. We developed an Adobe ColdFusion- and Adobe Flash-based system for simplifying the construction, use, and delivery of…
De Amorim, Joana D C G; Travnik, Isadora; De Sousa, Bernadete M
2015-03-01
Lizards' caudal autotomy is a complex and vastly employed antipredator mechanism, with thorough anatomic adaptations involved. Due to its diminished size and intricate structures, vertebral anatomy is hard to be clearly conveyed to students and researchers of other areas. Three-dimensional models are prodigious tools in unveiling anatomical nuances. Some of the techniques used to create them can produce irregular and complicated forms, which despite being very accurate, lack didactical uniformity and simplicity. Since both are considered fundamental characteristics for comprehension, a simplified model could be the key to improve learning. The model here presented depicts the caudal osteology of Tropidurus itambere, and was designed to be concise, in order to be easily assimilated, yet complete, not to compromise the informative aspect. The creation process requires only basic skills in manipulating polygons in 3D modeling softwares, in addition to the appropriate knowledge of the structure to be modeled. As reference for the modeling, we used microscopic observation and a photograph database of the caudal structures. This way, no advanced laboratory equipment was needed and all biological materials were preserved for future research. Therefore, we propose a wider usage of simplified 3D models both in the classroom and as illustrations for scientific publications.
Capelli, Claudio; Biglino, Giovanni; Petrini, Lorenza; Migliavacca, Francesco; Cosentino, Daria; Bonhoeffer, Philipp; Taylor, Andrew M; Schievano, Silvia
2012-12-01
Finite element (FE) modelling can be a very resourceful tool in the field of cardiovascular devices. To ensure result reliability, FE models must be validated experimentally against physical data. Their clinical application (e.g., patients' suitability, morphological evaluation) also requires fast simulation process and access to results, while engineering applications need highly accurate results. This study shows how FE models with different mesh discretisations can suit clinical and engineering requirements for studying a novel device designed for percutaneous valve implantation. Following sensitivity analysis and experimental characterisation of the materials, the stent-graft was first studied in a simplified geometry (i.e., compliant cylinder) and validated against in vitro data, and then in a patient-specific implantation site (i.e., distensible right ventricular outflow tract). Different meshing strategies using solid, beam and shell elements were tested. Results showed excellent agreement between computational and experimental data in the simplified implantation site. Beam elements were found to be convenient for clinical applications, providing reliable results in less than one hour in a patient-specific anatomical model. Solid elements remain the FE choice for engineering applications, albeit more computationally expensive (>100 times). This work also showed how information on device mechanical behaviour differs when acquired in a simplified model as opposed to a patient-specific model.
Declarative language design for interactive visualization.
Heer, Jeffrey; Bostock, Michael
2010-01-01
We investigate the design of declarative, domain-specific languages for constructing interactive visualizations. By separating specification from execution, declarative languages can simplify development, enable unobtrusive optimization, and support retargeting across platforms. We describe the design of the Protovis specification language and its implementation within an object-oriented, statically-typed programming language (Java). We demonstrate how to support rich visualizations without requiring a toolkit-specific data model and extend Protovis to enable declarative specification of animated transitions. To support cross-platform deployment, we introduce rendering and event-handling infrastructures decoupled from the runtime platform, letting designers retarget visualization specifications (e.g., from desktop to mobile phone) with reduced effort. We also explore optimizations such as runtime compilation of visualization specifications, parallelized execution, and hardware-accelerated rendering. We present benchmark studies measuring the performance gains provided by these optimizations and compare performance to existing Java-based visualization tools, demonstrating scalability improvements exceeding an order of magnitude.
Visible high-power laser sources for today and beyond
NASA Astrophysics Data System (ADS)
Smolka, Gregory L.
1995-04-01
The diversity and proliferation of 'real-world' laser applications continues to put increasing demand on laser technology. New system constraints, often dictated by the operation environment, stretch the capabilities of conventional laboratory lasers. As the applications proliferate, so too do the users. Today's laser user is often not a laser engineer, but rather views the laser simply as a tool to help him perform his job. For lasers to reach their true market potential, laser designers must respond to these user-mandated requirements with simple, compact, rugged devices. Traditional commercial lasers are far too large, bulky and complex for many of these new applications. Design techniques for shrinking, simplifying the ruggedizing solid-state lasers for today's applications will be discussed.
TNSPackage: A Fortran2003 library designed for tensor network state methods
NASA Astrophysics Data System (ADS)
Dong, Shao-Jun; Liu, Wen-Yuan; Wang, Chao; Han, Yongjian; Guo, G.-C.; He, Lixin
2018-07-01
Recently, the tensor network states (TNS) methods have proven to be very powerful tools to investigate the strongly correlated many-particle physics in one and two dimensions. The implementation of TNS methods depends heavily on the operations of tensors, including contraction, permutation, reshaping tensors, SVD and so on. Unfortunately, the most popular computer languages for scientific computation, such as Fortran and C/C++ do not have a standard library for such operations, and therefore make the coding of TNS very tedious. We develop a Fortran2003 package that includes all kinds of basic tensor operations designed for TNS. It is user-friendly and flexible for different forms of TNS, and therefore greatly simplifies the coding work for the TNS methods.
Resilient Monitoring Systems: Architecture, Design, and Application to Boiler/Turbine Plant
Garcia, Humberto E.; Lin, Wen-Chiao; Meerkov, Semyon M.; ...
2014-11-01
Resilient monitoring systems, considered in this paper, are sensor networks that degrade gracefully under malicious attacks on their sensors, causing them to project misleading information. The goal of this work is to design, analyze, and evaluate the performance of a resilient monitoring system intended to monitor plant conditions (normal or anomalous). The architecture developed consists of four layers: data quality assessment, process variable assessment, plant condition assessment, and sensor network adaptation. Each of these layers is analyzed by either analytical or numerical tools. The performance of the overall system is evaluated using a simplified boiler/turbine plant. The measure of resiliencymore » is quantified using Kullback-Leibler divergence, and is shown to be sufficiently high in all scenarios considered.« less
Resilient monitoring systems: architecture, design, and application to boiler/turbine plant.
Garcia, Humberto E; Lin, Wen-Chiao; Meerkov, Semyon M; Ravichandran, Maruthi T
2014-11-01
Resilient monitoring systems, considered in this paper, are sensor networks that degrade gracefully under malicious attacks on their sensors, causing them to project misleading information. The goal of this paper is to design, analyze, and evaluate the performance of a resilient monitoring system intended to monitor plant conditions (normal or anomalous). The architecture developed consists of four layers: data quality assessment, process variable assessment, plant condition assessment, and sensor network adaptation. Each of these layers is analyzed by either analytical or numerical tools. The performance of the overall system is evaluated using a simplified boiler/turbine plant. The measure of resiliency is quantified based on the Kullback-Leibler divergence and shown to be sufficiently high in all scenarios considered.
Jinghao Li; John F. Hunt; Shaoqin Gong; Zhiyong Cai
2016-01-01
This paper presents a simplified analytical model and balanced design approach for modeling lightweight wood-based structural panels in bending. Because many design parameters are required to input for the model of finite element analysis (FEA) during the preliminary design process and optimization, the equivalent method was developed to analyze the mechanical...
NASA Astrophysics Data System (ADS)
Segalini, Andrea; Ferrero, Anna Maria; Brighenti, Roberto
2013-04-01
A channelized debris flow is usually represented by a mixture of solid particles of various sizes and water, flowing along a laterally confined inclined channel-shaped region up to an unconfined area where it slow down its motion and spreads out into a flat-shaped mass. The study of these phenomena is very difficult due to their short duration and unpredictability, lack of historical data for a given basin and complexity of the involved mechanical phenomena. The post event surveys allow for the identification of some depositional features and provide indication about the maximum flow height; however they lack information about development of the phenomena with time. For this purpose the monitoring of recursive events has been carried out by several Authors. Most of the studies, aimed at the determination of the characteristic features of a debris flow, were carried out in artificial channels, where the main involved variables were measured and other where controlled during the tests; however, some uncertainties remained and other scaled models where developed to simulate the deposition mechanics as well as to analyze the transportation mechanics and the energy dissipation. The assessment of the mechanical behavior of the protection structures upon impact with the flow as well as the energy associated to it are necessary for the proper design of such structures that, in densely populated area, can avoid victims and limit the destructive effects of such a phenomenon. In this work a simplified structural model, developed by the Authors for the safety assessment of retention barrier against channelized debris flow, is presented and some parametric cases are interpreted through the proposed approach; this model is developed as a simplified and efficient tool to be used for the verification of the supporting cables and foundations of a flexible debris flow barrier. The present analytical and numerical-based approach has a different aim of a FEM model. The computational experiences by using FEM modeling for these kind of structures, had shown that a large amount of time for both the geometrical setup of the model and its computation is necessary. The big effort required by FEM for this class of problems limits the actual possibility to investigate different geometrical configurations, load schemes etc. and it is suitable to represent a specific configuration but it does not allow for investigation of the influence of parameter changes. On the other hand parametrical analysis are common practice in geotechnical design for the quoted reasons. Consequently, the Authors felt the need to develop a simplified method (which is not yet available in our knowledge) that allow to perform several parametrical analysis in a limited time. It should be noted that, in this paper, no consideration regarding the mechanical and physical behavior of debris flows are carried out; the proposed model requires the input of parameters that must be acquired through a preliminary characterization of the design event. However, adopting the proposed tool, the designer will be able to perform sensitivity analysis that will help in quantify the influence of parameters variability as commonly occurs in geotechnical design.
NASA Technical Reports Server (NTRS)
2012-01-01
Topics include: Bioreactors Drive Advances in Tissue Engineering; Tooling Techniques Enhance Medical Imaging; Ventilator Technologies Sustain Critically Injured Patients; Protein Innovations Advance Drug Treatments, Skin Care; Mass Analyzers Facilitate Research on Addiction; Frameworks Coordinate Scientific Data Management; Cameras Improve Navigation for Pilots, Drivers; Integrated Design Tools Reduce Risk, Cost; Advisory Systems Save Time, Fuel for Airlines; Modeling Programs Increase Aircraft Design Safety; Fly-by-Wire Systems Enable Safer, More Efficient Flight; Modified Fittings Enhance Industrial Safety; Simulation Tools Model Icing for Aircraft Design; Information Systems Coordinate Emergency Management; Imaging Systems Provide Maps for U.S. Soldiers; High-Pressure Systems Suppress Fires in Seconds; Alloy-Enhanced Fans Maintain Fresh Air in Tunnels; Control Algorithms Charge Batteries Faster; Software Programs Derive Measurements from Photographs; Retrofits Convert Gas Vehicles into Hybrids; NASA Missions Inspire Online Video Games; Monitors Track Vital Signs for Fitness and Safety; Thermal Components Boost Performance of HVAC Systems; World Wind Tools Reveal Environmental Change; Analyzers Measure Greenhouse Gasses, Airborne Pollutants; Remediation Technologies Eliminate Contaminants; Receivers Gather Data for Climate, Weather Prediction; Coating Processes Boost Performance of Solar Cells; Analyzers Provide Water Security in Space and on Earth; Catalyst Substrates Remove Contaminants, Produce Fuel; Rocket Engine Innovations Advance Clean Energy; Technologies Render Views of Earth for Virtual Navigation; Content Platforms Meet Data Storage, Retrieval Needs; Tools Ensure Reliability of Critical Software; Electronic Handbooks Simplify Process Management; Software Innovations Speed Scientific Computing; Controller Chips Preserve Microprocessor Function; Nanotube Production Devices Expand Research Capabilities; Custom Machines Advance Composite Manufacturing; Polyimide Foams Offer Superior Insulation; Beam Steering Devices Reduce Payload Weight; Models Support Energy-Saving Microwave Technologies; Materials Advance Chemical Propulsion Technology; and High-Temperature Coatings Offer Energy Savings.
Assefa, Yibeltal; Worku, Alemayehu; Wouters, Edwin; Koole, Olivier; Haile Mariam, Damen; Van Damme, Wim
2012-01-01
Patient retention in care is a critical challenge for antiretroviral treatment programs. This is mainly because retention in care is related to adherence to treatment and patient survival. It is therefore imperative that health facilities and programs measure patient retention in care. However, the currently available tools, such as Kaplan Meier, for measuring retention in care have a lot of practical limitations. The objective of this study was to develop simplified tools for measuring retention in care. Retrospective cohort data were collected from patient registers in nine health facilities in Ethiopia. Retention in care was the primary outcome for the study. Tools were developed to measure "current retention" in care during a specific period of time for a specific "ART-age group" and "cohort retention" in care among patients who were followed for the last "Y" number of years on ART. "Probability of retention" based on the tool for "cohort retention" in care was compared with "probability of retention" based on Kaplan Meier. We found that the new tools enable to measure "current retention" and "cohort retention" in care. We also found that the tools were easy to use and did not require advanced statistical skills. Both "current retention" and "cohort retention" are lower among patients in the first two "ART-age groups" and "ART-age cohorts" than in subsequent "ART-age groups" and "ART-age cohorts". The "probability of retention" based on the new tools were found to be similar to the "probability of retention" based on Kaplan Meier. The simplified tools for "current retention" and "cohort retention" will enable practitioners and program managers to measure and monitor rates of retention in care easily and appropriately. We therefore recommend that health facilities and programs start to use these tools in their efforts to improve retention in care and patient outcomes.
Multidisciplinary Optimization Methods for Aircraft Preliminary Design
NASA Technical Reports Server (NTRS)
Kroo, Ilan; Altus, Steve; Braun, Robert; Gage, Peter; Sobieski, Ian
1994-01-01
This paper describes a research program aimed at improved methods for multidisciplinary design and optimization of large-scale aeronautical systems. The research involves new approaches to system decomposition, interdisciplinary communication, and methods of exploiting coarse-grained parallelism for analysis and optimization. A new architecture, that involves a tight coupling between optimization and analysis, is intended to improve efficiency while simplifying the structure of multidisciplinary, computation-intensive design problems involving many analysis disciplines and perhaps hundreds of design variables. Work in two areas is described here: system decomposition using compatibility constraints to simplify the analysis structure and take advantage of coarse-grained parallelism; and collaborative optimization, a decomposition of the optimization process to permit parallel design and to simplify interdisciplinary communication requirements.
Simplified gas sensor model based on AlGaN/GaN heterostructure Schottky diode
DOE Office of Scientific and Technical Information (OSTI.GOV)
Das, Subhashis, E-mail: subhashis.ds@gmail.com; Majumdar, S.; Kumar, R.
2015-08-28
Physics based modeling of AlGaN/GaN heterostructure Schottky diode gas sensor has been investigated for high sensitivity and linearity of the device. Here the surface and heterointerface properties are greatly exploited. The dependence of two dimensional electron gas (2DEG) upon the surface charges is mainly utilized. The simulation of Schottky diode has been done in Technology Computer Aided Design (TCAD) tool and I-V curves are generated, from the I-V curves 76% response has been recorded in presence of 500 ppm gas at a biasing voltage of 0.95 Volt.
Solid motor aft closure insulation erosion. [heat flux correlation for rate analysis
NASA Technical Reports Server (NTRS)
Stampfl, E.; Landsbaum, E. M.
1973-01-01
The erosion rate of aft closure insulation in a number of large solid propellant motors was empirically analyzed by correlating the average ablation rate with a number of variables that had previously been demonstrated to affect heat flux. The main correlating parameter was a heat flux based on the simplified Bartz heat transfer coefficient corrected for two-dimensional effects. A multiplying group contained terms related to port-to-throat ratio, local wall angle, grain geometry and nozzle cant angle. The resulting equation gave a good correlation and is a useful design tool.
On a computational model of building thermal dynamic response
NASA Astrophysics Data System (ADS)
Jarošová, Petra; Vala, Jiří
2016-07-01
Development and exploitation of advanced materials, structures and technologies in civil engineering, both for buildings with carefully controlled interior temperature and for common residential houses, together with new European and national directives and technical standards, stimulate the development of rather complex and robust, but sufficiently simple and inexpensive computational tools, supporting their design and optimization of energy consumption. This paper demonstrates the possibility of consideration of such seemingly contradictory requirements, using the simplified non-stationary thermal model of a building, motivated by the analogy with the analysis of electric circuits; certain semi-analytical forms of solutions come from the method of lines.
A Simplified Shuttle Payload Thermal Analyzer /SSPTA/ program
NASA Technical Reports Server (NTRS)
Bartoszek, J. T.; Huckins, B.; Coyle, M.
1979-01-01
A simple thermal analysis program for Space Shuttle payloads has been developed to accommodate the user who requires an easily understood but dependable analytical tool. The thermal analysis program includes several thermal subprograms traditionally employed in spacecraft thermal studies, a data management system for data generated by the subprograms, and a master program to coordinate the data files and thermal subprograms. The language and logic used to run the thermal analysis program are designed for the small user. In addition, analytical and storage techniques which conserve computer time and minimize core requirements are incorporated into the program.
SAVANT: Solar Array Verification and Analysis Tool Demonstrated
NASA Technical Reports Server (NTRS)
Chock, Ricaurte
2000-01-01
The photovoltaics (PV) industry is now being held to strict specifications, such as end-oflife power requirements, that force them to overengineer their products to avoid contractual penalties. Such overengineering has been the only reliable way to meet such specifications. Unfortunately, it also results in a more costly process than is probably necessary. In our conversations with the PV industry, the issue of cost has been raised again and again. Consequently, the Photovoltaics and Space Environment Effects branch at the NASA Glenn Research Center at Lewis Field has been developing a software tool to address this problem. SAVANT, Glenn's tool for solar array verification and analysis is in the technology demonstration phase. Ongoing work has proven that more efficient and less costly PV designs should be possible by using SAVANT to predict the on-orbit life-cycle performance. The ultimate goal of the SAVANT project is to provide a user-friendly computer tool to predict PV on-orbit life-cycle performance. This should greatly simplify the tasks of scaling and designing the PV power component of any given flight or mission. By being able to predict how a particular PV article will perform, designers will be able to balance mission power requirements (both beginning-of-life and end-of-life) with survivability concerns such as power degradation due to radiation and/or contamination. Recent comparisons with actual flight data from the Photovoltaic Array Space Power Plus Diagnostics (PASP Plus) mission validate this approach.
Installation/Removal Tool for Screw-Mounted Components
NASA Technical Reports Server (NTRS)
Ash, J. P.
1984-01-01
Tweezerlike tool simplifies installation of screws in places reached only through narrow openings. With changes in size and shape, basic tool concept applicable to mounting and dismounting of transformers, sockets, terminal strips and mechanical parts. Inexpensive tool fabricated as needed by bending two pieces of steel wire. Exact size and shape selected to suit part manipulated and nature of inaccessible mounting space.
NASA Astrophysics Data System (ADS)
Koval, Viacheslav
The seismic design provisions of the CSA-S6 Canadian Highway Bridge Design Code and the AASHTO LRFD Seismic Bridge Design Specifications have been developed primarily based on historical earthquake events that have occurred along the west coast of North America. For the design of seismic isolation systems, these codes include simplified analysis and design methods. The appropriateness and range of application of these methods are investigated through extensive parametric nonlinear time history analyses in this thesis. It was found that there is a need to adjust existing design guidelines to better capture the expected nonlinear response of isolated bridges. For isolated bridges located in eastern North America, new damping coefficients are proposed. The applicability limits of the code-based simplified methods have been redefined to ensure that the modified method will lead to conservative results and that a wider range of seismically isolated bridges can be covered by this method. The possibility of further improving current simplified code methods was also examined. By transforming the quantity of allocated energy into a displacement contribution, an idealized analytical solution is proposed as a new simplified design method. This method realistically reflects the effects of ground-motion and system design parameters, including the effects of a drifted oscillation center. The proposed method is therefore more appropriate than current existing simplified methods and can be applicable to isolation systems exhibiting a wider range of properties. A multi-level-hazard performance matrix has been adopted by different seismic provisions worldwide and will be incorporated into the new edition of the Canadian CSA-S6-14 Bridge Design code. However, the combined effect and optimal use of isolation and supplemental damping devices in bridges have not been fully exploited yet to achieve enhanced performance under different levels of seismic hazard. A novel Dual-Level Seismic Protection (DLSP) concept is proposed and developed in this thesis which permits to achieve optimum seismic performance with combined isolation and supplemental damping devices in bridges. This concept is shown to represent an attractive design approach for both the upgrade of existing seismically deficient bridges and the design of new isolated bridges.
NASA Astrophysics Data System (ADS)
Kempf, Scott; Schäfer, Frank K.; Cardone, Tiziana; Ferreira, Ivo; Gerené, Sam; Destefanis, Roberto; Grassi, Lilith
2016-12-01
During recent years, the state-of-the-art risk assessment of the threat posed to spacecraft by micrometeoroids and space debris has been expanded to the analysis of failure modes of internal spacecraft components. This method can now be used to perform risk analyses for satellites to assess various failure levels - from failure of specific sub-systems to catastrophic break-up. This new assessment methodology is based on triple-wall ballistic limit equations (BLEs), specifically the Schäfer-Ryan-Lambert (SRL) BLE, which is applicable for describing failure threshold levels for satellite components following a hypervelocity impact. The methodology is implemented in the form of the software tool Particle Impact Risk and vulnerability Analysis Tool (PIRAT). During a recent European Space Agency (ESA) funded study, the PIRAT functionality was expanded in order to provide an interface to ESA's Concurrent Design Facility (CDF). The additions include a geometry importer and an OCDT (Open Concurrent Design Tool) interface. The new interface provides both the expanded geometrical flexibility, which is provided by external computer aided design (CAD) modelling, and an ease of import of existing data without the need for extensive preparation of the model. The reduced effort required to perform vulnerability analyses makes it feasible for application during early design phase, at which point modifications to satellite design can be undertaken with relatively little extra effort. The integration of PIRAT in the CDF represents the first time that vulnerability analyses can be performed in-session in ESA's CDF and the first time that comprehensive vulnerability studies can be applied cost-effectively in early design phase in general.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seppala, L G
2000-09-15
A glass-choice strategy, based on separately designing an achromatic lens before progressing to an apochromatic lens, simplified my approach to solving the International Optical Design Conference (IODC) 1998 lens design problem. The glasses that are needed to make the lens apochromatic are combined into triplet correctors with two ''buried'' surfaces. By applying this strategy, I reached successful solutions that used only six glasses--three glasses for the achromatic design and three additional glasses for the apochromatic design.
Simplified Metadata Curation via the Metadata Management Tool
NASA Astrophysics Data System (ADS)
Shum, D.; Pilone, D.
2015-12-01
The Metadata Management Tool (MMT) is the newest capability developed as part of NASA Earth Observing System Data and Information System's (EOSDIS) efforts to simplify metadata creation and improve metadata quality. The MMT was developed via an agile methodology, taking into account inputs from GCMD's science coordinators and other end-users. In its initial release, the MMT uses the Unified Metadata Model for Collections (UMM-C) to allow metadata providers to easily create and update collection records in the ISO-19115 format. Through a simplified UI experience, metadata curators can create and edit collections without full knowledge of the NASA Best Practices implementation of ISO-19115 format, while still generating compliant metadata. More experienced users are also able to access raw metadata to build more complex records as needed. In future releases, the MMT will build upon recent work done in the community to assess metadata quality and compliance with a variety of standards through application of metadata rubrics. The tool will provide users with clear guidance as to how to easily change their metadata in order to improve their quality and compliance. Through these features, the MMT allows data providers to create and maintain compliant and high quality metadata in a short amount of time.
Hypersonic Vehicle Propulsion System Simplified Model Development
NASA Technical Reports Server (NTRS)
Stueber, Thomas J.; Raitano, Paul; Le, Dzu K.; Ouzts, Peter
2007-01-01
This document addresses the modeling task plan for the hypersonic GN&C GRC team members. The overall propulsion system modeling task plan is a multi-step process and the task plan identified in this document addresses the first steps (short term modeling goals). The procedures and tools produced from this effort will be useful for creating simplified dynamic models applicable to a hypersonic vehicle propulsion system. The document continues with the GRC short term modeling goal. Next, a general description of the desired simplified model is presented along with simulations that are available to varying degrees. The simulations may be available in electronic form (FORTRAN, CFD, MatLab,...) or in paper form in published documents. Finally, roadmaps outlining possible avenues towards realizing simplified model are presented.
National Facilities Study. Volume 1: Facilities Inventory
NASA Technical Reports Server (NTRS)
1994-01-01
The inventory activity was initiated to solve the critical need for a single source of site specific descriptive and parametric data on major public and privately held aeronautics and aerospace related facilities. This a challenging undertaking due to the scope of the effort and the short lead time in which to assemble the inventory and have it available to support the task group study needs. The inventory remains dynamic as sites are being added and the data is accessed and refined as the study progresses. The inventory activity also included the design and implementation of a computer database and analytical tools to simplify access to the data. This volume describes the steps which were taken to define the data requirements, select sites, and solicit and acquire data from them. A discussion of the inventory structure and analytical tools is also provided.
Information systems as a quality management tool in clinical laboratories
NASA Astrophysics Data System (ADS)
Schmitz, Vanessa; Rosecler Bez el Boukhari, Marta
2007-11-01
This article describes information systems as a quality management tool in clinical laboratories. The quality of laboratory analyses is of fundamental importance for health professionals in aiding appropriate diagnosis and treatment. Information systems allow the automation of internal quality management processes, using standard sample tests, Levey-Jennings charts and Westgard multirule analysis. This simplifies evaluation and interpretation of quality tests and reduces the possibility of human error. This study proposes the development of an information system with appropriate functions and costs for the automation of internal quality control in small and medium-sized clinical laboratories. To this end, it evaluates the functions and usability of two commercial software products designed for this purpose, identifying the positive features of each, so that these can be taken into account during the development of the proposed system.
NASA Technical Reports Server (NTRS)
1989-01-01
001 is an integrated tool suited for automatically developing ultra reliable models, simulations and software systems. Developed and marketed by Hamilton Technologies, Inc. (HTI), it has been applied in engineering, manufacturing, banking and software tools development. The software provides the ability to simplify the complex. A system developed with 001 can be a prototype or fully developed with production quality code. It is free of interface errors, consistent, logically complete and has no data or control flow errors. Systems can be designed, developed and maintained with maximum productivity. Margaret Hamilton, President of Hamilton Technologies, also directed the research and development of USE.IT, an earlier product which was the first computer aided software engineering product in the industry to concentrate on automatically supporting the development of an ultrareliable system throughout its life cycle. Both products originated in NASA technology developed under a Johnson Space Center contract.
NASA Astrophysics Data System (ADS)
Gómez-Varela, A. I.; Salvado-Vara, F.; Bao-Varela, C.
2014-07-01
Nowadays, new technologies have great influence on our lives and how we access to the information. The new generations have never known a world without them and make use of these new technologies in practically all facets of their day-to-day. Education systems have also evolved rapidly and frequently make use of learning strategies based on interactive tools. In this work we have created a graphical user interface with GUIDE, a development environment of MATLAB, to show, in a simple way, how the eye works. This interactive program is addressed to the first courses of secondary education and designed to introduce them to the basic concepts of the normal refractive condition of the eye and the most common refractive errors, as myopia and hyperopia. The graphic interface makes use of the simplified model of the eye, where the optic system of the visual organ is represented by a converging lens (cornea and crystalline) and a screen (retina). Emmetropic, myopic and hyperopic eye operation is shown graphically to the students, as well as how these focusing errors can be solved with a diverging and converging lens, respectively. This teaching tool was used this academic course in the Colegio Hogar de Santa Margarita (A Coruña) for a better understanding of the students in this matter and to catch their attention to the world of Optics and its importance.
Canbay, Ferhat; Levent, Vecdi Emre; Serbes, Gorkem; Ugurdag, H. Fatih; Goren, Sezer
2016-01-01
The authors aimed to develop an application for producing different architectures to implement dual tree complex wavelet transform (DTCWT) having near shift-invariance property. To obtain a low-cost and portable solution for implementing the DTCWT in multi-channel real-time applications, various embedded-system approaches are realised. For comparison, the DTCWT was implemented in C language on a personal computer and on a PIC microcontroller. However, in the former approach portability and in the latter desired speed performance properties cannot be achieved. Hence, implementation of the DTCWT on a reconfigurable platform such as field programmable gate array, which provides portable, low-cost, low-power, and high-performance computing, is considered as the most feasible solution. At first, they used the system generator DSP design tool of Xilinx for algorithm design. However, the design implemented by using such tools is not optimised in terms of area and power. To overcome all these drawbacks mentioned above, they implemented the DTCWT algorithm by using Verilog Hardware Description Language, which has its own difficulties. To overcome these difficulties, simplify the usage of proposed algorithms and the adaptation procedures, a code generator program that can produce different architectures is proposed. PMID:27733925
Modeling Complex Workflow in Molecular Diagnostics
Gomah, Mohamed E.; Turley, James P.; Lu, Huimin; Jones, Dan
2010-01-01
One of the hurdles to achieving personalized medicine has been implementing the laboratory processes for performing and reporting complex molecular tests. The rapidly changing test rosters and complex analysis platforms in molecular diagnostics have meant that many clinical laboratories still use labor-intensive manual processing and testing without the level of automation seen in high-volume chemistry and hematology testing. We provide here a discussion of design requirements and the results of implementation of a suite of lab management tools that incorporate the many elements required for use of molecular diagnostics in personalized medicine, particularly in cancer. These applications provide the functionality required for sample accessioning and tracking, material generation, and testing that are particular to the evolving needs of individualized molecular diagnostics. On implementation, the applications described here resulted in improvements in the turn-around time for reporting of more complex molecular test sets, and significant changes in the workflow. Therefore, careful mapping of workflow can permit design of software applications that simplify even the complex demands of specialized molecular testing. By incorporating design features for order review, software tools can permit a more personalized approach to sample handling and test selection without compromising efficiency. PMID:20007844
Canbay, Ferhat; Levent, Vecdi Emre; Serbes, Gorkem; Ugurdag, H Fatih; Goren, Sezer; Aydin, Nizamettin
2016-09-01
The authors aimed to develop an application for producing different architectures to implement dual tree complex wavelet transform (DTCWT) having near shift-invariance property. To obtain a low-cost and portable solution for implementing the DTCWT in multi-channel real-time applications, various embedded-system approaches are realised. For comparison, the DTCWT was implemented in C language on a personal computer and on a PIC microcontroller. However, in the former approach portability and in the latter desired speed performance properties cannot be achieved. Hence, implementation of the DTCWT on a reconfigurable platform such as field programmable gate array, which provides portable, low-cost, low-power, and high-performance computing, is considered as the most feasible solution. At first, they used the system generator DSP design tool of Xilinx for algorithm design. However, the design implemented by using such tools is not optimised in terms of area and power. To overcome all these drawbacks mentioned above, they implemented the DTCWT algorithm by using Verilog Hardware Description Language, which has its own difficulties. To overcome these difficulties, simplify the usage of proposed algorithms and the adaptation procedures, a code generator program that can produce different architectures is proposed.
Ten Steps to Create Virtual Smile Design Templates With Adobe Photoshop® CS6.
Sundar, Manoj Kumar; Chelliah, Venkataraman
2018-03-01
Computer design software has become a primary tool for communication among the dentist, patient, and ceramist. Virtual smile design can be carried out using various software programs, most of which use assorted forms of teeth templates that are made based on the concept of "golden proportion." Despite current advances in 3-dimensional imaging and smile designing, many clinicians still employ conventional design methods and analog (ie, man-made) mock-ups in assessing and establishing esthetic makeovers. To simplify virtual smile designing, the teeth templates should be readily available. No literature has provided details as to how to create these templates. This article explains a technique for creating different forms of teeth templates using Adobe Photoshop® CS6 that eventually can be used for smile design purposes, either in Photoshop or Microsoft Powerpoint. Clinically speaking, various smile design templates created using set proportions in Adobe Photoshop CS6 can be used in virtual smile designing, a valuable resource in diagnosis, treatment planning, and communicating with patients and ceramists, thus providing a platform for a successful esthetic rehabilitation.
NREL's EVI-Pro Lite Tool Paves the Way for Future Electric Vehicle
Electric Vehicle Infrastructure Planning NREL's EVI-Pro Lite Tool Paves the Way for Future Electric Vehicle electric vehicle charging station To assist state and local governments anticipating this type of growth in simplified version of the Electric Vehicle Infrastructure Projection Tool (EVI-Pro) model. Combining a sleek
2011-01-01
Background The complexity and inter-related nature of biological data poses a difficult challenge for data and tool integration. There has been a proliferation of interoperability standards and projects over the past decade, none of which has been widely adopted by the bioinformatics community. Recent attempts have focused on the use of semantics to assist integration, and Semantic Web technologies are being welcomed by this community. Description SADI - Semantic Automated Discovery and Integration - is a lightweight set of fully standards-compliant Semantic Web service design patterns that simplify the publication of services of the type commonly found in bioinformatics and other scientific domains. Using Semantic Web technologies at every level of the Web services "stack", SADI services consume and produce instances of OWL Classes following a small number of very straightforward best-practices. In addition, we provide codebases that support these best-practices, and plug-in tools to popular developer and client software that dramatically simplify deployment of services by providers, and the discovery and utilization of those services by their consumers. Conclusions SADI Services are fully compliant with, and utilize only foundational Web standards; are simple to create and maintain for service providers; and can be discovered and utilized in a very intuitive way by biologist end-users. In addition, the SADI design patterns significantly improve the ability of software to automatically discover appropriate services based on user-needs, and automatically chain these into complex analytical workflows. We show that, when resources are exposed through SADI, data compliant with a given ontological model can be automatically gathered, or generated, from these distributed, non-coordinating resources - a behaviour we have not observed in any other Semantic system. Finally, we show that, using SADI, data dynamically generated from Web services can be explored in a manner very similar to data housed in static triple-stores, thus facilitating the intersection of Web services and Semantic Web technologies. PMID:22024447
Wilkinson, Mark D; Vandervalk, Benjamin; McCarthy, Luke
2011-10-24
The complexity and inter-related nature of biological data poses a difficult challenge for data and tool integration. There has been a proliferation of interoperability standards and projects over the past decade, none of which has been widely adopted by the bioinformatics community. Recent attempts have focused on the use of semantics to assist integration, and Semantic Web technologies are being welcomed by this community. SADI - Semantic Automated Discovery and Integration - is a lightweight set of fully standards-compliant Semantic Web service design patterns that simplify the publication of services of the type commonly found in bioinformatics and other scientific domains. Using Semantic Web technologies at every level of the Web services "stack", SADI services consume and produce instances of OWL Classes following a small number of very straightforward best-practices. In addition, we provide codebases that support these best-practices, and plug-in tools to popular developer and client software that dramatically simplify deployment of services by providers, and the discovery and utilization of those services by their consumers. SADI Services are fully compliant with, and utilize only foundational Web standards; are simple to create and maintain for service providers; and can be discovered and utilized in a very intuitive way by biologist end-users. In addition, the SADI design patterns significantly improve the ability of software to automatically discover appropriate services based on user-needs, and automatically chain these into complex analytical workflows. We show that, when resources are exposed through SADI, data compliant with a given ontological model can be automatically gathered, or generated, from these distributed, non-coordinating resources - a behaviour we have not observed in any other Semantic system. Finally, we show that, using SADI, data dynamically generated from Web services can be explored in a manner very similar to data housed in static triple-stores, thus facilitating the intersection of Web services and Semantic Web technologies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Womeldorff, Geoffrey Alan; Payne, Joshua Estes; Bergen, Benjamin Karl
These are slides for a presentation on PARTISN Research and FleCSI Updates. The following topics are covered: SNAP vs PARTISN, Background Research, Production Code (structural design and changes, kernel design and implementation, lessons learned), NuT IMC Proxy, FleCSI Update (design and lessons learned). It can all be summarized in the following manner: Kokkos was shown to be effective in FY15 in implementing a C++ version of SNAP's kernel. This same methodology was applied to a production IC code, PARTISN. This was a much more complex endeavour than in FY15 for many reasons; a C++ kernel embedded in Fortran, overloading Fortranmore » memory allocations, general language interoperability, and a fully fleshed out production code versus a simplified proxy code. Lessons learned are Legion. In no particular order: Interoperability between Fortran and C++ was really not that hard, and a useful engineering effort. Tracking down all necessary memory allocations for a kernel in a production code is pretty hard. Modifying a production code to work for more than a handful of use cases is also pretty hard. Figuring out the toolchain that will allow a successful implementation of design decisions is quite hard, if making use of "bleeding edge" design choices. In terms of performance, production code concurrency architecture can be a virtual showstopper; being too complex to easily rewrite and test in a short period of time, or depending on tool features which do not exist yet. Ultimately, while the tools used in this work were not successful in speeding up the production code, they helped to identify how work would be done, and provide requirements to tools.« less
Design and implementation of a sigma delta technology based pulse oximeter's acquisition stage
NASA Astrophysics Data System (ADS)
Rossi, E. E.; Peñalva, A.; Schaumburg, F.
2011-12-01
Pulse oximetry is a widely used tool in medical practice for estimating patient's fraction of hemoglobin bonded to oxygen. Conventional oximetry presents limitations when changes in the baseline, or low amplitude of signals involved occur. The aim of this paper is to simultaneously solve these constraints and to simplify the circuitry needed, by using ΣΔ technology. For this purpose, a board for the acquisition of the needed signals was developed, together with a PC managed software which controls it, and displays and processes in real time the information acquired. Also laboratory and field tests where designed and executed to verify the performance of this equipment in adverse situations. A simple, robust and economic instrument was achieved, capable of obtaining signals even in situations where conventional oximetry fails.
Development of electron beam ion source for nanoprocess using highly charged ions
NASA Astrophysics Data System (ADS)
Sakurai, Makoto; Nakajima, Fumiharu; Fukumoto, Takunori; Nakamura, Nobuyuki; Ohtani, Shunsuke; Mashiko, Shinro; Sakaue, Hiroyuki
2005-07-01
Highly charged ion is useful to produce nanostructure on various materials, and is key tool to realize single ion implantation technique. On such demands for the application to nanotechnology, we have designed an electron bean ion source. The design stresses on the volume of drift tubes where highly charged ions are confined and the efficiency of ion extraction from the drift tube through collector electrode in order to obtain intense ion beam as much as possible. The ion source uses a discrete superconducting magnet cooled by a closed-cycle refrigerator in order to reduce the running costs and to simplify the operating procedures. The electrodes of electron gun, drift tubes, and collector are enclosed in ultrahigh vacuum tube that is inserted into the bore of the magnet system.
NASA Astrophysics Data System (ADS)
Farjoud, Alireza; Taylor, Russell; Schumann, Eric; Schlangen, Timothy
2014-02-01
This paper is focused on modelling, design, and testing of semi-active magneto-rheological (MR) engine and transmission mounts used in the automotive industry. The purpose is to develop a complete analysis, synthesis, design, and tuning tool that reduces the need for expensive and time-consuming laboratory and field tests. A detailed mathematical model of such devices is developed using multi-physics modelling techniques for physical systems with various energy domains. The model includes all major features of an MR mount including fluid dynamics, fluid track, elastic components, decoupler, rate-dip, gas-charged chamber, MR fluid rheology, magnetic circuit, electronic driver, and control algorithm. Conventional passive hydraulic mounts can also be studied using the same mathematical model. The model is validated using standard experimental procedures. It is used for design and parametric study of mounts; effects of various geometric and material parameters on dynamic response of mounts can be studied. Additionally, this model can be used to test various control strategies to obtain best vibration isolation performance by tuning control parameters. Another benefit of this work is that nonlinear interactions between sub-components of the mount can be observed and investigated. This is not possible by using simplified linear models currently available.
A computational platform to maintain and migrate manual functional annotations for BioCyc databases.
Walsh, Jesse R; Sen, Taner Z; Dickerson, Julie A
2014-10-12
BioCyc databases are an important resource for information on biological pathways and genomic data. Such databases represent the accumulation of biological data, some of which has been manually curated from literature. An essential feature of these databases is the continuing data integration as new knowledge is discovered. As functional annotations are improved, scalable methods are needed for curators to manage annotations without detailed knowledge of the specific design of the BioCyc database. We have developed CycTools, a software tool which allows curators to maintain functional annotations in a model organism database. This tool builds on existing software to improve and simplify annotation data imports of user provided data into BioCyc databases. Additionally, CycTools automatically resolves synonyms and alternate identifiers contained within the database into the appropriate internal identifiers. Automating steps in the manual data entry process can improve curation efforts for major biological databases. The functionality of CycTools is demonstrated by transferring GO term annotations from MaizeCyc to matching proteins in CornCyc, both maize metabolic pathway databases available at MaizeGDB, and by creating strain specific databases for metabolic engineering.
29 CFR 2200.206 - Disclosure of information.
Code of Federal Regulations, 2010 CFR
2010-07-01
... working days after a case is designated for Simplified Proceedings, the Secretary shall provide the employer, free of charge, copies of the narrative (Form OSHA 1-A) and the worksheet (Form OSHA 1-B), or their equivalents. (2) Within 30 calendar days after a case is designated for Simplified Proceedings...
Simplify Web Development for Faculty and Promote Instructional Design.
ERIC Educational Resources Information Center
Pedersen, David C.
Faculty members are often overwhelmed with the prospect of implementing Web-based instruction. In an effort to simplify the process and incorporate some basic instructional design elements, the Educational Technology Team at Embry Riddle Aeronautical University created a course template for WebCT. Utilizing rapid prototyping, the template…
Pydna: a simulation and documentation tool for DNA assembly strategies using python.
Pereira, Filipa; Azevedo, Flávio; Carvalho, Ângela; Ribeiro, Gabriela F; Budde, Mark W; Johansson, Björn
2015-05-02
Recent advances in synthetic biology have provided tools to efficiently construct complex DNA molecules which are an important part of many molecular biology and biotechnology projects. The planning of such constructs has traditionally been done manually using a DNA sequence editor which becomes error-prone as scale and complexity of the construction increase. A human-readable formal description of cloning and assembly strategies, which also allows for automatic computer simulation and verification, would therefore be a valuable tool. We have developed pydna, an extensible, free and open source Python library for simulating basic molecular biology DNA unit operations such as restriction digestion, ligation, PCR, primer design, Gibson assembly and homologous recombination. A cloning strategy expressed as a pydna script provides a description that is complete, unambiguous and stable. Execution of the script automatically yields the sequence of the final molecule(s) and that of any intermediate constructs. Pydna has been designed to be understandable for biologists with limited programming skills by providing interfaces that are semantically similar to the description of molecular biology unit operations found in literature. Pydna simplifies both the planning and sharing of cloning strategies and is especially useful for complex or combinatorial DNA molecule construction. An important difference compared to existing tools with similar goals is the use of Python instead of a specifically constructed language, providing a simulation environment that is more flexible and extensible by the user.
OpenMDAO: Framework for Flexible Multidisciplinary Design, Analysis and Optimization Methods
NASA Technical Reports Server (NTRS)
Heath, Christopher M.; Gray, Justin S.
2012-01-01
The OpenMDAO project is underway at NASA to develop a framework which simplifies the implementation of state-of-the-art tools and methods for multidisciplinary design, analysis and optimization. Foremost, OpenMDAO has been designed to handle variable problem formulations, encourage reconfigurability, and promote model reuse. This work demonstrates the concept of iteration hierarchies in OpenMDAO to achieve a flexible environment for supporting advanced optimization methods which include adaptive sampling and surrogate modeling techniques. In this effort, two efficient global optimization methods were applied to solve a constrained, single-objective and constrained, multiobjective version of a joint aircraft/engine sizing problem. The aircraft model, NASA's nextgeneration advanced single-aisle civil transport, is being studied as part of the Subsonic Fixed Wing project to help meet simultaneous program goals for reduced fuel burn, emissions, and noise. This analysis serves as a realistic test problem to demonstrate the flexibility and reconfigurability offered by OpenMDAO.
Flowers, Natalie L
2010-01-01
CodeSlinger is a desktop application that was developed to aid medical professionals in the intertranslation, exploration, and use of biomedical coding schemes. The application was designed to provide a highly intuitive, easy-to-use interface that simplifies a complex business problem: a set of time-consuming, laborious tasks that were regularly performed by a group of medical professionals involving manually searching coding books, searching the Internet, and checking documentation references. A workplace observation session with a target user revealed the details of the current process and a clear understanding of the business goals of the target user group. These goals drove the design of the application's interface, which centers on searches for medical conditions and displays the codes found in the application's database that represent those conditions. The interface also allows the exploration of complex conceptual relationships across multiple coding schemes.
Developing Web-based Tools for Collaborative Science and Public Outreach
NASA Astrophysics Data System (ADS)
Friedman, A.; Pizarro, O.; Williams, S. B.
2016-02-01
With the advances in high bandwidth communications and the proliferation of social media tools, education & outreach activities have become commonplace on ocean-bound research cruises. In parallel, advances in underwater robotics & other data collecting platforms, have made it possible to collect copious amounts of oceanographic data. This data then typically undergoes laborious, manual processing to transform it into quantitative information, which normally occurs post cruise resulting in significant lags between collecting data and using it for scientific discovery. This presentation discusses how appropriately designed software systems, can be used to fulfill multiple objectives and attempt to leverage public engagement in order to compliment science goals. We will present two software platforms: the first is a web browser based tool that was developed for real-time tracking of multiple underwater robots and ships. It was designed to allow anyone on board to view or control it on any device with a web browser. It opens up the possibility of remote teleoperation & engagement and was easily adapted to enable live streaming over the internet for public outreach. While the tracking system provided context and engaged people in real-time, it also directed interested participants to Squidle, another online system. Developed for scientists, Squidle supports data management, exploration & analysis and enables direct access to survey data reducing the lag in data processing. It provides a user-friendly streamlined interface that integrates advanced data management & online annotation tools. This system was adapted to provide a simplified user interface, tutorial instructions and a gamified ranking system to encourage "citizen science" participation. These examples show that through a flexible design approach, it is possible to leverage the development effort of creating science tools to facilitate outreach goals, opening up the possibility for acquiring large volumes of crowd-sourced data without compromising science objectives.
Optical chirp z-transform processor with a simplified architecture.
Ngo, Nam Quoc
2014-12-29
Using a simplified chirp z-transform (CZT) algorithm based on the discrete-time convolution method, this paper presents the synthesis of a simplified architecture of a reconfigurable optical chirp z-transform (OCZT) processor based on the silica-based planar lightwave circuit (PLC) technology. In the simplified architecture of the reconfigurable OCZT, the required number of optical components is small and there are no waveguide crossings which make fabrication easy. The design of a novel type of optical discrete Fourier transform (ODFT) processor as a special case of the synthesized OCZT is then presented to demonstrate its effectiveness. The designed ODFT can be potentially used as an optical demultiplexer at the receiver of an optical fiber orthogonal frequency division multiplexing (OFDM) transmission system.
Ten questions concerning occupant behavior in buildings: The big picture
Hong, Tianzhen; Yan, Da; D'Oca, Simona; ...
2016-12-27
Occupant behavior has significant impacts on building energy performance and occupant comfort. However, occupant behavior is not well understood and is often oversimplified in the building life cycle, due to its stochastic, diverse, complex, and interdisciplinary nature. The use of simplified methods or tools to quantify the impacts of occupant behavior in building performance simulations significantly contributes to performance gaps between simulated models and actual building energy consumption. Therefore, it is crucial to understand occupant behavior in a comprehensive way, integrating qualitative approaches and data- and model-driven quantitative approaches, and employing appropriate tools to guide the design and operation ofmore » low-energy residential and commercial buildings that integrate technological and human dimensions. This paper presents ten questions, highlighting some of the most important issues regarding concepts, applications, and methodologies in occupant behavior research. The proposed questions and answers aim to provide insights into occupant behavior for current and future researchers, designers, and policy makers, and most importantly, to inspire innovative research and applications to increase energy efficiency and reduce energy use in buildings.« less
Sound field simulation and acoustic animation in urban squares
NASA Astrophysics Data System (ADS)
Kang, Jian; Meng, Yan
2005-04-01
Urban squares are important components of cities, and the acoustic environment is important for their usability. While models and formulae for predicting the sound field in urban squares are important for their soundscape design and improvement, acoustic animation tools would be of great importance for designers as well as for public participation process, given that below a certain sound level, the soundscape evaluation depends mainly on the type of sounds rather than the loudness. This paper first briefly introduces acoustic simulation models developed for urban squares, as well as empirical formulae derived from a series of simulation. It then presents an acoustic animation tool currently being developed. In urban squares there are multiple dynamic sound sources, so that the computation time becomes a main concern. Nevertheless, the requirements for acoustic animation in urban squares are relatively low compared to auditoria. As a result, it is important to simplify the simulation process and algorithms. Based on a series of subjective tests in a virtual reality environment with various simulation parameters, a fast simulation method with acceptable accuracy has been explored. [Work supported by the European Commission.
Schomann, Carsten; Giebel, Ole; Nachreiner, Friedhelm
2006-01-01
BASS 4, a computer program for the design and evaluation of workings hours, is an example of an ergonomics-based software tool that can be used by safety practitioners at the shop floor with regard to legal, ergonomic, and economic criteria. Based on experiences with this computer program, a less sophisticated Working-Hours-Risk Index for assessing the quality of work schedules (including flexible work hours) to indicate risks to health and wellbeing has been developed to provide a quick and easy applicable tool for legally required risk assessments. The results of a validation study show that this risk index seems to be a promising indicator for predicting risks of health complaints and wellbeing. The purpose of the Risk Index is to simplify the evaluation process at the shop floor and provide some more general information about the quality of a work schedule that can be used for triggering preventive interventions. Such a risk index complies with practitioners' expectations and requests for easy, useful, and valid instruments.
Simplified LCA and matrix methods in identifying the environmental aspects of a product system.
Hur, Tak; Lee, Jiyong; Ryu, Jiyeon; Kwon, Eunsun
2005-05-01
In order to effectively integrate environmental attributes into the product design and development processes, it is crucial to identify the significant environmental aspects related to a product system within a relatively short period of time. In this study, the usefulness of life cycle assessment (LCA) and a matrix method as tools for identifying the key environmental issues of a product system were examined. For this, a simplified LCA (SLCA) method that can be applied to Electrical and Electronic Equipment (EEE) was developed to efficiently identify their significant environmental aspects for eco-design, since a full scale LCA study is usually very detailed, expensive and time-consuming. The environmentally responsible product assessment (ERPA) method, which is one of the matrix methods, was also analyzed. Then, the usefulness of each method in eco-design processes was evaluated and compared using the case studies of the cellular phone and vacuum cleaner systems. It was found that the SLCA and the ERPA methods provided different information but they complemented each other to some extent. The SLCA method generated more information on the inherent environmental characteristics of a product system so that it might be useful for new design/eco-innovation when developing a completely new product or method where environmental considerations play a major role from the beginning. On the other hand, the ERPA method gave more information on the potential for improving a product so that it could be effectively used in eco-redesign which intends to alleviate environmental impacts of an existing product or process.
Simplified power processing for ion-thruster subsystems
NASA Technical Reports Server (NTRS)
Wessel, F. J.; Hancock, D. J.
1983-01-01
A design for a greatly simplified power-processing unit (SPPU) for the 8-cm diameter mercury-ion-thruster subsystem is discussed. This SPPU design will provide a tenfold reduction in parts count, a decrease in system mass and cost, and an increase in system reliability compared to the existing power-processing unit (PPU) used in the Hughes/NASA Lewis Research Center Ion Auxiliary Propulsion Subsystem. The simplifications achieved in this design will greatly increase the attractiveness of ion propulsion in near-term and future spacecraft propulsion applications. A description of a typical ion-thruster subsystem is given. An overview of the thruster/power-processor interface requirements is given. Simplified thruster power processing is discussed.
Analysis of Different Cost Functions in the Geosect Airspace Partitioning Tool
NASA Technical Reports Server (NTRS)
Wong, Gregory L.
2010-01-01
A new cost function representing air traffic controller workload is implemented in the Geosect airspace partitioning tool. Geosect currently uses a combination of aircraft count and dwell time to select optimal airspace partitions that balance controller workload. This is referred to as the aircraft count/dwell time hybrid cost function. The new cost function is based on Simplified Dynamic Density, a measure of different aspects of air traffic controller workload. Three sectorizations are compared. These are the current sectorization, Geosect's sectorization based on the aircraft count/dwell time hybrid cost function, and Geosect s sectorization based on the Simplified Dynamic Density cost function. Each sectorization is evaluated for maximum and average workload along with workload balance using the Simplified Dynamic Density as the workload measure. In addition, the Airspace Concept Evaluation System, a nationwide air traffic simulator, is used to determine the capacity and delay incurred by each sectorization. The sectorization resulting from the Simplified Dynamic Density cost function had a lower maximum workload measure than the other sectorizations, and the sectorization based on the combination of aircraft count and dwell time did a better job of balancing workload and balancing capacity. However, the current sectorization had the lowest average workload, highest sector capacity, and the least system delay.
NASA Technical Reports Server (NTRS)
Arnold, William R.
2015-01-01
Since last year, a number of expanded capabilities have been added to the modeler. The support the integration with thermal modeling, the program can now produce simplified thermal models with the same geometric parameters as the more detailed dynamic and even more refined stress models. The local mesh refinement and mesh improvement tools have been expanded and more user friendly. The goal is to provide a means of evaluating both monolithic and segmented mirrors to the same level of fidelity and loading conditions at reasonable man-power efforts. The paper will demonstrate most of these new capabilities.
NASA Technical Reports Server (NTRS)
Arnold, William R., Sr.
2015-01-01
Since last year, a number of expanded capabilities have been added to the modeler. The support the integration with thermal modeling, the program can now produce simplified thermal models with the same geometric parameters as the more detailed dynamic and even more refined stress models. The local mesh refinement and mesh improvement tools have been expanded and more user friendly. The goal is to provide a means of evaluating both monolithic and segmented mirrors to the same level of fidelity and loading conditions at reasonable man-power efforts. The paper will demonstrate most of these new capabilities.
Advances in the production of freeform optical surfaces
NASA Astrophysics Data System (ADS)
Tohme, Yazid E.; Luniya, Suneet S.
2007-05-01
Recent market demands for free-form optics have challenged the industry to find new methods and techniques to manufacture free-form optical surfaces with a high level of accuracy and reliability. Production techniques are becoming a mix of multi-axis single point diamond machining centers or deterministic ultra precision grinding centers coupled with capable measurement systems to accomplish the task. It has been determined that a complex software tool is required to seamlessly integrate all aspects of the manufacturing process chain. Advances in computational power and improved performance of computer controlled precision machinery have driven the use of such software programs to measure, visualize, analyze, produce and re-validate the 3D free-form design thus making the process of manufacturing such complex surfaces a viable task. Consolidation of the entire production cycle in a comprehensive software tool that can interact with all systems in design, production and measurement phase will enable manufacturers to solve these complex challenges providing improved product quality, simplified processes, and enhanced performance. The work being presented describes the latest advancements in developing such software package for the entire fabrication process chain for aspheric and free-form shapes. It applies a rational B-spline based kernel to transform an optical design in the form of parametrical definition (optical equation), standard CAD format, or a cloud of points to a central format that drives the simulation. This software tool creates a closed loop for the fabrication process chain. It integrates surface analysis and compensation, tool path generation, and measurement analysis in one package.
Using stable isotopes and models to explore estuarine linkages at multiple scales
Estuarine managers need tools to respond to dynamic stressors that occur in three linked environments – coastal ocean, estuaries and watersheds. Models have been the tool of choice for examining these dynamic systems because they simplify processes and integrate over multiple sc...
Evaluating Uncertainty in Integrated Environmental Models: A Review of Concepts and Tools
This paper reviews concepts for evaluating integrated environmental models and discusses a list of relevant software-based tools. A simplified taxonomy for sources of uncertainty and a glossary of key terms with standard definitions are provided in the context of integrated appro...
Moeeni, Vesal; Walls, Tony; Day, Andrew S
2014-12-01
Hospitalised children have higher rates of undernutrition. Early detection of at-risk patients could lead to prompt preventative or corrective interventions. Several nutritional risk screening tools are available for screening hospitalised children including the STRONGkids tool. This study was designed to assess the usefulness of STRONGkids when applied by nurses rather than a paediatrician. The STRONGkids questionnaire was simplified to enhance clarity with nursing staff. Trained nursing staff were asked to apply the tool to children, aged 1 month to 17 years, admitted to the Christchurch Hospital, New Zealand. Each patient was also assessed by a paediatrician. In addition, the current nutritional state of each patient was defined by measuring their weight and height. Of the 162 children enrolled, 11.7% were undernourished and 13% overnourished. STRONGkids recognised 84% of undernourished children when the tool was applied by nurses and 90% when the tool was applied by a paediatrician, indicating substantial agreement (kappa = 0.65). A minor simplification to the questionnaire improved its utility. STRONGkids successfully recognised at-risk children, when applied by either nurses or a paediatrician. It was suitable and feasible for nursing staff to use it to screen for children at risk of nutritional deterioration. ©2014 Foundation Acta Paediatrica. Published by John Wiley & Sons Ltd.
2010-01-01
Background The development of DNA microarrays has facilitated the generation of hundreds of thousands of transcriptomic datasets. The use of a common reference microarray design allows existing transcriptomic data to be readily compared and re-analysed in the light of new data, and the combination of this design with large datasets is ideal for 'systems'-level analyses. One issue is that these datasets are typically collected over many years and may be heterogeneous in nature, containing different microarray file formats and gene array layouts, dye-swaps, and showing varying scales of log2- ratios of expression between microarrays. Excellent software exists for the normalisation and analysis of microarray data but many data have yet to be analysed as existing methods struggle with heterogeneous datasets; options include normalising microarrays on an individual or experimental group basis. Our solution was to develop the Batch Anti-Banana Algorithm in R (BABAR) algorithm and software package which uses cyclic loess to normalise across the complete dataset. We have already used BABAR to analyse the function of Salmonella genes involved in the process of infection of mammalian cells. Results The only input required by BABAR is unprocessed GenePix or BlueFuse microarray data files. BABAR provides a combination of 'within' and 'between' microarray normalisation steps and diagnostic boxplots. When applied to a real heterogeneous dataset, BABAR normalised the dataset to produce a comparable scaling between the microarrays, with the microarray data in excellent agreement with RT-PCR analysis. When applied to a real non-heterogeneous dataset and a simulated dataset, BABAR's performance in identifying differentially expressed genes showed some benefits over standard techniques. Conclusions BABAR is an easy-to-use software tool, simplifying the simultaneous normalisation of heterogeneous two-colour common reference design cDNA microarray-based transcriptomic datasets. We show BABAR transforms real and simulated datasets to allow for the correct interpretation of these data, and is the ideal tool to facilitate the identification of differentially expressed genes or network inference analysis from transcriptomic datasets. PMID:20128918
Andrew, Stefanie Frances; Rothemeyer, Sally; Balchin, Ross
2017-01-01
The Western Cape Province of South Africa has a great shortage of diagnostic expertise, rehabilitative infrastructure, and support services for patients with traumatic brain injury (TBI). The neurosurgical outpatient setting is busy and often chaotic, and patients are frequently lost to follow-up. This study sought to continue with the design and development of a comprehensive, yet brief tool to aid patient referrals and ensure that no consequence of TBI is left unidentified and unaddressed. There were 47 patients with TBI (mean age, 35 years; range, 18-75 years) assessed. The study was designed in 3 distinct phases, each representing a different stage in the tool's development. The Groote Schuur Traumatic Brain Injury Evaluation was shortened and simplified. Overall, 81% of the participants indicated cognitive dysfunction. There was a high prevalence of psychological/psychiatric sequelae, with 85% of participants reporting at least 1 such problem. The findings further highlight the prevalence of the cognitive, behavioral, and psychological consequences of TBI and shed additional light on the particular types of problems that patients with TBI face. Following the identified changes, the questionnaire and algorithm combination are now ready to be validated in the neurosurgical clinical setting. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Crosby, Robert H.
1992-01-01
The Integrated Receiver/Decoder (IRD) currently used on the Space Shuttle was designed in the 1980 and prior time frame. Over the past 12 years, several parts have become obsolete or difficult to obtain. As directed by the Marshall Space Flight Center, a primary objective is to investigate updating the IRD design using the latest technology subsystems. To take advantage of experience with the current designs, an analysis of failures and a review of discrepancy reports, material review board actions, scrap, etc. are given. A recommended new design designated as the Advanced Receiver/Decoder (ARD) is presented. This design uses the latest technology components to simplify circuits, improve performance, reduce size and cost, and improve reliability. A self-test command is recommended that can improve and simplify operational procedures. Here, the new design is contrasted with the old. Possible simplification of the total Range Safety System is discussed, as is a single-step crypto technique that can improve and simplify operational procedures.
The SeaHorn Verification Framework
NASA Technical Reports Server (NTRS)
Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.
2015-01-01
In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.
Development of Fuel Shuffling Module for PHISICS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allan Mabe; Andrea Alfonsi; Cristian Rabiti
2013-06-01
PHISICS (Parallel and Highly Innovative Simulation for the INL Code System) [4] code toolkit has been in development at the Idaho National Laboratory. This package is intended to provide a modern analysis tool for reactor physics investigation. It is designed with the mindset to maximize accuracy for a given availability of computational resources and to give state of the art tools to the modern nuclear engineer. This is obtained by implementing several different algorithms and meshing approaches among which the user will be able to choose, in order to optimize his computational resources and accuracy needs. The software is completelymore » modular in order to simplify the independent development of modules by different teams and future maintenance. The package is coupled with the thermo-hydraulic code RELAP5-3D [3]. In the following the structure of the different PHISICS modules is briefly recalled, focusing on the new shuffling module (SHUFFLE), object of this paper.« less
Simplification rules for birdtrack operators
NASA Astrophysics Data System (ADS)
Alcock-Zeilinger, J.; Weigert, H.
2017-05-01
This paper derives a set of easy-to-use tools designed to simplify calculations with birdtrack operators comprised of symmetrizers and antisymmetrizers. In particular, we present cancellation rules allowing one to shorten the birdtrack expressions of operators and propagation rules identifying the circumstances under which it is possible to propagate symmetrizers past antisymmetrizers and vice versa. We exhibit the power of these simplification rules by means of a short example in which we apply the tools derived in this paper on a typical operator that can be encountered in the representation theory of 𝖲𝖴 (N ) over the product space V⊗m. These rules form the basis for the construction of compact Hermitian Young projection operators and their transition operators addressed in companion papers [J. Alcock-Zeilinger and H. Weigert, "Compact Hermitian Young projection operators," e-print arXiv:1610.10088 [math-ph] and J. Alcock-Zeilinger and H. Weigert, "Transition operators," e-print arXiv:1610.08802 [math-ph
AMP: a science-driven web-based application for the TeraGrid
NASA Astrophysics Data System (ADS)
Woitaszek, M.; Metcalfe, T.; Shorrock, I.
The Asteroseismic Modeling Portal (AMP) provides a web-based interface for astronomers to run and view simulations that derive the properties of Sun-like stars from observations of their pulsation frequencies. In this paper, we describe the architecture and implementation of AMP, highlighting the lightweight design principles and tools used to produce a functional fully-custom web-based science application in less than a year. Targeted as a TeraGrid science gateway, AMP's architecture and implementation are intended to simplify its orchestration of TeraGrid computational resources. AMP's web-based interface was developed as a traditional standalone database-backed web application using the Python-based Django web development framework, allowing us to leverage the Django framework's capabilities while cleanly separating the user interface development from the grid interface development. We have found this combination of tools flexible and effective for rapid gateway development and deployment.
Spacecraft transformer and inductor design
NASA Technical Reports Server (NTRS)
Mclyman, W. T.
1977-01-01
The conversion process in spacecraft power electronics requires the use of magnetic components which frequently are the heaviest and bulkiest items in the conversion circuit. This handbook pertains to magnetic material selection, transformer and inductor design tradeoffs, transformer design, iron core dc inductor design, toroidal power core inductor design, window utilization factors, regulation, and temperature rise. Relationships are given which simplify and standardize the design of transformers and the analysis of the circuits in which they are used. The interactions of the various design parameters are also presented in simplified form so that tradeoffs and optimizations may easily be made.
Tool simplifies machining of pipe ends for precision welding
NASA Technical Reports Server (NTRS)
Matus, S. T.
1969-01-01
Single tool prepares a pipe end for precision welding by simultaneously performing internal machining, end facing, and bevel cutting to specification standards. The machining operation requires only one milling adjustment, can be performed quickly, and produces the high quality pipe-end configurations required to ensure precision-welded joints.
ERIC Educational Resources Information Center
Becker, Bernd W.
2010-01-01
The author has discussed the Multimedia Educational Resource for Teaching and Online Learning site, MERLOT, in a recent Electronic Roundup column. In this article, he discusses an entirely new Web page development tool that MERLOT has added for its members. The new tool is called the MERLOT Content Builder and is directly integrated into the…
Ichikawa, Satoshi
2016-06-01
It is important to pursue function-oriented synthesis (FOS), a strategy for the design of less structurally complex targets with comparable or superior activity that can be made in a practical manner, because compared to synthetic drugs, many biologically relevant natural products possess large and complex chemical structures that may restrict chemical modifications in a structure-activity relationship study. In this account, we describe recent efforts to simplify complex nucleoside natural products including caprazamycins. Considering the structure-activity relationship study with several truncated analogues, three types of simplified derivatives, namely, oxazolidine, isoxazolidine, and lactam-fused isoxazolidine-containing uridine derivatives, were designed and efficiently synthesized. These simplified derivatives have exhibited promising antibacterial activities. A significant feature of our studies is the rational and drastic simplification of the molecular architecture of caprazamycins. This study provides a novel strategy for the development of a new type of antibacterial agent effective against drug-resistant bacteria. © 2016 The Chemical Society of Japan & Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
ERIC Educational Resources Information Center
Yang, Allen H. J.; Dimiduk, Kathryn; Daniel, Susan
2011-01-01
We present a simplified human alcohol metabolism model for a mass balance team project. Students explore aspects of engineering in biotechnology: designing/modeling biological systems, testing the design/model, evaluating new conditions, and exploring cutting-edge "lab-on-a-chip" research. This project highlights chemical engineering's impact on…
NASA Astrophysics Data System (ADS)
Radziszewski, Kacper
2017-10-01
The following paper presents the results of the research in the field of the machine learning, investigating the scope of application of the artificial neural networks algorithms as a tool in architectural design. The computational experiment was held using the backward propagation of errors method of training the artificial neural network, which was trained based on the geometry of the details of the Roman Corinthian order capital. During the experiment, as an input training data set, five local geometry parameters combined has given the best results: Theta, Pi, Rho in spherical coordinate system based on the capital volume centroid, followed by Z value of the Cartesian coordinate system and a distance from vertical planes created based on the capital symmetry. Additionally during the experiment, artificial neural network hidden layers optimal count and structure was found, giving results of the error below 0.2% for the mentioned before input parameters. Once successfully trained artificial network, was able to mimic the details composition on any other geometry type given. Despite of calculating the transformed geometry locally and separately for each of the thousands of surface points, system could create visually attractive and diverse, complex patterns. Designed tool, based on the supervised learning method of machine learning, gives possibility of generating new architectural forms- free of the designer’s imagination bounds. Implementing the infinitely broad computational methods of machine learning, or Artificial Intelligence in general, not only could accelerate and simplify the design process, but give an opportunity to explore never seen before, unpredictable forms or everyday architectural practice solutions.
Duane, B G; Humphris, G; Richards, D; Okeefe, E J; Gordon, K; Freeman, R
2014-12-01
To assess the use of the WCMT in two Scottish health boards and to consider the impact of simplifying the tool to improve efficient use. A retrospective analysis of routine WCMT data (47,276 cases). Public Dental Service (PDS) within NHS Lothian and Highland. The WCMT consists of six criteria. Each criterion is measured independently on a four-point scale to assess patient complexity and the dental care for the disabled/impaired patient. Psychometric analyses on the data-set were conducted. Conventional internal consistency coefficients were calculated. Latent variable modelling was performed to assess the 'fit' of the raw data to a pre-specified measurement model. A Confirmatory Factor Analysis (CFA) was used to test three potential changes to the existing WCMT that included, the removal of the oral risk factor question, the removal of original weightings for scoring the Tool, and collapsing the 4-point rating scale to three categories. The removal of the oral risk factor question had little impact on the reliability of the proposed simplified CMT to discriminate between levels of patient complexity. The removal of weighting and collapsing each item's rating scale to three categories had limited impact on reliability of the revised tool. The CFA analysis provided strong evidence that a new, proposed simplified Case Mix Tool (sCMT) would operate closely to the pre-specified measurement model (the WMCT). A modified sCMT can demonstrate, without reducing reliability, a useful measure of the complexity of patient care. The proposed sCMT may be implemented within primary care dentistry to record patient complexity as part of an oral health assessment.
Constructing and Modifying Sequence Statistics for relevent Using informR in 𝖱
Marcum, Christopher Steven; Butts, Carter T.
2015-01-01
The informR package greatly simplifies the analysis of complex event histories in 𝖱 by providing user friendly tools to build sufficient statistics for the relevent package. Historically, building sufficient statistics to model event sequences (of the form a→b) using the egocentric generalization of Butts’ (2008) relational event framework for modeling social action has been cumbersome. The informR package simplifies the construction of the complex list of arrays needed by the rem() model fitting for a variety of cases involving egocentric event data, multiple event types, and/or support constraints. This paper introduces these tools using examples from real data extracted from the American Time Use Survey. PMID:26185488
Simplifier: a web tool to eliminate redundant NGS contigs.
Ramos, Rommel Thiago Jucá; Carneiro, Adriana Ribeiro; Azevedo, Vasco; Schneider, Maria Paula; Barh, Debmalya; Silva, Artur
2012-01-01
Modern genomic sequencing technologies produce a large amount of data with reduced cost per base; however, this data consists of short reads. This reduction in the size of the reads, compared to those obtained with previous methodologies, presents new challenges, including a need for efficient algorithms for the assembly of genomes from short reads and for resolving repetitions. Additionally after abinitio assembly, curation of the hundreds or thousands of contigs generated by assemblers demands considerable time and computational resources. We developed Simplifier, a stand-alone software that selectively eliminates redundant sequences from the collection of contigs generated by ab initio assembly of genomes. Application of Simplifier to data generated by assembly of the genome of Corynebacterium pseudotuberculosis strain 258 reduced the number of contigs generated by ab initio methods from 8,004 to 5,272, a reduction of 34.14%; in addition, N50 increased from 1 kb to 1.5 kb. Processing the contigs of Escherichia coli DH10B with Simplifier reduced the mate-paired library 17.47% and the fragment library 23.91%. Simplifier removed redundant sequences from datasets produced by assemblers, thereby reducing the effort required for finalization of genome assembly in tests with data from Prokaryotic organisms. Simplifier is available at http://www.genoma.ufpa.br/rramos/softwares/simplifier.xhtmlIt requires Sun jdk 6 or higher.
A Formal Methodology to Design and Deploy Dependable Wireless Sensor Networks
Testa, Alessandro; Cinque, Marcello; Coronato, Antonio; Augusto, Juan Carlos
2016-01-01
Wireless Sensor Networks (WSNs) are being increasingly adopted in critical applications, where verifying the correct operation of sensor nodes is a major concern. Undesired events may undermine the mission of the WSNs. Hence, their effects need to be properly assessed before deployment, to obtain a good level of expected performance; and during the operation, in order to avoid dangerous unexpected results. In this paper, we propose a methodology that aims at assessing and improving the dependability level of WSNs by means of an event-based formal verification technique. The methodology includes a process to guide designers towards the realization of a dependable WSN and a tool (“ADVISES”) to simplify its adoption. The tool is applicable to homogeneous WSNs with static routing topologies. It allows the automatic generation of formal specifications used to check correctness properties and evaluate dependability metrics at design time and at runtime for WSNs where an acceptable percentage of faults can be defined. During the runtime, we can check the behavior of the WSN accordingly to the results obtained at design time and we can detect sudden and unexpected failures, in order to trigger recovery procedures. The effectiveness of the methodology is shown in the context of two case studies, as proof-of-concept, aiming to illustrate how the tool is helpful to drive design choices and to check the correctness properties of the WSN at runtime. Although the method scales up to very large WSNs, the applicability of the methodology may be compromised by the state space explosion of the reasoning model, which must be faced by partitioning large topologies into sub-topologies. PMID:28025568
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lawson, M.; Yu, Y. H.; Nelessen, A.
2014-05-01
Wave energy converters (WECs) are commonly designed and analyzed using numerical models that combine multi-body dynamics with hydrodynamic models based on the Cummins Equation and linearized hydrodynamic coefficients. These modeling methods are attractive design tools because they are computationally inexpensive and do not require the use of high performance computing resources necessitated by high-fidelity methods, such as Navier Stokes computational fluid dynamics. Modeling hydrodynamics using linear coefficients assumes that the device undergoes small motions and that the wetted surface area of the devices is approximately constant. WEC devices, however, are typically designed to undergo large motions in order to maximizemore » power extraction, calling into question the validity of assuming that linear hydrodynamic models accurately capture the relevant fluid-structure interactions. In this paper, we study how calculating buoyancy and Froude-Krylov forces from the instantaneous position of a WEC device (referred to as instantaneous buoyancy and Froude-Krylov forces from herein) changes WEC simulation results compared to simulations that use linear hydrodynamic coefficients. First, we describe the WEC-Sim tool used to perform simulations and how the ability to model instantaneous forces was incorporated into WEC-Sim. We then use a simplified one-body WEC device to validate the model and to demonstrate how accounting for these instantaneously calculated forces affects the accuracy of simulation results, such as device motions, hydrodynamic forces, and power generation.« less
New Criterion and Tool for Caltrans Seismic Hazard Characterization
NASA Astrophysics Data System (ADS)
Shantz, T.; Merriam, M.; Turner, L.; Chiou, B.; Liu, X.
2008-12-01
Caltrans recently adopted new procedures for the development of response spectra for structure design. These procedures incorporate both deterministic and probabilistic criteria. The Next Generation Attenuation (NGA) models (2008) are used for deterministic assessment (using a revised late-Quaternary age fault database), and the USGS 2008 5% in 50-year hazard maps are used for probabilistic assessment. A minimum deterministic spectrum based on a M6.5 earthquake at 12 km is also included. These spectra are enveloped and the largest values used. A new publicly available web-based design tool for calculating the design spectrum will be used for calculations. The tool is built on a Windows-Apache-MySQL-PHP (WAMP) platform and integrates GoogleMaps for increased flexibility in the tool's use. Links to Caltrans data such as pre-construction logs of test borings assist in the estimation of Vs30 values used in the new procedures. Basin effects based on new models developed for the CFM, for the San Francisco Bay area by the USGS, and by Thurber (2008) are also incorporated. It is anticipated that additional layers such as CGS Seismic Hazard Zone maps will be added in the future. Application of the new criterion will result in expected higher levels of ground motion at many bridges west of the Coast Ranges. In eastern California, use of the NGA relationships for strike-slip faulting (the dominant sense of motion in California) will often result in slightly lower expected values for bridges. The expected result is a more realistic prediction of ground motions at bridges, in keeping with those motions developed for other large-scale and important structures. The tool is based on a simplified fault map of California, so it will not be used for more detailed evaluations such as surface rupture determination. Announcements regarding tool availability (expected to be in early 2009) are at http://www.dot.ca.gov/research/index.htm
He, W; Zhao, S; Liu, X; Dong, S; Lv, J; Liu, D; Wang, J; Meng, Z
2013-12-04
Large-scale next-generation sequencing (NGS)-based resequencing detects sequence variations, constructs evolutionary histories, and identifies phenotype-related genotypes. However, NGS-based resequencing studies generate extraordinarily large amounts of data, making computations difficult. Effective use and analysis of these data for NGS-based resequencing studies remains a difficult task for individual researchers. Here, we introduce ReSeqTools, a full-featured toolkit for NGS (Illumina sequencing)-based resequencing analysis, which processes raw data, interprets mapping results, and identifies and annotates sequence variations. ReSeqTools provides abundant scalable functions for routine resequencing analysis in different modules to facilitate customization of the analysis pipeline. ReSeqTools is designed to use compressed data files as input or output to save storage space and facilitates faster and more computationally efficient large-scale resequencing studies in a user-friendly manner. It offers abundant practical functions and generates useful statistics during the analysis pipeline, which significantly simplifies resequencing analysis. Its integrated algorithms and abundant sub-functions provide a solid foundation for special demands in resequencing projects. Users can combine these functions to construct their own pipelines for other purposes.
Method and apparatus for manufacturing high-accuracy radio telescope reflector panels
NASA Astrophysics Data System (ADS)
Bosma, Marinus B.
1998-07-01
This article covers the manufacturing of aluminum reflector panels for submillimeter radio astronomy. The first part involves the general construction and application of a machine custom designed and built to do this. The second is a discussion of the software and execution of method to actually produce the reflectors for the Smithsonian Astrophysical Observatories Submillimeter Array (SMA). The reflective surface of each panel is contoured both radially and circularly by oscillating a platen supporting the panel about a fixed axis relative to a tool which is fixed during platen oscillation. The tool is repositionable between oscillations along an x axis to achieve the radial contour and along a z axis to achieve the desired parabolic or spherical contour. Contrary to the normal contouring of such a surface with a 5- axis CNC machine, tool positioning along either axis is independent of tool location along the other axis, simplifying the machine structure as well as its computerized operation. A unique hinge is provided to restrain the platen in a radial direction while allowing floating action of the platen on an air cushion during its oscillation. These techniques and the equipment are documented in U.S. Patent No. 5477602.
User Interface Design in Medical Distributed Web Applications.
Serban, Alexandru; Crisan-Vida, Mihaela; Mada, Leonard; Stoicu-Tivadar, Lacramioara
2016-01-01
User interfaces are important to facilitate easy learning and operating with an IT application especially in the medical world. An easy to use interface has to be simple and to customize the user needs and mode of operation. The technology in the background is an important tool to accomplish this. The present work aims to creating a web interface using specific technology (HTML table design combined with CSS3) to provide an optimized responsive interface for a complex web application. In the first phase, the current icMED web medical application layout is analyzed, and its structure is designed using specific tools, on source files. In the second phase, a new graphic adaptable interface to different mobile terminals is proposed, (using HTML table design (TD) and CSS3 method) that uses no source files, just lines of code for layout design, improving the interaction in terms of speed and simplicity. For a complex medical software application a new prototype layout was designed and developed using HTML tables. The method uses a CSS code with only CSS classes applied to one or multiple HTML table elements, instead of CSS styles that can be applied to just one DIV tag at once. The technique has the advantage of a simplified CSS code, and a better adaptability to different media resolutions compared to DIV-CSS style method. The presented work is a proof that adaptive web interfaces can be developed just using and combining different types of design methods and technologies, using HTML table design, resulting in a simpler to learn and use interface, suitable for healthcare services.
A Fast Proceduere for Optimizing Thermal Protection Systems of Re-Entry Vehicles
NASA Astrophysics Data System (ADS)
Ferraiuolo, M.; Riccio, A.; Tescione, D.; Gigliotti, M.
The aim of the present work is to introduce a fast procedure to optimize thermal protection systems for re-entry vehicles subjected to high thermal loads. A simplified one-dimensional optimization process, performed in order to find the optimum design variables (lengths, sections etc.), is the first step of the proposed design procedure. Simultaneously, the most suitable materials able to sustain high temperatures and meeting the weight requirements are selected and positioned within the design layout. In this stage of the design procedure, simplified (generalized plane strain) FEM models are used when boundary and geometrical conditions allow the reduction of the degrees of freedom. Those simplified local FEM models can be useful because they are time-saving and very simple to build; they are essentially one dimensional and can be used for optimization processes in order to determine the optimum configuration with regard to weight, temperature and stresses. A triple-layer and a double-layer body, subjected to the same aero-thermal loads, have been optimized to minimize the overall weight. Full two and three-dimensional analyses are performed in order to validate those simplified models. Thermal-structural analyses and optimizations are executed by adopting the Ansys FEM code.
A knowledge authoring tool for clinical decision support.
Dunsmuir, Dustin; Daniels, Jeremy; Brouse, Christopher; Ford, Simon; Ansermino, J Mark
2008-06-01
Anesthesiologists in the operating room are unable to constantly monitor all data generated by physiological monitors. They are further distracted by clinical and educational tasks. An expert system would ideally provide assistance to the anesthesiologist in this data-rich environment. Clinical monitoring expert systems have not been widely adopted, as traditional methods of knowledge encoding require both expert medical and programming skills, making knowledge acquisition difficult. A software application was developed for use as a knowledge authoring tool for physiological monitoring. This application enables clinicians to create knowledge rules without the need of a knowledge engineer or programmer. These rules are designed to provide clinical diagnosis, explanations and treatment advice for optimal patient care to the clinician in real time. By intelligently combining data from physiological monitors and demographical data sources the expert system can use these rules to assist in monitoring the patient. The knowledge authoring process is simplified by limiting connective relationships between rules. The application is designed to allow open collaboration between communities of clinicians to build a library of rules for clinical use. This design provides clinicians with a system for parameter surveillance and expert advice with a transparent pathway of reasoning. A usability evaluation demonstrated that anesthesiologists can rapidly develop useful rules for use in a predefined clinical scenario.
Simplified hydraulic model of French vertical-flow constructed wetlands.
Arias, Luis; Bertrand-Krajewski, Jean-Luc; Molle, Pascal
2014-01-01
Designing vertical-flow constructed wetlands (VFCWs) to treat both rain events and dry weather flow is a complex task due to the stochastic nature of rain events. Dynamic models can help to improve design, but they usually prove difficult to handle for designers. This study focuses on the development of a simplified hydraulic model of French VFCWs using an empirical infiltration coefficient--infiltration capacity parameter (ICP). The model was fitted using 60-second-step data collected on two experimental French VFCW systems and compared with Hydrus 1D software. The model revealed a season-by-season evolution of the ICP that could be explained by the mechanical role of reeds. This simplified model makes it possible to define time-course shifts in ponding time and outlet flows. As ponding time hinders oxygen renewal, thus impacting nitrification and organic matter degradation, ponding time limits can be used to fix a reliable design when treating both dry and rain events.
NASA Technical Reports Server (NTRS)
Rede, Leonard J.; Booth, Andrew; Hsieh, Jonathon; Summer, Kellee
2004-01-01
This paper presents a discussion of the evolution of a sequencer from a simple EPICS (Experimental Physics and Industrial Control System) based sequencer into a complex implementation designed utilizing UML (Unified Modeling Language) methodologies and a CASE (Computer Aided Software Engineering) tool approach. The main purpose of the sequencer (called the IF Sequencer) is to provide overall control of the Keck Interferometer to enable science operations be carried out by a single operator (and/or observer). The interferometer links the two 10m telescopes of the W. M. Keck Observatory at Mauna Kea, Hawaii. The IF Sequencer is a high-level, multi-threaded, Hare1 finite state machine, software program designed to orchestrate several lower-level hardware and software hard real time subsystems that must perform their work in a specific and sequential order. The sequencing need not be done in hard real-time. Each state machine thread commands either a high-speed real-time multiple mode embedded controller via CORB A, or slower controllers via EPICS Channel Access interfaces. The overall operation of the system is simplified by the automation. The UML is discussed and our use of it to implement the sequencer is presented. The decision to use the Rhapsody product as our CASE tool is explained and reflected upon. Most importantly, a section on lessons learned is presented and the difficulty of integrating CASE tool automatically generated C++ code into a large control system consisting of multiple infrastructures is presented.
NASA Astrophysics Data System (ADS)
Reder, Leonard J.; Booth, Andrew; Hsieh, Jonathan; Summers, Kellee R.
2004-09-01
This paper presents a discussion of the evolution of a sequencer from a simple Experimental Physics and Industrial Control System (EPICS) based sequencer into a complex implementation designed utilizing UML (Unified Modeling Language) methodologies and a Computer Aided Software Engineering (CASE) tool approach. The main purpose of the Interferometer Sequencer (called the IF Sequencer) is to provide overall control of the Keck Interferometer to enable science operations to be carried out by a single operator (and/or observer). The interferometer links the two 10m telescopes of the W. M. Keck Observatory at Mauna Kea, Hawaii. The IF Sequencer is a high-level, multi-threaded, Harel finite state machine software program designed to orchestrate several lower-level hardware and software hard real-time subsystems that must perform their work in a specific and sequential order. The sequencing need not be done in hard real-time. Each state machine thread commands either a high-speed real-time multiple mode embedded controller via CORBA, or slower controllers via EPICS Channel Access interfaces. The overall operation of the system is simplified by the automation. The UML is discussed and our use of it to implement the sequencer is presented. The decision to use the Rhapsody product as our CASE tool is explained and reflected upon. Most importantly, a section on lessons learned is presented and the difficulty of integrating CASE tool automatically generated C++ code into a large control system consisting of multiple infrastructures is presented.
Reponse dynamique des structures sous charges de vent
NASA Astrophysics Data System (ADS)
Gani, Ferawati
The main purpose of this research is to assemble numerical tools that allows realistic dynamic study of structures under wind loading. The availability of such numerical tools is becoming more important for the industry, following previous experiences in structural damages after extreme wind events. The methodology of the present study involves two main steps: (i) preparing the wind loading according to its spatial and temporal correlations by using digitally generated wind or real measured wind; (ii) preparing the numerical model that captures the characteristics of the real structures and respects all the necessary numerical requirements to pursue transient dynamic analyses. The thesis is presented as an ensemble of four articles written for refereed journals and conferences that showcase the contributions of the present study to the advancement of transient dynamic study of structures under wind loading, on the wind model itself (the first article) and on the application of the wind study on complex structures (the next three articles). The articles presented are as follows: (a) the evaluation of three-dimensional correlations of wind, an important issue for more precise prediction of wind loading for flexible and line-like structures, the results presented in this first article helps design engineers to choose a more suitable models to define three-dimensional wind loading; (b) the refinement of design for solar photovoltaic concentrator-tracker structure developed for utility scale, this study addressed concerns related strict operational criteria and fatigue under wind load for a large parabolic truss structure; (c) the study of guyed towers for TLs, the applicability of the static-equivalent method from the current industry documents for the design of this type of flexible TL support was questioned, a simplified method to improve the wind design was proposed; (d) the fundamental issue of nonlinear behaviour under extreme wind loading for single-degree-of-freedom systems is evaluated here, the use of real measured hurricane and winter storm have highlighted the possible interest of taking into account the ductility in the extreme wind loading design. The present research project has shown the versatility of the use of the developed wind study methodology to solve concerns related to different type of complex structures. In addition, this study proposes simplified methods that are useful for practical engineers when there is the need to solve similar problems. Key words: nonlinear, dynamic, wind, guyed tower, parabolic structure, ductility.
ProteoWizard: open source software for rapid proteomics tools development.
Kessner, Darren; Chambers, Matt; Burke, Robert; Agus, David; Mallick, Parag
2008-11-01
The ProteoWizard software project provides a modular and extensible set of open-source, cross-platform tools and libraries. The tools perform proteomics data analyses; the libraries enable rapid tool creation by providing a robust, pluggable development framework that simplifies and unifies data file access, and performs standard proteomics and LCMS dataset computations. The library contains readers and writers of the mzML data format, which has been written using modern C++ techniques and design principles and supports a variety of platforms with native compilers. The software has been specifically released under the Apache v2 license to ensure it can be used in both academic and commercial projects. In addition to the library, we also introduce a rapidly growing set of companion tools whose implementation helps to illustrate the simplicity of developing applications on top of the ProteoWizard library. Cross-platform software that compiles using native compilers (i.e. GCC on Linux, MSVC on Windows and XCode on OSX) is available for download free of charge, at http://proteowizard.sourceforge.net. This website also provides code examples, and documentation. It is our hope the ProteoWizard project will become a standard platform for proteomics development; consequently, code use, contribution and further development are strongly encouraged.
Fundamental Study of Material Flow in Friction Stir Welds
NASA Technical Reports Server (NTRS)
Reynolds, Anthony P.
1999-01-01
The presented research project consists of two major parts. First, the material flow in solid-state, friction stir, butt-welds as been investigated using a marker insert technique. Changes in material flow due to welding parameter as well as tool geometry variations have been examined for different materials. The method provides a semi-quantitative, three-dimensional view of the material transport in the welded zone. Second, a FSW process model has been developed. The fully coupled model is based on fluid mechanics; the solid-state material transport during welding is treated as a laminar, viscous flow of a non-Newtonian fluid past a rotating circular cylinder. The heat necessary for the material softening is generated by deformation of the material. As a first step, a two-dimensional model, which contains only the pin of the FSW tool, has been created to test the suitability of the modeling approach and to perform parametric studies of the boundary conditions. The material flow visualization experiments agree very well with the predicted flow field. Accordingly, material within the pin diameter is transported only in the rotation direction around the pin. Due to the simplifying assumptions inherent in the 2-D model, other experimental data such as forces on the pin, torque, and weld energy cannot be directly used for validation. However, the 2-D model predicts the same trends as shown in the experiments. The model also predicts a deviation from the "normal" material flow at certain combinations of welding parameters, suggesting a possible mechanism for the occurrence of some typical FSW defects. The next step has been the development of a three-dimensional process model. The simplified FSW tool has been designed as a flat shoulder rotating on the top of the workpiece and a rotating, cylindrical pin, which extends throughout the total height of the flow domain. The thermal boundary conditions at the tool and at the contact area to the backing plate have been varied to fit experimental data such as temperature profiles, torque and tool forces. General aspects of the experimentally visualized material flow pattern are confirmed by the 3-D model.
NASA Technical Reports Server (NTRS)
Perrell, Eric R.
2005-01-01
The recent bold initiatives to expand the human presence in space require innovative approaches to the design of propulsion systems whose underlying technology is not yet mature. The space propulsion community has identified a number of candidate concepts. A short list includes solar sails, high-energy-density chemical propellants, electric and electromagnetic accelerators, solar-thermal and nuclear-thermal expanders. For each of these, the underlying physics are relatively well understood. One could easily cite authoritative texts, addressing both the governing equations, and practical solution methods for, e.g. electromagnetic fields, heat transfer, radiation, thermophysics, structural dynamics, particulate kinematics, nuclear energy, power conversion, and fluid dynamics. One could also easily cite scholarly works in which complete equation sets for any one of these physical processes have been accurately solved relative to complex engineered systems. The Advanced Concepts and Analysis Office (ACAO), Space Transportation Directorate, NASA Marshall Space Flight Center, has recently released the first alpha version of a set of computer utilities for performing the applicable physical analyses relative to candidate deep-space propulsion systems such as those listed above. PARSEC, Preliminary Analysis of Revolutionary in-Space Engineering Concepts, enables rapid iterative calculations using several physics tools developed in-house. A complete cycle of the entire tool set takes about twenty minutes. PARSEC is a level-zero/level-one design tool. For PARSEC s proof-of-concept, and preliminary design decision-making, assumptions that significantly simplify the governing equation sets are necessary. To proceed to level-two, one wishes to retain modeling of the underlying physics as close as practical to known applicable first principles. This report describes results of collaboration between ACAO, and Embry-Riddle Aeronautical University (ERAU), to begin building a set of level-two design tools for PARSEC. The "CFD Multiphysics Tool" will be the propulsive element of the tool set. The name acknowledges that space propulsion performance assessment is primarily a fluid mechanics problem. At the core of the CFD Multiphysics Tool is an open-source CFD code, HYP, under development at ERAU. ERAU is renowned for its undergraduate degree program in Aerospace Engineering the largest in the nation. The strength of the program is its applications-oriented curriculum, which culminates in one of three two-course Engineering Design sequences: Aerospace Propulsion, Spacecraft, or Aircraft. This same philosophy applies to the HYP Project, albeit with fluid physics modeling commensurate with graduate research. HYP s purpose, like the Multiphysics Tool s, is to enable calculations of real (three-dimensional; geometrically complex; intended for hardware development) applications of high speed and propulsive fluid flows.
Propellant Chemistry for CFD Applications
NASA Technical Reports Server (NTRS)
Farmer, R. C.; Anderson, P. G.; Cheng, Gary C.
1996-01-01
Current concepts for reusable launch vehicle design have created renewed interest in the use of RP-1 fuels for high pressure and tri-propellant propulsion systems. Such designs require the use of an analytical technology that accurately accounts for the effects of real fluid properties, combustion of large hydrocarbon fuel modules, and the possibility of soot formation. These effects are inadequately treated in current computational fluid dynamic (CFD) codes used for propulsion system analyses. The objective of this investigation is to provide an accurate analytical description of hydrocarbon combustion thermodynamics and kinetics that is sufficiently computationally efficient to be a practical design tool when used with CFD codes such as the FDNS code. A rigorous description of real fluid properties for RP-1 and its combustion products will be derived from the literature and from experiments conducted in this investigation. Upon the establishment of such a description, the fluid description will be simplified by using the minimum of empiricism necessary to maintain accurate combustion analyses and including such empirical models into an appropriate CFD code. An additional benefit of this approach is that the real fluid properties analysis simplifies the introduction of the effects of droplet sprays into the combustion model. Typical species compositions of RP-1 have been identified, surrogate fuels have been established for analyses, and combustion and sooting reaction kinetics models have been developed. Methods for predicting the necessary real fluid properties have been developed and essential experiments have been designed. Verification studies are in progress, and preliminary results from these studies will be presented. The approach has been determined to be feasible, and upon its completion the required methodology for accurate performance and heat transfer CFD analyses for high pressure, tri-propellant propulsion systems will be available.
Automatic Design of Digital Synthetic Gene Circuits
Marchisio, Mario A.; Stelling, Jörg
2011-01-01
De novo computational design of synthetic gene circuits that achieve well-defined target functions is a hard task. Existing, brute-force approaches run optimization algorithms on the structure and on the kinetic parameter values of the network. However, more direct rational methods for automatic circuit design are lacking. Focusing on digital synthetic gene circuits, we developed a methodology and a corresponding tool for in silico automatic design. For a given truth table that specifies a circuit's input–output relations, our algorithm generates and ranks several possible circuit schemes without the need for any optimization. Logic behavior is reproduced by the action of regulatory factors and chemicals on the promoters and on the ribosome binding sites of biological Boolean gates. Simulations of circuits with up to four inputs show a faithful and unequivocal truth table representation, even under parametric perturbations and stochastic noise. A comparison with already implemented circuits, in addition, reveals the potential for simpler designs with the same function. Therefore, we expect the method to help both in devising new circuits and in simplifying existing solutions. PMID:21399700
Design Environment for Multifidelity and Multidisciplinary Components
NASA Technical Reports Server (NTRS)
Platt, Michael
2014-01-01
One of the greatest challenges when developing propulsion systems is predicting the interacting effects between the fluid loads, thermal loads, and structural deflection. The interactions between technical disciplines often are not fully analyzed, and the analysis in one discipline often uses a simplified representation of other disciplines as an input or boundary condition. For example, the fluid forces in an engine generate static and dynamic rotor deflection, but the forces themselves are dependent on the rotor position and its orbit. It is important to consider the interaction between the physical phenomena where the outcome of each analysis is heavily dependent on the inputs (e.g., changes in flow due to deflection, changes in deflection due to fluid forces). A rigid design process also lacks the flexibility to employ multiple levels of fidelity in the analysis of each of the components. This project developed and validated an innovative design environment that has the flexibility to simultaneously analyze multiple disciplines and multiple components with multiple levels of model fidelity. Using NASA's open-source multidisciplinary design analysis and optimization (OpenMDAO) framework, this multifaceted system will provide substantially superior capabilities to current design tools.
Amanzi: An Open-Source Multi-process Simulator for Environmental Applications
NASA Astrophysics Data System (ADS)
Moulton, J. D.; Molins, S.; Johnson, J. N.; Coon, E.; Lipnikov, K.; Day, M.; Barker, E.
2014-12-01
The Advanced Simulation Capabililty for Environmental Management (ASCEM) program is developing an approach and open-source tool suite for standardized risk and performance assessments at legacy nuclear waste sites. These assessments begin with simplified models, and add geometric and geologic complexity as understanding is gained. The Platform toolsets (Akuna) generates these conceptual models and Amanzi provides the computational engine to perform the simulations, returning the results for analysis and visualization. In this presentation we highlight key elements of the design, algorithms and implementations used in Amanzi. In particular, the hierarchical and modular design is aligned with the coupled processes being sumulated, and naturally supports a wide range of model complexity. This design leverages a dynamic data manager and the synergy of two graphs (one from the high-level perspective of the models the other from the dependencies of the variables in the model) to enable this flexible model configuration at run time. Moreover, to model sites with complex hydrostratigraphy, as well as engineered systems, we are developing a dual unstructured/structured capability. Recently, these capabilities have been collected in a framework named Arcos, and efforts have begun to improve interoperability between the unstructured and structured AMR approaches in Amanzi. To leverage a range of biogeochemistry capability from the community (e.g., CrunchFlow, PFLOTRAN, etc.), a biogeochemistry interface library was developed called Alquimia. To ensure that Amanzi is truly an open-source community code we require a completely open-source tool chain for our development. We will comment on elements of this tool chain, including the testing and documentation development tools such as docutils, and Sphinx. Finally, we will show simulation results from our phased demonstrations, including the geochemically complex Savannah River F-Area seepage basins.
Generic framework for mining cellular automata models on protein-folding simulations.
Diaz, N; Tischer, I
2016-05-13
Cellular automata model identification is an important way of building simplified simulation models. In this study, we describe a generic architectural framework to ease the development process of new metaheuristic-based algorithms for cellular automata model identification in protein-folding trajectories. Our framework was developed by a methodology based on design patterns that allow an improved experience for new algorithms development. The usefulness of the proposed framework is demonstrated by the implementation of four algorithms, able to obtain extremely precise cellular automata models of the protein-folding process with a protein contact map representation. Dynamic rules obtained by the proposed approach are discussed, and future use for the new tool is outlined.
Nikolian, Vahagn C; Ibrahim, Andrew M
2017-09-01
Journals fill several important roles within academic medicine, including building knowledge, validating quality of methods, and communicating research. This section provides an overview of these roles and highlights innovative approaches journals have taken to enhance dissemination of research. As journals move away from print formats and embrace web-based content, design-centered thinking will allow for engagement of a larger audience. Examples of recent efforts in this realm are provided, as well as simplified strategies for developing visual abstracts to improve dissemination via social media. Finally, we hone in on principles of learning and education which have driven these advances in multimedia-based communication in scientific research.
The Value of SysML Modeling During System Operations: A Case Study
NASA Technical Reports Server (NTRS)
Dutenhoffer, Chelsea; Tirona, Joseph
2013-01-01
System models are often touted as engineering tools that promote better understanding of systems, but these models are typically created during system design. The Ground Data System (GDS) team for the Dawn spacecraft took on a case study to see if benefits could be achieved by starting a model of a system already in operations. This paper focuses on the four steps the team undertook in modeling the Dawn GDS: defining a model structure, populating model elements, verifying that the model represented reality, and using the model to answer system-level questions and simplify day-to-day tasks. Throughout this paper the team outlines our thought processes and the system insights the model provided.
Boron-selective reactions as powerful tools for modular synthesis of diverse complex molecules.
Xu, Liang; Zhang, Shuai; Li, Pengfei
2015-12-21
In the context of modular and rapid construction of molecular diversity and complexity for applications in organic synthesis, biomedical and materials sciences, a generally useful strategy has emerged based on boron-selective chemical transformations. In the last decade, these types of reactions have evolved from proof-of-concept to some advanced applications in the efficient preparation of complex natural products and even automated precise manufacturing on the molecular level. These advances have shown the great potential of boron-selective reactions in simplifying synthetic design and experimental operations, and should inspire new developments in related chemical and technological areas. This tutorial review will highlight the original contributions and representative advances in this emerging field.
Constraint Force Equation Methodology for Modeling Multi-Body Stage Separation Dynamics
NASA Technical Reports Server (NTRS)
Toniolo, Matthew D.; Tartabini, Paul V.; Pamadi, Bandu N.; Hotchko, Nathaniel
2008-01-01
This paper discusses a generalized approach to the multi-body separation problems in a launch vehicle staging environment based on constraint force methodology and its implementation into the Program to Optimize Simulated Trajectories II (POST2), a widely used trajectory design and optimization tool. This development facilitates the inclusion of stage separation analysis into POST2 for seamless end-to-end simulations of launch vehicle trajectories, thus simplifying the overall implementation and providing a range of modeling and optimization capabilities that are standard features in POST2. Analysis and results are presented for two test cases that validate the constraint force equation methodology in a stand-alone mode and its implementation in POST2.
The value of SysML modeling during system operations: A case study
NASA Astrophysics Data System (ADS)
Dutenhoffer, C.; Tirona, J.
System models are often touted as engineering tools that promote better understanding of systems, but these models are typically created during system design. The Ground Data System (GDS) team for the Dawn spacecraft took on a case study to see if benefits could be achieved by starting a model of a system already in operations. This paper focuses on the four steps the team undertook in modeling the Dawn GDS: defining a model structure, populating model elements, verifying that the model represented reality, and using the model to answer system-level questions and simplify day-to-day tasks. Throughout this paper the team outlines our thought processes and the system insights the model provided.
compomics-utilities: an open-source Java library for computational proteomics.
Barsnes, Harald; Vaudel, Marc; Colaert, Niklaas; Helsens, Kenny; Sickmann, Albert; Berven, Frode S; Martens, Lennart
2011-03-08
The growing interest in the field of proteomics has increased the demand for software tools and applications that process and analyze the resulting data. And even though the purpose of these tools can vary significantly, they usually share a basic set of features, including the handling of protein and peptide sequences, the visualization of (and interaction with) spectra and chromatograms, and the parsing of results from various proteomics search engines. Developers typically spend considerable time and effort implementing these support structures, which detracts from working on the novel aspects of their tool. In order to simplify the development of proteomics tools, we have implemented an open-source support library for computational proteomics, called compomics-utilities. The library contains a broad set of features required for reading, parsing, and analyzing proteomics data. compomics-utilities is already used by a long list of existing software, ensuring library stability and continued support and development. As a user-friendly, well-documented and open-source library, compomics-utilities greatly simplifies the implementation of the basic features needed in most proteomics tools. Implemented in 100% Java, compomics-utilities is fully portable across platforms and architectures. Our library thus allows the developers to focus on the novel aspects of their tools, rather than on the basic functions, which can contribute substantially to faster development, and better tools for proteomics.
Space Fabrication Demonstration System
NASA Technical Reports Server (NTRS)
1977-01-01
Progress on fabrication facility (beam builder) support structure control, clamp/weld block, and welding and truss cut off is discussed. The brace attachment design was changed and the design of the weld mechanism was modified which achieved the following system benefits: (1) simplified weld electrode life; (2) reduced weld power requirements; and (3) simplified brace attachment mechanisms. Static and fatigue characteristics of spot welded 2024T3 aluminum joints are evaluated.
Mercury ion thruster technology
NASA Technical Reports Server (NTRS)
Beattie, J. R.; Matossian, J. N.
1989-01-01
The Mercury Ion Thruster Technology program was an investigation for improving the understanding of state-of-the-art mercury ion thrusters. Emphasis was placed on optimizing the performance and simplifying the design of the 30 cm diameter ring-cusp discharge chamber. Thruster performance was improved considerably; the baseline beam-ion production cost of the optimized configuration was reduced to Epsilon (sub i) perspective to 130 eV/ion. At a discharge propellant-utilization efficiency of 95 percent, the beam-ion production cost was reduced to about 155 eV/ion, representing a reduction of about 40 eV/ion over the corresponding value for the 30 cm diameter J-series thruster. Comprehensive Langmuir-probe surveys were obtained and compared with similar measurements for a J-series thruster. A successful volume-averaging scheme was developed to correlate thruster performance with the dominant plasma processes that prevail in the two thruster designs. The average Maxwellian electron temperature in the optimized ring-cusp design is as much as 1 eV higher than it is in the J-series thruster. Advances in ion-extraction electrode fabrication technology were made by improving materials selection criteria, hydroforming and stress-relieving tooling, and fabrications procedures. An ion-extraction performance study was conducted to assess the effect of screen aperture size on ion-optics performance and to verify the effectiveness of a beam-vectoring model for three-grid ion optics. An assessment of the technology readiness of the J-series thruster was completed, and operation of an 8 cm IAPS thruster using a simplified power processor was demonstrated.
NASA Astrophysics Data System (ADS)
Şahin, Rıdvan; Liu, Peide
2017-07-01
Simplified neutrosophic set (SNS) is an appropriate tool used to express the incompleteness, indeterminacy and uncertainty of the evaluation objects in decision-making process. In this study, we define the concept of possibility SNS including two types of information such as the neutrosophic performance provided from the evaluation objects and its possibility degree using a value ranging from zero to one. Then by extending the existing neutrosophic information, aggregation models for SNSs that cannot be used effectively to fusion the two different information described above, we propose two novel neutrosophic aggregation operators considering possibility, which are named as a possibility-induced simplified neutrosophic weighted arithmetic averaging operator and possibility-induced simplified neutrosophic weighted geometric averaging operator, and discuss their properties. Moreover, we develop a useful method based on the proposed aggregation operators for solving a multi-criteria group decision-making problem with the possibility simplified neutrosophic information, in which the weights of decision-makers and decision criteria are calculated based on entropy measure. Finally, a practical example is utilised to show the practicality and effectiveness of the proposed method.
Conservation of a molecular target across species can be used as a line-of-evidence to predict the likelihood of chemical susceptibility. The web-based Sequence Alignment to Predict Across Species Susceptibility (SeqAPASS) tool was developed to simplify, streamline, and quantitat...
CoMET: Cost and Mass Evaluation Tool for Spacecraft and Mission Design
NASA Technical Reports Server (NTRS)
Bieber, Ben S.
2005-01-01
New technology in space exploration is often developed without a complete knowledge of its impact. While the immediate benefits of a new technology are obvious, it is harder to understand its indirect consequences, which ripple through the entire system. COMET is a technology evaluation tool designed to illuminate how specific technology choices affect a mission at each system level. COMET uses simplified models for mass, power, and cost to analyze performance parameters of technologies of interest. The sensitivity analysis that CoMET provides shows whether developing a certain technology will greatly benefit the project or not. CoMET is an ongoing project approaching a web-based implementation phase. This year, development focused on the models for planetary daughter craft, such as atmospheric probes, blimps and balloons, and landers. These models are developed through research into historical data, well established rules of thumb, and engineering judgment of experts at JPL. The model is validated by corroboration with JpL advanced mission studies. Other enhancements to COMET include adding launch vehicle analysis and integrating an updated cost model. When completed, COMET will allow technological development to be focused on areas that will most drastically improve spacecraft performance.
Application of indoor noise prediction in the real world
NASA Astrophysics Data System (ADS)
Lewis, David N.
2002-11-01
Predicting indoor noise in industrial workrooms is an important part of the process of designing industrial plants. Predicted levels are used in the design process to determine compliance with occupational-noise regulations, and to estimate levels inside the walls in order to predict community noise radiated from the building. Once predicted levels are known, noise-control strategies can be developed. In this paper an overview of over 20 years of experience is given with the use of various prediction approaches to manage noise in Unilever plants. This work has applied empirical and ray-tracing approaches separately, and in combination, to design various packaging and production plants and other facilities. The advantages of prediction methods in general, and of the various approaches in particular, will be discussed. A case-study application of prediction methods to the optimization of noise-control measures in a food-packaging plant will be presented. Plans to acquire a simplified prediction model for use as a company noise-screening tool will be discussed.
Horvath, Monica M.; Winfield, Stephanie; Evans, Steve; Slopek, Steve; Shang, Howard; Ferranti, Jeffrey
2011-01-01
In many healthcare organizations, comparative effectiveness research and quality improvement (QI) investigations are hampered by a lack of access to data created as a byproduct of patient care. Data collection often hinges upon either manual chart review or ad hoc requests to technical experts who support legacy clinical systems. In order to facilitate this needed capacity for data exploration at our institution (Duke University Health System), we have designed and deployed a robust Web application for cohort identification and data extraction—the Duke Enterprise Data Unified Content Explorer (DEDUCE). DEDUCE is envisioned as a simple, web-based environment that allows investigators access to administrative, financial, and clinical information generated during patient care. By using business intelligence tools to create a view into Duke Medicine's enterprise data warehouse, DEDUCE provides a guided query functionality using a wizard-like interface that lets users filter through millions of clinical records, explore aggregate reports, and, export extracts. Researchers and QI specialists can obtain detailed patient- and observation-level extracts without needing to understand structured query language or the underlying database model. Developers designing such tools must devote sufficient training and develop application safeguards to ensure that patient-centered clinical researchers understand when observation-level extracts should be used. This may mitigate the risk of data being misunderstood and consequently used in an improper fashion. PMID:21130181
Horvath, Monica M; Winfield, Stephanie; Evans, Steve; Slopek, Steve; Shang, Howard; Ferranti, Jeffrey
2011-04-01
In many healthcare organizations, comparative effectiveness research and quality improvement (QI) investigations are hampered by a lack of access to data created as a byproduct of patient care. Data collection often hinges upon either manual chart review or ad hoc requests to technical experts who support legacy clinical systems. In order to facilitate this needed capacity for data exploration at our institution (Duke University Health System), we have designed and deployed a robust Web application for cohort identification and data extraction--the Duke Enterprise Data Unified Content Explorer (DEDUCE). DEDUCE is envisioned as a simple, web-based environment that allows investigators access to administrative, financial, and clinical information generated during patient care. By using business intelligence tools to create a view into Duke Medicine's enterprise data warehouse, DEDUCE provides a Guided Query functionality using a wizard-like interface that lets users filter through millions of clinical records, explore aggregate reports, and, export extracts. Researchers and QI specialists can obtain detailed patient- and observation-level extracts without needing to understand structured query language or the underlying database model. Developers designing such tools must devote sufficient training and develop application safeguards to ensure that patient-centered clinical researchers understand when observation-level extracts should be used. This may mitigate the risk of data being misunderstood and consequently used in an improper fashion. Copyright © 2010 Elsevier Inc. All rights reserved.
Environmental analysis Waste Isolation Pilot Plant (WIPP) cost reduction proposals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The Waste Isolation Pilot Plant (WIPP) is a research and development facility to demonstrate the safe disposal of radioactive wastes resulting from the defense activities and programs of the United States government. The facility is planned to be developed in bedded salt at the Los Medanos site in southeastern New Mexico. The environmental consequences of contruction and operation of the WIPP facility are documented in ''Final Environmental Impact Statement, Waste Isolation Pilot Plant''. The proposed action addressed by this environmental analysis is to simplify and reduce the scope of the WIPP facility as it is currently designed. The proposed changesmore » to the existing WIPP design are: limit the waste storage rate to 500,000 cubic feet per year; eliminate one shaft and revise the underground ventilation system; eliminate the underground conveyor system; combine the Administration Building, the Underground Personnel Building and the Waste Handling Building office area; simplify the central monitoring system; simplify the security control systems; modify the Waste Handling Building; simplify the storage exhaust system; modify the above ground salt handling logistics; simplify the power system; reduce overall site features; simplify the Warehouse/Shops Building and eliminate the Vehicle Maintenance Building; and allow resource recovery in Control Zone IV.« less
The Atlas of Physiology and Pathophysiology: Web-based multimedia enabled interactive simulations.
Kofranek, Jiri; Matousek, Stanislav; Rusz, Jan; Stodulka, Petr; Privitzer, Pavol; Matejak, Marek; Tribula, Martin
2011-11-01
The paper is a presentation of the current state of development for the Atlas of Physiology and Pathophysiology (Atlas). Our main aim is to provide a novel interactive multimedia application that can be used for biomedical education where (a) simulations are combined with tutorials and (b) the presentation layer is simplified while the underlying complexity of the model is retained. The development of the Atlas required the cooperation of many professionals including teachers, system analysts, artists, and programmers. During the design of the Atlas, tools were developed that allow for component-based creation of simulation models, creation of interactive multimedia and their final coordination into a compact unit based on the given design. The Atlas is a freely available online application, which can help to explain the function of individual physiological systems and the causes and symptoms of their disorders. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Sciutto, Giorgia; Zangheri, Martina; Anfossi, Laura; Guardigli, Massimo; Prati, Silvia; Mirasoli, Mara; Di Nardo, Fabio; Baggiani, Claudio; Mazzeo, Rocco; Roda, Aldo
2018-06-18
The point-of-care testing concept has been exploited to design and develop portable and cheap bioanalytical systems that can be used on-site by conservators. These systems employ lateral flow immunoassays to simultaneously detect two proteins (ovalbumin and collagen) in artworks. For an in-depth study on the application of these portable biosensors, both chemiluminescent and colorimetric detections were developed and compared in terms of sensitivity and feasibility. The chemiluminescent system displayed the best analytical performance (that is, two orders of magnitude lower limits of detection than the colorimetric system). To simplify its use, a disposable cartridge was designed ad hoc for this specific application. These results highlight the enormous potential of these inexpensive, easy-to-use, and minimally invasive diagnostic tools for conservators in the cultural heritage field. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Calibration of ultra-high frequency (UHF) partial discharge sensors using FDTD method
NASA Astrophysics Data System (ADS)
Ishak, Asnor Mazuan; Ishak, Mohd Taufiq
2018-02-01
Ultra-high frequency (UHF) partial discharge sensors are widely used for conditioning monitoring and defect location in insulation system of high voltage equipment. Designing sensors for specific applications often requires an iterative process of manufacturing, testing and mechanical modifications. This paper demonstrates the use of finite-difference time-domain (FDTD) technique as a tool to predict the frequency response of UHF PD sensors. Using this approach, the design process can be simplified and parametric studies can be conducted in order to assess the influence of component dimensions and material properties on the sensor response. The modelling approach is validated using gigahertz transverse electromagnetic (GTEM) calibration system. The use of a transient excitation source is particularly suitable for modeling using FDTD, which is able to simulate the step response output voltage of the sensor from which the frequency response is obtained using the same post-processing applied to the physical measurement.
Urban sound energy reduction by means of sound barriers
NASA Astrophysics Data System (ADS)
Iordache, Vlad; Ionita, Mihai Vlad
2018-02-01
In urban environment, various heating ventilation and air conditioning appliances designed to maintain indoor comfort become urban acoustic pollution vectors due to the sound energy produced by these equipment. The acoustic barriers are the recommended method for the sound energy reduction in urban environment. The current sizing method of these acoustic barriers is too difficult and it is not practical for any 3D location of the noisy equipment and reception point. In this study we will develop based on the same method a new simplified tool for acoustic barriers sizing, maintaining the same precision characteristic to the classical method. Abacuses for acoustic barriers sizing are built that can be used for different 3D locations of the source and the reception points, for several frequencies and several acoustic barrier heights. The study case presented in the article represents a confirmation for the rapidity and ease of use of these abacuses in the design of the acoustic barriers.
A Micromechanics-Based Method for Multiscale Fatigue Prediction
NASA Astrophysics Data System (ADS)
Moore, John Allan
An estimated 80% of all structural failures are due to mechanical fatigue, often resulting in catastrophic, dangerous and costly failure events. However, an accurate model to predict fatigue remains an elusive goal. One of the major challenges is that fatigue is intrinsically a multiscale process, which is dependent on a structure's geometric design as well as its material's microscale morphology. The following work begins with a microscale study of fatigue nucleation around non- metallic inclusions. Based on this analysis, a novel multiscale method for fatigue predictions is developed. This method simulates macroscale geometries explicitly while concurrently calculating the simplified response of microscale inclusions. Thus, providing adequate detail on multiple scales for accurate fatigue life predictions. The methods herein provide insight into the multiscale nature of fatigue, while also developing a tool to aid in geometric design and material optimization for fatigue critical devices such as biomedical stents and artificial heart valves.
Epiviz: a view inside the design of an integrated visual analysis software for genomics
2015-01-01
Background Computational and visual data analysis for genomics has traditionally involved a combination of tools and resources, of which the most ubiquitous consist of genome browsers, focused mainly on integrative visualization of large numbers of big datasets, and computational environments, focused on data modeling of a small number of moderately sized datasets. Workflows that involve the integration and exploration of multiple heterogeneous data sources, small and large, public and user specific have been poorly addressed by these tools. In our previous work, we introduced Epiviz, which bridges the gap between the two types of tools, simplifying these workflows. Results In this paper we expand on the design decisions behind Epiviz, and introduce a series of new advanced features that further support the type of interactive exploratory workflow we have targeted. We discuss three ways in which Epiviz advances the field of genomic data analysis: 1) it brings code to interactive visualizations at various different levels; 2) takes the first steps in the direction of collaborative data analysis by incorporating user plugins from source control providers, as well as by allowing analysis states to be shared among the scientific community; 3) combines established analysis features that have never before been available simultaneously in a genome browser. In our discussion section, we present security implications of the current design, as well as a series of limitations and future research steps. Conclusions Since many of the design choices of Epiviz are novel in genomics data analysis, this paper serves both as a document of our own approaches with lessons learned, as well as a start point for future efforts in the same direction for the genomics community. PMID:26328750
Industrial Adoption of Model-Based Systems Engineering: Challenges and Strategies
NASA Astrophysics Data System (ADS)
Maheshwari, Apoorv
As design teams are becoming more globally integrated, one of the biggest challenges is to efficiently communicate across the team. The increasing complexity and multi-disciplinary nature of the products are also making it difficult to keep track of all the information generated during the design process by these global team members. System engineers have identified Model-based Systems Engineering (MBSE) as a possible solution where the emphasis is placed on the application of visual modeling methods and best practices to systems engineering (SE) activities right from the beginning of the conceptual design phases through to the end of the product lifecycle. Despite several advantages, there are multiple challenges restricting the adoption of MBSE by industry. We mainly consider the following two challenges: a) Industry perceives MBSE just as a diagramming tool and does not see too much value in MBSE; b) Industrial adopters are skeptical if the products developed using MBSE approach will be accepted by the regulatory bodies. To provide counter evidence to the former challenge, we developed a generic framework for translation from an MBSE tool (Systems Modeling Language, SysML) to an analysis tool (Agent-Based Modeling, ABM). The translation is demonstrated using a simplified air traffic management problem and provides an example of a potential quite significant value: the ability to use MBSE representations directly in an analysis setting. For the latter challenge, we are developing a reference model that uses SysML to represent a generic infusion pump and SE process for planning, developing, and obtaining regulatory approval of a medical device. This reference model demonstrates how regulatory requirements can be captured effectively through model-based representations. We will present another case study at the end where we will apply the knowledge gained from both case studies to a UAV design problem.
Johny, Shajahan; Kyei-Poku, George
2014-10-01
Emerald ash borer is an invasive species from Asia. Beauveria bassiana strain L49-1AA is being tested for the control of emerald ash borer in Canada, using an autocontamination trapping system. We have developed a simplified allele discrimination polymerase chain reaction (PCR) assay to screen B. bassiana strain, L49-1AA from other Beauveria species by targeting the inter-strain genetic differences in 5' end of EF1-α gene of the genus Beauveria. A single nucleotide polymorphism (SNP) site, T→C was identified only in L49-1AA and was used to develop a simplified allele discrimination polymerase chain reaction (PCR) assay based on a modified allelic inhibition of displacement activity (AIDA) approach for distinguishing B. bassiana L49-1AA from all background Beauveria isolates. The SNP site was employed to design inner primers but with a deliberate mismatch introduced at the 3' antepenultimate from the mutation site in order to maximize specificity and detection efficiency. Amplification was specific to L49-1AA without cross-reaction with DNA from other Beauveria strains. In addition, the designed primers were also tested against environmental samples in L49-1AA released plots and observed to be highly efficient in detecting and discriminating the target strain, L49-1AA from both pure and crude DNA samples. This new method can potentially allow for more discriminatory tracking and monitoring of released L49-1AA in our autocontamination and dissemination projects for managing EAB populations. Additionally, the modified-AIDA format has potential as a tool for simultaneously identifying and differentiating closely related Beauveria species, strains/isolates as well as general classification of other pathogens or organisms. Crown Copyright © 2014. Published by Elsevier Inc. All rights reserved.
An improved pulse coupled neural network with spectral residual for infrared pedestrian segmentation
NASA Astrophysics Data System (ADS)
He, Fuliang; Guo, Yongcai; Gao, Chao
2017-12-01
Pulse coupled neural network (PCNN) has become a significant tool for the infrared pedestrian segmentation, and a variety of relevant methods have been developed at present. However, these existing models commonly have several problems of the poor adaptability of infrared noise, the inaccuracy of segmentation results, and the fairly complex determination of parameters in current methods. This paper presents an improved PCNN model that integrates the simplified framework and spectral residual to alleviate the above problem. In this model, firstly, the weight matrix of the feeding input field is designed by the anisotropic Gaussian kernels (ANGKs), in order to suppress the infrared noise effectively. Secondly, the normalized spectral residual saliency is introduced as linking coefficient to enhance the edges and structural characteristics of segmented pedestrians remarkably. Finally, the improved dynamic threshold based on the average gray values of the iterative segmentation is employed to simplify the original PCNN model. Experiments on the IEEE OTCBVS benchmark and the infrared pedestrian image database built by our laboratory, demonstrate that the superiority of both subjective visual effects and objective quantitative evaluations in information differences and segmentation errors in our model, compared with other classic segmentation methods.
NASA Technical Reports Server (NTRS)
Ungar, Eugene K.; Richards, W. Lance
2015-01-01
The aircraft-based Stratospheric Observatory for Infrared Astronomy (SOFIA) is a platform for multiple infrared astronomical observation experiments. These experiments carry sensors cooled to liquid helium temperatures. The liquid helium supply is contained in large (i.e., 10 liters or more) vacuum-insulated dewars. Should the dewar vacuum insulation fail, the inrushing air will condense and freeze on the dewar wall, resulting in a large heat flux on the dewar's contents. The heat flux results in a rise in pressure and the actuation of the dewar pressure relief system. A previous NASA Engineering and Safety Center (NESC) assessment provided recommendations for the wall heat flux that would be expected from a loss of vacuum and detailed an appropriate method to use in calculating the maximum pressure that would occur in a loss of vacuum event. This method involved building a detailed supercritical helium compressible flow thermal/fluid model of the vent stack and exercising the model over the appropriate range of parameters. The experimenters designing science instruments for SOFIA are not experts in compressible supercritical flows and do not generally have access to the thermal/fluid modeling packages that are required to build detailed models of the vent stacks. Therefore, the SOFIA Program engaged the NESC to develop a simplified methodology to estimate the maximum pressure in a liquid helium dewar after the loss of vacuum insulation. The method would allow the university-based science instrument development teams to conservatively determine the cryostat's vent neck sizing during preliminary design of new SOFIA Science Instruments. This report details the development of the simplified method, the method itself, and the limits of its applicability. The simplified methodology provides an estimate of the dewar pressure after a loss of vacuum insulation that can be used for the initial design of the liquid helium dewar vent stacks. However, since it is not an exact tool, final verification of the dewar pressure vessel design requires a complete, detailed real fluid compressible flow model of the vent stack. The wall heat flux resulting from a loss of vacuum insulation increases the dewar pressure, which actuates the pressure relief mechanism and results in high-speed flow through the dewar vent stack. At high pressures, the flow can be choked at the vent stack inlet, at the exit, or at an intermediate transition or restriction. During previous SOFIA analyses, it was observed that there was generally a readily identifiable section of the vent stack that would limit the flow – e.g., a small diameter entrance or an orifice. It was also found that when the supercritical helium was approximated as an ideal gas at the dewar condition, the calculated mass flow rate based on choking at the limiting entrance or transition was less than the mass flow rate calculated using the detailed real fluid model2. Using this lower mass flow rate would yield a conservative prediction of the dewar’s wall heat flux capability. The simplified method of the current work was developed by building on this observation.
Antibiogramj: A tool for analysing images from disk diffusion tests.
Alonso, C A; Domínguez, C; Heras, J; Mata, E; Pascual, V; Torres, C; Zarazaga, M
2017-05-01
Disk diffusion testing, known as antibiogram, is widely applied in microbiology to determine the antimicrobial susceptibility of microorganisms. The measurement of the diameter of the zone of growth inhibition of microorganisms around the antimicrobial disks in the antibiogram is frequently performed manually by specialists using a ruler. This is a time-consuming and error-prone task that might be simplified using automated or semi-automated inhibition zone readers. However, most readers are usually expensive instruments with embedded software that require significant changes in laboratory design and workflow. Based on the workflow employed by specialists to determine the antimicrobial susceptibility of microorganisms, we have designed a software tool that, from images of disk diffusion tests, semi-automatises the process. Standard computer vision techniques are employed to achieve such an automatisation. We present AntibiogramJ, a user-friendly and open-source software tool to semi-automatically determine, measure and categorise inhibition zones of images from disk diffusion tests. AntibiogramJ is implemented in Java and deals with images captured with any device that incorporates a camera, including digital cameras and mobile phones. The fully automatic procedure of AntibiogramJ for measuring inhibition zones achieves an overall agreement of 87% with an expert microbiologist; moreover, AntibiogramJ includes features to easily detect when the automatic reading is not correct and fix it manually to obtain the correct result. AntibiogramJ is a user-friendly, platform-independent, open-source, and free tool that, up to the best of our knowledge, is the most complete software tool for antibiogram analysis without requiring any investment in new equipment or changes in the laboratory. Copyright © 2017 Elsevier B.V. All rights reserved.
Bhargava, Puneet; Lackey, Amanda E; Dhand, Sabeen; Moshiri, Mariam; Jambhekar, Kedar; Pandey, Tarun
2013-03-01
We are in the midst of an evolving educational revolution. Use of digital devices such as smart phones and tablet computers is rapidly increasing among radiologists who now regularly use them for medical, technical, and administrative tasks. These electronic tools provide a wide array of new tools to the radiologists allowing for faster, more simplified, and widespread distribution of educational material. The utility, future potential, and limitations of some these powerful tools are discussed in this article. Published by Elsevier Inc.
Experiences Building an Object-Oriented System in C++
NASA Technical Reports Server (NTRS)
Madany, Peter W.; Campbell, Roy H.; Kougiouris, Panagiotis
1991-01-01
This paper describes tools that we built to support the construction of an object-oriented operating system in C++. The tools provide the automatic deletion of unwanted objects, first-class classes, dynamically loadable classes, and class-oriented debugging. As a consequence of our experience building Choices, we advocate these features as useful, simplifying and unifying many aspects of system programming.
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Nemeth, Michael P.; Hilburger, Mark W.
2004-01-01
A technology review and assessment of modeling and analysis efforts underway in support of a safe return to flight of the thermal protection system (TPS) for the Space Shuttle external tank (ET) are summarized. This review and assessment effort focuses on the structural modeling and analysis practices employed for ET TPS foam design and analysis and on identifying analysis capabilities needed in the short-term and long-term. The current understanding of the relationship between complex flight environments and ET TPS foam failure modes are reviewed as they relate to modeling and analysis. A literature review on modeling and analysis of TPS foam material systems is also presented. Finally, a review of modeling and analysis tools employed in the Space Shuttle Program is presented for the ET TPS acreage and close-out foam regions. This review includes existing simplified engineering analysis tools are well as finite element analysis procedures.
BEASTling: A software tool for linguistic phylogenetics using BEAST 2
Forkel, Robert; Kaiping, Gereon A.; Atkinson, Quentin D.
2017-01-01
We present a new open source software tool called BEASTling, designed to simplify the preparation of Bayesian phylogenetic analyses of linguistic data using the BEAST 2 platform. BEASTling transforms comparatively short and human-readable configuration files into the XML files used by BEAST to specify analyses. By taking advantage of Creative Commons-licensed data from the Glottolog language catalog, BEASTling allows the user to conveniently filter datasets using names for recognised language families, to impose monophyly constraints so that inferred language trees are backward compatible with Glottolog classifications, or to assign geographic location data to languages for phylogeographic analyses. Support for the emerging cross-linguistic linked data format (CLDF) permits easy incorporation of data published in cross-linguistic linked databases into analyses. BEASTling is intended to make the power of Bayesian analysis more accessible to historical linguists without strong programming backgrounds, in the hopes of encouraging communication and collaboration between those developing computational models of language evolution (who are typically not linguists) and relevant domain experts. PMID:28796784
BEASTling: A software tool for linguistic phylogenetics using BEAST 2.
Maurits, Luke; Forkel, Robert; Kaiping, Gereon A; Atkinson, Quentin D
2017-01-01
We present a new open source software tool called BEASTling, designed to simplify the preparation of Bayesian phylogenetic analyses of linguistic data using the BEAST 2 platform. BEASTling transforms comparatively short and human-readable configuration files into the XML files used by BEAST to specify analyses. By taking advantage of Creative Commons-licensed data from the Glottolog language catalog, BEASTling allows the user to conveniently filter datasets using names for recognised language families, to impose monophyly constraints so that inferred language trees are backward compatible with Glottolog classifications, or to assign geographic location data to languages for phylogeographic analyses. Support for the emerging cross-linguistic linked data format (CLDF) permits easy incorporation of data published in cross-linguistic linked databases into analyses. BEASTling is intended to make the power of Bayesian analysis more accessible to historical linguists without strong programming backgrounds, in the hopes of encouraging communication and collaboration between those developing computational models of language evolution (who are typically not linguists) and relevant domain experts.
The Technical Work Plan Tracking Tool
NASA Technical Reports Server (NTRS)
Chullen, Cinda; Leighton, Adele; Weller, Richard A.; Woodfill, Jared; Parkman, William E.; Ellis, Glenn L.; Wilson, Marilyn M.
2003-01-01
The Technical Work Plan Tracking Tool is a web-based application that enables interactive communication and approval of contract requirements that pertain to the administration of the Science, Engineering, Analysis, and Test (SEAT) contract at Johnson Space Center. The implementation of the application has (1) shortened the Technical Work Plan approval process, (2) facilitated writing and documenting requirements in a performance-based environment with associated surveillance plans, (3) simplified the contractor s estimate of the cost for the required work, and (4) allowed for the contractor to document how they plan to accomplish the work. The application is accessible to over 300 designated NASA and contractor employees via two Web sites. For each employee, the application regulates access according to the employee s authority to enter, view, and/or print out diverse information, including reports, work plans, purchase orders, and financial data. Advanced features of this application include on-line approval capability, automatic e-mail notifications requesting review by subsequent approvers, and security inside and outside the firewall.
Remote visual analysis of large turbulence databases at multiple scales
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pulido, Jesus; Livescu, Daniel; Kanov, Kalin
The remote analysis and visualization of raw large turbulence datasets is challenging. Current accurate direct numerical simulations (DNS) of turbulent flows generate datasets with billions of points per time-step and several thousand time-steps per simulation. Until recently, the analysis and visualization of such datasets was restricted to scientists with access to large supercomputers. The public Johns Hopkins Turbulence database simplifies access to multi-terabyte turbulence datasets and facilitates the computation of statistics and extraction of features through the use of commodity hardware. In this paper, we present a framework designed around wavelet-based compression for high-speed visualization of large datasets and methodsmore » supporting multi-resolution analysis of turbulence. By integrating common technologies, this framework enables remote access to tools available on supercomputers and over 230 terabytes of DNS data over the Web. Finally, the database toolset is expanded by providing access to exploratory data analysis tools, such as wavelet decomposition capabilities and coherent feature extraction.« less
How to Boost Engineering Support Via Web 2.0 - Seeds for the Ares Project...and/or Yours?
NASA Technical Reports Server (NTRS)
Scott, David W.
2010-01-01
The Mission Operations Laboratory (MOL) at Marshall Space Flight Center (MSFC) is responsible for Engineering Support capability for NASA s Ares launch system development. In pursuit of this, MOL is building the Ares Engineering and Operations Network (AEON), a web-based portal intended to provide a seamless interface to support and simplify two critical activities: a) Access and analyze Ares manufacturing, test, and flight performance data, with access to Shuttle data for comparison. b) Provide archive storage for engineering instrumentation data to support engineering design, development, and test. A mix of NASA-written and COTS software provides engineering analysis tools. A by-product of using a data portal to access and display data is access to collaborative tools inherent in a Web 2.0 environment. This paper discusses how Web 2.0 techniques, particularly social media, might be applied to the traditionally conservative and formal engineering support arena. A related paper by the author [1] considers use
Remote visual analysis of large turbulence databases at multiple scales
Pulido, Jesus; Livescu, Daniel; Kanov, Kalin; ...
2018-06-15
The remote analysis and visualization of raw large turbulence datasets is challenging. Current accurate direct numerical simulations (DNS) of turbulent flows generate datasets with billions of points per time-step and several thousand time-steps per simulation. Until recently, the analysis and visualization of such datasets was restricted to scientists with access to large supercomputers. The public Johns Hopkins Turbulence database simplifies access to multi-terabyte turbulence datasets and facilitates the computation of statistics and extraction of features through the use of commodity hardware. In this paper, we present a framework designed around wavelet-based compression for high-speed visualization of large datasets and methodsmore » supporting multi-resolution analysis of turbulence. By integrating common technologies, this framework enables remote access to tools available on supercomputers and over 230 terabytes of DNS data over the Web. Finally, the database toolset is expanded by providing access to exploratory data analysis tools, such as wavelet decomposition capabilities and coherent feature extraction.« less
Probabilistic risk analysis of building contamination.
Bolster, D T; Tartakovsky, D M
2008-10-01
We present a general framework for probabilistic risk assessment (PRA) of building contamination. PRA provides a powerful tool for the rigorous quantification of risk in contamination of building spaces. A typical PRA starts by identifying relevant components of a system (e.g. ventilation system components, potential sources of contaminants, remediation methods) and proceeds by using available information and statistical inference to estimate the probabilities of their failure. These probabilities are then combined by means of fault-tree analyses to yield probabilistic estimates of the risk of system failure (e.g. building contamination). A sensitivity study of PRAs can identify features and potential problems that need to be addressed with the most urgency. Often PRAs are amenable to approximations, which can significantly simplify the approach. All these features of PRA are presented in this paper via a simple illustrative example, which can be built upon in further studies. The tool presented here can be used to design and maintain adequate ventilation systems to minimize exposure of occupants to contaminants.
Villodre, Celia; Rebasa, Pere; Estrada, José Luís; Zaragoza, Carmen; Zapater, Pedro; Mena, Luís; Lluís, Félix
2016-11-01
In a previous study, we found that Physiological and Operative Severity Score for the enUmeration of Mortality and Morbidity (POSSUM) overpredicts morbidity risk in emergency gastrointestinal surgery. Our aim was to find a POSSUM equation adjustment. A prospective observational study was performed on 2,361 patients presenting with a community-acquired gastrointestinal surgical emergency. The first 1,000 surgeries constituted the development cohort, the second 1,000 events were the first validation intramural cohort, and the remaining 361 cases belonged to a second validation extramural cohort. (1) A modified POSSUM equation was obtained. (2) Logistic regression was used to yield a statistically significant equation that included age, hemoglobin, white cell count, sodium and operative severity. (3) A chi-square automatic interaction detector decision tree analysis yielded a statistically significant equation with 4 variables, namely cardiac failure, sodium, operative severity, and peritoneal soiling. A modified POSSUM equation and a simplified scoring system (aLicante sUrgical Community Emergencies New Tool for the enUmeration of Morbidities [LUCENTUM]) are described. Both tools significantly improve prediction of surgical morbidity in community-acquired gastrointestinal surgical emergencies. Copyright © 2016 Elsevier Inc. All rights reserved.
GoPros™ as an underwater photogrammetry tool for citizen science
David, Peter A.; Dupont, Sally F.; Mathewson, Ciaran P.; O’Neill, Samuel J.; Powell, Nicholas N.; Williamson, Jane E.
2016-01-01
Citizen science can increase the scope of research in the marine environment; however, it suffers from necessitating specialized training and simplified methodologies that reduce research output. This paper presents a simplified, novel survey methodology for citizen scientists, which combines GoPro imagery and structure from motion to construct an ortho-corrected 3D model of habitats for analysis. Results using a coral reef habitat were compared to surveys conducted with traditional snorkelling methods for benthic cover, holothurian counts, and coral health. Results were comparable between the two methods, and structure from motion allows the results to be analysed off-site for any chosen visual analysis. The GoPro method outlined in this study is thus an effective tool for citizen science in the marine environment, especially for comparing changes in coral cover or volume over time. PMID:27168973
GoPros™ as an underwater photogrammetry tool for citizen science.
Raoult, Vincent; David, Peter A; Dupont, Sally F; Mathewson, Ciaran P; O'Neill, Samuel J; Powell, Nicholas N; Williamson, Jane E
2016-01-01
Citizen science can increase the scope of research in the marine environment; however, it suffers from necessitating specialized training and simplified methodologies that reduce research output. This paper presents a simplified, novel survey methodology for citizen scientists, which combines GoPro imagery and structure from motion to construct an ortho-corrected 3D model of habitats for analysis. Results using a coral reef habitat were compared to surveys conducted with traditional snorkelling methods for benthic cover, holothurian counts, and coral health. Results were comparable between the two methods, and structure from motion allows the results to be analysed off-site for any chosen visual analysis. The GoPro method outlined in this study is thus an effective tool for citizen science in the marine environment, especially for comparing changes in coral cover or volume over time.
Building Efficiency Evaluation and Uncertainty Analysis with DOE's Asset Score Preview
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2016-08-12
Building Energy Asset Score Tool, developed by the U.S. Department of Energy (DOE), is a program to encourage energy efficiency improvement by helping building owners and managers assess a building's energy-related systems independent of operations and maintenance. Asset Score Tool uses a simplified EnergyPlus model to provide an assessment of building systems, through minimum user inputs of basic building characteristics. Asset Score Preview is a newly developed option that allows users to assess their building's systems and the potential value of a more in-depth analysis via an even more simplified approach. This methodology provides a preliminary approach to estimating amore » building's energy efficiency and potential for improvement. This paper provides an overview of the methodology used for the development of Asset Score Preview and the scoring methodology.« less
Simplified predictive models for CO 2 sequestration performance assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mishra, Srikanta; Ganesh, Priya; Schuetter, Jared
CO2 sequestration in deep saline formations is increasingly being considered as a viable strategy for the mitigation of greenhouse gas emissions from anthropogenic sources. In this context, detailed numerical simulation based models are routinely used to understand key processes and parameters affecting pressure propagation and buoyant plume migration following CO2 injection into the subsurface. As these models are data and computation intensive, the development of computationally-efficient alternatives to conventional numerical simulators has become an active area of research. Such simplified models can be valuable assets during preliminary CO2 injection project screening, serve as a key element of probabilistic system assessmentmore » modeling tools, and assist regulators in quickly evaluating geological storage projects. We present three strategies for the development and validation of simplified modeling approaches for CO2 sequestration in deep saline formations: (1) simplified physics-based modeling, (2) statisticallearning based modeling, and (3) reduced-order method based modeling. In the first category, a set of full-physics compositional simulations is used to develop correlations for dimensionless injectivity as a function of the slope of the CO2 fractional-flow curve, variance of layer permeability values, and the nature of vertical permeability arrangement. The same variables, along with a modified gravity number, can be used to develop a correlation for the total storage efficiency within the CO2 plume footprint. Furthermore, the dimensionless average pressure buildup after the onset of boundary effects can be correlated to dimensionless time, CO2 plume footprint, and storativity contrast between the reservoir and caprock. In the second category, statistical “proxy models” are developed using the simulation domain described previously with two approaches: (a) classical Box-Behnken experimental design with a quadratic response surface, and (b) maximin Latin Hypercube sampling (LHS) based design with a multidimensional kriging metamodel fit. For roughly the same number of simulations, the LHS-based metamodel yields a more robust predictive model, as verified by a k-fold cross-validation approach (with data split into training and test sets) as well by validation with an independent dataset. In the third category, a reduced-order modeling procedure is utilized that combines proper orthogonal decomposition (POD) for reducing problem dimensionality with trajectory-piecewise linearization (TPWL) in order to represent system response at new control settings from a limited number of training runs. Significant savings in computational time are observed with reasonable accuracy from the PODTPWL reduced-order model for both vertical and horizontal well problems – which could be important in the context of history matching, uncertainty quantification and optimization problems. The simplified physics and statistical learning based models are also validated using an uncertainty analysis framework. Reference cumulative distribution functions of key model outcomes (i.e., plume radius and reservoir pressure buildup) generated using a 97-run full-physics simulation are successfully validated against the CDF from 10,000 sample probabilistic simulations using the simplified models. The main contribution of this research project is the development and validation of a portfolio of simplified modeling approaches that will enable rapid feasibility and risk assessment for CO2 sequestration in deep saline formations.« less
A simplified genetic design for mammalian enamel
Snead, ML; Zhu, D; Lei, YP; Luo, W; Bringas, P.; Sucov, H.; Rauth, RJ; Paine, ML; White, SN
2011-01-01
A biomimetic replacement for tooth enamel is urgently needed because dental caries is the most prevalent infectious disease to affect man. Here, design specifications for an enamel replacement material inspired by Nature are deployed for testing in an animal model. Using genetic engineering we created a simplified enamel protein matrix precursor where only one, rather than dozens of amelogenin isoforms, contributed to enamel formation. Enamel function and architecture were unaltered, but the balance between the competing materials properties of hardness and toughness was modulated. While the other amelogenin isoforms make a modest contribution to optimal biomechanical design, the enamel made with only one amelogenin isoform served as a functional substitute. Where enamel has been lost to caries or trauma a suitable biomimetic replacement material could be fabricated using only one amelogenin isoform, thereby simplifying the protein matrix parameters by one order of magnitude. PMID:21295848
NASA Astrophysics Data System (ADS)
Destefanis, Stefano; Tracino, Emanuele; Giraudo, Martina
2014-06-01
During a mission involving a spacecraft using nuclear power sources (NPS), the consequences to the population induced by an accident has to be taken into account carefully.Part of the study (led by AREVA, with TAS-I as one of the involved parties) was devoted to "Worst Case Scenario Consolidation". In particular, one of the activities carried out by TAS-I had the aim of characterizing the accidental environment (explosion on launch pad or during launch) and consolidate the requirements given as input in the study. The resulting requirements became inputs for Nuclear Power Source container design.To do so, TAS-I did first an overview of the available technical literature (mostly developed in the frame of NASA Mercury / Apollo program), to identify the key parameters to be used for analytical assessment (blast pressure wave, fragments size, speed and distribution, TNT equivalent of liquid propellant).Then, a simplified Radioss model was setup, to verify both the cards needed for blast / fragment impact analysis and the consistency between preliminary results and available technical literature (Radioss is commonly used to design mine - resistant vehicles, by simulating the effect of blasts onto structural elements, and it is used in TAS-I for several types of analysis, including land impact, water impact and fluid - structure interaction).The obtained results (albeit produced by a very simplified model) are encouraging, showing that the analytical tool and the selected key parameters represent a step in the right direction.
NASA Astrophysics Data System (ADS)
van Daal-Rombouts, Petra; Sun, Siao; Langeveld, Jeroen; Bertrand-Krajewski, Jean-Luc; Clemens, François
2016-07-01
Optimisation or real time control (RTC) studies in wastewater systems increasingly require rapid simulations of sewer systems in extensive catchments. To reduce the simulation time calibrated simplified models are applied, with the performance generally based on the goodness of fit of the calibration. In this research the performance of three simplified and a full hydrodynamic (FH) model for two catchments are compared based on the correct determination of CSO event occurrences and of the total discharged volumes to the surface water. Simplified model M1 consists of a rainfall runoff outflow (RRO) model only. M2 combines the RRO model with a static reservoir model for the sewer behaviour. M3 comprises the RRO model and a dynamic reservoir model. The dynamic reservoir characteristics were derived from FH model simulations. It was found that M2 and M3 are able to describe the sewer behaviour of the catchments, contrary to M1. The preferred model structure depends on the quality of the information (geometrical database and monitoring data) available for the design and calibration of the model. Finally, calibrated simplified models are shown to be preferable to uncalibrated FH models when performing optimisation or RTC studies.
Linear regression metamodeling as a tool to summarize and present simulation model results.
Jalal, Hawre; Dowd, Bryan; Sainfort, François; Kuntz, Karen M
2013-10-01
Modelers lack a tool to systematically and clearly present complex model results, including those from sensitivity analyses. The objective was to propose linear regression metamodeling as a tool to increase transparency of decision analytic models and better communicate their results. We used a simplified cancer cure model to demonstrate our approach. The model computed the lifetime cost and benefit of 3 treatment options for cancer patients. We simulated 10,000 cohorts in a probabilistic sensitivity analysis (PSA) and regressed the model outcomes on the standardized input parameter values in a set of regression analyses. We used the regression coefficients to describe measures of sensitivity analyses, including threshold and parameter sensitivity analyses. We also compared the results of the PSA to deterministic full-factorial and one-factor-at-a-time designs. The regression intercept represented the estimated base-case outcome, and the other coefficients described the relative parameter uncertainty in the model. We defined simple relationships that compute the average and incremental net benefit of each intervention. Metamodeling produced outputs similar to traditional deterministic 1-way or 2-way sensitivity analyses but was more reliable since it used all parameter values. Linear regression metamodeling is a simple, yet powerful, tool that can assist modelers in communicating model characteristics and sensitivity analyses.
The principle of commonality and its application to the Space Station Freedom Program
NASA Technical Reports Server (NTRS)
Hopson, George D.; Thomas, L. Dale; Daniel, Charles C.
1989-01-01
The principle of commonality has achieved wide application in the communication, automotive, and aircraft industries. By the use of commonality, component development costs are minimized, logistics are simplified, and the investment costs of spares inventory are reduced. With space systems, which must be maintained and repaired in orbit, the advantages of commonality are compounded. Transportation of spares is expensive, on-board storage volume for spares is limited, and crew training and special tools needed for maintenance and repair are significant considerations. This paper addresses the techniques being formulated to realize the benefits of commonality in the design of the systems and elements of the Space Station Freedom Program, and include the criteria for determining the extent of commonality to be implemented.
Controlling molecular transport in minimal emulsions
NASA Astrophysics Data System (ADS)
Gruner, Philipp; Riechers, Birte; Semin, Benoît; Lim, Jiseok; Johnston, Abigail; Short, Kathleen; Baret, Jean-Christophe
2016-01-01
Emulsions are metastable dispersions in which molecular transport is a major mechanism driving the system towards its state of minimal energy. Determining the underlying mechanisms of molecular transport between droplets is challenging due to the complexity of a typical emulsion system. Here we introduce the concept of `minimal emulsions', which are controlled emulsions produced using microfluidic tools, simplifying an emulsion down to its minimal set of relevant parameters. We use these minimal emulsions to unravel the fundamentals of transport of small organic molecules in water-in-fluorinated-oil emulsions, a system of great interest for biotechnological applications. Our results are of practical relevance to guarantee a sustainable compartmentalization of compounds in droplet microreactors and to design new strategies for the dynamic control of droplet compositions.
Computerized engineering logic for procurement and dedication processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tulay, M.P.
1996-12-31
This paper summarizes the work performed for designing the system and especially for calculating on-line expected performance and gives some significant results. In an attempt to better meet the needs of operations and maintenance organizations, many nuclear utility procurement engineering groups have simplified their procedures, developed on-line tools for performing the specification of replacement items, and developed relational databases containing part-level information necessary to automate the procurement process. Although these improvements have helped to reduce the engineering necessary to properly specify and accept/dedicate items for nuclear safety-related applications, a number of utilities have recognized that additional long-term savings can bemore » realized by integrating a computerized logic to assist technical procurement engineering personnel.« less
ProForma: A Standard Proteoform Notation
DOE Office of Scientific and Technical Information (OSTI.GOV)
LeDuc, Richard D.; Schwämmle, Veit; Shortreed, Michael R.
The Consortium for Top-Down Proteomics (CTDP) proposes a standardized notation, ProForma, for writing the sequence of fully characterized proteoforms. ProForma provides a means to communicate any proteoform by writing the amino acid sequence using standard one-letter notation and specifying modifications or unidentified mass shifts within brackets following certain amino acids. The notation is unambiguous, human readable, and can easily be parsed and written by bioinformatic tools. This system uses seven rules and supports a wide range of possible use cases, ensuring compatibility and reproducibility of proteoform annotations. Standardizing proteoform sequences will simplify storage, comparison, and reanalysis of proteomic studies, andmore » the Consortium welcomes input and contributions from the research community on the continued design and maintenance of this standard.« less
Generation of single- and two-mode multiphoton states in waveguide QED
NASA Astrophysics Data System (ADS)
Paulisch, V.; Kimble, H. J.; Cirac, J. I.; González-Tudela, A.
2018-05-01
Single- and two-mode multiphoton states are the cornerstone of many quantum technologies, e.g., metrology. In the optical regime, these states are generally obtained combining heralded single photons with linear optics tools and post-selection, leading to inherent low success probabilities. In a recent paper [A. González-Tudela et al., Phys. Rev. Lett. 118, 213601 (2017), 10.1103/PhysRevLett.118.213601], we design several protocols that harness the long-range atomic interactions induced in waveguide QED to improve fidelities and protocols of single-mode multiphoton emission. Here, we give full details of these protocols, revisit them to simplify some of their requirements, and also extend them to generate two-mode multiphoton states, such as Yurke or NOON states.
Dyslexia and Fonts: Is a Specific Font Useful?
Bachmann, Christina; Mengheri, Lauro
2018-01-01
Nowadays, several books published in different fonts advertised as being particularly suitable for dyslexics are available on the market. Our research aimed to assess the significance of a specific reading font especially designed for dyslexia, called EasyReading™. The performances of good readers and dyslexics were compared. Fourth grade primary school students (533 students in total) were assessed based on reading tasks presented with two different layouts: the popular Times New Roman and EasyReading™, in order to investigate whether children’s performances were influenced by the fonts used. The results of the study were both statistically and clinically significant, proving that EasyReading™ can be considered a compensating tool for readers with dyslexia, and a simplifying font for all categories of readers. PMID:29757944
Gromita: a fully integrated graphical user interface to gromacs 4.
Sellis, Diamantis; Vlachakis, Dimitrios; Vlassi, Metaxia
2009-09-07
Gromita is a fully integrated and efficient graphical user interface (GUI) to the recently updated molecular dynamics suite Gromacs, version 4. Gromita is a cross-platform, perl/tcl-tk based, interactive front end designed to break the command line barrier and introduce a new user-friendly environment to run molecular dynamics simulations through Gromacs. Our GUI features a novel workflow interface that guides the user through each logical step of the molecular dynamics setup process, making it accessible to both advanced and novice users. This tool provides a seamless interface to the Gromacs package, while providing enhanced functionality by speeding up and simplifying the task of setting up molecular dynamics simulations of biological systems. Gromita can be freely downloaded from http://bio.demokritos.gr/gromita/.
Simplified power processing for ion-thruster subsystems
NASA Technical Reports Server (NTRS)
Wessel, F. J.; Hancock, D. J.
1983-01-01
Compared to chemical propulsion, ion propulsion offers distinct payload-mass increases for many future low-thrust earth-orbital and deep-space missions. Despite this advantage, the high initial cost and complexity of ion-propulsion subsystems reduce their attractiveness for most present and near-term spacecraft missions. Investigations have, therefore, been conducted with the objective to attempt to simplify the power-processing unit (PPU), which is the single most complex and expensive component in the thruster subsystem. The present investigation is concerned with a program to simplify the design of the PPU employed in a 8-cm mercury-ion-thruster subsystem. In this program a dramatic simplification in the design of the PPU could be achieved, while retaining essential thruster control and subsystem operational flexibility.
NASA Astrophysics Data System (ADS)
Şahin, Rıdvan; Zhang, Hong-yu
2018-03-01
Induced Choquet integral is a powerful tool to deal with imprecise or uncertain nature. This study proposes a combination process of the induced Choquet integral and neutrosophic information. We first give the operational properties of simplified neutrosophic numbers (SNNs). Then, we develop some new information aggregation operators, including an induced simplified neutrosophic correlated averaging (I-SNCA) operator and an induced simplified neutrosophic correlated geometric (I-SNCG) operator. These operators not only consider the importance of elements or their ordered positions, but also take into account the interactions phenomena among decision criteria or their ordered positions under multiple decision-makers. Moreover, we present a detailed analysis of I-SNCA and I-SNCG operators, including the properties of idempotency, commutativity and monotonicity, and study the relationships among the proposed operators and existing simplified neutrosophic aggregation operators. In order to handle the multi-criteria group decision-making (MCGDM) situations where the weights of criteria and decision-makers usually correlative and the criterion values are considered as SNNs, an approach is established based on I-SNCA operator. Finally, a numerical example is presented to demonstrate the proposed approach and to verify its effectiveness and practicality.
Murphy, D A; O'Keefe, Z H; Kaufman, A H
1999-10-01
A simplified version of the prototype HIV vaccine material was developed through (a) reducing reading grade level, (b) restructuring of the organization and categorization of the material, (c) adding pictures designed to emphasize key concepts, and (d) obtaining feedback on the simplified version through focus groups with the target population. Low-income women at risk for HIV (N = 141) recruited from a primary care clinic were randomly assigned to be presented the standard or the simplified version. There were no significant differences between the groups in terms of education or Vocabulary, Block Design, and Passage Comprehension scores. Women who received the simplified version had significantly higher comprehension scores immediately following presentation of the material than did women who received the standard version and were also significantly more likely to recall study benefits and risks. These findings were maintained at 3-month follow-up. Implications for informed consent are discussed.
Garrett, Sarah B; Murphy, Marie; Wiley, James; Dohan, Daniel
2017-12-01
Replacing standard consent materials with simplified materials is a promising intervention to improve patient comprehension, but there is little evidence on its real-world implementation. We employed a sequential two-arm design to compare the effect of standard versus simplified consent materials on potential donors' understanding of biobank processes and their accrual to an active biobanking program. Participants were female patients of a California breast health clinic. Subjects from the simplified arm answered more items correctly ( p = .064), reported "don't know" for fewer items ( p = .077), and consented to donate to the biobank at higher rates ( p = .025) than those from the standard arm. Replacing an extant consent form with a simplified version is feasible and may benefit patient comprehension and study accrual.
Statistical Issues for Uncontrolled Reentry Hazards
NASA Technical Reports Server (NTRS)
Matney, Mark
2008-01-01
A number of statistical tools have been developed over the years for assessing the risk of reentering objects to human populations. These tools make use of the characteristics (e.g., mass, shape, size) of debris that are predicted by aerothermal models to survive reentry. The statistical tools use this information to compute the probability that one or more of the surviving debris might hit a person on the ground and cause one or more casualties. The statistical portion of the analysis relies on a number of assumptions about how the debris footprint and the human population are distributed in latitude and longitude, and how to use that information to arrive at realistic risk numbers. This inevitably involves assumptions that simplify the problem and make it tractable, but it is often difficult to test the accuracy and applicability of these assumptions. This paper looks at a number of these theoretical assumptions, examining the mathematical basis for the hazard calculations, and outlining the conditions under which the simplifying assumptions hold. In addition, this paper will also outline some new tools for assessing ground hazard risk in useful ways. Also, this study is able to make use of a database of known uncontrolled reentry locations measured by the United States Department of Defense. By using data from objects that were in orbit more than 30 days before reentry, sufficient time is allowed for the orbital parameters to be randomized in the way the models are designed to compute. The predicted ground footprint distributions of these objects are based on the theory that their orbits behave basically like simple Kepler orbits. However, there are a number of factors - including the effects of gravitational harmonics, the effects of the Earth's equatorial bulge on the atmosphere, and the rotation of the Earth and atmosphere - that could cause them to diverge from simple Kepler orbit behavior and change the ground footprints. The measured latitude and longitude distributions of these objects provide data that can be directly compared with the predicted distributions, providing a fundamental empirical test of the model assumptions.
Development of a graphical user interface for sgRNAcas9 and its application.
Zhao, Chang-zhi; Zhang, Yi; Li, Guang-lei; Chen, Ji-liang; Li, Jing-Jin; Ren, Rui-min; Ni, Pan; Zhao, Shu-hong; Xie, Sheng-song
2015-10-01
The CRISPR/Cas9 genome editing technique is a powerful tool for researchers. However, off-target effects of the Cas9 nuclease activity is a recurrent concern of the CRISPR system. Thus, designing sgRNA (single guide RNA) with minimal off-target effects is very important. sgRNAcas9 is a software package, which can be used to design sgRNA and to evaluate potential off-target cleavage sites. In this study, a graphical user interface for sgRNAcas9 was developed using the Java programming language. In addition, off-target effect for sgRNAs was evaluated according to mismatched number and "seed sequence" specification. Moreover, sgRNAcas9 software was used to design 34 124 sgRNAs, which can target 4691 microRNA (miRNA) precursors from human, mouse, rat, pig, and chicken. In particular, the off-target effect of a sgRNA targeting to human miR-206 precursor was analyzed, and the on/off-target activity of this sgRNA was validated by T7E1 assay in vitro. Taken together, these data showed that the interface can simplify the usage of the sgRNAcas9 program, which can be used to design sgRNAs for the majority of miRNA precursors. We also found that the GC% of those sgRNAs ranged from 40% to 60%. In summary, the sgRNAcas9 software can be easily used to design sgRNA with minimal off-target effects for any species. The software can be downloaded from BiooTools website (http://www.biootools.com/).
Online Remote Sensing Interface
NASA Technical Reports Server (NTRS)
Lawhead, Joel
2007-01-01
BasinTools Module 1 processes remotely sensed raster data, including multi- and hyper-spectral data products, via a Web site with no downloads and no plug-ins required. The interface provides standardized algorithms designed so that a user with little or no remote-sensing experience can use the site. This Web-based approach reduces the amount of software, hardware, and computing power necessary to perform the specified analyses. Access to imagery and derived products is enterprise-level and controlled. Because the user never takes possession of the imagery, the licensing of the data is greatly simplified. BasinTools takes the "just-in-time" inventory control model from commercial manufacturing and applies it to remotely-sensed data. Products are created and delivered on-the-fly with no human intervention, even for casual users. Well-defined procedures can be combined in different ways to extend verified and validated methods in order to derive new remote-sensing products, which improves efficiency in any well-defined geospatial domain. Remote-sensing products produced in BasinTools are self-documenting, allowing procedures to be independently verified or peer-reviewed. The software can be used enterprise-wide to conduct low-level remote sensing, viewing, sharing, and manipulating of image data without the need for desktop applications.
Financial analysis of community-based forest enterprises with the Green Value tool
S. Humphries; Tom Holmes
2016-01-01
The Green Value tool was developed in response to the need for simplified procedures that could be used in the field to conduct financial analysis for community-based forest enterprises (CFEs). Initially our efforts focused on a set of worksheets that could be used by both researchers and CFEs to monitor and analyze costs and income for one production period. The...
NASA Astrophysics Data System (ADS)
Peckham, S. D.; Kelbert, A.; Rudan, S.; Stoica, M.
2016-12-01
Standardized metadata for models is the key to reliable and greatly simplified coupling in model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System). This model metadata also helps model users to understand the important details that underpin computational models and to compare the capabilities of different models. These details include simplifying assumptions on the physics, governing equations and the numerical methods used to solve them, discretization of space (the grid) and time (the time-stepping scheme), state variables (input or output), model configuration parameters. This kind of metadata provides a "deep description" of a computational model that goes well beyond other types of metadata (e.g. author, purpose, scientific domain, programming language, digital rights, provenance, execution) and captures the science that underpins a model. While having this kind of standardized metadata for each model in a repository opens up a wide range of exciting possibilities, it is difficult to collect this information and a carefully conceived "data model" or schema is needed to store it. Automated harvesting and scraping methods can provide some useful information, but they often result in metadata that is inaccurate or incomplete, and this is not sufficient to enable the desired capabilities. In order to address this problem, we have developed a browser-based tool called the MCM Tool (Model Component Metadata) which runs on notebooks, tablets and smart phones. This tool was partially inspired by the TurboTax software, which greatly simplifies the necessary task of preparing tax documents. It allows a model developer or advanced user to provide a standardized, deep description of a computational geoscience model, including hydrologic models. Under the hood, the tool uses a new ontology for models built on the CSDMS Standard Names, expressed as a collection of RDF files (Resource Description Framework). This ontology is based on core concepts such as variables, objects, quantities, operations, processes and assumptions. The purpose of this talk is to present details of the new ontology and to then demonstrate the MCM Tool for several hydrologic models.
Wender, Paul A; Staveness, Daryl
2014-10-03
Bryostatin 1, in clinical trials or preclinical development for cancer, Alzheimer's disease, and a first-of-its-kind strategy for HIV/AIDS eradication, is neither readily available nor optimally suited for clinical use. In preceding work, we disclosed a new class of simplified bryostatin analogs designed for ease of access and tunable activity. Here we describe a final step diversification strategy that provides, in only 25 synthetic steps, simplified and tunable analogs with bryostatin-like PKC modulatory activities.
Design of simplified maximum-likelihood receivers for multiuser CPM systems.
Bing, Li; Bai, Baoming
2014-01-01
A class of simplified maximum-likelihood receivers designed for continuous phase modulation based multiuser systems is proposed. The presented receiver is built upon a front end employing mismatched filters and a maximum-likelihood detector defined in a low-dimensional signal space. The performance of the proposed receivers is analyzed and compared to some existing receivers. Some schemes are designed to implement the proposed receivers and to reveal the roles of different system parameters. Analysis and numerical results show that the proposed receivers can approach the optimum multiuser receivers with significantly (even exponentially in some cases) reduced complexity and marginal performance degradation.
Mohanty, Sujata; Dabas, Jitender; Gupta, Rekha
2015-01-01
Transport distraction is nowadays gaining enormous popularity and is becoming a promising option for reconstruction of mandibular defects. However, the vast number of distraction device designs create huge confusion in the clinician's mind to choose the right one. Considering these complex and costly designs, the authors decided to find a simplified way of combining a modified conventional reconstruction plate and monofocal distraction device that can act as a transport distraction device for bridging of bony defects. A case performed by this technique and device has been presented along with the description of device design.
Rapid Prototyping of Hydrologic Model Interfaces with IPython
NASA Astrophysics Data System (ADS)
Farthing, M. W.; Winters, K. D.; Ahmadia, A. J.; Hesser, T.; Howington, S. E.; Johnson, B. D.; Tate, J.; Kees, C. E.
2014-12-01
A significant gulf still exists between the state of practice and state of the art in hydrologic modeling. Part of this gulf is due to the lack of adequate pre- and post-processing tools for newly developed computational models. The development of user interfaces has traditionally lagged several years behind the development of a particular computational model or suite of models. As a result, models with mature interfaces often lack key advancements in model formulation, solution methods, and/or software design and technology. Part of the problem has been a focus on developing monolithic tools to provide comprehensive interfaces for the entire suite of model capabilities. Such efforts require expertise in software libraries and frameworks for creating user interfaces (e.g., Tcl/Tk, Qt, and MFC). These tools are complex and require significant investment in project resources (time and/or money) to use. Moreover, providing the required features for the entire range of possible applications and analyses creates a cumbersome interface. For a particular site or application, the modeling requirements may be simplified or at least narrowed, which can greatly reduce the number and complexity of options that need to be accessible to the user. However, monolithic tools usually are not adept at dynamically exposing specific workflows. Our approach is to deliver highly tailored interfaces to users. These interfaces may be site and/or process specific. As a result, we end up with many, customized interfaces rather than a single, general-use tool. For this approach to be successful, it must be efficient to create these tailored interfaces. We need technology for creating quality user interfaces that is accessible and has a low barrier for integration into model development efforts. Here, we present efforts to leverage IPython notebooks as tools for rapid prototyping of site and application-specific user interfaces. We provide specific examples from applications in near-shore environments as well as levee analysis. We discuss our design decisions and methodology for developing customized interfaces, strategies for delivery of the interfaces to users in various computing environments, as well as implications for the design/implementation of simulation models.
Hand-Held Electronic Gap-Measuring Tools
NASA Technical Reports Server (NTRS)
Sugg, F. E.; Thompson, F. W.; Aragon, L. A.; Harrington, D. B.
1985-01-01
Repetitive measurements simplified by tool based on LVDT operation. With fingers in open position, Gap-measuring tool rests on digital readout instrument. With fingers inserted in gap, separation alters inductance of linear variable-differential transformer in plastic handle. Originally developed for measuring gaps between surface tiles of Space Shuttle orbiter, tool reduces measurement time from 20 minutes per tile to 2 minutes. Also reduces possibility of damage to tiles during measurement. Tool has potential applications in mass production; helps ensure proper gap dimensions in assembly of refrigerator and car doors and also used to measure dimensions of components and to verify positional accuracy of components during progressive assembly operations.
Walsh, John P.; Chih-Yuan Sun, Jerry; Riconscente, Michelle
2011-01-01
Digital technologies can improve student interest and knowledge in science. However, researching the vast number of websites devoted to science education and integrating them into undergraduate curricula is time-consuming. We developed an Adobe ColdFusion– and Adobe Flash–based system for simplifying the construction, use, and delivery of electronic educational materials in science. The Online Multimedia Teaching Tool (OMTT) in Neuroscience was constructed from a ColdFusion-based online interface, which reduced the need for programming skills and the time for curriculum development. The OMTT in Neuroscience was used by faculty to enhance their lectures in existing curricula. Students had unlimited online access to encourage user-centered exploration. We found the OMTT was rapidly adapted by multiple professors, and its use by undergraduate students was consistent with the interpretation that the OMTT improved performance on exams and increased interest in the field of neuroscience. PMID:21885826
Walsh, John P; Chih-Yuan Sun, Jerry; Riconscente, Michelle
2011-01-01
Digital technologies can improve student interest and knowledge in science. However, researching the vast number of websites devoted to science education and integrating them into undergraduate curricula is time-consuming. We developed an Adobe ColdFusion- and Adobe Flash-based system for simplifying the construction, use, and delivery of electronic educational materials in science. The Online Multimedia Teaching Tool (OMTT) in Neuroscience was constructed from a ColdFusion-based online interface, which reduced the need for programming skills and the time for curriculum development. The OMTT in Neuroscience was used by faculty to enhance their lectures in existing curricula. Students had unlimited online access to encourage user-centered exploration. We found the OMTT was rapidly adapted by multiple professors, and its use by undergraduate students was consistent with the interpretation that the OMTT improved performance on exams and increased interest in the field of neuroscience.
Simplifying operations with an uplink/downlink integration toolkit
NASA Technical Reports Server (NTRS)
Murphy, Susan C.; Miller, Kevin J.; Guerrero, Ana Maria; Joe, Chester; Louie, John J.; Aguilera, Christine
1994-01-01
The Operations Engineering Lab (OEL) at JPL has developed a simple, generic toolkit to integrate the uplink/downlink processes, (often called closing the loop), in JPL's Multimission Ground Data System. This toolkit provides capabilities for integrating telemetry verification points with predicted spacecraft commands and ground events in the Mission Sequence Of Events (SOE) document. In the JPL ground data system, the uplink processing functions and the downlink processing functions are separate subsystems that are not well integrated because of the nature of planetary missions with large one-way light times for spacecraft-to-ground communication. Our new closed-loop monitoring tool allows an analyst or mission controller to view and save uplink commands and ground events with their corresponding downlinked telemetry values regardless of the delay in downlink telemetry and without requiring real-time intervention by the user. An SOE document is a time-ordered list of all the planned ground and spacecraft events, including all commands, sequence loads, ground events, significant mission activities, spacecraft status, and resource allocations. The SOE document is generated by expansion and integration of spacecraft sequence files, ground station allocations, navigation files, and other ground event files. This SOE generation process has been automated within the OEL and includes a graphical, object-oriented SOE editor and real-time viewing tool running under X/Motif. The SOE toolkit was used as the framework for the integrated implementation. The SOE is used by flight engineers to coordinate their operations tasks, serving as a predict data set in ground operations and mission control. The closed-loop SOE toolkit allows simple, automated integration of predicted uplink events with correlated telemetry points in a single SOE document for on-screen viewing and archiving. It automatically interfaces with existing real-time or non real-time sources of information, to display actual values from the telemetry data stream. This toolkit was designed to greatly simplify the user's ability to access and view telemetry data, and also provide a means to view this data in the context of the commands and ground events that are used to interpret it. A closed-loop system can prove especially useful in small missions with limited resources requiring automated monitoring tools. This paper will discuss the toolkit implementation, including design trade-offs and future plans for enhancing the automated capabilities.
Simplifying operations with an uplink/downlink integration toolkit
NASA Astrophysics Data System (ADS)
Murphy, Susan C.; Miller, Kevin J.; Guerrero, Ana Maria; Joe, Chester; Louie, John J.; Aguilera, Christine
1994-11-01
The Operations Engineering Lab (OEL) at JPL has developed a simple, generic toolkit to integrate the uplink/downlink processes, (often called closing the loop), in JPL's Multimission Ground Data System. This toolkit provides capabilities for integrating telemetry verification points with predicted spacecraft commands and ground events in the Mission Sequence Of Events (SOE) document. In the JPL ground data system, the uplink processing functions and the downlink processing functions are separate subsystems that are not well integrated because of the nature of planetary missions with large one-way light times for spacecraft-to-ground communication. Our new closed-loop monitoring tool allows an analyst or mission controller to view and save uplink commands and ground events with their corresponding downlinked telemetry values regardless of the delay in downlink telemetry and without requiring real-time intervention by the user. An SOE document is a time-ordered list of all the planned ground and spacecraft events, including all commands, sequence loads, ground events, significant mission activities, spacecraft status, and resource allocations. The SOE document is generated by expansion and integration of spacecraft sequence files, ground station allocations, navigation files, and other ground event files. This SOE generation process has been automated within the OEL and includes a graphical, object-oriented SOE editor and real-time viewing tool running under X/Motif. The SOE toolkit was used as the framework for the integrated implementation. The SOE is used by flight engineers to coordinate their operations tasks, serving as a predict data set in ground operations and mission control. The closed-loop SOE toolkit allows simple, automated integration of predicted uplink events with correlated telemetry points in a single SOE document for on-screen viewing and archiving. It automatically interfaces with existing real-time or non real-time sources of information, to display actual values from the telemetry data stream. This toolkit was designed to greatly simplify the user's ability to access and view telemetry data, and also provide a means to view this data in the context of the commands and ground events that are used to interpret it. A closed-loop system can prove especially useful in small missions with limited resources requiring automated monitoring tools. This paper will discuss the toolkit implementation, including design trade-offs and future plans for enhancing the automated capabilities.
NASA Technical Reports Server (NTRS)
2006-01-01
Topics covered include: Airport Remote Tower Sensor Systems; Implantable Wireless MEMS Sensors for Medical Uses; Embedded Sensors for Measuring Surface Regression; Coordinating an Autonomous Earth-Observing Sensorweb; Range-Measuring Video Sensors; Stability Enhancement of Polymeric Sensing Films Using Fillers; Sensors for Using Times of Flight to Measure Flow Velocities; Receiver Would Control Phasing of a Phased-Array Antenna; Modern Design of Resonant Edge-Slot Array Antennas; Carbon-Nanotube Schottky Diodes; Simplified Optics and Controls for Laser Communications; Coherent Detection of High-Rate Optical PPM Signals; Multichannel Phase and Power Detector; Using Satellite Data in Weather Forecasting: I; Using Dissimilarity Metrics to Identify Interesting Designs; X-Windows PVT Widget Class; Shuttle Data Center File-Processing Tool in Java; Statistical Evaluation of Utilization of the ISS; Nanotube Dispersions Made With Charged Surfactant; Aerogels for Thermal Insulation of Thermoelectric Devices; Low-Density, Creep-Resistant Single-Crystal Superalloys; Excitations for Rapidly Estimating Flight-Control Parameters; Estimation of Stability and Control Derivatives of an F-15; Tool for Coupling a Torque Wrench to a Round Cable Connector; Ultrasonically Actuated Tools for Abrading Rock Surfaces; Active Struts With Variable Spring Stiffness and Damping; Multiaxis, Lightweight, Computer-Controlled Exercise System; Dehydrating and Sterilizing Wastes Using Supercritical CO2; Alpha-Voltaic Sources Using Liquid Ga as Conversion Medium; Ice-Borehole Probe; Alpha-Voltaic Sources Using Diamond as Conversion Medium; White-Light Whispering-Gallery-Mode Optical Resonators; Controlling Attitude of a Solar-Sail Spacecraft Using Vanes; and Wire-Mesh-Based Sorber for Removing Contaminants from Air.
Bim and Gis: when Parametric Modeling Meets Geospatial Data
NASA Astrophysics Data System (ADS)
Barazzetti, L.; Banfi, F.
2017-12-01
Geospatial data have a crucial role in several projects related to infrastructures and land management. GIS software are able to perform advanced geospatial analyses, but they lack several instruments and tools for parametric modelling typically available in BIM. At the same time, BIM software designed for buildings have limited tools to handle geospatial data. As things stand at the moment, BIM and GIS could appear as complementary solutions, notwithstanding research work is currently under development to ensure a better level of interoperability, especially at the scale of the building. On the other hand, the transition from the local (building) scale to the infrastructure (where geospatial data cannot be neglected) has already demonstrated that parametric modelling integrated with geoinformation is a powerful tool to simplify and speed up some phases of the design workflow. This paper reviews such mixed approaches with both simulated and real examples, demonstrating that integration is already a reality at specific scales, which are not dominated by "pure" GIS or BIM. The paper will also demonstrate that some traditional operations carried out with GIS software are also available in parametric modelling software for BIM, such as transformation between reference systems, DEM generation, feature extraction, and geospatial queries. A real case study is illustrated and discussed to show the advantage of a combined use of both technologies. BIM and GIS integration can generate greater usage of geospatial data in the AECOO (Architecture, Engineering, Construction, Owner and Operator) industry, as well as new solutions for parametric modelling with additional geoinformation.
Next-generation confirmatory disease diagnostics
NASA Astrophysics Data System (ADS)
Lin, Robert; Gerver, Rachel; Karns, Kelly; Apori, Akwasi A.; Denisin, Aleksandra K.; Herr, Amy E.
2014-06-01
Microfluidic tools are advancing capabilities in screening diagnostics for use in near-patient settings. Here, we review three case studies to illustrate the flexibility and analytical power offered by microanalytical tools. We first overview a near-patient tool for detection of protein markers found in cerebrospinal fluid (CSF), as a means to identify the presence of cerebrospinal fluid in nasal mucous - an indication that CSF is leaking into the nasal cavity. Microfluidic design allowed integration of several up-stream preparatory steps and rapid, specific completion of the human CSF protein assay. Second, we overview a tear fluid based assay for lactoferrin, a protein produced in the lacrimal gland, then secreted into tear fluid. Tear Lf is a putative biomarker for primary SS. A critical contribution of this and related work being measurement of Lf, even in light of well-known and significant matrix interactions and losses during the tear fluid collection and preparation. Lastly, we review a microfluidic barcode platform that enables rapid measurement of multiple infectious disease biomarkers in human sera. The assay presents a new approach to multiplexed biomarker detection, yet in a simple straight microchannel - thus providing a streamlined, simplified microanalytical platform, as is relevant to robust operation in diagnostic settings. We view microfluidic design and analytical chemistry as the basis for emerging, sophisticated assays that will advance not just screening diagnostic technology, but confirmatory assays, sample preparation and handling, and thus introduction and utilization of new biomarkers and assay formats.
Simplified models for Higgs physics: singlet scalar and vector-like quark phenomenology
Dolan, Matthew J.; Hewett, J. L.; Krämer, M.; ...
2016-07-08
Simplified models provide a useful tool to conduct the search and exploration of physics beyond the Standard Model in a model-independent fashion. In this study, we consider the complementarity of indirect searches for new physics in Higgs couplings and distributions with direct searches for new particles, using a simplified model which includes a new singlet scalar resonance and vector-like fermions that can mix with the SM top-quark. We fit this model to the combined ATLAS and CMS 125 GeV Higgs production and coupling measurements and other precision electroweak constraints, and explore in detail the effects of the new matter contentmore » upon Higgs production and kinematics. Finally, we highlight some novel features and decay modes of the top partner phenomenology, and discuss prospects for Run II.« less
SIRTF Science Operations System Design
NASA Technical Reports Server (NTRS)
Green, William
1999-01-01
SIRTF Science Operations System Design William B. Green Manager, SIRTF Science Center California Institute of Technology M/S 310-6 1200 E. California Blvd., Pasadena CA 91125 (626) 395 8572 Fax (626) 568 0673 bgreen@ipac.caltech.edu. The Space Infrared Telescope Facility (SIRTF) will be launched in December 2001, and perform an extended series of science observations at wavelengths ranging from 20 to 160 microns for five years or more. The California Institute of Technology has been selected as the home for the SIRTF Science Center (SSC). The SSC will be responsible for evaluating and selecting observation proposals, providing technical support to the science community, performing mission planning and science observation scheduling activities, instrument calibration during operations and instrument health monitoring, production of archival quality data products, and management of science research grants. The science payload consists of three instruments delivered by instrument Principal Investigators located at University of Arizona, Cornell, and Harvard Smithsonian Astrophysical Observatory. The SSC is responsible for design, development, and operation of the Science Operations System (SOS) which will support the functions assigned to the SSC by NASA. The SIRTF spacecraft, mission profile, and science instrument design have undergone almost ten years of refinement. SIRTF development and operations activities are highly cost constrained. The cost constraints have impacted the design of the SOS in several ways. The Science Operations System has been designed to incorporate a set of efficient, easy to use tools which will make it possible for scientists to propose observation sequences in a rapid and automated manner. The use of highly automated tools for requesting observations will simplify the long range observatory scheduling process, and the short term scheduling of science observations. Pipeline data processing will be highly automated and data-driven, utilizing a variety of tools developed at JPL, the instrument development teams, and Space Telescope Science Institute to automate processing. An incremental ground data system development approach has been adopted, featuring periodic deliveries that are validated with the flight hardware throughout the various phases of system level development and testing. This approach minimizes development time and decreases operations risk. This paper will describe the top level architecture of the SOS and the basic design concepts. A summary of the incremental development approach will be presented. Examples of the unique science user tools now under final development prior to the first proposal call scheduled for mid-2000 will be shown.
Soli, Sigfrid D; Zheng, Yun; Meng, Zhaoli; Li, Gang
2012-09-01
The purpose of this study was to develop a practical mean for clinical evaluation of early pediatric language development by establishing developmental trajectories for receptive and expressive vocabulary growth in children between 6 and 32 months of age using a simple, time-efficient assessment tool. Simplified short form versions of the Words and Gestures and Words and Sentences vocabulary inventories in the Mandarin Communicative Development Inventory [1] were developed and used to assess early language development in developmentally normal children from 6 to 32 months of age during routine health checks. Developmental trajectories characterizing the rate of receptive and expressive vocabulary growth between 6 and 32 months of age are reported. These trajectories allow the equivalent age corresponding to a score to be determined after a brief structured interview with the child's parents that can be conducted in a busy clinical setting. The simplified short forms of the Mandarin Communicative Development Inventories can serve as a clinically useful tool to assess early child language development, providing a practical mean of objectively assessing early language development following early interventions to treat young children with hearing impairment as well as speech and language delays. Objective evidence of language development is essential for achievement of effective (re)habilitation outcomes. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Carpenter, James R.; Berry, Kevin; Gregpru. Late; Speckman, Keith; Hur-Diaz, Sun; Surka, Derek; Gaylor, Dave
2010-01-01
The Orbit Determination Toolbox is an orbit determination (OD) analysis tool based on MATLAB and Java that provides a flexible way to do early mission analysis. The toolbox is primarily intended for advanced mission analysis such as might be performed in concept exploration, proposal, early design phase, or rapid design center environments. The emphasis is on flexibility, but it has enough fidelity to produce credible results. Insight into all flight dynamics source code is provided. MATLAB is the primary user interface and is used for piecing together measurement and dynamic models. The Java Astrodynamics Toolbox is used as an engine for things that might be slow or inefficient in MATLAB, such as high-fidelity trajectory propagation, lunar and planetary ephemeris look-ups, precession, nutation, polar motion calculations, ephemeris file parsing, and the like. The primary analysis functions are sequential filter/smoother and batch least-squares commands that incorporate Monte-Carlo data simulation, linear covariance analysis, measurement processing, and plotting capabilities at the generic level. These functions have a user interface that is based on that of the MATLAB ODE suite. To perform a specific analysis, users write MATLAB functions that implement truth and design system models. The user provides his or her models as inputs to the filter commands. The software provides a capability to publish and subscribe to a software bus that is compliant with the NASA Goddard Mission Services Evolution Center (GMSEC) standards, to exchange data with other flight dynamics tools to simplify the flight dynamics design cycle. Using the publish and subscribe approach allows for analysts in a rapid design center environment to seamlessly incorporate changes in spacecraft and mission design into navigation analysis and vice versa.
Simplified method for the transverse bending analysis of twin celled concrete box girder bridges
NASA Astrophysics Data System (ADS)
Chithra, J.; Nagarajan, Praveen; S, Sajith A.
2018-03-01
Box girder bridges are one of the best options for bridges with span more than 25 m. For the study of these bridges, three-dimensional finite element analysis is the best suited method. However, performing three-dimensional analysis for routine design is difficult as well as time consuming. Also, software used for the three-dimensional analysis are very expensive. Hence designers resort to simplified analysis for predicting longitudinal and transverse bending moments. Among the many analytical methods used to find the transverse bending moments, SFA is the simplest and widely used in design offices. Results from simplified frame analysis can be used for the preliminary analysis of the concrete box girder bridges.From the review of literatures, it is found that majority of the work done using SFA is restricted to the analysis of single cell box girder bridges. Not much work has been done on the analysis multi-cell concrete box girder bridges. In this present study, a double cell concrete box girder bridge is chosen. The bridge is modelled using three- dimensional finite element software and the results are then compared with the simplified frame analysis. The study mainly focuses on establishing correction factors for transverse bending moment values obtained from SFA.
Toward a Scalable Visualization System for Network Traffic Monitoring
NASA Astrophysics Data System (ADS)
Malécot, Erwan Le; Kohara, Masayoshi; Hori, Yoshiaki; Sakurai, Kouichi
With the multiplication of attacks against computer networks, system administrators are required to monitor carefully the traffic exchanged by the networks they manage. However, that monitoring task is increasingly laborious because of the augmentation of the amount of data to analyze. And that trend is going to intensify with the explosion of the number of devices connected to computer networks along with the global rise of the available network bandwidth. So system administrators now heavily rely on automated tools to assist them and simplify the analysis of the data. Yet, these tools provide limited support and, most of the time, require highly skilled operators. Recently, some research teams have started to study the application of visualization techniques to the analysis of network traffic data. We believe that this original approach can also allow system administrators to deal with the large amount of data they have to process. In this paper, we introduce a tool for network traffic monitoring using visualization techniques that we developed in order to assist the system administrators of our corporate network. We explain how we designed the tool and some of the choices we made regarding the visualization techniques to use. The resulting tool proposes two linked representations of the network traffic and activity, one in 2D and the other in 3D. As 2D and 3D visualization techniques have different assets, we resulted in combining them in our tool to take advantage of their complementarity. We finally tested our tool in order to evaluate the accuracy of our approach.
A Simplified Decision Support Approach for Evaluating Wetlands Ecosystem Services NABS11
State-level managers and environmental advocates often must justify their restoration actions in terms of tangible beneficial outcomes. Wetlands functional assessment tools (e.g, Wetland Evaluation Technique (WET), Habitat Evaluation Procedures (HEP), Hydrogeomorphic Method (HGM)...
NASA Astrophysics Data System (ADS)
Schuster, J. C.
2017-08-01
The tablet-based software docu-tools digitize the documentation of buildings, simplifies construction and facility management and the data analysis in building and construction-history research. As a plan-based software, `pins' can be set to record data (images, audio, text etc.), each data point containing a time and date stamp. Once a pin is set and information recorded, it can never be deleted from the system, creating clear contentious-free documentation. Reports to any/all data recorded can immediately be generated through various templates in order to share, document, analyze and archive the information gathered. The software both digitizes building condition assessment, as well as simplifies the fully documented management and solving of problems and monitoring of a building. Used both in the construction industry and for documenting and analyzing historic buildings, docu-tools is a versatile and flexible tool that has become integral to my work as a building historian working on the conservation and curating of the historic built environment in Europe. I used the software at Boughton House, Northamptonshire, UK, during a one-year research project into the construction history of the building. The details of how docu-tools was used during this project will be discussed in this paper.
Kralisch, Dana; Streckmann, Ina; Ott, Denise; Krtschil, Ulich; Santacesaria, Elio; Di Serio, Martino; Russo, Vincenzo; De Carlo, Lucrezia; Linhart, Walter; Christian, Engelbert; Cortese, Bruno; de Croon, Mart H J M; Hessel, Volker
2012-02-13
The simple transfer of established chemical production processes from batch to flow chemistry does not automatically result in more sustainable ones. Detailed process understanding and the motivation to scrutinize known process conditions are necessary factors for success. Although the focus is usually "only" on intensifying transport phenomena to operate under intrinsic kinetics, there is also a large intensification potential in chemistry under harsh conditions and in the specific design of flow processes. Such an understanding and proposed processes are required at an early stage of process design because decisions on the best-suited tools and parameters required to convert green engineering concepts into practice-typically with little chance of substantial changes later-are made during this period. Herein, we present a holistic and interdisciplinary process design approach that combines the concept of novel process windows with process modeling, simulation, and simplified cost and lifecycle assessment for the deliberate development of a cost-competitive and environmentally sustainable alternative to an existing production process for epoxidized soybean oil. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
SModelS v1.1 user manual: Improving simplified model constraints with efficiency maps
NASA Astrophysics Data System (ADS)
Ambrogi, Federico; Kraml, Sabine; Kulkarni, Suchita; Laa, Ursula; Lessa, Andre; Magerl, Veronika; Sonneveld, Jory; Traub, Michael; Waltenberger, Wolfgang
2018-06-01
SModelS is an automatized tool for the interpretation of simplified model results from the LHC. It allows to decompose models of new physics obeying a Z2 symmetry into simplified model components, and to compare these against a large database of experimental results. The first release of SModelS, v1.0, used only cross section upper limit maps provided by the experimental collaborations. In this new release, v1.1, we extend the functionality of SModelS to efficiency maps. This increases the constraining power of the software, as efficiency maps allow to combine contributions to the same signal region from different simplified models. Other new features of version 1.1 include likelihood and χ2 calculations, extended information on the topology coverage, an extended database of experimental results as well as major speed upgrades for both the code and the database. We describe in detail the concepts and procedures used in SModelS v1.1, explaining in particular how upper limits and efficiency map results are dealt with in parallel. Detailed instructions for code usage are also provided.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Myronakis, M; Cai, W; Dhou, S
Purpose: To design a comprehensive open-source, publicly available, graphical user interface (GUI) to facilitate the configuration, generation, processing and use of the 4D Extended Cardiac-Torso (XCAT) phantom. Methods: The XCAT phantom includes over 9000 anatomical objects as well as respiratory, cardiac and tumor motion. It is widely used for research studies in medical imaging and radiotherapy. The phantom generation process involves the configuration of a text script to parameterize the geometry, motion, and composition of the whole body and objects within it, and to generate simulated PET or CT images. To avoid the need for manual editing or script writing,more » our MATLAB-based GUI uses slider controls, drop-down lists, buttons and graphical text input to parameterize and process the phantom. Results: Our GUI can be used to: a) generate parameter files; b) generate the voxelized phantom; c) combine the phantom with a lesion; d) display the phantom; e) produce average and maximum intensity images from the phantom output files; f) incorporate irregular patient breathing patterns; and f) generate DICOM files containing phantom images. The GUI provides local help information using tool-tip strings on the currently selected phantom, minimizing the need for external documentation. The DICOM generation feature is intended to simplify the process of importing the phantom images into radiotherapy treatment planning systems or other clinical software. Conclusion: The GUI simplifies and automates the use of the XCAT phantom for imaging-based research projects in medical imaging or radiotherapy. This has the potential to accelerate research conducted with the XCAT phantom, or to ease the learning curve for new users. This tool does not include the XCAT phantom software itself. We would like to acknowledge funding from MRA, Varian Medical Systems Inc.« less
Azpilicueta, Leire; López-Iturri, Peio; Aguirre, Erik; Mateo, Ignacio; Astrain, José Javier; Villadangos, Jesús; Falcone, Francisco
2014-12-10
The use of wireless networks has experienced exponential growth due to the improvements in terms of battery life and low consumption of the devices. However, it is compulsory to conduct previous radio propagation analysis when deploying a wireless sensor network. These studies are necessary to perform an estimation of the range coverage, in order to optimize the distance between devices in an actual network deployment. In this work, the radio channel characterization for ISM 2.4 GHz Wireless Sensor Networks (WSNs) in an inhomogeneous vegetation environment has been analyzed. This analysis allows designing environment monitoring tools based on ZigBee and WiFi where WSN and smartphones cooperate, providing rich and customized monitoring information to users in a friendly manner. The impact of topology as well as morphology of the environment is assessed by means of an in-house developed 3D Ray Launching code, to emulate the realistic operation in the framework of the scenario. Experimental results gathered from a measurement campaign conducted by deploying a ZigBee Wireless Sensor Network, are analyzed and compared with simulations in this paper. The scenario where this network is intended to operate is a combination of buildings and diverse vegetation species. To gain insight in the effects of radio propagation, a simplified vegetation model has been developed, considering the material parameters and simplified geometry embedded in the simulation scenario. An initial location-based application has been implemented in a real scenario, to test the functionality within a context aware scenario. The use of deterministic tools can aid to know the impact of the topological influence in the deployment of the optimal Wireless Sensor Network in terms of capacity, coverage and energy consumption, making the use of these systems attractive for multiple applications in inhomogeneous vegetation environments.
A Risk Score Model for Evaluation and Management of Patients with Thyroid Nodules.
Zhang, Yongwen; Meng, Fanrong; Hong, Lianqing; Chu, Lanfang
2018-06-12
The study is aimed to establish a simplified and practical tool for analyzing thyroid nodules. A novel risk score model was designed, risk factors including patient history, patient characteristics, physical examination, symptoms of compression, thyroid function, ultrasonography (US) of thyroid and cervical lymph nodes were evaluated and classified into high risk factors, intermediate risk factors, and low risk factors. A total of 243 thyroid nodules in 162 patients were assessed with risk score system and Thyroid Imaging-Reporting and Data System (TI-RADS). The diagnostic performance of risk score system and TI-RADS was compared. The accuracy in the diagnosis of thyroid nodules was 89.3% for risk score system, 74.9% for TI-RADS respectively. The specificity, accuracy and positive predictive value (PPV) of risk score system were significantly higher than the TI-RADS system (χ 2 =26.287, 17.151, 11.983; p <0.05), statistically significant differences were not observed in the sensitivity and negative predictive value (NPV) between the risk score system and TI-RADS (χ 2 =1.276, 0.290; p>0.05). The area under the curve (AUC) for risk score diagnosis system was 0.963, standard error 0.014, 95% confidence interval (CI)=0.934-0.991, the AUC for TI-RADS diagnosis system was 0.912 with standard error 0.021, 95% CI=0.871-0.953, the AUC for risk score system was significantly different from that of TI-RADS (Z=2.02; p <0.05). Risk score model is a reliable, simplified and cost-effective diagnostic tool used in diagnosis of thyroid cancer. The higher the score is, the higher the risk of malignancy will be. © Georg Thieme Verlag KG Stuttgart · New York.
Representing Operational Modes for Situation Awareness
NASA Astrophysics Data System (ADS)
Kirchhübel, Denis; Lind, Morten; Ravn, Ole
2017-01-01
Operating complex plants is an increasingly demanding task for human operators. Diagnosis of and reaction to on-line events requires the interpretation of real time data. Vast amounts of sensor data as well as operational knowledge about the state and design of the plant are necessary to deduct reasonable reactions to abnormal situations. Intelligent computational support tools can make the operator’s task easier, but they require knowledge about the overall system in form of some model. While tools used for fault-tolerant control design based on physical principles and relations are valuable tools for designing robust systems, the models become too complex when considering the interactions on a plant-wide level. The alarm systems meant to support human operators in the diagnosis of the plant-wide situation on the other hand fail regularly in situations where these interactions of systems lead to many related alarms overloading the operator with alarm floods. Functional modelling can provide a middle way to reduce the complexity of plant-wide models by abstracting from physical details to more general functions and behaviours. Based on functional models the propagation of failures through the interconnected systems can be inferred and alarm floods can potentially be reduced to their root-cause. However, the desired behaviour of a complex system changes due to operating procedures that require more than one physical and functional configuration. In this paper a consistent representation of possible configurations is deduced from the analysis of an exemplary start-up procedure by functional models. The proposed interpretation of the modelling concepts simplifies the functional modelling of distinct modes. The analysis further reveals relevant links between the quantitative sensor data and the qualitative perspective of the diagnostics tool based on functional models. This will form the basis for the ongoing development of a novel real-time diagnostics system based on the on-line adaptation of the underlying MFM model.
ERIC Educational Resources Information Center
Andraos, John
2015-01-01
This paper presents a simplified approach for the application of material efficiency metrics to linear and convergent synthesis plans encountered in organic synthesis courses. Computations are facilitated and automated using intuitively designed Microsoft Excel spreadsheets without invoking abstract mathematical formulas. The merits of this…
EcoFlex: A Multifunctional MoClo Kit for E. coli Synthetic Biology.
Moore, Simon J; Lai, Hung-En; Kelwick, Richard J R; Chee, Soo Mei; Bell, David J; Polizzi, Karen Marie; Freemont, Paul S
2016-10-21
Golden Gate cloning is a prominent DNA assembly tool in synthetic biology for the assembly of plasmid constructs often used in combinatorial pathway optimization, with a number of assembly kits developed specifically for yeast and plant-based expression. However, its use for synthetic biology in commonly used bacterial systems such as Escherichia coli has surprisingly been overlooked. Here, we introduce EcoFlex a simplified modular package of DNA parts for a variety of applications in E. coli, cell-free protein synthesis, protein purification and hierarchical assembly of transcription units based on the MoClo assembly standard. The kit features a library of constitutive promoters, T7 expression, RBS strength variants, synthetic terminators, protein purification tags and fluorescence proteins. We validate EcoFlex by assembling a 68-part containing (20 genes) plasmid (31 kb), characterize in vivo and in vitro library parts, and perform combinatorial pathway assembly, using pooled libraries of either fluorescent proteins or the biosynthetic genes for the antimicrobial pigment violacein as a proof-of-concept. To minimize pathway screening, we also introduce a secondary module design site to simplify MoClo pathway optimization. In summary, EcoFlex provides a standardized and multifunctional kit for a variety of applications in E. coli synthetic biology.
An Educational Model for Hands-On Hydrology Education
NASA Astrophysics Data System (ADS)
AghaKouchak, A.; Nakhjiri, N.; Habib, E. H.
2014-12-01
This presentation provides an overview of a hands-on modeling tool developed for students in civil engineering and earth science disciplines to help them learn the fundamentals of hydrologic processes, model calibration, sensitivity analysis, uncertainty assessment, and practice conceptual thinking in solving engineering problems. The toolbox includes two simplified hydrologic models, namely HBV-EDU and HBV-Ensemble, designed as a complement to theoretical hydrology lectures. The models provide an interdisciplinary application-oriented learning environment that introduces the hydrologic phenomena through the use of a simplified conceptual hydrologic model. The toolbox can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation) are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI) and an ensemble simulation scheme that can be used for teaching more advanced topics including uncertainty analysis, and ensemble simulation. Both models have been administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of hydrology.
Simplified Predictive Models for CO2 Sequestration Performance Assessment
NASA Astrophysics Data System (ADS)
Mishra, Srikanta; RaviGanesh, Priya; Schuetter, Jared; Mooney, Douglas; He, Jincong; Durlofsky, Louis
2014-05-01
We present results from an ongoing research project that seeks to develop and validate a portfolio of simplified modeling approaches that will enable rapid feasibility and risk assessment for CO2 sequestration in deep saline formation. The overall research goal is to provide tools for predicting: (a) injection well and formation pressure buildup, and (b) lateral and vertical CO2 plume migration. Simplified modeling approaches that are being developed in this research fall under three categories: (1) Simplified physics-based modeling (SPM), where only the most relevant physical processes are modeled, (2) Statistical-learning based modeling (SLM), where the simulator is replaced with a "response surface", and (3) Reduced-order method based modeling (RMM), where mathematical approximations reduce the computational burden. The system of interest is a single vertical well injecting supercritical CO2 into a 2-D layered reservoir-caprock system with variable layer permeabilities. In the first category (SPM), we use a set of well-designed full-physics compositional simulations to understand key processes and parameters affecting pressure propagation and buoyant plume migration. Based on these simulations, we have developed correlations for dimensionless injectivity as a function of the slope of fractional-flow curve, variance of layer permeability values, and the nature of vertical permeability arrangement. The same variables, along with a modified gravity number, can be used to develop a correlation for the total storage efficiency within the CO2 plume footprint. In the second category (SLM), we develop statistical "proxy models" using the simulation domain described previously with two different approaches: (a) classical Box-Behnken experimental design with a quadratic response surface fit, and (b) maximin Latin Hypercube sampling (LHS) based design with a Kriging metamodel fit using a quadratic trend and Gaussian correlation structure. For roughly the same number of simulations, the LHS-based meta-model yields a more robust predictive model, as verified by a k-fold cross-validation approach. In the third category (RMM), we use a reduced-order modeling procedure that combines proper orthogonal decomposition (POD) for reducing problem dimensionality with trajectory-piecewise linearization (TPWL) for extrapolating system response at new control points from a limited number of trial runs ("snapshots"). We observe significant savings in computational time with very good accuracy from the POD-TPWL reduced order model - which could be important in the context of history matching, uncertainty quantification and optimization problems. The paper will present results from our ongoing investigations, and also discuss future research directions and likely outcomes. This work was supported by U.S. Department of Energy National Energy Technology Laboratory award DE-FE0009051 and Ohio Department of Development grant D-13-02.
Conceptual Design Oriented Wing Structural Analysis and Optimization
NASA Technical Reports Server (NTRS)
Lau, May Yuen
1996-01-01
Airplane optimization has always been the goal of airplane designers. In the conceptual design phase, a designer's goal could be tradeoffs between maximum structural integrity, minimum aerodynamic drag, or maximum stability and control, many times achieved separately. Bringing all of these factors into an iterative preliminary design procedure was time consuming, tedious, and not always accurate. For example, the final weight estimate would often be based upon statistical data from past airplanes. The new design would be classified based on gross characteristics, such as number of engines, wingspan, etc., to see which airplanes of the past most closely resembled the new design. This procedure works well for conventional airplane designs, but not very well for new innovative designs. With the computing power of today, new methods are emerging for the conceptual design phase of airplanes. Using finite element methods, computational fluid dynamics, and other computer techniques, designers can make very accurate disciplinary-analyses of an airplane design. These tools are computationally intensive, and when used repeatedly, they consume a great deal of computing time. In order to reduce the time required to analyze a design and still bring together all of the disciplines (such as structures, aerodynamics, and controls) into the analysis, simplified design computer analyses are linked together into one computer program. These design codes are very efficient for conceptual design. The work in this thesis is focused on a finite element based conceptual design oriented structural synthesis capability (CDOSS) tailored to be linked into ACSYNT.
Planetary Geologic Mapping Python Toolbox: A Suite of Tools to Support Mapping Workflows
NASA Astrophysics Data System (ADS)
Hunter, M. A.; Skinner, J. A.; Hare, T. M.; Fortezzo, C. M.
2017-06-01
The collective focus of the Planetary Geologic Mapping Python Toolbox is to provide researchers with additional means to migrate legacy GIS data, assess the quality of data and analysis results, and simplify common mapping tasks.
CLASSIFICATION FRAMEWORK FOR DIAGNOSTICS RESEARCH
The goal of Diagnostics Research is to provide tools to simplify diagnosis of the causes of biological impairment, in support of State and Tribe 303(d) impaired waters lists. The Diagnostics Workgroup has developed conceptual models for four major aquatic stressors that cause im...
Simplified Parallel Domain Traversal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erickson III, David J
2011-01-01
Many data-intensive scientific analysis techniques require global domain traversal, which over the years has been a bottleneck for efficient parallelization across distributed-memory architectures. Inspired by MapReduce and other simplified parallel programming approaches, we have designed DStep, a flexible system that greatly simplifies efficient parallelization of domain traversal techniques at scale. In order to deliver both simplicity to users as well as scalability on HPC platforms, we introduce a novel two-tiered communication architecture for managing and exploiting asynchronous communication loads. We also integrate our design with advanced parallel I/O techniques that operate directly on native simulation output. We demonstrate DStep bymore » performing teleconnection analysis across ensemble runs of terascale atmospheric CO{sub 2} and climate data, and we show scalability results on up to 65,536 IBM BlueGene/P cores.« less
Heuristics in Managing Complex Clinical Decision Tasks in Experts’ Decision Making
Islam, Roosan; Weir, Charlene; Del Fiol, Guilherme
2016-01-01
Background Clinical decision support is a tool to help experts make optimal and efficient decisions. However, little is known about the high level of abstractions in the thinking process for the experts. Objective The objective of the study is to understand how clinicians manage complexity while dealing with complex clinical decision tasks. Method After approval from the Institutional Review Board (IRB), three clinical experts were interviewed the transcripts from these interviews were analyzed. Results We found five broad categories of strategies by experts for managing complex clinical decision tasks: decision conflict, mental projection, decision trade-offs, managing uncertainty and generating rule of thumb. Conclusion Complexity is created by decision conflicts, mental projection, limited options and treatment uncertainty. Experts cope with complexity in a variety of ways, including using efficient and fast decision strategies to simplify complex decision tasks, mentally simulating outcomes and focusing on only the most relevant information. Application Understanding complex decision making processes can help design allocation based on the complexity of task for clinical decision support design. PMID:27275019
Heuristics in Managing Complex Clinical Decision Tasks in Experts' Decision Making.
Islam, Roosan; Weir, Charlene; Del Fiol, Guilherme
2014-09-01
Clinical decision support is a tool to help experts make optimal and efficient decisions. However, little is known about the high level of abstractions in the thinking process for the experts. The objective of the study is to understand how clinicians manage complexity while dealing with complex clinical decision tasks. After approval from the Institutional Review Board (IRB), three clinical experts were interviewed the transcripts from these interviews were analyzed. We found five broad categories of strategies by experts for managing complex clinical decision tasks: decision conflict, mental projection, decision trade-offs, managing uncertainty and generating rule of thumb. Complexity is created by decision conflicts, mental projection, limited options and treatment uncertainty. Experts cope with complexity in a variety of ways, including using efficient and fast decision strategies to simplify complex decision tasks, mentally simulating outcomes and focusing on only the most relevant information. Understanding complex decision making processes can help design allocation based on the complexity of task for clinical decision support design.
Cloud Environment Automation: from infrastructure deployment to application monitoring
NASA Astrophysics Data System (ADS)
Aiftimiei, C.; Costantini, A.; Bucchi, R.; Italiano, A.; Michelotto, D.; Panella, M.; Pergolesi, M.; Saletta, M.; Traldi, S.; Vistoli, C.; Zizzi, G.; Salomoni, D.
2017-10-01
The potential offered by the cloud paradigm is often limited by technical issues, rules and regulations. In particular, the activities related to the design and deployment of the Infrastructure as a Service (IaaS) cloud layer can be difficult to apply and time-consuming for the infrastructure maintainers. In this paper the research activity, carried out during the Open City Platform (OCP) research project [1], aimed at designing and developing an automatic tool for cloud-based IaaS deployment is presented. Open City Platform is an industrial research project funded by the Italian Ministry of University and Research (MIUR), started in 2014. It intends to research, develop and test new technological solutions open, interoperable and usable on-demand in the field of Cloud Computing, along with new sustainable organizational models that can be deployed for and adopted by the Public Administrations (PA). The presented work and the related outcomes are aimed at simplifying the deployment and maintenance of a complete IaaS cloud-based infrastructure.
Integrating Reconfigurable Hardware-Based Grid for High Performance Computing
Dondo Gazzano, Julio; Sanchez Molina, Francisco; Rincon, Fernando; López, Juan Carlos
2015-01-01
FPGAs have shown several characteristics that make them very attractive for high performance computing (HPC). The impressive speed-up factors that they are able to achieve, the reduced power consumption, and the easiness and flexibility of the design process with fast iterations between consecutive versions are examples of benefits obtained with their use. However, there are still some difficulties when using reconfigurable platforms as accelerator that need to be addressed: the need of an in-depth application study to identify potential acceleration, the lack of tools for the deployment of computational problems in distributed hardware platforms, and the low portability of components, among others. This work proposes a complete grid infrastructure for distributed high performance computing based on dynamically reconfigurable FPGAs. Besides, a set of services designed to facilitate the application deployment is described. An example application and a comparison with other hardware and software implementations are shown. Experimental results show that the proposed architecture offers encouraging advantages for deployment of high performance distributed applications simplifying development process. PMID:25874241
Practical modeling approaches for geological storage of carbon dioxide.
Celia, Michael A; Nordbotten, Jan M
2009-01-01
The relentless increase of anthropogenic carbon dioxide emissions and the associated concerns about climate change have motivated new ideas about carbon-constrained energy production. One technological approach to control carbon dioxide emissions is carbon capture and storage, or CCS. The underlying idea of CCS is to capture the carbon before it emitted to the atmosphere and store it somewhere other than the atmosphere. Currently, the most attractive option for large-scale storage is in deep geological formations, including deep saline aquifers. Many physical and chemical processes can affect the fate of the injected CO2, with the overall mathematical description of the complete system becoming very complex. Our approach to the problem has been to reduce complexity as much as possible, so that we can focus on the few truly important questions about the injected CO2, most of which involve leakage out of the injection formation. Toward this end, we have established a set of simplifying assumptions that allow us to derive simplified models, which can be solved numerically or, for the most simplified cases, analytically. These simplified models allow calculation of solutions to large-scale injection and leakage problems in ways that traditional multicomponent multiphase simulators cannot. Such simplified models provide important tools for system analysis, screening calculations, and overall risk-assessment calculations. We believe this is a practical and important approach to model geological storage of carbon dioxide. It also serves as an example of how complex systems can be simplified while retaining the essential physics of the problem.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, Y.; Edwards, R.M.; Lee, K.Y.
1997-03-01
In this paper, a simplified model with a lower order is first developed for a nuclear steam generator system and verified against some realistic environments. Based on this simplified model, a hybrid multi-input and multi-out (MIMO) control system, consisting of feedforward control (FFC) and feedback control (FBC), is designed for wide range conditions by using the genetic algorithm (GA) technique. The FFC control, obtained by the GA optimization method, injects an a priori command input into the system to achieve an optimal performance for the designed system, while the GA-based FBC control provides the necessary compensation for any disturbances ormore » uncertainties in a real steam generator. The FBC control is an optimal design of a PI-based control system which would be more acceptable for industrial practices and power plant control system upgrades. The designed hybrid MIMO FFC/FBC control system is first applied to the simplified model and then to a more complicated model with a higher order which is used as a substitute of the real system to test the efficacy of the designed control system. Results from computer simulations show that the designed GA-based hybrid MIMO FFC/FBC control can achieve good responses and robust performances. Hence, it can be considered as a viable alternative to the current control system upgrade.« less
Pánek, J; Vohradský, J
1997-06-01
The principal motivation was to design an environment for the development of image-analysis applications which would allow the integration of independent modules into one frame and make available tools for their build-up, running, management and mutual communication. The system was designed as modular, consisting of the core and work modules. The system core focuses on overall management and provides a library of classes for build-up of the work modules, their user interface and data communication. The work modules carry practical implementation of algorithms and data structures for the solution of a particular problem, and were implemented as dynamic-link libraries. They are mutually independent and run as individual threads, communicating with each other via a unified mechanism. The environment was designed to simplify the development and testing of new algorithms or applications. An example of implementation for the particular problem of the analysis of two-dimensional (2D) gel electrophoretograms is presented. The environment was designed for the Windows NT operating system with the use of Microsoft Foundation Class Library employing the possibilities of C++ programming language. Available on request from the authors.
Development and weighting of a life cycle assessment screening model
NASA Astrophysics Data System (ADS)
Bates, Wayne E.; O'Shaughnessy, James; Johnson, Sharon A.; Sisson, Richard
2004-02-01
Nearly all life cycle assessment tools available today are high priced, comprehensive and quantitative models requiring a significant amount of data collection and data input. In addition, most of the available software packages require a great deal of training time to learn how to operate the model software. Even after this time investment, results are not guaranteed because of the number of estimations and assumptions often necessary to run the model. As a result, product development, design teams and environmental specialists need a simplified tool that will allow for the qualitative evaluation and "screening" of various design options. This paper presents the development and design of a generic, qualitative life cycle screening model and demonstrates its applicability and ease of use. The model uses qualitative environmental, health and safety factors, based on site or product-specific issues, to sensitize the overall results for a given set of conditions. The paper also evaluates the impact of different population input ranking values on model output. The final analysis is based on site or product-specific variables. The user can then evaluate various design changes and the apparent impact or improvement on the environment, health and safety, compliance cost and overall corporate liability. Major input parameters can be varied, and factors such as materials use, pollution prevention, waste minimization, worker safety, product life, environmental impacts, return of investment, and recycle are evaluated. The flexibility of the model format will be discussed in order to demonstrate the applicability and usefulness within nearly any industry sector. Finally, an example using audience input value scores will be compared to other population input results.
NASA Astrophysics Data System (ADS)
Faqih, A.
2017-03-01
Providing information regarding future climate scenarios is very important in climate change study. The climate scenario can be used as basic information to support adaptation and mitigation studies. In order to deliver future climate scenarios over specific region, baseline and projection data from the outputs of global climate models (GCM) is needed. However, due to its coarse resolution, the data have to be downscaled and bias corrected in order to get scenario data with better spatial resolution that match the characteristics of the observed data. Generating this downscaled data is mostly difficult for scientist who do not have specific background, experience and skill in dealing with the complex data from the GCM outputs. In this regards, it is necessary to develop a tool that can be used to simplify the downscaling processes in order to help scientist, especially in Indonesia, for generating future climate scenario data that can be used for their climate change-related studies. In this paper, we introduce a tool called as “Statistical Bias Correction for Climate Scenarios (SiBiaS)”. The tool is specially designed to facilitate the use of CMIP5 GCM data outputs and process their statistical bias corrections relative to the reference data from observations. It is prepared for supporting capacity building in climate modeling in Indonesia as part of the Indonesia 3rd National Communication (TNC) project activities.
Bohren, Meghan A; Oladapo, Olufemi T; Tunçalp, Özge; Wendland, Melanie; Vogel, Joshua P; Tikkanen, Mari; Fawole, Bukola; Mugerwa, Kidza; Souza, João Paulo; Bahl, Rajiv; Gülmezoglu, A Metin
2015-05-26
Most complications during labour and childbirth could be averted with timely interventions by skilled healthcare providers. Yet, the quality and outcomes of childbirth care remains suboptimal in many health facilities in low-resource settings. To accelerate the reduction of childbirth-related maternal, fetal and newborn mortality and morbidity, the World Health Organization has initiated the "Better Outcomes in Labour Difficulty" (BOLD) project to address weaknesses in labour care processes and better connect health systems and communities. The project seeks to develop a "Simplified, Effective, Labour Monitoring-to-Action" tool (SELMA) to assist healthcare providers to monitor labour and take decisive actions more efficiently; and by developing an innovative set of service prototypes and/or tools termed "Passport to Safer Birth", designed with communities and healthcare providers, to promote access to quality care for women during childbirth. This protocol describes the formative research activities to support the development of these tools. We will employ qualitative research and service design methodologies in eight health facilities and their catchment communities in Nigeria and Uganda. In the health facilities, focus group discussions (FGD) and in-depth interviews (IDI) will be conducted among different cadres of healthcare providers and facility administrators. In the communities, FGDs and IDIs will be conducted among women who have delivered in a health facility. We will use service design methods to explore women's journey to access and receive childbirth care in order to innovate and design services around the needs and expectations of women, within the context of the health system. This formative research will serve several roles. First, it will provide an in-depth understanding of healthcare providers and health system issues to be accounted for in the final design and implementation of SELMA. Second, it will help to identify key moments ("touch points") where women's experiences of childbirth care are shaped, and where the overall experience of quality care could be improved. The synthesis of findings from the qualitative and service design activities will help identify potential areas for behaviour change related to the provision and experience of childbirth care, and serve as the basis for the development of Passport to Safer Birth. Please see related articles 'http://dx.doi.org/ 10.1186/s12978-015-0027-6 ' and 'http://dx.doi.org/ 10.1186/s12978-015-0029-4 '.
Iba, Toshiaki; Di Nisio, Marcello; Thachil, Jecko; Wada, Hideo; Asakura, Hidesaku; Sato, Koichi; Saitoh, Daizoh
2018-04-01
Sepsis-associated disseminated intravascular coagulation (DIC) carries a high risk of death. Thus, a simple tool to quickly establish DIC diagnosis is required. The purpose of this study was to introduce the simple and reliable tool for the prediction of outcome in patients with sepsis complicated by coagulopathy. We investigated the performance of simplified Japanese Society on Thrombosis and Hemostasis (JSTH) DIC diagnostic criteria. In this study, we conducted a retrospective, multicenter survey in 107 general emergency and critical care centers in secondary and tertiary care hospitals. A total of 918 patients with sepsis-associated coagulopathy who underwent antithrombin supplementation were examined. The relationships between patient mortality and each of the baseline (ie, before treatment) JSTH-DIC diagnostic criteria were examined. A reduced platelet count, increased prothrombin time (PT) ratio, and lower antithrombin activity were correlated with 28-day mortality, while fibrinogen and fibrin degradation product (FDP) level were not. Thus, the number of points assigned to FDP levels was reduced from 3 to 1 (above 20 μg/mL). The simplified JSTH diagnostic criteria combining platelet count, PT ratio, antithrombin activity, and FDP level (reduction in the maximum score) strongly predicted 28-day mortality and allowed us to diagnose a larger/similar number of patients with DIC as compared to the original JSTH-DIC. The simplified JSTH-DIC diagnostic criteria show a similar performance to JSTH-DIC criteria in patients with septic coagulopathy. The lower number of laboratory markers used in the simplified JSTH-DIC score may increase its applicability and routine use in emergency and critical care setting.
RchyOptimyx: Cellular Hierarchy Optimization for Flow Cytometry
Aghaeepour, Nima; Jalali, Adrin; O’Neill, Kieran; Chattopadhyay, Pratip K.; Roederer, Mario; Hoos, Holger H.; Brinkman, Ryan R.
2013-01-01
Analysis of high-dimensional flow cytometry datasets can reveal novel cell populations with poorly understood biology. Following discovery, characterization of these populations in terms of the critical markers involved is an important step, as this can help to both better understand the biology of these populations and aid in designing simpler marker panels to identify them on simpler instruments and with fewer reagents (i.e., in resource poor or highly regulated clinical settings). However, current tools to design panels based on the biological characteristics of the target cell populations work exclusively based on technical parameters (e.g., instrument configurations, spectral overlap, and reagent availability). To address this shortcoming, we developed RchyOptimyx (cellular hieraRCHY OPTIMization), a computational tool that constructs cellular hierarchies by combining automated gating with dynamic programming and graph theory to provide the best gating strategies to identify a target population to a desired level of purity or correlation with a clinical outcome, using the simplest possible marker panels. RchyOptimyx can assess and graphically present the trade-offs between marker choice and population specificity in high-dimensional flow or mass cytometry datasets. We present three proof-of-concept use cases for RchyOptimyx that involve 1) designing a panel of surface markers for identification of rare populations that are primarily characterized using their intracellular signature; 2) simplifying the gating strategy for identification of a target cell population; 3) identification of a non-redundant marker set to identify a target cell population. PMID:23044634
miRiadne: a web tool for consistent integration of miRNA nomenclature.
Bonnal, Raoul J P; Rossi, Riccardo L; Carpi, Donatella; Ranzani, Valeria; Abrignani, Sergio; Pagani, Massimiliano
2015-07-01
The miRBase is the official miRNA repository which keeps the annotation updated on newly discovered miRNAs: it is also used as a reference for the design of miRNA profiling platforms. Nomenclature ambiguities generated by loosely updated platforms and design errors lead to incompatibilities among platforms, even from the same vendor. Published miRNA lists are thus generated with different profiling platforms that refer to diverse and not updated annotations. This greatly compromises searches, comparisons and analyses that rely on miRNA names only without taking into account the mature sequences, which is particularly critic when such analyses are carried over automatically. In this paper we introduce miRiadne, a web tool to harmonize miRNA nomenclature, which takes into account the original miRBase versions from 10 up to 21, and annotations of 40 common profiling platforms from nine brands that we manually curated. miRiadne uses the miRNA mature sequence to link miRBase versions and/or platforms to prevent nomenclature ambiguities. miRiadne was designed to simplify and support biologists and bioinformaticians in re-annotating their own miRNA lists and/or data sets. As Ariadne helped Theseus in escaping the mythological maze, miRiadne will help the miRNA researcher in escaping the nomenclature maze. miRiadne is freely accessible from the URL http://www.miriadne.org. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
A simplified gis-based model for large wood recruitment and connectivity in mountain basins
NASA Astrophysics Data System (ADS)
Franceschi, Silvia; Antonello, Andrea; Vela, Ana Lucia; Cavalli, Marco; Crema, Stefano; Comiti, Francesco; Tonon, Giustino
2015-04-01
During the last 50 years in the Alps the decline of the rural and forest economy and at the depopulation of the mountain areas caused the progressive abandon of the land in general and in particular of the riparian zones and the consequent increment of the vegetation extension. On one hand the wood increases the availability of organic matter and has positive effects on mountain river systems. However, during flooding events large woods that reach the stream cause the clogging of bridges with an increase of flood hazard. The approach to the evaluation of the availability of large wood during flooding events is still a challenge. There are models that simulate the propagation of the logs downstream, but the evaluation of the trees that can reach the stream is still done using simplified GIS procedures. These procedures are the base for our research which will include LiDAR derived information on vegetation to evaluate large wood recruitment extreme events. Within the last Google Summer of Code (2014) we developed a set of tools to evaluate large wood recruitment and propagation along the channel network based on a simplified methodology for monitoring and modeling large wood recruitment and transport in mountain basins implemented by Lucía et 2014. These tools are integrated in the JGrassTools project as a dedicated section in the Hydro-Geomorphology library. The section LWRecruitment contains 10 simple modules that allow the user to start from very simple information related to geomorphology, flooding areas and vegetation cover and obtain a map of the most probable critical sections on the streams. The tools cover the two main aspects related to the iteration of large wood with the rivers: the recruitment mechanisms and the propagation downstream. While the propagation tool is very simple and does not consider the hydrodynamic of the problem, the recruitment algorithms are more specific and consider the influence of hillslopes stability and the flooding extension. The modules are available for download at www.jgrasstools.org. A simple and easy to use graphical interface to run the models is available at https://github.com/moovida/STAGE/releases.
A simplified close range photogrammetry method for soil erosion assessment
USDA-ARS?s Scientific Manuscript database
With the increased affordability of consumer grade cameras and the development of powerful image processing software, digital photogrammetry offers a competitive advantage as a tool for soil erosion estimation compared to other technologies. One bottleneck of digital photogrammetry is its dependency...
Formative Research on the Simplifying Conditions Method (SCM) for Task Analysis and Sequencing.
ERIC Educational Resources Information Center
Kim, YoungHwan; Reigluth, Charles M.
The Simplifying Conditions Method (SCM) is a set of guidelines for task analysis and sequencing of instructional content under the Elaboration Theory (ET). This article introduces the fundamentals of SCM and presents the findings from a formative research study on SCM. It was conducted in two distinct phases: design and instruction. In the first…
A Manual of Simplified Laboratory Methods for Operators of Wastewater Treatment Facilities.
ERIC Educational Resources Information Center
Westerhold, Arnold F., Ed.; Bennett, Ernest C., Ed.
This manual is designed to provide the small wastewater treatment plant operator, as well as the new or inexperienced operator, with simplified methods for laboratory analysis of water and wastewater. It is emphasized that this manual is not a replacement for standard methods but a guide for plants with insufficient equipment to perform analyses…
Simplified Recipes for Day Care Centers.
ERIC Educational Resources Information Center
Asmussen, Patricia D.
The spiral-bound collection of 156 simplified recipes is designed to help those who prepare food for groups of children at day care centers. The recipes provide for 25 child-size servings to meet the nutritional needs and appetites of children from 2 to 6 years of age. The first section gives general information on ladle and scoop sizes, weights…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-23
... Simplify the $1 Strike Price Interval Program November 17, 2011. Pursuant to Section 19(b)(1) of the... by the Exchange. The Exchange has designated the proposed rule change as constituting a non...) of the Rules of the Boston Options Exchange Group, LLC (``BOX'') to simplify the $1 Strike Price...
Measurement of erosion in helicon plasma thrusters using the VASIMR® VX-CR device
NASA Astrophysics Data System (ADS)
Del Valle Gamboa, Juan Ignacio; Castro-Nieto, Jose; Squire, Jared; Carter, Mark; Chang-Diaz, Franklin
2015-09-01
The helicon plasma source is one of the principal stages of the high-power VASIMR® electric propulsion system. The VASIMR® VX-CR experiment focuses solely on this stage, exploring the erosion and long-term operation effects of the VASIMR helicon source. We report on the design and operational parameters of the VX-CR experiment, and the development of modeling tools and characterization techniques allowing the study of erosion phenomena in helicon plasma sources in general, and stand-alone helicon plasma thrusters (HPTs) in particular. A thorough understanding of the erosion phenomena within HPTs will enable better predictions of their behavior as well as more accurate estimations of their expected lifetime. We present a simplified model of the plasma-wall interactions within HPTs based on current models of the plasma density distributions in helicon discharges. Results from this modeling tool are used to predict the erosion within the plasma-facing components of the VX-CR device. Experimental techniques to measure actual erosion, including the use of coordinate-measuring machines and microscopy, will be discussed.
Fastener Capture Plate Technology to Contain On-Orbit Debris
NASA Technical Reports Server (NTRS)
Eisenhower, Kevin
2010-01-01
The Fastener Capture Plate technology was developed to solve the problem of capturing loose hardware and small fasteners, items that were not originally intended to be disengaged in microgravity, thus preventing them from becoming space debris. This technology was incorporated into astronaut tools designed and successfully used on NASA s Hubble Space Telescope Servicing Mission #4. The technology s ultimate benefit is that it allows a very time-efficient method for disengaging fasteners and removing hardware while minimizing the chances of losing parts or generating debris. The technology aims to simplify the manual labor required of the operator. It does so by optimizing visibility and access to the work site and minimizing the operator's need to be concerned with debris while performing the operations. It has a range of unique features that were developed to minimize task time, as well as maximize the ease and confidence of the astronaut operator. This paper describes the technology and the astronaut tools developed specifically for a complicated on-orbit repair, and it includes photographs of the hardware being used in outer space.
Accurate Behavioral Simulator of All-Digital Time-Domain Smart Temperature Sensors by Using SIMULINK
Chen, Chun-Chi; Chen, Chao-Lieh; Lin, You-Ting
2016-01-01
This study proposes a new behavioral simulator that uses SIMULINK for all-digital CMOS time-domain smart temperature sensors (TDSTSs) for performing rapid and accurate simulations. Inverter-based TDSTSs offer the benefits of low cost and simple structure for temperature-to-digital conversion and have been developed. Typically, electronic design automation tools, such as HSPICE, are used to simulate TDSTSs for performance evaluations. However, such tools require extremely long simulation time and complex procedures to analyze the results and generate figures. In this paper, we organize simple but accurate equations into a temperature-dependent model (TDM) by which the TDSTSs evaluate temperature behavior. Furthermore, temperature-sensing models of a single CMOS NOT gate were devised using HSPICE simulations. Using the TDM and these temperature-sensing models, a novel simulator in SIMULINK environment was developed to substantially accelerate the simulation and simplify the evaluation procedures. Experiments demonstrated that the simulation results of the proposed simulator have favorable agreement with those obtained from HSPICE simulations, showing that the proposed simulator functions successfully. This is the first behavioral simulator addressing the rapid simulation of TDSTSs. PMID:27509507
Doyle, Thomas W.; Chivoiu, Bogdan; Enwright, Nicholas M.
2015-08-24
Global sea level is rising and may accelerate with continued fossil fuel consumption from industrial and population growth. In 2012, the U.S. Geological Survey conducted more than 30 training and feedback sessions with Federal, State, and nongovernmental organization (NGO) coastal managers and planners across the northern Gulf of Mexico coast to evaluate user needs, potential benefits, current scientific understanding, and utilization of resource aids and modeling tools focused on sea-level rise. In response to the findings from the sessions, this sea-level rise modeling handbook has been designed as a guide to the science and simulation models for understanding the dynamics and impacts of sea-level rise on coastal ecosystems. The review herein of decision-support tools and predictive models was compiled from the training sessions, from online research, and from publications. The purpose of this guide is to describe and categorize the suite of data, methods, and models and their design, structure, and application for hindcasting and forecasting the potential impacts of sea-level rise in coastal ecosystems. The data and models cover a broad spectrum of disciplines involving different designs and scales of spatial and temporal complexity for predicting environmental change and ecosystem response. These data and models have not heretofore been synthesized, nor have appraisals been made of their utility or limitations. Some models are demonstration tools for non-experts, whereas others require more expert capacity to apply for any given park, refuge, or regional application. A simplified tabular context has been developed to list and contrast a host of decision-support tools and models from the ecological, geological, and hydrological perspectives. Criteria were established to distinguish the source, scale, and quality of information input and geographic datasets; physical and biological constraints and relations; datum characteristics of water and land components; utility options for setting sea-level rise and climate change scenarios; and ease or difficulty of storing, displaying, or interpreting model output. Coastal land managers, engineers, and scientists can benefit from this synthesis of tools and models that have been developed for projecting causes and consequences of sea-level change on the landscape and seascape.
High fidelity simulations of infrared imagery with animated characters
NASA Astrophysics Data System (ADS)
Näsström, F.; Persson, A.; Bergström, D.; Berggren, J.; Hedström, J.; Allvar, J.; Karlsson, M.
2012-06-01
High fidelity simulations of IR signatures and imagery tend to be slow and do not have effective support for animation of characters. Simplified rendering methods based on computer graphics methods can be used to overcome these limitations. This paper presents a method to combine these tools and produce simulated high fidelity thermal IR data of animated people in terrain. Infrared signatures for human characters have been calculated using RadThermIR. To handle multiple character models, these calculations use a simplified material model for the anatomy and clothing. Weather and temperature conditions match the IR-texture used in the terrain model. The calculated signatures are applied to the animated 3D characters that, together with the terrain model, are used to produce high fidelity IR imagery of people or crowds. For high level animation control and crowd simulations, HLAS (High Level Animation System) has been developed. There are tools available to create and visualize skeleton based animations, but tools that allow control of the animated characters on a higher level, e.g. for crowd simulation, are usually expensive and closed source. We need the flexibility of HLAS to add animation into an HLA enabled sensor system simulation framework.
Unsteady Turbopump Flow Simulations
NASA Technical Reports Server (NTRS)
Centin, Kiris C.; Kwak, Dochan
2001-01-01
The objective of the current effort is two-fold: 1) to provide a computational framework for design and analysis of the entire fuel supply system of a liquid rocket engine; and 2) to provide high-fidelity unsteady turbopump flow analysis capability to support the design of pump sub-systems for advanced space transportation vehicle. Since the space launch systems in the near future are likely to involve liquid propulsion system, increasing the efficiency and reliability of the turbopump components is an important task. To date, computational tools for design/analysis of turbopump flow are based on relatively lower fidelity methods. Unsteady, three-dimensional viscous flow analysis tool involving stationary and rotational components for the entire turbopump assembly has not been available, at least, for real-world engineering applications. Present effort is an attempt to provide this capability so that developers of the vehicle will be able to extract such information as transient flow phenomena for start up, impact of non-uniform inflow, system vibration and impact on the structure. Those quantities are not readily available from simplified design tools. In this presentation, the progress being made toward complete turbo-pump simulation capability for a liquid rocket engine is reported. Space Shuttle Main Engine (SSME) turbo-pump is used as a test case for the performance evaluation of the hybrid MPI/Open-MP and MLP versions of the INS3D code. Relative motion of the grid system for rotor-stator interaction was obtained by employing overset grid techniques. Time-accuracy of the scheme has been evaluated by using simple test cases. Unsteady computations for SSME turbopump, which contains 106 zones with 34.5 Million grid points, are currently underway on Origin 2000 systems at NASA Ames Research Center. Results from these time-accurate simulations with moving boundary capability and the performance of the parallel versions of the code will be presented.
Using Kepler for Tool Integration in Microarray Analysis Workflows.
Gan, Zhuohui; Stowe, Jennifer C; Altintas, Ilkay; McCulloch, Andrew D; Zambon, Alexander C
Increasing numbers of genomic technologies are leading to massive amounts of genomic data, all of which requires complex analysis. More and more bioinformatics analysis tools are being developed by scientist to simplify these analyses. However, different pipelines have been developed using different software environments. This makes integrations of these diverse bioinformatics tools difficult. Kepler provides an open source environment to integrate these disparate packages. Using Kepler, we integrated several external tools including Bioconductor packages, AltAnalyze, a python-based open source tool, and R-based comparison tool to build an automated workflow to meta-analyze both online and local microarray data. The automated workflow connects the integrated tools seamlessly, delivers data flow between the tools smoothly, and hence improves efficiency and accuracy of complex data analyses. Our workflow exemplifies the usage of Kepler as a scientific workflow platform for bioinformatics pipelines.
Telfer, Scott; Erdemir, Ahmet; Woodburn, James; Cavanagh, Peter R
2016-01-25
Integration of patient-specific biomechanical measurements into the design of therapeutic footwear has been shown to improve clinical outcomes in patients with diabetic foot disease. The addition of numerical simulations intended to optimise intervention design may help to build on these advances, however at present the time and labour required to generate and run personalised models of foot anatomy restrict their routine clinical utility. In this study we developed second-generation personalised simple finite element (FE) models of the forefoot with varying geometric fidelities. Plantar pressure predictions from barefoot, shod, and shod with insole simulations using simplified models were compared to those obtained from CT-based FE models incorporating more detailed representations of bone and tissue geometry. A simplified model including representations of metatarsals based on simple geometric shapes, embedded within a contoured soft tissue block with outer geometry acquired from a 3D surface scan was found to provide pressure predictions closest to the more complex model, with mean differences of 13.3kPa (SD 13.4), 12.52kPa (SD 11.9) and 9.6kPa (SD 9.3) for barefoot, shod, and insole conditions respectively. The simplified model design could be produced in <1h compared to >3h in the case of the more detailed model, and solved on average 24% faster. FE models of the forefoot based on simplified geometric representations of the metatarsal bones and soft tissue surface geometry from 3D surface scans may potentially provide a simulation approach with improved clinical utility, however further validity testing around a range of therapeutic footwear types is required. Copyright © 2015 Elsevier Ltd. All rights reserved.
Falotico, Egidio; Vannucci, Lorenzo; Ambrosano, Alessandro; Albanese, Ugo; Ulbrich, Stefan; Vasquez Tieck, Juan Camilo; Hinkel, Georg; Kaiser, Jacques; Peric, Igor; Denninger, Oliver; Cauli, Nino; Kirtay, Murat; Roennau, Arne; Klinker, Gudrun; Von Arnim, Axel; Guyot, Luc; Peppicelli, Daniel; Martínez-Cañada, Pablo; Ros, Eduardo; Maier, Patrick; Weber, Sandro; Huber, Manuel; Plecher, David; Röhrbein, Florian; Deser, Stefan; Roitberg, Alina; van der Smagt, Patrick; Dillman, Rüdiger; Levi, Paul; Laschi, Cecilia; Knoll, Alois C.; Gewaltig, Marc-Oliver
2017-01-01
Combined efforts in the fields of neuroscience, computer science, and biology allowed to design biologically realistic models of the brain based on spiking neural networks. For a proper validation of these models, an embodiment in a dynamic and rich sensory environment, where the model is exposed to a realistic sensory-motor task, is needed. Due to the complexity of these brain models that, at the current stage, cannot deal with real-time constraints, it is not possible to embed them into a real-world task. Rather, the embodiment has to be simulated as well. While adequate tools exist to simulate either complex neural networks or robots and their environments, there is so far no tool that allows to easily establish a communication between brain and body models. The Neurorobotics Platform is a new web-based environment that aims to fill this gap by offering scientists and technology developers a software infrastructure allowing them to connect brain models to detailed simulations of robot bodies and environments and to use the resulting neurorobotic systems for in silico experimentation. In order to simplify the workflow and reduce the level of the required programming skills, the platform provides editors for the specification of experimental sequences and conditions, environments, robots, and brain–body connectors. In addition to that, a variety of existing robots and environments are provided. This work presents the architecture of the first release of the Neurorobotics Platform developed in subproject 10 “Neurorobotics” of the Human Brain Project (HBP).1 At the current state, the Neurorobotics Platform allows researchers to design and run basic experiments in neurorobotics using simulated robots and simulated environments linked to simplified versions of brain models. We illustrate the capabilities of the platform with three example experiments: a Braitenberg task implemented on a mobile robot, a sensory-motor learning task based on a robotic controller, and a visual tracking embedding a retina model on the iCub humanoid robot. These use-cases allow to assess the applicability of the Neurorobotics Platform for robotic tasks as well as in neuroscientific experiments. PMID:28179882
Falotico, Egidio; Vannucci, Lorenzo; Ambrosano, Alessandro; Albanese, Ugo; Ulbrich, Stefan; Vasquez Tieck, Juan Camilo; Hinkel, Georg; Kaiser, Jacques; Peric, Igor; Denninger, Oliver; Cauli, Nino; Kirtay, Murat; Roennau, Arne; Klinker, Gudrun; Von Arnim, Axel; Guyot, Luc; Peppicelli, Daniel; Martínez-Cañada, Pablo; Ros, Eduardo; Maier, Patrick; Weber, Sandro; Huber, Manuel; Plecher, David; Röhrbein, Florian; Deser, Stefan; Roitberg, Alina; van der Smagt, Patrick; Dillman, Rüdiger; Levi, Paul; Laschi, Cecilia; Knoll, Alois C; Gewaltig, Marc-Oliver
2017-01-01
Combined efforts in the fields of neuroscience, computer science, and biology allowed to design biologically realistic models of the brain based on spiking neural networks. For a proper validation of these models, an embodiment in a dynamic and rich sensory environment, where the model is exposed to a realistic sensory-motor task, is needed. Due to the complexity of these brain models that, at the current stage, cannot deal with real-time constraints, it is not possible to embed them into a real-world task. Rather, the embodiment has to be simulated as well. While adequate tools exist to simulate either complex neural networks or robots and their environments, there is so far no tool that allows to easily establish a communication between brain and body models. The Neurorobotics Platform is a new web-based environment that aims to fill this gap by offering scientists and technology developers a software infrastructure allowing them to connect brain models to detailed simulations of robot bodies and environments and to use the resulting neurorobotic systems for in silico experimentation. In order to simplify the workflow and reduce the level of the required programming skills, the platform provides editors for the specification of experimental sequences and conditions, environments, robots, and brain-body connectors. In addition to that, a variety of existing robots and environments are provided. This work presents the architecture of the first release of the Neurorobotics Platform developed in subproject 10 "Neurorobotics" of the Human Brain Project (HBP). At the current state, the Neurorobotics Platform allows researchers to design and run basic experiments in neurorobotics using simulated robots and simulated environments linked to simplified versions of brain models. We illustrate the capabilities of the platform with three example experiments: a Braitenberg task implemented on a mobile robot, a sensory-motor learning task based on a robotic controller, and a visual tracking embedding a retina model on the iCub humanoid robot. These use-cases allow to assess the applicability of the Neurorobotics Platform for robotic tasks as well as in neuroscientific experiments.
Simplified Interface to Complex Memory Hierarchies 1.x
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lang, Michael; Ionkov, Latchesar; Williams, Sean
2017-02-21
Memory systems are expected to get evermore complicated in the coming years, and it isn't clear exactly what form that complexity will take. On the software side, a simple, flexible way of identifying and working with memory pools is needed. Additionally, most developers seek code portability and do not want to learn the intricacies of complex memory. Hence, we believe that a library for interacting with complex memory systems should expose two kinds of abstraction: First, a low-level, mechanism-based interface designed for the runtime or advanced user that wants complete control, with its focus on simplified representation but with allmore » decisions left to the caller. Second, a high-level, policy-based interface designed for ease of use for the application developer, in which we aim for best-practice decisions based on application intent. We have developed such a library, called SICM: Simplified Interface to Complex Memory.« less
NASA's Cryogenic Fluid Management Technology Project
NASA Technical Reports Server (NTRS)
Tramel, Terri L.; Motil, Susan M.
2008-01-01
The Cryogenic Fluid Management (CFM) Project's primary objective is to develop storage, transfer, and handling technologies for cryogens that will support the enabling of high performance cryogenic propulsion systems, lunar surface systems and economical ground operations. Such technologies can significantly reduce propellant launch mass and required on-orbit margins, reduce or even eliminate propellant tank fluid boil-off losses for long term missions, and simplify vehicle operations. This paper will present the status of the specific technologies that the CFM Project is developing. The two main areas of concentration are analysis models development and CFM hardware development. The project develops analysis tools and models based on thermodynamics, hydrodynamics, and existing flight/test data. These tools assist in the development of pressure/thermal control devices (such as the Thermodynamic Vent System (TVS), and Multi-layer insulation); with the ultimate goal being to develop a mature set of tools and models that can characterize the performance of the pressure/thermal control devices incorporated in the design of an entire CFM system with minimal cryogen loss. The project does hardware development and testing to verify our understanding of the physical principles involved, and to validate the performance of CFM components, subsystems and systems. This database provides information to anchor our analytical models. This paper describes some of the current activities of the NASA's Cryogenic Fluid Management Project.
MassCascade: Visual Programming for LC-MS Data Processing in Metabolomics.
Beisken, Stephan; Earll, Mark; Portwood, David; Seymour, Mark; Steinbeck, Christoph
2014-04-01
Liquid chromatography coupled to mass spectrometry (LC-MS) is commonly applied to investigate the small molecule complement of organisms. Several software tools are typically joined in custom pipelines to semi-automatically process and analyse the resulting data. General workflow environments like the Konstanz Information Miner (KNIME) offer the potential of an all-in-one solution to process LC-MS data by allowing easy integration of different tools and scripts. We describe MassCascade and its workflow plug-in for processing LC-MS data. The Java library integrates frequently used algorithms in a modular fashion, thus enabling it to serve as back-end for graphical front-ends. The functions available in MassCascade have been encapsulated in a plug-in for the workflow environment KNIME, allowing combined use with e.g. statistical workflow nodes from other providers and making the tool intuitive to use without knowledge of programming. The design of the software guarantees a high level of modularity where processing functions can be quickly replaced or concatenated. MassCascade is an open-source library for LC-MS data processing in metabolomics. It embraces the concept of visual programming through its KNIME plug-in, simplifying the process of building complex workflows. The library was validated using open data.
Mobile, Virtual Enhancements for Rehabilitation (MOVER)
2015-05-31
patient uses COTS input devices, such as the Microsoft Kinect and the Wii Balance Board , to perform therapeutic exercises that are mapped to controls...in place of having an exercise creation tool for the therapists, we have simplified the process by hardcoding specific, commonly used balance
Thermodynamics--A Practical Subject.
ERIC Educational Resources Information Center
Jones, Hugh G.
1984-01-01
Provides a simplified, synoptic overview of the area of thermodynamics, enumerating and explaining the four basic laws, and introducing the mathematics involved in a stepwise fashion. Discusses such basic tools of thermodynamics as enthalpy, entropy, Helmholtz free energy, and Gibbs free energy, and their uses in problem solving. (JM)
ClusterCAD: a computational platform for type I modular polyketide synthase design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eng, Clara H.; Backman, Tyler W H; Bailey, Constance B.
Here, we present ClusterCAD, a web-based toolkit designed to leverage the collinear structure and deterministic logic of type I modular polyketide synthases (PKSs) for synthetic biology applications. The unique organization of these megasynthases, combined with the diversity of their catalytic domain building blocks, has fueled an interest in harnessing the biosynthetic potential of PKSs for the microbial production of both novel natural product analogs and industrially relevant small molecules. However, a limited theoretical understanding of the determinants of PKS fold and function poses a substantial barrier to the design of active variants, and identifying strategies to reliably construct functional PKSmore » chimeras remains an active area of research. In this work, we formalize a paradigm for the design of PKS chimeras and introduce ClusterCAD as a computational platform to streamline and simplify the process of designing experiments to test strategies for engineering PKS variants. ClusterCAD provides chemical structures with stereochemistry for the intermediates generated by each PKS module, as well as sequence- and structure-based search tools that allow users to identify modules based either on amino acid sequence or on the chemical structure of the cognate polyketide intermediate. ClusterCAD can be accessed at https://clustercad.jbei.org and at http://clustercad.igb.uci.edu.« less
Mojo Hand, a TALEN design tool for genome editing applications.
Neff, Kevin L; Argue, David P; Ma, Alvin C; Lee, Han B; Clark, Karl J; Ekker, Stephen C
2013-01-16
Recent studies of transcription activator-like (TAL) effector domains fused to nucleases (TALENs) demonstrate enormous potential for genome editing. Effective design of TALENs requires a combination of selecting appropriate genetic features, finding pairs of binding sites based on a consensus sequence, and, in some cases, identifying endogenous restriction sites for downstream molecular genetic applications. We present the web-based program Mojo Hand for designing TAL and TALEN constructs for genome editing applications (http://www.talendesign.org). We describe the algorithm and its implementation. The features of Mojo Hand include (1) automatic download of genomic data from the National Center for Biotechnology Information, (2) analysis of any DNA sequence to reveal pairs of binding sites based on a user-defined template, (3) selection of restriction-enzyme recognition sites in the spacer between the TAL monomer binding sites including options for the selection of restriction enzyme suppliers, and (4) output files designed for subsequent TALEN construction using the Golden Gate assembly method. Mojo Hand enables the rapid identification of TAL binding sites for use in TALEN design. The assembly of TALEN constructs, is also simplified by using the TAL-site prediction program in conjunction with a spreadsheet management aid of reagent concentrations and TALEN formulation. Mojo Hand enables scientists to more rapidly deploy TALENs for genome editing applications.
ClusterCAD: a computational platform for type I modular polyketide synthase design
Eng, Clara H.; Backman, Tyler W H; Bailey, Constance B.; ...
2017-10-11
Here, we present ClusterCAD, a web-based toolkit designed to leverage the collinear structure and deterministic logic of type I modular polyketide synthases (PKSs) for synthetic biology applications. The unique organization of these megasynthases, combined with the diversity of their catalytic domain building blocks, has fueled an interest in harnessing the biosynthetic potential of PKSs for the microbial production of both novel natural product analogs and industrially relevant small molecules. However, a limited theoretical understanding of the determinants of PKS fold and function poses a substantial barrier to the design of active variants, and identifying strategies to reliably construct functional PKSmore » chimeras remains an active area of research. In this work, we formalize a paradigm for the design of PKS chimeras and introduce ClusterCAD as a computational platform to streamline and simplify the process of designing experiments to test strategies for engineering PKS variants. ClusterCAD provides chemical structures with stereochemistry for the intermediates generated by each PKS module, as well as sequence- and structure-based search tools that allow users to identify modules based either on amino acid sequence or on the chemical structure of the cognate polyketide intermediate. ClusterCAD can be accessed at https://clustercad.jbei.org and at http://clustercad.igb.uci.edu.« less
MPI, HPF or OpenMP: A Study with the NAS Benchmarks
NASA Technical Reports Server (NTRS)
Jin, Hao-Qiang; Frumkin, Michael; Hribar, Michelle; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)
1999-01-01
Porting applications to new high performance parallel and distributed platforms is a challenging task. Writing parallel code by hand is time consuming and costly, but the task can be simplified by high level languages and would even better be automated by parallelizing tools and compilers. The definition of HPF (High Performance Fortran, based on data parallel model) and OpenMP (based on shared memory parallel model) standards has offered great opportunity in this respect. Both provide simple and clear interfaces to language like FORTRAN and simplify many tedious tasks encountered in writing message passing programs. In our study we implemented the parallel versions of the NAS Benchmarks with HPF and OpenMP directives. Comparison of their performance with the MPI implementation and pros and cons of different approaches will be discussed along with experience of using computer-aided tools to help parallelize these benchmarks. Based on the study,potentials of applying some of the techniques to realistic aerospace applications will be presented
MPI, HPF or OpenMP: A Study with the NAS Benchmarks
NASA Technical Reports Server (NTRS)
Jin, H.; Frumkin, M.; Hribar, M.; Waheed, A.; Yan, J.; Saini, Subhash (Technical Monitor)
1999-01-01
Porting applications to new high performance parallel and distributed platforms is a challenging task. Writing parallel code by hand is time consuming and costly, but this task can be simplified by high level languages and would even better be automated by parallelizing tools and compilers. The definition of HPF (High Performance Fortran, based on data parallel model) and OpenMP (based on shared memory parallel model) standards has offered great opportunity in this respect. Both provide simple and clear interfaces to language like FORTRAN and simplify many tedious tasks encountered in writing message passing programs. In our study, we implemented the parallel versions of the NAS Benchmarks with HPF and OpenMP directives. Comparison of their performance with the MPI implementation and pros and cons of different approaches will be discussed along with experience of using computer-aided tools to help parallelize these benchmarks. Based on the study, potentials of applying some of the techniques to realistic aerospace applications will be presented.
NASA Astrophysics Data System (ADS)
Tonini, Roberto; Selva, Jacopo; Costa, Antonio; Sandri, Laura
2014-05-01
Probabilistic Hazard Assessment (PHA) is becoming an essential tool for risk mitigation policies, since it allows to quantify the hazard due to hazardous phenomena and, differently from the deterministic approach, it accounts for both aleatory and epistemic uncertainties. On the other hand, one of the main disadvantages of PHA methods is that their results are not easy to understand and interpret by people who are not specialist in probabilistic tools. For scientists, this leads to the issue of providing tools that can be easily used and understood by decision makers (i.e., risk managers or local authorities). The work here presented fits into the problem of simplifying the transfer between scientific knowledge and land protection policies, by providing an interface between scientists, who produce PHA's results, and decision makers, who use PHA's results for risk analyses. In this framework we present pyPHaz, an open tool developed and designed to visualize and analyze PHA results due to one or more phenomena affecting a specific area of interest. The software implementation has been fully developed with the free and open-source Python programming language and some featured Python-based libraries and modules. The pyPHaz tool allows to visualize the Hazard Curves (HC) calculated in a selected target area together with different levels of uncertainty (mean and percentiles) on maps that can be interactively created and modified by the user, thanks to a dedicated Graphical User Interface (GUI). Moreover, the tool can be used to compare the results of different PHA models and to merge them, by creating ensemble models. The pyPHaz software has been designed with the features of storing and accessing all the data through a MySQL database and of being able to read as input the XML-based standard file formats defined in the frame of GEM (Global Earthquake Model). This format model is easy to extend also to any other kind of hazard, as it will be shown in the applications here used as examples of the pyPHaz potentialities, that are focused on a Probabilistic Volcanic Hazard Assessment (PVHA) for tephra dispersal and fallout applied to the municipality of Naples.
Nonstandard and Higher-Order Finite-Difference Methods for Electromagnetics
2009-10-26
Simplified Fuselage filled with 90 passengers. . . . . . . . . 135 4.4. A top view photograph of the expanded polystyrene passenger support, and the... expanded polystyrene supports. . . . . . . . . . . . . . . . . . . . . . . 140 4.10. Measured S11 (the exterior antenna) of the simplified fuselage...escape. To keep the passengers in their designated locations and upright, an expanded polystyrene support system was made. In a sheet of 1” thick
VEST: Abstract Vector Calculus Simplification in Mathematica
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. Squire, J. Burby and H. Qin
2013-03-12
We present a new package, VEST (Vector Einstein Summation Tools), that performs abstract vector calculus computations in Mathematica. Through the use of index notation, VEST is able to reduce scalar and vector expressions of a very general type using a systematic canonicalization procedure. In addition, utilizing properties of the Levi-Civita symbol, the program can derive types of multi-term vector identities that are not recognized by canonicalization, subsequently applying these to simplify large expressions. In a companion paper [1], we employ VEST in the automation of the calculation of Lagrangians for the single particle guiding center system in plasma physics, amore » computation which illustrates its ability to handle very large expressions. VEST has been designed to be simple and intuitive to use, both for basic checking of work and more involved computations. __________________________________________________« less
Fabrication of PDMS-Based Microfluidic Devices: Application for Synthesis of Magnetic Nanoparticles
NASA Astrophysics Data System (ADS)
Thu, Vu Thi; Mai, An Ngoc; Le The Tam; Van Trung, Hoang; Thu, Phung Thi; Tien, Bui Quang; Thuat, Nguyen Tran; Lam, Tran Dai
2016-05-01
In this work, we have developed a convenient approach to synthesize magnetic nanoparticles with relatively high magnetization and controllable sizes. This was realized by combining the traditional co-precipitation method and microfluidic techniques inside microfluidic devices. The device was first designed, and then fabricated using simplified soft-lithography techniques. The device was utilized to synthesize magnetite nanoparticles. The synthesized nanomaterials were thoroughly characterized using field emission scanning electron microscopy and a vibrating sample magnetometer. The results demonstrated that the as-prepared device can be utilized as a simple and effective tool to synthesize magnetic nanoparticles with the sizes less than 10 nm and magnetization more than 50 emu/g. The development of these devices opens new strategies to synthesize nanomaterials with more precise dimensions at narrow size-distribution and with controllable behaviors.
Allergen screening bioassays: recent developments in lab-on-a-chip and lab-on-a-disc systems.
Ho, Ho-pui; Lau, Pui-man; Kwok, Ho-chin; Wu, Shu-yuen; Gao, Minghui; Cheung, Anthony Ka-lun; Chen, Qiulan; Wang, Guanghui; Kwan, Yiu-wa; Wong, Chun-kwok; Kong, Siu-kai
2014-01-01
Allergies occur when a person's immune system mounts an abnormal response with or without IgE to a normally harmless substance called an allergen. The standard skin-prick test introduces suspected allergens into the skin with lancets in order to trigger allergic reactions. This test is annoying and sometimes life threatening. New tools such as lab-on-a-chip and lab-on-a-disc, which rely on microfabrication, are designed for allergy testing. These systems provide benefits such as short analysis times, enhanced sensitivity, simplified procedures, minimal consumption of sample and reagents and low cost. This article gives a summary of these systems. In particular, a cell-based assay detecting both the IgE- and non-IgE-type triggers through the study of degranulation in a centrifugal microfluidic system is highlighted.
Tools for Physiology Labs: Inexpensive Equipment for Physiological Stimulation
Land, Bruce R.; Johnson, Bruce R.; Wyttenbach, Robert A.; Hoy, Ronald R.
2004-01-01
We describe the design of inexpensive equipment and software for physiological stimulation in the neurobiology teaching laboratory. The core component is a stimulus isolation unit (SIU) that uses DC-DC converters, rather than expensive high-voltage batteries, to generate isolated power at high voltage. The SIU has no offset when inactive and produces pulses up to 100 V with moderately fast (50 μs) rise times. We also describe two methods of stimulus timing control. The first is a simplified conventional, stand-alone analog pulse generator. The second uses a digital microcontroller interfaced with a personal computer. The SIU has performed well and withstood intensive use in our undergraduate physiology laboratory. This project is part of our ongoing effort to make reliable low-cost physiology equipment available for both student teaching and faculty research laboratories. PMID:23493817
NASA Astrophysics Data System (ADS)
Miyake, Hiroshi; Masuzawa, Hideaki
A medical consultation system has been developed that encompasses knowledge of various specialties. The system is designed to be used by general practitioners, and inhabitants themselves. It has the characteristics of ; 1. The input task of complaints is simplified by use of multiple choice questionaires. 2. The system advices the person whether to seek medical help or not, and if so, the degree of urgency, and from what type of practitioner or specialist. 3. It supplies the doctor information regarding essential symptoms and possible diagnosis. 4. The system offer easy tools to make a medical consultation system to the specialists themselves. This system is intended as an answer to the common problem of uncertainty on the part of both inhabitants and doctors as to the area of medical speciality that applies to a given disease.
A high burnup model developed for the DIONISIO code
NASA Astrophysics Data System (ADS)
Soba, A.; Denis, A.; Romero, L.; Villarino, E.; Sardella, F.
2013-02-01
A group of subroutines, designed to extend the application range of the fuel performance code DIONISIO to high burn up, has recently been included in the code. The new calculation tools, which are tuned for UO2 fuels in LWR conditions, predict the radial distribution of power density, burnup, and concentration of diverse nuclides within the pellet. The balance equations of all the isotopes involved in the fission process are solved in a simplified manner, and the one-group effective cross sections of all of them are obtained as functions of the radial position in the pellet, burnup, and enrichment in 235U. In this work, the subroutines are described and the results of the simulations performed with DIONISIO are presented. The good agreement with the data provided in the FUMEX II/III NEA data bank can be easily recognized.
Parallel stochastic simulation of macroscopic calcium currents.
González-Vélez, Virginia; González-Vélez, Horacio
2007-06-01
This work introduces MACACO, a macroscopic calcium currents simulator. It provides a parameter-sweep framework which computes macroscopic Ca(2+) currents from the individual aggregation of unitary currents, using a stochastic model for L-type Ca(2+) channels. MACACO uses a simplified 3-state Markov model to simulate the response of each Ca(2+) channel to different voltage inputs to the cell. In order to provide an accurate systematic view for the stochastic nature of the calcium channels, MACACO is composed of an experiment generator, a central simulation engine and a post-processing script component. Due to the computational complexity of the problem and the dimensions of the parameter space, the MACACO simulation engine employs a grid-enabled task farm. Having been designed as a computational biology tool, MACACO heavily borrows from the way cell physiologists conduct and report their experimental work.
VEST: Abstract vector calculus simplification in Mathematica
NASA Astrophysics Data System (ADS)
Squire, J.; Burby, J.; Qin, H.
2014-01-01
We present a new package, VEST (Vector Einstein Summation Tools), that performs abstract vector calculus computations in Mathematica. Through the use of index notation, VEST is able to reduce three-dimensional scalar and vector expressions of a very general type to a well defined standard form. In addition, utilizing properties of the Levi-Civita symbol, the program can derive types of multi-term vector identities that are not recognized by reduction, subsequently applying these to simplify large expressions. In a companion paper Burby et al. (2013) [12], we employ VEST in the automation of the calculation of high-order Lagrangians for the single particle guiding center system in plasma physics, a computation which illustrates its ability to handle very large expressions. VEST has been designed to be simple and intuitive to use, both for basic checking of work and more involved computations.
Developments in Geometric Metadata and Tools at the PDS Ring-Moon Systems Node
NASA Astrophysics Data System (ADS)
Showalter, M. R.; Ballard, L.; French, R. S.; Gordon, M. K.; Tiscareno, M. S.
2018-04-01
Object-Oriented Python/SPICE (OOPS) is an overlay on the SPICE toolkit that vastly simplifies and speeds up geometry calculations for planetary data products. This toolkit is the basis for much of the development at the PDS Ring-Moon Systems Node.
Skyline: an open source document editor for creating and analyzing targeted proteomics experiments.
MacLean, Brendan; Tomazela, Daniela M; Shulman, Nicholas; Chambers, Matthew; Finney, Gregory L; Frewen, Barbara; Kern, Randall; Tabb, David L; Liebler, Daniel C; MacCoss, Michael J
2010-04-01
Skyline is a Windows client application for targeted proteomics method creation and quantitative data analysis. It is open source and freely available for academic and commercial use. The Skyline user interface simplifies the development of mass spectrometer methods and the analysis of data from targeted proteomics experiments performed using selected reaction monitoring (SRM). Skyline supports using and creating MS/MS spectral libraries from a wide variety of sources to choose SRM filters and verify results based on previously observed ion trap data. Skyline exports transition lists to and imports the native output files from Agilent, Applied Biosystems, Thermo Fisher Scientific and Waters triple quadrupole instruments, seamlessly connecting mass spectrometer output back to the experimental design document. The fast and compact Skyline file format is easily shared, even for experiments requiring many sample injections. A rich array of graphs displays results and provides powerful tools for inspecting data integrity as data are acquired, helping instrument operators to identify problems early. The Skyline dynamic report designer exports tabular data from the Skyline document model for in-depth analysis with common statistical tools. Single-click, self-updating web installation is available at http://proteome.gs.washington.edu/software/skyline. This web site also provides access to instructional videos, a support board, an issues list and a link to the source code project.
NASA Astrophysics Data System (ADS)
Adrich, Przemysław
2016-05-01
In Part I of this work existing methods and problems in dual foil electron beam forming system design are presented. On this basis, a new method of designing these systems is introduced. The motivation behind this work is to eliminate the shortcomings of the existing design methods and improve overall efficiency of the dual foil design process. The existing methods are based on approximate analytical models applied in an unrealistically simplified geometry. Designing a dual foil system with these methods is a rather labor intensive task as corrections to account for the effects not included in the analytical models have to be calculated separately and accounted for in an iterative procedure. To eliminate these drawbacks, the new design method is based entirely on Monte Carlo modeling in a realistic geometry and using physics models that include all relevant processes. In our approach, an optimal configuration of the dual foil system is found by means of a systematic, automatized scan of the system performance in function of parameters of the foils. The new method, while being computationally intensive, minimizes the involvement of the designer and considerably shortens the overall design time. The results are of high quality as all the relevant physics and geometry details are naturally accounted for. To demonstrate the feasibility of practical implementation of the new method, specialized software tools were developed and applied to solve a real life design problem, as described in Part II of this work.
NASA Technical Reports Server (NTRS)
DeChant, Lawrence J.
1997-01-01
In spite of the rapid advances in both scalar and parallel computational tools, the large number and breadth of variables involved in aerodynamic systems make the use of parabolized or even boundary layer fluid flow models impractical for both preliminary design and inverse design problems. Given this restriction, we have concluded that reduced or approximate models are an important family of tools for design purposes. This study of a combined perturbation/numerical modeling methodology with an application to ejector-mixer nozzles (shown schematically in the following figure) is nearing completion. The work is being funded by a grant from the NASA Lewis Research Center to Texas A&M University. These ejector-mixer nozzle models are designed to be of use to the High Speed Civil Transport Program and may be adopted by both NASA and industry. A computer code incorporating the ejector-mixer models is under development. This code, the Differential Reduced Ejector/Mixer Analysis (DREA), can be run fast enough to be used as a subroutine or to be called by a design optimization routine. Simplified conservation equations--x-momentum, energy, and mass conservation--are used to define the model. Unlike other preliminary design models, DREA requires minimal empirical input and includes vortical mixing and a fully compressible formulation among other features. DREA is being validated by comparing it with results obtained from open literature and proprietary industry data. Preliminary results for a subsonic ejector and a supersonic ejector are shown. In addition, dedicated experiments have been performed at Texas A&M. These experiments use a hydraulic/gas flow analog to provide information about the inviscid mixing interface structure. Final validation and documentation of this work is expected by May of 1997. However, preliminary versions of DREA can be expected in early 1997. In summary, DREA provides a sufficiently detailed and realistic ejector-mixer nozzle model at a computational cost compatible with preliminary design applications.
Oscillating water column structural model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Copeland, Guild; Bull, Diana L; Jepsen, Richard Alan
2014-09-01
An oscillating water column (OWC) wave energy converter is a structure with an opening to the ocean below the free surface, i.e. a structure with a moonpool. Two structural models for a non-axisymmetric terminator design OWC, the Backward Bent Duct Buoy (BBDB) are discussed in this report. The results of this structural model design study are intended to inform experiments and modeling underway in support of the U.S. Department of Energy (DOE) initiated Reference Model Project (RMP). A detailed design developed by Re Vision Consulting used stiffeners and girders to stabilize the structure against the hydrostatic loads experienced by amore » BBDB device. Additional support plates were added to this structure to account for loads arising from the mooring line attachment points. A simplified structure was designed in a modular fashion. This simplified design allows easy alterations to the buoyancy chambers and uncomplicated analysis of resulting changes in buoyancy.« less
Design considerations for eye-safe single-aperture laser radars
NASA Astrophysics Data System (ADS)
Starodubov, D.; McCormick, K.; Volfson, L.
2015-05-01
The design considerations for low cost, shock resistant, compact and efficient laser radars and ranging systems are discussed. The reviewed approach with single optical aperture allows reducing the size, weight and power of the system. Additional design benefits include improved stability, reliability and rigidity of the overall system. The proposed modular architecture provides simplified way of varying the performance parameters of the range finder product family by selecting the sets of specific illumination and detection modules. The performance operation challenges are presented. The implementation of non-reciprocal optical elements is considered. The cross talk between illumination and detection channels for single aperture design is reviewed. 3D imaging capability for the ranging applications is considered. The simplified assembly and testing process for single aperture range finders that allows to mass produce the design are discussed. The eye safety of the range finder operation is summarized.
Development of integrated control system for smart factory in the injection molding process
NASA Astrophysics Data System (ADS)
Chung, M. J.; Kim, C. Y.
2018-03-01
In this study, we proposed integrated control system for automation of injection molding process required for construction of smart factory. The injection molding process consists of heating, tool close, injection, cooling, tool open, and take-out. Take-out robot controller, image processing module, and process data acquisition interface module are developed and assembled to integrated control system. By adoption of integrated control system, the injection molding process can be simplified and the cost for construction of smart factory can be inexpensive.
Elaborate SMART MCNP Modelling Using ANSYS and Its Applications
NASA Astrophysics Data System (ADS)
Song, Jaehoon; Surh, Han-bum; Kim, Seung-jin; Koo, Bonsueng
2017-09-01
An MCNP 3-dimensional model can be widely used to evaluate various design parameters such as a core design or shielding design. Conventionally, a simplified 3-dimensional MCNP model is applied to calculate these parameters because of the cumbersomeness of modelling by hand. ANSYS has a function for converting the CAD `stp' format into an MCNP input in the geometry part. Using ANSYS and a 3- dimensional CAD file, a very detailed and sophisticated MCNP 3-dimensional model can be generated. The MCNP model is applied to evaluate the assembly weighting factor at the ex-core detector of SMART, and the result is compared with a simplified MCNP SMART model and assembly weighting factor calculated by DORT, which is a deterministic Sn code.
Effects of Simplifying Choice Tasks on Estimates of Taste Heterogeneity in Stated-Choice Surveys
Johnson, F. Reed; Ozdemir, Semra; Phillips, Kathryn A
2011-01-01
Researchers usually employ orthogonal arrays or D-optimal designs with little or no attribute overlap in stated-choice surveys. The challenge is to balance statistical efficiency and respondent burden to minimize the overall error in the survey responses. This study examined whether simplifying the choice task, by using a design with more overlap, provides advantages over standard minimum-overlap methods. We administered two designs for eliciting HIV test preferences to split samples. Surveys were undertaken at four HIV testing locations in San Francisco, California. Personal characteristics had different effects on willingness to pay for the two treatments, and gains in statistical efficiency in the minimal-overlap version more than compensated for possible imprecision from increased measurement error. PMID:19880234
[Simplified identification and filter device of carbon dioxide].
Mei, Xue-qin; Zhang, Yi-ping
2009-11-01
This paper presents the design and implementation ways of a simplified device to identify and filter carbon dioxide. The gas went through the test interface which had wet litmus paper before entering the abdominal cavity. Carbon dioxide dissolving in water turned acidic, making litmus paper change color to identify carbon dioxide, in order to avoid malpractice by connecting the wrong gas when making Endoscopic surgery.
NASA Technical Reports Server (NTRS)
Deal, J. H.
1975-01-01
One approach to the problem of simplifying complex nonlinear filtering algorithms is through using stratified probability approximations where the continuous probability density functions of certain random variables are represented by discrete mass approximations. This technique is developed in this paper and used to simplify the filtering algorithms developed for the optimum receiver for signals corrupted by both additive and multiplicative noise.
'PACLIMS': a component LIM system for high-throughput functional genomic analysis.
Donofrio, Nicole; Rajagopalon, Ravi; Brown, Douglas; Diener, Stephen; Windham, Donald; Nolin, Shelly; Floyd, Anna; Mitchell, Thomas; Galadima, Natalia; Tucker, Sara; Orbach, Marc J; Patel, Gayatri; Farman, Mark; Pampanwar, Vishal; Soderlund, Cari; Lee, Yong-Hwan; Dean, Ralph A
2005-04-12
Recent advances in sequencing techniques leading to cost reduction have resulted in the generation of a growing number of sequenced eukaryotic genomes. Computational tools greatly assist in defining open reading frames and assigning tentative annotations. However, gene functions cannot be asserted without biological support through, among other things, mutational analysis. In taking a genome-wide approach to functionally annotate an entire organism, in this application the approximately 11,000 predicted genes in the rice blast fungus (Magnaporthe grisea), an effective platform for tracking and storing both the biological materials created and the data produced across several participating institutions was required. The platform designed, named PACLIMS, was built to support our high throughput pipeline for generating 50,000 random insertion mutants of Magnaporthe grisea. To be a useful tool for materials and data tracking and storage, PACLIMS was designed to be simple to use, modifiable to accommodate refinement of research protocols, and cost-efficient. Data entry into PACLIMS was simplified through the use of barcodes and scanners, thus reducing the potential human error, time constraints, and labor. This platform was designed in concert with our experimental protocol so that it leads the researchers through each step of the process from mutant generation through phenotypic assays, thus ensuring that every mutant produced is handled in an identical manner and all necessary data is captured. Many sequenced eukaryotes have reached the point where computational analyses are no longer sufficient and require biological support for their predicted genes. Consequently, there is an increasing need for platforms that support high throughput genome-wide mutational analyses. While PACLIMS was designed specifically for this project, the source and ideas present in its implementation can be used as a model for other high throughput mutational endeavors.
'PACLIMS': A component LIM system for high-throughput functional genomic analysis
Donofrio, Nicole; Rajagopalon, Ravi; Brown, Douglas; Diener, Stephen; Windham, Donald; Nolin, Shelly; Floyd, Anna; Mitchell, Thomas; Galadima, Natalia; Tucker, Sara; Orbach, Marc J; Patel, Gayatri; Farman, Mark; Pampanwar, Vishal; Soderlund, Cari; Lee, Yong-Hwan; Dean, Ralph A
2005-01-01
Background Recent advances in sequencing techniques leading to cost reduction have resulted in the generation of a growing number of sequenced eukaryotic genomes. Computational tools greatly assist in defining open reading frames and assigning tentative annotations. However, gene functions cannot be asserted without biological support through, among other things, mutational analysis. In taking a genome-wide approach to functionally annotate an entire organism, in this application the ~11,000 predicted genes in the rice blast fungus (Magnaporthe grisea), an effective platform for tracking and storing both the biological materials created and the data produced across several participating institutions was required. Results The platform designed, named PACLIMS, was built to support our high throughput pipeline for generating 50,000 random insertion mutants of Magnaporthe grisea. To be a useful tool for materials and data tracking and storage, PACLIMS was designed to be simple to use, modifiable to accommodate refinement of research protocols, and cost-efficient. Data entry into PACLIMS was simplified through the use of barcodes and scanners, thus reducing the potential human error, time constraints, and labor. This platform was designed in concert with our experimental protocol so that it leads the researchers through each step of the process from mutant generation through phenotypic assays, thus ensuring that every mutant produced is handled in an identical manner and all necessary data is captured. Conclusion Many sequenced eukaryotes have reached the point where computational analyses are no longer sufficient and require biological support for their predicted genes. Consequently, there is an increasing need for platforms that support high throughput genome-wide mutational analyses. While PACLIMS was designed specifically for this project, the source and ideas present in its implementation can be used as a model for other high throughput mutational endeavors. PMID:15826298
CHEMICAL MARKERS OF HUMAN WASTE CONTAMINATION IN SOURCE WATERS: A SIMPLIFIED ANALYTICAL APPROACH
Giving public water authorities a tool to monitor and measure levels of human waste contamination of waters simply and rapidly would enhance public protection. This methodology, using both urobilin and azithromycin (or any other human-use pharmaceutical) could be used to give pub...
The Complexity of Developmental Predictions from Dual Process Models
ERIC Educational Resources Information Center
Stanovich, Keith E.; West, Richard F.; Toplak, Maggie E.
2011-01-01
Drawing developmental predictions from dual-process theories is more complex than is commonly realized. Overly simplified predictions drawn from such models may lead to premature rejection of the dual process approach as one of many tools for understanding cognitive development. Misleading predictions can be avoided by paying attention to several…
Upper-Division Student Difficulties with the Dirac Delta Function
ERIC Educational Resources Information Center
Wilcox, Bethany R.; Pollock, Steven J.
2015-01-01
The Dirac delta function is a standard mathematical tool that appears repeatedly in the undergraduate physics curriculum in multiple topical areas including electrostatics, and quantum mechanics. While Dirac delta functions are often introduced in order to simplify a problem mathematically, students still struggle to manipulate and interpret them.…
Assessment of optional sediment transport functions via the complex watershed simulation model SWAT
USDA-ARS?s Scientific Manuscript database
The Soil and Water Assessment Tool 2012 (SWAT2012) offers four sediment routing methods as optional alternatives to the default simplified Bagnold method. Previous studies compared only one of these alternative sediment routing methods with the default method. The proposed study evaluated the impac...
USDA-ARS?s Scientific Manuscript database
Advanced Land Surface Models (LSM) offer a powerful tool for studying hydrological variability. Highly managed systems, however, present a challenge for these models, which typically have simplified or incomplete representations of human water use. Here we examine recent groundwater declines in the ...
PD5: a general purpose library for primer design software.
Riley, Michael C; Aubrey, Wayne; Young, Michael; Clare, Amanda
2013-01-01
Complex PCR applications for large genome-scale projects require fast, reliable and often highly sophisticated primer design software applications. Presently, such applications use pipelining methods to utilise many third party applications and this involves file parsing, interfacing and data conversion, which is slow and prone to error. A fully integrated suite of software tools for primer design would considerably improve the development time, the processing speed, and the reliability of bespoke primer design software applications. The PD5 software library is an open-source collection of classes and utilities, providing a complete collection of software building blocks for primer design and analysis. It is written in object-oriented C(++) with an emphasis on classes suitable for efficient and rapid development of bespoke primer design programs. The modular design of the software library simplifies the development of specific applications and also integration with existing third party software where necessary. We demonstrate several applications created using this software library that have already proved to be effective, but we view the project as a dynamic environment for building primer design software and it is open for future development by the bioinformatics community. Therefore, the PD5 software library is published under the terms of the GNU General Public License, which guarantee access to source-code and allow redistribution and modification. The PD5 software library is downloadable from Google Code and the accompanying Wiki includes instructions and examples: http://code.google.com/p/primer-design.
van Roekel, Hendrik W H; Rosier, Bas J H M; Meijer, Lenny H H; Hilbers, Peter A J; Markvoort, Albert J; Huck, Wilhelm T S; de Greef, Tom F A
2015-11-07
Living cells are able to produce a wide variety of biological responses when subjected to biochemical stimuli. It has become apparent that these biological responses are regulated by complex chemical reaction networks (CRNs). Unravelling the function of these circuits is a key topic of both systems biology and synthetic biology. Recent progress at the interface of chemistry and biology together with the realisation that current experimental tools are insufficient to quantitatively understand the molecular logic of pathways inside living cells has triggered renewed interest in the bottom-up development of CRNs. This builds upon earlier work of physical chemists who extensively studied inorganic CRNs and showed how a system of chemical reactions can give rise to complex spatiotemporal responses such as oscillations and pattern formation. Using purified biochemical components, in vitro synthetic biologists have started to engineer simplified model systems with the goal of mimicking biological responses of intracellular circuits. Emulation and reconstruction of system-level properties of intracellular networks using simplified circuits are able to reveal key design principles and molecular programs that underlie the biological function of interest. In this Tutorial Review, we present an accessible overview of this emerging field starting with key studies on inorganic CRNs followed by a discussion of recent work involving purified biochemical components. Finally, we review recent work showing the versatility of programmable biochemical reaction networks (BRNs) in analytical and diagnostic applications.
Teaching neurology to medical students with a simplified version of team-based learning.
Brich, Jochen; Jost, Meike; Brüstle, Peter; Giesler, Marianne; Rijntjes, Michel
2017-08-08
To compare the effect of a simplified version of team-based learning (sTBL), an active learning/small group instructional strategy, with that of the traditionally used small group interactive seminars on the acquisition of knowledge and clinical reasoning (CR) skills. Third- and fourth-year medical students (n = 122) were randomly distributed into 2 groups. A crossover design was used in which 2 neurologic topics were taught by sTBL and 2 by small group interactive seminars. Knowledge was assessed with a multiple-choice question examination (MCQE), CR skills with a key feature problem examination (KFPE). Questionnaires were used for further methodologic evaluation. No group differences were found in the MCQE results. sTBL instruction of the topic "acute altered mental status" was associated with a significantly better student performance in the KFPE ( p = 0.008), with no differences in the other 3 topics covered. Although both teaching methods were highly rated by the students, a clear majority voted for sTBL as their preferred future teaching method. sTBL served as an equivalent alternative to small group interactive seminars for imparting knowledge and teaching CR skills, and was particularly advantageous for teaching CR in the setting of a complex neurologic topic. Furthermore, students reported a strong preference for the sTBL approach, making it a promising tool for effectively teaching neurology. © 2017 American Academy of Neurology.
NASA Astrophysics Data System (ADS)
Peckham, S. D.
2017-12-01
Standardized, deep descriptions of digital resources (e.g. data sets, computational models, software tools and publications) make it possible to develop user-friendly software systems that assist scientists with the discovery and appropriate use of these resources. Semantic metadata makes it possible for machines to take actions on behalf of humans, such as automatically identifying the resources needed to solve a given problem, retrieving them and then automatically connecting them (despite their heterogeneity) into a functioning workflow. Standardized model metadata also helps model users to understand the important details that underpin computational models and to compare the capabilities of different models. These details include simplifying assumptions on the physics, governing equations and the numerical methods used to solve them, discretization of space (the grid) and time (the time-stepping scheme), state variables (input or output), model configuration parameters. This kind of metadata provides a "deep description" of a computational model that goes well beyond other types of metadata (e.g. author, purpose, scientific domain, programming language, digital rights, provenance, execution) and captures the science that underpins a model. A carefully constructed, unambiguous and rules-based schema to address this problem, called the Geoscience Standard Names ontology will be presented that utilizes Semantic Web best practices and technologies. It has also been designed to work across science domains and to be readable by both humans and machines.
Service management at CERN with Service-Now
NASA Astrophysics Data System (ADS)
Toteva, Z.; Alvarez Alonso, R.; Alvarez Granda, E.; Cheimariou, M.-E.; Fedorko, I.; Hefferman, J.; Lemaitre, S.; Clavo, D. Martin; Martinez Pedreira, P.; Pera Mira, O.
2012-12-01
The Information Technology (IT) and the General Services (GS) departments at CERN have decided to combine their extensive experience in support for IT and non-IT services towards a common goal - to bring the services closer to the end user based on Information Technology Infrastructure Library (ITIL) best practice. The collaborative efforts have so far produced definitions for the incident and the request fulfilment processes which are based on a unique two-dimensional service catalogue that combines both the user and the support team views of all services. After an extensive evaluation of the available industrial solutions, Service-now was selected as the tool to implement the CERN Service-Management processes. The initial release of the tool provided an attractive web portal for the users and successfully implemented two basic ITIL processes; the incident management and the request fulfilment processes. It also integrated with the CERN personnel databases and the LHC GRID ticketing system. Subsequent releases continued to integrate with other third-party tools like the facility management systems of CERN as well as to implement new processes such as change management. Independently from those new development activities it was decided to simplify the request fulfilment process in order to achieve easier acceptance by the CERN user community. We believe that due to the high modularity of the Service-now tool, the parallel design of ITIL processes e.g., event management and non-ITIL processes, e.g., computer centre hardware management, will be easily achieved. This presentation will describe the experience that we have acquired and the techniques that were followed to achieve the CERN customization of the Service-Now tool.
BIRCH: a user-oriented, locally-customizable, bioinformatics system.
Fristensky, Brian
2007-02-09
Molecular biologists need sophisticated analytical tools which often demand extensive computational resources. While finding, installing, and using these tools can be challenging, pipelining data from one program to the next is particularly awkward, especially when using web-based programs. At the same time, system administrators tasked with maintaining these tools do not always appreciate the needs of research biologists. BIRCH (Biological Research Computing Hierarchy) is an organizational framework for delivering bioinformatics resources to a user group, scaling from a single lab to a large institution. The BIRCH core distribution includes many popular bioinformatics programs, unified within the GDE (Genetic Data Environment) graphic interface. Of equal importance, BIRCH provides the system administrator with tools that simplify the job of managing a multiuser bioinformatics system across different platforms and operating systems. These include tools for integrating locally-installed programs and databases into BIRCH, and for customizing the local BIRCH system to meet the needs of the user base. BIRCH can also act as a front end to provide a unified view of already-existing collections of bioinformatics software. Documentation for the BIRCH and locally-added programs is merged in a hierarchical set of web pages. In addition to manual pages for individual programs, BIRCH tutorials employ step by step examples, with screen shots and sample files, to illustrate both the important theoretical and practical considerations behind complex analytical tasks. BIRCH provides a versatile organizational framework for managing software and databases, and making these accessible to a user base. Because of its network-centric design, BIRCH makes it possible for any user to do any task from anywhere.
BIRCH: A user-oriented, locally-customizable, bioinformatics system
Fristensky, Brian
2007-01-01
Background Molecular biologists need sophisticated analytical tools which often demand extensive computational resources. While finding, installing, and using these tools can be challenging, pipelining data from one program to the next is particularly awkward, especially when using web-based programs. At the same time, system administrators tasked with maintaining these tools do not always appreciate the needs of research biologists. Results BIRCH (Biological Research Computing Hierarchy) is an organizational framework for delivering bioinformatics resources to a user group, scaling from a single lab to a large institution. The BIRCH core distribution includes many popular bioinformatics programs, unified within the GDE (Genetic Data Environment) graphic interface. Of equal importance, BIRCH provides the system administrator with tools that simplify the job of managing a multiuser bioinformatics system across different platforms and operating systems. These include tools for integrating locally-installed programs and databases into BIRCH, and for customizing the local BIRCH system to meet the needs of the user base. BIRCH can also act as a front end to provide a unified view of already-existing collections of bioinformatics software. Documentation for the BIRCH and locally-added programs is merged in a hierarchical set of web pages. In addition to manual pages for individual programs, BIRCH tutorials employ step by step examples, with screen shots and sample files, to illustrate both the important theoretical and practical considerations behind complex analytical tasks. Conclusion BIRCH provides a versatile organizational framework for managing software and databases, and making these accessible to a user base. Because of its network-centric design, BIRCH makes it possible for any user to do any task from anywhere. PMID:17291351
Modeling of Damage Initiation and Progression in a SiC/SiC Woven Ceramic Matrix Composite
NASA Technical Reports Server (NTRS)
Mital, Subodh K.; Goldberg, Robert K.; Bonacuse, Peter J.
2012-01-01
The goal of an ongoing project at NASA Glenn is to investigate the effects of the complex microstructure of a woven ceramic matrix composite and its variability on the effective properties and the durability of the material. Detailed analysis of these complex microstructures may provide clues for the material scientists who `design the material? or to structural analysts and designers who `design with the material? regarding damage initiation and damage propagation. A model material system, specifically a five-harness satin weave architecture CVI SiC/SiC composite composed of Sylramic-iBN fibers and a SiC matrix, has been analyzed. Specimens of the material were serially sectioned and polished to capture the detailed images of fiber tows, matrix and porosity. Open source analysis tools were used to isolate various constituents and finite elements models were then generated from simplified models of those images. Detailed finite element analyses were performed that examine how the variability in the local microstructure affected the macroscopic behavior as well as the local damage initiation and progression. Results indicate that the locations where damage initiated and propagated is linked to specific microstructural features.
A simplified lumped model for the optimization of post-buckled beam architecture wideband generator
NASA Astrophysics Data System (ADS)
Liu, Weiqun; Formosa, Fabien; Badel, Adrien; Hu, Guangdi
2017-11-01
Buckled beams structures are a classical kind of bistable energy harvesters which attract more and more interests because of their capability to scavenge energy over a large frequency band in comparison with linear generator. The usual modeling approach uses the Galerkin mode discretization method with relatively high complexity, while the simplification with a single-mode solution lacks accuracy. It stems on the optimization of the energy potential features to finally define the physical and geometrical parameters. Therefore, in this paper, a simple lumped model is proposed with explicit relationship between the potential shape and parameters to allow efficient design of bistable beams based generator. The accuracy of the approximation model is studied with the effectiveness of application analyzed. Moreover, an important fact, that the bending stiffness has little influence on the potential shape with low buckling level and the sectional area determined, is found. This feature extends the applicable range of the model by utilizing the design of high moment of inertia. Numerical investigations demonstrate that the proposed model is a simple and reliable tool for design. An optimization example of using the proposed model is demonstrated with satisfactory performance.
Corporate ergonomics programme at automobiles Peugeot-Sochaux.
Moreau, M
2003-01-01
An ergonomic assessment tool for design procedures, exclusive to Peugeot-Citroën, called ECM, was developed and applied at the design stage by method technicians in the 1990s. It generates data, which are followed up by the project leader of a new model and ergonomists until two years before each launch. During this time, vehicle design is subject to modification, to adapt to ergonomic demands. Simplified methods (DACORS and METEO) were also developed to assess workstations on the shop floor in trim and final plants. Assessments were used to grade the workstations into four profiles linked to physical and static requirements. Production technicians are responsible for the application of these local methods on the shop floor. The management of these centres aimed to reduce the risks of musculoskeletal disorders by reduced heavy profiles of these stations. New cases of musculoskeletal disorders, surveyed by the company doctor among workers on the assembly lines had decreased since 1996. In 1999, the incidence increased again, despite the pursuit of ergonomic methods. This increase in musculoskeletal disorders was above all linked to a major reorganisation of work conditions, including a reduction in the cycle time on the assembly line, and to a move into a new workshop.
From LCAs to simplified models: a generic methodology applied to wind power electricity.
Padey, Pierryves; Girard, Robin; le Boulch, Denis; Blanc, Isabelle
2013-02-05
This study presents a generic methodology to produce simplified models able to provide a comprehensive life cycle impact assessment of energy pathways. The methodology relies on the application of global sensitivity analysis to identify key parameters explaining the impact variability of systems over their life cycle. Simplified models are built upon the identification of such key parameters. The methodology is applied to one energy pathway: onshore wind turbines of medium size considering a large sample of possible configurations representative of European conditions. Among several technological, geographical, and methodological parameters, we identified the turbine load factor and the wind turbine lifetime as the most influent parameters. Greenhouse Gas (GHG) performances have been plotted as a function of these key parameters identified. Using these curves, GHG performances of a specific wind turbine can be estimated, thus avoiding the undertaking of an extensive Life Cycle Assessment (LCA). This methodology should be useful for decisions makers, providing them a robust but simple support tool for assessing the environmental performance of energy systems.
ObspyDMT: a Python toolbox for retrieving and processing large seismological data sets
NASA Astrophysics Data System (ADS)
Hosseini, Kasra; Sigloch, Karin
2017-10-01
We present obspyDMT, a free, open-source software toolbox for the query, retrieval, processing and management of seismological data sets, including very large, heterogeneous and/or dynamically growing ones. ObspyDMT simplifies and speeds up user interaction with data centers, in more versatile ways than existing tools. The user is shielded from the complexities of interacting with different data centers and data exchange protocols and is provided with powerful diagnostic and plotting tools to check the retrieved data and metadata. While primarily a productivity tool for research seismologists and observatories, easy-to-use syntax and plotting functionality also make obspyDMT an effective teaching aid. Written in the Python programming language, it can be used as a stand-alone command-line tool (requiring no knowledge of Python) or can be integrated as a module with other Python codes. It facilitates data archiving, preprocessing, instrument correction and quality control - routine but nontrivial tasks that can consume much user time. We describe obspyDMT's functionality, design and technical implementation, accompanied by an overview of its use cases. As an example of a typical problem encountered in seismogram preprocessing, we show how to check for inconsistencies in response files of two example stations. We also demonstrate the fully automated request, remote computation and retrieval of synthetic seismograms from the Synthetics Engine (Syngine) web service of the Data Management Center (DMC) at the Incorporated Research Institutions for Seismology (IRIS).
NASA Technical Reports Server (NTRS)
Nguyen, Lac; Kenney, Patrick J.
1993-01-01
Development of interactive virtual environments (VE) has typically consisted of three primary activities: model (object) development, model relationship tree development, and environment behavior definition and coding. The model and relationship tree development activities are accomplished with a variety of well-established graphic library (GL) based programs - most utilizing graphical user interfaces (GUI) with point-and-click interactions. Because of this GUI format, little programming expertise on the part of the developer is necessary to create the 3D graphical models or to establish interrelationships between the models. However, the third VE development activity, environment behavior definition and coding, has generally required the greatest amount of time and programmer expertise. Behaviors, characteristics, and interactions between objects and the user within a VE must be defined via command line C coding prior to rendering the environment scenes. In an effort to simplify this environment behavior definition phase for non-programmers, and to provide easy access to model and tree tools, a graphical interface and development tool has been created. The principal thrust of this research is to effect rapid development and prototyping of virtual environments. This presentation will discuss the 'Visual Interface for Virtual Interaction Development' (VIVID) tool; an X-Windows based system employing drop-down menus for user selection of program access, models, and trees, behavior editing, and code generation. Examples of these selection will be highlighted in this presentation, as will the currently available program interfaces. The functionality of this tool allows non-programming users access to all facets of VE development while providing experienced programmers with a collection of pre-coded behaviors. In conjunction with its existing, interfaces and predefined suite of behaviors, future development plans for VIVID will be described. These include incorporation of dual user virtual environment enhancements, tool expansion, and additional behaviors.
A simplified conjoint recognition paradigm for the measurement of gist and verbatim memory.
Stahl, Christoph; Klauer, Karl Christoph
2008-05-01
The distinction between verbatim and gist memory traces has furthered the understanding of numerous phenomena in various fields, such as false memory research, research on reasoning and decision making, and cognitive development. To measure verbatim and gist memory empirically, an experimental paradigm and multinomial measurement model has been proposed but rarely applied. In the present article, a simplified conjoint recognition paradigm and multinomial model is introduced and validated as a measurement tool for the separate assessment of verbatim and gist memory processes. A Bayesian metacognitive framework is applied to validate guessing processes. Extensions of the model toward incorporating the processes of phantom recollection and erroneous recollection rejection are discussed.
Got Graphs? An Assessment of Data Visualization Tools
NASA Technical Reports Server (NTRS)
Schaefer, C. M.; Foy, M.
2015-01-01
Graphs are powerful tools for simplifying complex data. They are useful for quickly assessing patterns and relationships among one or more variables from a dataset. As the amount of data increases, it becomes more difficult to visualize potential associations. Lifetime Surveillance of Astronaut Health (LSAH) was charged with assessing its current visualization tools along with others on the market to determine whether new tools would be useful for supporting NASA's occupational surveillance effort. It was concluded by members of LSAH that the current tools hindered their ability to provide quick results to researchers working with the department. Due to the high volume of data requests and the many iterations of visualizations requested by researchers, software with a better ability to replicate graphs and edit quickly could improve LSAH's efficiency and lead to faster research results.
A toolbox and record for scientific models
NASA Technical Reports Server (NTRS)
Ellman, Thomas
1994-01-01
Computational science presents a host of challenges for the field of knowledge-based software design. Scientific computation models are difficult to construct. Models constructed by one scientist are easily misapplied by other scientists to problems for which they are not well-suited. Finally, models constructed by one scientist are difficult for others to modify or extend to handle new types of problems. Construction of scientific models actually involves much more than the mechanics of building a single computational model. In the course of developing a model, a scientist will often test a candidate model against experimental data or against a priori expectations. Test results often lead to revisions of the model and a consequent need for additional testing. During a single model development session, a scientist typically examines a whole series of alternative models, each using different simplifying assumptions or modeling techniques. A useful scientific software design tool must support these aspects of the model development process as well. In particular, it should propose and carry out tests of candidate models. It should analyze test results and identify models and parts of models that must be changed. It should determine what types of changes can potentially cure a given negative test result. It should organize candidate models, test data, and test results into a coherent record of the development process. Finally, it should exploit the development record for two purposes: (1) automatically determining the applicability of a scientific model to a given problem; (2) supporting revision of a scientific model to handle a new type of problem. Existing knowledge-based software design tools must be extended in order to provide these facilities.
NASA Astrophysics Data System (ADS)
Vines, Aleksander; Hansen, Morten W.; Korosov, Anton
2017-04-01
Existing infrastructure international and Norwegian projects, e.g., NorDataNet, NMDC and NORMAP, provide open data access through the OPeNDAP protocol following the conventions for CF (Climate and Forecast) metadata, designed to promote the processing and sharing of files created with the NetCDF application programming interface (API). This approach is now also being implemented in the Norwegian Sentinel Data Hub (satellittdata.no) to provide satellite EO data to the user community. Simultaneously with providing simplified and unified data access, these projects also seek to use and establish common standards for use and discovery metadata. This then allows development of standardized tools for data search and (subset) streaming over the internet to perform actual scientific analysis. A combinnation of software tools, which we call a Scientific Platform as a Service (SPaaS), will take advantage of these opportunities to harmonize and streamline the search, retrieval and analysis of integrated satellite and auxiliary observations of the oceans in a seamless system. The SPaaS is a cloud solution for integration of analysis tools with scientific datasets via an API. The core part of the SPaaS is a distributed metadata catalog to store granular metadata describing the structure, location and content of available satellite, model, and in situ datasets. The analysis tools include software for visualization (also online), interactive in-depth analysis, and server-based processing chains. The API conveys search requests between system nodes (i.e., interactive and server tools) and provides easy access to the metadata catalog, data repositories, and the tools. The SPaaS components are integrated in virtual machines, of which provisioning and deployment are automatized using existing state-of-the-art open-source tools (e.g., Vagrant, Ansible, Docker). The open-source code for scientific tools and virtual machine configurations is under version control at https://github.com/nansencenter/, and is coupled to an online continuous integration system (e.g., Travis CI).
A New Strategy in Observer Modeling for Greenhouse Cucumber Seedling Growth
Qiu, Quan; Zheng, Chenfei; Wang, Wenping; Qiao, Xiaojun; Bai, He; Yu, Jingquan; Shi, Kai
2017-01-01
State observer is an essential component in computerized control loops for greenhouse-crop systems. However, the current accomplishments of observer modeling for greenhouse-crop systems mainly focus on mass/energy balance, ignoring physiological responses of crops. As a result, state observers for crop physiological responses are rarely developed, and control operations are typically made based on experience rather than actual crop requirements. In addition, existing observer models require a large number of parameters, leading to heavy computational load and poor application feasibility. To address these problems, we present a new state observer modeling strategy that takes both environmental information and crop physiological responses into consideration during the observer modeling process. Using greenhouse cucumber seedlings as an instance, we sample 10 physiological parameters of cucumber seedlings at different time point during the exponential growth stage, and employ them to build growth state observers together with 8 environmental parameters. Support vector machine (SVM) acts as the mathematical tool for observer modeling. Canonical correlation analysis (CCA) is used to select the dominant environmental and physiological parameters in the modeling process. With the dominant parameters, simplified observer models are built and tested. We conduct contrast experiments with different input parameter combinations on simplified and un-simplified observers. Experimental results indicate that physiological information can improve the prediction accuracies of the growth state observers. Furthermore, the simplified observer models can give equivalent or even better performance than the un-simplified ones, which verifies the feasibility of CCA. The current study can enable state observers to reflect crop requirements and make them feasible for applications with simplified shapes, which is significant for developing intelligent greenhouse control systems for modern greenhouse production. PMID:28848565
A New Strategy in Observer Modeling for Greenhouse Cucumber Seedling Growth.
Qiu, Quan; Zheng, Chenfei; Wang, Wenping; Qiao, Xiaojun; Bai, He; Yu, Jingquan; Shi, Kai
2017-01-01
State observer is an essential component in computerized control loops for greenhouse-crop systems. However, the current accomplishments of observer modeling for greenhouse-crop systems mainly focus on mass/energy balance, ignoring physiological responses of crops. As a result, state observers for crop physiological responses are rarely developed, and control operations are typically made based on experience rather than actual crop requirements. In addition, existing observer models require a large number of parameters, leading to heavy computational load and poor application feasibility. To address these problems, we present a new state observer modeling strategy that takes both environmental information and crop physiological responses into consideration during the observer modeling process. Using greenhouse cucumber seedlings as an instance, we sample 10 physiological parameters of cucumber seedlings at different time point during the exponential growth stage, and employ them to build growth state observers together with 8 environmental parameters. Support vector machine (SVM) acts as the mathematical tool for observer modeling. Canonical correlation analysis (CCA) is used to select the dominant environmental and physiological parameters in the modeling process. With the dominant parameters, simplified observer models are built and tested. We conduct contrast experiments with different input parameter combinations on simplified and un-simplified observers. Experimental results indicate that physiological information can improve the prediction accuracies of the growth state observers. Furthermore, the simplified observer models can give equivalent or even better performance than the un-simplified ones, which verifies the feasibility of CCA. The current study can enable state observers to reflect crop requirements and make them feasible for applications with simplified shapes, which is significant for developing intelligent greenhouse control systems for modern greenhouse production.
NASA Astrophysics Data System (ADS)
Nguyen, Tien M.; Guillen, Andy T.; Hant, James J.; Kizer, Justin R.; Min, Inki A.; Siedlak, Dennis J. L.; Yoh, James
2017-05-01
The U.S. Air Force (USAF) has recognized the needs for owning the program and technical knowledge within the Air Force concerning the systems being acquired to ensure success. This paper extends the previous work done by the authors [1-2] on the "Resilient Program Technical Baseline Framework for Future Space Systems" and "Portfolio Decision Support Tool (PDST)" to the development and implementation of the Program and Technical Baseline (PTB) Tracking Tool (PTBTL) for the DOD acquisition life cycle. The paper describes the "simplified" PTB tracking model with a focus on the preaward phases and discusses how to implement this model in PDST.
Bestelmeyer, Brandon T.; Williamson, Jeb C.; Talbot, Curtis J.; Cates, Greg W.; Duniway, Michael C.; Brown, Joel R.
2016-01-01
State-and-transition models (STMs) are useful tools for management, but they can be difficult to use and have limited content.STMs created for groups of related ecological sites could simplify and improve their utility. The amount of information linked to models can be increased using tables that communicate management interpretations and important within-group variability.We created a new web-based information system (the Ecosystem Dynamics Interpretive Tool) to house STMs, associated tabular information, and other ecological site data and descriptors.Fewer, more informative, better organized, and easily accessible STMs should increase the accessibility of science information.
Using DSP technology to simplify deep space ranging
NASA Technical Reports Server (NTRS)
Bryant, S.
2000-01-01
Commercially available Digital Signal Processing (DSP) technology has enabled a new spacecraft ranging design. The new design reduces overall size, parts count, and complexity. The design implementation will also meet the Jet Propulsion Laboratory (JPL) requirements for both near-Earth and deep space ranging.
Shielding analyses of an AB-BNCT facility using Monte Carlo simulations and simplified methods
NASA Astrophysics Data System (ADS)
Lai, Bo-Lun; Sheu, Rong-Jiun
2017-09-01
Accurate Monte Carlo simulations and simplified methods were used to investigate the shielding requirements of a hypothetical accelerator-based boron neutron capture therapy (AB-BNCT) facility that included an accelerator room and a patient treatment room. The epithermal neutron beam for BNCT purpose was generated by coupling a neutron production target with a specially designed beam shaping assembly (BSA), which was embedded in the partition wall between the two rooms. Neutrons were produced from a beryllium target bombarded by 1-mA 30-MeV protons. The MCNP6-generated surface sources around all the exterior surfaces of the BSA were established to facilitate repeated Monte Carlo shielding calculations. In addition, three simplified models based on a point-source line-of-sight approximation were developed and their predictions were compared with the reference Monte Carlo results. The comparison determined which model resulted in better dose estimation, forming the basis of future design activities for the first ABBNCT facility in Taiwan.
Souza, João Paulo; Oladapo, Olufemi T; Bohren, Meghan A; Mugerwa, Kidza; Fawole, Bukola; Moscovici, Leonardo; Alves, Domingos; Perdona, Gleici; Oliveira-Ciabati, Livia; Vogel, Joshua P; Tunçalp, Özge; Zhang, Jim; Hofmeyr, Justus; Bahl, Rajiv; Gülmezoglu, A Metin
2015-05-26
The partograph is currently the main tool available to support decision-making of health professionals during labour. However, the rate of appropriate use of the partograph is disappointingly low. Apart from limitations that are associated with partograph use, evidence of positive impact on labour-related health outcomes is lacking. The main goal of this study is to develop a Simplified, Effective, Labour Monitoring-to-Action (SELMA) tool. The primary objectives are: to identify the essential elements of intrapartum monitoring that trigger the decision to use interventions aimed at preventing poor labour outcomes; to develop a simplified, monitoring-to-action algorithm for labour management; and to compare the diagnostic performance of SELMA and partograph algorithms as tools to identify women who are likely to develop poor labour-related outcomes. A prospective cohort study will be conducted in eight health facilities in Nigeria and Uganda (four facilities from each country). All women admitted for vaginal birth will comprise the study population (estimated sample size: 7,812 women). Data will be collected on maternal characteristics on admission, labour events and pregnancy outcomes by trained research assistants at the participating health facilities. Prediction models will be developed to identify women at risk of intrapartum-related perinatal death or morbidity (primary outcomes) throughout the course of labour. These predictions models will be used to assemble a decision-support tool that will be able to suggest the best course of action to avert adverse outcomes during the course of labour. To develop this set of prediction models, we will use up-to-date techniques of prognostic research, including identification of important predictors, assigning of relative weights to each predictor, estimation of the predictive performance of the model through calibration and discrimination, and determination of its potential for application using internal validation techniques. This research offers an opportunity to revisit the theoretical basis of the partograph. It is envisioned that the final product would help providers overcome the challenging tasks of promptly interpreting complex labour information and deriving appropriate clinical actions, and thus increase efficiency of the care process, enhance providers' competence and ultimately improve labour outcomes. Please see related articles ' http://dx.doi.org/10.1186/s12978-015-0027-6 ' and ' http://dx.doi.org/10.1186/s12978-015-0028-5 '.
A drill-soil system modelization for future Mars exploration
NASA Astrophysics Data System (ADS)
Finzi, A. E.; Lavagna, M.; Rocchitelli, G.
2004-01-01
This paper presents a first approach to the problem of modeling a drilling process to be carried on in the space environment by a dedicated payload. Systems devoted to work in space present very strict requirements in many different fields such as thermal response, electric power demand, reliability and so on. Thus, models devoted to the operational behaviour simulation represent a fundamental help in the design phase and give a great improvement in the final product quality. As the required power is the crucial constraint within drilling devices, the tool-soil interaction modelization and simulation are finalized to the computation of the power demand as a function of both the drill and the soil parameters. An accurate study of the tool and the soil separately has been firstly carried on and, secondly their interaction has been analyzed. The Dee-Dri system, designed by Tecnospazio and to be part of the lander components in the NASA's Mars Sample Return Mission, has been taken as the tool reference. The Deep-Drill system is a complex rotary tool devoted to the soil perforation and sample collection; it has to operate in a Martian zone made of rocks similar to the terrestrial basalt, then the modelization is restricted to the interaction analysis between the tool and materials belonging to the rock set. The tool geometric modelization has been faced by a finite element approach with a Langrangian formulation: for the static analysis a refined model is assumed considering both the actual geometry of the head and the rod screws; a simplified model has been used to deal with the dynamic analysis. The soil representation is based on the Mohr-Coulomb crack criterion and an Eulerian approach has been selected to model it. However, software limitations in dealing with the tool-soil interface definition required assuming a Langrangian formulation for the soil too. The interaction between the soil and the tool has been modeled by extending the two-dimensional Nishimatsu's theory for rock cutting for rotating perforation tools. A fine analysis on f.e.m. element choice for each part of the tool is presented together with static analysis results. The dynamic analysis results are limited to the first impact phenomenon between the rock and the tool head. The validity of both the theoretical and numerical models is confirmed by the good agreement between simulation results and data coming from the experiments done within the Tecnospazio facilities.
Barbour, P S; Stone, M H; Fisher, J
1999-01-01
In some designs of hip joint simulator the cost of building a highly complex machine has been offset with the requirement for a large number of test stations. The application of the wear results generated by these machines depends on their ability to reproduce physiological wear rates and processes. In this study a hip joint simulator has been shown to reproduce physiological wear using only one load vector and two degrees of motion with simplified input cycles. The actual path of points on the femoral head relative to the acetabular cup were calculated and compared for physiological and simplified input cycles. The in vitro wear rates were found to be highly dependent on the shape of these paths and similarities could be drawn between the shape of the physiological paths and the simplified elliptical paths.
NASA Technical Reports Server (NTRS)
Pease, R. Adam
1995-01-01
MIDAS is a set of tools which allow a designer to specify the physical and functional characteristics of a complex system such as an aircraft cockpit, and analyze the system with regard to human performance. MIDAS allows for a number of static analyses such as military standard reach and fit analysis, display legibility analysis, and vision polars. It also supports dynamic simulation of mission segments with 3d visualization. MIDAS development has incorporated several models of human planning behavior. The CaseMIDAS effort has been to provide a simplified and unified approach to modeling task selection behavior. Except for highly practiced, routine procedures, a human operator exhibits a cognitive effort while determining what step to take next in the accomplishment of mission tasks. Current versions of MIDAS do not model this effort in a consistent and inclusive manner. CaseMIDAS also attempts to address this issue. The CaseMIDAS project has yielded an easy to use software module for case creation and execution which is integrated with existing MIDAS simulation components.
Digi Island: A Serious Game for Teaching and Learning Digital Circuit Optimization
NASA Technical Reports Server (NTRS)
Harper, Michael; Miller, Joseph; Shen, Yuzhong
2011-01-01
Karnaugh maps, also known as K-maps, are a tool used to optimize or simplify digital logic circuits. A K-map is a graphical display of a logic circuit. K-map optimization is essentially the process of finding a minimum number of maximal aggregations of K-map cells. with values of 1 according to a set of rules. The Digi Island is a serious game designed for aiding students to learn K-map optimization. The game takes place on an exotic island (called Digi Island) in the Pacific Ocean . The player is an adventurer to the Digi Island and will transform it into a tourist attraction by developing real estates, such as amusement parks.and hotels. The Digi Island game elegantly converts boring 1s and Os in digital circuits into usable and unusable spaces on a beautiful island and transforms K-map optimization into real estate development, an activity with which many students are familiar and also interested in. This paper discusses the design, development, and some preliminary results of the Digi Island game.
Simulating multiprimary LCDs on standard tri-stimulus LC displays
NASA Astrophysics Data System (ADS)
Lebowsky, Fritz; Vonneilich, Katrin; Bonse, Thomas
2008-01-01
Large-scale, direct view TV screens, in particular those based on liquid crystal technology, are beginning to use subpixel structures with more than three subpixels to implement a multi-primary display with up to six primaries. Since their input color space is likely to remain tri-stimulus RGB we first focus on some fundamental constraints. Among them, we elaborate simplified gamut mapping architectures as well as color filter geometry, transparency, and chromaticity coordinates in color space. Based on a 'display centric' RGB color space tetrahedrization combined with linear interpolation we describe a simulation framework which enables optimization for up to 7 primaries. We evaluated the performance through mapping the multi-primary design back onto a RGB LC display gamut without building a prototype multi-primary display. As long as we kept the RGB equivalent output signal within the display gamut we could analyze all desirable multi-primary configurations with regard to colorimetric variance and visually perceived quality. Not only does our simulation tool enable us to verify a novel concept it also demonstrates how carefully one needs to design a multiprimary display for LCD TV applications.
Solar dynamic power for the Space Station
NASA Technical Reports Server (NTRS)
Archer, J. S.; Diamant, E. S.
1986-01-01
This paper describes a computer code which provides a significant advance in the systems analysis capabilities of solar dynamic power modules. While the code can be used to advantage in the preliminary analysis of terrestrial solar dynamic modules its real value lies in the adaptions which make it particularly useful for the conceptualization of optimized power modules for space applications. In particular, as illustrated in the paper, the code can be used to establish optimum values of concentrator diameter, concentrator surface roughness, concentrator rim angle and receiver aperture corresponding to the main heat cycle options - Organic Rankine and Brayton - and for certain receiver design options. The code can also be used to establish system sizing margins to account for the loss of reflectivity in orbit or the seasonal variation of insolation. By the simulation of the interactions among the major components of a solar dynamic module and through simplified formulations of the major thermal-optic-thermodynamic interactions the code adds a powerful, efficient and economic analytical tool to the repertory of techniques available for the design of advanced space power systems.
Passive vs. Parachute System Architecture for Robotic Sample Return Vehicles
NASA Technical Reports Server (NTRS)
Maddock, Robert W.; Henning, Allen B.; Samareh, Jamshid A.
2016-01-01
The Multi-Mission Earth Entry Vehicle (MMEEV) is a flexible vehicle concept based on the Mars Sample Return (MSR) EEV design which can be used in the preliminary sample return mission study phase to parametrically investigate any trade space of interest to determine the best entry vehicle design approach for that particular mission concept. In addition to the trade space dimensions often considered (e.g. entry conditions, payload size and mass, vehicle size, etc.), the MMEEV trade space considers whether it might be more beneficial for the vehicle to utilize a parachute system during descent/landing or to be fully passive (i.e. not use a parachute). In order to evaluate this trade space dimension, a simplified parachute system model has been developed based on inputs such as vehicle size/mass, payload size/mass and landing requirements. This model works in conjunction with analytical approximations of a mission trade space dataset provided by the MMEEV System Analysis for Planetary EDL (M-SAPE) tool to help quantify the differences between an active (with parachute) and a passive (no parachute) vehicle concept.
NASA Technical Reports Server (NTRS)
1979-01-01
The home shown at right is specially designed to accommodate solar heating units; it has roof planes in four directions, allowing placement of solar collectors for best exposure to the sun. Plans (bottom) and complete working blueprints for the solar-heated house are being marketed by Home Building Plan Service, Portland, Oregon. The company also offers an inexpensive schematic (center) showing how a homeowner only moderately skilled in the use of tools can build his own solar energy system, applicable to new or existing structures. The schematic is based upon the design of a low-cost solar home heating system built and tested by NASA's Langley Research Center; used to supplement a warm-air heating system, it can save the homeowner about 40 percent of his annual heating bill for a modest investment in materials and components. Home Building Plan Service saved considerable research time by obtaining a NASA technical report which details the Langley work. The resulting schematic includes construction plans and simplified explanations of solar heat collection, collectors and other components, passive heat factors, domestic hot water supply and how to work with local heating engineers.
NASA Astrophysics Data System (ADS)
Pradhan, Bikram; Delchambre, Ludovic; Hickson, Paul; Akhunov, Talat; Bartczak, Przemyslaw; Kumar, Brajesh; Surdej, Jean
2018-04-01
The 4-m International Liquid Mirror Telescope (ILMT) located at the ARIES Observatory (Devasthal, India) has been designed to scan at a latitude of +29° 22' 26" a band of sky having a width of about half a degree in the Time Delayed Integration (TDI) mode. Therefore, a special data-reduction and analysis pipeline to process online the large amount of optical data being produced has been dedicated to it. This requirement has led to the development of the 4-m ILMT data reduction pipeline, a new software package built with Python in order to simplify a large number of tasks aimed at the reduction of the acquired TDI images. This software provides astronomers with specially designed data reduction functions, astrometry and photometry calibration tools. In this paper we discuss the various reduction and calibration steps followed to reduce TDI images obtained in May 2015 with the Devasthal 1.3m telescope. We report here the detection and characterization of nine space debris present in the TDI frames.
Developing Formal Correctness Properties from Natural Language Requirements
NASA Technical Reports Server (NTRS)
Nikora, Allen P.
2006-01-01
This viewgraph presentation reviews the rationale of the program to transform natural language specifications into formal notation.Specifically, automate generation of Linear Temporal Logic (LTL)correctness properties from natural language temporal specifications. There are several reasons for this approach (1) Model-based techniques becoming more widely accepted, (2) Analytical verification techniques (e.g., model checking, theorem proving) significantly more effective at detecting types of specification design errors (e.g., race conditions, deadlock) than manual inspection, (3) Many requirements still written in natural language, which results in a high learning curve for specification languages, associated tools and increased schedule and budget pressure on projects reduce training opportunities for engineers, and (4) Formulation of correctness properties for system models can be a difficult problem. This has relevance to NASA in that it would simplify development of formal correctness properties, lead to more widespread use of model-based specification, design techniques, assist in earlier identification of defects and reduce residual defect content for space mission software systems. The presentation also discusses: potential applications, accomplishments and/or technological transfer potential and the next steps.
Gotor, Raúl; Ashokkumar, Pichandi; Hecht, Mandy; Keil, Karin; Rurack, Knut
2017-08-15
In this work, a family of pH-responsive fluorescent probes has been designed in a rational manner with the aid of quantum chemistry tools, covering the entire pH range from 0-14. Relying on the boron-dipyrromethene (BODIPY) core, all the probes as well as selected reference dyes display very similar spectroscopic properties with ON-OFF fluorescence switching responses, facilitating optical readout in simple devices used for detection and analysis. Embedding of the probes and reference dyes into hydrogel spots on a plastic strip yielded a test strip that reversibly indicates pH with a considerably small uncertainty of ∼0.1 pH units. These strips are not only reusable but, combined with a 3D-printed case that can be attached to a smartphone, the USB port of which drives the integrated LED used for excitation, allows for autonomous operation in on-site or in-the-field applications; the developed Android application software ("app") further simplifies operation for unskilled users.
Collisionless spectral-kinetic Simulation of the Multipole Resonance Probe
NASA Astrophysics Data System (ADS)
Dobrygin, Wladislaw; Szeremley, Daniel; Schilling, Christian; Oberrath, Jens; Eremin, Denis; Mussenbrock, Thomas; Brinkmann, Ralf Peter
2012-10-01
Plasma resonance spectroscopy is a well established plasma diagnostic method realized in several designs. One of these designs is the multipole resonance probe (MRP). In its idealized - geometrically simplified - version it consists of two dielectrically shielded, hemispherical electrodes to which an RF signal is applied. A numerical tool is under development, which is capable of simulating the dynamics of the plasma surrounding the MRP in electrostatic approximation. In the simulation the potential is separeted in an inner and a vacuum potential. The inner potential is influenced by the charged partilces and is calculated by a specialized Poisson solver. The vacuum potential fulfills Laplace's equetion and consists of the applied voltage of the probe as boundary condition. Both potentials are expanded in spherical harmonics. For a practical particle pusher implementation, the expansion must be appropriately truncated. Compared to a PIC simulation a grid is unnecessary to calculate the force on the particles. This work purpose is a collisionless kinetic simulation, which can be used to investigate kinetic effects on the resonance behavior of the MRP.[4pt] [1] M. Lapke et al., Appl. Phys. Lett. 93, 2008, 051502.
A microkernel design for component-based parallel numerical software systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balay, S.
1999-01-13
What is the minimal software infrastructure and what type of conventions are needed to simplify development of sophisticated parallel numerical application codes using a variety of software components that are not necessarily available as source code? We propose an opaque object-based model where the objects are dynamically loadable from the file system or network. The microkernel required to manage such a system needs to include, at most: (1) a few basic services, namely--a mechanism for loading objects at run time via dynamic link libraries, and consistent schemes for error handling and memory management; and (2) selected methods that all objectsmore » share, to deal with object life (destruction, reference counting, relationships), and object observation (viewing, profiling, tracing). We are experimenting with these ideas in the context of extensible numerical software within the ALICE (Advanced Large-scale Integrated Computational Environment) project, where we are building the microkernel to manage the interoperability among various tools for large-scale scientific simulations. This paper presents some preliminary observations and conclusions from our work with microkernel design.« less
Modeling Remineralization of Desalinated Water by Micronized Calcite Dissolution.
Hasson, David; Fine, Larissa; Sagiv, Abraham; Semiat, Raphael; Shemer, Hilla
2017-11-07
A widely used process for remineralization of desalinated water consists of dissolution of calcite particles by flow of acidified desalinated water through a bed packed with millimeter-size calcite particles. An alternative process consists of calcite dissolution by slurry flow of micron-size calcite particles with acidified desalinated water. The objective of this investigation is to provide theoretical models enabling design of remineralization by calcite slurry dissolution with carbonic and sulfuric acids. Extensive experimental results are presented displaying the effects of acid concentration, slurry feed concentration, and dissolution contact time. The experimental data are shown to be in agreement within less than 10% with theoretical predictions based on the simplifying assumption that the slurry consists of uniform particles represented by the surface mean diameter of the powder. Agreement between theory and experiment is improved by 1-8% by taking into account the powder size distribution. Apart from the practical value of this work in providing a hitherto lacking design tool for a novel technology. The paper has the merit of being among the very few publications providing experimental confirmation to the theory describing reaction kinetics in a segregated flow system.
Survey Says? A Primer on Web-based Survey Design and Distribution
Oppenheimer, Adam J.; Pannucci, Christopher J.; Kasten, Steven J.; Haase, Steven C.
2011-01-01
The internet has changed the way in which we gather and interpret information. While books were once the exclusive bearers of data, knowledge is now only a keystroke away. The internet has also facilitated the synthesis of new knowledge. Specifically, it has become a tool through which medical research is conducted. A review of the literature reveals that in the past year, over one-hundred medical publications have been based on web-based survey data alone. Due to emerging internet technologies, web-based surveys can now be launched with little computer knowledge. They may also be self-administered, eliminating personnel requirements. Ultimately, an investigator may build, implement, and analyze survey results with speed and efficiency, obviating the need for mass mailings and data processing. All of these qualities have rendered telephone and mail-based surveys virtually obsolete. Despite these capabilities, web-based survey techniques are not without their limitations, namely recall and response biases. When used properly, however, web-based surveys can greatly simplify the research process. This article discusses the implications of web-based surveys and provides guidelines for their effective design and distribution. PMID:21701347
DOT National Transportation Integrated Search
2017-01-01
The New York State Department of Transportation (NYSDOT) has used the AASHTO 1993 Design Guide for the design of new flexible pavement structures for more than two decades. The AASHTO 1993 Guide is based on the empirical design equations developed fr...
Design and Strength check of Large Blow Molding Machine Rack
NASA Astrophysics Data System (ADS)
Fei-fei, GU; Zhi-song, ZHU; Xiao-zhao, YAN; Yi-min, ZHU
Design procedure of large blow moulding machine rack is discussed in the article. A strength checking method is presented. Finite element analysis is conducted in the design procedure by ANSYS software. The actual situation of the rack load bearing is fully considered. The necessary means to simplify the model are done. The dimensional linear element Beam 188 is analyzed. MESH200 is used to mesh. Therefore, it simplifies the analysis process and improves computational efficiency. The maximum deformation of rack is 8.037 mm: it is occurred in the position of accumulator head. The result states: it meets the national standard curvature which is not greater than 0.3% of the total channel length; it also meets strength requirement that the maximum stress was 54.112 MPa.
psRNATarget: a plant small RNA target analysis server
Dai, Xinbin; Zhao, Patrick Xuechun
2011-01-01
Plant endogenous non-coding short small RNAs (20–24 nt), including microRNAs (miRNAs) and a subset of small interfering RNAs (ta-siRNAs), play important role in gene expression regulatory networks (GRNs). For example, many transcription factors and development-related genes have been reported as targets of these regulatory small RNAs. Although a number of miRNA target prediction algorithms and programs have been developed, most of them were designed for animal miRNAs which are significantly different from plant miRNAs in the target recognition process. These differences demand the development of separate plant miRNA (and ta-siRNA) target analysis tool(s). We present psRNATarget, a plant small RNA target analysis server, which features two important analysis functions: (i) reverse complementary matching between small RNA and target transcript using a proven scoring schema, and (ii) target-site accessibility evaluation by calculating unpaired energy (UPE) required to ‘open’ secondary structure around small RNA’s target site on mRNA. The psRNATarget incorporates recent discoveries in plant miRNA target recognition, e.g. it distinguishes translational and post-transcriptional inhibition, and it reports the number of small RNA/target site pairs that may affect small RNA binding activity to target transcript. The psRNATarget server is designed for high-throughput analysis of next-generation data with an efficient distributed computing back-end pipeline that runs on a Linux cluster. The server front-end integrates three simplified user-friendly interfaces to accept user-submitted or preloaded small RNAs and transcript sequences; and outputs a comprehensive list of small RNA/target pairs along with the online tools for batch downloading, key word searching and results sorting. The psRNATarget server is freely available at http://plantgrn.noble.org/psRNATarget/. PMID:21622958
Towards a metadata scheme for the description of materials - the description of microstructures
NASA Astrophysics Data System (ADS)
Schmitz, Georg J.; Böttger, Bernd; Apel, Markus; Eiken, Janin; Laschet, Gottfried; Altenfeld, Ralph; Berger, Ralf; Boussinot, Guillaume; Viardin, Alexandre
2016-01-01
The property of any material is essentially determined by its microstructure. Numerical models are increasingly the focus of modern engineering as helpful tools for tailoring and optimization of custom-designed microstructures by suitable processing and alloy design. A huge variety of software tools is available to predict various microstructural aspects for different materials. In the general frame of an integrated computational materials engineering (ICME) approach, these microstructure models provide the link between models operating at the atomistic or electronic scales, and models operating on the macroscopic scale of the component and its processing. In view of an improved interoperability of all these different tools it is highly desirable to establish a standardized nomenclature and methodology for the exchange of microstructure data. The scope of this article is to provide a comprehensive system of metadata descriptors for the description of a 3D microstructure. The presented descriptors are limited to a mere geometric description of a static microstructure and have to be complemented by further descriptors, e.g. for properties, numerical representations, kinetic data, and others in the future. Further attributes to each descriptor, e.g. on data origin, data uncertainty, and data validity range are being defined in ongoing work. The proposed descriptors are intended to be independent of any specific numerical representation. The descriptors defined in this article may serve as a first basis for standardization and will simplify the data exchange between different numerical models, as well as promote the integration of experimental data into numerical models of microstructures. An HDF5 template data file for a simple, three phase Al-Cu microstructure being based on the defined descriptors complements this article.
Towards a metadata scheme for the description of materials - the description of microstructures.
Schmitz, Georg J; Böttger, Bernd; Apel, Markus; Eiken, Janin; Laschet, Gottfried; Altenfeld, Ralph; Berger, Ralf; Boussinot, Guillaume; Viardin, Alexandre
2016-01-01
The property of any material is essentially determined by its microstructure. Numerical models are increasingly the focus of modern engineering as helpful tools for tailoring and optimization of custom-designed microstructures by suitable processing and alloy design. A huge variety of software tools is available to predict various microstructural aspects for different materials. In the general frame of an integrated computational materials engineering (ICME) approach, these microstructure models provide the link between models operating at the atomistic or electronic scales, and models operating on the macroscopic scale of the component and its processing. In view of an improved interoperability of all these different tools it is highly desirable to establish a standardized nomenclature and methodology for the exchange of microstructure data. The scope of this article is to provide a comprehensive system of metadata descriptors for the description of a 3D microstructure. The presented descriptors are limited to a mere geometric description of a static microstructure and have to be complemented by further descriptors, e.g. for properties, numerical representations, kinetic data, and others in the future. Further attributes to each descriptor, e.g. on data origin, data uncertainty, and data validity range are being defined in ongoing work. The proposed descriptors are intended to be independent of any specific numerical representation. The descriptors defined in this article may serve as a first basis for standardization and will simplify the data exchange between different numerical models, as well as promote the integration of experimental data into numerical models of microstructures. An HDF5 template data file for a simple, three phase Al-Cu microstructure being based on the defined descriptors complements this article.
Essential core of the Hawking–Ellis types
NASA Astrophysics Data System (ADS)
Martín-Moruno, Prado; Visser, Matt
2018-06-01
The Hawking–Ellis (Segre–Plebański) classification of possible stress–energy tensors is an essential tool in analyzing the implications of the Einstein field equations in a more-or-less model-independent manner. In the current article the basic idea is to simplify the Hawking–Ellis type I, II, III, and IV classification by isolating the ‘essential core’ of the type II, type III, and type IV stress–energy tensors; this being done by subtracting (special cases of) type I to simplify the (Lorentz invariant) eigenvalue structure as much as possible without disturbing the eigenvector structure. We will denote these ‘simplified cores’ type II0, type III0, and type IV0. These ‘simplified cores’ have very nice and simple algebraic properties. Furthermore, types I and II0 have very simple classical interpretations, while type IV0 is known to arise semi-classically (in renormalized expectation values of standard stress–energy tensors). In contrast type III0 stands out in that it has neither a simple classical interpretation, nor even a simple semi-classical interpretation. We will also consider the robustness of this classification considering the stability of the different Hawking–Ellis types under perturbations. We argue that types II and III are definitively unstable, whereas types I and IV are stable.
Policy Compliance of Queries for Private Information Retrieval
2010-11-01
SPARQL, unfortunately, is not in RDF and so we had to develop tools to translate SPARQL queries into RDF to be used by our policy compliance prototype...policy-assurance/sparql2n3.py) that accepts SPARQL queries and returns the translated query in our simplified ontology. An example of a translated
Single-Case Time Series with Bayesian Analysis: A Practitioner's Guide.
ERIC Educational Resources Information Center
Jones, W. Paul
2003-01-01
This article illustrates a simplified time series analysis for use by the counseling researcher practitioner in single-case baseline plus intervention studies with a Bayesian probability analysis to integrate findings from replications. The C statistic is recommended as a primary analysis tool with particular relevance in the context of actual…
USDA-ARS?s Scientific Manuscript database
Cellular automata (CA) is a powerful tool in modeling the evolution of macroscopic scale phenomena as it couples time, space, and variable together while remaining in a simplified form. However, such application has remained challenging in landscape-level chronic forest insect epidemics due to the h...
The Use of Google Scholar for Research and Research Dissemination
ERIC Educational Resources Information Center
Zientek, Linda R.; Werner, Jon M.; Campuzano, Mariela V.; Nimon, Kim
2018-01-01
The abundance of technological and Internet resources can both simplify and complicate a researcher's world. Such innovations place a burden on researchers to stay current with advances in technology and then discern the best technology tools to utilize. We first discuss benefits that Google Scholar can provide in the preparation of the literature…
Text Readability and Intuitive Simplification: A Comparison of Readability Formulas
ERIC Educational Resources Information Center
Crossley, Scott A.; Allen, David B.; McNamara, Danielle S.
2011-01-01
Texts are routinely simplified for language learners with authors relying on a variety of approaches and materials to assist them in making the texts more comprehensible. Readability measures are one such tool that authors can use when evaluating text comprehensibility. This study compares the Coh-Metrix Second Language (L2) Reading Index, a…
Multipath Very-Simplified Estimate of Adversary Sequence Interruption v. 2.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Snell, Mark K.
2017-10-10
MP VEASI is a training tool that models physical protection systems for fixed sites using Adversary Sequence Diagrams (ASDs) and then uses the ASD to find most-vulnerable adversary paths through the ASD. The identified paths have the lowest Probability of Interruption among all the paths through the ASD.
Text Simplification and Comprehensible Input: A Case for an Intuitive Approach
ERIC Educational Resources Information Center
Crossley, Scott A.; Allen, David; McNamara, Danielle S.
2012-01-01
Texts are routinely simplified to make them more comprehensible for second language learners. However, the effects of simplification upon the linguistic features of texts remain largely unexplored. Here we examine the effects of one type of text simplification: intuitive text simplification. We use the computational tool, Coh-Metrix, to examine…
Collaborative Learning through Formative Peer Review: Pedagogy, Programs and Potential
ERIC Educational Resources Information Center
Sondergaard, Harald; Mulder, Raoul A.
2012-01-01
We examine student peer review, with an emphasis on formative practice and collaborative learning, rather than peer grading. Opportunities to engage students in such formative peer assessment are growing, as a range of online tools become available to manage and simplify the process of administering student peer review. We consider whether…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williamson, Richard L.; Kochunas, Brendan; Adams, Brian M.
The Virtual Environment for Reactor Applications components included in this distribution include selected computational tools and supporting infrastructure that solve neutronics, thermal-hydraulics, fuel performance, and coupled neutronics-thermal hydraulics problems. The infrastructure components provide a simplified common user input capability and provide for the physics integration with data transfer and coupled-physics iterative solution algorithms.
A Comparison of Simplified-Visually Rich and Traditional Presentation Styles
ERIC Educational Resources Information Center
Johnson, Douglas A.; Christensen, Jack
2011-01-01
Microsoft PowerPoint and similar presentation tools have become commonplace in higher education, yet there is very little research on the effectiveness of different PowerPoint formats for implementing this software. This study compared two PowerPoint presentation techniques: a more traditional format employing heavy use of bullet points with text…
NASA Astrophysics Data System (ADS)
Macher, H.; Grussenmeyer, P.; Kraemer, C.; Guillemin, S.
2015-08-01
In this paper, the 3D documentation of the full structure of the Romanesque church of Dugny-sur-Meuse is discussed. In 2012 and 2013, a 3D recording project was carried out under the supervision of the Photogrammetry and Geomatics Research Group from INSA Strasbourg (France) in cooperation with C. Kraemer, archaeologist from Nancy (France). The goal of the project was on one hand to propose new solutions and tools to the archaeologists in charge of the project especially for stone by stone measurements. On the other hand, a simplified 3D model was required by the local authorities for communication purposes. To achieve these goals several techniques were applied namely GNSS measurements and accurate traverse networks, photogrammetric recordings and terrestrial laser scanning acquisitions. The various acquired data are presented in this paper. Based on these data, several deliverables are also proposed. The generation of orthoimages from plane as well as cylindrical surfaces is considered. Moreover, the workflow for the creation of a 3D simplified model is also presented.
Large Angle Transient Dynamics (LATDYN) user's manual
NASA Technical Reports Server (NTRS)
Abrahamson, A. Louis; Chang, Che-Wei; Powell, Michael G.; Wu, Shih-Chin; Bingel, Bradford D.; Theophilos, Paula M.
1991-01-01
A computer code for modeling the large angle transient dynamics (LATDYN) of structures was developed to investigate techniques for analyzing flexible deformation and control/structure interaction problems associated with large angular motions of spacecraft. This type of analysis is beyond the routine capability of conventional analytical tools without simplifying assumptions. In some instances, the motion may be sufficiently slow and the spacecraft (or component) sufficiently rigid to simplify analyses of dynamics and controls by making pseudo-static and/or rigid body assumptions. The LATDYN introduces a new approach to the problem by combining finite element structural analysis, multi-body dynamics, and control system analysis in a single tool. It includes a type of finite element that can deform and rotate through large angles at the same time, and which can be connected to other finite elements either rigidly or through mechanical joints. The LATDYN also provides symbolic capabilities for modeling control systems which are interfaced directly with the finite element structural model. Thus, the nonlinear equations representing the structural model are integrated along with the equations representing sensors, processing, and controls as a coupled system.
Lightweight genome viewer: portable software for browsing genomics data in its chromosomal context
Faith, Jeremiah J; Olson, Andrew J; Gardner, Timothy S; Sachidanandam, Ravi
2007-01-01
Background Lightweight genome viewer (lwgv) is a web-based tool for visualization of sequence annotations in their chromosomal context. It performs most of the functions of larger genome browsers, while relying on standard flat-file formats and bypassing the database needs of most visualization tools. Visualization as an aide to discovery requires display of novel data in conjunction with static annotations in their chromosomal context. With database-based systems, displaying dynamic results requires temporary tables that need to be tracked for removal. Results lwgv simplifies the visualization of user-generated results on a local computer. The dynamic results of these analyses are written to transient files, which can import static content from a more permanent file. lwgv is currently used in many different applications, from whole genome browsers to single-gene RNAi design visualization, demonstrating its applicability in a large variety of contexts and scales. Conclusion lwgv provides a lightweight alternative to large genome browsers for visualizing biological annotations and dynamic analyses in their chromosomal context. It is particularly suited for applications ranging from short sequences to medium-sized genomes when the creation and maintenance of a large software and database infrastructure is not necessary or desired. PMID:17877794
Pleated and Creased Structures
NASA Astrophysics Data System (ADS)
Dudte, Levi; Wei, Zhiyan; Mahadevan, L.
2012-02-01
The strategic placement of curved folds on a paper annulus produces saddle-shaped origami. These exotic geometries resulting from simple design processes motivate our development of a computational tool to simulate the stretching, bending and folding of thin sheets of material. We seek to understand the shape of the curved origami figure by applying the computational tool to simulate a thin annulus with single or multiple folds. We aim to quantify the static geometry of this simplified model in order to delineate methods for actuation and control of similar developable structures with curved folds. Miura-ori pattern is a periodic pleated structure defined in terms of 2 angles and 2 lengths. The unit cell embodies the basic element in all non-trivial pleated structures - the mountain or valley folds, wherein four folds come together at a single vertex. The ability of this structure to pack and unpack with a few degrees of freedom leads to their use in deployable structures such as solar sails and maps, just as this feature is useful in insect wings, plant leaves and flowers. We probe the qualitative and quantitative aspects of the mechanical behavior of these structures with a view to optimizing material performance.
LibHalfSpace: A C++ object-oriented library to study deformation and stress in elastic half-spaces
NASA Astrophysics Data System (ADS)
Ferrari, Claudio; Bonafede, Maurizio; Belardinelli, Maria Elina
2016-11-01
The study of deformation processes in elastic half-spaces is widely employed for many purposes (e.g. didactic, scientific investigation of real processes, inversion of geodetic data, etc.). We present a coherent programming interface containing a set of tools designed to make easier and faster the study of processes in an elastic half-space. LibHalfSpace is presented in the form of an object-oriented library. A set of well known and frequently used source models (Mogi source, penny shaped horizontal crack, inflating spheroid, Okada rectangular dislocation, etc.) are implemented to describe the potential usage and the versatility of the library. The common interface given to library tools enables us to switch easily among the effects produced by different deformation sources that can be monitored at the free surface. Furthermore, the library also offers an interface which simplifies the creation of new source models exploiting the features of object-oriented programming (OOP). These source models can be built as distributions of rectangular boundary elements. In order to better explain how new models can be deployed some examples are included in the library.
Lightweight genome viewer: portable software for browsing genomics data in its chromosomal context.
Faith, Jeremiah J; Olson, Andrew J; Gardner, Timothy S; Sachidanandam, Ravi
2007-09-18
Lightweight genome viewer (lwgv) is a web-based tool for visualization of sequence annotations in their chromosomal context. It performs most of the functions of larger genome browsers, while relying on standard flat-file formats and bypassing the database needs of most visualization tools. Visualization as an aide to discovery requires display of novel data in conjunction with static annotations in their chromosomal context. With database-based systems, displaying dynamic results requires temporary tables that need to be tracked for removal. lwgv simplifies the visualization of user-generated results on a local computer. The dynamic results of these analyses are written to transient files, which can import static content from a more permanent file. lwgv is currently used in many different applications, from whole genome browsers to single-gene RNAi design visualization, demonstrating its applicability in a large variety of contexts and scales. lwgv provides a lightweight alternative to large genome browsers for visualizing biological annotations and dynamic analyses in their chromosomal context. It is particularly suited for applications ranging from short sequences to medium-sized genomes when the creation and maintenance of a large software and database infrastructure is not necessary or desired.
Maintaining consistency in distributed systems
NASA Technical Reports Server (NTRS)
Birman, Kenneth P.
1991-01-01
In systems designed as assemblies of independently developed components, concurrent access to data or data structures normally arises within individual programs, and is controlled using mutual exclusion constructs, such as semaphores and monitors. Where data is persistent and/or sets of operation are related to one another, transactions or linearizability may be more appropriate. Systems that incorporate cooperative styles of distributed execution often replicate or distribute data within groups of components. In these cases, group oriented consistency properties must be maintained, and tools based on the virtual synchrony execution model greatly simplify the task confronting an application developer. All three styles of distributed computing are likely to be seen in future systems - often, within the same application. This leads us to propose an integrated approach that permits applications that use virtual synchrony with concurrent objects that respect a linearizability constraint, and vice versa. Transactional subsystems are treated as a special case of linearizability.
Sergi, Pier Nicola; Cavalcanti-Adam, Elisabetta Ada
2017-03-28
Topographical and chemical cues drive migration, outgrowth and regeneration of neurons in different and crucial biological conditions. In the natural extracellular matrix, their influences are so closely coupled that they result in complex cellular responses. As a consequence, engineered biomaterials are widely used to simplify in vitro conditions, disentangling intricate in vivo behaviours, and narrowing the investigation on particular emergent responses. Nevertheless, how topographical and chemical cues affect the emergent response of neural cells is still unclear, thus in silico models are used as additional tools to reproduce and investigate the interactions between cells and engineered biomaterials. This work aims at presenting the synergistic use of biomaterials-based experiments and computation as a strategic way to promote the discovering of complex neural responses as well as to allow the interactions between cells and biomaterials to be quantitatively investigated, fostering a rational design of experiments.
BGen: A UML Behavior Network Generator Tool
NASA Technical Reports Server (NTRS)
Huntsberger, Terry; Reder, Leonard J.; Balian, Harry
2010-01-01
BGen software was designed for autogeneration of code based on a graphical representation of a behavior network used for controlling automatic vehicles. A common format used for describing a behavior network, such as that used in the JPL-developed behavior-based control system, CARACaS ["Control Architecture for Robotic Agent Command and Sensing" (NPO-43635), NASA Tech Briefs, Vol. 32, No. 10 (October 2008), page 40] includes a graph with sensory inputs flowing through the behaviors in order to generate the signals for the actuators that drive and steer the vehicle. A computer program to translate Unified Modeling Language (UML) Freeform Implementation Diagrams into a legacy C implementation of Behavior Network has been developed in order to simplify the development of C-code for behavior-based control systems. UML is a popular standard developed by the Object Management Group (OMG) to model software architectures graphically. The C implementation of a Behavior Network is functioning as a decision tree.
Inclusion of Structural Flexibility in Design Load Analysis for Wave Energy Converters: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guo, Yi; Yu, Yi-Hsiang; van Rij, Jennifer A
2017-08-14
Hydroelastic interactions, caused by ocean wave loading on wave energy devices with deformable structures, are studied in the time domain. A midfidelity, hybrid modeling approach of rigid-body and flexible-body dynamics is developed and implemented in an open-source simulation tool for wave energy converters (WEC-Sim) to simulate the dynamic responses of wave energy converter component structural deformations under wave loading. A generalized coordinate system, including degrees of freedom associated with rigid bodies, structural modes, and constraints connecting multiple bodies, is utilized. A simplified method of calculating stress loads and sectional bending moments is implemented, with the purpose of sizing and designingmore » wave energy converters. Results calculated using the method presented are verified with those of high-fidelity fluid-structure interaction simulations, as well as low-fidelity, frequency-domain, boundary element method analysis.« less
Corlette, Sabrina; Downs, David; Monahan, Christine H; Yondorf, Barbara
2013-02-01
Value-based insurance is a relatively new approach to health insurance in which financial barriers, such as copayments, are lowered for clinical services that are considered high value, while consumer cost sharing may be increased for services considered to be of uncertain value. Such plans are complex and do not easily fit into the simplified, consumer-friendly comparison tools that many state health insurance exchanges are formulating for use in 2014. Nevertheless some states and plans are attempting to strike the right balance between a streamlined health exchange shopping experience and innovative, albeit complex, benefit design that promotes value. For example, agencies administering exchanges in Vermont and Oregon are contemplating offering value-based insurance plans as an option in addition to a set of standardized plans. In the postreform environment, policy makers must find ways to present complex value-based insurance plans in a way that consumers and employers can more readily understand.
Gutman, David A.; Dunn, William D.; Cobb, Jake; Stoner, Richard M.; Kalpathy-Cramer, Jayashree; Erickson, Bradley
2014-01-01
Advances in web technologies now allow direct visualization of imaging data sets without necessitating the download of large file sets or the installation of software. This allows centralization of file storage and facilitates image review and analysis. XNATView is a light framework recently developed in our lab to visualize DICOM images stored in The Extensible Neuroimaging Archive Toolkit (XNAT). It consists of a PyXNAT-based framework to wrap around the REST application programming interface (API) and query the data in XNAT. XNATView was developed to simplify quality assurance, help organize imaging data, and facilitate data sharing for intra- and inter-laboratory collaborations. Its zero-footprint design allows the user to connect to XNAT from a web browser, navigate through projects, experiments, and subjects, and view DICOM images with accompanying metadata all within a single viewing instance. PMID:24904399
Nowak, Derek B; Lawrence, A J; Sánchez, Erik J
2010-12-10
We present the development of a versatile spectroscopic imaging tool to allow for imaging with single-molecule sensitivity and high spatial resolution. The microscope allows for near-field and subdiffraction-limited far-field imaging by integrating a shear-force microscope on top of a custom inverted microscope design. The instrument has the ability to image in ambient conditions with optical resolutions on the order of tens of nanometers in the near field. A single low-cost computer controls the microscope with a field programmable gate array data acquisition card. High spatial resolution imaging is achieved with an inexpensive CW multiphoton excitation source, using an apertureless probe and simplified optical pathways. The high-resolution, combined with high collection efficiency and single-molecule sensitive optical capabilities of the microscope, are demonstrated with a low-cost CW laser source as well as a mode-locked laser source.
The WLCG Messaging Service and its Future
NASA Astrophysics Data System (ADS)
Cons, Lionel; Paladin, Massimo
2012-12-01
Enterprise messaging is seen as an attractive mechanism to simplify and extend several portions of the Grid middleware, from low level monitoring to experiments dashboards. The production messaging service currently used by WLCG includes four tightly coupled brokers operated by EGI (running Apache ActiveMQ and designed to host the Grid operational tools such as SAM) as well as two dedicated services for ATLAS-DDM and experiments dashboards (currently also running Apache ActiveMQ). In the future, this service is expected to grow in numbers of applications supported, brokers and technologies. The WLCG Messaging Roadmap identified three areas with room for improvement (security, scalability and availability/reliability) as well as ten practical recommendations to address them. This paper describes a messaging service architecture that is in line with these recommendations as well as a software architecture based on reusable components that ease interactions with the messaging service. These two architectures will support the growth of the WLCG messaging service.
A decade of innovation with laser speckle metrology
NASA Astrophysics Data System (ADS)
Ettemeyer, Andreas
2003-05-01
Speckle Pattern Interferometry has emerged from the experimental substitution of holographic interferometry to become a powerful problem solving tool in research and industry. The rapid development of computer and digital imaging techniques in combination with minaturization of the optical equipment led to new applications which had not been anticipated before. While classical holographic interferometry had always required careful consideration of the environmental conditions such as vibration, noise, light, etc. and could generally only be performed in the optical laboratory, it is now state of the art, to handle portable speckle measuring equipment at almost any place. During the last decade, the change in design and technique has dramatically influenced the range of applications of speckle metrology and opened new markets. The integration of recent research results into speckle measuring equipment has led to handy equipment, simplified the operation and created high quality data output.
Automation of Data Traffic Control on DSM Architecture
NASA Technical Reports Server (NTRS)
Frumkin, Michael; Jin, Hao-Qiang; Yan, Jerry
2001-01-01
The design of distributed shared memory (DSM) computers liberates users from the duty to distribute data across processors and allows for the incremental development of parallel programs using, for example, OpenMP or Java threads. DSM architecture greatly simplifies the development of parallel programs having good performance on a few processors. However, to achieve a good program scalability on DSM computers requires that the user understand data flow in the application and use various techniques to avoid data traffic congestions. In this paper we discuss a number of such techniques, including data blocking, data placement, data transposition and page size control and evaluate their efficiency on the NAS (NASA Advanced Supercomputing) Parallel Benchmarks. We also present a tool which automates the detection of constructs causing data congestions in Fortran array oriented codes and advises the user on code transformations for improving data traffic in the application.
JCMT observatory control system
NASA Astrophysics Data System (ADS)
Rees, Nicholas P.; Economou, Frossie; Jenness, Tim; Kackley, Russell D.; Walther, Craig A.; Dent, William R. F.; Folger, Martin; Gao, Xiaofeng; Kelly, Dennis; Lightfoot, John F.; Pain, Ian; Hovey, Gary J.; Redman, Russell O.
2002-12-01
The JCMT, the world's largest sub-mm telescope, has had essentially the same VAX/VMS based control system since it was commissioned. For the next generation of instrumentation we are implementing a new Unix/VxWorks based system, based on the successful ORAC system that was recently released on UKIRT. The system is now entering the integration and testing phase. This paper gives a broad overview of the system architecture and includes some discussion on the choices made. (Other papers in this conference cover some areas in more detail). The basic philosophy is to control the sub-systems with a small and simple set of commands, but passing detailed XML configuration descriptions along with the commands to give the flexibility required. The XML files can be passed between various layers in the system without interpretation, and so simplify the design enormously. This has all been made possible by the adoption of an Observation Preparation Tool, which essentially serves as an intelligent XML editor.
Open Source Clinical NLP - More than Any Single System.
Masanz, James; Pakhomov, Serguei V; Xu, Hua; Wu, Stephen T; Chute, Christopher G; Liu, Hongfang
2014-01-01
The number of Natural Language Processing (NLP) tools and systems for processing clinical free-text has grown as interest and processing capability have surged. Unfortunately any two systems typically cannot simply interoperate, even when both are built upon a framework designed to facilitate the creation of pluggable components. We present two ongoing activities promoting open source clinical NLP. The Open Health Natural Language Processing (OHNLP) Consortium was originally founded to foster a collaborative community around clinical NLP, releasing UIMA-based open source software. OHNLP's mission currently includes maintaining a catalog of clinical NLP software and providing interfaces to simplify the interaction of NLP systems. Meanwhile, Apache cTAKES aims to integrate best-of-breed annotators, providing a world-class NLP system for accessing clinical information within free-text. These two activities are complementary. OHNLP promotes open source clinical NLP activities in the research community and Apache cTAKES bridges research to the health information technology (HIT) practice.
NeuroManager: a workflow analysis based simulation management engine for computational neuroscience
Stockton, David B.; Santamaria, Fidel
2015-01-01
We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project. PMID:26528175
NeuroManager: a workflow analysis based simulation management engine for computational neuroscience.
Stockton, David B; Santamaria, Fidel
2015-01-01
We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.
NASA Astrophysics Data System (ADS)
Matha, Denis; Sandner, Frank; Schlipf, David
2014-12-01
Design verification of wind turbines is performed by simulation of design load cases (DLC) defined in the IEC 61400-1 and -3 standards or equivalent guidelines. Due to the resulting large number of necessary load simulations, here a method is presented to reduce the computational effort for DLC simulations significantly by introducing a reduced nonlinear model and simplified hydro- and aerodynamics. The advantage of the formulation is that the nonlinear ODE system only contains basic mathematic operations and no iterations or internal loops which makes it very computationally efficient. Global turbine extreme and fatigue loads such as rotor thrust, tower base bending moment and mooring line tension, as well as platform motions are outputs of the model. They can be used to identify critical and less critical load situations to be then analysed with a higher fidelity tool and so speed up the design process. Results from these reduced model DLC simulations are presented and compared to higher fidelity models. Results in frequency and time domain as well as extreme and fatigue load predictions demonstrate that good agreement between the reduced and advanced model is achieved, allowing to efficiently exclude less critical DLC simulations, and to identify the most critical subset of cases for a given design. Additionally, the model is applicable for brute force optimization of floater control system parameters.
Model Based Optimal Sensor Network Design for Condition Monitoring in an IGCC Plant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kumar, Rajeeva; Kumar, Aditya; Dai, Dan
2012-12-31
This report summarizes the achievements and final results of this program. The objective of this program is to develop a general model-based sensor network design methodology and tools to address key issues in the design of an optimal sensor network configuration: the type, location and number of sensors used in a network, for online condition monitoring. In particular, the focus in this work is to develop software tools for optimal sensor placement (OSP) and use these tools to design optimal sensor network configuration for online condition monitoring of gasifier refractory wear and radiant syngas cooler (RSC) fouling. The methodology developedmore » will be applicable to sensing system design for online condition monitoring for broad range of applications. The overall approach consists of (i) defining condition monitoring requirement in terms of OSP and mapping these requirements in mathematical terms for OSP algorithm, (ii) analyzing trade-off of alternate OSP algorithms, down selecting the most relevant ones and developing them for IGCC applications (iii) enhancing the gasifier and RSC models as required by OSP algorithms, (iv) applying the developed OSP algorithm to design the optimal sensor network required for the condition monitoring of an IGCC gasifier refractory and RSC fouling. Two key requirements for OSP for condition monitoring are desired precision for the monitoring variables (e.g. refractory wear) and reliability of the proposed sensor network in the presence of expected sensor failures. The OSP problem is naturally posed within a Kalman filtering approach as an integer programming problem where the key requirements of precision and reliability are imposed as constraints. The optimization is performed over the overall network cost. Based on extensive literature survey two formulations were identified as being relevant to OSP for condition monitoring; one based on LMI formulation and the other being standard INLP formulation. Various algorithms to solve these two formulations were developed and validated. For a given OSP problem the computation efficiency largely depends on the “size” of the problem. Initially a simplified 1-D gasifier model assuming axial and azimuthal symmetry was used to test out various OSP algorithms. Finally these algorithms were used to design the optimal sensor network for condition monitoring of IGCC gasifier refractory wear and RSC fouling. The sensors type and locations obtained as solution to the OSP problem were validated using model based sensing approach. The OSP algorithm has been developed in a modular form and has been packaged as a software tool for OSP design where a designer can explore various OSP design algorithm is a user friendly way. The OSP software tool is implemented in Matlab/Simulink© in-house. The tool also uses few optimization routines that are freely available on World Wide Web. In addition a modular Extended Kalman Filter (EKF) block has also been developed in Matlab/Simulink© which can be utilized for model based sensing of important process variables that are not directly measured through combining the online sensors with model based estimation once the hardware sensor and their locations has been finalized. The OSP algorithm details and the results of applying these algorithms to obtain optimal sensor location for condition monitoring of gasifier refractory wear and RSC fouling profile are summarized in this final report.« less
NASA Astrophysics Data System (ADS)
Ivanov, Stanislav; Kamzolkin, Vladimir; Konilov, Aleksandr; Aleshin, Igor
2014-05-01
There are many various methods of assessing the conditions of rocks formation based on determining the composition of the constituent minerals. Our objective was to create a universal tool for processing mineral's chemical analysis results and solving geothermobarometry problems by creating a database of existing sensors and providing a user-friendly standard interface. Similar computer assisted tools are based upon large collection of sensors (geothermometers and geobarometers) are known, for example, the project TPF (Konilov A.N., 1999) - text-based sensor collection tool written in PASCAL. The application contained more than 350 different sensors and has been used widely in petrochemical studies (see A.N. Konilov , A.A. Grafchikov, V.I. Fonarev 2010 for review). Our prototype uses the TPF project concept and is designed with modern application development techniques, which allows better flexibility. Main components of the designed system are 3 connected datasets: sensors collection (geothermometers, geobarometers, oxygen geobarometers, etc.), petrochemical data and modeling results. All data is maintained by special management and visualization tools and resides in sql database. System utilities allow user to import and export data in various file formats, edit records and plot graphs. Sensors database contains up to date collections of known methods. New sensors may be added by user. Measured database should be filled in by researcher. User friendly interface allows access to all available data and sensors, automates routine work, reduces the risk of common user mistakes and simplifies information exchange between research groups. We use prototype to evaluate peak pressure during the formation of garnet-amphibolite apoeclogites, gneisses and schists Blybsky metamorphic complex of the Front Range of the Northern Caucasus. In particular, our estimation of formation pressure range (18 ± 4 kbar) agrees on independent research results. The reported study was partially supported by RFBR, research project No. 14-05-00615.
VASSAR: Value assessment of system architectures using rules
NASA Astrophysics Data System (ADS)
Selva, D.; Crawley, E. F.
A key step of the mission development process is the selection of a system architecture, i.e., the layout of the major high-level system design decisions. This step typically involves the identification of a set of candidate architectures and a cost-benefit analysis to compare them. Computational tools have been used in the past to bring rigor and consistency into this process. These tools can automatically generate architectures by enumerating different combinations of decisions and options. They can also evaluate these architectures by applying cost models and simplified performance models. Current performance models are purely quantitative tools that are best fit for the evaluation of the technical performance of mission design. However, assessing the relative merit of a system architecture is a much more holistic task than evaluating performance of a mission design. Indeed, the merit of a system architecture comes from satisfying a variety of stakeholder needs, some of which are easy to quantify, and some of which are harder to quantify (e.g., elegance, scientific value, political robustness, flexibility). Moreover, assessing the merit of a system architecture at these very early stages of design often requires dealing with a mix of: a) quantitative and semi-qualitative data; objective and subjective information. Current computational tools are poorly suited for these purposes. In this paper, we propose a general methodology that can used to assess the relative merit of several candidate system architectures under the presence of objective, subjective, quantitative, and qualitative stakeholder needs. The methodology called VASSAR (Value ASsessment for System Architectures using Rules). The major underlying assumption of the VASSAR methodology is that the merit of a system architecture can assessed by comparing the capabilities of the architecture with the stakeholder requirements. Hence for example, a candidate architecture that fully satisfies all critical sta- eholder requirements is a good architecture. The assessment process is thus fundamentally seen as a pattern matching process where capabilities match requirements, which motivates the use of rule-based expert systems (RBES). This paper describes the VASSAR methodology and shows how it can be applied to a large complex space system, namely an Earth observation satellite system. Companion papers show its applicability to the NASA space communications and navigation program and the joint NOAA-DoD NPOESS program.
Software design for a compact interferometer
NASA Astrophysics Data System (ADS)
Vogel, Andreas
1993-01-01
Experience shows that very often a lot of similar elements have to be tested by the optician. Only a small number of input parameters are changed in a well defined manner. So it is useful to develop simplified software for special applications. The software is used in a compact phase shifting interferometer. Up to five interferometers can be controlled by a single PC-AT computer. Modular programming simplifies the software modification for new applications.
Li, D H; Wang, W; Li, X; Gao, Y L; Liu, D H; Liu, D L; Xu, W D
2017-01-01
The International Hip Outcome Tool (iHOT-33) is a questionnaire designed for young, active patients with hip disorders. It has proven to be a highly reliable and valid questionnaire. The main purpose of our study was to adapt the iHOT-33 questionnaire into simplified Chinese and to assess its psychometric properties in Chinese patients. The iHOT-33 was cross culturally adapted into Chinese and 138 patients completed the Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC), the EuroQol-5D (EQ-5D), and the Chinese version of the iHOT-33(SC-iHOT-33) pre- or postoperatively within 6 months' follow-up. The Cronbach's alpha, intraclass correlation coefficient (ICC), Pearson's correlation coefficient (r), effect size (ES), and standardized response mean (SRM) were calculated to assess the reliability, validity, and responsiveness of the SC-iHOT-33, respectively. Total Cronbach's alpha was 0.965, which represented excellent internal consistency of the SC-iHOT-33. The ICC ranges from 0.866 to 0.929, which shows excellent test-retest reliability. The subscales of SC-iHOT-33 had the highest correlation coefficient (r = 0.812) with the physical function subscales of the WOMAC, as well as good correlation between the social/emotional subscale of the SC-iHOT-33 and the EQ-5D (r = 0.740, r = 0.743). No floor or ceiling effects were found. The ES and SRM values indicated good responsiveness of 2.44 and 2.67, respectively. The SC-iHOT-33 questionnaire is reliable, valid, and responsive for the evaluation of young, Chinese, active patients with hip disorders. Copyright © 2016 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.
Development of a Comprehensive Community Nitrogen Oxide Emissions Reduction Toolkit (CCNERT)
NASA Astrophysics Data System (ADS)
Sung, Yong Hoon
The main objective of this study is to research and develop a simplified tool to estimate energy use in a community and its associated effects on air pollution. This tool is intended to predict the impacts of selected energy conservation options and efficiency programs on emission reduction. It is intended to help local government and their residents understand and manage information collection and the procedures to be used. This study presents a broad overview of the community-wide energy use and NOx emissions inventory process. It also presents various simplified procedures to estimate each sector's energy use. In an effort to better understand community-wide energy use and its associated NOx emissions, the City of College Station, Texas, was selected as a case study community for this research. While one community might successfully reduce the production of NOx emissions by adopting electricity efficiency programs in its buildings, another community might be equally successful by changing the mix of fuel sources used to generate electricity, which is consumed by the community. In yet a third community low NOx automobiles may be mandated. Unfortunately, the impact and cost of one strategy over another changes over time as major sources of pollution are reduced. Therefore, this research proposes to help community planners answer these questions and to assist local communities with their NOx emission reduction plans by developing a Comprehensive Community NOx Emissions Reduction Toolkit (CCNERT). The proposed simplified tool could have a substantial impact on reducing NOx emission by providing decision-makers with a preliminary understanding about the impacts of various energy efficiency programs on emissions reductions. To help decision makers, this study has addressed these issues by providing a general framework for examining how a community's non-renewable energy use leads to NOx emissions, by quantifying each end-user's energy usage and its associated NOx emissions, and by evaluating the environmental benefits of various types of energy saving options.
The Sky is for Everyone — Outreach and Education with the Virtual Observatory
NASA Astrophysics Data System (ADS)
Freistetter, F.; Iafrate, G.; Ramella, M.; Aida-Wp5 Team
2010-12-01
The Virtual Observatory (VO) is an international project to collect astronomical data (images, spectra, simulations, mission-logs, etc.), organise them and develop tools that let astronomers access this huge amount of information. The VO not only simplifies the work of professional astronomers, it is also a valuable tool for education and public outreach. For teachers and astronomers who actively promote astronomy to the public, the VO is a great opportunity to access and use real astronomical data, and have a taste of the daily life of astronomers.
Geocoded data structures and their applications to Earth science investigations
NASA Technical Reports Server (NTRS)
Goldberg, M.
1984-01-01
A geocoded data structure is a means for digitally representing a geographically referenced map or image. The characteristics of representative cellular, linked, and hybrid geocoded data structures are reviewed. The data processing requirements of Earth science projects at the Goddard Space Flight Center and the basic tools of geographic data processing are described. Specific ways that new geocoded data structures can be used to adapt these tools to scientists' needs are presented. These include: expanding analysis and modeling capabilities; simplifying the merging of data sets from diverse sources; and saving computer storage space.
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Boerschlein, David P.
1993-01-01
Fault-Tree Compiler (FTC) program, is software tool used to calculate probability of top event in fault tree. Gates of five different types allowed in fault tree: AND, OR, EXCLUSIVE OR, INVERT, and M OF N. High-level input language easy to understand and use. In addition, program supports hierarchical fault-tree definition feature, which simplifies tree-description process and reduces execution time. Set of programs created forming basis for reliability-analysis workstation: SURE, ASSIST, PAWS/STEM, and FTC fault-tree tool (LAR-14586). Written in PASCAL, ANSI-compliant C language, and FORTRAN 77. Other versions available upon request.
3D Feature Extraction for Unstructured Grids
NASA Technical Reports Server (NTRS)
Silver, Deborah
1996-01-01
Visualization techniques provide tools that help scientists identify observed phenomena in scientific simulation. To be useful, these tools must allow the user to extract regions, classify and visualize them, abstract them for simplified representations, and track their evolution. Object Segmentation provides a technique to extract and quantify regions of interest within these massive datasets. This article explores basic algorithms to extract coherent amorphous regions from two-dimensional and three-dimensional scalar unstructured grids. The techniques are applied to datasets from Computational Fluid Dynamics and those from Finite Element Analysis.
Development of a design methodology for asphalt treated mixtures.
DOT National Transportation Integrated Search
2013-12-01
This report summarizes the results of a study that was conducted to develop a simplified design methodology for asphalt : treated mixtures that are durable, stable, constructible, and cost effective through the examination of the performance of : mix...
Georgetown University Photovoltaic Higher Education National Exemplar Facility (PHENEF)
NASA Technical Reports Server (NTRS)
Marshall, N.
1984-01-01
Several photographs of this facility using photovoltaic (PV) cells are shown. An outline is given of the systems requirements, system design and wiring topology, a simplified block design, module electrical characteristics, PV module and PV module matching.
ERIC Educational Resources Information Center
Dolan, Thomas G.
2002-01-01
Describes Clark County, Nevada's use of prototype school designs to respond to its rapidly growing school population. The purpose of the prototypes is to simplify designs so that schools can be built quickly and minimize the time and expense that comes with variations. (EV)
Argilés, Josep M.; Betancourt, Angelica; Guàrdia-Olmos, Joan; Peró-Cebollero, Maribel; López-Soriano, Francisco J.; Madeddu, Clelia; Serpe, Roberto; Busquets, Sílvia
2017-01-01
The CAchexia SCOre (CASCO) was described as a tool for the staging of cachectic cancer patients. The aim of this study is to show the metric properties of CASCO in order to classify cachectic cancer patients into three different groups, which are associated with a numerical scoring. The final aim was to clinically validate CASCO for its use in the classification of cachectic cancer patients in clinical practice. We carried out a case -control study that enrolled prospectively 186 cancer patients and 95 age-matched controls. The score includes five components: (1) body weight loss and composition, (2) inflammation/metabolic disturbances/immunosuppression, (3) physical performance, (4) anorexia, and (5) quality of life. The present study provides clinical validation for the use of the score. In order to show the metric properties of CASCO, three different groups of cachectic cancer patients were established according to the results obtained with the statistical approach used: mild cachexia (15 ≤ × ≤ 28), moderate cachexia (29 ≤ × ≤ 46), and severe cachexia (47 ≤ × ≤ 100). In addition, a simplified version of CASCO, MiniCASCO (MCASCO), was also presented and it contributes as a valid and easy-to-use tool for cachexia staging. Significant statistically correlations were found between CASCO and other validated indexes such as Eastern Cooperative Oncology Group (ECOG) and the subjective diagnosis of cachexia by specialized oncologists. A very significant estimated correlation between CASCO and MCASCO was found that suggests that MCASCO might constitute an easy and valid tool for the staging of the cachectic cancer patients. CASCO and MCASCO provide a new tool for the quantitative staging of cachectic cancer patients with a clear advantage over previous classifications. PMID:28261113
Argilés, Josep M; Betancourt, Angelica; Guàrdia-Olmos, Joan; Peró-Cebollero, Maribel; López-Soriano, Francisco J; Madeddu, Clelia; Serpe, Roberto; Busquets, Sílvia
2017-01-01
The CAchexia SCOre (CASCO) was described as a tool for the staging of cachectic cancer patients. The aim of this study is to show the metric properties of CASCO in order to classify cachectic cancer patients into three different groups, which are associated with a numerical scoring. The final aim was to clinically validate CASCO for its use in the classification of cachectic cancer patients in clinical practice. We carried out a case -control study that enrolled prospectively 186 cancer patients and 95 age-matched controls. The score includes five components: (1) body weight loss and composition, (2) inflammation/metabolic disturbances/immunosuppression, (3) physical performance, (4) anorexia, and (5) quality of life. The present study provides clinical validation for the use of the score. In order to show the metric properties of CASCO, three different groups of cachectic cancer patients were established according to the results obtained with the statistical approach used: mild cachexia (15 ≤ × ≤ 28), moderate cachexia (29 ≤ × ≤ 46), and severe cachexia (47 ≤ × ≤ 100). In addition, a simplified version of CASCO, MiniCASCO (MCASCO), was also presented and it contributes as a valid and easy-to-use tool for cachexia staging. Significant statistically correlations were found between CASCO and other validated indexes such as Eastern Cooperative Oncology Group (ECOG) and the subjective diagnosis of cachexia by specialized oncologists. A very significant estimated correlation between CASCO and MCASCO was found that suggests that MCASCO might constitute an easy and valid tool for the staging of the cachectic cancer patients. CASCO and MCASCO provide a new tool for the quantitative staging of cachectic cancer patients with a clear advantage over previous classifications.
Probabilistic distance-based quantizer design for distributed estimation
NASA Astrophysics Data System (ADS)
Kim, Yoon Hak
2016-12-01
We consider an iterative design of independently operating local quantizers at nodes that should cooperate without interaction to achieve application objectives for distributed estimation systems. We suggest as a new cost function a probabilistic distance between the posterior distribution and its quantized one expressed as the Kullback Leibler (KL) divergence. We first present the analysis that minimizing the KL divergence in the cyclic generalized Lloyd design framework is equivalent to maximizing the logarithmic quantized posterior distribution on the average which can be further computationally reduced in our iterative design. We propose an iterative design algorithm that seeks to maximize the simplified version of the posterior quantized distribution and discuss that our algorithm converges to a global optimum due to the convexity of the cost function and generates the most informative quantized measurements. We also provide an independent encoding technique that enables minimization of the cost function and can be efficiently simplified for a practical use of power-constrained nodes. We finally demonstrate through extensive experiments an obvious advantage of improved estimation performance as compared with the typical designs and the novel design techniques previously published.
Design of RISC Processor Using VHDL and Cadence
NASA Astrophysics Data System (ADS)
Moslehpour, Saeid; Puliroju, Chandrasekhar; Abu-Aisheh, Akram
The project deals about development of a basic RISC processor. The processor is designed with basic architecture consisting of internal modules like clock generator, memory, program counter, instruction register, accumulator, arithmetic and logic unit and decoder. This processor is mainly used for simple general purpose like arithmetic operations and which can be further developed for general purpose processor by increasing the size of the instruction register. The processor is designed in VHDL by using Xilinx 8.1i version. The present project also serves as an application of the knowledge gained from past studies of the PSPICE program. The study will show how PSPICE can be used to simplify massive complex circuits designed in VHDL Synthesis. The purpose of the project is to explore the designed RISC model piece by piece, examine and understand the Input/ Output pins, and to show how the VHDL synthesis code can be converted to a simplified PSPICE model. The project will also serve as a collection of various research materials about the pieces of the circuit.
Pan, Qiaosheng; Miao, Enming; Wu, Bingxuan; Chen, Weikang; Lei, Xiujun; He, Liangguo
2017-07-01
A novel, bio-inspired, single-phase driven piezoelectric linear motor (PLM) using an asymmetric stator was designed, fabricated, and tested to avoid mode degeneracy and to simplify the drive mechanism of a piezoelectric motor. A piezoelectric transducer composed of two piezoelectric stacks and a displacement amplifier was used as the driving element of the PLM. Two simple and specially designed claws performed elliptical motion. A numerical simulation was performed to design the stator and determine the feasibility of the design mechanism of the PLM. Moreover, an experimental setup was built to validate the working principles, as well as to evaluate the performance, of the PLM. The prototype motor outputs a no-load speed of 233.7 mm/s at a voltage of 180 V p-p and a maximum thrust force of 2.3 N under a preload of 10 N. This study verified the feasibility of the proposed design and provided a method to simplify the driving harmonic signal and structure of PLMs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Richards, Elizabeth H.; Schindel, Kay; Bosiljevac, Tom
2011-12-01
Structural Considerations for Solar Installers provides a comprehensive outline of structural considerations associated with simplified solar installations and recommends a set of best practices installers can follow when assessing such considerations. Information in the manual comes from engineering and solar experts as well as case studies. The objectives of the manual are to ensure safety and structural durability for rooftop solar installations and to potentially accelerate the permitting process by identifying and remedying structural issues prior to installation. The purpose of this document is to provide tools and guidelines for installers to help ensure that residential photovoltaic (PV) power systemsmore » are properly specified and installed with respect to the continuing structural integrity of the building.« less
77 FR 76588 - Request for Proposal Platform Pilot
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-28
...The Small Business Administration (SBA) is announcing a pilot where federal agencies will test a new request for proposal (RFP) platform (RFP-EZ) to streamline the process through which the government buys web design and related technology services from small businesses for acquisitions valued at or below the simplified acquisition threshold (SAT). RFP-EZ is one of five projects sponsored by the Office of Science and Technology Policy's Presidential Innovation Fellows Program, which leverages the ingenuity of leading problem solvers from across America together with federal innovators to tackle projects that aim to fuel job creation, save taxpayers money, and significantly improve how the federal government serves the American people. Under the RFP-EZ pilot, which will initially run from December 28, 2012 through May 1, 2013, agencies will identify individual procurements valued at or below the simplified acquisition threshold that can be set aside for small businesses to test a suite of functional tools for: (1) Simplifying the development of statements of work, (2) improving agency access to information about small businesses, (3) enabling small businesses to submit quotes, bids or proposals (collectively referred to as proposals) electronically in response to a solicitation posted on Federal Business Opportunities (FedBizOpps); (4) enhancing efficiencies for evaluating proposals, and (5) improving how information (including prices paid by federal agencies) is captured and stored. The pilot will be conducted in accordance with existing laws and regulations. Interested parties are encouraged to review and comment on the functionality of RFP-EZ, as described at www.sba.gov/rfpez and highlighted in this notice. Responses to this notice will be considered for possible refinements to the RFP-EZ platform during the pilot and as part of the evaluation of the benefits and costs of making RFP-EZ a permanent platform fully integrated with FedBizOpps, the System for Award Management and agency contract writing systems.
DOT National Transportation Integrated Search
2017-01-01
This report summarizes the local calibration of the distress models for the Northeast (NE) region of the United States and the development of new design tables for new flexible pavement structures. Design, performance, and traffic data collected on t...
NASA Astrophysics Data System (ADS)
Bianco, C.; Tosco, T.; Sethi, R.
2017-12-01
Nanoremediation is a promising in-situ technology for the reclamation of contaminated aquifers. It consists in the subsurface injection of a reactive colloidal suspension for the in-situ treatment of pollutants. The overall success of this technology at the field scale is strictly related to the achievement of an effective and efficient emplacement of the nanoparticles (NP) inside the contaminated area. Mathematical models can be used to support the design of nanotechnology-based remediation by effectively assessing the expected NP mobility at the field scale. Several analytical and numerical tools have been developed in recent years to model the transport of NPs in simplified geometry and boundary conditions. The numerical tool MNMs was developed by the authors of this work to simulate colloidal transport in 1D Cartesian and radial coordinates. A new modelling tool, MNM3D (Micro and Nanoparticle transport Model in 3D geometries), was also proposed for the simulation of injection and transport of NP suspensions in generic complex scenarios. MNM3D accounts for the simultaneous dependency of NP transport on water ionic strength and velocity. The software was developed to predict the NP mobility at different stages of a nanoremediation application, from the design stage to the prediction of the long-term fate after injection. In this work an integrated experimental-modelling procedure is applied to support the design of a field scale injection of goethite NPs carried out in the framework of the H2020 European project Reground. Column tests are performed at different injection flowrates using natural sand collected at the contaminated site as porous medium. The tests are interpreted using MNMs to characterize the NP mobility and derive the constitutive equations describing the suspension behavior in the natural porous medium. MNM3D is then used to predict NP behavior during the field scale injection and to assess the long-term mobility of the injected slurry. Finally, different injection scenarios were simulated to get a reliable estimation of several operating parameters, e.g. particle distribution around the injection well, radius of influence, number of required wells.
Skyline: an open source document editor for creating and analyzing targeted proteomics experiments
MacLean, Brendan; Tomazela, Daniela M.; Shulman, Nicholas; Chambers, Matthew; Finney, Gregory L.; Frewen, Barbara; Kern, Randall; Tabb, David L.; Liebler, Daniel C.; MacCoss, Michael J.
2010-01-01
Summary: Skyline is a Windows client application for targeted proteomics method creation and quantitative data analysis. It is open source and freely available for academic and commercial use. The Skyline user interface simplifies the development of mass spectrometer methods and the analysis of data from targeted proteomics experiments performed using selected reaction monitoring (SRM). Skyline supports using and creating MS/MS spectral libraries from a wide variety of sources to choose SRM filters and verify results based on previously observed ion trap data. Skyline exports transition lists to and imports the native output files from Agilent, Applied Biosystems, Thermo Fisher Scientific and Waters triple quadrupole instruments, seamlessly connecting mass spectrometer output back to the experimental design document. The fast and compact Skyline file format is easily shared, even for experiments requiring many sample injections. A rich array of graphs displays results and provides powerful tools for inspecting data integrity as data are acquired, helping instrument operators to identify problems early. The Skyline dynamic report designer exports tabular data from the Skyline document model for in-depth analysis with common statistical tools. Availability: Single-click, self-updating web installation is available at http://proteome.gs.washington.edu/software/skyline. This web site also provides access to instructional videos, a support board, an issues list and a link to the source code project. Contact: brendanx@u.washington.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20147306
Ultra-low noise combs in the palm of your hand
NASA Astrophysics Data System (ADS)
Schibli, Thomas R.
Mode-locked lasers are attractive tools for precision measurements and for photonic microwave generation. The technology around these lasers has rapidly evolved, and with the invention of optical frequency combs, fs-technology has become a ubiquitous tool science and engineering. At first, most of these combs were generated by bulky and delicate Kerr-Lens mode-locked Ti:sapphire systems, but have now been mostly replaced by the much more robust and compact fiber lasers. However, the move from table-top solid-state lasers to the fully self-contained fiber systems came with a price: the optical phase noise performance degraded due to design constraints. While this is of no concern for most spectroscopic applications, it poses a challenge for applications that require excellent short-term phase noise performance, such as, for example, required for photonic microwave generation. While much of this has been improved by ingenious laser designs, it remains a challenge to obtain ultra-low phase-noise combs from high-repetition-rate fiber lasers. Here we present a new approach consisting of a monolithic cavity design, in which the laser light is fully confined inside an optical material. Thanks to this monolithic design, these solid-state lasers are inherently robust against environmental perturbations, such as acoustics, vibrations, air pressure and humidity. Opposed to the omnipresent mode-locked fiber lasers, these monolithic lasers exhibit very low round-trip loss, dispersion and nonlinearities. As a result, they produce highly stable pulse trains, with free-running relative line-widths of the order of a few Hz in the optical domain, despite their moderately high fundamental repetition rates of 1 GHz. The compact design further simplifies integration into complex systems, and eliminates the need for an optics bench or a vibration isolated platform. These lasers produce less than 0.2 W of heat, and are fully turn-key. This work was supported by the DARPA PULSE program with a Grant from AMRDEC and by the NSF Early Career Award.
NASA Astrophysics Data System (ADS)
Williams, C. A.; Dicaprio, C.; Simons, M.
2003-12-01
With the advent of projects such as the Plate Boundary Observatory and future InSAR missions, spatially dense geodetic data of high quality will provide an increasingly detailed picture of the movement of the earth's surface. To interpret such information, powerful and easily accessible modeling tools are required. We are presently developing such a tool that we feel will meet many of the needs for evaluating quasi-static earth deformation. As a starting point, we begin with a modified version of the finite element code TECTON, which has been specifically designed to solve tectonic problems involving faulting and viscoelastic/plastic earth behavior. As our first priority, we are integrating the code into the GeoFramework, which is an extension of the Python-based Pyre modeling framework. The goal of this framework is to provide simplified user interfaces for powerful modeling codes, to provide easy access to utilities such as meshers and visualization tools, and to provide a tight integration between different modeling tools so they can interact with each other. The initial integration of the code into this framework is essentially complete, and a more thorough integration, where Python-based drivers control the entire solution, will be completed in the near future. We have an evolving set of priorities that we expect to solidify as we receive more input from the modeling community. Current priorities include the development of linear and quadratic tetrahedral elements, the development of a parallelized version of the code using the PETSc libraries, the addition of more complex rheologies, realistic fault friction models, adaptive time stepping, and spherical geometries. In this presentation we describe current progress toward our various priorities, briefly describe the structure of the code within the GeoFramework, and demonstrate some sample applications.
NASA Technical Reports Server (NTRS)
2010-01-01
Topics covered include: Burnishing Techniques Strengthen Hip Implants; Signal Processing Methods Monitor Cranial Pressure; Ultraviolet-Blocking Lenses Protect, Enhance Vision; Hyperspectral Systems Increase Imaging Capabilities; Programs Model the Future of Air Traffic Management; Tail Rotor Airfoils Stabilize Helicopters, Reduce Noise; Personal Aircraft Point to the Future of Transportation; Ducted Fan Designs Lead to Potential New Vehicles; Winglets Save Billions of Dollars in Fuel Costs; Sensor Systems Collect Critical Aerodynamics Data; Coatings Extend Life of Engines and Infrastructure; Radiometers Optimize Local Weather Prediction; Energy-Efficient Systems Eliminate Icing Danger for UAVs; Rocket-Powered Parachutes Rescue Entire Planes; Technologies Advance UAVs for Science, Military; Inflatable Antennas Support Emergency Communication; Smart Sensors Assess Structural Health; Hand-Held Devices Detect Explosives and Chemical Agents; Terahertz Tools Advance Imaging for Security, Industry; LED Systems Target Plant Growth; Aerogels Insulate Against Extreme Temperatures; Image Sensors Enhance Camera Technologies; Lightweight Material Patches Allow for Quick Repairs; Nanomaterials Transform Hairstyling Tools; Do-It-Yourself Additives Recharge Auto Air Conditioning; Systems Analyze Water Quality in Real Time; Compact Radiometers Expand Climate Knowledge; Energy Servers Deliver Clean, Affordable Power; Solutions Remediate Contaminated Groundwater; Bacteria Provide Cleanup of Oil Spills, Wastewater; Reflective Coatings Protect People and Animals; Innovative Techniques Simplify Vibration Analysis; Modeling Tools Predict Flow in Fluid Dynamics; Verification Tools Secure Online Shopping, Banking; Toolsets Maintain Health of Complex Systems; Framework Resources Multiply Computing Power; Tools Automate Spacecraft Testing, Operation; GPS Software Packages Deliver Positioning Solutions; Solid-State Recorders Enhance Scientific Data Collection; Computer Models Simulate Fine Particle Dispersion; Composite Sandwich Technologies Lighten Components; Cameras Reveal Elements in the Short Wave Infrared; Deformable Mirrors Correct Optical Distortions; Stitching Techniques Advance Optics Manufacturing; Compact, Robust Chips Integrate Optical Functions; Fuel Cell Stations Automate Processes, Catalyst Testing; Onboard Systems Record Unique Videos of Space Missions; Space Research Results Purify Semiconductor Materials; and Toolkits Control Motion of Complex Robotics.
Direct conversion of infrared radiant energy for space power applications
NASA Technical Reports Server (NTRS)
Finke, R. C.
1982-01-01
A proposed technology to convert the earth radiant energy (infrared albedo) for spacecraft power is presented. The resultant system would eliminate energy storage requirements and simplify the spacecraft design. The design and performance of a infrared rectenna is discussed.
Simplifying Chemical Reactor Design by using Molar Quantities Instead of Fractional Conversion.
ERIC Educational Resources Information Center
Brown, Lee F.; Falconer, John L.
1987-01-01
Explains the advantages of using molar quantities in chemical reactor design. Advocates the use of differential versions of reactor mass balances rather than the integrated forms. Provides specific examples and cases to illustrate the principles. (ML)
DOT National Transportation Integrated Search
2014-11-15
The simplified procedure in design codes for determining earthquake response spectra involves : estimating site coefficients to adjust available rock accelerations to site accelerations. Several : investigators have noted concerns with the site coeff...
NASA Technical Reports Server (NTRS)
Johnson, Paul W.
2008-01-01
ePORT (electronic Project Online Risk Tool) provides a systematic approach to using an electronic database program to manage a program/project risk management processes. This presentation will briefly cover the standard risk management procedures, then thoroughly cover NASA's Risk Management tool called ePORT. This electronic Project Online Risk Tool (ePORT) is a web-based risk management program that provides a common framework to capture and manage risks, independent of a programs/projects size and budget. It is used to thoroughly cover the risk management paradigm providing standardized evaluation criterion for common management reporting, ePORT improves Product Line, Center and Corporate Management insight, simplifies program/project manager reporting, and maintains an archive of data for historical reference.
Verifying the error bound of numerical computation implemented in computer systems
Sawada, Jun
2013-03-12
A verification tool receives a finite precision definition for an approximation of an infinite precision numerical function implemented in a processor in the form of a polynomial of bounded functions. The verification tool receives a domain for verifying outputs of segments associated with the infinite precision numerical function. The verification tool splits the domain into at least two segments, wherein each segment is non-overlapping with any other segment and converts, for each segment, a polynomial of bounded functions for the segment to a simplified formula comprising a polynomial, an inequality, and a constant for a selected segment. The verification tool calculates upper bounds of the polynomial for the at least two segments, beginning with the selected segment and reports the segments that violate a bounding condition.
Simplify to survive: prescriptive layouts ensure profitable scaling to 32nm and beyond
NASA Astrophysics Data System (ADS)
Liebmann, Lars; Pileggi, Larry; Hibbeler, Jason; Rovner, Vyacheslav; Jhaveri, Tejas; Northrop, Greg
2009-03-01
The time-to-market driven need to maintain concurrent process-design co-development, even in spite of discontinuous patterning, process, and device innovation is reiterated. The escalating design rule complexity resulting from increasing layout sensitivities in physical and electrical yield and the resulting risk to profitable technology scaling is reviewed. Shortcomings in traditional Design for Manufacturability (DfM) solutions are identified and contrasted to the highly successful integrated design-technology co-optimization used for SRAM and other memory arrays. The feasibility of extending memory-style design-technology co-optimization, based on a highly simplified layout environment, to logic chips is demonstrated. Layout density benefits, modeled patterning and electrical yield improvements, as well as substantially improved layout simplicity are quantified in a conventional versus template-based design comparison on a 65nm IBM PowerPC 405 microprocessor core. The adaptability of this highly regularized template-based design solution to different yield concerns and design styles is shown in the extension of this work to 32nm with an increased focus on interconnect redundancy. In closing, the work not covered in this paper, focused on the process side of the integrated process-design co-optimization, is introduced.
Dual Telecentric Lens System For Projection Onto Tilted Toroidal Screen
NASA Technical Reports Server (NTRS)
Gold, Ronald S.; Hudyma, Russell M.
1995-01-01
System of two optical assemblies for projecting image onto tilted toroidal screen. One projection lens optimized for red and green spectral region; other for blue. Dual-channel approach offers several advantages which include: simplified color filtering, simplified chromatic aberration corrections, less complex polarizing prism arrangement, and increased throughput of blue light energy. Used in conjunction with any source of imagery, designed especially to project images formed by reflection of light from liquid-crystal light valve (LCLV).
COVD-QOL questionnaire: An adaptation for school vision screening using Rasch analysis
Abu Bakar, Nurul Farhana; Ai Hong, Chen; Pik Pin, Goh
2012-01-01
Purpose To adapt the College of Optometrist in Vision Development (COVD-QOL) questionnaire as a vision screening tool for primary school children. Methods An interview session was conducted with children, teachers or guardians regarding visual symptoms of 88 children (45 from special education classes and 43 from mainstream classes) in government primary schools. Data was assessed for response categories, fit items (infit/outfit: 0.6–1.4) and separation reliability (item/person: 0.80). The COVD-QOL questionnaire results were compared with vision assessment in identifying three categories of vision disorders: reduce visual acuity, accommodative response anomaly and convergence insufficiency. Analysis on the screening performance using the simplified version of the questionnaire was evaluated based on receiver-operating characteristic analysis for detection of any type of target conditions for both types of classes. Predictive validity analysis was used a Spearman rank correlation (>0.3). Results Two of the response categories were underutilized and therefore collapsed to the adjacent category and items were reduced to 14. Item separation reliability for the simplified version of the questionnaire was acceptable (0.86) but the person separation reliability was inadequate for special education classes (0.79) similar to mainstream classes (0.78). The discriminant cut-off score of 9 (mainstream classes) and 3 (special education classes) from the 14 items provided sensitivity and specificity of (65% and 54%) and (78% and 80%) with Spearman rank correlation of 0.16 and 0.40 respectively. Conclusion The simplified version of COVD-QOL questionnaire (14-items) performs adequately among children in special education classes suggesting its suitability as a vision screening tool.
A User's Guide to Topological Data Analysis
ERIC Educational Resources Information Center
Munch, Elizabeth
2017-01-01
Topological data analysis (TDA) is a collection of powerful tools that can quantify shape and structure in data in order to answer questions from the data's domain. This is done by representing some aspect of the structure of the data in a simplified topological signature. In this article, we introduce two of the most commonly used topological…
eCAF: A New Tool for the Conversational Analysis of Electronic Communication
ERIC Educational Resources Information Center
Duncan-Howell, Jennifer
2009-01-01
Electronic communication is characteristically concerned with "the message" (eM), those who send them (S), and those who receive and read them (R). This relationship could be simplified into the equation eM = S + R. When this simple equation is applied to electronic communication, several elements are added that make this straightforward act of…
Health Literacy: An Opportunity to Improve Individual, Community, and Global Health
ERIC Educational Resources Information Center
Pleasant, Andrew
2011-01-01
Over the past decade, the field of health literacy has advanced from providing limited tools for simplifying language into the basis for a viable theory of the complex relationship between knowledge, attitudes, behavior, and health outcomes, ranging from the individual to the societal level. While roughly a decade passed between what seem to be…
Equity Audits: A Practical Leadership Tool for Developing Equitable and Excellent Schools
ERIC Educational Resources Information Center
Skrla, Linda; Scheurich, James Joseph; Garcia, Juanita; Nolly, Glenn
2004-01-01
Persistent achievement gaps by race and class in U.S. public schools are educationally and ethically deplorable and, thus, need to be eliminated. Based on their research on schools and districts that haven arrowed these gaps, the authors have developed a simplified reconceptualization of equity auditing, a concept with a respected history in civil…
Operating tool for a distributed data and information management system
NASA Astrophysics Data System (ADS)
Reck, C.; Mikusch, E.; Kiemle, S.; Wolfmüller, M.; Böttcher, M.
2002-07-01
The German Remote Sensing Data Center has developed the Data Information and Management System DIMS which provides multi-mission ground system services for earth observation product processing, archiving, ordering and delivery. DIMS successfully uses newest technologies within its services. This paper presents the solution taken to simplify operation tasks for this large and distributed system.
Simplifying microbial electrosynthesis reactor design.
Giddings, Cloelle G S; Nevin, Kelly P; Woodward, Trevor; Lovley, Derek R; Butler, Caitlyn S
2015-01-01
Microbial electrosynthesis, an artificial form of photosynthesis, can efficiently convert carbon dioxide into organic commodities; however, this process has only previously been demonstrated in reactors that have features likely to be a barrier to scale-up. Therefore, the possibility of simplifying reactor design by both eliminating potentiostatic control of the cathode and removing the membrane separating the anode and cathode was investigated with biofilms of Sporomusa ovata. S. ovata reduces carbon dioxide to acetate and acts as the microbial catalyst for plain graphite stick cathodes as the electron donor. In traditional 'H-cell' reactors, where the anode and cathode chambers were separated with a proton-selective membrane, the rates and columbic efficiencies of microbial electrosynthesis remained high when electron delivery at the cathode was powered with a direct current power source rather than with a potentiostat-poised cathode utilized in previous studies. A membrane-less reactor with a direct-current power source with the cathode and anode positioned to avoid oxygen exposure at the cathode, retained high rates of acetate production as well as high columbic and energetic efficiencies. The finding that microbial electrosynthesis is feasible without a membrane separating the anode from the cathode, coupled with a direct current power source supplying the energy for electron delivery, is expected to greatly simplify future reactor design and lower construction costs.
Can a More User-Friendly Medicare Plan Finder Improve Consumers' Selection of Medicare Plans?
Martino, Steven C; Kanouse, David E; Miranda, David J; Elliott, Marc N
2017-10-01
To evaluate the efficacy for consumers of two potential enhancements to the Medicare Plan Finder (MPF)-a simplified data display and a "quick links" home page designed to match the specific tasks that users seek to accomplish on the MPF. Participants (N = 641) were seniors and adult caregivers of seniors who were recruited from a national online panel. Participants browsed a simulated version of the MPF, made a hypothetical plan choice, and reported on their experience. Participants were randomly assigned to one of eight conditions in a fully factorial design: 2 home pages (quick links, current MPF home page) × 2 data displays (simplified, current MPF display) × 2 plan types (stand-alone prescription drug plan [PDP], Medicare Advantage plan with prescription drug coverage [MA-PD]). The quick links page resulted in more favorable perceptions of the MPF, improved users' understanding of the information, and increased the probability of choosing the objectively best plan. The simplified data display resulted in a more favorable evaluation of the website, better comprehension of the displayed information, and, among those choosing a PDP only, an increased probability of choosing the best plan. Design enhancements could markedly improve average website users' understanding, ability to use, and experience of using the MPF. © Health Research and Educational Trust.
An Integrated Multivariable Visualization Tool for Marine Sanctuary Climate Assessments
NASA Astrophysics Data System (ADS)
Shein, K. A.; Johnston, S.; Stachniewicz, J.; Duncan, B.; Cecil, D.; Ansari, S.; Urzen, M.
2012-12-01
The comprehensive development and use of ecological climate impact assessments by ecosystem managers can be limited by data access and visualization methods that require a priori knowledge about the various large and complex climate data products necessary to those impact assessments. In addition, it can be difficult to geographically and temporally integrate climate and ecological data to fully characterize climate-driven ecological impacts. To address these considerations, we have enhanced and extended the functionality of the NOAA National Climatic Data Center's Weather and Climate Toolkit (WCT). The WCT is a freely available Java-based tool designed to access and display NCDC's georeferenced climate data products (e.g., satellite, radar, and reanalysis gridded data). However, the WCT requires users already know how to obtain the data products, which products are preferred for a given variable, and which products are most relevant to their needs. Developed in cooperation with research and management customers at the Gulf of the Farallones National Marine Sanctuary, the Integrated Marine Protected Area Climate Tools (IMPACT) modification to the WCT simplifies or eliminates these requirements, while simultaneously adding core analytical functionality to the tool. Designed for use by marine ecosystem managers, WCT-IMPACT accesses a suite of data products that have been identified as relevant to marine ecosystem climate impact assessments, such as NOAA's Climate Data Records. WCT-IMPACT regularly crops these products to the geographic boundaries of each included marine protected area (MPA), and those clipped regions are processed to produce MPA-specific analytics. The tool retrieves the most appropriate data files based on the user selection of MPA, environmental variable(s), and time frame. Once the data are loaded, they may be visualized, explored, analyzed, and exported to other formats (e.g., Google KML). Multiple variables may be simultaneously visualized using a 4-panel display and compared via a variety of statistics such as difference, probability, or correlation maps.; NCDC's Weather and Climate Toolkit image of NARR-A non-convective cloud cover (%) over the Pacific Coast on June 17, 2012 at 09:00 GMT.
Using simple agent-based modeling to inform and enhance neighborhood walkability.
Badland, Hannah; White, Marcus; Macaulay, Gus; Eagleson, Serryn; Mavoa, Suzanne; Pettit, Christopher; Giles-Corti, Billie
2013-12-11
Pedestrian-friendly neighborhoods with proximal destinations and services encourage walking and decrease car dependence, thereby contributing to more active and healthier communities. Proximity to key destinations and services is an important aspect of the urban design decision making process, particularly in areas adopting a transit-oriented development (TOD) approach to urban planning, whereby densification occurs within walking distance of transit nodes. Modeling destination access within neighborhoods has been limited to circular catchment buffers or more sophisticated network-buffers generated using geoprocessing routines within geographical information systems (GIS). Both circular and network-buffer catchment methods are problematic. Circular catchment models do not account for street networks, thus do not allow exploratory 'what-if' scenario modeling; and network-buffering functionality typically exists within proprietary GIS software, which can be costly and requires a high level of expertise to operate. This study sought to overcome these limitations by developing an open-source simple agent-based walkable catchment tool that can be used by researchers, urban designers, planners, and policy makers to test scenarios for improving neighborhood walkable catchments. A simplified version of an agent-based model was ported to a vector-based open source GIS web tool using data derived from the Australian Urban Research Infrastructure Network (AURIN). The tool was developed and tested with end-user stakeholder working group input. The resulting model has proven to be effective and flexible, allowing stakeholders to assess and optimize the walkability of neighborhood catchments around actual or potential nodes of interest (e.g., schools, public transport stops). Users can derive a range of metrics to compare different scenarios modeled. These include: catchment area versus circular buffer ratios; mean number of streets crossed; and modeling of different walking speeds and wait time at intersections. The tool has the capacity to influence planning and public health advocacy and practice, and by using open-access source software, it is available for use locally and internationally. There is also scope to extend this version of the tool from a simple to a complex model, which includes agents (i.e., simulated pedestrians) 'learning' and incorporating other environmental attributes that enhance walkability (e.g., residential density, mixed land use, traffic volume).
Variation simulation for compliant sheet metal assemblies with applications
NASA Astrophysics Data System (ADS)
Long, Yufeng
Sheet metals are widely used in discrete products, such as automobiles, aircraft, furniture and electronics appliances, due to their good manufacturability and low cost. A typical automotive body assembly consists of more than 300 parts welded together in more than 200 assembly fixture stations. Such an assembly system is usually quite complex, and takes a long time to develop. As the automotive customer demands products of increasing quality in a shorter time, engineers in automotive industry turn to computer-aided engineering (CAE) tools for help. Computers are an invaluable resource for engineers, not only to simplify and automate the design process, but also to share design specifications with manufacturing groups so that production systems can be tooled up quickly and efficiently. Therefore, it is beneficial to develop computerized simulation and evaluation tools for development of automotive body assembly systems. It is a well-known fact that assembly architectures (joints, fixtures, and assembly lines) have a profound impact on dimensional quality of compliant sheet metal assemblies. To evaluate sheet metal assembly architectures, a special dimensional analysis tool need be developed for predicting dimensional variation of the assembly. Then, the corresponding systematic tools can be established to help engineers select the assembly architectures. In this dissertation, a unified variation model is developed to predict variation in compliant sheet metal assemblies by considering fixture-induced rigid-body motion, deformation and springback. Based on the unified variation model, variation propagation models in multiple assembly stations with various configurations are established. To evaluate the dimensional capability of assembly architectures, quantitative indices are proposed based on the sensitivity matrix, which are independent of the variation level of the process. Examples are given to demonstrate their applications in selecting robust assembly architectures, and some useful guidelines for selection of assembly architectures are summarized. In addition, to enhance the fault diagnosis, a systematic methodology is proposed for selection of measurement configurations. Specifically, principles involved in selecting measurements are generalized first; then, the corresponding quantitative indices are developed to evaluate the measurement configurations, and finally, examples are present.