A Discussion of Issues in Integrity Constraint Monitoring
NASA Technical Reports Server (NTRS)
Fernandez, Francisco G.; Gates, Ann Q.; Cooke, Daniel E.
1998-01-01
In the development of large-scale software systems, analysts, designers, and programmers identify properties of data objects in the system. The ability to check those assertions during runtime is desirable as a means of verifying the integrity of the program. Typically, programmers ensure the satisfaction of such properties through the use of some form of manually embedded assertion check. The disadvantage to this approach is that these assertions become entangled within the program code. The goal of the research is to develop an integrity constraint monitoring mechanism whereby a repository of software system properties (called integrity constraints) are automatically inserted into the program by the mechanism to check for incorrect program behaviors. Such a mechanism would overcome many of the deficiencies of manually embedded assertion checks. This paper gives an overview of the preliminary work performed toward this goal. The manual instrumentation of constraint checking on a series of test programs is discussed, This review then is used as the basis for a discussion of issues to be considered in developing an automated integrity constraint monitor.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Delgoshaei, Parastoo; Austin, Mark A.; Pertzborn, Amanda J.
State-of-the-art building simulation control methods incorporate physical constraints into their mathematical models, but omit implicit constraints associated with policies of operation and dependency relationships among rules representing those constraints. To overcome these shortcomings, there is a recent trend in enabling the control strategies with inference-based rule checking capabilities. One solution is to exploit semantic web technologies in building simulation control. Such approaches provide the tools for semantic modeling of domains, and the ability to deduce new information based on the models through use of Description Logic (DL). In a step toward enabling this capability, this paper presents a cross-disciplinary data-drivenmore » control strategy for building energy management simulation that integrates semantic modeling and formal rule checking mechanisms into a Model Predictive Control (MPC) formulation. The results show that MPC provides superior levels of performance when initial conditions and inputs are derived from inference-based rules.« less
Development of an expert planning system for OSSA
NASA Technical Reports Server (NTRS)
Groundwater, B.; Lembeck, M. F.; Sarsfield, L.; Diaz, Alphonso
1988-01-01
This paper presents concepts related to preliminary work for the development of an expert planning system for NASA's Office for Space Science and Applications (OSSA). The expert system will function as a planner's decision aid in preparing mission plans encompassing sets of proposed OSSA space science initiatives. These plans in turn will be checked against budgetary and technical constraints and tested for constraint violations. Appropriate advice will be generated by the system for making modifications to the plans to bring them in line with the constraints. The OSSA Planning Expert System (OPES) has been designed to function as an integral part of the OSSA mission planning process. It will be able to suggest a best plan, be able to accept and check a user-suggested strawman plan, and should provide a quick response to user request and actions. OPES will be written in the C programming language and have a transparent user interface running under Windows 386 on a Compaq 386/20 machine. The system's sorted knowledge and inference procedures will model the expertise of human planners familiar with the OSSA planning domain. Given mission priorities and budget guidelines, the system first sets the launch dates for each mission. It will check to make sure that planetary launch windows and precursor mission relationships are not violated. Additional levels of constraints will then be considered, checking such things as the availability of a suitable launch vehicle, total mission launch mass required vs. the identified launch mass capability, and the total power required by the payload at its destination vs. the actual power available. System output will be in the form of Gantt charts, spreadsheet hardcopy, and other presentation quality materials detailing the resulting OSSA mission plan.
NASA Technical Reports Server (NTRS)
Horvath, Joan C.; Alkalaj, Leon J.; Schneider, Karl M.; Amador, Arthur V.; Spitale, Joseph N.
1993-01-01
Robotic spacecraft are controlled by sets of commands called 'sequences.' These sequences must be checked against mission constraints. Making our existing constraint checking program faster would enable new capabilities in our uplink process. Therefore, we are rewriting this program to run on a parallel computer. To do so, we had to determine how to run constraint-checking algorithms in parallel and create a new method of specifying spacecraft models and constraints. This new specification gives us a means of representing flight systems and their predicted response to commands which could be used in a variety of applications throughout the command process, particularly during anomaly or high-activity operations. This commonality could reduce operations cost and risk for future complex missions. Lessons learned in applying some parts of this system to the TOPEX/Poseidon mission will be described.
Check Calibration of the NASA Glenn 10- by 10-Foot Supersonic Wind Tunnel (2014 Test Entry)
NASA Technical Reports Server (NTRS)
Johnson, Aaron; Pastor-Barsi, Christine; Arrington, E. Allen
2016-01-01
A check calibration of the 10- by 10-Foot Supersonic Wind Tunnel (SWT) was conducted in May/June 2014 using an array of five supersonic wedge probes to verify the 1999 Calibration. This check calibration was necessary following a control systems upgrade and an integrated systems test (IST). This check calibration was required to verify the tunnel flow quality was unchanged by the control systems upgrade prior to the next test customer beginning their test entry. The previous check calibration of the tunnel occurred in 2007, prior to the Mars Science Laboratory test program. Secondary objectives of this test entry included the validation of the new Cobra data acquisition system (DAS) against the current Escort DAS and the creation of statistical process control (SPC) charts through the collection of series of repeated test points at certain predetermined tunnel parameters. The SPC charts secondary objective was not completed due to schedule constraints. It is hoped that this effort will be readdressed and completed in the near future.
Integrity Constraint Monitoring in Software Development: Proposed Architectures
NASA Technical Reports Server (NTRS)
Fernandez, Francisco G.
1997-01-01
In the development of complex software systems, designers are required to obtain from many sources and manage vast amounts of knowledge of the system being built and communicate this information to personnel with a variety of backgrounds. Knowledge concerning the properties of the system, including the structure of, relationships between and limitations of the data objects in the system, becomes increasingly more vital as the complexity of the system and the number of knowledge sources increases. Ensuring that violations of these properties do not occur becomes steadily more challenging. One approach toward managing the enforcement or system properties, called context monitoring, uses a centralized repository of integrity constraints and a constraint satisfiability mechanism for dynamic verification of property enforcement during program execution. The focus of this paper is to describe possible software architectures that define a mechanism for dynamically checking the satisfiability of a set of constraints on a program. The next section describes the context monitoring approach in general. Section 3 gives an overview of the work currently being done toward the addition of an integrity constraint satisfiability mechanism to a high-level program language, SequenceL, and demonstrates how this model is being examined to develop a general software architecture. Section 4 describes possible architectures for a general constraint satisfiability mechanism, as well as an alternative approach that, uses embedded database queries in lieu of an external monitor. The paper concludes with a brief summary outlining the, current state of the research and future work.
Time management displays for shuttle countdown
NASA Technical Reports Server (NTRS)
Beller, Arthur E.; Hadaller, H. Greg; Ricci, Mark J.
1992-01-01
The Intelligent Launch Decision Support System project is developing a Time Management System (TMS) for the NASA Test Director (NTD) to use for time management during Shuttle terminal countdown. TMS is being developed in three phases: an information phase; a tool phase; and an advisor phase. The information phase is an integrated display (TMID) of firing room clocks, of graphic timelines with Ground Launch Sequencer events, and of constraints. The tool phase is a what-if spreadsheet (TMWI) for devising plans for resuming from unplanned hold situations. It is tied to information in TMID, propagates constraints forward and backward to complete unspecified values, and checks the plan against constraints. The advisor phase is a situation advisor (TMSA), which proactively suggests tactics. A concept prototype for TMSA is under development. The TMID is currently undergoing field testing. Displays for TMID and TMWI are described. Descriptions include organization, rationale for organization, implementation choices and constraints, and use by NTD.
NASA Technical Reports Server (NTRS)
Izygon, Michel
1992-01-01
This report summarizes the findings and lessons learned from the development of an intelligent user interface for a space flight planning simulation program, in the specific area related to constraint-checking. The different functionalities of the Graphical User Interface part and of the rule-based part of the system have been identified. Their respective domain of applicability for error prevention and error checking have been specified.
Computing Systems Configuration for Highly Integrated Guidance and Control Systems
1988-06-01
conmmunication ear lea imlustrielaiservenant dais an projet. Cela eat renda , possible entre auies par l’adoption dene mibodologie do travai coammune, par...computed graph results to data processors for post processing, or commnicating with system I/O modules. The ESU PI- Bus interface logic includes extra ...the extra constraint checking helps to find more problems at compile time), and it is especially well- suited for large software systems written by a
Greenwood, Daniel; Davids, Keith; Renshaw, Ian
2014-01-01
Coordination of dynamic interceptive movements is predicated on cyclical relations between an individual's actions and information sources from the performance environment. To identify dynamic informational constraints, which are interwoven with individual and task constraints, coaches' experiential knowledge provides a complementary source to support empirical understanding of performance in sport. In this study, 15 expert coaches from 3 sports (track and field, gymnastics and cricket) participated in a semi-structured interview process to identify potential informational constraints which they perceived to regulate action during run-up performance. Expert coaches' experiential knowledge revealed multiple information sources which may constrain performance adaptations in such locomotor pointing tasks. In addition to the locomotor pointing target, coaches' knowledge highlighted two other key informational constraints: vertical reference points located near the locomotor pointing target and a check mark located prior to the locomotor pointing target. This study highlights opportunities for broadening the understanding of perception and action coupling processes, and the identified information sources warrant further empirical investigation as potential constraints on athletic performance. Integration of experiential knowledge of expert coaches with theoretically driven empirical knowledge represents a promising avenue to drive future applied science research and pedagogical practice.
NASA Technical Reports Server (NTRS)
Bosworth, John T.
2009-01-01
Adaptive control should be integrated with a baseline controller and only used when necessary (5 responses). Implementation as an emergency system. Immediately re-stabilize and return to controlled flight. Forced perturbation (excitation) for fine-tuning system a) Check margins; b) Develop requirements for amplitude of excitation. Adaptive system can improve performance by eating into margin constraints imposed on the non-adaptive system. Nonlinear effects due to multi-string voting.
Development of advanced avionics systems applicable to terminal-configured vehicles
NASA Technical Reports Server (NTRS)
Heimbold, R. L.; Lee, H. P.; Leffler, M. F.
1980-01-01
A technique to add the time constraint to the automatic descent feature of the existing L-1011 aircraft Flight Management System (FMS) was developed. Software modifications were incorporated in the FMS computer program and the results checked by lab simulation and on a series of eleven test flights. An arrival time dispersion (2 sigma) of 19 seconds was achieved. The 4 D descent technique can be integrated with the time-based metering method of air traffic control. Substantial reductions in delays at today's busy airports should result.
Proceedings of the First NASA Formal Methods Symposium
NASA Technical Reports Server (NTRS)
Denney, Ewen (Editor); Giannakopoulou, Dimitra (Editor); Pasareanu, Corina S. (Editor)
2009-01-01
Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000.
Zhang, Meiyan; Zheng, Yahong Rosa
2017-01-01
This paper investigates the task assignment and path planning problem for multiple AUVs in three dimensional (3D) underwater wireless sensor networks where nonholonomic motion constraints of underwater AUVs in 3D space are considered. The multi-target task assignment and path planning problem is modeled by the Multiple Traveling Sales Person (MTSP) problem and the Genetic Algorithm (GA) is used to solve the MTSP problem with Euclidean distance as the cost function and the Tour Hop Balance (THB) or Tour Length Balance (TLB) constraints as the stop criterion. The resulting tour sequences are mapped to 2D Dubins curves in the X−Y plane, and then interpolated linearly to obtain the Z coordinates. We demonstrate that the linear interpolation fails to achieve G1 continuity in the 3D Dubins path for multiple targets. Therefore, the interpolated 3D Dubins curves are checked against the AUV dynamics constraint and the ones satisfying the constraint are accepted to finalize the 3D Dubins curve selection. Simulation results demonstrate that the integration of the 3D Dubins curve with the MTSP model is successful and effective for solving the 3D target assignment and path planning problem. PMID:28696377
Cai, Wenyu; Zhang, Meiyan; Zheng, Yahong Rosa
2017-07-11
This paper investigates the task assignment and path planning problem for multiple AUVs in three dimensional (3D) underwater wireless sensor networks where nonholonomic motion constraints of underwater AUVs in 3D space are considered. The multi-target task assignment and path planning problem is modeled by the Multiple Traveling Sales Person (MTSP) problem and the Genetic Algorithm (GA) is used to solve the MTSP problem with Euclidean distance as the cost function and the Tour Hop Balance (THB) or Tour Length Balance (TLB) constraints as the stop criterion. The resulting tour sequences are mapped to 2D Dubins curves in the X - Y plane, and then interpolated linearly to obtain the Z coordinates. We demonstrate that the linear interpolation fails to achieve G 1 continuity in the 3D Dubins path for multiple targets. Therefore, the interpolated 3D Dubins curves are checked against the AUV dynamics constraint and the ones satisfying the constraint are accepted to finalize the 3D Dubins curve selection. Simulation results demonstrate that the integration of the 3D Dubins curve with the MTSP model is successful and effective for solving the 3D target assignment and path planning problem.
User-friendly design approach for analog layout design
NASA Astrophysics Data System (ADS)
Li, Yongfu; Lee, Zhao Chuan; Tripathi, Vikas; Perez, Valerio; Ong, Yoong Seang; Hui, Chiu Wing
2017-03-01
Analog circuits are sensitives to the changes in the layout environment conditions, manufacturing processes, and variations. This paper presents analog verification flow with five types of analogfocused layout constraint checks to assist engineers in identifying any potential device mismatch and layout drawing mistakes. Compared to several solutions, our approach only requires layout design, which is sufficient to recognize all the matched devices. Our approach simplifies the data preparation and allows seamless integration into the layout environment with minimum disruption to the custom layout flow. Our user-friendly analog verification flow provides the engineer with more confident with their layouts quality.
A Roadmap for using Agile Development in a Traditional System
NASA Technical Reports Server (NTRS)
Streiffert, Barbara; Starbird, Thomas
2006-01-01
I. Ensemble Development Group: a) Produces activity planning software for in spacecraft; b) Built on Eclipse Rich Client Platform (open source development and runtime software); c) Funded by multiple sources including the Mars Technology Program; d) Incorporated the use of Agile Development. II. Next Generation Uplink Planning System: a) Researches the Activity Planning and Sequencing Subsystem for Mars Science Laboratory (APSS); b) APSS includes Ensemble, Activity Modeling, Constraint Checking, Command Editing and Sequencing tools plus other uplink generation utilities; c) Funded by the Mars Technology Program; d) Integrates all of the tools for APSS.
Universal Quantification in a Constraint-Based Planner
NASA Technical Reports Server (NTRS)
Golden, Keith; Frank, Jeremy; Clancy, Daniel (Technical Monitor)
2002-01-01
Constraints and universal quantification are both useful in planning, but handling universally quantified constraints presents some novel challenges. We present a general approach to proving the validity of universally quantified constraints. The approach essentially consists of checking that the constraint is not violated for all members of the universe. We show that this approach can sometimes be applied even when variable domains are infinite, and we present some useful special cases where this can be done efficiently.
NASA Technical Reports Server (NTRS)
Turpin, Jason B.
2004-01-01
One-dimensional water-hammer modeling involves the solution of two coupled non-linear hyperbolic partial differential equations (PDEs). These equations result from applying the principles of conservation of mass and momentum to flow through a pipe, and usually the assumption that the speed at which pressure waves propagate through the pipe is constant. In order to solve these equations for the interested quantities (i.e. pressures and flow rates), they must first be converted to a system of ordinary differential equations (ODEs) by either approximating the spatial derivative terms with numerical techniques or using the Method of Characteristics (MOC). The MOC approach is ideal in that no numerical approximation errors are introduced in converting the original system of PDEs into an equivalent system of ODEs. Unfortunately this resulting system of ODEs is bound by a time step constraint so that when integrating the equations the solution can only be obtained at fixed time intervals. If the fluid system to be modeled also contains dynamic components (i.e. components that are best modeled by a system of ODEs), it may be necessary to take extremely small time steps during certain points of the model simulation in order to achieve stability and/or accuracy in the solution. Coupled together, the fixed time step constraint invoked by the MOC, and the occasional need for extremely small time steps in order to obtain stability and/or accuracy, can greatly increase simulation run times. As one solution to this problem, a method for combining variable step integration (VSI) algorithms with the MOC was developed for modeling water-hammer in systems with highly dynamic components. A case study is presented in which reverse flow through a dual-flapper check valve introduces a water-hammer event. The predicted pressure responses upstream of the check-valve are compared with test data.
MOM: A meteorological data checking expert system in CLIPS
NASA Technical Reports Server (NTRS)
Odonnell, Richard
1990-01-01
Meteorologists have long faced the problem of verifying the data they use. Experience shows that there is a sizable number of errors in the data reported by meteorological observers. This is unacceptable for computer forecast models, which depend on accurate data for accurate results. Most errors that occur in meteorological data are obvious to the meteorologist, but time constraints prevent hand-checking. For this reason, it is necessary to have a 'front end' to the computer model to ensure the accuracy of input. Various approaches to automatic data quality control have been developed by several groups. MOM is a rule-based system implemented in CLIPS and utilizing 'consistency checks' and 'range checks'. The system is generic in the sense that it knows some meteorological principles, regardless of specific station characteristics. Specific constraints kept as CLIPS facts in a separate file provide for system flexibility. Preliminary results show that the expert system has detected some inconsistencies not noticed by a local expert.
'Constraint consistency' at all orders in cosmological perturbation theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nandi, Debottam; Shankaranarayanan, S., E-mail: debottam@iisertvm.ac.in, E-mail: shanki@iisertvm.ac.in
2015-08-01
We study the equivalence of two—order-by-order Einstein's equation and Reduced action—approaches to cosmological perturbation theory at all orders for different models of inflation. We point out a crucial consistency check which we refer to as 'Constraint consistency' condition that needs to be satisfied in order for the two approaches to lead to identical single variable equation of motion. The method we propose here is quick and efficient to check the consistency for any model including modified gravity models. Our analysis points out an important feature which is crucial for inflationary model building i.e., all 'constraint' inconsistent models have higher ordermore » Ostrogradsky's instabilities but the reverse is not true. In other words, one can have models with constraint Lapse function and Shift vector, though it may have Ostrogradsky's instabilities. We also obtain single variable equation for non-canonical scalar field in the limit of power-law inflation for the second-order perturbed variables.« less
Virasoro constraints for D 2n + 1 -, E 6 -, E 7 -, E 8 -type minimal models coupled to 2D gravity
NASA Astrophysics Data System (ADS)
Yen, Tim
1990-12-01
We find Virasoro constraints for D 2 n + 1 -, E 6 -, E 7 -, E 8 -type models analogous to the recently discovered Virasoro constraints for A n-type models by Fukuma et al., and Dijkgraaf et al. We verify that the proposed Virasoro constraints give operator scaling dimensions identical to those found by Kostov. We check that these Virasoro constraints and, more generally, W-algebra constraints can be used to express correlation functions with non-primary operator in terms of correlation functions of primary operators only.
NASA Astrophysics Data System (ADS)
Onoyama, Takashi; Maekawa, Takuya; Kubota, Sen; Tsuruta, Setuso; Komoda, Norihisa
To build a cooperative logistics network covering multiple enterprises, a planning method that can build a long-distance transportation network is required. Many strict constraints are imposed on this type of problem. To solve these strict-constraint problems, a selfish constraint satisfaction genetic algorithm (GA) is proposed. In this GA, each gene of an individual satisfies only its constraint selfishly, disregarding the constraints of other genes in the same individuals. Moreover, a constraint pre-checking method is also applied to improve the GA convergence speed. The experimental result shows the proposed method can obtain an accurate solution in a practical response time.
Project Report: Automatic Sequence Processor Software Analysis
NASA Technical Reports Server (NTRS)
Benjamin, Brandon
2011-01-01
The Mission Planning and Sequencing (MPS) element of Multi-Mission Ground System and Services (MGSS) provides space missions with multi-purpose software to plan spacecraft activities, sequence spacecraft commands, and then integrate these products and execute them on spacecraft. Jet Propulsion Laboratory (JPL) is currently is flying many missions. The processes for building, integrating, and testing the multi-mission uplink software need to be improved to meet the needs of the missions and the operations teams that command the spacecraft. The Multi-Mission Sequencing Team is responsible for collecting and processing the observations, experiments and engineering activities that are to be performed on a selected spacecraft. The collection of these activities is called a sequence and ultimately a sequence becomes a sequence of spacecraft commands. The operations teams check the sequence to make sure that no constraints are violated. The workflow process involves sending a program start command, which activates the Automatic Sequence Processor (ASP). The ASP is currently a file-based system that is comprised of scripts written in perl, c-shell and awk. Once this start process is complete, the system checks for errors and aborts if there are any; otherwise the system converts the commands to binary, and then sends the resultant information to be radiated to the spacecraft.
Timing analysis by model checking
NASA Technical Reports Server (NTRS)
Naydich, Dimitri; Guaspari, David
2000-01-01
The safety of modern avionics relies on high integrity software that can be verified to meet hard real-time requirements. The limits of verification technology therefore determine acceptable engineering practice. To simplify verification problems, safety-critical systems are commonly implemented under the severe constraints of a cyclic executive, which make design an expensive trial-and-error process highly intolerant of change. Important advances in analysis techniques, such as rate monotonic analysis (RMA), have provided a theoretical and practical basis for easing these onerous restrictions. But RMA and its kindred have two limitations: they apply only to verifying the requirement of schedulability (that tasks meet their deadlines) and they cannot be applied to many common programming paradigms. We address both these limitations by applying model checking, a technique with successful industrial applications in hardware design. Model checking algorithms analyze finite state machines, either by explicit state enumeration or by symbolic manipulation. Since quantitative timing properties involve a potentially unbounded state variable (a clock), our first problem is to construct a finite approximation that is conservative for the properties being analyzed-if the approximation satisfies the properties of interest, so does the infinite model. To reduce the potential for state space explosion we must further optimize this finite model. Experiments with some simple optimizations have yielded a hundred-fold efficiency improvement over published techniques.
Spacecraft command verification: The AI solution
NASA Technical Reports Server (NTRS)
Fesq, Lorraine M.; Stephan, Amy; Smith, Brian K.
1990-01-01
Recently, a knowledge-based approach was used to develop a system called the Command Constraint Checker (CCC) for TRW. CCC was created to automate the process of verifying spacecraft command sequences. To check command files by hand for timing and sequencing errors is a time-consuming and error-prone task. Conventional software solutions were rejected when it was estimated that it would require 36 man-months to build an automated tool to check constraints by conventional methods. Using rule-based representation to model the various timing and sequencing constraints of the spacecraft, CCC was developed and tested in only three months. By applying artificial intelligence techniques, CCC designers were able to demonstrate the viability of AI as a tool to transform difficult problems into easily managed tasks. The design considerations used in developing CCC are discussed and the potential impact of this system on future satellite programs is examined.
Outage maintenance checks on large generator windings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nindra, B.; Jeney, S.I.; Slobodinsky, Y.
In the present days of austerity, more constraints and pressures are being brought on the maintenance engineers to certify the generators for their reliability and life extension. The outages are shorter and intervals between the outages are becoming longer. The annual outages were very common when utilities had no regulatory constraints and also had standby capacities. Furthermore, due to lean and mean budgets, outage maintenance programs are being pursued more aggressively, so that longer interval outages can be achieved to ensure peak generator performance. This paper will discuss various visual checks, electrical tests and recommended fixes to achieve the abovemore » mentioned objectives, in case any deficiencies are found.« less
1994-05-04
gen- word pussies . In Proceedings of the Eighth Na- erate fewer than c clauses so the boundary of the grid tional Conference on Artificial Intelligence...checking. (2) (FC - NI), forward-checking search branches are bundled and visited once. with the advantage of neighborhood inter- If a dead -end occurs
OntoCheck: verifying ontology naming conventions and metadata completeness in Protégé 4.
Schober, Daniel; Tudose, Ilinca; Svatek, Vojtech; Boeker, Martin
2012-09-21
Although policy providers have outlined minimal metadata guidelines and naming conventions, ontologies of today still display inter- and intra-ontology heterogeneities in class labelling schemes and metadata completeness. This fact is at least partially due to missing or inappropriate tools. Software support can ease this situation and contribute to overall ontology consistency and quality by helping to enforce such conventions. We provide a plugin for the Protégé Ontology editor to allow for easy checks on compliance towards ontology naming conventions and metadata completeness, as well as curation in case of found violations. In a requirement analysis, derived from a prior standardization approach carried out within the OBO Foundry, we investigate the needed capabilities for software tools to check, curate and maintain class naming conventions. A Protégé tab plugin was implemented accordingly using the Protégé 4.1 libraries. The plugin was tested on six different ontologies. Based on these test results, the plugin could be refined, also by the integration of new functionalities. The new Protégé plugin, OntoCheck, allows for ontology tests to be carried out on OWL ontologies. In particular the OntoCheck plugin helps to clean up an ontology with regard to lexical heterogeneity, i.e. enforcing naming conventions and metadata completeness, meeting most of the requirements outlined for such a tool. Found test violations can be corrected to foster consistency in entity naming and meta-annotation within an artefact. Once specified, check constraints like name patterns can be stored and exchanged for later re-use. Here we describe a first version of the software, illustrate its capabilities and use within running ontology development efforts and briefly outline improvements resulting from its application. Further, we discuss OntoChecks capabilities in the context of related tools and highlight potential future expansions. The OntoCheck plugin facilitates labelling error detection and curation, contributing to lexical quality assurance in OWL ontologies. Ultimately, we hope this Protégé extension will ease ontology alignments as well as lexical post-processing of annotated data and hence can increase overall secondary data usage by humans and computers.
Symbolic discrete event system specification
NASA Technical Reports Server (NTRS)
Zeigler, Bernard P.; Chi, Sungdo
1992-01-01
Extending discrete event modeling formalisms to facilitate greater symbol manipulation capabilities is important to further their use in intelligent control and design of high autonomy systems. An extension to the DEVS formalism that facilitates symbolic expression of event times by extending the time base from the real numbers to the field of linear polynomials over the reals is defined. A simulation algorithm is developed to generate the branching trajectories resulting from the underlying nondeterminism. To efficiently manage symbolic constraints, a consistency checking algorithm for linear polynomial constraints based on feasibility checking algorithms borrowed from linear programming has been developed. The extended formalism offers a convenient means to conduct multiple, simultaneous explorations of model behaviors. Examples of application are given with concentration on fault model analysis.
Symbolic PathFinder: Symbolic Execution of Java Bytecode
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.; Rungta, Neha
2010-01-01
Symbolic Pathfinder (SPF) combines symbolic execution with model checking and constraint solving for automated test case generation and error detection in Java programs with unspecified inputs. In this tool, programs are executed on symbolic inputs representing multiple concrete inputs. Values of variables are represented as constraints generated from the analysis of Java bytecode. The constraints are solved using off-the shelf solvers to generate test inputs guaranteed to achieve complex coverage criteria. SPF has been used successfully at NASA, in academia, and in industry.
Safe, Multiphase Bounds Check Elimination in Java
2010-01-28
production of mobile code from source code, JIT compilation in the virtual ma- chine, and application code execution. The code producer uses...invariants, and inequality constraint analysis) to identify and prove redundancy of bounds checks. During class-loading and JIT compilation, the virtual...unoptimized code if the speculated invariants do not hold. The combined effect of the multiple phases is to shift the effort as- sociated with bounds
Verification and Planning Based on Coinductive Logic Programming
NASA Technical Reports Server (NTRS)
Bansal, Ajay; Min, Richard; Simon, Luke; Mallya, Ajay; Gupta, Gopal
2008-01-01
Coinduction is a powerful technique for reasoning about unfounded sets, unbounded structures, infinite automata, and interactive computations [6]. Where induction corresponds to least fixed point's semantics, coinduction corresponds to greatest fixed point semantics. Recently coinduction has been incorporated into logic programming and an elegant operational semantics developed for it [11, 12]. This operational semantics is the greatest fix point counterpart of SLD resolution (SLD resolution imparts operational semantics to least fix point based computations) and is termed co- SLD resolution. In co-SLD resolution, a predicate goal p( t) succeeds if it unifies with one of its ancestor calls. In addition, rational infinite terms are allowed as arguments of predicates. Infinite terms are represented as solutions to unification equations and the occurs check is omitted during the unification process. Coinductive Logic Programming (Co-LP) and Co-SLD resolution can be used to elegantly perform model checking and planning. A combined SLD and Co-SLD resolution based LP system forms the common basis for planning, scheduling, verification, model checking, and constraint solving [9, 4]. This is achieved by amalgamating SLD resolution, co-SLD resolution, and constraint logic programming [13] in a single logic programming system. Given that parallelism in logic programs can be implicitly exploited [8], complex, compute-intensive applications (planning, scheduling, model checking, etc.) can be executed in parallel on multi-core machines. Parallel execution can result in speed-ups as well as in larger instances of the problems being solved. In the remainder we elaborate on (i) how planning can be elegantly and efficiently performed under real-time constraints, (ii) how real-time systems can be elegantly and efficiently model- checked, as well as (iii) how hybrid systems can be verified in a combined system with both co-SLD and SLD resolution. Implementations of co-SLD resolution as well as preliminary implementations of the planning and verification applications have been developed [4]. Co-LP and Model Checking: The vast majority of properties that are to be verified can be classified into safety properties and liveness properties. It is well known within model checking that safety properties can be verified by reachability analysis, i.e, if a counter-example to the property exists, it can be finitely determined by enumerating all the reachable states of the Kripke structure.
Large Deployable Reflector (LDR) Requirements for Space Station Accommodations
NASA Technical Reports Server (NTRS)
Crowe, D. A.; Clayton, M. J.; Runge, F. C.
1985-01-01
Top level requirements for assembly and integration of the Large Deployable Reflector (LDR) Observatory at the Space Station are examined. Concepts are currently under study for LDR which will provide a sequel to the Infrared Astronomy Satellite and the Space Infrared Telescope Facility. LDR will provide a spectacular capability over a very broad spectral range. The Space Station will provide an essential facility for the initial assembly and check out of LDR, as well as a necessary base for refurbishment, repair and modification. By providing a manned platform, the Space Station will remove the time constraint on assembly associated with use of the Shuttle alone. Personnel safety during necessary EVA is enhanced by the presence of the manned facility.
Large Deployable Reflector (LDR) requirements for space station accommodations
NASA Astrophysics Data System (ADS)
Crowe, D. A.; Clayton, M. J.; Runge, F. C.
1985-04-01
Top level requirements for assembly and integration of the Large Deployable Reflector (LDR) Observatory at the Space Station are examined. Concepts are currently under study for LDR which will provide a sequel to the Infrared Astronomy Satellite and the Space Infrared Telescope Facility. LDR will provide a spectacular capability over a very broad spectral range. The Space Station will provide an essential facility for the initial assembly and check out of LDR, as well as a necessary base for refurbishment, repair and modification. By providing a manned platform, the Space Station will remove the time constraint on assembly associated with use of the Shuttle alone. Personnel safety during necessary EVA is enhanced by the presence of the manned facility.
NASA Technical Reports Server (NTRS)
Chien, Steve A.; Tran, Daniel Q.; Rabideau, Gregg R.; Schaffer, Steven R.
2011-01-01
Software has been designed to schedule remote sensing with the Earth Observing One spacecraft. The software attempts to satisfy as many observation requests as possible considering each against spacecraft operation constraints such as data volume, thermal, pointing maneuvers, and others. More complex constraints such as temperature are approximated to enable efficient reasoning while keeping the spacecraft within safe limits. Other constraints are checked using an external software library. For example, an attitude control library is used to determine the feasibility of maneuvering between pairs of observations. This innovation can deal with a wide range of spacecraft constraints and solve large scale scheduling problems like hundreds of observations and thousands of combinations of observation sequences.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lazkoz, Ruth; Escamilla-Rivera, Celia; Salzano, Vincenzo
Cosmography provides a model-independent way to map the expansion history of the Universe. In this paper we simulate a Euclid-like survey and explore cosmographic constraints from future Baryonic Acoustic Oscillations (BAO) observations. We derive general expressions for the BAO transverse and radial modes and discuss the optimal order of the cosmographic expansion that provides reliable cosmological constraints. Through constraints on the deceleration and jerk parameters, we show that future BAO data have the potential to provide a model-independent check of the cosmic acceleration as well as a discrimination between the standard ΛCDM model and alternative mechanisms of cosmic acceleration.
XMI2USE: A Tool for Transforming XMI to USE Specifications
NASA Astrophysics Data System (ADS)
Sun, Wuliang; Song, Eunjee; Grabow, Paul C.; Simmonds, Devon M.
The UML-based Specification Environment (USE) tool supports syntactic analysis, type checking, consistency checking, and dynamic validation of invariants and pre-/post conditions specified in the Object Constraint Language (OCL). Due to its animation and analysis power, it is useful when checking critical non-functional properties such as security policies. However, the USE tool requires one to specify (i.e., "write") a model using its own textual language and does not allow one to import any model specification files created by other UML modeling tools. Hence, to make the best use of existing UML tools, we often create a model with OCL constraints using a modeling tool such as the IBM Rational Software Architect (RSA) and then use the USE tool for model validation. This approach, however, requires a manual transformation between the specifications of two different tool formats, which is error-prone and diminishes the benefit of automated model-level validations. In this paper, we describe our own implementation of a specification transformation engine that is based on the Model Driven Architecture (MDA) framework and currently supports automatic tool-level transformations from RSA to USE.
NASA Astrophysics Data System (ADS)
Banda, Gourinath; Gallagher, John P.
interpretation provides a practical approach to verifying properties of infinite-state systems. We apply the framework of abstract interpretation to derive an abstract semantic function for the modal μ-calculus, which is the basis for abstract model checking. The abstract semantic function is constructed directly from the standard concrete semantics together with a Galois connection between the concrete state-space and an abstract domain. There is no need for mixed or modal transition systems to abstract arbitrary temporal properties, as in previous work in the area of abstract model checking. Using the modal μ-calculus to implement CTL, the abstract semantics gives an over-approximation of the set of states in which an arbitrary CTL formula holds. Then we show that this leads directly to an effective implementation of an abstract model checking algorithm for CTL using abstract domains based on linear constraints. The implementation of the abstract semantic function makes use of an SMT solver. We describe an implemented system for proving properties of linear hybrid automata and give some experimental results.
OntoCheck: verifying ontology naming conventions and metadata completeness in Protégé 4
2012-01-01
Background Although policy providers have outlined minimal metadata guidelines and naming conventions, ontologies of today still display inter- and intra-ontology heterogeneities in class labelling schemes and metadata completeness. This fact is at least partially due to missing or inappropriate tools. Software support can ease this situation and contribute to overall ontology consistency and quality by helping to enforce such conventions. Objective We provide a plugin for the Protégé Ontology editor to allow for easy checks on compliance towards ontology naming conventions and metadata completeness, as well as curation in case of found violations. Implementation In a requirement analysis, derived from a prior standardization approach carried out within the OBO Foundry, we investigate the needed capabilities for software tools to check, curate and maintain class naming conventions. A Protégé tab plugin was implemented accordingly using the Protégé 4.1 libraries. The plugin was tested on six different ontologies. Based on these test results, the plugin could be refined, also by the integration of new functionalities. Results The new Protégé plugin, OntoCheck, allows for ontology tests to be carried out on OWL ontologies. In particular the OntoCheck plugin helps to clean up an ontology with regard to lexical heterogeneity, i.e. enforcing naming conventions and metadata completeness, meeting most of the requirements outlined for such a tool. Found test violations can be corrected to foster consistency in entity naming and meta-annotation within an artefact. Once specified, check constraints like name patterns can be stored and exchanged for later re-use. Here we describe a first version of the software, illustrate its capabilities and use within running ontology development efforts and briefly outline improvements resulting from its application. Further, we discuss OntoChecks capabilities in the context of related tools and highlight potential future expansions. Conclusions The OntoCheck plugin facilitates labelling error detection and curation, contributing to lexical quality assurance in OWL ontologies. Ultimately, we hope this Protégé extension will ease ontology alignments as well as lexical post-processing of annotated data and hence can increase overall secondary data usage by humans and computers. PMID:23046606
NASA Technical Reports Server (NTRS)
Maldague, Pierre; Page, Dennis; Chase, Adam
2005-01-01
Activity Plan Generator (APGEN), now at version 5.0, is a computer program that assists in generating an integrated plan of activities for a spacecraft mission that does not oversubscribe spacecraft and ground resources. APGEN generates an interactive display, through which the user can easily create or modify the plan. The display summarizes the plan by means of a time line, whereon each activity is represented by a bar stretched between its beginning and ending times. Activities can be added, deleted, and modified via simple mouse and keyboard actions. The use of resources can be viewed on resource graphs. Resource and activity constraints can be checked. Types of activities, resources, and constraints are defined by simple text files, which the user can modify. In one of two modes of operation, APGEN acts as a planning expert assistant, displaying the plan and identifying problems in the plan. The user is in charge of creating and modifying the plan. In the other mode, APGEN automatically creates a plan that does not oversubscribe resources. The user can then manually modify the plan. APGEN is designed to interact with other software that generates sequences of timed commands for implementing details of planned activities.
Dynamic Response of Functionally Graded Carbon Nanotube Reinforced Sandwich Plate
NASA Astrophysics Data System (ADS)
Mehar, Kulmani; Panda, Subrata Kumar
2018-03-01
In this article, the dynamic response of the carbon nanotube-reinforced functionally graded sandwich composite plate has been studied numerically with the help of finite element method. The face sheets of the sandwich composite plate are made of carbon nanotube- reinforced composite for two different grading patterns whereas the core phase is taken as isotropic material. The final properties of the structure are calculated using the rule of mixture. The geometrical model of the sandwich plate is developed and discretized suitably with the help of available shell element in ANSYS library. Subsequently, the corresponding numerical dynamic responses computed via batch input technique (parametric design language code in ANSYS) of ANSYS including Newmark’s integration scheme. The stability of the sandwich structural numerical model is established through the proper convergence study. Further, the reliability of the sandwich model is checked by comparison study between present and available results from references. As a final point, some numerical problems have been solved to examine the effect of different design constraints (carbon nanotube distribution pattern, core to face thickness ratio, volume fractions of the nanotube, length to thickness ratio, aspect ratio and constraints at edges) on the time-responses of sandwich plate.
Dual deep modeling: multi-level modeling with dual potencies and its formalization in F-Logic.
Neumayr, Bernd; Schuetz, Christoph G; Jeusfeld, Manfred A; Schrefl, Michael
2018-01-01
An enterprise database contains a global, integrated, and consistent representation of a company's data. Multi-level modeling facilitates the definition and maintenance of such an integrated conceptual data model in a dynamic environment of changing data requirements of diverse applications. Multi-level models transcend the traditional separation of class and object with clabjects as the central modeling primitive, which allows for a more flexible and natural representation of many real-world use cases. In deep instantiation, the number of instantiation levels of a clabject or property is indicated by a single potency. Dual deep modeling (DDM) differentiates between source potency and target potency of a property or association and supports the flexible instantiation and refinement of the property by statements connecting clabjects at different modeling levels. DDM comes with multiple generalization of clabjects, subsetting/specialization of properties, and multi-level cardinality constraints. Examples are presented using a UML-style notation for DDM together with UML class and object diagrams for the representation of two-level user views derived from the multi-level model. Syntax and semantics of DDM are formalized and implemented in F-Logic, supporting the modeler with integrity checks and rich query facilities.
Robust and Accurate Image-Based Georeferencing Exploiting Relative Orientation Constraints
NASA Astrophysics Data System (ADS)
Cavegn, S.; Blaser, S.; Nebiker, S.; Haala, N.
2018-05-01
Urban environments with extended areas of poor GNSS coverage as well as indoor spaces that often rely on real-time SLAM algorithms for camera pose estimation require sophisticated georeferencing in order to fulfill our high requirements of a few centimeters for absolute 3D point measurement accuracies. Since we focus on image-based mobile mapping, we extended the structure-from-motion pipeline COLMAP with georeferencing capabilities by integrating exterior orientation parameters from direct sensor orientation or SLAM as well as ground control points into bundle adjustment. Furthermore, we exploit constraints for relative orientation parameters among all cameras in bundle adjustment, which leads to a significant robustness and accuracy increase especially by incorporating highly redundant multi-view image sequences. We evaluated our integrated georeferencing approach on two data sets, one captured outdoors by a vehicle-based multi-stereo mobile mapping system and the other captured indoors by a portable panoramic mobile mapping system. We obtained mean RMSE values for check point residuals between image-based georeferencing and tachymetry of 2 cm in an indoor area, and 3 cm in an urban environment where the measurement distances are a multiple compared to indoors. Moreover, in comparison to a solely image-based procedure, our integrated georeferencing approach showed a consistent accuracy increase by a factor of 2-3 at our outdoor test site. Due to pre-calibrated relative orientation parameters, images of all camera heads were oriented correctly in our challenging indoor environment. By performing self-calibration of relative orientation parameters among respective cameras of our vehicle-based mobile mapping system, remaining inaccuracies from suboptimal test field calibration were successfully compensated.
Comment on id-based remote data integrity checking with data privacy preserving
NASA Astrophysics Data System (ADS)
Zhang, Jianhong; Meng, Hongxin
2017-09-01
Recently, an ID-based remote data integrity checking protocol with perfect data privacy preserving (IEEE Transactions on Information Forensics and Security, doi: 10.1109/TIFS.2016.2615853) was proposed to achieve data privacy protection and integrity checking. Unfortunately, in this letter, we demonstrate that their protocol is insecure. An active hacker can modify the stored data without being detected by the verifier in the auditing. And we also show malicious cloud server can convince the verifier that the stored data are kept intact after the outsourced data blocks are deleted. Finally, the reasons to produce such attacks are given.
SU8 diaphragm micropump with monolithically integrated cantilever check valves.
Ezkerra, Aitor; Fernández, Luis José; Mayora, Kepa; Ruano-López, Jesús Miguel
2011-10-07
This paper presents a SU8 unidirectional diaphragm micropump with embedded out-of-plane cantilever check valves. The device represents a reliable and low-cost solution for integration of microfluidic control in lab-on-a-chip devices. Its planar architecture allows monolithic definition of its components in a single step and potential integration with previously reported PCR, electrophoresis and flow-sensing SU8 microdevices. Pneumatic actuation is applied on a PDMS diaphragm, which is bonded to the SU8 body at wafer level, further enhancing its integration and mass production capabilities. The cantilever check valves move synchronously with the diaphragm, feature fast response (10ms), low dead volume (86nl) and a 94% flow blockage up to 300kPa. The micropump achieves a maximum flow rate of 177 μl min(-1) at 6 Hz and 200 kPa with an effective area of 10 mm(2). The device is reliable, self-priming and tolerant to particles and big bubbles. To the knowledge of the authors, this is the first micropump in SU8 with monolithically integrated cantilever check valves.
Standard model effective field theory: Integrating out neutralinos and charginos in the MSSM
NASA Astrophysics Data System (ADS)
Han, Huayong; Huo, Ran; Jiang, Minyuan; Shu, Jing
2018-05-01
We apply the covariant derivative expansion method to integrate out the neutralinos and charginos in the minimal supersymmetric Standard Model. The results are presented as set of pure bosonic dimension-six operators in the Standard Model effective field theory. Nontrivial chirality dependence in fermionic covariant derivative expansion is discussed carefully. The results are checked by computing the h γ γ effective coupling and the electroweak oblique parameters using the Standard Model effective field theory with our effective operators and direct loop calculation. In global fitting, the proposed lepton collider constraint projections, special phenomenological emphasis is paid to the gaugino mass unification scenario (M2≃2 M1) and anomaly mediation scenario (M1≃3.3 M2). These results show that the precision measurement experiments in future lepton colliders will provide a very useful complementary job in probing the electroweakino sector, in particular, filling the gap of the soft lepton plus the missing ET channel search left by the traditional collider, where the neutralino as the lightest supersymmetric particle is very degenerated with the next-to-lightest chargino/neutralino.
Incremental checking of Master Data Management model based on contextual graphs
NASA Astrophysics Data System (ADS)
Lamolle, Myriam; Menet, Ludovic; Le Duc, Chan
2015-10-01
The validation of models is a crucial step in distributed heterogeneous systems. In this paper, an incremental validation method is proposed in the scope of a Model Driven Engineering (MDE) approach, which is used to develop a Master Data Management (MDM) field represented by XML Schema models. The MDE approach presented in this paper is based on the definition of an abstraction layer using UML class diagrams. The validation method aims to minimise the model errors and to optimisethe process of model checking. Therefore, the notion of validation contexts is introduced allowing the verification of data model views. Description logics specify constraints that the models have to check. An experimentation of the approach is presented through an application developed in ArgoUML IDE.
Rate-Compatible Protograph LDPC Codes
NASA Technical Reports Server (NTRS)
Nguyen, Thuy V. (Inventor); Nosratinia, Aria (Inventor); Divsalar, Dariush (Inventor)
2014-01-01
Digital communication coding methods resulting in rate-compatible low density parity-check (LDPC) codes built from protographs. Described digital coding methods start with a desired code rate and a selection of the numbers of variable nodes and check nodes to be used in the protograph. Constraints are set to satisfy a linear minimum distance growth property for the protograph. All possible edges in the graph are searched for the minimum iterative decoding threshold and the protograph with the lowest iterative decoding threshold is selected. Protographs designed in this manner are used in decode and forward relay channels.
Litho hotspots fixing using model based algorithm
NASA Astrophysics Data System (ADS)
Zhang, Meili; Yu, Shirui; Mao, Zhibiao; Shafee, Marwa; Madkour, Kareem; ElManhawy, Wael; Kwan, Joe; Hu, Xinyi; Wan, Qijian; Du, Chunshan
2017-04-01
As technology advances, IC designs are getting more sophisticated, thus it becomes more critical and challenging to fix printability issues in the design flow. Running lithography checks before tapeout is now mandatory for designers, which creates a need for more advanced and easy-to-use techniques for fixing hotspots found after lithographic simulation without creating a new design rule checking (DRC) violation or generating a new hotspot. This paper presents a new methodology for fixing hotspots on layouts while using the same engine currently used to detect the hotspots. The fix is achieved by applying minimum movement of edges causing the hotspot, with consideration of DRC constraints. The fix is internally simulated by the lithographic simulation engine to verify that the hotspot is eliminated and that no new hotspot is generated by the new edge locations. Hotspot fix checking is enhanced by adding DRC checks to the litho-friendly design (LFD) rule file to guarantee that any fix options that violate DRC checks are removed from the output hint file. This extra checking eliminates the need to re-run both DRC and LFD checks to ensure the change successfully fixed the hotspot, which saves time and simplifies the designer's workflow. This methodology is demonstrated on industrial designs, where the fixing rate of single and dual layer hotspots is reported.
Experimental Evaluation of a Planning Language Suitable for Formal Verification
NASA Technical Reports Server (NTRS)
Butler, Rick W.; Munoz, Cesar A.; Siminiceanu, Radu I.
2008-01-01
The marriage of model checking and planning faces two seemingly diverging alternatives: the need for a planning language expressive enough to capture the complexity of real-life applications, as opposed to a language simple, yet robust enough to be amenable to exhaustive verification and validation techniques. In an attempt to reconcile these differences, we have designed an abstract plan description language, ANMLite, inspired from the Action Notation Modeling Language (ANML) [17]. We present the basic concepts of the ANMLite language as well as an automatic translator from ANMLite to the model checker SAL (Symbolic Analysis Laboratory) [7]. We discuss various aspects of specifying a plan in terms of constraints and explore the implications of choosing a robust logic behind the specification of constraints, rather than simply propose a new planning language. Additionally, we provide an initial assessment of the efficiency of model checking to search for solutions of planning problems. To this end, we design a basic test benchmark and study the scalability of the generated SAL models in terms of plan complexity.
Using LDPC Code Constraints to Aid Recovery of Symbol Timing
NASA Technical Reports Server (NTRS)
Jones, Christopher; Villasnor, John; Lee, Dong-U; Vales, Esteban
2008-01-01
A method of utilizing information available in the constraints imposed by a low-density parity-check (LDPC) code has been proposed as a means of aiding the recovery of symbol timing in the reception of a binary-phase-shift-keying (BPSK) signal representing such a code in the presence of noise, timing error, and/or Doppler shift between the transmitter and the receiver. This method and the receiver architecture in which it would be implemented belong to a class of timing-recovery methods and corresponding receiver architectures characterized as pilotless in that they do not require transmission and reception of pilot signals. Acquisition and tracking of a signal of the type described above have traditionally been performed upstream of, and independently of, decoding and have typically involved utilization of a phase-locked loop (PLL). However, the LDPC decoding process, which is iterative, provides information that can be fed back to the timing-recovery receiver circuits to improve performance significantly over that attainable in the absence of such feedback. Prior methods of coupling LDPC decoding with timing recovery had focused on the use of output code words produced as the iterations progress. In contrast, in the present method, one exploits the information available from the metrics computed for the constraint nodes of an LDPC code during the decoding process. In addition, the method involves the use of a waveform model that captures, better than do the waveform models of the prior methods, distortions introduced by receiver timing errors and transmitter/ receiver motions. An LDPC code is commonly represented by use of a bipartite graph containing two sets of nodes. In the graph corresponding to an (n,k) code, the n variable nodes correspond to the code word symbols and the n-k constraint nodes represent the constraints that the code places on the variable nodes in order for them to form a valid code word. The decoding procedure involves iterative computation of values associated with these nodes. A constraint node represents a parity-check equation using a set of variable nodes as inputs. A valid decoded code word is obtained if all parity-check equations are satisfied. After each iteration, the metrics associated with each constraint node can be evaluated to determine the status of the associated parity check. Heretofore, normally, these metrics would be utilized only within the LDPC decoding process to assess whether or not variable nodes had converged to a codeword. In the present method, it is recognized that these metrics can be used to determine accuracy of the timing estimates used in acquiring the sampled data that constitute the input to the LDPC decoder. In fact, the number of constraints that are satisfied exhibits a peak near the optimal timing estimate. Coarse timing estimation (or first-stage estimation as described below) is found via a parametric search for this peak. The present method calls for a two-stage receiver architecture illustrated in the figure. The first stage would correct large time delays and frequency offsets; the second stage would track random walks and correct residual time and frequency offsets. In the first stage, constraint-node feedback from the LDPC decoder would be employed in a search algorithm in which the searches would be performed in successively narrower windows to find the correct time delay and/or frequency offset. The second stage would include a conventional first-order PLL with a decision-aided timing-error detector that would utilize, as its decision aid, decoded symbols from the LDPC decoder. The method has been tested by means of computational simulations in cases involving various timing and frequency errors. The results of the simulations ined in the ideal case of perfect timing in the receiver.
Spectral Analysis of Vector Magnetic Field Profiles
NASA Technical Reports Server (NTRS)
Parker, Robert L.; OBrien, Michael S.
1997-01-01
We investigate the power spectra and cross spectra derived from the three components of the vector magnetic field measured on a straight horizontal path above a statistically stationary source. All of these spectra, which can be estimated from the recorded time series, are related to a single two-dimensional power spectral density via integrals that run in the across-track direction in the wavenumber domain. Thus the measured spectra must obey a number of strong constraints: for example, the sum of the two power spectral densities of the two horizontal field components equals the power spectral density of the vertical component at every wavenumber and the phase spectrum between the vertical and along-track components is always pi/2. These constraints provide powerful checks on the quality of the measured data; if they are violated, measurement or environmental noise should be suspected. The noise due to errors of orientation has a clear characteristic; both the power and phase spectra of the components differ from those of crustal signals, which makes orientation noise easy to detect and to quantify. The spectra of the crustal signals can be inverted to obtain information about the cross-track structure of the field. We illustrate these ideas using a high-altitude Project Magnet profile flown in the southeastern Pacific Ocean.
Formal Verification Toolkit for Requirements and Early Design Stages
NASA Technical Reports Server (NTRS)
Badger, Julia M.; Miller, Sheena Judson
2011-01-01
Efficient flight software development from natural language requirements needs an effective way to test designs earlier in the software design cycle. A method to automatically derive logical safety constraints and the design state space from natural language requirements is described. The constraints can then be checked using a logical consistency checker and also be used in a symbolic model checker to verify the early design of the system. This method was used to verify a hybrid control design for the suit ports on NASA Johnson Space Center's Space Exploration Vehicle against safety requirements.
NASA Astrophysics Data System (ADS)
Dzuba, Sergei A.
2016-08-01
Pulsed double electron-electron resonance technique (DEER, or PELDOR) is applied to study conformations and aggregation of peptides, proteins, nucleic acids, and other macromolecules. For a pair of spin labels, experimental data allows for the determination of their distance distribution function, P(r). P(r) is derived as a solution of a first-kind Fredholm integral equation, which is an ill-posed problem. Here, we suggest regularization by increasing the distance discretization length to its upper limit where numerical integration still provides agreement with experiment. This upper limit is found to be well above the lower limit for which the solution instability appears because of the ill-posed nature of the problem. For solving the integral equation, Monte Carlo trials of P(r) functions are employed; this method has an obvious advantage of the fulfillment of the non-negativity constraint for P(r). The regularization by the increasing of distance discretization length for the case of overlapping broad and narrow distributions may be employed selectively, with this length being different for different distance ranges. The approach is checked for model distance distributions and for experimental data taken from literature for doubly spin-labeled DNA and peptide antibiotics.
Proposal of Constraints Analysis Method Based on Network Model for Task Planning
NASA Astrophysics Data System (ADS)
Tomiyama, Tomoe; Sato, Tatsuhiro; Morita, Toyohisa; Sasaki, Toshiro
Deregulation has been accelerating several activities toward reengineering business processes, such as railway through service and modal shift in logistics. Making those activities successful, business entities have to regulate new business rules or know-how (we call them ‘constraints’). According to the new constraints, they need to manage business resources such as instruments, materials, workers and so on. In this paper, we propose a constraint analysis method to define constraints for task planning of the new business processes. To visualize each constraint's influence on planning, we propose a network model which represents allocation relations between tasks and resources. The network can also represent task ordering relations and resource grouping relations. The proposed method formalizes the way of defining constraints manually as repeatedly checking the network structure and finding conflicts between constraints. Being applied to crew scheduling problems shows that the method can adequately represent and define constraints of some task planning problems with the following fundamental features, (1) specifying work pattern to some resources, (2) restricting the number of resources for some works, (3) requiring multiple resources for some works, (4) prior allocation of some resources to some works and (5) considering the workload balance between resources.
The Advent of WDM and the All-Optical Network: A Reality Check.
ERIC Educational Resources Information Center
Lutkowitz, Mark
1998-01-01
Discussion of the telecommunications industry focuses on WDM (wavelength division multiplexing) as a solution for dealing with capacity constraints. Highlights include fiber optic technology; cross-connecting and switching wavelengths; SONET (Synchronous Optical Network) and wavelength networking; and optical TDM (Time Division Multiplexing). (LRW)
Consistency Check for the Bin Packing Constraint Revisited
NASA Astrophysics Data System (ADS)
Dupuis, Julien; Schaus, Pierre; Deville, Yves
The bin packing problem (BP) consists in finding the minimum number of bins necessary to pack a set of items so that the total size of the items in each bin does not exceed the bin capacity C. The bin capacity is common for all the bins.
Probing primordial non-Gaussianity via iSW measurements with SKA continuum surveys
DOE Office of Scientific and Technical Information (OSTI.GOV)
Raccanelli, Alvise; Doré, Olivier, E-mail: alvise@jhu.edu, E-mail: olivier.dore@caltech.edu; Bacon, David J.
The Planck CMB experiment has delivered the best constraints so far on primordial non-Gaussianity, ruling out early-Universe models of inflation that generate large non-Gaussianity. Although small improvements in the CMB constraints are expected, the next frontier of precision will come from future large-scale surveys of the galaxy distribution. The advantage of such surveys is that they can measure many more modes than the CMB—in particular, forthcoming radio surveys with the Square Kilometre Array will cover huge volumes. Radio continuum surveys deliver the largest volumes, but with the disadvantage of no redshift information. In order to mitigate this, we use twomore » additional observables. First, the integrated Sachs-Wolfe effect—the cross-correlation of the radio number counts with the CMB temperature anisotropies—helps to reduce systematics on the large scales that are sensitive to non-Gaussianity. Second, optical data allows for cross-identification in order to gain some redshift information. We show that, while the single redshift bin case can provide a σ(f{sub NL}) ∼ 20, and is therefore not competitive with current and future constraints on non-Gaussianity, a tomographic analysis could improve the constraints by an order of magnitude, even with only two redshift bins. A huge improvement is provided by the addition of high-redshift sources, so having cross-ID for high-z galaxies and an even higher-z radio tail is key to enabling very precise measurements of f{sub NL}. We use Fisher matrix forecasts to predict the constraining power in the case of no redshift information and the case where cross-ID allows a tomographic analysis, and we show that the constraints do not improve much with 3 or more bins. Our results show that SKA continuum surveys could provide constraints competitive with CMB and forthcoming optical surveys, potentially allowing a measurement of σ(f{sub NL}) ∼ 1 to be made. Moreover, these measurements would act as a useful check of results obtained with other probes at other redshift ranges with other methods.« less
Code of Federal Regulations, 2010 CFR
2010-01-01
... detection and surveillance of unauthorized penetration or activities, (3) Monitor with an intrusion alarm or... acknowledges the specified mode of transport, (iii) Check the integrity of the container and locks or seals... material of moderate strategic significance shall: (i) Check the integrity of the containers and seals upon...
Code of Federal Regulations, 2013 CFR
2013-01-01
... detection and surveillance of unauthorized penetration or activities, (3) Monitor with an intrusion alarm or... acknowledges the specified mode of transport, (iii) Check the integrity of the container and locks or seals... material of moderate strategic significance shall: (i) Check the integrity of the containers and seals upon...
Code of Federal Regulations, 2014 CFR
2014-01-01
... detection and surveillance of unauthorized penetration or activities, (3) Monitor with an intrusion alarm or... acknowledges the specified mode of transport, (iii) Check the integrity of the container and locks or seals... material of moderate strategic significance shall: (i) Check the integrity of the containers and seals upon...
Code of Federal Regulations, 2012 CFR
2012-01-01
... detection and surveillance of unauthorized penetration or activities, (3) Monitor with an intrusion alarm or... acknowledges the specified mode of transport, (iii) Check the integrity of the container and locks or seals... material of moderate strategic significance shall: (i) Check the integrity of the containers and seals upon...
Code of Federal Regulations, 2011 CFR
2011-01-01
... detection and surveillance of unauthorized penetration or activities, (3) Monitor with an intrusion alarm or... acknowledges the specified mode of transport, (iii) Check the integrity of the container and locks or seals... material of moderate strategic significance shall: (i) Check the integrity of the containers and seals upon...
Rotational-path decomposition based recursive planning for spacecraft attitude reorientation
NASA Astrophysics Data System (ADS)
Xu, Rui; Wang, Hui; Xu, Wenming; Cui, Pingyuan; Zhu, Shengying
2018-02-01
The spacecraft reorientation is a common task in many space missions. With multiple pointing constraints, it is greatly difficult to solve the constrained spacecraft reorientation planning problem. To deal with this problem, an efficient rotational-path decomposition based recursive planning (RDRP) method is proposed in this paper. The uniform pointing-constraint-ignored attitude rotation planning process is designed to solve all rotations without considering pointing constraints. Then the whole path is checked node by node. If any pointing constraint is violated, the nearest critical increment approach will be used to generate feasible alternative nodes in the process of rotational-path decomposition. As the planning path of each subdivision may still violate pointing constraints, multiple decomposition is needed and the reorientation planning is designed as a recursive manner. Simulation results demonstrate the effectiveness of the proposed method. The proposed method has been successfully applied in two SPARK microsatellites to solve onboard constrained attitude reorientation planning problem, which were developed by the Shanghai Engineering Center for Microsatellites and launched on 22 December 2016.
Consistent Query Answering of Conjunctive Queries under Primary Key Constraints
ERIC Educational Resources Information Center
Pema, Enela
2014-01-01
An inconsistent database is a database that violates one or more of its integrity constraints. In reality, violations of integrity constraints arise frequently under several different circumstances. Inconsistent databases have long posed the challenge to develop suitable tools for meaningful query answering. A principled approach for querying…
Virasoro constraints and polynomial recursion for the linear Hodge integrals
NASA Astrophysics Data System (ADS)
Guo, Shuai; Wang, Gehao
2017-04-01
The Hodge tau-function is a generating function for the linear Hodge integrals. It is also a tau-function of the KP hierarchy. In this paper, we first present the Virasoro constraints for the Hodge tau-function in the explicit form of the Virasoro equations. The expression of our Virasoro constraints is simply a linear combination of the Virasoro operators, where the coefficients are restored from a power series for the Lambert W function. Then, using this result, we deduce a simple version of the Virasoro constraints for the linear Hodge partition function, where the coefficients are restored from the Gamma function. Finally, we establish the equivalence relation between the Virasoro constraints and polynomial recursion formula for the linear Hodge integrals.
Calibration and characterisation of the Gaia Red Clump
NASA Astrophysics Data System (ADS)
Ruiz-Dern, L.; Babusiaux, C.; Arenou, F.; Danielski, C.; Turon, C.; Sartoretti, P.
2018-04-01
We present new empirical Colour-Colour and Effective Temperature-Colour Gaia Red Clump calibrations. The selected sample takes into account high photometric quality, good spectrometric metallicity, homogeneous effective temperatures and low interstellar extinctions. From those calibrations we developed a method to derive the absolute magnitude, temperature and extinction of the Gaia RC. We tested our colour and extinction estimates on stars with measured spectroscopic effective temperatures and Diffuse Interstellar Band (DIB) constraints. Within the Gaia Validation team these calibrations are also being used, together with asteroseismic constraints, to check the parallax zero-point with Red Clump stars.
Conceptual design of ACB-CP for ITER cryogenic system
NASA Astrophysics Data System (ADS)
Jiang, Yongcheng; Xiong, Lianyou; Peng, Nan; Tang, Jiancheng; Liu, Liqiang; Zhang, Liang
2012-06-01
ACB-CP (Auxiliary Cold Box for Cryopumps) is used to supply the cryopumps system with necessary cryogen in ITER (International Thermonuclear Experimental Reactor) cryogenic distribution system. The conceptual design of ACB-CP contains thermo-hydraulic analysis, 3D structure design and strength checking. Through the thermohydraulic analysis, the main specifications of process valves, pressure safety valves, pipes, heat exchangers can be decided. During the 3D structure design process, vacuum requirement, adiabatic requirement, assembly constraints and maintenance requirement have been considered to arrange the pipes, valves and other components. The strength checking has been performed to crosscheck if the 3D design meets the strength requirements for the ACB-CP.
1991-12-01
materials. TABLE I DRMO Market Price Paper $ 45 / ton Canvas $ 0.024 / lb Aluminum $ 0.26/1b Tires * $ 0.02 / lb Corrugated $ 63 /ton Silver Reclaimed...quality control check in accordance with their permit requirements. They pull samples and do a fingerprint analysis. If during that analysis they find that
Modeling Regular Replacement for String Constraint Solving
NASA Technical Reports Server (NTRS)
Fu, Xiang; Li, Chung-Chih
2010-01-01
Bugs in user input sanitation of software systems often lead to vulnerabilities. Among them many are caused by improper use of regular replacement. This paper presents a precise modeling of various semantics of regular substitution, such as the declarative, finite, greedy, and reluctant, using finite state transducers (FST). By projecting an FST to its input/output tapes, we are able to solve atomic string constraints, which can be applied to both the forward and backward image computation in model checking and symbolic execution of text processing programs. We report several interesting discoveries, e.g., certain fragments of the general problem can be handled using less expressive deterministic FST. A compact representation of FST is implemented in SUSHI, a string constraint solver. It is applied to detecting vulnerabilities in web applications
NASA Astrophysics Data System (ADS)
Kang, Jidong; Gianetto, James A.; Tyson, William R.
2018-03-01
Fracture toughness measurement is an integral part of structural integrity assessment of pipelines. Traditionally, a single-edge-notched bend (SE(B)) specimen with a deep crack is recommended in many existing pipeline structural integrity assessment procedures. Such a test provides high constraint and therefore conservative fracture toughness results. However, for girth welds in service, defects are usually subjected to primarily tensile loading where the constraint is usually much lower than in the three-point bend case. Moreover, there is increasing use of strain-based design of pipelines that allows applied strains above yield. Low-constraint toughness tests represent more realistic loading conditions for girth weld defects, and the corresponding increased toughness can minimize unnecessary conservatism in assessments. In this review, we present recent developments in low-constraint fracture toughness testing, specifically using single-edgenotched tension specimens, SENT or SE(T). We focus our review on the test procedure development and automation, round-robin test results and some common concerns such as the effect of crack tip, crack size monitoring techniques, and testing at low temperatures. Examples are also given of the integration of fracture toughness data from SE(T) tests into structural integrity assessment.
NASA Astrophysics Data System (ADS)
Tang, Qiuhua; Li, Zixiang; Zhang, Liping; Floudas, C. A.; Cao, Xiaojun
2015-09-01
Due to the NP-hardness of the two-sided assembly line balancing (TALB) problem, multiple constraints existing in real applications are less studied, especially when one task is involved with several constraints. In this paper, an effective hybrid algorithm is proposed to address the TALB problem with multiple constraints (TALB-MC). Considering the discrete attribute of TALB-MC and the continuous attribute of the standard teaching-learning-based optimization (TLBO) algorithm, the random-keys method is hired in task permutation representation, for the purpose of bridging the gap between them. Subsequently, a special mechanism for handling multiple constraints is developed. In the mechanism, the directions constraint of each task is ensured by the direction check and adjustment. The zoning constraints and the synchronism constraints are satisfied by teasing out the hidden correlations among constraints. The positional constraint is allowed to be violated to some extent in decoding and punished in cost function. Finally, with the TLBO seeking for the global optimum, the variable neighborhood search (VNS) is further hybridized to extend the local search space. The experimental results show that the proposed hybrid algorithm outperforms the late acceptance hill-climbing algorithm (LAHC) for TALB-MC in most cases, especially for large-size problems with multiple constraints, and demonstrates well balance between the exploration and the exploitation. This research proposes an effective and efficient algorithm for solving TALB-MC problem by hybridizing the TLBO and VNS.
CheckMATE 2: From the model to the limit
NASA Astrophysics Data System (ADS)
Dercks, Daniel; Desai, Nishita; Kim, Jong Soo; Rolbiecki, Krzysztof; Tattersall, Jamie; Weber, Torsten
2017-12-01
We present the latest developments to the CheckMATE program that allows models of new physics to be easily tested against the recent LHC data. To achieve this goal, the core of CheckMATE now contains over 60 LHC analyses of which 12 are from the 13 TeV run. The main new feature is that CheckMATE 2 now integrates the Monte Carlo event generation via MadGraph5_aMC@NLO and Pythia 8. This allows users to go directly from a SLHA file or UFO model to the result of whether a model is allowed or not. In addition, the integration of the event generation leads to a significant increase in the speed of the program. Many other improvements have also been made, including the possibility to now combine signal regions to give a total likelihood for a model.
Dasgupta, Amitava; Chughtai, Omar; Hannah, Christina; Davis, Bonnette; Wells, Alice
2004-10-01
Several adulterants are used to mask tests for abused drugs in urine. Adulterants such as "Klear" and "Whizzies" contain potassium nitrite while "Urine Luck" contains pyridinium chlorochromate (PCC). The presence of these adulterants cannot be detected by routine specimen integrity check (pH, specific gravity, creatinine and temperature). We previously reported the development of rapid spot tests to detect the presence of these adulterants. AdultaCheck 6 and Intect 7 urine test strips are commercially available for detecting the presence of these adulterants along with specific gravity, creatinine and pH in urine. The performance of these two test strips for detecting adulterants was compared with the results obtained by spot tests. Both AdultaCheck 6 and Intect 7 effectively detected the presence of nitrite and pyridinium chlorochromate in urine. Moreover, both test strips successfully detected the presence of glutaraldehyde, for which no spot test is currently available. High amount of glucose and ascorbic acid did not cause any false positive result with AdultaCheck 6 or Intect 7. Both AdultaCheck 6 and Intect 7 can be used for checking the integrity of a urine specimen submitted for drugs of abuse testing.
Shear flow simulations of biaxial nematic liquid crystals
NASA Astrophysics Data System (ADS)
Sarman, Sten
1997-08-01
We have calculated the viscosities of a biaxial nematic liquid crystal phase of a variant of the Gay-Berne fluid [J. G. Gay and B. J. Berne, J. Chem. Phys. 74, 3316 (1981)] by performing molecular dynamics simulations. The equations of motion have been augmented by a director constraint torque that fixes the orientation of the directors. This makes it possible to fix them at different angles relative to the stream lines in shear flow simulations. In equilibrium simulations the constraints generate a new ensemble. One finds that the Green-Kubo relations for the viscosities become linear combinations of time correlation function integrals in this ensemble whereas they are complicated rational functions in the conventional canonical ensemble. We have evaluated these Green-Kubo relations for all the shear viscosities and all the twist viscosities. We have also calculated the alignment angles, which are functions of the viscosity coefficients. We find that there are three real alignment angles but a linear stability analysis shows that only one of them corresponds to a stable director orientation. The Green-Kubo results have been cross checked by nonequilibrium shear flow simulations. The results from the different methods agree very well. Finally, we have evaluated the Miesowicz viscosities [D. Baalss, Z. Naturforsch. Teil A 45, 7 (1990)]. They vary by more than 2 orders of magnitude. The viscosity is consequently highly orientation dependent.
The Software Design for the Wide-Field Infrared Explorer Attitude Control System
NASA Technical Reports Server (NTRS)
Anderson, Mark O.; Barnes, Kenneth C.; Melhorn, Charles M.; Phillips, Tom
1998-01-01
The Wide-Field Infrared Explorer (WIRE), currently scheduled for launch in September 1998, is the fifth of five spacecraft in the NASA/Goddard Small Explorer (SMEX) series. This paper presents the design of WIRE's Attitude Control System flight software (ACS FSW). WIRE is a momentum-biased, three-axis stabilized stellar pointer which provides high-accuracy pointing and autonomous acquisition for eight to ten stellar targets per orbit. WIRE's short mission life and limited cryogen supply motivate requirements for Sun and Earth avoidance constraints which are designed to prevent catastrophic instrument damage and to minimize the heat load on the cryostat. The FSW implements autonomous fault detection and handling (FDH) to enforce these instrument constraints and to perform several other checks which insure the safety of the spacecraft. The ACS FSW implements modules for sensor data processing, attitude determination, attitude control, guide star acquisition, actuator command generation, command/telemetry processing, and FDH. These software components are integrated with a hierarchical control mode managing module that dictates which software components are currently active. The lowest mode in the hierarchy is the 'safest' one, in the sense that it utilizes a minimal complement of sensors and actuators to keep the spacecraft in a stable configuration (power and pointing constraints are maintained). As higher modes in the hierarchy are achieved, the various software functions are activated by the mode manager, and an increasing level of attitude control accuracy is provided. If FDH detects a constraint violation or other anomaly, it triggers a safing transition to a lower control mode. The WIRE ACS FSW satisfies all target acquisition and pointing accuracy requirements, enforces all pointing constraints, provides the ground with a simple means for reconfiguring the system via table load, and meets all the demands of its real-time embedded environment (16 MHz Intel 80386 processor with 80387 coprocessor running under the VRTX operating system). The mode manager organizes and controls all the software modules used to accomplish these goals, and in particular, the FDH module is tightly coupled with the mode manager.
ERIC Educational Resources Information Center
Makwana, Alpesh P.
2009-01-01
"Pre-Trip Inspection" of the truck and trailer is one of the components of the current Commercial Driver's License (CDL) test. This part of the CDL test checks the ability of the student to identify the important parts of the commercial vehicle and their potential defects. The "Virtual Check Ride System" (VCRS), a…
Generalization of the event-based Carnevale-Hines integration scheme for integrate-and-fire models.
van Elburg, Ronald A J; van Ooyen, Arjen
2009-07-01
An event-based integration scheme for an integrate-and-fire neuron model with exponentially decaying excitatory synaptic currents and double exponential inhibitory synaptic currents has been introduced by Carnevale and Hines. However, the integration scheme imposes nonphysiological constraints on the time constants of the synaptic currents, which hamper its general applicability. This letter addresses this problem in two ways. First, we provide physical arguments demonstrating why these constraints on the time constants can be relaxed. Second, we give a formal proof showing which constraints can be abolished. As part of our formal proof, we introduce the generalized Carnevale-Hines lemma, a new tool for comparing double exponentials as they naturally occur in many cascaded decay systems, including receptor-neurotransmitter dissociation followed by channel closing. Through repeated application of the generalized lemma, we lift most of the original constraints on the time constants. Thus, we show that the Carnevale-Hines integration scheme for the integrate-and-fire model can be employed for simulating a much wider range of neuron and synapse types than was previously thought.
NASA Technical Reports Server (NTRS)
Lee, Jeh Won
1990-01-01
The objective is the theoretical analysis and the experimental verification of dynamics and control of a two link flexible manipulator with a flexible parallel link mechanism. Nonlinear equations of motion of the lightweight manipulator are derived by the Lagrangian method in symbolic form to better understand the structure of the dynamic model. The resulting equation of motion have a structure which is useful to reduce the number of terms calculated, to check correctness, or to extend the model to higher order. A manipulator with a flexible parallel link mechanism is a constrained dynamic system whose equations are sensitive to numerical integration error. This constrained system is solved using singular value decomposition of the constraint Jacobian matrix. Elastic motion is expressed by the assumed mode method. Mode shape functions of each link are chosen using the load interfaced component mode synthesis. The discrepancies between the analytical model and the experiment are explained using a simplified and a detailed finite element model.
Effect of Subject Types on the Production of Auxiliary "Is" in Young English-Speaking Children
ERIC Educational Resources Information Center
Guo, Ling-Yu; Owen, Amanda J.; Tomblin, J. Bruce
2010-01-01
Purpose: In this study, the authors tested the unique checking constraint (UCC) hypothesis and the usage-based approach concerning why young children variably use tense and agreement morphemes in obligatory contexts by examining the effect of subject types on the production of auxiliary "is". Method: Twenty typically developing 3-year-olds were…
Provenance based data integrity checking and verification in cloud environments
Haq, Inam Ul; Jan, Bilal; Khan, Fakhri Alam; Ahmad, Awais
2017-01-01
Cloud computing is a recent tendency in IT that moves computing and data away from desktop and hand-held devices into large scale processing hubs and data centers respectively. It has been proposed as an effective solution for data outsourcing and on demand computing to control the rising cost of IT setups and management in enterprises. However, with Cloud platforms user’s data is moved into remotely located storages such that users lose control over their data. This unique feature of the Cloud is facing many security and privacy challenges which need to be clearly understood and resolved. One of the important concerns that needs to be addressed is to provide the proof of data integrity, i.e., correctness of the user’s data stored in the Cloud storage. The data in Clouds is physically not accessible to the users. Therefore, a mechanism is required where users can check if the integrity of their valuable data is maintained or compromised. For this purpose some methods are proposed like mirroring, checksumming and using third party auditors amongst others. However, these methods use extra storage space by maintaining multiple copies of data or the presence of a third party verifier is required. In this paper, we address the problem of proving data integrity in Cloud computing by proposing a scheme through which users are able to check the integrity of their data stored in Clouds. In addition, users can track the violation of data integrity if occurred. For this purpose, we utilize a relatively new concept in the Cloud computing called “Data Provenance”. Our scheme is capable to reduce the need of any third party services, additional hardware support and the replication of data items on client side for integrity checking. PMID:28545151
Provenance based data integrity checking and verification in cloud environments.
Imran, Muhammad; Hlavacs, Helmut; Haq, Inam Ul; Jan, Bilal; Khan, Fakhri Alam; Ahmad, Awais
2017-01-01
Cloud computing is a recent tendency in IT that moves computing and data away from desktop and hand-held devices into large scale processing hubs and data centers respectively. It has been proposed as an effective solution for data outsourcing and on demand computing to control the rising cost of IT setups and management in enterprises. However, with Cloud platforms user's data is moved into remotely located storages such that users lose control over their data. This unique feature of the Cloud is facing many security and privacy challenges which need to be clearly understood and resolved. One of the important concerns that needs to be addressed is to provide the proof of data integrity, i.e., correctness of the user's data stored in the Cloud storage. The data in Clouds is physically not accessible to the users. Therefore, a mechanism is required where users can check if the integrity of their valuable data is maintained or compromised. For this purpose some methods are proposed like mirroring, checksumming and using third party auditors amongst others. However, these methods use extra storage space by maintaining multiple copies of data or the presence of a third party verifier is required. In this paper, we address the problem of proving data integrity in Cloud computing by proposing a scheme through which users are able to check the integrity of their data stored in Clouds. In addition, users can track the violation of data integrity if occurred. For this purpose, we utilize a relatively new concept in the Cloud computing called "Data Provenance". Our scheme is capable to reduce the need of any third party services, additional hardware support and the replication of data items on client side for integrity checking.
A Model of Object-Identities and Values
1990-02-23
integrity constraints in its construct, which provides the natural integration of the logical database model and the object-oriented database model. 20...portions are integrated by a simple commutative diagram of modeling functions. The formalism includes the expression of integrity constraints in its ...38 .5.2.2 The (Concept Model and Its Semantics .. .. .. .. ... .... ... .. 40 5.2.3 Two K%.inds of Predicates
Physical constraints on biological integral control design for homeostasis and sensory adaptation.
Ang, Jordan; McMillen, David R
2013-01-22
Synthetic biology includes an effort to use design-based approaches to create novel controllers, biological systems aimed at regulating the output of other biological processes. The design of such controllers can be guided by results from control theory, including the strategy of integral feedback control, which is central to regulation, sensory adaptation, and long-term robustness. Realization of integral control in a synthetic network is an attractive prospect, but the nature of biochemical networks can make the implementation of even basic control structures challenging. Here we present a study of the general challenges and important constraints that will arise in efforts to engineer biological integral feedback controllers or to analyze existing natural systems. Constraints arise from the need to identify target output values that the combined process-plus-controller system can reach, and to ensure that the controller implements a good approximation of integral feedback control. These constraints depend on mild assumptions about the shape of input-output relationships in the biological components, and thus will apply to a variety of biochemical systems. We summarize our results as a set of variable constraints intended to provide guidance for the design or analysis of a working biological integral feedback controller. Copyright © 2013 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Integrated system for automated financial document processing
NASA Astrophysics Data System (ADS)
Hassanein, Khaled S.; Wesolkowski, Slawo; Higgins, Ray; Crabtree, Ralph; Peng, Antai
1997-02-01
A system was developed that integrates intelligent document analysis with multiple character/numeral recognition engines in order to achieve high accuracy automated financial document processing. In this system, images are accepted in both their grayscale and binary formats. A document analysis module starts by extracting essential features from the document to help identify its type (e.g. personal check, business check, etc.). These features are also utilized to conduct a full analysis of the image to determine the location of interesting zones such as the courtesy amount and the legal amount. These fields are then made available to several recognition knowledge sources such as courtesy amount recognition engines and legal amount recognition engines through a blackboard architecture. This architecture allows all the available knowledge sources to contribute incrementally and opportunistically to the solution of the given recognition query. Performance results on a test set of machine printed business checks using the integrated system are also reported.
Trajectory Design to Mitigate Risk on the Transiting Exoplanet Survey Satellite (TESS) Mission
NASA Technical Reports Server (NTRS)
Dichmann, Donald
2016-01-01
The Transiting Exoplanet Survey Satellite (TESS) will employ a highly eccentric Earth orbit, in 2:1 lunar resonance, reached with a lunar flyby preceded by 3.5 phasing loops. The TESS mission has limited propellant and several orbit constraints. Based on analysis and simulation, we have designed the phasing loops to reduce delta-V and to mitigate risk due to maneuver execution errors. We have automated the trajectory design process and use distributed processing to generate and to optimize nominal trajectories, check constraint satisfaction, and finally model the effects of maneuver errors to identify trajectories that best meet the mission requirements.
Immediate effects of form-class constraints on spoken word recognition
Magnuson, James S.; Tanenhaus, Michael K.; Aslin, Richard N.
2008-01-01
In many domains of cognitive processing there is strong support for bottom-up priority and delayed top-down (contextual) integration. We ask whether this applies to supra-lexical context that could potentially constrain lexical access. Previous findings of early context integration in word recognition have typically used constraints that can be linked to pair-wise conceptual relations between words. Using an artificial lexicon, we found immediate integration of syntactic expectations based on pragmatic constraints linked to syntactic categories rather than words: phonologically similar “nouns” and “adjectives” did not compete when a combination of syntactic and visual information strongly predicted form class. These results suggest that predictive context is integrated continuously, and that previous findings supporting delayed context integration stem from weak contexts rather than delayed integration. PMID:18675408
Verification of Java Programs using Symbolic Execution and Invariant Generation
NASA Technical Reports Server (NTRS)
Pasareanu, Corina; Visser, Willem
2004-01-01
Software verification is recognized as an important and difficult problem. We present a norel framework, based on symbolic execution, for the automated verification of software. The framework uses annotations in the form of method specifications an3 loop invariants. We present a novel iterative technique that uses invariant strengthening and approximation for discovering these loop invariants automatically. The technique handles different types of data (e.g. boolean and numeric constraints, dynamically allocated structures and arrays) and it allows for checking universally quantified formulas. Our framework is built on top of the Java PathFinder model checking toolset and it was used for the verification of several non-trivial Java programs.
Reopen parameter regions in two-Higgs doublet models
NASA Astrophysics Data System (ADS)
Staub, Florian
2018-01-01
The stability of the electroweak potential is a very important constraint for models of new physics. At the moment, it is standard for Two-Higgs doublet models (THDM), singlet or triplet extensions of the standard model to perform these checks at tree-level. However, these models are often studied in the presence of very large couplings. Therefore, it can be expected that radiative corrections to the potential are important. We study these effects at the example of the THDM type-II and find that loop corrections can revive more than 50% of the phenomenological viable points which are ruled out by the tree-level vacuum stability checks. Similar effects are expected for other extension of the standard model.
Specification and Enforcement of Semantic Integrity Constraints in Microsoft Access
ERIC Educational Resources Information Center
Dadashzadeh, Mohammad
2007-01-01
Semantic integrity constraints are business-specific rules that limit the permissible values in a database. For example, a university rule dictating that an "incomplete" grade cannot be changed to an A constrains the possible states of the database. To maintain database integrity, business rules should be identified in the course of database…
A Hybrid Constraint Representation and Reasoning Framework
NASA Technical Reports Server (NTRS)
Golden, Keith; Pang, Wan-Lin
2003-01-01
This paper introduces JNET, a novel constraint representation and reasoning framework that supports procedural constraints and constraint attachments, providing a flexible way of integrating the constraint reasoner with a run- time software environment. Attachments in JNET are constraints over arbitrary Java objects, which are defined using Java code, at runtime, with no changes to the JNET source code.
The Single Soldier Quality of Life Initiative: Great Expectations of Privacy
1995-04-01
without regard to their marital status and to hold them accountable to established standards. 18 To many "old soldiers," some of the ideas contained...Family Housing Office: assign and terminate quarters, conduct check-in and check-out inspections, maintain accountability of SQ furniture, follow up on...integrity is a second priority." 2 6 Further hindering unit integrity is that smoking preference of the soldiers must be taken into account when making
Implications of water constraints for electricity capacity expansion in the United States
NASA Astrophysics Data System (ADS)
Liu, L.; Hejazi, M. I.; Iyer, G.; Forman, B. A.
2017-12-01
U.S. electricity generation is vulnerable to water supply since water is required for cooling. Constraints on the availability of water will therefore necessitate adaptive planning by the power generation sector. Hence, it is important to integrate restrictions in water availability in electricity capacity planning in order to better understand the economic viability of alternative capacity planning options. The study of the implications of water constraints for the U.S. power generation system is limited in terms of scale and robustness. We extend previous studies by including physical water constraints in a state-level model of the U.S. energy system embedded within a global integrated assessment model (GCAM-USA). We focus on the implications of such constraints for the U.S. electricity capacity expansion, integrating both supply and demand effects under a consistent framework. Constraints on the availability of water have two general effects across the U.S. First, water availability constraints increase the cost of electricity generation, resulting in reduced electrification of end-use sectors. Second, water availability constraints result in forced retirements of water-intensive technologies such as thermoelectric coal- and gas- fired technologies before the end of their natural lifetimes. The demand for electricity is then met by an increase in investments in less water-dependent technologies such as wind and solar photovoltaic. Our results show that the regional patterns of the above effects are heterogeneous across the U.S. In general, the impacts of water constraints on electricity capacity expansion are more pronounced in the West than in the East. This is largely because of lower water availability in the West compared to the East due to lower precipitation in the Western states. Constraints on the availability of water might also have important implications for U.S. electricity trade. For example, under severe constraints on the availability of water, some states flip from being net exporters of electricity to becoming net importers and vice versa. Our study demonstrates the impacts of water availability constraints on electricity capacity expansion in the U.S. and highlights the need to integrate such constraints into decision-making so as to better understand state-level challenges.
Hall-Andersen, Lene Bjerg; Neumann, Patrick; Broberg, Ole
2016-10-17
The integration of ergonomics knowledge into engineering projects leads to both healthier and more efficient workplaces. There is a lack of knowledge about integrating ergonomic knowledge into the design practice in engineering consultancies. This study explores how organizational resources can pose constraints for the integration of ergonomics knowledge into engineering design projects in a business-driven setting, and how ergonomists cope with these resource constraints. An exploratory case study in an engineering consultancy was conducted. A total of 27 participants were interviewed. Data were collected applying semi-structured interviews, observations, and documentary studies. Interviews were transcribed, coded, and categorized into themes. From the analysis five overall themes emerged as major constituents of resource constraints: 1) maximizing project revenue, 2) payment for ergonomics services, 3) value of ergonomic services, 4) role of the client, and 5) coping strategies to overcome resource constraints. We hypothesize that resource constraints were shaped due to sub-optimization of costs in design projects. The economical contribution of ergonomics measures was not evaluated in the entire life cycle of a designed workplace. Coping strategies included teaming up with engineering designers in the sales process or creating an alliance with ergonomists in the client organization.
Probing satellite galaxies in the Local Group by using FAST
NASA Astrophysics Data System (ADS)
Li, Jing; Wang, You-Gang; Kong, Min-Zhi; Wang, Jie; Chen, Xuelei; Guo, Rui
2018-01-01
The abundance of neutral hydrogen (HI) in satellite galaxies in the local group is important for studying the formation history of our local group. In this work, we generated mock HI satellite galaxies in the Local Group using the high mass-resolution hydrodynamic APOSTLE simulation. The simulated HI mass function agrees with the ALFALFA survey very well above 106 M ⊙, although there is a discrepancy below this scale because of the observed flux limit. After carefully checking various systematic elements in the observations, including fitting of line width, sky coverage, integration time and frequency drift due to uncertainty in a galaxy’s distance, we predicted the abundance of HI in galaxies in a future survey that will be conducted by FAST. FAST has a larger aperture and higher sensitivity than the Arecibo telescope. We found that the HI mass function could be estimated well around 105 M ⊙ if the integration time is 40 minutes. Our results indicate that there are 61 HI satellites in the Local Group and 36 in the FAST field above 105 M ⊙. This estimation is one order of magnitude better than the current data, and will put a strong constraint on the formation history of the Local Group. Also more high resolution simulated samples are needed to achieve this target.
Rate-gyro-integral constraint for ambiguity resolution in GNSS attitude determination applications.
Zhu, Jiancheng; Li, Tao; Wang, Jinling; Hu, Xiaoping; Wu, Meiping
2013-06-21
In the field of Global Navigation Satellite System (GNSS) attitude determination, the constraints usually play a critical role in resolving the unknown ambiguities quickly and correctly. Many constraints such as the baseline length, the geometry of multi-baselines and the horizontal attitude angles have been used extensively to improve the performance of ambiguity resolution. In the GNSS/Inertial Navigation System (INS) integrated attitude determination systems using low grade Inertial Measurement Unit (IMU), the initial heading parameters of the vehicle are usually worked out by the GNSS subsystem instead of by the IMU sensors independently. However, when a rotation occurs, the angle at which vehicle has turned within a short time span can be measured accurately by the IMU. This measurement will be treated as a constraint, namely the rate-gyro-integral constraint, which can aid the GNSS ambiguity resolution. We will use this constraint to filter the candidates in the ambiguity search stage. The ambiguity search space shrinks significantly with this constraint imposed during the rotation, thus it is helpful to speeding up the initialization of attitude parameters under dynamic circumstances. This paper will only study the applications of this new constraint to land vehicles. The impacts of measurement errors on the effect of this new constraint will be assessed for different grades of IMU and current average precision level of GNSS receivers. Simulations and experiments in urban areas have demonstrated the validity and efficacy of the new constraint in aiding GNSS attitude determinations.
Integrated Analysis of Airport Capacity and Environmental Constraints
NASA Technical Reports Server (NTRS)
Hasan, Shahab; Long, Dou; Hart, George; Eckhause, Jeremy; Hemm, Robert; Busick, Andrew; Graham, Michael; Thompson, Terry; Murphy, Charles; Poage, James
2010-01-01
LMI conducted an integrated analysis of airport capacity and environmental constraints. identifying and ranking the key factors limiting achievement of NextGen capacity goals. The primary metric used was projected throughput, which was estimated for the years 2015 and 2025 based on the unconstrained demand forecast from the Federal Aviation Administration, and planned improvements including those proposed in the NextGen plan. A set of 310 critical airports was identified.. collectively accounting for more than 99 percent of domestic air traffic volume; a one-off analytical approach was used to isolate the constraint being assessed. The study considered three capacity constraints (runway.. taxiway, and gate) and three environmental constraints (fuel, NO(x) emissions, and noise). For the ten busiest airports, runway and noise are the primary and secondary constraints in both 2015 and 2025. For the OEP 35 airports and overall for the remaining airports, the most binding constraint is noise. Six of the 10 busiest airports, will face runway constraints in 2025, and 95 will face gate constraints. Nearly every airport will be subject to constraints due to emissions and NOx. Runway and taxi constraints are more concentrated in the large airports: environmental constraints are present at almost every airport regardless of size.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carlsson, Mats; Johansson, Mikael; Larson, Jeffrey
Previous approaches for scheduling a league with round-robin and divisional tournaments involved decomposing the problem into easier subproblems. This approach, used to schedule the top Swedish handball league Elitserien, reduces the problem complexity but can result in suboptimal schedules. This paper presents an integrated constraint programming model that allows to perform the scheduling in a single step. Particular attention is given to identifying implied and symmetry-breaking constraints that reduce the computational complexity significantly. The experimental evaluation of the integrated approach takes considerably less computational effort than the previous approach.
Taylor, Sally; Allsop, Matthew J; Bekker, Hilary L; Bennett, Michael I; Bewick, Bridgette M
2017-07-01
Poor pain assessment is a barrier to effective pain control. There is growing interest internationally in the development and implementation of remote monitoring technologies to enhance assessment in cancer and chronic disease contexts. Findings describe the development and testing of pain monitoring systems, but research identifying the needs of health professionals to implement routine monitoring systems within clinical practice is limited. To inform the development and implementation strategy of an electronic pain monitoring system, PainCheck, by understanding palliative care professionals' needs when integrating PainCheck into routine clinical practice. Qualitative study using face-to-face interviews. Data were analysed using framework analysis Setting/participants: Purposive sample of health professionals managing the palliative care of patients living in the community Results: A total of 15 interviews with health professionals took place. Three meta-themes emerged from the data: (1) uncertainties about integration of PainCheck and changes to current practice, (2) appraisal of current practice and (3) pain management is everybody's responsibility Conclusion: Even the most sceptical of health professionals could see the potential benefits of implementing an electronic patient-reported pain monitoring system. Health professionals have reservations about how PainCheck would work in practice. For optimal use, PainCheck needs embedding within existing electronic health records. Electronic pain monitoring systems have the potential to enable professionals to support patients' pain management more effectively but only when barriers to implementation are appropriately identified and addressed.
42 CFR 455.19 - Provider's statement on check.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 4 2010-10-01 2010-10-01 false Provider's statement on check. 455.19 Section 455.19 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS PROGRAM INTEGRITY: MEDICAID Medicaid Agency Fraud Detection and...
6 CFR 37.45 - Background checks for covered employees.
Code of Federal Regulations, 2012 CFR
2012-01-01
..., the validation of references from prior employment, a name-based and fingerprint-based criminal.... States must conduct a name-based and fingerprint-based criminal history records check (CHRC) using, at a minimum, the FBI's National Crime Information Center (NCIC) and the Integrated Automated Fingerprint...
6 CFR 37.45 - Background checks for covered employees.
Code of Federal Regulations, 2010 CFR
2010-01-01
..., the validation of references from prior employment, a name-based and fingerprint-based criminal.... States must conduct a name-based and fingerprint-based criminal history records check (CHRC) using, at a minimum, the FBI's National Crime Information Center (NCIC) and the Integrated Automated Fingerprint...
6 CFR 37.45 - Background checks for covered employees.
Code of Federal Regulations, 2013 CFR
2013-01-01
..., the validation of references from prior employment, a name-based and fingerprint-based criminal.... States must conduct a name-based and fingerprint-based criminal history records check (CHRC) using, at a minimum, the FBI's National Crime Information Center (NCIC) and the Integrated Automated Fingerprint...
6 CFR 37.45 - Background checks for covered employees.
Code of Federal Regulations, 2014 CFR
2014-01-01
..., the validation of references from prior employment, a name-based and fingerprint-based criminal.... States must conduct a name-based and fingerprint-based criminal history records check (CHRC) using, at a minimum, the FBI's National Crime Information Center (NCIC) and the Integrated Automated Fingerprint...
6 CFR 37.45 - Background checks for covered employees.
Code of Federal Regulations, 2011 CFR
2011-01-01
..., the validation of references from prior employment, a name-based and fingerprint-based criminal.... States must conduct a name-based and fingerprint-based criminal history records check (CHRC) using, at a minimum, the FBI's National Crime Information Center (NCIC) and the Integrated Automated Fingerprint...
SME2EM: Smart mobile end-to-end monitoring architecture for life-long diseases.
Serhani, Mohamed Adel; Menshawy, Mohamed El; Benharref, Abdelghani
2016-01-01
Monitoring life-long diseases requires continuous measurements and recording of physical vital signs. Most of these diseases are manifested through unexpected and non-uniform occurrences and behaviors. It is impractical to keep patients in hospitals, health-care institutions, or even at home for long periods of time. Monitoring solutions based on smartphones combined with mobile sensors and wireless communication technologies are a potential candidate to support complete mobility-freedom, not only for patients, but also for physicians. However, existing monitoring architectures based on smartphones and modern communication technologies are not suitable to address some challenging issues, such as intensive and big data, resource constraints, data integration, and context awareness in an integrated framework. This manuscript provides a novel mobile-based end-to-end architecture for live monitoring and visualization of life-long diseases. The proposed architecture provides smartness features to cope with continuous monitoring, data explosion, dynamic adaptation, unlimited mobility, and constrained devices resources. The integration of the architecture׳s components provides information about diseases׳ recurrences as soon as they occur to expedite taking necessary actions, and thus prevent severe consequences. Our architecture system is formally model-checked to automatically verify its correctness against designers׳ desirable properties at design time. Its components are fully implemented as Web services with respect to the SOA architecture to be easy to deploy and integrate, and supported by Cloud infrastructure and services to allow high scalability, availability of processes and data being stored and exchanged. The architecture׳s applicability is evaluated through concrete experimental scenarios on monitoring and visualizing states of epileptic diseases. The obtained theoretical and experimental results are very promising and efficiently satisfy the proposed architecture׳s objectives, including resource awareness, smart data integration and visualization, cost reduction, and performance guarantee. Copyright © 2015 Elsevier Ltd. All rights reserved.
Trajectory Design Enhancements to Mitigate Risk for the Transiting Exoplanet Survey Satellite (TESS)
NASA Technical Reports Server (NTRS)
Dichmann, Donald; Parker, Joel; Nickel, Craig; Lutz, Stephen
2016-01-01
The Transiting Exoplanet Survey Satellite (TESS) will employ a highly eccentric Earth orbit, in 2:1 lunar resonance, which will be reached with a lunar flyby preceded by 3.5 phasing loops. The TESS mission has limited propellant and several constraints on the science orbit and on the phasing loops. Based on analysis and simulation, we have designed the phasing loops to reduce delta-V (DV) and to mitigate risk due to maneuver execution errors. We have automated the trajectory design process and use distributed processing to generate and optimal nominal trajectories; to check constraint satisfaction; and finally to model the effects of maneuver errors to identify trajectories that best meet the mission requirements.
The ANMLite Language and Logic for Specifying Planning Problems
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Siminiceanu, Radu I.; Munoz, Cesar A.
2007-01-01
We present the basic concepts of the ANMLite planning language. We discuss various aspects of specifying a plan in terms of constraints and checking the existence of a solution with the help of a model checker. The constructs of the ANMLite language have been kept as simple as possible in order to reduce complexity and simplify the verification problem. We illustrate the language with a specification of the space shuttle crew activity model that was constructed under the Spacecraft Autonomy for Vehicles and Habitats (SAVH) project. The main purpose of this study was to explore the implications of choosing a robust logic behind the specification of constraints, rather than simply proposing a new planning language.
NASA Technical Reports Server (NTRS)
Johnson, Charles S.
1986-01-01
Physical quantities using various units of measurement can be well represented in Ada by the use of abstract types. Computation involving these quantities (electric potential, mass, volume) can also automatically invoke the computation and checking of some of the implicitly associable attributes of measurements. Quantities can be held internally in SI units, transparently to the user, with automatic conversion. Through dimensional analysis, the type of the derived quantity resulting from a computation is known, thereby allowing dynamic checks of the equations used. The impact of the possible implementation of these techniques in integration and test applications is discussed. The overhead of computing and transporting measurement attributes is weighed against the advantages gained by their use. The construction of a run time interpreter using physical quantities in equations can be aided by the dynamic equation checks provided by dimensional analysis. The effects of high levels of abstraction on the generation and maintenance of software used in integration and test applications are also discussed.
Active Diagnosis of Navy Machinery Rev 2.0
2016-10-01
electrical distribution and potable water supply systems. Because of these dependencies, ship auxiliary system failures can cause combat load failure...buildup generally causes a pipe to disconnect from a junction, causing water to leak . This limits the faults that are testable, since many of the faults...pipes, junctions, pumps, flow meters, thermal loads, check valve, and water tank. Each agent is responsible for maintaining its constraints locally
NASA Technical Reports Server (NTRS)
Dunkey, J.; Komatsu, E.; Nolta, M.R.; Spergel, D.N.; Larson, D.; Hinshaw, G.; Page, L.; Bennett, C.L.; Gold, B.; Jarosik, N.;
2008-01-01
The Wilkinson Microwave Anisotropy Probe (WMAP), launched in 2001, has mapped out the Cosmic Microwave Background with unprecedented accuracy over the whole sky. Its observations have led to the establishment of a simple concordance cosmological model for the contents and evolution of the universe, consistent with virtually all other astronomical measurements. The WMAP first-year and three-year data have allowed us to place strong constraints on the parameters describing the ACDM model. a flat universe filled with baryons, cold dark matter, neutrinos. and a cosmological constant. with initial fluctuations described by nearly scale-invariant power law fluctuations, as well as placing limits on extensions to this simple model (Spergel et al. 2003. 2007). With all-sky measurements of the polarization anisotropy (Kogut et al. 2003; Page et al. 2007), two orders of magnitude smaller than the intensity fluctuations. WMAP has not only given us an additional picture of the universe as it transitioned from ionized to neutral at redshift z approx.1100. but also an observation of the later reionization of the universe by the first stars. In this paper we present cosmological constraints from WMAP alone. for both the ACDM model and a set of possible extensions. We also consider tlle consistency of WMAP constraints with other recent astronomical observations. This is one of seven five-year WMAP papers. Hinshaw et al. (2008) describe the data processing and basic results. Hill et al. (2008) present new beam models arid window functions, Gold et al. (2008) describe the emission from Galactic foregrounds, and Wright et al. (2008) the emission from extra-Galactic point sources. The angular power spectra are described in Nolta et al. (2008), and Komatsu et al. (2008) present and interpret cosmological constraints based on combining WMAP with other data. WMAP observations are used to produce full-sky maps of the CMB in five frequency bands centered at 23, 33, 41, 61, and 94 GHz (Hinshaw et al. 2008). With five years of data, we are now able to place better limits on the ACDM model. as well as to move beyond it to test the composition of the universe. details of reionization. sub-dominant components, characteristics of inflation, and primordial fluctuations. We have more than doubled the amount of polarized data used for cosmological analysis. allowing a better measure of the large-scale E-mode signal (Nolta et al. 2008). To this end we describe an alternative way to remove Galactic foregrounds from low resolution polarization maps in which Galactic emission is marginalized over, providing a cross-check of our results. With longer integration we also better probe the second and third acoustic peaks in the temperature angular power spectrum, and have many more year-to-year difference maps available for cross-checking systematic effects (Hinshaw et al. 2008).
NASA Technical Reports Server (NTRS)
Mitra, Debasis; Thomas, Ajai; Hemminger, Joseph; Sakowski, Barbara
2001-01-01
In this research we have developed an algorithm for the purpose of constraint processing by utilizing relational algebraic operators. Van Beek and others have investigated in the past this type of constraint processing from within a relational algebraic framework, producing some unique results. Apart from providing new theoretical angles, this approach also gives the opportunity to use the existing efficient implementations of relational database management systems as the underlying data structures for any relevant algorithm. Our algorithm here enhances that framework. The algorithm is quite general in its current form. Weak heuristics (like forward checking) developed within the Constraint-satisfaction problem (CSP) area could be also plugged easily within this algorithm for further enhancements of efficiency. The algorithm as developed here is targeted toward a component-oriented modeling problem that we are currently working on, namely, the problem of interactive modeling for batch-simulation of engineering systems (IMBSES). However, it could be adopted for many other CSP problems as well. The research addresses the algorithm and many aspects of the problem IMBSES that we are currently handling.
NASA Astrophysics Data System (ADS)
Manukure, Solomon
2018-04-01
We construct finite-dimensional Hamiltonian systems by means of symmetry constraints from the Lax pairs and adjoint Lax pairs of a bi-Hamiltonian hierarchy of soliton equations associated with the 3-dimensional special linear Lie algebra, and discuss the Liouville integrability of these systems based on the existence of sufficiently many integrals of motion.
TARA: Tool Assisted Requirements Analysis
1988-05-01
provided during the project and to aid tool integration . Chapter 6 provides a brief discussion of the experience of specifying the ASET case study in CORE...set of Prolog clauses. This includes the context-free grammar rules depicted in Figure 2.1, integrity constraints such as those defining the binding...Jeremaes (1986). This was developed originally for specifying database management ". semantics (for example, the preservation of integrity constraints
Open groups of constraints. Integrating arbitrary involutions
NASA Astrophysics Data System (ADS)
Batalin, Igor; Marnelius, Robert
1998-11-01
A new type of quantum master equation is presented which is expressed in terms of a recently introduced quantum antibracket. The equation involves only two operators: an extended nilpotent BFV-BRST charge and an extended ghost charge. It is proposed to determine the generalized quantum Maurer-Cartan equations for arbitrary open groups. These groups are the integration of constraints in arbitrary involutions. The only condition for this is that the constraint operators may be embedded in an odd nilpotent operator, the BFV-BRST charge. The proposal is verified at the quasigroup level. The integration formulas are also used to construct a generating operator for quantum antibrackets of operators in arbitrary involutions.
Nurses' barriers to learning: an integrative review.
Santos, Marion C
2012-07-01
This integrative review of the literature describes nurses' barriers to learning. Five major themes emerged: time constraints, financial constraints, workplace culture, access/relevance, and competency in accessing electronic evidence-based practice literature. The nurse educator must address these barriers for the staff to achieve learning and competency.
42 CFR 424.518 - Screening levels for Medicare providers and suppliers.
Code of Federal Regulations, 2012 CFR
2012-10-01
... this section. (ii)(A) Requires the submission of a set of fingerprints for a national background check... provider or supplier; and (B) Conducts a fingerprint-based criminal history record check of the Federal Bureau of Investigation's Integrated Automated Fingerprint Identification System on all individuals who...
42 CFR 424.518 - Screening levels for Medicare providers and suppliers.
Code of Federal Regulations, 2014 CFR
2014-10-01
... this section. (ii)(A) Requires the submission of a set of fingerprints for a national background check... provider or supplier; and (B) Conducts a fingerprint-based criminal history record check of the Federal Bureau of Investigation's Integrated Automated Fingerprint Identification System on all individuals who...
42 CFR 424.518 - Screening levels for Medicare providers and suppliers.
Code of Federal Regulations, 2011 CFR
2011-10-01
...) Requires the submission of a set of fingerprints for a national background check from all individuals who...) Conducts a fingerprint-based criminal history record check of the Federal Bureau of Investigation's Integrated Automated Fingerprint Identification System on all individuals who maintain a 5 percent or greater...
42 CFR 424.518 - Screening levels for Medicare providers and suppliers.
Code of Federal Regulations, 2013 CFR
2013-10-01
... this section. (ii)(A) Requires the submission of a set of fingerprints for a national background check... provider or supplier; and (B) Conducts a fingerprint-based criminal history record check of the Federal Bureau of Investigation's Integrated Automated Fingerprint Identification System on all individuals who...
OPTIMAL NETWORK TOPOLOGY DESIGN
NASA Technical Reports Server (NTRS)
Yuen, J. H.
1994-01-01
This program was developed as part of a research study on the topology design and performance analysis for the Space Station Information System (SSIS) network. It uses an efficient algorithm to generate candidate network designs (consisting of subsets of the set of all network components) in increasing order of their total costs, and checks each design to see if it forms an acceptable network. This technique gives the true cost-optimal network, and is particularly useful when the network has many constraints and not too many components. It is intended that this new design technique consider all important performance measures explicitly and take into account the constraints due to various technical feasibilities. In the current program, technical constraints are taken care of by the user properly forming the starting set of candidate components (e.g. nonfeasible links are not included). As subsets are generated, they are tested to see if they form an acceptable network by checking that all requirements are satisfied. Thus the first acceptable subset encountered gives the cost-optimal topology satisfying all given constraints. The user must sort the set of "feasible" link elements in increasing order of their costs. The program prompts the user for the following information for each link: 1) cost, 2) connectivity (number of stations connected by the link), and 3) the stations connected by that link. Unless instructed to stop, the program generates all possible acceptable networks in increasing order of their total costs. The program is written only to generate topologies that are simply connected. Tests on reliability, delay, and other performance measures are discussed in the documentation, but have not been incorporated into the program. This program is written in PASCAL for interactive execution and has been implemented on an IBM PC series computer operating under PC DOS. The disk contains source code only. This program was developed in 1985.
Testing Instrument for Flight-Simulator Displays
NASA Technical Reports Server (NTRS)
Haines, Richard F.
1987-01-01
Displays for flight-training simulators rapidly aligned with aid of integrated optical instrument. Calibrations and tests such as aligning boresight of display with respect to user's eyes, checking and adjusting display horizon, checking image sharpness, measuring illuminance of displayed scenes, and measuring distance of optical focus of scene performed with single unit. New instrument combines all measurement devices in single, compact, integrated unit. Requires just one initial setup. Employs laser and produces narrow, collimated beam for greater measurement accuracy. Uses only one moving part, double right prism, to position laser beam.
Blakes, Jonathan; Twycross, Jamie; Romero-Campero, Francisco Jose; Krasnogor, Natalio
2011-12-01
The Infobiotics Workbench is an integrated software suite incorporating model specification, simulation, parameter optimization and model checking for Systems and Synthetic Biology. A modular model specification allows for straightforward creation of large-scale models containing many compartments and reactions. Models are simulated either using stochastic simulation or numerical integration, and visualized in time and space. Model parameters and structure can be optimized with evolutionary algorithms, and model properties calculated using probabilistic model checking. Source code and binaries for Linux, Mac and Windows are available at http://www.infobiotics.org/infobiotics-workbench/; released under the GNU General Public License (GPL) version 3. Natalio.Krasnogor@nottingham.ac.uk.
Catuzzo, P; Zenone, F; Aimonetto, S; Peruzzo, A; Casanova Borca, V; Pasquino, M; Franco, P; La Porta, M R; Ricardi, U; Tofani, S
2012-07-01
To investigate the feasibility of implementing a novel approach for patient-specific QA of TomoDirect(TM) whole breast treatment. The most currently used TomoTherapy DQA method, consisting in the verification of the 2D dose distribution in a coronal or sagittal plane of the Cheese Phantom by means of gafchromic films, was compared with an alternative approach based on the use of two commercially available diode arrays, MapCHECK2(TM) and ArcCHECK(TM). The TomoDirect(TM) plans of twenty patients with a primary unilateral breast cancer were applied to a CT scan of the Cheese Phantom and a MVCT dataset of the diode arrays. Then measurements of 2D dose distribution were performed and compared with the calculated ones using the gamma analysis method with different sets of DTA and DD criteria (3%-3 mm, 3%-2 mm). The sensitivity of the diode arrays to detect delivery and setup errors was also investigated. The measured dose distributions showed excellent agreement with the TPS calculations for each detector, with averaged fractions of passed Γ values greater than 95%. The percentage of points satisfying the constraint Γ < 1 was significantly higher for MapCHECK2(TM) than for ArcCHECK(TM) and gafchromic films using both the 3%-3 mm and 3%-2 mm gamma criteria. Both the diode arrays show a good sensitivity to delivery and setup errors using a 3%-2 mm gamma criteria. MapCHECK2™ and ArcCHECK(TM) may fulfill the demands of an adequate system for TomoDirect(TM) patient-specific QA.
The Development and Implementation of Outdoor-Based Secondary School Integrated Programs
ERIC Educational Resources Information Center
Comishin, Kelly; Dyment, Janet E.; Potter, Tom G.; Russell, Constance L.
2004-01-01
Four teachers share the challenges they faced when creating and running outdoor-focused secondary school integrated programs in British Columbia, Canada. The five most common challenges were funding constraints, insufficient support from administrators and colleagues, time constraints, liability and risk management, and inadequate skills and…
Autonomic Recovery: HyperCheck: A Hardware-Assisted Integrity Monitor
2013-08-01
system (OS). HyperCheck leverages the CPU System Management Mode ( SMM ), present in x86 systems, to securely generate and transmit the full state of the...HyperCheck harnesses the CPU System Management Mode ( SMM ) which is present in all x86 commodity systems to create a snapshot view of the current state of the...protect the software above it. Our assumptions are that the attacker does not have physical access to the machine and that the SMM BIOS is locked and
Including Overweight or Obese Students in Physical Education: A Social Ecological Constraint Model
ERIC Educational Resources Information Center
Li, Weidong; Rukavina, Paul
2012-01-01
In this review, we propose a social ecological constraint model to study inclusion of overweight or obese students in physical education by integrating key concepts and assumptions from ecological constraint theory in motor development and social ecological models in health promotion and behavior. The social ecological constraint model proposes…
Zhang, Wei; Zhang, Gengxin; Dong, Feihong; Xie, Zhidong; Bian, Dongming
2015-01-01
This article investigates the capacity problem of an integrated remote wireless sensor and satellite network (IWSSN) in emergency scenarios. We formulate a general model to evaluate the remote sensor and satellite network capacity. Compared to most existing works for ground networks, the proposed model is time varying and space oriented. To capture the characteristics of a practical network, we sift through major capacity-impacting constraints and analyze the influence of these constraints. Specifically, we combine the geometric satellite orbit model and satellite tool kit (STK) engineering software to quantify the trends of the capacity constraints. Our objective in analyzing these trends is to provide insights and design guidelines for optimizing the integrated remote wireless sensor and satellite network schedules. Simulation results validate the theoretical analysis of capacity trends and show the optimization opportunities of the IWSSN. PMID:26593919
Zhang, Wei; Zhang, Gengxin; Dong, Feihong; Xie, Zhidong; Bian, Dongming
2015-11-17
This article investigates the capacity problem of an integrated remote wireless sensor and satellite network (IWSSN) in emergency scenarios. We formulate a general model to evaluate the remote sensor and satellite network capacity. Compared to most existing works for ground networks, the proposed model is time varying and space oriented. To capture the characteristics of a practical network, we sift through major capacity-impacting constraints and analyze the influence of these constraints. Specifically, we combine the geometric satellite orbit model and satellite tool kit (STK) engineering software to quantify the trends of the capacity constraints. Our objective in analyzing these trends is to provide insights and design guidelines for optimizing the integrated remote wireless sensor and satellite network schedules. Simulation results validate the theoretical analysis of capacity trends and show the optimization opportunities of the IWSSN.
Lambrecht, Maarten; Eekers, Daniëlle B P; Alapetite, Claire; Burnet, Neil G; Calugaru, Valentin; Coremans, Ida E M; Fossati, Piero; Høyer, Morten; Langendijk, Johannes A; Romero, Alejandra Méndez; Paulsen, Frank; Perpar, Ana; Renard, Laurette; de Ruysscher, Dirk; Timmermann, Beate; Vitek, Pavel; Weber, Damien C; van der Weide, Hiske L; Whitfield, Gillian A; Wiggenraad, Ruud; Roelofs, Erik; Nyström, Petra Witt; Troost, Esther G C
2018-05-17
For unbiased comparison of different radiation modalities and techniques, consensus on delineation of radiation sensitive organs at risk (OARs) and on their dose constraints is warranted. Following the publication of a digital, online atlas for OAR delineation in neuro-oncology by the same group, we assessed the brain OAR-dose constraints in a follow-up study. We performed a comprehensive search to identify the current papers on OAR dose constraints for normofractionated photon and particle therapy in PubMed, Ovid Medline, Cochrane Library, Embase and Web of Science. Moreover, the included articles' reference lists were cross-checked for potential studies that met the inclusion criteria. Consensus was reached among 20 radiation oncology experts in the field of neuro-oncology. For the OARs published in the neuro-oncology literature, we summarized the available literature and recommended dose constraints associated with certain levels of normal tissue complication probability (NTCP) according to the recent ICRU recommendations. For those OARs with lacking or insufficient NTCP data, a proposal for effective and efficient data collection is given. The use of the European Particle Therapy Network-consensus OAR dose constraints summarized in this article is recommended for the model-based approach comparing photon and proton beam irradiation as well as for prospective clinical trials including novel radiation techniques and/or modalities. Copyright © 2018 Elsevier B.V. All rights reserved.
Discrete Event Simulation-Based Resource Modelling in Health Technology Assessment.
Salleh, Syed; Thokala, Praveen; Brennan, Alan; Hughes, Ruby; Dixon, Simon
2017-10-01
The objective of this article was to conduct a systematic review of published research on the use of discrete event simulation (DES) for resource modelling (RM) in health technology assessment (HTA). RM is broadly defined as incorporating and measuring effects of constraints on physical resources (e.g. beds, doctors, nurses) in HTA models. Systematic literature searches were conducted in academic databases (JSTOR, SAGE, SPRINGER, SCOPUS, IEEE, Science Direct, PubMed, EMBASE) and grey literature (Google Scholar, NHS journal library), enhanced by manual searchers (i.e. reference list checking, citation searching and hand-searching techniques). The search strategy yielded 4117 potentially relevant citations. Following the screening and manual searches, ten articles were included. Reviewing these articles provided insights into the applications of RM: firstly, different types of economic analyses, model settings, RM and cost-effectiveness analysis (CEA) outcomes were identified. Secondly, variation in the characteristics of the constraints such as types and nature of constraints and sources of data for the constraints were identified. Thirdly, it was found that including the effects of constraints caused the CEA results to change in these articles. The review found that DES proved to be an effective technique for RM but there were only a small number of studies applied in HTA. However, these studies showed the important consequences of modelling physical constraints and point to the need for a framework to be developed to guide future applications of this approach.
Opto-Electronically Efficient Conjugated Polymers by Stress-Induced Molecular Constraints
2012-07-15
TEM, JEOL JEM-2010) and checked by weight losses obtained from the thermogravimetric scans (TGA, Perkin-Elmer).[49-55] Scheme 1. Grafting P3HT...further analysis of the conduction pathways, e.g., the linear resistance networks,[40] but even without it, the jump frequency is predicted to...Nanocomposites: CNT Surface grafting, p-p interactions, and Gold Nanoparticles adsorption effect, Mater Thesis, Department of Materials Science and Engineering
Landfill site selection by using geographic information systems
NASA Astrophysics Data System (ADS)
Şener, Başak; Süzen, M. Lütfi; Doyuran, Vedat
2006-01-01
One of the serious and growing potential problems in most large urban areas is the shortage of land for waste disposal. Although there are some efforts to reduce and recover the waste, disposal in landfills is still the most common method for waste destination. An inappropriate landfill site may have negative environmental, economic and ecological impacts. Therefore, it should be selected carefully by considering both regulations and constraints on other sources. In this study, candidate sites for an appropriate landfill area in the vicinity of Ankara are determined by using the integration of geographic information systems and multicriteria decision analysis (MCDA). For this purpose, 16 input map layers including topography, settlements (urban centers and villages), roads (Highway E90 and village roads), railways, airport, wetlands, infrastructures (pipelines and power lines), slope, geology, land use, floodplains, aquifers and surface water are prepared and two different MCDA methods (simple additive weighting and analytic hierarchy process) are implemented to a geographical information system. Comparison of the maps produced by these two different methods shows that both methods yield conformable results. Field checks also confirm that the candidate sites agree well with the selected criteria.
Astronaut Joseph Tanner checks gloves during during launch/entry training
NASA Technical Reports Server (NTRS)
1994-01-01
Astronaut Joseph R. Tanner, mission specialist, checks his gloves during a rehearsal for the launch and entry phases of the scheduled November 1994 flight of STS-66. This rehearsal, held in the crew compartment trainer (CCT) of JSC's Shuttle mockup and integration laboratory, was followed by a training session on emergency egress procedures.
Ice hockey shoulder pad design and the effect on head response during shoulder-to-head impacts.
Richards, Darrin; Ivarsson, B Johan; Scher, Irving; Hoover, Ryan; Rodowicz, Kathleen; Cripton, Peter
2016-11-01
Ice hockey body checks involving direct shoulder-to-head contact frequently result in head injury. In the current study, we examined the effect of shoulder pad style on the likelihood of head injury from a shoulder-to-head check. Shoulder-to-head body checks were simulated by swinging a modified Hybrid-III anthropomorphic test device (ATD) with and without shoulder pads into a stationary Hybrid-III ATD at 21 km/h. Tests were conducted with three different styles of shoulder pads (traditional, integrated and tethered) and without shoulder pads for the purpose of control. Head response kinematics for the stationary ATD were measured. Compared to the case of no shoulder pads, the three different pad styles significantly (p < 0.05) reduced peak resultant linear head accelerations of the stationary ATD by 35-56%. The integrated shoulder pads reduced linear head accelerations by an additional 18-21% beyond the other two styles of shoulder pads. The data presented here suggest that shoulder pads can be designed to help protect the head of the struck player in a shoulder-to-head check.
Yang, Su
2005-02-01
A new descriptor for symbol recognition is proposed. 1) A histogram is constructed for every pixel to figure out the distribution of the constraints among the other pixels. 2) All the histograms are statistically integrated to form a feature vector with fixed dimension. The robustness and invariance were experimentally confirmed.
Thermal-Aware Test Access Mechanism and Wrapper Design Optimization for System-on-Chips
NASA Astrophysics Data System (ADS)
Yu, Thomas Edison; Yoneda, Tomokazu; Chakrabarty, Krishnendu; Fujiwara, Hideo
Rapid advances in semiconductor manufacturing technology have led to higher chip power densities, which places greater emphasis on packaging and temperature control during testing. For system-on-chips, peak power-based scheduling algorithms have been used to optimize tests under specified power constraints. However, imposing power constraints does not always solve the problem of overheating due to the non-uniform distribution of power across the chip. This paper presents a TAM/Wrapper co-design methodology for system-on-chips that ensures thermal safety while still optimizing the test schedule. The method combines a simplified thermal-cost model with a traditional bin-packing algorithm to minimize test time while satisfying temperature constraints. Furthermore, for temperature checking, thermal simulation is done using cycle-accurate power profiles for more realistic results. Experiments show that even a minimal sacrifice in test time can yield a considerable decrease in test temperature as well as the possibility of further lowering temperatures beyond those achieved using traditional power-based test scheduling.
Wilhelm, Leonie; Hartmann, Andrea S; Becker, Julia C; Kişi, Melahat; Waldorf, Manuel; Vocks, Silja
2018-02-21
Although Islam is the fastest growing religion worldwide, only few studies have investigated body image in Muslim women, and no study has investigated body checking. Therefore, the present study examined whether body image, body checking, and disordered eating differ between veiled and unveiled Muslim women, Christian women, and atheist women. While the groups did not differ regarding body dissatisfaction, unveiled Muslim women reported more checking than veiled Muslim and Christian women, and higher bulimia scores than Christian. Thus, prevention against eating disorders should integrate all women, irrespective of religious affiliation or veiling, with a particular focus on unveiled Muslim women.
Sehr, Christiana; Kremling, Andreas; Marin-Sanguino, Alberto
2015-10-16
During the last 10 years, systems biology has matured from a fuzzy concept combining omics, mathematical modeling and computers into a scientific field on its own right. In spite of its incredible potential, the multilevel complexity of its objects of study makes it very difficult to establish a reliable connection between data and models. The great number of degrees of freedom often results in situations, where many different models can explain/fit all available datasets. This has resulted in a shift of paradigm from the initially dominant, maybe naive, idea of inferring the system out of a number of datasets to the application of different techniques that reduce the degrees of freedom before any data set is analyzed. There is a wide variety of techniques available, each of them can contribute a piece of the puzzle and include different kinds of experimental information. But the challenge that remains is their meaningful integration. Here we show some theoretical results that enable some of the main modeling approaches to be applied sequentially in a complementary manner, and how this workflow can benefit from evolutionary reasoning to keep the complexity of the problem in check. As a proof of concept, we show how the synergies between these modeling techniques can provide insight into some well studied problems: Ammonia assimilation in bacteria and an unbranched linear pathway with end-product inhibition.
Small Autonomous Aircraft Servo Health Monitoring
NASA Technical Reports Server (NTRS)
Quintero, Steven
2008-01-01
Small air vehicles offer challenging power, weight, and volume constraints when considering implementation of system health monitoring technologies. In order to develop a testbed for monitoring the health and integrity of control surface servos and linkages, the Autonomous Aircraft Servo Health Monitoring system has been designed for small Uninhabited Aerial Vehicle (UAV) platforms to detect problematic behavior from servos and the air craft structures they control, This system will serve to verify the structural integrity of an aircraft's servos and linkages and thereby, through early detection of a problematic situation, minimize the chances of an aircraft accident. Embry-Riddle Aeronautical University's rotary-winged UAV has an Airborne Power management unit that is responsible for regulating, distributing, and monitoring the power supplied to the UAV's avionics. The current sensing technology utilized by the Airborne Power Management system is also the basis for the Servo Health system. The Servo Health system measures the current draw of the servos while the servos are in Motion in order to quantify the servo health. During a preflight check, deviations from a known baseline behavior can be logged and their causes found upon closer inspection of the aircraft. The erratic behavior nay include binding as a result of dirt buildup or backlash caused by looseness in the mechanical linkages. Moreover, the Servo Health system will allow elusive problems to be identified and preventative measures taken to avoid unnecessary hazardous conditions in small autonomous aircraft.
Use of standard vocabulary services in validation of water resources data
NASA Astrophysics Data System (ADS)
Yu, Jonathan; Cox, Simon; Ratcliffe, David
2010-05-01
Ontology repositories are increasingly being exposed through vocabulary and concept services. Primarily this is in support of resource discovery. Thesaurus functionality and even more sophisticated reasoning offers the possibility of overcoming the limitations of simple text-matching and tagging which is the basis of most search. However, controlled vocabularies have other important roles in distributed systems: in particular in constraining content validity. A national water information system established by the Australian Bureau of Meterorology ('the Bureau') has deployed a system for ingestion of data from multiple providers. This uses a http interface onto separately maintained vocabulary services as part of the quality assurance chain. With over 200 data providers potentially transferring data to the Bureau, a standard XML-based Water Data Transfer Format (WDTF) was developed for receipt of data into an integrated national water information system. The WDTF schema was built upon standards from the Open Geospatial Consortium (OGC). The structure and syntax specified by a W3C XML Schema is complemented by additional constraints described using Schematron. These implement important content requirements and business rules including: • Restricted cardinality: where optional elements and attributes inherited from the base standards become mandatory in the application, or repeatable elements or attributes are limited to one or omitted. For example, the sampledFeature element from O&M is optional but is mandatory for a samplingPoint element in WDTF. • Vocabulary checking: WDTF data use seventeen vocabularies or code lists derived from Regulations under the Commonwealth Water Act 2007. Examples of codelists are the Australian Water Regulations list, observed property vocabulary, and units of measures. • Contextual constraints: in many places, the permissible value is dependent on the value of another field. For example, within observations the unit of measure must be commensurate with the observed property type Validation of data submitted in WDTF uses a two-pass approach. First, syntax and structural validation is performed by standard XML Schema validation tools. Second, validation of contextual constraints and code list checking is performed using a hybrid method combining context-sensitive rule-based validation (allowing the rules to be expressed within a given context) and semantic vocabulary services. Schematron allows rules to incorporate assertions of XPath expressions to access and constrain element content, therefore enabling contextual constraints. Schematron is also used to perform element cardinality checking. The vocabularies or code lists are formalized in SKOS (Simple Knowledge Organization System), an RDF-based language. SKOS provides mechanisms to define concepts, associate them with (multi-lingual) labels or terms, and record thesaurus-like relationships between them. The vocabularies are managed in a RDF database or semantic triple store. Querying is implemented as a semantic vocabulary service, with an http-based API that allows queries to be issued from rules written in Schematron. WDTF has required development and deployment of some ontologies whose scope is much more general than this application, in particular covering 'observed properties' and 'units of measure', which also have to be related to each other and consistent with the dimensional analysis. Separation of the two validation passes reflects the separate governance and stability of the structural and content rules, and allows an organisation's business rules to be moved out of the XML schema definition and the XML schema to be reused by other businesses with their own specific rules. With the general approach proven, harmonization opportunities with more generic services are being explored, such as the GEMET API for SKOS, developed by the European Environment Agency. Acknowledgements: The authors would like to thank the AUSCOPE team for their development and support provided of the vocabulary services.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schaly, B; Hoover, D; Mitchell, S
2014-08-15
During volumetric modulated arc therapy (VMAT) of head and neck cancer, some patients lose weight which may result in anatomical deviations from the initial plan. If these deviations are substantial a new treatment plan can be designed for the remainder of treatment (i.e., adaptive planning). Since the adaptive treatment process is resource intensive, one possible approach to streamlining the quality assurance (QA) process is to use the electronic portal imaging device (EPID) to measure the integrated fluence for the adapted plans instead of the currently-used ArcCHECK device (Sun Nuclear). Although ArcCHECK is recognized as the clinical standard for patient-specific VMATmore » plan QA, it has limited length (20 cm) for most head and neck field apertures and has coarser detector spacing than the EPID (10 mm vs. 0.39 mm). In this work we compared measurement of the integrated fluence using the EPID with corresponding measurements from the ArcCHECK device. In the past year nine patients required an adapted plan. Each of the plans (the original and adapted) is composed of two arcs. Routine clinical QA was performed using the ArcCHECK device, and the same plans were delivered to the EPID (individual arcs) in integrated mode. The dose difference between the initial plan and adapted plan was compared for ArcCHECK and EPID. In most cases, it was found that the EPID is more sensitive in detecting plan differences. Therefore, we conclude that EPID provides a viable alternative for QA of the adapted head and neck plans and should be further explored.« less
Science Opportunity Analyzer (SOA) Version 8
NASA Technical Reports Server (NTRS)
Witoff, Robert J.; Polanskey, Carol A.; Aguinaldo, Anna Marie A.; Liu, Ning; Hofstadter, Mark D.
2013-01-01
SOA allows scientists to plan spacecraft observations. It facilitates the identification of geometrically interesting times in a spacecraft s orbit that a user can use to plan observations or instrument-driven spacecraft maneuvers. These observations can then be visualized multiple ways in both two- and three-dimensional views. When observations have been optimized within a spacecraft's flight rules, the resulting plans can be output for use by other JPL uplink tools. Now in its eighth major version, SOA improves on these capabilities in a modern and integrated fashion. SOA consists of five major functions: Opportunity Search, Visualization, Observation Design, Constraint Checking, and Data Output. Opportunity Search is a GUI-driven interface to existing search engines that can be used to identify times when a spacecraft is in a specific geometrical relationship with other bodies in the solar system. This function can be used for advanced mission planning as well as for making last-minute adjustments to mission sequences in response to trajectory modifications. Visualization is a key aspect of SOA. The user can view observation opportunities in either a 3D representation or as a 2D map projection. Observation Design allows the user to orient the spacecraft and visualize the projection of the instrument field of view for that orientation using the same views as Opportunity Search. Constraint Checking is provided to validate various geometrical and physical aspects of an observation design. The user has the ability to easily create custom rules or to use official project-generated flight rules. This capability may also allow scientists to easily assess the cost to science if flight rule changes occur. Data Output allows the user to compute ancillary data related to an observation or to a given position of the spacecraft along its trajectory. The data can be saved as a tab-delimited text file or viewed as a graph. SOA combines science planning functionality unique to both JPL and the sponsoring spacecraft. SOA is able to ingest JPL SPICE Kernels that are used to drive the tool and its computations. A Percy search engine is then included that identifies interesting time periods for the user to build observations. When observations are then built, flight-like orientation algorithms replicate spacecraft dynamics to closely simulate the flight spacecraft s dynamics. SOA v8 represents large steps forward from SOA v7 in terms of quality, reliability, maintainability, efficiency, and user experience. A tailored agile development environment has been built around SOA that provides automated unit testing, continuous build and integration, a consolidated Web-based code and documentation storage environment, modern Java enhancements, and a focus on usability
STS-97 Mission Specialist Tanner during pre-pack and fit check
NASA Technical Reports Server (NTRS)
2000-01-01
STS-97 Mission Specialist Joseph Tanner gets help with his boots from suit technician Erin Canlon during check pre-pack and fit check. Mission STS-97 is the sixth construction flight to the International Space Station. Its payload includes the P6 Integrated Truss Structure and a photovoltaic (PV) module, with giant solar arrays that will provide power to the Station. The mission includes two spacewalks to complete the solar array connections. STS-97 is scheduled to launch Nov. 30 at about 10:06 p.m. EST.
ExoMars Raman Laser Spectrometer scientific required performances check with a Breadboard
NASA Astrophysics Data System (ADS)
Moral, A.; Díaz, E.; Ramos, G.; Rodríguez Prieto, J. A.; Pérez Canora, C.; Díaz, C.; Canchal, R.; Gallego, P.; Santamaría, P.; Colombo, M.
2013-09-01
The Raman Laser Spectrometer (RLS) is one of the Pasteur Payload instruments, within the ESA's Aurora Exploration Program, ExoMars mission. For being able to verify the achievement of the scientific objectives of the instrument, a Breadboard campaign was developed, for achieving instrument TRL5. Within the Instrument TRL5 Plan, it was required to every unit to develop its own Unit Breadboard, to check their own TRL5 and then to deliver it to System Team to be integrated and tested for finally checks Instrument performances.
Proceedings of the Second NASA Formal Methods Symposium
NASA Technical Reports Server (NTRS)
Munoz, Cesar (Editor)
2010-01-01
This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.
Sleep underpins the plasticity of language production.
Gaskell, M Gareth; Warker, Jill; Lindsay, Shane; Frost, Rebecca; Guest, James; Snowdon, Reza; Stackhouse, Abigail
2014-07-01
The constraints that govern acceptable phoneme combinations in speech perception and production have considerable plasticity. We addressed whether sleep influences the acquisition of new constraints and their integration into the speech-production system. Participants repeated sequences of syllables in which two phonemes were artificially restricted to syllable onset or syllable coda, depending on the vowel in that sequence. After 48 sequences, participants either had a 90-min nap or remained awake. Participants then repeated 96 sequences so implicit constraint learning could be examined, and then were tested for constraint generalization in a forced-choice task. The sleep group, but not the wake group, produced speech errors at test that were consistent with restrictions on the placement of phonemes in training. Furthermore, only the sleep group generalized their learning to new materials. Polysomnography data showed that implicit constraint learning was associated with slow-wave sleep. These results show that sleep facilitates the integration of new linguistic knowledge with existing production constraints. These data have relevance for systems-consolidation models of sleep. © The Author(s) 2014.
Flight Test Results of a Synthetic Vision Elevation Database Integrity Monitor
NASA Technical Reports Server (NTRS)
deHaag, Maarten Uijt; Sayre, Jonathon; Campbell, Jacob; Young, Steve; Gray, Robert
2001-01-01
This paper discusses the flight test results of a real-time Digital Elevation Model (DEM) integrity monitor for Civil Aviation applications. Providing pilots with Synthetic Vision (SV) displays containing terrain information has the potential to improve flight safety by improving situational awareness and thereby reducing the likelihood of Controlled Flight Into Terrain (CFIT). Utilization of DEMs, such as the digital terrain elevation data (DTED), requires a DEM integrity check and timely integrity alerts to the pilots when used for flight-critical terrain-displays, otherwise the DEM may provide hazardous misleading terrain information. The discussed integrity monitor checks the consistency between a terrain elevation profile synthesized from sensor information, and the profile given in the DEM. The synthesized profile is derived from DGPS and radar altimeter measurements. DEMs of various spatial resolutions are used to illustrate the dependency of the integrity monitor s performance on the DEMs spatial resolution. The paper will give a description of proposed integrity algorithms, the flight test setup, and the results of a flight test performed at the Ohio University airport and in the vicinity of Asheville, NC.
Combining constraint satisfaction and local improvement algorithms to construct anaesthetists' rotas
NASA Technical Reports Server (NTRS)
Smith, Barbara M.; Bennett, Sean
1992-01-01
A system is described which was built to compile weekly rotas for the anaesthetists in a large hospital. The rota compilation problem is an optimization problem (the number of tasks which cannot be assigned to an anaesthetist must be minimized) and was formulated as a constraint satisfaction problem (CSP). The forward checking algorithm is used to find a feasible rota, but because of the size of the problem, it cannot find an optimal (or even a good enough) solution in an acceptable time. Instead, an algorithm was devised which makes local improvements to a feasible solution. The algorithm makes use of the constraints as expressed in the CSP to ensure that feasibility is maintained, and produces very good rotas which are being used by the hospital involved in the project. It is argued that formulation as a constraint satisfaction problem may be a good approach to solving discrete optimization problems, even if the resulting CSP is too large to be solved exactly in an acceptable time. A CSP algorithm may be able to produce a feasible solution which can then be improved, giving a good, if not provably optimal, solution.
PSAMM: A Portable System for the Analysis of Metabolic Models
Steffensen, Jon Lund; Dufault-Thompson, Keith; Zhang, Ying
2016-01-01
The genome-scale models of metabolic networks have been broadly applied in phenotype prediction, evolutionary reconstruction, community functional analysis, and metabolic engineering. Despite the development of tools that support individual steps along the modeling procedure, it is still difficult to associate mathematical simulation results with the annotation and biological interpretation of metabolic models. In order to solve this problem, here we developed a Portable System for the Analysis of Metabolic Models (PSAMM), a new open-source software package that supports the integration of heterogeneous metadata in model annotations and provides a user-friendly interface for the analysis of metabolic models. PSAMM is independent of paid software environments like MATLAB, and all its dependencies are freely available for academic users. Compared to existing tools, PSAMM significantly reduced the running time of constraint-based analysis and enabled flexible settings of simulation parameters using simple one-line commands. The integration of heterogeneous, model-specific annotation information in PSAMM is achieved with a novel format of YAML-based model representation, which has several advantages, such as providing a modular organization of model components and simulation settings, enabling model version tracking, and permitting the integration of multiple simulation problems. PSAMM also includes a number of quality checking procedures to examine stoichiometric balance and to identify blocked reactions. Applying PSAMM to 57 models collected from current literature, we demonstrated how the software can be used for managing and simulating metabolic models. We identified a number of common inconsistencies in existing models and constructed an updated model repository to document the resolution of these inconsistencies. PMID:26828591
Microspheres as resistive elements in a check valve for low pressure and low flow rate conditions.
Ou, Kevin; Jackson, John; Burt, Helen; Chiao, Mu
2012-11-07
In this paper we describe a microsphere-based check valve integrated with a micropump. The check valve uses Ø20 μm polystyrene microspheres to rectify flow in low pressure and low flow rate applications (Re < 1). The microspheres form a porous medium in the check valve increasing fluidic resistance based on the direction of flow. Three check valve designs were fabricated and characterized to study the microspheres' effectiveness as resistive elements. A maximum diodicity (ratio of flow in the forward and reverse direction) of 18 was achieved. The pumping system can deliver a minimum flow volume of 0.25 μL and a maximum flow volume of 1.26 μL under an applied pressure of 0.2 kPa and 1 kPa, respectively. A proof-of-concept study was conducted using a pharmaceutical agent, docetaxel (DTX), as a sample drug showing the microsphere check valve's ability to limit diffusion from the micropump. The proposed check valve and pumping concept shows strong potential for implantable drug delivery applications with low flow rate requirements.
Stabilization of computational procedures for constrained dynamical systems
NASA Technical Reports Server (NTRS)
Park, K. C.; Chiou, J. C.
1988-01-01
A new stabilization method of treating constraints in multibody dynamical systems is presented. By tailoring a penalty form of the constraint equations, the method achieves stabilization without artificial damping and yields a companion matrix differential equation for the constraint forces; hence, the constraint forces are obtained by integrating the companion differential equation for the constraint forces in time. A principal feature of the method is that the errors committed in each constraint condition decay with its corresponding characteristic time scale associated with its constraint force. Numerical experiments indicate that the method yields a marked improvement over existing techniques.
A Hybrid Constraint Representation and Reasoning Framework
NASA Technical Reports Server (NTRS)
Golden, Keith; Pang, Wanlin
2004-01-01
In this paper, we introduce JNET, a novel constraint representation and reasoning framework that supports procedural constraints and constraint attachments, providing a flexible way of integrating the constraint system with a runtime software environment and improving its applicability. We describe how JNET is applied to a real-world problem - NASA's Earth-science data processing domain, and demonstrate how JNET can be extended, without any knowledge of how it is implemented, to meet the growing demands of real-world applications.
Wang, Yanchao; Sunderraman, Rajshekhar
2006-01-01
In this paper, we propose two architectures for curating PDB data to improve its quality. The first one, PDB Data Curation System, is developed by adding two parts, Checking Filter and Curation Engine, between User Interface and Database. This architecture supports the basic PDB data curation. The other one, PDB Data Curation System with XCML, is designed for further curation which adds four more parts, PDB-XML, PDB, OODB, Protin-OODB, into the previous one. This architecture uses XCML language to automatically check errors of PDB data that enables PDB data more consistent and accurate. These two tools can be used for cleaning existing PDB files and creating new PDB files. We also show some ideas how to add constraints and assertions with XCML to get better data. In addition, we discuss the data provenance that may affect data accuracy and consistency.
Cooperative runtime monitoring
NASA Astrophysics Data System (ADS)
Hallé, Sylvain
2013-11-01
Requirements on message-based interactions can be formalised as an interface contract that specifies constraints on the sequence of possible messages that can be exchanged by multiple parties. At runtime, each peer can monitor incoming messages and check that the contract is correctly being followed by their respective senders. We introduce cooperative runtime monitoring, where a recipient 'delegates' its monitoring task to the sender, which is required to provide evidence that the message it sends complies with the contract. In turn, this evidence can be quickly checked by the recipient, which is then guaranteed of the sender's compliance to the contract without doing the monitoring computation by itself. A particular application of this concept is shown on web services, where service providers can monitor and enforce contract compliance of third-party clients at a small cost on the server side, while avoiding to certify or digitally sign them.
A voice-actuated wind tunnel model leak checking system
NASA Technical Reports Server (NTRS)
Larson, William E.
1989-01-01
A computer program has been developed that improves the efficiency of wind tunnel model leak checking. The program uses a voice recognition unit to relay a technician's commands to the computer. The computer, after receiving a command, can respond to the technician via a voice response unit. Information about the model pressure orifice being checked is displayed on a gas-plasma terminal. On command, the program records up to 30 seconds of pressure data. After the recording is complete, the raw data and a straight line fit of the data are plotted on the terminal. This allows the technician to make a decision on the integrity of the orifice being checked. All results of the leak check program are stored in a database file that can be listed on the line printer for record keeping purposes or displayed on the terminal to help the technician find unchecked orifices. This program allows one technician to check a model for leaks instead of the two or three previously required.
NASA Astrophysics Data System (ADS)
Su, Yi; Wang, Feifeng; Lu, Yufeng; Huang, Huimin; Xia, Xiaofei
2017-09-01
This paper is based on affine function equation of the grid and OPF problem, discusses the equivalent of some inequality constraints variables optimizing. Further, we propose the model of injection current and set up the constraint sensitivity index of affine characteristics. The index can be used to identify the central point voltage and effective inequality of the system automatically. And then we can know how to compensate reactive power of the corresponding generator node and control the voltage to ensure the quality of the system voltage. When checking the effective inequalities we introduce cross-solving method of power flow. This provide a different idea for solving the power flow. The paper uses the results of the IEEE5 node examples to illustrate the validity and practicality of the proposed method.
Herman, Gabor T; Chen, Wei
2008-03-01
The goal of Intensity-Modulated Radiation Therapy (IMRT) is to deliver sufficient doses to tumors to kill them, but without causing irreparable damage to critical organs. This requirement can be formulated as a linear feasibility problem. The sequential (i.e., iteratively treating the constraints one after another in a cyclic fashion) algorithm ART3 is known to find a solution to such problems in a finite number of steps, provided that the feasible region is full dimensional. We present a faster algorithm called ART3+. The idea of ART3+ is to avoid unnecessary checks on constraints that are likely to be satisfied. The superior performance of the new algorithm is demonstrated by mathematical experiments inspired by the IMRT application.
NASA Astrophysics Data System (ADS)
Alimohammadi, Shahrouz; Cavaglieri, Daniele; Beyhaghi, Pooriya; Bewley, Thomas R.
2016-11-01
This work applies a recently developed Derivative-free optimization algorithm to derive a new mixed implicit-explicit (IMEX) time integration scheme for Computational Fluid Dynamics (CFD) simulations. This algorithm allows imposing a specified order of accuracy for the time integration and other important stability properties in the form of nonlinear constraints within the optimization problem. In this procedure, the coefficients of the IMEX scheme should satisfy a set of constraints simultaneously. Therefore, the optimization process, at each iteration, estimates the location of the optimal coefficients using a set of global surrogates, for both the objective and constraint functions, as well as a model of the uncertainty function of these surrogates based on the concept of Delaunay triangulation. This procedure has been proven to converge to the global minimum of the constrained optimization problem provided the constraints and objective functions are twice differentiable. As a result, a new third-order, low-storage IMEX Runge-Kutta time integration scheme is obtained with remarkably fast convergence. Numerical tests are then performed leveraging the turbulent channel flow simulations to validate the theoretical order of accuracy and stability properties of the new scheme.
Embedding Temporal Constraints For Coordinated Execution in Habitat Automation
NASA Technical Reports Server (NTRS)
Morris, Paul; Schwabacher, Mark; Dalal, Michael; Fry, Charles
2013-01-01
Future NASA plans call for long-duration deep space missions with human crews. Because of light-time delay and other considerations, increased autonomy will be needed. This will necessitate integration of tools in such areas as anomaly detection, diagnosis, planning, and execution. In this paper we investigate an approach that integrates planning and execution by embedding planner-derived temporal constraints in an execution procedure. To avoid the need for propagation, we convert the temporal constraints to dispatchable form. We handle some uncertainty in the durations without it affecting the execution; larger variations may cause activities to be skipped.
An Integrated Environment for Efficient Formal Design and Verification
NASA Technical Reports Server (NTRS)
1998-01-01
The general goal of this project was to improve the practicality of formal methods by combining techniques from model checking and theorem proving. At the time the project was proposed, the model checking and theorem proving communities were applying different tools to similar problems, but there was not much cross-fertilization. This project involved a group from SRI that had substantial experience in the development and application of theorem-proving technology, and a group at Stanford that specialized in model checking techniques. Now, over five years after the proposal was submitted, there are many research groups working on combining theorem-proving and model checking techniques, and much more communication between the model checking and theorem proving research communities. This project contributed significantly to this research trend. The research work under this project covered a variety of topics: new theory and algorithms; prototype tools; verification methodology; and applications to problems in particular domains.
28 CFR 25.5 - Validation and data integrity of records in the system.
Code of Federal Regulations, 2010 CFR
2010-07-01
... INFORMATION SYSTEMS The National Instant Criminal Background Check System § 25.5 Validation and data integrity... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Validation and data integrity of records... verify that the information provided to the NICS Index remains valid and correct. (b) Each data source...
28 CFR 25.5 - Validation and data integrity of records in the system.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 28 Judicial Administration 1 2011-07-01 2011-07-01 false Validation and data integrity of records in the system. 25.5 Section 25.5 Judicial Administration DEPARTMENT OF JUSTICE DEPARTMENT OF JUSTICE INFORMATION SYSTEMS The National Instant Criminal Background Check System § 25.5 Validation and data integrity...
28 CFR 25.5 - Validation and data integrity of records in the system.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 28 Judicial Administration 1 2013-07-01 2013-07-01 false Validation and data integrity of records in the system. 25.5 Section 25.5 Judicial Administration DEPARTMENT OF JUSTICE DEPARTMENT OF JUSTICE INFORMATION SYSTEMS The National Instant Criminal Background Check System § 25.5 Validation and data integrity...
28 CFR 25.5 - Validation and data integrity of records in the system.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 28 Judicial Administration 1 2012-07-01 2012-07-01 false Validation and data integrity of records in the system. 25.5 Section 25.5 Judicial Administration DEPARTMENT OF JUSTICE DEPARTMENT OF JUSTICE INFORMATION SYSTEMS The National Instant Criminal Background Check System § 25.5 Validation and data integrity...
28 CFR 25.5 - Validation and data integrity of records in the system.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 28 Judicial Administration 1 2014-07-01 2014-07-01 false Validation and data integrity of records in the system. 25.5 Section 25.5 Judicial Administration DEPARTMENT OF JUSTICE DEPARTMENT OF JUSTICE INFORMATION SYSTEMS The National Instant Criminal Background Check System § 25.5 Validation and data integrity...
40 CFR 85.2233 - Steady state test equipment calibrations, adjustments, and quality control-EPA 91.
Code of Federal Regulations, 2013 CFR
2013-07-01
... tolerance range. The pressure in the sample cell must be the same with the calibration gas flowing during... this chapter. The check is done at 30 mph (48 kph), and a power absorption load setting to generate a... in § 85.2225(c)(1) are not met. (2) Leak checks. Each time the sample line integrity is broken, a...
40 CFR 85.2233 - Steady state test equipment calibrations, adjustments, and quality control-EPA 91.
Code of Federal Regulations, 2011 CFR
2011-07-01
... tolerance range. The pressure in the sample cell must be the same with the calibration gas flowing during... this chapter. The check is done at 30 mph (48 kph), and a power absorption load setting to generate a... in § 85.2225(c)(1) are not met. (2) Leak checks. Each time the sample line integrity is broken, a...
40 CFR 85.2233 - Steady state test equipment calibrations, adjustments, and quality control-EPA 91.
Code of Federal Regulations, 2012 CFR
2012-07-01
... tolerance range. The pressure in the sample cell must be the same with the calibration gas flowing during... this chapter. The check is done at 30 mph (48 kph), and a power absorption load setting to generate a... in § 85.2225(c)(1) are not met. (2) Leak checks. Each time the sample line integrity is broken, a...
Integrated Control Using the SOFFT Control Structure
NASA Technical Reports Server (NTRS)
Halyo, Nesim
1996-01-01
The need for integrated/constrained control systems has become clearer as advanced aircraft introduced new coupled subsystems such as new propulsion subsystems with thrust vectoring and new aerodynamic designs. In this study, we develop an integrated control design methodology which accomodates constraints among subsystem variables while using the Stochastic Optimal Feedforward/Feedback Control Technique (SOFFT) thus maintaining all the advantages of the SOFFT approach. The Integrated SOFFT Control methodology uses a centralized feedforward control and a constrained feedback control law. The control thus takes advantage of the known coupling among the subsystems while maintaining the identity of subsystems for validation purposes and the simplicity of the feedback law to understand the system response in complicated nonlinear scenarios. The Variable-Gain Output Feedback Control methodology (including constant gain output feedback) is extended to accommodate equality constraints. A gain computation algorithm is developed. The designer can set the cross-gains between two variables or subsystems to zero or another value and optimize the remaining gains subject to the constraint. An integrated control law is designed for a modified F-15 SMTD aircraft model with coupled airframe and propulsion subsystems using the Integrated SOFFT Control methodology to produce a set of desired flying qualities.
Enforcement of entailment constraints in distributed service-based business processes.
Hummer, Waldemar; Gaubatz, Patrick; Strembeck, Mark; Zdun, Uwe; Dustdar, Schahram
2013-11-01
A distributed business process is executed in a distributed computing environment. The service-oriented architecture (SOA) paradigm is a popular option for the integration of software services and execution of distributed business processes. Entailment constraints, such as mutual exclusion and binding constraints, are important means to control process execution. Mutually exclusive tasks result from the division of powerful rights and responsibilities to prevent fraud and abuse. In contrast, binding constraints define that a subject who performed one task must also perform the corresponding bound task(s). We aim to provide a model-driven approach for the specification and enforcement of task-based entailment constraints in distributed service-based business processes. Based on a generic metamodel, we define a domain-specific language (DSL) that maps the different modeling-level artifacts to the implementation-level. The DSL integrates elements from role-based access control (RBAC) with the tasks that are performed in a business process. Process definitions are annotated using the DSL, and our software platform uses automated model transformations to produce executable WS-BPEL specifications which enforce the entailment constraints. We evaluate the impact of constraint enforcement on runtime performance for five selected service-based processes from existing literature. Our evaluation demonstrates that the approach correctly enforces task-based entailment constraints at runtime. The performance experiments illustrate that the runtime enforcement operates with an overhead that scales well up to the order of several ten thousand logged invocations. Using our DSL annotations, the user-defined process definition remains declarative and clean of security enforcement code. Our approach decouples the concerns of (non-technical) domain experts from technical details of entailment constraint enforcement. The developed framework integrates seamlessly with WS-BPEL and the Web services technology stack. Our prototype implementation shows the feasibility of the approach, and the evaluation points to future work and further performance optimizations.
NASA Astrophysics Data System (ADS)
Yinghao, Cui; He, Xue; Lingyan, Zhao
2017-12-01
It’s important to obtain accurate stress corrosion crack(SCC) growth rate for quantitative life prediction of components in nuclear power plants. However, the engineering practice shows that the crack tip constraint effect has a great influence on the mechanical properties and crack growth rate of SCC at crack tip. To study the influence of the specimen thickness on the crack tip mechanical properties of SCC, the stress, strain and C integral at creep crack tip are analyzed under different specimens thickness. Results show that the cracked specimen is less likely to crack due to effect of crack tip constraint. When the thickness ratio B/W is larger than 0.1, the crack tip constraint is almost ineffective. Value of C integral is the largest when B/W is 0.25. Then specimen thickness has little effect on the value of C integral. The effect of specimen thickness on the value of C integral is less significant at higher thickness ratio.
Sci-Fin: Visual Mining Spatial and Temporal Behavior Features from Social Media
Pu, Jiansu; Teng, Zhiyao; Gong, Rui; Wen, Changjiang; Xu, Yang
2016-01-01
Check-in records are usually available in social services, which offer us the opportunity to capture and analyze users’ spatial and temporal behaviors. Mining such behavior features is essential to social analysis and business intelligence. However, the complexity and incompleteness of check-in records bring challenges to achieve such a task. Different from the previous work on social behavior analysis, in this paper, we present a visual analytics system, Social Check-in Fingerprinting (Sci-Fin), to facilitate the analysis and visualization of social check-in data. We focus on three major components of user check-in data: location, activity, and profile. Visual fingerprints for location, activity, and profile are designed to intuitively represent the high-dimensional attributes. To visually mine and demonstrate the behavior features, we integrate WorldMapper and Voronoi Treemap into our glyph-like designs. Such visual fingerprint designs offer us the opportunity to summarize the interesting features and patterns from different check-in locations, activities and users (groups). We demonstrate the effectiveness and usability of our system by conducting extensive case studies on real check-in data collected from a popular microblogging service. Interesting findings are reported and discussed at last. PMID:27999398
Sci-Fin: Visual Mining Spatial and Temporal Behavior Features from Social Media.
Pu, Jiansu; Teng, Zhiyao; Gong, Rui; Wen, Changjiang; Xu, Yang
2016-12-20
Check-in records are usually available in social services, which offer us the opportunity to capture and analyze users' spatial and temporal behaviors. Mining such behavior features is essential to social analysis and business intelligence. However, the complexity and incompleteness of check-in records bring challenges to achieve such a task. Different from the previous work on social behavior analysis, in this paper, we present a visual analytics system, Social Check-in Fingerprinting (Sci-Fin), to facilitate the analysis and visualization of social check-in data. We focus on three major components of user check-in data: location, activity, and profile. Visual fingerprints for location, activity, and profile are designed to intuitively represent the high-dimensional attributes. To visually mine and demonstrate the behavior features, we integrate WorldMapper and Voronoi Treemap into our glyph-like designs. Such visual fingerprint designs offer us the opportunity to summarize the interesting features and patterns from different check-in locations, activities and users (groups). We demonstrate the effectiveness and usability of our system by conducting extensive case studies on real check-in data collected from a popular microblogging service. Interesting findings are reported and discussed at last.
Gratton, D G; Kwon, S R; Blanchette, D R; Aquilino, S A
2017-11-01
Proper integration of newly emerging digital assessment tools is a central issue in dental education in an effort to provide more accurate and objective feedback to students. The study examined how the outcomes of students' tooth preparation were correlated when evaluated using traditional faculty assessment and two types of digital assessment approaches. Specifically, incorporation of the Romexis Compare 2.0 (Compare) and Sirona prepCheck 1.1 (prepCheck) systems was evaluated. Additionally, satisfaction of students based on the type of software was evaluated through a survey. Students in a second-year pre-clinical prosthodontics course were allocated to either Compare (n = 42) or prepCheck (n = 37) systems. All students received conventional instruction and used their assigned digital system as an additional evaluation tool to aid in assessing their work. Examinations assessed crown preparations of the maxillary right central incisor (#8) and the mandibular left first molar (#19). All submissions were graded by faculty, Compare and prepCheck. Technical scores did not differ between student groups for any of the assessment approaches. Compare and prepCheck had modest, statistically significant correlations with faculty scores with a minimum correlation of 0.3944 (P = 0.0011) and strong, statistically significant correlations with each other with a minimum correlation of 0.8203 (P < 0.0001). A post-course student survey found that 55.26% of the students felt unfavourably about learning the digital evaluation protocols. A total of 62.31% felt favourably about the integration of these digital tools into the curriculum. Comparison of Compare and prepCheck showed no evidence of significant difference in students' prosthodontics technical performance and perception. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Cycle time reduction by Html report in mask checking flow
NASA Astrophysics Data System (ADS)
Chen, Jian-Cheng; Lu, Min-Ying; Fang, Xiang; Shen, Ming-Feng; Ma, Shou-Yuan; Yang, Chuen-Huei; Tsai, Joe; Lee, Rachel; Deng, Erwin; Lin, Ling-Chieh; Liao, Hung-Yueh; Tsai, Jenny; Bowhill, Amanda; Vu, Hien; Russell, Gordon
2017-07-01
The Mask Data Correctness Check (MDCC) is a reticle-level, multi-layer DRC-like check evolved from mask rule check (MRC). The MDCC uses extended job deck (EJB) to achieve mask composition and to perform a detailed check for positioning and integrity of each component of the reticle. Different design patterns on the mask will be mapped to different layers. Therefore, users may be able to review the whole reticle and check the interactions between different designs before the final mask pattern file is available. However, many types of MDCC check results, such as errors from overlapping patterns usually have very large and complex-shaped highlighted areas covering the boundary of the design. Users have to load the result OASIS file and overlap it to the original database that was assembled in MDCC process on a layout viewer, then search for the details of the check results. We introduce a quick result-reviewing method based on an html format report generated by Calibre® RVE. In the report generation process, we analyze and extract the essential part of result OASIS file to a result database (RDB) file by standard verification rule format (SVRF) commands. Calibre® RVE automatically loads the assembled reticle pattern and generates screen shots of these check results. All the processes are automatically triggered just after the MDCC process finishes. Users just have to open the html report to get the information they need: for example, check summary, captured images of results and their coordinates.
Low Density Parity Check Codes Based on Finite Geometries: A Rediscovery and More
NASA Technical Reports Server (NTRS)
Kou, Yu; Lin, Shu; Fossorier, Marc
1999-01-01
Low density parity check (LDPC) codes with iterative decoding based on belief propagation achieve astonishing error performance close to Shannon limit. No algebraic or geometric method for constructing these codes has been reported and they are largely generated by computer search. As a result, encoding of long LDPC codes is in general very complex. This paper presents two classes of high rate LDPC codes whose constructions are based on finite Euclidean and projective geometries, respectively. These classes of codes a.re cyclic and have good constraint parameters and minimum distances. Cyclic structure adows the use of linear feedback shift registers for encoding. These finite geometry LDPC codes achieve very good error performance with either soft-decision iterative decoding based on belief propagation or Gallager's hard-decision bit flipping algorithm. These codes can be punctured or extended to obtain other good LDPC codes. A generalization of these codes is also presented.
Consideration and Checkboxes: Incorporating Ethics and Science into the 3Rs
Landi, Margaret S; Shriver, Adam J; Mueller, Anne
2015-01-01
Members of the research community aim to both produce high-quality research and ensure that harm is minimized in animals. The primary means of ensuring these goals are both met is the 3Rs framework of replacement, reduction, and refinement. However, some approaches to the 3Rs may result in a ‘check box mentality’ in which IACUC members, researchers, administrators, and caretakers check off a list of tasks to evaluate a protocol. We provide reasons for thinking that the 3Rs approach could be enhanced with more explicit discussion of the ethical assumptions used to arrive at an approved research protocol during IACUC review. Here we suggest that the notion of moral considerability, and all of the related issues it gives rise to, should be incorporated into IACUC discussions of 3Rs deliberations during protocol review to ensure that animal wellbeing is enhanced within the constraints of scientific investigation. PMID:25836970
Modelling of influential parameters on a continuous evaporation process by Doehlert shells
Porte, Catherine; Havet, Jean-Louis; Daguet, David
2003-01-01
The modelling of the parameters that influence the continuous evaporation of an alcoholic extract was considered using Doehlert matrices. The work was performed with a wiped falling film evaporator that allowed us to study the influence of the pressure, temperature, feed flow and dry matter of the feed solution on the dry matter contents of the resulting concentrate, and the productivity of the process. The Doehlert shells were used to model the influential parameters. The pattern obtained from the experimental results was checked allowing for some dysfunction in the unit. The evaporator was modified and a new model applied; the experimental results were then in agreement with the equations. The model was finally determined and successfully checked in order to obtain an 8% dry matter concentrate with the best productivity; the results fit in with the industrial constraints of subsequent processes. PMID:18924887
Performance of Low-Density Parity-Check Coded Modulation
NASA Astrophysics Data System (ADS)
Hamkins, J.
2011-02-01
This article presents the simulated performance of a family of nine AR4JA low-density parity-check (LDPC) codes when used with each of five modulations. In each case, the decoder inputs are codebit log-likelihood ratios computed from the received (noisy) modulation symbols using a general formula which applies to arbitrary modulations. Suboptimal soft-decision and hard-decision demodulators are also explored. Bit-interleaving and various mappings of bits to modulation symbols are considered. A number of subtle decoder algorithm details are shown to affect performance, especially in the error floor region. Among these are quantization dynamic range and step size, clipping degree-one variable nodes, "Jones clipping" of variable nodes, approximations of the min* function, and partial hard-limiting messages from check nodes. Using these decoder optimizations, all coded modulations simulated here are free of error floors down to codeword error rates below 10^{-6}. The purpose of generating this performance data is to aid system engineers in determining an appropriate code and modulation to use under specific power and bandwidth constraints, and to provide information needed to design a variable/adaptive coded modulation (VCM/ACM) system using the AR4JA codes. IPNPR Volume 42-185 Tagged File.txt
Checking the Dark Matter Origin of a 3.53 keV Line with the Milky Way Center.
Boyarsky, A; Franse, J; Iakubovskyi, D; Ruchayskiy, O
2015-10-16
We detect a line at 3.539±0.011 keV in the deep exposure data set of the Galactic center region, observed with the x-ray multi-mirror mission Newton. The dark matter interpretation of the signal observed in the Perseus galaxy cluster, the Andromeda galaxy [A. Boyarsky et al., Phys. Rev. Lett. 113, 251301 (2014)], and in the stacked spectra of galaxy clusters [E. Bulbul et al., Astrophys. J. 789, 13 (2014)], together with nonobservation of the line in blank-sky data, put both lower and upper limits on the possible intensity of the line in the Galactic center data. Our result is consistent with these constraints for a class of Milky Way mass models, presented previously by observers, and would correspond to the radiative decay dark matter lifetime, τDM∼6-8×10(27) sec. Although it is hard to exclude an astrophysical origin of this line based on the Galactic center data alone, this is an important consistency check of the hypothesis that encourages us to check it with more observational data that are expected by the end of 2015.
Redundancy of constraints in the classical and quantum theories of gravitation.
NASA Technical Reports Server (NTRS)
Moncrief, V.
1972-01-01
It is shown that in Dirac's version of the quantum theory of gravitation, the Hamiltonian constraints are greatly redundant. If the Hamiltonian constraint condition is satisfied at one point on the underlying, closed three-dimensional manifold, then it is automatically satisfied at every point, provided only that the momentum constraints are everywhere satisfied. This permits one to replace the usual infinity of Hamiltonian constraints by a single condition which may be taken in the form of an integral over the manifold. Analogous theorems are given for the classical Einstein Hamilton-Jacobi equations.
Analyzing Planck and low redshift data sets with advanced statistical methods
NASA Astrophysics Data System (ADS)
Eifler, Tim
The recent ESA/NASA Planck mission has provided a key data set to constrain cosmology that is most sensitive to physics of the early Universe, such as inflation and primordial NonGaussianity (Planck 2015 results XIII). In combination with cosmological probes of the LargeScale Structure (LSS), the Planck data set is a powerful source of information to investigate late time phenomena (Planck 2015 results XIV), e.g. the accelerated expansion of the Universe, the impact of baryonic physics on the growth of structure, and the alignment of galaxies in their dark matter halos. It is the main objective of this proposal to re-analyze the archival Planck data, 1) with different, more recently developed statistical methods for cosmological parameter inference, and 2) to combine Planck and ground-based observations in an innovative way. We will make the corresponding analysis framework publicly available and believe that it will set a new standard for future CMB-LSS analyses. Advanced statistical methods, such as the Gibbs sampler (Jewell et al 2004, Wandelt et al 2004) have been critical in the analysis of Planck data. More recently, Approximate Bayesian Computation (ABC, see Weyant et al 2012, Akeret et al 2015, Ishida et al 2015, for cosmological applications) has matured to an interesting tool in cosmological likelihood analyses. It circumvents several assumptions that enter the standard Planck (and most LSS) likelihood analyses, most importantly, the assumption that the functional form of the likelihood of the CMB observables is a multivariate Gaussian. Beyond applying new statistical methods to Planck data in order to cross-check and validate existing constraints, we plan to combine Planck and DES data in a new and innovative way and run multi-probe likelihood analyses of CMB and LSS observables. The complexity of multiprobe likelihood analyses scale (non-linearly) with the level of correlations amongst the individual probes that are included. For the multi-probe analysis proposed here we will use the existing CosmoLike software, a computationally efficient analysis framework that is unique in its integrated ansatz of jointly analyzing probes of large-scale structure (LSS) of the Universe. We plan to combine CosmoLike with publicly available CMB analysis software (Camb, CLASS) to include modeling capabilities of CMB temperature, polarization, and lensing measurements. The resulting analysis framework will be capable to independently and jointly analyze data from the CMB and from various probes of the LSS of the Universe. After completion we will utilize this framework to check for consistency amongst the individual probes and subsequently run a joint likelihood analysis of probes that are not in tension. The inclusion of Planck information in a joint likelihood analysis substantially reduces DES uncertainties in cosmological parameters, and allows for unprecedented constraints on parameters that describe astrophysics. In their recent review Observational Probes of Cosmic Acceleration (Weinberg et al 2013) the authors emphasize the value of a balanced program that employs several of the most powerful methods in combination, both to cross-check systematic uncertainties and to take advantage of complementary information. The work we propose follows exactly this idea: 1) cross-checking existing Planck results with alternative methods in the data analysis, 2) checking for consistency of Planck and DES data, and 3) running a joint analysis to constrain cosmology and astrophysics. It is now expedient to develop and refine multi-probe analysis strategies that allow the comparison and inclusion of information from disparate probes to optimally obtain cosmology and astrophysics. Analyzing Planck and DES data poses an ideal opportunity for this purpose and corresponding lessons will be of great value for the science preparation of Euclid and WFIRST.
Pilotless Frame Synchronization Using LDPC Code Constraints
NASA Technical Reports Server (NTRS)
Jones, Christopher; Vissasenor, John
2009-01-01
A method of pilotless frame synchronization has been devised for low- density parity-check (LDPC) codes. In pilotless frame synchronization , there are no pilot symbols; instead, the offset is estimated by ex ploiting selected aspects of the structure of the code. The advantag e of pilotless frame synchronization is that the bandwidth of the sig nal is reduced by an amount associated with elimination of the pilot symbols. The disadvantage is an increase in the amount of receiver data processing needed for frame synchronization.
NASA Astrophysics Data System (ADS)
Hasegawa, K.; Lim, C. S.; Ogure, K.
2003-09-01
We propose a two-zero-texture general Zee model, compatible with the large mixing angle Mikheyev-Smirnov-Wolfenstein solution. The washing out of the baryon number does not occur in this model for an adequate parameter range. We check the consistency of a model with the constraints coming from flavor changing neutral current processes, the recent cosmic microwave background observation, and the Z-burst scenario.
Retrospective review of Contura HDR breast cases to improve our standardized procedure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iftimia, Ileana, E-mail: Ileana.n.iftimia@lahey.org; Cirino, Eileen T.; Ladd, Ron
2013-07-01
To retrospectively review our first 20 Contura high dose rate breast cases to improve and refine our standardized procedure and checklists. We prepared in advance checklists for all steps, developed an in-house Excel spreadsheet for second checking the plan, and generated a procedure for efficient contouring and a set of optimization constraints to meet the dose volume histogram criteria. Templates were created in our treatment planning system for structures, isodose levels, optimization constraints, and plan report. This study reviews our first 20 high dose rate Contura breast treatment plans. We followed our standardized procedure for contouring, planning, and second checking.more » The established dose volume histogram criteria were successfully met for all plans. For the cases studied here, the balloon-skin and balloon-ribs distances ranged between 5 and 43 mm and 1 and 33 mm, respectively; air{sub s}eroma volume/PTV{sub E}val volume≤5.5% (allowed≤10%); asymmetry<1.2 mm (goal≤2 mm); PTV{sub E}val V90%≥97.6%; PTV{sub E}val V95%≥94.9%; skin max dose≤98%Rx; ribs max dose≤137%Rx; V150%≤29.8 cc; V200%≤7.8 cc; the total dwell time range was 225.4 to 401.9 seconds; and the second check agreement was within 3%. Based on this analysis, more appropriate ranges for the total dwell time and balloon diameter tolerance were found. Three major problems were encountered: balloon migration toward the skin for small balloon-to-skin distances, lumen obstruction, and length change for the flexible balloon. Solutions were found for these issues and our standardized procedure and checklists were updated accordingly. Based on our review of these cases, the use of checklists resulted in consistent results, indicating good coverage for the target without sacrificing the critical structures. This review helped us to refine our standardized procedure and update our checklists.« less
Advances in thermal control and performance of the MMT M1 mirror
NASA Astrophysics Data System (ADS)
Gibson, J. D.; Williams, G. G.; Callahan, S.; Comisso, B.; Ortiz, R.; Williams, J. T.
2010-07-01
Strategies for thermal control of the 6.5-meter diameter borosilicate honeycomb primary (M1) mirror at the MMT Observatory have included: 1) direct control of ventilation system chiller setpoints by the telescope operator, 2) semiautomated control of chiller setpoints, using a fixed offset from the ambient temperature, and 3) most recently, an automated temperature controller for conditioned air. Details of this automated controller, including the integration of multiple chillers, heat exchangers, and temperature/dew point sensors, are presented here. Constraints and sanity checks for thermal control are also discussed, including: 1) mirror and hardware safety, 2) aluminum coating preservation, and 3) optimization of M1 thermal conditions for science acquisition by minimizing both air-to-glass temperature differences, which cause mirror seeing, and internal glass temperature gradients, which cause wavefront errors. Consideration is given to special operating conditions, such as high dew and frost points. Precise temperature control of conditioned ventilation air as delivered to the M1 mirror cell is also discussed. The performance of the new automated controller is assessed and compared to previous control strategies. Finally, suggestions are made for further refinement of the M1 mirror thermal control system and related algorithms.
AdS/CFT and local renormalization group with gauge fields
NASA Astrophysics Data System (ADS)
Kikuchi, Ken; Sakai, Tadakatsu
2016-03-01
We revisit a study of local renormalization group (RG) with background gauge fields incorporated using the AdS/CFT correspondence. Starting with a (d+1)-dimensional bulk gravity coupled to scalars and gauge fields, we derive a local RG equation from a flow equation by working in the Hamilton-Jacobi formulation of the bulk theory. The Gauss's law constraint associated with gauge symmetry plays an important role. RG flows of the background gauge fields are governed by vector β-functions, and some of their interesting properties are known to follow. We give a systematic rederivation of them on the basis of the flow equation. Fixing an ambiguity of local counterterms in such a manner that is natural from the viewpoint of the flow equation, we determine all the coefficients uniquely appearing in the trace of the stress tensor for d=4. A relation between a choice of schemes and a virial current is discussed. As a consistency check, these are found to satisfy the integrability conditions of local RG transformations. From these results, we are led to a proof of a holographic c-theorem by determining a full family of schemes where a trace anomaly coefficient is related with a holographic c-function.
ERIC Educational Resources Information Center
Mastin, David F.; Peszka, Jennifer; Lilly, Deborah R.
2009-01-01
Psychology students completed a task with reinforcement for successful performance. We tested academic integrity under randomly assigned conditions of check mark acknowledgment of an honor pledge, typed honor pledge, or no pledge. Across all conditions, 14.1% of students inflated their self-reported performance (i.e., cheated). We found no…
NASA Astrophysics Data System (ADS)
Elbaz, Reouven; Torres, Lionel; Sassatelli, Gilles; Guillemin, Pierre; Bardouillet, Michel; Martinez, Albert
The bus between the System on Chip (SoC) and the external memory is one of the weakest points of computer systems: an adversary can easily probe this bus in order to read private data (data confidentiality concern) or to inject data (data integrity concern). The conventional way to protect data against such attacks and to ensure data confidentiality and integrity is to implement two dedicated engines: one performing data encryption and another data authentication. This approach, while secure, prevents parallelizability of the underlying computations. In this paper, we introduce the concept of Block-Level Added Redundancy Explicit Authentication (BL-AREA) and we describe a Parallelized Encryption and Integrity Checking Engine (PE-ICE) based on this concept. BL-AREA and PE-ICE have been designed to provide an effective solution to ensure both security services while allowing for full parallelization on processor read and write operations and optimizing the hardware resources. Compared to standard encryption which ensures only confidentiality, we show that PE-ICE additionally guarantees code and data integrity for less than 4% of run-time performance overhead.
NASA Technical Reports Server (NTRS)
Moncrief, V.; Teitelboim, C.
1972-01-01
It is shown that if the Hamiltonian constraint of general relativity is imposed as a restriction on the Hamilton principal functional in the classical theory, or on the state functional in the quantum theory, then the momentum constraints are automatically satisfied. This result holds both for closed and open spaces and it means that the full content of the theory is summarized by a single functional equation of the Tomonaga-Schwinger type.
Multiconstrained gene clustering based on generalized projections
2010-01-01
Background Gene clustering for annotating gene functions is one of the fundamental issues in bioinformatics. The best clustering solution is often regularized by multiple constraints such as gene expressions, Gene Ontology (GO) annotations and gene network structures. How to integrate multiple pieces of constraints for an optimal clustering solution still remains an unsolved problem. Results We propose a novel multiconstrained gene clustering (MGC) method within the generalized projection onto convex sets (POCS) framework used widely in image reconstruction. Each constraint is formulated as a corresponding set. The generalized projector iteratively projects the clustering solution onto these sets in order to find a consistent solution included in the intersection set that satisfies all constraints. Compared with previous MGC methods, POCS can integrate multiple constraints from different nature without distorting the original constraints. To evaluate the clustering solution, we also propose a new performance measure referred to as Gene Log Likelihood (GLL) that considers genes having more than one function and hence in more than one cluster. Comparative experimental results show that our POCS-based gene clustering method outperforms current state-of-the-art MGC methods. Conclusions The POCS-based MGC method can successfully combine multiple constraints from different nature for gene clustering. Also, the proposed GLL is an effective performance measure for the soft clustering solutions. PMID:20356386
A scanning Hartmann focus test for the EUVI telescopes aboard STEREO
NASA Astrophysics Data System (ADS)
Ohl, R., IV; Antonille, S.; Aronstein, D.; Dean, B.; Delmont, M.; d'Entremont, J.; Eichhorn, W.; Frey, B.; Hynes, S.; Janssen, D.; Kubalak, D.; Redman, K.; Shiri, R.; Smith, J. S.; Thompson, P.; Wilson, M.
2007-09-01
The Solar TErrestrial RElations Observatory (STEREO), the third mission in NASA's Solar Terrestrial Probes program, was launched in 2006 on a two year mission to study solar phenomena. STEREO consists of two nearly identical satellites, each carrying an Extreme Ultraviolet Imager (EUVI) telescope as part of the Sun Earth Connection Coronal and Heliospheric Investigation instrument suite. EUVI is a normal incidence, 98mm diameter, Ritchey-Chrétien telescope designed to obtain wide field of view images of the Sun at short wavelengths (17.1-30.4nm) using a CCD detector. The telescope entrance aperture is divided into four quadrants by a mask near the secondary mirror spider veins. A mechanism that rotates another mask allows only one of these sub-apertures to accept light over an exposure. The EUVI contains no focus mechanism. Mechanical models predict a difference in telescope focus between ambient integration conditions and on-orbit operation. We describe an independent check of the ambient, ultraviolet, absolute focus setting of the EUVI telescopes after they were integrated with their respective spacecraft. A scanning Hartmann-like test design resulted from constraints imposed by the EUVI aperture select mechanism. This inexpensive test was simultaneously coordinated with other integration and test activities in a high-vibration, clean room environment. The total focus test error was required to be better than +/-0.05mm. We cover the alignment and test procedure, sources of statistical and systematic error, data reduction and analysis, and results using various algorithms for determining focus. The results are consistent with other tests of instrument focus alignment and indicate that the EUVI telescopes meet the ambient focus offset requirements. STEREO and the EUVI telescopes are functioning well on-orbit.
A Scanning Hartmann Focus Test for the EUVI Telescopes aboard STEREO
NASA Technical Reports Server (NTRS)
Ohl, Ray; Antonille, Scott; Aronstein, Dave; Dean, Bruce; Eichhorn, Bil; Frey, Brad; Kubalak, Dave; Shiri, Ron; Smith, Scott; Wilson, Mark;
2007-01-01
The Solar TErrestrial RElations Observatory (STEREO), the third mission in NASA's Solar Terrestrial Probes program, was launched in 2006 on a two year mission to study solar phenomena. STEREO consists of two nearly identical satellites, each carrying an Extreme Ultraviolet Imager (EUVI) telescope as part of the Sun Earth Connection Coronal and Heliospheric Investigation instrument suite. EUVI is a normal incidence, 98mm diameter, Ritchey-Chretien telescope designed to obtain wide field of view images of the Sun at short wavelengths (17.1-30.4nm) using a CCD detector. The telescope entrance aperture is divided into four quadrants by a mask near the secondary mirror spider veins. A mechanism that rotates another mask allows only one of these sub-apertures to accept light over an exposure. The EUVI contains no focus mechanism. Mechanical models predict a difference in telescope focus between ambient integration conditions and on-orbit operation. We describe an independent check of the ambient, ultraviolet, absolute focus setting of the EUVI telescopes after they were integrated with their respective spacecraft. A scanning Hartmann-like test design resulted from constraints implied by the EUVI aperture select mechanism. This inexpensive test was simultaneously coordinated with other NASA integration and test activities in a high-vibration, clean room environment. The total focus test error was required to be better than +/-0.05 mm. We describe the alignment and test procedure, sources of statistical and systematic error, and then the focus determination results using various algorithms. The results are consistent with other tests of focus alignment and indicate that the EUVI telescopes meet the ambient focus offset requirements. STEREO is functioning well on-orbit and the EUVI telescopes meet their on-orbit image quality requirements.
Šupak-Smolčić, Vesna; Šimundić, Ana-Maria
2013-01-01
In February 2013, Biochemia Medica has joined CrossRef, which enabled us to implement CrossCheck plagiarism detection service. Therefore, all manuscript submitted to Biochemia Medica are now first assigned to Research integrity editor (RIE), before sending the manuscript for peer-review. RIE submits the text to CrossCheck analysis and is responsible for reviewing the results of the text similarity analysis. Based on the CrossCheck analysis results, RIE subsequently provides a recommendation to the Editor-in-chief (EIC) on whether the manuscript should be forwarded to peer-review, corrected for suspected parts prior to peer-review or immediately rejected. Final decision on the manuscript is, however, with the EIC. We hope that our new policy and manuscript processing algorithm will help us to further increase the overall quality of our Journal. PMID:23894858
NASA Technical Reports Server (NTRS)
McNamara, Luke W.; Braun, Robert D.
2014-01-01
One of the key design objectives of NASA's Orion Exploration Mission 1 (EM- 1) is to execute a guided entry trajectory demonstrating GN&C capability. The focus of this paper is defining the flyable entry corridor for EM-1 taking into account multiple subsystem constraints such as complex aerothermal heating constraints, aerothermal heating objectives, landing accuracy constraints, structural load limits, Human-System-Integration-Requirements, Service Module debris disposal limits and other flight test objectives. During the EM-1 Design Analysis Cycle 1 design challenges came up that made defining the flyable entry corridor for the EM-1 mission critical to mission success. This document details the optimization techniques that were explored to use with the 6-DOF ANTARES simulation to assist in defining the design entry interface state and entry corridor with respect to key flight test constraints and objectives.
Delprato, Marcos; Akyeampong, Kwame
Age of marriage is a barrier to mother's health care around pregnancy and children health outcomes. We provide evidence on the health benefits of postponing early marriage among young wives (from age 10-14 to age 15-17) on women's health care and children's health for sub-Saharan Africa (SSA) and Southwest Asia (SWA). We use data for 39 countries from the Demographic and Health Surveys to estimate the effects of postponing early marriage for women's health care and children's health outcomes and immunization using matching techniques. We also assess whether women's health empowerment and health constraints are additional barriers. We found that in SSA, delaying the age of marriage from age 10-14 to age 15-17 and from age 15-17 to age 18 or older leads to an increase in maternal neotetanus vaccinations of 2.4% and 3.2%, respectively; gains in the likelihood of postnatal checks are larger for delayed marriage among the youngest wives (aged 10-14). In SWA, the number of antenatal visits increases by 34% and the likelihood of having a skilled birth attendant goes up to 4.1% if young wives postpone marriage. In SSA, the probability of children receiving basic vaccinations is twice as large and their neonatal mortality reduction is nearly double if their mothers married between ages 15-17 instead of at ages 10-14. The extent of these benefits is also shaped by supply constraints and cultural factors. For instance, we found that weak bargaining power on health decisions for young wives leads to 11% fewer antenatal visits (SWA) and 13% less chance of attending postnatal checks (SSA). Delaying age of marriage among young wives can lead to considerable gains in health care utilization and children health in SSA and SWA if supported by policies that lessen supply constraints and raise women's health empowerment. Copyright © 2017 Icahn School of Medicine at Mount Sinai. Published by Elsevier Inc. All rights reserved.
Constraint treatment techniques and parallel algorithms for multibody dynamic analysis. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Chiou, Jin-Chern
1990-01-01
Computational procedures for kinematic and dynamic analysis of three-dimensional multibody dynamic (MBD) systems are developed from the differential-algebraic equations (DAE's) viewpoint. Constraint violations during the time integration process are minimized and penalty constraint stabilization techniques and partitioning schemes are developed. The governing equations of motion, a two-stage staggered explicit-implicit numerical algorithm, are treated which takes advantage of a partitioned solution procedure. A robust and parallelizable integration algorithm is developed. This algorithm uses a two-stage staggered central difference algorithm to integrate the translational coordinates and the angular velocities. The angular orientations of bodies in MBD systems are then obtained by using an implicit algorithm via the kinematic relationship between Euler parameters and angular velocities. It is shown that the combination of the present solution procedures yields a computationally more accurate solution. To speed up the computational procedures, parallel implementation of the present constraint treatment techniques, the two-stage staggered explicit-implicit numerical algorithm was efficiently carried out. The DAE's and the constraint treatment techniques were transformed into arrowhead matrices to which Schur complement form was derived. By fully exploiting the sparse matrix structural analysis techniques, a parallel preconditioned conjugate gradient numerical algorithm is used to solve the systems equations written in Schur complement form. A software testbed was designed and implemented in both sequential and parallel computers. This testbed was used to demonstrate the robustness and efficiency of the constraint treatment techniques, the accuracy of the two-stage staggered explicit-implicit numerical algorithm, and the speed up of the Schur-complement-based parallel preconditioned conjugate gradient algorithm on a parallel computer.
SU-E-P-20: Personnel Lead Apparel Integrity Inspection: Where We Are and What We Need?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, S; Zhang, J; Anaskevich, L
Purpose: In recent years, tremendous efforts have been devoted to radiation dose reduction, especially for patients who are directly exposed to primary radiation or receive radiopharmaceuticals. Limited efforts have been focused on those personnel who are exposed to secondary radiation while fulfilling their work responsibilities associated with diagnostic imaging and image-guided interventions. Occupational exposure is compounded in daily practice and can lead to a significant radiation dose over time. Personnel lead apparel is a well-accepted engineering control to protect healthcare workers when radiation is inevitable. The question is, do we have a nationally established program to protect personnel? This studymore » is to investigate the lead apparel inspection programs among the USA. Methods: A series of surveys of state regulations, the University Health System Consortium, and federal regulations and regulations determined by accrediting bodies were conducted. The surveys were used to determine the current status of lead apparel programs regarding integrity inspections. Based on the survey results, a thorough program was proposed accordingly. Results: Of 50 states, seventeen states and Washington D.C. require lead apparel integrity inspections within their state regulations. Eleven of these states specify that the inspection is required on an annual basis. Two of these states require lead apron integrity checks to be performed semi-annually. Eleven out of the two hundred academic medical centers surveyed responded. The results show that the method (visually vs. fluoroscopy) used to conduct lead apparel integrity checks differ greatly amongst healthcare organizations. The FDA, EPA, CRCPD and NCRP require lead apparel integrity checks. However, the level of policies is different. A standard program is not well established and clearly there is a lack of standardization. Conclusion: A program led by legislative (state or federal government) and with specific frequency, methods, tracking and criteria is needed to ensure the integrity of personnel lead apparel.« less
NASA Astrophysics Data System (ADS)
Housh, M.; Ng, T.; Cai, X.
2012-12-01
The environmental impact is one of the major concerns of biofuel development. While many other studies have examined the impact of biofuel expansion on stream flow and water quality, this study examines the problem from the other side - will and how a biofuel production target be affected by given environmental constraints. For this purpose, an integrated model comprises of different sub-systems of biofuel refineries, transportation, agriculture, water resources and crops/ethanol market has been developed. The sub-systems are integrated into one large-scale model to guide the optimal development plan considering the interdependency between the subsystems. The optimal development plan includes biofuel refineries location and capacity, refinery operation, land allocation between biofuel and food crops, and the corresponding stream flow and nitrate load in the watershed. The watershed is modeled as a network flow, in which the nodes represent sub-watersheds and the arcs are defined as the linkage between the sub-watersheds. The runoff contribution of each sub-watershed is determined based on the land cover and the water uses in that sub-watershed. Thus, decisions of other sub-systems such as the land allocation in the land use sub-system and the water use in the refinery sub-system define the sources and the sinks of the network. Environmental policies will be addressed in the integrated model by imposing stream flow and nitrate load constraints. These constraints can be specified by location and time in the watershed to reflect the spatial and temporal variation of the regulations. Preliminary results show that imposing monthly water flow constraints and yearly nitrate load constraints will change the biofuel development plan dramatically. Sensitivity analysis is performed to examine how the environmental constraints and their spatial and the temporal distribution influence the overall biofuel development plan and the performance of each of the sub-systems. Additional scenarios are analyzed to show the synergies of crop pattern choice (first versus second generation of biofuel crops), refinery technology adaptation (particularly on water use), refinery plant distribution, and economic incentives in terms of balanced environmental protection and bioenergy development objectives.
Integrated Risk Management Within NASA Programs/Projects
NASA Technical Reports Server (NTRS)
Connley, Warren; Rad, Adrian; Botzum, Stephen
2004-01-01
As NASA Project Risk Management activities continue to evolve, the need to successfully integrate risk management processes across the life cycle, between functional disciplines, stakeholders, various management policies, and within cost, schedule and performance requirements/constraints become more evident and important. Today's programs and projects are complex undertakings that include a myriad of processes, tools, techniques, management arrangements and other variables all of which must function together in order to achieve mission success. The perception and impact of risk may vary significantly among stakeholders and may influence decisions that may have unintended consequences on the project during a future phase of the life cycle. In these cases, risks may be unintentionally and/or arbitrarily transferred to others without the benefit of a comprehensive systemic risk assessment. Integrating risk across people, processes, and project requirements/constraints serves to enhance decisions, strengthen communication pathways, and reinforce the ability of the project team to identify and manage risks across the broad spectrum of project management responsibilities. The ability to identify risks in all areas of project management increases the likelihood a project will identify significant issues before they become problems and allows projects to make effective and efficient use of shrinking resources. By getting a total team integrated risk effort, applying a disciplined and rigorous process, along with understanding project requirements/constraints provides the opportunity for more effective risk management. Applying an integrated approach to risk management makes it possible to do a better job at balancing safety, cost, schedule, operational performance and other elements of risk. This paper will examine how people, processes, and project requirements/constraints can be integrated across the project lifecycle for better risk management and ultimately improve the chances for mission success.
A jazz-based approach for optimal setting of pressure reducing valves in water distribution networks
NASA Astrophysics Data System (ADS)
De Paola, Francesco; Galdiero, Enzo; Giugni, Maurizio
2016-05-01
This study presents a model for valve setting in water distribution networks (WDNs), with the aim of reducing the level of leakage. The approach is based on the harmony search (HS) optimization algorithm. The HS mimics a jazz improvisation process able to find the best solutions, in this case corresponding to valve settings in a WDN. The model also interfaces with the improved version of a popular hydraulic simulator, EPANET 2.0, to check the hydraulic constraints and to evaluate the performances of the solutions. Penalties are introduced in the objective function in case of violation of the hydraulic constraints. The model is applied to two case studies, and the obtained results in terms of pressure reductions are comparable with those of competitive metaheuristic algorithms (e.g. genetic algorithms). The results demonstrate the suitability of the HS algorithm for water network management and optimization.
Toward inflation models compatible with the no-boundary proposal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hwang, Dong-il; Yeom, Dong-han, E-mail: dongil.j.hwang@gmail.com, E-mail: innocent.yeom@gmail.com
2014-06-01
In this paper, we investigate various inflation models in the context of the no-boundary proposal. We propose that a good inflation model should satisfy three conditions: observational constraints, plausible initial conditions, and naturalness of the model. For various inflation models, we assign the probability to each initial condition using the no-boundary proposal and define a quantitative standard, typicality, to check whether the model satisfies the observational constraints with probable initial conditions. There are three possible ways to satisfy the typicality criterion: there was pre-inflation near the high energy scale, the potential is finely tuned or the inflationary field space ismore » unbounded, or there are sufficient number of fields that contribute to inflation. The no-boundary proposal rejects some of naive inflation models, explains some of traditional doubts on inflation, and possibly, can have observational consequences.« less
Self-adaptive predictor-corrector algorithm for static nonlinear structural analysis
NASA Technical Reports Server (NTRS)
Padovan, J.
1981-01-01
A multiphase selfadaptive predictor corrector type algorithm was developed. This algorithm enables the solution of highly nonlinear structural responses including kinematic, kinetic and material effects as well as pro/post buckling behavior. The strategy involves three main phases: (1) the use of a warpable hyperelliptic constraint surface which serves to upperbound dependent iterate excursions during successive incremental Newton Ramphson (INR) type iterations; (20 uses an energy constraint to scale the generation of successive iterates so as to maintain the appropriate form of local convergence behavior; (3) the use of quality of convergence checks which enable various self adaptive modifications of the algorithmic structure when necessary. The restructuring is achieved by tightening various conditioning parameters as well as switch to different algorithmic levels to improve the convergence process. The capabilities of the procedure to handle various types of static nonlinear structural behavior are illustrated.
Primordial black holes for the LIGO events in the axionlike curvaton model
NASA Astrophysics Data System (ADS)
Ando, Kenta; Inomata, Keisuke; Kawasaki, Masahiro; Mukaida, Kyohei; Yanagida, Tsutomu T.
2018-06-01
We review primordial black hole (PBH) formation in the axionlike curvaton model and investigate whether PBHs formed in this model can be the origin of the gravtitational wave (GW) signals detected by the Advanced LIGO. In this model, small-scale curvature perturbations with large amplitude are generated, which is essential for PBH formation. On the other hand, large curvature perturbations also become a source of primordial GWs by their second-order effects. Severe constraints are imposed on such GWs by pulsar timing array (PTA) experiments. We also check the consistency of the model with these constraints. In this analysis, it is important to take into account the effect of non-Gaussianity, which is generated easily in the curvaton model. We see that, if there are non-Gaussianities, the fixed amount of PBHs can be produced with a smaller amplitude of the primordial power spectrum.
General relativity in two dimensions: A Hamilton-Jacobi analysis
NASA Astrophysics Data System (ADS)
Bertin, M. C.; Pimentel, B. M.; Pompeia, P. J.
2010-11-01
We analyzed the constraint structure of the Einstein-Hilbert first-order action in two dimensions using the Hamilton-Jacobi approach. We were able to find a set of involutive, as well as a set of non-involutive constraints. Using generalized brackets we showed how to assure integrability of the theory, to eliminate the set of non-involutive constraints and how to build the field equations.
NASA Technical Reports Server (NTRS)
1981-01-01
The modified CG2000 crystal grower construction, installation, and machine check-out was completed. The process development check-out proceeded with several dry runs and one growth run. Several machine calibrations and functional problems were discovered and corrected. Several exhaust gas analysis system alternatives were evaluated and an integrated system approved and ordered. A contract presentation was made at the Project Integration Meeting at JPL, including cost-projections using contract projected throughput and machine parameters. Several growth runs on a development CG200 RC grower show that complete neck, crown, and body automated growth can be achieved with only one operator input. Work continued for melt level, melt temperature, and diameter sensor development.
Integrating digital topology in image-processing libraries.
Lamy, Julien
2007-01-01
This paper describes a method to integrate digital topology informations in image-processing libraries. This additional information allows a library user to write algorithms respecting topological constraints, for example, a seed fill or a skeletonization algorithm. As digital topology is absent from most image-processing libraries, such constraints cannot be fulfilled. We describe and give code samples for all the structures necessary for this integration, and show a use case in the form of a homotopic thinning filter inside ITK. The obtained filter can be up to a hundred times as fast as ITK's thinning filter and works for any image dimension. This paper mainly deals of integration within ITK, but can be adapted with only minor modifications to other image-processing libraries.
Low-cost and high-speed optical mark reader based on an intelligent line camera
NASA Astrophysics Data System (ADS)
Hussmann, Stephan; Chan, Leona; Fung, Celine; Albrecht, Martin
2003-08-01
Optical Mark Recognition (OMR) is thoroughly reliable and highly efficient provided that high standards are maintained at both the planning and implementation stages. It is necessary to ensure that OMR forms are designed with due attention to data integrity checks, the best use is made of features built into the OMR, used data integrity is checked before the data is processed and data is validated before it is processed. This paper describes the design and implementation of an OMR prototype system for marking multiple-choice tests automatically. Parameter testing is carried out before the platform and the multiple-choice answer sheet has been designed. Position recognition and position verification methods have been developed and implemented in an intelligent line scan camera. The position recognition process is implemented into a Field Programmable Gate Array (FPGA), whereas the verification process is implemented into a micro-controller. The verified results are then sent to the Graphical User Interface (GUI) for answers checking and statistical analysis. At the end of the paper the proposed OMR system will be compared with commercially available system on the market.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mendes, Albert C.R., E-mail: albert@fisica.ufjf.br; Takakura, Flavio I., E-mail: takakura@fisica.ufjf.br; Abreu, Everton M.C., E-mail: evertonabreu@ufrrj.br
In this work we have obtained a higher-derivative Lagrangian for a charged fluid coupled with the electromagnetic fluid and the Dirac’s constraints analysis was discussed. A set of first-class constraints fixed by noncovariant gauge condition were obtained. The path integral formalism was used to obtain the partition function for the corresponding higher-derivative Hamiltonian and the Faddeev–Popov ansatz was used to construct an effective Lagrangian. Through the partition function, a Stefan–Boltzmann type law was obtained. - Highlights: • Higher-derivative Lagrangian for a charged fluid. • Electromagnetic coupling and Dirac’s constraint analysis. • Partition function through path integral formalism. • Stefan–Boltzmann-kind lawmore » through the partition function.« less
1991-09-01
putting all tasks directed towsrds achieving an outcome in aequence. The tasks can be viewed as steps in the process (39:2.3). Using this...improvement opportunity is investigated. A plan is developed, root causes are identified, and solutions are tested and implemented. The process is... solutions , check for actual improvement, and integrate the successful improvements into the process. ?UP 7. Check Improvement Performance. Finally, the
Roybal, H; Baxendale, S J; Gupta, M
1999-01-01
Activity-based costing and the theory of constraints have been applied successfully in many manufacturing organizations. Recently, those concepts have been applied in service organizations. This article describes the application of activity-based costing and the theory of constraints in a managed care mental health and substance abuse organization. One of the unique aspects of this particular application was the integration of activity-based costing and the theory of constraints to guide process improvement efforts. This article describes the activity-based costing model and the application of the theory of constraint's focusing steps with an emphasis on unused capacities of activities in the organization.
AN ENERGY SYSTEMS PERPECTIVE OF ECOLOGICAL INTEGRITY AND ECOSYSTEM HEALTH
The integrity and health of society's life-supporting ecosystems establishes a fundamental constraint on economic growth and development. Energy Systems Theory provides a theoretical basis for defining, measuring and interpreting the concepts of ecological integrity and ecosystem...
Energy conditions in f (T, TG) gravity
NASA Astrophysics Data System (ADS)
Jawad, Abdul
2015-05-01
This paper is devoted to study the energy conditions in f( T, T G ) gravity for the FRW universe with perfect fluid, where T is the torsion scalar and T G is the quartic torsion scalar. We construct the energy conditions in this theory and discuss them for two specific f( T, T G ) models. These models are and , which represent viability through some cosmological scenarios. We consider cosmographic parameters to simplify the energy condition expressions. The present-day values of these parameters are assumed to check the constraints on model parameters through energy condition inequalities.
Consistency of anisotropic inflation during rapid oscillations with Planck 2015 data
NASA Astrophysics Data System (ADS)
Saleem, Rabia
2018-07-01
This paper is aimed to study the compelling issue of cosmic inflation during rapid oscillations using the framework of non-minimal derivative coupling. To this end, an anisotropic and homogeneous Bianchi I background is considered. In this context, I developed the formalism of anisotropic oscillatory inflation and found some constraints for the existence of inflation. In this era, the parameters related to cosmological perturbations are evaluated, further, their graphical trajectories are presented to check the compatibility of the model with the observational data (Planck 2015 probe).
Checking the Grammar Checker: Integrating Grammar Instruction with Writing.
ERIC Educational Resources Information Center
McAlexander, Patricia J.
2000-01-01
Notes Rei Noguchi's recommendation of integrating grammar instruction with writing instruction and teaching only the most vital terms and the most frequently made errors. Presents a project that provides a review of the grammar lessons, applies many grammar rules specifically to the students' writing, and teaches students the effective use of the…
Conceptual Integration of Chemical Equilibrium by Prospective Physical Sciences Teachers
ERIC Educational Resources Information Center
Ganaras, Kostas; Dumon, Alain; Larcher, Claudine
2008-01-01
This article describes an empirical study concerning the mastering of the chemical equilibrium concept by prospective physical sciences teachers. The main objective was to check whether the concept of chemical equilibrium had become an integrating and unifying concept for them, that is to say an operational and functional knowledge to explain and…
Automated Derivation of Complex System Constraints from User Requirements
NASA Technical Reports Server (NTRS)
Muery, Kim; Foshee, Mark; Marsh, Angela
2006-01-01
International Space Station (ISS) payload developers submit their payload science requirements for the development of on-board execution timelines. The ISS systems required to execute the payload science operations must be represented as constraints for the execution timeline. Payload developers use a software application, User Requirements Collection (URC), to submit their requirements by selecting a simplified representation of ISS system constraints. To fully represent the complex ISS systems, the constraints require a level of detail that is beyond the insight of the payload developer. To provide the complex representation of the ISS system constraints, HOSC operations personnel, specifically the Payload Activity Requirements Coordinators (PARC), manually translate the payload developers simplified constraints into detailed ISS system constraints used for scheduling the payload activities in the Consolidated Planning System (CPS). This paper describes the implementation for a software application, User Requirements Integration (URI), developed to automate the manual ISS constraint translation process.
Kang, Hyunchul
2015-01-01
We investigate the in-network processing of an iceberg join query in wireless sensor networks (WSNs). An iceberg join is a special type of join where only those joined tuples whose cardinality exceeds a certain threshold (called iceberg threshold) are qualified for the result. Processing such a join involves the value matching for the join predicate as well as the checking of the cardinality constraint for the iceberg threshold. In the previous scheme, the value matching is carried out as the main task for filtering non-joinable tuples while the iceberg threshold is treated as an additional constraint. We take an alternative approach, meeting the cardinality constraint first and matching values next. In this approach, with a logical fragmentation of the join operand relations on the aggregate counts of the joining attribute values, the optimal sequence of 2-way fragment semijoins is generated, where each fragment semijoin employs a Bloom filter as a synopsis of the joining attribute values. This sequence filters non-joinable tuples in an energy-efficient way in WSNs. Through implementation and a set of detailed experiments, we show that our alternative approach considerably outperforms the previous one. PMID:25774710
Pey, Jon; Rubio, Angel; Theodoropoulos, Constantinos; Cascante, Marta; Planes, Francisco J
2012-07-01
Constraints-based modeling is an emergent area in Systems Biology that includes an increasing set of methods for the analysis of metabolic networks. In order to refine its predictions, the development of novel methods integrating high-throughput experimental data is currently a key challenge in the field. In this paper, we present a novel set of constraints that integrate tracer-based metabolomics data from Isotope Labeling Experiments and metabolic fluxes in a linear fashion. These constraints are based on Elementary Carbon Modes (ECMs), a recently developed concept that generalizes Elementary Flux Modes at the carbon level. To illustrate the effect of our ECMs-based constraints, a Flux Variability Analysis approach was applied to a previously published metabolic network involving the main pathways in the metabolism of glucose. The addition of our ECMs-based constraints substantially reduced the under-determination resulting from a standard application of Flux Variability Analysis, which shows a clear progress over the state of the art. In addition, our approach is adjusted to deal with combinatorial explosion of ECMs in genome-scale metabolic networks. This extension was applied to infer the maximum biosynthetic capacity of non-essential amino acids in human metabolism. Finally, as linearity is the hallmark of our approach, its importance is discussed at a methodological, computational and theoretical level and illustrated with a practical application in the field of Isotope Labeling Experiments. Copyright © 2012 Elsevier Inc. All rights reserved.
Metrology Laboratory | Energy Systems Integration Facility | NREL
and artificial) Spectral reflectance and transmission of materials (functional check only , pyrheliometers,* pyranometers,* and pyrgeometers. The Metrology Laboratory provides National Institute of
[Design and implementation of data checking system for Chinese materia medica resources survey].
Wang, Hui; Zhang, Xiao-Bo; Ge, Xiao-Guang; Jin, Yan; Jing, Zhi-Xian; Qi, Yuan-Hua; Wang, Ling; Zhao, Yu-Ping; Wang, Wei; Guo, Lan-Ping; Huang, Lu-Qi
2017-11-01
The Chinese material medica resources (CMMR) national survey information management system has collected a large amount of data. To help dealing with data recheck, reduce the work of inside, improve the recheck of survey data from provincial and county level, National Resource Center for Chinese Materia Medical has designed a data checking system for Chinese material medica resources survey based on J2EE technology, Java language, Oracle data base in accordance with the SOA framework. It includes single data check, check score, content manage, check the survey data census data with manual checking and automatic checking about census implementation plan, key research information, general survey information, cultivation of medicinal materials information, germplasm resources information the medicine information, market research information, traditional knowledge information, specimen information of this 9 aspects 20 class 175 indicators in two aspects of the quantity and quality. The established system assists in the completion of the data consistency and accuracy, pushes the county survey team timely to complete the data entry arrangement work, so as to improve the integrity, consistency and accuracy of the survey data, and ensure effective and available data, which lay a foundation for providing accurate data support for national survey of the Chinese material medica resources (CMMR) results summary, and displaying results and sharing. Copyright© by the Chinese Pharmaceutical Association.
[Advanced information technologies for financial services industry]. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The project scope is to develop an advanced user interface utilizing speech and/or handwriting recognition technology that will improve the accuracy and speed of recording transactions in the dynamic environment of a foreign exchange (FX) trading floor. The project`s desired result is to improve the base technology for trader`s workstations on FX trading floors. Improved workstation effectiveness will allow vast amounts of complex information and events to be presented and analyzed, thus increasing the volume of money and other assets to be exchanged at an accelerated rate. The project scope is to develop and demonstrate technologies that advance interbank checkmore » imaging and paper check truncation. The following describes the tasks to be completed: (1) Identify the economics value case, the legal and regulatory issues, the business practices that are affected, and the effects upon settlement. (2) Familiarization with existing imaging technology. Develop requirements for image quality, security, and interoperability. Adapt existing technologies to meet requirements. (3) Define requirements for the imaging laboratory and design its architecture. Integrate and test technology from task 2 with equipment in the laboratory. (4) Develop and/or integrate and test remaining components; includes security, storage, and communications. (5) Build a prototype system and test in a laboratory. Install and run in two or more banks. Develop documentation. Conduct training. The project`s desired result is to enable a proof-of-concept trial in which multiple banks will exchange check images, exhibiting operating conditions which a check experiences as it travels through the payments/clearing system. The trial should demonstrate the adequacy of digital check images instead of paper checks.« less
Cotten, Cameron; Reed, Jennifer L
2013-01-30
Constraint-based modeling uses mass balances, flux capacity, and reaction directionality constraints to predict fluxes through metabolism. Although transcriptional regulation and thermodynamic constraints have been integrated into constraint-based modeling, kinetic rate laws have not been extensively used. In this study, an in vivo kinetic parameter estimation problem was formulated and solved using multi-omic data sets for Escherichia coli. To narrow the confidence intervals for kinetic parameters, a series of kinetic model simplifications were made, resulting in fewer kinetic parameters than the full kinetic model. These new parameter values are able to account for flux and concentration data from 20 different experimental conditions used in our training dataset. Concentration estimates from the simplified kinetic model were within one standard deviation for 92.7% of the 790 experimental measurements in the training set. Gibbs free energy changes of reaction were calculated to identify reactions that were often operating close to or far from equilibrium. In addition, enzymes whose activities were positively or negatively influenced by metabolite concentrations were also identified. The kinetic model was then used to calculate the maximum and minimum possible flux values for individual reactions from independent metabolite and enzyme concentration data that were not used to estimate parameter values. Incorporating these kinetically-derived flux limits into the constraint-based metabolic model improved predictions for uptake and secretion rates and intracellular fluxes in constraint-based models of central metabolism. This study has produced a method for in vivo kinetic parameter estimation and identified strategies and outcomes of kinetic model simplification. We also have illustrated how kinetic constraints can be used to improve constraint-based model predictions for intracellular fluxes and biomass yield and identify potential metabolic limitations through the integrated analysis of multi-omics datasets.
2013-01-01
Background Constraint-based modeling uses mass balances, flux capacity, and reaction directionality constraints to predict fluxes through metabolism. Although transcriptional regulation and thermodynamic constraints have been integrated into constraint-based modeling, kinetic rate laws have not been extensively used. Results In this study, an in vivo kinetic parameter estimation problem was formulated and solved using multi-omic data sets for Escherichia coli. To narrow the confidence intervals for kinetic parameters, a series of kinetic model simplifications were made, resulting in fewer kinetic parameters than the full kinetic model. These new parameter values are able to account for flux and concentration data from 20 different experimental conditions used in our training dataset. Concentration estimates from the simplified kinetic model were within one standard deviation for 92.7% of the 790 experimental measurements in the training set. Gibbs free energy changes of reaction were calculated to identify reactions that were often operating close to or far from equilibrium. In addition, enzymes whose activities were positively or negatively influenced by metabolite concentrations were also identified. The kinetic model was then used to calculate the maximum and minimum possible flux values for individual reactions from independent metabolite and enzyme concentration data that were not used to estimate parameter values. Incorporating these kinetically-derived flux limits into the constraint-based metabolic model improved predictions for uptake and secretion rates and intracellular fluxes in constraint-based models of central metabolism. Conclusions This study has produced a method for in vivo kinetic parameter estimation and identified strategies and outcomes of kinetic model simplification. We also have illustrated how kinetic constraints can be used to improve constraint-based model predictions for intracellular fluxes and biomass yield and identify potential metabolic limitations through the integrated analysis of multi-omics datasets. PMID:23360254
Button, C; Croft, J L
2017-12-01
In the lead article of this special issue, Paul Glazier proposes that Newell's constraints model has the potential to contribute to a grand unified theory of sports performance in that it can help to integrate the disciplinary silos that have typically operated in isolation in sports and exercise science. With a few caveats discussed in this commentary, we agree with Glazier's proposal. However, his ideas suggest that there is a need to demonstrate explicitly how such an integration might occur within applied scientific research. To help fill this perceived 'gap' and thereby illustrate the value of adopting a constraints-led approach, we offer an example of our own interdisciplinary research programme. We believe our research on water safety is ideally suited to this task due to the diverse range of interacting constraints present and as such provides a tangible example of how this approach can unify different disciplinary perspectives examining an important aspect of sport performance. Copyright © 2017 Elsevier B.V. All rights reserved.
NDEC: A NEA platform for nuclear data testing, verification and benchmarking
NASA Astrophysics Data System (ADS)
Díez, C. J.; Michel-Sendis, F.; Cabellos, O.; Bossant, M.; Soppera, N.
2017-09-01
The selection, testing, verification and benchmarking of evaluated nuclear data consists, in practice, in putting an evaluated file through a number of checking steps where different computational codes verify that the file and the data it contains complies with different requirements. These requirements range from format compliance to good performance in application cases, while at the same time physical constraints and the agreement with experimental data are verified. At NEA, the NDEC (Nuclear Data Evaluation Cycle) platform aims at providing, in a user friendly interface, a thorough diagnose of the quality of a submitted evaluated nuclear data file. Such diagnose is based on the results of different computational codes and routines which carry out the mentioned verifications, tests and checks. NDEC also searches synergies with other existing NEA tools and databases, such as JANIS, DICE or NDaST, including them into its working scheme. Hence, this paper presents NDEC, its current development status and its usage in the JEFF nuclear data project.
CCM-C,Collins checks the middeck experiment
1999-07-24
S93-E-5016 (23 July 1999) --- Astronaut Eileen M. Collins, mission commander, checks on an experiment on Columbia's middeck during Flight Day 1 activity. The experiment is called the Cell Culture Model, Configuration C. Objectives of it are to validate cell culture models for muscle, bone and endothelial cell biochemical and functional loss induced by microgravity stress; to evaluate cytoskeleton, metabolism, membrane integrity and protease activity in target cells; and to test tissue loss pharmaceuticals for efficacy. The photo was recorded with an electronic still camera (ESC).
2000-11-28
STS-97 Mission Specialist Joseph Tanner gets help with his boots from suit technician Erin Canlon during check pre-pack and fit check. Mission STS-97 is the sixth construction flight to the International Space Station. Its payload includes the P6 Integrated Truss Structure and a photovoltaic (PV) module, with giant solar arrays that will provide power to the Station. The mission includes two spacewalks to complete the solar array connections. STS-97 is scheduled to launch Nov. 30 at about 10:06 p.m. EST
2008-01-24
NASA Dryden technicians work on a fit-check mockup in preparation for systems installation work on an Orion boilerplate crew capsule for launch abort testing. A mockup Orion crew module has been constructed by NASA Dryden Flight Research Center's Fabrication Branch. The mockup is being used to develop integration procedures for avionics and instrumentation in advance of the arrival of the first abort flight test article.
2008-01-24
NASA Dryden technicians take measurements inside a fit-check mockup for prior to systems installation on a boilerplate Orion launch abort test crew capsule. A mockup Orion crew module has been constructed by NASA Dryden Flight Research Center's Fabrication Branch. The mockup is being used to develop integration procedures for avionics and instrumentation in advance of the arrival of the first abort flight test article.
The Application of Lidar to Synthetic Vision System Integrity
NASA Technical Reports Server (NTRS)
Campbell, Jacob L.; UijtdeHaag, Maarten; Vadlamani, Ananth; Young, Steve
2003-01-01
One goal in the development of a Synthetic Vision System (SVS) is to create a system that can be certified by the Federal Aviation Administration (FAA) for use at various flight criticality levels. As part of NASA s Aviation Safety Program, Ohio University and NASA Langley have been involved in the research and development of real-time terrain database integrity monitors for SVS. Integrity monitors based on a consistency check with onboard sensors may be required if the inherent terrain database integrity is not sufficient for a particular operation. Sensors such as the radar altimeter and weather radar, which are available on most commercial aircraft, are currently being investigated for use in a real-time terrain database integrity monitor. This paper introduces the concept of using a Light Detection And Ranging (LiDAR) sensor as part of a real-time terrain database integrity monitor. A LiDAR system consists of a scanning laser ranger, an inertial measurement unit (IMU), and a Global Positioning System (GPS) receiver. Information from these three sensors can be combined to generate synthesized terrain models (profiles), which can then be compared to the stored SVS terrain model. This paper discusses an initial performance evaluation of the LiDAR-based terrain database integrity monitor using LiDAR data collected over Reno, Nevada. The paper will address the consistency checking mechanism and test statistic, sensitivity to position errors, and a comparison of the LiDAR-based integrity monitor to a radar altimeter-based integrity monitor.
Kukona, Anuenue; Cho, Pyeong Whan; Magnuson, James S.; Tabor, Whitney
2014-01-01
Psycholinguistic research spanning a number of decades has produced diverging results with regard to the nature of constraint integration in online sentence processing. For example, evidence that language users anticipatorily fixate likely upcoming referents in advance of evidence in the speech signal supports rapid context integration. By contrast, evidence that language users activate representations that conflict with contextual constraints, or only indirectly satisfy them, supports non-integration or late integration. Here, we report on a self-organizing neural network framework that addresses one aspect of constraint integration: the integration of incoming lexical information (i.e., an incoming word) with sentence context information (i.e., from preceding words in an unfolding utterance). In two simulations, we show that the framework predicts both classic results concerned with lexical ambiguity resolution (Swinney, 1979; Tanenhaus, Leiman, & Seidenberg, 1979), which suggest late context integration, and results demonstrating anticipatory eye movements (e.g., Altmann & Kamide, 1999), which support rapid context integration. We also report two experiments using the visual world paradigm that confirm a new prediction of the framework. Listeners heard sentences like “The boy will eat the white…,” while viewing visual displays with objects like a white cake (i.e., a predictable direct object of “eat”), white car (i.e., an object not predicted by “eat,” but consistent with “white”), and distractors. Consistent with our simulation predictions, we found that while listeners fixated white cake most, they also fixated white car more than unrelated distractors in this highly constraining sentence (and visual) context. PMID:24245535
Communication Satellite Payload Special Check out Equipment (SCOE) for Satellite Testing
NASA Astrophysics Data System (ADS)
Subhani, Noman
2016-07-01
This paper presents Payload Special Check out Equipment (SCOE) for the test and measurement of communication satellite Payload at subsystem and system level. The main emphasis of this paper is to demonstrate the principle test equipment, instruments and the payload test matrix for an automatic test control. Electrical Ground Support Equipment (EGSE)/ Special Check out Equipment (SCOE) requirements, functions and architecture for C-band and Ku-band payloads are presented in details along with their interface with satellite during different phases of satellite testing. It provides test setup, in a single rack cabinet that can easily be moved from payload assembly and integration environment to thermal vacuum chamber all the way to launch site (for pre-launch test and verification).
COBRApy: COnstraints-Based Reconstruction and Analysis for Python.
Ebrahim, Ali; Lerman, Joshua A; Palsson, Bernhard O; Hyduke, Daniel R
2013-08-08
COnstraint-Based Reconstruction and Analysis (COBRA) methods are widely used for genome-scale modeling of metabolic networks in both prokaryotes and eukaryotes. Due to the successes with metabolism, there is an increasing effort to apply COBRA methods to reconstruct and analyze integrated models of cellular processes. The COBRA Toolbox for MATLAB is a leading software package for genome-scale analysis of metabolism; however, it was not designed to elegantly capture the complexity inherent in integrated biological networks and lacks an integration framework for the multiomics data used in systems biology. The openCOBRA Project is a community effort to promote constraints-based research through the distribution of freely available software. Here, we describe COBRA for Python (COBRApy), a Python package that provides support for basic COBRA methods. COBRApy is designed in an object-oriented fashion that facilitates the representation of the complex biological processes of metabolism and gene expression. COBRApy does not require MATLAB to function; however, it includes an interface to the COBRA Toolbox for MATLAB to facilitate use of legacy codes. For improved performance, COBRApy includes parallel processing support for computationally intensive processes. COBRApy is an object-oriented framework designed to meet the computational challenges associated with the next generation of stoichiometric constraint-based models and high-density omics data sets. http://opencobra.sourceforge.net/
Neighboring extremals of dynamic optimization problems with path equality constraints
NASA Technical Reports Server (NTRS)
Lee, A. Y.
1988-01-01
Neighboring extremals of dynamic optimization problems with path equality constraints and with an unknown parameter vector are considered in this paper. With some simplifications, the problem is reduced to solving a linear, time-varying two-point boundary-value problem with integral path equality constraints. A modified backward sweep method is used to solve this problem. Two example problems are solved to illustrate the validity and usefulness of the solution technique.
NASA Astrophysics Data System (ADS)
Thibes, Ronaldo
2017-02-01
We perform the canonical and path integral quantizations of a lower-order derivatives model describing Podolsky's generalized electrodynamics. The physical content of the model shows an auxiliary massive vector field coupled to the usual electromagnetic field. The equivalence with Podolsky's original model is studied at classical and quantum levels. Concerning the dynamical time evolution, we obtain a theory with two first-class and two second-class constraints in phase space. We calculate explicitly the corresponding Dirac brackets involving both vector fields. We use the Senjanovic procedure to implement the second-class constraints and the Batalin-Fradkin-Vilkovisky path integral quantization scheme to deal with the symmetries generated by the first-class constraints. The physical interpretation of the results turns out to be simpler due to the reduced derivatives order permeating the equations of motion, Dirac brackets and effective action.
NASA Technical Reports Server (NTRS)
Kim, Chang-Soo; Lee, Cae-Hyang; Fiering, Jason O.; Ufer, Stefan; Scarantino, Charles W.; Nagle, H. Troy; Fiering, Jason O.; Ufer, Stefan; Nagle, H. Troy; Scarantino, Charles W.
2004-01-01
Abstract - Biochemical sensors for continuous monitoring require dependable periodic self- diagnosis with acceptable simplicity to check its functionality during operation. An in situ self- diagnostic technique for a dissolved oxygen microsensor is proposed in an effort to devise an intelligent microsensor system with an integrated electrochemical actuation electrode. With a built- in platinum microelectrode that surrounds the microsensor, two kinds of microenvironments, called the oxygen-saturated or oxygen-depleted phases, can be created by water electrolysis depending on the polarity. The functionality of the microsensor can be checked during these microenvironment phases. The polarographic oxygen microsensor is fabricated on a flexible polyimide substrate (Kapton) and the feasibility of the proposed concept is demonstrated in a physiological solution. The sensor responds properly during the oxygen-generating and oxygen- depleting phases. The use of these microenvironments for in situ self-calibration is discussed to achieve functional integration as well as structural integration of the microsensor system.
Path integral measure, constraints and ghosts for massive gravitons with a cosmological constant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Metaxas, Dimitrios
2009-12-15
For massive gravity in a de Sitter background one encounters problems of stability when the curvature is larger than the graviton mass. I analyze this situation from the path integral point of view and show that it is related to the conformal factor problem of Euclidean quantum (massless) gravity. When a constraint for massive gravity is incorporated and the proper treatment of the path integral measure is taken into account one finds that, for particular choices of the DeWitt metric on the space of metrics (in fact, the same choices as in the massless case), one obtains the opposite boundmore » on the graviton mass.« less
NASA Technical Reports Server (NTRS)
Takacs, Lawrence L.
1988-01-01
The nature and effect of using a posteriori adjustments to nonconservative finite-difference schemes to enforce integral invariants of the corresponding analytic system are examined. The method of a posteriori integral constraint restoration is analyzed for the case of linear advection, and the harmonic response associated with the a posteriori adjustments is examined in detail. The conservative properties of the shallow water system are reviewed, and the constraint restoration algorithm applied to the shallow water equations are described. A comparison is made between forecasts obtained using implicit and a posteriori methods for the conservation of mass, energy, and potential enstrophy in the complete nonlinear shallow-water system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Laurence; Yurkovich, James T.; Lloyd, Colton J.
Integrating omics data to refine or make context-specific models is an active field of constraint-based modeling. Proteomics now cover over 95% of the Escherichia coli proteome by mass. Genome-scale models of Metabolism and macromolecular Expression (ME) compute proteome allocation linked to metabolism and fitness. Using proteomics data, we formulated allocation constraints for key proteome sectors in the ME model. The resulting calibrated model effectively computed the “generalist” (wild-type) E. coli proteome and phenotype across diverse growth environments. Across 15 growth conditions, prediction errors for growth rate and metabolic fluxes were 69% and 14% lower, respectively. The sector-constrained ME model thusmore » represents a generalist ME model reflecting both growth rate maximization and “hedging” against uncertain environments and stresses, as indicated by significant enrichment of these sectors for the general stress response sigma factor σS. Finally, the sector constraints represent a general formalism for integrating omics data from any experimental condition into constraint-based ME models. The constraints can be fine-grained (individual proteins) or coarse-grained (functionally-related protein groups) as demonstrated here. Furthermore, this flexible formalism provides an accessible approach for narrowing the gap between the complexity captured by omics data and governing principles of proteome allocation described by systems-level models.« less
NASA Astrophysics Data System (ADS)
Nielsen, N. K.; Quaade, U. J.
1995-07-01
The physical phase space of the relativistic top, as defined by Hansson and Regge, is expressed in terms of canonical coordinates of the Poincaré group manifold. The system is described in the Hamiltonian formalism by the mass-shell condition and constraints that reduce the number of spin degrees of freedom. The constraints are second class and are modified into a set of first class constraints by adding combinations of gauge-fixing functions. The Batalin-Fradkin-Vilkovisky method is then applied to quantize the system in the path integral formalism in Hamiltonian form. It is finally shown that different gauge choices produce different equivalent forms of the constraints.
Intelligent Data Visualization for Cross-Checking Spacecraft System Diagnosis
NASA Technical Reports Server (NTRS)
Ong, James C.; Remolina, Emilio; Breeden, David; Stroozas, Brett A.; Mohammed, John L.
2012-01-01
Any reasoning system is fallible, so crew members and flight controllers must be able to cross-check automated diagnoses of spacecraft or habitat problems by considering alternate diagnoses and analyzing related evidence. Cross-checking improves diagnostic accuracy because people can apply information processing heuristics, pattern recognition techniques, and reasoning methods that the automated diagnostic system may not possess. Over time, cross-checking also enables crew members to become comfortable with how the diagnostic reasoning system performs, so the system can earn the crew s trust. We developed intelligent data visualization software that helps users cross-check automated diagnoses of system faults more effectively. The user interface displays scrollable arrays of timelines and time-series graphs, which are tightly integrated with an interactive, color-coded system schematic to show important spatial-temporal data patterns. Signal processing and rule-based diagnostic reasoning automatically identify alternate hypotheses and data patterns that support or rebut the original and alternate diagnoses. A color-coded matrix display summarizes the supporting or rebutting evidence for each diagnosis, and a drill-down capability enables crew members to quickly view graphs and timelines of the underlying data. This system demonstrates that modest amounts of diagnostic reasoning, combined with interactive, information-dense data visualizations, can accelerate system diagnosis and cross-checking.
Fang, Yilin; Scheibe, Timothy D; Mahadevan, Radhakrishnan; Garg, Srinath; Long, Philip E; Lovley, Derek R
2011-03-25
The activity of microorganisms often plays an important role in dynamic natural attenuation or engineered bioremediation of subsurface contaminants, such as chlorinated solvents, metals, and radionuclides. To evaluate and/or design bioremediated systems, quantitative reactive transport models are needed. State-of-the-art reactive transport models often ignore the microbial effects or simulate the microbial effects with static growth yield and constant reaction rate parameters over simulated conditions, while in reality microorganisms can dynamically modify their functionality (such as utilization of alternative respiratory pathways) in response to spatial and temporal variations in environmental conditions. Constraint-based genome-scale microbial in silico models, using genomic data and multiple-pathway reaction networks, have been shown to be able to simulate transient metabolism of some well studied microorganisms and identify growth rate, substrate uptake rates, and byproduct rates under different growth conditions. These rates can be identified and used to replace specific microbially-mediated reaction rates in a reactive transport model using local geochemical conditions as constraints. We previously demonstrated the potential utility of integrating a constraint-based microbial metabolism model with a reactive transport simulator as applied to bioremediation of uranium in groundwater. However, that work relied on an indirect coupling approach that was effective for initial demonstration but may not be extensible to more complex problems that are of significant interest (e.g., communities of microbial species and multiple constraining variables). Here, we extend that work by presenting and demonstrating a method of directly integrating a reactive transport model (FORTRAN code) with constraint-based in silico models solved with IBM ILOG CPLEX linear optimizer base system (C library). The models were integrated with BABEL, a language interoperability tool. The modeling system is designed in such a way that constraint-based models targeting different microorganisms or competing organism communities can be easily plugged into the system. Constraint-based modeling is very costly given the size of a genome-scale reaction network. To save computation time, a binary tree is traversed to examine the concentration and solution pool generated during the simulation in order to decide whether the constraint-based model should be called. We also show preliminary results from the integrated model including a comparison of the direct and indirect coupling approaches and evaluated the ability of the approach to simulate field experiment. Published by Elsevier B.V.
Cuba: Multidimensional numerical integration library
NASA Astrophysics Data System (ADS)
Hahn, Thomas
2016-08-01
The Cuba library offers four independent routines for multidimensional numerical integration: Vegas, Suave, Divonne, and Cuhre. The four algorithms work by very different methods, and can integrate vector integrands and have very similar Fortran, C/C++, and Mathematica interfaces. Their invocation is very similar, making it easy to cross-check by substituting one method by another. For further safeguarding, the output is supplemented by a chi-square probability which quantifies the reliability of the error estimate.
NASA Astrophysics Data System (ADS)
Servilla, M. S.; O'Brien, M.; Costa, D.
2013-12-01
Considerable ecological research performed today occurs through the analysis of data downloaded from various repositories and archives, often resulting in derived or synthetic products generated by automated workflows. These data are only meaningful for research if they are well documented by metadata, lest semantic or data type errors may occur in interpretation or processing. The Long Term Ecological Research (LTER) Network now screens all data packages entering its long-term archive to ensure that each package contains metadata that is complete, of high quality, and accurately describes the structure of its associated data entity and the data are structurally congruent to the metadata. Screening occurs prior to the upload of a data package into the Provenance Aware Synthesis Tracking Architecture (PASTA) data management system through a series of quality checks, thus preventing ambiguously or incorrectly documented data packages from entering the system. The quality checks within PASTA are designed to work specifically with the Ecological Metadata Language (EML), the metadata standard adopted by the LTER Network to describe data generated by their 26 research sites. Each quality check is codified in Java as part of the ecological community-supported Data Manager Library, which is a resource of the EML specification and used as a component of the PASTA software stack. Quality checks test for metadata quality, data integrity, or metadata-data congruence. Quality checks are further classified as either conditional or informational. Conditional checks issue a 'valid', 'warning' or 'error' response. Only an 'error' response blocks the data package from upload into PASTA. Informational checks only provide descriptive content pertaining to a particular facet of the data package. Quality checks are designed by a group of LTER information managers and reviewed by the LTER community before deploying into PASTA. A total of 32 quality checks have been deployed to date. Quality checks can be customized through a configurable template, which includes turning checks 'on' or 'off' and setting the severity of conditional checks. This feature is important to other potential users of the Data Manager Library who wish to configure its quality checks in accordance with the standards of their community. Executing the complete set of quality checks produces a report that describes the result of each check. The report is an XML document that is stored by PASTA for future reference.
Department of Defense Travel Reengineering Pilot Report to Congress
1997-06-01
Electronic Commerce /Electronic Data Interchange (EC/EDI) capabilities to integrate functions. automate edit checks for internal controls, and create user-friendly management tools at all levels of the process.
Pasacreta, Jeannie V; Kenefick, Amy L; McCorkle, Ruth
2008-01-01
The American Psychosocial Oncology Society and the Individual Cancer Assistance Network have launched the online continuing education accredited program "ICAN: Distress Management for Oncology Nursing" to address the ability of oncology nurses to assess, treat, and refer patients with a range of psychosocial problems. An important goal of the program is to reduce traditional barriers to psychosocial oncology education by providing the oncology nursing community with easy access to information from experts in the field. There are 4 Internet webcasts: Nurse's Role in Recognizing Distress in Patients and Caregivers; Assessment Recommendations; Treatment Strategies; and Principles and Guidelines for Psychotherapy and Referral. The program examines the prevalence and dimensions of patient distress and offers instruction on how to effectively integrate screening tools, such as the Distress Thermometer and Problem Check List, into clinical practice. It provides details on relevant interventions and referral algorithms based on the National Comprehensive Cancer Network Guidelines for Distress Management. It explores the devastating impact of psychological distress on quality of life, and the unique position of nurses in busy inpatient settings, outpatient clinics, and offices to detect, intervene, and refer to appropriate services. Providing information over the Internet addresses common barriers to learning, including schedule and time constraints.
Topology design and performance analysis of an integrated communication network
NASA Technical Reports Server (NTRS)
Li, V. O. K.; Lam, Y. F.; Hou, T. C.; Yuen, J. H.
1985-01-01
A research study on the topology design and performance analysis for the Space Station Information System (SSIS) network is conducted. It is begun with a survey of existing research efforts in network topology design. Then a new approach for topology design is presented. It uses an efficient algorithm to generate candidate network designs (consisting of subsets of the set of all network components) in increasing order of their total costs, and checks each design to see if it forms an acceptable network. This technique gives the true cost-optimal network, and is particularly useful when the network has many constraints and not too many components. The algorithm for generating subsets is described in detail, and various aspects of the overall design procedure are discussed. Two more efficient versions of this algorithm (applicable in specific situations) are also given. Next, two important aspects of network performance analysis: network reliability and message delays are discussed. A new model is introduced to study the reliability of a network with dependent failures. For message delays, a collection of formulas from existing research results is given to compute or estimate the delays of messages in a communication network without making the independence assumption. The design algorithm coded in PASCAL is included as an appendix.
Liu, Yan-Jun; Tong, Shaocheng; Chen, C L Philip; Li, Dong-Juan
2017-11-01
A neural network (NN) adaptive control design problem is addressed for a class of uncertain multi-input-multi-output (MIMO) nonlinear systems in block-triangular form. The considered systems contain uncertainty dynamics and their states are enforced to subject to bounded constraints as well as the couplings among various inputs and outputs are inserted in each subsystem. To stabilize this class of systems, a novel adaptive control strategy is constructively framed by using the backstepping design technique and NNs. The novel integral barrier Lyapunov functionals (BLFs) are employed to overcome the violation of the full state constraints. The proposed strategy can not only guarantee the boundedness of the closed-loop system and the outputs are driven to follow the reference signals, but also can ensure all the states to remain in the predefined compact sets. Moreover, the transformed constraints on the errors are used in the previous BLF, and accordingly it is required to determine clearly the bounds of the virtual controllers. Thus, it can relax the conservative limitations in the traditional BLF-based controls for the full state constraints. This conservatism can be solved in this paper and it is for the first time to control this class of MIMO systems with the full state constraints. The performance of the proposed control strategy can be verified through a simulation example.
NASA Astrophysics Data System (ADS)
Chen, Yang; Wang, Huasheng; Xia, Jixia; Cai, Guobiao; Zhang, Zhenpeng
2017-04-01
For the pressure reducing regulator and check valve double-valve combined test system in an integral bipropellant propulsion system, a system model is established with modular models of various typical components. The simulation research is conducted on the whole working process of an experiment of 9 MPa working condition from startup to rated working condition and finally to shutdown. Comparison of simulation results with test data shows: five working conditions including standby, startup, rated pressurization, shutdown and halt and nine stages of the combined test system are comprehensively disclosed; valve-spool opening and closing details of the regulator and two check valves are accurately revealed; the simulation also clarifies two phenomena which test data are unable to clarify, one is the critical opening state in which the check valve spools slightly open and close alternately in their own fully closed positions, the other is the obvious effects of flow-field temperature drop and temperature rise in pipeline network with helium gas flowing. Moreover, simulation results with consideration of component wall heat transfer are closer to the test data than those under the adiabatic-wall condition, and more able to reveal the dynamic characteristics of the system in various working stages.
Probability theory, not the very guide of life.
Juslin, Peter; Nilsson, Håkan; Winman, Anders
2009-10-01
Probability theory has long been taken as the self-evident norm against which to evaluate inductive reasoning, and classical demonstrations of violations of this norm include the conjunction error and base-rate neglect. Many of these phenomena require multiplicative probability integration, whereas people seem more inclined to linear additive integration, in part, at least, because of well-known capacity constraints on controlled thought. In this article, the authors show with computer simulations that when based on approximate knowledge of probabilities, as is routinely the case in natural environments, linear additive integration can yield as accurate estimates, and as good average decision returns, as estimates based on probability theory. It is proposed that in natural environments people have little opportunity or incentive to induce the normative rules of probability theory and, given their cognitive constraints, linear additive integration may often offer superior bounded rationality.
Kiesel, Andrea; Kunde, Wilfried; Pohl, Carsten; Berner, Michael P; Hoffmann, Joachim
2009-01-01
Expertise in a certain stimulus domain enhances perceptual capabilities. In the present article, the authors investigate whether expertise improves perceptual processing to an extent that allows complex visual stimuli to bias behavior unconsciously. Expert chess players judged whether a target chess configuration entailed a checking configuration. These displays were preceded by masked prime configurations that either represented a checking or a nonchecking configuration. Chess experts, but not novice chess players, revealed a subliminal response priming effect, that is, faster responding when prime and target displays were congruent (both checking or both nonchecking) rather than incongruent. Priming generalized to displays that were not used as targets, ruling out simple repetition priming effects. Thus, chess experts were able to judge unconsciously presented chess configurations as checking or nonchecking. A 2nd experiment demonstrated that experts' priming does not occur for simpler but uncommon chess configurations. The authors conclude that long-term practice prompts the acquisition of visual memories of chess configurations with integrated form-location conjunctions. These perceptual chunks enable complex visual processing outside of conscious awareness.
Awasthi, Jay Prakash; Saha, Bedabrata; Regon, Preetom; Sahoo, Smita; Chowra, Umakanta; Pradhan, Amit; Roy, Anupam; Panda, Sanjib Kumar
2017-01-01
Aluminum (Al) is the third most abundant metal in earth crust, whose chemical form is mainly dependent on soil pH. The most toxic form of Al with respect to plants is Al3+, which exists in soil pH <5. Acidic soil significantly limits crop production mainly due to Al3+ toxicity worldwide, impacting approximately 50% of the world’s arable land (in North-Eastern India 80% soil are acidic). Al3+ toxicity in plants ensues root growth inhibition leading to less nutrient and water uptake impacting crop productivity as a whole. Rice is one of the chief grains which constitutes the staple food of two-third of the world population including India and is not untouched by Al3+ toxicity. Al contamination is a critical constraint to plant production in agricultural soils of North East India. 24 indigenous Indica rice varieties (including Badshahbhog as tolerant check and Mashuri as sensitive check) were screened for Al stress tolerance in hydroponic plant growth system. Results show marked difference in growth parameters (relative growth rate, Root tolerance index, fresh and dry weight of root) of rice seedlings due to Al (100 μM) toxicity. Al3+ uptake and lipid peroxidation level also increased concomitantly under Al treatment. Histochemical assay were also performed to elucidate uptake of aluminum, loss of membrane integrity and lipid peroxidation, which were found to be more in sensitive genotypes at higher Al concentration. This study revealed that aluminum toxicity is a serious harmful problem for rice crop productivity in acid soil. Based on various parameters studied it’s concluded that Disang is a comparatively tolerant variety whereas Joymati a sensitive variety. Western blot hybridization further strengthened the claim, as it demonstrated more accumulation of Glutathione reductase (GR) protein in Disang rice variety than Joymati under stressed condition. This study also observed that the emergence of lethal toxic symptoms occurs only after 48h irrespective of the dose used in the study. PMID:28448589
Privacy-preserving public auditing for data integrity in cloud
NASA Astrophysics Data System (ADS)
Shaik Saleem, M.; Murali, M.
2018-04-01
Cloud computing which has collected extent concentration from communities of research and with industry research development, a large pool of computing resources using virtualized sharing method like storage, processing power, applications and services. The users of cloud are vend with on demand resources as they want in the cloud computing. Outsourced file of the cloud user can easily tampered as it is stored at the third party service providers databases, so there is no integrity of cloud users data as it has no control on their data, therefore providing security assurance to the users data has become one of the primary concern for the cloud service providers. Cloud servers are not responsible for any data loss as it doesn’t provide the security assurance to the cloud user data. Remote data integrity checking (RDIC) licenses an information to data storage server, to determine that it is really storing an owners data truthfully. RDIC is composed of security model and ID-based RDIC where it is responsible for the security of every server and make sure the data privacy of cloud user against the third party verifier. Generally, by running a two-party Remote data integrity checking (RDIC) protocol the clients would themselves be able to check the information trustworthiness of their cloud. Within the two party scenario the verifying result is given either from the information holder or the cloud server may be considered as one-sided. Public verifiability feature of RDIC gives the privilege to all its users to verify whether the original data is modified or not. To ensure the transparency of the publicly verifiable RDIC protocols, Let’s figure out there exists a TPA who is having knowledge and efficiency to verify the work to provide the condition clearly by publicly verifiable RDIC protocols.
A Novel Face-on-Face Contact Method for Nonlinear Solid Mechanics
NASA Astrophysics Data System (ADS)
Wopschall, Steven Robert
The implicit solution to contact problems in nonlinear solid mechanics poses many difficulties. Traditional node-to-segment methods may suffer from locking and experience contact force chatter in the presence of sliding. More recent developments include mortar based methods, which resolve local contact interactions over face-pairs and feature a kinematic constraint in integral form that smoothes contact behavior, especially in the presence of sliding. These methods have been shown to perform well in the presence of geometric nonlinearities and are demonstratively more robust than node-to-segment methods. These methods are typically biased, however, interpolating contact tractions and gap equations on a designated non-mortar face, which leads to an asymmetry in the formulation. Another challenge is constraint enforcement. The general selection of the active set of constraints is brought with difficulty, often leading to non-physical solutions and easily resulting in missed face-pair interactions. Details on reliable constraint enforcement methods are lacking in the greater contact literature. This work presents an unbiased contact formulation utilizing a median-plane methodology. Up to linear polynomials are used for the discrete pressure representation and integral gap constraints are enforced using a novel subcycling procedure. This procedure reliably determines the active set of contact constraints leading to physical and kinematically admissible solutions void of heuristics and user action. The contact method presented herein successfully solves difficult quasi-static contact problems in the implicit computational setting. These problems feature finite deformations, material nonlinearity, and complex interface geometries, all of which are challenging characteristics for contact implementations and constraint enforcement algorithms. The subcycling procedure is a key feature of this method, handling active constraint selection for complex interfaces and mesh geometries.
Simple Check Valves for Microfluidic Devices
NASA Technical Reports Server (NTRS)
Willis, Peter A.; Greer, Harold F.; Smith, J. Anthony
2010-01-01
A simple design concept for check valves has been adopted for microfluidic devices that consist mostly of (1) deformable fluorocarbon polymer membranes sandwiched between (2) borosilicate float glass wafers into which channels, valve seats, and holes have been etched. The first microfluidic devices in which these check valves are intended to be used are micro-capillary electrophoresis (microCE) devices undergoing development for use on Mars in detecting compounds indicative of life. In this application, it will be necessary to store some liquid samples in reservoirs in the devices for subsequent laboratory analysis, and check valves are needed to prevent cross-contamination of the samples. The simple check-valve design concept is also applicable to other microfluidic devices and to fluidic devices in general. These check valves are simplified microscopic versions of conventional rubber- flap check valves that are parts of numerous industrial and consumer products. These check valves are fabricated, not as separate components, but as integral parts of microfluidic devices. A check valve according to this concept consists of suitably shaped portions of a deformable membrane and the two glass wafers between which the membrane is sandwiched (see figure). The valve flap is formed by making an approximately semicircular cut in the membrane. The flap is centered over a hole in the lower glass wafer, through which hole the liquid in question is intended to flow upward into a wider hole, channel, or reservoir in the upper glass wafer. The radius of the cut exceeds the radius of the hole by an amount large enough to prevent settling of the flap into the hole. As in a conventional rubber-flap check valve, back pressure in the liquid pushes the flap against the valve seat (in this case, the valve seat is the adjacent surface of the lower glass wafer), thereby forming a seal that prevents backflow.
Reactor design and integration into a nuclear electric spacecraft
NASA Technical Reports Server (NTRS)
Phillips, W. M.; Koenig, D. R.
1978-01-01
One of the well-defined applications for nuclear power in space is nuclear electric propulsion (NEP). Mission studies have identified the optimum power level (400 kWe). A single Shuttle launch requirement and science-package integration have added additional constraints to the design. A reactor design which will meet these constraints has been studied. The reactor employs 90 fuel elements, each heat pipe cooled. Reactor control is obtained with BeO/B4C drums in a BeO reflector. The balance of the spacecraft is shielded from the reactor with LiH. Power conditioning and reactor control drum drives are located behind the LiH with the power conditioning. Launch safety, mechanical design and integration with the power conversion subsystem are discussed.
NASA Astrophysics Data System (ADS)
Kurdhi, N. A.; Nurhayati, R. A.; Wiyono, S. B.; Handajani, S. S.; Martini, T. S.
2017-01-01
In this paper, we develop an integrated inventory model considering the imperfect quality items, inspection error, controllable lead time, and budget capacity constraint. The imperfect items were uniformly distributed and detected on the screening process. However there are two types of possibilities. The first is type I of inspection error (when a non-defective item classified as defective) and the second is type II of inspection error (when a defective item classified as non-defective). The demand during the lead time is unknown, and it follows the normal distribution. The lead time can be controlled by adding the crashing cost. Furthermore, the existence of the budget capacity constraint is caused by the limited purchasing cost. The purposes of this research are: to modify the integrated vendor and buyer inventory model, to establish the optimal solution using Kuhn-Tucker’s conditions, and to apply the models. Based on the result of application and the sensitivity analysis, it can be obtained minimum integrated inventory total cost rather than separated inventory.
Integrated Transportation-land Use Model For Indiana
DOT National Transportation Integrated Search
1997-01-01
Despite the recent research interest in integrating land use and transportation models inspired by federal legislation, no product had met the data, budget, and personnel constraints faced by the metropolitan planning organizations in Indiana. Conseq...
Arik, Sabri
2005-05-01
This paper presents a sufficient condition for the existence, uniqueness and global asymptotic stability of the equilibrium point for bidirectional associative memory (BAM) neural networks with distributed time delays. The results impose constraint conditions on the network parameters of neural system independently of the delay parameter, and they are applicable to all continuous nonmonotonic neuron activation functions. It is shown that in some special cases of the results, the stability criteria can be easily checked. Some examples are also given to compare the results with the previous results derived in the literature.
Specifying and Verifying Organizational Security Properties in First-Order Logic
NASA Astrophysics Data System (ADS)
Brandt, Christoph; Otten, Jens; Kreitz, Christoph; Bibel, Wolfgang
In certain critical cases the data flow between business departments in banking organizations has to respect security policies known as Chinese Wall or Bell-La Padula. We show that these policies can be represented by formal requirements and constraints in first-order logic. By additionally providing a formal model for the flow of data between business departments we demonstrate how security policies can be applied to a concrete organizational setting and checked with a first-order theorem prover. Our approach can be applied without requiring a deep formal expertise and it therefore promises a high potential of usability in the business.
STS-97 Mission Specialist Noriega during pre-pack and fit check
NASA Technical Reports Server (NTRS)
2000-01-01
STS-97 Mission Specialist Carlos Noriega gets help with his boots from suit technician Shelly Grick-Agrella during pre-pack and fit check. Mission STS-97 is the sixth construction flight to the International Space Station. Its payload includes the P6 Integrated Truss Structure and a photovoltaic (PV) module, with giant solar arrays that will provide power to the Station. The mission includes two spacewalks to complete the solar array connections. STS-97 is scheduled to launch Nov. 30 at about 10:06 p.m. EST.
2001-01-17
Workers in the Payload Changeout Room check the U.S. Lab Destiny as its moves from Atlantis’ payload bay into the PCR. Destiny will remain in the PCR while Atlantis rolls back to the Vehicle Assembly Building to allow workers to conduct inspections, continuity checks and X-ray analysis on the 36 solid rocket booster cables located inside each booster’s system tunnel. An extensive evaluation of NASA’s SRB cable inventory revealed conductor damage in four (of about 200) cables on the shelf. Shuttle managers decided to prove the integrity of the system tunnel cables already on Atlantis
2001-01-17
Workers in the Payload Changeout Room check the U.S. Lab Destiny as its moves from Atlantis’ payload bay into the PCR. Destiny will remain in the PCR while Atlantis rolls back to the Vehicle Assembly Building to allow workers to conduct inspections, continuity checks and X-ray analysis on the 36 solid rocket booster cables located inside each booster’s system tunnel. An extensive evaluation of NASA’s SRB cable inventory revealed conductor damage in four (of about 200) cables on the shelf. Shuttle managers decided to prove the integrity of the system tunnel cables already on Atlantis
NASA Technical Reports Server (NTRS)
1981-01-01
The modified CG2000 crystal grower construction, installation, and machine check out was completed. The process development check out proceeded with several dry runs and one growth run. Several machine calibrations and functional problems were discovered and corrected. Exhaust gas analysis system alternatives were evaluated and an integrated system approved and ordered. Several growth runs on a development CG2000 RC grower show that complete neck, crown, and body automated growth can be achieved with only one operator input.
Reducing software security risk through an integrated approach
NASA Technical Reports Server (NTRS)
Gilliam, D.; Powell, J.; Kelly, J.; Bishop, M.
2001-01-01
The fourth quarter delivery, FY'01 for this RTOP is a Property-Based Testing (PBT), 'Tester's Assistant' (TA). The TA tool is to be used to check compiled and pre-compiled code for potential security weaknesses that could be exploited by hackers. The TA Instrumenter, implemented mostly in C++ (with a small part in Java), parsels two types of files: Java and TASPEC. Security properties to be checked are written in TASPEC. The Instrumenter is used in conjunction with the Tester's Assistant Specification (TASpec)execution monitor to verify the security properties of a given program.
Novel optimization technique of isolated microgrid with hydrogen energy storage.
Beshr, Eman Hassan; Abdelghany, Hazem; Eteiba, Mahmoud
2018-01-01
This paper presents a novel optimization technique for energy management studies of an isolated microgrid. The system is supplied by various Distributed Energy Resources (DERs), Diesel Generator (DG), a Wind Turbine Generator (WTG), Photovoltaic (PV) arrays and supported by fuel cell/electrolyzer Hydrogen storage system for short term storage. Multi-objective optimization is used through non-dominated sorting genetic algorithm to suit the load requirements under the given constraints. A novel multi-objective flower pollination algorithm is utilized to check the results. The Pros and cons of the two optimization techniques are compared and evaluated. An isolated microgrid is modelled using MATLAB software package, dispatch of active/reactive power, optimal load flow analysis with slack bus selection are carried out to be able to minimize fuel cost and line losses under realistic constraints. The performance of the system is studied and analyzed during both summer and winter conditions and three case studies are presented for each condition. The modified IEEE 15 bus system is used to validate the proposed algorithm.
Single-polymer dynamics under constraints: scaling theory and computer experiment.
Milchev, Andrey
2011-03-16
The relaxation, diffusion and translocation dynamics of single linear polymer chains in confinement is briefly reviewed with emphasis on the comparison between theoretical scaling predictions and observations from experiment or, most frequently, from computer simulations. Besides cylindrical, spherical and slit-like constraints, related problems such as the chain dynamics in a random medium and the translocation dynamics through a nanopore are also considered. Another particular kind of confinement is imposed by polymer adsorption on attractive surfaces or selective interfaces--a short overview of single-chain dynamics is also contained in this survey. While both theory and numerical experiments consider predominantly coarse-grained models of self-avoiding linear chain molecules with typically Rouse dynamics, we also note some recent studies which examine the impact of hydrodynamic interactions on polymer dynamics in confinement. In all of the aforementioned cases we focus mainly on the consequences of imposed geometric restrictions on single-chain dynamics and try to check our degree of understanding by assessing the agreement between theoretical predictions and observations.
Novel optimization technique of isolated microgrid with hydrogen energy storage
Abdelghany, Hazem; Eteiba, Mahmoud
2018-01-01
This paper presents a novel optimization technique for energy management studies of an isolated microgrid. The system is supplied by various Distributed Energy Resources (DERs), Diesel Generator (DG), a Wind Turbine Generator (WTG), Photovoltaic (PV) arrays and supported by fuel cell/electrolyzer Hydrogen storage system for short term storage. Multi-objective optimization is used through non-dominated sorting genetic algorithm to suit the load requirements under the given constraints. A novel multi-objective flower pollination algorithm is utilized to check the results. The Pros and cons of the two optimization techniques are compared and evaluated. An isolated microgrid is modelled using MATLAB software package, dispatch of active/reactive power, optimal load flow analysis with slack bus selection are carried out to be able to minimize fuel cost and line losses under realistic constraints. The performance of the system is studied and analyzed during both summer and winter conditions and three case studies are presented for each condition. The modified IEEE 15 bus system is used to validate the proposed algorithm. PMID:29466433
Convolutional encoding of self-dual codes
NASA Technical Reports Server (NTRS)
Solomon, G.
1994-01-01
There exist almost complete convolutional encodings of self-dual codes, i.e., block codes of rate 1/2 with weights w, w = 0 mod 4. The codes are of length 8m with the convolutional portion of length 8m-2 and the nonsystematic information of length 4m-1. The last two bits are parity checks on the two (4m-1) length parity sequences. The final information bit complements one of the extended parity sequences of length 4m. Solomon and van Tilborg have developed algorithms to generate these for the Quadratic Residue (QR) Codes of lengths 48 and beyond. For these codes and reasonable constraint lengths, there are sequential decodings for both hard and soft decisions. There are also possible Viterbi-type decodings that may be simple, as in a convolutional encoding/decoding of the extended Golay Code. In addition, the previously found constraint length K = 9 for the QR (48, 24;12) Code is lowered here to K = 8.
Low-energy effective field theory below the electroweak scale: operators and matching
NASA Astrophysics Data System (ADS)
Jenkins, Elizabeth E.; Manohar, Aneesh V.; Stoffer, Peter
2018-03-01
The gauge-invariant operators up to dimension six in the low-energy effective field theory below the electroweak scale are classified. There are 70 Hermitian dimension-five and 3631 Hermitian dimension-six operators that conserve baryon and lepton number, as well as Δ B = ±Δ L = ±1, Δ L = ±2, and Δ L = ±4 operators. The matching onto these operators from the Standard Model Effective Field Theory (SMEFT) up to order 1 /Λ2 is computed at tree level. SMEFT imposes constraints on the coefficients of the low-energy effective theory, which can be checked experimentally to determine whether the electroweak gauge symmetry is broken by a single fundamental scalar doublet as in SMEFT. Our results, when combined with the one-loop anomalous dimensions of the low-energy theory and the one-loop anomalous dimensions of SMEFT, allow one to compute the low-energy implications of new physics to leading-log accuracy, and combine them consistently with high-energy LHC constraints.
Inferring Metadata for a Semantic Web Peer-to-Peer Environment
ERIC Educational Resources Information Center
Brase, Jan; Painter, Mark
2004-01-01
Learning Objects Metadata (LOM) aims at describing educational resources in order to allow better reusability and retrieval. In this article we show how additional inference rules allows us to derive additional metadata from existing ones. Additionally, using these rules as integrity constraints helps us to define the constraints on LOM elements,…
Finite BRST-BFV transformations for dynamical systems with second-class constraints
NASA Astrophysics Data System (ADS)
Batalin, Igor A.; Lavrov, Peter M.; Tyutin, Igor V.
2015-06-01
We study finite field-dependent BRST-BFV transformations for dynamical systems with first- and second-class constraints within the generalized Hamiltonian formalism. We find explicitly their Jacobians and the form of a solution to the compensation equation necessary for generating an arbitrary finite change of gauge-fixing functionals in the path integral.
Scheduling double round-robin tournaments with divisional play using constraint programming
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carlsson, Mats; Johansson, Mikael; Larson, Jeffrey
We study a tournament format that extends a traditional double round-robin format with divisional single round-robin tournaments. Elitserien, the top Swedish handball league, uses such a format for its league schedule. We present a constraint programming model that characterizes the general double round-robin plus divisional single round-robin format. This integrated model allows scheduling to be performed in a single step, as opposed to common multistep approaches that decompose scheduling into smaller problems and possibly miss optimal solutions. In addition to general constraints, we introduce Elitserien-specific requirements for its tournament. These general and league-specific constraints allow us to identify implicit andmore » symmetry-breaking properties that reduce the time to solution from hours to seconds. A scalability study of the number of teams shows that our approach is reasonably fast for even larger league sizes. The experimental evaluation of the integrated approach takes considerably less computational effort to schedule Elitserien than does the previous decomposed approach. (C) 2016 Elsevier B.V. All rights reserved« less
3-D model-based Bayesian classification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Soenneland, L.; Tenneboe, P.; Gehrmann, T.
1994-12-31
The challenging task of the interpreter is to integrate different pieces of information and combine them into an earth model. The sophistication level of this earth model might vary from the simplest geometrical description to the most complex set of reservoir parameters related to the geometrical description. Obviously the sophistication level also depend on the completeness of the available information. The authors describe the interpreter`s task as a mapping between the observation space and the model space. The information available to the interpreter exists in observation space and the task is to infer a model in model-space. It is well-knownmore » that this inversion problem is non-unique. Therefore any attempt to find a solution depend son constraints being added in some manner. The solution will obviously depend on which constraints are introduced and it would be desirable to allow the interpreter to modify the constraints in a problem-dependent manner. They will present a probabilistic framework that gives the interpreter the tools to integrate the different types of information and produce constrained solutions. The constraints can be adapted to the problem at hand.« less
Stationary properties of maximum-entropy random walks.
Dixit, Purushottam D
2015-10-01
Maximum-entropy (ME) inference of state probabilities using state-dependent constraints is popular in the study of complex systems. In stochastic systems, how state space topology and path-dependent constraints affect ME-inferred state probabilities remains unknown. To that end, we derive the transition probabilities and the stationary distribution of a maximum path entropy Markov process subject to state- and path-dependent constraints. A main finding is that the stationary distribution over states differs significantly from the Boltzmann distribution and reflects a competition between path multiplicity and imposed constraints. We illustrate our results with particle diffusion on a two-dimensional landscape. Connections with the path integral approach to diffusion are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, S; Gardner, S; Doemer, A
Purpose: Investigate use of standardized non-coplanar arcs to improve plan quality in lung Stereotactic Body Radiation Therapy(SBRT) VMAT planning. Methods: VMAT planning was performed for 9 patients previously treated with SBRT for peripheral lung tumors (tumor size:12.7cc to 32.5cc). For each patient, 7 VMAT plans (couch rotation values:0,5,10,15,20,25,and 30 deg) were generated; the coplanar plans were pushed to meet the RTOG0915 constraints and each non-coplanar plans utilized the same optimization constraints. The following plan dose metrics were used (taken from RTOG 0915): D-2cm: the maximum dose at 2 cm from the PTV, conformality index (CI), gradient index (GI), lung volumemore » receiving 5 Gy (V5) and lung volume receiving 20 Gy (V20). The couch collision clearance was checked for each plan through a dry run using the couch position from the patient’s treatment. Results: Of the 9 cases, one coplanar plan failed to meet two protocol guidelines (both gradient index and D-2cm parameter), and an additional plan failed the D-2cm parameter. When introducing at least 5 degree couch rotation, all plans met the protocol guidelines. The largest feasible couch angle available was 15 to 20 degrees due to gantry collision issues. Non-coplanar plans resulted in the average (standard deviation) reduction of the following metrics: GI by 7.3% (3.7%); lung V20 by 11.1% (3.2%); D-2cm by 12.7% (3.9%). The CI was unchanged (−0.3%±0.6%), and lung V5 increased (3.8%±8.2%). Conclusion: The use of couch rotations as little as 5 degrees allows for plan quality that will meet RTOG0915 constraints while reducing D-2cm, GI, and lung V20. Using default couch rotations while planning SBRT cases will allow for more efficient planning with the stated goal of meeting RTOG0915 criteria for all clinical cases. Gantry clearance checks in the treatment room may be necessary to ensure safe treatments for larger couch rotation values.« less
Principles of proteome allocation are revealed using proteomic data and genome-scale models
Yang, Laurence; Yurkovich, James T.; Lloyd, Colton J.; Ebrahim, Ali; Saunders, Michael A.; Palsson, Bernhard O.
2016-01-01
Integrating omics data to refine or make context-specific models is an active field of constraint-based modeling. Proteomics now cover over 95% of the Escherichia coli proteome by mass. Genome-scale models of Metabolism and macromolecular Expression (ME) compute proteome allocation linked to metabolism and fitness. Using proteomics data, we formulated allocation constraints for key proteome sectors in the ME model. The resulting calibrated model effectively computed the “generalist” (wild-type) E. coli proteome and phenotype across diverse growth environments. Across 15 growth conditions, prediction errors for growth rate and metabolic fluxes were 69% and 14% lower, respectively. The sector-constrained ME model thus represents a generalist ME model reflecting both growth rate maximization and “hedging” against uncertain environments and stresses, as indicated by significant enrichment of these sectors for the general stress response sigma factor σS. Finally, the sector constraints represent a general formalism for integrating omics data from any experimental condition into constraint-based ME models. The constraints can be fine-grained (individual proteins) or coarse-grained (functionally-related protein groups) as demonstrated here. This flexible formalism provides an accessible approach for narrowing the gap between the complexity captured by omics data and governing principles of proteome allocation described by systems-level models. PMID:27857205
Principles of proteome allocation are revealed using proteomic data and genome-scale models
Yang, Laurence; Yurkovich, James T.; Lloyd, Colton J.; ...
2016-11-18
Integrating omics data to refine or make context-specific models is an active field of constraint-based modeling. Proteomics now cover over 95% of the Escherichia coli proteome by mass. Genome-scale models of Metabolism and macromolecular Expression (ME) compute proteome allocation linked to metabolism and fitness. Using proteomics data, we formulated allocation constraints for key proteome sectors in the ME model. The resulting calibrated model effectively computed the “generalist” (wild-type) E. coli proteome and phenotype across diverse growth environments. Across 15 growth conditions, prediction errors for growth rate and metabolic fluxes were 69% and 14% lower, respectively. The sector-constrained ME model thusmore » represents a generalist ME model reflecting both growth rate maximization and “hedging” against uncertain environments and stresses, as indicated by significant enrichment of these sectors for the general stress response sigma factor σS. Finally, the sector constraints represent a general formalism for integrating omics data from any experimental condition into constraint-based ME models. The constraints can be fine-grained (individual proteins) or coarse-grained (functionally-related protein groups) as demonstrated here. Furthermore, this flexible formalism provides an accessible approach for narrowing the gap between the complexity captured by omics data and governing principles of proteome allocation described by systems-level models.« less
Controlling Bed Bugs Using Integrated Pest Management (IPM)
Several non-chemical methods can help control an infestation, such as heat treatment or freezing, or mattress and box spring encasements. When using a pesticide, follow label directions carefully and check for EPA registration.
Practical Cleanroom Operations Constraints
NASA Technical Reports Server (NTRS)
Hughes, David; Ginyard, Amani
2007-01-01
This viewgraph presentation reviews the GSFC Cleanroom Facility i.e., Spacecraft Systems Development and Integration Facility (SSDIF) with particular interest in its use during the development of the Wide Field Camera 3 (WFC3). The SSDIF is described and a diagram of the SSDIF is shown. A Constraint Table was created for consistency within Contamination Control Team. This table is shown. Another table that shows the activities that were allowed during the integration under given WFC3 condition and activity location is presented. Three decision trees are shown for different phases of the work: (1) Hardware Relocation, Hardware Work, and Contamination Control Operations.
Throughput and latency programmable optical transceiver by using DSP and FEC control.
Tanimura, Takahito; Hoshida, Takeshi; Kato, Tomoyuki; Watanabe, Shigeki; Suzuki, Makoto; Morikawa, Hiroyuki
2017-05-15
We propose and experimentally demonstrate a proof-of-concept of a programmable optical transceiver that enables simultaneous optimization of multiple programmable parameters (modulation format, symbol rate, power allocation, and FEC) for satisfying throughput, signal quality, and latency requirements. The proposed optical transceiver also accommodates multiple sub-channels that can transport different optical signals with different requirements. Multi-degree-of-freedom of the parameters often leads to difficulty in finding the optimum combination among the parameters due to an explosion of the number of combinations. The proposed optical transceiver reduces the number of combinations and finds feasible sets of programmable parameters by using constraints of the parameters combined with a precise analytical model. For precise BER prediction with the specified set of parameters, we model the sub-channel BER as a function of OSNR, modulation formats, symbol rates, and power difference between sub-channels. Next, we formulate simple constraints of the parameters and combine the constraints with the analytical model to seek feasible sets of programmable parameters. Finally, we experimentally demonstrate the end-to-end operation of the proposed optical transceiver with offline manner including low-density parity-check (LDPC) FEC encoding and decoding under a specific use case with latency-sensitive application and 40-km transmission.
Recent Advances in Stellarator Optimization
NASA Astrophysics Data System (ADS)
Gates, David; Brown, T.; Breslau, J.; Landreman, M.; Lazerson, S. A.; Mynick, H.; Neilson, G. H.; Pomphrey, N.
2016-10-01
Computational optimization has revolutionized the field of stellarator design. To date, optimizations have focused primarily on optimization of neoclassical confinement and ideal MHD stability, although limited optimization of other parameters has also been performed. One criticism that has been levelled at this method of design is the complexity of the resultant field coils. Recently, a new coil optimization code, COILOPT + + , was written and included in the STELLOPT suite of codes. The advantage of this method is that it allows the addition of real space constraints on the locations of the coils. As an initial exercise, a constraint that the windings be vertical was placed on large major radius half of the non-planar coils. Further constraints were also imposed that guaranteed that sector blanket modules could be removed from between the coils, enabling a sector maintenance scheme. Results of this exercise will be presented. We have also explored possibilities for generating an experimental database that could check whether the reduction in turbulent transport that is predicted by GENE as a function of local shear would be consistent with experiments. To this end, a series of equilibria that can be made in the now latent QUASAR experiment have been identified. This work was supported by U.S. DoE Contract #DE-AC02-09CH11466.
NASA Astrophysics Data System (ADS)
González, M. R.; Torres, F.; Yoldi, V.; Arcega, F.; Plaza, I.
2012-04-01
It is proposed an integrated management model for an organization. This model is based on the continuous improvement Plan-Do-Check-Act cycle and it intends to integrate the environmental, risk prevention and ethical aspects as well as research, development and innovation projects management in the general quality management structure proposed by ISO 9001:2008. It aims to fulfill the standards ISO 9001, ISO 14001, OSHAS 18001, SGE 21 y 166002.
Díaz-Castillo, Carlos; Xia, Xiao-Qin; Ranz, José M.
2012-01-01
Why gene order is conserved over long evolutionary timespans remains elusive. A common interpretation is that gene order conservation might reflect the existence of functional constraints that are important for organismal performance. Alteration of the integrity of genomic regions, and therefore of those constraints, would result in detrimental effects. This notion seems especially plausible in those genomes that can easily accommodate gene reshuffling via chromosomal inversions since genomic regions free of constraints are likely to have been disrupted in one or more lineages. Nevertheless, no empirical test has been performed to this notion. Here, we disrupt one of the largest conserved genomic regions of the Drosophila genome by chromosome engineering and examine the phenotypic consequences derived from such disruption. The targeted region exhibits multiple patterns of functional enrichment suggestive of the presence of constraints. The carriers of the disrupted collinear block show no defects in their viability, fertility, and parameters of general homeostasis, although their odorant perception is altered. This change in odorant perception does not correlate with modifications of the level of expression and sex bias of the genes within the genomic region disrupted. Our results indicate that even in highly rearranged genomes, like those of Diptera, unusually high levels of gene order conservation cannot be systematically attributed to functional constraints, which raises the possibility that other mechanisms can be in place and therefore the underpinnings of the maintenance of gene organization might be more diverse than previously thought. PMID:22319453
GraDit: graph-based data repair algorithm for multiple data edits rule violations
NASA Astrophysics Data System (ADS)
Ode Zuhayeni Madjida, Wa; Gusti Bagus Baskara Nugraha, I.
2018-03-01
Constraint-based data cleaning captures data violation to a set of rule called data quality rules. The rules consist of integrity constraint and data edits. Structurally, they are similar, where the rule contain left hand side and right hand side. Previous research proposed a data repair algorithm for integrity constraint violation. The algorithm uses undirected hypergraph as rule violation representation. Nevertheless, this algorithm can not be applied for data edits because of different rule characteristics. This study proposed GraDit, a repair algorithm for data edits rule. First, we use bipartite-directed hypergraph as model representation of overall defined rules. These representation is used for getting interaction between violation rules and clean rules. On the other hand, we proposed undirected graph as violation representation. Our experimental study showed that algorithm with undirected graph as violation representation model gave better data quality than algorithm with undirected hypergraph as representation model.
Option pricing, stochastic volatility, singular dynamics and constrained path integrals
NASA Astrophysics Data System (ADS)
Contreras, Mauricio; Hojman, Sergio A.
2014-01-01
Stochastic volatility models have been widely studied and used in the financial world. The Heston model (Heston, 1993) [7] is one of the best known models to deal with this issue. These stochastic volatility models are characterized by the fact that they explicitly depend on a correlation parameter ρ which relates the two Brownian motions that drive the stochastic dynamics associated to the volatility and the underlying asset. Solutions to the Heston model in the context of option pricing, using a path integral approach, are found in Lemmens et al. (2008) [21] while in Baaquie (2007,1997) [12,13] propagators for different stochastic volatility models are constructed. In all previous cases, the propagator is not defined for extreme cases ρ=±1. It is therefore necessary to obtain a solution for these extreme cases and also to understand the origin of the divergence of the propagator. In this paper we study in detail a general class of stochastic volatility models for extreme values ρ=±1 and show that in these two cases, the associated classical dynamics corresponds to a system with second class constraints, which must be dealt with using Dirac’s method for constrained systems (Dirac, 1958,1967) [22,23] in order to properly obtain the propagator in the form of a Euclidean Hamiltonian path integral (Henneaux and Teitelboim, 1992) [25]. After integrating over momenta, one gets an Euclidean Lagrangian path integral without constraints, which in the case of the Heston model corresponds to a path integral of a repulsive radial harmonic oscillator. In all the cases studied, the price of the underlying asset is completely determined by one of the second class constraints in terms of volatility and plays no active role in the path integral.
Watson-Jones, Deborah; Lees, Shelley; Mwanga, Joseph; Neke, Nyasule; Changalucha, John; Broutet, Nathalie; Maduhu, Ibrahim; Kapiga, Saidi; Chandra-Mouli, Venkatraman; Bloem, Paul; Ross, David A
2016-01-01
Background: Human papillomavirus (HPV) vaccination offers an opportunity to strengthen provision of adolescent health interventions (AHI). We explored the feasibility of integrating other AHI with HPV vaccination in Tanzania. Methods: A desk review of 39 policy documents was preceded by a stakeholder meeting with 38 policy makers and partners. Eighteen key informant interviews (KIIs) with health and education policy makers and district officials were conducted to further explore perceptions of current programs, priorities and AHI that might be suitable for integration with HPV vaccination. Results: Fourteen school health interventions (SHI) or AHI are currently being implemented by the Government of Tanzania. Most are delivered as vertical programmes. Coverage of current programs is not universal, and is limited by financial, human resource and logistic constraints. Limited community engagement, rumours, and lack of strategic advocacy has affected uptake of some interventions, e.g. tetanus toxoid (TT) immunization. Stakeholder and KI perceptions and opinions were limited by a lack of experience with integrated delivery and AHI that were outside an individual’s area of expertise and experience. Deworming and educational sessions including reproductive health education were the most frequently mentioned interventions that respondents considered suitable for integrated delivery with HPV vaccine. Conclusions: Given programme constraints, limited experience with integrated delivery and concern about real or perceived side-effects being attributed to the vaccine, it will be very important to pilot-test integration of AHI/SHI with HPV vaccination. Selected interventions will need to be simple and quick to deliver since health workers are likely to face significant logistic and time constraints during vaccination visits. PMID:26768827
Genetic constraints predict evolutionary divergence in Dalechampia blossoms.
Bolstad, Geir H; Hansen, Thomas F; Pélabon, Christophe; Falahati-Anbaran, Mohsen; Pérez-Barrales, Rocío; Armbruster, W Scott
2014-08-19
If genetic constraints are important, then rates and direction of evolution should be related to trait evolvability. Here we use recently developed measures of evolvability to test the genetic constraint hypothesis with quantitative genetic data on floral morphology from the Neotropical vine Dalechampia scandens (Euphorbiaceae). These measures were compared against rates of evolution and patterns of divergence among 24 populations in two species in the D. scandens species complex. We found clear evidence for genetic constraints, particularly among traits that were tightly phenotypically integrated. This relationship between evolvability and evolutionary divergence is puzzling, because the estimated evolvabilities seem too large to constitute real constraints. We suggest that this paradox can be explained by a combination of weak stabilizing selection around moving adaptive optima and small realized evolvabilities relative to the observed additive genetic variance.
Experiences in integrating auto-translated state-chart designs for model checking
NASA Technical Reports Server (NTRS)
Pingree, P. J.; Benowitz, E. G.
2003-01-01
In the complex environment of JPL's flight missions with increasing dependency on advanced software designs, traditional software validation methods of simulation and testing are being stretched to adequately cover the needs of software development.
AutoLock: a semiautomated system for radiotherapy treatment plan quality control
Lowe, Matthew; Hardy, Mark J.; Boylan, Christopher J.; Whitehurst, Philip; Rowbottom, Carl G.
2015-01-01
A semiautomated system for radiotherapy treatment plan quality control (QC), named AutoLock, is presented. AutoLock is designed to augment treatment plan QC by automatically checking aspects of treatment plans that are well suited to computational evaluation, whilst summarizing more subjective aspects in the form of a checklist. The treatment plan must pass all automated checks and all checklist items must be acknowledged by the planner as correct before the plan is finalized. Thus AutoLock uniquely integrates automated treatment plan QC, an electronic checklist, and plan finalization. In addition to reducing the potential for the propagation of errors, the integration of AutoLock into the plan finalization workflow has improved efficiency at our center. Detailed audit data are presented, demonstrating that the treatment plan QC rejection rate fell by around a third following the clinical introduction of AutoLock. PACS number: 87.55.Qr PMID:26103498
AutoLock: a semiautomated system for radiotherapy treatment plan quality control.
Dewhurst, Joseph M; Lowe, Matthew; Hardy, Mark J; Boylan, Christopher J; Whitehurst, Philip; Rowbottom, Carl G
2015-05-08
A semiautomated system for radiotherapy treatment plan quality control (QC), named AutoLock, is presented. AutoLock is designed to augment treatment plan QC by automatically checking aspects of treatment plans that are well suited to computational evaluation, whilst summarizing more subjective aspects in the form of a checklist. The treatment plan must pass all automated checks and all checklist items must be acknowledged by the planner as correct before the plan is finalized. Thus AutoLock uniquely integrates automated treatment plan QC, an electronic checklist, and plan finalization. In addition to reducing the potential for the propagation of errors, the integration of AutoLock into the plan finalization workflow has improved efficiency at our center. Detailed audit data are presented, demonstrating that the treatment plan QC rejection rate fell by around a third following the clinical introduction of AutoLock.
Bayesian tomography and integrated data analysis in fusion diagnostics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Dong, E-mail: lid@swip.ac.cn; Dong, Y. B.; Deng, Wei
2016-11-15
In this article, a Bayesian tomography method using non-stationary Gaussian process for a prior has been introduced. The Bayesian formalism allows quantities which bear uncertainty to be expressed in the probabilistic form so that the uncertainty of a final solution can be fully resolved from the confidence interval of a posterior probability. Moreover, a consistency check of that solution can be performed by checking whether the misfits between predicted and measured data are reasonably within an assumed data error. In particular, the accuracy of reconstructions is significantly improved by using the non-stationary Gaussian process that can adapt to the varyingmore » smoothness of emission distribution. The implementation of this method to a soft X-ray diagnostics on HL-2A has been used to explore relevant physics in equilibrium and MHD instability modes. This project is carried out within a large size inference framework, aiming at an integrated analysis of heterogeneous diagnostics.« less
ERIC Educational Resources Information Center
Monroe, Brian M.; Read, Stephen J.
2008-01-01
A localist, parallel constraint satisfaction, artificial neural network model is presented that accounts for a broad collection of attitude and attitude-change phenomena. The network represents the attitude object and cognitions and beliefs related to the attitude, as well as how to integrate a persuasive message into this network. Short-term…
A Partial Test of Agnew's General Theory of Crime and Delinquency
ERIC Educational Resources Information Center
Zhang, Yan; Day, George; Cao, Liqun
2012-01-01
In 2005, Agnew introduced a new integrated theory, which he labels a general theory of crime and delinquency. He proposes that delinquency is more likely to occur when constraints against delinquency are low and motivations for delinquency are high. In addition, he argues that constraints and motivations are influenced by variables in five life…
Chen, Guang-Pei; Ahunbay, Ergun; Li, X Allen
2016-04-01
To develop an integrated quality assurance (QA) software tool for online replanning capable of efficiently and automatically checking radiation treatment (RT) planning parameters and gross plan quality, verifying treatment plan data transfer from treatment planning system (TPS) to record and verify (R&V) system, performing a secondary monitor unit (MU) calculation with or without a presence of a magnetic field from MR-Linac, and validating the delivery record consistency with the plan. The software tool, named ArtQA, was developed to obtain and compare plan and treatment parameters from both the TPS and the R&V system database. The TPS data are accessed via direct file reading and the R&V data are retrieved via open database connectivity and structured query language. Plan quality is evaluated with both the logical consistency of planning parameters and the achieved dose-volume histograms. Beams in between the TPS and R&V system are matched based on geometry configurations. To consider the effect of a 1.5 T transverse magnetic field from MR-Linac in the secondary MU calculation, a method based on modified Clarkson integration algorithm was developed and tested for a series of clinical situations. ArtQA has been used in their clinic and can quickly detect inconsistencies and deviations in the entire RT planning process. With the use of the ArtQA tool, the efficiency for plan check including plan quality, data transfer, and delivery check can be improved by at least 60%. The newly developed independent MU calculation tool for MR-Linac reduces the difference between the plan and calculated MUs by 10%. The software tool ArtQA can be used to perform a comprehensive QA check from planning to delivery with conventional Linac or MR-Linac and is an essential tool for online replanning where the QA check needs to be performed rapidly.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Guang-Pei, E-mail: gpchen@mcw.edu; Ahunbay, Ergun; Li, X. Allen
Purpose: To develop an integrated quality assurance (QA) software tool for online replanning capable of efficiently and automatically checking radiation treatment (RT) planning parameters and gross plan quality, verifying treatment plan data transfer from treatment planning system (TPS) to record and verify (R&V) system, performing a secondary monitor unit (MU) calculation with or without a presence of a magnetic field from MR-Linac, and validating the delivery record consistency with the plan. Methods: The software tool, named ArtQA, was developed to obtain and compare plan and treatment parameters from both the TPS and the R&V system database. The TPS data aremore » accessed via direct file reading and the R&V data are retrieved via open database connectivity and structured query language. Plan quality is evaluated with both the logical consistency of planning parameters and the achieved dose–volume histograms. Beams in between the TPS and R&V system are matched based on geometry configurations. To consider the effect of a 1.5 T transverse magnetic field from MR-Linac in the secondary MU calculation, a method based on modified Clarkson integration algorithm was developed and tested for a series of clinical situations. Results: ArtQA has been used in their clinic and can quickly detect inconsistencies and deviations in the entire RT planning process. With the use of the ArtQA tool, the efficiency for plan check including plan quality, data transfer, and delivery check can be improved by at least 60%. The newly developed independent MU calculation tool for MR-Linac reduces the difference between the plan and calculated MUs by 10%. Conclusions: The software tool ArtQA can be used to perform a comprehensive QA check from planning to delivery with conventional Linac or MR-Linac and is an essential tool for online replanning where the QA check needs to be performed rapidly.« less
Transit spectroscopy with JWST: Systematics, starspots and stitching
NASA Astrophysics Data System (ADS)
Barstow, Joanna K.; Aigrain, Suzanne; Irwin, Patrick; Kendrew, Sarah; Fletcher, Leigh N.
2014-11-01
We explore the capabilities of JWST to obtain transit and eclipse spectra of a variety of exoplanets to provide constraints on their atmospheric properties, starting with hot Jupiters and Neptunes. We use the NEMESIS spectral modelling and inversion tool (Irwin et al. 2008) to calculate synthetic spectra for a range of exoplanet types from simple atmospheric models. Using a similar approach to Barstow et al. (2013), we add estimated noise levels to the synthetics before inverting the noisy spectra to check whether we accurately recover the input atmospheric state. The 5 - 12 μm MIRI LRS instrument on JWST ( 100) probes an important part of the infrared spectrum, especially for thermal emission; since a secondary eclipse spectrum relies on the contrast between the exoplanet and its host star, the signal is maximised at longer wavelengths. However, the spectral coverage of the LRS is insufficient to break degeneracies between temperature and composition without some constraint from other instruments. The ideal scenario would be to stitch together observations from NIRSpec ( 100, 0.6 - 5 μm), MIRI LRS, and MIRI MRS (integral field unit providing 3000 spectra across 12 gratings between 5 and 27 μm) to form a complete spectrum with broad wavelength coverage, which would enable the temperature structure and atmospheric composition to be uniquely retrieved. This means that it is necessary to consider in our analysis systematic offsets between parts of the spectrum obtained with different instruments. These offsets may have an astrophysical origin - for example due to stellar activity or temporal variability of the planet itself - or they may be instrumental effects. We explore the influence of these noise sources on our ability to accurately retrieve the true atmospheric state from transit and eclipse spectra. We can simultaneously retrieve temperature structure and H2O, CO2, and CH4 abundances of a hot Jupiter orbiting a sun-like star at 250 parsecs if a single eclipse each is observed with NIRSpec and MIRI LRS.
A Late Cenozoic Kinematic Model for Deformation Within the Greater Cascadia Subduction System
NASA Astrophysics Data System (ADS)
Wilson, D. S.; McCrory, P. A.
2016-12-01
Relatively low fault slip rates have complicated efforts to characterize seismic hazards associated with the diffuse subduction boundary between North America and offshore oceanic plates in the Pacific Northwest region. A kinematic forward model that encompasses a broader region, and incorporates seismologic and geodetic as well as geologic and paleomagnetic constraints offers a tool for constraining fault rupture chronologies—all within a framework tracking relative motion of the Juan de Fuca, Pacific, and North American plates during late Cenozoic time. Our kinematic model tracks motions as a system of rigid microplates, bounded by the more important mapped faults of the region or zones of distributed deformation. Though our emphasis is on Washington and Oregon, the scope of the model extends eastward to the rigid craton in Montana and Wyoming, and southward to the Sierra Nevada block of California to provide important checks on its internal consistency. The model reproduces observed geodetic velocities [e.g., McCaffrey et al., 2013, JGR], for 6 Ma to present, with only minor reorganization for 12-6 Ma. Constraints for the older deformation history are based on paleomagnetic rotations within the Columbia River Basalt Group, and geologic details of fault offsets. Since 17 Ma, our model includes 50 km of N-S shortening across the central Yakima fold and thrust belt, substantial NW-SE right-lateral strike slip distributed among faults in the Washington Cascade Range, 90 km of shortening on thrusts of Puget Lowland, and substantial oroclinal bending of the Crescent Formation basement surrounding the Olympic Peninsula. This kinematic reconstruction provides an integrated, quantitative framework with which to investigate the motions of various PNW forearc and backarc blocks during late Cenozoic time, an essential tool for characterizing the seismic risk associated with the Puget Sound and Portland urban areas, hydroelectric dams, and other critical infrastructure.
The NASA Lewis integrated propulsion and flight control simulator
NASA Technical Reports Server (NTRS)
Bright, Michelle M.; Simon, Donald L.
1991-01-01
A new flight simulation facility has been developed at NASA Lewis to allow integrated propulsion-control and flight-control algorithm development and evaluation in real time. As a preliminary check of the simulator facility and the correct integration of its components, the control design and physics models for an STOVL fighter aircraft model have been demonstrated, with their associated system integration and architecture, pilot vehicle interfaces, and display symbology. The results show that this fixed-based flight simulator can provide real-time feedback and display of both airframe and propulsion variables for validation of integrated systems and testing of control design methodologies and cockpit mechanizations.
NASA Astrophysics Data System (ADS)
Romero, C.; McWilliam, M.; Macías-Pérez, J.-F.; Adam, R.; Ade, P.; André, P.; Aussel, H.; Beelen, A.; Benoît, A.; Bideaud, A.; Billot, N.; Bourrion, O.; Calvo, M.; Catalano, A.; Coiffard, G.; Comis, B.; de Petris, M.; Désert, F.-X.; Doyle, S.; Goupy, J.; Kramer, C.; Lagache, G.; Leclercq, S.; Lestrade, J.-F.; Mauskopf, P.; Mayet, F.; Monfardini, A.; Pascale, E.; Perotto, L.; Pisano, G.; Ponthieu, N.; Revéret, V.; Ritacco, A.; Roussel, H.; Ruppin, F.; Schuster, K.; Sievers, A.; Triqueneaux, S.; Tucker, C.; Zylka, R.
2018-04-01
Context. In the past decade, sensitive, resolved Sunyaev-Zel'dovich (SZ) studies of galaxy clusters have become common. Whereas many previous SZ studies have parameterized the pressure profiles of galaxy clusters, non-parametric reconstructions will provide insights into the thermodynamic state of the intracluster medium. Aim. We seek to recover the non-parametric pressure profiles of the high redshift (z = 0.89) galaxy cluster CLJ 1226.9+3332 as inferred from SZ data from the MUSTANG, NIKA, Bolocam, and Planck instruments, which all probe different angular scales. Methods: Our non-parametric algorithm makes use of logarithmic interpolation, which under the assumption of ellipsoidal symmetry is analytically integrable. For MUSTANG, NIKA, and Bolocam we derive a non-parametric pressure profile independently and find good agreement among the instruments. In particular, we find that the non-parametric profiles are consistent with a fitted generalized Navaro-Frenk-White (gNFW) profile. Given the ability of Planck to constrain the total signal, we include a prior on the integrated Compton Y parameter as determined by Planck. Results: For a given instrument, constraints on the pressure profile diminish rapidly beyond the field of view. The overlap in spatial scales probed by these four datasets is therefore critical in checking for consistency between instruments. By using multiple instruments, our analysis of CLJ 1226.9+3332 covers a large radial range, from the central regions to the cluster outskirts: 0.05 R500 < r < 1.1 R500. This is a wider range of spatial scales than is typically recovered by SZ instruments. Similar analyses will be possible with the new generation of SZ instruments such as NIKA2 and MUSTANG2.
Constraint processing in our extensible language for cooperative imaging system
NASA Astrophysics Data System (ADS)
Aoki, Minoru; Murao, Yo; Enomoto, Hajime
1996-02-01
The extensible WELL (Window-based elaboration language) has been developed using the concept of common platform, where both client and server can communicate with each other with support from a communication manager. This extensible language is based on an object oriented design by introducing constraint processing. Any kind of services including imaging in the extensible language is controlled by the constraints. Interactive functions between client and server are extended by introducing agent functions including a request-respond relation. Necessary service integrations are satisfied with some cooperative processes using constraints. Constraints are treated similarly to data, because the system should have flexibilities in the execution of many kinds of services. The similar control process is defined by using intentional logic. There are two kinds of constraints, temporal and modal constraints. Rendering the constraints, the predicate format as the relation between attribute values can be a warrant for entities' validity as data. As an imaging example, a processing procedure of interaction between multiple objects is shown as an image application for the extensible system. This paper describes how the procedure proceeds in the system, and that how the constraints work for generating moving pictures.
The sense of body ownership relaxes temporal constraints for multisensory integration.
Maselli, Antonella; Kilteni, Konstantina; López-Moliner, Joan; Slater, Mel
2016-08-03
Experimental work on body ownership illusions showed how simple multisensory manipulation can generate the illusory experience of an artificial limb as being part of the own-body. This work highlighted how own-body perception relies on a plastic brain representation emerging from multisensory integration. The flexibility of this representation is reflected in the short-term modulations of physiological states and perceptual processing observed during these illusions. Here, we explore the impact of ownership illusions on the temporal dimension of multisensory integration. We show that, during the illusion, the temporal window for integrating touch on the physical body with touch seen on a virtual body representation, increases with respect to integration with visual events seen close but separated from the virtual body. We show that this effect is mediated by the ownership illusion. Crucially, the temporal window for visuotactile integration was positively correlated with participants' scores rating the illusory experience of owning the virtual body and touching the object seen in contact with it. Our results corroborate the recently proposed causal inference mechanism for illusory body ownership. As a novelty, they show that the ensuing illusory causal binding between stimuli from the real and fake body relaxes constraints for the integration of bodily signals.
Hiring a Pest Management Professional for Bed Bugs
If you hire someone to treat your bed bug infestation, make sure they use Integrated Pest Management (IPM) techniques, check credentials, and know they may need multiple visits, to take apart furniture, and to use vacuums, heat, and pesticides.
Runtime Analysis of Linear Temporal Logic Specifications
NASA Technical Reports Server (NTRS)
Giannakopoulou, Dimitra; Havelund, Klaus
2001-01-01
This report presents an approach to checking a running program against its Linear Temporal Logic (LTL) specifications. LTL is a widely used logic for expressing properties of programs viewed as sets of executions. Our approach consists of translating LTL formulae to finite-state automata, which are used as observers of the program behavior. The translation algorithm we propose modifies standard LTL to B chi automata conversion techniques to generate automata that check finite program traces. The algorithm has been implemented in a tool, which has been integrated with the generic JPaX framework for runtime analysis of Java programs.
NASA Astrophysics Data System (ADS)
Lee, Kyu J.; Kunii, T. L.; Noma, T.
1993-01-01
In this paper, we propose a syntactic pattern recognition method for non-schematic drawings, based on a new attributed graph grammar with flexible embedding. In our graph grammar, the embedding rule permits the nodes of a guest graph to be arbitrarily connected with the nodes of a host graph. The ambiguity caused by this flexible embedding is controlled with the evaluation of synthesized attributes and the check of context sensitivity. To integrate parsing with the synthesized attribute evaluation and the context sensitivity check, we also develop a bottom up parsing algorithm.
STS-97 Mission Specialist Garneau with full launch and entry suit during pre-pack and fit check
NASA Technical Reports Server (NTRS)
2000-01-01
During pre-pack and fit check in the Operations and Checkout Building, STS-97 Commander Brent Jett gets help with his gloves from suit technician Bill Todd. Mission STS-97 is the sixth construction flight to the International Space Station. Its payload includes the P6 Integrated Truss Structure and a photovoltaic (PV) module, with giant solar arrays that will provide power to the Station. The mission includes two spacewalks to complete the solar array connections. STS-97 is scheduled to launch Nov. 30 at about 10:06 p.m. EST.
STS-97 Mission Specialist Garneau during pre-pack and fit check
NASA Technical Reports Server (NTRS)
2000-01-01
STS-97 Mission Specialist Marc Garneau gets help with his boots from suit technician Tommy McDonald during pre-pack and fit check. Garneau is with the Canadian Space Agency. Mission STS-97 is the sixth construction flight to the International Space Station. Its payload includes the P6 Integrated Truss Structure and a photovoltaic (PV) module, with giant solar arrays that will provide power to the Station. The mission includes two spacewalks to complete the solar array connections. STS-97 is scheduled to launch Nov. 30 at about 10:06 p.m. EST.
Automata-Based Verification of Temporal Properties on Running Programs
NASA Technical Reports Server (NTRS)
Giannakopoulou, Dimitra; Havelund, Klaus; Lan, Sonie (Technical Monitor)
2001-01-01
This paper presents an approach to checking a running program against its Linear Temporal Logic (LTL) specifications. LTL is a widely used logic for expressing properties of programs viewed as sets of executions. Our approach consists of translating LTL formulae to finite-state automata, which are used as observers of the program behavior. The translation algorithm we propose modifies standard LTL to Buchi automata conversion techniques to generate automata that check finite program traces. The algorithm has been implemented in a tool, which has been integrated with the generic JPaX framework for runtime analysis of Java programs.
2001-01-17
Workers in the Payload Changeout Room check the Payload Ground Handling Mechanism that will move the U.S. Lab Destiny out of Atlantis’ payload bay and into the PCR. After the move, Atlantis will roll back to the Vehicle Assembly Building to allow workers to conduct inspections, continuity checks and X-ray analysis on the 36 solid rocket booster cables located inside each booster’s system tunnel. An extensive evaluation of NASA’s SRB cable inventory revealed conductor damage in four (of about 200) cables on the shelf. Shuttle managers decided to prove the integrity of the system tunnel cables already on Atlantis
The Z1 truss is placed in stand to check weight and balance
NASA Technical Reports Server (NTRS)
2000-01-01
In the Space Station Processing Facility, the Integrated Truss Structure Z1 rests in the workstand to check its weight and balance. The Z1 truss is the first of 10 trusses that will become the backbone of the International Space Station, eventually stretching the length of a football field. Along with its companion payload, the third Pressurized Mating Adapter, the Z1 is scheduled to be launched aboard Space Shuttle Discovery Oct. 5 at 9:38 p.m. EDT. The launch will be the 100th in the Shuttle program.
The Z1 truss is lowered to stand to check weight and balance
NASA Technical Reports Server (NTRS)
2000-01-01
In the Space Station Processing Facility, an overhead crane lowers the Integrated Truss Structure Z1 onto a workstand to check its weight and balance. The Z1 truss is the first of 10 trusses that will become the backbone of the International Space Station, eventually stretching the length of a football field. Along with its companion payload, the third Pressurized Mating Adapter, the Z1 is scheduled to be launched aboard Space Shuttle Discovery Oct. 5 at 9:38 p.m. EDT. The launch will be the 100th in the Shuttle program.
The Z1 truss is moved to check weight and balance
NASA Technical Reports Server (NTRS)
2000-01-01
In the Space Station Processing Facility, the Integrated Truss Structure Z1, an element of the International Space Station, is moved to another stand to check its weight and balance. The Z1 truss is the first of 10 trusses that will become the backbone of the Space Station, eventually stretching the length of a football field. Along with its companion payload, the third Pressurized Mating Adapter, the Z1 is scheduled to be launched aboard Space Shuttle Discovery Oct. 5 at 9:38 p.m. EDT. The launch will be the 100th in the Shuttle program.
The Z1 truss is moved to check weight and balance
NASA Technical Reports Server (NTRS)
2000-01-01
In the Space Station Processing Facility, the Integrated Truss Structure Z1, an element of the International Space Station, is lifted for moving to another stand to check its weight and balance. The Z1 truss is the first of 10 trusses that will become the backbone of the Space Station, eventually stretching the length of a football field. Along with its companion payload, the third Pressurized Mating Adapter, the Z1 is scheduled to be launched aboard Space Shuttle Discovery Oct. 5 at 9:38 p.m. EDT. The launch will be the 100th in the Shuttle program.
Astronaut Joseph Tanner checks gloves during during launch/entry training
1994-06-23
S94-40082 (23 June 1994) --- Astronaut Joseph R. Tanner, mission specialist, checks his glove during a rehearsal for launch and entry phases of the scheduled November flight of STS-66. This rehearsal, held in the Crew Compartment Trainer (CCT) of the Johnson Space Center's (JSC) Shuttle Mockup and Integration Laboratory, was followed by a training session on emergency egress procedures. In November, Tanner will join four other NASA astronauts and a European mission specialist for a week and a half aboard the Space Shuttle Atlantis in Earth-orbit in support of the Atmospheric Laboratory for Applications and Science (ATLAS-3).
Nonplanar on-shell diagrams and leading singularities of scattering amplitudes
NASA Astrophysics Data System (ADS)
Chen, Baoyi; Chen, Gang; Cheung, Yeuk-Kwan E.; Li, Yunxuan; Xie, Ruofei; Xin, Yuan
2017-02-01
Bipartite on-shell diagrams are the latest tool in constructing scattering amplitudes. In this paper we prove that a Britto-Cachazo-Feng-Witten (BCFW) decomposable on-shell diagram process a rational top form if and only if the algebraic ideal comprised the geometrical constraints are shifted linearly during successive BCFW integrations. With a proper geometric interpretation of the constraints in the Grassmannian manifold, the rational top form integration contours can thus be obtained, and understood, in a straightforward way. All rational top form integrands of arbitrary higher loops leading singularities can therefore be derived recursively, as long as the corresponding on-shell diagram is BCFW decomposable.
The capability and constraint model of recoverability: An integrated theory of continuity planning.
Lindstedt, David
2017-01-01
While there are best practices, good practices, regulations and standards for continuity planning, there is no single model to collate and sort their various recommended activities. To address this deficit, this paper presents the capability and constraint model of recoverability - a new model to provide an integrated foundation for business continuity planning. The model is non-linear in both construct and practice, thus allowing practitioners to remain adaptive in its application. The paper presents each facet of the model, outlines the model's use in both theory and practice, suggests a subsequent approach that arises from the model, and discusses some possible ramifications to the industry.
Integrated Analytic and Linearized Inverse Kinematics for Precise Full Body Interactions
NASA Astrophysics Data System (ADS)
Boulic, Ronan; Raunhardt, Daniel
Despite the large success of games grounded on movement-based interactions the current state of full body motion capture technologies still prevents the exploitation of precise interactions with complex environments. This paper focuses on ensuring a precise spatial correspondence between the user and the avatar. We build upon our past effort in human postural control with a Prioritized Inverse Kinematics framework. One of its key advantage is to ease the dynamic combination of postural and collision avoidance constraints. However its reliance on a linearized approximation of the problem makes it vulnerable to the well-known full extension singularity of the limbs. In such context the tracking performance is reduced and/or less believable intermediate postural solutions are produced. We address this issue by introducing a new type of analytic constraint that smoothly integrates within the prioritized Inverse Kinematics framework. The paper first recalls the background of full body 3D interactions and the advantages and drawbacks of the linearized IK solution. Then the Flexion-EXTension constraint (FLEXT in short) is introduced for the partial position control of limb-like articulated structures. Comparative results illustrate the interest of this new type of integrated analytical and linearized IK control.
Vivek-Ananth, R P; Samal, Areejit
2016-09-01
A major goal of systems biology is to build predictive computational models of cellular metabolism. Availability of complete genome sequences and wealth of legacy biochemical information has led to the reconstruction of genome-scale metabolic networks in the last 15 years for several organisms across the three domains of life. Due to paucity of information on kinetic parameters associated with metabolic reactions, the constraint-based modelling approach, flux balance analysis (FBA), has proved to be a vital alternative to investigate the capabilities of reconstructed metabolic networks. In parallel, advent of high-throughput technologies has led to the generation of massive amounts of omics data on transcriptional regulation comprising mRNA transcript levels and genome-wide binding profile of transcriptional regulators. A frontier area in metabolic systems biology has been the development of methods to integrate the available transcriptional regulatory information into constraint-based models of reconstructed metabolic networks in order to increase the predictive capabilities of computational models and understand the regulation of cellular metabolism. Here, we review the existing methods to integrate transcriptional regulatory information into constraint-based models of metabolic networks. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Integrated Attitude Control Strategy for the Asteroid Redirect Mission
NASA Technical Reports Server (NTRS)
Lopez, Pedro, Jr.; Price, Hoppy; San Martin, Miguel
2014-01-01
A deep-space mission has been proposed to redirect an asteroid to a distant retrograde orbit around the moon using a robotic vehicle, the Asteroid Redirect Vehicle (ARV). In this orbit, astronauts will rendezvous with the ARV using the Orion spacecraft. The integrated attitude control concept that Orion will use for approach and docking and for mated operations will be described. Details of the ARV's attitude control system and its associated constraints for redirecting the asteroid to the distant retrograde orbit around the moon will be provided. Once Orion is docked to the ARV, an overall description of the mated stack attitude during all phases of the mission will be presented using a coordinate system that was developed for this mission. Next, the thermal and power constraints of both the ARV and Orion will be discussed as well as how they are used to define the optimal integrated stack attitude. Lastly, the lighting and communications constraints necessary for the crew's extravehicular activity planned to retrieve samples from the asteroid will be examined. Similarly, the joint attitude control strategy that employs both the Orion and the ARV attitude control assets prior, during, and after each extravehicular activity will also be thoroughly discussed.
Integration deficiencies associated with continuous limb movement sequences in Parkinson's disease.
Park, Jin-Hoon; Stelmach, George E
2009-11-01
The present study examined the extent to which Parkinson's disease (PD) influences integration of continuous limb movement sequences. Eight patients with idiopathic PD and 8 age-matched normal subjects were instructed to perform repetitive sequential aiming movements to specified targets under three-accuracy constraints: 1) low accuracy (W = 7 cm) - minimal accuracy constraint, 2) high accuracy (W = 0.64 cm) - maximum accuracy constraint, and 3) mixed accuracy constraint - one target of high accuracy and another target of low accuracy. The characteristic of sequential movements in the low accuracy condition was mostly cyclical, whereas in the high accuracy condition it was discrete in both groups. When the accuracy constraint was mixed, the sequential movements were executed by assembling discrete and cyclical movements in both groups, suggesting that for PD patients the capability to combine discrete and cyclical movements to meet a task requirement appears to be intact. However, such functional linkage was not as pronounced as was in normal subjects. Close examination of movement from the mixed accuracy condition revealed marked movement hesitations in the vicinity of the large target in PD patients, resulting in a bias toward discrete movement. These results suggest that PD patients may have deficits in ongoing planning and organizing processes during movement execution when the tasks require to assemble various accuracy requirements into more complex movement sequences.
ERIC Educational Resources Information Center
Huynh, Minh; Pinto, Ivan
2010-01-01
For years, there has been a need for teaching students about business process integration. The use of ERP systems has been proposed as a mechanism to meet this need. Yet, in the midst of a recent economic crisis, it is difficult to find funding for the acquisition and implementation of an ERP system for teaching purpose. While it is recognized…
Genetic constraints predict evolutionary divergence in Dalechampia blossoms
Bolstad, Geir H.; Hansen, Thomas F.; Pélabon, Christophe; Falahati-Anbaran, Mohsen; Pérez-Barrales, Rocío; Armbruster, W. Scott
2014-01-01
If genetic constraints are important, then rates and direction of evolution should be related to trait evolvability. Here we use recently developed measures of evolvability to test the genetic constraint hypothesis with quantitative genetic data on floral morphology from the Neotropical vine Dalechampia scandens (Euphorbiaceae). These measures were compared against rates of evolution and patterns of divergence among 24 populations in two species in the D. scandens species complex. We found clear evidence for genetic constraints, particularly among traits that were tightly phenotypically integrated. This relationship between evolvability and evolutionary divergence is puzzling, because the estimated evolvabilities seem too large to constitute real constraints. We suggest that this paradox can be explained by a combination of weak stabilizing selection around moving adaptive optima and small realized evolvabilities relative to the observed additive genetic variance. PMID:25002700
Carver, Charles S
2005-01-01
A behavioral dimension of impulse versus constraint has long been observed by personality psychologists. This article begins by reviewing processes underlying this dimension from the perspectives of several personality theories. Some cases of constraint reflect inhibition due to anxiety, but some theories suggest other roots for constraint. Theories from developmental psychology accommodate both possibilities by positing 2 sorts of control over action. These modes of influence strongly resemble those predicated in some personality theories and also 2 modes of function that are asserted by some cognitive and social psychological theories. Several further literatures are considered, to which 2-mode models seem to contribute meaningfully. The article closes by addressing questions raised by these ideas, including whether the issue of impulse versus constraint applies to avoidance as well as to approach.
NASA Astrophysics Data System (ADS)
Fortugno, Diego; Zema, Demetrio Antonio; Bombino, Giuseppe; Tamburino, Vincenzo; Quinonero Rubio, Juan Manuel; Boix-Fayos, Carolina
2016-04-01
In Mediterranean semi-arid conditions the geomorphic effects of land-use changes and check dam installation on active channel headwater morphology are not completely understood. In such environments, the availability of specific studies, which monitor channel adjustments as a response to reforestation and check dams over representative observation periods, could help develop new management strategies and erosion control measures. This investigation is an integrated approach assessing the adjustments of channel morphology in a typical torrent (Sant'Agata, Calabria, Southern Italy) after land-use changes (e.g. fire, reforestation, land abandonment) and check dam construction across a period of about 60 years (1955-2012). A statistical analysis of historical rainfall records, an analysis of land-use change in the catchment area and a geomorphological mapping of channel adjustments were carried out and combined with field surveys of bed surface grain-size over a 5-km reach including 14 check dams. The analysis of the historical rainfall records showed a slight decrease in the amount and erosivity of precipitation. Mapping of land-use changes highlighted a general increase of vegetal coverage on the slopes adjacent to the monitored reaches. Together with the check dam network installation, this increase could have induced a reduction in water and sediment supply. The different erosional and depositional forms and adjustments showed a general narrowing between consecutive check dams together with local modifications detected upstream (bed aggradation and cross section expansion together with low-flow realignments) and downstream (local incision) of the installed check dams. Changes in the torrent bends were also detected as a response to erosional and depositional processes with different intensities. The study highlighted: (i) the efficiency of check dams against the disrupting power of the most intense floods by stabilising the active channel; and (ii) the influence of reforestation in increasing hillslope protection from erosion and disconnectivity of water and sediment flows towards the active channel. The residual sediment deficit circulating in the watershed suggests the need of slight management interventions, as, for instance, the conversion of the existing check dams into open structures, allowing a definite channel and coast stability.
Integrating Sociological Practice into Traditional Sociology Courses.
ERIC Educational Resources Information Center
Basirico, Laurence A.
1990-01-01
Outlines a model of instruction that uses Marvin Olsen's reconceptualization of sociology as "sociological practice" to integrate sociological practice into traditional courses. States that this approach helps students gain a critical perspective and overcome personal and cultural ideological constraints in dealing with real issues…
Multiply scaled constrained nonlinear equation solvers. [for nonlinear heat conduction problems
NASA Technical Reports Server (NTRS)
Padovan, Joe; Krishna, Lala
1986-01-01
To improve the numerical stability of nonlinear equation solvers, a partitioned multiply scaled constraint scheme is developed. This scheme enables hierarchical levels of control for nonlinear equation solvers. To complement the procedure, partitioned convergence checks are established along with self-adaptive partitioning schemes. Overall, such procedures greatly enhance the numerical stability of the original solvers. To demonstrate and motivate the development of the scheme, the problem of nonlinear heat conduction is considered. In this context the main emphasis is given to successive substitution-type schemes. To verify the improved numerical characteristics associated with partitioned multiply scaled solvers, results are presented for several benchmark examples.
LOI/SOHO constraints on oblique rotation of the solar core
NASA Astrophysics Data System (ADS)
Gizon, L.; Appourchaux, T.; Gough, D. O.
The Sun is usually assumed to rotate about a single axis, tilted with respect to the ecliptic normal by an angle of 7.25 degrees. Although we have an excellent knowledge of the direction of the rotation axis of the photospheric layers, we cannot exclude a priori that the direction of the rotation axis could vary as a function of radius. We have tried to check whether the assumption of rotation about a unique axis is consistent with helioseismic data. We report on an attempt to measure the directions of the pulsation axes of several low-degree modes of oscillation in the LOI/SOHO Fourier spectra.
NASA Technical Reports Server (NTRS)
Mock, W. D.; Latham, R. A.
1982-01-01
The NASTRAN model plan for the fairing structure was expanded in detail to generate the NASTRAN model of this substructure. The grid point coordinates, element definitions, material properties, and sizing data for each element were specified. The fairing model was thoroughly checked out for continuity, connectivity, and constraints. The substructure was processed for structural influence coefficients (SIC) point loadings to determine the deflection characteristics of the fairing model. Finally, a demonstration and validation processing of this substructure was accomplished using the NASTRAN finite element program. The bulk data deck, stiffness matrices, and SIC output data were delivered.
Integral approximations to classical diffusion and smoothed particle hydrodynamics
Du, Qiang; Lehoucq, R. B.; Tartakovsky, A. M.
2014-12-31
The contribution of the paper is the approximation of a classical diffusion operator by an integral equation with a volume constraint. A particular focus is on classical diffusion problems associated with Neumann boundary conditions. By exploiting this approximation, we can also approximate other quantities such as the flux out of a domain. Our analysis of the model equation on the continuum level is closely related to the recent work on nonlocal diffusion and peridynamic mechanics. In particular, we elucidate the role of a volumetric constraint as an approximation to a classical Neumann boundary condition in the presence of physical boundary.more » The volume-constrained integral equation then provides the basis for accurate and robust discretization methods. As a result, an immediate application is to the understanding and improvement of the Smoothed Particle Hydrodynamics (SPH) method.« less
Watson-Jones, Deborah; Lees, Shelley; Mwanga, Joseph; Neke, Nyasule; Changalucha, John; Broutet, Nathalie; Maduhu, Ibrahim; Kapiga, Saidi; Chandra-Mouli, Venkatraman; Bloem, Paul; Ross, David A
2016-07-01
Human papillomavirus (HPV) vaccination offers an opportunity to strengthen provision of adolescent health interventions (AHI). We explored the feasibility of integrating other AHI with HPV vaccination in Tanzania. A desk review of 39 policy documents was preceded by a stakeholder meeting with 38 policy makers and partners. Eighteen key informant interviews (KIIs) with health and education policy makers and district officials were conducted to further explore perceptions of current programs, priorities and AHI that might be suitable for integration with HPV vaccination. Fourteen school health interventions (SHI) or AHI are currently being implemented by the Government of Tanzania. Most are delivered as vertical programmes. Coverage of current programs is not universal, and is limited by financial, human resource and logistic constraints. Limited community engagement, rumours, and lack of strategic advocacy has affected uptake of some interventions, e.g. tetanus toxoid (TT) immunization. Stakeholder and KI perceptions and opinions were limited by a lack of experience with integrated delivery and AHI that were outside an individual's area of expertise and experience. Deworming and educational sessions including reproductive health education were the most frequently mentioned interventions that respondents considered suitable for integrated delivery with HPV vaccine. Given programme constraints, limited experience with integrated delivery and concern about real or perceived side-effects being attributed to the vaccine, it will be very important to pilot-test integration of AHI/SHI with HPV vaccination. Selected interventions will need to be simple and quick to deliver since health workers are likely to face significant logistic and time constraints during vaccination visits. © The Author 2016. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine.
Report: Plans to Migrate Data to the New EPA Acquisition System Need Improvement
Report #10-P-0071, February 24, 2010. EPA’s plans for migrating data from ICMS to EAS lack sufficient incorporation of data integrity and quality checks to ensure the complete and accurate transfer of procurement data.
STS-114: Discovery L-2 Countdown Status Briefing
NASA Technical Reports Server (NTRS)
2005-01-01
George Diller of NASA Public Affairs hosted this briefing. Pete Nickolenko, NASA Test Director; Scott Higgenbotham, STS-114 Payload-Mission Manager; Cathy Winters, Shuttle Weather Officer were present. Pete reports his team has completed the avionics system check ups, servicing of the cryogenic tanks will take about seven hours that day, and will perform engine system checks and pad close outs come evening. Pete also summarized other standard close out activities: check ups of the Orbiter and ground communications network, rotary service, structure retraction, and external tank load (ETL). Pete reported that the mission will be 12 days with two weather contingency days, and end of mission landing scheduled at Kennedy Space Center (KSC) at approximately 11:00 in the morning, Eastern time on July 25th. Scott briefly reported that all hardware is on board Discovery, closed out, and ready to fly. Cathy reported that hurricane Dennis moved to the North and looking forward to launch. She mentioned of a new hurricane looming and will be named Emily, spotted some crosswinds which will migrate to the west, there is 30% probability weather prohibiting launch. Cathy further gave current weather forecast supported with charts: the Launch Forecast, Tanking Forecast, SRB (Shuttle Solid Rocket Booster) Forecast, CONUS and TAL Launch Sites Forecast, and with 24 hours and 48 hours turn around plan. Launch constraints, weather, crosswinds, cloud cover, ground imagery system, launch countdown, launch crews, mission management simulations, launch team simulations were topics covered with the News Media.
An Event Driven Hybrid Identity Management Approach to Privacy Enhanced e-Health
Sánchez-Guerrero, Rosa; Almenárez, Florina; Díaz-Sánchez, Daniel; Marín, Andrés; Arias, Patricia; Sanvido, Fabio
2012-01-01
Credential-based authorization offers interesting advantages for ubiquitous scenarios involving limited devices such as sensors and personal mobile equipment: the verification can be done locally; it offers a more reduced computational cost than its competitors for issuing, storing, and verification; and it naturally supports rights delegation. The main drawback is the revocation of rights. Revocation requires handling potentially large revocation lists, or using protocols to check the revocation status, bringing extra communication costs not acceptable for sensors and other limited devices. Moreover, the effective revocation consent—considered as a privacy rule in sensitive scenarios—has not been fully addressed. This paper proposes an event-based mechanism empowering a new concept, the sleepyhead credentials, which allows to substitute time constraints and explicit revocation by activating and deactivating authorization rights according to events. Our approach is to integrate this concept in IdM systems in a hybrid model supporting delegation, which can be an interesting alternative for scenarios where revocation of consent and user privacy are critical. The delegation includes a SAML compliant protocol, which we have validated through a proof-of-concept implementation. This article also explains the mathematical model describing the event-based model and offers estimations of the overhead introduced by the system. The paper focus on health care scenarios, where we show the flexibility of the proposed event-based user consent revocation mechanism. PMID:22778634
An event driven hybrid identity management approach to privacy enhanced e-health.
Sánchez-Guerrero, Rosa; Almenárez, Florina; Díaz-Sánchez, Daniel; Marín, Andrés; Arias, Patricia; Sanvido, Fabio
2012-01-01
Credential-based authorization offers interesting advantages for ubiquitous scenarios involving limited devices such as sensors and personal mobile equipment: the verification can be done locally; it offers a more reduced computational cost than its competitors for issuing, storing, and verification; and it naturally supports rights delegation. The main drawback is the revocation of rights. Revocation requires handling potentially large revocation lists, or using protocols to check the revocation status, bringing extra communication costs not acceptable for sensors and other limited devices. Moreover, the effective revocation consent--considered as a privacy rule in sensitive scenarios--has not been fully addressed. This paper proposes an event-based mechanism empowering a new concept, the sleepyhead credentials, which allows to substitute time constraints and explicit revocation by activating and deactivating authorization rights according to events. Our approach is to integrate this concept in IdM systems in a hybrid model supporting delegation, which can be an interesting alternative for scenarios where revocation of consent and user privacy are critical. The delegation includes a SAML compliant protocol, which we have validated through a proof-of-concept implementation. This article also explains the mathematical model describing the event-based model and offers estimations of the overhead introduced by the system. The paper focus on health care scenarios, where we show the flexibility of the proposed event-based user consent revocation mechanism.
An inventory of bispectrum estimators for redshift space distortions
NASA Astrophysics Data System (ADS)
Regan, Donough
2017-12-01
In order to best improve constraints on cosmological parameters and on models of modified gravity using current and future galaxy surveys it is necessary maximally exploit the available data. As redshift-space distortions mean statistical translation invariance is broken for galaxy observations, this will require measurement of the monopole, quadrupole and hexadecapole of not just the galaxy power spectrum, but also the galaxy bispectrum. A recent (2015) paper by Scoccimarro demonstrated how the standard bispectrum estimator may be expressed in terms of Fast Fourier Transforms (FFTs) to afford an extremely efficient algorithm, allowing the bispectrum multipoles on all scales and triangle shapes to be measured in comparable time to those of the power spectrum. In this paper we present a suite of alternative proxies to measure the three-point correlation multipoles. In particular, we describe a modal (or plane wave) decomposition to capture the information in each multipole in a series of basis coefficients, and also describe three compressed estimators formed using the skew-spectrum, the line correlation function and the integrated bispectrum, respectively. As well as each of the estimators offering a different measurement channel, and thereby a robustness check, it is expected that some (especially the modal estimator) will offer a vast data compression, and so a much reduced covariance matrix. This compression may be vital to reduce the computational load involved in extracting the available three-point information.
Massive Cloud-Based Big Data Processing for Ocean Sensor Networks and Remote Sensing
NASA Astrophysics Data System (ADS)
Schwehr, K. D.
2017-12-01
Until recently, the work required to integrate and analyze data for global-scale environmental issues was prohibitive both in cost and availability. Traditional desktop processing systems are not able to effectively store and process all the data, and super computer solutions are financially out of the reach of most people. The availability of large-scale cloud computing has created tools that are usable by small groups and individuals regardless of financial resources or locally available computational resources. These systems give scientists and policymakers the ability to see how critical resources are being used across the globe with little or no barrier to entry. Google Earth Engine has the Moderate Resolution Imaging Spectroradiometer (MODIS) Terra, MODIS Aqua, and Global Land Data Assimilation Systems (GLDAS) data catalogs available live online. Here we demonstrate these data to calculate the correlation between lagged chlorophyll and rainfall to identify areas of eutrophication, matching these events to ocean currents from datasets like HYbrid Coordinate Ocean Model (HYCOM) to check if there are constraints from oceanographic configurations. The system can provide addition ground truth with observations from sensor networks like the International Comprehensive Ocean-Atmosphere Data Set / Voluntary Observing Ship (ICOADS/VOS) and Argo floats. This presentation is intended to introduce users to the datasets, programming idioms, and functionality of Earth Engine for large-scale, data-driven oceanography.
Space Shuttle Day-of-Launch Trajectory Design and Verification
NASA Technical Reports Server (NTRS)
Harrington, Brian E.
2010-01-01
A top priority of any launch vehicle is to insert as much mass into the desired orbit as possible. This requirement must be traded against vehicle capability in terms of dynamic control, thermal constraints, and structural margins. The vehicle is certified to a specific structural envelope which will yield certain performance characteristics of mass to orbit. Some envelopes cannot be certified generically and must be checked with each mission design. The most sensitive envelopes require an assessment on the day-of-launch. To further minimize vehicle loads while maximizing vehicle performance, a day-of-launch trajectory can be designed. This design is optimized according to that day s wind and atmospheric conditions, which will increase the probability of launch. The day-of-launch trajectory verification is critical to the vehicle's safety. The Day-Of-Launch I-Load Uplink (DOLILU) is the process by which the Space Shuttle Program redesigns the vehicle steering commands to fit that day's environmental conditions and then rigorously verifies the integrated vehicle trajectory's loads, controls, and performance. The Shuttle methodology is very similar to other United States unmanned launch vehicles. By extension, this method would be similar to the methods employed for any future NASA launch vehicles. This presentation will provide an overview of the Shuttle's day-of-launch trajectory optimization and verification as an example of a more generic application of dayof- launch design and validation.
NASA Astrophysics Data System (ADS)
Pezzi, M.; Favaro, M.; Gregori, D.; Ricci, P. P.; Sapunenko, V.
2014-06-01
In large computing centers, such as the INFN CNAF Tier1 [1], is essential to be able to configure all the machines, depending on use, in an automated way. For several years at the Tier1 has been used Quattor[2], a server provisioning tool, which is currently used in production. Nevertheless we have recently started a comparison study involving other tools able to provide specific server installation and configuration features and also offer a proper full customizable solution as an alternative to Quattor. Our choice at the moment fell on integration between two tools: Cobbler [3] for the installation phase and Puppet [4] for the server provisioning and management operation. The tool should provide the following properties in order to replicate and gradually improve the current system features: implement a system check for storage specific constraints such as kernel modules black list at boot time to avoid undesired SAN (Storage Area Network) access during disk partitioning; a simple and effective mechanism for kernel upgrade and downgrade; the ability of setting package provider using yum, rpm or apt; easy to use Virtual Machine installation support including bonding and specific Ethernet configuration; scalability for managing thousands of nodes and parallel installations. This paper describes the results of the comparison and the tests carried out to verify the requirements and the new system suitability in the INFN-T1 environment.
Hard and Soft Constraints in Reliability-Based Design Optimization
NASA Technical Reports Server (NTRS)
Crespo, L.uis G.; Giesy, Daniel P.; Kenny, Sean P.
2006-01-01
This paper proposes a framework for the analysis and design optimization of models subject to parametric uncertainty where design requirements in the form of inequality constraints are present. Emphasis is given to uncertainty models prescribed by norm bounded perturbations from a nominal parameter value and by sets of componentwise bounded uncertain variables. These models, which often arise in engineering problems, allow for a sharp mathematical manipulation. Constraints can be implemented in the hard sense, i.e., constraints must be satisfied for all parameter realizations in the uncertainty model, and in the soft sense, i.e., constraints can be violated by some realizations of the uncertain parameter. In regard to hard constraints, this methodology allows (i) to determine if a hard constraint can be satisfied for a given uncertainty model and constraint structure, (ii) to generate conclusive, formally verifiable reliability assessments that allow for unprejudiced comparisons of competing design alternatives and (iii) to identify the critical combination of uncertain parameters leading to constraint violations. In regard to soft constraints, the methodology allows the designer (i) to use probabilistic uncertainty models, (ii) to calculate upper bounds to the probability of constraint violation, and (iii) to efficiently estimate failure probabilities via a hybrid method. This method integrates the upper bounds, for which closed form expressions are derived, along with conditional sampling. In addition, an l(sub infinity) formulation for the efficient manipulation of hyper-rectangular sets is also proposed.
NASA Technical Reports Server (NTRS)
Pendergrass, J. R.; Walsh, R. L.
1975-01-01
An examination of the factors which modify the simulation of a constraint in the motion of the aft attach points of the orbiter and external tank during separation has been made. The factors considered were both internal (spring and damper constants) and external (friction coefficient and dynamic pressure). The results show that an acceptable choice of spring/damper constant combinations exist over the expected range of the external factors and that the choice is consistent with a practical integration interval. The constraint model is shown to produce about a 10 percent increase in the relative body pitch angles over the unconstrained case whereas the MDC-STL constraint model is shown to produce about a 38 percent increase.
Deep Neural Networks for Speech Separation With Application to Robust Speech Recognition
acoustic -phonetic features. The second objective is integration of spectrotemporal context for improved separation performance. Conditional random fields...will be used to encode contextual constraints. The third objective is to achieve robust ASR in the DNN framework through integrated acoustic modeling
Fully integrated aerodynamic/dynamic optimization of helicopter rotor blades
NASA Technical Reports Server (NTRS)
Walsh, Joanne L.; Lamarsh, William J., II; Adelman, Howard M.
1992-01-01
This paper describes a fully integrated aerodynamic/dynamic optimization procedure for helicopter rotor blades. The procedure combines performance and dynamics analyses with a general purpose optimizer. The procedure minimizes a linear combination of power required (in hover, forward flight, and maneuver) and vibratory hub shear. The design variables include pretwist, taper initiation, taper ratio, root chord, blade stiffnesses, tuning masses, and tuning mass locations. Aerodynamic constraints consist of limits on power required in hover, forward flight and maneuver; airfoil section stall; drag divergence Mach number; minimum tip chord; and trim. Dynamic constraints are on frequencies, minimum autorotational inertia, and maximum blade weight. The procedure is demonstrated for two cases. In the first case the objective function involves power required (in hover, forward flight, and maneuver) and dynamics. The second case involves only hover power and dynamics. The designs from the integrated procedure are compared with designs from a sequential optimization approach in which the blade is first optimized for performance and then for dynamics. In both cases, the integrated approach is superior.
Fully integrated aerodynamic/dynamic optimization of helicopter rotor blades
NASA Technical Reports Server (NTRS)
Walsh, Joanne L.; Lamarsh, William J., II; Adelman, Howard M.
1992-01-01
A fully integrated aerodynamic/dynamic optimization procedure is described for helicopter rotor blades. The procedure combines performance and dynamic analyses with a general purpose optimizer. The procedure minimizes a linear combination of power required (in hover, forward flight, and maneuver) and vibratory hub shear. The design variables include pretwist, taper initiation, taper ratio, root chord, blade stiffnesses, tuning masses, and tuning mass locations. Aerodynamic constraints consist of limits on power required in hover, forward flight and maneuvers; airfoil section stall; drag divergence Mach number; minimum tip chord; and trim. Dynamic constraints are on frequencies, minimum autorotational inertia, and maximum blade weight. The procedure is demonstrated for two cases. In the first case, the objective function involves power required (in hover, forward flight and maneuver) and dynamics. The second case involves only hover power and dynamics. The designs from the integrated procedure are compared with designs from a sequential optimization approach in which the blade is first optimized for performance and then for dynamics. In both cases, the integrated approach is superior.
Acceleration constraints in modeling and control of nonholonomic systems
NASA Astrophysics Data System (ADS)
Bajodah, Abdulrahman H.
2003-10-01
Acceleration constraints are used to enhance modeling techniques for dynamical systems. In particular, Kane's equations of motion subjected to bilateral constraints, unilateral constraints, and servo-constraints are modified by utilizing acceleration constraints for the purpose of simplifying the equations and increasing their applicability. The tangential properties of Kane's method provide relationships between the holonomic and the nonholonomic partial velocities, and hence allow one to describe nonholonomic generalized active and inertia forces in terms of their holonomic counterparts, i.e., those which correspond to the system without constraints. Therefore, based on the modeling process objectives, the holonomic and the nonholonomic vector entities in Kane's approach are used interchangeably to model holonomic and nonholonomic systems. When the holonomic partial velocities are used to model nonholonomic systems, the resulting models are full-order (also called nonminimal or unreduced) and separated in accelerations. As a consequence, they are readily integrable and can be used for generic system analysis. Other related topics are constraint forces, numerical stability of the nonminimal equations of motion, and numerical constraint stabilization. Two types of unilateral constraints considered are impulsive and friction constraints. Impulsive constraints are modeled by means of a continuous-in-velocities and impulse-momentum approaches. In controlled motion, the acceleration form of constraints is utilized with the Moore-Penrose generalized inverse of the corresponding constraint matrix to solve for the inverse dynamics of servo-constraints, and for the redundancy resolution of overactuated manipulators. If control variables are involved in the algebraic constraint equations, then these tools are used to modify the controlled equations of motion in order to facilitate control system design. An illustrative example of spacecraft stabilization is presented.
James Webb Space Telescope: Frequently Asked Questions for Scientists and Engineers
NASA Technical Reports Server (NTRS)
Gardner, Jonathan P.
2008-01-01
JWST will be tested incrementally during its construction, starting with individual mirrors and instruments (including cameras and spectrometers) and building up to the full observatory. JWST's mirrors and the telescope structure are first each tested individually, including optical testing of the mirrors and alignment testing of the structure inside a cold thermal-vacuum chamber. The mirrors are then installed on the telescope structure in a clean room at Goddard Space Flight Center (GSFC). In parallel to the telescope assembly and alignment, the instruments are being built and tested, again first individually, and then as part of an integrated instrument assembly. The integrated instrument assembly will be tested in a thermal-vacuum chamber at GSFC using an optical simulator of the telescope. This testing makes sure the instruments are properly aligned relative to each other and also provides an independent check of the individual tests. After both the telescope and the integrated instrument module are successfully assembled, the integrated instrument module will be installed onto the telescope, and the combined system will be sent to Johnson Space Flight Center (JSC) where it will be optically tested in one of the JSC chambers. The process includes testing the 18 primary mirror segments acting as a single primary mirror, and testing the end-to-end system. The final system test will assure that the combined telescope and instruments are focused and aligned properly, and that the alignment, once in space, will be within the range of the actively controlled optics. In general, the individual optical tests of instruments and mirrors are the most accurate. The final system tests provide a cost-effective check that no major problem has occurred during assembly. In addition, independent optical checks of earlier tests will be made as the full system is assembled, providing confidence that there are no major problems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Basso, Benjamin; Dixon, Lance J.
We use integrability at weak coupling to compute fishnet diagrams for four-point correlation functions in planar Φ 4 theory. Our results are always multilinear combinations of ladder integrals, which are in turn built out of classical polylogarithms. The Steinmann relations provide a powerful constraint on such linear combinations, which leads to a natural conjecture for any fishnet diagram as the determinant of a matrix of ladder integrals.
A programing system for research and applications in structural optimization
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, J.; Rogers, J. L., Jr.
1981-01-01
The flexibility necessary for such diverse utilizations is achieved by combining, in a modular manner, a state-of-the-art optimization program, a production level structural analysis program, and user supplied and problem dependent interface programs. Standard utility capabilities in modern computer operating systems are used to integrate these programs. This approach results in flexibility of the optimization procedure organization and versatility in the formulation of constraints and design variables. Features shown in numerical examples include: variability of structural layout and overall shape geometry, static strength and stiffness constraints, local buckling failure, and vibration constraints.
Integrated optimization of nonlinear R/C frames with reliability constraints
NASA Technical Reports Server (NTRS)
Soeiro, Alfredo; Hoit, Marc
1989-01-01
A structural optimization algorithm was researched including global displacements as decision variables. The algorithm was applied to planar reinforced concrete frames with nonlinear material behavior submitted to static loading. The flexural performance of the elements was evaluated as a function of the actual stress-strain diagrams of the materials. Formation of rotational hinges with strain hardening were allowed and the equilibrium constraints were updated accordingly. The adequacy of the frames was guaranteed by imposing as constraints required reliability indices for the members, maximum global displacements for the structure and a maximum system probability of failure.
Integrated Heat Switch/Oxide Sorption Compressor
NASA Technical Reports Server (NTRS)
Bard, Steven
1989-01-01
Thermally-driven, nonmechanical compressor uses container filled with compressed praseodymium cerium oxide powder (PrCeOx) to provide high-pressure flow of oxygen gas for driving closed-cycle Joule-Thomson-expansion refrigeration unit. Integrated heat switch/oxide sorption compressor has no moving parts except check valves, which control flow of oxygen gas between compressor and closed-cycle Joule-Thomson refrigeration system. Oxygen expelled from sorbent at high pressure by evacuating heat-switch gap and turning on heater.
ERIC Educational Resources Information Center
Mbach, Florence; Oboka, Wycliffe; Simiyu, Ruth; Wakhungu, Jacob
2016-01-01
Education was identified as the critical means of achieving behaviour change in and out of the classroom in order to prevent and mitigate the spread of HIV and AIDS among the youth. This study sought to investigate the constraints during HIV and AIDS curriculum implementation, the study was guided by social cognitive approach theories, survey and…
Staggered solution procedures for multibody dynamics simulation
NASA Technical Reports Server (NTRS)
Park, K. C.; Chiou, J. C.; Downer, J. D.
1990-01-01
The numerical solution procedure for multibody dynamics (MBD) systems is termed a staggered MBD solution procedure that solves the generalized coordinates in a separate module from that for the constraint force. This requires a reformulation of the constraint conditions so that the constraint forces can also be integrated in time. A major advantage of such a partitioned solution procedure is that additional analysis capabilities such as active controller and design optimization modules can be easily interfaced without embedding them into a monolithic program. After introducing the basic equations of motion for MBD system in the second section, Section 3 briefly reviews some constraint handling techniques and introduces the staggered stabilized technique for the solution of the constraint forces as independent variables. The numerical direct time integration of the equations of motion is described in Section 4. As accurate damping treatment is important for the dynamics of space structures, we have employed the central difference method and the mid-point form of the trapezoidal rule since they engender no numerical damping. This is in contrast to the current practice in dynamic simulations of ground vehicles by employing a set of backward difference formulas. First, the equations of motion are partitioned according to the translational and the rotational coordinates. This sets the stage for an efficient treatment of the rotational motions via the singularity-free Euler parameters. The resulting partitioned equations of motion are then integrated via a two-stage explicit stabilized algorithm for updating both the translational coordinates and angular velocities. Once the angular velocities are obtained, the angular orientations are updated via the mid-point implicit formula employing the Euler parameters. When the two algorithms, namely, the two-stage explicit algorithm for the generalized coordinates and the implicit staggered procedure for the constraint Lagrange multipliers, are brought together in a staggered manner, they constitute a staggered explicit-implicit procedure which is summarized in Section 5. Section 6 presents some example problems and discussions concerning several salient features of the staggered MBD solution procedure are offered in Section 7.
Rational first integrals of geodesic equations and generalised hidden symmetries
NASA Astrophysics Data System (ADS)
Aoki, Arata; Houri, Tsuyoshi; Tomoda, Kentaro
2016-10-01
We discuss novel generalisations of Killing tensors, which are introduced by considering rational first integrals of geodesic equations. We introduce the notion of inconstructible generalised Killing tensors, which cannot be constructed from ordinary Killing tensors. Moreover, we introduce inconstructible rational first integrals, which are constructed from inconstructible generalised Killing tensors, and provide a method for checking the inconstructibility of a rational first integral. Using the method, we show that the rational first integral of the Collinson-O’Donnell solution is not inconstructible. We also provide several examples of metrics admitting an inconstructible rational first integral in two and four-dimensions, by using the Maciejewski-Przybylska system. Furthermore, we attempt to generalise other hidden symmetries such as Killing-Yano tensors.
The Proposal of the Model for Developing Dispatch System for Nationwide One-Day Integrative Planning
NASA Astrophysics Data System (ADS)
Kim, Hyun Soo; Choi, Hyung Rim; Park, Byung Kwon; Jung, Jae Un; Lee, Jin Wook
The problems of dispatch planning for container truck are classified as the pickup and delivery problems, which are highly complex issues that consider various constraints in the real world. However, in case of the current situation, it is developed by the control system so that it requires the automated planning system under the view of nationwide integrative planning. Therefore, the purpose of this study is to suggest model to develop the automated dispatch system through the constraint satisfaction problem and meta-heuristic technique-based algorithm. In the further study, the practical system is developed and evaluation is performed in aspect of various results. This study suggests model to undergo the study which promoted the complexity of the problems by considering the various constraints which were not considered in the early study. However, it is suggested that it is necessary to add the study which includes the real-time monitoring function for vehicles and cargos based on the information technology.
NASA Astrophysics Data System (ADS)
Arfawi Kurdhi, Nughthoh; Adi Diwiryo, Toray; Sutanto
2016-02-01
This paper presents an integrated single-vendor two-buyer production-inventory model with stochastic demand and service level constraints. Shortage is permitted in the model, and partial backordered partial lost sale. The lead time demand is assumed follows a normal distribution and the lead time can be reduced by adding crashing cost. The lead time and ordering cost reductions are interdependent with logaritmic function relationship. A service level constraint policy corresponding to each buyer is considered in the model in order to limit the level of inventory shortages. The purpose of this research is to minimize joint total cost inventory model by finding the optimal order quantity, safety stock, lead time, and the number of lots delivered in one production run. The optimal production-inventory policy gained by the Lagrange method is shaped to account for the service level restrictions. Finally, a numerical example and effects of the key parameters are performed to illustrate the results of the proposed model.
Beyond mechanistic interaction: value-based constraints on meaning in language.
Rączaszek-Leonardi, Joanna; Nomikou, Iris
2015-01-01
According to situated, embodied, and distributed approaches to cognition, language is a crucial means for structuring social interactions. Recent approaches that emphasize this coordinative function treat language as a system of replicable constraints on individual and interactive dynamics. In this paper, we argue that the integration of the replicable-constraints approach to language with the ecological view on values allows for a deeper insight into processes of meaning creation in interaction. Such a synthesis of these frameworks draws attention to important sources of structuring interactions beyond the sheer efficiency of a collective system in its current task situation. Most importantly, the workings of linguistic constraints will be shown as embedded in more general fields of values, which are realized on multiple timescales. Because the ontogenetic timescale offers a convenient window into the emergence of linguistic constraints, we present illustrations of concrete mechanisms through which values may become embodied in language use in development.
Emotional Intelligence in Secondary Education Students in Multicultural Contexts
ERIC Educational Resources Information Center
Pegalajar-Palomino, Ma. del Carmen; Colmenero-Ruiz, Ma. Jesus
2014-01-01
Introduction: The study analyzes the level of development in emotional intelligence of Secondary Education students. It also checks for statistically significant differences in educational level between Spanish and immigrant students, under the integration program "Intercultural Open Classrooms". Method: 94 students of Secondary…
Thinking inside the (lock)box: using banking technology to improve the revenue cycle.
D'Eramo, Michael; Umbreit, Lynda
2005-08-01
An integrated, image-based lockbox solution has allowed Columbus, Ohio-based MaternOhio to automate payment posting, reconcilement, and billing; store check and remittance images electronically; and automatically update its in-house patient accounting system and medical records.
Wang, Shiyao; Deng, Zhidong; Yin, Gang
2016-01-01
A high-performance differential global positioning system (GPS) receiver with real time kinematics provides absolute localization for driverless cars. However, it is not only susceptible to multipath effect but also unable to effectively fulfill precise error correction in a wide range of driving areas. This paper proposes an accurate GPS–inertial measurement unit (IMU)/dead reckoning (DR) data fusion method based on a set of predictive models and occupancy grid constraints. First, we employ a set of autoregressive and moving average (ARMA) equations that have different structural parameters to build maximum likelihood models of raw navigation. Second, both grid constraints and spatial consensus checks on all predictive results and current measurements are required to have removal of outliers. Navigation data that satisfy stationary stochastic process are further fused to achieve accurate localization results. Third, the standard deviation of multimodal data fusion can be pre-specified by grid size. Finally, we perform a lot of field tests on a diversity of real urban scenarios. The experimental results demonstrate that the method can significantly smooth small jumps in bias and considerably reduce accumulated position errors due to DR. With low computational complexity, the position accuracy of our method surpasses existing state-of-the-arts on the same dataset and the new data fusion method is practically applied in our driverless car. PMID:26927108
Wang, Shiyao; Deng, Zhidong; Yin, Gang
2016-02-24
A high-performance differential global positioning system (GPS) receiver with real time kinematics provides absolute localization for driverless cars. However, it is not only susceptible to multipath effect but also unable to effectively fulfill precise error correction in a wide range of driving areas. This paper proposes an accurate GPS-inertial measurement unit (IMU)/dead reckoning (DR) data fusion method based on a set of predictive models and occupancy grid constraints. First, we employ a set of autoregressive and moving average (ARMA) equations that have different structural parameters to build maximum likelihood models of raw navigation. Second, both grid constraints and spatial consensus checks on all predictive results and current measurements are required to have removal of outliers. Navigation data that satisfy stationary stochastic process are further fused to achieve accurate localization results. Third, the standard deviation of multimodal data fusion can be pre-specified by grid size. Finally, we perform a lot of field tests on a diversity of real urban scenarios. The experimental results demonstrate that the method can significantly smooth small jumps in bias and considerably reduce accumulated position errors due to DR. With low computational complexity, the position accuracy of our method surpasses existing state-of-the-arts on the same dataset and the new data fusion method is practically applied in our driverless car.
KiDS-450: the tomographic weak lensing power spectrum and constraints on cosmological parameters
NASA Astrophysics Data System (ADS)
Köhlinger, F.; Viola, M.; Joachimi, B.; Hoekstra, H.; van Uitert, E.; Hildebrandt, H.; Choi, A.; Erben, T.; Heymans, C.; Joudaki, S.; Klaes, D.; Kuijken, K.; Merten, J.; Miller, L.; Schneider, P.; Valentijn, E. A.
2017-11-01
We present measurements of the weak gravitational lensing shear power spectrum based on 450 ° ^2 of imaging data from the Kilo Degree Survey. We employ a quadratic estimator in two and three redshift bins and extract band powers of redshift autocorrelation and cross-correlation spectra in the multipole range 76 ≤ ℓ ≤ 1310. The cosmological interpretation of the measured shear power spectra is performed in a Bayesian framework assuming a ΛCDM model with spatially flat geometry, while accounting for small residual uncertainties in the shear calibration and redshift distributions as well as marginalizing over intrinsic alignments, baryon feedback and an excess-noise power model. Moreover, massive neutrinos are included in the modelling. The cosmological main result is expressed in terms of the parameter combination S_8 ≡ σ _8 √{Ω_m/0.3} yielding S8 = 0.651 ± 0.058 (three z-bins), confirming the recently reported tension in this parameter with constraints from Planck at 3.2σ (three z-bins). We cross-check the results of the three z-bin analysis with the weaker constraints from the two z-bin analysis and find them to be consistent. The high-level data products of this analysis, such as the band power measurements, covariance matrices, redshift distributions and likelihood evaluation chains are available at http://kids.strw.leidenuniv.nl.
Integrated design of the CSI evolutionary structure: A verification of the design methodology
NASA Technical Reports Server (NTRS)
Maghami, Peiman G.; Joshi, S. M.; Elliott, Kenny B.; Walz, J. E.
1993-01-01
One of the main objectives of the Controls-Structures Interaction (CSI) program is to develop and evaluate integrated controls-structures design methodology for flexible space structures. Thus far, integrated design methodologies for a class of flexible spacecraft, which require fine attitude pointing and vibration suppression with no payload articulation, have been extensively investigated. Various integrated design optimization approaches, such as single-objective optimization, and multi-objective optimization, have been implemented with an array of different objectives and constraints involving performance and cost measures such as total mass, actuator mass, steady-state pointing performance, transient performance, control power, and many more. These studies have been performed using an integrated design software tool (CSI-DESIGN CODE) which is under development by the CSI-ADM team at the NASA Langley Research Center. To date, all of these studies, irrespective of the type of integrated optimization posed or objectives and constraints used, have indicated that integrated controls-structures design results in an overall spacecraft design which is considerably superior to designs obtained through a conventional sequential approach. Consequently, it is believed that validation of some of these results through fabrication and testing of a structure which is designed through an integrated design approach is warranted. The objective of this paper is to present and discuss the efforts that have been taken thus far for the validation of the integrated design methodology.
The Integrated Farm System Model: A Tool for Whole Farm Nutrient Management Analysis
USDA-ARS?s Scientific Manuscript database
With tighter profit margins and increasing environmental constraints, strategic planning of farm production systems is becoming both more important and more difficult. This is especially true for integrated crop and animal production systems. Animal production is complex with a number of interacting...
Automated Planning and Scheduling for Space Mission Operations
NASA Technical Reports Server (NTRS)
Chien, Steve; Jonsson, Ari; Knight, Russell
2005-01-01
Research Trends: a) Finite-capacity scheduling under more complex constraints and increased problem dimensionality (subcontracting, overtime, lot splitting, inventory, etc.) b) Integrated planning and scheduling. c) Mixed-initiative frameworks. d) Management of uncertainty (proactive and reactive). e) Autonomous agent architectures and distributed production management. e) Integration of machine learning capabilities. f) Wider scope of applications: 1) analysis of supplier/buyer protocols & tradeoffs; 2) integration of strategic & tactical decision-making; and 3) enterprise integration.
V-band integrated quadriphase modulator
NASA Technical Reports Server (NTRS)
Grote, A.; Chang, K.
1983-01-01
A V-band integrated circuit quadriphase shift keyed modulator/exciter for space communications systems was developed. Intersatellite communications systems require direct modulation at 60 GHz to enhance signal processing capability. For most systems, particularly space applications, small and lightweight components are essential to alleviate severe system design constraints. Thus to achieve wideband, high data rate systems, direct modulation techniques at millimeter waves using solid state integrated circuit technology are an integral part of the overall technology developments.
Revisiting the Procedures for the Vector Data Quality Assurance in Practice
NASA Astrophysics Data System (ADS)
Erdoğan, M.; Torun, A.; Boyacı, D.
2012-07-01
Immense use of topographical data in spatial data visualization, business GIS (Geographic Information Systems) solutions and applications, mobile and location-based services forced the topo-data providers to create standard, up-to-date and complete data sets in a sustainable frame. Data quality has been studied and researched for more than two decades. There have been un-countable numbers of references on its semantics, its conceptual logical and representations and many applications on spatial databases and GIS. However, there is a gap between research and practice in the sense of spatial data quality which increases the costs and decreases the efficiency of data production. Spatial data quality is well-known by academia and industry but usually in different context. The research on spatial data quality stated several issues having practical use such as descriptive information, metadata, fulfillment of spatial relationships among data, integrity measures, geometric constraints etc. The industry and data producers realize them in three stages; pre-, co- and post data capturing. The pre-data capturing stage covers semantic modelling, data definition, cataloguing, modelling, data dictionary and schema creation processes. The co-data capturing stage covers general rules of spatial relationships, data and model specific rules such as topologic and model building relationships, geometric threshold, data extraction guidelines, object-object, object-belonging class, object-non-belonging class, class-class relationships to be taken into account during data capturing. And post-data capturing stage covers specified QC (quality check) benchmarks and checking compliance to general and specific rules. The vector data quality criteria are different from the views of producers and users. But these criteria are generally driven by the needs, expectations and feedbacks of the users. This paper presents a practical method which closes the gap between theory and practice. Development of spatial data quality concepts into developments and application requires existence of conceptual, logical and most importantly physical existence of data model, rules and knowledge of realization in a form of geo-spatial data. The applicable metrics and thresholds are determined on this concrete base. This study discusses application of geo-spatial data quality issues and QA (quality assurance) and QC procedures in the topographic data production. Firstly we introduce MGCP (Multinational Geospatial Co-production Program) data profile of NATO (North Atlantic Treaty Organization) DFDD (DGIWG Feature Data Dictionary), the requirements of data owner, the view of data producers for both data capturing and QC and finally QA to fulfil user needs. Then, our practical and new approach which divides the quality into three phases is introduced. Finally, implementation of our approach to accomplish metrics, measures and thresholds of quality definitions is discussed. In this paper, especially geometry and semantics quality and quality control procedures that can be performed by the producers are discussed. Some applicable best-practices that we experienced on techniques of quality control, defining regulations that define the objectives and data production procedures are given in the final remarks. These quality control procedures should include the visual checks over the source data, captured vector data and printouts, some automatic checks that can be performed by software and some semi-automatic checks by the interaction with quality control personnel. Finally, these quality control procedures should ensure the geometric, semantic, attribution and metadata quality of vector data.
Yang, Liang; Ge, Meng; Jin, Di; He, Dongxiao; Fu, Huazhu; Wang, Jing; Cao, Xiaochun
2017-01-01
Due to the demand for performance improvement and the existence of prior information, semi-supervised community detection with pairwise constraints becomes a hot topic. Most existing methods have been successfully encoding the must-link constraints, but neglect the opposite ones, i.e., the cannot-link constraints, which can force the exclusion between nodes. In this paper, we are interested in understanding the role of cannot-link constraints and effectively encoding pairwise constraints. Towards these goals, we define an integral generative process jointly considering the network topology, must-link and cannot-link constraints. We propose to characterize this process as a Multi-variance Mixed Gaussian Generative (MMGG) Model to address diverse degrees of confidences that exist in network topology and pairwise constraints and formulate it as a weighted nonnegative matrix factorization problem. The experiments on artificial and real-world networks not only illustrate the superiority of our proposed MMGG, but also, most importantly, reveal the roles of pairwise constraints. That is, though the must-link is more important than cannot-link when either of them is available, both must-link and cannot-link are equally important when both of them are available. To the best of our knowledge, this is the first work on discovering and exploring the importance of cannot-link constraints in semi-supervised community detection.
Ge, Meng; Jin, Di; He, Dongxiao; Fu, Huazhu; Wang, Jing; Cao, Xiaochun
2017-01-01
Due to the demand for performance improvement and the existence of prior information, semi-supervised community detection with pairwise constraints becomes a hot topic. Most existing methods have been successfully encoding the must-link constraints, but neglect the opposite ones, i.e., the cannot-link constraints, which can force the exclusion between nodes. In this paper, we are interested in understanding the role of cannot-link constraints and effectively encoding pairwise constraints. Towards these goals, we define an integral generative process jointly considering the network topology, must-link and cannot-link constraints. We propose to characterize this process as a Multi-variance Mixed Gaussian Generative (MMGG) Model to address diverse degrees of confidences that exist in network topology and pairwise constraints and formulate it as a weighted nonnegative matrix factorization problem. The experiments on artificial and real-world networks not only illustrate the superiority of our proposed MMGG, but also, most importantly, reveal the roles of pairwise constraints. That is, though the must-link is more important than cannot-link when either of them is available, both must-link and cannot-link are equally important when both of them are available. To the best of our knowledge, this is the first work on discovering and exploring the importance of cannot-link constraints in semi-supervised community detection. PMID:28678864
STS-98 U.S. Lab Destiny is moved out of Atlantis' payload bay
NASA Technical Reports Server (NTRS)
2001-01-01
KENNEDY SPACE CENTER, Fla. -- Workers in the Payload Changeout Room check the U.S. Lab Destiny as its moves from Atlantis''' payload bay into the PCR. Destiny will remain in the PCR while Atlantis rolls back to the Vehicle Assembly Building to allow workers to conduct inspections, continuity checks and X-ray analysis on the 36 solid rocket booster cables located inside each booster'''s system tunnel. An extensive evaluation of NASA'''s SRB cable inventory revealed conductor damage in four (of about 200) cables on the shelf. Shuttle managers decided to prove the integrity of the system tunnel cables already on Atlantis.
The Z1 truss is moved to check weight and balance
NASA Technical Reports Server (NTRS)
2000-01-01
In the Space Station Processing Facility, workers watch as the Integrated Truss Structure Z1, an element of the International Space Station, is moved to another stand to check its weight and balance. The Z1 truss is the first of 10 trusses that will become the backbone of the Space Station, eventually stretching the length of a football field. Along with its companion payload, the third Pressurized Mating Adapter, the Z1 is scheduled to be launched aboard Space Shuttle Discovery Oct. 5 at 9:38 p.m. EDT. The launch will be the 100th in the Shuttle program.
CSRQ: Communication-Efficient Secure Range Queries in Two-Tiered Sensor Networks
Dai, Hua; Ye, Qingqun; Yang, Geng; Xu, Jia; He, Ruiliang
2016-01-01
In recent years, we have seen many applications of secure query in two-tiered wireless sensor networks. Storage nodes are responsible for storing data from nearby sensor nodes and answering queries from Sink. It is critical to protect data security from a compromised storage node. In this paper, the Communication-efficient Secure Range Query (CSRQ)—a privacy and integrity preserving range query protocol—is proposed to prevent attackers from gaining information of both data collected by sensor nodes and queries issued by Sink. To preserve privacy and integrity, in addition to employing the encoding mechanisms, a novel data structure called encrypted constraint chain is proposed, which embeds the information of integrity verification. Sink can use this encrypted constraint chain to verify the query result. The performance evaluation shows that CSRQ has lower communication cost than the current range query protocols. PMID:26907293
Kinematics of velocity and vorticity correlations in turbulent flow
NASA Technical Reports Server (NTRS)
Bernard, P. S.
1983-01-01
The kinematic problem of calculating second-order velocity moments from given values of the vorticity covariance is examined. Integral representation formulas for second-order velocity moments in terms of the two-point vorticity correlation tensor are derived. The special relationships existing between velocity moments in isotropic turbulence are expressed in terms of the integral formulas yielding several kinematic constraints on the two-point vorticity correlation tensor in isotropic turbulence. Numerical evaluation of these constraints suggests that a Gaussian curve may be the only form of the longitudinal velocity correlation coefficient which is consistent with the requirement of isotropy. It is shown that if this is the case, then a family of exact solutions to the decay of isotropic turbulence may be obtained which contains Batchelor's final period solution as a special case. In addition, the computed results suggest a method of approximating the integral representation formulas in general turbulent shear flows.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nanstad, Randy K; Sokolov, Mikhail A; Merkle, John Graham
2007-01-01
To enable determination of the fracture toughness reference temperature, T0, with reactor pressure vessel surveillance specimens, the precracked Charpy (PCVN) three-point bend, SE(B), specimen is of interest. Compared with the 25-mm (1 in.) thick compact, 1TC(T), specimen, tests with the PCVN specimen (10x10x55 mm) have resulted in T0 temperatures as much as 40 XC lower (a so-called specimen bias effect). The Heavy-Section Steel Irradiation (HSSI) Program at Oak Ridge National Laboratory developed a two-part project to evaluate the C(T) versus PCVN differences, (1) calibration experiments concentrating on test practices, and (2) a matrix of transition range tests with various specimenmore » geometries and sizes, including 1T SE(B) and 1TC(T). The test material selected was a plate of A533 grade B class 1 steel. The calibration experiments included assessment of the computational validity of J-integral determinations, while the constraint characteristics of various specimen types and sizes were evaluated using key curves and notch strength determinations. The results indicate that J-integral solutions for the small PCVN specimen are comparable in terms of J-integral validity with 1T bend specimens. Regarding constraint evaluations, Phase I deformation is defined where plastic deformation is confined to crack tip plastic zone development, whereas Phase II deformation is defined where plastic hinging deformation develops. In Phase II deformation, the 0.5T SE(B) B B specimen (slightly larger than the PCVN specimen) consistently showed the highest constraint of all SE(B) specimens evaluated for constraint comparisons. The PCVN specimen begins the Phase II type of deformation at relatively low KR levels, with the result that KJc values above about 70 MPa m from precracked Charpy specimens are under extensive plastic hinging deformation.« less
Goswami, Anjali; Randau, Marcela; Polly, P David; Weisbecker, Vera; Bennett, C Verity; Hautier, Lionel; Sánchez-Villagra, Marcelo R
2016-09-01
Developmental constraints can have significant influence on the magnitude and direction of evolutionary change, and many studies have demonstrated that these effects are manifested on macroevolutionary scales. Phenotypic integration, or the strong interactions among traits, has been similarly invoked as a major influence on morphological variation, and many studies have demonstrated that trait integration changes through ontogeny, in many cases decreasing with age. Here, we unify these perspectives in a case study of the ontogeny of the mammalian cranium, focusing on a comparison between marsupials and placentals. Marsupials are born at an extremely altricial state, requiring, in most cases, the use of the forelimbs to climb to the pouch, and, in all cases, an extended period of continuous suckling, during which most of their development occurs. Previous work has shown that marsupials are less disparate in adult cranial form than are placentals, particularly in the oral apparatus, and in forelimb ontogeny and adult morphology, presumably due to functional selection pressures on these two systems during early postnatal development. Using phenotypic trajectory analysis to quantify prenatal and early postnatal cranial ontogeny in 10 species of therian mammals, we demonstrate that this pattern of limited variation is also apparent in the development of the oral apparatus of marsupials, relative to placentals, but not in the skull more generally. Combined with the observation that marsupials show extremely high integration of the oral apparatus in early postnatal ontogeny, while other cranial regions show similar levels of integration to that observed in placentals, we suggest that high integration may compound the effects of the functional constraints for continuous suckling to ultimately limit the ontogenetic and adult disparity of the marsupial oral apparatus throughout their evolutionary history. © The Author 2016. Published by Oxford University Press on behalf of the Society for Integrative and Comparative Biology.
NASA Technical Reports Server (NTRS)
Baxter, G. I.
1976-01-01
Contoured-stiffened 63 by 337 inch 2124 aluminum alloy panels are machined in-the-flat to make integral, tapered T-capped stringers, parallel with longitudinal centerline. Aging fixture, which includes net contour formers made from lofted contour templates, has eggcrate-like structure for use in forming and checking panels.
NASA Astrophysics Data System (ADS)
Vinogradov, Vasiliy Y.; Morozov, Oleg G.; Nureev, Ilnur I.; Kuznetzov, Artem A.
2015-03-01
In this paper we consider the integrated approach to development of the aero-acoustical methods for diagnostics of aircraft gas-turbine engine flow-through passages by using as the base the passive fiber-optic and location technologies.
Confused about Fusion? Weed Your Science Collection with a Pro.
ERIC Educational Resources Information Center
O'Dell, Charli
1998-01-01
Provides guidelines on weeding science collections in junior high/high school libraries. Highlights include checking copyright dates, online sources, 13 science subject areas that deserve special consideration (plate tectonics, fission, fusion, radioactive dating, weather/climate, astronomy/space science, elements, integrated science,…
Membrane oxygenator heat exchanger failure detected by unique blood gas findings.
Hawkins, Justin L
2014-03-01
Failure of components integrated into the cardiopulmonary bypass circuit, although rare, can bring about catastrophic results. One of these components is the heat exchanger of the membrane oxygenator. In this compartment, unsterile water from the heater cooler device is separated from the sterile blood by stainless steel, aluminum, or by polyurethane. These areas are glued or welded to keep the two compartments separate, maintaining sterility of the blood. Although quality control testing is performed by the manufacturer at the factory level, transport presents the real possibility for damage. Because of this, each manufacturer has included in the instructions for use a testing procedure for testing the integrity of the heat exchanger component. Water is circulated through the heat exchanger before priming and a visible check is made of the oxygenator bundle to check for leaks. If none are apparent, then priming of the oxygenator is performed. In this particular case, this procedure was not useful in detecting communication between the water and blood chambers of the oxygenator.
NASA Technical Reports Server (NTRS)
Epp, L. W.; Stanton, P. H.
1993-01-01
In order to add the capability of an X-band uplink onto the 70-m antenna, a new dichroic plate is needed to replace the Pyle-guide-shaped dichroic plate currently in use. The replacement dichroic plate must exhibit an additional passband at the new uplink frequency of 7.165 GHz, while still maintaining a passband at the existing downlink frequency of 8.425 GHz. Because of the wide frequency separation of these two passbands, conventional methods of designing air-filled dichroic plates exhibit grating lobe problems. A new method of solving this problem by using a dichroic plate with cross-shaped holes is presented and verified experimentally. Two checks of the integral equation solution are described. One is the comparison to a modal analysis for the limiting cross shape of a square hole. As a final check, a prototype dichroic plate with cross-shaped holes was built and measured.
Comparison of Integrated Testlet and Constructed-Response Question Formats
ERIC Educational Resources Information Center
Slepkov, Aaron D.; Shiell, Ralph C.
2014-01-01
Constructed-response (CR) questions are a mainstay of introductory physics textbooks and exams. However, because of the time, cost, and scoring reliability constraints associated with this format, CR questions are being increasingly replaced by multiple-choice (MC) questions in formal exams. The integrated testlet (IT) is a recently developed…
Gestalt Therapy and Feminist Therapy: A Proposed Integration.
ERIC Educational Resources Information Center
Enns, Carolyn Zerbe
1987-01-01
Offers a proposal for integrating the Gestalt goals of self-responsibility with a feminist perspective that places value on the web of relationships in women's lives and focuses attention on the environmental constraints and socialization that affect women's choices. Discusses Gestalt techniques for enhancing women's growth and examines…
Constraint-Driven Software Design: An Escape from the Waterfall Model.
ERIC Educational Resources Information Center
de Hoog, Robert; And Others
1994-01-01
Presents the principles of a development methodology for software design based on a nonlinear, product-driven approach that integrates quality aspects. Two examples are given to show that the flexibility needed for building high quality systems leads to integrated development environments in which methodology, product, and tools are closely…
NASA Astrophysics Data System (ADS)
Zolnierczyk, Joanna Asia
The integration of mathematics and science in secondary schools in the 21st century continues to be an important topic of practice and research. The purpose of my research study, which builds on studies by Frykholm and Glasson (2005) and Berlin and White (2010), is to explore the potential constraints and benefits of integrating mathematics and science in Ontario secondary schools based on the perspectives of in-service and pre-service teachers with various math and/or science backgrounds. A qualitative and quantitative research design with an exploratory approach was used. The qualitative data was collected from a sample of 12 in-service teachers with various math and/or science backgrounds recruited from two school boards in Eastern Ontario. The quantitative and some qualitative data was collected from a sample of 81 pre-service teachers from the Queen's University Bachelor of Education (B.Ed) program. Semi-structured interviews were conducted with the in-service teachers while a survey and a focus group was conducted with the pre-service teachers. Once the data was collected, the qualitative data were abductively analyzed. For the quantitative data, descriptive and inferential statistics (one-way ANOVAs and Pearson Chi Square analyses) were calculated to examine perspectives of teachers regardless of teaching background and to compare groups of teachers based on teaching background. The findings of this study suggest that in-service and pre-service teachers have a positive attitude towards the integration of math and science and view it as valuable to student learning and success. The pre-service teachers viewed the integration as easy and did not express concerns to this integration. On the other hand, the in-service teachers highlighted concerns and challenges such as resources, scheduling, and time constraints. My results illustrate when teachers perceive it is valuable to integrate math and science and which aspects of the classroom benefit best from the integration. Furthermore, the results highlight barriers and possible solutions to better the integration of math and science. In addition to the benefits and constraints of integration, my results illustrate why some teachers may opt out of integrating math and science and the different strategies teachers have incorporated to integrate math and science in their classroom.
Gluing Ladder Feynman Diagrams into Fishnets
Basso, Benjamin; Dixon, Lance J.
2017-08-14
We use integrability at weak coupling to compute fishnet diagrams for four-point correlation functions in planar Φ 4 theory. Our results are always multilinear combinations of ladder integrals, which are in turn built out of classical polylogarithms. The Steinmann relations provide a powerful constraint on such linear combinations, which leads to a natural conjecture for any fishnet diagram as the determinant of a matrix of ladder integrals.
A Dynamic Non Energy Storing Guidance Constraint with Motion Redirection for Robot Assisted Surgery
2016-12-01
Abstract— Haptically enabled hands-on or tele-operated surgical robotic systems provide a unique opportunity to integrate pre- and intra... robot -assisted surgical systems aim at improving and extending human capabilities, by exploiting the advantages of robotic systems while keeping the...move during the operation. Robot -assisted beating heart surgery is an example of procedures that can benefit from dynamic constraints. Their
1984-12-01
only four transistors[5]. Each year since that time, the semiconductor industry has con- sistently improved the quality of the fabrication tech- niques...rarely took place at universities and was almost exclusively confined to industry . IC design techniques were developed, tested, and taught only in the...community, it is not uncommon for industry to borrow ideas and even particular programs from these university designed tools. The Very Large Scale Integration
AWARE@HOME: PROFITABLY INTEGRATING CONSERVATION INTO THE AMERICAN HOME
While American households are the most resource consuming in the world, they are unlikely to become more efficient users of public utilities because of: 1) large time delays between utility use and the receipt of utility bills; 2) the inconvenience of personally checking and ...
Integrative Lifecourse and Genetic Analysis of Military Working Dogs
2012-10-01
Recognition), ICR (Intelligent Character Recognition) and HWR ( Handwriting Recognition). A number of various software packages were evaluated and we have...the third-party software is able to recognize check-boxes and columns and do a reasonable job with handwriting – which is does. This workflow will
Integrative Lifecourse and Genetic Analysis of Military Working Dogs
2012-10-01
Intelligent Character Recognition) and HWR ( Handwriting Recognition). A number of various software packages were evaluated and we have settled on a...third-party software is able to recognize check-boxes and columns and do a reasonable job with handwriting – which is does. This workflow will
Managing Mission-Critical Infrastructure
ERIC Educational Resources Information Center
Breeding, Marshall
2012-01-01
In the library context, they depend on sophisticated business applications specifically designed to support their work. This infrastructure consists of such components as integrated library systems, their associated online catalogs or discovery services, and self-check equipment, as well as a Web site and the various online tools and services…
Read, S J; Vanman, E J; Miller, L C
1997-01-01
We argue that recent work in connectionist modeling, in particular the parallel constraint satisfaction processes that are central to many of these models, has great importance for understanding issues of both historical and current concern for social psychologists. We first provide a brief description of connectionist modeling, with particular emphasis on parallel constraint satisfaction processes. Second, we examine the tremendous similarities between parallel constraint satisfaction processes and the Gestalt principles that were the foundation for much of modem social psychology. We propose that parallel constraint satisfaction processes provide a computational implementation of the principles of Gestalt psychology that were central to the work of such seminal social psychologists as Asch, Festinger, Heider, and Lewin. Third, we then describe how parallel constraint satisfaction processes have been applied to three areas that were key to the beginnings of modern social psychology and remain central today: impression formation and causal reasoning, cognitive consistency (balance and cognitive dissonance), and goal-directed behavior. We conclude by discussing implications of parallel constraint satisfaction principles for a number of broader issues in social psychology, such as the dynamics of social thought and the integration of social information within the narrow time frame of social interaction.
On the BRST Quantization of the Massless Bosonic Particle in Twistor-Like Formulation
NASA Astrophysics Data System (ADS)
Bandos, Igor; Maznytsia, Alexey; Rudychev, Igor; Sorokin, Dmitri
We study some features of bosonic-particle path-integral quantization in a twistor-like approach by the use of the BRST-BFV-quantization prescription. In the course of the Hamiltonian analysis we observe links between various formulations of the twistor-like particle by performing a conversion of the Hamiltonian constraints of one formulation to another. A particular feature of the conversion procedure applied to turn the second-class constraints into first-class constraints is that the simplest Lorentz-covariant way to do this is to convert a full mixed set of the initial first- and second-class constraints rather than explicitly extracting and converting only the second-class constraints. Another novel feature of the conversion procedure applied below is that in the case of the D = 4 and D = 6 twistor-like particle the number of new auxiliary Lorentz-covariant coordinates, which one introduces to get a system of first-class constraints in an extended phase space, exceeds the number of independent second-class constraints of the original dynamical system. We calculate the twistor-like particle propagator in D = 3,4,6 space-time dimensions and show that it coincides with that of a conventional massless bosonic particle.
Why don't zebras have machine guns? Adaptation, selection, and constraints in evolutionary theory.
Shanahan, Timothy
2008-03-01
In an influential paper, Stephen Jay Gould and Richard Lewontin (1979) contrasted selection-driven adaptation with phylogenetic, architectural, and developmental constraints as distinct causes of phenotypic evolution. In subsequent publications Gould (e.g., 1997a,b, 2002) has elaborated this distinction into one between a narrow "Darwinian Fundamentalist" emphasis on "external functionalist" processes, and a more inclusive "pluralist" emphasis on "internal structuralist" principles. Although theoretical integration of functionalist and structuralist explanations is the ultimate aim, natural selection and internal constraints are treated as distinct causes of evolutionary change. This distinction is now routinely taken for granted in the literature in evolutionary biology. I argue that this distinction is problematic because the effects attributed to non-selective constraints are more parsimoniously explained as the ordinary effects of selection itself. Although it may still be a useful shorthand to speak of phylogenetic, architectural, and developmental constraints on phenotypic evolution, it is important to understand that such "constraints" do not constitute an alternative set of causes of evolutionary change. The result of this analysis is a clearer understanding of the relationship between adaptation, selection and constraints as explanatory concepts in evolutionary theory.
From r-spin intersection numbers to Hodge integrals
NASA Astrophysics Data System (ADS)
Ding, Xiang-Mao; Li, Yuping; Meng, Lingxian
2016-01-01
Generalized Kontsevich Matrix Model (GKMM) with a certain given potential is the partition function of r-spin intersection numbers. We represent this GKMM in terms of fermions and expand it in terms of the Schur polynomials by boson-fermion correspondence, and link it with a Hurwitz partition function and a Hodge partition by operators in a widehat{GL}(∞) group. Then, from a W 1+∞ constraint of the partition function of r-spin intersection numbers, we get a W 1+∞ constraint for the Hodge partition function. The W 1+∞ constraint completely determines the Schur polynomials expansion of the Hodge partition function.
Model-based control strategies for systems with constraints of the program type
NASA Astrophysics Data System (ADS)
Jarzębowska, Elżbieta
2006-08-01
The paper presents a model-based tracking control strategy for constrained mechanical systems. Constraints we consider can be material and non-material ones referred to as program constraints. The program constraint equations represent tasks put upon system motions and they can be differential equations of orders higher than one or two, and be non-integrable. The tracking control strategy relies upon two dynamic models: a reference model, which is a dynamic model of a system with arbitrary order differential constraints and a dynamic control model. The reference model serves as a motion planner, which generates inputs to the dynamic control model. It is based upon a generalized program motion equations (GPME) method. The method enables to combine material and program constraints and merge them both into the motion equations. Lagrange's equations with multipliers are the peculiar case of the GPME, since they can be applied to systems with constraints of first orders. Our tracking strategy referred to as a model reference program motion tracking control strategy enables tracking of any program motion predefined by the program constraints. It extends the "trajectory tracking" to the "program motion tracking". We also demonstrate that our tracking strategy can be extended to a hybrid program motion/force tracking.
NASA Technical Reports Server (NTRS)
Mock, W. D.; Latham, R. A.
1982-01-01
The NASTRAN model plan for the wing structure was expanded in detail to generate the NASTRAN model for this substructure. The grid point coordinates were coded for each element. The material properties and sizing data for each element were specified. The wing substructure model was thoroughly checked out for continuity, connectivity, and constraints. This substructure was processed for structural influence coefficients (SIC) point loadings and the deflections were compared to those computed for the aircraft detail model. Finally, a demonstration and validation processing of this substructure was accomplished using the NASTRAN finite element program. The bulk data deck, stiffness matrices, and SIC output data were delivered.
NASA Technical Reports Server (NTRS)
Mock, W. D.; Latham, R. A.; Tisher, E. D.
1982-01-01
The NASTRAN model plans for the horizontal stabilizer, vertical stabilizer, and nacelle structure were expanded in detail to generate the NASTRAN model for each of these substructures. The grid point coordinates were coded for each element. The material properties and sizing data for each element were specified. Each substructure model was thoroughly checked out for continuity, connectivity, and constraints. These substructures were processed for structural influence coefficients (SIC) point loadings and the deflections were compared to those computed for the aircraft detail models. Finally, a demonstration and validation processing of these substructures was accomplished using the NASTRAN finite element program installed at NASA/DFRC facility.
NASA Technical Reports Server (NTRS)
Mock, W. D.; Latham, R. A.
1982-01-01
The NASTRAN model plan for the fuselage structure was expanded in detail to generate the NASTRAN model for this substructure. The grid point coordinates were coded for each element. The material properties and sizing data for each element were specified. The fuselage substructure model was thoroughly checked out for continuity, connectivity, and constraints. This substructure was processed for structural influence coefficients (SIC) point loadings and the deflections were compared to those computed for the aircraft detail model. Finally, a demonstration and validation processing of this substructure was accomplished using the NASTRAN finite element program. The bulk data deck, stiffness matrices, and SIC output data were delivered.
Brain evolution and development: adaptation, allometry and constraint
Barton, Robert A.
2016-01-01
Phenotypic traits are products of two processes: evolution and development. But how do these processes combine to produce integrated phenotypes? Comparative studies identify consistent patterns of covariation, or allometries, between brain and body size, and between brain components, indicating the presence of significant constraints limiting independent evolution of separate parts. These constraints are poorly understood, but in principle could be either developmental or functional. The developmental constraints hypothesis suggests that individual components (brain and body size, or individual brain components) tend to evolve together because natural selection operates on relatively simple developmental mechanisms that affect the growth of all parts in a concerted manner. The functional constraints hypothesis suggests that correlated change reflects the action of selection on distributed functional systems connecting the different sub-components, predicting more complex patterns of mosaic change at the level of the functional systems and more complex genetic and developmental mechanisms. These hypotheses are not mutually exclusive but make different predictions. We review recent genetic and neurodevelopmental evidence, concluding that functional rather than developmental constraints are the main cause of the observed patterns. PMID:27629025
SEQ-REVIEW: A tool for reviewing and checking spacecraft sequences
NASA Astrophysics Data System (ADS)
Maldague, Pierre F.; El-Boushi, Mekki; Starbird, Thomas J.; Zawacki, Steven J.
1994-11-01
A key component of JPL's strategy to make space missions faster, better and cheaper is the Advanced Multi-Mission Operations System (AMMOS), a ground software intensive system currently in use and in further development. AMMOS intends to eliminate the cost of re-engineering a ground system for each new JPL mission. This paper discusses SEQ-REVIEW, a component of AMMOS that was designed to facilitate and automate the task of reviewing and checking spacecraft sequences. SEQ-REVIEW is a smart browser for inspecting files created by other sequence generation tools in the AMMOS system. It can parse sequence-related files according to a computer-readable version of a 'Software Interface Specification' (SIS), which is a standard document for defining file formats. It lets users display one or several linked files and check simple constraints using a Basic-like 'Little Language'. SEQ-REVIEW represents the first application of the Quality Function Development (QFD) method to sequence software development at JPL. The paper will show how the requirements for SEQ-REVIEW were defined and converted into a design based on object-oriented principles. The process starts with interviews of potential users, a small but diverse group that spans multiple disciplines and 'cultures'. It continues with the development of QFD matrices that related product functions and characteristics to user-demanded qualities. These matrices are then turned into a formal Software Requirements Document (SRD). The process concludes with the design phase, in which the CRC (Class, Responsibility, Collaboration) approach was used to convert requirements into a blueprint for the final product.
SEQ-REVIEW: A tool for reviewing and checking spacecraft sequences
NASA Technical Reports Server (NTRS)
Maldague, Pierre F.; El-Boushi, Mekki; Starbird, Thomas J.; Zawacki, Steven J.
1994-01-01
A key component of JPL's strategy to make space missions faster, better and cheaper is the Advanced Multi-Mission Operations System (AMMOS), a ground software intensive system currently in use and in further development. AMMOS intends to eliminate the cost of re-engineering a ground system for each new JPL mission. This paper discusses SEQ-REVIEW, a component of AMMOS that was designed to facilitate and automate the task of reviewing and checking spacecraft sequences. SEQ-REVIEW is a smart browser for inspecting files created by other sequence generation tools in the AMMOS system. It can parse sequence-related files according to a computer-readable version of a 'Software Interface Specification' (SIS), which is a standard document for defining file formats. It lets users display one or several linked files and check simple constraints using a Basic-like 'Little Language'. SEQ-REVIEW represents the first application of the Quality Function Development (QFD) method to sequence software development at JPL. The paper will show how the requirements for SEQ-REVIEW were defined and converted into a design based on object-oriented principles. The process starts with interviews of potential users, a small but diverse group that spans multiple disciplines and 'cultures'. It continues with the development of QFD matrices that related product functions and characteristics to user-demanded qualities. These matrices are then turned into a formal Software Requirements Document (SRD). The process concludes with the design phase, in which the CRC (Class, Responsibility, Collaboration) approach was used to convert requirements into a blueprint for the final product.
ERIC Educational Resources Information Center
Miller, Fred; Mangold, W. Glynn; Holmes, Terry
2006-01-01
Although the value of geographic information systems (GIS) technologies is recognized by practitioners and educators alike, GIS instruction has yet to make significant inroads into business curricula. In this article, the authors discuss the constraints of integrating GIS tools into business education. They develop a prototype module for…
Integrating Guided Inquiry into a Traditional Chemistry Curricular Framework
ERIC Educational Resources Information Center
Smithenry, Dennis William
2010-01-01
The case study presented in this paper examines the work of one high school chemistry teacher who has integrated guided inquiry into a yearlong, traditional curricular framework in ways that take into account the constraints and realities of her classroom. The study's findings suggest (1) the extent and frequency to which teachers can…
Integrating CALL into an Iranian EAP Course: Constraints and Affordances
ERIC Educational Resources Information Center
Mehran, Parisa; Alizadeh, Mehrasa
2015-01-01
Iranian universities have recently displayed a growing interest in integrating Computer-Assisted Language Learning (CALL) into teaching/learning English. The English for Academic Purposes (EAP) context, however, is not keeping pace with the current changes since EAP courses are strictly text-based and exam-oriented, and little research has thus…
Integrating Service and Experience: When Education Meets Admissions
ERIC Educational Resources Information Center
Stafne, Marcos
2010-01-01
In the past five years the Rubin Museum of Art has had significant shifts in the organizational structure and interrelation of visitor services and education due to various financial and administrative changes. Though varying levels of integration have existed in the institution's history, due to budget constraints in early 2009, the two separate…
Organizational Support of Technology Integration in One School in Lebanon
ERIC Educational Resources Information Center
Zgheib, Rosine S.
2013-01-01
Technology has been at the center of heated debates in educational settings driving schools to compete for the best technological equipments. However, in Lebanon there is a lag in technology integration matching twenty first century advances. Several barriers related to teacher attitudes, lack of technical skills and organizational constraints to…
NASA Technical Reports Server (NTRS)
Kerstman, Eric; Minard, Charles G.; Saile, Lynn; FreiredeCarvalho, Mary; Myers, Jerry; Walton, Marlei; Butler, Douglas; Lopez, Vilma
2010-01-01
The Integrated Medical Model (IMM) is a decision support tool that is useful to space flight mission planners and medical system designers in assessing risks and optimizing medical systems. The IMM employs an evidence-based, probabilistic risk assessment (PRA) approach within the operational constraints of space flight.
Model selection and assessment for multi-species occupancy models
Broms, Kristin M.; Hooten, Mevin B.; Fitzpatrick, Ryan M.
2016-01-01
While multi-species occupancy models (MSOMs) are emerging as a popular method for analyzing biodiversity data, formal checking and validation approaches for this class of models have lagged behind. Concurrent with the rise in application of MSOMs among ecologists, a quiet regime shift is occurring in Bayesian statistics where predictive model comparison approaches are experiencing a resurgence. Unlike single-species occupancy models that use integrated likelihoods, MSOMs are usually couched in a Bayesian framework and contain multiple levels. Standard model checking and selection methods are often unreliable in this setting and there is only limited guidance in the ecological literature for this class of models. We examined several different contemporary Bayesian hierarchical approaches for checking and validating MSOMs and applied these methods to a freshwater aquatic study system in Colorado, USA, to better understand the diversity and distributions of plains fishes. Our findings indicated distinct differences among model selection approaches, with cross-validation techniques performing the best in terms of prediction.
1999-03-01
aerodynamics to affect load motions. The effects include a load trail angle in proportion to the drag specific force, and modification of the load pendulum...equations algorithm for flight data filtering architeture . and data consistency checking; and SCIDNT 8, an output architecture. error identification...accelerations at the seven sensor locations, identified system is proportional to the number When system identification is performed, as of flexible modes
NASA Technical Reports Server (NTRS)
Duncan, L. M.; Reddell, J. P.; Schoonmaker, P. B.
1975-01-01
Techniques and support software for the efficient performance of simulation validation are discussed. Overall validation software structure, the performance of validation at various levels of simulation integration, guidelines for check case formulation, methods for real time acquisition and formatting of data from an all up operational simulator, and methods and criteria for comparison and evaluation of simulation data are included. Vehicle subsystems modules, module integration, special test requirements, and reference data formats are also described.
40 CFR 63.7740 - What are my monitoring requirements?
Code of Federal Regulations, 2010 CFR
2010-07-01
... a bag leak detection system according to the requirements in § 63.7741(b). (c) For each baghouse... the proper functioning of removal mechanisms. (3) Check the compressed air supply for pulse-jet... integrity of the baghouse through quarterly visual inspections of the baghouse interior for air leaks. (8...
Towards Quantifying Programmable Logic Controller Resilience Against Intentional Exploits
2012-03-22
may improve the SCADA system’s resilience against DoS and man-in-the-middle ( MITM ) attacks. DoS attacks may be mitigated by using the redundant...paths available on the network links. MITM attacks may be mitigated by the data integrity checks associated with the middleware. Figure 4 illustrates
Does It Work? 555-Timer Checker Leaves No Doubt
ERIC Educational Resources Information Center
Harman, Charles
2009-01-01
This article details the construction and use of the 555-timer checker. The 555-timer checker allows the user to dynamically check a 555-timer, an integrated circuit device. Of its many applications, it provides timing applications that unijunction transistors once performed. (Contains 4 figures and 3 photos.)
NASA Technical Reports Server (NTRS)
Panontin, Tina L.; Sheppard, Sheri D.
1994-01-01
The use of small laboratory specimens to predict the integrity of large, complex structures relies on the validity of single parameter fracture mechanics. Unfortunately, the constraint loss associated with large scale yielding, whether in a laboratory specimen because of its small size or in a structure because it contains shallow flaws loaded in tension, can cause the breakdown of classical fracture mechanics and the loss of transferability of critical, global fracture parameters. Although the issue of constraint loss can be eliminated by testing actual structural configurations, such an approach can be prohibitively costly. Hence, a methodology that can correct global fracture parameters for constraint effects is desirable. This research uses micromechanical analyses to define the relationship between global, ductile fracture initiation parameters and constraint in two specimen geometries (SECT and SECB with varying a/w ratios) and one structural geometry (circumferentially cracked pipe). Two local fracture criteria corresponding to ductile fracture micromechanisms are evaluated: a constraint-modified, critical strain criterion for void coalescence proposed by Hancock and Cowling and a critical void ratio criterion for void growth based on the Rice and Tracey model. Crack initiation is assumed to occur when the critical value in each case is reached over some critical length. The primary material of interest is A516-70, a high-hardening pressure vessel steel sensitive to constraint; however, a low-hardening structural steel that is less sensitive to constraint is also being studied. Critical values of local fracture parameters are obtained by numerical analysis and experimental testing of circumferentially notched tensile specimens of varying constraint (e.g., notch radius). These parameters are then used in conjunction with large strain, large deformation, two- and three-dimensional finite element analyses of the geometries listed above to predict crack initiation loads and to calculate the associated (critical) global fracture parameters. The loads are verified experimentally, and microscopy is used to measure pre-crack length, crack tip opening displacement (CTOD), and the amount of stable crack growth. Results for A516-70 steel indicate that the constraint-modified, critical strain criterion with a critical length approximately equal to the grain size (0.0025 inch) provides accurate predictions of crack initiation. The critical void growth criterion is shown to considerably underpredict crack initiation loads with the same critical length. The relationship between the critical value of the J-integral for ductile crack initiation and crack depth for SECT and SECB specimens has been determined using the constraint-modified, critical strain criterion, demonstrating that this micromechanical model can be used to correct in-plane constraint effects due to crack depth and bending vs. tension loading. Finally, the relationship developed for the SECT specimens is used to predict the behavior of circumferentially cracked pipe specimens.
Web-based software tool for constraint-based design specification of synthetic biological systems.
Oberortner, Ernst; Densmore, Douglas
2015-06-19
miniEugene provides computational support for solving combinatorial design problems, enabling users to specify and enumerate designs for novel biological systems based on sets of biological constraints. This technical note presents a brief tutorial for biologists and software engineers in the field of synthetic biology on how to use miniEugene. After reading this technical note, users should know which biological constraints are available in miniEugene, understand the syntax and semantics of these constraints, and be able to follow a step-by-step guide to specify the design of a classical synthetic biological system-the genetic toggle switch.1 We also provide links and references to more information on the miniEugene web application and the integration of the miniEugene software library into sophisticated Computer-Aided Design (CAD) tools for synthetic biology ( www.eugenecad.org ).
High performance techniques for space mission scheduling
NASA Technical Reports Server (NTRS)
Smith, Stephen F.
1994-01-01
In this paper, we summarize current research at Carnegie Mellon University aimed at development of high performance techniques and tools for space mission scheduling. Similar to prior research in opportunistic scheduling, our approach assumes the use of dynamic analysis of problem constraints as a basis for heuristic focusing of problem solving search. This methodology, however, is grounded in representational assumptions more akin to those adopted in recent temporal planning research, and in a problem solving framework which similarly emphasizes constraint posting in an explicitly maintained solution constraint network. These more general representational assumptions are necessitated by the predominance of state-dependent constraints in space mission planning domains, and the consequent need to integrate resource allocation and plan synthesis processes. First, we review the space mission problems we have considered to date and indicate the results obtained in these application domains. Next, we summarize recent work in constraint posting scheduling procedures, which offer the promise of better future solutions to this class of problems.
The Z1 truss is moved to check weight and balance
NASA Technical Reports Server (NTRS)
2000-01-01
In the Space Station Processing Facility, photographers focus on the Integrated Truss Structure Z1, an element of the International Space Station, suspended by a crane overhead. The truss is being moved to another stand to check its weight and balance. The Z1 truss is the first of 10 trusses that will become the backbone of the Space Station, eventually stretching the length of a football field. Along with its companion payload, the third Pressurized Mating Adapter, the Z1 is scheduled to be launched aboard Space Shuttle Discovery Oct. 5 at 9:38 p.m. EDT. The launch will be the 100th in the Shuttle program.
STS-98 U.S. Lab Destiny is moved out of Atlantis' payload bay
NASA Technical Reports Server (NTRS)
2001-01-01
KENNEDY SPACE CENTER, Fla. -- Workers in the Payload Changeout Room check the Payload Ground Handling Mechanism that will move the U.S. Lab Destiny out of Atlantis''' payload bay and into the PCR. After the move, Atlantis will roll back to the Vehicle Assembly Building to allow workers to conduct inspections, continuity checks and X-ray analysis on the 36 solid rocket booster cables located inside each booster'''s system tunnel. An extensive evaluation of NASA'''s SRB cable inventory revealed conductor damage in four (of about 200) cables on the shelf. Shuttle managers decided to prove the integrity of the system tunnel cables already on Atlantis.
STS-92 crew takes part in a Leak Seal Kit Fit Check in the SSPF
NASA Technical Reports Server (NTRS)
1999-01-01
STS-92 crew members discuss results of a Leak Seal Kit Fit Check on the Pressurized Mating Adapter -3, part of their mission payload, with JSC and Boeing representatives. From left are Mission Specialists Michael E. Lopez-Alegria; Koichi Wakata, who represents the National Space Development Agency of Japan (NASDA); (standing) Peter J.K. 'Jeff' Wisoff (Ph.D.) and William Surles 'Bill' McArthur Jr.; (seated) Pilot Pamela A. Melroy; Dave Moore (behind Melroy), with Boeing; Mission Specialist Leroy Chiao (Ph.D.); Brian Warkentine, with JSC; and Commander Brian Duffy. The mission payload also includes an integrated truss structure (Z-1 truss). Launch of STS-92 is scheduled for Feb. 24, 2000.
Comparing the Correlation Length of Grain Markets in China and France
NASA Astrophysics Data System (ADS)
Roehner, Bertrand M.; Shiue, Carol H.
In economics, comparative analysis plays the same role as experimental research in physics. In this paper, we closely examine several methodological problems related to comparative analysis by investigating the specific example of grain markets in China and France respectively. This enables us to answer a question in economic history which has so far remained pending, namely whether or not market integration progressed in the 18th century. In economics as in physics, before any new result being accepted, it has to be checked and re-checked by different researchers. This is what we call the replication and comparison procedures. We show how these procedures should (and can) be implemented.
Evaluation of atomic pressure in the multiple time-step integration algorithm.
Andoh, Yoshimichi; Yoshii, Noriyuki; Yamada, Atsushi; Okazaki, Susumu
2017-04-15
In molecular dynamics (MD) calculations, reduction in calculation time per MD loop is essential. A multiple time-step (MTS) integration algorithm, the RESPA (Tuckerman and Berne, J. Chem. Phys. 1992, 97, 1990-2001), enables reductions in calculation time by decreasing the frequency of time-consuming long-range interaction calculations. However, the RESPA MTS algorithm involves uncertainties in evaluating the atomic interaction-based pressure (i.e., atomic pressure) of systems with and without holonomic constraints. It is not clear which intermediate forces and constraint forces in the MTS integration procedure should be used to calculate the atomic pressure. In this article, we propose a series of equations to evaluate the atomic pressure in the RESPA MTS integration procedure on the basis of its equivalence to the Velocity-Verlet integration procedure with a single time step (STS). The equations guarantee time-reversibility even for the system with holonomic constrants. Furthermore, we generalize the equations to both (i) arbitrary number of inner time steps and (ii) arbitrary number of force components (RESPA levels). The atomic pressure calculated by our equations with the MTS integration shows excellent agreement with the reference value with the STS, whereas pressures calculated using the conventional ad hoc equations deviated from it. Our equations can be extended straightforwardly to the MTS integration algorithm for the isothermal NVT and isothermal-isobaric NPT ensembles. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Lu, Yuhua; Liu, Qian
2018-01-01
We propose a novel method to simulate soft tissue deformation for virtual surgery applications. The method considers the mechanical properties of soft tissue, such as its viscoelasticity, nonlinearity and incompressibility; its speed, stability and accuracy also meet the requirements for a surgery simulator. Modifying the traditional equation for mass spring dampers (MSD) introduces nonlinearity and viscoelasticity into the calculation of elastic force. Then, the elastic force is used in the constraint projection step for naturally reducing constraint potential. The node position is enforced by the combined spring force and constraint conservative force through Newton's second law. We conduct a comparison study of conventional MSD and position-based dynamics for our new integrating method. Our approach enables stable, fast and large step simulation by freely controlling visual effects based on nonlinearity, viscoelasticity and incompressibility. We implement a laparoscopic cholecystectomy simulator to demonstrate the practicality of our method, in which liver and gallbladder deformation can be simulated in real time. Our method is an appropriate choice for the development of real-time virtual surgery applications. PMID:29515870
Xu, Lang; Lu, Yuhua; Liu, Qian
2018-02-01
We propose a novel method to simulate soft tissue deformation for virtual surgery applications. The method considers the mechanical properties of soft tissue, such as its viscoelasticity, nonlinearity and incompressibility; its speed, stability and accuracy also meet the requirements for a surgery simulator. Modifying the traditional equation for mass spring dampers (MSD) introduces nonlinearity and viscoelasticity into the calculation of elastic force. Then, the elastic force is used in the constraint projection step for naturally reducing constraint potential. The node position is enforced by the combined spring force and constraint conservative force through Newton's second law. We conduct a comparison study of conventional MSD and position-based dynamics for our new integrating method. Our approach enables stable, fast and large step simulation by freely controlling visual effects based on nonlinearity, viscoelasticity and incompressibility. We implement a laparoscopic cholecystectomy simulator to demonstrate the practicality of our method, in which liver and gallbladder deformation can be simulated in real time. Our method is an appropriate choice for the development of real-time virtual surgery applications.
A multi-objective constraint-based approach for modeling genome-scale microbial ecosystems.
Budinich, Marko; Bourdon, Jérémie; Larhlimi, Abdelhalim; Eveillard, Damien
2017-01-01
Interplay within microbial communities impacts ecosystems on several scales, and elucidation of the consequent effects is a difficult task in ecology. In particular, the integration of genome-scale data within quantitative models of microbial ecosystems remains elusive. This study advocates the use of constraint-based modeling to build predictive models from recent high-resolution -omics datasets. Following recent studies that have demonstrated the accuracy of constraint-based models (CBMs) for simulating single-strain metabolic networks, we sought to study microbial ecosystems as a combination of single-strain metabolic networks that exchange nutrients. This study presents two multi-objective extensions of CBMs for modeling communities: multi-objective flux balance analysis (MO-FBA) and multi-objective flux variability analysis (MO-FVA). Both methods were applied to a hot spring mat model ecosystem. As a result, multiple trade-offs between nutrients and growth rates, as well as thermodynamically favorable relative abundances at community level, were emphasized. We expect this approach to be used for integrating genomic information in microbial ecosystems. Following models will provide insights about behaviors (including diversity) that take place at the ecosystem scale.
Using surface integrals for checking Archimedes' law of buoyancy
NASA Astrophysics Data System (ADS)
Lima, F. M. S.
2012-01-01
A mathematical derivation of the force exerted by an inhomogeneous (i.e. compressible) fluid on the surface of an arbitrarily shaped body immersed in it is not found in the literature, which may be attributed to our trust in Archimedes' law of buoyancy. However, this law, also known as Archimedes' principle (AP), does not yield the force observed when the body is in contact with the container walls, as is more evident in the case of a block immersed in a liquid and in contact with the bottom, in which a downward force that increases with depth is observed. In this work, by taking into account the surface integral of the pressure force exerted by a fluid over the surface of a body, the general validity of AP is checked. For a body fully surrounded by a fluid, homogeneous or not, a gradient version of the divergence theorem applies, yielding a volume integral that simplifies to an upward force which agrees with the force predicted by AP, as long as the fluid density is a continuous function of depth. For the bottom case, this approach yields a downward force that increases with depth, which contrasts to AP but is in agreement with experiments. It also yields a formula for this force which shows that it increases with the area of contact.
Quantum mechanics of hyperbolic orbits in the Kepler problem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rauh, Alexander; Parisi, Juergen
2011-04-15
The problem of deriving macroscopic properties from the Hamiltonian of the hydrogen atom is resumed by extending previous results in the literature, which predicted elliptic orbits, into the region of hyperbolic orbits. As a main tool, coherent states of the harmonic oscillator are used which are continued to imaginary frequencies. The Kustaanheimo-Stiefel (KS) map is applied to transform the original configuration space into the product space of four harmonic oscillators with a constraint. The relation derived between real time and oscillator (pseudo) time includes quantum corrections. In the limit ({h_bar}/2{pi}){yields}0, the time-dependent mean values of position and velocity describe themore » classical motion on a hyperbola and a circular hodograph, respectively. Moreover, the connection between pseudotime and real time comes out in analogy to Kepler's equation for elliptic orbits. The mean-square-root deviations of position and velocity components behave similarly in time to the corresponding ones of a spreading Gaussian wave packet in free space. To check the approximate treatment of the constraint, its contribution to the mean energy is determined with the result that it is negligible except for energy values close to the parabolic orbit with eccentricity equal to 1. It is inevitable to introduce a suitable scalar product in R{sup 4} which makes both the transformed Hamiltonian and the velocity operators Hermitian. An elementary necessary criterion is given for the energy interval where the constraint can be approximated by averaging.« less
Protograph LDPC Codes with Node Degrees at Least 3
NASA Technical Reports Server (NTRS)
Divsalar, Dariush; Jones, Christopher
2006-01-01
In this paper we present protograph codes with a small number of degree-3 nodes and one high degree node. The iterative decoding threshold for proposed rate 1/2 codes are lower, by about 0.2 dB, than the best known irregular LDPC codes with degree at least 3. The main motivation is to gain linear minimum distance to achieve low error floor. Also to construct rate-compatible protograph-based LDPC codes for fixed block length that simultaneously achieves low iterative decoding threshold and linear minimum distance. We start with a rate 1/2 protograph LDPC code with degree-3 nodes and one high degree node. Higher rate codes are obtained by connecting check nodes with degree-2 non-transmitted nodes. This is equivalent to constraint combining in the protograph. The condition where all constraints are combined corresponds to the highest rate code. This constraint must be connected to nodes of degree at least three for the graph to have linear minimum distance. Thus having node degree at least 3 for rate 1/2 guarantees linear minimum distance property to be preserved for higher rates. Through examples we show that the iterative decoding threshold as low as 0.544 dB can be achieved for small protographs with node degrees at least three. A family of low- to high-rate codes with minimum distance linearly increasing in block size and with capacity-approaching performance thresholds is presented. FPGA simulation results for a few example codes show that the proposed codes perform as predicted.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hajian, Amir; Bond, J. Richard; Battaglia, Nicholas
We measure a significant correlation between the thermal Sunyaev-Zel'dovich effect in the Planck and WMAP maps and an X-ray cluster map based on ROSAT. We use the 100, 143 and 343 GHz Planck maps and the WMAP 94 GHz map to obtain this cluster cross spectrum. We check our measurements for contamination from dusty galaxies using the cross correlations with the 217, 545 and 857 GHz maps from Planck. Our measurement yields a direct characterization of the cluster power spectrum over a wide range of angular scales that is consistent with large cosmological simulations. The amplitude of this signal dependsmore » on cosmological parameters that determine the growth of structure (σ{sub 8} and Ω M) and scales as σ{sub 8}{sup 7.4} and Ω M{sup 1.9} around the multipole (ℓ) ∼ 1000. We constrain σ{sub 8} and Ω M from the cross-power spectrum to be σ{sub 8}(Ω M/0.30){sup 0.26} = 0.8±0.02. Since this cross spectrum produces a tight constraint in the σ{sub 8} and Ω M plane the errors on a σ{sub 8} constraint will be mostly limited by the uncertainties from external constraints. Future cluster catalogs, like those from eRosita and LSST, and pointed multi-wavelength observations of clusters will improve the constraining power of this cross spectrum measurement. In principle this analysis can be extended beyond σ{sub 8} and Ω M to constrain dark energy or the sum of the neutrino masses.« less
Eulerian Formulation of Spatially Constrained Elastic Rods
NASA Astrophysics Data System (ADS)
Huynen, Alexandre
Slender elastic rods are ubiquitous in nature and technology. For a vast majority of applications, the rod deflection is restricted by an external constraint and a significant part of the elastic body is in contact with a stiff constraining surface. The research work presented in this doctoral dissertation formulates a computational model for the solution of elastic rods constrained inside or around frictionless tube-like surfaces. The segmentation strategy adopted to cope with this complex class of problems consists in sequencing the global problem into, comparatively simpler, elementary problems either in continuous contact with the constraint or contact-free between their extremities. Within the conventional Lagrangian formulation of elastic rods, this approach is however associated with two major drawbacks. First, the boundary conditions specifying the locations of the rod centerline at both extremities of each elementary problem lead to the establishment of isoperimetric constraints, i.e., integral constraints on the unknown length of the rod. Second, the assessment of the unilateral contact condition requires, in principle, the comparison of two curves parametrized by distinct curvilinear coordinates, viz. the rod centerline and the constraint axis. Both conspire to burden the computations associated with the method. To streamline the solution along the elementary problems and rationalize the assessment of the unilateral contact condition, the rod governing equations are reformulated within the Eulerian framework of the constraint. The methodical exploration of both types of elementary problems leads to specific formulations of the rod governing equations that stress the profound connection between the mechanics of the rod and the geometry of the constraint surface. The proposed Eulerian reformulation, which restates the rod local equilibrium in terms of the curvilinear coordinate associated with the constraint axis, describes the rod deformed configuration by means of either its relative position with respect to the constraint axis (contact-free segments) or its angular position on the constraint surface (continuous contacts.) This formulation circumvents both drawbacks that afflict the conventional Lagrangian approach associated with the segmentation strategy. As the a priori unknown domain, viz. the rod length, is substituted for the known constraint axis, the free boundary problem and the associated isoperimetric constraints are converted into a classical two-point boundary value problem. Additionally, the description of the rod deflection by means of its eccentricity with respect to the constraint axis trivializes the assessment of the unilateral contact condition. Along continuous contacts, this formulation expresses the strain variables, measuring the rod change of shape, in terms of the geometric invariants of the constraint surface, and emphasizes the influence of the constraint local geometry on the reaction pressure. Formalizing the segmentation strategy, a computational model that exploits the Eulerian formulation of the rod governing equations is devised. To solve the quasi-static deflection of elastic rods constrained inside or around a tube-like surface, this computational model identifies the number of contacts, their nature (either discrete or continuous), and the rod configuration at the connections that satisfies the unilateral contact condition and preserves the rod integrity along the sequence of elementary problems.
Space Shuttle Day-of-Launch Trajectory Design Operations
NASA Technical Reports Server (NTRS)
Harrington, Brian E.
2011-01-01
A top priority of any launch vehicle is to insert as much mass into the desired orbit as possible. This requirement must be traded against vehicle capability in terms of dynamic control, thermal constraints, and structural margins. The vehicle is certified to specific structural limits which will yield certain performance characteristics of mass to orbit. Some limits cannot be certified generically and must be checked with each mission design. The most sensitive limits require an assessment on the day-of-launch. To further minimize vehicle loads while maximizing vehicle performance, a day-of-launch trajectory can be designed. This design is optimized according to that day s wind and atmospheric conditions, which increase the probability of launch. The day-of-launch trajectory design and verification process is critical to the vehicle s safety. The Day-Of-Launch I-Load Update (DOLILU) is the process by which the National Aeronautics and Space Administration's (NASA) Space Shuttle Program tailors the vehicle steering commands to fit that day s environmental conditions and then rigorously verifies the integrated vehicle trajectory s loads, controls, and performance. This process has been successfully used for almost twenty years and shares many of the same elements with other launch vehicles that execute a day-of-launch trajectory design or day-of-launch trajectory verification. Weather balloon data is gathered at the launch site and transmitted to the Johnson Space Center s Mission Control. The vehicle s first stage trajectory is then adjusted to the measured wind and atmosphere data. The resultant trajectory must satisfy loads and controls constraints. Additionally, these assessments statistically protect for non-observed dispersions. One such dispersion is the change in the wind from the last measured balloon to launch time. This process is started in the hours before launch and is repeated several times as the launch count proceeds. Should the trajectory design not meet all constraint criteria, Shuttle would be No-Go for launch. This Shuttle methodology is very similar to other unmanned launch vehicles. By extension, this method would likely be employed for any future NASA launch vehicle. This paper will review the Shuttle s day-of-launch trajectory optimization and verification operations as an example of a more generic application of day-of-launch design and validation. With Shuttle s retirement, it is fitting to document the current state of this critical process and capture lessons learned to benefit current and future launch vehicle endeavors.
Depth from Edge and Intensity Based Stereo.
1982-09-01
a Mars Viking vehicle, and a random dotted coffee jar. Assessment of the algorithm is a bit difficult: it uses a fairly simple control structure with...correspondences. This use of an evaluation function estimator allowed the introduction of the extensive pruning of a branch and bound algorithm. Even with it...Figure 3-6). This is the edge reversal constraint, and was integral to the pruning . As it happens, this same constraint is the key to the use of the
Space station payload operations scheduling with ESP2
NASA Technical Reports Server (NTRS)
Stacy, Kenneth L.; Jaap, John P.
1988-01-01
The Mission Analysis Division of the Systems Analysis and Integration Laboratory at the Marshall Space Flight Center is developing a system of programs to handle all aspects of scheduling payload operations for Space Station. The Expert Scheduling Program (ESP2) is the heart of this system. The task of payload operations scheduling can be simply stated as positioning the payload activities in a mission so that they collect their desired data without interfering with other activities or violating mission constraints. ESP2 is an advanced version of the Experiment Scheduling Program (ESP) which was developed by the Mission Integration Branch beginning in 1979 to schedule Spacelab payload activities. The automatic scheduler in ESP2 is an expert system that embodies the rules that expert planners would use to schedule payload operations by hand. This scheduler uses depth-first searching, backtracking, and forward chaining techniques to place an activity so that constraints (such as crew, resources, and orbit opportunities) are not violated. It has an explanation facility to show why an activity was or was not scheduled at a certain time. The ESP2 user can also place the activities in the schedule manually. The program offers graphical assistance to the user and will advise when constraints are being violated. ESP2 also has an option to identify conflict introduced into an existing schedule by changes to payload requirements, mission constraints, and orbit opportunities.
Henderson, Amanda; Harrison, Penny; Rowe, Jennifer; Edwards, Sam; Barnes, Margaret; Henderson, Simon; Henderson, Amanda
2018-04-10
To prepare graduate nurses for practice, the curriculum and pedagogy need to facilitate student engagement, active learning and the development of self-efficacy. This pilot project describes and explores an initiative, the Check-in and Check-out process, that aims to engage students as active partners in their learning and teaching in their clinical preparation for practice. Three interdependent elements make up the process: a check-in (briefing) part; a clinical practice part, which supports students as they engage in their learning and practise clinical skills; and a check-out (debriefing) part. A student evaluation of this initiative confirmed the value of the process, which has subsequently been embedded in the preparation for practice and work-integrated learning courses in the undergraduate nursing programs at the participating university. The introduction of a singular learning process provides consistency in the learning approach used across clinical learning spaces, irrespective of their location or focus. A consistent learning process-including a common language that easily transfers across all clinical courses and clinical settings-arguably enhances the students' learning experience, helps them to actively manage their preparation for clinical practice and to develop self-efficacy. Copyright © 2018. Published by Elsevier Ltd.
Stateless and stateful implementations of faithful execution
Pierson, Lyndon G; Witzke, Edward L; Tarman, Thomas D; Robertson, Perry J; Eldridge, John M; Campbell, Philip L
2014-12-16
A faithful execution system includes system memory, a target processor, and protection engine. The system memory stores a ciphertext including value fields and integrity fields. The value fields each include an encrypted executable instruction and the integrity fields each include an encrypted integrity value for determining whether a corresponding one of the value fields has been modified. The target processor executes plaintext instructions decoded from the ciphertext while the protection engine is coupled between the system memory and the target processor. The protection engine includes logic to retrieve the ciphertext from the system memory, decrypt the value fields into the plaintext instructions, perform an integrity check based on the integrity fields to determine whether any of the corresponding value fields have been modified, and provide the plaintext instructions to the target processor for execution.
NASA Technical Reports Server (NTRS)
Pinckney, John
2010-01-01
With the advent of high speed computing Monte Carlo ray tracing techniques has become the preferred method for evaluating spacecraft orbital heats. Monte Carlo has its greatest advantage where there are many interacting surfaces. However Monte Carlo programs are specialized programs that suffer from some inaccuracy, long calculation times and high purchase cost. A general orbital heating integral is presented here that is accurate, fast and runs on MathCad, a generally available engineering mathematics program. The integral is easy to read, understand and alter. The integral can be applied to unshaded primitive surfaces at any orientation. The method is limited to direct heating calculations. This integral formulation can be used for quick orbit evaluations and spot checking Monte Carlo results.
Finite element implementation of state variable-based viscoplasticity models
NASA Technical Reports Server (NTRS)
Iskovitz, I.; Chang, T. Y. P.; Saleeb, A. F.
1991-01-01
The implementation of state variable-based viscoplasticity models is made in a general purpose finite element code for structural applications of metals deformed at elevated temperatures. Two constitutive models, Walker's and Robinson's models, are studied in conjunction with two implicit integration methods: the trapezoidal rule with Newton-Raphson iterations and an asymptotic integration algorithm. A comparison is made between the two integration methods, and the latter method appears to be computationally more appealing in terms of numerical accuracy and CPU time. However, in order to make the asymptotic algorithm robust, it is necessary to include a self adaptive scheme with subincremental step control and error checking of the Jacobian matrix at the integration points. Three examples are given to illustrate the numerical aspects of the integration methods tested.
Simplified, inverse, ejector design tool
NASA Technical Reports Server (NTRS)
Dechant, Lawrence J.
1993-01-01
A simple lumped parameter based inverse design tool has been developed which provides flow path geometry and entrainment estimates subject to operational, acoustic, and design constraints. These constraints are manifested through specification of primary mass flow rate or ejector thrust, fully-mixed exit velocity, and static pressure matching. Fundamentally, integral forms of the conservation equations coupled with the specified design constraints are combined to yield an easily invertible linear system in terms of the flow path cross-sectional areas. Entrainment is computed by back substitution. Initial comparison with experimental and analogous one-dimensional methods show good agreement. Thus, this simple inverse design code provides an analytically based, preliminary design tool with direct application to High Speed Civil Transport (HSCT) design studies.
NASA Technical Reports Server (NTRS)
Dolvin, Douglas J.
1992-01-01
The superior survivability of a multirole fighter is dependent upon balanced integration of technologies for reduced vulnerability and susceptability. The objective is to develop a methodology for structural design optimization with survivability dependent constraints. The design criteria for optimization will be survivability in a tactical laser environment. The following analyses are studied to establish a dependent design relationship between structural weight and survivability: (1) develop a physically linked global design model of survivability variables; and (2) apply conventional constraints to quantify survivability dependent design. It was not possible to develop an exact approach which would include all aspects of survivability dependent design, therefore guidelines are offered for solving similar problems.
NASA Astrophysics Data System (ADS)
Mardirossian, Narbe; Head-Gordon, Martin
2015-02-01
A meta-generalized gradient approximation density functional paired with the VV10 nonlocal correlation functional is presented. The functional form is selected from more than 1010 choices carved out of a functional space of almost 1040 possibilities. Raw data come from training a vast number of candidate functional forms on a comprehensive training set of 1095 data points and testing the resulting fits on a comprehensive primary test set of 1153 data points. Functional forms are ranked based on their ability to reproduce the data in both the training and primary test sets with minimum empiricism, and filtered based on a set of physical constraints and an often-overlooked condition of satisfactory numerical precision with medium-sized integration grids. The resulting optimal functional form has 4 linear exchange parameters, 4 linear same-spin correlation parameters, and 4 linear opposite-spin correlation parameters, for a total of 12 fitted parameters. The final density functional, B97M-V, is further assessed on a secondary test set of 212 data points, applied to several large systems including the coronene dimer and water clusters, tested for the accurate prediction of intramolecular and intermolecular geometries, verified to have a readily attainable basis set limit, and checked for grid sensitivity. Compared to existing density functionals, B97M-V is remarkably accurate for non-bonded interactions and very satisfactory for thermochemical quantities such as atomization energies, but inherits the demonstrable limitations of existing local density functionals for barrier heights.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berger, Edmond L.; Giddings, Steven B.; Wang, Haichen
2014-10-10
Here, the LHC phenomenology of a low-scale gauged flavor symmetry model with inverted hierarchy is studied, through introduction of a simplified model of broken flavor symmetry. A new scalar (a flavon) and a new neutral top-philic massive gauge boson emerge with mass in the TeV range, along with a new heavy fermion associated with the standard model top quark. After checking constraints from electroweak precision observables, we investigate the influence of the model on Higgs boson physics, notably on its production cross section and decay branching fractions. Limits on the flavon φ from heavy Higgs boson searches at the LHCmore » at 7 and 8 TeV are presented. The branching fractions of the flavon are computed as a function of the flavon mass and the Higgs-flavon mixing angle. We also explore possible discovery of the flavon at 14 TeV, particularly via the φ → Z 0Z 0 decay channel in the 2ℓ2ℓ' final state, and through standard model Higgs boson pair production φ → hh in the b¯bγγ final state. We conclude that the flavon mass range up to 500 GeV could be probed down to quite small values of the Higgs-flavon mixing angle with 100 fb –1 of integrated luminosity at 14 TeV.« less
Jiao, Wan; Hagler, Gayle S W; Williams, Ronald W; Sharpe, Robert N; Weinstock, Lewis; Rice, Joann
2015-05-19
Continuous, long-term, and time-resolved measurement of outdoor air pollution has been limited by logistical hurdles and resource constraints. Measuring air pollution in more places is desired to address community concerns regarding local air quality impacts related to proximate sources, to provide data in areas lacking regional air monitoring altogether, or to support environmental awareness and education. This study integrated commercially available technologies to create the Village Green Project (VGP), a durable, solar-powered air monitoring park bench that measures real-time ozone, PM2.5, and meteorological parameters. The data are wirelessly transmitted via cellular modem to a server, where automated quality checks take place before data are provided to the public nearly instantaneously. Over 5500 h of data were successfully collected during the first ten months of pilot testing in Durham, North Carolina, with about 13 days (5.5%) of downtime because of low battery power. Additional data loss (4-14% depending on the measurement) was caused by infrequent wireless communication interruptions and instrument maintenance. The 94.5% operational time via solar power was within 1.5% of engineering calculations using historical solar data for the location. The performance of the VGP was evaluated by comparing the data to nearby air monitoring stations operating federal equivalent methods (FEM), which exhibited good agreement with the nearest benchmark FEMs for hourly ozone (r(2) = 0.79) and PM2.5 (r(2) = 0.76).
Polymer diffusion in quenched disorder: A renormalization group approach
NASA Astrophysics Data System (ADS)
Ebert, Ute
1996-01-01
We study the diffusion of polymers through quenched short-range correlated random media by renormalization group (RG) methods, which allow us to derive universal predictions in the limit of long chains and weak disorder. We take local quenched random potentials with second moment v and the excluded-volume interaction u of the chain segments into account. We show that our model contains the relevant features of polymer diffusion in random media in the RG sense if we focus on the local entropic effects rather than on the topological constraints of a quenched random medium. The dynamic generating functional and the general structure of its perturbation expansion in u and v are derived. The distribution functions for the center-of-mass motion and the internal modes of one chain and for the correlation of the center of mass motions of two chains are calculated to one-loop order. The results allow for sufficient cross-checks to have trust in the one-loop renormalizability of the model. The general structure as well as the one-loop results of the integrated RG flow of the parameters are discussed. Universal results can be found for the effective static interaction w≔u-v≥0 and for small effective disorder couplingbar v(l) on the intermediate length scale l. As a first physical prediction from our analysis, we determine the general nonlinear scaling form of the chain diffusion constant and evaluate it explicitly as[Figure not available: see fulltext.] forbar v(l) ≪ 1.
ERIC Educational Resources Information Center
Shepard, Suzanne
The assessment process can be integrated with treatment and evaluation for helping teenage suicide attempters and families in short term psychiatric hospitalization programs. The method is an extremely efficient way for the therapist to work within a given time constraint. During family assessment sufficient information can be gathered to…
Digital Technologies in Mathematics Classrooms: Barriers, Lessons and Focus on Teachers
ERIC Educational Resources Information Center
Sacristán, Ana Isabel
2017-01-01
In this paper, drawing from data from several experiences and studies in which I have been involved in Mexico, I reflect on the constraints and inertia of classroom cultures, and the barriers to successful, meaningful and transformative technology integration in mathematics classroom. I focus on teachers as key players for this integration,…
Shuttle program. STS-7 feasibility assessment: IUS/TDRS-A
NASA Technical Reports Server (NTRS)
1979-01-01
This Space Transportation System 7 (STS-7) Flight Feasibility Assessment (FFA) provides a base from which the various design, operation, and integration elements associated with Tracking and Data Relay Satellite-A can perform mission planning and analysis. The STS-7 FFA identifies conflicts, issues, and concerns associated with the integrated flight design requirements and constraints.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Basavatia, A; Kalnicki, S; Garg, M
Purpose: To implement a clinically useful palm vein pattern recognition biometric system to treat the correct treatment plan to the correct patient each and every time and to check-in the patient into the department to access the correct medical record. Methods: A commercially available hand vein scanning system was paired to Aria and utilized an ADT interface from the hospital electronic health system. Integration at two points in Aria, version 11 MR2, first at the appointment tracker screen for the front desk medical record access and second at the queue screen on the 4D treatment console took place for patientmore » daily time-out. A test patient was utilized to check accuracy of identification as well as to check that no unintended interactions take place between the 4D treatment console and the hand vein scanning system. This system has been in clinical use since December 2013. Results: Since implementation, 445 patients have been enrolled into our biometric system. 95% of patients learn the correct methodology of hand placement on the scanner in the first try. We have had two instances of patient not found because of a bad initial scan. We simply erased the scanned metric and the patient enrolled again in those cases. The accuracy of the match is 100% for each patient, we have not had one patient misidentified. We can state this because we still use patient photo and date of birth as identifiers. A QA test patient is run monthly to check the integrity of the system. Conclusion: By utilizing palm vein scans along with the date of birth and patient photo, another means of patient identification now exits. This work indicates the successful implementation of technology in the area of patient safety by closing the gap of treating the wrong plan to a patient in radiation oncology. FOJP Service Corporation covered some of the costs of the hardware and software of the palm vein pattern recognition biometric system.« less
4d N = 1 quiver gauge theories and the An Bailey lemma
NASA Astrophysics Data System (ADS)
Brünner, Frederic; Spiridonov, Vyacheslav P.
2018-03-01
We study the integral Bailey lemma associated with the An-root system and identities for elliptic hypergeometric integrals generated thereby. Interpreting integrals as superconformal indices of four-dimensional N = 1 quiver gauge theories with the gauge groups being products of SU(n + 1), we provide evidence for various new dualities. Further confirmation is achieved by explicitly checking that the `t Hooft anomaly matching conditions holds. We discuss a flavour symmetry breaking phenomenon for supersymmetric quantum chromodynamics (SQCD), and by making use of the Bailey lemma we indicate its manifestation in a web of linear quivers dual to SQCD that exhibits full s-confinement.
Yangian Symmetry and Integrability of Planar N=4 Supersymmetric Yang-Mills Theory.
Beisert, Niklas; Garus, Aleksander; Rosso, Matteo
2017-04-07
In this Letter, we establish Yangian symmetry of planar N=4 supersymmetric Yang-Mills theory. We prove that the classical equations of motion of the model close onto themselves under the action of Yangian generators. Moreover, we propose an off-shell extension of our statement, which is equivalent to the invariance of the action and prove that it is exactly satisfied. We assert that our relationship serves as a criterion for integrability in planar gauge theories by explicitly checking that it applies to the integrable Aharony-Bergman-Jafferis-Maldacena theory but not to the nonintegrable N=1 supersymmetric Yang-Mills theory.
NASA Technical Reports Server (NTRS)
Panek, Joseph W.
2001-01-01
The proper operation of the Electronically Scanned Pressure (ESP) System critical to accomplish the following goals: acquisition of highly accurate pressure data for the development of aerospace and commercial aviation systems and continuous confirmation of data quality to avoid costly, unplanned, repeat wind tunnel or turbine testing. Standard automated setup and checkout routines are necessary to accomplish these goals. Data verification and integrity checks occur at three distinct stages, pretest pressure tubing and system checkouts, daily system validation and in-test confirmation of critical system parameters. This paper will give an overview of the existing hardware, software and methods used to validate data integrity.
[Filariasis control: entry point for other helminthiasis control programs?].
Boussinesq, M
2006-08-01
Filariasis control programs are based on a decentralized drug distribution strategy known as "community-directed". This strategy could also be applied to the control of schistosomiasis and intestinal nematode infections. Integration of these control programs could be highly cost-effective. However, as a prerequisite for integration, it would be necessary to identify zones where these helminthic infections co-exist, specify the population categories that should receive each medication (ivermectin, albendazole, mebendazole, and praziquantel), check that combined administration of these drugs is safe and ensure that an integrated program would have no detrimental effect on the health care system and on the efficacy of ongoing programs.
Liu, S X; Zou, M S
2018-03-01
The radiation loading on a vibratory finite cylindrical shell is conventionally evaluated through the direct numerical integration (DNI) method. An alternative strategy via the fast Fourier transform algorithm is put forward in this work based on the general expression of radiation impedance. To check the feasibility and efficiency of the proposed method, a comparison with DNI is presented through numerical cases. The results obtained using the present method agree well with those calculated by DNI. More importantly, the proposed calculating strategy can significantly save the time cost compared with the conventional approach of straightforward numerical integration.
NASA Astrophysics Data System (ADS)
Svenšek, Daniel; Podgornik, Rudolf
2015-09-01
We present and analyze correlation functions of a main-chain polymer nematic in a continuum worm-like chain description for two types of constraints formalized by the tensorial and vectorial conservation laws, both originating in the microscopic chain integrity, i.e., the connectivity of the polymer chains. In particular, our aim is to identify the features of the correlation functions that are most susceptible to the differences between the two constraints. Besides the density and director autocorrelations in both the tensorial and vectorial cases, we calculate also the density-director correlation functions, the latter being a direct signature of the presence of a specific constraint. Its amplitude is connected to the strength of the constraint and is zero if none of the constraints are present, i.e., for a standard non-polymeric nematic. Generally, the correlation functions with the constraints differ substantially from the correlation functions in the non-polymeric case, if the constraints are strong which in practice requires long chains. Moreover, for the tensorial conservation law to be well distinguishable from the vectorial one, the chain persistence length should be much smaller than the total length of the chain, so that hairpins (chain backfolding) are numerous and the polar order is small.
A Forensically Sound Adversary Model for Mobile Devices.
Do, Quang; Martini, Ben; Choo, Kim-Kwang Raymond
2015-01-01
In this paper, we propose an adversary model to facilitate forensic investigations of mobile devices (e.g. Android, iOS and Windows smartphones) that can be readily adapted to the latest mobile device technologies. This is essential given the ongoing and rapidly changing nature of mobile device technologies. An integral principle and significant constraint upon forensic practitioners is that of forensic soundness. Our adversary model specifically considers and integrates the constraints of forensic soundness on the adversary, in our case, a forensic practitioner. One construction of the adversary model is an evidence collection and analysis methodology for Android devices. Using the methodology with six popular cloud apps, we were successful in extracting various information of forensic interest in both the external and internal storage of the mobile device.
Comparative study of flare control laws. [optimal control of b-737 aircraft approach and landing
NASA Technical Reports Server (NTRS)
Nadkarni, A. A.; Breedlove, W. J., Jr.
1979-01-01
A digital 3-D automatic control law was developed to achieve an optimal transition of a B-737 aircraft between various initial glid slope conditions and the desired final touchdown condition. A discrete, time-invariant, optimal, closed-loop control law presented for a linear regulator problem, was extended to include a system being acted upon by a constant disturbance. Two forms of control laws were derived to solve this problem. One method utilized the feedback of integral states defined appropriately and augmented with the original system equations. The second method formulated the problem as a control variable constraint, and the control variables were augmented with the original system. The control variable constraint control law yielded a better performance compared to feedback control law for the integral states chosen.
A Forensically Sound Adversary Model for Mobile Devices
Choo, Kim-Kwang Raymond
2015-01-01
In this paper, we propose an adversary model to facilitate forensic investigations of mobile devices (e.g. Android, iOS and Windows smartphones) that can be readily adapted to the latest mobile device technologies. This is essential given the ongoing and rapidly changing nature of mobile device technologies. An integral principle and significant constraint upon forensic practitioners is that of forensic soundness. Our adversary model specifically considers and integrates the constraints of forensic soundness on the adversary, in our case, a forensic practitioner. One construction of the adversary model is an evidence collection and analysis methodology for Android devices. Using the methodology with six popular cloud apps, we were successful in extracting various information of forensic interest in both the external and internal storage of the mobile device. PMID:26393812
Constraints and spandrels of interareal connectomes
Rubinov, Mikail
2016-01-01
Interareal connectomes are whole-brain wiring diagrams of white-matter pathways. Recent studies have identified modules, hubs, module hierarchies and rich clubs as structural hallmarks of these wiring diagrams. An influential current theory postulates that connectome modules are adequately explained by evolutionary pressures for wiring economy, but that the other hallmarks are not explained by such pressures and are therefore less trivial. Here, we use constraint network models to test these postulates in current gold-standard vertebrate and invertebrate interareal-connectome reconstructions. We show that empirical wiring-cost constraints inadequately explain connectome module organization, and that simultaneous module and hub constraints induce the structural byproducts of hierarchies and rich clubs. These byproducts, known as spandrels in evolutionary biology, include the structural substrate of the default-mode network. Our results imply that currently standard connectome characterizations are based on circular analyses or double dipping, and we emphasize an integrative approach to future connectome analyses for avoiding such pitfalls. PMID:27924867
Constraints and spandrels of interareal connectomes.
Rubinov, Mikail
2016-12-07
Interareal connectomes are whole-brain wiring diagrams of white-matter pathways. Recent studies have identified modules, hubs, module hierarchies and rich clubs as structural hallmarks of these wiring diagrams. An influential current theory postulates that connectome modules are adequately explained by evolutionary pressures for wiring economy, but that the other hallmarks are not explained by such pressures and are therefore less trivial. Here, we use constraint network models to test these postulates in current gold-standard vertebrate and invertebrate interareal-connectome reconstructions. We show that empirical wiring-cost constraints inadequately explain connectome module organization, and that simultaneous module and hub constraints induce the structural byproducts of hierarchies and rich clubs. These byproducts, known as spandrels in evolutionary biology, include the structural substrate of the default-mode network. Our results imply that currently standard connectome characterizations are based on circular analyses or double dipping, and we emphasize an integrative approach to future connectome analyses for avoiding such pitfalls.
NASA Astrophysics Data System (ADS)
Jung-Woon Yoo, John
2016-06-01
Since customer preferences change rapidly, there is a need for design processes with shorter product development cycles. Modularization plays a key role in achieving mass customization, which is crucial in today's competitive global market environments. Standardized interfaces among modularized parts have facilitated computational product design. To incorporate product size and weight constraints during computational design procedures, a mixed integer programming formulation is presented in this article. Product size and weight are two of the most important design parameters, as evidenced by recent smart-phone products. This article focuses on the integration of geometric, weight and interface constraints into the proposed mathematical formulation. The formulation generates the optimal selection of components for a target product, which satisfies geometric, weight and interface constraints. The formulation is verified through a case study and experiments are performed to demonstrate the performance of the formulation.
A tool for efficient, model-independent management optimization under uncertainty
White, Jeremy; Fienen, Michael N.; Barlow, Paul M.; Welter, Dave E.
2018-01-01
To fill a need for risk-based environmental management optimization, we have developed PESTPP-OPT, a model-independent tool for resource management optimization under uncertainty. PESTPP-OPT solves a sequential linear programming (SLP) problem and also implements (optional) efficient, “on-the-fly” (without user intervention) first-order, second-moment (FOSM) uncertainty techniques to estimate model-derived constraint uncertainty. Combined with a user-specified risk value, the constraint uncertainty estimates are used to form chance-constraints for the SLP solution process, so that any optimal solution includes contributions from model input and observation uncertainty. In this way, a “single answer” that includes uncertainty is yielded from the modeling analysis. PESTPP-OPT uses the familiar PEST/PEST++ model interface protocols, which makes it widely applicable to many modeling analyses. The use of PESTPP-OPT is demonstrated with a synthetic, integrated surface-water/groundwater model. The function and implications of chance constraints for this synthetic model are discussed.
Quantization of Simple Parametrized Systems
NASA Astrophysics Data System (ADS)
Ruffini, Giulio
1995-01-01
I study the canonical formulation and quantization of some simple parametrized systems using Dirac's formalism and the Becchi-Rouet-Stora-Tyutin (BRST) extended phase space method. These systems include the parametrized particle and minisuperspace. Using Dirac's formalism I first analyze for each case the construction of the classical reduced phase space. There are two separate features of these systems that may make this construction difficult: (a) Because of the boundary conditions used, the actions are not gauge invariant at the boundaries. (b) The constraints may have a disconnected solution space. The relativistic particle and minisuperspace have such complicated constraints, while the non-relativistic particle displays only the first feature. I first show that a change of gauge fixing is equivalent to a canonical transformation in the reduced phase space, thus resolving the problems associated with the first feature above. Then I consider the quantization of these systems using several approaches: Dirac's method, Dirac-Fock quantization, and the BRST formalism. In the cases of the relativistic particle and minisuperspace I consider first the quantization of one branch of the constraint at the time and then discuss the backgrounds in which it is possible to quantize simultaneously both branches. I motivate and define the inner product, and obtain, for example, the Klein-Gordon inner product for the relativistic case. Then I show how to construct phase space path integral representations for amplitudes in these approaches--the Batalin-Fradkin-Vilkovisky (BFV) and the Faddeev path integrals --from which one can then derive the path integrals in coordinate space--the Faddeev-Popov path integral and the geometric path integral. In particular I establish the connection between the Hilbert space representation and the range of the lapse in the path integrals. I also examine the class of paths that contribute in the path integrals and how they affect space-time covariance, concluding that it is consistent to take paths that move forward in time only when there is no electric field. The key elements in this analysis are the space-like paths and the behavior of the action under the non-trivial ( Z_2) element of the reparametrization group.
40 CFR 75.59 - Certification, quality assurance, and quality control record provisions.
Code of Federal Regulations, 2010 CFR
2010-07-01
... and the run average); (B) The raw data and results for all required pre-test, post-test, pre-run and...-day calibration error tests, all daily system integrity checks (Hg monitors, only), and all off-line calibration demonstrations, including any follow-up tests after corrective action: (i) Component-system...
Improving NAVFAC's total quality management of construction drawings with CLIPS
NASA Technical Reports Server (NTRS)
Antelman, Albert
1991-01-01
A diagnostic expert system to improve the quality of Naval Facilities Engineering Command (NAVFAC) construction drawings and specification is described. C Language Integrated Production System (CLIPS) and computer aided design layering standards are used in an expert system to check and coordinate construction drawings and specifications to eliminate errors and omissions.
Foursquare: A Health Education Specialist Checks-In--A Commentary
ERIC Educational Resources Information Center
Haithcox-Dennis, Melissa
2011-01-01
More and more, health education specialists are integrating technology into their work. Whereas most are familiar with social media sites like Facebook, Twitter and LinkedIn, one relatively new form of social media, location based services (LBS), may be less familiar. Developed in 2000, LBS are software applications that are accessible from a…
HPC Software Stack Testing Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garvey, Cormac
The HPC Software stack testing framework (hpcswtest) is used in the INL Scientific Computing Department to test the basic sanity and integrity of the HPC Software stack (Compilers, MPI, Numerical libraries and Applications) and to quickly discover hard failures, and as a by-product it will indirectly check the HPC infrastructure (network, PBS and licensing servers).
Think Inside the Box. Integrating Math in Your Classroom
ERIC Educational Resources Information Center
Naylor, Michael
2005-01-01
This brief article describes a few entertaining math "puzzles" that are easy to use with students at any grade level and with any operation. Not only do these puzzles help provide practice with facts and operations, they are also self-checking and may lead to some interesting big ideas in algebra.
Efficiency in the Community College Sector: Stochastic Frontier Analysis
ERIC Educational Resources Information Center
Agasisti, Tommaso; Belfield, Clive
2017-01-01
This paper estimates technical efficiency scores across the community college sector in the United States. Using stochastic frontier analysis and data from the Integrated Postsecondary Education Data System for 2003-2010, we estimate efficiency scores for 950 community colleges and perform a series of sensitivity tests to check for robustness. We…
Diagrammar in classical scalar field theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cattaruzza, E., E-mail: Enrico.Cattaruzza@gmail.com; Gozzi, E., E-mail: gozzi@ts.infn.it; INFN, Sezione di Trieste
2011-09-15
In this paper we analyze perturbatively a g{phi}{sup 4}classical field theory with and without temperature. In order to do that, we make use of a path-integral approach developed some time ago for classical theories. It turns out that the diagrams appearing at the classical level are many more than at the quantum level due to the presence of extra auxiliary fields in the classical formalism. We shall show that a universal supersymmetry present in the classical path-integral mentioned above is responsible for the cancelation of various diagrams. The same supersymmetry allows the introduction of super-fields and super-diagrams which considerably simplifymore » the calculations and make the classical perturbative calculations almost 'identical' formally to the quantum ones. Using the super-diagrams technique, we develop the classical perturbation theory up to third order. We conclude the paper with a perturbative check of the fluctuation-dissipation theorem. - Highlights: > We provide the Feynman diagrams of perturbation theory for a classical field theory. > We give a super-formalism which links the quantum diagrams to the classical ones. > We check perturbatively the fluctuation-dissipation theorem.« less
Capturing, using, and managing quality assurance knowledge for shuttle post-MECO flight design
NASA Technical Reports Server (NTRS)
Peters, H. L.; Fussell, L. R.; Goodwin, M. A.; Schultz, Roger D.
1991-01-01
Ascent initialization values used by the Shuttle's onboard computer for nominal and abort mission scenarios are verified by a six degrees of freedom computer simulation. The procedure that the Ascent Post Main Engine Cutoff (Post-MECO) group uses to perform quality assurance (QA) of the simulation is time consuming. Also, the QA data, checklists and associated rationale, though known by the group members, is not sufficiently documented, hindering transfer of knowledge and problem resolution. A new QA procedure which retains the current high level of integrity while reducing the time required to perform QA is needed to support the increasing Shuttle flight rate. Documenting the knowledge is also needed to increase its availability for training and problem resolution. To meet these needs, a knowledge capture process, embedded into the group activities, was initiated to verify the existing QA checks, define new ones, and document all rationale. The resulting checks were automated in a conventional software program to achieve the desired standardization, integrity, and time reduction. A prototype electronic knowledge base was developed with Macintosh's HyperCard to serve as a knowledge capture tool and data storage.
Galileo: The Added Value for Integrity in Harsh Environments.
Borio, Daniele; Gioia, Ciro
2016-01-16
A global navigation satellite system (GNSS)-based navigation is a challenging task in a signal-degraded environments where GNSS signals are distorted by multipath and attenuated by fading effects: the navigation solution may be inaccurate or unavailable. A possible approach to improve accuracy and availability is the joint use of measurements from different GNSSs and quality check algorithms; this approach is investigated here using live GPS and Galileo signals. A modified receiver autonomous integrity monitoring (RAIM) algorithm, including geometry and separability checks, is proposed to detect and exclude erroneous measurements: the multi-constellation approach provides redundant measurements, and RAIM exploits them to exclude distorted observations. The synergy between combined GPS/Galileo navigation and RAIM is analyzed using live data; the performance is compared to the accuracy and availability of a GPS-only solution. The tests performed demonstrate that the methods developed are effective techniques for GNSS-based navigation in signal-degraded environments. The joint use of the multi-constellation approach and of modified RAIM algorithms improves the performance of the navigation system in terms of both accuracy and availability.
Galileo: The Added Value for Integrity in Harsh Environments
Borio, Daniele; Gioia, Ciro
2016-01-01
A global navigation satellite system (GNSS)-based navigation is a challenging task in a signal-degraded environments where GNSS signals are distorted by multipath and attenuated by fading effects: the navigation solution may be inaccurate or unavailable. A possible approach to improve accuracy and availability is the joint use of measurements from different GNSSs and quality check algorithms; this approach is investigated here using live GPS and Galileo signals. A modified receiver autonomous integrity monitoring (RAIM) algorithm, including geometry and separability checks, is proposed to detect and exclude erroneous measurements: the multi-constellation approach provides redundant measurements, and RAIM exploits them to exclude distorted observations. The synergy between combined GPS/Galileo navigation and RAIM is analyzed using live data; the performance is compared to the accuracy and availability of a GPS-only solution. The tests performed demonstrate that the methods developed are effective techniques for GNSS-based navigation in signal-degraded environments. The joint use of the multi-constellation approach and of modified RAIM algorithms improves the performance of the navigation system in terms of both accuracy and availability. PMID:26784205
Replica approach to mean-variance portfolio optimization
NASA Astrophysics Data System (ADS)
Varga-Haszonits, Istvan; Caccioli, Fabio; Kondor, Imre
2016-12-01
We consider the problem of mean-variance portfolio optimization for a generic covariance matrix subject to the budget constraint and the constraint for the expected return, with the application of the replica method borrowed from the statistical physics of disordered systems. We find that the replica symmetry of the solution does not need to be assumed, but emerges as the unique solution of the optimization problem. We also check the stability of this solution and find that the eigenvalues of the Hessian are positive for r = N/T < 1, where N is the dimension of the portfolio and T the length of the time series used to estimate the covariance matrix. At the critical point r = 1 a phase transition is taking place. The out of sample estimation error blows up at this point as 1/(1 - r), independently of the covariance matrix or the expected return, displaying the universality not only of the critical exponent, but also the critical point. As a conspicuous illustration of the dangers of in-sample estimates, the optimal in-sample variance is found to vanish at the critical point inversely proportional to the divergent estimation error.
VARED: Verification and Analysis of Requirements and Early Designs
NASA Technical Reports Server (NTRS)
Badger, Julia; Throop, David; Claunch, Charles
2014-01-01
Requirements are a part of every project life cycle; everything going forward in a project depends on them. Good requirements are hard to write, there are few useful tools to test, verify, or check them, and it is difficult to properly marry them to the subsequent design, especially if the requirements are written in natural language. In fact, the inconsistencies and errors in the requirements along with the difficulty in finding these errors contribute greatly to the cost of the testing and verification stage of flight software projects [1]. Large projects tend to have several thousand requirements written at various levels by different groups of people. The design process is distributed and a lack of widely accepted standards for requirements often results in a product that varies widely in style and quality. A simple way to improve this would be to standardize the design process using a set of tools and widely accepted requirements design constraints. The difficulty with this approach is finding the appropriate constraints and tools. Common complaints against the tools available include ease of use, functionality, and available features. Also, although preferable, it is rare that these tools are capable of testing the quality of the requirements.
This map service contains data from aerial radiological surveys of 41 potential uranium mining areas (1,144 square miles) within the Navajo Nation that were conducted during the period from October 1994 through October 1999. The US Environmental Protection Agency (USEPA) Region 9 funded the surveys and the US Department of Energy (USDOE) Remote Sensing Laboratory (RSL) in Las Vegas, Nevada conducted the aerial surveys. The aerial survey data were used to characterize the overall radioactivity and excess Bismuth 214 levels within the surveyed areas.This US EPA Region 9 web service contains the following map layers: Total Terrestrial Gamma Activity Polygons, Total Terrestrial Gamma Activity Contours, Excess Bismuth 214 Contours, Excess Bismuth 214 Polygons, Flight AreasFull FGDC metadata records for each layer can be found by clicking the layer name at the web service endpoint and viewing the layer description.Security Classification: Public. Access Constraints: None. Use Constraints: None. Please check sources, scale, accuracy, currentness and other available information. Please confirm that you are using the most recent copy of both data and metadata. Acknowledgement of the EPA would be appreciated.
Optimal guidance law development for an advanced launch system
NASA Technical Reports Server (NTRS)
Calise, Anthony J.; Hodges, Dewey H.
1990-01-01
A regular perturbation analysis is presented. Closed-loop simulations were performed with a first order correction including all of the atmospheric terms. In addition, a method was developed for independently checking the accuracy of the analysis and the rather extensive programming required to implement the complete first order correction with all of the aerodynamic effects included. This amounted to developing an equivalent Hamiltonian computed from the first order analysis. A second order correction was also completed for the neglected spherical Earth and back-pressure effects. Finally, an analysis was begun on a method for dealing with control inequality constraints. The results on including higher order corrections do show some improvement for this application; however, it is not known at this stage if significant improvement will result when the aerodynamic forces are included. The weak formulation for solving optimal problems was extended in order to account for state inequality constraints. The formulation was tested on three example problems and numerical results were compared to the exact solutions. Development of a general purpose computational environment for the solution of a large class of optimal control problems is under way. An example, along with the necessary input and the output, is given.
Application of Sequential Quadratic Programming to Minimize Smart Active Flap Rotor Hub Loads
NASA Technical Reports Server (NTRS)
Kottapalli, Sesi; Leyland, Jane
2014-01-01
In an analytical study, SMART active flap rotor hub loads have been minimized using nonlinear programming constrained optimization methodology. The recently developed NLPQLP system (Schittkowski, 2010) that employs Sequential Quadratic Programming (SQP) as its core algorithm was embedded into a driver code (NLP10x10) specifically designed to minimize active flap rotor hub loads (Leyland, 2014). Three types of practical constraints on the flap deflections have been considered. To validate the current application, two other optimization methods have been used: i) the standard, linear unconstrained method, and ii) the nonlinear Generalized Reduced Gradient (GRG) method with constraints. The new software code NLP10x10 has been systematically checked out. It has been verified that NLP10x10 is functioning as desired. The following are briefly covered in this paper: relevant optimization theory; implementation of the capability of minimizing a metric of all, or a subset, of the hub loads as well as the capability of using all, or a subset, of the flap harmonics; and finally, solutions for the SMART rotor. The eventual goal is to implement NLP10x10 in a real-time wind tunnel environment.
Barriers to oral health care amongst different social classes in India.
Garcha, V; Shetiya, S H; Kakodkar, P
2010-09-01
To investigate and compare the influence of social and cultural factors as access barriers to oral health care amongst people from various social classes. A cross sectional survey in Pimpri, was conducted using a pilot tested 15 item-structured, close-ended and self-administered questionnaire. Two hundred and fifty people aged 35-45 years (50 participants each in five social classes as per British Registrar's General classification of occupation) were selected. The chi-square test was applied to check statistical differences between social classes at 5% level of significance. Overall, it was observed that irrespective of the social class difference 88% participants wished to seek only expert/professional advice for the dental treatment. Unavailability of services on Sunday (63%), going to dentist only when in pain (57%), trying self care or home remedy (54%), inadequate government policies (50%), budgetary constraints (40%) were among the major access barriers. Statistically significant difference in the access barriers among the social classes were found related to: Inadequate government policies, budgetary constraints, appointment schedules, far-off located clinics, myths and fear about dental treatment. Social and cultural factors act as access barriers to oral health care and social class differences have a significant influence on the access barriers.
Societal constraints related to environmental remediation and decommissioning programmes.
Perko, Tanja; Monken-Fernandes, Horst; Martell, Meritxell; Zeleznik, Nadja; O'Sullivan, Patrick
2017-06-20
The decisions related to decommissioning or environmental remediation projects (D/ER) cannot be isolated from the socio-political and cultural environment. Experiences of the IAEA Member States point out the importance of giving due attention to the societal aspects in project planning and implementation. The purpose of this paper is threefold: i) to systematically review societal constraints that some organisations in different IAEA Member States encounter when implementing D/ER programmes, ii) to identify different approaches to overcome these constraints and iii) to collect examples of existing practices related to the integration of societal aspects in D/ER programmes worldwide. The research was conducted in the context of the IAEA project Constraints to Decommissioning and Environmental Remediation (CIDER). The research results show that societal constraints arise mostly as a result of the different perceptions, attitudes, opinions and concerns of stakeholders towards the risks and benefits of D/ER programmes and due to the lack of stakeholder involvement in planning. There are different approaches to address these constraints, however all approaches have common points: early involvement, respect for different views, mutual understanding and learning. These results are relevant for all on-going and planned D/ER programmes. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Heremans, Stien; Suykens, Johan A. K.; Van Orshoven, Jos
2016-02-01
To be physically interpretable, sub-pixel land cover fractions or abundances should fulfill two constraints, the Abundance Non-negativity Constraint (ANC) and the Abundance Sum-to-one Constraint (ASC). This paper focuses on the effect of imposing these constraints onto the MultiLayer Perceptron (MLP) for a multi-class sub-pixel land cover classification of a time series of low resolution MODIS-images covering the northern part of Belgium. Two constraining modes were compared, (i) an in-training approach that uses 'softmax' as the transfer function in the MLP's output layer and (ii) a post-training approach that linearly rescales the outputs of the unconstrained MLP. Our results demonstrate that the pixel-level prediction accuracy is markedly increased by the explicit enforcement, both in-training and post-training, of the ANC and the ASC. For aggregations of pixels (municipalities), the constrained perceptrons perform at least as well as their unconstrained counterparts. Although the difference in performance between the in-training and post-training approach is small, we recommend the former for integrating the fractional abundance constraints into MLPs meant for sub-pixel land cover estimation, regardless of the targeted level of spatial aggregation.
Parkyn, Stephanie M; Smith, Brian J
2011-09-01
Biodiversity goals are becoming increasingly important in stream restoration. Typical models of stream restoration are based on the assumption that if habitat is restored then species will return and ecological processes will re-establish. However, a range of constraints at different scales can affect restoration success. Much of the research in stream restoration ecology has focused on habitat constraints, namely the in-stream and riparian conditions required to restore biota. Dispersal constraints are also integral to determining the timescales, trajectory and potential endpoints of a restored ecosystem. Dispersal is both a means of organism recolonization of restored sites and a vital ecological process that maintains viable populations. We review knowledge of dispersal pathways and explore the factors influencing stream invertebrate dispersal. From empirical and modeling studies of restoration in warm-temperate zones of New Zealand, we make predictions about the timescales of stream ecological restoration under differing levels of dispersal constraints. This process of constraints identification and timescale prediction is proposed as a practical step for resource managers to prioritize and appropriately monitor restoration sites and highlights that in some instances, natural recolonization and achievement of biodiversity goals may not occur.
Merits and limitations of optimality criteria method for structural optimization
NASA Technical Reports Server (NTRS)
Patnaik, Surya N.; Guptill, James D.; Berke, Laszlo
1993-01-01
The merits and limitations of the optimality criteria (OC) method for the minimum weight design of structures subjected to multiple load conditions under stress, displacement, and frequency constraints were investigated by examining several numerical examples. The examples were solved utilizing the Optimality Criteria Design Code that was developed for this purpose at NASA Lewis Research Center. This OC code incorporates OC methods available in the literature with generalizations for stress constraints, fully utilized design concepts, and hybrid methods that combine both techniques. Salient features of the code include multiple choices for Lagrange multiplier and design variable update methods, design strategies for several constraint types, variable linking, displacement and integrated force method analyzers, and analytical and numerical sensitivities. The performance of the OC method, on the basis of the examples solved, was found to be satisfactory for problems with few active constraints or with small numbers of design variables. For problems with large numbers of behavior constraints and design variables, the OC method appears to follow a subset of active constraints that can result in a heavier design. The computational efficiency of OC methods appears to be similar to some mathematical programming techniques.
NASA Astrophysics Data System (ADS)
Parkyn, Stephanie M.; Smith, Brian J.
2011-09-01
Biodiversity goals are becoming increasingly important in stream restoration. Typical models of stream restoration are based on the assumption that if habitat is restored then species will return and ecological processes will re-establish. However, a range of constraints at different scales can affect restoration success. Much of the research in stream restoration ecology has focused on habitat constraints, namely the in-stream and riparian conditions required to restore biota. Dispersal constraints are also integral to determining the timescales, trajectory and potential endpoints of a restored ecosystem. Dispersal is both a means of organism recolonization of restored sites and a vital ecological process that maintains viable populations. We review knowledge of dispersal pathways and explore the factors influencing stream invertebrate dispersal. From empirical and modeling studies of restoration in warm-temperate zones of New Zealand, we make predictions about the timescales of stream ecological restoration under differing levels of dispersal constraints. This process of constraints identification and timescale prediction is proposed as a practical step for resource managers to prioritize and appropriately monitor restoration sites and highlights that in some instances, natural recolonization and achievement of biodiversity goals may not occur.
Rivera, Claudia
2014-01-01
This paper analyses the perceptions of disaster risk reduction (DRR) practitioners concerning the on-going integration of climate change adaptation (CCA) into their practices in urban contexts in Nicaragua. Understanding their perceptions is important as this will provide information on how this integration can be improved. Exploring the perceptions of practitioners in Nicaragua is important as the country has a long history of disasters, and practitioners have been developing the current DRR planning framework for more than a decade. The analysis is based on semi-structured interviews designed to collect information about practitioners’ understanding of: (a) CCA, (b) the current level of integration of CCA into DRR and urban planning, (c) the opportunities and constraints of this integration, and (d) the potential to adapt cities to climate change. The results revealed that practitioners’ perception is that the integration of CCA into their practice is at an early stage, and that they need to improve their understanding of CCA in terms of a development issue. Three main constraints on improved integration were identified: (a) a recognized lack of understanding of CCA, (b) insufficient guidance on how to integrate it, and (c) the limited opportunities to integrate it into urban planning due to a lack of instruments and capacity in this field. Three opportunities were also identified: (a) practitioners’ awareness of the need to integrate CCA into their practices, (b) the robust structure of the DRR planning framework in the country, which provides a suitable channel for facilitating integration, and (c) the fact that CCA is receiving more attention and financial and technical support from the international community. PMID:24475365
48 CFR 1401.603-3 - Appointment.
Code of Federal Regulations, 2010 CFR
2010-10-01
... the bureau/office procedures within the constraints of DOI Integrated Charge Card Program Policy Manual located at http://www.doi.gov/pam/chargecard. Additional guidance is available in the GSA Smart...
48 CFR 1401.603-3 - Appointment.
Code of Federal Regulations, 2011 CFR
2011-10-01
... the bureau/office procedures within the constraints of DOI Integrated Charge Card Program Policy Manual located at http://www.doi.gov/pam/chargecard. Additional guidance is available in the GSA Smart...
Geographical determination of an optimal network of landing sites for Hermes
NASA Astrophysics Data System (ADS)
Goester, J. F.
Once its mission is done, Hermès will perform a deorbit burn, then will pilot towards a specially equipped landing site. As the atmospheric re-entry corridor is limited (the maximum cross range is 1500 km) Hermès will have to be situated on or-bits going near the runway. For safety reasons, we need to get one return opportunity per revolution, so it may be necessary to consider several landing sites and to fit out them. This proposed method allows to find, with easiness and quickness, the geographic areas getting the optimal solutions in term of number of runways, solutions amongst which we will choose already existing sites, checking other meteorologic, politic and economic constraints.
NASA Astrophysics Data System (ADS)
Gao, Xiang; Zhang, Shi-Bin; Chang, Yan; Yang, Fan; Zhang, Yan
2018-02-01
Recently, Li et al. (Int. J. Theor. Phys. 55, 1710-1718, 2016) proposed a Quantum Private Comparison (QPC) protocol based on the Entanglement Swapping Between Three-Particle W-Class State and Bell State. Two parties can check whether their secret information is equal or not with the help of the semi-honest third party (TP). However in this paper, we will point out this kind of semi-honest TP is unreasonable. If we relax the constraint of the semi-honest TP, by using the fake signal attack, TP can know the whole secret information illegally. At last, we give our improvement, which can make this protocol more secure.
NASA Astrophysics Data System (ADS)
Adem, Abdullahi Rashid; Moawad, Salah M.
2018-05-01
In this paper, the steady-state equations of ideal magnetohydrodynamic incompressible flows in axisymmetric domains are investigated. These flows are governed by a second-order elliptic partial differential equation as a type of generalized Grad-Shafranov equation. The problem of finding exact equilibria to the full governing equations in the presence of incompressible mass flows is considered. Two different types of constraints on position variables are presented to construct exact solution classes for several nonlinear cases of the governing equations. Some of the obtained results are checked for their applications to magnetic confinement plasma. Besides, they cover many previous configurations and include new considerations about the nonlinearity of magnetic flux stream variables.
Towards Time Automata and Multi-Agent Systems
NASA Technical Reports Server (NTRS)
Hutzler, G.; Klaudel, H.; Wang, D. Y.
2004-01-01
The design of reactive systems must comply with logical correctness (the system does what it is supposed to do) and timeliness (the system has to satisfy a set of temporal constraints) criteria. In this paper, we propose a global approach for the design of adaptive reactive systems, i.e., systems that dynamically adapt their architecture depending on the context. We use the timed automata formalism for the design of the agents' behavior. This allows evaluating beforehand the properties of the system (regarding logical correctness and timeliness), thanks to model-checking and simulation techniques. This model is enhanced with tools that we developed for the automatic generation of code, allowing to produce very quickly a running multi-agent prototype satisfying the properties of the model.
NASA Astrophysics Data System (ADS)
Gao, Xiang; Zhang, Shi-Bin; Chang, Yan; Yang, Fan; Zhang, Yan
2018-06-01
Recently, Li et al. (Int. J. Theor. Phys. 55, 1710-1718, 2016) proposed a Quantum Private Comparison (QPC) protocol based on the Entanglement Swapping Between Three-Particle W-Class State and Bell State. Two parties can check whether their secret information is equal or not with the help of the semi-honest third party (TP). However in this paper, we will point out this kind of semi-honest TP is unreasonable. If we relax the constraint of the semi-honest TP, by using the fake signal attack, TP can know the whole secret information illegally. At last, we give our improvement, which can make this protocol more secure.
Integrable Rosochatius deformations of the restricted soliton flows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou Ruguang
2007-10-15
A method to construct integrable Rosochatius deformations of the restricted soliton flows in the setup of Lax formulation is presented. The integrable Rosochatius deformations of the restricted soliton flows such as the restricted Ablowitz-Kaup-Newell-Segur flow, the restricted Tu-Meng flow, the restricted Tu flow with Neumann-type constraints, and the restricted modified Korteweg-de Vries flow, together with their Lax representations, are presented. In addition, a Lax representation of the Jacobi-Rosochatius system is obtained.
2013-07-22
HOUSTON - JSC2013e068344 - NASA astronaut Randy Bresnik gets into position in The Boeing Company's CST-100 spacecraft for a fit check evaluation at the company's Houston Product Support Center. Bresnik's fit check will help evaluate a crew's maneuverability in the spacecraft and test communications. Boeing's CST-100 is being designed to transport crew members or a mix of crew and cargo to low-Earth-orbit destinations. The evaluation is part of the ongoing work supporting Boeing's funded Space Act Agreement with NASA's Commercial Crew Program, or CCP, during the agency's Commercial Crew Integrated Capability, or CCiCap, initiative. CCiCap is intended to make commercial human spaceflight services available for government and commercial customers. To learn more about CCP, visit http://www.nasa.gov/commercialcrew. Photo credit: NASA/Robert Markowitz
2013-07-22
HOUSTON - JSC2013e068317 - NASA astronaut Serena Aunon exits The Boeing Company's CST-100 spacecraft following a fit check evaluation at the company's Houston Product Support Center. Aunon's fit check will help evaluate a crew's maneuverability in the spacecraft and test communications. Boeing's CST-100 is being designed to transport crew members or a mix of crew and cargo to low-Earth-orbit destinations. The evaluation is part of the ongoing work supporting Boeing's funded Space Act Agreement with NASA's Commercial Crew Program, or CCP, during the agency's Commercial Crew Integrated Capability, or CCiCap, initiative. CCiCap is intended to make commercial human spaceflight services available for government and commercial customers. To learn more about CCP, visit http://www.nasa.gov/commercialcrew. Photo credit: NASA/Robert Markowitz
2013-07-22
HOUSTON - JSC2013e068269 - NASA astronaut Serena Aunon prepares to enter The Boeing Company's CST-100 spacecraft for a fit check evaluation at the company's Houston Product Support Center. Aunon's fit check will help evaluate a crew's maneuverability in the spacecraft and test communications. Boeing's CST-100 is being designed to transport crew members or a mix of crew and cargo to low-Earth-orbit destinations. The evaluation is part of the ongoing work supporting Boeing's funded Space Act Agreement with NASA's Commercial Crew Program, or CCP, during the agency's Commercial Crew Integrated Capability, or CCiCap, initiative. CCiCap is intended to make commercial human spaceflight services available for government and commercial customers. To learn more about CCP, visit http://www.nasa.gov/commercialcrew. Photo credit: NASA/Robert Markowitz
2013-07-22
HOUSTON - JSC2013e068333 - NASA astronaut Randy Bresnik prepares to enter The Boeing Company's CST-100 spacecraft for a fit check evaluation at the company's Houston Product Support Center. Bresnik's fit check will help evaluate a crew's maneuverability in the spacecraft and test communications. Boeing's CST-100 is being designed to transport crew members or a mix of crew and cargo to low-Earth-orbit destinations. The evaluation is part of the ongoing work supporting Boeing's funded Space Act Agreement with NASA's Commercial Crew Program, or CCP, during the agency's Commercial Crew Integrated Capability, or CCiCap, initiative. CCiCap is intended to make commercial human spaceflight services available for government and commercial customers. To learn more about CCP, visit http://www.nasa.gov/commercialcrew. Photo credit: NASA/Robert Markowitz
2013-07-22
HOUSTON - JSC2013e068260 - NASA astronaut Serena Aunon suits up for a fit check evaluation of The Boeing Company's CST-100 spacecraft at the company's Houston Product Support Center. Aunon's fit check will help evaluate a crew's maneuverability in the spacecraft and test communications. Boeing's CST-100 is being designed to transport crew members or a mix of crew and cargo to low-Earth-orbit destinations. The evaluation is part of the ongoing work supporting Boeing's funded Space Act Agreement with NASA's Commercial Crew Program, or CCP, during the agency's Commercial Crew Integrated Capability, or CCiCap, initiative. CCiCap is intended to make commercial human spaceflight services available for government and commercial customers. To learn more about CCP, visit http://www.nasa.gov/commercialcrew. Photo credit: NASA/Robert Markowitz
Prototype data terminal: Multiplexer/demultiplexer
NASA Technical Reports Server (NTRS)
Leck, D. E.; Goodwin, J. E.
1972-01-01
The design and operation of a quad redundant data terminal and a multiplexer/demultiplexer (MDU) design are described. The most unique feature is the design of the quad redundant data terminal. This is one of the few designs where the unit is fail/op, fail/op, fail/safe. Laboratory tests confirm that the unit will operate satisfactorily with the failure of three out of four channels. Although the design utilizes state-of-the-art technology. The waveform error checks, the voting techniques, and the parity bit checks are believed to be used in unique configurations. Correct word selection routines are also novel, if not unique. The MDU design, while not redundant, utilizes, the latest state-of-the-art advantages of light couplers and integrated circuit amplifiers.
STS-92 crew takes part in a Leak Seal Kit Fit Check in the SSPF
NASA Technical Reports Server (NTRS)
1999-01-01
STS-92 Mission Specialist Koichi Wakata, with the National Space Development Agency of Japan (NASDA), and Pilot Pamela A. Melroy take a break during a Leak Seal Kit Fit Check of the Pressurized Mating Adapter -3 in the Space Station Processing Facility. Also participating are the other crew members Commander Brian Duffy and Mission Specialists Leroy Chiao (Ph.D.), Peter J.K. 'Jeff' Wisoff (Ph.D.), Michael E. Lopez-Alegria and William Surles 'Bill' McArthur Jr. STS-92 is the fourth U.S. flight for construction of the International Space Station. The mission payload also includes an integrated truss structure (Z-1 truss). Launch of STS-92 is scheduled for Feb. 24, 2000.
STS-92 crew takes part in a Leak Seal Kit Fit Check in the SSPF
NASA Technical Reports Server (NTRS)
1999-01-01
In the Space Station Processing Facility, STS-92 crew members take part in a Leak Seal Kit Fit Check in connection with the Pressurized Mating Adapter -3 in the background. From left are Mission Specialist Peter J.K. 'Jeff' Wisoff (Ph.D.), Pilot Pamela A. Melroy, Commander Brian Duffy, Mission Specialist Koichi Wakata, who represents the National Space Development Agency of Japan (NASDA), Brian Warkentine, with JSC, and a Boeing worker at right. Also participating are other crew members Mission Specialists Leroy Chiao (Ph.D.), Michael E. Lopez-Alegria and William Surles 'Bill' McArthur Jr. The mission payload also includes an integrated truss structure (Z-1 truss). Launch of STS-92 is scheduled for Feb. 24, 2000.