Symbolic PathFinder: Symbolic Execution of Java Bytecode
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.; Rungta, Neha
2010-01-01
Symbolic Pathfinder (SPF) combines symbolic execution with model checking and constraint solving for automated test case generation and error detection in Java programs with unspecified inputs. In this tool, programs are executed on symbolic inputs representing multiple concrete inputs. Values of variables are represented as constraints generated from the analysis of Java bytecode. The constraints are solved using off-the shelf solvers to generate test inputs guaranteed to achieve complex coverage criteria. SPF has been used successfully at NASA, in academia, and in industry.
Precision orbit raising trajectories. [solar electric propulsion orbital transfer program
NASA Technical Reports Server (NTRS)
Flanagan, P. F.; Horsewood, J. L.; Pines, S.
1975-01-01
A precision trajectory program has been developed to serve as a test bed for geocentric orbit raising steering laws. The steering laws to be evaluated have been developed using optimization methods employing averaging techniques. This program provides the capability of testing the steering laws in a precision simulation. The principal system models incorporated in the program are described, including the radiation environment, the solar array model, the thrusters and power processors, the geopotential, and the solar system. Steering and array orientation constraints are discussed, and the impact of these constraints on program design is considered.
Natural environment application for NASP-X-30 design and mission planning
NASA Technical Reports Server (NTRS)
Johnson, D. L.; Hill, C. K.; Brown, S. C.; Batts, G. W.
1993-01-01
The NASA/MSFC Mission Analysis Program has recently been utilized in various National Aero-Space Plane (NASP) mission and operational planning scenarios. This paper focuses on presenting various atmospheric constraint statistics based on assumed NASP mission phases using established natural environment design, parametric, threshold values. Probabilities of no-go are calculated using atmospheric parameters such as temperature, humidity, density altitude, peak/steady-state winds, cloud cover/ceiling, thunderstorms, and precipitation. The program although developed to evaluate test or operational missions after flight constraints have been established, can provide valuable information in the design phase of the NASP X-30 program. Inputting the design values as flight constraints the Mission Analysis Program returns the probability of no-go, or launch delay, by hour by month. This output tells the X-30 program manager whether the design values are stringent enough to meet his required test flight schedules.
Choi, Mi-Ri; Jeon, Sang-Wan; Yi, Eun-Surk
2018-04-01
The purpose of this study is to analyze the differences among the hospitalized cancer patients on their perception of exercise and physical activity constraints based on their medical history. The study used questionnaire survey as measurement tool for 194 cancer patients (male or female, aged 20 or older) living in Seoul metropolitan area (Seoul, Gyeonggi, Incheon). The collected data were analyzed using frequency analysis, exploratory factor analysis, reliability analysis t -test, and one-way distribution using statistical program SPSS 18.0. The following results were obtained. First, there was no statistically significant difference between cancer stage and exercise recognition/physical activity constraint. Second, there was a significant difference between cancer stage and sociocultural constraint/facility constraint/program constraint. Third, there was a significant difference between cancer operation history and physical/socio-cultural/facility/program constraint. Fourth, there was a significant difference between cancer operation history and negative perception/facility/program constraint. Fifth, there was a significant difference between ancillary cancer treatment method and negative perception/facility/program constraint. Sixth, there was a significant difference between hospitalization period and positive perception/negative perception/physical constraint/cognitive constraint. In conclusion, this study will provide information necessary to create patient-centered healthcare service system by analyzing exercise recognition of hospitalized cancer patients based on their medical history and to investigate the constraint factors that prevents patients from actually making efforts to exercise.
Constraint Logic Programming approach to protein structure prediction.
Dal Palù, Alessandro; Dovier, Agostino; Fogolari, Federico
2004-11-30
The protein structure prediction problem is one of the most challenging problems in biological sciences. Many approaches have been proposed using database information and/or simplified protein models. The protein structure prediction problem can be cast in the form of an optimization problem. Notwithstanding its importance, the problem has very seldom been tackled by Constraint Logic Programming, a declarative programming paradigm suitable for solving combinatorial optimization problems. Constraint Logic Programming techniques have been applied to the protein structure prediction problem on the face-centered cube lattice model. Molecular dynamics techniques, endowed with the notion of constraint, have been also exploited. Even using a very simplified model, Constraint Logic Programming on the face-centered cube lattice model allowed us to obtain acceptable results for a few small proteins. As a test implementation their (known) secondary structure and the presence of disulfide bridges are used as constraints. Simplified structures obtained in this way have been converted to all atom models with plausible structure. Results have been compared with a similar approach using a well-established technique as molecular dynamics. The results obtained on small proteins show that Constraint Logic Programming techniques can be employed for studying protein simplified models, which can be converted into realistic all atom models. The advantage of Constraint Logic Programming over other, much more explored, methodologies, resides in the rapid software prototyping, in the easy way of encoding heuristics, and in exploiting all the advances made in this research area, e.g. in constraint propagation and its use for pruning the huge search space.
ERIC Educational Resources Information Center
van der Linden, Wim J.; Boekkooi-Timminga, Ellen
A "maximin" model for item response theory based test design is proposed. In this model only the relative shape of the target test information function is specified. It serves as a constraint subject to which a linear programming algorithm maximizes the information in the test. In the practice of test construction there may be several…
A Discussion of Issues in Integrity Constraint Monitoring
NASA Technical Reports Server (NTRS)
Fernandez, Francisco G.; Gates, Ann Q.; Cooke, Daniel E.
1998-01-01
In the development of large-scale software systems, analysts, designers, and programmers identify properties of data objects in the system. The ability to check those assertions during runtime is desirable as a means of verifying the integrity of the program. Typically, programmers ensure the satisfaction of such properties through the use of some form of manually embedded assertion check. The disadvantage to this approach is that these assertions become entangled within the program code. The goal of the research is to develop an integrity constraint monitoring mechanism whereby a repository of software system properties (called integrity constraints) are automatically inserted into the program by the mechanism to check for incorrect program behaviors. Such a mechanism would overcome many of the deficiencies of manually embedded assertion checks. This paper gives an overview of the preliminary work performed toward this goal. The manual instrumentation of constraint checking on a series of test programs is discussed, This review then is used as the basis for a discussion of issues to be considered in developing an automated integrity constraint monitor.
SIMCA T 1.0: A SAS Computer Program for Simulating Computer Adaptive Testing
ERIC Educational Resources Information Center
Raiche, Gilles; Blais, Jean-Guy
2006-01-01
Monte Carlo methodologies are frequently applied to study the sampling distribution of the estimated proficiency level in adaptive testing. These methods eliminate real situational constraints. However, these Monte Carlo methodologies are not currently supported by the available software programs, and when these programs are available, their…
Symbolic Execution Enhanced System Testing
NASA Technical Reports Server (NTRS)
Davies, Misty D.; Pasareanu, Corina S.; Raman, Vishwanath
2012-01-01
We describe a testing technique that uses information computed by symbolic execution of a program unit to guide the generation of inputs to the system containing the unit, in such a way that the unit's, and hence the system's, coverage is increased. The symbolic execution computes unit constraints at run-time, along program paths obtained by system simulations. We use machine learning techniques treatment learning and function fitting to approximate the system input constraints that will lead to the satisfaction of the unit constraints. Execution of system input predictions either uncovers new code regions in the unit under analysis or provides information that can be used to improve the approximation. We have implemented the technique and we have demonstrated its effectiveness on several examples, including one from the aerospace domain.
Construct exploit constraint in crash analysis by bypassing canary
NASA Astrophysics Data System (ADS)
Huang, Ning; Huang, Shuguang; Huang, Hui; Chang, Chao
2017-08-01
Selective symbolic execution is a common program testing technology. Developed on the basis of it, some crash analysis systems are often used to test the fragility of the program by constructing exploit constraints, such as CRAX. From the study of crash analysis based on symbolic execution, this paper find that this technology cannot bypass the canary stack protection mechanisms. This paper makes the improvement uses the API hook in Linux. Experimental results show that the use of API hook can effectively solve the problem that crash analysis cannot bypass the canary protection.
Developing a Drug Testing Policy at a Public University: Participant Perspectives.
ERIC Educational Resources Information Center
Griffin, Stephen O.; Keller, Adrienne; Cohn, Alan
2001-01-01
Although employee drug testing is widespread among private employers, the development of programs in the public sector has been slower due to constitutional law constraints. A qualitative approach presenting various participant perspectives may aid in developing an employee drug testing program. (Contains 41 references/notes.) (JOW)
jFuzz: A Concolic Whitebox Fuzzer for Java
NASA Technical Reports Server (NTRS)
Jayaraman, Karthick; Harvison, David; Ganesh, Vijay; Kiezun, Adam
2009-01-01
We present jFuzz, a automatic testing tool for Java programs. jFuzz is a concolic whitebox fuzzer, built on the NASA Java PathFinder, an explicit-state Java model checker, and a framework for developing reliability and analysis tools for Java. Starting from a seed input, jFuzz automatically and systematically generates inputs that exercise new program paths. jFuzz uses a combination of concrete and symbolic execution, and constraint solving. Time spent on solving constraints can be significant. We implemented several well-known optimizations and name-independent caching, which aggressively normalizes the constraints to reduce the number of calls to the constraint solver. We present preliminary results due to the optimizations, and demonstrate the effectiveness of jFuzz in creating good test inputs. The source code of jFuzz is available as part of the NASA Java PathFinder. jFuzz is intended to be a research testbed for investigating new testing and analysis techniques based on concrete and symbolic execution. The source code of jFuzz is available as part of the NASA Java PathFinder.
Shock Mitigating Seat Single Impact Program
2014-04-24
new seats from Shockwave, SHOXS and Zodiac , were tested during the third and fourth phases of the final test program and these were conducted between...test program to the four single jockey style seats from Shockwave, SHOXS, Ullman and Zodiac because of budget and time constraints. The program...along with the Zodiac jockey pod seat that replaced the Ullman seat. 2711 (NETE CS) ZT4110-R 23 April 2014 QF035 32/39 Rev. 05/2011.11.14 69. The
ERIC Educational Resources Information Center
Huitzing, Hiddo A.
2004-01-01
This article shows how set covering with item sampling (SCIS) methods can be used in the analysis and preanalysis of linear programming models for test assembly (LPTA). LPTA models can construct tests, fulfilling a set of constraints set by the test assembler. Sometimes, no solution to the LPTA model exists. The model is then said to be…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pytel, K.; Mieleszczenko, W.; Lechniak, J.
2010-03-01
The presented paper contains neutronic and thermal-hydraulic (for steady and unsteady states) calculation results prepared to support annex to Safety Analysis Report for MARIA reactor in order to obtain approval for program of testing low-enriched uranium (LEU) lead test fuel assemblies (LTFA) manufactured by CERCA. This includes presentation of the limits and operational constraints to be in effect during the fuel testing investigations. Also, the scope of testing program (which began in August 2009), including additional measurements and monitoring procedures, is described.
Benko, Matúš; Gfrerer, Helmut
2018-01-01
In this paper, we consider a sufficiently broad class of non-linear mathematical programs with disjunctive constraints, which, e.g. include mathematical programs with complemetarity/vanishing constraints. We present an extension of the concept of [Formula: see text]-stationarity which can be easily combined with the well-known notion of M-stationarity to obtain the stronger property of so-called [Formula: see text]-stationarity. We show how the property of [Formula: see text]-stationarity (and thus also of M-stationarity) can be efficiently verified for the considered problem class by computing [Formula: see text]-stationary solutions of a certain quadratic program. We consider further the situation that the point which is to be tested for [Formula: see text]-stationarity, is not known exactly, but is approximated by some convergent sequence, as it is usually the case when applying some numerical method.
Sustaining Arts Programs in Public Education
ERIC Educational Resources Information Center
Dunstan, David
2016-01-01
The purpose of this qualitative research case study was to investigate leadership and funding decisions that determine key factors responsible for sustaining arts programs in public schools. While the educational climate, financial constraints, and standardized testing continue to impact arts programs in public education, Eastland High School, the…
Water-resources optimization model for Santa Barbara, California
Nishikawa, Tracy
1998-01-01
A simulation-optimization model has been developed for the optimal management of the city of Santa Barbara's water resources during a drought. The model, which links groundwater simulation with linear programming, has a planning horizon of 5 years. The objective is to minimize the cost of water supply subject to: water demand constraints, hydraulic head constraints to control seawater intrusion, and water capacity constraints. The decision variables are montly water deliveries from surface water and groundwater. The state variables are hydraulic heads. The drought of 1947-51 is the city's worst drought on record, and simulated surface-water supplies for this period were used as a basis for testing optimal management of current water resources under drought conditions. The simulation-optimization model was applied using three reservoir operation rules. In addition, the model's sensitivity to demand, carry over [the storage of water in one year for use in the later year(s)], head constraints, and capacity constraints was tested.
Apollo experience report: Very high frequency ranging system
NASA Technical Reports Server (NTRS)
Panter, W. C.; Shores, P. W.
1972-01-01
The history of the Apollo very-high-frequency ranging system development program is presented from the program-planning stage through the final-test and flight-evaluation stages. Block diagrams of the equipment are presented, and a description of the theory of operation is outlined. A sample of the distribution of errors measured in the aircraft-flight test program is included. The report is concluded with guidelines or recommendations for the management of development programs having the same general constraints.
About the mechanism of ERP-system pilot test
NASA Astrophysics Data System (ADS)
Mitkov, V. V.; Zimin, V. V.
2018-05-01
In the paper the mathematical problem of defining the scope of pilot test is stated, which is a task of quadratic programming. The procedure of the problem solving includes the method of network programming based on the structurally similar network representation of the criterion and constraints and which reduces the original problem to a sequence of simpler evaluation tasks. The evaluation tasks are solved by the method of dichotomous programming.
Giant Panda Maternal Care: A Test of the Experience Constraint Hypothesis.
Snyder, Rebecca J; Perdue, Bonnie M; Zhang, Zhihe; Maple, Terry L; Charlton, Benjamin D
2016-06-07
The body condition constraint and the experience condition constraint hypotheses have both been proposed to account for differences in reproductive success between multiparous (experienced) and primiparous (first-time) mothers. However, because primiparous mothers are typically characterized by both inferior body condition and lack of experience when compared to multiparous mothers, interpreting experience related differences in maternal care as support for either the body condition constraint hypothesis or the experience constraint hypothesis is extremely difficult. Here, we examined maternal behaviour in captive giant pandas, allowing us to simultaneously control for body condition and provide a rigorous test of the experience constraint hypothesis in this endangered animal. We found that multiparous mothers spent more time engaged in key maternal behaviours (nursing, grooming, and holding cubs) and had significantly less vocal cubs than primiparous mothers. This study provides the first evidence supporting the experience constraint hypothesis in the order Carnivora, and may have utility for captive breeding programs in which it is important to monitor the welfare of this species' highly altricial cubs, whose survival is almost entirely dependent on receiving adequate maternal care during the first few weeks of life.
ERIC Educational Resources Information Center
Li, Yuan H.; Yang, Yu N.; Tompkins, Leroy J.; Modarresi, Shahpar
2005-01-01
The statistical technique, "Zero-One Linear Programming," that has successfully been used to create multiple tests with similar characteristics (e.g., item difficulties, test information and test specifications) in the area of educational measurement, was deemed to be a suitable method for creating multiple sets of matched samples to be…
OPTIMAL NETWORK TOPOLOGY DESIGN
NASA Technical Reports Server (NTRS)
Yuen, J. H.
1994-01-01
This program was developed as part of a research study on the topology design and performance analysis for the Space Station Information System (SSIS) network. It uses an efficient algorithm to generate candidate network designs (consisting of subsets of the set of all network components) in increasing order of their total costs, and checks each design to see if it forms an acceptable network. This technique gives the true cost-optimal network, and is particularly useful when the network has many constraints and not too many components. It is intended that this new design technique consider all important performance measures explicitly and take into account the constraints due to various technical feasibilities. In the current program, technical constraints are taken care of by the user properly forming the starting set of candidate components (e.g. nonfeasible links are not included). As subsets are generated, they are tested to see if they form an acceptable network by checking that all requirements are satisfied. Thus the first acceptable subset encountered gives the cost-optimal topology satisfying all given constraints. The user must sort the set of "feasible" link elements in increasing order of their costs. The program prompts the user for the following information for each link: 1) cost, 2) connectivity (number of stations connected by the link), and 3) the stations connected by that link. Unless instructed to stop, the program generates all possible acceptable networks in increasing order of their total costs. The program is written only to generate topologies that are simply connected. Tests on reliability, delay, and other performance measures are discussed in the documentation, but have not been incorporated into the program. This program is written in PASCAL for interactive execution and has been implemented on an IBM PC series computer operating under PC DOS. The disk contains source code only. This program was developed in 1985.
Observed-Score Equating as a Test Assembly Problem.
ERIC Educational Resources Information Center
van der Linden, Wim J.; Luecht, Richard M.
1998-01-01
Derives a set of linear conditions of item-response functions that guarantees identical observed-score distributions on two test forms. The conditions can be added as constraints to a linear programming model for test assembly. An example illustrates the use of the model for an item pool from the Law School Admissions Test (LSAT). (SLD)
Giant Panda Maternal Care: A Test of the Experience Constraint Hypothesis
Snyder, Rebecca J.; Perdue, Bonnie M.; Zhang, Zhihe; Maple, Terry L.; Charlton, Benjamin D.
2016-01-01
The body condition constraint and the experience condition constraint hypotheses have both been proposed to account for differences in reproductive success between multiparous (experienced) and primiparous (first-time) mothers. However, because primiparous mothers are typically characterized by both inferior body condition and lack of experience when compared to multiparous mothers, interpreting experience related differences in maternal care as support for either the body condition constraint hypothesis or the experience constraint hypothesis is extremely difficult. Here, we examined maternal behaviour in captive giant pandas, allowing us to simultaneously control for body condition and provide a rigorous test of the experience constraint hypothesis in this endangered animal. We found that multiparous mothers spent more time engaged in key maternal behaviours (nursing, grooming, and holding cubs) and had significantly less vocal cubs than primiparous mothers. This study provides the first evidence supporting the experience constraint hypothesis in the order Carnivora, and may have utility for captive breeding programs in which it is important to monitor the welfare of this species’ highly altricial cubs, whose survival is almost entirely dependent on receiving adequate maternal care during the first few weeks of life. PMID:27272352
Application of CFE/POST2 for Simulation of Launch Vehicle Stage Separation
NASA Technical Reports Server (NTRS)
Pamadi, Bandu N.; Tartabini, Paul V.; Toniolo, Matthew D.; Roithmayr, Carlos M.; Karlgaard, Christopher D.; Samareh, Jamshid A.
2009-01-01
The constraint force equation (CFE) methodology provides a framework for modeling constraint forces and moments acting at joints that connect multiple vehicles. With implementation in Program to Optimize Simulated Trajectories II (POST 2), the CFE provides a capability to simulate end-to-end trajectories of launch vehicles, including stage separation. In this paper, the CFE/POST2 methodology is applied to the Shuttle-SRB separation problem as a test and validation case. The CFE/POST2 results are compared with STS-1 flight test data.
Brief Lags in Interrupted Sequential Performance: Evaluating a Model and Model Evaluation Method
2015-01-05
rehearsal mechanism in the model. To evaluate the model we developed a simple new goodness-of-fit test based on analysis of variance that offers an...repeated step). Sequen- tial constraints are common in medicine, equipment maintenance, computer programming and technical support, data analysis ...legal analysis , accounting, and many other home and workplace environ- ments. Sequential constraints also play a role in such basic cognitive processes
A depth-first search algorithm to compute elementary flux modes by linear programming.
Quek, Lake-Ee; Nielsen, Lars K
2014-07-30
The decomposition of complex metabolic networks into elementary flux modes (EFMs) provides a useful framework for exploring reaction interactions systematically. Generating a complete set of EFMs for large-scale models, however, is near impossible. Even for moderately-sized models (<400 reactions), existing approaches based on the Double Description method must iterate through a large number of combinatorial candidates, thus imposing an immense processor and memory demand. Based on an alternative elementarity test, we developed a depth-first search algorithm using linear programming (LP) to enumerate EFMs in an exhaustive fashion. Constraints can be introduced to directly generate a subset of EFMs satisfying the set of constraints. The depth-first search algorithm has a constant memory overhead. Using flux constraints, a large LP problem can be massively divided and parallelized into independent sub-jobs for deployment into computing clusters. Since the sub-jobs do not overlap, the approach scales to utilize all available computing nodes with minimal coordination overhead or memory limitations. The speed of the algorithm was comparable to efmtool, a mainstream Double Description method, when enumerating all EFMs; the attrition power gained from performing flux feasibility tests offsets the increased computational demand of running an LP solver. Unlike the Double Description method, the algorithm enables accelerated enumeration of all EFMs satisfying a set of constraints.
Engineering calculations for communications satellite systems planning
NASA Technical Reports Server (NTRS)
Reilly, C. H.; Levis, C. A.; Mount-Campbell, C.; Gonsalvez, D. J.; Wang, C. W.; Yamamura, Y.
1985-01-01
Computer-based techniques for optimizing communications-satellite orbit and frequency assignments are discussed. A gradient-search code was tested against a BSS scenario derived from the RARC-83 data. Improvement was obtained, but each iteration requires about 50 minutes of IBM-3081 CPU time. Gradient-search experiments on a small FSS test problem, consisting of a single service area served by 8 satellites, showed quickest convergence when the satellites were all initially placed near the center of the available orbital arc with moderate spacing. A transformation technique is proposed for investigating the surface topography of the objective function used in the gradient-search method. A new synthesis approach is based on transforming single-entry interference constraints into corresponding constraints on satellite spacings. These constraints are used with linear objective functions to formulate the co-channel orbital assignment task as a linear-programming (LP) problem or mixed integer programming (MIP) problem. Globally optimal solutions are always found with the MIP problems, but not necessarily with the LP problems. The MIP solutions can be used to evaluate the quality of the LP solutions. The initial results are very encouraging.
Model-based control strategies for systems with constraints of the program type
NASA Astrophysics Data System (ADS)
Jarzębowska, Elżbieta
2006-08-01
The paper presents a model-based tracking control strategy for constrained mechanical systems. Constraints we consider can be material and non-material ones referred to as program constraints. The program constraint equations represent tasks put upon system motions and they can be differential equations of orders higher than one or two, and be non-integrable. The tracking control strategy relies upon two dynamic models: a reference model, which is a dynamic model of a system with arbitrary order differential constraints and a dynamic control model. The reference model serves as a motion planner, which generates inputs to the dynamic control model. It is based upon a generalized program motion equations (GPME) method. The method enables to combine material and program constraints and merge them both into the motion equations. Lagrange's equations with multipliers are the peculiar case of the GPME, since they can be applied to systems with constraints of first orders. Our tracking strategy referred to as a model reference program motion tracking control strategy enables tracking of any program motion predefined by the program constraints. It extends the "trajectory tracking" to the "program motion tracking". We also demonstrate that our tracking strategy can be extended to a hybrid program motion/force tracking.
An Analysis of the Effects of Varying Male and Female Force Levels. Appendices 1, 2, 3, and 4.
1985-03-01
facility constraints. 1-13 -’.’’ > ; ; > ; . -.. "-"-"." --,.--. -. .- - .- . .. ’.--. . . ..- the first coeducational commissioning program in any...Carolina, Ohio State, and Drake) on a test basis. In the Spring of 1970, the decision was made to adopt coeducational AFROTC and the program was
An Airborne Parachute Compartment Test Bed for the Orion Parachute Test Program
NASA Technical Reports Server (NTRS)
Moore, James W.; Romero, Leah M.
2013-01-01
The test program developing parachutes for the Orion/MPCV includes drop tests with parachutes deployed from an Orion-like parachute compartment at a wide range of dynamic pressures. Aircraft and altitude constraints precluded the use of an Orion boilerplate capsule for several test points. Therefore, a dart-shaped test vehicle with a hi-fidelity mock-up of the Orion parachute compartment has been developed. The available aircraft options imposed constraints on the test vehicle development and concept of operations. Delivery of this test vehicle to the desired velocity, altitude, and orientation required for the test is a di cult problem involving multiple engineering disciplines. This paper describes the development of the test technique. The engineering challenges include extraction from an aircraft, reposition of the extraction parachute, and mid-air separation of two vehicles, neither of which has an active attitude control system. The desired separation behavior is achieved by precisely controlling the release point using on-board monitoring of the motion. The design of the test vehicle is also described. The trajectory simulations and other analyses used to develop this technique and predict the behavior of the test vehicle are reviewed in detail. The application of the technique on several successful drop tests is summarized.
Reliability of programs specified with equational specifications
NASA Astrophysics Data System (ADS)
Nikolik, Borislav
Ultrareliability is desirable (and sometimes a demand of regulatory authorities) for safety-critical applications, such as commercial flight-control programs, medical applications, nuclear reactor control programs, etc. A method is proposed, called the Term Redundancy Method (TRM), for obtaining ultrareliable programs through specification-based testing. Current specification-based testing schemes need a prohibitively large number of testcases for estimating ultrareliability. They assume availability of an accurate program-usage distribution prior to testing, and they assume the availability of a test oracle. It is shown how to obtain ultrareliable programs (probability of failure near zero) with a practical number of testcases, without accurate usage distribution, and without a test oracle. TRM applies to the class of decision Abstract Data Type (ADT) programs specified with unconditional equational specifications. TRM is restricted to programs that do not exceed certain efficiency constraints in generating testcases. The effectiveness of TRM in failure detection and recovery is demonstrated on formulas from the aircraft collision avoidance system TCAS.
A depth-first search algorithm to compute elementary flux modes by linear programming
2014-01-01
Background The decomposition of complex metabolic networks into elementary flux modes (EFMs) provides a useful framework for exploring reaction interactions systematically. Generating a complete set of EFMs for large-scale models, however, is near impossible. Even for moderately-sized models (<400 reactions), existing approaches based on the Double Description method must iterate through a large number of combinatorial candidates, thus imposing an immense processor and memory demand. Results Based on an alternative elementarity test, we developed a depth-first search algorithm using linear programming (LP) to enumerate EFMs in an exhaustive fashion. Constraints can be introduced to directly generate a subset of EFMs satisfying the set of constraints. The depth-first search algorithm has a constant memory overhead. Using flux constraints, a large LP problem can be massively divided and parallelized into independent sub-jobs for deployment into computing clusters. Since the sub-jobs do not overlap, the approach scales to utilize all available computing nodes with minimal coordination overhead or memory limitations. Conclusions The speed of the algorithm was comparable to efmtool, a mainstream Double Description method, when enumerating all EFMs; the attrition power gained from performing flux feasibility tests offsets the increased computational demand of running an LP solver. Unlike the Double Description method, the algorithm enables accelerated enumeration of all EFMs satisfying a set of constraints. PMID:25074068
A comparison of Heuristic method and Llewellyn’s rules for identification of redundant constraints
NASA Astrophysics Data System (ADS)
Estiningsih, Y.; Farikhin; Tjahjana, R. H.
2018-03-01
Important techniques in linear programming is modelling and solving practical optimization. Redundant constraints are consider for their effects on general linear programming problems. Identification and reduce redundant constraints are for avoidance of all the calculations associated when solving an associated linear programming problems. Many researchers have been proposed for identification redundant constraints. This paper a compararison of Heuristic method and Llewellyn’s rules for identification of redundant constraints.
Modeling damaged wings: Element selection and constraint specification
NASA Technical Reports Server (NTRS)
Stronge, W. J.
1975-01-01
The NASTRAN analytical program was used for structural design, and no problems were anticipated in applying this program to a damaged structure as long as the deformations were small and the strains remained within the elastic range. In this context, NASTRAN was used to test three-dimensional analytical models of a damaged aircraft wing under static loads. A comparison was made of calculated and experimentally measured strains on primary structural components of an RF-84F wing. This comparison brought out two sensitive areas in modeling semimonocoque structures. The calculated strains were strongly affected by the type of elements used adjacent to the damaged region and by the choice of multipoint constraints sets on the damaged boundary.
Alternative Test Methods for Electronic Parts
NASA Technical Reports Server (NTRS)
Plante, Jeannette
2004-01-01
It is common practice within NASA to test electronic parts at the manufacturing lot level to demonstrate, statistically, that parts from the lot tested will not fail in service using generic application conditions. The test methods and the generic application conditions used have been developed over the years through cooperation between NASA, DoD, and industry in order to establish a common set of standard practices. These common practices, found in MIL-STD-883, MIL-STD-750, military part specifications, EEE-INST-002, and other guidelines are preferred because they are considered to be effective and repeatable and their results are usually straightforward to interpret. These practices can sometimes be unavailable to some NASA projects due to special application conditions that must be addressed, such as schedule constraints, cost constraints, logistical constraints, or advances in the technology that make the historical standards an inappropriate choice for establishing part performance and reliability. Alternate methods have begun to emerge and to be used by NASA programs to test parts individually or as part of a system, especially when standard lot tests cannot be applied. Four alternate screening methods will be discussed in this paper: Highly accelerated life test (HALT), forward voltage drop tests for evaluating wire-bond integrity, burn-in options during or after highly accelerated stress test (HAST), and board-level qualification.
Zörnig, Peter
2015-08-01
We present integer programming models for some variants of the farthest string problem. The number of variables and constraints is substantially less than that of the integer linear programming models known in the literature. Moreover, the solution of the linear programming-relaxation contains only a small proportion of noninteger values, which considerably simplifies the rounding process. Numerical tests have shown excellent results, especially when a small set of long sequences is given.
A Representative Shuttle Environmental Control System
NASA Technical Reports Server (NTRS)
Brose, H. F.; Stanley, M. D.; Leblanc, J. C.
1977-01-01
The Representative Shuttle Environmental Control System (RSECS) provides a ground test bed to be used in the early accumulation of component and system operating data, the evaluation of potential system improvements, and possibly the analysis of Shuttle Orbiter test and flight anomalies. Selected components are being subjected to long term tests to determine endurance and corrosion resistance capability prior to Orbiter vehicle experience. Component and system level tests in several cases are being used to support flight certification of Orbiter hardware. These activities are conducted as a development program to allow for timeliness, flexibility, and cost effectiveness not possible in a program burdened by flight documentation and monitoring constraints.
Constraint Force Equation Methodology for Modeling Multi-Body Stage Separation Dynamics
NASA Technical Reports Server (NTRS)
Toniolo, Matthew D.; Tartabini, Paul V.; Pamadi, Bandu N.; Hotchko, Nathaniel
2008-01-01
This paper discusses a generalized approach to the multi-body separation problems in a launch vehicle staging environment based on constraint force methodology and its implementation into the Program to Optimize Simulated Trajectories II (POST2), a widely used trajectory design and optimization tool. This development facilitates the inclusion of stage separation analysis into POST2 for seamless end-to-end simulations of launch vehicle trajectories, thus simplifying the overall implementation and providing a range of modeling and optimization capabilities that are standard features in POST2. Analysis and results are presented for two test cases that validate the constraint force equation methodology in a stand-alone mode and its implementation in POST2.
DOT National Transportation Integrated Search
2000-12-01
This paper brings together the findings of activities that addressed the impacts of nontechnical barriers and constraints that might impede the progress of Intelligent Transportation Systems (ITS) programs. It discusses how the planning and deploymen...
Pollution reduction technology program for small jet aircraft engines: Class T1
NASA Technical Reports Server (NTRS)
Bruce, T. W.; Davis, F. G.; Mongia, H. C.
1977-01-01
Small jet aircraft engines (EPA class T1, turbojet and turbofan engines of less than 35.6 kN thrust) were evaluated with the objective of attaining emissions reduction consistent with performance constraints. Configurations employing the technological advances were screened and developed through full scale rig testing. The most promising approaches in full-scale engine testing were evaluated.
Constraints in Genetic Programming
NASA Technical Reports Server (NTRS)
Janikow, Cezary Z.
1996-01-01
Genetic programming refers to a class of genetic algorithms utilizing generic representation in the form of program trees. For a particular application, one needs to provide the set of functions, whose compositions determine the space of program structures being evolved, and the set of terminals, which determine the space of specific instances of those programs. The algorithm searches the space for the best program for a given problem, applying evolutionary mechanisms borrowed from nature. Genetic algorithms have shown great capabilities in approximately solving optimization problems which could not be approximated or solved with other methods. Genetic programming extends their capabilities to deal with a broader variety of problems. However, it also extends the size of the search space, which often becomes too large to be effectively searched even by evolutionary methods. Therefore, our objective is to utilize problem constraints, if such can be identified, to restrict this space. In this publication, we propose a generic constraint specification language, powerful enough for a broad class of problem constraints. This language has two elements -- one reduces only the number of program instances, the other reduces both the space of program structures as well as their instances. With this language, we define the minimal set of complete constraints, and a set of operators guaranteeing offspring validity from valid parents. We also show that these operators are not less efficient than the standard genetic programming operators if one preprocesses the constraints - the necessary mechanisms are identified.
12 CFR 195.21 - Performance tests, standards, and ratings, in general.
Code of Federal Regulations, 2014 CFR
2014-01-01
... association or obtained from community organizations, state, local, and tribal governments, economic... constraints, including the size and financial condition of the savings association, the economic climate... state or local education loan program), originated by the savings association for a student at an...
12 CFR 195.21 - Performance tests, standards, and ratings, in general.
Code of Federal Regulations, 2012 CFR
2012-01-01
... association or obtained from community organizations, state, local, and tribal governments, economic... constraints, including the size and financial condition of the savings association, the economic climate... state or local education loan program), originated by the savings association for a student at an...
12 CFR 195.21 - Performance tests, standards, and ratings, in general.
Code of Federal Regulations, 2013 CFR
2013-01-01
... association or obtained from community organizations, state, local, and tribal governments, economic... constraints, including the size and financial condition of the savings association, the economic climate... state or local education loan program), originated by the savings association for a student at an...
Transportation impacts of the Chicago River closure to prevent an asian carp infestation.
DOT National Transportation Integrated Search
2012-07-01
This project develops a simple linear programming model of the Upper Midwest regions rail transportation network to test : whether a closure of the Chicago River to freight traffic would impact the capacity constraint of the rail system. The result :...
Multi-Constraint Multi-Variable Optimization of Source-Driven Nuclear Systems
NASA Astrophysics Data System (ADS)
Watkins, Edward Francis
1995-01-01
A novel approach to the search for optimal designs of source-driven nuclear systems is investigated. Such systems include radiation shields, fusion reactor blankets and various neutron spectrum-shaping assemblies. The novel approach involves the replacement of the steepest-descents optimization algorithm incorporated in the code SWAN by a significantly more general and efficient sequential quadratic programming optimization algorithm provided by the code NPSOL. The resulting SWAN/NPSOL code system can be applied to more general, multi-variable, multi-constraint shield optimization problems. The constraints it accounts for may include simple bounds on variables, linear constraints, and smooth nonlinear constraints. It may also be applied to unconstrained, bound-constrained and linearly constrained optimization. The shield optimization capabilities of the SWAN/NPSOL code system is tested and verified in a variety of optimization problems: dose minimization at constant cost, cost minimization at constant dose, and multiple-nonlinear constraint optimization. The replacement of the optimization part of SWAN with NPSOL is found feasible and leads to a very substantial improvement in the complexity of optimization problems which can be efficiently handled.
Fixed Base Modal Survey of the MPCV Orion European Service Module Structural Test Article
NASA Technical Reports Server (NTRS)
Winkel, James P.; Akers, J. C.; Suarez, Vicente J.; Staab, Lucas D.; Napolitano, Kevin L.
2017-01-01
Recently, the MPCV Orion European Service Module Structural Test Article (E-STA) underwent sine vibration testing using the multi-axis shaker system at NASA GRC Plum Brook Station Mechanical Vibration Facility (MVF). An innovative approach using measured constraint shapes at the interface of E-STA to the MVF allowed high-quality fixed base modal parameters of the E-STA to be extracted, which have been used to update the E-STA finite element model (FEM), without the need for a traditional fixed base modal survey. This innovative approach provided considerable program cost and test schedule savings. This paper documents this modal survey, which includes the modal pretest analysis sensor selection, the fixed base methodology using measured constraint shapes as virtual references and measured frequency response functions, and post-survey comparison between measured and analysis fixed base modal parameters.
Verification of a Constraint Force Equation Methodology for Modeling Multi-Body Stage Separation
NASA Technical Reports Server (NTRS)
Tartabini, Paul V.; Roithmayr, Carlos; Toniolo, Matthew D.; Karlgaard, Christopher; Pamadi, Bandu N.
2008-01-01
This paper discusses the verification of the Constraint Force Equation (CFE) methodology and its implementation in the Program to Optimize Simulated Trajectories II (POST2) for multibody separation problems using three specially designed test cases. The first test case involves two rigid bodies connected by a fixed joint; the second case involves two rigid bodies connected with a universal joint; and the third test case is that of Mach 7 separation of the Hyper-X vehicle. For the first two cases, the POST2/CFE solutions compared well with those obtained using industry standard benchmark codes, namely AUTOLEV and ADAMS. For the Hyper-X case, the POST2/CFE solutions were in reasonable agreement with the flight test data. The CFE implementation in POST2 facilitates the analysis and simulation of stage separation as an integral part of POST2 for seamless end-to-end simulations of launch vehicle trajectories.
Aero/structural tailoring of engine blades (AERO/STAEBL)
NASA Technical Reports Server (NTRS)
Brown, K. W.
1988-01-01
This report describes the Aero/Structural Tailoring of Engine Blades (AERO/STAEBL) program, which is a computer code used to perform engine fan and compressor blade aero/structural numerical optimizations. These optimizations seek a blade design of minimum operating cost that satisfies realistic blade design constraints. This report documents the overall program (i.e., input, optimization procedures, approximate analyses) and also provides a detailed description of the validation test cases.
ERIC Educational Resources Information Center
Kratochvil, Kathie R.
2009-01-01
This research study presents one in-depth case study that investigates the successes, challenges, and processes of developing and enacting arts education programming at the elementary school level given the time limitations and other constraints associated with the high stakes testing environment that currently characterizes many of California's…
Cohen-Holzer, Marilyn; Sorek, Gilad; Schless, Simon; Kerem, Julie; Katz-Leurer, Michal
2016-01-01
To assess the influence of an intensive combined constraint and bimanual upper extremity (UE) training program using a variety of modalities including the fitness room and pool, on UE functions as well as the effects of the program on gait parameters among children with hemiparetic cerebral palsy. Ten children ages 6-10 years participated in the program for 2 weeks, 5 days per week for 6 hr each day. Data from the Assisting Hand Assessment (AHA) for bimanual function , the Jebsen-Taylor Test of Hand Function (JTTHF) for unimanual function, the six-minute walk test (6MWT), and the temporal-spatial aspects of gait using the GAITRite walkway were collected prior to, immediately post and 3-months post-intervention. A significant improvement was noted in both unimanual as well as bimanual UE performance; A significant improvement in the 6MWT was noted, from a median of 442 meter [range: 294-558] at baseline to 466 [432-592] post intervention and 528 [425-609] after 3 months (p = .03). Combining intensive practice in a variety of modalities, although targeting to the UE is associated with substantial improvement both in the upper as well as in the lower extremity function.
Power Distribution System Planning with GIS Consideration
NASA Astrophysics Data System (ADS)
Wattanasophon, Sirichai; Eua-Arporn, Bundhit
This paper proposes a method for solving radial distribution system planning problems taking into account geographical information. The proposed method can automatically determine appropriate location and size of a substation, routing of feeders, and sizes of conductors while satisfying all constraints, i.e. technical constraints (voltage drop and thermal limit) and geographical constraints (obstacle, existing infrastructure, and high-cost passages). Sequential quadratic programming (SQP) and minimum path algorithm (MPA) are applied to solve the planning problem based on net price value (NPV) consideration. In addition this method integrates planner's experience and optimization process to achieve an appropriate practical solution. The proposed method has been tested with an actual distribution system, from which the results indicate that it can provide satisfactory plans.
12 CFR 563e.21 - Performance tests, standards, and ratings, in general.
Code of Federal Regulations, 2013 CFR
2013-01-01
... association or obtained from community organizations, state, local, and tribal governments, economic... constraints, including the size and financial condition of the savings association, the economic climate... state or local education loan program), originated by the savings association for a student at an...
12 CFR 563e.21 - Performance tests, standards, and ratings, in general.
Code of Federal Regulations, 2014 CFR
2014-01-01
... association or obtained from community organizations, state, local, and tribal governments, economic... constraints, including the size and financial condition of the savings association, the economic climate... state or local education loan program), originated by the savings association for a student at an...
12 CFR 563e.21 - Performance tests, standards, and ratings, in general.
Code of Federal Regulations, 2011 CFR
2011-01-01
... association or obtained from community organizations, state, local, and tribal governments, economic... constraints, including the size and financial condition of the savings association, the economic climate... state or local education loan program), originated by the savings association for a student at an...
12 CFR 563e.21 - Performance tests, standards, and ratings, in general.
Code of Federal Regulations, 2012 CFR
2012-01-01
... association or obtained from community organizations, state, local, and tribal governments, economic... constraints, including the size and financial condition of the savings association, the economic climate... state or local education loan program), originated by the savings association for a student at an...
Shuttle/Agena study. Annex A: Ascent agena configuration
NASA Technical Reports Server (NTRS)
1972-01-01
Details are presented on the Agena rocket vehicle description, vehicle interfaces, environmental constraints and test requirements, software programs, and ground support equipment. The basic design concept for the Ascent Agena is identified as optimization of reliability, flexibility, performance capabilities, and economy through the use of tested and flight-proven hardware. The development history of the Agenas A, B, and D is outlined and space applications are described.
Damage Tolerance Characterisitics of Composite Sandwich Structures
2000-02-01
requirements impose strict test program is devised and carried out, with hundreds of tests at constraints on the design of composite aircraft... design A particular effort was dedicated to the study of delamination methodologies, as well as static and fatigue strength and growth under...partition according to the theoretical tools, the industries are more or less forced, for the fundamental modes. design of primary composite structures
Solar Sail Models and Test Measurements Correspondence for Validation Requirements Definition
NASA Technical Reports Server (NTRS)
Ewing, Anthony; Adams, Charles
2004-01-01
Solar sails are being developed as a mission-enabling technology in support of future NASA science missions. Current efforts have advanced solar sail technology sufficient to justify a flight validation program. A primary objective of this activity is to test and validate solar sail models that are currently under development so that they may be used with confidence in future science mission development (e.g., scalable to larger sails). Both system and model validation requirements must be defined early in the program to guide design cycles and to ensure that relevant and sufficient test data will be obtained to conduct model validation to the level required. A process of model identification, model input/output documentation, model sensitivity analyses, and test measurement correspondence is required so that decisions can be made to satisfy validation requirements within program constraints.
Bruhn, Peter; Geyer-Schulz, Andreas
2002-01-01
In this paper, we introduce genetic programming over context-free languages with linear constraints for combinatorial optimization, apply this method to several variants of the multidimensional knapsack problem, and discuss its performance relative to Michalewicz's genetic algorithm with penalty functions. With respect to Michalewicz's approach, we demonstrate that genetic programming over context-free languages with linear constraints improves convergence. A final result is that genetic programming over context-free languages with linear constraints is ideally suited to modeling complementarities between items in a knapsack problem: The more complementarities in the problem, the stronger the performance in comparison to its competitors.
Experimental study of transient liquid motion in orbiting spacecraft
NASA Technical Reports Server (NTRS)
Berry, R. L.; Tegart, J. R.
1975-01-01
The results are presented of a twofold study of transient liquid motion such as that which will be experienced during orbital maneuvers by space tug. A test program was conducted in a low-g test facility involving twenty-two drops. Biaxial, low-g accelerations were applied to an instrumented, model propellant tank during free-fall testing, and forces exerted during liquid reorientation were measured and recorded. Photographic records of the liquid reorientation were also made. The test data were used to verify a mechanical analog which portrays the liquid as a point mass moving on an ellipsoidal constraint surface. The mechanical analog was coded into a FORTRAN IV digital computer program: LAMPS, Large AMPlitude Slosh. Test/analytical correlation indicates that the mechanical analog is capable of predicting the overall force trends measured during testing.
Onsite 40-kilowatt fuel cell power plant manufacturing and field test program
NASA Technical Reports Server (NTRS)
1985-01-01
A joint Gas Research Institute and U.S. Department of Energy Program was initiated in 1982 to evaluate the use of fuel cell power systems for on-site energy service. Forty-six 40 kW fuel cell power plants were manufactured at the United Technologies Corporation facility in South Windsor, Connecticut, and are being delivered to host utilities and other program participants in the United States and Japan for field testing. The construction of the 46 fully-integrated power plants was completed in January 1985 within the constraints of the contract plan. The program has provided significant experience in the manufacture, acceptance testing, deployment, and support of on-site fuel cell systems. Initial field test results also show that these experimental power plants meet the performance and environmental requirements of a commercial specification. This Interim Report encompasses the design and manufacturing phases of the 40 kW Power Plant Manufacturing and Field Test program. The contract between UTC and NASA also provides UTC field engineering support to the host utilities, training programs and associated manuals for utility operating and maintenance personnel, spare parts support for a defined test period, and testing at UTC of a power plant made available from a preceding program phase. These activities are ongoing and will be reported subsequently.
The Challenge Course Experience Questionnaire: A Facilitator's Assessment Tool
ERIC Educational Resources Information Center
Schary, David P.; Waldron, Alexis L.
2017-01-01
Challenge course programs influence a variety of psychological, social, and educational outcomes. Yet, many challenges exist when measuring challenge course outcomes like logistical constraints and a lack of specific assessment tools. This study piloted and tested an assessment tool designed for facilitators to measure participant outcomes in…
Time management situation assessment (TMSA)
NASA Technical Reports Server (NTRS)
Richardson, Michael B.; Ricci, Mark J.
1992-01-01
TMSA is a concept prototype developed to support NASA Test Directors (NTDs) in schedule execution monitoring during the later stages of a Shuttle countdown. The program detects qualitative and quantitative constraint violations in near real-time. The next version will support incremental rescheduling and reason over a substantially larger number of scheduled events.
Practical input optimization for aircraft parameter estimation experiments. Ph.D. Thesis, 1990
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.
1993-01-01
The object of this research was to develop an algorithm for the design of practical, optimal flight test inputs for aircraft parameter estimation experiments. A general, single pass technique was developed which allows global optimization of the flight test input design for parameter estimation using the principles of dynamic programming with the input forms limited to square waves only. Provision was made for practical constraints on the input, including amplitude constraints, control system dynamics, and selected input frequency range exclusions. In addition, the input design was accomplished while imposing output amplitude constraints required by model validity and considerations of safety during the flight test. The algorithm has multiple input design capability, with optional inclusion of a constraint that only one control move at a time, so that a human pilot can implement the inputs. It is shown that the technique can be used to design experiments for estimation of open loop model parameters from closed loop flight test data. The report includes a new formulation of the optimal input design problem, a description of a new approach to the solution, and a summary of the characteristics of the algorithm, followed by three example applications of the new technique which demonstrate the quality and expanded capabilities of the input designs produced by the new technique. In all cases, the new input design approach showed significant improvement over previous input design methods in terms of achievable parameter accuracies.
NASA Technical Reports Server (NTRS)
Clark, R. T.; Mccallister, R. D.
1982-01-01
The particular coding option identified as providing the best level of coding gain performance in an LSI-efficient implementation was the optimal constraint length five, rate one-half convolutional code. To determine the specific set of design parameters which optimally matches this decoder to the LSI constraints, a breadboard MCD (maximum-likelihood convolutional decoder) was fabricated and used to generate detailed performance trade-off data. The extensive performance testing data gathered during this design tradeoff study are summarized, and the functional and physical MCD chip characteristics are presented.
Variable-Metric Algorithm For Constrained Optimization
NASA Technical Reports Server (NTRS)
Frick, James D.
1989-01-01
Variable Metric Algorithm for Constrained Optimization (VMACO) is nonlinear computer program developed to calculate least value of function of n variables subject to general constraints, both equality and inequality. First set of constraints equality and remaining constraints inequalities. Program utilizes iterative method in seeking optimal solution. Written in ANSI Standard FORTRAN 77.
PrismTech Data Distribution Service Java API Evaluation
NASA Technical Reports Server (NTRS)
Riggs, Cortney
2008-01-01
My internship duties with Launch Control Systems required me to start performance testing of an Object Management Group's (OMG) Data Distribution Service (DDS) specification implementation by PrismTech Limited through the Java programming language application programming interface (API). DDS is a networking middleware for Real-Time Data Distribution. The performance testing involves latency, redundant publishers, extended duration, redundant failover, and read performance. Time constraints allowed only for a data throughput test. I have designed the testing applications to perform all performance tests when time is allowed. Performance evaluation data such as megabits per second and central processing unit (CPU) time consumption were not easily attainable through the Java programming language; they required new methods and classes created in the test applications. Evaluation of this product showed the rate that data can be sent across the network. Performance rates are better on Linux platforms than AIX and Sun platforms. Compared to previous C++ programming language API, the performance evaluation also shows the language differences for the implementation. The Java API of the DDS has a lower throughput performance than the C++ API.
Loading tests of a wing structure for a hypersonic aircraft
NASA Technical Reports Server (NTRS)
Fields, R. A.; Reardon, L. F.; Siegel, W. H.
1980-01-01
Room-temperature loading tests were conducted on a wing structure designed with a beaded panel concept for a Mach 8 hypersonic research airplane. Strain, stress, and deflection data were compared with the results of three finite-element structural analysis computer programs and with design data. The test program data were used to evaluate the structural concept and the methods of analysis used in the design. A force stiffness technique was utilized in conjunction with load conditions which produced various combinations of panel shear and compression loading to determine the failure envelope of the buckling critical beaded panels The force-stiffness data did not result in any predictions of buckling failure. It was, therefore, concluded that the panels were conservatively designed as a result of design constraints and assumptions of panel eccentricities. The analysis programs calculated strains and stresses competently. Comparisons between calculated and measured structural deflections showed good agreement. The test program offered a positive demonstration of the beaded panel concept subjected to room-temperature load conditions.
A programing system for research and applications in structural optimization
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, J.; Rogers, J. L., Jr.
1981-01-01
The flexibility necessary for such diverse utilizations is achieved by combining, in a modular manner, a state-of-the-art optimization program, a production level structural analysis program, and user supplied and problem dependent interface programs. Standard utility capabilities in modern computer operating systems are used to integrate these programs. This approach results in flexibility of the optimization procedure organization and versatility in the formulation of constraints and design variables. Features shown in numerical examples include: variability of structural layout and overall shape geometry, static strength and stiffness constraints, local buckling failure, and vibration constraints.
An Analysis of the Distinction between Deep and Shallow Expert Systems
1989-08-01
will also be important as we discuss in later sections of this paper. The following test has been proposed by which one may decide whether one program...which is explicitly represented or computed in M"’ (Klein and Finin, 1987). The authors of this test acknowledge that notions such as implicit and...constraints in (DeKleer and Brown, 1984) is that the if-then form of production rules "falsely implies the passage of time" between the test and action
Software for Planning Scientific Activities on Mars
NASA Technical Reports Server (NTRS)
Ai-Chang, Mitchell; Bresina, John; Jonsson, Ari; Hsu, Jennifer; Kanefsky, Bob; Morris, Paul; Rajan, Kanna; Yglesias, Jeffrey; Charest, Len; Maldague, Pierre
2003-01-01
Mixed-Initiative Activity Plan Generator (MAPGEN) is a ground-based computer program for planning and scheduling the scientific activities of instrumented exploratory robotic vehicles, within the limitations of available resources onboard the vehicle. MAPGEN is a combination of two prior software systems: (1) an activity-planning program, APGEN, developed at NASA s Jet Propulsion Laboratory and (2) the Europa planner/scheduler from NASA Ames Research Center. MAPGEN performs all of the following functions: Automatic generation of plans and schedules for scientific and engineering activities; Testing of hypotheses (or what-if analyses of various scenarios); Editing of plans; Computation and analysis of resources; and Enforcement and maintenance of constraints, including resolution of temporal and resource conflicts among planned activities. MAPGEN can be used in either of two modes: one in which the planner/scheduler is turned off and only the basic APGEN functionality is utilized, or one in which both component programs are used to obtain the full planning, scheduling, and constraint-maintenance functionality.
Feedback-Driven Dynamic Invariant Discovery
NASA Technical Reports Server (NTRS)
Zhang, Lingming; Yang, Guowei; Rungta, Neha S.; Person, Suzette; Khurshid, Sarfraz
2014-01-01
Program invariants can help software developers identify program properties that must be preserved as the software evolves, however, formulating correct invariants can be challenging. In this work, we introduce iDiscovery, a technique which leverages symbolic execution to improve the quality of dynamically discovered invariants computed by Daikon. Candidate invariants generated by Daikon are synthesized into assertions and instrumented onto the program. The instrumented code is executed symbolically to generate new test cases that are fed back to Daikon to help further re ne the set of candidate invariants. This feedback loop is executed until a x-point is reached. To mitigate the cost of symbolic execution, we present optimizations to prune the symbolic state space and to reduce the complexity of the generated path conditions. We also leverage recent advances in constraint solution reuse techniques to avoid computing results for the same constraints across iterations. Experimental results show that iDiscovery converges to a set of higher quality invariants compared to the initial set of candidate invariants in a small number of iterations.
Key issues in the thermal design of spaceborne cryogenic infrared instruments
NASA Astrophysics Data System (ADS)
Schember, Helene R.; Rapp, Donald
1992-12-01
Thermal design and analysis play an integral role in the development of spaceborne cryogenic infrared (IR) instruments. From conceptual sketches to final testing, both direct and derived thermal requirements place significant constraints on the instrument design. Although in practice these thermal requirements are interdependent, the sources of most thermal constraints may be grouped into six distinct categories. These are: (1) Detector temperatures, (2) Optics temperatures, (3) Pointing or alignment stability, (4) Mission lifetime, (5) Orbit, and (6) Test and Integration. In this paper, we discuss these six sources of thermal requirements with particular regard to development of instrument packages for low background infrared astronomical observatories. In the end, the thermal performance of these instruments must meet a set of thermal requirements. The development of these requirements is typically an ongoing and interactive process, however, and the thermal design must maintain flexibility and robustness throughout the process. The thermal (or cryogenic) engineer must understand the constraints imposed by the science requirements, the specific hardware, the observing environment, the mission design, and the testing program. By balancing these often competing factors, the system-oriented thermal engineer can work together with the experiment team to produce an effective overall design of the instrument.
Development and testing of airfoils for high-altitude aircraft
NASA Technical Reports Server (NTRS)
Drela, Mark (Principal Investigator)
1996-01-01
Specific tasks included airfoil design; study of airfoil constraints on pullout maneuver; selection of tail airfoils; examination of wing twist; test section instrumentation and layout; and integrated airfoil/heat-exchanger tests. In the course of designing the airfoil, specifically for the APEX test vehicle, extensive studies were made over the Mach and Reynolds number ranges of interest. It is intended to be representative of airfoils required for lightweight aircraft operating at extreme altitudes, which is the primary research objective of the APEX program. Also considered were thickness, pitching moment, and off-design behavior. The maximum ceiling parameter M(exp 2)C(sub L) value achievable by the Apex-16 airfoil was found to be a strong constraint on the pullout maneuver. The NACA 1410 and 2410 airfoils (inverted) were identified as good candidates for the tail, with predictable behavior at low Reynolds numbers and good tolerance to flap deflections. With regards to wing twist, it was decided that a simple flat wing was a reasonable compromise. The test section instrumentation consisted of surface pressure taps, wake rakes, surface-mounted microphones, and skin-friction gauges. Also, a modest wind tunnel test was performed for an integrated airfoil/heat-exchanger configuration, which is currently on Aurora's 'Theseus' aircraft. Although not directly related to the APEX tests, the aerodynamics or heat exchangers has been identified as a crucial aspect of designing high-altitude aircraft and hence is relevant to the ERAST program.
Komar, Alyssa; Ashley, Kelsey; Hanna, Kelly; Lavallee, Julia; Woodhouse, Janet; Bernstein, Janet; Andres, Matthew; Reed, Nick
2016-01-01
A pretest-posttest retrospective design was used to evaluate the impact of a group-based modified constraint-induced movement therapy (mCIMT) program on upper extremity function and occupational performance. 20 children ages 3 to 18 years with hemiplegia following an acquired brain injury participated in a 2-week group mCIMT program. Upper extremity function was measured with the Assisting Hand Assessment (AHA) and subtests from the Quality of Upper Extremity Skills Test (QUEST). Occupational performance and satisfaction were assessed using the Canadian Occupational Performance Measure (COPM). Data were analyzed using a Wilcoxon signed-ranks test. Group-based analysis revealed upper extremity function and occupational performance attained statistically significant improvements from pre- to postintervention on all outcome measures (AHA: Z = -3.63, p = <.001; QUEST Grasps: Z = -3.10, p = .002; QUEST Dissociated Movement: Z = -2.51, p = .012; COPM Performance: Z = -3.64, p = <.001; COPM Satisfaction: Z = -3.64, p = <.001). Across individuals, clinically significant improvements were found in 65% of participants' AHA scores. 80% of COPM Performance scores and 70% of COPM Satisfaction scores demonstrated clinically significant improvements in at least one identified goal. This study is an initial step in evaluating and providing preliminary evidence supporting the effectiveness of a group-based mCIMT program for children with hemiplegia following an acquired brain injury.
Maillot, Matthieu; Ferguson, Elaine L; Drewnowski, Adam; Darmon, Nicole
2008-06-01
Nutrient profiling ranks foods based on their nutrient content. They may help identify foods with a good nutritional quality for their price. This hypothesis was tested using diet modeling with linear programming. Analyses were undertaken using food intake data from the nationally representative French INCA (enquête Individuelle et Nationale sur les Consommations Alimentaires) survey and its associated food composition and price database. For each food, a nutrient profile score was defined as the ratio between the previously published nutrient density score (NDS) and the limited nutrient score (LIM); a nutritional quality for price indicator was developed and calculated from the relationship between its NDS:LIM and energy cost (in euro/100 kcal). We developed linear programming models to design diets that fulfilled increasing levels of nutritional constraints at a minimal cost. The median NDS:LIM values of foods selected in modeled diets increased as the levels of nutritional constraints increased (P = 0.005). In addition, the proportion of foods with a good nutritional quality for price indicator was higher (P < 0.0001) among foods selected (81%) than among foods not selected (39%) in modeled diets. This agreement between the linear programming and the nutrient profiling approaches indicates that nutrient profiling can help identify foods of good nutritional quality for their price. Linear programming is a useful tool for testing nutrient profiling systems and validating the concept of nutrient profiling.
ERIC Educational Resources Information Center
Bowen, Shelly-Ann K.; Saunders, Ruth P.; Richter, Donna L.; Hussey, Jim; Elder, Keith; Lindley, Lisa
2010-01-01
Most HIV-prevention funding agencies require the use of evidence-based behavioral interventions, tested and proven to be effective through outcome evaluation. Adaptation of programs during implementation is common and may be influenced by many factors, including agency mission, time constraints, and funding streams. There are few theoretical…
Warid, Warid; Hizam, Hashim; Mariun, Norman; Abdul-Wahab, Noor Izzri
2016-01-01
This paper proposes a new formulation for the multi-objective optimal power flow (MOOPF) problem for meshed power networks considering distributed generation. An efficacious multi-objective fuzzy linear programming optimization (MFLP) algorithm is proposed to solve the aforementioned problem with and without considering the distributed generation (DG) effect. A variant combination of objectives is considered for simultaneous optimization, including power loss, voltage stability, and shunt capacitors MVAR reserve. Fuzzy membership functions for these objectives are designed with extreme targets, whereas the inequality constraints are treated as hard constraints. The multi-objective fuzzy optimal power flow (OPF) formulation was converted into a crisp OPF in a successive linear programming (SLP) framework and solved using an efficient interior point method (IPM). To test the efficacy of the proposed approach, simulations are performed on the IEEE 30-busand IEEE 118-bus test systems. The MFLP optimization is solved for several optimization cases. The obtained results are compared with those presented in the literature. A unique solution with a high satisfaction for the assigned targets is gained. Results demonstrate the effectiveness of the proposed MFLP technique in terms of solution optimality and rapid convergence. Moreover, the results indicate that using the optimal DG location with the MFLP algorithm provides the solution with the highest quality.
Warid, Warid; Hizam, Hashim; Mariun, Norman; Abdul-Wahab, Noor Izzri
2016-01-01
This paper proposes a new formulation for the multi-objective optimal power flow (MOOPF) problem for meshed power networks considering distributed generation. An efficacious multi-objective fuzzy linear programming optimization (MFLP) algorithm is proposed to solve the aforementioned problem with and without considering the distributed generation (DG) effect. A variant combination of objectives is considered for simultaneous optimization, including power loss, voltage stability, and shunt capacitors MVAR reserve. Fuzzy membership functions for these objectives are designed with extreme targets, whereas the inequality constraints are treated as hard constraints. The multi-objective fuzzy optimal power flow (OPF) formulation was converted into a crisp OPF in a successive linear programming (SLP) framework and solved using an efficient interior point method (IPM). To test the efficacy of the proposed approach, simulations are performed on the IEEE 30-busand IEEE 118-bus test systems. The MFLP optimization is solved for several optimization cases. The obtained results are compared with those presented in the literature. A unique solution with a high satisfaction for the assigned targets is gained. Results demonstrate the effectiveness of the proposed MFLP technique in terms of solution optimality and rapid convergence. Moreover, the results indicate that using the optimal DG location with the MFLP algorithm provides the solution with the highest quality. PMID:26954783
Development of a prototype commonality analysis tool for use in space programs
NASA Technical Reports Server (NTRS)
Yeager, Dorian P.
1988-01-01
A software tool to aid in performing commonality analyses, called Commonality Analysis Problem Solver (CAPS), was designed, and a prototype version (CAPS 1.0) was implemented and tested. The CAPS 1.0 runs in an MS-DOS or IBM PC-DOS environment. The CAPS is designed around a simple input language which provides a natural syntax for the description of feasibility constraints. It provides its users with the ability to load a database representing a set of design items, describe the feasibility constraints on items in that database, and do a comprehensive cost analysis to find the most economical substitution pattern.
Thermal/Structural Tailoring of Engine Blades (T/STAEBL) User's manual
NASA Technical Reports Server (NTRS)
Brown, K. W.
1994-01-01
The Thermal/Structural Tailoring of Engine Blades (T/STAEBL) system is a computer code that is able to perform numerical optimizations of cooled jet engine turbine blades and vanes. These optimizations seek an airfoil design of minimum operating cost that satisfies realistic design constraints. This report documents the organization of the T/STAEBL computer program, its design and analysis procedure, its optimization procedure, and provides an overview of the input required to run the program, as well as the computer resources required for its effective use. Additionally, usage of the program is demonstrated through a validation test case.
Thermal/Structural Tailoring of Engine Blades (T/STAEBL): User's manual
NASA Astrophysics Data System (ADS)
Brown, K. W.
1994-03-01
The Thermal/Structural Tailoring of Engine Blades (T/STAEBL) system is a computer code that is able to perform numerical optimizations of cooled jet engine turbine blades and vanes. These optimizations seek an airfoil design of minimum operating cost that satisfies realistic design constraints. This report documents the organization of the T/STAEBL computer program, its design and analysis procedure, its optimization procedure, and provides an overview of the input required to run the program, as well as the computer resources required for its effective use. Additionally, usage of the program is demonstrated through a validation test case.
A Boilerplate Capsule Test Technique for the Orion Parachute Test Program
NASA Technical Reports Server (NTRS)
Moore, James W.; Fraire, Usbaldo, Jr.
2013-01-01
The test program developing parachutes for the Orion/MPCV includes drop tests of a Parachute Test Vehicle designed to emulate the wake of the Orion capsule. Delivery of this test vehicle to the initial velocity, altitude, and orientation required for the test is a difficult problem involving multiple engineering disciplines. The available delivery of aircraft options imposed constraints on the test vehicle development and concept of operations. This paper describes the development of this test technique. The engineering challenges include the extraction from an aircraft and separation of two aerodynamically unstable vehicles, one of which will be delivered to a specific orientation with reasonably small rates. The desired attitude is achieved by precisely targeting the separation point using on-board monitoring of the motion. The design of the test vehicle is described. The trajectory simulations and other analyses used to develop this technique and predict the behavior of the test article are reviewed in detail. The application of the technique on several successful drop tests is summarized.
Requirements Analysis for Large Ada Programs: Lessons Learned on CCPDS- R
1989-12-01
when the design had matured and This approach was not optimal from the formal the SRS role was to be the tester’s contract, implemen- testing and...on the software development CPU processing load. These constraints primar- process is the necessity to include sufficient testing ily affect algorithm...allocations and timing requirements are by-products of the software design process when multiple CSCls are a P R StrR eSOFTWARE ENGINEERING executed within
Program manual for ASTOP, an Arbitrary space trajectory optimization program
NASA Technical Reports Server (NTRS)
Horsewood, J. L.
1974-01-01
The ASTOP program (an Arbitrary Space Trajectory Optimization Program) designed to generate optimum low-thrust trajectories in an N-body field while satisfying selected hardware and operational constraints is presented. The trajectory is divided into a number of segments or arcs over which the control is held constant. This constant control over each arc is optimized using a parameter optimization scheme based on gradient techniques. A modified Encke formulation of the equations of motion is employed. The program provides a wide range of constraint, end conditions, and performance index options. The basic approach is conducive to future expansion of features such as the incorporation of new constraints and the addition of new end conditions.
It Takes a Village: Network Effects on Rural Education in Afghanistan. PRGS Dissertation
ERIC Educational Resources Information Center
Hoover, Matthew Amos
2014-01-01
Often, development organizations confront a tradeoff between program priorities and operational constraints. These constraints may be financial, capacity, or logistical; regardless, the tradeoff often requires sacrificing portions of a program. This work is concerned with figuring out how, when constrained, an organization or program manager can…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nanstad, Randy K; Sokolov, Mikhail A; Merkle, John Graham
2007-01-01
To enable determination of the fracture toughness reference temperature, T0, with reactor pressure vessel surveillance specimens, the precracked Charpy (PCVN) three-point bend, SE(B), specimen is of interest. Compared with the 25-mm (1 in.) thick compact, 1TC(T), specimen, tests with the PCVN specimen (10x10x55 mm) have resulted in T0 temperatures as much as 40 XC lower (a so-called specimen bias effect). The Heavy-Section Steel Irradiation (HSSI) Program at Oak Ridge National Laboratory developed a two-part project to evaluate the C(T) versus PCVN differences, (1) calibration experiments concentrating on test practices, and (2) a matrix of transition range tests with various specimenmore » geometries and sizes, including 1T SE(B) and 1TC(T). The test material selected was a plate of A533 grade B class 1 steel. The calibration experiments included assessment of the computational validity of J-integral determinations, while the constraint characteristics of various specimen types and sizes were evaluated using key curves and notch strength determinations. The results indicate that J-integral solutions for the small PCVN specimen are comparable in terms of J-integral validity with 1T bend specimens. Regarding constraint evaluations, Phase I deformation is defined where plastic deformation is confined to crack tip plastic zone development, whereas Phase II deformation is defined where plastic hinging deformation develops. In Phase II deformation, the 0.5T SE(B) B B specimen (slightly larger than the PCVN specimen) consistently showed the highest constraint of all SE(B) specimens evaluated for constraint comparisons. The PCVN specimen begins the Phase II type of deformation at relatively low KR levels, with the result that KJc values above about 70 MPa m from precracked Charpy specimens are under extensive plastic hinging deformation.« less
Liang, Bin; Li, Yongbao; Wei, Ran; Guo, Bin; Xu, Xuang; Liu, Bo; Li, Jiafeng; Wu, Qiuwen; Zhou, Fugen
2018-01-05
With robot-controlled linac positioning, robotic radiotherapy systems such as CyberKnife significantly increase freedom of radiation beam placement, but also impose more challenges on treatment plan optimization. The resampling mechanism in the vendor-supplied treatment planning system (MultiPlan) cannot fully explore the increased beam direction search space. Besides, a sparse treatment plan (using fewer beams) is desired to improve treatment efficiency. This study proposes a singular value decomposition linear programming (SVDLP) optimization technique for circular collimator based robotic radiotherapy. The SVDLP approach initializes the input beams by simulating the process of covering the entire target volume with equivalent beam tapers. The requirements on dosimetry distribution are modeled as hard and soft constraints, and the sparsity of the treatment plan is achieved by compressive sensing. The proposed linear programming (LP) model optimizes beam weights by minimizing the deviation of soft constraints subject to hard constraints, with a constraint on the l 1 norm of the beam weight. A singular value decomposition (SVD) based acceleration technique was developed for the LP model. Based on the degeneracy of the influence matrix, the model is first compressed into lower dimension for optimization, and then back-projected to reconstruct the beam weight. After beam weight optimization, the number of beams is reduced by removing the beams with low weight, and optimizing the weights of the remaining beams using the same model. This beam reduction technique is further validated by a mixed integer programming (MIP) model. The SVDLP approach was tested on a lung case. The results demonstrate that the SVD acceleration technique speeds up the optimization by a factor of 4.8. Furthermore, the beam reduction achieves a similar plan quality to the globally optimal plan obtained by the MIP model, but is one to two orders of magnitude faster. Furthermore, the SVDLP approach is tested and compared with MultiPlan on three clinical cases of varying complexities. In general, the plans generated by the SVDLP achieve steeper dose gradient, better conformity and less damage to normal tissues. In conclusion, the SVDLP approach effectively improves the quality of treatment plan due to the use of the complete beam search space. This challenging optimization problem with the complete beam search space is effectively handled by the proposed SVD acceleration.
NASA Astrophysics Data System (ADS)
Liang, Bin; Li, Yongbao; Wei, Ran; Guo, Bin; Xu, Xuang; Liu, Bo; Li, Jiafeng; Wu, Qiuwen; Zhou, Fugen
2018-01-01
With robot-controlled linac positioning, robotic radiotherapy systems such as CyberKnife significantly increase freedom of radiation beam placement, but also impose more challenges on treatment plan optimization. The resampling mechanism in the vendor-supplied treatment planning system (MultiPlan) cannot fully explore the increased beam direction search space. Besides, a sparse treatment plan (using fewer beams) is desired to improve treatment efficiency. This study proposes a singular value decomposition linear programming (SVDLP) optimization technique for circular collimator based robotic radiotherapy. The SVDLP approach initializes the input beams by simulating the process of covering the entire target volume with equivalent beam tapers. The requirements on dosimetry distribution are modeled as hard and soft constraints, and the sparsity of the treatment plan is achieved by compressive sensing. The proposed linear programming (LP) model optimizes beam weights by minimizing the deviation of soft constraints subject to hard constraints, with a constraint on the l 1 norm of the beam weight. A singular value decomposition (SVD) based acceleration technique was developed for the LP model. Based on the degeneracy of the influence matrix, the model is first compressed into lower dimension for optimization, and then back-projected to reconstruct the beam weight. After beam weight optimization, the number of beams is reduced by removing the beams with low weight, and optimizing the weights of the remaining beams using the same model. This beam reduction technique is further validated by a mixed integer programming (MIP) model. The SVDLP approach was tested on a lung case. The results demonstrate that the SVD acceleration technique speeds up the optimization by a factor of 4.8. Furthermore, the beam reduction achieves a similar plan quality to the globally optimal plan obtained by the MIP model, but is one to two orders of magnitude faster. Furthermore, the SVDLP approach is tested and compared with MultiPlan on three clinical cases of varying complexities. In general, the plans generated by the SVDLP achieve steeper dose gradient, better conformity and less damage to normal tissues. In conclusion, the SVDLP approach effectively improves the quality of treatment plan due to the use of the complete beam search space. This challenging optimization problem with the complete beam search space is effectively handled by the proposed SVD acceleration.
Cohen-Holzer, Marilyn; Sorek, Gilad; Schweizer, Maayan; Katz-Leurer, Michal
2017-01-01
An intensive hybrid program, combining constraint with bimanual training, improves upper extremity function as well as walking endurance of children with unilateral cerebral palsy (UCP). Endurance improvement may be associated with the cardiac autonomic regulation system (CARS) adaptation, known to be impaired among these children. To examine the influence of an intensive hybrid program on CARS, walking endurance and the correlation with upper extremity function of children with UCP. Twenty-four children aged 6-10 years with UCP participated in a hybrid program, 10 days, 6 hours per day. Data were collected pre-, post- and 3-months post-intervention. Main outcome measures included the Polar RS800CX for heart rate (HR) and heart rate variability (HRV) data, the 6-Minute Walk Test (6MWT) for endurance, and the Assisting Hand Assessment (AHA) and Jebsen-Taylor Test of Hand Function (JTTHF) for bimanual and unimanual function. A significant reduction in HR and an increase in HRV at post- and 3-month post-intervention was noted (χ22= 8.3, p = 0.016) along with a significant increase in 6MWT with a median increase of 81 meters (χ22= 11.0, p = 0.004) at the same interval. A significant improvement was noted in unimanual and bimanual performance following the intervention. An intensive hybrid program effectively improved CARS function as well as walking endurance and upper extremity function in children with UCP (213).
Optimization of Water Resources and Agricultural Activities for Economic Benefit in Colorado
NASA Astrophysics Data System (ADS)
LIM, J.; Lall, U.
2017-12-01
The limited water resources available for irrigation are a key constraint for the important agricultural sector of Colorado's economy. As climate change and groundwater depletion reshape these resources, it is essential to understand the economic potential of water resources under different agricultural production practices. This study uses a linear programming optimization at the county spatial scale and annual temporal scales to study the optimal allocation of water withdrawal and crop choices. The model, AWASH, reflects streamflow constraints between different extraction points, six field crops, and a distinct irrigation decision for maize and wheat. The optimized decision variables, under different environmental, social, economic, and physical constraints, provide long-term solutions for ground and surface water distribution and for land use decisions so that the state can generate the maximum net revenue. Colorado, one of the largest agricultural producers, is tested as a case study and the sensitivity on water price and on climate variability is explored.
Examining the effects of postural constraints on estimating reach.
Gabbard, Carl; Cordova, Alberto; Lee, Sunghan
2007-07-01
The tendency to overestimate has consistently been reported in studies of reachability estimation. According to one of the more prominent explanations, the postural stability hypothesis, the perceived reaching limit depends on the individual's perceived postural constraints. To test that proposition, the authors compared estimates of reachability of 38 adults (a) in the seated posture (P1) and (b) in the more demanding posture of standing on one foot and leaning forward (P2). Although there was no difference between conditions for total error, results for the distribution and direction of error indicated that participants overestimated in the P1 condition and underestimated in the P2 condition. It therefore appears that perceived postural constraints could be a factor in judgments of reachability. When participants in the present study perceived greater postural demands, they may have elected to program a more conservative strategy that resulted in underestimation.
Spacecraft command verification: The AI solution
NASA Technical Reports Server (NTRS)
Fesq, Lorraine M.; Stephan, Amy; Smith, Brian K.
1990-01-01
Recently, a knowledge-based approach was used to develop a system called the Command Constraint Checker (CCC) for TRW. CCC was created to automate the process of verifying spacecraft command sequences. To check command files by hand for timing and sequencing errors is a time-consuming and error-prone task. Conventional software solutions were rejected when it was estimated that it would require 36 man-months to build an automated tool to check constraints by conventional methods. Using rule-based representation to model the various timing and sequencing constraints of the spacecraft, CCC was developed and tested in only three months. By applying artificial intelligence techniques, CCC designers were able to demonstrate the viability of AI as a tool to transform difficult problems into easily managed tasks. The design considerations used in developing CCC are discussed and the potential impact of this system on future satellite programs is examined.
NASA Technical Reports Server (NTRS)
Roschke, E. J.; Coulbert, C. D.
1979-01-01
The five relative energy release criteria (RERC) which are a first step towards formulating a unified concept that can be applied to the development of fires in enclosures, place upper bounds on the rate and amount of energy released during a fire. They are independent, calculated readily, and may be applied generally to any enclosure regardless of size. They are useful in pretest planning and for interpreting experimental data. Data from several specific fire test programs were examined to evaluate the potential use of RERC to provide test planning guidelines. The RERC were compared with experimental data obtained in full-scale enclosures. These results confirm that in general the RERC do identify the proper limiting constraints on enclosure fire development and determine the bounds of the fire development envelope. Plotting actual fire data against the RERC reveals new valid insights into fire behavior and reveals the controlling constraints in fire development. The RERC were calculated and plotted for several descrpitions of full-scale fires in various aircraft compartments.
Joint Chance-Constrained Dynamic Programming
NASA Technical Reports Server (NTRS)
Ono, Masahiro; Kuwata, Yoshiaki; Balaram, J. Bob
2012-01-01
This paper presents a novel dynamic programming algorithm with a joint chance constraint, which explicitly bounds the risk of failure in order to maintain the state within a specified feasible region. A joint chance constraint cannot be handled by existing constrained dynamic programming approaches since their application is limited to constraints in the same form as the cost function, that is, an expectation over a sum of one-stage costs. We overcome this challenge by reformulating the joint chance constraint into a constraint on an expectation over a sum of indicator functions, which can be incorporated into the cost function by dualizing the optimization problem. As a result, the primal variables can be optimized by a standard dynamic programming, while the dual variable is optimized by a root-finding algorithm that converges exponentially. Error bounds on the primal and dual objective values are rigorously derived. We demonstrate the algorithm on a path planning problem, as well as an optimal control problem for Mars entry, descent and landing. The simulations are conducted using a real terrain data of Mars, with four million discrete states at each time step.
King, Laurie A; Horak, Fay B
2009-01-01
This article introduces a new framework for therapists to develop an exercise program to delay mobility disability in people with Parkinson disease (PD). Mobility, or the ability to efficiently navigate and function in a variety of environments, requires balance, agility, and flexibility, all of which are affected by PD. This article summarizes recent research identifying how constraints on mobility specific to PD, such as rigidity, bradykinesia, freezing, poor sensory integration, inflexible program selection, and impaired cognitive processing, limit mobility in people with PD. Based on these constraints, a conceptual framework for exercises to maintain and improve mobility is presented. An example of a constraint-focused agility exercise program, incorporating movement principles from tai chi, kayaking, boxing, lunges, agility training, and Pilates exercises, is presented. This new constraint-focused agility exercise program is based on a strong scientific framework and includes progressive levels of sensorimotor, resistance, and coordination challenges that can be customized for each patient while maintaining fidelity. Principles for improving mobility presented here can be incorporated into an ongoing or long-term exercise program for people with PD. PMID:19228832
King, Laurie A; Horak, Fay B
2009-04-01
This article introduces a new framework for therapists to develop an exercise program to delay mobility disability in people with Parkinson disease (PD). Mobility, or the ability to efficiently navigate and function in a variety of environments, requires balance, agility, and flexibility, all of which are affected by PD. This article summarizes recent research identifying how constraints on mobility specific to PD, such as rigidity, bradykinesia, freezing, poor sensory integration, inflexible program selection, and impaired cognitive processing, limit mobility in people with PD. Based on these constraints, a conceptual framework for exercises to maintain and improve mobility is presented. An example of a constraint-focused agility exercise program, incorporating movement principles from tai chi, kayaking, boxing, lunges, agility training, and Pilates exercises, is presented. This new constraint-focused agility exercise program is based on a strong scientific framework and includes progressive levels of sensorimotor, resistance, and coordination challenges that can be customized for each patient while maintaining fidelity. Principles for improving mobility presented here can be incorporated into an ongoing or long-term exercise program for people with PD.
A heuristic constraint programmed planner for deep space exploration problems
NASA Astrophysics Data System (ADS)
Jiang, Xiao; Xu, Rui; Cui, Pingyuan
2017-10-01
In recent years, the increasing numbers of scientific payloads and growing constraints on the probe have made constraint processing technology a hotspot in the deep space planning field. In the procedure of planning, the ordering of variables and values plays a vital role. This paper we present two heuristic ordering methods for variables and values. On this basis a graphplan-like constraint-programmed planner is proposed. In the planner we convert the traditional constraint satisfaction problem to a time-tagged form with different levels. Inspired by the most constrained first principle in constraint satisfaction problem (CSP), the variable heuristic is designed by the number of unassigned variables in the constraint and the value heuristic is designed by the completion degree of the support set. The simulation experiments show that the planner proposed is effective and its performance is competitive with other kind of planners.
ERIC Educational Resources Information Center
Murphy, Mark Patrick
2012-01-01
Although much has been researched and written about the value of extracurricular programs in U.S. public schools, few studies have addressed the combined effect that school reform initiatives, including myriad standardized tests, accountability measures, and massive financial crisis which have become more commonplace during periods of economic…
Mathematical Modeling for Optimal System Testing under Fixed-cost Constraint
2009-04-22
Logistics Network Strategic Sourcing Program Management Building Collaborative Capacity Business Process Reengineering (BPR) for LCS Mission...research presented at the symposium was supported by the Acquisition Chair of the Graduate School of Business & Public Policy at the Naval...James B. Greene, RADM, USN, (Ret) Acquisition Chair Graduate School of Business and Public Policy Naval Postgraduate School 555 Dyer Road, Room
Outage maintenance checks on large generator windings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nindra, B.; Jeney, S.I.; Slobodinsky, Y.
In the present days of austerity, more constraints and pressures are being brought on the maintenance engineers to certify the generators for their reliability and life extension. The outages are shorter and intervals between the outages are becoming longer. The annual outages were very common when utilities had no regulatory constraints and also had standby capacities. Furthermore, due to lean and mean budgets, outage maintenance programs are being pursued more aggressively, so that longer interval outages can be achieved to ensure peak generator performance. This paper will discuss various visual checks, electrical tests and recommended fixes to achieve the abovemore » mentioned objectives, in case any deficiencies are found.« less
Research Breathes New Life Into Senior Travel Program.
ERIC Educational Resources Information Center
Blazey, Michael
1986-01-01
A survey of older citizens concerning travel interests revealed constraints to participation in a travel program. A description is given of how research on attitudes and life styles indicated ways in which these constraints could be lessened. (JD)
Deducing protein structures using logic programming: exploiting minimum data of diverse types.
Sibbald, P R
1995-04-21
The extent to which a protein can be modeled from constraint data depends on the amount and quality of the data. This report quantifies a relationship between the amount of data and the achievable model resolution. In an information-theoretic framework the number of bits of information per residue needed to constrain a solution was calculated. The number of bits provided by different kinds of constraints was estimated from a tetrahedral lattice where all unique molecules of 6, 9, ..., 21 atoms were enumerated. Subsets of these molecules consistent with different constraint sets were then chosen, counted, and the root-mean-square distance between them calculated. This provided the desired relations. In a discrete system the number of possible models can be severely limited with relatively few constraints. An expert system that can model a protein from data of different types was built to illustrate the principle and was tested using known proteins as examples. C-alpha resolutions of 5 A are obtainable from 5 bits of information per amino acid and, in principle, from data that could be rapidly collected using standard biophysical techniques.
Automatic data partitioning on distributed memory multicomputers. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Gupta, Manish
1992-01-01
Distributed-memory parallel computers are increasingly being used to provide high levels of performance for scientific applications. Unfortunately, such machines are not very easy to program. A number of research efforts seek to alleviate this problem by developing compilers that take over the task of generating communication. The communication overheads and the extent of parallelism exploited in the resulting target program are determined largely by the manner in which data is partitioned across different processors of the machine. Most of the compilers provide no assistance to the programmer in the crucial task of determining a good data partitioning scheme. A novel approach is presented, the constraints-based approach, to the problem of automatic data partitioning for numeric programs. In this approach, the compiler identifies some desirable requirements on the distribution of various arrays being referenced in each statement, based on performance considerations. These desirable requirements are referred to as constraints. For each constraint, the compiler determines a quality measure that captures its importance with respect to the performance of the program. The quality measure is obtained through static performance estimation, without actually generating the target data-parallel program with explicit communication. Each data distribution decision is taken by combining all the relevant constraints. The compiler attempts to resolve any conflicts between constraints such that the overall execution time of the parallel program is minimized. This approach has been implemented as part of a compiler called Paradigm, that accepts Fortran 77 programs, and specifies the partitioning scheme to be used for each array in the program. We have obtained results on some programs taken from the Linpack and Eispack libraries, and the Perfect Benchmarks. These results are quite promising, and demonstrate the feasibility of automatic data partitioning for a significant class of scientific application programs with regular computations.
NASA Technical Reports Server (NTRS)
Coppolino, R. N.
1974-01-01
Details are presented of the implementation of the new formulation into NASTRAN including descriptions of the DMAP statements required for conversion of the program and details pertaining to problem definition and bulk data considerations. Details of the current 1/8-scale space shuttle external tank mathematical model, numerical results and analysis/test comparisons are also presented. The appendices include a description and listing of a FORTRAN program used to develop harmonic transformation bulk data (multipoint constraint statements) and sample bulk data information for a number of hydroelastic problems.
Modeling global macroclimatic constraints on ectotherm energy budgets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grant, B.W.; Porter, W.P.
1992-12-31
The authors describe a mechanistic individual-based model of how global macroclimatic constraints affect the energy budgets of ectothermic animals. The model uses macroclimatic and biophysical characters of the habitat and organism and tenets of heat transfer theory to calculate hourly temperature availabilities over a year. Data on the temperature dependence of activity rate, metabolism, food consumption and food processing capacity are used to estimate the net rate of resource assimilation which is then integrated over time. They present a new test of this model in which they show that the predicted energy budget sizes for 11 populations of the lizardmore » Sceloporus undulates are in close agreement with observed results from previous field studies. This demonstrates that model tests rae feasible and the results are reasonable. Further, since the model represents an upper bound to the size of the energy budget, observed residual deviations form explicit predictions about the effects of environmental constraints on the bioenergetics of the study lizards within each site that may be tested by future field and laboratory studies. Three major new improvements to the modeling are discussed. They present a means to estimate microclimate thermal heterogeneity more realistically and include its effects on field rates of individual activity and food consumption. Second, they describe an improved model of digestive function involving batch processing of consumed food. Third, they show how optimality methods (specifically the methods of stochastic dynamic programming) may be included to model the fitness consequences of energy allocation decisions subject to food consumption and processing constraints which are predicted from the microclimate and physiological modeling.« less
Duality in non-linear programming
NASA Astrophysics Data System (ADS)
Jeyalakshmi, K.
2018-04-01
In this paper we consider duality and converse duality for a programming problem involving convex objective and constraint functions with finite dimensional range. We do not assume any constraint qualification. The dual is presented by reducing the problem to a standard Lagrange multiplier problem.
Balancing Flexible Constraints and Measurement Precision in Computerized Adaptive Testing
ERIC Educational Resources Information Center
Moyer, Eric L.; Galindo, Jennifer L.; Dodd, Barbara G.
2012-01-01
Managing test specifications--both multiple nonstatistical constraints and flexibly defined constraints--has become an important part of designing item selection procedures for computerized adaptive tests (CATs) in achievement testing. This study compared the effectiveness of three procedures: constrained CAT, flexible modified constrained CAT,…
Teaching People to Manage Constraints: Effects on Creative Problem-Solving
ERIC Educational Resources Information Center
Peterson, David R.; Barrett, Jamie D.; Hester, Kimberly S.; Robledo, Issac C.; Hougen, Dean F.; Day, Eric A.; Mumford, Michael D.
2013-01-01
Constraints often inhibit creative problem-solving. This study examined the impact of training strategies for managing constraints on creative problem-solving. Undergraduates, 218 in all, were asked to work through 1 to 4 self-paced instructional programs focused on constraint management strategies. The quality, originality, and elegance of…
Energy Efficient Engine exhaust mixer model technology report addendum; phase 3 test program
NASA Technical Reports Server (NTRS)
Larkin, M. J.; Blatt, J. R.
1984-01-01
The Phase 3 exhaust mixer test program was conducted to explore the trends established during previous Phases 1 and 2. Combinations of mixer design parameters were tested. Phase 3 testing showed that the best performance achievable within tailpipe length and diameter constraints is 2.55 percent better than an optimized separate flow base line. A reduced penetration design achieved about the same overall performance level at a substantially lower level of excess pressure loss but with a small reduction in mixing. To improve reliability of the data, the hot and cold flow thrust coefficient analysis used in Phases 1 and 2 was augmented by calculating percent mixing from traverse data. Relative change in percent mixing between configurations was determined from thrust and flow coefficient increments. The calculation procedure developed was found to be a useful tool in assessing mixer performance. Detailed flow field data were obtained to facilitate calibration of computer codes.
ERIC Educational Resources Information Center
Elsherif, Entisar
2017-01-01
This adaptive methodological inquiry explored the affordances and constraints of one TESOL teacher education program in Libya as a conflict zone. Data was collected through seven documents and 33 questionnaires. Questionnaires were gathered from the investigated program's teacher-educators, student-teachers, and graduates, who were in-service…
Modified Fully Utilized Design (MFUD) Method for Stress and Displacement Constraints
NASA Technical Reports Server (NTRS)
Patnaik, Surya; Gendy, Atef; Berke, Laszlo; Hopkins, Dale
1997-01-01
The traditional fully stressed method performs satisfactorily for stress-limited structural design. When this method is extended to include displacement limitations in addition to stress constraints, it is known as the fully utilized design (FUD). Typically, the FUD produces an overdesign, which is the primary limitation of this otherwise elegant method. We have modified FUD in an attempt to alleviate the limitation. This new method, called the modified fully utilized design (MFUD) method, has been tested successfully on a number of designs that were subjected to multiple loads and had both stress and displacement constraints. The solutions obtained with MFUD compare favorably with the optimum results that can be generated by using nonlinear mathematical programming techniques. The MFUD method appears to have alleviated the overdesign condition and offers the simplicity of a direct, fully stressed type of design method that is distinctly different from optimization and optimality criteria formulations. The MFUD method is being developed for practicing engineers who favor traditional design methods rather than methods based on advanced calculus and nonlinear mathematical programming techniques. The Integrated Force Method (IFM) was found to be the appropriate analysis tool in the development of the MFUD method. In this paper, the MFUD method and its optimality are presented along with a number of illustrative examples.
NASA Astrophysics Data System (ADS)
Hashimoto, Hiroyuki; Takaguchi, Yusuke; Nakamura, Shizuka
Instability of calculation process and increase of calculation time caused by increasing size of continuous optimization problem remain the major issues to be solved to apply the technique to practical industrial systems. This paper proposes an enhanced quadratic programming algorithm based on interior point method mainly for improvement of calculation stability. The proposed method has dynamic estimation mechanism of active constraints on variables, which fixes the variables getting closer to the upper/lower limit on them and afterwards releases the fixed ones as needed during the optimization process. It is considered as algorithm-level integration of the solution strategy of active-set method into the interior point method framework. We describe some numerical results on commonly-used bench-mark problems called “CUTEr” to show the effectiveness of the proposed method. Furthermore, the test results on large-sized ELD problem (Economic Load Dispatching problems in electric power supply scheduling) are also described as a practical industrial application.
Speech production gains following constraint-induced movement therapy in children with hemiparesis.
Allison, Kristen M; Reidy, Teressa Garcia; Boyle, Mary; Naber, Erin; Carney, Joan; Pidcock, Frank S
2017-01-01
The purpose of this study was to investigate changes in speech skills of children who have hemiparesis and speech impairment after participation in a constraint-induced movement therapy (CIMT) program. While case studies have reported collateral speech gains following CIMT, the effect of CIMT on speech production has not previously been directly investigated to the knowledge of these investigators. Eighteen children with hemiparesis and co-occurring speech impairment participated in a 21-day clinical CIMT program. The Goldman-Fristoe Test of Articulation-2 (GFTA-2) was used to assess children's articulation of speech sounds before and after the intervention. Changes in percent of consonants correct (PCC) on the GFTA-2 were used as a measure of change in speech production. Children made significant gains in PCC following CIMT. Gains were similar in children with left and right-sided hemiparesis, and across age groups. This study reports significant collateral gains in speech production following CIMT and suggests benefits of CIMT may also spread to speech motor domains.
Park, JuHyung; Lee, NaYun; Cho, YongHo; Yang, YeongAe
2015-03-01
[Purpose] The purpose of this study was to investigate the impact that modified constraint-induced movement therapy has on upper extremity function and the daily life of chronic stroke patients. [Subjects and Methods] Modified constraint-induced movement therapy was conduct for 2 stroke patients with hemiplegia. It was performed 5 days a week for 2 weeks, and the participants performed their daily living activities wearing mittens for 6 hours a day, including the 2 hours of the therapy program. The assessment was conducted 5 times in 3 weeks before and after intervention. The upper extremity function was measured using the box and block test and a dynamometer, and performance daily of living activities was assessed using the modified Barthel index. The results were analyzed using a scatterplot and linear regression. [Results] All the upper extremity functions of the participants all improved after the modified constraint-induced movement therapy. Performance of daily living activities by participant 1 showed no change, but the results of participant 2 had improved after the intervention. [Conclusion] Through the results of this research, it was identified that modified constraint-induced movement therapy is effective at improving the upper extremity functions and the performance of daily living activities of chronic stroke patients.
Speededness and Adaptive Testing
ERIC Educational Resources Information Center
van der Linden, Wim J.; Xiong, Xinhui
2013-01-01
Two simple constraints on the item parameters in a response--time model are proposed to control the speededness of an adaptive test. As the constraints are additive, they can easily be included in the constraint set for a shadow-test approach (STA) to adaptive testing. Alternatively, a simple heuristic is presented to control speededness in plain…
Dal Palù, Alessandro; Pontelli, Enrico; He, Jing; Lu, Yonggang
2007-01-01
The paper describes a novel framework, constructed using Constraint Logic Programming (CLP) and parallelism, to determine the association between parts of the primary sequence of a protein and alpha-helices extracted from 3D low-resolution descriptions of large protein complexes. The association is determined by extracting constraints from the 3D information, regarding length, relative position and connectivity of helices, and solving these constraints with the guidance of a secondary structure prediction algorithm. Parallelism is employed to enhance performance on large proteins. The framework provides a fast, inexpensive alternative to determine the exact tertiary structure of unknown proteins.
NASA Technical Reports Server (NTRS)
Young, Katherine C.; Sobieszczanski-Sobieski, Jaroslaw
1988-01-01
This project has two objectives. The first is to determine whether linear programming techniques can improve performance when handling design optimization problems with a large number of design variables and constraints relative to the feasible directions algorithm. The second purpose is to determine whether using the Kreisselmeier-Steinhauser (KS) function to replace the constraints with one constraint will reduce the cost of total optimization. Comparisons are made using solutions obtained with linear and non-linear methods. The results indicate that there is no cost saving using the linear method or in using the KS function to replace constraints.
Constraint Programming to Solve Maximal Density Still Life
NASA Astrophysics Data System (ADS)
Chu, Geoffrey; Petrie, Karen Elizabeth; Yorke-Smith, Neil
The Maximum Density Still Life problem fills a finite Game of Life board with a stable pattern of cells that has as many live cells as possible. Although simple to state, this problem is computationally challenging for any but the smallest sizes of board. Especially difficult is to prove that the maximum number of live cells has been found. Various approaches have been employed. The most successful are approaches based on Constraint Programming (CP). We describe the Maximum Density Still Life problem, introduce the concept of constraint programming, give an overview on how the problem can be modelled and solved with CP, and report on best-known results for the problem.
A Mars Exploration Discovery Program
NASA Astrophysics Data System (ADS)
Hansen, C. J.; Paige, D. A.
2000-07-01
The Mars Exploration Program should consider following the Discovery Program model. In the Discovery Program a team of scientists led by a PI develop the science goals of their mission, decide what payload achieves the necessary measurements most effectively, and then choose a spacecraft with the capabilities needed to carry the payload to the desired target body. The primary constraints associated with the Discovery missions are time and money. The proposer must convince reviewers that their mission has scientific merit and is feasible. Every Announcement of Opportunity has resulted in a collection of creative ideas that fit within advertised constraints. Following this model, a "Mars Discovery Program" would issue an Announcement of Opportunity for each launch opportunity with schedule constraints dictated by the launch window and fiscal constraints in accord with the program budget. All else would be left to the proposer to choose, based on the science the team wants to accomplish, consistent with the program theme of "Life, Climate and Resources". A proposer could propose a lander, an orbiter, a fleet of SCOUT vehicles or penetrators, an airplane, a balloon mission, a large rover, a small rover, etc. depending on what made the most sense for the science investigation and payload. As in the Discovery program, overall feasibility relative to cost, schedule and technology readiness would be evaluated and be part of the selection process.
A Mars Exploration Discovery Program
NASA Technical Reports Server (NTRS)
Hansen, C. J.; Paige, D. A.
2000-01-01
The Mars Exploration Program should consider following the Discovery Program model. In the Discovery Program a team of scientists led by a PI develop the science goals of their mission, decide what payload achieves the necessary measurements most effectively, and then choose a spacecraft with the capabilities needed to carry the payload to the desired target body. The primary constraints associated with the Discovery missions are time and money. The proposer must convince reviewers that their mission has scientific merit and is feasible. Every Announcement of Opportunity has resulted in a collection of creative ideas that fit within advertised constraints. Following this model, a "Mars Discovery Program" would issue an Announcement of Opportunity for each launch opportunity with schedule constraints dictated by the launch window and fiscal constraints in accord with the program budget. All else would be left to the proposer to choose, based on the science the team wants to accomplish, consistent with the program theme of "Life, Climate and Resources". A proposer could propose a lander, an orbiter, a fleet of SCOUT vehicles or penetrators, an airplane, a balloon mission, a large rover, a small rover, etc. depending on what made the most sense for the science investigation and payload. As in the Discovery program, overall feasibility relative to cost, schedule and technology readiness would be evaluated and be part of the selection process.
Pegasus air-launched space booster flight test program
NASA Astrophysics Data System (ADS)
Elias, Antonio L.; Knutson, Martin A.
1995-03-01
Pegasus is a satellite-launching space rocket dropped from a B52 carrier aircraft instead of launching vertically from a ground pad. Its three-year, privately-funded accelerated development was carried out under a demanding design-to-nonrecurring cost methodology, which imposed unique requirements on its flight test program, such as the decision not to drop an inert model from the carrier aircraft; the number and type of captive and free-flight tests; the extent of envelope exploration; and the decision to combine test and operational orbital flights. The authors believe that Pegasus may be the first vehicle where constraints in the number and type of flight tests to be carried out actually influenced the design of the vehicle. During the period November 1989 to February of 1990 a total of three captive flight tests were conducted, starting with a flutter clearing flight and culminating in a complete drop rehearsal. Starting on April 5, 1990, two combination test/operational flights were conducted. A unique aspect of the program was the degree of involvement of flight test personnel in the early design of the vehicle and, conversely, of the design team in flight testing and early flight operations. Various lessons learned as a result of this process are discussed throughout this paper.
The Development and Implementation of Outdoor-Based Secondary School Integrated Programs
ERIC Educational Resources Information Center
Comishin, Kelly; Dyment, Janet E.; Potter, Tom G.; Russell, Constance L.
2004-01-01
Four teachers share the challenges they faced when creating and running outdoor-focused secondary school integrated programs in British Columbia, Canada. The five most common challenges were funding constraints, insufficient support from administrators and colleagues, time constraints, liability and risk management, and inadequate skills and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hempling, Scott; Elefant, Carolyn; Cory, Karlynn
2010-01-01
This report details how state feed-in tariff (FIT) programs can be legally implemented and how they can comply with federal requirements. The report describes the federal constraints on FIT programs and identifies legal methods that are free of those constrains.
The Nature of Credit Constraints and Human Capital. NBER Working Paper No. 13912
ERIC Educational Resources Information Center
Lochner, Lance J.; Monge-Naranjo, Alexander
2008-01-01
This paper studies the nature and impact of credit constraints in the market for human capital. We derive endogenous constraints from the design of government student loan programs and from the limited repayment incentives in private lending markets. These constraints imply cross-sectional patterns for schooling, ability, and family income that…
Boruch, R F; Mcsweeny, A J; Soderstrom, E J
1978-11-01
This bibliography lists references to over 300 field experiments undertaken in schools, hospitals, prisons, and other social settings, mainly in the U.S. The list is divided into 10 major categories corresponding to the type of program under examination. They include: criminal and civil justice programs, mental health, training and education, mass media, information collection, utilization, commerce and industry, welfare, health, and family planning. The main purpose of the bibliography is to provide evidence on feasibility and scope of randomized field tests, since despite their advantages, it is not always clear from managerial, political, and other constraints on research that they can be mounted. Dates of publications range from 1944 to 1978.
Do older adults perceive postural constraints for reach estimation?
Cordova, Alberto; Gabbard, Carl
2014-01-01
BACKGROUND/STUDY CONTEXT: Recent evidence indicates that older persons have difficulty mentally representing intended movements. Furthermore, in an estimation of reach paradigm using motor imagery, a form of mental representation, older persons significantly overestimated their ability compared with young adults. The authors tested the notion that older adults may also have difficulty perceiving the postural constraints associated with reach estimation. The authors compared young (Mage = 22 years) and older (Mage = 67) adults on reach estimation while seated and in a more postural demanding standing and leaning forward position. The expectation was a significant postural effect with the standing condition, as evidenced by reduced overestimation. Whereas there was no difference between groups in the seated condition (both overestimated), older adults underestimated whereas the younger group once again overestimated in the standing condition. From one perspective, these results show that older adults do perceive postural constraints in light of their own physical capabilities. That is, that group perceived greater postural demands with the standing posture and elected to program a more conservative strategy, resulting in underestimation.
A fuzzy goal programming model for biodiesel production
NASA Astrophysics Data System (ADS)
Lutero, D. S.; Pangue, EMU; Tubay, J. M.; Lubag, S. P.
2016-02-01
A fuzzy goal programming (FGP) model for biodiesel production in the Philippines was formulated with Coconut (Cocos nucifera) and Jatropha (Jatropha curcas) as sources of biodiesel. Objectives were maximization of feedstock production and overall revenue and, minimization of energy used in production and working capital for farming subject to biodiesel and non-biodiesel requirements, and availability of land, labor, water and machine time. All these objectives and constraints were assumed to be fuzzy. Model was tested for different sets of weights. Results for all sets of weights showed the same optimal allocation. Coconut alone can satisfy the biodiesel requirement of 2% per volume.
Structural Tailoring of Advanced Turboprops (STAT)
NASA Technical Reports Server (NTRS)
Brown, Kenneth W.
1988-01-01
This interim report describes the progress achieved in the structural Tailoring of Advanced Turboprops (STAT) program which was developed to perform numerical optimizations on highly swept propfan blades. The optimization procedure seeks to minimize an objective function, defined as either direct operating cost or aeroelastic differences between a blade and its scaled model, by tuning internal and external geometry variables that must satisfy realistic blade design constraints. This report provides a detailed description of the input, optimization procedures, approximate analyses and refined analyses, as well as validation test cases for the STAT program. In addition, conclusions and recommendations are summarized.
Zhao, Yingfeng; Liu, Sanyang
2016-01-01
We present a practical branch and bound algorithm for globally solving generalized linear multiplicative programming problem with multiplicative constraints. To solve the problem, a relaxation programming problem which is equivalent to a linear programming is proposed by utilizing a new two-phase relaxation technique. In the algorithm, lower and upper bounds are simultaneously obtained by solving some linear relaxation programming problems. Global convergence has been proved and results of some sample examples and a small random experiment show that the proposed algorithm is feasible and efficient.
NASA Technical Reports Server (NTRS)
Haddad, Michael E.
2008-01-01
On-Orbit Constraints Test (OOCT's) refers to mating flight hardware together on the ground before they will be mated on-orbit. The concept seems simple but it can be difficult to perform operations like this on the ground when the flight hardware is being designed to be mated on-orbit in a zero-g and/or vacuum environment of space. Also some of the items are manufactured years apart so how are mating tasks performed on these components if one piece is on-orbit before its mating piece is planned to be built. Both the Internal Vehicular Activity (IVA) and Extra-Vehicular Activity (EVA) OOCT's performed at Kennedy Space Center will be presented in this paper. Details include how OOCT's should mimic on-orbit operational scenarios, a series of photographs will be shown that were taken during OOCT's performed on International Space Station (ISS) flight elements, lessons learned as a result of the OOCT's will be presented and the paper will conclude with possible applications to Moon and Mars Surface operations planned for the Constellation Program.
Krypton-85 Powered Lights for Airfield Application.
1981-11-01
Department of Energy.(DOE), and eight lights were fabricated for testing by actual observation under airfield conditions. Light is produced in the units...concepts of radionuclide-powered lights, the R&D program carried out, and fabrication constraints involved in the production of the experimental...visible light has been known for many years. Early use of radium mixed with zinc sulfide phosphors provided self-illuminated clock dials. The military has
Bayes Factor Approaches for Testing Interval Null Hypotheses
ERIC Educational Resources Information Center
Morey, Richard D.; Rouder, Jeffrey N.
2011-01-01
Psychological theories are statements of constraint. The role of hypothesis testing in psychology is to test whether specific theoretical constraints hold in data. Bayesian statistics is well suited to the task of finding supporting evidence for constraint, because it allows for comparing evidence for 2 hypotheses against each another. One issue…
Multi-Objective Programming for Lot-Sizing with Quantity Discount
NASA Astrophysics Data System (ADS)
Kang, He-Yau; Lee, Amy H. I.; Lai, Chun-Mei; Kang, Mei-Sung
2011-11-01
Multi-objective programming (MOP) is one of the popular methods for decision making in a complex environment. In a MOP, decision makers try to optimize two or more objectives simultaneously under various constraints. A complete optimal solution seldom exists, and a Pareto-optimal solution is usually used. Some methods, such as the weighting method which assigns priorities to the objectives and sets aspiration levels for the objectives, are used to derive a compromise solution. The ɛ-constraint method is a modified weight method. One of the objective functions is optimized while the other objective functions are treated as constraints and are incorporated in the constraint part of the model. This research considers a stochastic lot-sizing problem with multi-suppliers and quantity discounts. The model is transformed into a mixed integer programming (MIP) model next based on the ɛ-constraint method. An illustrative example is used to illustrate the practicality of the proposed model. The results demonstrate that the model is an effective and accurate tool for determining the replenishment of a manufacturer from multiple suppliers for multi-periods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahrens, J.P.; Shapiro, L.G.; Tanimoto, S.L.
1997-04-01
This paper describes a computing environment which supports computer-based scientific research work. Key features include support for automatic distributed scheduling and execution and computer-based scientific experimentation. A new flexible and extensible scheduling technique that is responsive to a user`s scheduling constraints, such as the ordering of program results and the specification of task assignments and processor utilization levels, is presented. An easy-to-use constraint language for specifying scheduling constraints, based on the relational database query language SQL, is described along with a search-based algorithm for fulfilling these constraints. A set of performance studies show that the environment can schedule and executemore » program graphs on a network of workstations as the user requests. A method for automatically generating computer-based scientific experiments is described. Experiments provide a concise method of specifying a large collection of parameterized program executions. The environment achieved significant speedups when executing experiments; for a large collection of scientific experiments an average speedup of 3.4 on an average of 5.5 scheduled processors was obtained.« less
Lan, Yihua; Li, Cunhua; Ren, Haozheng; Zhang, Yong; Min, Zhifang
2012-10-21
A new heuristic algorithm based on the so-called geometric distance sorting technique is proposed for solving the fluence map optimization with dose-volume constraints which is one of the most essential tasks for inverse planning in IMRT. The framework of the proposed method is basically an iterative process which begins with a simple linear constrained quadratic optimization model without considering any dose-volume constraints, and then the dose constraints for the voxels violating the dose-volume constraints are gradually added into the quadratic optimization model step by step until all the dose-volume constraints are satisfied. In each iteration step, an interior point method is adopted to solve each new linear constrained quadratic programming. For choosing the proper candidate voxels for the current dose constraint adding, a so-called geometric distance defined in the transformed standard quadratic form of the fluence map optimization model was used to guide the selection of the voxels. The new geometric distance sorting technique can mostly reduce the unexpected increase of the objective function value caused inevitably by the constraint adding. It can be regarded as an upgrading to the traditional dose sorting technique. The geometry explanation for the proposed method is also given and a proposition is proved to support our heuristic idea. In addition, a smart constraint adding/deleting strategy is designed to ensure a stable iteration convergence. The new algorithm is tested on four cases including head-neck, a prostate, a lung and an oropharyngeal, and compared with the algorithm based on the traditional dose sorting technique. Experimental results showed that the proposed method is more suitable for guiding the selection of new constraints than the traditional dose sorting method, especially for the cases whose target regions are in non-convex shapes. It is a more efficient optimization technique to some extent for choosing constraints than the dose sorting method. By integrating a smart constraint adding/deleting scheme within the iteration framework, the new technique builds up an improved algorithm for solving the fluence map optimization with dose-volume constraints.
Programming languages for circuit design.
Pedersen, Michael; Yordanov, Boyan
2015-01-01
This chapter provides an overview of a programming language for Genetic Engineering of Cells (GEC). A GEC program specifies a genetic circuit at a high level of abstraction through constraints on otherwise unspecified DNA parts. The GEC compiler then selects parts which satisfy the constraints from a given parts database. GEC further provides more conventional programming language constructs for abstraction, e.g., through modularity. The GEC language and compiler is available through a Web tool which also provides functionality, e.g., for simulation of designed circuits.
The feasibility of a Paleolithic diet for low-income consumers.
Metzgar, Matthew; Rideout, Todd C; Fontes-Villalba, Maelan; Kuipers, Remko S
2011-06-01
Many low-income consumers face a limited budget for food purchases. The United States Department of Agriculture developed the Thrifty Food Plan to address this problem of consuming a healthy diet given a budget constraint. This dietary optimization program uses common food choices to build a suitable diet. In this article, the United States Department of Agriculture data sets are used to test the feasibility of consuming a Paleolithic diet given a limited budget. The Paleolithic diet is described as the diet that humans are genetically adapted to, containing only the preagricultural food groups of meat, seafood, fruits, vegetables, and nuts. Constraints were applied to the diet optimization model to restrict grains, dairy, and certain other food categories. Constraints were also applied for macronutrients, micronutrients, and long-chain polyunsaturated fatty acids. The results show that it is possible to consume a Paleolithic diet given the constraints. However, the diet does fall short of meeting the daily recommended intakes for certain micronutrients. A 9.3% increase in income is needed to consume a Paleolithic diet that meets all daily recommended intakes except for calcium. Copyright © 2011 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Horvath, Joan C.; Alkalaj, Leon J.; Schneider, Karl M.; Amador, Arthur V.; Spitale, Joseph N.
1993-01-01
Robotic spacecraft are controlled by sets of commands called 'sequences.' These sequences must be checked against mission constraints. Making our existing constraint checking program faster would enable new capabilities in our uplink process. Therefore, we are rewriting this program to run on a parallel computer. To do so, we had to determine how to run constraint-checking algorithms in parallel and create a new method of specifying spacecraft models and constraints. This new specification gives us a means of representing flight systems and their predicted response to commands which could be used in a variety of applications throughout the command process, particularly during anomaly or high-activity operations. This commonality could reduce operations cost and risk for future complex missions. Lessons learned in applying some parts of this system to the TOPEX/Poseidon mission will be described.
WINDOWAC (Wing Design Optimization With Aeroelastic Constraints): Program manual
NASA Technical Reports Server (NTRS)
Haftka, R. T.; Starnes, J. H., Jr.
1974-01-01
User and programer documentation for the WIDOWAC programs is given. WIDOWAC may be used for the design of minimum mass wing structures subjected to flutter, strength, and minimum gage constraints. The wing structure is modeled by finite elements, flutter conditions may be both subsonic and supersonic, and mathematical programing methods are used for the optimization procedure. The user documentation gives general directions on how the programs may be used and describes their limitations; in addition, program input and output are described, and example problems are presented. A discussion of computational algorithms and flow charts of the WIDOWAC programs and major subroutines is also given.
Advanced Computational Methods for Security Constrained Financial Transmission Rights
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kalsi, Karanjit; Elbert, Stephen T.; Vlachopoulou, Maria
Financial Transmission Rights (FTRs) are financial insurance tools to help power market participants reduce price risks associated with transmission congestion. FTRs are issued based on a process of solving a constrained optimization problem with the objective to maximize the FTR social welfare under power flow security constraints. Security constraints for different FTR categories (monthly, seasonal or annual) are usually coupled and the number of constraints increases exponentially with the number of categories. Commercial software for FTR calculation can only provide limited categories of FTRs due to the inherent computational challenges mentioned above. In this paper, first an innovative mathematical reformulationmore » of the FTR problem is presented which dramatically improves the computational efficiency of optimization problem. After having re-formulated the problem, a novel non-linear dynamic system (NDS) approach is proposed to solve the optimization problem. The new formulation and performance of the NDS solver is benchmarked against widely used linear programming (LP) solvers like CPLEX™ and tested on both standard IEEE test systems and large-scale systems using data from the Western Electricity Coordinating Council (WECC). The performance of the NDS is demonstrated to be comparable and in some cases is shown to outperform the widely used CPLEX algorithms. The proposed formulation and NDS based solver is also easily parallelizable enabling further computational improvement.« less
The application of Newman crack-closure model to predicting fatigue crack growth
NASA Astrophysics Data System (ADS)
Si, Erjian
1994-09-01
Newman crack-closure model and the relevant crack growth program were applied to the analysis of crack growth under constant amplitude and aircraft spectrum loading on a number of aluminum alloy materials. The analysis was performed for available test data of 2219-T851, 2024-T3, 2024-T351, 7075-T651, 2324-T39, and 7150-T651 aluminum materials. The results showed that the constraint factor is a significant factor in the method. The determination of the constraint factor is discussed. For constant amplitude loading, satisfactory crack growth lives could be predicted. For the above aluminum specimens, the ratio of predicted to experimental lives, Np/Nt, ranged from 0.74 to 1.36. The mean value of Np/Nt was 0.97. For a specified complex spectrum loading, predicted crack growth lives are not in very good agreement with the test data. Further effort is needed to correctly simulate the transition between plane strain and plane stress conditions, existing near the crack tip.
De Carvalho, Irene Stuart Torrié; Granfeldt, Yvonne; Dejmek, Petr; Håkansson, Andreas
2015-03-01
Linear programming has been used extensively as a tool for nutritional recommendations. Extending the methodology to food formulation presents new challenges, since not all combinations of nutritious ingredients will produce an acceptable food. Furthermore, it would help in implementation and in ensuring the feasibility of the suggested recommendations. To extend the previously used linear programming methodology from diet optimization to food formulation using consistency constraints. In addition, to exemplify usability using the case of a porridge mix formulation for emergency situations in rural Mozambique. The linear programming method was extended with a consistency constraint based on previously published empirical studies on swelling of starch in soft porridges. The new method was exemplified using the formulation of a nutritious, minimum-cost porridge mix for children aged 1 to 2 years for use as a complete relief food, based primarily on local ingredients, in rural Mozambique. A nutritious porridge fulfilling the consistency constraints was found; however, the minimum cost was unfeasible with local ingredients only. This illustrates the challenges in formulating nutritious yet economically feasible foods from local ingredients. The high cost was caused by the high cost of mineral-rich foods. A nutritious, low-cost porridge that fulfills the consistency constraints was obtained by including supplements of zinc and calcium salts as ingredients. The optimizations were successful in fulfilling all constraints and provided a feasible porridge, showing that the extended constrained linear programming methodology provides a systematic tool for designing nutritious foods.
NASA Technical Reports Server (NTRS)
Kiusalaas, J.; Reddy, G. B.
1977-01-01
A finite element program is presented for computer-automated, minimum weight design of elastic structures with constraints on stresses (including local instability criteria) and displacements. Volume 1 of the report contains the theoretical and user's manual of the program. Sample problems and the listing of the program are included in Volumes 2 and 3. The element subroutines are organized so as to facilitate additions and changes by the user. As a result, a relatively minor programming effort would be required to make DESAP 1 into a special purpose program to handle the user's specific design requirements and failure criteria.
NASA Astrophysics Data System (ADS)
Wu, Dongjun
Network industries have technologies characterized by a spatial hierarchy, the "network," with capital-intensive interconnections and time-dependent, capacity-limited flows of products and services through the network to customers. This dissertation studies service pricing, investment and business operating strategies for the electric power network. First-best solutions for a variety of pricing and investment problems have been studied. The evaluation of genetic algorithms (GA, which are methods based on the idea of natural evolution) as a primary means of solving complicated network problems, both w.r.t. pricing: as well as w.r.t. investment and other operating decisions, has been conducted. New constraint-handling techniques in GAs have been studied and tested. The actual application of such constraint-handling techniques in solving practical non-linear optimization problems has been tested on several complex network design problems with encouraging initial results. Genetic algorithms provide solutions that are feasible and close to optimal when the optimal solution is know; in some instances, the near-optimal solutions for small problems by the proposed GA approach can only be tested by pushing the limits of currently available non-linear optimization software. The performance is far better than several commercially available GA programs, which are generally inadequate in solving any of the problems studied in this dissertation, primarily because of their poor handling of constraints. Genetic algorithms, if carefully designed, seem very promising in solving difficult problems which are intractable by traditional analytic methods.
Modeling Multibody Stage Separation Dynamics Using Constraint Force Equation Methodology
NASA Technical Reports Server (NTRS)
Tartabini, Paul V.; Roithmayr, Carlos M.; Toniolo, Matthew D.; Karlgaard, Christopher D.; Pamadi, Bandu N.
2011-01-01
This paper discusses the application of the constraint force equation methodology and its implementation for multibody separation problems using three specially designed test cases. The first test case involves two rigid bodies connected by a fixed joint, the second case involves two rigid bodies connected with a universal joint, and the third test case is that of Mach 7 separation of the X-43A vehicle. For the first two cases, the solutions obtained using the constraint force equation method compare well with those obtained using industry- standard benchmark codes. For the X-43A case, the constraint force equation solutions show reasonable agreement with the flight-test data. Use of the constraint force equation method facilitates the analysis of stage separation in end-to-end simulations of launch vehicle trajectories
Educational Policy and Literacy Learning in an ESL Classroom: Constraints and Opportunities
ERIC Educational Resources Information Center
Ricklefs, Mariana Alvayero
2012-01-01
This dissertation was a qualitative case study of an educational program for English Language Learners (ELL) at an elementary school in a small city in the Midwest. This case study investigated how language ideologies influence the constraints and opportunities for the planning and execution of this educational program. The findings evidenced that…
Darmon, Nicole; Ferguson, Elaine L; Briend, André
2002-12-01
Economic constraints may contribute to the unhealthy food choices observed among low socioeconomic groups in industrialized countries. The objective of the present study was to predict the food choices a rational individual would make to reduce his or her food budget, while retaining a diet as close as possible to the average population diet. Isoenergetic diets were modeled by linear programming. To ensure these diets were consistent with habitual food consumption patterns, departure from the average French diet was minimized and constraints that limited portion size and the amount of energy from food groups were introduced into the models. A cost constraint was introduced and progressively strengthened to assess the effect of cost on the selection of foods by the program. Strengthening the cost constraint reduced the proportion of energy contributed by fruits and vegetables, meat and dairy products and increased the proportion from cereals, sweets and added fats, a pattern similar to that observed among low socioeconomic groups. This decreased the nutritional quality of modeled diets, notably the lowest cost linear programming diets had lower vitamin C and beta-carotene densities than the mean French adult diet (i.e., <25% and 10% of the mean density, respectively). These results indicate that a simple cost constraint can decrease the nutrient densities of diets and influence food selection in ways that reproduce the food intake patterns observed among low socioeconomic groups. They suggest that economic measures will be needed to effectively improve the nutritional quality of diets consumed by these populations.
Status of SFR Codes and Methods QA Implementation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brunett, Acacia J.; Briggs, Laural L.; Fanning, Thomas H.
2017-01-31
This report details development of the SAS4A/SASSYS-1 SQA Program and describes the initial stages of Program implementation planning. The provisional Program structure, which is largely focused on the establishment of compliant SQA documentation, is outlined in detail, and Program compliance with the appropriate SQA requirements is highlighted. Additional program activities, such as improvements to testing methods and Program surveillance, are also described in this report. Given that the programmatic resources currently granted to development of the SAS4A/SASSYS-1 SQA Program framework are not sufficient to adequately address all SQA requirements (e.g. NQA-1, NUREG/BR-0167, etc.), this report also provides an overview ofmore » the gaps that remain the SQA program, and highlights recommendations on a path forward to resolution of these issues. One key finding of this effort is the identification of the need for an SQA program sustainable over multiple years within DOE annual R&D funding constraints.« less
Summary of Technical Meeting To Compare US/French Approaches for Physical Protection Test Beds
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mack, Thomas Kimball; Martinez, Ruben; Thomas, Gerald
In September 2015, representatives of the US Department of Energy/National Nuclear Security Administration, including test bed professionals from Sandia National Laboratories, and representatives of the French Alternative Energies and Atomic Energy Commission participated in a one-week workshop to share best practices in design, organization, operations, utilization, improvement, and performance testing of physical protection test beds. The intended workshop outcomes were to (1) share methods of improving respective test bed methodologies and programs and (2) prepare recommendations for standards regarding creating and operating testing facilities for nations new to nuclear operations. At the workshop, the French and American subject matter expertsmore » compared best practices as developed at their respective test bed sites; discussed access delay test bed considerations; and presented the limitations/ constraints of physical protection test beds.« less
Mathematical programming for the efficient allocation of health care resources.
Stinnett, A A; Paltiel, A D
1996-10-01
Previous discussions of methods for the efficient allocation of health care resources subject to a budget constraint have relied on unnecessarily restrictive assumptions. This paper makes use of established optimization techniques to demonstrate that a general mathematical programming framework can accommodate much more complex information regarding returns to scale, partial and complete indivisibility and program interdependence. Methods are also presented for incorporating ethical constraints into the resource allocation process, including explicit identification of the cost of equity.
A 30 MW, 200 MHz Inductive Output Tube for RF Accelerators
DOE Office of Scientific and Technical Information (OSTI.GOV)
R. Lawrence Ives; Michael Read
2008-06-19
This program investigated development of a multiple beam inductive output tube (IOT) to produce 30 MW pulses at 200 MHz. The program was successful in demonstrating feasibility of developing the source to achieve the desired power in microsecond pulses with 70% efficiency. The predicted gain of the device is 24 dB. Consequently, a 200 kW driver would be required for the RF input. Estimated cost of this driver is approximately $1.25 M. Given the estimated development cost of the IOT of approximately $750K and the requirements for a test set that would significantly increase the cost, it was determined thatmore » development could not be achieved within the funding constraints of a Phase II program.« less
Risk-Constrained Dynamic Programming for Optimal Mars Entry, Descent, and Landing
NASA Technical Reports Server (NTRS)
Ono, Masahiro; Kuwata, Yoshiaki
2013-01-01
A chance-constrained dynamic programming algorithm was developed that is capable of making optimal sequential decisions within a user-specified risk bound. This work handles stochastic uncertainties over multiple stages in the CEMAT (Combined EDL-Mobility Analyses Tool) framework. It was demonstrated by a simulation of Mars entry, descent, and landing (EDL) using real landscape data obtained from the Mars Reconnaissance Orbiter. Although standard dynamic programming (DP) provides a general framework for optimal sequential decisionmaking under uncertainty, it typically achieves risk aversion by imposing an arbitrary penalty on failure states. Such a penalty-based approach cannot explicitly bound the probability of mission failure. A key idea behind the new approach is called risk allocation, which decomposes a joint chance constraint into a set of individual chance constraints and distributes risk over them. The joint chance constraint was reformulated into a constraint on an expectation over a sum of an indicator function, which can be incorporated into the cost function by dualizing the optimization problem. As a result, the chance-constraint optimization problem can be turned into an unconstrained optimization over a Lagrangian, which can be solved efficiently using a standard DP approach.
Constraint programming based biomarker optimization.
Zhou, Manli; Luo, Youxi; Sun, Guoquan; Mai, Guoqin; Zhou, Fengfeng
2015-01-01
Efficient and intuitive characterization of biological big data is becoming a major challenge for modern bio-OMIC based scientists. Interactive visualization and exploration of big data is proven to be one of the successful solutions. Most of the existing feature selection algorithms do not allow the interactive inputs from users in the optimizing process of feature selection. This study investigates this question as fixing a few user-input features in the finally selected feature subset and formulates these user-input features as constraints for a programming model. The proposed algorithm, fsCoP (feature selection based on constrained programming), performs well similar to or much better than the existing feature selection algorithms, even with the constraints from both literature and the existing algorithms. An fsCoP biomarker may be intriguing for further wet lab validation, since it satisfies both the classification optimization function and the biomedical knowledge. fsCoP may also be used for the interactive exploration of bio-OMIC big data by interactively adding user-defined constraints for modeling.
Program Predicts Time Courses of Human/Computer Interactions
NASA Technical Reports Server (NTRS)
Vera, Alonso; Howes, Andrew
2005-01-01
CPM X is a computer program that predicts sequences of, and amounts of time taken by, routine actions performed by a skilled person performing a task. Unlike programs that simulate the interaction of the person with the task environment, CPM X predicts the time course of events as consequences of encoded constraints on human behavior. The constraints determine which cognitive and environmental processes can occur simultaneously and which have sequential dependencies. The input to CPM X comprises (1) a description of a task and strategy in a hierarchical description language and (2) a description of architectural constraints in the form of rules governing interactions of fundamental cognitive, perceptual, and motor operations. The output of CPM X is a Program Evaluation Review Technique (PERT) chart that presents a schedule of predicted cognitive, motor, and perceptual operators interacting with a task environment. The CPM X program allows direct, a priori prediction of skilled user performance on complex human-machine systems, providing a way to assess critical interfaces before they are deployed in mission contexts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ungun, B; Stanford University School of Medicine, Stanford, CA; Fu, A
2016-06-15
Purpose: To develop a procedure for including dose constraints in convex programming-based approaches to treatment planning, and to support dynamic modification of such constraints during planning. Methods: We present a mathematical approach that allows mean dose, maximum dose, minimum dose and dose volume (i.e., percentile) constraints to be appended to any convex formulation of an inverse planning problem. The first three constraint types are convex and readily incorporated. Dose volume constraints are not convex, however, so we introduce a convex restriction that is related to CVaR-based approaches previously proposed in the literature. To compensate for the conservatism of this restriction,more » we propose a new two-pass algorithm that solves the restricted problem on a first pass and uses this solution to form exact constraints on a second pass. In another variant, we introduce slack variables for each dose constraint to prevent the problem from becoming infeasible when the user specifies an incompatible set of constraints. We implement the proposed methods in Python using the convex programming package cvxpy in conjunction with the open source convex solvers SCS and ECOS. Results: We show, for several cases taken from the clinic, that our proposed method meets specified constraints (often with margin) when they are feasible. Constraints are met exactly when we use the two-pass method, and infeasible constraints are replaced with the nearest feasible constraint when slacks are used. Finally, we introduce ConRad, a Python-embedded free software package for convex radiation therapy planning. ConRad implements the methods described above and offers a simple interface for specifying prescriptions and dose constraints. Conclusion: This work demonstrates the feasibility of using modifiable dose constraints in a convex formulation, making it practical to guide the treatment planning process with interactively specified dose constraints. This work was supported by the Stanford BioX Graduate Fellowship and NIH Grant 5R01CA176553.« less
Recognizing and optimizing flight opportunities with hardware and life sciences limitations.
Luttges, M W
1992-01-01
The availability of orbital space flight opportunities to conduct life sciences research has been limited. It is possible to use parabolic flight and sounding rocket programs to conduct some kinds of experiments during short episodes (seconds to minutes) of reduced gravity, but there are constraints and limitations to these programs. Orbital flight opportunities are major undertakings, and the potential science achievable is often a function of the flight hardware available. A variety of generic types of flight hardware have been developed and tested, and show great promise for use during NSTS flights. One such payload configuration is described which has already flown.
Engineering and simulation of life science Spacelab experiments
NASA Technical Reports Server (NTRS)
Bush, B.; Rummel, J.; Johnston, R. S.
1977-01-01
Approaches to the planning and realization of Spacelab life sciences experiments, which may involve as many as 16 Space Shuttle missions and 100 tests, are discussed. In particular, a Spacelab simulation program, designed to evaluate problems associated with the use of live animal specimens, the constraints imposed by zero gravity on equipment operation, training of investigators and data management, is described. The simulated facility approximates the hardware and support systems of a current European Space Agency Spacelab model. Preparations necessary for the experimental program, such as crew activity plans, payload documentation and inflight experimental procedures are developed; health problems of the crew, including human/animal microbial contamination, are also assessed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carlsson, Mats; Johansson, Mikael; Larson, Jeffrey
Previous approaches for scheduling a league with round-robin and divisional tournaments involved decomposing the problem into easier subproblems. This approach, used to schedule the top Swedish handball league Elitserien, reduces the problem complexity but can result in suboptimal schedules. This paper presents an integrated constraint programming model that allows to perform the scheduling in a single step. Particular attention is given to identifying implied and symmetry-breaking constraints that reduce the computational complexity significantly. The experimental evaluation of the integrated approach takes considerably less computational effort than the previous approach.
Thermoviscoelastic characterization and prediction of Kevlar/epoxy composite laminates
NASA Technical Reports Server (NTRS)
Gramoll, K. C.; Dillard, D. A.; Brinson, H. F.
1990-01-01
The thermoviscoelastic characterization of Kevlar 49/Fiberite 7714A epoxy composite lamina and the development of a numerical procedure to predict the viscoelastic response of any general laminate constructed from the same material were studied. The four orthotropic material properties, S sub 11, S sub 12, S sub 22, and S sub 66, were characterized by 20 minute static creep tests on unidirectional (0) sub 8, (10) sub 8, and (90) sub 16 lamina specimens. The Time-Temperature Superposition-Principle (TTSP) was used successfully to accelerate the characterization process. A nonlinear constitutive model was developed to describe the stress dependent viscoelastic response for each of the material properties. A numerical procedure to predict long term laminate properties from lamina properties (obtained experimentally) was developed. Numerical instabilities and time constraints associated with viscoelastic numerical techniques were discussed and solved. The numerical procedure was incorporated into a user friendly microcomputer program called Viscoelastic Composite Analysis Program (VCAP), which is available for IBM PC type computers. The program was designed for ease of use. The final phase involved testing actual laminates constructed from the characterized material, Kevlar/epoxy, at various temperatures and load level for 4 to 5 weeks. These results were compared with the VCAP program predictions to verify the testing procedure and to check the numerical procedure used in the program. The actual tests and predictions agreed for all test cases which included 1, 2, 3, and 4 fiber direction laminates.
Use of Heritage Hardware on Orion MPCV Exploration Flight Test One
NASA Technical Reports Server (NTRS)
Rains, George Edward; Cross, Cynthia D.
2012-01-01
Due to an aggressive schedule for the first space flight of an unmanned Orion capsule, currently known as Exploration Flight Test One (EFT1), combined with severe programmatic funding constraints, an effort was made within the Orion Program to identify heritage hardware, i.e., already existing, flight-certified components from previous manned space programs, which might be available for use on EFT1. With the end of the Space Shuttle Program, no current means exists to launch Multi-Purpose Logistics Modules (MPLMs) to the International Space Station (ISS), and so the inventory of many flight-certified Shuttle and MPLM components are available for other purposes. Two of these items are the MPLM cabin Positive Pressure Relief Assembly (PPRA), and the Shuttle Ground Support Equipment Heat Exchanger (GSE HX). In preparation for the utilization of these components by the Orion Program, analyses and testing of the hardware were performed. The PPRA had to be analyzed to determine its susceptibility to pyrotechnic shock, and vibration testing had to be performed, since those environments are predicted to be more severe during an Orion mission than those the hardware was originally designed to accommodate. The GSE HX had to be tested for performance with the Orion thermal working fluids, which are different from those used by the Space Shuttle. This paper summarizes the activities required in order to utilize heritage hardware for EFT1.
ERIC Educational Resources Information Center
Le, Nguyen-Thinh; Menzel, Wolfgang
2009-01-01
In this paper, we introduce logic programming as a domain that exhibits some characteristics of being ill-defined. In order to diagnose student errors in such a domain, we need a means to hypothesise the student's intention, that is the strategy underlying her solution. This is achieved by weighting constraints, so that hypotheses about solution…
AP Studio Art as an Enabling Constraint for Secondary Art Education
ERIC Educational Resources Information Center
Graham, Mark A.
2009-01-01
Advanced Placement (AP) Studio Art is an influential force in secondary art education as is evident in the 31,800 portfolios submitted for review in 2008. From the perspectives of a high school educator and AP Reader, this author has observed how the constraints of the AP program can be used to generate support for high school art programs and…
NASA Astrophysics Data System (ADS)
Li, Hong; Zhang, Li; Jiao, Yong-Chang
2016-07-01
This paper presents an interactive approach based on a discrete differential evolution algorithm to solve a class of integer bilevel programming problems, in which integer decision variables are controlled by an upper-level decision maker and real-value or continuous decision variables are controlled by a lower-level decision maker. Using the Karush--Kuhn-Tucker optimality conditions in the lower-level programming, the original discrete bilevel formulation can be converted into a discrete single-level nonlinear programming problem with the complementarity constraints, and then the smoothing technique is applied to deal with the complementarity constraints. Finally, a discrete single-level nonlinear programming problem is obtained, and solved by an interactive approach. In each iteration, for each given upper-level discrete variable, a system of nonlinear equations including the lower-level variables and Lagrange multipliers is solved first, and then a discrete nonlinear programming problem only with inequality constraints is handled by using a discrete differential evolution algorithm. Simulation results show the effectiveness of the proposed approach.
Thermal-environmental testing of a 30-cm engineering model thruster
NASA Technical Reports Server (NTRS)
Mirtich, M. J.
1976-01-01
An experimental test program was carried out to document all 30-cm electron bombardment Hg ion bombardment thruster functions and characteristics over the thermal environment of several proposed missions. An engineering model thruster was placed in a thermal test facility equipped with -196 C walls and solar simulation. The thruster was cold soaked and exposed to simulated eclipses lasting in duration from 17 to 72 minutes. The thruster was operated at quarter, to full beam power in various thermal configurations which simulated multiple thruster operation, and was also exposed to 1 and 2 suns solar simulation. Thruster control characteristics and constraints; performance, including thrust magnitude and direction; and structural integrity were evaluated over the range of thermal environments tested.
Thermal-environment testing of a 30-cm engineering model thruster
NASA Technical Reports Server (NTRS)
Mirtich, M. J.
1976-01-01
An experimental test program was carried out to document all 30-cm electron bombardment Hg ion bombardment thruster functions and characteristics over the thermal environment of several proposed missions. An engineering model thruster was placed in a thermal test facility equipped with -196 C walls and solar simulation. The thruster was cold soaked and exposed to simulated eclipses lasting in duration from 17 to 72 minutes. The thruster was operated at quarter, to full beam power in various thermal configurations which simulated multiple thruster operation, and was also exposed to 1 and 2 suns solar simulation. Thruster control characteristics and constraints; performance, including thrust magnitude and direction; and structural integrity were evaluated over the range of thermal environments tested.
Minimum weight design of helicopter rotor blades with frequency constraints
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Walsh, Joanne L.
1989-01-01
The minimum weight design of helicopter rotor blades subject to constraints on fundamental coupled flap-lag natural frequencies has been studied in this paper. A constraint has also been imposed on the minimum value of the blade autorotational inertia to ensure that the blade has sufficient inertia to autorotate in case of an engine failure. The program CAMRAD has been used for the blade modal analysis and the program CONMIN has been used for the optimization. In addition, a linear approximation analysis involving Taylor series expansion has been used to reduce the analysis effort. The procedure contains a sensitivity analysis which consists of analytical derivatives of the objective function and the autorotational inertia constraint and central finite difference derivatives of the frequency constraints. Optimum designs have been obtained for blades in vacuum with both rectangular and tapered box beam structures. Design variables include taper ratio, nonstructural segment weights and box beam dimensions. The paper shows that even when starting with an acceptable baseline design, a significant amount of weight reduction is possible while satisfying all the constraints for blades with rectangular and tapered box beams.
CPAS Preflight Drop Test Analysis Process
NASA Technical Reports Server (NTRS)
Englert, Megan E.; Bledsoe, Kristin J.; Romero, Leah M.
2015-01-01
Throughout the Capsule Parachute Assembly System (CPAS) drop test program, the CPAS Analysis Team has developed a simulation and analysis process to support drop test planning and execution. This process includes multiple phases focused on developing test simulations and communicating results to all groups involved in the drop test. CPAS Engineering Development Unit (EDU) series drop test planning begins with the development of a basic operational concept for each test. Trajectory simulation tools include the Flight Analysis and Simulation Tool (FAST) for single bodies, and the Automatic Dynamic Analysis of Mechanical Systems (ADAMS) simulation for the mated vehicle. Results are communicated to the team at the Test Configuration Review (TCR) and Test Readiness Review (TRR), as well as at Analysis Integrated Product Team (IPT) meetings in earlier and intermediate phases of the pre-test planning. The ability to plan and communicate efficiently with rapidly changing objectives and tight schedule constraints is a necessity for safe and successful drop tests.
Automotive fuel economy and emissions program
NASA Technical Reports Server (NTRS)
Dowdy, M. W.; Baisley, R. L.
1978-01-01
Experimental data were generated to support an assessment of the relationship between automobile fuel economy and emissions control systems. Tests were made at both the engine and vehicle levels. Detailed investigations were made on cold-start emissions devices, exhaust gas recirculation systems, and air injection reactor systems. Based on the results of engine tests, an alternative emission control system and modified control strategy were implemented and tested in the vehicle. With the same fuel economy and NOx emissions as the stock vehicle, the modified vehicle reduced HC and CO emissions by about 20 percent. By removing the NOx emissions constraint, the modified vehicle demonstrated about 12 percent better fuel economy than the stock vehicle.
Satellite Test Assistant Robot (STAR)
NASA Technical Reports Server (NTRS)
Mcaffee, D. A.; Kerrisk, D. J.; Johnson, K. R.
1993-01-01
A three-year, three-phase program to demonstrate the applicability of telerobotic technology to the testing of satellites and other spacecraft has been initiated. Specifically, the objectives are to design, fabricate, and install into the JPL 25-ft. Space Simulator (SS) a system that will provide the capability to view test articles from all directions in both the visible and infrared (IR) spectral regions, to automatically map the solar flux intensity over the entire work volume of the chamber, and to provide the capability for leak detection. The first year's work, which provides a vertically mobile viewing platform equipped with stereo cameras, will be discussed. Design constraints and system implementation approaches mandated by the requirements of thermal vacuum operation will be emphasized.
NASA Technical Reports Server (NTRS)
Kolb, Mark A.
1990-01-01
Originally, computer programs for engineering design focused on detailed geometric design. Later, computer programs for algorithmically performing the preliminary design of specific well-defined classes of objects became commonplace. However, due to the need for extreme flexibility, it appears unlikely that conventional programming techniques will prove fruitful in developing computer aids for engineering conceptual design. The use of symbolic processing techniques, such as object-oriented programming and constraint propagation, facilitate such flexibility. Object-oriented programming allows programs to be organized around the objects and behavior to be simulated, rather than around fixed sequences of function- and subroutine-calls. Constraint propagation allows declarative statements to be understood as designating multi-directional mathematical relationships among all the variables of an equation, rather than as unidirectional assignments to the variable on the left-hand side of the equation, as in conventional computer programs. The research has concentrated on applying these two techniques to the development of a general-purpose computer aid for engineering conceptual design. Object-oriented programming techniques are utilized to implement a user-extensible database of design components. The mathematical relationships which model both geometry and physics of these components are managed via constraint propagation. In addition, to this component-based hierarchy, special-purpose data structures are provided for describing component interactions and supporting state-dependent parameters. In order to investigate the utility of this approach, a number of sample design problems from the field of aerospace engineering were implemented using the prototype design tool, Rubber Airplane. The additional level of organizational structure obtained by representing design knowledge in terms of components is observed to provide greater convenience to the program user, and to result in a database of engineering information which is easier both to maintain and to extend.
The Swift Project Contamination Control Program: A Case study of Balancing Cost, Schedule and Risk
NASA Technical Reports Server (NTRS)
Hansen, Patricia A.; Day, Diane; Secunda, Mark
2003-01-01
The Swift Observatory will be launched in early 2004 to examine the dynamic process of gamma ray burst (GRB) events. The multi-wavelength Observatory will study the GRB afterglow characteristics, which will help to answer fundamental questions about both the structure and the evolution of the universe. The Swift Observatory Contamination Control Program has been developed to aid in ensuring the success of the on-orbit performance of two of the primary instruments: the Ultraviolet and Optical Telescope (UVOT) and the X-Ray Telescope (XRT). During the design phase of the Observatory, the contamination control program evolved and trade studies were performed to assess the risk of contaminating the sensitive UVOT and XRT optics during both pre-launch testing and on-orbit operations, within the constraints of the overall program cost and schedule.
The Swift Project Contamination Control Program: A Case Study of Balancing Cost, Schedule and Risk
NASA Technical Reports Server (NTRS)
Hansen, Patricia A.; Day, Diane T.; Secunda, Mark S.; Rosecrans, Glenn P.
2004-01-01
The Swift Observatory will be launched in early 2004 to examine the dynamic process of gamma ray burst (GRB) events. The multi-wavelength Observatory will study the GRB afterglow characteristics, which will help to answer fundamental questions about both the structure and the evolution of the universe. The Swift Observatory Contamination Control Program has been developed to aid in ensuring the success of the on-orbit performance of two of the primary instruments: the Ultraviolet and Optical Telescope (UVOT) and the X-Ray Telescope (XRT). During the design phase of the Observatory, the contamination control program evolved and trade studies were performed to assess the risk of contaminating the sensitive UVOT and XRT optics during both pre-launch testing and on-orbit operations, within the constraints of the overall program cost and schedule.
NASA Astrophysics Data System (ADS)
Welch, Jonathan
This case study focused on obsolescence management constraints that occur during development of sustainment-dominated systems. Obsolescence management constraints were explored in systems expected to last 20 years or more and that tend to use commercial off-the-shelf products. The field of obsolescence has received little study, but obsolescence has a large cost for military systems. Because developing complex systems takes an average of 3 to 8 years, and commercial off-the-shelf components are typically obsolete within 3 to 5 years, military systems are often deployed with obsolescence issues that are transferred to the sustainment community to determine solutions. The main problem addressed in the study was to identify the constraints that have caused 70% of military systems under development to be obsolete when they are delivered. The purpose of the study was to use a qualitative case study to identify constraints that interfered with obsolescence management occurring during the development stages of a program. The participants of this case study were managers, subordinates, and end-users who were logistics and obsolescence experts. Researchers largely agree that proactive obsolescence management is a lower cost solution for sustainment-dominated systems. Program managers must understand the constraints and understand the impact of not implementing proactive solutions early in the development program lifecycle. The conclusion of the study found several constraints that prevented the development program from early adoption of obsolescence management theories, specifically pro-active theories. There were three major themes identified: (a) management commitment, (b) lack of details in the statement of work, and (c) vendor management. Each major theme includes several subthemes. The recommendation is future researchers should explore two areas: (a) comparing the cost of managing obsolescence early in the development process versus the costs of managing later, (b) exploring the costs and value to start a centralized obsolescence group at each major defense contractor location.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elbert, Stephen T.; Kalsi, Karanjit; Vlachopoulou, Maria
Financial Transmission Rights (FTRs) help power market participants reduce price risks associated with transmission congestion. FTRs are issued based on a process of solving a constrained optimization problem with the objective to maximize the FTR social welfare under power flow security constraints. Security constraints for different FTR categories (monthly, seasonal or annual) are usually coupled and the number of constraints increases exponentially with the number of categories. Commercial software for FTR calculation can only provide limited categories of FTRs due to the inherent computational challenges mentioned above. In this paper, a novel non-linear dynamical system (NDS) approach is proposed tomore » solve the optimization problem. The new formulation and performance of the NDS solver is benchmarked against widely used linear programming (LP) solvers like CPLEX™ and tested on large-scale systems using data from the Western Electricity Coordinating Council (WECC). The NDS is demonstrated to outperform the widely used CPLEX algorithms while exhibiting superior scalability. Furthermore, the NDS based solver can be easily parallelized which results in significant computational improvement.« less
Carpenter, G.B.; Cardinell, A.P.; Francois, D.K.; Good, L.K.; Lewis, R.L.; Stiles, N.T.
1982-01-01
Analysis of high-resolution geophysical data collected over 540 blocks tentatively selected for leasing in proposed OCS Oil and Gas Lease Sale 52 (Georges Bank) revealed a number of potential geologic hazards to oil and gas exploration and development activities: evidence of mass movements and shallow gas deposits on the continental slope. No potential hazards were observed on the continental shelf or rise. Other geology-related problems, termed constraints because they pose a relatively low degree of risk and can be routinely dealt with by the use of existing technology have been observed on the continental shelf. Constraints identified in the proposed sale area are erosion, sand waves, filled channels and deep faults. Piston cores were collected for geotechnical analysis at selected locations on the continental slope in the proposed lease sale area. The core locations were selected to provide information on slope stability and to establish the general geotechnical properties of the sediments. Preliminary results of a testing program suggest that the surficial sediment cover is stable with respect to mass movement.
NASA Astrophysics Data System (ADS)
Kang, Jidong; Gianetto, James A.; Tyson, William R.
2018-03-01
Fracture toughness measurement is an integral part of structural integrity assessment of pipelines. Traditionally, a single-edge-notched bend (SE(B)) specimen with a deep crack is recommended in many existing pipeline structural integrity assessment procedures. Such a test provides high constraint and therefore conservative fracture toughness results. However, for girth welds in service, defects are usually subjected to primarily tensile loading where the constraint is usually much lower than in the three-point bend case. Moreover, there is increasing use of strain-based design of pipelines that allows applied strains above yield. Low-constraint toughness tests represent more realistic loading conditions for girth weld defects, and the corresponding increased toughness can minimize unnecessary conservatism in assessments. In this review, we present recent developments in low-constraint fracture toughness testing, specifically using single-edgenotched tension specimens, SENT or SE(T). We focus our review on the test procedure development and automation, round-robin test results and some common concerns such as the effect of crack tip, crack size monitoring techniques, and testing at low temperatures. Examples are also given of the integration of fracture toughness data from SE(T) tests into structural integrity assessment.
Leverage Between the Buffering Effect and the Bystander Effect in Social Networking.
Chiu, Yu-Ping; Chang, Shu-Chen
2015-08-01
This study examined encouraged and inhibited social feedback behaviors based on the theories of the buffering effect and the bystander effect. A system program was used to collect personal data and social feedback from a Facebook data set to test the research model. The results revealed that the buffering effect induced a positive relationship between social network size and feedback gained from friends when people's social network size was under a certain cognitive constraint. For people with a social network size that exceeds this cognitive constraint, the bystander effect may occur, in which having more friends may inhibit social feedback. In this study, two social psychological theories were applied to explain social feedback behavior on Facebook, and it was determined that social network size and social feedback exhibited no consistent linear relationship.
Constraint-based component-modeling for knowledge-based design
NASA Technical Reports Server (NTRS)
Kolb, Mark A.
1992-01-01
The paper describes the application of various advanced programming techniques derived from artificial intelligence research to the development of flexible design tools for conceptual design. Special attention is given to two techniques which appear to be readily applicable to such design tools: the constraint propagation technique and the object-oriented programming. The implementation of these techniques in a prototype computer tool, Rubber Airplane, is described.
Path Following in the Exact Penalty Method of Convex Programming.
Zhou, Hua; Lange, Kenneth
2015-07-01
Classical penalty methods solve a sequence of unconstrained problems that put greater and greater stress on meeting the constraints. In the limit as the penalty constant tends to ∞, one recovers the constrained solution. In the exact penalty method, squared penalties are replaced by absolute value penalties, and the solution is recovered for a finite value of the penalty constant. In practice, the kinks in the penalty and the unknown magnitude of the penalty constant prevent wide application of the exact penalty method in nonlinear programming. In this article, we examine a strategy of path following consistent with the exact penalty method. Instead of performing optimization at a single penalty constant, we trace the solution as a continuous function of the penalty constant. Thus, path following starts at the unconstrained solution and follows the solution path as the penalty constant increases. In the process, the solution path hits, slides along, and exits from the various constraints. For quadratic programming, the solution path is piecewise linear and takes large jumps from constraint to constraint. For a general convex program, the solution path is piecewise smooth, and path following operates by numerically solving an ordinary differential equation segment by segment. Our diverse applications to a) projection onto a convex set, b) nonnegative least squares, c) quadratically constrained quadratic programming, d) geometric programming, and e) semidefinite programming illustrate the mechanics and potential of path following. The final detour to image denoising demonstrates the relevance of path following to regularized estimation in inverse problems. In regularized estimation, one follows the solution path as the penalty constant decreases from a large value.
Path Following in the Exact Penalty Method of Convex Programming
Zhou, Hua; Lange, Kenneth
2015-01-01
Classical penalty methods solve a sequence of unconstrained problems that put greater and greater stress on meeting the constraints. In the limit as the penalty constant tends to ∞, one recovers the constrained solution. In the exact penalty method, squared penalties are replaced by absolute value penalties, and the solution is recovered for a finite value of the penalty constant. In practice, the kinks in the penalty and the unknown magnitude of the penalty constant prevent wide application of the exact penalty method in nonlinear programming. In this article, we examine a strategy of path following consistent with the exact penalty method. Instead of performing optimization at a single penalty constant, we trace the solution as a continuous function of the penalty constant. Thus, path following starts at the unconstrained solution and follows the solution path as the penalty constant increases. In the process, the solution path hits, slides along, and exits from the various constraints. For quadratic programming, the solution path is piecewise linear and takes large jumps from constraint to constraint. For a general convex program, the solution path is piecewise smooth, and path following operates by numerically solving an ordinary differential equation segment by segment. Our diverse applications to a) projection onto a convex set, b) nonnegative least squares, c) quadratically constrained quadratic programming, d) geometric programming, and e) semidefinite programming illustrate the mechanics and potential of path following. The final detour to image denoising demonstrates the relevance of path following to regularized estimation in inverse problems. In regularized estimation, one follows the solution path as the penalty constant decreases from a large value. PMID:26366044
A Monte Carlo Approach for Adaptive Testing with Content Constraints
ERIC Educational Resources Information Center
Belov, Dmitry I.; Armstrong, Ronald D.; Weissman, Alexander
2008-01-01
This article presents a new algorithm for computerized adaptive testing (CAT) when content constraints are present. The algorithm is based on shadow CAT methodology to meet content constraints but applies Monte Carlo methods and provides the following advantages over shadow CAT: (a) lower maximum item exposure rates, (b) higher utilization of the…
Test development for the thermionic system evaluation test (TSET) project
NASA Astrophysics Data System (ADS)
Morris, D. Brent; Standley, Vaughn H.; Schuller, Michael J.
1992-01-01
The arrival of a Soviet TOPAZ-II space nuclear reactor affords the US space nuclear power (SNP) community the opportunity to study an assembled thermionic conversion power system. The TOPAZ-II will be studied via the Thermionic System Evaluation Test (TSET) Project. This paper is devoted to the discussion of TSET test development as related to the objectives contained in the TSET Project Plan (Standley et al. 1991). The objectives contained in the Project Plan are the foundation for scheduled TSET tests on TOPAZ-II and are derived from the needs of the Air Force Thermionic SNP program. Our ability to meet the objectives is bounded by unique constraints, such as procurement requirements, operational limitations, and necessary interaction between US and Soviet Scientists and engineers. The fulfillment of the test objectives involves a thorough methodology of test scheduling and data managment. The overall goals for the TSET program are gaining technical understanding of a thermionic SNP system and demonstrating the capabilities and limitations of such a system while assisting in the training of US scientist and engineers in preparation for US SNP system testing. Tests presently scheduled as part of TSET include setup, demonstration, and verification tests; normal and off-normal operating test, and system and component performance tests.
NASA Technical Reports Server (NTRS)
Head, J. W. (Editor)
1978-01-01
Developments reported at a meeting of principal investigators for NASA's planetology geology program are summarized. Topics covered include: constraints on solar system formation; asteriods, comets, and satellites; constraints on planetary interiors; volatiles and regoliths; instrument development techniques; planetary cartography; geological and geochemical constraints on planetary evolution; fluvial processes and channel formation; volcanic processes; Eolian processes; radar studies of planetary surfaces; cratering as a process, landform, and dating method; and the Tharsis region of Mars. Activities at a planetary geology field conference on Eolian processes are reported and techniques recommended for the presentation and analysis of crater size-frequency data are included.
Integrity Constraint Monitoring in Software Development: Proposed Architectures
NASA Technical Reports Server (NTRS)
Fernandez, Francisco G.
1997-01-01
In the development of complex software systems, designers are required to obtain from many sources and manage vast amounts of knowledge of the system being built and communicate this information to personnel with a variety of backgrounds. Knowledge concerning the properties of the system, including the structure of, relationships between and limitations of the data objects in the system, becomes increasingly more vital as the complexity of the system and the number of knowledge sources increases. Ensuring that violations of these properties do not occur becomes steadily more challenging. One approach toward managing the enforcement or system properties, called context monitoring, uses a centralized repository of integrity constraints and a constraint satisfiability mechanism for dynamic verification of property enforcement during program execution. The focus of this paper is to describe possible software architectures that define a mechanism for dynamically checking the satisfiability of a set of constraints on a program. The next section describes the context monitoring approach in general. Section 3 gives an overview of the work currently being done toward the addition of an integrity constraint satisfiability mechanism to a high-level program language, SequenceL, and demonstrates how this model is being examined to develop a general software architecture. Section 4 describes possible architectures for a general constraint satisfiability mechanism, as well as an alternative approach that, uses embedded database queries in lieu of an external monitor. The paper concludes with a brief summary outlining the, current state of the research and future work.
Dal Palú, Alessandro; Spyrakis, Francesca; Cozzini, Pietro
2012-03-01
We describe the potential of a novel method, based on Constraint Logic Programming (CLP), developed for an exhaustive sampling of protein conformational space. The CLP framework proposed here has been tested and applied to the estrogen receptor, whose activity and function is strictly related to its intrinsic, and well known, dynamics. We have investigated in particular the flexibility of H12, focusing on the pathways followed by the helix when moving from one stable crystallographic conformation to the others. Millions of geometrically feasible conformations were generated, selected and the traces connecting the different forms were determined by using a shortest path algorithm. The preliminary analyses showed a marked agreement between the crystallographic agonist-like, antagonist-like and hypothetical apo forms, and the corresponding conformations identified by the CLP framework. These promising results, together with the short computational time required to perform the analyses, make this constraint-based approach a valuable tool for the study of protein folding prediction. The CLP framework enables one to consider various structural and energetic scenarious, without changing the core algorithm. To show the feasibility of the method, we intentionally choose a pure geometric setting, neglecting the energetic evaluation of the poses, in order to be independent from a specific force field and to provide the possibility of comparing different behaviours associated with various energy models. Copyright © 2012 Elsevier Masson SAS. All rights reserved.
NASA Technical Reports Server (NTRS)
Watts, G.
1992-01-01
A programming technique to eliminate computational instability in multibody simulations that use the Lagrange multiplier is presented. The computational instability occurs when the attached bodies drift apart and violate the constraints. The programming technique uses the constraint equation, instead of integration, to determine the coordinates that are not independent. Although the equations of motion are unchanged, a complete derivation of the incorporation of the Lagrange multiplier into the equation of motion for two bodies is presented. A listing of a digital computer program which uses the programming technique to eliminate computational instability is also presented. The computer program simulates a solid rocket booster and parachute connected by a frictionless swivel.
Interactive two-stage stochastic fuzzy programming for water resources management.
Wang, S; Huang, G H
2011-08-01
In this study, an interactive two-stage stochastic fuzzy programming (ITSFP) approach has been developed through incorporating an interactive fuzzy resolution (IFR) method within an inexact two-stage stochastic programming (ITSP) framework. ITSFP can not only tackle dual uncertainties presented as fuzzy boundary intervals that exist in the objective function and the left- and right-hand sides of constraints, but also permit in-depth analyses of various policy scenarios that are associated with different levels of economic penalties when the promised policy targets are violated. A management problem in terms of water resources allocation has been studied to illustrate applicability of the proposed approach. The results indicate that a set of solutions under different feasibility degrees has been generated for planning the water resources allocation. They can help the decision makers (DMs) to conduct in-depth analyses of tradeoffs between economic efficiency and constraint-violation risk, as well as enable them to identify, in an interactive way, a desired compromise between satisfaction degree of the goal and feasibility of the constraints (i.e., risk of constraint violation). Copyright © 2011 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Zhao, Dang-Jun; Song, Zheng-Yu
2017-08-01
This study proposes a multiphase convex programming approach for rapid reentry trajectory generation that satisfies path, waypoint and no-fly zone (NFZ) constraints on Common Aerial Vehicles (CAVs). Because the time when the vehicle reaches the waypoint is unknown, the trajectory of the vehicle is divided into several phases according to the prescribed waypoints, rendering a multiphase optimization problem with free final time. Due to the requirement of rapidity, the minimum flight time of each phase index is preferred over other indices in this research. The sequential linearization is used to approximate the nonlinear dynamics of the vehicle as well as the nonlinear concave path constraints on the heat rate, dynamic pressure, and normal load; meanwhile, the convexification techniques are proposed to relax the concave constraints on control variables. Next, the original multiphase optimization problem is reformulated as a standard second-order convex programming problem. Theoretical analysis is conducted to show that the original problem and the converted problem have the same solution. Numerical results are presented to demonstrate that the proposed approach is efficient and effective.
DOE Office of Scientific and Technical Information (OSTI.GOV)
John, C.J.; Maciasz, G.; Harder, B.J.
The US Department of Energy established a geopressured-geothermal energy program in the mid 1970`s as one response to America`s need to develop alternate energy resources in view of the increasing dependence on imported fossil fuel energy. This program continued for 17 years and approximately two hundred million dollars were expended for various types of research and well testing to thoroughly investigate this alternative energy source. This volume describes the following studies: Geopressured-geothermal resource description; Resource origin and sediment type; Gulf Coast resource extent; Resource estimates; Project history; Authorizing legislation; Program objectives; Perceived constraints; Program activities and structure; Well testing; Programmore » management; Program cost summary; Funding history; Resource characterization; Wells of opportunity; Edna Delcambre No. 1 well; Edna Delcambre well recompletion; Fairfax Foster Sutter No. 2 well; Beulah Simon No. 2 well; P.E. Girouard No. 1 well; Prairie Canal No. 1 well; Crown Zellerbach No. 2 well; Alice C. Plantation No. 2 well; Tenneco Fee N No. 1 well; Pauline Kraft No. 1 well; Saldana well No. 2; G.M. Koelemay well No. 1; Willis Hulin No. 1 well; Investigations of other wells of opportunity; Clovis A. Kennedy No. 1 well; Watkins-Miller No. 1 well; Lucien J. Richard et al No. 1 well; and the C and K-Frank A. Godchaux, III, well No. 1.« less
Real-Time MENTAT programming language and architecture
NASA Technical Reports Server (NTRS)
Grimshaw, Andrew S.; Silberman, Ami; Liu, Jane W. S.
1989-01-01
Real-time MENTAT, a programming environment designed to simplify the task of programming real-time applications in distributed and parallel environments, is described. It is based on the same data-driven computation model and object-oriented programming paradigm as MENTAT. It provides an easy-to-use mechanism to exploit parallelism, language constructs for the expression and enforcement of timing constraints, and run-time support for scheduling and exciting real-time programs. The real-time MENTAT programming language is an extended C++. The extensions are added to facilitate automatic detection of data flow and generation of data flow graphs, to express the timing constraints of individual granules of computation, and to provide scheduling directives for the runtime system. A high-level view of the real-time MENTAT system architecture and programming language constructs is provided.
NASA Technical Reports Server (NTRS)
Kolb, Mark A.
1988-01-01
The Rubber Airplane program, which combines two symbolic processing techniques with a component-based database of design knowledge, is proposed as a computer aid for conceptual design. Using object-oriented programming, programs are organized around the objects and behavior to be simulated, and using constraint propagation, declarative statements designate mathematical relationships among all the equation variables. It is found that the additional level of organizational structure resulting from the arrangement of the design information in terms of design components provides greater flexibility and convenience.
[Constraints and opportunities for inter-sector health promotion initiatives: a case study].
Magalhães, Rosana
2015-07-01
This article analyzes the implementation of inter-sector initiatives linked to the Family Grant, Family Health, and School Health Programs in the Manguinhos neighborhood in the North Zone of Rio de Janeiro, Brazil. The study was conducted in 2010 and 2011 and included document review, local observation, and 25 interviews with program managers, professionals, and staff. This was an exploratory case study using a qualitative approach that identified constraints and opportunities for inter-sector health experiences, contributing to the debate on the effectiveness of health promotion and poverty relief programs.
Object matching using a locally affine invariant and linear programming techniques.
Li, Hongsheng; Huang, Xiaolei; He, Lei
2013-02-01
In this paper, we introduce a new matching method based on a novel locally affine-invariant geometric constraint and linear programming techniques. To model and solve the matching problem in a linear programming formulation, all geometric constraints should be able to be exactly or approximately reformulated into a linear form. This is a major difficulty for this kind of matching algorithm. We propose a novel locally affine-invariant constraint which can be exactly linearized and requires a lot fewer auxiliary variables than other linear programming-based methods do. The key idea behind it is that each point in the template point set can be exactly represented by an affine combination of its neighboring points, whose weights can be solved easily by least squares. Errors of reconstructing each matched point using such weights are used to penalize the disagreement of geometric relationships between the template points and the matched points. The resulting overall objective function can be solved efficiently by linear programming techniques. Our experimental results on both rigid and nonrigid object matching show the effectiveness of the proposed algorithm.
WIPP waste characterization program sampling and analysis guidance manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1991-01-01
The Waste Isolation Pilot Plant (WIPP) Waste Characterization Program Sampling and Analysis Guidance Manual (Guidance Manual) provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Quality Assurance Program Plan (QAPP) for the WIPP Experimental-Waste Characterization Program (the Program). This Guidance Manual includes all of the sampling and testing methodologies accepted by the WIPP Project Office (DOE/WPO) for use in implementing the Program requirements specified in the QAPP. This includes methods for characterizing representative samples of transuranic (TRU) wastesmore » at DOE generator sites with respect to the gas generation controlling variables defined in the WIPP bin-scale and alcove test plans, as well as waste container headspace gas sampling and analytical procedures to support waste characterization requirements under the WIPP test program and the Resource Conservation and Recovery Act (RCRA). The procedures in this Guidance Manual are comprehensive and detailed and are designed to provide the necessary guidance for the preparation of site specific procedures. The use of these procedures is intended to provide the necessary sensitivity, specificity, precision, and comparability of analyses and test results. The solutions to achieving specific program objectives will depend upon facility constraints, compliance with DOE Orders and DOE facilities' operating contractor requirements, and the knowledge and experience of the TRU waste handlers and analysts. With some analytical methods, such as gas chromatography/mass spectrometry, the Guidance Manual procedures may be used directly. With other methods, such as nondestructive/destructive characterization, the Guidance Manual provides guidance rather than a step-by-step procedure.« less
Liles, Elizabeth G; Schneider, Jennifer L; Feldstein, Adrianne C; Mosen, David M; Perrin, Nancy; Rosales, Ana Gabriela; Smith, David H
2015-03-29
Few studies describe system-level challenges or facilitators to implementing population-based colorectal cancer (CRC) screening outreach programs. Our qualitative study explored viewpoints of multilevel stakeholders before, during, and after implementation of a centralized outreach program. Program implementation was part of a broader quality-improvement initiative. During 2008-2010, we conducted semi-structured, open-ended individual interviews and focus groups at Kaiser Permanente Northwest (KPNW), a not-for-profit group model health maintenance organization using the practical robust implementation and sustainability model to explore external and internal barriers to CRC screening. We interviewed 55 stakeholders: 8 health plan leaders, 20 primary care providers, 4 program managers, and 23 endoscopy specialists (15 gastroenterologists, 8 general surgeons), and analyzed interview transcripts to identify common as well as divergent opinions expressed by stakeholders. The majority of stakeholders at various levels consistently reported that an automated telephone-reminder system to contact patients and coordinate mailing fecal tests alleviated organizational constraints on staff's time and resources. Changing to a single-sample fecal immunochemical test (FIT) lessened patient and provider concerns about feasibility and accuracy of fecal testing. The centralized telephonic outreach program did, however, result in some screening duplication and overuse. Higher rates of FIT completion and a higher proportion of positive results with FIT required more colonoscopies. Addressing barriers at multiple levels of a health system by changing the delivery system design to add a centralized outreach program, switching to a more accurate and easier-to-use fecal test, and providing educational and electronic support had both benefits and problematic consequences. Other health care organizations can use our results to understand the complexities of implementing centralized screening programs.
NASA Astrophysics Data System (ADS)
Huseyin Turan, Hasan; Kasap, Nihat; Savran, Huseyin
2014-03-01
Nowadays, every firm uses telecommunication networks in different amounts and ways in order to complete their daily operations. In this article, we investigate an optimisation problem that a firm faces when acquiring network capacity from a market in which there exist several network providers offering different pricing and quality of service (QoS) schemes. The QoS level guaranteed by network providers and the minimum quality level of service, which is needed for accomplishing the operations are denoted as fuzzy numbers in order to handle the non-deterministic nature of the telecommunication network environment. Interestingly, the mathematical formulation of the aforementioned problem leads to the special case of a well-known two-dimensional bin packing problem, which is famous for its computational complexity. We propose two different heuristic solution procedures that have the capability of solving the resulting nonlinear mixed integer programming model with fuzzy constraints. In conclusion, the efficiency of each algorithm is tested in several test instances to demonstrate the applicability of the methodology.
2010-12-01
computers in 1953. HIL motion simulators were also built for the dynamic testing of vehicle com- ponents (e.g. suspensions, bodies ) with hydraulic or...complex, comprehensive mechanical systems can be simulated in real-time by parallel computers; examples include multi- body sys- tems, brake systems...hard constraints in a multivariable control framework. And the third aspect is the ability to perform online optimization. These aspects results in
Design, testing, and damage tolerance study of bonded stiffened composite wing cover panels
NASA Technical Reports Server (NTRS)
Madan, Ram C.; Sutton, Jason O.
1988-01-01
Results are presented from the application of damage tolerance criteria for composite panels to multistringer composite wing cover panels developed under NASA's Composite Transport Wing Technology Development contract. This conceptual wing design integrated aeroelastic stiffness constraints with an enhanced damage tolerance material system, in order to yield optimized producibility and structural performance. Damage tolerance was demonstrated in a test program using full-sized cover panel subcomponents; panel skins were impacted at midbay between stiffeners, directly over a stiffener, and over the stiffener flange edge. None of the impacts produced visible damage. NASTRAN analyses were performed to simulate NDI-detected invisible damage.
A Comparison of Trajectory Optimization Methods for the Impulsive Minimum Fuel Rendezvous Problem
NASA Technical Reports Server (NTRS)
Hughes, Steven P.; Mailhe, Laurie M.; Guzman, Jose J.
2002-01-01
In this paper we present a comparison of optimization approaches to the minimum fuel rendezvous problem. Both indirect and direct methods are compared for a variety of test cases. The indirect approach is based on primer vector theory. The direct approaches are implemented numerically and include Sequential Quadratic Programming (SQP), Quasi-Newton, Simplex, Genetic Algorithms, and Simulated Annealing. Each method is applied to a variety of test cases including, circular to circular coplanar orbits, LEO to GEO, and orbit phasing in highly elliptic orbits. We also compare different constrained optimization routines on complex orbit rendezvous problems with complicated, highly nonlinear constraints.
NASA Technical Reports Server (NTRS)
Arneson, Heather M.; Dousse, Nicholas; Langbort, Cedric
2014-01-01
We consider control design for positive compartmental systems in which each compartment's outflow rate is described by a concave function of the amount of material in the compartment.We address the problem of determining the routing of material between compartments to satisfy time-varying state constraints while ensuring that material reaches its intended destination over a finite time horizon. We give sufficient conditions for the existence of a time-varying state-dependent routing strategy which ensures that the closed-loop system satisfies basic network properties of positivity, conservation and interconnection while ensuring that capacity constraints are satisfied, when possible, or adjusted if a solution cannot be found. These conditions are formulated as a linear programming problem. Instances of this linear programming problem can be solved iteratively to generate a solution to the finite horizon routing problem. Results are given for the application of this control design method to an example problem. Key words: linear programming; control of networks; positive systems; controller constraints and structure.
The Definition and Implementation of a Computer Programming Language Based on Constraints.
1980-08-01
though not quite reached, is a complete programming system which will implicitly support the constraint paradigm to the same extent that IISP , say...and detecting and resolving conflicts, just as iisp provides certain services such as automatic storage management, which records given dala in a...defined- it permits the statement of equalities and some simple arithmetic relationships. An implementation representation is chosen, and IISP code for a
Safety and environmental constraints on space applications of fusion energy
NASA Technical Reports Server (NTRS)
Roth, J. Reece
1990-01-01
Some of the constraints are examined on fusion reactions, plasma confinement systems, and fusion reactors that are intended for such space related missions as manned or unmanned operations in near earth orbit, interplanetary missions, or requirements of the SDI program. Of the many constraints on space power and propulsion systems, those arising from safety and environmental considerations are emphasized since these considerations place severe constraints on some fusion systems and have not been adequately treated in previous studies.
2011-12-28
specify collaboration constraints that occur in Java and XML frameworks and that the collaboration constraints from these frameworks matter in practice. (a...programming language boundaries, and Chapter 6 and Appendix A demonstrate that Fusion can specify constraints across both Java and XML in practice. (c...designed JUnit, Josh Bloch designed Java Collec- tions, and Krzysztof Cwalina designed the .NET Framework APIs. While all of these frameworks are very
Darmon, Nicole; Ferguson, Elaine L; Briend, André
2006-01-01
To predict, for French women, the impact of a cost constraint on the food choices required to provide a nutritionally adequate diet. Isocaloric daily diets fulfilling both palatability and nutritional constraints were modeled in linear programming, using different cost constraint levels. For each modeled diet, total departure from an observed French population's average food group pattern ("mean observed diet") was minimized. To achieve the nutritional recommendations without a cost constraint, the modeled diet provided more energy from fish, fresh fruits and green vegetables and less energy from animal fats and cheese than the "mean observed diet." Introducing and strengthening a cost constraint decreased the energy provided by meat, fresh vegetables, fresh fruits, vegetable fat, and yogurts and increased the energy from processed meat, eggs, offal, and milk. For the lowest cost diet (ie, 3.18 euros/d), marked changes from the "mean observed diet" were required, including a marked reduction in the amount of energy from fresh fruits (-85%) and green vegetables (-70%), and an increase in the amount of energy from nuts, dried fruits, roots, legumes, and fruit juices. Nutrition education for low-income French women must emphasize these affordable food choices.
An algorithm for the solution of dynamic linear programs
NASA Technical Reports Server (NTRS)
Psiaki, Mark L.
1989-01-01
The algorithm's objective is to efficiently solve Dynamic Linear Programs (DLP) by taking advantage of their special staircase structure. This algorithm constitutes a stepping stone to an improved algorithm for solving Dynamic Quadratic Programs, which, in turn, would make the nonlinear programming method of Successive Quadratic Programs more practical for solving trajectory optimization problems. The ultimate goal is to being trajectory optimization solution speeds into the realm of real-time control. The algorithm exploits the staircase nature of the large constraint matrix of the equality-constrained DLPs encountered when solving inequality-constrained DLPs by an active set approach. A numerically-stable, staircase QL factorization of the staircase constraint matrix is carried out starting from its last rows and columns. The resulting recursion is like the time-varying Riccati equation from multi-stage LQR theory. The resulting factorization increases the efficiency of all of the typical LP solution operations over that of a dense matrix LP code. At the same time numerical stability is ensured. The algorithm also takes advantage of dynamic programming ideas about the cost-to-go by relaxing active pseudo constraints in a backwards sweeping process. This further decreases the cost per update of the LP rank-1 updating procedure, although it may result in more changes of the active set that if pseudo constraints were relaxed in a non-stagewise fashion. The usual stability of closed-loop Linear/Quadratic optimally-controlled systems, if it carries over to strictly linear cost functions, implies that the saving due to reduced factor update effort may outweigh the cost of an increased number of updates. An aerospace example is presented in which a ground-to-ground rocket's distance is maximized. This example demonstrates the applicability of this class of algorithms to aerospace guidance. It also sheds light on the efficacy of the proposed pseudo constraint relaxation scheme.
Use of Heritage Hardware on MPCV Exploration Flight Test One
NASA Technical Reports Server (NTRS)
Rains, George Edward; Cross, Cynthia D.
2011-01-01
Due to an aggressive schedule for the first orbital test flight of an unmanned Orion capsule, known as Exploration Flight Test One (EFT1), combined with severe programmatic funding constraints, an effort was made to identify heritage hardware, i.e., already existing, flight-certified components from previous manned space programs, which might be available for use on EFT1. With the end of the Space Shuttle Program, no current means exists to launch Multi Purpose Logistics Modules (MPLMs) to the International Space Station (ISS), and so the inventory of many flight-certified Shuttle and MPLM components are available for other purposes. Two of these items are the Shuttle Ground Support Equipment Heat Exchanger (GSE Hx) and the MPLM cabin Positive Pressure Relief Assembly (PPRA). In preparation for the utilization of these components by the Orion Program, analyses and testing of the hardware were performed. The PPRA had to be analyzed to determine its susceptibility to pyrotechnic shock, and vibration testing had to be performed, since those environments are predicted to be significantly more severe during an Orion mission than those the hardware was originally designed to accommodate. The GSE Hx had to be tested for performance with the Orion thermal working fluids, which are different from those used by the Space Shuttle. This paper summarizes the certification of the use of heritage hardware for EFT1.
Engineering risk reduction in satellite programs
NASA Technical Reports Server (NTRS)
Dean, E. S., Jr.
1979-01-01
Methods developed in planning and executing system safety engineering programs for Lockheed satellite integration contracts are presented. These procedures establish the applicable safety design criteria, document design compliance and assess the residual risks where non-compliant design is proposed, and provide for hazard analysis of system level test, handling and launch preparations. Operations hazard analysis identifies product protection and product liability hazards prior to the preparation of operational procedures and provides safety requirements for inclusion in them. The method developed for documenting all residual hazards for the attention of program management assures an acceptable minimum level of risk prior to program deployment. The results are significant for persons responsible for managing or engineering the deployment and production of complex high cost equipment under current product liability law and cost/time constraints, have a responsibility to minimize the possibility of an accident, and should have documentation to provide a defense in a product liability suit.
Juggling Act: Re-Planning and Building on Observatory...Simultaneously!
NASA Technical Reports Server (NTRS)
Zavala, Eddie; Daws, Patricia
2011-01-01
SOFIA (Stratospheric Observatory for Infrared Astronomy) is a major SMD program that has been required to meet several requirements and implement major planning and business initiatives overthe past 1 1/2 years, in the midst of system development and flight test phases. The program was required to implementing JCL and EVM simultaneously, as well as undergo a major replan and Standing Review Board - and all without impacting technical schedule progress. The team developed innovative processes that met all the requirements, and improved Program Management process toolsets. The SOFIA team, being subject to all the typical budget constraints, found ways to leverage existing roles in new ways to meet the requirements without creating unmanageable overhead. The team developed strategies and value added processes - such as improved risk identification, structured reserves management, cost/risk integration - so that the effort expended resulted in a positive return to the program.
Post optimization paradigm in maximum 3-satisfiability logic programming
NASA Astrophysics Data System (ADS)
Mansor, Mohd. Asyraf; Sathasivam, Saratha; Kasihmuddin, Mohd Shareduwan Mohd
2017-08-01
Maximum 3-Satisfiability (MAX-3SAT) is a counterpart of the Boolean satisfiability problem that can be treated as a constraint optimization problem. It deals with a conundrum of searching the maximum number of satisfied clauses in a particular 3-SAT formula. This paper presents the implementation of enhanced Hopfield network in hastening the Maximum 3-Satisfiability (MAX-3SAT) logic programming. Four post optimization techniques are investigated, including the Elliot symmetric activation function, Gaussian activation function, Wavelet activation function and Hyperbolic tangent activation function. The performances of these post optimization techniques in accelerating MAX-3SAT logic programming will be discussed in terms of the ratio of maximum satisfied clauses, Hamming distance and the computation time. Dev-C++ was used as the platform for training, testing and validating our proposed techniques. The results depict the Hyperbolic tangent activation function and Elliot symmetric activation function can be used in doing MAX-3SAT logic programming.
NASA Technical Reports Server (NTRS)
Knox, C. E.
1983-01-01
A simplified flight-management descent algorithm, programmed on a small programmable calculator, was developed and flight tested. It was designed to aid the pilot in planning and executing a fuel-conservative descent to arrive at a metering fix at a time designated by the air traffic control system. The algorithm may also be used for planning fuel-conservative descents when time is not a consideration. The descent path was calculated for a constant Mach/airspeed schedule from linear approximations of airplane performance with considerations given for gross weight, wind, and nonstandard temperature effects. The flight-management descent algorithm is described. The results of flight tests flown with a T-39A (Sabreliner) airplane are presented.
Coker-Bolt, Patty; Downey, Ryan J; Connolly, Jacqueline; Hoover, Reagin; Shelton, Daniel; Seo, Na Jin
2017-01-01
The aim of this pilot study was to determine the feasibility and use accelerometers before, during, and after a camp-based constraint-induced movement therapy (CIMT) program for children with hemiplegic cerebral palsy. A pre-test post-test design was used for 12 children with CP (mean = 4.9 yrs) who completed a 30-hour camp-based CIMT program. The accelerometer data were collected using ActiGraph GT9X Link. Children wore accelerometers on both wrists one day before and after the camp and on the affected limb during each camp day. Three developmental assessments were administered pre-post CIMT program. Accelerometers were successfully worn before, during, and directly after the CIMT program to collect upper limb data. Affected upper limb accelerometer activity significantly increased during the CIMT camp compared to baseline (p< 0.05). Significant improvements were seen in all twelve children on all assessments of affected upper limb function (p< 0.05) measuring capacity and quality of affected upper limb functioning. Accelerometers can be worn during high intensity pediatric CIMT programs to collect data about affected upper limb function. Further study is required to determine the relationship between accelerometer data, measure of motor capacity, and real-world performance post-CIMT.
Space Shuttle stability and control flight test techniques
NASA Technical Reports Server (NTRS)
Cooke, D. R.
1980-01-01
A unique approach for obtaining vehicle aerodynamic characteristics during entry has been developed for the Space Shuttle. This is due to the high cost of Shuttle testing, the need to open constraints for operational flights, and the fact that all flight regimes are flown starting with the first flight. Because of uncertainties associated with predicted aerodynamic coefficients, nine flight conditions have been identified at which control problems could occur. A detailed test plan has been developed for testing at these conditions and is presented. Due to limited testing, precise computer initiated maneuvers are implemented. These maneuvers are designed to optimize the vehicle motion for determining aerodynamic coefficients. Special sensors and atmospheric measurements are required to provide stability and control flight data during an entire entry. The techniques employed in data reduction are proven programs developed and used at NASA/DFRC.
Ranking Forestry Investments With Parametric Linear Programming
Paul A. Murphy
1976-01-01
Parametric linear programming is introduced as a technique for ranking forestry investments under multiple constraints; it combines the advantages of simple tanking and linear programming as capital budgeting tools.
Declarative Programming with Temporal Constraints, in the Language CG.
Negreanu, Lorina
2015-01-01
Specifying and interpreting temporal constraints are key elements of knowledge representation and reasoning, with applications in temporal databases, agent programming, and ambient intelligence. We present and formally characterize the language CG, which tackles this issue. In CG, users are able to develop time-dependent programs, in a flexible and straightforward manner. Such programs can, in turn, be coupled with evolving environments, thus empowering users to control the environment's evolution. CG relies on a structure for storing temporal information, together with a dedicated query mechanism. Hence, we explore the computational complexity of our query satisfaction problem. We discuss previous implementation attempts of CG and introduce a novel prototype which relies on logic programming. Finally, we address the issue of consistency and correctness of CG program execution, using the Event-B modeling approach.
NASA Technical Reports Server (NTRS)
Tapia, R. A.; Vanrooy, D. L.
1976-01-01
A quasi-Newton method is presented for minimizing a nonlinear function while constraining the variables to be nonnegative and sum to one. The nonnegativity constraints were eliminated by working with the squares of the variables and the resulting problem was solved using Tapia's general theory of quasi-Newton methods for constrained optimization. A user's guide for a computer program implementing this algorithm is provided.
RSM 1.0 user's guide: A resupply scheduler using integer optimization
NASA Technical Reports Server (NTRS)
Viterna, Larry A.; Green, Robert D.; Reed, David M.
1991-01-01
The Resupply Scheduling Model (RSM) is a PC based, fully menu-driven computer program. It uses integer programming techniques to determine an optimum schedule to replace components on or before a fixed replacement period, subject to user defined constraints such as transportation mass and volume limits or available repair crew time. Principal input for RSJ includes properties such as mass and volume and an assembly sequence. Resource constraints are entered for each period corresponding to the component properties. Though written to analyze the electrical power system on the Space Station Freedom, RSM is quite general and can be used to model the resupply of almost any system subject to user defined resource constraints. Presented here is a step by step procedure for preparing the input, performing the analysis, and interpreting the results. Instructions for installing the program and information on the algorithms are given.
Developing high-quality educational software.
Johnson, Lynn A; Schleyer, Titus K L
2003-11-01
The development of effective educational software requires a systematic process executed by a skilled development team. This article describes the core skills required of the development team members for the six phases of successful educational software development. During analysis, the foundation of product development is laid including defining the audience and program goals, determining hardware and software constraints, identifying content resources, and developing management tools. The design phase creates the specifications that describe the user interface, the sequence of events, and the details of the content to be displayed. During development, the pieces of the educational program are assembled. Graphics and other media are created, video and audio scripts written and recorded, the program code created, and support documentation produced. Extensive testing by the development team (alpha testing) and with students (beta testing) is conducted. Carefully planned implementation is most likely to result in a flawless delivery of the educational software and maintenance ensures up-to-date content and software. Due to the importance of the sixth phase, evaluation, we have written a companion article on it that follows this one. The development of a CD-ROM product is described including the development team, a detailed description of the development phases, and the lessons learned from the project.
ERIC Educational Resources Information Center
van der Linden, Wim J.
2011-01-01
A critical component of test speededness is the distribution of the test taker's total time on the test. A simple set of constraints on the item parameters in the lognormal model for response times is derived that can be used to control the distribution when assembling a new test form. As the constraints are linear in the item parameters, they can…
Building flexible real-time systems using the Flex language
NASA Technical Reports Server (NTRS)
Kenny, Kevin B.; Lin, Kwei-Jay
1991-01-01
The design and implementation of a real-time programming language called Flex, which is a derivative of C++, are presented. It is shown how different types of timing requirements might be expressed and enforced in Flex, how they might be fulfilled in a flexible way using different program models, and how the programming environment can help in making binding and scheduling decisions. The timing constraint primitives in Flex are easy to use yet powerful enough to define both independent and relative timing constraints. Program models like imprecise computation and performance polymorphism can carry out flexible real-time programs. In addition, programmers can use a performance measurement tool that produces statistically correct timing models to predict the expected execution time of a program and to help make binding decisions. A real-time programming environment is also presented.
A programing system for research and applications in structural optimization
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, J.; Rogers, J. L., Jr.
1981-01-01
The paper describes a computer programming system designed to be used for methodology research as well as applications in structural optimization. The flexibility necessary for such diverse utilizations is achieved by combining, in a modular manner, a state-of-the-art optimization program, a production level structural analysis program, and user supplied and problem dependent interface programs. Standard utility capabilities existing in modern computer operating systems are used to integrate these programs. This approach results in flexibility of the optimization procedure organization and versatility in the formulation of contraints and design variables. Features shown in numerical examples include: (1) variability of structural layout and overall shape geometry, (2) static strength and stiffness constraints, (3) local buckling failure, and (4) vibration constraints. The paper concludes with a review of the further development trends of this programing system.
Andrykowski, Michael A.; Pavlik, Edward J.
2009-01-01
All cancer screening tests produce a proportion of abnormal results requiring follow-up. Consequently, the cancer screening setting is a natural laboratory for examining psychological and behavioral response to a threatening health-related event. This study tested hypotheses derived from the Social Cognitive Processing and Cognitive-Social Health Information Processing models in trying to understand response to an abnormal ovarian cancer (OC) screening test result. Women (n=278) receiving an abnormal screening test result a mean of 7 weeks earlier were assessed prior to a repeat screening test intended to clarify their previous abnormal result. Measures of disposition (optimism, informational coping style), social environment (social support and constraint), emotional processing, distress, and benefit finding were obtained. Regression analyses indicated greater distress was associated with greater social constraint and emotional processing and a monitoring coping style in women with a family history of OC. Distress was unrelated to social support. Greater benefit finding was associated with both greater social constraint and support and greater distress. The primacy of social constraint in accounting for both benefit-finding and distress was noteworthy and warrants further research on the role of social constraint in adaptation to stressful events. PMID:20419561
Low Velocity Airdrop Tests of an X-38 Backup Parachute Design
NASA Technical Reports Server (NTRS)
Stein, Jenny M.; Machin, Ricardo A.; Wolf, Dean F.; Hillebrandt, F. David
2007-01-01
The NASA Johnson Space Center's X-38 program designed a new backup parachute system to recover the 25,000 lb X-38 prototype for the Crew Return Vehicle spacecraft. Due to weight and cost constraints, the main backup parachute design incorporated rapid and low cost fabrication techniques using off-the-shelf materials. Near the vent, the canopy was constructed of continuous ribbons, to provide more damage tolerance. The remainder of the canopy was a constructed with a continuous ringslot design. After cancellation of the X-38 program, the parachute design was resized, built, and drop tested for Natick Soldiers Center's Low Velocity Air Drop (LVAD) program to deliver cargo loads up to 22,000 lbs from altitudes as low as 500 feet above the ground. Drop tests results showed that the 500-foot LVAD parachute deployment conditions cause severe skirt inversion and inflation problems for large parachutes. The bag strip occurred at a high angle of attack, causing skirt inversion before the parachute could inflate. The addition of a short reefing line prevented the skirt inversion. Using a lower porosity in the vent area, than is normally used in large parachutes, improved inflation. The drop testing demonstrated that the parachute design could be refined to meet the requirements for the 500-foot LVAD mission.
The optimization of wireless power transmission: design and realization.
Jia, Zhiwei; Yan, Guozheng; Liu, Hua; Wang, Zhiwu; Jiang, Pingping; Shi, Yu
2012-09-01
A wireless power transmission system is regarded as a practical way of solving power-shortage problems in multifunctional active capsule endoscopes. The uniformity of magnetic flux density, frequency stability and orientation stability are used to evaluate power transmission stability, taking into consideration size and safety constraints. Magnetic field safety and temperature rise are also considered. Test benches are designed to measure the relevent parameters. Finally, a mathematical programming model in which these constraints are considered is proposed to improve transmission efficiency. To verify the feasibility of the proposed method, various systems for a wireless active capsule endoscope are designed and evaluated. The optimal power transmission system has the capability to supply continuously at least 500 mW of power with a transmission efficiency of 4.08%. The example validates the feasibility of the proposed method. Introduction of novel designs enables further improvement of this method. Copyright © 2012 John Wiley & Sons, Ltd.
The Feasibility and Acceptability of "Arise": An Online Substance Abuse Relapse Prevention Program.
Sanchez, Rebecca Polley; Bartel, Chelsea M
2015-04-01
The purpose of this study was to test the feasibility and acceptability of a novel online adolescent substance abuse relapse prevention tool, "Arise" (3C Institute, Cary, NC). The program uses an innovative platform including interactive instructional segments and skill-building games to help adolescents learn and practice coping skills training strategies. We conducted a pilot test with nine adolescents in substance abuse treatment (44 percent female) and a feasibility test with treatment providers (n=8; 50 percent female). Adolescents interacted with the program via a secure Web site for approximately 30 minutes for each of two instructional units. Treatment providers reviewed the same material at their own pace. All participants completed a questionnaire with items assessing usability, acceptability, understanding, and subjective experience of the program. Regarding feasibility, recruitment of this population within the study constraints proved challenging, but participant retention in the trial was high (no attrition). Adolescents and treatment providers completed the program with no reported problems, and overall we were able to collect data as planned. Regarding acceptability, the program received strong ratings from both adolescents and providers, who found the prototype informative, engaging, and appealing. Both groups strongly recommended continuing development. We were able to deliver the intervention as intended, and acceptability ratings were high, demonstrating the feasibility and acceptability of online delivery of engaging interactive interventions. This study contributes to our understanding of how interactive technologies, including games, can be used to modify behavior in substance abuse treatment and other health areas.
A Framework for Dynamic Constraint Reasoning Using Procedural Constraints
NASA Technical Reports Server (NTRS)
Jonsson, Ari K.; Frank, Jeremy D.
1999-01-01
Many complex real-world decision and control problems contain an underlying constraint reasoning problem. This is particularly evident in a recently developed approach to planning, where almost all planning decisions are represented by constrained variables. This translates a significant part of the planning problem into a constraint network whose consistency determines the validity of the plan candidate. Since higher-level choices about control actions can add or remove variables and constraints, the underlying constraint network is invariably highly dynamic. Arbitrary domain-dependent constraints may be added to the constraint network and the constraint reasoning mechanism must be able to handle such constraints effectively. Additionally, real problems often require handling constraints over continuous variables. These requirements present a number of significant challenges for a constraint reasoning mechanism. In this paper, we introduce a general framework for handling dynamic constraint networks with real-valued variables, by using procedures to represent and effectively reason about general constraints. The framework is based on a sound theoretical foundation, and can be proven to be sound and complete under well-defined conditions. Furthermore, the framework provides hybrid reasoning capabilities, as alternative solution methods like mathematical programming can be incorporated into the framework, in the form of procedures.
Cimolin, Veronica; Beretta, Elena; Piccinini, Luigi; Turconi, Anna Carla; Locatelli, Federica; Galli, Manuela; Strazzer, Sandra
2012-01-01
The aims of this study are to quantify the movement limitation of upper limbs in hemiplegic children with traumatic brain injury (TBI) by using a clinical-functional scale and upper limb kinematics and to evaluate the effectiveness of constraint-induced movement therapy (CIMT) on upper limbs. Pre-post study. Clinical rehabilitation research laboratory. Ten children with TBI. The participants were evaluated by clinical examinations (Gross Motor Function Measure, Besta scale, Quality of Upper Extremities Skills Test, and Manual Ability Classification System) and 3D kinematic movement analysis of the upper limb before the CIMT program (pretest: 0.7 years after the injury) and at the end of the program (posttest: 10 weeks later). After the CIMT, most of the clinical measures improved significantly. Some significant improvements were present in terms of kinematics, in particular, in the movement duration and the velocity of movement execution of both tasks; the index of curvature and the average jerk improved, respectively, during reaching and hand-to-mouth task, while the adjusting sway parameter decreased during the 2 movements. Significant improvements were found in upper limb joint excursion after the rehabilitative programme too. Our results suggest that the CIMT program can improve movement efficiency and upper limb function in children after TBI. The integration of the clinical outcomes and upper limb kinematics revealed to be crucial in detecting the effects of the CIMT programme.
NASA Technical Reports Server (NTRS)
Fadel, G. M.
1991-01-01
The point exponential approximation method was introduced by Fadel et al. (Fadel, 1990), and tested on structural optimization problems with stress and displacement constraints. The reports in earlier papers were promising, and the method, which consists of correcting Taylor series approximations using previous design history, is tested in this paper on optimization problems with frequency constraints. The aim of the research is to verify the robustness and speed of convergence of the two point exponential approximation method when highly non-linear constraints are used.
ERIC Educational Resources Information Center
Taylor, Conrad F.; Houghton, George
2005-01-01
G. S. Dell, K. D. Reed, D. R. Adams, and A. S. Meyer (2000) proposed a "breadth-of-constraint" continuum on phoneme errors, using artificial experiment-wide constraints to investigate a putative middle ground between local and language-wide constraints. The authors report 5 experiments that test the idea of the continuum and the location of the…
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, J.; Bhat, R. B.
1979-01-01
A finite element program is linked with a general purpose optimization program in a 'programing system' which includes user supplied codes that contain problem dependent formulations of the design variables, objective function and constraints. The result is a system adaptable to a wide spectrum of structural optimization problems. In a sample of numerical examples, the design variables are the cross-sectional dimensions and the parameters of overall shape geometry, constraints are applied to stresses, displacements, buckling and vibration characteristics, and structural mass is the objective function. Thin-walled, built-up structures and frameworks are included in the sample. Details of the system organization and characteristics of the component programs are given.
JWST Operations and the Phase I and II Process
NASA Astrophysics Data System (ADS)
Beck, Tracy L.
2010-07-01
The JWST operations and Phase I and Phase II process will build upon our knowledge on the current system in use for HST. The primary observing overheads associated with JWST observations, both direct and indirect, are summarized. While some key operations constraints for JWST may cause deviations from the HST model for proposal planning, the overall interface to JWST planning will use the APT and will appear similar to the HST interface. The requirement is to have a proposal planning model simlar to HST, where proposals submitted to the TAC must have at least the minimum amount of information necessary for assessment of the strength of the science. However, a goal of the JWST planning process is to have the submitted Phase I proposal in executable form, and as complete as possible for many programs. JWST will have significant constraints on the spacecraft pointing and orient, so it is beneficial for the planning process to have these scheduling constraints on programs defined as early as possible. The guide field of JWST is also much smaller than the HST guide field, so searches for available guide stars for JWST science programs must be done at the Phase I deadline. The long range observing plan for each JWST cycle will be generated intially from the TAC accepted programs at the Phase I deadline, and the LRP will be refined after the Phase II deadline when all scheduling constraints are defined.
Space station payload operations scheduling with ESP2
NASA Technical Reports Server (NTRS)
Stacy, Kenneth L.; Jaap, John P.
1988-01-01
The Mission Analysis Division of the Systems Analysis and Integration Laboratory at the Marshall Space Flight Center is developing a system of programs to handle all aspects of scheduling payload operations for Space Station. The Expert Scheduling Program (ESP2) is the heart of this system. The task of payload operations scheduling can be simply stated as positioning the payload activities in a mission so that they collect their desired data without interfering with other activities or violating mission constraints. ESP2 is an advanced version of the Experiment Scheduling Program (ESP) which was developed by the Mission Integration Branch beginning in 1979 to schedule Spacelab payload activities. The automatic scheduler in ESP2 is an expert system that embodies the rules that expert planners would use to schedule payload operations by hand. This scheduler uses depth-first searching, backtracking, and forward chaining techniques to place an activity so that constraints (such as crew, resources, and orbit opportunities) are not violated. It has an explanation facility to show why an activity was or was not scheduled at a certain time. The ESP2 user can also place the activities in the schedule manually. The program offers graphical assistance to the user and will advise when constraints are being violated. ESP2 also has an option to identify conflict introduced into an existing schedule by changes to payload requirements, mission constraints, and orbit opportunities.
Strategies for Validation Testing of Ground Systems
NASA Technical Reports Server (NTRS)
Annis, Tammy; Sowards, Stephanie
2009-01-01
In order to accomplish the full Vision for Space Exploration announced by former President George W. Bush in 2004, NASA will have to develop a new space transportation system and supporting infrastructure. The main portion of this supporting infrastructure will reside at the Kennedy Space Center (KSC) in Florida and will either be newly developed or a modification of existing vehicle processing and launch facilities, including Ground Support Equipment (GSE). This type of large-scale launch site development is unprecedented since the time of the Apollo Program. In order to accomplish this successfully within the limited budget and schedule constraints a combination of traditional and innovative strategies for Verification and Validation (V&V) have been developed. The core of these strategies consists of a building-block approach to V&V, starting with component V&V and ending with a comprehensive end-to-end validation test of the complete launch site, called a Ground Element Integration Test (GEIT). This paper will outline these strategies and provide the high level planning for meeting the challenges of implementing V&V on a large-scale development program. KEY WORDS: Systems, Elements, Subsystem, Integration Test, Ground Systems, Ground Support Equipment, Component, End Item, Test and Verification Requirements (TVR), Verification Requirements (VR)
Development and Testing of Control Laws for the Active Aeroelastic Wing Program
NASA Technical Reports Server (NTRS)
Dibley, Ryan P.; Allen, Michael J.; Clarke, Robert; Gera, Joseph; Hodgkinson, John
2005-01-01
The Active Aeroelastic Wing research program was a joint program between the U.S. Air Force Research Laboratory and NASA established to investigate the characteristics of an aeroelastic wing and the technique of using wing twist for roll control. The flight test program employed the use of an F/A-18 aircraft modified by reducing the wing torsional stiffness and adding a custom research flight control system. The research flight control system was optimized to maximize roll rate using only wing surfaces to twist the wing while simultaneously maintaining design load limits, stability margins, and handling qualities. NASA Dryden Flight Research Center developed control laws using the software design tool called CONDUIT, which employs a multi-objective function optimization to tune selected control system design parameters. Modifications were made to the Active Aeroelastic Wing implementation in this new software design tool to incorporate the NASA Dryden Flight Research Center nonlinear F/A-18 simulation for time history analysis. This paper describes the design process, including how the control law requirements were incorporated into constraints for the optimization of this specific software design tool. Predicted performance is also compared to results from flight.
Continuous Improvement in Battery Testing at the NASA/JSC Energy System Test Area
NASA Technical Reports Server (NTRS)
Boyd, William; Cook, Joseph
2003-01-01
The Energy Systems Test Area (ESTA) at the Lyndon B. Johnson Space Center in Houston, Texas conducts development and qualification tests to fulfill Energy System Division responsibilities relevant to ASA programs and projects. EST A has historically called upon a variety of fluid, mechanical, electrical, environmental, and data system capabilities spread amongst five full-service facilities to test human and human supported spacecraft in the areas of propulsion systems, fluid systems, pyrotechnics, power generation, and power distribution and control systems. Improvements at ESTA are being made in full earnest of offering NASA project offices an option to choose a thorough test regime that is balanced with cost and schedule constraints. In order to continue testing of enabling power-related technologies utilized by the Energy System Division, an especially proactive effort has been made to increase the cost effectiveness and schedule responsiveness for battery testing. This paper describes the continuous improvement in battery testing at the Energy Systems Test Area being made through consolidation, streamlining, and standardization.
Partitioning problems in parallel, pipelined and distributed computing
NASA Technical Reports Server (NTRS)
Bokhari, S.
1985-01-01
The problem of optimally assigning the modules of a parallel program over the processors of a multiple computer system is addressed. A Sum-Bottleneck path algorithm is developed that permits the efficient solution of many variants of this problem under some constraints on the structure of the partitions. In particular, the following problems are solved optimally for a single-host, multiple satellite system: partitioning multiple chain structured parallel programs, multiple arbitrarily structured serial programs and single tree structured parallel programs. In addition, the problems of partitioning chain structured parallel programs across chain connected systems and across shared memory (or shared bus) systems are also solved under certain constraints. All solutions for parallel programs are equally applicable to pipelined programs. These results extend prior research in this area by explicitly taking concurrency into account and permit the efficient utilization of multiple computer architectures for a wide range of problems of practical interest.
Fuzzy robust credibility-constrained programming for environmental management and planning.
Zhang, Yimei; Hang, Guohe
2010-06-01
In this study, a fuzzy robust credibility-constrained programming (FRCCP) is developed and applied to the planning for waste management systems. It incorporates the concepts of credibility-based chance-constrained programming and robust programming within an optimization framework. The developed method can reflect uncertainties presented as possibility-density by fuzzy-membership functions. Fuzzy credibility constraints are transformed to the crisp equivalents with different credibility levels, and ordinary fuzzy inclusion constraints are determined by their robust deterministic constraints by setting a-cut levels. The FRCCP method can provide different system costs under different credibility levels (lambda). From the results of sensitivity analyses, the operation cost of the landfill is a critical parameter. For the management, any factors that would induce cost fluctuation during landfilling operation would deserve serious observation and analysis. By FRCCP, useful solutions can be obtained to provide decision-making support for long-term planning of solid waste management systems. It could be further enhanced through incorporating methods of inexact analysis into its framework. It can also be applied to other environmental management problems.
An innovative approach to compensator design
NASA Technical Reports Server (NTRS)
Mitchell, J. R.; Mcdaniel, W. L., Jr.
1973-01-01
The design is considered of a computer-aided-compensator for a control system from a frequency domain point of view. The design technique developed is based on describing the open loop frequency response by n discrete frequency points which result in n functions of the compensator coefficients. Several of these functions are chosen so that the system specifications are properly portrayed; then mathematical programming is used to improve all of these functions which have values below minimum standards. To do this, several definitions in regard to measuring the performance of a system in the frequency domain are given, e.g., relative stability, relative attenuation, proper phasing, etc. Next, theorems which govern the number of compensator coefficients necessary to make improvements in a certain number of functions are proved. After this a mathematical programming tool for aiding in the solution of the problem is developed. This tool is called the constraint improvement algorithm. Then for applying the constraint improvement algorithm generalized, gradients for the constraints are derived. Finally, the necessary theory is incorporated in a Computer program called CIP (compensator Improvement Program). The practical usefulness of CIP is demonstrated by two large system examples.
Watson-Jones, Deborah; Lees, Shelley; Mwanga, Joseph; Neke, Nyasule; Changalucha, John; Broutet, Nathalie; Maduhu, Ibrahim; Kapiga, Saidi; Chandra-Mouli, Venkatraman; Bloem, Paul; Ross, David A
2016-01-01
Background: Human papillomavirus (HPV) vaccination offers an opportunity to strengthen provision of adolescent health interventions (AHI). We explored the feasibility of integrating other AHI with HPV vaccination in Tanzania. Methods: A desk review of 39 policy documents was preceded by a stakeholder meeting with 38 policy makers and partners. Eighteen key informant interviews (KIIs) with health and education policy makers and district officials were conducted to further explore perceptions of current programs, priorities and AHI that might be suitable for integration with HPV vaccination. Results: Fourteen school health interventions (SHI) or AHI are currently being implemented by the Government of Tanzania. Most are delivered as vertical programmes. Coverage of current programs is not universal, and is limited by financial, human resource and logistic constraints. Limited community engagement, rumours, and lack of strategic advocacy has affected uptake of some interventions, e.g. tetanus toxoid (TT) immunization. Stakeholder and KI perceptions and opinions were limited by a lack of experience with integrated delivery and AHI that were outside an individual’s area of expertise and experience. Deworming and educational sessions including reproductive health education were the most frequently mentioned interventions that respondents considered suitable for integrated delivery with HPV vaccine. Conclusions: Given programme constraints, limited experience with integrated delivery and concern about real or perceived side-effects being attributed to the vaccine, it will be very important to pilot-test integration of AHI/SHI with HPV vaccination. Selected interventions will need to be simple and quick to deliver since health workers are likely to face significant logistic and time constraints during vaccination visits. PMID:26768827
A System for Automatically Generating Scheduling Heuristics
NASA Technical Reports Server (NTRS)
Morris, Robert
1996-01-01
The goal of this research is to improve the performance of automated schedulers by designing and implementing an algorithm by automatically generating heuristics by selecting a schedule. The particular application selected by applying this method solves the problem of scheduling telescope observations, and is called the Associate Principal Astronomer. The input to the APA scheduler is a set of observation requests submitted by one or more astronomers. Each observation request specifies an observation program as well as scheduling constraints and preferences associated with the program. The scheduler employs greedy heuristic search to synthesize a schedule that satisfies all hard constraints of the domain and achieves a good score with respect to soft constraints expressed as an objective function established by an astronomer-user.
Optimum structural design with plate bending elements - A survey
NASA Technical Reports Server (NTRS)
Haftka, R. T.; Prasad, B.
1981-01-01
A survey is presented of recently published papers in the field of optimum structural design of plates, largely with respect to the minimum-weight design of plates subject to such constraints as fundamental frequency maximization. It is shown that, due to the availability of powerful computers, the trend in optimum plate design is away from methods tailored to specific geometry and loads and toward methods that can be easily programmed for any kind of plate, such as finite element methods. A corresponding shift is seen in optimization from variational techniques to numerical optimization algorithms. Among the topics covered are fully stressed design and optimality criteria, mathematical programming, smooth and ribbed designs, design against plastic collapse, buckling constraints, and vibration constraints.
A Structural Evaluation of a Large-Scale Quasi-Experimental Microfinance Initiative
Kaboski, Joseph P.; Townsend, Robert M.
2010-01-01
This paper uses a structural model to understand, predict, and evaluate the impact of an exogenous microcredit intervention program, the Thai Million Baht Village Fund program. We model household decisions in the face of borrowing constraints, income uncertainty, and high-yield indivisible investment opportunities. After estimation of parameters using pre-program data, we evaluate the model’s ability to predict and interpret the impact of the village fund intervention. Simulations from the model mirror the data in yielding a greater increase in consumption than credit, which is interpreted as evidence of credit constraints. A cost-benefit analysis using the model indicates that some households value the program much more than its per household cost, but overall the program costs 20 percent more than the sum of these benefits. PMID:22162594
A Structural Evaluation of a Large-Scale Quasi-Experimental Microfinance Initiative.
Kaboski, Joseph P; Townsend, Robert M
2011-09-01
This paper uses a structural model to understand, predict, and evaluate the impact of an exogenous microcredit intervention program, the Thai Million Baht Village Fund program. We model household decisions in the face of borrowing constraints, income uncertainty, and high-yield indivisible investment opportunities. After estimation of parameters using pre-program data, we evaluate the model's ability to predict and interpret the impact of the village fund intervention. Simulations from the model mirror the data in yielding a greater increase in consumption than credit, which is interpreted as evidence of credit constraints. A cost-benefit analysis using the model indicates that some households value the program much more than its per household cost, but overall the program costs 20 percent more than the sum of these benefits.
FloPSy - Search-Based Floating Point Constraint Solving for Symbolic Execution
NASA Astrophysics Data System (ADS)
Lakhotia, Kiran; Tillmann, Nikolai; Harman, Mark; de Halleux, Jonathan
Recently there has been an upsurge of interest in both, Search-Based Software Testing (SBST), and Dynamic Symbolic Execution (DSE). Each of these two approaches has complementary strengths and weaknesses, making it a natural choice to explore the degree to which the strengths of one can be exploited to offset the weakness of the other. This paper introduces an augmented version of DSE that uses a SBST-based approach to handling floating point computations, which are known to be problematic for vanilla DSE. The approach has been implemented as a plug in for the Microsoft Pex DSE testing tool. The paper presents results from both, standard evaluation benchmarks, and two open source programs.
Design optimization studies using COSMIC NASTRAN
NASA Technical Reports Server (NTRS)
Pitrof, Stephen M.; Bharatram, G.; Venkayya, Vipperla B.
1993-01-01
The purpose of this study is to create, test and document a procedure to integrate mathematical optimization algorithms with COSMIC NASTRAN. This procedure is very important to structural design engineers who wish to capitalize on optimization methods to ensure that their design is optimized for its intended application. The OPTNAST computer program was created to link NASTRAN and design optimization codes into one package. This implementation was tested using two truss structure models and optimizing their designs for minimum weight, subject to multiple loading conditions and displacement and stress constraints. However, the process is generalized so that an engineer could design other types of elements by adding to or modifying some parts of the code.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berglin, E.J.
1996-09-17
Westinghouse Hanford Company (WHC) is exploring commercial methods for retrieving waste from the underground storage tanks at the Hanford site in south central Washington state. WHC needs data on commercial retrieval systems equipment in order to make programmatic decisions for waste retrieval. Full system testing of retrieval processes is to be demonstrated in phases through September 1997 in support of programs aimed to Acquire Commercial Technology for Retrieval (ACTR) and at the Hanford Tanks Initiative (HTI). One of the important parts of the integrated testing will be the deployment of retrieval tools using manipulator-based systems. WHC requires an assessment ofmore » a number of commercial deployment systems that have been identified by the ACTR program as good candidates to be included in an integrated testing effort. Included in this assessment should be an independent evaluation of manipulator tests performed to date, so that WHC can construct an integrated test based on these systems. The objectives of this document are to provide a description of the need, requirements, and constraints for a manipulator-based retrieval system; to evaluate manipulator-based concepts and testing performed to date by a number of commercial organizations; and to identify issues to be resolved through testing and/or analysis for each concept.« less
NASA Astrophysics Data System (ADS)
Stoitsov, M. V.; Schunck, N.; Kortelainen, M.; Michel, N.; Nam, H.; Olsen, E.; Sarich, J.; Wild, S.
2013-06-01
We describe the new version 2.00d of the code HFBTHO that solves the nuclear Skyrme-Hartree-Fock (HF) or Skyrme-Hartree-Fock-Bogoliubov (HFB) problem by using the cylindrical transformed deformed harmonic oscillator basis. In the new version, we have implemented the following features: (i) the modified Broyden method for non-linear problems, (ii) optional breaking of reflection symmetry, (iii) calculation of axial multipole moments, (iv) finite temperature formalism for the HFB method, (v) linear constraint method based on the approximation of the Random Phase Approximation (RPA) matrix for multi-constraint calculations, (vi) blocking of quasi-particles in the Equal Filling Approximation (EFA), (vii) framework for generalized energy density with arbitrary density-dependences, and (viii) shared memory parallelism via OpenMP pragmas. Program summaryProgram title: HFBTHO v2.00d Catalog identifier: ADUI_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADUI_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 167228 No. of bytes in distributed program, including test data, etc.: 2672156 Distribution format: tar.gz Programming language: FORTRAN-95. Computer: Intel Pentium-III, Intel Xeon, AMD-Athlon, AMD-Opteron, Cray XT5, Cray XE6. Operating system: UNIX, LINUX, WindowsXP. RAM: 200 Mwords Word size: 8 bits Classification: 17.22. Does the new version supercede the previous version?: Yes Catalog identifier of previous version: ADUI_v1_0 Journal reference of previous version: Comput. Phys. Comm. 167 (2005) 43 Nature of problem: The solution of self-consistent mean-field equations for weakly-bound paired nuclei requires a correct description of the asymptotic properties of nuclear quasi-particle wave functions. In the present implementation, this is achieved by using the single-particle wave functions of the transformed harmonic oscillator, which allows for an accurate description of deformation effects and pairing correlations in nuclei arbitrarily close to the particle drip lines. Solution method: The program uses the axial Transformed Harmonic Oscillator (THO) single- particle basis to expand quasi-particle wave functions. It iteratively diagonalizes the Hartree-Fock-Bogoliubov Hamiltonian based on generalized Skyrme-like energy densities and zero-range pairing interactions until a self-consistent solution is found. A previous version of the program was presented in: M.V. Stoitsov, J. Dobaczewski, W. Nazarewicz, P. Ring, Comput. Phys. Commun. 167 (2005) 43-63. Reasons for new version: Version 2.00d of HFBTHO provides a number of new options such as the optional breaking of reflection symmetry, the calculation of axial multipole moments, the finite temperature formalism for the HFB method, optimized multi-constraint calculations, the treatment of odd-even and odd-odd nuclei in the blocking approximation, and the framework for generalized energy density with arbitrary density-dependences. It is also the first version of HFBTHO to contain threading capabilities. Summary of revisions: The modified Broyden method has been implemented, Optional breaking of reflection symmetry has been implemented, The calculation of all axial multipole moments up to λ=8 has been implemented, The finite temperature formalism for the HFB method has been implemented, The linear constraint method based on the approximation of the Random Phase Approximation (RPA) matrix for multi-constraint calculations has been implemented, The blocking of quasi-particles in the Equal Filling Approximation (EFA) has been implemented, The framework for generalized energy density functionals with arbitrary density-dependence has been implemented, Shared memory parallelism via OpenMP pragmas has been implemented. Restrictions: Axial- and time-reversal symmetries are assumed. Unusual features: The user must have access to the LAPACK subroutines DSYEVD, DSYTRF and DSYTRI, and their dependences, which compute eigenvalues and eigenfunctions of real symmetric matrices, the LAPACK subroutines DGETRI and DGETRF, which invert arbitrary real matrices, and the BLAS routines DCOPY, DSCAL, DGEMM and DGEMV for double-precision linear algebra (or provide another set of subroutines that can perform such tasks). The BLAS and LAPACK subroutines can be obtained from the Netlib Repository at the University of Tennessee, Knoxville: http://netlib2.cs.utk.edu/. Running time: Highly variable, as it depends on the nucleus, size of the basis, requested accuracy, requested configuration, compiler and libraries, and hardware architecture. An order of magnitude would be a few seconds for ground-state configurations in small bases N≈8-12, to a few minutes in very deformed configuration of a heavy nucleus with a large basis N>20.
Guidelines for development structured FORTRAN programs
NASA Technical Reports Server (NTRS)
Earnest, B. M.
1984-01-01
Computer programming and coding standards were compiled to serve as guidelines for the uniform writing of FORTRAN 77 programs at NASA Langley. Software development philosophy, documentation, general coding conventions, and specific FORTRAN coding constraints are discussed.
A survey of methods of feasible directions for the solution of optimal control problems
NASA Technical Reports Server (NTRS)
Polak, E.
1972-01-01
Three methods of feasible directions for optimal control are reviewed. These methods are an extension of the Frank-Wolfe method, a dual method devised by Pironneau and Polack, and a Zontendijk method. The categories of continuous optimal control problems are shown as: (1) fixed time problems with fixed initial state, free terminal state, and simple constraints on the control; (2) fixed time problems with inequality constraints on both the initial and the terminal state and no control constraints; (3) free time problems with inequality constraints on the initial and terminal states and simple constraints on the control; and (4) fixed time problems with inequality state space contraints and constraints on the control. The nonlinear programming algorithms are derived for each of the methods in its associated category.
A Kind of Nonlinear Programming Problem Based on Mixed Fuzzy Relation Equations Constraints
NASA Astrophysics Data System (ADS)
Li, Jinquan; Feng, Shuang; Mi, Honghai
In this work, a kind of nonlinear programming problem with non-differential objective function and under the constraints expressed by a system of mixed fuzzy relation equations is investigated. First, some properties of this kind of optimization problem are obtained. Then, a polynomial-time algorithm for this kind of optimization problem is proposed based on these properties. Furthermore, we show that this algorithm is optimal for the considered optimization problem in this paper. Finally, numerical examples are provided to illustrate our algorithms.
Waveform stimulus subsystem: An advanced technology multifunction subsystem on a card
NASA Astrophysics Data System (ADS)
Pritchard, David J.
The F-15 TISS ATE (automatic test equipment) requires subsystem-on-a-card technology to achieve the required functionality within the space constraints. The waveform stimulus subsystem (WSS), an example of this advanced technology, is considered. The WSS circuit card consists of two 40-MHz pulse generators and an 80-MHz aribtrary waveform generator. Each generator is independently programmed and is available simultaneously to the user. The implementation of this highly integrated malfunction-detection system on a card is described, and the benefits to performance and maintainability are highlighted.
Development of advanced avionics systems applicable to terminal-configured vehicles
NASA Technical Reports Server (NTRS)
Heimbold, R. L.; Lee, H. P.; Leffler, M. F.
1980-01-01
A technique to add the time constraint to the automatic descent feature of the existing L-1011 aircraft Flight Management System (FMS) was developed. Software modifications were incorporated in the FMS computer program and the results checked by lab simulation and on a series of eleven test flights. An arrival time dispersion (2 sigma) of 19 seconds was achieved. The 4 D descent technique can be integrated with the time-based metering method of air traffic control. Substantial reductions in delays at today's busy airports should result.
Constrained spacecraft reorientation using mixed integer convex programming
NASA Astrophysics Data System (ADS)
Tam, Margaret; Glenn Lightsey, E.
2016-10-01
A constrained attitude guidance (CAG) system is developed using convex optimization to autonomously achieve spacecraft pointing objectives while meeting the constraints imposed by on-board hardware. These constraints include bounds on the control input and slew rate, as well as pointing constraints imposed by the sensors. The pointing constraints consist of inclusion and exclusion cones that dictate permissible orientations of the spacecraft in order to keep objects in or out of the field of view of the sensors. The optimization scheme drives a body vector towards a target inertial vector along a trajectory that consists solely of permissible orientations in order to achieve the desired attitude for a given mission mode. The non-convex rotational kinematics are handled by discretization, which also ensures that the quaternion stays unity norm. In order to guarantee an admissible path, the pointing constraints are relaxed. Depending on how strict the pointing constraints are, the degree of relaxation is tuneable. The use of binary variables permits the inclusion of logical expressions in the pointing constraints in the case that a set of sensors has redundancies. The resulting mixed integer convex programming (MICP) formulation generates a steering law that can be easily integrated into an attitude determination and control (ADC) system. A sample simulation of the system is performed for the Bevo-2 satellite, including disturbance torques and actuator dynamics which are not modeled by the controller. Simulation results demonstrate the robustness of the system to disturbances while meeting the mission requirements with desirable performance characteristics.
Hydropower, an energy source whose time has come again
NASA Astrophysics Data System (ADS)
1980-01-01
Recent price increases in imported oil demonstrate the urgency for the U.S. to rapidly develop its renewable resources. One such renewable resource for which technology is available now is hydropower. Studies indicate that hydropower potential, particularly at existing dam sites, can save the county hundreds of thousands of barrels of oil per day. But problems and constraints-economic, environmental, institutional, and operational-limit is full potential. Federal programs have had little impact on helping to bring hydro projects on line. Specifically, the Department of Energy's Small Hydro Program could do more to overcome hydro constraints and problems through an effective outreach program and more emphasis on demonstration projects.
NASA Technical Reports Server (NTRS)
Kolb, Mark A.
1990-01-01
Viewgraphs on Rubber Airplane: Constraint-based Component-Modeling for Knowledge Representation in Computer Aided Conceptual Design are presented. Topics covered include: computer aided design; object oriented programming; airfoil design; surveillance aircraft; commercial aircraft; aircraft design; and launch vehicles.
Continuous Optimization on Constraint Manifolds
NASA Technical Reports Server (NTRS)
Dean, Edwin B.
1988-01-01
This paper demonstrates continuous optimization on the differentiable manifold formed by continuous constraint functions. The first order tensor geodesic differential equation is solved on the manifold in both numerical and closed analytic form for simple nonlinear programs. Advantages and disadvantages with respect to conventional optimization techniques are discussed.
A numerical differentiation library exploiting parallel architectures
NASA Astrophysics Data System (ADS)
Voglis, C.; Hadjidoukas, P. E.; Lagaris, I. E.; Papageorgiou, D. G.
2009-08-01
We present a software library for numerically estimating first and second order partial derivatives of a function by finite differencing. Various truncation schemes are offered resulting in corresponding formulas that are accurate to order O(h), O(h), and O(h), h being the differencing step. The derivatives are calculated via forward, backward and central differences. Care has been taken that only feasible points are used in the case where bound constraints are imposed on the variables. The Hessian may be approximated either from function or from gradient values. There are three versions of the software: a sequential version, an OpenMP version for shared memory architectures and an MPI version for distributed systems (clusters). The parallel versions exploit the multiprocessing capability offered by computer clusters, as well as modern multi-core systems and due to the independent character of the derivative computation, the speedup scales almost linearly with the number of available processors/cores. Program summaryProgram title: NDL (Numerical Differentiation Library) Catalogue identifier: AEDG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEDG_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 73 030 No. of bytes in distributed program, including test data, etc.: 630 876 Distribution format: tar.gz Programming language: ANSI FORTRAN-77, ANSI C, MPI, OPENMP Computer: Distributed systems (clusters), shared memory systems Operating system: Linux, Solaris Has the code been vectorised or parallelized?: Yes RAM: The library uses O(N) internal storage, N being the dimension of the problem Classification: 4.9, 4.14, 6.5 Nature of problem: The numerical estimation of derivatives at several accuracy levels is a common requirement in many computational tasks, such as optimization, solution of nonlinear systems, etc. The parallel implementation that exploits systems with multiple CPUs is very important for large scale and computationally expensive problems. Solution method: Finite differencing is used with carefully chosen step that minimizes the sum of the truncation and round-off errors. The parallel versions employ both OpenMP and MPI libraries. Restrictions: The library uses only double precision arithmetic. Unusual features: The software takes into account bound constraints, in the sense that only feasible points are used to evaluate the derivatives, and given the level of the desired accuracy, the proper formula is automatically employed. Running time: Running time depends on the function's complexity. The test run took 15 ms for the serial distribution, 0.6 s for the OpenMP and 4.2 s for the MPI parallel distribution on 2 processors.
Feed Forward Neural Network and Optimal Control Problem with Control and State Constraints
NASA Astrophysics Data System (ADS)
Kmet', Tibor; Kmet'ová, Mária
2009-09-01
A feed forward neural network based optimal control synthesis is presented for solving optimal control problems with control and state constraints. The paper extends adaptive critic neural network architecture proposed by [5] to the optimal control problems with control and state constraints. The optimal control problem is transcribed into a nonlinear programming problem which is implemented with adaptive critic neural network. The proposed simulation method is illustrated by the optimal control problem of nitrogen transformation cycle model. Results show that adaptive critic based systematic approach holds promise for obtaining the optimal control with control and state constraints.
Constraint-induced aphasia therapy versus intensive semantic treatment in fluent aphasia.
Wilssens, Ineke; Vandenborre, Dorien; van Dun, Kim; Verhoeven, Jo; Visch-Brink, Evy; Mariën, Peter
2015-05-01
The authors compared the effectiveness of 2 intensive therapy methods: Constraint-Induced Aphasia Therapy (CIAT; Pulvermüller et al., 2001) and semantic therapy (BOX; Visch-Brink & Bajema, 2001). Nine patients with chronic fluent aphasia participated in a therapy program to establish behavioral treatment outcomes. Participants were randomly assigned to one of two groups (CIAT or BOX). Intensive therapy significantly improved verbal communication. However, BOX treatment showed a more pronounced improvement on two communication-namely, a standardized assessment for verbal communication, the Amsterdam Nijmegen Everyday Language Test (Blomert, Koster, & Kean, 1995), and a subjective rating scale, the Communicative Effectiveness Index (Lomas et al., 1989). All participants significantly improved on one (or more) subtests of the Aachen Aphasia Test (Graetz, de Bleser, & Willmes, 1992), an impairment-focused assessment. There was a treatment-specific effect. BOX treatment had a significant effect on language comprehension and semantics, whereas CIAT treatment affected language production and phonology. The findings indicate that in patients with fluent aphasia, (a) intensive treatment has a significant effect on language and verbal communication, (b) intensive therapy results in selective treatment effects, and (c) an intensive semantic treatment shows a more striking mean improvement on verbal communication in comparison with communication-based CIAT treatment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gomes, C.
This report describes a successful project for transference of advanced AI technology into the domain of planning of outages of nuclear power plants as part of DOD`s dual-use program. ROMAN (Rome Lab Outage Manager) is the prototype system that was developed as a result of this project. ROMAN`s main innovation compared to the current state-of-the-art of outage management tools is its capability to automatically enforce safety constraints during the planning and scheduling phase. Another innovative aspect of ROMAN is the generation of more robust schedules that are feasible over time windows. In other words, ROMAN generates a family of schedulesmore » by assigning time intervals as start times to activities rather than single start times, without affecting the overall duration of the project. ROMAN uses a constraint satisfaction paradigm combining a global search tactic with constraint propagation. The derivation of very specialized representations for the constraints to perform efficient propagation is a key aspect for the generation of very fast schedules - constraints are compiled into the code, which is a novel aspect of our work using an automatic programming system, KIDS.« less
Figlewski, Krystian; Blicher, Jakob Udby; Mortensen, Jesper; Severinsen, Kåre Eg; Nielsen, Jørgen Feldbæk; Andersen, Henning
2017-01-01
Transcranial direct current stimulation may enhance effect of rehabilitation in patients with chronic stroke. The objective was to evaluate the efficacy of anodal transcranial direct current stimulation combined with constraint-induced movement therapy of the paretic upper limb. A total of 44 patients with stroke were randomly allocated to receive 2 weeks of constraint-induced movement therapy with either anodal or sham transcranial direct current stimulation. The primary outcome measure, Wolf Motor Function Test, was assessed at baseline and after the intervention by blinded investigators. Both groups improved significantly on all Wolf Motor Function Test scores. Group comparison showed improvement on Wolf Motor Function Test in the anodal group compared with the sham group. Anodal transcranial direct current stimulation combined with constraint-induced movement therapy resulted in improvement of functional ability of the paretic upper limb compared with constraint-induced movement therapy alone. URL: http://www.clinicaltrials.gov. Unique identifier: NCT01983319. © 2016 American Heart Association, Inc.
Campos, Nicole G.; Castle, Philip E.; Wright, Thomas C.; Kim, Jane J.
2016-01-01
As cervical cancer screening programs are implemented in low-resource settings, protocols are needed to maximize health benefits under operational constraints. Our objective was to develop a framework for examining health and economic tradeoffs between screening test sensitivity, population coverage, and follow-up of screen-positive women, to help decision makers identify where program investments yield the greatest value. As an illustrative example, we used an individual-based Monte Carlo simulation model of the natural history of human papillomavirus (HPV) and cervical cancer calibrated to epidemiologic data from Uganda. We assumed once in a lifetime screening at age 35 with two-visit HPV DNA testing or one-visit visual inspection with acetic acid (VIA). We assessed the health and economic tradeoffs that arise between 1) test sensitivity and screening coverage; 2) test sensitivity and loss to follow-up (LTFU) of screen-positive women; and 3) test sensitivity, screening coverage, and LTFU simultaneously. The decline in health benefits associated with sacrificing HPV DNA test sensitivity by 20% (e.g., shifting from provider- to self-collection of specimens) could be offset by gains in coverage if coverage increased by at least 20%. When LTFU was 10%, two-visit HPV DNA testing with 80-90% sensitivity was more effective and more cost-effective than one-visit VIA with 40% sensitivity, and yielded greater health benefits than VIA even as VIA sensitivity increased to 60% and HPV test sensitivity declined to 70%. As LTFU increased, two-visit HPV DNA testing became more costly and less effective than one-visit VIA. Setting-specific data on achievable test sensitivity, coverage, follow-up rates, and programmatic costs are needed to guide programmatic decision making for cervical cancer screening. PMID:25943074
ERIC Educational Resources Information Center
Beeble, Marisa L.; Bybee, Deborah; Sullivan, Cris M.
2010-01-01
This study examined the impact of resource constraints on the psychological well-being of survivors of intimate partner violence (IPV), testing whether resource constraints is one mechanism that partially mediates the relationship between IPV and women's well-being. Although within-woman changes in resource constraints did not mediate the…
Outcomes Assessment in Dental Hygiene Programs.
ERIC Educational Resources Information Center
Grimes, Ellen B.
1999-01-01
A survey of 22 dental-hygiene-program directors found that programs routinely and effectively assess student outcomes and use the information for program improvements and to demonstrate accountability. Both policy and faculty/administrative support were deemed important to implementation. Time constraints were a major barrier. Outcomes-assessment…
The Feasibility and Acceptability of “Arise”: An Online Substance Abuse Relapse Prevention Program
Bartel, Chelsea M.
2015-01-01
Abstract Objective: The purpose of this study was to test the feasibility and acceptability of a novel online adolescent substance abuse relapse prevention tool, “Arise” (3C Institute, Cary, NC). The program uses an innovative platform including interactive instructional segments and skill-building games to help adolescents learn and practice coping skills training strategies. Materials and Methods: We conducted a pilot test with nine adolescents in substance abuse treatment (44 percent female) and a feasibility test with treatment providers (n=8; 50 percent female). Adolescents interacted with the program via a secure Web site for approximately 30 minutes for each of two instructional units. Treatment providers reviewed the same material at their own pace. All participants completed a questionnaire with items assessing usability, acceptability, understanding, and subjective experience of the program. Results: Regarding feasibility, recruitment of this population within the study constraints proved challenging, but participant retention in the trial was high (no attrition). Adolescents and treatment providers completed the program with no reported problems, and overall we were able to collect data as planned. Regarding acceptability, the program received strong ratings from both adolescents and providers, who found the prototype informative, engaging, and appealing. Both groups strongly recommended continuing development. Conclusions: We were able to deliver the intervention as intended, and acceptability ratings were high, demonstrating the feasibility and acceptability of online delivery of engaging interactive interventions. This study contributes to our understanding of how interactive technologies, including games, can be used to modify behavior in substance abuse treatment and other health areas. PMID:26181807
Modelisations et inversions tri-dimensionnelles en prospections gravimetrique et electrique
NASA Astrophysics Data System (ADS)
Boulanger, Olivier
The aim of this thesis is the application of gravity and resistivity methods for mining prospecting. The objectives of the present study are: (1) to build a fast gravity inversion method to interpret surface data; (2) to develop a tool for modelling the electrical potential acquired at surface and in boreholes when the resistivity distribution is heterogeneous; and (3) to define and implement a stochastic inversion scheme allowing the estimation of the subsurface resistivity from electrical data. The first technique concerns the elaboration of a three dimensional (3D) inversion program allowing the interpretation of gravity data using a selection of constraints such as the minimum distance, the flatness, the smoothness and the compactness. These constraints are integrated in a Lagrangian formulation. A multi-grid technique is also implemented to resolve separately large and short gravity wavelengths. The subsurface in the survey area is divided into juxtaposed rectangular prismatic blocks. The problem is solved by calculating the model parameters, i.e. the densities of each block. Weights are given to each block depending on depth, a priori information on density, and density range allowed for the region under investigation. The present code is tested on synthetic data. Advantages and behaviour of each method are compared in the 3D reconstruction. Recovery of geometry (depth, size) and density distribution of the original model is dependent on the set of constraints used. The best combination of constraints experimented for multiple bodies seems to be flatness and minimum volume for multiple bodies. The inversion method is tested on real gravity data. The second tool developed in this thesis is a three-dimensional electrical resistivity modelling code to interpret surface and subsurface data. Based on the integral equation, it calculates the charge density caused by conductivity gradients at each interface of the mesh allowing an exact estimation of the potential. Modelling generates a huge matrix made of Green's functions which is stored by using the method of pyramidal compression. The third method consists to interpret electrical potential measurements from a non-linear geostatistical approach including new constraints. This method estimates an analytical covariance model for the resistivity parameters from the potential data. (Abstract shortened by UMI.)
NASA Astrophysics Data System (ADS)
Risser, V. V.
1982-06-01
In 1977 the New Mexico State Energy Research and Development (R & D) Program provided $25,000 to the New Mexico Solar Energy Institute to be used in conjunction with US Department of Energy (DOE) funding for design, engineering, and installation of a proposed 150-kilowatt peak photovoltaic (PV) system in Lovington, New Mexico. An additional $75,000 was also committed contingent on award of a contract for construction, test, and evaluation of the system. This award was made in 1979 and the PV system was completed in 1981. Even though budget constraints dictated reduction of the plant size to 100-kilowatts peak, this system has produced more energy than any other flat-plate PV system in the world. The utilization of the R & D Program funding in contributing to the success of this important New Mexico energy project is detailed.
Neon reduction program on Cymer ArF light sources
NASA Astrophysics Data System (ADS)
Kanawade, Dinesh; Roman, Yzzer; Cacouris, Ted; Thornes, Josh; O'Brien, Kevin
2016-03-01
In response to significant neon supply constraints, Cymer has responded with a multi-part plan to support its customers. Cymer's primary objective is to ensure that reliable system performance is maintained while minimizing gas consumption. Gas algorithms were optimized to ensure stable performance across all operating conditions. The Cymer neon support plan contains four elements: 1. Gas reduction program to reduce neon by >50% while maintaining existing performance levels and availability; 2. short-term containment solutions for immediate relief. 3. qualification of additional gas suppliers; and 4. long-term recycling/reclaim opportunity. The Cymer neon reduction program has shown excellent results as demonstrated through the comparison on standard gas use versus the new >50% reduced neon performance for ArF immersion light sources. Testing included stressful conditions such as repetition rate, duty cycle and energy target changes. No performance degradation has been observed over typical gas lives.
Alternative Constraint Handling Technique for Four-Bar Linkage Path Generation
NASA Astrophysics Data System (ADS)
Sleesongsom, S.; Bureerat, S.
2018-03-01
This paper proposes an extension of a new concept for path generation from our previous work by adding a new constraint handling technique. The propose technique was initially designed for problems without prescribed timing by avoiding the timing constraint, while remain constraints are solving with a new constraint handling technique. The technique is one kind of penalty technique. The comparative study is optimisation of path generation problems are solved using self-adaptive population size teaching-learning based optimization (SAP-TLBO) and original TLBO. In this study, two traditional path generation test problem are used to test the proposed technique. The results show that the new technique can be applied with the path generation problem without prescribed timing and gives better results than the previous technique. Furthermore, SAP-TLBO outperforms the original one.
Optimizing Environmental Flow Operation Rules based on Explicit IHA Constraints
NASA Astrophysics Data System (ADS)
Dongnan, L.; Wan, W.; Zhao, J.
2017-12-01
Multi-objective operation of reservoirs are increasingly asked to consider the environmental flow to support ecosystem health. Indicators of Hydrologic Alteration (IHA) is widely used to describe environmental flow regimes, but few studies have explicitly formulated it into optimization models and thus is difficult to direct reservoir release. In an attempt to incorporate the benefit of environmental flow into economic achievement, a two-objective reservoir optimization model is developed and all 33 hydrologic parameters of IHA are explicitly formulated into constraints. The benefit of economic is defined by Hydropower Production (HP) while the benefit of environmental flow is transformed into Eco-Index (EI) that combined 5 of the 33 IHA parameters chosen by principal component analysis method. Five scenarios (A to E) with different constraints are tested and solved by nonlinear programming. The case study of Jing Hong reservoir, located in the upstream of Mekong basin, China, shows: 1. A Pareto frontier is formed by maximizing on only HP objective in scenario A and on only EI objective in scenario B. 2. Scenario D using IHA parameters as constraints obtains the optimal benefits of both economic and ecological. 3. A sensitive weight coefficient is found in scenario E, but the trade-offs between HP and EI objectives are not within the Pareto frontier. 4. When the fraction of reservoir utilizable capacity reaches 0.8, both HP and EI capture acceptable values. At last, to make this modelmore conveniently applied to everyday practice, a simplified operation rule curve is extracted.
Project Portal User-Centered Design and Engineering Report
2016-06-01
design . Further information on this round of testing is in Appendix A. 4.2 APRIL TEST Wireframe usability tests ... testing on other areas of the design , but due to schedule constraints from management, and personnel constraints in the development team, this became...just move on . That’s super normal when we test early on like this. I also may ask you to do things we actually haven’t created designs for
A new look at the simultaneous analysis and design of structures
NASA Technical Reports Server (NTRS)
Striz, Alfred G.
1994-01-01
The minimum weight optimization of structural systems, subject to strength and displacement constraints as well as size side constraints, was investigated by the Simultaneous ANalysis and Design (SAND) approach. As an optimizer, the code NPSOL was used which is based on a sequential quadratic programming (SQP) algorithm. The structures were modeled by the finite element method. The finite element related input to NPSOL was automatically generated from the input decks of such standard FEM/optimization codes as NASTRAN or ASTROS, with the stiffness matrices, at present, extracted from the FEM code ANALYZE. In order to avoid ill-conditioned matrices that can be encountered when the global stiffness equations are used as additional nonlinear equality constraints in the SAND approach (with the displacements as additional variables), the matrix displacement method was applied. In this approach, the element stiffness equations are used as constraints instead of the global stiffness equations, in conjunction with the nodal force equilibrium equations. This approach adds the element forces as variables to the system. Since, for complex structures and the associated large and very sparce matrices, the execution times of the optimization code became excessive due to the large number of required constraint gradient evaluations, the Kreisselmeier-Steinhauser function approach was used to decrease the computational effort by reducing the nonlinear equality constraint system to essentially a single combined constraint equation. As the linear equality and inequality constraints require much less computational effort to evaluate, they were kept in their previous form to limit the complexity of the KS function evaluation. To date, the standard three-bar, ten-bar, and 72-bar trusses have been tested. For the standard SAND approach, correct results were obtained for all three trusses although convergence became slower for the 72-bar truss. When the matrix displacement method was used, correct results were still obtained, but the execution times became excessive due to the large number of constraint gradient evaluations required. Using the KS function, the computational effort dropped, but the optimization seemed to become less robust. The investigation of this phenomenon is continuing. As an alternate approach, the code MINOS for the optimization of sparse matrices can be applied to the problem in lieu of the Kreisselmeier-Steinhauser function. This investigation is underway.
A simulation exercise of a cavity-type solar receiver using the HEAP program
NASA Technical Reports Server (NTRS)
Lansing, F. L.
1979-01-01
A computer program has been developed at JPL to support the advanced studies of solar receivers in high concentration solar-thermal-electric power plants. This work presents briefly the program methodology, input data required, expected output results, capabilities and limitations. The program was used to simulate an existing 5 kwt experimental receiver of a cavity type. The receiver is located at the focus of a paraboloid dish and is connected to a Stirling engine. Both steady state and transient performance simulation were given. Details about the receiver modeling were also presented to illustrate the procedure followed. Simulated temperature patterns were found in good agreement with test data obtained by high temperature thermocouples. The simulated receiver performance was extrapolated to various operating conditions not attained experimentally. The results of the parameterization study were fitted to a general performance expression to determine the receiver characteristic constraints. The latter were used to optimize the receiver operating conditions to obtain the highest overall conversion efficiency.
User's guide to the Fault Inferring Nonlinear Detection System (FINDS) computer program
NASA Technical Reports Server (NTRS)
Caglayan, A. K.; Godiwala, P. M.; Satz, H. S.
1988-01-01
Described are the operation and internal structure of the computer program FINDS (Fault Inferring Nonlinear Detection System). The FINDS algorithm is designed to provide reliable estimates for aircraft position, velocity, attitude, and horizontal winds to be used for guidance and control laws in the presence of possible failures in the avionics sensors. The FINDS algorithm was developed with the use of a digital simulation of a commercial transport aircraft and tested with flight recorded data. The algorithm was then modified to meet the size constraints and real-time execution requirements on a flight computer. For the real-time operation, a multi-rate implementation of the FINDS algorithm has been partitioned to execute on a dual parallel processor configuration: one based on the translational dynamics and the other on the rotational kinematics. The report presents an overview of the FINDS algorithm, the implemented equations, the flow charts for the key subprograms, the input and output files, program variable indexing convention, subprogram descriptions, and the common block descriptions used in the program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waddell, Lucas; Muldoon, Frank; Henry, Stephen Michael
In order to effectively plan the management and modernization of their large and diverse fleets of vehicles, Program Executive Office Ground Combat Systems (PEO GCS) and Program Executive Office Combat Support and Combat Service Support (PEO CS&CSS) commis- sioned the development of a large-scale portfolio planning optimization tool. This software, the Capability Portfolio Analysis Tool (CPAT), creates a detailed schedule that optimally prioritizes the modernization or replacement of vehicles within the fleet - respecting numerous business rules associated with fleet structure, budgets, industrial base, research and testing, etc., while maximizing overall fleet performance through time. This paper contains a thor-more » ough documentation of the terminology, parameters, variables, and constraints that comprise the fleet management mixed integer linear programming (MILP) mathematical formulation. This paper, which is an update to the original CPAT formulation document published in 2015 (SAND2015-3487), covers the formulation of important new CPAT features.« less
ERIC Educational Resources Information Center
Mao, Xiuzhen; Xin, Tao
2013-01-01
The Monte Carlo approach which has previously been implemented in traditional computerized adaptive testing (CAT) is applied here to cognitive diagnostic CAT to test the ability of this approach to address multiple content constraints. The performance of the Monte Carlo approach is compared with the performance of the modified maximum global…
Check Calibration of the NASA Glenn 10- by 10-Foot Supersonic Wind Tunnel (2014 Test Entry)
NASA Technical Reports Server (NTRS)
Johnson, Aaron; Pastor-Barsi, Christine; Arrington, E. Allen
2016-01-01
A check calibration of the 10- by 10-Foot Supersonic Wind Tunnel (SWT) was conducted in May/June 2014 using an array of five supersonic wedge probes to verify the 1999 Calibration. This check calibration was necessary following a control systems upgrade and an integrated systems test (IST). This check calibration was required to verify the tunnel flow quality was unchanged by the control systems upgrade prior to the next test customer beginning their test entry. The previous check calibration of the tunnel occurred in 2007, prior to the Mars Science Laboratory test program. Secondary objectives of this test entry included the validation of the new Cobra data acquisition system (DAS) against the current Escort DAS and the creation of statistical process control (SPC) charts through the collection of series of repeated test points at certain predetermined tunnel parameters. The SPC charts secondary objective was not completed due to schedule constraints. It is hoped that this effort will be readdressed and completed in the near future.
Passive shimming of a superconducting magnet using the L1-norm regularized least square algorithm.
Kong, Xia; Zhu, Minhua; Xia, Ling; Wang, Qiuliang; Li, Yi; Zhu, Xuchen; Liu, Feng; Crozier, Stuart
2016-02-01
The uniformity of the static magnetic field B0 is of prime importance for an MRI system. The passive shimming technique is usually applied to improve the uniformity of the static field by optimizing the layout of a series of steel shims. The steel pieces are fixed in the drawers in the inner bore of the superconducting magnet, and produce a magnetizing field in the imaging region to compensate for the inhomogeneity of the B0 field. In practice, the total mass of steel used for shimming should be minimized, in addition to the field uniformity requirement. This is because the presence of steel shims may introduce a thermal stability problem. The passive shimming procedure is typically realized using the linear programming (LP) method. The LP approach however, is generally slow and also has difficulty balancing the field quality and the total amount of steel for shimming. In this paper, we have developed a new algorithm that is better able to balance the dual constraints of field uniformity and the total mass of the shims. The least square method is used to minimize the magnetic field inhomogeneity over the imaging surface with the total mass of steel being controlled by an L1-norm based constraint. The proposed algorithm has been tested with practical field data, and the results show that, with similar computational cost and mass of shim material, the new algorithm achieves superior field uniformity (43% better for the test case) compared with the conventional linear programming approach. Copyright © 2016 Elsevier Inc. All rights reserved.
Space Operations Center system analysis study extension. Volume 2: Programmatics and cost
NASA Technical Reports Server (NTRS)
1982-01-01
A summary of Space Operations Center (SOC) orbital space station costs, program options and program recommendations is presented. Program structure, hardware commonality, schedules and program phasing are considered. Program options are analyzed with respect to mission needs, design and technology options, and anticipated funding constraints. Design and system options are discussed.
Scheduling the resident 80-hour work week: an operations research algorithm.
Day, T Eugene; Napoli, Joseph T; Kuo, Paul C
2006-01-01
The resident 80-hour work week requires that programs now schedule duty hours. Typically, scheduling is performed in an empirical "trial-and-error" fashion. However, this is a classic "scheduling" problem from the field of operations research (OR). It is similar to scheduling issues that airlines must face with pilots and planes routing through various airports at various times. The authors hypothesized that an OR approach using iterative computer algorithms could provide a rational scheduling solution. Institution-specific constraints of the residency problem were formulated. A total of 56 residents are rotating through 4 hospitals. Additional constraints were dictated by the Residency Review Committee (RRC) rules or the specific surgical service. For example, at Hospital 1, during the weekday hours between 6 am and 6 pm, there will be a PGY4 or PGY5 and a PGY2 or PGY3 on-duty to cover Service "A." A series of equations and logic statements was generated to satisfy all constraints and requirements. These were restated in the Optimization Programming Language used by the ILOG software suite for solving mixed integer programming problems. An integer programming solution was generated to this resource-constrained assignment problem. A total of 30,900 variables and 12,443 constraints were required. A total of man-hours of programming were used; computer run-time was 25.9 hours. A weekly schedule was generated for each resident that satisfied the RRC regulations while fulfilling all stated surgical service requirements. Each required between 64 and 80 weekly resident duty hours. The authors conclude that OR is a viable approach to schedule resident work hours. This technique is sufficiently robust to accommodate changes in resident numbers, service requirements, and service and hospital rotations.
Solving Fractional Programming Problems based on Swarm Intelligence
NASA Astrophysics Data System (ADS)
Raouf, Osama Abdel; Hezam, Ibrahim M.
2014-04-01
This paper presents a new approach to solve Fractional Programming Problems (FPPs) based on two different Swarm Intelligence (SI) algorithms. The two algorithms are: Particle Swarm Optimization, and Firefly Algorithm. The two algorithms are tested using several FPP benchmark examples and two selected industrial applications. The test aims to prove the capability of the SI algorithms to solve any type of FPPs. The solution results employing the SI algorithms are compared with a number of exact and metaheuristic solution methods used for handling FPPs. Swarm Intelligence can be denoted as an effective technique for solving linear or nonlinear, non-differentiable fractional objective functions. Problems with an optimal solution at a finite point and an unbounded constraint set, can be solved using the proposed approach. Numerical examples are given to show the feasibility, effectiveness, and robustness of the proposed algorithm. The results obtained using the two SI algorithms revealed the superiority of the proposed technique among others in computational time. A better accuracy was remarkably observed in the solution results of the industrial application problems.
On the linear programming bound for linear Lee codes.
Astola, Helena; Tabus, Ioan
2016-01-01
Based on an invariance-type property of the Lee-compositions of a linear Lee code, additional equality constraints can be introduced to the linear programming problem of linear Lee codes. In this paper, we formulate this property in terms of an action of the multiplicative group of the field [Formula: see text] on the set of Lee-compositions. We show some useful properties of certain sums of Lee-numbers, which are the eigenvalues of the Lee association scheme, appearing in the linear programming problem of linear Lee codes. Using the additional equality constraints, we formulate the linear programming problem of linear Lee codes in a very compact form, leading to a fast execution, which allows to efficiently compute the bounds for large parameter values of the linear codes.
NASA Technical Reports Server (NTRS)
Pamadi, Bandu N.; Toniolo, Matthew D.; Tartabini, Paul V.; Roithmayr, Carlos M.; Albertson, Cindy W.; Karlgaard, Christopher D.
2016-01-01
The objective of this report is to develop and implement a physics based method for analysis and simulation of multi-body dynamics including launch vehicle stage separation. The constraint force equation (CFE) methodology discussed in this report provides such a framework for modeling constraint forces and moments acting at joints when the vehicles are still connected. Several stand-alone test cases involving various types of joints were developed to validate the CFE methodology. The results were compared with ADAMS(Registered Trademark) and Autolev, two different industry standard benchmark codes for multi-body dynamic analysis and simulations. However, these two codes are not designed for aerospace flight trajectory simulations. After this validation exercise, the CFE algorithm was implemented in Program to Optimize Simulated Trajectories II (POST2) to provide a capability to simulate end-to-end trajectories of launch vehicles including stage separation. The POST2/CFE methodology was applied to the STS-1 Space Shuttle solid rocket booster (SRB) separation and Hyper-X Research Vehicle (HXRV) separation from the Pegasus booster as a further test and validation for its application to launch vehicle stage separation problems. Finally, to demonstrate end-to-end simulation capability, POST2/CFE was applied to the ascent, orbit insertion, and booster return of a reusable two-stage-to-orbit (TSTO) vehicle concept. With these validation exercises, POST2/CFE software can be used for performing conceptual level end-to-end simulations, including launch vehicle stage separation, for problems similar to those discussed in this report.
Interior point techniques for LP and NLP
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evtushenko, Y.
By using surjective mapping the initial constrained optimization problem is transformed to a problem in a new space with only equality constraints. For the numerical solution of the latter problem we use the generalized gradient-projection method and Newton`s method. After inverse transformation to the initial space we obtain the family of numerical methods for solving optimization problems with equality and inequality constraints. In the linear programming case after some simplification we obtain Dikin`s algorithm, affine scaling algorithm and generalized primal dual interior point linear programming algorithm.
Compensator improvement for multivariable control systems
NASA Technical Reports Server (NTRS)
Mitchell, J. R.; Mcdaniel, W. L., Jr.; Gresham, L. L.
1977-01-01
A theory and the associated numerical technique are developed for an iterative design improvement of the compensation for linear, time-invariant control systems with multiple inputs and multiple outputs. A strict constraint algorithm is used in obtaining a solution of the specified constraints of the control design. The result of the research effort is the multiple input, multiple output Compensator Improvement Program (CIP). The objective of the Compensator Improvement Program is to modify in an iterative manner the free parameters of the dynamic compensation matrix so that the system satisfies frequency domain specifications. In this exposition, the underlying principles of the multivariable CIP algorithm are presented and the practical utility of the program is illustrated with space vehicle related examples.
Creative Funding Opportunities for Interscholastic Athletic Programs
ERIC Educational Resources Information Center
Forester, Brooke E.
2015-01-01
Athletic programs nationwide are facing budget constraints like never before. Pay-to-play programs are becoming commonplace. School districts are providing less and less funding for athletics. Still worse, many high school athletic programs are being cut entirely from the scholastic school setting. Coaches and athletic directors are being forced…
Cluster functions and scattering amplitudes for six and seven points
Harrington, Thomas; Spradlin, Marcus
2017-07-05
Scattering amplitudes in planar super-Yang-Mills theory satisfy several basic physical and mathematical constraints, including physical constraints on their branch cut structure and various empirically discovered connections to the mathematics of cluster algebras. The power of the bootstrap program for amplitudes is inversely proportional to the size of the intersection between these physical and mathematical constraints: ideally we would like a list of constraints which determine scattering amplitudes uniquely. Here, we explore this intersection quantitatively for two-loop six- and seven-point amplitudes by providing a complete taxonomy of the Gr(4, 6) and Gr(4, 7) cluster polylogarithm functions of [15] at weight 4.
Evaluation of Fast-Time Wake Vortex Models using Wake Encounter Flight Test Data
NASA Technical Reports Server (NTRS)
Ahmad, Nashat N.; VanValkenburg, Randal L.; Bowles, Roland L.; Limon Duparcmeur, Fanny M.; Gloudesman, Thijs; van Lochem, Sander; Ras, Eelco
2014-01-01
This paper describes a methodology for the integration and evaluation of fast-time wake models with flight data. The National Aeronautics and Space Administration conducted detailed flight tests in 1995 and 1997 under the Aircraft Vortex Spacing System Program to characterize wake vortex decay and wake encounter dynamics. In this study, data collected during Flight 705 were used to evaluate NASA's fast-time wake transport and decay models. Deterministic and Monte-Carlo simulations were conducted to define wake hazard bounds behind the wake generator. The methodology described in this paper can be used for further validation of fast-time wake models using en-route flight data, and for determining wake turbulence constraints in the design of air traffic management concepts.
Laser-based firing systems for prompt initiation of secondary explosives
NASA Technical Reports Server (NTRS)
Meeks, Kent D.; Setchell, Robert E.
1993-01-01
Motivated by issues of weapon safety and security, laser based firing systems for promptly initiating secondary explosives have been under active development at Sandia National Laboratories for more than four years. Such a firing system consists of miniaturized, Q-switched, solid-state laser, optical detonators, optical safety switches, and elements for splitting, coupling, and transmitting the laser output. Potential system applications pose significant challenges in terms of server mechanical and thermal environments and packaging constraints, while requiring clear demonstration of safety enhancements. The Direct Optical Initiation (DOI) Program at Sandia is addressing these challenges through progress development phases during which the design, fabrication, and testing of prototype hardware is aimed at more difficult application requirements. A brief history of the development program, and a summary of current and planned activities, will be presented.
Watson-Jones, Deborah; Lees, Shelley; Mwanga, Joseph; Neke, Nyasule; Changalucha, John; Broutet, Nathalie; Maduhu, Ibrahim; Kapiga, Saidi; Chandra-Mouli, Venkatraman; Bloem, Paul; Ross, David A
2016-07-01
Human papillomavirus (HPV) vaccination offers an opportunity to strengthen provision of adolescent health interventions (AHI). We explored the feasibility of integrating other AHI with HPV vaccination in Tanzania. A desk review of 39 policy documents was preceded by a stakeholder meeting with 38 policy makers and partners. Eighteen key informant interviews (KIIs) with health and education policy makers and district officials were conducted to further explore perceptions of current programs, priorities and AHI that might be suitable for integration with HPV vaccination. Fourteen school health interventions (SHI) or AHI are currently being implemented by the Government of Tanzania. Most are delivered as vertical programmes. Coverage of current programs is not universal, and is limited by financial, human resource and logistic constraints. Limited community engagement, rumours, and lack of strategic advocacy has affected uptake of some interventions, e.g. tetanus toxoid (TT) immunization. Stakeholder and KI perceptions and opinions were limited by a lack of experience with integrated delivery and AHI that were outside an individual's area of expertise and experience. Deworming and educational sessions including reproductive health education were the most frequently mentioned interventions that respondents considered suitable for integrated delivery with HPV vaccine. Given programme constraints, limited experience with integrated delivery and concern about real or perceived side-effects being attributed to the vaccine, it will be very important to pilot-test integration of AHI/SHI with HPV vaccination. Selected interventions will need to be simple and quick to deliver since health workers are likely to face significant logistic and time constraints during vaccination visits. © The Author 2016. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine.
Why do mothers favor girls and fathers, boys? : A hypothesis and a test of investment disparity.
Godoy, Ricardo; Reyes-García, Victoria; McDade, Thomas; Tanner, Susan; Leonard, William R; Huanca, Tomás; Vadez, Vincent; Patel, Karishma
2006-06-01
Growing evidence suggests mothers invest more in girls than boys and fathers more in boys than girls. We develop a hypothesis that predicts preference for girls by the parent facing more resource constraints and preference for boys by the parent facing less constraint. We test the hypothesis with panel data from the Tsimane', a foraging-farming society in the Bolivian Amazon. Tsimane' mothers face more resource constraints than fathers. As predicted, mother's wealth protected girl's BMI, but father's wealth had weak effects on boy's BMI. Numerous tests yielded robust results, including those that controlled for fixed effects of child and household.
Sleep underpins the plasticity of language production.
Gaskell, M Gareth; Warker, Jill; Lindsay, Shane; Frost, Rebecca; Guest, James; Snowdon, Reza; Stackhouse, Abigail
2014-07-01
The constraints that govern acceptable phoneme combinations in speech perception and production have considerable plasticity. We addressed whether sleep influences the acquisition of new constraints and their integration into the speech-production system. Participants repeated sequences of syllables in which two phonemes were artificially restricted to syllable onset or syllable coda, depending on the vowel in that sequence. After 48 sequences, participants either had a 90-min nap or remained awake. Participants then repeated 96 sequences so implicit constraint learning could be examined, and then were tested for constraint generalization in a forced-choice task. The sleep group, but not the wake group, produced speech errors at test that were consistent with restrictions on the placement of phonemes in training. Furthermore, only the sleep group generalized their learning to new materials. Polysomnography data showed that implicit constraint learning was associated with slow-wave sleep. These results show that sleep facilitates the integration of new linguistic knowledge with existing production constraints. These data have relevance for systems-consolidation models of sleep. © The Author(s) 2014.
Multi-Time Step Service Restoration for Advanced Distribution Systems and Microgrids
Chen, Bo; Chen, Chen; Wang, Jianhui; ...
2017-07-07
Modern power systems are facing increased risk of disasters that can cause extended outages. The presence of remote control switches (RCSs), distributed generators (DGs), and energy storage systems (ESS) provides both challenges and opportunities for developing post-fault service restoration methodologies. Inter-temporal constraints of DGs, ESS, and loads under cold load pickup (CLPU) conditions impose extra complexity on problem formulation and solution. In this paper, a multi-time step service restoration methodology is proposed to optimally generate a sequence of control actions for controllable switches, ESSs, and dispatchable DGs to assist the system operator with decision making. The restoration sequence is determinedmore » to minimize the unserved customers by energizing the system step by step without violating operational constraints at each time step. The proposed methodology is formulated as a mixed-integer linear programming (MILP) model and can adapt to various operation conditions. Furthermore, the proposed method is validated through several case studies that are performed on modified IEEE 13-node and IEEE 123-node test feeders.« less
Multi-Time Step Service Restoration for Advanced Distribution Systems and Microgrids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Bo; Chen, Chen; Wang, Jianhui
Modern power systems are facing increased risk of disasters that can cause extended outages. The presence of remote control switches (RCSs), distributed generators (DGs), and energy storage systems (ESS) provides both challenges and opportunities for developing post-fault service restoration methodologies. Inter-temporal constraints of DGs, ESS, and loads under cold load pickup (CLPU) conditions impose extra complexity on problem formulation and solution. In this paper, a multi-time step service restoration methodology is proposed to optimally generate a sequence of control actions for controllable switches, ESSs, and dispatchable DGs to assist the system operator with decision making. The restoration sequence is determinedmore » to minimize the unserved customers by energizing the system step by step without violating operational constraints at each time step. The proposed methodology is formulated as a mixed-integer linear programming (MILP) model and can adapt to various operation conditions. Furthermore, the proposed method is validated through several case studies that are performed on modified IEEE 13-node and IEEE 123-node test feeders.« less
GRADSPMHD: A parallel MHD code based on the SPH formalism
NASA Astrophysics Data System (ADS)
Vanaverbeke, S.; Keppens, R.; Poedts, S.
2014-03-01
We present GRADSPMHD, a completely Lagrangian parallel magnetohydrodynamics code based on the SPH formalism. The implementation of the equations of SPMHD in the “GRAD-h” formalism assembles known results, including the derivation of the discretized MHD equations from a variational principle, the inclusion of time-dependent artificial viscosity, resistivity and conductivity terms, as well as the inclusion of a mixed hyperbolic/parabolic correction scheme for satisfying the ∇ṡB→ constraint on the magnetic field. The code uses a tree-based formalism for neighbor finding and can optionally use the tree code for computing the self-gravity of the plasma. The structure of the code closely follows the framework of our parallel GRADSPH FORTRAN 90 code which we added previously to the CPC program library. We demonstrate the capabilities of GRADSPMHD by running 1, 2, and 3 dimensional standard benchmark tests and we find good agreement with previous work done by other researchers. The code is also applied to the problem of simulating the magnetorotational instability in 2.5D shearing box tests as well as in global simulations of magnetized accretion disks. We find good agreement with available results on this subject in the literature. Finally, we discuss the performance of the code on a parallel supercomputer with distributed memory architecture. Catalogue identifier: AERP_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERP_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 620503 No. of bytes in distributed program, including test data, etc.: 19837671 Distribution format: tar.gz Programming language: FORTRAN 90/MPI. Computer: HPC cluster. Operating system: Unix. Has the code been vectorized or parallelized?: Yes, parallelized using MPI. RAM: ˜30 MB for a Sedov test including 15625 particles on a single CPU. Classification: 12. Nature of problem: Evolution of a plasma in the ideal MHD approximation. Solution method: The equations of magnetohydrodynamics are solved using the SPH method. Running time: The test provided takes approximately 20 min using 4 processors.
Combining computer adaptive testing technology with cognitively diagnostic assessment.
McGlohen, Meghan; Chang, Hua-Hua
2008-08-01
A major advantage of computerized adaptive testing (CAT) is that it allows the test to home in on an examinee's ability level in an interactive manner. The aim of the new area of cognitive diagnosis is to provide information about specific content areas in which an examinee needs help. The goal of this study was to combine the benefit of specific feedback from cognitively diagnostic assessment with the advantages of CAT. In this study, three approaches to combining these were investigated: (1) item selection based on the traditional ability level estimate (theta), (2) item selection based on the attribute mastery feedback provided by cognitively diagnostic assessment (alpha), and (3) item selection based on both the traditional ability level estimate (theta) and the attribute mastery feedback provided by cognitively diagnostic assessment (alpha). The results from these three approaches were compared for theta estimation accuracy, attribute mastery estimation accuracy, and item exposure control. The theta- and alpha-based condition outperformed the alpha-based condition regarding theta estimation, attribute mastery pattern estimation, and item exposure control. Both the theta-based condition and the theta- and alpha-based condition performed similarly with regard to theta estimation, attribute mastery estimation, and item exposure control, but the theta- and alpha-based condition has an additional advantage in that it uses the shadow test method, which allows the administrator to incorporate additional constraints in the item selection process, such as content balancing, item type constraints, and so forth, and also to select items on the basis of both the current theta and alpha estimates, which can be built on top of existing 3PL testing programs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun Wei; Huang, Guo H., E-mail: huang@iseis.org; Institute for Energy, Environment and Sustainable Communities, University of Regina, Regina, Saskatchewan, S4S 0A2
2012-06-15
Highlights: Black-Right-Pointing-Pointer Inexact piecewise-linearization-based fuzzy flexible programming is proposed. Black-Right-Pointing-Pointer It's the first application to waste management under multiple complexities. Black-Right-Pointing-Pointer It tackles nonlinear economies-of-scale effects in interval-parameter constraints. Black-Right-Pointing-Pointer It estimates costs more accurately than the linear-regression-based model. Black-Right-Pointing-Pointer Uncertainties are decreased and more satisfactory interval solutions are obtained. - Abstract: To tackle nonlinear economies-of-scale (EOS) effects in interval-parameter constraints for a representative waste management problem, an inexact piecewise-linearization-based fuzzy flexible programming (IPFP) model is developed. In IPFP, interval parameters for waste amounts and transportation/operation costs can be quantified; aspiration levels for net system costs, as well as tolerancemore » intervals for both capacities of waste treatment facilities and waste generation rates can be reflected; and the nonlinear EOS effects transformed from objective function to constraints can be approximated. An interactive algorithm is proposed for solving the IPFP model, which in nature is an interval-parameter mixed-integer quadratically constrained programming model. To demonstrate the IPFP's advantages, two alternative models are developed to compare their performances. One is a conventional linear-regression-based inexact fuzzy programming model (IPFP2) and the other is an IPFP model with all right-hand-sides of fussy constraints being the corresponding interval numbers (IPFP3). The comparison results between IPFP and IPFP2 indicate that the optimized waste amounts would have the similar patterns in both models. However, when dealing with EOS effects in constraints, the IPFP2 may underestimate the net system costs while the IPFP can estimate the costs more accurately. The comparison results between IPFP and IPFP3 indicate that their solutions would be significantly different. The decreased system uncertainties in IPFP's solutions demonstrate its effectiveness for providing more satisfactory interval solutions than IPFP3. Following its first application to waste management, the IPFP can be potentially applied to other environmental problems under multiple complexities.« less
Duckworth, Renée A
2015-12-01
Personality traits are behaviors that show limited flexibility over time and across contexts, and thus understanding their origin requires an understanding of what limits behavioral flexibility. Here, I suggest that insight into the evolutionary origin of personality traits requires determining the relative importance of selection and constraint in producing limits to behavioral flexibility. Natural selection as the primary cause of limits to behavioral flexibility assumes that the default state of behavior is one of high flexibility and predicts that personality variation arises through evolution of buffering mechanisms to stabilize behavioral expression, whereas the constraint hypothesis assumes that the default state is one of limited flexibility and predicts that the neuroendocrine components that underlie personality variation are those most constrained in flexibility. Using recent work on the neurobiology of sensitive periods and maternal programming of offspring behavior, I show that some of the most stable aspects of the neuroendocrine system are structural components and maternally induced epigenetic effects. Evidence of numerous constraints to changes in structural features of the neuroendocrine system and far fewer constraints to flexibility of epigenetic systems suggests that structural constraints play a primary role in the origin of behavioral stability and that epigenetic programming may be more important in generating adaptive variation among individuals. © 2015 New York Academy of Sciences.
Level-Set Topology Optimization with Aeroelastic Constraints
NASA Technical Reports Server (NTRS)
Dunning, Peter D.; Stanford, Bret K.; Kim, H. Alicia
2015-01-01
Level-set topology optimization is used to design a wing considering skin buckling under static aeroelastic trim loading, as well as dynamic aeroelastic stability (flutter). The level-set function is defined over the entire 3D volume of a transport aircraft wing box. Therefore, the approach is not limited by any predefined structure and can explore novel configurations. The Sequential Linear Programming (SLP) level-set method is used to solve the constrained optimization problems. The proposed method is demonstrated using three problems with mass, linear buckling and flutter objective and/or constraints. A constraint aggregation method is used to handle multiple buckling constraints in the wing skins. A continuous flutter constraint formulation is used to handle difficulties arising from discontinuities in the design space caused by a switching of the critical flutter mode.
Affordances and Constraints of a Blended Course in a Teacher Professional Development Program
ERIC Educational Resources Information Center
Bakir, Nesrin; Devers, Christopher; Hug, Barbara
2016-01-01
Using a descriptive research design approach, this study investigated the affordances and constraints of a graduate level blended course focused on science teaching and learning. Data were gathered from 24 in-service teacher interviews and surveys. Identified affordances included the structure and implementation of the course, the flexibility of…
General Constraints on Sampling Wildlife on FIA Plots
Larissa L. Bailey; John R. Sauer; James D. Nichols; Paul H. Geissler
2005-01-01
This paper reviews the constraints to sampling wildlife populations at FIA points. Wildlife sampling programs must have well-defined goals and provide information adequate to meet those goals. Investigators should choose a State variable based on information needs and the spatial sampling scale. We discuss estimation-based methods for three State variables: species...
Scheduling double round-robin tournaments with divisional play using constraint programming
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carlsson, Mats; Johansson, Mikael; Larson, Jeffrey
We study a tournament format that extends a traditional double round-robin format with divisional single round-robin tournaments. Elitserien, the top Swedish handball league, uses such a format for its league schedule. We present a constraint programming model that characterizes the general double round-robin plus divisional single round-robin format. This integrated model allows scheduling to be performed in a single step, as opposed to common multistep approaches that decompose scheduling into smaller problems and possibly miss optimal solutions. In addition to general constraints, we introduce Elitserien-specific requirements for its tournament. These general and league-specific constraints allow us to identify implicit andmore » symmetry-breaking properties that reduce the time to solution from hours to seconds. A scalability study of the number of teams shows that our approach is reasonably fast for even larger league sizes. The experimental evaluation of the integrated approach takes considerably less computational effort to schedule Elitserien than does the previous decomposed approach. (C) 2016 Elsevier B.V. All rights reserved« less
Improving the Held and Karp Approach with Constraint Programming
NASA Astrophysics Data System (ADS)
Benchimol, Pascal; Régin, Jean-Charles; Rousseau, Louis-Martin; Rueher, Michel; van Hoeve, Willem-Jan
Held and Karp have proposed, in the early 1970s, a relaxation for the Traveling Salesman Problem (TSP) as well as a branch-and-bound procedure that can solve small to modest-size instances to optimality [4, 5]. It has been shown that the Held-Karp relaxation produces very tight bounds in practice, and this relaxation is therefore applied in TSP solvers such as Concorde [1]. In this short paper we show that the Held-Karp approach can benefit from well-known techniques in Constraint Programming (CP) such as domain filtering and constraint propagation. Namely, we show that filtering algorithms developed for the weighted spanning tree constraint [3, 8] can be adapted to the context of the Held and Karp procedure. In addition to the adaptation of existing algorithms, we introduce a special-purpose filtering algorithm based on the underlying mechanisms used in Prim's algorithm [7]. Finally, we explored two different branching schemes to close the integrality gap. Our initial experimental results indicate that the addition of the CP techniques to the Held-Karp method can be very effective.
NASA Astrophysics Data System (ADS)
Sun, Jingliang; Liu, Chunsheng
2018-01-01
In this paper, the problem of intercepting a manoeuvring target within a fixed final time is posed in a non-linear constrained zero-sum differential game framework. The Nash equilibrium solution is found by solving the finite-horizon constrained differential game problem via adaptive dynamic programming technique. Besides, a suitable non-quadratic functional is utilised to encode the control constraints into a differential game problem. The single critic network with constant weights and time-varying activation functions is constructed to approximate the solution of associated time-varying Hamilton-Jacobi-Isaacs equation online. To properly satisfy the terminal constraint, an additional error term is incorporated in a novel weight-updating law such that the terminal constraint error is also minimised over time. By utilising Lyapunov's direct method, the closed-loop differential game system and the estimation weight error of the critic network are proved to be uniformly ultimately bounded. Finally, the effectiveness of the proposed method is demonstrated by using a simple non-linear system and a non-linear missile-target interception system, assuming first-order dynamics for the interceptor and target.
Motion coordination and programmable teleoperation between two industrial robots
NASA Technical Reports Server (NTRS)
Luh, J. Y. S.; Zheng, Y. F.
1987-01-01
Tasks for two coordinated industrial robots always bring the robots in contact with a same object. The motion coordination among the robots and the object must be maintained all the time. To plan the coordinated tasks, only one robot's motion is planned according to the required motion of the object. The motion of the second robot is to follow the first one as specified by a set of holonomic equality constraints at every time instant. If any modification of the object's motion is needed in real-time, only the first robot's motion has to be modified accordingly in real-time. The modification for the second robot is done implicitly through the constraint conditions. Thus the operation is simplified. If the object is physically removed, the second robot still continually follows the first one through the constraint conditions. If the first robot is maneuvered through either the teach pendant or the keyboard, the second one moves accordingly to form the teleoperation which is linked through the software programming. Obviously, the second robot does not need to duplicate the first robot's motion. The programming of the constraints specifies their relative motions.
2012-01-01
Background The global initiative ‘Treatment 2.0’ calls for expanding the evidence base of optimal HIV service delivery models to maximize HIV case detection and retention in care. However limited systematic assessment has been conducted in countries with concentrated HIV epidemic. We aimed to assess HIV service availability and service connectedness in Vietnam. Methods We developed a new analytical framework of the continuum of prevention and care (COPC). Using the framework, we examined HIV service delivery in Vietnam. Specifically, we analyzed HIV service availability including geographical distribution and decentralization and service connectedness across multiple services and dimensions. We then identified system-related strengths and constraints in improving HIV case detection and retention in care. This was accomplished by reviewing related published and unpublished documents including existing service delivery data. Results Identified strengths included: decentralized HIV outpatient clinics that offer comprehensive care at the district level particularly in high HIV burden provinces; functional chronic care management for antiretroviral treatment (ART) with the involvement of people living with HIV and the links to community- and home-based care; HIV testing and counseling integrated into tuberculosis and antenatal care services in districts supported by donor-funded projects, and extensive peer outreach networks that reduce barriers for the most-at-risk populations to access services. Constraints included: fragmented local coordination mechanisms for HIV-related health services; lack of systems to monitor the expansion of HIV outpatient clinics that offer comprehensive care; underdevelopment of pre-ART care; insufficient linkage from HIV testing and counseling to pre-ART care; inadequate access to HIV-related services in districts not supported by donor-funded projects particularly in middle and low burden provinces and in mountainous remote areas; and no systematic monitoring of referral services. Conclusions Our COPC analytical framework was instrumental in identifying system-related strengths and constraints that contribute to HIV case detection and retention in care. The national HIV program plans to strengthen provincial programming by re-defining various service linkages and accelerate the transition from project-based approach to integrated service delivery in line with the ‘Treatment 2.0’ initiative. PMID:23272730
Fujita, Masami; Poudel, Krishna C; Do, Thi Nhan; Bui, Duc Duong; Nguyen, Van Kinh; Green, Kimberly; Nguyen, Thi Minh Thu; Kato, Masaya; Jacka, David; Cao, Thi Thanh Thuy; Nguyen, Thanh Long; Jimba, Masamine
2012-12-29
The global initiative 'Treatment 2.0' calls for expanding the evidence base of optimal HIV service delivery models to maximize HIV case detection and retention in care. However limited systematic assessment has been conducted in countries with concentrated HIV epidemic. We aimed to assess HIV service availability and service connectedness in Vietnam. We developed a new analytical framework of the continuum of prevention and care (COPC). Using the framework, we examined HIV service delivery in Vietnam. Specifically, we analyzed HIV service availability including geographical distribution and decentralization and service connectedness across multiple services and dimensions. We then identified system-related strengths and constraints in improving HIV case detection and retention in care. This was accomplished by reviewing related published and unpublished documents including existing service delivery data. Identified strengths included: decentralized HIV outpatient clinics that offer comprehensive care at the district level particularly in high HIV burden provinces; functional chronic care management for antiretroviral treatment (ART) with the involvement of people living with HIV and the links to community- and home-based care; HIV testing and counseling integrated into tuberculosis and antenatal care services in districts supported by donor-funded projects, and extensive peer outreach networks that reduce barriers for the most-at-risk populations to access services. Constraints included: fragmented local coordination mechanisms for HIV-related health services; lack of systems to monitor the expansion of HIV outpatient clinics that offer comprehensive care; underdevelopment of pre-ART care; insufficient linkage from HIV testing and counseling to pre-ART care; inadequate access to HIV-related services in districts not supported by donor-funded projects particularly in middle and low burden provinces and in mountainous remote areas; and no systematic monitoring of referral services. Our COPC analytical framework was instrumental in identifying system-related strengths and constraints that contribute to HIV case detection and retention in care. The national HIV program plans to strengthen provincial programming by re-defining various service linkages and accelerate the transition from project-based approach to integrated service delivery in line with the 'Treatment 2.0' initiative.
Sun, Wei; Huang, Guo H; Lv, Ying; Li, Gongchen
2012-06-01
To tackle nonlinear economies-of-scale (EOS) effects in interval-parameter constraints for a representative waste management problem, an inexact piecewise-linearization-based fuzzy flexible programming (IPFP) model is developed. In IPFP, interval parameters for waste amounts and transportation/operation costs can be quantified; aspiration levels for net system costs, as well as tolerance intervals for both capacities of waste treatment facilities and waste generation rates can be reflected; and the nonlinear EOS effects transformed from objective function to constraints can be approximated. An interactive algorithm is proposed for solving the IPFP model, which in nature is an interval-parameter mixed-integer quadratically constrained programming model. To demonstrate the IPFP's advantages, two alternative models are developed to compare their performances. One is a conventional linear-regression-based inexact fuzzy programming model (IPFP2) and the other is an IPFP model with all right-hand-sides of fussy constraints being the corresponding interval numbers (IPFP3). The comparison results between IPFP and IPFP2 indicate that the optimized waste amounts would have the similar patterns in both models. However, when dealing with EOS effects in constraints, the IPFP2 may underestimate the net system costs while the IPFP can estimate the costs more accurately. The comparison results between IPFP and IPFP3 indicate that their solutions would be significantly different. The decreased system uncertainties in IPFP's solutions demonstrate its effectiveness for providing more satisfactory interval solutions than IPFP3. Following its first application to waste management, the IPFP can be potentially applied to other environmental problems under multiple complexities. Copyright © 2012 Elsevier Ltd. All rights reserved.
Abo, Masahiro; Kakuda, Wataru; Momosaki, Ryo; Harashima, Hiroaki; Kojima, Miki; Watanabe, Shigeto; Sato, Toshihiro; Yokoi, Aki; Umemori, Takuma; Sasanuma, Jinichi
2014-07-01
Many poststroke patients suffer functional motor limitation of the affected upper limb, which is associated with diminished health-related quality of life. The aim of this study is to conduct a randomized, multicenter, comparative study of low-frequency repetitive transcranial magnetic stimulation combined with intensive occupational therapy, NEURO (NovEl intervention Using Repetitive TMS and intensive Occupational therapy) versus constraint-induced movement therapy in poststroke patients with upper limb hemiparesis. In this randomized controlled study of NEURO and constraint-induced movement therapy, 66 poststroke patients with upper limb hemiparesis were randomly assigned at 2:1 ratio to low-frequency repetitive transcranial magnetic stimulation plus occupational therapy (NEURO group) or constraint-induced movement therapy (constraint-induced movement therapy group) for 15 days. Fugl-Meyer Assessment and Wolf Motor Function Test and Functional Ability Score of Wolf Motor Function Test were used for assessment. No differences in patients' characteristics were found between the two groups at baseline. The Fugl-Meyer Assessment score was significantly higher in both groups after the 15-day treatment compared with the baseline. Changes in Fugl-Meyer Assessment scores and Functional Ability Score of Wolf Motor Function Test were significantly higher in the NEURO group than in the constraint-induced movement therapy group, whereas the decrease in the Wolf Motor Function Test log performance time was comparable between the two groups (changes in Fugl-Meyer Assessment score, NEURO: 5·39 ± 4·28, constraint-induced movement therapy: 3·09 ± 4·50 points; mean ± standard error of the mean; P < 0·05) (changes in Functional Ability Score of Wolf Motor Function Test, NEURO: 3·98 ± 2·99, constraint-induced movement therapy: 2·09 ± 2·96 points; P < 0·05). The results of the 15-day rehabilitative protocol showed the superiority of NEURO relative to constraint-induced movement therapy; NEURO improved the motion of the whole upper limb and resulted in functional improvement in activities of daily living. © 2013 The Authors. International Journal of Stroke © 2013 World Stroke Organization.
RNAiFold2T: Constraint Programming design of thermo-IRES switches.
Garcia-Martin, Juan Antonio; Dotu, Ivan; Fernandez-Chamorro, Javier; Lozano, Gloria; Ramajo, Jorge; Martinez-Salas, Encarnacion; Clote, Peter
2016-06-15
RNA thermometers (RNATs) are cis-regulatory elements that change secondary structure upon temperature shift. Often involved in the regulation of heat shock, cold shock and virulence genes, RNATs constitute an interesting potential resource in synthetic biology, where engineered RNATs could prove to be useful tools in biosensors and conditional gene regulation. Solving the 2-temperature inverse folding problem is critical for RNAT engineering. Here we introduce RNAiFold2T, the first Constraint Programming (CP) and Large Neighborhood Search (LNS) algorithms to solve this problem. Benchmarking tests of RNAiFold2T against existent programs (adaptive walk and genetic algorithm) inverse folding show that our software generates two orders of magnitude more solutions, thus allowing ample exploration of the space of solutions. Subsequently, solutions can be prioritized by computing various measures, including probability of target structure in the ensemble, melting temperature, etc. Using this strategy, we rationally designed two thermosensor internal ribosome entry site (thermo-IRES) elements, whose normalized cap-independent translation efficiency is approximately 50% greater at 42 °C than 30 °C, when tested in reticulocyte lysates. Translation efficiency is lower than that of the wild-type IRES element, which on the other hand is fully resistant to temperature shift-up. This appears to be the first purely computational design of functional RNA thermoswitches, and certainly the first purely computational design of functional thermo-IRES elements. RNAiFold2T is publicly available as part of the new release RNAiFold3.0 at https://github.com/clotelab/RNAiFold and http://bioinformatics.bc.edu/clotelab/RNAiFold, which latter has a web server as well. The software is written in C ++ and uses OR-Tools CP search engine. clote@bc.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
Incorporation of physical constraints in optimal surface search for renal cortex segmentation
NASA Astrophysics Data System (ADS)
Li, Xiuli; Chen, Xinjian; Yao, Jianhua; Zhang, Xing; Tian, Jie
2012-02-01
In this paper, we propose a novel approach for multiple surfaces segmentation based on the incorporation of physical constraints in optimal surface searching. We apply our new approach to solve the renal cortex segmentation problem, an important but not sufficiently researched issue. In this study, in order to better restrain the intensity proximity of the renal cortex and renal column, we extend the optimal surface search approach to allow for varying sampling distance and physical separation constraints, instead of the traditional fixed sampling distance and numerical separation constraints. The sampling distance of each vertex-column is computed according to the sparsity of the local triangular mesh. Then the physical constraint learned from a priori renal cortex thickness is applied to the inter-surface arcs as the separation constraints. Appropriate varying sampling distance and separation constraints were learnt from 6 clinical CT images. After training, the proposed approach was tested on a test set of 10 images. The manual segmentation of renal cortex was used as the reference standard. Quantitative analysis of the segmented renal cortex indicates that overall segmentation accuracy was increased after introducing the varying sampling distance and physical separation constraints (the average true positive volume fraction (TPVF) and false positive volume fraction (FPVF) were 83.96% and 2.80%, respectively, by using varying sampling distance and physical separation constraints compared to 74.10% and 0.18%, respectively, by using fixed sampling distance and numerical separation constraints). The experimental results demonstrated the effectiveness of the proposed approach.
Thermal-Aware Test Access Mechanism and Wrapper Design Optimization for System-on-Chips
NASA Astrophysics Data System (ADS)
Yu, Thomas Edison; Yoneda, Tomokazu; Chakrabarty, Krishnendu; Fujiwara, Hideo
Rapid advances in semiconductor manufacturing technology have led to higher chip power densities, which places greater emphasis on packaging and temperature control during testing. For system-on-chips, peak power-based scheduling algorithms have been used to optimize tests under specified power constraints. However, imposing power constraints does not always solve the problem of overheating due to the non-uniform distribution of power across the chip. This paper presents a TAM/Wrapper co-design methodology for system-on-chips that ensures thermal safety while still optimizing the test schedule. The method combines a simplified thermal-cost model with a traditional bin-packing algorithm to minimize test time while satisfying temperature constraints. Furthermore, for temperature checking, thermal simulation is done using cycle-accurate power profiles for more realistic results. Experiments show that even a minimal sacrifice in test time can yield a considerable decrease in test temperature as well as the possibility of further lowering temperatures beyond those achieved using traditional power-based test scheduling.
NASA Astrophysics Data System (ADS)
Gutin, Gregory; Kim, Eun Jung; Soleimanfallah, Arezou; Szeider, Stefan; Yeo, Anders
The NP-hard general factor problem asks, given a graph and for each vertex a list of integers, whether the graph has a spanning subgraph where each vertex has a degree that belongs to its assigned list. The problem remains NP-hard even if the given graph is bipartite with partition U ⊎ V, and each vertex in U is assigned the list {1}; this subproblem appears in the context of constraint programming as the consistency problem for the extended global cardinality constraint. We show that this subproblem is fixed-parameter tractable when parameterized by the size of the second partite set V. More generally, we show that the general factor problem for bipartite graphs, parameterized by |V |, is fixed-parameter tractable as long as all vertices in U are assigned lists of length 1, but becomes W[1]-hard if vertices in U are assigned lists of length at most 2. We establish fixed-parameter tractability by reducing the problem instance to a bounded number of acyclic instances, each of which can be solved in polynomial time by dynamic programming.
ERIC Educational Resources Information Center
Grossman, Michael; Schortgen, Francis
2016-01-01
This article offers insights into the overall program development process and--institutional obstacles and constraints notwithstanding--successful introduction of a new national security program at a small liberal arts university at a time of growing institutional prioritization of science, technology, engineering, and mathematics (STEM) programs.…
Ji, Eun-Kyu; Lee, Sang-Heon
2016-11-01
[Purpose] The purpose of this study was to investigate the effects of virtual reality training combined with modified constraint-induced movement therapy on upper extremity motor function recovery in acute stage stroke patients. [Subjects and Methods] Four acute stage stroke patients participated in the study. A multiple baseline single subject experimental design was utilized. Modified constraint-induced movement therapy was used according to the EXplaining PLastICITy after stroke protocol during baseline sessions. Virtual reality training with modified constraint-induced movement therapy was applied during treatment sessions. The Manual Function Test and the Box and Block Test were used to measure upper extremity function before every session. [Results] The subjects' upper extremity function improved during the intervention period. [Conclusion] Virtual reality training combined with modified constraint-induced movement is effective for upper extremity function recovery in acute stroke patients.
Unifying Model-Based and Reactive Programming within a Model-Based Executive
NASA Technical Reports Server (NTRS)
Williams, Brian C.; Gupta, Vineet; Norvig, Peter (Technical Monitor)
1999-01-01
Real-time, model-based, deduction has recently emerged as a vital component in AI's tool box for developing highly autonomous reactive systems. Yet one of the current hurdles towards developing model-based reactive systems is the number of methods simultaneously employed, and their corresponding melange of programming and modeling languages. This paper offers an important step towards unification. We introduce RMPL, a rich modeling language that combines probabilistic, constraint-based modeling with reactive programming constructs, while offering a simple semantics in terms of hidden state Markov processes. We introduce probabilistic, hierarchical constraint automata (PHCA), which allow Markov processes to be expressed in a compact representation that preserves the modularity of RMPL programs. Finally, a model-based executive, called Reactive Burton is described that exploits this compact encoding to perform efficIent simulation, belief state update and control sequence generation.
Mission Possible: BioMedical Experiments on the Space Shuttle
NASA Technical Reports Server (NTRS)
Bopp, E.; Kreutzberg, K.
2011-01-01
Biomedical research, both applied and basic, was conducted on every Shuttle mission from 1981 to 2011. The Space Shuttle Program enabled NASA investigators and researchers from around the world to address fundamental issues concerning living and working effectively in space. Operationally focused occupational health investigations and tests were given priority by the Shuttle crew and Shuttle Program management for the resolution of acute health issues caused by the rigors of spaceflight. The challenges of research on the Shuttle included: limited up and return mass, limited power, limited crew time, and requirements for containment of hazards. The sheer capacity of the Shuttle for crew and equipment was unsurpassed by any other launch and entry vehicle and the Shuttle Program provided more opportunity for human research than any program before or since. To take advantage of this opportunity, life sciences research programs learned how to: streamline the complicated process of integrating experiments aboard the Shuttle, design experiments and hardware within operational constraints, and integrate requirements between different experiments and with operational countermeasures. We learned how to take advantage of commercial-off-the-shelf hardware and developed a hardware certification process with the flexibility to allow for design changes between flights. We learned the importance of end-to-end testing for experiment hardware with humans-in-the-loop. Most importantly, we learned that the Shuttle Program provided an excellent platform for conducting human research and for developing the systems that are now used to optimize research on the International Space Station. This presentation will include a review of the types of experiments and medical tests flown on the Shuttle and the processes that were used to manifest and conduct the experiments. Learning Objective: This paper provides a description of the challenges related to launching and implementing biomedical experiments aboard the Space Shuttle.
Marriageable Women: A Focus on Participants in a Community Healthy Marriage Program
Manning, Wendy D.; Trella, Deanna; Lyons, Heidi; Toit, Nola Cora Du
2012-01-01
Although disadvantaged women are the targets of marriage programs, little attention has been paid to women's marriage constraints and their views of marriage. Drawing on an exchange framework and using qualitative data collected from single women participating in a marriage initiative, we introduce the concept of marriageable women—the notion that certain limitations may make women poor marriage partners. Like their male counterparts, we find women also possess qualities that are not considered assets in the marriage market, such as economic constraints, mental and physical health issues, substance use, multiple partner fertility, and gender distrust. We also consider how women participating in a marriage program frame their marriage options, whereas a few opt out of the marriage market altogether. PMID:23258947
An Integer Programming Model for the Management of a Forest in the North of Portugal
NASA Astrophysics Data System (ADS)
Cerveira, Adelaide; Fonseca, Teresa; Mota, Artur; Martins, Isabel
2011-09-01
This study aims to develop an approach for the management of a forest of maritime pine located in the north region of Portugal. The forest is classified into five public lands, the so-called baldios, extending over 4432 ha. These baldios are co-managed by the Official Forest Services and the local communities mainly for timber production purposes. The forest planning involves non-spatial and spatial constraints. Spatial constraints dictate a maximum clearcut area and an exclusion time. An integer programming model is presented and the computational results are discussed.
Redshift drift constraints on holographic dark energy
NASA Astrophysics Data System (ADS)
He, Dong-Ze; Zhang, Jing-Fei; Zhang, Xin
2017-03-01
The Sandage-Loeb (SL) test is a promising method for probing dark energy because it measures the redshift drift in the spectra of Lyman- α forest of distant quasars, covering the "redshift desert" of 2 ≲ z ≲ 5, which is not covered by existing cosmological observations. Therefore, it could provide an important supplement to current cosmological observations. In this paper, we explore the impact of SL test on the precision of cosmological constraints for two typical holographic dark energy models, i.e., the original holographic dark energy (HDE) model and the Ricci holographic dark energy (RDE) model. To avoid data inconsistency, we use the best-fit models based on current combined observational data as the fiducial models to simulate 30 mock SL test data. The results show that SL test can effectively break the existing strong degeneracy between the present-day matter density Ωm0 and the Hubble constant H 0 in other cosmological observations. For the considered two typical dark energy models, not only can a 30-year observation of SL test improve the constraint precision of Ωm0 and h dramatically, but can also enhance the constraint precision of the model parameters c and α significantly.
Slush Hydrogen Technology Program
NASA Technical Reports Server (NTRS)
Cady, Edwin C.
1994-01-01
A slush hydrogen (SH2) technology facility (STF) was designed, fabricated, and assembled by a contractor team of McDonnell Douglas Aerospace (MDA), Martin Marietta Aerospace Group (MMAG), and Air Products and Chemicals, Inc. (APCI). The STF consists of a slush generator which uses the freeze-thaw production process, a vacuum subsystem, a test tank which simulates the NASP vehicle, a triple point hydrogen receiver tank, a transfer subsystem, a sample bottle, a pressurization system, and a complete instrumentation and control subsystem. The STF was fabricated, checked-out, and made ready for testing under this contract. The actual SH2 testing was performed under the NASP consortium following NASP teaming. Pre-STF testing verified SH2 production methods, validated special SH2 instrumentation, and performed limited SH2 pressurization and expulsion tests which demonstrated the need for gaseous helium pre-pressurized of SH2 to control pressure collapse. The STF represents cutting-edge technology development by an effective Government-Industry team under very tight cost and schedule constraints.
Torabian, Kian; Lezzar, Dalia; Piety, Nathaniel Z; George, Alex; Shevkoplyas, Sergey S
2017-09-20
Sickle cell anemia (SCA) is a genetic blood disorder that is particularly lethal in early childhood. Universal newborn screening programs and subsequent early treatment are known to drastically reduce under-five SCA mortality. However, in resource-limited settings, cost and infrastructure constraints limit the effectiveness of laboratory-based SCA screening programs. To address this limitation our laboratory previously developed a low-cost, equipment-free, point-of-care, paper-based SCA test. Here, we improved the stability and performance of the test by replacing sodium hydrosulfite (HS), a key reducing agent in the hemoglobin solubility buffer which is not stable in aqueous solutions, with sodium metabisulfite (MS). The MS formulation of the test was compared to the HS formulation in a laboratory setting by inexperienced users ( n = 3), to determine visual limit of detection (LOD), readout time, diagnostic accuracy, intra- and inter-observer agreement, and shelf life. The MS test was found to have a 10% sickle hemoglobin LOD, 21-min readout time, 97.3% sensitivity and 99.5% specificity for SCA, almost perfect intra- and inter-observer agreement, at least 24 weeks of shelf stability at room temperature, and could be packaged into a self-contained, distributable test kits comprised of off-the-shelf disposable components and food-grade reagents with a total cost of only $0.21 (USD).
Investment portfolio of a pension fund: Stochastic model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bosch-Princep, M.; Fontanals-Albiol, H.
1994-12-31
This paper presents a stochastic programming model that aims at getting the optimal investment portfolio of a Pension Funds. The model has been designed bearing in mind the liabilities of the Funds to its members. The essential characteristic of the objective function and the constraints is the randomness of the coefficients and the right hand side of the constraints, so it`s necessary to use techniques of stochastic mathematical programming to get information about the amount of money that should be assigned to each sort of investment. It`s important to know the risky attitude of the person that has to takemore » decisions towards running risks. It incorporates the relation between the different coefficients of the objective function and constraints of each period of temporal horizon, through lineal and discrete random processes. Likewise, it includes the hypotheses that are related to Spanish law concerning the subject of Pension Funds.« less
IESIP - AN IMPROVED EXPLORATORY SEARCH TECHNIQUE FOR PURE INTEGER LINEAR PROGRAMMING PROBLEMS
NASA Technical Reports Server (NTRS)
Fogle, F. R.
1994-01-01
IESIP, an Improved Exploratory Search Technique for Pure Integer Linear Programming Problems, addresses the problem of optimizing an objective function of one or more variables subject to a set of confining functions or constraints by a method called discrete optimization or integer programming. Integer programming is based on a specific form of the general linear programming problem in which all variables in the objective function and all variables in the constraints are integers. While more difficult, integer programming is required for accuracy when modeling systems with small numbers of components such as the distribution of goods, machine scheduling, and production scheduling. IESIP establishes a new methodology for solving pure integer programming problems by utilizing a modified version of the univariate exploratory move developed by Robert Hooke and T.A. Jeeves. IESIP also takes some of its technique from the greedy procedure and the idea of unit neighborhoods. A rounding scheme uses the continuous solution found by traditional methods (simplex or other suitable technique) and creates a feasible integer starting point. The Hook and Jeeves exploratory search is modified to accommodate integers and constraints and is then employed to determine an optimal integer solution from the feasible starting solution. The user-friendly IESIP allows for rapid solution of problems up to 10 variables in size (limited by DOS allocation). Sample problems compare IESIP solutions with the traditional branch-and-bound approach. IESIP is written in Borland's TURBO Pascal for IBM PC series computers and compatibles running DOS. Source code and an executable are provided. The main memory requirement for execution is 25K. This program is available on a 5.25 inch 360K MS DOS format diskette. IESIP was developed in 1990. IBM is a trademark of International Business Machines. TURBO Pascal is registered by Borland International.
Heuristic Constraint Management Methods in Multidimensional Adaptive Testing
ERIC Educational Resources Information Center
Born, Sebastian; Frey, Andreas
2017-01-01
Although multidimensional adaptive testing (MAT) has been proven to be highly advantageous with regard to measurement efficiency when several highly correlated dimensions are measured, there are few operational assessments that use MAT. This may be due to issues of constraint management, which is more complex in MAT than it is in unidimensional…
Marco A. Contreras; Woodam Chung; Greg Jones
2008-01-01
Forest transportation planning problems (FTPP) have evolved from considering only the financial aspects of timber management to more holistic problems that also consider the environmental impacts of roads. These additional requirements have introduced side constraints, making FTPP larger and more complex. Mixed-integer programming (MIP) has been used to solve FTPP, but...
Management as the enabling technology for space exploration
NASA Technical Reports Server (NTRS)
Mandell, Humboldt C., Jr.; Griffin, Michael D.
1992-01-01
This paper addresses the dilemma which NASA faces in starting a major new initiative within the constraints of the current national budget. It addressed the fact that unlike previous NASA programs, the major mission constraints come from management factors as opposed to technologies. An action plan is presented, along with some results from early management simplification processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morris, F. A.; Sawyer, C. H.; Maxwell, J. H.
1979-10-01
The Regional Assessments Division in the US Department of Energy (DOE) has undertaken a program to assess the probable consequences of various national energy policies in regions of the United States and to evaluate the constraints on national energy policy imposed by conditions in these regions. The program is referred to as the Regional Issues Identification and Assessment (RIIA) Program. Currently the RIIA Program is evaluating the Trendlong Mid-Mid scenario, a pattern of energy development for 1985 and 1990 derived from the Project Independence Evaluation System (PIES) model. This scenario assumes a medium annual growth rate in both the nationalmore » demand for and national supply of energy. It has been disaggregated to specify the generating capacity to be supplied by each energy source in each state. Pacific Northwest Laboratory (PNL) has the responsibility for evaluating the scenario for the Federal Region 10, consisting of Alaska, Idaho, Oregon, and Washington. PNL is identifying impacts and constraints associated with realizing the scenario in a variety of categories, including air and water quality impacts, health and safety effects, and socioeconomic impacts. This report summarizes the analysis of one such category: institutional constraints - defined to include legal, organizational, and political barriers to the achievement of the scenario in the Northwest.« less
Advanced Concepts Research for Flywheel Technology Applications
NASA Technical Reports Server (NTRS)
Keith, Theo G., Jr.; Wagner, Robert
2004-01-01
The Missile Defense Agency (MDA) (formerly the Ballistic Missile Defense Organization) is embarking on a program to employ the use of High Altitude Airships (HAAs) for surveillance of coastal areas as a part of homeland defense. It is envisioned that these HAAs will fly at 70,000 feet continuously for at least a year, therefore requiring a regenerative electric power system. As part of a program to entice the MDA to utilize the NASA GRC expertise in electric power and propulsion as a means of risk reduction, an internal study program was performed to examine possible configurations that may be employed on a HAA to meet a theoretical surveillance need. This entailed the development of a set of program requirements which were flowed down to system and subsystem level requirements as well as the identification of environmental and infrastructure constraints. Such infrastructure constraints include the ability to construct a reasonably sized HAA within existing airship hangers, as the size of such vehicles could reach in excess of 600 ft. The issues regarding environments at this altitude are similar to those that would be imposed on satellite in Low Earth Orbit. Additionally, operational constraints, due to high winds at certain times of the year were also examined to determine options that could be examined to allow year round coverage of the US coast.
Design of nucleic acid sequences for DNA computing based on a thermodynamic approach
Tanaka, Fumiaki; Kameda, Atsushi; Yamamoto, Masahito; Ohuchi, Azuma
2005-01-01
We have developed an algorithm for designing multiple sequences of nucleic acids that have a uniform melting temperature between the sequence and its complement and that do not hybridize non-specifically with each other based on the minimum free energy (ΔGmin). Sequences that satisfy these constraints can be utilized in computations, various engineering applications such as microarrays, and nano-fabrications. Our algorithm is a random generate-and-test algorithm: it generates a candidate sequence randomly and tests whether the sequence satisfies the constraints. The novelty of our algorithm is that the filtering method uses a greedy search to calculate ΔGmin. This effectively excludes inappropriate sequences before ΔGmin is calculated, thereby reducing computation time drastically when compared with an algorithm without the filtering. Experimental results in silico showed the superiority of the greedy search over the traditional approach based on the hamming distance. In addition, experimental results in vitro demonstrated that the experimental free energy (ΔGexp) of 126 sequences correlated well with ΔGmin (|R| = 0.90) than with the hamming distance (|R| = 0.80). These results validate the rationality of a thermodynamic approach. We implemented our algorithm in a graphic user interface-based program written in Java. PMID:15701762
What Sensing Tells Us: Towards a Formal Theory of Testing for Dynamical Systems
NASA Technical Reports Server (NTRS)
McIlraith, Sheila; Scherl, Richard
2005-01-01
Just as actions can have indirect effects on the state of the world, so too can sensing actions have indirect effects on an agent's state of knowledge. In this paper, we investigate "what sensing actions tell us", i.e., what an agent comes to know indirectly from the outcome of a sensing action, given knowledge of its actions and state constraints that hold in the world. To this end, we propose a formalization of the notion of testing within a dialect of the situation calculus that includes knowledge and sensing actions. Realizing this formalization requires addressing the ramification problem for sensing actions. We formalize simple tests as sensing actions. Complex tests are expressed in the logic programming language Golog. We examine what it means to perform a test, and how the outcome of a test affects an agent's state of knowledge. Finally, we propose automated reasoning techniques for test generation and complex-test verification, under certain restrictions. The work presented in this paper is relevant to a number of application domains including diagnostic problem solving, natural language understanding, plan recognition, and active vision.
Han, Feifei
2017-01-01
While some first language (L1) reading models suggest that inefficient word recognition and small working memory tend to inhibit higher-level comprehension processes; the Compensatory Encoding Model maintains that slow word recognition and small working memory do not normally hinder reading comprehension, as readers are able to operate metacognitive strategies to compensate for inefficient word recognition and working memory limitation as long as readers process a reading task without time constraint. Although empirical evidence is accumulated for support of the Compensatory Encoding Model in L1 reading, there is lack of research for testing of the Compensatory Encoding Model in foreign language (FL) reading. This research empirically tested the Compensatory Encoding Model in English reading among Chinese college English language learners (ELLs). Two studies were conducted. Study one focused on testing whether reading condition varying time affects the relationship between word recognition, working memory, and reading comprehension. Students were tested on a computerized English word recognition test, a computerized Operation Span task, and reading comprehension in time constraint and non-time constraint reading. The correlation and regression analyses showed that the strength of association was much stronger between word recognition, working memory, and reading comprehension in time constraint than that in non-time constraint reading condition. Study two examined whether FL readers were able to operate metacognitive reading strategies as a compensatory way of reading comprehension for inefficient word recognition and working memory limitation in non-time constraint reading. The participants were tested on the same computerized English word recognition test and Operation Span test. They were required to think aloud while reading and to complete the comprehension questions. The think-aloud protocols were coded for concurrent use of reading strategies, classified into language-oriented strategies, content-oriented strategies, re-reading, pausing, and meta-comment. The correlation analyses showed that while word recognition and working memory were only significantly related to frequency of language-oriented strategies, re-reading, and pausing, but not with reading comprehension. Jointly viewed, the results of the two studies, complimenting each other, supported the applicability of the Compensatory Encoding Model in FL reading with Chinese college ELLs. PMID:28522984
Han, Feifei
2017-01-01
While some first language (L1) reading models suggest that inefficient word recognition and small working memory tend to inhibit higher-level comprehension processes; the Compensatory Encoding Model maintains that slow word recognition and small working memory do not normally hinder reading comprehension, as readers are able to operate metacognitive strategies to compensate for inefficient word recognition and working memory limitation as long as readers process a reading task without time constraint. Although empirical evidence is accumulated for support of the Compensatory Encoding Model in L1 reading, there is lack of research for testing of the Compensatory Encoding Model in foreign language (FL) reading. This research empirically tested the Compensatory Encoding Model in English reading among Chinese college English language learners (ELLs). Two studies were conducted. Study one focused on testing whether reading condition varying time affects the relationship between word recognition, working memory, and reading comprehension. Students were tested on a computerized English word recognition test, a computerized Operation Span task, and reading comprehension in time constraint and non-time constraint reading. The correlation and regression analyses showed that the strength of association was much stronger between word recognition, working memory, and reading comprehension in time constraint than that in non-time constraint reading condition. Study two examined whether FL readers were able to operate metacognitive reading strategies as a compensatory way of reading comprehension for inefficient word recognition and working memory limitation in non-time constraint reading. The participants were tested on the same computerized English word recognition test and Operation Span test. They were required to think aloud while reading and to complete the comprehension questions. The think-aloud protocols were coded for concurrent use of reading strategies, classified into language-oriented strategies, content-oriented strategies, re-reading, pausing, and meta-comment. The correlation analyses showed that while word recognition and working memory were only significantly related to frequency of language-oriented strategies, re-reading, and pausing, but not with reading comprehension. Jointly viewed, the results of the two studies, complimenting each other, supported the applicability of the Compensatory Encoding Model in FL reading with Chinese college ELLs.
Carroll, Raymond J; Delaigle, Aurore; Hall, Peter
2011-03-01
In many applications we can expect that, or are interested to know if, a density function or a regression curve satisfies some specific shape constraints. For example, when the explanatory variable, X, represents the value taken by a treatment or dosage, the conditional mean of the response, Y , is often anticipated to be a monotone function of X. Indeed, if this regression mean is not monotone (in the appropriate direction) then the medical or commercial value of the treatment is likely to be significantly curtailed, at least for values of X that lie beyond the point at which monotonicity fails. In the case of a density, common shape constraints include log-concavity and unimodality. If we can correctly guess the shape of a curve, then nonparametric estimators can be improved by taking this information into account. Addressing such problems requires a method for testing the hypothesis that the curve of interest satisfies a shape constraint, and, if the conclusion of the test is positive, a technique for estimating the curve subject to the constraint. Nonparametric methodology for solving these problems already exists, but only in cases where the covariates are observed precisely. However in many problems, data can only be observed with measurement errors, and the methods employed in the error-free case typically do not carry over to this error context. In this paper we develop a novel approach to hypothesis testing and function estimation under shape constraints, which is valid in the context of measurement errors. Our method is based on tilting an estimator of the density or the regression mean until it satisfies the shape constraint, and we take as our test statistic the distance through which it is tilted. Bootstrap methods are used to calibrate the test. The constrained curve estimators that we develop are also based on tilting, and in that context our work has points of contact with methodology in the error-free case.
A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing
NASA Technical Reports Server (NTRS)
Takaki, Mitsuo; Cavalcanti, Diego; Gheyi, Rohit; Iyoda, Juliano; dAmorim, Marcelo; Prudencio, Ricardo
2009-01-01
The complexity of constraints is a major obstacle for constraint-based software verification. Automatic constraint solvers are fundamentally incomplete: input constraints often build on some undecidable theory or some theory the solver does not support. This paper proposes and evaluates several randomized solvers to address this issue. We compare the effectiveness of a symbolic solver (CVC3), a random solver, three hybrid solvers (i.e., mix of random and symbolic), and two heuristic search solvers. We evaluate the solvers on two benchmarks: one consisting of manually generated constraints and another generated with a concolic execution of 8 subjects. In addition to fully decidable constraints, the benchmarks include constraints with non-linear integer arithmetic, integer modulo and division, bitwise arithmetic, and floating-point arithmetic. As expected symbolic solving (in particular, CVC3) subsumes the other solvers for the concolic execution of subjects that only generate decidable constraints. For the remaining subjects the solvers are complementary.
Chen, Pei-Hua
2017-05-01
This rejoinder responds to the commentary by van der Linden and Li entiled "Comment on Three-Element Item Selection Procedures for Multiple Forms Assembly: An Item Matching Approach" on the article "Three-Element Item Selection Procedures for Multiple Forms Assembly: An Item Matching Approach" by Chen. Van der Linden and Li made a strong statement calling for the cessation of test assembly heuristics development, and instead encouraged embracing mixed integer programming (MIP). This article points out the nondeterministic polynomial (NP)-hard nature of MIP problems and how solutions found using heuristics could be useful in an MIP context. Although van der Linden and Li provided several practical examples of test assembly supporting their view, the examples ignore the cases in which a slight change of constraints or item pool data might mean it would not be possible to obtain solutions as quickly as before. The article illustrates the use of heuristic solutions to improve both the performance of MIP solvers and the quality of solutions. Additional responses to the commentary by van der Linden and Li are included.
Panel discussion: prescribed burning in the 21st century
Jerry Hurley; Ishmael Messer; Stephen J. Botti; Jay Perkins; L. Dean Clark
1995-01-01
Even though many legal, social, and organizational constraints affect prescribed fire programs, the ecological and social benefits of such programs encourage their continued existence (with or without modification). The form of these programs in the next 10 to 50 years is pure speculation; but we must speculate and project the programs, as well as associated benefits...
Efficient pairwise RNA structure prediction using probabilistic alignment constraints in Dynalign
2007-01-01
Background Joint alignment and secondary structure prediction of two RNA sequences can significantly improve the accuracy of the structural predictions. Methods addressing this problem, however, are forced to employ constraints that reduce computation by restricting the alignments and/or structures (i.e. folds) that are permissible. In this paper, a new methodology is presented for the purpose of establishing alignment constraints based on nucleotide alignment and insertion posterior probabilities. Using a hidden Markov model, posterior probabilities of alignment and insertion are computed for all possible pairings of nucleotide positions from the two sequences. These alignment and insertion posterior probabilities are additively combined to obtain probabilities of co-incidence for nucleotide position pairs. A suitable alignment constraint is obtained by thresholding the co-incidence probabilities. The constraint is integrated with Dynalign, a free energy minimization algorithm for joint alignment and secondary structure prediction. The resulting method is benchmarked against the previous version of Dynalign and against other programs for pairwise RNA structure prediction. Results The proposed technique eliminates manual parameter selection in Dynalign and provides significant computational time savings in comparison to prior constraints in Dynalign while simultaneously providing a small improvement in the structural prediction accuracy. Savings are also realized in memory. In experiments over a 5S RNA dataset with average sequence length of approximately 120 nucleotides, the method reduces computation by a factor of 2. The method performs favorably in comparison to other programs for pairwise RNA structure prediction: yielding better accuracy, on average, and requiring significantly lesser computational resources. Conclusion Probabilistic analysis can be utilized in order to automate the determination of alignment constraints for pairwise RNA structure prediction methods in a principled fashion. These constraints can reduce the computational and memory requirements of these methods while maintaining or improving their accuracy of structural prediction. This extends the practical reach of these methods to longer length sequences. The revised Dynalign code is freely available for download. PMID:17445273
Estimating free-body modal parameters from tests of a constrained structure
NASA Technical Reports Server (NTRS)
Cooley, Victor M.
1993-01-01
Hardware advances in suspension technology for ground tests of large space structures provide near on-orbit boundary conditions for modal testing. Further advances in determining free-body modal properties of constrained large space structures have been made, on the analysis side, by using time domain parameter estimation and perturbing the stiffness of the constraints over multiple sub-tests. In this manner, passive suspension constraint forces, which are fully correlated and therefore not usable for spectral averaging techniques, are made effectively uncorrelated. The technique is demonstrated with simulated test data.
Evaluation Strategies in Financial Education: Evaluation with Imperfect Instruments
ERIC Educational Resources Information Center
Robinson, Lauren; Dudensing, Rebekka; Granovsky, Nancy L.
2016-01-01
Program evaluation often suffers due to time constraints, imperfect instruments, incomplete data, and the need to report standardized metrics. This article about the evaluation process for the Wi$eUp financial education program showcases the difficulties inherent in evaluation and suggests best practices for assessing program effectiveness. We…
A Case Study of the Development of an Early Retirement Program for University Faculty.
ERIC Educational Resources Information Center
Chronister, Jay L.; Trainer, Aileen
1985-01-01
To offset declining enrollments, financial constraints, younger faculties, and high tenure ratios, some institutions are considering early retirement programs to facilitate faculty turnover. A University of Virginia faculty committee reviewed several early retirement options and selected a cost-effective bridging program with ample incentives and…
Control system software, simulation, and robotic applications
NASA Technical Reports Server (NTRS)
Frisch, Harold P.
1991-01-01
All essential existing capabilities needed to create a man-machine interaction dynamics and performance (MMIDAP) capability are reviewed. The multibody system dynamics software program Order N DISCOS will be used for machine and musculo-skeletal dynamics modeling. The program JACK will be used for estimating and animating whole body human response to given loading situations and motion constraints. The basic elements of performance (BEP) task decomposition methodologies associated with the Human Performance Institute database will be used for performance assessment. Techniques for resolving the statically indeterminant muscular load sharing problem will be used for a detailed understanding of potential musculotendon or ligamentous fatigue, pain, discomfort, and trauma. The envisioned capacity is to be used for mechanical system design, human performance assessment, extrapolation of man/machine interaction test data, biomedical engineering, and soft prototyping within a concurrent engineering (CE) system.
Real time data acquisition for expert systems in Unix workstations at Space Shuttle Mission Control
NASA Technical Reports Server (NTRS)
Muratore, John F.; Heindel, Troy A.; Murphy, Terri B.; Rasmussen, Arthur N.; Gnabasik, Mark; Mcfarland, Robert Z.; Bailey, Samuel A.
1990-01-01
A distributed system of proprietary engineering-class workstations is incorporated into NASA's Space Shuttle Mission-Control Center to increase the automation of mission control. The Real-Time Data System (RTDS) allows the operator to utilize expert knowledge in the display program for system modeling and evaluation. RTDS applications are reviewed including: (1) telemetry-animated communications schematics; (2) workstation displays of systems such as the Space Shuttle remote manipulator; and (3) a workstation emulation of shuttle flight instrumentation. The hard and soft real-time constraints are described including computer data acquisition, and the support techniques for the real-time expert systems include major frame buffers for logging and distribution as well as noise filtering. The incorporation of the workstations allows smaller programming teams to implement real-time telemetry systems that can improve operations and flight testing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knox, C.E.
A simplified flight-management descent algorithm, programmed on a small programmable calculator, was developed and flight tested. It was designed to aid the pilot in planning and executing a fuel-conservative descent to arrive at a metering fix at a time designated by the air traffic control system. The algorithm may also be used for planning fuel-conservative descents when time is not a consideration. The descent path was calculated for a constant Mach/airspeed schedule from linear approximations of airplane performance with considerations given for gross weight, wind, and nonstandard temperature effects. The flight-management descent algorithm is described. The results of flight testsmore » flown with a T-39A (Sabreliner) airplane are presented.« less
Integrated Avionics System (IAS)
NASA Technical Reports Server (NTRS)
Hunter, D. J.
2001-01-01
As spacecraft designs converge toward miniaturization and with the volumetric and mass constraints placed on avionics, programs will continue to advance the 'state of the art' in spacecraft systems development with new challenges to reduce power, mass, and volume. Although new technologies have improved packaging densities, a total system packaging architecture is required that not only reduces spacecraft volume and mass budgets, but increase integration efficiencies, provide modularity and scalability to accommodate multiple missions. With these challenges in mind, a novel packaging approach incorporates solutions that provide broader environmental applications, more flexible system interconnectivity, scalability, and simplified assembly test and integration schemes. This paper will describe the fundamental elements of the Integrated Avionics System (IAS), Horizontally Mounted Cube (HMC) hardware design, system and environmental test results. Additional information is contained in the original extended abstract.
A space standards application to university-class microsatellites: The UNISAT experience
NASA Astrophysics Data System (ADS)
Graziani, Filippo; Piergentili, Fabrizio; Santoni, Fabio
2010-05-01
Hands-on education is recognized as an invaluable tool to improve students' skills, to stimulate their enthusiasm and to educate them to teamwork. University class satellite programs should be developed keeping in mind that education is the main goal and that university satellites are a unique opportunity to make involved students familiar with all the phases of space missions. Moreover university budgets for education programs are much lower than for industrial satellites programs. Therefore two main constraints must be respected: a time schedule fitting with the student course duration and a low economic budget. These have an impact on the standard which can be followed in university class satellite programs. In this paper university-class satellite standardization is discussed on the basis of UNISAT program experience, reporting successful project achievements and lessons learned through unsuccessful experiences. The UNISAT program was established at the Scuola di Ingegneria Aerospaziale by the Group of Astrodynamics of the University of Rome "La Sapienza" (GAUSS) as a research and education program in which Ph.D. and graduate students have the opportunity to gain hands-on experience on small space missions. Four university satellites (UNISAT, UNISAT-2, UNISAT-3, UNISAT-4), weighing about 10 kg, have been designed, manufactured, tested and launched every two years since 2000 in the framework of this program In the paper, after a brief overview of new GAUSS programs, an analysis of the UNISAT satellites ground test campaign is carried out, identifying the most critical procedures and requirements to be fulfilled. Moreover a device for low earth orbit low-cost satellite end-of-life disposal is presented; this system (SIRDARIA) complies with the international guidelines on space debris.
Large-scale linear programs in planning and prediction.
DOT National Transportation Integrated Search
2017-06-01
Large-scale linear programs are at the core of many traffic-related optimization problems in both planning and prediction. Moreover, many of these involve significant uncertainty, and hence are modeled using either chance constraints, or robust optim...
The detection of planetary systems from Space Station - A star observation strategy
NASA Technical Reports Server (NTRS)
Mascy, Alfred C.; Nishioka, Ken; Jorgensen, Helen; Swenson, Byron L.
1987-01-01
A 10-20-yr star-observation program for the Space Station Astrometric Telescope Facility (ATF) is proposed and evaluated by means of computer simulations. The primary aim of the program is to detect stars with planetary systems by precise determination of their motion relative to reference stars. The designs proposed for the ATF are described and illustrated; the basic parameters of the 127 stars selected for the program are listed in a table; spacecraft and science constraints, telescope slewing rates, and the possibility of limiting the program sample to stars near the Galactic equator are discussed; and the effects of these constraints are investigated by simulating 1 yr of ATF operation. Viewing all sky regions, the ATF would have 81-percent active viewing time, observing each star about 200 times (56 h) per yr; only small decrements in this performance would result from limiting the viewing field.
NASA Technical Reports Server (NTRS)
Butler, R.; Williams, F. W.
1992-01-01
A computer program for obtaining the optimum (least mass) dimensions of the kind of prismatic assemblies of laminated, composite plates which occur in advanced aerospace construction is described. Rigorous buckling analysis (derived from exact member theory) and a tailored design procedure are used to produce designs which satisfy buckling and material strength constraints and configurational requirements. Analysis is two to three orders of magnitude quicker than FEM, keeps track of all the governing modes of failure and is efficiently adapted to give sensitivities and to maintain feasibility. Tailoring encourages convergence in fewer sizing cycles than competing programs and permits start designs which are a long way from feasible and/or optimum. Comparisons with its predecessor, PASCO, show that the program is more likely to produce an optimum, will do so more quickly in some cases, and remains accurate for a wider range of problems.
Geomagnetic main field modeling using magnetohydrodynamic constraints
NASA Technical Reports Server (NTRS)
Estes, R. H.
1985-01-01
The influence of physical constraints are investigated which may be approximately satisfied by the Earth's liquid core on models of the geomagnetic main field and its secular variation. A previous report describes the methodology used to incorporate nonlinear equations of constraint into the main field model. The application of that methodology to the GSFC 12/83 field model to test the frozen-flux hypothesis and the usefulness of incorporating magnetohydrodynamic constraints for obtaining improved geomagnetic field models is described.
Testing deformation hypotheses by constraints on a time series of geodetic observations
NASA Astrophysics Data System (ADS)
Velsink, Hiddo
2018-01-01
In geodetic deformation analysis observations are used to identify form and size changes of a geodetic network, representing objects on the earth's surface. The network points are monitored, often continuously, because of suspected deformations. A deformation may affect many points during many epochs. The problem is that the best description of the deformation is, in general, unknown. To find it, different hypothesised deformation models have to be tested systematically for agreement with the observations. The tests have to be capable of stating with a certain probability the size of detectable deformations, and to be datum invariant. A statistical criterion is needed to find the best deformation model. Existing methods do not fulfil these requirements. Here we propose a method that formulates the different hypotheses as sets of constraints on the parameters of a least-squares adjustment model. The constraints can relate to subsets of epochs and to subsets of points, thus combining time series analysis and congruence model analysis. The constraints are formulated as nonstochastic observations in an adjustment model of observation equations. This gives an easy way to test the constraints and to get a quality description. The proposed method aims at providing a good discriminating method to find the best description of a deformation. The method is expected to improve the quality of geodetic deformation analysis. We demonstrate the method with an elaborate example.
Thermodynamic Constraints Improve Metabolic Networks.
Krumholz, Elias W; Libourel, Igor G L
2017-08-08
In pursuit of establishing a realistic metabolic phenotypic space, the reversibility of reactions is thermodynamically constrained in modern metabolic networks. The reversibility constraints follow from heuristic thermodynamic poise approximations that take anticipated cellular metabolite concentration ranges into account. Because constraints reduce the feasible space, draft metabolic network reconstructions may need more extensive reconciliation, and a larger number of genes may become essential. Notwithstanding ubiquitous application, the effect of reversibility constraints on the predictive capabilities of metabolic networks has not been investigated in detail. Instead, work has focused on the implementation and validation of the thermodynamic poise calculation itself. With the advance of fast linear programming-based network reconciliation, the effects of reversibility constraints on network reconciliation and gene essentiality predictions have become feasible and are the subject of this study. Networks with thermodynamically informed reversibility constraints outperformed gene essentiality predictions compared to networks that were constrained with randomly shuffled constraints. Unconstrained networks predicted gene essentiality as accurately as thermodynamically constrained networks, but predicted substantially fewer essential genes. Networks that were reconciled with sequence similarity data and strongly enforced reversibility constraints outperformed all other networks. We conclude that metabolic network analysis confirmed the validity of the thermodynamic constraints, and that thermodynamic poise information is actionable during network reconciliation. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.
A Simulation-Optimization Model for the Management of Seawater Intrusion
NASA Astrophysics Data System (ADS)
Stanko, Z.; Nishikawa, T.
2012-12-01
Seawater intrusion is a common problem in coastal aquifers where excessive groundwater pumping can lead to chloride contamination of a freshwater resource. Simulation-optimization techniques have been developed to determine optimal management strategies while mitigating seawater intrusion. The simulation models are often density-independent groundwater-flow models that may assume a sharp interface and/or use equivalent freshwater heads. The optimization methods are often linear-programming (LP) based techniques that that require simplifications of the real-world system. However, seawater intrusion is a highly nonlinear, density-dependent flow and transport problem, which requires the use of nonlinear-programming (NLP) or global-optimization (GO) techniques. NLP approaches are difficult because of the need for gradient information; therefore, we have chosen a GO technique for this study. Specifically, we have coupled a multi-objective genetic algorithm (GA) with a density-dependent groundwater-flow and transport model to simulate and identify strategies that optimally manage seawater intrusion. GA is a heuristic approach, often chosen when seeking optimal solutions to highly complex and nonlinear problems where LP or NLP methods cannot be applied. The GA utilized in this study is the Epsilon-Nondominated Sorted Genetic Algorithm II (ɛ-NSGAII), which can approximate a pareto-optimal front between competing objectives. This algorithm has several key features: real and/or binary variable capabilities; an efficient sorting scheme; preservation and diversity of good solutions; dynamic population sizing; constraint handling; parallelizable implementation; and user controlled precision for each objective. The simulation model is SEAWAT, the USGS model that couples MODFLOW with MT3DMS for variable-density flow and transport. ɛ-NSGAII and SEAWAT were efficiently linked together through a C-Fortran interface. The simulation-optimization model was first tested by using a published density-independent flow model test case that was originally solved using a sequential LP method with the USGS's Ground-Water Management Process (GWM). For the problem formulation, the objective is to maximize net groundwater extraction, subject to head and head-gradient constraints. The decision variables are pumping rates at fixed wells and the system's state is represented with freshwater hydraulic head. The results of the proposed algorithm were similar to the published results (within 1%); discrepancies may be attributed to differences in the simulators and inherent differences between LP and GA. The GWM test case was then extended to a density-dependent flow and transport version. As formulated, the optimization problem is infeasible because of the density effects on hydraulic head. Therefore, the sum of the squared constraint violation (SSC) was used as a second objective. The result is a pareto curve showing optimal pumping rates versus the SSC. Analysis of this curve indicates that a similar net-extraction rate to the test case can be obtained with a minor violation in vertical head-gradient constraints. This study shows that a coupled ɛ-NSGAII/SEAWAT model can be used for the management of groundwater seawater intrusion. In the future, the proposed methodology will be applied to a real-world seawater intrusion and resource management problem for Santa Barbara, CA.
Aspect-object alignment with Integer Linear Programming in opinion mining.
Zhao, Yanyan; Qin, Bing; Liu, Ting; Yang, Wei
2015-01-01
Target extraction is an important task in opinion mining. In this task, a complete target consists of an aspect and its corresponding object. However, previous work has always simply regarded the aspect as the target itself and has ignored the important "object" element. Thus, these studies have addressed incomplete targets, which are of limited use for practical applications. This paper proposes a novel and important sentiment analysis task, termed aspect-object alignment, to solve the "object neglect" problem. The objective of this task is to obtain the correct corresponding object for each aspect. We design a two-step framework for this task. We first provide an aspect-object alignment classifier that incorporates three sets of features, namely, the basic, relational, and special target features. However, the objects that are assigned to aspects in a sentence often contradict each other and possess many complicated features that are difficult to incorporate into a classifier. To resolve these conflicts, we impose two types of constraints in the second step: intra-sentence constraints and inter-sentence constraints. These constraints are encoded as linear formulations, and Integer Linear Programming (ILP) is used as an inference procedure to obtain a final global decision that is consistent with the constraints. Experiments on a corpus in the camera domain demonstrate that the three feature sets used in the aspect-object alignment classifier are effective in improving its performance. Moreover, the classifier with ILP inference performs better than the classifier without it, thereby illustrating that the two types of constraints that we impose are beneficial.
Contracts and Management Services FY 1996 Site Support Program Plan: WBS 6.10.14. Revision 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knoll, J.M. Jr.
1995-09-01
This is the Contracts and Management Services site support program plan for the US DOE Hanford site. The topics addressed in the program plan include a mission statement, program objectives, planning assumptions, program constraints, work breakdown structure, milestone list, milestone description sheets, and activity detail including cost accounting narrative summary, approved funding budget, and activity detailed description.
Software For Integer Programming
NASA Technical Reports Server (NTRS)
Fogle, F. R.
1992-01-01
Improved Exploratory Search Technique for Pure Integer Linear Programming Problems (IESIP) program optimizes objective function of variables subject to confining functions or constraints, using discrete optimization or integer programming. Enables rapid solution of problems up to 10 variables in size. Integer programming required for accuracy in modeling systems containing small number of components, distribution of goods, scheduling operations on machine tools, and scheduling production in general. Written in Borland's TURBO Pascal.
NASA Technical Reports Server (NTRS)
Freedman, Wendy L.; Madore, Barry F.; Scowcroft, Vicky; Mnso, Andy; Persson, S. E.; Rigby, Jane; Sturch, Laura; Stetson, Peter
2011-01-01
We present an overview of and preliminary results from an ongoing comprehensive program that has a goal of determining the Hubble constant to a systematic accuracy of 2%. As part of this program, we are currently obtaining 3.6 micron data using the Infrared Array Camera (IRAC) on Spitzer, and the program is designed to include JWST in the future. We demonstrate that the mid-infrared period-luminosity relation for Cepheids at 3.6 microns is the most accurate means of measuring Cepheid distances to date. At 3.6 microns, it is possible to minimize the known remaining systematic uncertainties in the Cepheid extragalactic distance scale. We discuss the advantages of 3.6 micron observations in minimizing systematic effects in the Cepheid calibration of the Hubble constant including the absolute zero point, extinction corrections, and the effects of metallicity on the colors and magnitudes of Cepheids. We are undertaking three independent tests of the sensitivity of the mid-IR Cepheid Leavitt Law to metallicity, which when combined will allow a robust constraint on the effect. Finally, we are providing a new mid-IR Tully-Fisher relation for spiral galaxies.
Structural optimization with approximate sensitivities
NASA Technical Reports Server (NTRS)
Patnaik, S. N.; Hopkins, D. A.; Coroneos, R.
1994-01-01
Computational efficiency in structural optimization can be enhanced if the intensive computations associated with the calculation of the sensitivities, that is, gradients of the behavior constraints, are reduced. Approximation to gradients of the behavior constraints that can be generated with small amount of numerical calculations is proposed. Structural optimization with these approximate sensitivities produced correct optimum solution. Approximate gradients performed well for different nonlinear programming methods, such as the sequence of unconstrained minimization technique, method of feasible directions, sequence of quadratic programming, and sequence of linear programming. Structural optimization with approximate gradients can reduce by one third the CPU time that would otherwise be required to solve the problem with explicit closed-form gradients. The proposed gradient approximation shows potential to reduce intensive computation that has been associated with traditional structural optimization.
Use of RORA for Complex Ground-Water Flow Conditions
Rutledge, A.T.
2004-01-01
The RORA computer program for estimating recharge is based on a condition in which ground water flows perpendicular to the nearest stream that receives ground-water discharge. The method, therefore, does not explicitly account for the ground-water-flow component that is parallel to the stream. Hypothetical finite-difference simulations are used to demonstrate effects of complex flow conditions that consist of two components: one that is perpendicular to the stream and one that is parallel to the stream. Results of the simulations indicate that the RORA program can be used if certain constraints are applied in the estimation of the recession index, an input variable to the program. These constraints apply to a mathematical formulation based on aquifer properties, recession of ground-water levels, and recession of streamflow.
A technique for locating function roots and for satisfying equality constraints in optimization
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, Jaroslaw
1991-01-01
A new technique for locating simultaneous roots of a set of functions is described. The technique is based on the property of the Kreisselmeier-Steinhauser function which descends to a minimum at each root location. It is shown that the ensuing algorithm may be merged into any nonlinear programming method for solving optimization problems with equality constraints.
A technique for locating function roots and for satisfying equality constraints in optimization
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, J.
1992-01-01
A new technique for locating simultaneous roots of a set of functions is described. The technique is based on the property of the Kreisselmeier-Steinhauser function which descends to a minimum at each root location. It is shown that the ensuing algorithm may be merged into any nonlinear programming method for solving optimization problems with equality constraints.
ERIC Educational Resources Information Center
Carney, Joan
2012-01-01
This study investigated the efficacy of constraint-induced movement therapy (CI therapy) on activities important to school participation in children with hemiparesis. Four children, ages 4-0 to 7-10 participated in an intensive CI therapy program in a clinical setting. Constraining casts were worn 24 hours daily. Therapy was delivered 6 hours…
Kumyaito, Nattapon; Yupapin, Preecha; Tamee, Kreangsak
2018-01-08
An effective training plan is an important factor in sports training to enhance athletic performance. A poorly considered training plan may result in injury to the athlete, and overtraining. Good training plans normally require expert input, which may have a cost too great for many athletes, particularly amateur athletes. The objectives of this research were to create a practical cycling training plan that substantially improves athletic performance while satisfying essential physiological constraints. Adaptive Particle Swarm Optimization using ɛ-constraint methods were used to formulate such a plan and simulate the likely performance outcomes. The physiological constraints considered in this study were monotony, chronic training load ramp rate and daily training impulse. A comparison of results from our simulations against a training plan from British Cycling, which we used as our standard, showed that our training plan outperformed the benchmark in terms of both athletic performance and satisfying all physiological constraints.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-10
... accurate population estimates possible given the constraints of time, money, and available statistical... related to changes in an area's housing stock, such as data on demolitions, building permits, or mobile...
A Health Education Program School Systems Can Afford
ERIC Educational Resources Information Center
Henke, Lorraine J.
1977-01-01
The Prince George's County Public School System has found that implementing the health education curriculum in conjunction with the life science program at the seventh-grade level is a satisfactory solution to the problem of budgetary constraints. (MB)
Atmospheric Infrared Sounder (AIRS) thermal test program
NASA Astrophysics Data System (ADS)
Coda, Roger C.; Green, Kenneth E.; McKay, Thomas; Overoye, Kenneth; Wickman-Boisvert, Heather A.
1999-12-01
The Atmospheric Infrared Sounder (AIRS) has been developed for the NASA Earth Observing System (EOS) program with a scheduled launch on the first post meridian (PM-1) platform in December 2000. AIRS is designed to provide both new and more accurate data about the atmosphere, land and oceans for application to climate studies and weather predictions. Among the important parameters to be derived from AIRS observations are atmospheric temperature profiles with an average accuracy of 1 K in 1 kilometer (km) layers in the troposphere and surface temperatures with an average accuracy of 0.5 K. The AIRS measurement technique is based on passive infrared remote sensing using a precisely calibrated, high spectral resolution grating spectrometer providing high sensitivity operation over the 3.7 micrometer - 15.4 micrometer region. To meet the challenge of high performance over this broad wavelength range, the spectrometer is cooled to 155 K using a passive two-stage radiative cooler and the HgCdTe focal plane is cooled to 58 K using a state-of-the-art long life, low vibration Stirling/pulse tube cryocooler. Electronics waste heat is removed through a spacecraft provided heat rejection system based on heat pipe technology. All of these functions combine to make AIRS thermal management a key aspect of the overall instrument design. Additionally, the thermal operating constraints place challenging requirements on the test program in terms of proper simulation of the space environment and the logistic issues attendant with testing cryogenic instruments. The AIRS instrument has been fully integrated and thermal vacuum performance testing is underway. This paper provides an overview of the AIRS thermal system design, the test methodologies and the key results from the thermal vacuum tests, which have been completed at the time of this publication.
Relating constrained motion to force through Newton's second law
NASA Astrophysics Data System (ADS)
Roithmayr, Carlos M.
When a mechanical system is subject to constraints its motion is in some way restricted. In accordance with Newton's second law, motion is a direct result of forces acting on a system; hence, constraint is inextricably linked to force. The presence of a constraint implies the application of particular forces needed to compel motion in accordance with the constraint; absence of a constraint implies the absence of such forces. The objective of this thesis is to formulate a comprehensive, consistent, and concise method for identifying a set of forces needed to constrain the behavior of a mechanical system modeled as a set of particles and rigid bodies. The goal is accomplished in large part by expressing constraint equations in vector form rather than entirely in terms of scalars. The method developed here can be applied whenever constraints can be described at the acceleration level by a set of independent equations that are linear in acceleration. Hence, the range of applicability extends to servo-constraints or program constraints described at the velocity level with relationships that are nonlinear in velocity. All configuration constraints, and an important class of classical motion constraints, can be expressed at the velocity level by using equations that are linear in velocity; therefore, the associated constraint equations are linear in acceleration when written at the acceleration level. Two new approaches are presented for deriving equations governing motion of a system subject to constraints expressed at the velocity level with equations that are nonlinear in velocity. By using partial accelerations instead of the partial velocities normally employed with Kane's method, it is possible to form dynamical equations that either do or do not contain evidence of the constraint forces, depending on the analyst's interests.
An Analysis of the Neighborhood Impacts of a Mortgage Assistance Program: A Spatial Hedonic Model
ERIC Educational Resources Information Center
Di, Wenhua; Ma, Jielai; Murdoch, James C.
2010-01-01
Down payment or closing cost assistance is an effective program in addressing the wealth constraints of low-and moderate-income homebuyers. However, the spillover effect of such programs on the neighborhood is unknown. This paper estimates the impact of the City of Dallas Mortgage Assistance Program (MAP) on nearby home values using a hedonic…
Topological control of life and death in non-proliferative epithelia.
Martinand-Mari, Camille; Maury, Benoit; Rousset, François; Sahuquet, Alain; Mennessier, Gérard; Rochal, Sergei; Lorman, Vladimir; Mangeat, Paul; Baghdiguian, Stephen
2009-01-01
Programmed cell death is one of the most fascinating demonstrations of the plasticity of biological systems. It is classically described to act upstream of and govern major developmental patterning processes (e.g. inter-digitations in vertebrates, ommatidia in Drosophila). We show here the first evidence that massive apoptosis can also be controlled and coordinated by a pre-established pattern of a specific 'master cell' population. This new concept is supported by the development and validation of an original model of cell patterning. Ciona intestinalis eggs are surrounded by a three-layered follicular organization composed of 60 elongated floating extensions made of as many outer and inner cells, and indirectly spread through an extracellular matrix over 1200 test cells. Experimental and selective ablation of outer and inner cells results in the abrogation of apoptosis in respective remaining neighbouring test cells. In addition incubation of outer/inner follicular cell-depleted eggs with a soluble extract of apoptotic outer/inner cells partially restores apoptosis to apoptotic-defective test cells. The 60 inner follicular cells were thus identified as 'apoptotic master' cells which collectively are induction sites for programmed cell death of the underlying test cells. The position of apoptotic master cells is controlled by topological constraints exhibiting a tetrahedral symmetry, and each cell spreads over and can control the destiny of 20 smaller test cells, which leads to optimized apoptosis signalling.
Advanced launch system trajectory optimization using suboptimal control
NASA Technical Reports Server (NTRS)
Shaver, Douglas A.; Hull, David G.
1993-01-01
The maximum-final mass trajectory of a proposed configuration of the Advanced Launch System is presented. A model for the two-stage rocket is given; the optimal control problem is formulated as a parameter optimization problem; and the optimal trajectory is computed using a nonlinear programming code called VF02AD. Numerical results are presented for the controls (angle of attack and velocity roll angle) and the states. After the initial rotation, the angle of attack goes to a positive value to keep the trajectory as high as possible, returns to near zero to pass through the transonic regime and satisfy the dynamic pressure constraint, returns to a positive value to keep the trajectory high and to take advantage of minimum drag at positive angle of attack due to aerodynamic shading of the booster, and then rolls off to negative values to satisfy the constraints. Because the engines cannot be throttled, the maximum dynamic pressure occurs at a single point; there is no maximum dynamic pressure subarc. To test approximations for obtaining analytical solutions for guidance, two additional optimal trajectories are computed: one using untrimmed aerodynamics and one using no atmospheric effects except for the dynamic pressure constraint. It is concluded that untrimmed aerodynamics has a negligible effect on the optimal trajectory and that approximate optimal controls should be able to be obtained by treating atmospheric effects as perturbations.
Microgrid optimal scheduling considering impact of high penetration wind generation
NASA Astrophysics Data System (ADS)
Alanazi, Abdulaziz
The objective of this thesis is to study the impact of high penetration wind energy in economic and reliable operation of microgrids. Wind power is variable, i.e., constantly changing, and nondispatchable, i.e., cannot be controlled by the microgrid controller. Thus an accurate forecasting of wind power is an essential task in order to study its impacts in microgrid operation. Two commonly used forecasting methods including Autoregressive Integrated Moving Average (ARIMA) and Artificial Neural Network (ANN) have been used in this thesis to improve the wind power forecasting. The forecasting error is calculated using a Mean Absolute Percentage Error (MAPE) and is improved using the ANN. The wind forecast is further used in the microgrid optimal scheduling problem. The microgrid optimal scheduling is performed by developing a viable model for security-constrained unit commitment (SCUC) based on mixed-integer linear programing (MILP) method. The proposed SCUC is solved for various wind penetration levels and the relationship between the total cost and the wind power penetration is found. In order to reduce microgrid power transfer fluctuations, an additional constraint is proposed and added to the SCUC formulation. The new constraint would control the time-based fluctuations. The impact of the constraint on microgrid SCUC results is tested and validated with numerical analysis. Finally, the applicability of proposed models is demonstrated through numerical simulations.
Flight control with adaptive critic neural network
NASA Astrophysics Data System (ADS)
Han, Dongchen
2001-10-01
In this dissertation, the adaptive critic neural network technique is applied to solve complex nonlinear system control problems. Based on dynamic programming, the adaptive critic neural network can embed the optimal solution into a neural network. Though trained off-line, the neural network forms a real-time feedback controller. Because of its general interpolation properties, the neurocontroller has inherit robustness. The problems solved here are an agile missile control for U.S. Air Force and a midcourse guidance law for U.S. Navy. In the first three papers, the neural network was used to control an air-to-air agile missile to implement a minimum-time heading-reverse in a vertical plane corresponding to following conditions: a system without constraint, a system with control inequality constraint, and a system with state inequality constraint. While the agile missile is a one-dimensional problem, the midcourse guidance law is the first test-bed for multiple-dimensional problem. In the fourth paper, the neurocontroller is synthesized to guide a surface-to-air missile to a fixed final condition, and to a flexible final condition from a variable initial condition. In order to evaluate the adaptive critic neural network approach, the numerical solutions for these cases are also obtained by solving two-point boundary value problem with a shooting method. All of the results showed that the adaptive critic neural network could solve complex nonlinear system control problems.
Structural Optimization for Reliability Using Nonlinear Goal Programming
NASA Technical Reports Server (NTRS)
El-Sayed, Mohamed E.
1999-01-01
This report details the development of a reliability based multi-objective design tool for solving structural optimization problems. Based on two different optimization techniques, namely sequential unconstrained minimization and nonlinear goal programming, the developed design method has the capability to take into account the effects of variability on the proposed design through a user specified reliability design criterion. In its sequential unconstrained minimization mode, the developed design tool uses a composite objective function, in conjunction with weight ordered design objectives, in order to take into account conflicting and multiple design criteria. Multiple design criteria of interest including structural weight, load induced stress and deflection, and mechanical reliability. The nonlinear goal programming mode, on the other hand, provides for a design method that eliminates the difficulty of having to define an objective function and constraints, while at the same time has the capability of handling rank ordered design objectives or goals. For simulation purposes the design of a pressure vessel cover plate was undertaken as a test bed for the newly developed design tool. The formulation of this structural optimization problem into sequential unconstrained minimization and goal programming form is presented. The resulting optimization problem was solved using: (i) the linear extended interior penalty function method algorithm; and (ii) Powell's conjugate directions method. Both single and multi-objective numerical test cases are included demonstrating the design tool's capabilities as it applies to this design problem.
NASA Technical Reports Server (NTRS)
Antonille, Scott R.; Miskey, Cherie L.; Ohl, Raymond G.; Rohrbach, Scott O.; Aronstein, David L.; Bartoszyk, Andrew E.; Bowers, Charles W.; Cofie, Emmanuel; Collins, Nicholas R.; Comber, Brian J.;
2016-01-01
NASA's James Webb Space Telescope (JWST) is a 6.6m diameter, segmented, deployable telescope for cryogenic IR space astronomy (40K). The JWST Observatory includes the Optical Telescope Element (OTE) and the Integrated Science Instrument Module (ISIM) that contains four science instruments (SI) and the fine guider. The SIs are mounted to a composite metering structure. The SI and guider units were integrated to the ISIM structure and optically tested at the NASA Goddard Space Flight Center as a suite using the Optical Telescope Element SIMulator (OSIM). OSIM is a full field, cryogenic JWST telescope simulator. SI performance, including alignment and wave front error, were evaluated using OSIM. We describe test and analysis methods for optical performance verification of the ISIM Element, with an emphasis on the processes used to plan and execute the test. The complexity of ISIM and OSIM drove us to develop a software tool for test planning that allows for configuration control of observations, associated scripts, and management of hardware and software limits and constraints, as well as tools for rapid data evaluation, and flexible re-planning in response to the unexpected. As examples of our test and analysis approach, we discuss how factors such as the ground test thermal environment are compensated in alignment. We describe how these innovative methods for test planning and execution and post-test analysis were instrumental in the verification program for the ISIM element, with enough information to allow the reader to consider these innovations and lessons learned in this successful effort in their future testing for other programs.
NASA Astrophysics Data System (ADS)
Antonille, Scott R.; Miskey, Cherie L.; Ohl, Raymond G.; Rohrbach, Scott O.; Aronstein, David L.; Bartoszyk, Andrew E.; Bowers, Charles W.; Cofie, Emmanuel; Collins, Nicholas R.; Comber, Brian J.; Eichhorn, William L.; Glasse, Alistair C.; Gracey, Renee; Hartig, George F.; Howard, Joseph M.; Kelly, Douglas M.; Kimble, Randy A.; Kirk, Jeffrey R.; Kubalak, David A.; Landsman, Wayne B.; Lindler, Don J.; Malumuth, Eliot M.; Maszkiewicz, Michael; Rieke, Marcia J.; Rowlands, Neil; Sabatke, Derek S.; Smith, Corbett T.; Smith, J. Scott; Sullivan, Joseph F.; Telfer, Randal C.; Te Plate, Maurice; Vila, M. Begoña.; Warner, Gerry D.; Wright, David; Wright, Raymond H.; Zhou, Julia; Zielinski, Thomas P.
2016-09-01
NASA's James Webb Space Telescope (JWST) is a 6.5m diameter, segmented, deployable telescope for cryogenic IR space astronomy. The JWST Observatory includes the Optical Telescope Element (OTE) and the Integrated Science Instrument Module (ISIM), that contains four science instruments (SI) and the Fine Guidance Sensor (FGS). The SIs are mounted to a composite metering structure. The SIs and FGS were integrated to the ISIM structure and optically tested at NASA's Goddard Space Flight Center using the Optical Telescope Element SIMulator (OSIM). OSIM is a full-field, cryogenic JWST telescope simulator. SI performance, including alignment and wavefront error, was evaluated using OSIM. We describe test and analysis methods for optical performance verification of the ISIM Element, with an emphasis on the processes used to plan and execute the test. The complexity of ISIM and OSIM drove us to develop a software tool for test planning that allows for configuration control of observations, implementation of associated scripts, and management of hardware and software limits and constraints, as well as tools for rapid data evaluation, and flexible re-planning in response to the unexpected. As examples of our test and analysis approach, we discuss how factors such as the ground test thermal environment are compensated in alignment. We describe how these innovative methods for test planning and execution and post-test analysis were instrumental in the verification program for the ISIM element, with enough information to allow the reader to consider these innovations and lessons learned in this successful effort in their future testing for other programs.
A robust optimization methodology for preliminary aircraft design
NASA Astrophysics Data System (ADS)
Prigent, S.; Maréchal, P.; Rondepierre, A.; Druot, T.; Belleville, M.
2016-05-01
This article focuses on a robust optimization of an aircraft preliminary design under operational constraints. According to engineers' know-how, the aircraft preliminary design problem can be modelled as an uncertain optimization problem whose objective (the cost or the fuel consumption) is almost affine, and whose constraints are convex. It is shown that this uncertain optimization problem can be approximated in a conservative manner by an uncertain linear optimization program, which enables the use of the techniques of robust linear programming of Ben-Tal, El Ghaoui, and Nemirovski [Robust Optimization, Princeton University Press, 2009]. This methodology is then applied to two real cases of aircraft design and numerical results are presented.
Optimization techniques applied to spectrum management for communications satellites
NASA Astrophysics Data System (ADS)
Ottey, H. R.; Sullivan, T. M.; Zusman, F. S.
This paper describes user requirements, algorithms and software design features for the application of optimization techniques to the management of the geostationary orbit/spectrum resource. Relevant problems include parameter sensitivity analyses, frequency and orbit position assignment coordination, and orbit position allotment planning. It is shown how integer and nonlinear programming as well as heuristic search techniques can be used to solve these problems. Formalized mathematical objective functions that define the problems are presented. Constraint functions that impart the necessary solution bounds are described. A versatile program structure is outlined, which would allow problems to be solved in stages while varying the problem space, solution resolution, objective function and constraints.
Structural design using equilibrium programming formulations
NASA Technical Reports Server (NTRS)
Scotti, Stephen J.
1995-01-01
Solutions to increasingly larger structural optimization problems are desired. However, computational resources are strained to meet this need. New methods will be required to solve increasingly larger problems. The present approaches to solving large-scale problems involve approximations for the constraints of structural optimization problems and/or decomposition of the problem into multiple subproblems that can be solved in parallel. An area of game theory, equilibrium programming (also known as noncooperative game theory), can be used to unify these existing approaches from a theoretical point of view (considering the existence and optimality of solutions), and be used as a framework for the development of new methods for solving large-scale optimization problems. Equilibrium programming theory is described, and existing design techniques such as fully stressed design and constraint approximations are shown to fit within its framework. Two new structural design formulations are also derived. The first new formulation is another approximation technique which is a general updating scheme for the sensitivity derivatives of design constraints. The second new formulation uses a substructure-based decomposition of the structure for analysis and sensitivity calculations. Significant computational benefits of the new formulations compared with a conventional method are demonstrated.
Xie, Y L; Li, Y P; Huang, G H; Li, Y F; Chen, L R
2011-04-15
In this study, an inexact-chance-constrained water quality management (ICC-WQM) model is developed for planning regional environmental management under uncertainty. This method is based on an integration of interval linear programming (ILP) and chance-constrained programming (CCP) techniques. ICC-WQM allows uncertainties presented as both probability distributions and interval values to be incorporated within a general optimization framework. Complexities in environmental management systems can be systematically reflected, thus applicability of the modeling process can be highly enhanced. The developed method is applied to planning chemical-industry development in Binhai New Area of Tianjin, China. Interval solutions associated with different risk levels of constraint violation have been obtained. They can be used for generating decision alternatives and thus help decision makers identify desired policies under various system-reliability constraints of water environmental capacity of pollutant. Tradeoffs between system benefits and constraint-violation risks can also be tackled. They are helpful for supporting (a) decision of wastewater discharge and government investment, (b) formulation of local policies regarding water consumption, economic development and industry structure, and (c) analysis of interactions among economic benefits, system reliability and pollutant discharges. Copyright © 2011 Elsevier B.V. All rights reserved.
Enhancements on the Convex Programming Based Powered Descent Guidance Algorithm for Mars Landing
NASA Technical Reports Server (NTRS)
Acikmese, Behcet; Blackmore, Lars; Scharf, Daniel P.; Wolf, Aron
2008-01-01
In this paper, we present enhancements on the powered descent guidance algorithm developed for Mars pinpoint landing. The guidance algorithm solves the powered descent minimum fuel trajectory optimization problem via a direct numerical method. Our main contribution is to formulate the trajectory optimization problem, which has nonconvex control constraints, as a finite dimensional convex optimization problem, specifically as a finite dimensional second order cone programming (SOCP) problem. SOCP is a subclass of convex programming, and there are efficient SOCP solvers with deterministic convergence properties. Hence, the resulting guidance algorithm can potentially be implemented onboard a spacecraft for real-time applications. Particularly, this paper discusses the algorithmic improvements obtained by: (i) Using an efficient approach to choose the optimal time-of-flight; (ii) Using a computationally inexpensive way to detect the feasibility/ infeasibility of the problem due to the thrust-to-weight constraint; (iii) Incorporating the rotation rate of the planet into the problem formulation; (iv) Developing additional constraints on the position and velocity to guarantee no-subsurface flight between the time samples of the temporal discretization; (v) Developing a fuel-limited targeting algorithm; (vi) Initial result on developing an onboard table lookup method to obtain almost fuel optimal solutions in real-time.
Mars Science Laboratory Rover Mobility Bushing Development
NASA Technical Reports Server (NTRS)
Riggs, Benjamin
2008-01-01
NASA s Mars Science Laboratory (MSL) Project will send a six-wheeled rover to Mars in 2009. The rover will carry a scientific payload designed to search for organic molecules on the Martian surface during its primary mission. This paper describes the development and testing of a bonded film lubricated bushing system to be used in the mobility system of the rover. The MSL Rover Mobility System contains several pivots that are tightly constrained with respect to mass and volume. These pivots are also exposed to relatively low temperatures (-135 C) during operation. The combination of these constraints led the mobility team to consider the use of solid film lubricated metallic bushings and dry running polymeric bushings in several flight pivot applications. A test program was developed to mitigate the risk associated with using these materials in critical pivots on the MSL vehicle. The program was designed to characterize bushing friction and wear performance over the expected operational temperature range (-135 C to +70 C). Seven different bushing material / lubricant combinations were evaluated to aid in the selection of the final flight pivot bushing material / lubricant combination.
The Husky Byte Program: Delivering Nutrition Education One Sound Byte at a Time
ERIC Educational Resources Information Center
Pierce, Michelle B.; Hudson, Kerrian A.; Lora, Karina R.; Havens, Erin K.; Ferris, Ann M.
2011-01-01
The Husky Byte program uses interactive displays to deliver quick sound bytes of nutrition information to adults in frequented community settings. This innovative program considers time constraints, adult learning theory, diverse learning styles, and is easily accessible to adults. Both process and impact evaluations have demonstrated positive…
Enacting the Independent Public Schools Program in Western Australia
ERIC Educational Resources Information Center
Gobby, Brad
2013-01-01
The Independent Public Schools (IPS) program began to be implemented in some Western Australian schools in 2010. The IPS program devolves a number of responsibilities to principals and is part of the political objective of removing the constraints of the education bureaucracy by fostering school level decision-making, problem-solving and…
Demonstration of Mer-Cure Technology for Enhanced Mercury Control
DOE Office of Scientific and Technical Information (OSTI.GOV)
John Marion; Dave O'Neill; Kevin Taugher
2008-06-01
Alstom Power Inc. has completed a DOE/NETL-sponsored program (under DOE Cooperative Agreement No. De-FC26-07NT42776) to demonstrate Mer-Cure{trademark}, one of Alstom's mercury control technologies for coal-fired boilers. The Mer-Cure{trademark}system utilizes a small amount of Mer-Clean{trademark} sorbent that is injected into the flue gas stream for oxidation and adsorption of gaseous mercury. Mer-Clean{trademark} sorbents are carbon-based and prepared with chemical additives that promote oxidation and capture of mercury. The Mer-Cure{trademark} system is unique in that the sorbent is injected into an environment where the mercury capture kinetics is accelerated. The full-scale demonstration program originally included test campaigns at two host sites: LCRA's 480-MW{sub e} Fayette Unit No.3 and Reliant Energy's 190-MW{sub e} Shawville Unit No.3. The only demonstration tests actually done were the short-term tests at LCRA due to budget constraints. This report gives a summary of the demonstration testing at Fayette Unit No.3. The goals for this Mercury Round 3 program, established by DOE/NETL under the original solicitation, were to reduce the uncontrolled mercury emissions by 90% at a cost significantly less than 50% of the previous target ofmore » $$60,000/lb mercury removed. The results indicated that Mer-Cure{trademark} technology could achieve mercury removal of 90% based on uncontrolled stack emissions. The estimated costs for 90% mercury control, at a sorbent cost of $$0.75 to $2.00/lb respectively, were $13,400 to $18,700/lb Hg removed. In summary, the results from demonstration testing show that the goals established by DOE/NETL were met during this test program. The goal of 90% mercury reduction was achieved. Estimated mercury removal costs were 69-78% lower than the benchmark of $60,000/lb mercury removed, significantly less than 50% of the baseline removal cost.« less
NASA Technical Reports Server (NTRS)
Park, Sang C.; Brinckerhoff, Pamela; Franck, Randy; Schweickart, Rusty; Thomson, Shaun; Burt, Bill; Ousley, Wes
2016-01-01
The James Webb Space Telescope (JWST) Optical Telescope Element (OTE) assembly is the largest optically stable infrared-optimized telescope currently being manufactured and assembled, and scheduled for launch in 2018. The JWST OTE, including the primary mirrors, secondary mirror, and the Aft Optics Subsystems (AOS) are designed to be passively cooled and operate at near 45 degrees Kelvin. Due to the size of its large sunshield in relation to existing test facilities, JWST cannot be optically or thermally tested as a complete observatory-level system at flight temperatures. As a result, the telescope portion along with its instrument complement will be tested as a single unit very late in the program, and on the program schedule critical path. To mitigate schedule risks, a set of 'pathfinder' cryogenic tests will be performed to reduce program risks by demonstrating the optical testing capabilities of the facility, characterizing telescope thermal performance, and allowing project personnel to learn valuable testing lessons off-line. This paper describes the 'pathfinder' cryogenic test program, focusing on the recently completed second test in the series called the Optical Ground Support Equipment 2 (OGSE2) test. The JWST OGSE2 was successfully completed within the allocated project schedule while faced with numerous conflicting thermal requirements during cool-down to the final cryogenic operational temperatures, and during warm-up after the cryo-stable optical tests. The challenges include developing a pre-test cool-down and warm-up profiles without a reliable method to predict the thermal behaviors in a rarified helium environment, and managing the test article hardware safety driven by the project Limits and Constraints (L&C's). Furthermore, OGSE2 test included the time critical Aft Optics Subsystem (AOS), a part of the flight Optical Telescope Element that would need to be placed back in the overall telescope assembly integrations. The OGSE2 test requirements included the strict adherence of the project contamination controls due to the presence of the contamination sensitive flight optical elements. The test operations required close coordination of numerous personnel while they being exposed and trained for the 'final' combined OTE and instrument cryo-test in 2017. This paper will also encompass the OGSE2 thermal data look-back review.
Passive Wireless SAW Sensors for IVHM
NASA Technical Reports Server (NTRS)
Wilson, William C.; Perey, Daniel F.; Atkinson, Gary M.; Barclay, Rebecca O.
2008-01-01
NASA aeronautical programs require integrated vehicle health monitoring (IVHM) to ensure the safety of the crew and the vehicles. Future IVHM sensors need to be small, light weight, inexpensive, and wireless. Surface acoustic wave (SAW) technology meets all of these constraints. In addition it operates in harsh environments and over wide temperature ranges, and it is inherently radiation hardened. This paper presents a survey of research opportunities for universities and industry to develop new sensors that address anticipated IVHM needs for aerospace vehicles. Potential applications of passive wireless SAW sensors from ground testing to high altitude aircraft operations are presented, along with some of the challenges and issues of the technology.
Helicopter rotor and engine sizing for preliminary performance estimation
NASA Technical Reports Server (NTRS)
Talbot, P. D.; Bowles, J. V.; Lee, H. C.
1986-01-01
Methods are presented for estimating some of the more fundamental design variables of single-rotor helicopters (tip speed, blade area, disk loading, and installed power) based on design requirements (speed, weight, fuselage drag, and design hover ceiling). The well-known constraints of advancing-blade compressibility and retreating-blade stall are incorporated into the estimation process, based on an empirical interpretation of rotor performance data from large-scale wind-tunnel tests. Engine performance data are presented and correlated with a simple model usable for preliminary design. When approximate results are required quickly, these methods may be more convenient to use and provide more insight than large digital computer programs.
Optimization of Stochastic Response Surfaces Subject to Constraints with Linear Programming
1992-03-01
SEXTPT(EPDIM,NVAR), box(STEP,NVAR), SDEV(3) REAL BA-SET(NL,BCDIM,M,NVAR),BA(M,NVAR),CBA(NVAR) REAL CB(M), BMAT (MM),B _TEST(M) COMMON OPTBASIS, OPTEP...0.0) THEN COUNT - COUNT+1 XBASIC(N,SET,COUNT) = I DO 136 J - 1, M BMAT (J,COUNT) = A(J,I) 136 CONTINUE ENDIF 137 CONTINUE IF(COUNT.GT.M) WRITE...SET,I)= 0.0 DO 140 J = 1, M BMAT (J,I) = 0.0 140 CONTINUE 142 CONTINUE DO 148 I= 1, M BTEST(I) = 0.0 64 DO 146 J -1, NVAR BTEST(I)= BTEST(I)+XSOL(J)*A
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vienna, John D.; Kim, Dong-Sang; Skorski, Daniel C.
2013-07-01
Recent glass formulation and melter testing data have suggested that significant increases in waste loading in HLW and LAW glasses are possible over current system planning estimates. The data (although limited in some cases) were evaluated to determine a set of constraints and models that could be used to estimate the maximum loading of specific waste compositions in glass. It is recommended that these models and constraints be used to estimate the likely HLW and LAW glass volumes that would result if the current glass formulation studies are successfully completed. It is recognized that some of the models are preliminarymore » in nature and will change in the coming years. Plus the models do not currently address the prediction uncertainties that would be needed before they could be used in plant operations. The models and constraints are only meant to give an indication of rough glass volumes and are not intended to be used in plant operation or waste form qualification activities. A current research program is in place to develop the data, models, and uncertainty descriptions for that purpose. A fundamental tenet underlying the research reported in this document is to try to be less conservative than previous studies when developing constraints for estimating the glass to be produced by implementing current advanced glass formulation efforts. The less conservative approach documented herein should allow for the estimate of glass masses that may be realized if the current efforts in advanced glass formulations are completed over the coming years and are as successful as early indications suggest they may be. Because of this approach there is an unquantifiable uncertainty in the ultimate glass volume projections due to model prediction uncertainties that has to be considered along with other system uncertainties such as waste compositions and amounts to be immobilized, split factors between LAW and HLW, etc.« less
Parameter estimation with Sandage-Loeb test
DOE Office of Scientific and Technical Information (OSTI.GOV)
Geng, Jia-Jia; Zhang, Jing-Fei; Zhang, Xin, E-mail: gengjiajia163@163.com, E-mail: jfzhang@mail.neu.edu.cn, E-mail: zhangxin@mail.neu.edu.cn
2014-12-01
The Sandage-Loeb (SL) test directly measures the expansion rate of the universe in the redshift range of 2 ∼< z ∼< 5 by detecting redshift drift in the spectra of Lyman-α forest of distant quasars. We discuss the impact of the future SL test data on parameter estimation for the ΛCDM, the wCDM, and the w{sub 0}w{sub a}CDM models. To avoid the potential inconsistency with other observational data, we take the best-fitting dark energy model constrained by the current observations as the fiducial model to produce 30 mock SL test data. The SL test data provide an important supplement to the other dark energymore » probes, since they are extremely helpful in breaking the existing parameter degeneracies. We show that the strong degeneracy between Ω{sub m} and H{sub 0} in all the three dark energy models is well broken by the SL test. Compared to the current combined data of type Ia supernovae, baryon acoustic oscillation, cosmic microwave background, and Hubble constant, the 30-yr observation of SL test could improve the constraints on Ω{sub m} and H{sub 0} by more than 60% for all the three models. But the SL test can only moderately improve the constraint on the equation of state of dark energy. We show that a 30-yr observation of SL test could help improve the constraint on constant w by about 25%, and improve the constraints on w{sub 0} and w{sub a} by about 20% and 15%, respectively. We also quantify the constraining power of the SL test in the future high-precision joint geometric constraints on dark energy. The mock future supernova and baryon acoustic oscillation data are simulated based on the space-based project JDEM. We find that the 30-yr observation of SL test would help improve the measurement precision of Ω{sub m}, H{sub 0}, and w{sub a} by more than 70%, 20%, and 60%, respectively, for the w{sub 0}w{sub a}CDM model.« less
NASA Astrophysics Data System (ADS)
Li, Yanhua; Lin, Jianping
2015-08-01
Tailor-welded blanks (TWBs) have been considered as a productive sheet forming method in automotive industries. However, formability of TWBs is reduced due to different properties or thicknesses of the blanks and is a challenge for manufacturing designers. The plastic capacity of TWBs is decreased even when the material and thickness are the same. The constraint effect of the laser weld (including weld and heat-affected zone) material in the forming process of similar TWBs is a key problem to be solved in the research, development and application of thin-sheet TWBs. In this paper, uniaxial tensile tests with full-field strain measurement by digital image correlation and Erichsen tests are performed to investigate the constraint effect on deformation behavior and explore the mechanism of decreasing formability of similar TWBs. In addition, finite element models are conducted under ABAQUS code to further reveal the phenomenal behavior of the constraint effect. The results of the base material and welded blanks are compared for characterizing the differences. Furthermore, in order to better understand this mechanism, theoretical and numerical investigations are employed and compared to interpret the constraint effect of laser weld on the deformation behavior of TWBs. An index is proposed to quantify the constraint effect. Results show that the constraint effect of laser weld appears in both stretch forming and drawing of TWBs. Strain paths are approaching the plane strain condition as compared to the monolithic blank due to the constraint effect. Constraint effect is a major factor affecting the formability of TWBs when the failure occurs away from the weld seam.
Development and Application of a Portable Health Algorithms Test System
NASA Technical Reports Server (NTRS)
Melcher, Kevin J.; Fulton, Christopher E.; Maul, William A.; Sowers, T. Shane
2007-01-01
This paper describes the development and initial demonstration of a Portable Health Algorithms Test (PHALT) System that is being developed by researchers at the NASA Glenn Research Center (GRC). The PHALT System was conceived as a means of evolving the maturity and credibility of algorithms developed to assess the health of aerospace systems. Comprising an integrated hardware-software environment, the PHALT System allows systems health management algorithms to be developed in a graphical programming environment; to be tested and refined using system simulation or test data playback; and finally, to be evaluated in a real-time hardware-in-the-loop mode with a live test article. In this paper, PHALT System development is described through the presentation of a functional architecture, followed by the selection and integration of hardware and software. Also described is an initial real-time hardware-in-the-loop demonstration that used sensor data qualification algorithms to diagnose and isolate simulated sensor failures in a prototype Power Distribution Unit test-bed. Success of the initial demonstration is highlighted by the correct detection of all sensor failures and the absence of any real-time constraint violations.
Davidov, Ori; Rosen, Sophia
2011-04-01
In medical studies, endpoints are often measured for each patient longitudinally. The mixed-effects model has been a useful tool for the analysis of such data. There are situations in which the parameters of the model are subject to some restrictions or constraints. For example, in hearing loss studies, we expect hearing to deteriorate with time. This means that hearing thresholds which reflect hearing acuity will, on average, increase over time. Therefore, the regression coefficients associated with the mean effect of time on hearing ability will be constrained. Such constraints should be accounted for in the analysis. We propose maximum likelihood estimation procedures, based on the expectation-conditional maximization either algorithm, to estimate the parameters of the model while accounting for the constraints on them. The proposed methods improve, in terms of mean square error, on the unconstrained estimators. In some settings, the improvement may be substantial. Hypotheses testing procedures that incorporate the constraints are developed. Specifically, likelihood ratio, Wald, and score tests are proposed and investigated. Their empirical significance levels and power are studied using simulations. It is shown that incorporating the constraints improves the mean squared error of the estimates and the power of the tests. These improvements may be substantial. The methodology is used to analyze a hearing loss study.
Constraints and Approach for Selecting the Mars Surveyor '01 Landing Site
NASA Technical Reports Server (NTRS)
Golombek, M.; Bridges, N.; Gilmore, M.; Haldemann, A.; Parker, T.; Saunders, R.; Spencer, D.; Smith, J.; Weitz, C.
1999-01-01
There are many similarities between the Mars Surveyor '01 (MS '01) landing site selection process and that of Mars Pathfinder. The selection process includes two parallel activities in which engineers define and refine the capabilities of the spacecraft through design, testing and modeling and scientists define a set of landing site constraints based on the spacecraft design and landing scenario. As for Pathfinder, the safety of the site is without question the single most important factor, for the simple reason that failure to land safely yields no science and exposes the mission and program to considerable risk. The selection process must be thorough and defensible and capable of surviving multiple withering reviews similar to the Pathfinder decision. On Pathfinder, this was accomplished by attempting to understand the surface properties of sites using available remote sensing data sets and models based on them. Science objectives are factored into the selection process only after the safety of the site is validated. Finally, as for Pathfinder, the selection process is being done in an open environment with multiple opportunities for community involvement including open workshops, with education and outreach opportunities.
Constraints, Approach and Present Status for Selecting the Mars Surveyor 2001 Landing Site
NASA Technical Reports Server (NTRS)
Golombek, M.; Anderson, F.; Bridges, N.; Briggs, G.; Gilmore, M.; Gulick, V.; Haldemann, A.; Parker, T.; Saunders, R.; Spencer, D.;
1999-01-01
There are many similarities between the Mars Surveyor '01 (MS '01) landing site selection process and that of Mars Pathfinder. The selection process includes two parallel activities in which engineers define and refine the capabilities of the spacecraft through design, testing and modeling and scientists define a set of landing site constraints based on the spacecraft design and landing scenario. As for Pathfinder, the safety of the site is without question the single most important factor, for the simple reason that failure to land safely yields no science and exposes the mission and program to considerable risk. The selection process must be thorough, defensible and capable of surviving multiple withering reviews similar to the Pathfinder decision. On Pathfinder, this was accomplished by attempting to understand the surface properties of sites using available remote sensing data sets and models based on them. Science objectives are factored into the selection process only after the safety of the site is validated. Finally, as for Pathfinder, the selection process is being done in an open environment with multiple opportunities for community involvement including open workshops, with education and outreach opportunities.
Specimen size effects on ductile?brittle transition temperature in Charpy impact testing
NASA Astrophysics Data System (ADS)
Kurishita, H.; Yamamoto, T.; Narui, M.; Suwarno, H.; Yoshitake, T.; Yano, Y.; Yamazaki, M.; Matsui, H.
2004-08-01
One key issue for small specimen test techniques is to clarify specimen size effects on test results. In consideration of size effects on determining the ductile-to-brittle transition temperature (DBTT) in Charpy impact testing, a method to evaluate the plastic constraint loss for differently sized Charpy V-notch (CVN) specimens is proposed and applied to a ferritic-martensitic steel, 2WFK, developed by JNC. In the method, a constraint factor, α, that is an index of the plastic constraint is defined as α=σ ∗/σ y∗. Here, σ ∗ is the critical cleavage fracture stress which is a material constant and σ y∗ is the uniaxial yield stress at the DBTT at the strain rate generated in the Charpy impact test. The procedures for evaluating each of σ ∗ and σ y∗ are described and a result of σ ∗ and σ y∗, thus the value of α, is presented for different types of miniaturized and full-sized CVN specimens of 2WFK.
Test of the ecological-constraints model on ursine colobus monkeys (Colobus vellerosus) in Ghana.
Teichroeb, Julie A; Sicotte, Pascale
2009-01-01
For group-living mammals, the ecological-constraints model predicts that within-group feeding competition will increase as group size increases, necessitating more daily travel to find food and thereby constraining group size. It provides a useful tool for detecting scramble competition any time it is difficult to determine whether or not food is limiting. We tested the ecological-constraints model on highly folivorous ursine colobus monkeys (Colobus vellerosus) at the Boabeng-Fiema Monkey Sanctuary in Ghana. Three differently sized groups were followed for 13 months and two others were followed for 6 months each in 2004-2005 using focal-animal sampling and ranging scans; ecological plots and phenology surveys were used to determine home-range quality and food availability. There was relatively little difference in home-range quality, monthly food availability, diet, adult female ingestion rates, and rate of travel within food patches between the groups. However, home-range size, day-range length, and percent of time spent feeding all increased with group size. We performed a single large test of the ecological-constraints model by combining several separate Spearman correlations, each testing different predictions under the model, using Fisher's log-likelihood method. It showed that the ecological-constraints model was supported in this study; scramble competition in this population is manifesting in increased ranging and time spent feeding. How costly this increased energy expenditure is for individuals in larger groups remains to be determined. (c) 2008 Wiley-Liss, Inc.
Constraints as a destriping tool for Hires images
NASA Technical Reports Server (NTRS)
Cao, YU; Prince, Thomas A.
1994-01-01
Images produced from the Maximum Correlation Method sometimes suffer from visible striping artifacts, especially for areas of extended sources. Possible causes are different baseline levels and calibration errors in the detectors. We incorporated these factors into the MCM algorithm, and tested the effects of different constraints on the output image. The result shows significant visual improvement over the standard MCM Method. In some areas the new images show intelligible structures that are otherwise corrupted by striping artifacts, and the removal of these artifacts could enhance performance of object classification algorithms. The constraints were also tested on low surface brightness areas, and were found to be effective in reducing the noise level.
Development of an expert planning system for OSSA
NASA Technical Reports Server (NTRS)
Groundwater, B.; Lembeck, M. F.; Sarsfield, L.; Diaz, Alphonso
1988-01-01
This paper presents concepts related to preliminary work for the development of an expert planning system for NASA's Office for Space Science and Applications (OSSA). The expert system will function as a planner's decision aid in preparing mission plans encompassing sets of proposed OSSA space science initiatives. These plans in turn will be checked against budgetary and technical constraints and tested for constraint violations. Appropriate advice will be generated by the system for making modifications to the plans to bring them in line with the constraints. The OSSA Planning Expert System (OPES) has been designed to function as an integral part of the OSSA mission planning process. It will be able to suggest a best plan, be able to accept and check a user-suggested strawman plan, and should provide a quick response to user request and actions. OPES will be written in the C programming language and have a transparent user interface running under Windows 386 on a Compaq 386/20 machine. The system's sorted knowledge and inference procedures will model the expertise of human planners familiar with the OSSA planning domain. Given mission priorities and budget guidelines, the system first sets the launch dates for each mission. It will check to make sure that planetary launch windows and precursor mission relationships are not violated. Additional levels of constraints will then be considered, checking such things as the availability of a suitable launch vehicle, total mission launch mass required vs. the identified launch mass capability, and the total power required by the payload at its destination vs. the actual power available. System output will be in the form of Gantt charts, spreadsheet hardcopy, and other presentation quality materials detailing the resulting OSSA mission plan.
NASA Astrophysics Data System (ADS)
Liao, Haitao; Wu, Wenwang; Fang, Daining
2018-07-01
A coupled approach combining the reduced space Sequential Quadratic Programming (SQP) method with the harmonic balance condensation technique for finding the worst resonance response is developed. The nonlinear equality constraints of the optimization problem are imposed on the condensed harmonic balance equations. Making use of the null space decomposition technique, the original optimization formulation in the full space is mathematically simplified, and solved in the reduced space by means of the reduced SQP method. The transformation matrix that maps the full space to the null space of the constrained optimization problem is constructed via the coordinate basis scheme. The removal of the nonlinear equality constraints is accomplished, resulting in a simple optimization problem subject to bound constraints. Moreover, second order correction technique is introduced to overcome Maratos effect. The combination application of the reduced SQP method and condensation technique permits a large reduction of the computational cost. Finally, the effectiveness and applicability of the proposed methodology is demonstrated by two numerical examples.
Spacecraft rendezvous operational considerations affecting vehicle systems design and configuration
NASA Astrophysics Data System (ADS)
Prust, Ellen E.
One lesson learned from Orbiting Maneuvering Vehicle (OMV) program experience is that Design Reference Missions must include an appropriate balance of operations and performance inputs to effectively drive vehicle systems design and configuration. Rendezvous trajectory design is based on vehicle characteristics (e.g., mass, propellant tank size, and mission duration capability) and operational requirements, which have evolved through the Gemini, Apollo, and STS programs. Operational constraints affecting the rendezvous final approach are summarized. The two major objectives of operational rendezvous design are vehicle/crew safety and mission success. Operational requirements on the final approach which support these objectives include: tracking/targeting/communications; trajectory dispersion and navigation uncertainty handling; contingency protection; favorable sunlight conditions; acceptable relative state for proximity operations handover; and compliance with target vehicle constraints. A discussion of the ways each of these requirements may constrain the rendezvous trajectory follows. Although the constraints discussed apply to all rendezvous, the trajectory presented in 'Cargo Transfer Vehicle Preliminary Reference Definition' (MSFC, May 1991) was used as the basis for the comments below.
NASA Technical Reports Server (NTRS)
1972-01-01
Current research is reported on precise and accurate descriptions of the earth's surface and gravitational field and on time variations of geophysical parameters. A new computer program was written in connection with the adjustment of the BC-4 worldwide geometric satellite triangulation net. The possibility that an increment to accuracy could be transferred from a super-control net to the basic geodetic (first-order triangulation) was investigated. Coordinates of the NA9 solution were computed and were transformed to the NAD datum, based on GEOS 1 observations. Normal equations from observational data of several different systems and constraint equations were added and a single solution was obtained for the combined systems. Transformation parameters with constraints were determined, and the impact of computers on surveying and mapping is discussed.
Programming Capital Improvements. Coping With Growth.
ERIC Educational Resources Information Center
Meyer, Neil L.
Capital improvements programming is one financial managment technique for providing public services within the constraints of limited financial resources--a particular problem for communities experiencing rapid population growth. Long-range planning and improvement of public facilities for water supply, sewage treatment, parks and recreation,…
Mass transit : FTA could relieve New Starts program funding constraints
DOT National Transportation Integrated Search
2001-08-01
The Transportation Equity Act for the 21st Century (TEA-21) authorized $6 billion in "guaranteed" funding for the New Starts program (full funding grant agreements to help pay certain rail, bus, and trolley projects) through fiscal year 2003. The Fed...
Albuquerque Principals Have ESP.
ERIC Educational Resources Information Center
Weingartner, Carl J.
2001-01-01
In the midnineties, Albuquerque Public Schools developed an Extra Support for Principals initiative-a voluntary support program that respects participants' time constraints and schedules only three activities during the year. Both experienced and mentored principals value the program, which keeps more beginning administrators on the job. (MLH)
Standardized development of computer software. Part 1: Methods
NASA Technical Reports Server (NTRS)
Tausworthe, R. C.
1976-01-01
This work is a two-volume set on standards for modern software engineering methodology. This volume presents a tutorial and practical guide to the efficient development of reliable computer software, a unified and coordinated discipline for design, coding, testing, documentation, and project organization and management. The aim of the monograph is to provide formal disciplines for increasing the probability of securing software that is characterized by high degrees of initial correctness, readability, and maintainability, and to promote practices which aid in the consistent and orderly development of a total software system within schedule and budgetary constraints. These disciplines are set forth as a set of rules to be applied during software development to drastically reduce the time traditionally spent in debugging, to increase documentation quality, to foster understandability among those who must come in contact with it, and to facilitate operations and alterations of the program as requirements on the program environment change.
Performance optimization of helicopter rotor blades
NASA Technical Reports Server (NTRS)
Walsh, Joanne L.
1991-01-01
As part of a center-wide activity at NASA Langley Research Center to develop multidisciplinary design procedures by accounting for discipline interactions, a performance design optimization procedure is developed. The procedure optimizes the aerodynamic performance of rotor blades by selecting the point of taper initiation, root chord, taper ratio, and maximum twist which minimize hover horsepower while not degrading forward flight performance. The procedure uses HOVT (a strip theory momentum analysis) to compute the horse power required for hover and the comprehensive helicopter analysis program CAMRAD to compute the horsepower required for forward flight and maneuver. The optimization algorithm consists of the general purpose optimization program CONMIN and approximate analyses. Sensitivity analyses consisting of derivatives of the objective function and constraints are carried out by forward finite differences. The procedure is applied to a test problem which is an analytical model of a wind tunnel model of a utility rotor blade.
NASA Technical Reports Server (NTRS)
Wagenknecht, C. D.; Bediako, E. D.
1985-01-01
Advanced Supersonic Transport jet noise may be reduced to Federal Air Regulation limits if recommended refinements to a recently developed ejector shroud exhaust system are successfully carried out. A two-part program consisting of a design study and a subscale model wind tunnel test effort conducted to define an acoustically treated ejector shroud exhaust system for supersonic transport application is described. Coannular, 20-chute, and ejector shroud exhaust systems were evaluated. Program results were used in a mission analysis study to determine aircraft takeoff gross weight to perform a nominal design mission, under Federal Aviation Regulation (1969), Part 36, Stage 3 noise constraints. Mission trade study results confirmed that the ejector shroud was the best of the three exhaust systems studied with a significant takeoff gross weight advantage over the 20-chute suppressor nozzle which was the second best.
Veterinary medical considerations for the use of nonhuman primates in space research
NASA Technical Reports Server (NTRS)
Simmonds, R. C.
1977-01-01
The validity of biomedical research using animal subjects is highly dependent on the use of 'normal' and healthy animals. The current costs of research programs dictate that a minimum number of animals and test replicates be used to obtain the desired data. The use of healthy and standardized animals increases the probability of obtaining valid data while also permitting greater economy by reducing the between-individual variation, thus allowing the use of fewer animals. Areas of concern when planning animal payloads include constraints of the flight on candidate species selection, screening for physiological and psychological normalcy, procedures for routine care and quarantine of new animals and those returning from space, ground-based studies to determine experimental protocol, selection of instrumentation, stress during transportation for flight operations, housing and care facilities at launch and recovery sites, and the overall veterinary program.
NCMS PWB Surface Finishes Team project summary
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kokas, J.; DeSantis, C.; Wenger, G.
1996-04-01
The NCMS PWB Surface Finishes Consortium is just about at the end of the five year program. Dozens of projects related to surface finishes and PWB solder-ability were performed by the team throughout the program, and many of them are listed in this paper. They are listed with a cross reference to where and when a technical paper was presented describing the results of the research. However, due to time and space constraints, this paper can summarize the details of only three of the major research projects accomplished by the team. The first project described is an ``Evaluation of PWBmore » Surface Finishes.`` It describes the solderability, reliability, and wire bondability of numerous surface finishes. The second project outlined is an ``Evaluation of PWB Solderability Test Methods.`` The third project outlined is the ``Development and Evaluation of Organic Solderability Preservatives.``« less
ERIC Educational Resources Information Center
Alhassan, Munkaila; Habib, Abdallah Mohammed
2016-01-01
Polytechnics in Ghana view Competency Based Training (CBT) as a major intervention to the perennial constraints confronting its education and training. On the basis of this, and by government policy, a pilot programme of CBT was instituted in all the 10 polytechnics of Ghana, and was pilot tested in, at least, one department. Agricultural…
Fabrication of MgF2 and LiF windows for the Hubble Space Telescope Imaging Spectrograph
NASA Technical Reports Server (NTRS)
Gormley, Daphne; Bottema, Murk; Darnell, Barbara; Fowler, Walter; Medenica, Walter
1988-01-01
Two prototype test windows (MgF2 and LiF) to be used on the 75-mm UV MAMA detector tubes for the Hubble Space Telescope Imaging Spectrograph are described. The spatial and optical constraints of this instrument dictate that the thickness of the window materials be no greater than 2-3 mm to achieve a minimum 50-percent transmission at hydrogen Lyman alpha (121.6 nm), and that the window must be domed to minimize optical aberrations and provide structural strength. The detector window has an input diameter of about 100 mm with a radius-of-curvature of 70 mm. The manufacturing processes involved in the fabrication of these windows is discussed, as well as test programs (optical and structural) to be performed at Goddard Space Flight Center.
Accomplishments and economic evaluations of the Forestry Incentives Program: A review
Deborah A. Gaddis; Barry D. New; Fredrick W. Cubbage; Robert C. Abt; Robert J. Moulton
1995-01-01
The Forestry Incentives Program (FIP) is a federal financial cost-share program that is intended to increase the nation's timber supply by increasing tree planting and timber stand improvement on nonindustrial private forest lands. Timber harvest reductions on public lands in the West, environmental constraints on private lands throughout the U.S., and increased...
Planning the Fire Program for the Third Millennium
Richard A. Chase
1987-01-01
The fire program planner faces an increasingly complex task as diverse--and often contradictory--messages about objectives and constraints are received from political, administrative, budgetary, and social processes. Our principal challenge as we move into the 21st century is not one of looking for flashier technology to include in the planned fire program. Rather, we...
Solving intuitionistic fuzzy multi-objective nonlinear programming problem
NASA Astrophysics Data System (ADS)
Anuradha, D.; Sobana, V. E.
2017-11-01
This paper presents intuitionistic fuzzy multi-objective nonlinear programming problem (IFMONLPP). All the coefficients of the multi-objective nonlinear programming problem (MONLPP) and the constraints are taken to be intuitionistic fuzzy numbers (IFN). The IFMONLPP has been transformed into crisp one and solved by using Kuhn-Tucker condition. Numerical example is provided to illustrate the approach.
Experimental constraints on metric and non-metric theories of gravity
NASA Technical Reports Server (NTRS)
Will, Clifford M.
1989-01-01
Experimental constraints on metric and non-metric theories of gravitation are reviewed. Tests of the Einstein Equivalence Principle indicate that only metric theories of gravity are likely to be viable. Solar system experiments constrain the parameters of the weak field, post-Newtonian limit to be close to the values predicted by general relativity. Future space experiments will provide further constraints on post-Newtonian gravity.
HOROPLAN: computer-assisted nurse scheduling using constraint-based programming.
Darmoni, S J; Fajner, A; Mahé, N; Leforestier, A; Vondracek, M; Stelian, O; Baldenweck, M
1995-01-01
Nurse scheduling is a difficult and time consuming task. The schedule has to determine the day to day shift assignments of each nurse for a specified period of time in a way that satisfies the given requirements as much as possible, taking into account the wishes of nurses as closely as possible. This paper presents a constraint-based, artificial intelligence approach by describing a prototype implementation developed with the Charme language and the first results of its use in the Rouen University Hospital. Horoplan implements a non-cyclical constraint-based scheduling, using some heuristics. Four levels of constraints were defined to give a maximum of flexibility: French level (e.g. number of worked hours in a year), hospital level (e.g. specific day-off), department level (e.g. specific shift) and care unit level (e.g. specific pattern for week-ends). Some constraints must always be verified and can not be overruled and some constraints can be overruled at a certain cost. Rescheduling is possible at any time specially in case of an unscheduled absence.
Minimum weight design of rectangular and tapered helicopter rotor blades with frequency constraints
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Walsh, Joanne L.
1988-01-01
The minimum weight design of a helicopter rotor blade subject to constraints on coupled flap-lag natural frequencies has been studied. A constraint has also been imposed on the minimum value of the autorotational inertia of the blade in order to ensure that it has sufficient inertia to autorotate in the case of engine failure. The program CAMRAD is used for the blade modal analysis and CONMIN is used for the optimization. In addition, a linear approximation analysis involving Taylor series expansion has been used to reduce the analysis effort. The procedure contains a sensitivity analysis which consists of analytical derivatives of the objective function and the autorotational inertia constraint and central finite difference derivatives of the frequency constraints. Optimum designs have been obtained for both rectangular and tapered blades. Design variables include taper ratio, segment weights, and box beam dimensions. It is shown that even when starting with an acceptable baseline design, a significant amount of weight reduction is possible while satisfying all the constraints for both rectangular and tapered blades.
Minimum weight design of rectangular and tapered helicopter rotor blades with frequency constraints
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Walsh, Joanne L.
1988-01-01
The minimum weight design of a helicopter rotor blade subject to constraints on coupled flap-lag natural frequencies has been studied. A constraint has also been imposed on the minimum value of the autorotational inertia of the blade in order to ensure that it has sufficient inertia to aurorotate in the case of engine failure. The program CAMRAD is used for the blade modal analysis and CONMIN is used for the optimization. In addition, a linear approximation analysis involving Taylor series expansion has been used to reduce the analysis effort. The procedure contains a sensitivity analysis which consists of analytical derivatives of the objective function and the autorotational inertia constraint and central finite difference derivatives of the frequency constraints. Optimum designs have been obtained for both rectangular and tapered blades. Design variables include taper ratio, segment weights, and box beam dimensions. It is shown that even when starting with an acceptable baseline design, a significant amount of weight reduction is possible while satisfying all the constraints for both rectangular and tapered blades.
A set-covering based heuristic algorithm for the periodic vehicle routing problem.
Cacchiani, V; Hemmelmayr, V C; Tricoire, F
2014-01-30
We present a hybrid optimization algorithm for mixed-integer linear programming, embedding both heuristic and exact components. In order to validate it we use the periodic vehicle routing problem (PVRP) as a case study. This problem consists of determining a set of minimum cost routes for each day of a given planning horizon, with the constraints that each customer must be visited a required number of times (chosen among a set of valid day combinations), must receive every time the required quantity of product, and that the number of routes per day (each respecting the capacity of the vehicle) does not exceed the total number of available vehicles. This is a generalization of the well-known vehicle routing problem (VRP). Our algorithm is based on the linear programming (LP) relaxation of a set-covering-like integer linear programming formulation of the problem, with additional constraints. The LP-relaxation is solved by column generation, where columns are generated heuristically by an iterated local search algorithm. The whole solution method takes advantage of the LP-solution and applies techniques of fixing and releasing of the columns as a local search, making use of a tabu list to avoid cycling. We show the results of the proposed algorithm on benchmark instances from the literature and compare them to the state-of-the-art algorithms, showing the effectiveness of our approach in producing good quality solutions. In addition, we report the results on realistic instances of the PVRP introduced in Pacheco et al. (2011) [24] and on benchmark instances of the periodic traveling salesman problem (PTSP), showing the efficacy of the proposed algorithm on these as well. Finally, we report the new best known solutions found for all the tested problems.
A set-covering based heuristic algorithm for the periodic vehicle routing problem
Cacchiani, V.; Hemmelmayr, V.C.; Tricoire, F.
2014-01-01
We present a hybrid optimization algorithm for mixed-integer linear programming, embedding both heuristic and exact components. In order to validate it we use the periodic vehicle routing problem (PVRP) as a case study. This problem consists of determining a set of minimum cost routes for each day of a given planning horizon, with the constraints that each customer must be visited a required number of times (chosen among a set of valid day combinations), must receive every time the required quantity of product, and that the number of routes per day (each respecting the capacity of the vehicle) does not exceed the total number of available vehicles. This is a generalization of the well-known vehicle routing problem (VRP). Our algorithm is based on the linear programming (LP) relaxation of a set-covering-like integer linear programming formulation of the problem, with additional constraints. The LP-relaxation is solved by column generation, where columns are generated heuristically by an iterated local search algorithm. The whole solution method takes advantage of the LP-solution and applies techniques of fixing and releasing of the columns as a local search, making use of a tabu list to avoid cycling. We show the results of the proposed algorithm on benchmark instances from the literature and compare them to the state-of-the-art algorithms, showing the effectiveness of our approach in producing good quality solutions. In addition, we report the results on realistic instances of the PVRP introduced in Pacheco et al. (2011) [24] and on benchmark instances of the periodic traveling salesman problem (PTSP), showing the efficacy of the proposed algorithm on these as well. Finally, we report the new best known solutions found for all the tested problems. PMID:24748696
Image-optimized Coronal Magnetic Field Models
NASA Astrophysics Data System (ADS)
Jones, Shaela I.; Uritsky, Vadim; Davila, Joseph M.
2017-08-01
We have reported previously on a new method we are developing for using image-based information to improve global coronal magnetic field models. In that work, we presented early tests of the method, which proved its capability to improve global models based on flawed synoptic magnetograms, given excellent constraints on the field in the model volume. In this follow-up paper, we present the results of similar tests given field constraints of a nature that could realistically be obtained from quality white-light coronagraph images of the lower corona. We pay particular attention to difficulties associated with the line-of-sight projection of features outside of the assumed coronagraph image plane and the effect on the outcome of the optimization of errors in the localization of constraints. We find that substantial improvement in the model field can be achieved with these types of constraints, even when magnetic features in the images are located outside of the image plane.
NASA Technical Reports Server (NTRS)
McNamara, Luke W.; Braun, Robert D.
2014-01-01
One of the key design objectives of NASA's Orion Exploration Mission 1 (EM- 1) is to execute a guided entry trajectory demonstrating GN&C capability. The focus of this paper is defining the flyable entry corridor for EM-1 taking into account multiple subsystem constraints such as complex aerothermal heating constraints, aerothermal heating objectives, landing accuracy constraints, structural load limits, Human-System-Integration-Requirements, Service Module debris disposal limits and other flight test objectives. During the EM-1 Design Analysis Cycle 1 design challenges came up that made defining the flyable entry corridor for the EM-1 mission critical to mission success. This document details the optimization techniques that were explored to use with the 6-DOF ANTARES simulation to assist in defining the design entry interface state and entry corridor with respect to key flight test constraints and objectives.
Time management displays for shuttle countdown
NASA Technical Reports Server (NTRS)
Beller, Arthur E.; Hadaller, H. Greg; Ricci, Mark J.
1992-01-01
The Intelligent Launch Decision Support System project is developing a Time Management System (TMS) for the NASA Test Director (NTD) to use for time management during Shuttle terminal countdown. TMS is being developed in three phases: an information phase; a tool phase; and an advisor phase. The information phase is an integrated display (TMID) of firing room clocks, of graphic timelines with Ground Launch Sequencer events, and of constraints. The tool phase is a what-if spreadsheet (TMWI) for devising plans for resuming from unplanned hold situations. It is tied to information in TMID, propagates constraints forward and backward to complete unspecified values, and checks the plan against constraints. The advisor phase is a situation advisor (TMSA), which proactively suggests tactics. A concept prototype for TMSA is under development. The TMID is currently undergoing field testing. Displays for TMID and TMWI are described. Descriptions include organization, rationale for organization, implementation choices and constraints, and use by NTD.
Image-Optimized Coronal Magnetic Field Models
NASA Technical Reports Server (NTRS)
Jones, Shaela I.; Uritsky, Vadim; Davila, Joseph M.
2017-01-01
We have reported previously on a new method we are developing for using image-based information to improve global coronal magnetic field models. In that work we presented early tests of the method which proved its capability to improve global models based on flawed synoptic magnetograms, given excellent constraints on the field in the model volume. In this follow-up paper we present the results of similar tests given field constraints of a nature that could realistically be obtained from quality white-light coronagraph images of the lower corona. We pay particular attention to difficulties associated with the line-of-sight projection of features outside of the assumed coronagraph image plane, and the effect on the outcome of the optimization of errors in localization of constraints. We find that substantial improvement in the model field can be achieved with this type of constraints, even when magnetic features in the images are located outside of the image plane.
NASA Technical Reports Server (NTRS)
Short, Nick, Jr.; Bedet, Jean-Jacques; Bodden, Lee; Boddy, Mark; White, Jim; Beane, John
1994-01-01
The Goddard Space Flight Center (GSFC) Distributed Active Archive Center (DAAC) has been operational since October 1, 1993. Its mission is to support the Earth Observing System (EOS) by providing rapid access to EOS data and analysis products, and to test Earth Observing System Data and Information System (EOSDIS) design concepts. One of the challenges is to ensure quick and easy retrieval of any data archived within the DAAC's Data Archive and Distributed System (DADS). Over the 15-year life of EOS project, an estimated several Petabytes (10(exp 15)) of data will be permanently stored. Accessing that amount of information is a formidable task that will require innovative approaches. As a precursor of the full EOS system, the GSFC DAAC with a few Terabits of storage, has implemented a prototype of a constraint-based task and resource scheduler to improve the performance of the DADS. This Honeywell Task and Resource Scheduler (HTRS), developed by Honeywell Technology Center in cooperation the Information Science and Technology Branch/935, the Code X Operations Technology Program, and the GSFC DAAC, makes better use of limited resources, prevents backlog of data, provides information about resources bottlenecks and performance characteristics. The prototype which is developed concurrently with the GSFC Version 0 (V0) DADS, models DADS activities such as ingestion and distribution with priority, precedence, resource requirements (disk and network bandwidth) and temporal constraints. HTRS supports schedule updates, insertions, and retrieval of task information via an Application Program Interface (API). The prototype has demonstrated with a few examples, the substantial advantages of using HTRS over scheduling algorithms such as a First In First Out (FIFO) queue. The kernel scheduling engine for HTRS, called Kronos, has been successfully applied to several other domains such as space shuttle mission scheduling, demand flow manufacturing, and avionics communications scheduling.
A chance-constrained stochastic approach to intermodal container routing problems.
Zhao, Yi; Liu, Ronghui; Zhang, Xi; Whiteing, Anthony
2018-01-01
We consider a container routing problem with stochastic time variables in a sea-rail intermodal transportation system. The problem is formulated as a binary integer chance-constrained programming model including stochastic travel times and stochastic transfer time, with the objective of minimising the expected total cost. Two chance constraints are proposed to ensure that the container service satisfies ship fulfilment and cargo on-time delivery with pre-specified probabilities. A hybrid heuristic algorithm is employed to solve the binary integer chance-constrained programming model. Two case studies are conducted to demonstrate the feasibility of the proposed model and to analyse the impact of stochastic variables and chance-constraints on the optimal solution and total cost.
A chance-constrained stochastic approach to intermodal container routing problems
Zhao, Yi; Zhang, Xi; Whiteing, Anthony
2018-01-01
We consider a container routing problem with stochastic time variables in a sea-rail intermodal transportation system. The problem is formulated as a binary integer chance-constrained programming model including stochastic travel times and stochastic transfer time, with the objective of minimising the expected total cost. Two chance constraints are proposed to ensure that the container service satisfies ship fulfilment and cargo on-time delivery with pre-specified probabilities. A hybrid heuristic algorithm is employed to solve the binary integer chance-constrained programming model. Two case studies are conducted to demonstrate the feasibility of the proposed model and to analyse the impact of stochastic variables and chance-constraints on the optimal solution and total cost. PMID:29438389
NASA Technical Reports Server (NTRS)
Izygon, Michel
1992-01-01
This report summarizes the findings and lessons learned from the development of an intelligent user interface for a space flight planning simulation program, in the specific area related to constraint-checking. The different functionalities of the Graphical User Interface part and of the rule-based part of the system have been identified. Their respective domain of applicability for error prevention and error checking have been specified.
Cochran, John K
2017-08-01
Recently, Robert Agnew introduced a new general theory of crime and delinquency in which he attempted to corral the vast array of theoretical "causes" of criminal conduct into a more parsimonious statement organized into one of five life domains: self, family, peers, school, and work as well as constraints against crime and motivation for it. These domains are depicted as the source of constraints and motivations and whose effects are, in part, mediated by these constraints and motivations. Based on self-report data on academic dishonesty from a sample of college students, the present study attempts to test this general theory. While several of the life domain variables had significant effects of cheating in the baseline model, all of these effects were fully mediated by constraints and motivations. In the final model, academic dishonesty was observed to be most significantly affected by the perceived severity of formal sanction threats, the number of credit hours enrolled, the frequency of skipping classes, and pressure from friends.
Statistical learning of novel graphotactic constraints in children and adults.
Samara, Anna; Caravolas, Markéta
2014-05-01
The current study explored statistical learning processes in the acquisition of orthographic knowledge in school-aged children and skilled adults. Learning of novel graphotactic constraints on the position and context of letter distributions was induced by means of a two-phase learning task adapted from Onishi, Chambers, and Fisher (Cognition, 83 (2002) B13-B23). Following incidental exposure to pattern-embedding stimuli in Phase 1, participants' learning generalization was tested in Phase 2 with legality judgments about novel conforming/nonconforming word-like strings. Test phase performance was above chance, suggesting that both types of constraints were reliably learned even after relatively brief exposure. As hypothesized, signal detection theory d' analyses confirmed that learning permissible letter positions (d'=0.97) was easier than permissible neighboring letter contexts (d'=0.19). Adults were more accurate than children in all but a strict analysis of the contextual constraints condition. Consistent with the statistical learning perspective in literacy, our results suggest that statistical learning mechanisms contribute to children's and adults' acquisition of knowledge about graphotactic constraints similar to those existing in their orthography. Copyright © 2013 Elsevier Inc. All rights reserved.
Conservation of Mass and Preservation of Positivity with Ensemble-Type Kalman Filter Algorithms
NASA Technical Reports Server (NTRS)
Janjic, Tijana; Mclaughlin, Dennis; Cohn, Stephen E.; Verlaan, Martin
2014-01-01
This paper considers the incorporation of constraints to enforce physically based conservation laws in the ensemble Kalman filter. In particular, constraints are used to ensure that the ensemble members and the ensemble mean conserve mass and remain nonnegative through measurement updates. In certain situations filtering algorithms such as the ensemble Kalman filter (EnKF) and ensemble transform Kalman filter (ETKF) yield updated ensembles that conserve mass but are negative, even though the actual states must be nonnegative. In such situations if negative values are set to zero, or a log transform is introduced, the total mass will not be conserved. In this study, mass and positivity are both preserved by formulating the filter update as a set of quadratic programming problems that incorporate non-negativity constraints. Simple numerical experiments indicate that this approach can have a significant positive impact on the posterior ensemble distribution, giving results that are more physically plausible both for individual ensemble members and for the ensemble mean. In two examples, an update that includes a non-negativity constraint is able to properly describe the transport of a sharp feature (e.g., a triangle or cone). A number of implementation questions still need to be addressed, particularly the need to develop a computationally efficient quadratic programming update for large ensemble.
Linear programming: an alternative approach for developing formulations for emergency food products.
Sheibani, Ershad; Dabbagh Moghaddam, Arasb; Sharifan, Anousheh; Afshari, Zahra
2018-03-01
To minimize the mortality rates of individuals affected by disasters, providing high-quality food relief during the initial stages of an emergency is crucial. The goal of this study was to develop a formulation for a high-energy, nutrient-dense prototype using linear programming (LP) model as a novel method for developing formulations for food products. The model consisted of the objective function and the decision variables, which were the formulation costs and weights of the selected commodities, respectively. The LP constraints were the Institute of Medicine and the World Health Organization specifications of the content of nutrients in the product. Other constraints related to the product's sensory properties were also introduced to the model. Nonlinear constraints for energy ratios of nutrients were linearized to allow their use in the LP. Three focus group studies were conducted to evaluate the palatability and other aspects of the optimized formulation. New constraints were introduced to the LP model based on the focus group evaluations to improve the formulation. LP is an appropriate tool for designing formulations of food products to meet a set of nutritional requirements. This method is an excellent alternative to the traditional 'trial and error' method in designing formulations. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
An innovative approach to compensator design
NASA Technical Reports Server (NTRS)
Mitchell, J. R.
1972-01-01
The primary goal is to present for a control system a computer-aided-compensator design technique from a frequency domain point of view. The thesis for developing this technique is to describe the open loop frequency response by n discrete frequency points which result in n functions of the compensator coefficients. Several of these functions are chosen so that the system specifications are properly portrayed; then mathematical programming is used to improve all of these functions which have values below minimum standards. In order to do this several definitions in regard to measuring the performance of a system in the frequency domain are given. Next, theorems which govern the number of compensator coefficients necessary to make improvements in a certain number of functions are proved. After this a mathematical programming tool for aiding in the solution of the problem is developed. Then for applying the constraint improvement algorithm generalized gradients for the constraints are derived. Finally, the necessary theory is incorporated in a computer program called CIP (compensator improvement program).
Dynamic visualization techniques for high consequence software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pollock, G.M.
1998-02-01
This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification. The prototype tool is described along with the requirements constraint language after a brief literature review is presented. Examples of howmore » the tool can be used are also presented. In conclusion, the most significant advantage of this tool is to provide a first step in evaluating specification completeness, and to provide a more productive method for program comprehension and debugging. The expected payoff is increased software surety confidence, increased program comprehension, and reduced development and debugging time.« less
NASA Astrophysics Data System (ADS)
Kassa, Semu Mitiku; Tsegay, Teklay Hailay
2017-08-01
Tri-level optimization problems are optimization problems with three nested hierarchical structures, where in most cases conflicting objectives are set at each level of hierarchy. Such problems are common in management, engineering designs and in decision making situations in general, and are known to be strongly NP-hard. Existing solution methods lack universality in solving these types of problems. In this paper, we investigate a tri-level programming problem with quadratic fractional objective functions at each of the three levels. A solution algorithm has been proposed by applying fuzzy goal programming approach and by reformulating the fractional constraints to equivalent but non-fractional non-linear constraints. Based on the transformed formulation, an iterative procedure is developed that can yield a satisfactory solution to the tri-level problem. The numerical results on various illustrative examples demonstrated that the proposed algorithm is very much promising and it can also be used to solve larger-sized as well as n-level problems of similar structure.
Solar electric geocentric transfer with attitude constraints: Analysis
NASA Technical Reports Server (NTRS)
Sackett, L. L.; Malchow, H. L.; Delbaum, T. N.
1975-01-01
A time optimal or nearly time optimal trajectory program was developed for solar electric geocentric transfer with or without attitude constraints and with an optional initial high thrust stage. The method of averaging reduces computation time. A nonsingular set of orbital elements is used. The constraints, which are those of one of the SERT-C designs, introduce complexities into the analysis and the solution yields possible discontinuous changes in thrust direction. The power degradation due to VanAllen radiation is modeled analytically. A wide range of solar cell characteristics is assumed. Effects such as oblateness and shadowing are included. The analysis and the results of many example runs are included.
Artificial bee colony algorithm for constrained possibilistic portfolio optimization problem
NASA Astrophysics Data System (ADS)
Chen, Wei
2015-07-01
In this paper, we discuss the portfolio optimization problem with real-world constraints under the assumption that the returns of risky assets are fuzzy numbers. A new possibilistic mean-semiabsolute deviation model is proposed, in which transaction costs, cardinality and quantity constraints are considered. Due to such constraints the proposed model becomes a mixed integer nonlinear programming problem and traditional optimization methods fail to find the optimal solution efficiently. Thus, a modified artificial bee colony (MABC) algorithm is developed to solve the corresponding optimization problem. Finally, a numerical example is given to illustrate the effectiveness of the proposed model and the corresponding algorithm.
A novel approach based on preference-based index for interval bilevel linear programming problem.
Ren, Aihong; Wang, Yuping; Xue, Xingsi
2017-01-01
This paper proposes a new methodology for solving the interval bilevel linear programming problem in which all coefficients of both objective functions and constraints are considered as interval numbers. In order to keep as much uncertainty of the original constraint region as possible, the original problem is first converted into an interval bilevel programming problem with interval coefficients in both objective functions only through normal variation of interval number and chance-constrained programming. With the consideration of different preferences of different decision makers, the concept of the preference level that the interval objective function is preferred to a target interval is defined based on the preference-based index. Then a preference-based deterministic bilevel programming problem is constructed in terms of the preference level and the order relation [Formula: see text]. Furthermore, the concept of a preference δ -optimal solution is given. Subsequently, the constructed deterministic nonlinear bilevel problem is solved with the help of estimation of distribution algorithm. Finally, several numerical examples are provided to demonstrate the effectiveness of the proposed approach.
An implementation of the distributed programming structural synthesis system (PROSSS)
NASA Technical Reports Server (NTRS)
Rogers, J. L., Jr.
1981-01-01
A method is described for implementing a flexible software system that combines large, complex programs with small, user-supplied, problem-dependent programs and that distributes their execution between a mainframe and a minicomputer. The Programming Structural Synthesis System (PROSSS) was the specific software system considered. The results of such distributed implementation are flexibility of the optimization procedure organization and versatility of the formulation of constraints and design variables.
Mechanical properties of metal-ceramic nanolaminates: Effect of constraint and temperature
Yang, Ling Wei; Mayer, Carl; Li, Nan; ...
2017-09-21
Al/SiC nanolaminates with equal nominal thicknesses of the Al and SiC layers (10, 25, 50 and 100 nm) were manufactured by magnetron sputtering. The mechanical properties were measured at 25 °C and 100 °C by means of nanoindentation and micropillar compression tests and the deformation mechanisms were analyzed by in situ micropillar compression tests in the transmission electron microscope. In addition, finite element simulations of both tests were carried out to ascertain the role played by the strength of the Al layers and by the elastic constraint of the ceramic layers on the plastic flow of Al in the mechanicalmore » response. It was found that the mechanical response was mainly controlled by the constraint during nanoindentation or micropillar compression tests of very thin layered (≈10 nm) laminates, while the influence of the strength of Al layers was not as critical. This behavior was reversed, however, for thick layered laminates (100 nm). Here, these mechanisms point to the different effects of layer thickness during nanoindentation and micropillar compression, at both temperatures, and showed the critical role played by constraint on the mechanical response of nanolaminates made of materials with a very large difference in the elasto-plastic properties.« less
A Hybrid alldifferent-Tabu Search Algorithm for Solving Sudoku Puzzles
Crawford, Broderick; Paredes, Fernando; Norero, Enrique
2015-01-01
The Sudoku problem is a well-known logic-based puzzle of combinatorial number-placement. It consists in filling a n 2 × n 2 grid, composed of n columns, n rows, and n subgrids, each one containing distinct integers from 1 to n 2. Such a puzzle belongs to the NP-complete collection of problems, to which there exist diverse exact and approximate methods able to solve it. In this paper, we propose a new hybrid algorithm that smartly combines a classic tabu search procedure with the alldifferent global constraint from the constraint programming world. The alldifferent constraint is known to be efficient for domain filtering in the presence of constraints that must be pairwise different, which are exactly the kind of constraints that Sudokus own. This ability clearly alleviates the work of the tabu search, resulting in a faster and more robust approach for solving Sudokus. We illustrate interesting experimental results where our proposed algorithm outperforms the best results previously reported by hybrids and approximate methods. PMID:26078751
A Hybrid alldifferent-Tabu Search Algorithm for Solving Sudoku Puzzles.
Soto, Ricardo; Crawford, Broderick; Galleguillos, Cristian; Paredes, Fernando; Norero, Enrique
2015-01-01
The Sudoku problem is a well-known logic-based puzzle of combinatorial number-placement. It consists in filling a n(2) × n(2) grid, composed of n columns, n rows, and n subgrids, each one containing distinct integers from 1 to n(2). Such a puzzle belongs to the NP-complete collection of problems, to which there exist diverse exact and approximate methods able to solve it. In this paper, we propose a new hybrid algorithm that smartly combines a classic tabu search procedure with the alldifferent global constraint from the constraint programming world. The alldifferent constraint is known to be efficient for domain filtering in the presence of constraints that must be pairwise different, which are exactly the kind of constraints that Sudokus own. This ability clearly alleviates the work of the tabu search, resulting in a faster and more robust approach for solving Sudokus. We illustrate interesting experimental results where our proposed algorithm outperforms the best results previously reported by hybrids and approximate methods.
2013-01-01
Background Optimization procedures to identify gene knockouts for targeted biochemical overproduction have been widely in use in modern metabolic engineering. Flux balance analysis (FBA) framework has provided conceptual simplifications for genome-scale dynamic analysis at steady states. Based on FBA, many current optimization methods for targeted bio-productions have been developed under the maximum cell growth assumption. The optimization problem to derive gene knockout strategies recently has been formulated as a bi-level programming problem in OptKnock for maximum targeted bio-productions with maximum growth rates. However, it has been shown that knockout mutants in fact reach the steady states with the minimization of metabolic adjustment (MOMA) from the corresponding wild-type strains instead of having maximal growth rates after genetic or metabolic intervention. In this work, we propose a new bi-level computational framework--MOMAKnock--which can derive robust knockout strategies under the MOMA flux distribution approximation. Methods In this new bi-level optimization framework, we aim to maximize the production of targeted chemicals by identifying candidate knockout genes or reactions under phenotypic constraints approximated by the MOMA assumption. Hence, the targeted chemical production is the primary objective of MOMAKnock while the MOMA assumption is formulated as the inner problem of constraining the knockout metabolic flux to be as close as possible to the steady-state phenotypes of wide-type strains. As this new inner problem becomes a quadratic programming problem, a novel adaptive piecewise linearization algorithm is developed in this paper to obtain the exact optimal solution to this new bi-level integer quadratic programming problem for MOMAKnock. Results Our new MOMAKnock model and the adaptive piecewise linearization solution algorithm are tested with a small E. coli core metabolic network and a large-scale iAF1260 E. coli metabolic network. The derived knockout strategies are compared with those from OptKnock. Our preliminary experimental results show that MOMAKnock can provide improved targeted productions with more robust knockout strategies. PMID:23368729
Ren, Shaogang; Zeng, Bo; Qian, Xiaoning
2013-01-01
Optimization procedures to identify gene knockouts for targeted biochemical overproduction have been widely in use in modern metabolic engineering. Flux balance analysis (FBA) framework has provided conceptual simplifications for genome-scale dynamic analysis at steady states. Based on FBA, many current optimization methods for targeted bio-productions have been developed under the maximum cell growth assumption. The optimization problem to derive gene knockout strategies recently has been formulated as a bi-level programming problem in OptKnock for maximum targeted bio-productions with maximum growth rates. However, it has been shown that knockout mutants in fact reach the steady states with the minimization of metabolic adjustment (MOMA) from the corresponding wild-type strains instead of having maximal growth rates after genetic or metabolic intervention. In this work, we propose a new bi-level computational framework--MOMAKnock--which can derive robust knockout strategies under the MOMA flux distribution approximation. In this new bi-level optimization framework, we aim to maximize the production of targeted chemicals by identifying candidate knockout genes or reactions under phenotypic constraints approximated by the MOMA assumption. Hence, the targeted chemical production is the primary objective of MOMAKnock while the MOMA assumption is formulated as the inner problem of constraining the knockout metabolic flux to be as close as possible to the steady-state phenotypes of wide-type strains. As this new inner problem becomes a quadratic programming problem, a novel adaptive piecewise linearization algorithm is developed in this paper to obtain the exact optimal solution to this new bi-level integer quadratic programming problem for MOMAKnock. Our new MOMAKnock model and the adaptive piecewise linearization solution algorithm are tested with a small E. coli core metabolic network and a large-scale iAF1260 E. coli metabolic network. The derived knockout strategies are compared with those from OptKnock. Our preliminary experimental results show that MOMAKnock can provide improved targeted productions with more robust knockout strategies.
Packing Boxes into Multiple Containers Using Genetic Algorithm
NASA Astrophysics Data System (ADS)
Menghani, Deepak; Guha, Anirban
2016-07-01
Container loading problems have been studied extensively in the literature and various analytical, heuristic and metaheuristic methods have been proposed. This paper presents two different variants of a genetic algorithm framework for the three-dimensional container loading problem for optimally loading boxes into multiple containers with constraints. The algorithms are designed so that it is easy to incorporate various constraints found in real life problems. The algorithms are tested on data of standard test cases from literature and are found to compare well with the benchmark algorithms in terms of utilization of containers. This, along with the ability to easily incorporate a wide range of practical constraints, makes them attractive for implementation in real life scenarios.
Launch vehicle selection model
NASA Technical Reports Server (NTRS)
Montoya, Alex J.
1990-01-01
Over the next 50 years, humans will be heading for the Moon and Mars to build scientific bases to gain further knowledge about the universe and to develop rewarding space activities. These large scale projects will last many years and will require large amounts of mass to be delivered to Low Earth Orbit (LEO). It will take a great deal of planning to complete these missions in an efficient manner. The planning of a future Heavy Lift Launch Vehicle (HLLV) will significantly impact the overall multi-year launching cost for the vehicle fleet depending upon when the HLLV will be ready for use. It is desirable to develop a model in which many trade studies can be performed. In one sample multi-year space program analysis, the total launch vehicle cost of implementing the program reduced from 50 percent to 25 percent. This indicates how critical it is to reduce space logistics costs. A linear programming model has been developed to answer such questions. The model is now in its second phase of development, and this paper will address the capabilities of the model and its intended uses. The main emphasis over the past year was to make the model user friendly and to incorporate additional realistic constraints that are difficult to represent mathematically. We have developed a methodology in which the user has to be knowledgeable about the mission model and the requirements of the payloads. We have found a representation that will cut down the solution space of the problem by inserting some preliminary tests to eliminate some infeasible vehicle solutions. The paper will address the handling of these additional constraints and the methodology for incorporating new costing information utilizing learning curve theory. The paper will review several test cases that will explore the preferred vehicle characteristics and the preferred period of construction, i.e., within the next decade, or in the first decade of the next century. Finally, the paper will explore the interaction between the primary mission model (all payloads going from Earth to Low Earth Orbit (LEO)) and the secondary mission model (all payloads from LEO to Lunar and LEO to Mars and return).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mckinly, J.B.
The impact of the Federal Aviation Regulations (FARs) on fuel conservation in the air-transportation system. To date there exist over 89 identifiable fuel-conservation program and research areas. Operational constraints in the areas of FARs and Air Traffic Control (ATC), which hinder further fuel savings in any of the 89 program and research areas, are identified. The nature of this investigation presents an update of analyses from previous FAA, DOE, and NASA publications from a DOE viewpoint. The short duration and cost constraints of this study did not allow an assessment of safety, social, or any of the broader impacts ofmore » the regulations. However, this study was not intended to solve all of the regulatory problems. Rather, this was a cursory review of the FARs intended to pinpoint those fuel inefficient regulations which could be changed to improve the overall fuel-conservation effort in the air transportation industry. The program and research areas identified as being negatively impacted by FARs were analyzed to quantify the fuel savings available through revision or removal of those constraints. A recommended list of new R and D initiatives are proposed in order to improve fuel efficiency of the FARs in the air-transportation industry.« less
Ryan, Jason C; Banerjee, Ashis Gopal; Cummings, Mary L; Roy, Nicholas
2014-06-01
Planning operations across a number of domains can be considered as resource allocation problems with timing constraints. An unexplored instance of such a problem domain is the aircraft carrier flight deck, where, in current operations, replanning is done without the aid of any computerized decision support. Rather, veteran operators employ a set of experience-based heuristics to quickly generate new operating schedules. These expert user heuristics are neither codified nor evaluated by the United States Navy; they have grown solely from the convergent experiences of supervisory staff. As unmanned aerial vehicles (UAVs) are introduced in the aircraft carrier domain, these heuristics may require alterations due to differing capabilities. The inclusion of UAVs also allows for new opportunities for on-line planning and control, providing an alternative to the current heuristic-based replanning methodology. To investigate these issues formally, we have developed a decision support system for flight deck operations that utilizes a conventional integer linear program-based planning algorithm. In this system, a human operator sets both the goals and constraints for the algorithm, which then returns a proposed schedule for operator approval. As a part of validating this system, the performance of this collaborative human-automation planner was compared with that of the expert user heuristics over a set of test scenarios. The resulting analysis shows that human heuristics often outperform the plans produced by an optimization algorithm, but are also often more conservative.
Automatic Constraint Detection for 2D Layout Regularization.
Jiang, Haiyong; Nan, Liangliang; Yan, Dong-Ming; Dong, Weiming; Zhang, Xiaopeng; Wonka, Peter
2016-08-01
In this paper, we address the problem of constraint detection for layout regularization. The layout we consider is a set of two-dimensional elements where each element is represented by its bounding box. Layout regularization is important in digitizing plans or images, such as floor plans and facade images, and in the improvement of user-created contents, such as architectural drawings and slide layouts. To regularize a layout, we aim to improve the input by detecting and subsequently enforcing alignment, size, and distance constraints between layout elements. Similar to previous work, we formulate layout regularization as a quadratic programming problem. In addition, we propose a novel optimization algorithm that automatically detects constraints. We evaluate the proposed framework using a variety of input layouts from different applications. Our results demonstrate that our method has superior performance to the state of the art.
Testing for new physics: neutrinos and the primordial power spectrum
DOE Office of Scientific and Technical Information (OSTI.GOV)
Canac, Nicolas; Abazajian, Kevork N.; Aslanyan, Grigor
2016-09-01
We test the sensitivity of neutrino parameter constraints from combinations of CMB and LSS data sets to the assumed form of the primordial power spectrum (PPS) using Bayesian model selection. Significantly, none of the tested combinations, including recent high-precision local measurements of H{sub 0} and cluster abundances, indicate a signal for massive neutrinos or extra relativistic degrees of freedom. For PPS models with a large, but fixed number of degrees of freedom, neutrino parameter constraints do not change significantly if the location of any features in the PPS are allowed to vary, although neutrino constraints are more sensitive to PPSmore » features if they are known a priori to exist at fixed intervals in log k . Although there is no support for a non-standard neutrino sector from constraints on both neutrino mass and relativistic energy density, we see surprisingly strong evidence for features in the PPS when it is constrained with data from Planck 2015, SZ cluster counts, and recent high-precision local measurements of H{sub 0}. Conversely combining Planck with matter power spectrum and BAO measurements yields a much weaker constraint. Given that this result is sensitive to the choice of data this tension between SZ cluster counts, Planck and H{sub 0} measurements is likely an indication of unmodeled systematic bias that mimics PPS features, rather than new physics in the PPS or neutrino sector.« less
ERIC Educational Resources Information Center
Vos, Lynn
2013-01-01
This article looks at the curriculum redesign of a master's-level program in international marketing from a UK perspective. In order to ensure that the program would be more fit-for-purpose for future managers working under conditions of complexity, uncertainty, and within regimes often very different from the home market, the team began the…
ERIC Educational Resources Information Center
Leff, H. Stephen; Turner, Ralph R.
This report focuses on the use of linear programming models to address the issues of how vocational rehabilitation (VR) resources should be allocated in order to maximize program efficiency within given resource constraints. A general introduction to linear programming models is first presented that describes the major types of models available,…
The NASA firefighter's breathing system program
NASA Technical Reports Server (NTRS)
Mclaughlan, P. B.; Carson, M. A.
1974-01-01
The research is reported in the development of a firefighter's breathing system (FBS) to satisfy the operational requirements of fire departments while remaining within their cost constraints. System definition for the FBS is discussed, and the program status is reported. It is concluded that the most difficult problem in the FBS Program is the achievement of widespread fire department acceptance of the system.
Using Blended Learning as an Innovative Delivery Model for an In-House Language Program
ERIC Educational Resources Information Center
Gadbois, Manon; Quildon, Denise
2013-01-01
This paper reports on the development and implementation in 2012 of McGill University's French at Work program for McGill employees, using a blended learning model. The program is an example of how a reduction in face-to-face teaching presents one solution to employees' scheduling constraints and how this model might offer suggestions for the…
ERIC Educational Resources Information Center
Toh, Swee-Hin; And Others
This paper draws upon the experiential and theoretical insights gained from 5 years of developing a peace education program at Notre Dame University in the Philippines. The critical reflections on that experience encompass the processes, relationships, and structures embodied in the program, and its achievements, constraints, difficulties, and…
No-signaling quantum key distribution: solution by linear programming
NASA Astrophysics Data System (ADS)
Hwang, Won-Young; Bae, Joonwoo; Killoran, Nathan
2015-02-01
We outline a straightforward approach for obtaining a secret key rate using only no-signaling constraints and linear programming. Assuming an individual attack, we consider all possible joint probabilities. Initially, we study only the case where Eve has binary outcomes, and we impose constraints due to the no-signaling principle and given measurement outcomes. Within the remaining space of joint probabilities, by using linear programming, we get bound on the probability of Eve correctly guessing Bob's bit. We then make use of an inequality that relates this guessing probability to the mutual information between Bob and a more general Eve, who is not binary-restricted. Putting our computed bound together with the Csiszár-Körner formula, we obtain a positive key generation rate. The optimal value of this rate agrees with known results, but was calculated in a more straightforward way, offering the potential of generalization to different scenarios.
Smart-Grid Backbone Network Real-Time Delay Reduction via Integer Programming.
Pagadrai, Sasikanth; Yilmaz, Muhittin; Valluri, Pratyush
2016-08-01
This research investigates an optimal delay-based virtual topology design using integer linear programming (ILP), which is applied to the current backbone networks such as smart-grid real-time communication systems. A network traffic matrix is applied and the corresponding virtual topology problem is solved using the ILP formulations that include a network delay-dependent objective function and lightpath routing, wavelength assignment, wavelength continuity, flow routing, and traffic loss constraints. The proposed optimization approach provides an efficient deterministic integration of intelligent sensing and decision making, and network learning features for superior smart grid operations by adaptively responding the time-varying network traffic data as well as operational constraints to maintain optimal virtual topologies. A representative optical backbone network has been utilized to demonstrate the proposed optimization framework whose simulation results indicate that superior smart-grid network performance can be achieved using commercial networks and integer programming.
An investigation of new methods for estimating parameter sensitivities
NASA Technical Reports Server (NTRS)
Beltracchi, Todd J.; Gabriele, Gary A.
1988-01-01
Parameter sensitivity is defined as the estimation of changes in the modeling functions and the design variables due to small changes in the fixed parameters of the formulation. There are currently several methods for estimating parameter sensitivities requiring either difficult to obtain second order information, or do not return reliable estimates for the derivatives. Additionally, all the methods assume that the set of active constraints does not change in a neighborhood of the estimation point. If the active set does in fact change, than any extrapolations based on these derivatives may be in error. The objective here is to investigate more efficient new methods for estimating parameter sensitivities when the active set changes. The new method is based on the recursive quadratic programming (RQP) method and in conjunction a differencing formula to produce estimates of the sensitivities. This is compared to existing methods and is shown to be very competitive in terms of the number of function evaluations required. In terms of accuracy, the method is shown to be equivalent to a modified version of the Kuhn-Tucker method, where the Hessian of the Lagrangian is estimated using the BFS method employed by the RPQ algorithm. Inital testing on a test set with known sensitivities demonstrates that the method can accurately calculate the parameter sensitivity. To handle changes in the active set, a deflection algorithm is proposed for those cases where the new set of active constraints remains linearly independent. For those cases where dependencies occur, a directional derivative is proposed. A few simple examples are included for the algorithm, but extensive testing has not yet been performed.
[AT THE CROSSROADS: THE ROLE OF LABORATORY MEDICINE IN THE PATIENT CARE PROCESS].
Geffen, Yuval; Zaidise, Itzhak
2017-06-01
In recent decades, the laboratory medicine profession has undergone significant changes due to both technological developments and economic constraints. Technological innovations support automation, provide faster and more accurate equipment, and allow increased efficiency through the use of commercial test kits. These changes, combined with budgetary constraints, have led to mergers and centralization of medical laboratories to optimize work and cut costs. While this centralization may be a business necessity, it leads to a disconnection between the laboratory and the clinical context. In addition, laboratory tests are treated as a commodity, which places emphasis on price only, rather than quality. In this article, we review the developments and changes that medical laboratories and the laboratory medicine profession have undergone in recent decades. We focus on technological and structural challenges affecting the functioning of medical laboratories and the relations between laboratory workers and medical teams. We then introduce vocational education changes required for the laboratory medicine profession. We propose defining the role of medical laboratory directors in terms of their basic training as medical doctors or doctors of science. We suggest that laboratory employees should become a reliable source of information regarding selection of appropriate test methods, processing data and presenting the results to the medical staff. Laboratory workers must deepen their clinical knowledge and become an integral part of the patient care process, along with medical and nursing staff. Special training programs for medical laboratory workers and directors must be developed in order to match the complex activities currently being conducted in laboratories.
Determination of optimum values for maximizing the profit in bread production: Daily bakery Sdn Bhd
NASA Astrophysics Data System (ADS)
Muda, Nora; Sim, Raymond
2015-02-01
An integer programming problem is a mathematical optimization or feasibility program in which some or all of the variables are restricted to be integers. In many settings the term refers to integer linear programming (ILP), in which the objective function and the constraints (other than the integer constraints) are linear. An ILP has many applications in industrial production, including job-shop modelling. A possible objective is to maximize the total production, without exceeding the available resources. In some cases, this can be expressed in terms of a linear program, but variables must be constrained to be integer. It concerned with the optimization of a linear function while satisfying a set of linear equality and inequality constraints and restrictions. It has been used to solve optimization problem in many industries area such as banking, nutrition, agriculture, and bakery and so on. The main purpose of this study is to formulate the best combination of all ingredients in producing different type of bread in Daily Bakery in order to gain maximum profit. This study also focuses on the sensitivity analysis due to changing of the profit and the cost of each ingredient. The optimum result obtained from QM software is RM 65,377.29 per day. This study will be benefited for Daily Bakery and also other similar industries. By formulating a combination of all ingredients make up, they can easily know their total profit in producing bread everyday.
Test of Special Relativity Using a Fiber Network of Optical Clocks.
Delva, P; Lodewyck, J; Bilicki, S; Bookjans, E; Vallet, G; Le Targat, R; Pottie, P-E; Guerlin, C; Meynadier, F; Le Poncin-Lafitte, C; Lopez, O; Amy-Klein, A; Lee, W-K; Quintin, N; Lisdat, C; Al-Masoudi, A; Dörscher, S; Grebing, C; Grosche, G; Kuhl, A; Raupach, S; Sterr, U; Hill, I R; Hobson, R; Bowden, W; Kronjäger, J; Marra, G; Rolland, A; Baynes, F N; Margolis, H S; Gill, P
2017-06-02
Phase compensated optical fiber links enable high accuracy atomic clocks separated by thousands of kilometers to be compared with unprecedented statistical resolution. By searching for a daily variation of the frequency difference between four strontium optical lattice clocks in different locations throughout Europe connected by such links, we improve upon previous tests of time dilation predicted by special relativity. We obtain a constraint on the Robertson-Mansouri-Sexl parameter |α|≲1.1×10^{-8}, quantifying a violation of time dilation, thus improving by a factor of around 2 the best known constraint obtained with Ives-Stilwell type experiments, and by 2 orders of magnitude the best constraint obtained by comparing atomic clocks. This work is the first of a new generation of tests of fundamental physics using optical clocks and fiber links. As clocks improve, and as fiber links are routinely operated, we expect that the tests initiated in this Letter will improve by orders of magnitude in the near future.
Kalman Filtering with Inequality Constraints for Turbofan Engine Health Estimation
NASA Technical Reports Server (NTRS)
Simon, Dan; Simon, Donald L.
2003-01-01
Kalman filters are often used to estimate the state variables of a dynamic system. However, in the application of Kalman filters some known signal information is often either ignored or dealt with heuristically. For instance, state variable constraints (which may be based on physical considerations) are often neglected because they do not fit easily into the structure of the Kalman filter. This paper develops two analytic methods of incorporating state variable inequality constraints in the Kalman filter. The first method is a general technique of using hard constraints to enforce inequalities on the state variable estimates. The resultant filter is a combination of a standard Kalman filter and a quadratic programming problem. The second method uses soft constraints to estimate state variables that are known to vary slowly with time. (Soft constraints are constraints that are required to be approximately satisfied rather than exactly satisfied.) The incorporation of state variable constraints increases the computational effort of the filter but significantly improves its estimation accuracy. The improvement is proven theoretically and shown via simulation results. The use of the algorithm is demonstrated on a linearized simulation of a turbofan engine to estimate health parameters. The turbofan engine model contains 16 state variables, 12 measurements, and 8 component health parameters. It is shown that the new algorithms provide improved performance in this example over unconstrained Kalman filtering.
A reduced order, test verified component mode synthesis approach for system modeling applications
NASA Astrophysics Data System (ADS)
Butland, Adam; Avitabile, Peter
2010-05-01
Component mode synthesis (CMS) is a very common approach used for the generation of large system models. In general, these modeling techniques can be separated into two categories: those utilizing a combination of constraint modes and fixed interface normal modes and those based on a combination of free interface normal modes and residual flexibility terms. The major limitation of the methods utilizing constraint modes and fixed interface normal modes is the inability to easily obtain the required information from testing; the result of this limitation is that constraint mode-based techniques are primarily used with numerical models. An alternate approach is proposed which utilizes frequency and shape information acquired from modal testing to update reduced order finite element models using exact analytical model improvement techniques. The connection degrees of freedom are then rigidly constrained in the test verified, reduced order model to provide the boundary conditions necessary for constraint modes and fixed interface normal modes. The CMS approach is then used with this test verified, reduced order model to generate the system model for further analysis. A laboratory structure is used to show the application of the technique with both numerical and simulated experimental components to describe the system and validate the proposed approach. Actual test data is then used in the approach proposed. Due to typical measurement data contaminants that are always included in any test, the measured data is further processed to remove contaminants and is then used in the proposed approach. The final case using improved data with the reduced order, test verified components is shown to produce very acceptable results from the Craig-Bampton component mode synthesis approach. Use of the technique with its strengths and weaknesses are discussed.
UTM Safely Enabling UAS Operations in Low-Altitude Airspace
NASA Technical Reports Server (NTRS)
Kopardekar, Parimal
2017-01-01
Conduct research, development and testing to identify airspace operations requirements to enable large-scale visual and beyond visual line of sight UAS operations in the low-altitude airspace. Use build-a-little-test-a-little strategy remote areas to urban areas Low density: No traffic management required but understanding of airspace constraints. Cooperative traffic management: Understanding of airspace constraints and other operations. Manned and unmanned traffic management: Scalable and heterogeneous operations. UTM construct consistent with FAAs risk-based strategy. UTM research platform is used for simulations and tests. UTM offers path towards scalability.
UTM Safely Enabling UAS Operations in Low-Altitude Airspace
NASA Technical Reports Server (NTRS)
Kopardekar, Parimal H.
2016-01-01
Conduct research, development and testing to identify airspace operations requirements to enable large-scale visual and beyond visual line of sight UAS operations in the low-altitude airspace. Use build-a-little-test-a-little strategy remote areas to urban areas Low density: No traffic management required but understanding of airspace constraints. Cooperative traffic management: Understanding of airspace constraints and other operations. Manned and unmanned traffic management: Scalable and heterogeneous operations. UTM construct consistent with FAAs risk-based strategy. UTM research platform is used for simulations and tests. UTM offers path towards scalability.
Obstacle avoidance handling and mixed integer predictive control for space robots
NASA Astrophysics Data System (ADS)
Zong, Lijun; Luo, Jianjun; Wang, Mingming; Yuan, Jianping
2018-04-01
This paper presents a novel obstacle avoidance constraint and a mixed integer predictive control (MIPC) method for space robots avoiding obstacles and satisfying physical limits during performing tasks. Firstly, a novel kind of obstacle avoidance constraint of space robots, which needs the assumption that the manipulator links and the obstacles can be represented by convex bodies, is proposed by limiting the relative velocity between two closest points which are on the manipulator and the obstacle, respectively. Furthermore, the logical variables are introduced into the obstacle avoidance constraint, which have realized the constraint form is automatically changed to satisfy different obstacle avoidance requirements in different distance intervals between the space robot and the obstacle. Afterwards, the obstacle avoidance constraint and other system physical limits, such as joint angle ranges, the amplitude boundaries of joint velocities and joint torques, are described as inequality constraints of a quadratic programming (QP) problem by using the model predictive control (MPC) method. To guarantee the feasibility of the obtained multi-constraint QP problem, the constraints are treated as soft constraints and assigned levels of priority based on the propositional logic theory, which can realize that the constraints with lower priorities are always firstly violated to recover the feasibility of the QP problem. Since the logical variables have been introduced, the optimization problem including obstacle avoidance and system physical limits as prioritized inequality constraints is termed as MIPC method of space robots, and its computational complexity as well as possible strategies for reducing calculation amount are analyzed. Simulations of the space robot unfolding its manipulator and tracking the end-effector's desired trajectories with the existence of obstacles and physical limits are presented to demonstrate the effectiveness of the proposed obstacle avoidance strategy and MIPC control method of space robots.
Infrared Submillimeter and Radio Astronomy Research and Analysis Program
NASA Technical Reports Server (NTRS)
Traub, Wesley A.
2000-01-01
This program entitled "Infrared Submillimeter and Radio Astronomy Research and Analysis Program" with NASA-Ames Research Center (ARC) was proposed by the Smithsonian Astrophysical Observatory (SAO) to cover three years. Due to funding constraints only the first year installment of $18,436 was funded, but this funding was spread out over two years to try to maximize the benefit to the program. During the tenure of this contact, the investigators at the SAO, Drs. Wesley A. Traub and Nathaniel P. Carleton, worked with the investigators at ARC, Drs. Jesse Bregman and Fred Wittebom, on the following three main areas: 1. Rapid scanning SAO and ARC collaborated on purchasing and constructing a Rapid Scan Platform for the delay arm of the Infrared-Optical Telescope Array (IOTA) interferometer on Mt. Hopkins, Arizona. The Rapid Scan Platform was tested and improved by the addition of stiffening plates which eliminated a very small but noticeable bending of the metal platform at the micro-meter level. 2. Star tracking Bregman and Wittebom conducted a study of the IOTA CCD-based star tracker system, by constructing a device to simulate star motion having a specified frequency and amplitude of motion, and by examining the response of the tracker to this simulated star input. 3. Fringe tracking. ARC, and in particular Dr. Robert Mah, developed a fringe-packet tracking algorithm, based on data that Bregman and Witteborn obtained on IOTA. The algorithm was tested in the laboratory at ARC, and found to work well for both strong and weak fringes.
NASA Astrophysics Data System (ADS)
Amallynda, I.; Santosa, B.
2017-11-01
This paper proposes a new generalization of the distributed parallel machine and assembly scheduling problem (DPMASP) with eligibility constraints referred to as the modified distributed parallel machine and assembly scheduling problem (MDPMASP) with eligibility constraints. Within this generalization, we assume that there are a set non-identical factories or production lines, each one with a set unrelated parallel machine with different speeds in processing them disposed to a single assembly machine in series. A set of different products that are manufactured through an assembly program of a set of components (jobs) according to the requested demand. Each product requires several kinds of jobs with different sizes. Beside that we also consider to the multi-objective problem (MOP) of minimizing mean flow time and the number of tardy products simultaneously. This is known to be NP-Hard problem, is important to practice, as the former criterions to reflect the customer's demand and manufacturer's perspective. This is a realistic and complex problem with wide range of possible solutions, we propose four simple heuristics and two metaheuristics to solve it. Various parameters of the proposed metaheuristic algorithms are discussed and calibrated by means of Taguchi technique. All proposed algorithms are tested by Matlab software. Our computational experiments indicate that the proposed problem and fourth proposed algorithms are able to be implemented and can be used to solve moderately-sized instances, and giving efficient solutions, which are close to optimum in most cases.
Distribution-dependent robust linear optimization with applications to inventory control
Kang, Seong-Cheol; Brisimi, Theodora S.
2014-01-01
This paper tackles linear programming problems with data uncertainty and applies it to an important inventory control problem. Each element of the constraint matrix is subject to uncertainty and is modeled as a random variable with a bounded support. The classical robust optimization approach to this problem yields a solution with guaranteed feasibility. As this approach tends to be too conservative when applications can tolerate a small chance of infeasibility, one would be interested in obtaining a less conservative solution with a certain probabilistic guarantee of feasibility. A robust formulation in the literature produces such a solution, but it does not use any distributional information on the uncertain data. In this work, we show that the use of distributional information leads to an equally robust solution (i.e., under the same probabilistic guarantee of feasibility) but with a better objective value. In particular, by exploiting distributional information, we establish stronger upper bounds on the constraint violation probability of a solution. These bounds enable us to “inject” less conservatism into the formulation, which in turn yields a more cost-effective solution (by 50% or more in some numerical instances). To illustrate the effectiveness of our methodology, we consider a discrete-time stochastic inventory control problem with certain quality of service constraints. Numerical tests demonstrate that the use of distributional information in the robust optimization of the inventory control problem results in 36%–54% cost savings, compared to the case where such information is not used. PMID:26347579
Elastic plastic fracture mechanics methodology for surface cracks
NASA Technical Reports Server (NTRS)
Ernst, Hugo A.; Lambert, D. M.
1994-01-01
The Elastic Plastic Fracture Mechanics Methodology has evolved significantly in the last several years. Nevertheless, some of these concepts need to be extended further before the whole methodology can be safely applied to structural parts. Specifically, there is a need to include the effect of constraint in the characterization of material resistance to crack growth and also to extend these methods to the case of 3D defects. As a consequence, this project was started as a 36 month research program with the general objective of developing an elastic plastic fracture mechanics methodology to assess the structural reliability of pressure vessels and other parts of interest to NASA which may contain flaws. The project is divided into three tasks that deal with (1) constraint and thickness effects, (2) three-dimensional cracks, and (3) the Leak-Before-Burst (LBB) criterion. This report period (March 1994 to August 1994) is a continuation of attempts to characterize three dimensional aspects of fracture present in 'two dimensional' or planar configuration specimens (Chapter Two), especially, the determination of, and use of, crack face separation data. Also, included, are a variety of fracture resistance testing results (J(m)R-curve format) and a discussion regarding two materials of NASA interest (6061-T651 Aluminum alloy and 1N718-STA1 nickel-base super alloy) involving a bases for like constraint in terms of ligament dimensions, and their comparison to the resulting J(m)R-curves (Chapter Two).
Constraints on the Grueneisen Theory
2007-02-01
Constraints on the Grüneisen Theory 5c. PROGRAM ELEMENT NUMBER 5d. PROJECT NUMBER AH80 5e. TASK NUMBER 6. AUTHOR( S ) Steven B. Segletes 5f...WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME( S ) AND ADDRESS(ES) U.S. Army Research Laboratory ATTN: AMSRD-ARL-WM-TD Aberdeen Proving Ground...MD 21005-5069 8. PERFORMING ORGANIZATION REPORT NUMBER ARL-TR-4041 10. SPONSOR/MONITOR’S ACRONYM( S ) 9. SPONSORING/MONITORING AGENCY NAME
Effects of the oceans on polar motion: Extended investigations
NASA Technical Reports Server (NTRS)
Dickman, Steven R.
1987-01-01
Matrix formulation of the tide equations (pole tide in nonglobal oceans); matrix formulation of the associated boundary conditions (constraints on the tide velocity at coastlines); and FORTRAN encoding of the tide equations excluding boundary conditions were completed. The need for supercomputer facilities was evident. Large versions of the programs were successfully run on the CYBER, submitting the jobs from SUNY through the BITNET network. The code was also restructured to include boundary constraints.
NASA Technical Reports Server (NTRS)
Guman, W. J. (Editor)
1972-01-01
Design details are presented of the solid propellant pulsed plasma microthruster which was analyzed during the Task 1 effort. The design details presented show that the inherent functional simplicity underlying the flight proven LES-6 design can be maintained in the SMS systems design even with minimum weight constraints imposed. A 1293 hour uninterrupted vacuum test with the engineering thermal model, simulating an 18.8 to 33 g environment of the propellant, its feed system and electrode assembly, revealed that program thruster performance requirements could be met. This latter g environment is a more severe environment than will be ever encountered in the SMS spacecraft.
Optimization of cutting parameters for machining time in turning process
NASA Astrophysics Data System (ADS)
Mavliutov, A. R.; Zlotnikov, E. G.
2018-03-01
This paper describes the most effective methods for nonlinear constraint optimization of cutting parameters in the turning process. Among them are Linearization Programming Method with Dual-Simplex algorithm, Interior Point method, and Augmented Lagrangian Genetic Algorithm (ALGA). Every each of them is tested on an actual example – the minimization of production rate in turning process. The computation was conducted in the MATLAB environment. The comparative results obtained from the application of these methods show: The optimal value of the linearized objective and the original function are the same. ALGA gives sufficiently accurate values, however, when the algorithm uses the Hybrid function with Interior Point algorithm, the resulted values have the maximal accuracy.
NASA Technical Reports Server (NTRS)
Graves, M. E.; Perlmutter, M.
1974-01-01
To aid the planning of the Apollo Soyuz Test Program (ASTP), certain natural environment statistical relationships are presented, based on Markov theory and empirical counts. The practical results are in terms of conditional probability of favorable and unfavorable launch conditions at Kennedy Space Center (KSC). They are based upon 15 years of recorded weather data which are analyzed under a set of natural environmental launch constraints. Three specific forecasting problems were treated: (1) the length of record of past weather which is useful to a prediction; (2) the effect of persistence in runs of favorable and unfavorable conditions; and (3) the forecasting of future weather in probabilistic terms.
Speech recognition: Acoustic phonetic and lexical knowledge representation
NASA Astrophysics Data System (ADS)
Zue, V. W.
1983-02-01
The purpose of this program is to develop a speech data base facility under which the acoustic characteristics of speech sounds in various contexts can be studied conveniently; investigate the phonological properties of a large lexicon of, say 10,000 words, and determine to what extent the phontactic constraints can be utilized in speech recognition; study the acoustic cues that are used to mark work boundaries; develop a test bed in the form of a large-vocabulary, IWR system to study the interactions of acoustic, phonetic and lexical knowledge; and develop a limited continuous speech recognition system with the goal of recognizing any English word from its spelling in order to assess the interactions of higher-level knowledge sources.
Ultramicrowave communications system, phase 2
NASA Technical Reports Server (NTRS)
1980-01-01
Communications system design was completed and reviewed. Minor changes were made in order to make it more cost effective and to increase design flexibility. System design activities identified the techniques and procedures to generate and monitor high data rate test signals. Differential bi-phase demodulation is the proposed method for this system. The mockup and packaging designs were performed, and component layout and interconnection constraints were determined, as well as design drawings for dummy parts of the system. The possibility of adding a low cost option to the transceiver system was studied. The communications program has the advantage that new technology signal processing devices can be readily interfaced with the existing radio frequency subsystem to produce a short range radar.
Solving Constraint-Satisfaction Problems with Distributed Neocortical-Like Neuronal Networks.
Rutishauser, Ueli; Slotine, Jean-Jacques; Douglas, Rodney J
2018-05-01
Finding actions that satisfy the constraints imposed by both external inputs and internal representations is central to decision making. We demonstrate that some important classes of constraint satisfaction problems (CSPs) can be solved by networks composed of homogeneous cooperative-competitive modules that have connectivity similar to motifs observed in the superficial layers of neocortex. The winner-take-all modules are sparsely coupled by programming neurons that embed the constraints onto the otherwise homogeneous modular computational substrate. We show rules that embed any instance of the CSP's planar four-color graph coloring, maximum independent set, and sudoku on this substrate and provide mathematical proofs that guarantee these graph coloring problems will convergence to a solution. The network is composed of nonsaturating linear threshold neurons. Their lack of right saturation allows the overall network to explore the problem space driven through the unstable dynamics generated by recurrent excitation. The direction of exploration is steered by the constraint neurons. While many problems can be solved using only linear inhibitory constraints, network performance on hard problems benefits significantly when these negative constraints are implemented by nonlinear multiplicative inhibition. Overall, our results demonstrate the importance of instability rather than stability in network computation and offer insight into the computational role of dual inhibitory mechanisms in neural circuits.
Low-rank regularization for learning gene expression programs.
Ye, Guibo; Tang, Mengfan; Cai, Jian-Feng; Nie, Qing; Xie, Xiaohui
2013-01-01
Learning gene expression programs directly from a set of observations is challenging due to the complexity of gene regulation, high noise of experimental measurements, and insufficient number of experimental measurements. Imposing additional constraints with strong and biologically motivated regularizations is critical in developing reliable and effective algorithms for inferring gene expression programs. Here we propose a new form of regulation that constrains the number of independent connectivity patterns between regulators and targets, motivated by the modular design of gene regulatory programs and the belief that the total number of independent regulatory modules should be small. We formulate a multi-target linear regression framework to incorporate this type of regulation, in which the number of independent connectivity patterns is expressed as the rank of the connectivity matrix between regulators and targets. We then generalize the linear framework to nonlinear cases, and prove that the generalized low-rank regularization model is still convex. Efficient algorithms are derived to solve both the linear and nonlinear low-rank regularized problems. Finally, we test the algorithms on three gene expression datasets, and show that the low-rank regularization improves the accuracy of gene expression prediction in these three datasets.
Apollo 14 visibility tests: Visibility of lunar surface features and lunar landing
NASA Technical Reports Server (NTRS)
Ziedman, K.
1972-01-01
An in-flight visibility test conducted on the Apollo 14 mission is discussed. The need for obtaining experimental data on lunar feature visibility arose from visibility problems associated with various aspects of the Apollo missions; and especially from anticipated difficulties of recognizing lunar surface features at the time of descent and landing under certain illumination conditions. Although visibility problems have influenced many other aspects of the Apollo mission, they have been particularly important for descent operations, due to the criticality of this mission phase and the crew's guidance and control role for landing site recognition and touchdown point selection. A series of analytical and photographic studies were conducted during the Apollo program (prior to as well as after the initial manned lunar operations) to delineate constraints imposed on landing operations by visibility limitations. The purpose of the visibility test conducted on Apollo 14 was to obtain data to reduce uncertainties and to extend the analytical models of visibility in the lunar environment.
A method for the dynamic management of genetic variability in dairy cattle
Colleau, Jean-Jacques; Moureaux, Sophie; Briend, Michèle; Bechu, Jérôme
2004-01-01
According to the general approach developed in this paper, dynamic management of genetic variability in selected populations of dairy cattle is carried out for three simultaneous purposes: procreation of young bulls to be further progeny-tested, use of service bulls already selected and approval of recently progeny-tested bulls for use. At each step, the objective is to minimize the average pairwise relationship coefficient in the future population born from programmed matings and the existing population. As a common constraint, the average estimated breeding value of the new population, for a selection goal including many important traits, is set to a desired value. For the procreation of young bulls, breeding costs are additionally constrained. Optimization is fully analytical and directly considers matings. Corresponding algorithms are presented in detail. The efficiency of these procedures was tested on the current Norman population. Comparisons between optimized and real matings, clearly showed that optimization would have saved substantial genetic variability without reducing short-term genetic gains. PMID:15231230
Graeve, Catherine; McGovern, Patricia; Nachreiner, Nancy M; Ayers, Lynn
2014-01-01
Occupational health nurses use their knowledge and skills to improve the health and safety of the working population; however, companies increasingly face budget constraints and may eliminate health and safety programs. Occupational health nurses must be prepared to document their services and outcomes, and use quantitative tools to demonstrate their value to employers. The aim of this project was to create and pilot test a quantitative tool for occupational health nurses to track their activities and potential cost savings for on-site occupational health nursing services. Tool developments included a pilot test in which semi-structured interviews with occupational health and safety leaders were conducted to identify currents issues and products used for estimating the value of occupational health nursing services. The outcome was the creation of a tool that estimates the economic value of occupational health nursing services. The feasibility and potential value of this tool is described.
Modeling and Analysis of Power Processing Systems (MAPPS). Volume 2: Appendices
NASA Technical Reports Server (NTRS)
Lee, F. C.; Radman, S.; Carter, R. A.; Wu, C. H.; Yu, Y.; Chang, R.
1980-01-01
The computer programs and derivations generated in support of the modeling and design optimization program are presented. Programs for the buck regulator, boost regulator, and buck-boost regulator are described. The computer program for the design optimization calculations is presented. Constraints for the boost and buck-boost converter were derived. Derivations of state-space equations and transfer functions are presented. Computer lists for the converters are presented, and the input parameters justified.
CONORBIT: constrained optimization by radial basis function interpolation in trust regions
Regis, Rommel G.; Wild, Stefan M.
2016-09-26
Here, this paper presents CONORBIT (CONstrained Optimization by Radial Basis function Interpolation in Trust regions), a derivative-free algorithm for constrained black-box optimization where the objective and constraint functions are computationally expensive. CONORBIT employs a trust-region framework that uses interpolating radial basis function (RBF) models for the objective and constraint functions, and is an extension of the ORBIT algorithm. It uses a small margin for the RBF constraint models to facilitate the generation of feasible iterates, and extensive numerical tests confirm that such a margin is helpful in improving performance. CONORBIT is compared with other algorithms on 27 test problems, amore » chemical process optimization problem, and an automotive application. Numerical results show that CONORBIT performs better than COBYLA, a sequential penalty derivative-free method, an augmented Lagrangian method, a direct search method, and another RBF-based algorithm on the test problems and on the automotive application.« less
Business as Usual or Brave New World? A College President's Perspective.
ERIC Educational Resources Information Center
Keohane, Nannerl O.
1986-01-01
The Sloan Foundation's New Liberal Arts Program aims to make a fundamental transformation in the liberal arts curriculum, by infusing applied mathematics and technological literacy. The program is examined by the president of Wellesley College in the context of current philosophical and practical constraints in higher education. (MSE)
Using Social-Impact Borrowing to Expand Preschool-to-Third Grade Programs in Urban Schools
ERIC Educational Resources Information Center
Temple, Judy A.; Reynolds, Arthur J.
2015-01-01
Budget constraints and difficulty raising taxes limit school districts from expanding education programming, even when research shows that additional expenditures would generate economic benefits that are greater than costs. Recently, coalitions of private investors, philanthropists, education practitioners, and government finance analysts have…
ERIC Educational Resources Information Center
Ryan, Margaret Vail
2011-01-01
Prominent challenges facing contemporary community colleges are enhancing leadership capacity and serving their diverse student populations. While doctoral education constitutes a mainstay strategy for developing community college leaders, community college professionals face constraints accessing doctoral programs. The innovation of an…
Remote sensing support for national forest inventories
Ronald E. McRoberts; Erkki O. Tomppo
2007-01-01
National forest inventory programs are tasked to produce timely and accurate estimates for a wide range of forest resource variables for a variety of users and applications. Time, cost, and precision constraints cause these programs to seek technological innovations that contribute to measurement and estimation efficiencies and that facilitate the production and...
The STEM Initiative: Constraints and Challenges
ERIC Educational Resources Information Center
Herschbach, Dennis R.
2011-01-01
There is considerable national interest in STEM initiatives, but yet there is little discussion concerning what STEM means in terms of a curriculum concept to be applied to school programming. This article focuses on STEM as a curriculum concept. First, STEM programming is discussed in terms of separate subjects, correlated and broad fields…
Lingua and Erasmus: Circumventing the Constraints.
ERIC Educational Resources Information Center
Chambers, Gary
1994-01-01
Discusses the development and implementation of a student exchange program between the University of Leeds in England and the Institut fur Praxis der Theorie der Schule in Kiel, Germany. Ten Leeds students participated in the program, which was designed to give students an appreciation of German culture, language, and vocational teaching methods.…
Have Less? Do More! Marketing University Counseling Center Services.
ERIC Educational Resources Information Center
Schreier, Barry A.
Many university and college counseling centers are experiencing increased financial constraints and a growing lack of general institutional support. This paper suggests that psycho-educational programming may be one solution for reaching more students while spending less in financial and staff hour resources. Although educational programming may…
An Introduction to the Safe Schools/Healthy Students Initiative
ERIC Educational Resources Information Center
Modzeleski, William; Mathews-Younes, Anne; Arroyo, Carmen G.; Mannix, Danyelle; Wells, Michael E.; Hill, Gary; Yu, Ping; Murray, Stephen
2012-01-01
The Safe Schools/Healthy Students (SS/HS) Initiative offers a unique opportunity to conduct large-scale, multisite, multilevel program evaluation in the context of a federal environment that places many requirements and constraints on how the grants are conducted and managed. Federal programs stress performance-based outcomes, valid and reliable…
Evolutionary Scheduler for the Deep Space Network
NASA Technical Reports Server (NTRS)
Guillaume, Alexandre; Lee, Seungwon; Wang, Yeou-Fang; Zheng, Hua; Chau, Savio; Tung, Yu-Wen; Terrile, Richard J.; Hovden, Robert
2010-01-01
A computer program assists human schedulers in satisfying, to the maximum extent possible, competing demands from multiple spacecraft missions for utilization of the transmitting/receiving Earth stations of NASA s Deep Space Network. The program embodies a concept of optimal scheduling to attain multiple objectives in the presence of multiple constraints.
PATRAN-STAGS translator (PATSTAGS)
NASA Technical Reports Server (NTRS)
Otte, Neil
1990-01-01
A a computer program used to translate PATRAN finite element model data into Structural Analysis of General Shells (STAGS) input data is presented. The program supports translation of nodal, nodal constraints, element, force, and pressure data. The subroutine UPRESS required for the readings of live pressure data into STAGS is also presented.
Guevara, V R
2004-02-01
A nonlinear programming optimization model was developed to maximize margin over feed cost in broiler feed formulation and is described in this paper. The model identifies the optimal feed mix that maximizes profit margin. Optimum metabolizable energy level and performance were found by using Excel Solver nonlinear programming. Data from an energy density study with broilers were fitted to quadratic equations to express weight gain, feed consumption, and the objective function income over feed cost in terms of energy density. Nutrient:energy ratio constraints were transformed into equivalent linear constraints. National Research Council nutrient requirements and feeding program were used for examining changes in variables. The nonlinear programming feed formulation method was used to illustrate the effects of changes in different variables on the optimum energy density, performance, and profitability and was compared with conventional linear programming. To demonstrate the capabilities of the model, I determined the impact of variation in prices. Prices for broiler, corn, fish meal, and soybean meal were increased and decreased by 25%. Formulations were identical in all other respects. Energy density, margin, and diet cost changed compared with conventional linear programming formulation. This study suggests that nonlinear programming can be more useful than conventional linear programming to optimize performance response to energy density in broiler feed formulation because an energy level does not need to be set.
φ-evo: A program to evolve phenotypic models of biological networks.
Henry, Adrien; Hemery, Mathieu; François, Paul
2018-06-01
Molecular networks are at the core of most cellular decisions, but are often difficult to comprehend. Reverse engineering of network architecture from their functions has proved fruitful to classify and predict the structure and function of molecular networks, suggesting new experimental tests and biological predictions. We present φ-evo, an open-source program to evolve in silico phenotypic networks performing a given biological function. We include implementations for evolution of biochemical adaptation, adaptive sorting for immune recognition, metazoan development (somitogenesis, hox patterning), as well as Pareto evolution. We detail the program architecture based on C, Python 3, and a Jupyter interface for project configuration and network analysis. We illustrate the predictive power of φ-evo by first recovering the asymmetrical structure of the lac operon regulation from an objective function with symmetrical constraints. Second, we use the problem of hox-like embryonic patterning to show how a single effective fitness can emerge from multi-objective (Pareto) evolution. φ-evo provides an efficient approach and user-friendly interface for the phenotypic prediction of networks and the numerical study of evolution itself.