Wilk, Szymon; Michalowski, Martin; Michalowski, Wojtek; Hing, Marisela Mainegra; Farion, Ken
2011-01-01
This paper describes a new methodological approach to reconciling adverse and contradictory activities (called points of contention) occurring when a patient is managed according to two or more concurrently used clinical practice guidelines (CPGs). The need to address these inconsistencies occurs when a patient with more than one disease, each of which is a comorbid condition, has to be managed according to different treatment regimens. We propose an automatic procedure that constructs a mathematical guideline model using the Constraint Logic Programming (CLP) methodology, uses this model to identify and mitigate encountered points of contention, and revises the considered CPGs accordingly. The proposed procedure is used as an alerting mechanism and coupled with a guideline execution engine warns the physician about potential problems with the concurrent application of two or more guidelines. We illustrate the operation of our procedure in a clinical scenario describing simultaneous use of CPGs for duodenal ulcer and transient ischemic attack.
NASA Astrophysics Data System (ADS)
Long, Kai; Wang, Xuan; Gu, Xianguang
2017-09-01
The present work introduces a novel concurrent optimization formulation to meet the requirements of lightweight design and various constraints simultaneously. Nodal displacement of macrostructure and effective thermal conductivity of microstructure are regarded as the constraint functions, which means taking into account both the load-carrying capabilities and the thermal insulation properties. The effective properties of porous material derived from numerical homogenization are used for macrostructural analysis. Meanwhile, displacement vectors of macrostructures from original and adjoint load cases are used for sensitivity analysis of the microstructure. Design variables in the form of reciprocal functions of relative densities are introduced and used for linearization of the constraint function. The objective function of total mass is approximately expressed by the second order Taylor series expansion. Then, the proposed concurrent optimization problem is solved using a sequential quadratic programming algorithm, by splitting into a series of sub-problems in the form of the quadratic program. Finally, several numerical examples are presented to validate the effectiveness of the proposed optimization method. The various effects including initial designs, prescribed limits of nodal displacement, and effective thermal conductivity on optimized designs are also investigated. An amount of optimized macrostructures and their corresponding microstructures are achieved.
Cooperating knowledge-based systems
NASA Technical Reports Server (NTRS)
Feigenbaum, Edward A.; Buchanan, Bruce G.
1988-01-01
This final report covers work performed under Contract NCC2-220 between NASA Ames Research Center and the Knowledge Systems Laboratory, Stanford University. The period of research was from March 1, 1987 to February 29, 1988. Topics covered were as follows: (1) concurrent architectures for knowledge-based systems; (2) methods for the solution of geometric constraint satisfaction problems, and (3) reasoning under uncertainty. The research in concurrent architectures was co-funded by DARPA, as part of that agency's Strategic Computing Program. The research has been in progress since 1985, under DARPA and NASA sponsorship. The research in geometric constraint satisfaction has been done in the context of a particular application, that of determining the 3-D structure of complex protein molecules, using the constraints inferred from NMR measurements.
Wilk, Szymon; Michalowski, Martin; Michalowski, Wojtek; Hing, Marisela Mainegra; Farion, Ken
2011-01-01
This paper describes a new methodological approach to reconciling adverse and contradictory activities (called points of contention) occurring when a patient is managed according to two or more concurrently used clinical practice guidelines (CPGs). The need to address these inconsistencies occurs when a patient with more than one disease, each of which is a comorbid condition, has to be managed according to different treatment regimens. We propose an automatic procedure that constructs a mathematical guideline model using the Constraint Logic Programming (CLP) methodology, uses this model to identify and mitigate encountered points of contention, and revises the considered CPGs accordingly. The proposed procedure is used as an alerting mechanism and coupled with a guideline execution engine warns the physician about potential problems with the concurrent application of two or more guidelines. We illustrate the operation of our procedure in a clinical scenario describing simultaneous use of CPGs for duodenal ulcer and transient ischemic attack. PMID:22195153
From non-preemptive to preemptive scheduling using synchronization synthesis.
Černý, Pavol; Clarke, Edmund M; Henzinger, Thomas A; Radhakrishna, Arjun; Ryzhyk, Leonid; Samanta, Roopsha; Tarrach, Thorsten
2017-01-01
We present a computer-aided programming approach to concurrency. The approach allows programmers to program assuming a friendly, non-preemptive scheduler, and our synthesis procedure inserts synchronization to ensure that the final program works even with a preemptive scheduler. The correctness specification is implicit, inferred from the non-preemptive behavior. Let us consider sequences of calls that the program makes to an external interface. The specification requires that any such sequence produced under a preemptive scheduler should be included in the set of sequences produced under a non-preemptive scheduler. We guarantee that our synthesis does not introduce deadlocks and that the synchronization inserted is optimal w.r.t. a given objective function. The solution is based on a finitary abstraction, an algorithm for bounded language inclusion modulo an independence relation, and generation of a set of global constraints over synchronization placements. Each model of the global constraints set corresponds to a correctness-ensuring synchronization placement. The placement that is optimal w.r.t. the given objective function is chosen as the synchronization solution. We apply the approach to device-driver programming, where the driver threads call the software interface of the device and the API provided by the operating system. Our experiments demonstrate that our synthesis method is precise and efficient. The implicit specification helped us find one concurrency bug previously missed when model-checking using an explicit, user-provided specification. We implemented objective functions for coarse-grained and fine-grained locking and observed that different synchronization placements are produced for our experiments, favoring a minimal number of synchronization operations or maximum concurrency, respectively.
Detecting Potential Synchronization Constraint Deadlocks from Formal System Specifications
1992-03-01
family of languages, consisting of the Larch Shared Language and a series of Larch interface languages, specific to particular programming languages...specify sequential (non- concurrent) programs , and explicitly does not include the ability to specify atomic actions (Guttag, 1985). Larch is therefore...synchronized communication between two such agents is ronsidered as a single action. The transitions in CCS trees are labelled to show how they are
Partitioning problems in parallel, pipelined and distributed computing
NASA Technical Reports Server (NTRS)
Bokhari, S.
1985-01-01
The problem of optimally assigning the modules of a parallel program over the processors of a multiple computer system is addressed. A Sum-Bottleneck path algorithm is developed that permits the efficient solution of many variants of this problem under some constraints on the structure of the partitions. In particular, the following problems are solved optimally for a single-host, multiple satellite system: partitioning multiple chain structured parallel programs, multiple arbitrarily structured serial programs and single tree structured parallel programs. In addition, the problems of partitioning chain structured parallel programs across chain connected systems and across shared memory (or shared bus) systems are also solved under certain constraints. All solutions for parallel programs are equally applicable to pipelined programs. These results extend prior research in this area by explicitly taking concurrency into account and permit the efficient utilization of multiple computer architectures for a wide range of problems of practical interest.
RACER: Effective Race Detection Using AspectJ
NASA Technical Reports Server (NTRS)
Bodden, Eric; Havelund, Klaus
2008-01-01
The limits of coding with joint constraints on detected and undetected error rates Programming errors occur frequently in large software systems, and even more so if these systems are concurrent. In the past, researchers have developed specialized programs to aid programmers detecting concurrent programming errors such as deadlocks, livelocks, starvation and data races. In this work we propose a language extension to the aspect-oriented programming language AspectJ, in the form of three new built-in pointcuts, lock(), unlock() and may be Shared(), which allow programmers to monitor program events where locks are granted or handed back, and where values are accessed that may be shared amongst multiple Java threads. We decide thread-locality using a static thread-local objects analysis developed by others. Using the three new primitive pointcuts, researchers can directly implement efficient monitoring algorithms to detect concurrent programming errors online. As an example, we expose a new algorithm which we call RACER, an adoption of the well-known ERASER algorithm to the memory model of Java. We implemented the new pointcuts as an extension to the Aspect Bench Compiler, implemented the RACER algorithm using this language extension and then applied the algorithm to the NASA K9 Rover Executive. Our experiments proved our implementation very effective. In the Rover Executive RACER finds 70 data races. Only one of these races was previously known.We further applied the algorithm to two other multi-threaded programs written by Computer Science researchers, in which we found races as well.
Wilk, Szymon; Michalowski, Wojtek; Michalowski, Martin; Farion, Ken; Hing, Marisela Mainegra; Mohapatra, Subhra
2013-04-01
We propose a new method to mitigate (identify and address) adverse interactions (drug-drug or drug-disease) that occur when a patient with comorbid diseases is managed according to two concurrently applied clinical practice guidelines (CPGs). A lack of methods to facilitate the concurrent application of CPGs severely limits their use in clinical practice and the development of such methods is one of the grand challenges for clinical decision support. The proposed method responds to this challenge. We introduce and formally define logical models of CPGs and other related concepts, and develop the mitigation algorithm that operates on these concepts. In the algorithm we combine domain knowledge encoded as interaction and revision operators using the constraint logic programming (CLP) paradigm. The operators characterize adverse interactions and describe revisions to logical models required to address these interactions, while CLP allows us to efficiently solve the logical models - a solution represents a feasible therapy that may be safely applied to a patient. The mitigation algorithm accepts two CPGs and available (likely incomplete) patient information. It reports whether mitigation has been successful or not, and on success it gives a feasible therapy and points at identified interactions (if any) together with the revisions that address them. Thus, we consider the mitigation algorithm as an alerting tool to support a physician in the concurrent application of CPGs that can be implemented as a component of a clinical decision support system. We illustrate our method in the context of two clinical scenarios involving a patient with duodenal ulcer who experiences an episode of transient ischemic attack. Copyright © 2013 Elsevier Inc. All rights reserved.
EASAMS' Ariane 5 on-board software experience
NASA Astrophysics Data System (ADS)
Birnie, Steven Andrew
The design and development of the prototype flight software for the Ariane 5 satellite launch vehicle is considered. This was specified as being representative of the eventual real flight program in terms of timing constraints and target computer loading. The usability of HOOD (Hierarchical Object Oriented Design) and Ada for development of such preemptive multitasking computer programs was verified. Features of the prototype development included: design methods supplementary to HOOD for representation of concurrency aspects; visibility of Ada enumerated type literals across HOOD parent-child interfaces; deterministic timings achieved by modification of Ada delays; and linking of interrupts to Ada task entries.
An Analysis of the Concurrent Certification Program at Fleet Readiness Center Southwest
2009-12-01
Mapping 5S Methodology Kanban Poka - yoke A3 Problem Solving Single Point Lesson Plans (SPLP) Total Productive Maintenance (TPM) 54 What...the actual demand of the customers. Kanban is as a demand signal which immediately propagates through the supply chain. D. POKA - YOKE : A Japanese...term that means "fail-safing" or "mistake- proofing." Avoiding (yokeru) inadvertent errors ( poka ) is a behavior-shaping constraint, or a method of
Concurrent schedules: Effects of time- and response-allocation constraints
Davison, Michael
1991-01-01
Five pigeons were trained on concurrent variable-interval schedules arranged on two keys. In Part 1 of the experiment, the subjects responded under no constraints, and the ratios of reinforcers obtainable were varied over five levels. In Part 2, the conditions of the experiment were changed such that the time spent responding on the left key before a subsequent changeover to the right key determined the minimum time that must be spent responding on the right key before a changeover to the left key could occur. When the left key provided a higher reinforcer rate than the right key, this procedure ensured that the time allocated to the two keys was approximately equal. The data showed that such a time-allocation constraint only marginally constrained response allocation. In Part 3, the numbers of responses emitted on the left key before a changeover to the right key determined the minimum number of responses that had to be emitted on the right key before a changeover to the left key could occur. This response constraint completely constrained time allocation. These data are consistent with the view that response allocation is a fundamental process (and time allocation a derivative process), or that response and time allocation are independently controlled, in concurrent-schedule performance. PMID:16812632
Maintaining consistency in distributed systems
NASA Technical Reports Server (NTRS)
Birman, Kenneth P.
1991-01-01
In systems designed as assemblies of independently developed components, concurrent access to data or data structures normally arises within individual programs, and is controlled using mutual exclusion constructs, such as semaphores and monitors. Where data is persistent and/or sets of operation are related to one another, transactions or linearizability may be more appropriate. Systems that incorporate cooperative styles of distributed execution often replicate or distribute data within groups of components. In these cases, group oriented consistency properties must be maintained, and tools based on the virtual synchrony execution model greatly simplify the task confronting an application developer. All three styles of distributed computing are likely to be seen in future systems - often, within the same application. This leads us to propose an integrated approach that permits applications that use virtual synchrony with concurrent objects that respect a linearizability constraint, and vice versa. Transactional subsystems are treated as a special case of linearizability.
Concurrent design of composite materials and structures considering thermal conductivity constraints
NASA Astrophysics Data System (ADS)
Jia, J.; Cheng, W.; Long, K.
2017-08-01
This article introduces thermal conductivity constraints into concurrent design. The influence of thermal conductivity on macrostructure and orthotropic composite material is extensively investigated using the minimum mean compliance as the objective function. To simultaneously control the amounts of different phase materials, a given mass fraction is applied in the optimization algorithm. Two phase materials are assumed to compete with each other to be distributed during the process of maximizing stiffness and thermal conductivity when the mass fraction constraint is small, where phase 1 has superior stiffness and thermal conductivity whereas phase 2 has a superior ratio of stiffness to density. The effective properties of the material microstructure are computed by a numerical homogenization technique, in which the effective elasticity matrix is applied to macrostructural analyses and the effective thermal conductivity matrix is applied to the thermal conductivity constraint. To validate the effectiveness of the proposed optimization algorithm, several three-dimensional illustrative examples are provided and the features under different boundary conditions are analysed.
The Influence of Task Instruction on Action Coding: Constraint Setting or Direct Coding?
ERIC Educational Resources Information Center
Wenke, Dorit; Frensch, Peter A.
2005-01-01
In 3 experiments, the authors manipulated response instructions for 2 concurrently performed tasks. Specifically, the authors' instructions described left and right keypresses on a manual task either as left versus right or as blue versus green keypresses and required either "left" versus "right" or "blue" versus "green" concurrent verbalizations.…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shusharina, N; Khan, F; Sharp, G
Purpose: To determine the dose level and timing of the boost in locally advanced lung cancer patients with confirmed tumor recurrence by comparing different boosting strategies by an impact of dose escalation in improvement of the therapeutic ratio. Methods: We selected eighteen patients with advanced NSCLC and confirmed recurrence. For each patient, a base IMRT plan to 60 Gy prescribed to PTV was created. Then we compared three dose escalation strategies: a uniform escalation to the original PTV, an escalation to a PET-defined target planned sequentially and concurrently. The PET-defined targets were delineated by biologically-weighed regions on a pre-treatment 18F-FDGmore » PET. The maximal achievable dose, without violating the OAR constraints, was identified for each boosting method. The EUD for the target, spinal cord, combined lung, and esophagus was compared for each plan. Results: The average prescribed dose was 70.4±13.9 Gy for the uniform boost, 88.5±15.9 Gy for the sequential boost and 89.1±16.5 Gy for concurrent boost. The size of the boost planning volume was 12.8% (range: 1.4 – 27.9%) of the PTV. The most prescription-limiting dose constraints was the V70 of the esophagus. The EUD within the target increased by 10.6 Gy for the uniform boost, by 31.4 Gy for the sequential boost and by 38.2 for the concurrent boost. The EUD for OARs increased by the following amounts: spinal cord, 3.1 Gy for uniform boost, 2.8 Gy for sequential boost, 5.8 Gy for concurrent boost; combined lung, 1.6 Gy for uniform, 1.1 Gy for sequential, 2.8 Gy for concurrent; esophagus, 4.2 Gy for uniform, 1.3 Gy for sequential, 5.6 Gy for concurrent. Conclusion: Dose escalation to a biologically-weighed gross tumor volume defined on a pre-treatment 18F-FDG PET may provide improved therapeutic ratio without breaching predefined OAR constraints. Sequential boost provides better sparing of OARs as compared with concurrent boost.« less
Control system software, simulation, and robotic applications
NASA Technical Reports Server (NTRS)
Frisch, Harold P.
1991-01-01
All essential existing capabilities needed to create a man-machine interaction dynamics and performance (MMIDAP) capability are reviewed. The multibody system dynamics software program Order N DISCOS will be used for machine and musculo-skeletal dynamics modeling. The program JACK will be used for estimating and animating whole body human response to given loading situations and motion constraints. The basic elements of performance (BEP) task decomposition methodologies associated with the Human Performance Institute database will be used for performance assessment. Techniques for resolving the statically indeterminant muscular load sharing problem will be used for a detailed understanding of potential musculotendon or ligamentous fatigue, pain, discomfort, and trauma. The envisioned capacity is to be used for mechanical system design, human performance assessment, extrapolation of man/machine interaction test data, biomedical engineering, and soft prototyping within a concurrent engineering (CE) system.
Language and System Support for Concurrent Programming
1990-04-01
language. We give suggestions on how to avoid polling programs , and suggest changes to the rendezvous facilities to eliminate the polling bias. The...concerned with support for concurrent pro- Capsule gramming provided to the application programmer by operating Description systems and programming ...of concurrent programming has widened Philosophy from "pure" operating system applications to a multitude of real-time and distributed programs . Since
NASA Technical Reports Server (NTRS)
Chien, Andrew A.; Karamcheti, Vijay; Plevyak, John; Sahrawat, Deepak
1993-01-01
Concurrent object-oriented languages, particularly fine-grained approaches, reduce the difficulty of large scale concurrent programming by providing modularity through encapsulation while exposing large degrees of concurrency. Despite these programmability advantages, such languages have historically suffered from poor efficiency. This paper describes the Concert project whose goal is to develop portable, efficient implementations of fine-grained concurrent object-oriented languages. Our approach incorporates aggressive program analysis and program transformation with careful information management at every stage from the compiler to the runtime system. The paper discusses the basic elements of the Concert approach along with a description of the potential payoffs. Initial performance results and specific plans for system development are also detailed.
A Concurrent Distributed System for Aircraft Tactical Decision Generation
NASA Technical Reports Server (NTRS)
McManus, John W.
1990-01-01
A research program investigating the use of artificial intelligence (AI) techniques to aid in the development of a Tactical Decision Generator (TDG) for Within Visual Range (WVR) air combat engagements is discussed. The application of AI programming and problem solving methods in the development and implementation of a concurrent version of the Computerized Logic For Air-to-Air Warfare Simulations (CLAWS) program, a second generation TDG, is presented. Concurrent computing environments and programming approaches are discussed and the design and performance of a prototype concurrent TDG system are presented.
Constraint-based Attribute and Interval Planning
NASA Technical Reports Server (NTRS)
Jonsson, Ari; Frank, Jeremy
2013-01-01
In this paper we describe Constraint-based Attribute and Interval Planning (CAIP), a paradigm for representing and reasoning about plans. The paradigm enables the description of planning domains with time, resources, concurrent activities, mutual exclusions among sets of activities, disjunctive preconditions and conditional effects. We provide a theoretical foundation for the paradigm, based on temporal intervals and attributes. We then show how the plans are naturally expressed by networks of constraints, and show that the process of planning maps directly to dynamic constraint reasoning. In addition, we de ne compatibilities, a compact mechanism for describing planning domains. We describe how this framework can incorporate the use of constraint reasoning technology to improve planning. Finally, we describe EUROPA, an implementation of the CAIP framework.
A Novel Joint Problem of Routing, Scheduling, and Variable-Width Channel Allocation in WMNs
Liu, Wan-Yu; Chou, Chun-Hung
2014-01-01
This paper investigates a novel joint problem of routing, scheduling, and channel allocation for single-radio multichannel wireless mesh networks in which multiple channel widths can be adjusted dynamically through a new software technology so that more concurrent transmissions and suppressed overlapping channel interference can be achieved. Although the previous works have studied this joint problem, their linear programming models for the problem were not incorporated with some delicate constraints. As a result, this paper first constructs a linear programming model with more practical concerns and then proposes a simulated annealing approach with a novel encoding mechanism, in which the configurations of multiple time slots are devised to characterize the dynamic transmission process. Experimental results show that our approach can find the same or similar solutions as the optimal solutions for smaller-scale problems and can efficiently find good-quality solutions for a variety of larger-scale problems. PMID:24982990
ERIC Educational Resources Information Center
Haag, Patricia W.
2015-01-01
Career and technical education concurrent enrollment may pose unique challenges in programming and enrollment for program administrators, and this chapter describes the experiences and challenges of a CTE concurrent enrollment administrator.
C formal verification with unix communication and concurrency
NASA Technical Reports Server (NTRS)
Hoover, Doug N.
1990-01-01
The results of a NASA SBIR project are presented in which CSP-Ariel, a verification system for C programs which use Unix system calls for concurrent programming, interprocess communication, and file input and output, was developed. This project builds on ORA's Ariel C verification system by using the system of Hoare's book, Communicating Sequential Processes, to model concurrency and communication. The system runs in ORA's Clio theorem proving environment. The use of CSP to model Unix concurrency and sketch the CSP semantics of a simple concurrent program is outlined. Plans for further development of CSP-Ariel are discussed. This paper is presented in viewgraph form.
The Caltech Concurrent Computation Program - Project description
NASA Technical Reports Server (NTRS)
Fox, G.; Otto, S.; Lyzenga, G.; Rogstad, D.
1985-01-01
The Caltech Concurrent Computation Program wwhich studies basic issues in computational science is described. The research builds on initial work where novel concurrent hardware, the necessary systems software to use it and twenty significant scientific implementations running on the initial 32, 64, and 128 node hypercube machines have been constructed. A major goal of the program will be to extend this work into new disciplines and more complex algorithms including general packages that decompose arbitrary problems in major application areas. New high-performance concurrent processors with up to 1024-nodes, over a gigabyte of memory and multigigaflop performance are being constructed. The implementations cover a wide range of problems in areas such as high energy and astrophysics, condensed matter, chemical reactions, plasma physics, applied mathematics, geophysics, simulation, CAD for VLSI, graphics and image processing. The products of the research program include the concurrent algorithms, hardware, systems software, and complete program implementations.
NASA Technical Reports Server (NTRS)
Burns, K. Lee; Altino, Karen
2008-01-01
The Marshall Space Flight Center Natural Environments Branch has a long history of expertise in the modeling and computation of statistical launch availabilities with respect to weather conditions. Their existing data analysis product, the Atmospheric Parametric Risk Assessment (APRA) tool, computes launch availability given an input set of vehicle hardware and/or operational weather constraints by calculating the climatological probability of exceeding the specified constraint limits, APRA has been used extensively to provide the Space Shuttle program the ability to estimate impacts that various proposed design modifications would have to overall launch availability. The model accounts for both seasonal and diurnal variability at a single geographic location and provides output probabilities for a single arbitrary launch attempt. Recently, the Shuttle program has shown interest in having additional capabilities added to the APRA model, including analysis of humidity parameters, inclusion of landing site weather to produce landing availability, and concurrent analysis of multiple sites, to assist in operational landing site selection. In addition, the Constellation program has also expressed interest in the APRA tool, and has requested several additional capabilities to address some Constellation-specific issues, both in the specification and verification of design requirements and in the development of operations concepts. The combined scope of the requested capability enhancements suggests an evolution of the model beyond a simple revision process. Development has begun for a new data analysis tool that will satisfy the requests of both programs. This new tool, Probabilities of Atmospheric Conditions and Environmental Risk (PACER), will provide greater flexibility and significantly enhanced functionality compared to the currently existing tool.
An Evaluation of One-Sided and Two-Sided Communication Paradigms on Relaxed-Ordering Interconnect
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ibrahim, Khaled Z.; Hargrove, Paul H.; Iancu, Costin
The Cray Gemini interconnect hardware provides multiple transfer mechanisms and out-of-order message delivery to improve communication throughput. In this paper we quantify the performance of one-sided and two-sided communication paradigms with respect to: 1) the optimal available hardware transfer mechanism, 2) message ordering constraints, 3) per node and per core message concurrency. In addition to using Cray native communication APIs, we use UPC and MPI micro-benchmarks to capture one- and two-sided semantics respectively. Our results indicate that relaxing the message delivery order can improve performance up to 4.6x when compared with strict ordering. When hardware allows it, high-level one-sided programmingmore » models can already take advantage of message reordering. Enforcing the ordering semantics of two-sided communication comes with a performance penalty. Furthermore, we argue that exposing out-of-order delivery at the application level is required for the next-generation programming models. Any ordering constraints in the language specifications reduce communication performance for small messages and increase the number of active cores required for peak throughput.« less
NASA Astrophysics Data System (ADS)
Hahn, T.
2016-10-01
The parallel version of the multidimensional numerical integration package Cuba is presented and achievable speed-ups discussed. The parallelization is based on the fork/wait POSIX functions, needs no extra software installed, imposes almost no constraints on the integrand function, and works largely automatically.
strates Investigation of actinomycetales occurring in the marine environment Concurrent related mycological research program Systematics of pelagic fungi Biology and ecology of marine yeasts Concurrent bacteriological research programs
Michalowski, Martin; Wilk, Szymon; Tan, Xing; Michalowski, Wojtek
2014-01-01
Clinical practice guidelines (CPGs) implement evidence-based medicine designed to help generate a therapy for a patient suffering from a single disease. When applied to a comorbid patient, the concurrent combination of treatment steps from multiple CPGs is susceptible to adverse interactions in the resulting combined therapy (i.e., a therapy established according to all considered CPGs). This inability to concurrently apply CPGs has been shown to be one of the key shortcomings of CPG uptake in a clinical setting1. Several research efforts are underway to address this issue such as the K4CARE2 and GuideLine INteraction Detection Assistant (GLINDA)3 projects and our previous research on applying constraint logic programming to developing a consistent combined therapy for a comorbid patient4. However, there is no generalized framework for mitigation that effectively captures general characteristics of the problem while handling nuances such as time and ordering requirements imposed by specific CPGs. In this paper we propose a first-order logic-based (FOL) approach for developing a generalized framework of mitigation. This approach uses a meta-algorithm and entailment properties to mitigate (i.e., identify and address) adverse interactions introduced by concurrently applied CPGs. We use an illustrative case study of a patient suffering from type 2 diabetes being treated for an onset of severe rheumatoid arthritis to show the expressiveness and robustness of our proposed FOL-based approach, and we discuss its appropriateness as the basis for the generalized theory.
Concurrent simulation of a parallel jaw end effector
NASA Technical Reports Server (NTRS)
Bynum, Bill
1985-01-01
A system of programs developed to aid in the design and development of the command/response protocol between a parallel jaw end effector and the strategic planner program controlling it are presented. The system executes concurrently with the LISP controlling program to generate a graphical image of the end effector that moves in approximately real time in response to commands sent from the controlling program. Concurrent execution of the simulation program is useful for revealing flaws in the communication command structure arising from the asynchronous nature of the message traffic between the end effector and the strategic planner. Software simulation helps to minimize the number of hardware changes necessary to the microprocessor driving the end effector because of changes in the communication protocol. The simulation of other actuator devices can be easily incorporated into the system of programs by using the underlying support that was developed for the concurrent execution of the simulation process and the communication between it and the controlling program.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 20 Employees' Benefits 4 2013-04-01 2013-04-01 false May youth participate in both youth and adult... Concurrent Enrollment § 664.500 May youth participate in both youth and adult/dislocated worker programs concurrently? (a) Yes, under the Act, eligible youth are 14 through 21 years of age. Adults are defined in the...
Code of Federal Regulations, 2014 CFR
2014-04-01
... 20 Employees' Benefits 4 2014-04-01 2014-04-01 false May youth participate in both youth and adult... Concurrent Enrollment § 664.500 May youth participate in both youth and adult/dislocated worker programs concurrently? (a) Yes, under the Act, eligible youth are 14 through 21 years of age. Adults are defined in the...
Code of Federal Regulations, 2012 CFR
2012-04-01
... 20 Employees' Benefits 4 2012-04-01 2012-04-01 false May youth participate in both youth and adult... Concurrent Enrollment § 664.500 May youth participate in both youth and adult/dislocated worker programs concurrently? (a) Yes, under the Act, eligible youth are 14 through 21 years of age. Adults are defined in the...
CONCURRENT WORK-EDUCATION (PROGRAMS IN THE 50 STATES 1965-66). INITIAL DRAFT.
ERIC Educational Resources Information Center
SCHILL, WILLIAM JOHN
A DESCRIPTIVE REPORT OF THE CONDUCT OR STATUS OF CONCURRENT WORK-EDUCATION PROGRAMS IN EACH OF THE 50 STATES IS PRESENTED. DATA ARE REPORTED FOR TWO DISTINCT PROGRAMS--(1) COOPERATIVE EDUCATION, A PROGRAM IN WHICH THE STUDENTS WORK PART-TIME AND STUDY IN A FORMAL CLASSROOM SETTING PART-TIME, AND (2) WORK-STUDY, A PROGRAM IN WHICH STUDENTS IN…
Optics simulations: a Python workshop
NASA Astrophysics Data System (ADS)
Ghalila, H.; Ammar, A.; Varadharajan, S.; Majdi, Y.; Zghal, M.; Lahmar, S.; Lakshminarayanan, V.
2017-08-01
Numerical simulations allow teachers and students to indirectly perform sophisticated experiments that cannot be realizable otherwise due to cost and other constraints. During the past few decades there has been an explosion in the development of numerical tools concurrently with open source environments such as Python software. This availability of open source software offers an incredible opportunity for advancing teaching methodologies as well as in research. More specifically it is possible to correlate theoretical knowledge with experimental measurements using "virtual" experiments. We have been working on the development of numerical simulation tools using the Python program package and we have concentrated on geometric and physical optics simulations. The advantage of doing hands-on numerical experiments is that it allows the student learner to be an active participant in the pedagogical/learning process rather than playing a passive role as in the traditional lecture format. Even in laboratory classes because of constraints of space, lack of equipment and often-large numbers of students, many students play a passive role since they work in groups of 3 or more students. Furthermore these new tools help students get a handle on numerical methods as well simulations and impart a "feel" for the physics under investigation.
Lessons from a Concurrent Evaluation of Eight Antibullying Programs Used in Sweden
ERIC Educational Resources Information Center
Flygare, Erik; Gill, Peter Edward; Johansson, Bjorn
2013-01-01
Sweden has a low prevalence of bullying and Swedish schools are legally obliged to have anti-bullying policies. Many commercial programs are available. A mixed methods, quasi-experimental, concurrent evaluation of 8 programs, chosen from a pool of 21 widely used anti-bullying programs, was planned. Preliminary data, based on 835 stakeholder…
Toward a Multidimensional Perspective on Teacher-Coach Role Conflict
ERIC Educational Resources Information Center
Richards, K. Andrew R.; Templin, Thomas J.
2012-01-01
Research grounded in role theory and occupational socialization theory point to the potential consequences of occupying the roles of physical education teacher and athletic coach concurrently. Specifically, time constraints and inconsistencies in role requirements, organization, rewards, and modes of accountability in teaching and coaching create…
Synchronization in Scratch: A Case Study with Education Science Students
ERIC Educational Resources Information Center
Nikolos, Dimitris; Komis, Vassilis
2015-01-01
The Scratch programming language is an introductory programming language for students. It is also a visual concurrent programming language, where multiple threads are executed simultaneously. Synchronization in concurrent languages is a complex task for novices to understand. Our research is focused on strategies and methods applied by novice…
How Formal Dynamic Verification Tools Facilitate Novel Concurrency Visualizations
NASA Astrophysics Data System (ADS)
Aananthakrishnan, Sriram; Delisi, Michael; Vakkalanka, Sarvani; Vo, Anh; Gopalakrishnan, Ganesh; Kirby, Robert M.; Thakur, Rajeev
With the exploding scale of concurrency, presenting valuable pieces of information collected by formal verification tools intuitively and graphically can greatly enhance concurrent system debugging. Traditional MPI program debuggers present trace views of MPI program executions. Such views are redundant, often containing equivalent traces that permute independent MPI calls. In our ISP formal dynamic verifier for MPI programs, we present a collection of alternate views made possible by the use of formal dynamic verification. Some of ISP’s views help pinpoint errors, some facilitate discerning errors by eliminating redundancy, while others help understand the program better by displaying concurrent even orderings that must be respected by all MPI implementations, in the form of completes-before graphs. In this paper, we describe ISP’s graphical user interface (GUI) capabilities in all these areas which are currently supported by a portable Java based GUI, a Microsoft Visual Studio GUI, and an Eclipse based GUI whose development is in progress.
Legal Implications of Physical Examinations
Felton, Jean Spencer
1978-01-01
With the new national emphasis on the prevention of occupationally incurred disease, legislative constraints have been placed in connection with the medical examination of employed persons at health risk. Concurrently, there is mandated a system of communication to the worker of the significant clinical findings encountered on his physical and laboratory inventories. PMID:636417
Examining Stress in Graduate Assistants: Combining Qualitative and Quantitative Survey Methods
ERIC Educational Resources Information Center
Mazzola, Joseph J.; Walker, Erin J.; Shockley, Kristen M.; Spector, Paul E.
2011-01-01
The aim of this study was to employ qualitative and quantitative survey methods in a concurrent mixed model design to assess stressors and strains in graduate assistants. The stressors most frequently reported qualitatively were work overload, interpersonal conflict, and organizational constraints; the most frequently reported psychological…
Programming Models for Concurrency and Real-Time
NASA Astrophysics Data System (ADS)
Vitek, Jan
Modern real-time applications are increasingly large, complex and concurrent systems which must meet stringent performance and predictability requirements. Programming those systems require fundamental advances in programming languages and runtime systems. This talk presents our work on Flexotasks, a programming model for concurrent, real-time systems inspired by stream-processing and concurrent active objects. Some of the key innovations in Flexotasks are that it support both real-time garbage collection and region-based memory with an ownership type system for static safety. Communication between tasks is performed by channels with a linear type discipline to avoid copying messages, and by a non-blocking transactional memory facility. We have evaluated our model empirically within two distinct implementations, one based on Purdue’s Ovm research virtual machine framework and the other on Websphere, IBM’s production real-time virtual machine. We have written a number of small programs, as well as a 30 KLOC avionics collision detector application. We show that Flexotasks are capable of executing periodic threads at 10 KHz with a standard deviation of 1.2us and have performance competitive with hand coded C programs.
Three Views on Concurrent Enrollment. Feature on Research and Leadership. Vol. 1, No. 2
ERIC Educational Resources Information Center
Scheffel, Kent
2016-01-01
In this brief, Kent Scheffel offers a unique combination of expertise on dual credit and concurrent enrollment as he reviews questions of quality, program accreditation, and education policy for concurrent enrollment offerings from a national (National Alliance of Concurrent Enrollment Partnerships (NACEP), local (Lewis and Clark Community…
Lessons learned while implementing an HIV/AIDS care and treatment program in rural Mozambique.
Moon, Troy D; Burlison, Janeen R; Sidat, Mohsin; Pires, Paulo; Silva, Wilson; Solis, Manuel; Rocha, Michele; Arregui, Chiqui; Manders, Eric J; Vergara, Alfredo E; Vermund, Sten H
2010-04-23
Mozambique has severe resource constraints, yet with international partnerships, the nation has placed over 145,000 HIV-infected persons on antiretroviral therapies (ART) through May-2009. HIV clinical services are provided at > 215 clinical venues in all 11 of Mozambique's provinces. Friends in Global Health (FGH) , affiliated with Vanderbilt University in the United States (US), is a locally licensed non-governmental organization (NGO) working exclusively in small city and rural venues in Zambézia Province whose population reaches approximately 4 million persons. Our approach to clinical capacity building is based on: 1) technical assistance to national health system facilities to implement ART clinical services at the district level, 2) human capacity development, and 3) health system strengthening. Challenges in this setting are daunting, including: 1) human resource constraints, 2) infrastructure limitations, 3) centralized care for large populations spread out over large distances, 4) continued high social stigma related to HIV, 5) limited livelihood options in rural areas and 6) limited educational opportunities in rural areas. Sustainability in rural Mozambique will depend on transitioning services from emergency foreign partners to local authorities and continued funding. It will also require "wrap-around" programs that help build economic capacity with agricultural, educational, and commercial initiatives. Sustainability is undermined by serious health manpower and infrastructure limitations. Recent U.S. government pronouncements suggest that the U.S. President's Emergency Plan for AIDS Relief will support concurrent community and business development. FGH, with its Mozambican government counterparts, see the evolution of an emergency response to a sustainable chronic disease management program as an essential and logical step. We have presented six key challenges that are essential to address in rural Mozambique.
Lessons learned while implementing an HIV/AIDS care and treatment program in rural Mozambique
Moon, Troy D.; Burlison, Janeen R.; Sidat, Mohsin; Pires, Paulo; Silva, Wilson; Solis, Manuel; Rocha, Michele; Arregui, Chiqui; Manders, Eric J.; Vergara, Alfredo E.; Vermund, Sten H.
2014-01-01
Mozambique has severe resource constraints, yet with international partnerships, the nation has placed over 145,000 HIV-infected persons on antiretroviral therapies (ART) through May-2009. HIV clinical services are provided at > 215 clinical venues in all 11 of Mozambique’s provinces. Friends in Global Health (FGH), affiliated with Vanderbilt University in the United States (US), is a locally licensed non-governmental organization (NGO) working exclusively in small city and rural venues in Zambézia Province whose population reaches approximately 4 million persons. Our approach to clinical capacity building is based on: 1) technical assistance to national health system facilities to implement ART clinical services at the district level, 2) human capacity development, and 3) health system strengthening. Challenges in this setting are daunting, including: 1) human resource constraints, 2) infrastructure limitations, 3) centralized care for large populations spread out over large distances, 4) continued high social stigma related to HIV, 5) limited livelihood options in rural areas and 6) limited educational opportunities in rural areas. Sustainability in rural Mozambique will depend on transitioning services from emergency foreign partners to local authorities and continued funding. It will also require “wrap-around” programs that help build economic capacity with agricultural, educational, and commercial initiatives. Sustainability is undermined by serious health manpower and infrastructure limitations. Recent U.S. government pronouncements suggest that the U.S. President’s Emergency Plan for AIDS Relief will support concurrent community and business development. FGH, with its Mozambican government counterparts, see the evolution of an emergency response to a sustainable chronic disease management program as an essential and logical step. We have presented six key challenges that are essential to address in rural Mozambique. PMID:25097450
Visualization of Concurrent Program Executions
NASA Technical Reports Server (NTRS)
Artho, Cyrille; Havelund, Klaus; Honiden, Shinichi
2007-01-01
Various program analysis techniques are efficient at discovering failures and properties. However, it is often difficult to evaluate results, such as program traces. This calls for abstraction and visualization tools. We propose an approach based on UML sequence diagrams, addressing shortcomings of such diagrams for concurrency. The resulting visualization is expressive and provides all the necessary information at a glance.
ERIC Educational Resources Information Center
Aucoin, Jennifer Mangrum
2013-01-01
The purpose of this mixed methods concurrent triangulation study was to examine the program evaluation practices of high school counselors. A total of 294 high school counselors in Texas were assessed using a mixed methods concurrent triangulation design. A researcher-developed survey, the School Counseling Program Evaluation Questionnaire…
Model-based control strategies for systems with constraints of the program type
NASA Astrophysics Data System (ADS)
Jarzębowska, Elżbieta
2006-08-01
The paper presents a model-based tracking control strategy for constrained mechanical systems. Constraints we consider can be material and non-material ones referred to as program constraints. The program constraint equations represent tasks put upon system motions and they can be differential equations of orders higher than one or two, and be non-integrable. The tracking control strategy relies upon two dynamic models: a reference model, which is a dynamic model of a system with arbitrary order differential constraints and a dynamic control model. The reference model serves as a motion planner, which generates inputs to the dynamic control model. It is based upon a generalized program motion equations (GPME) method. The method enables to combine material and program constraints and merge them both into the motion equations. Lagrange's equations with multipliers are the peculiar case of the GPME, since they can be applied to systems with constraints of first orders. Our tracking strategy referred to as a model reference program motion tracking control strategy enables tracking of any program motion predefined by the program constraints. It extends the "trajectory tracking" to the "program motion tracking". We also demonstrate that our tracking strategy can be extended to a hybrid program motion/force tracking.
NASA Astrophysics Data System (ADS)
Robertson, Randolph B.
This study investigates the impact of concurrent design on the cost growth and schedule growth of US Department of Defense Major Defense Acquisition Systems (MDAPs). It is motivated by the question of whether employment of concurrent design in the development of a major weapon system will produce better results in terms of cost and schedule than traditional serial development methods. Selected Acquisition Reports were used to determine the cost and schedule growth of MDAPs as well as the degree of concurrency employed. Two simple linear regression analyses were used to determine the degree to which cost growth and schedule growth vary with concurrency. The results were somewhat surprising in that for major weapon systems the utilization of concurrency as it was implemented in the programs under study was shown to have no effect on cost performance, and that performance to development schedule, one of the purported benefits of concurrency, was actually shown to deteriorate with increases in concurrency. These results, while not an indictment of the concept of concurrency, indicate that better practices and methods are needed in the implementation of concurrency in major weapon systems. The findings are instructive to stakeholders in the weapons acquisition process in their consideration of whether and how to employ concurrent design strategies in their planning of new weapons acquisition programs.
Koorts, Harriet; Gillison, Fiona
2015-11-06
Communities are a pivotal setting in which to promote increases in child and adolescent physical activity behaviours. Interventions implemented in these settings require effective evaluation to facilitate translation of findings to wider settings. The aims of this paper are to i) present findings from a RE-AIM evaluation of a community-based physical activity program, and ii) review the methodological challenges faced when applying RE-AIM in practice. A single mixed-methods case study was conducted based on a concurrent triangulation design. Five sources of data were collected via interviews, questionnaires, archival records, documentation and field notes. Evidence was triangulated within RE-AIM to assess individual and organisational-level program outcomes. Inconsistent availability of data and a lack of robust reporting challenged assessment of all five dimensions. Reach, Implementation and setting-level Adoption were less successful, Effectiveness and Maintenance at an individual and organisational level were moderately successful. Only community-level Adoption was highly successful, reflecting the key program goal to provide community-wide participation in sport and physical activity. This research highlighted important methodological constraints associated with the use of RE-AIM in practice settings. Future evaluators wishing to use RE-AIM may benefit from a mixed-method triangulation approach to offset challenges with data availability and reliability.
A concurrent distributed system for aircraft tactical decision generation
NASA Technical Reports Server (NTRS)
Mcmanus, John W.
1990-01-01
A research program investigating the use of AI techniques to aid in the development of a tactical decision generator (TDG) for within visual range (WVR) air combat engagements is discussed. The application of AI programming and problem-solving methods in the development and implementation of a concurrent version of the computerized logic for air-to-air warfare simulations (CLAWS) program, a second-generation TDG, is presented. Concurrent computing environments and programming approaches are discussed, and the design and performance of prototype concurrent TDG system (Cube CLAWS) are presented. It is concluded that the Cube CLAWS has provided a useful testbed to evaluate the development of a distributed blackboard system. The project has shown that the complexity of developing specialized software on a distributed, message-passing architecture such as the Hypercube is not overwhelming, and that reasonable speedups and processor efficiency can be achieved by a distributed blackboard system. The project has also highlighted some of the costs of using a distributed approach to designing a blackboard system.
Steele, Ann; Karmiloff-Smith, Annette; Cornish, Kim; Scerif, Gaia
2012-11-01
Attention is construed as multicomponential, but the roles of its distinct subfunctions in shaping the broader developing cognitive landscape are poorly understood. The current study assessed 3- to 6-year-olds (N=83) to: (a) trace developmental trajectories of attentional processes and their structure in early childhood and (b) measure the impact of distinct attention subfunctions on concurrent and longitudinal abilities related to literacy and numeracy. Distinct trajectories across attention measures revealed the emergence of 2 attentional factors, encompassing "executive" and "sustained-selective" processes. Executive attention predicted concurrent abilities across domains at Time 1, whereas sustained-selective attention predicted basic numeracy 1 year later. These concurrent and longitudinal constraints cast a broader light on the unfolding relations between domain-general and domain-specific processes over early childhood. © 2012 The Authors. Child Development © 2012 Society for Research in Child Development, Inc.
Integrated System-Level Optimization for Concurrent Engineering With Parametric Subsystem Modeling
NASA Technical Reports Server (NTRS)
Schuman, Todd; DeWeck, Oliver L.; Sobieski, Jaroslaw
2005-01-01
The introduction of concurrent design practices to the aerospace industry has greatly increased the productivity of engineers and teams during design sessions as demonstrated by JPL's Team X. Simultaneously, advances in computing power have given rise to a host of potent numerical optimization methods capable of solving complex multidisciplinary optimization problems containing hundreds of variables, constraints, and governing equations. Unfortunately, such methods are tedious to set up and require significant amounts of time and processor power to execute, thus making them unsuitable for rapid concurrent engineering use. This paper proposes a framework for Integration of System-Level Optimization with Concurrent Engineering (ISLOCE). It uses parametric neural-network approximations of the subsystem models. These approximations are then linked to a system-level optimizer that is capable of reaching a solution quickly due to the reduced complexity of the approximations. The integration structure is described in detail and applied to the multiobjective design of a simplified Space Shuttle external fuel tank model. Further, a comparison is made between the new framework and traditional concurrent engineering (without system optimization) through an experimental trial with two groups of engineers. Each method is evaluated in terms of optimizer accuracy, time to solution, and ease of use. The results suggest that system-level optimization, running as a background process during integrated concurrent engineering sessions, is potentially advantageous as long as it is judiciously implemented.
Multiprocessor smalltalk: Implementation, performance, and analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pallas, J.I.
1990-01-01
Multiprocessor Smalltalk demonstrates the value of object-oriented programming on a multiprocessor. Its implementation and analysis shed light on three areas: concurrent programming in an object oriented language without special extensions, implementation techniques for adapting to multiprocessors, and performance factors in the resulting system. Adding parallelism to Smalltalk code is easy, because programs already use control abstractions like iterators. Smalltalk's basic control and concurrency primitives (lambda expressions, processes and semaphores) can be used to build parallel control abstractions, including parallel iterators, parallel objects, atomic objects, and futures. Language extensions for concurrency are not required. This implementation demonstrates that it is possiblemore » to build an efficient parallel object-oriented programming system and illustrates techniques for doing so. Three modification tools-serialization, replication, and reorganization-adapted the Berkeley Smalltalk interpreter to the Firefly multiprocessor. Multiprocessor Smalltalk's performance shows that the combination of multiprocessing and object-oriented programming can be effective: speedups (relative to the original serial version) exceed 2.0 for five processors on all the benchmarks; the median efficiency is 48%. Analysis shows both where performance is lost and how to improve and generalize the experimental results. Changes in the interpreter to support concurrency add at most 12% overhead; better access to per-process variables could eliminate much of that. Changes in the user code to express concurrency add as much as 70% overhead; this overhead could be reduced to 54% if blocks (lambda expressions) were reentrant. Performance is also lost when the program cannot keep all five processors busy.« less
Exploiting loop level parallelism in nonprocedural dataflow programs
NASA Technical Reports Server (NTRS)
Gokhale, Maya B.
1987-01-01
Discussed are how loop level parallelism is detected in a nonprocedural dataflow program, and how a procedural program with concurrent loops is scheduled. Also discussed is a program restructuring technique which may be applied to recursive equations so that concurrent loops may be generated for a seemingly iterative computation. A compiler which generates C code for the language described below has been implemented. The scheduling component of the compiler and the restructuring transformation are described.
NASA Technical Reports Server (NTRS)
Short, Nick, Jr.; Bedet, Jean-Jacques; Bodden, Lee; Boddy, Mark; White, Jim; Beane, John
1994-01-01
The Goddard Space Flight Center (GSFC) Distributed Active Archive Center (DAAC) has been operational since October 1, 1993. Its mission is to support the Earth Observing System (EOS) by providing rapid access to EOS data and analysis products, and to test Earth Observing System Data and Information System (EOSDIS) design concepts. One of the challenges is to ensure quick and easy retrieval of any data archived within the DAAC's Data Archive and Distributed System (DADS). Over the 15-year life of EOS project, an estimated several Petabytes (10(exp 15)) of data will be permanently stored. Accessing that amount of information is a formidable task that will require innovative approaches. As a precursor of the full EOS system, the GSFC DAAC with a few Terabits of storage, has implemented a prototype of a constraint-based task and resource scheduler to improve the performance of the DADS. This Honeywell Task and Resource Scheduler (HTRS), developed by Honeywell Technology Center in cooperation the Information Science and Technology Branch/935, the Code X Operations Technology Program, and the GSFC DAAC, makes better use of limited resources, prevents backlog of data, provides information about resources bottlenecks and performance characteristics. The prototype which is developed concurrently with the GSFC Version 0 (V0) DADS, models DADS activities such as ingestion and distribution with priority, precedence, resource requirements (disk and network bandwidth) and temporal constraints. HTRS supports schedule updates, insertions, and retrieval of task information via an Application Program Interface (API). The prototype has demonstrated with a few examples, the substantial advantages of using HTRS over scheduling algorithms such as a First In First Out (FIFO) queue. The kernel scheduling engine for HTRS, called Kronos, has been successfully applied to several other domains such as space shuttle mission scheduling, demand flow manufacturing, and avionics communications scheduling.
Explanation Constraint Programming for Model-based Diagnosis of Engineered Systems
NASA Technical Reports Server (NTRS)
Narasimhan, Sriram; Brownston, Lee; Burrows, Daniel
2004-01-01
We can expect to see an increase in the deployment of unmanned air and land vehicles for autonomous exploration of space. In order to maintain autonomous control of such systems, it is essential to track the current state of the system. When the system includes safety-critical components, failures or faults in the system must be diagnosed as quickly as possible, and their effects compensated for so that control and safety are maintained under a variety of fault conditions. The Livingstone fault diagnosis and recovery kernel and its temporal extension L2 are examples of model-based reasoning engines for health management. Livingstone has been shown to be effective, it is in demand, and it is being further developed. It was part of the successful Remote Agent demonstration on Deep Space One in 1999. It has been and is being utilized by several projects involving groups from various NASA centers, including the In Situ Propellant Production (ISPP) simulation at Kennedy Space Center, the X-34 and X-37 experimental reusable launch vehicle missions, Techsat-21, and advanced life support projects. Model-based and consistency-based diagnostic systems like Livingstone work only with discrete and finite domain models. When quantitative and continuous behaviors are involved, these are abstracted to discrete form using some mapping. This mapping from the quantitative domain to the qualitative domain is sometimes very involved and requires the design of highly sophisticated and complex monitors. We propose a diagnostic methodology that deals directly with quantitative models and behaviors, thereby mitigating the need for these sophisticated mappings. Our work brings together ideas from model-based diagnosis systems like Livingstone and concurrent constraint programming concepts. The system uses explanations derived from the propagation of quantitative constraints to generate conflicts. Fast conflict generation algorithms are used to generate and maintain multiple candidates whose consistency can be tracked across multiple time steps.
A comparison of Heuristic method and Llewellyn’s rules for identification of redundant constraints
NASA Astrophysics Data System (ADS)
Estiningsih, Y.; Farikhin; Tjahjana, R. H.
2018-03-01
Important techniques in linear programming is modelling and solving practical optimization. Redundant constraints are consider for their effects on general linear programming problems. Identification and reduce redundant constraints are for avoidance of all the calculations associated when solving an associated linear programming problems. Many researchers have been proposed for identification redundant constraints. This paper a compararison of Heuristic method and Llewellyn’s rules for identification of redundant constraints.
The Enhancement of Concurrent Processing through Functional Programming Languages.
1984-06-01
ta * functional programming languages allow us to harness the pro- cessing power of computers with hundreds or even thousands of DD I 1473 EDITION OF...that it might be the best way to make imperative library", programs into functional ones which are well suited to concurrent processing. Accession For...statements in their code. We assert that functional programming languajes allok us to harness the processing power of computers with hundre4s or even
ERIC Educational Resources Information Center
Urcelay, Gonzalo P.; Lipatova, Olga; Miller, Ralph R.
2009-01-01
Three Pavlovian fear conditioning experiments with rats as subjects explored the effect of extinction in the presence of a concurrent excitor. Our aim was to explore this particular treatment, documented in previous studies to deepen extinction, with novel control groups to shed light on the processes involved in extinction. Relative to subjects…
Exploring Efficacy in Negotiating Support: Women Re-Entry Students in Higher Education
ERIC Educational Resources Information Center
Filipponi-Berardinelli, Josephine Oriana
2013-01-01
The existing literature on women re-entry students reveals that women students concurrently struggle with family, work, and sometimes health issues. Women students often do not receive adequate support from their partners or from other sources in helping manage the multiple roles that compete for their time, and often face constraints that affect…
Development of a multistage compliant mechanism with new boundary constraint
NASA Astrophysics Data System (ADS)
Ling, Mingxiang; Cao, Junyi; Jiang, Zhou; Li, Qisheng
2018-01-01
This paper presents a piezo-actuated compliant mechanism with a new boundary constraint to provide concurrent large workspace and high dynamic frequency for precision positioning or other flexible manipulation applications. A two-stage rhombus-type displacement amplifier with the "sliding-sliding" boundary constraint is presented to maximize the dynamic frequency while retaining a large output displacement. The vibration mode is also improved by the designed boundary constraint. A theoretical kinematic model of the compliant mechanism is established to optimize the geometric parameters, and a prototype is fabricated with a compact dimension of 60 mm × 60 mm × 12 mm. The experimental testing shows that the maximum stroke is approximately 0.6 mm and the output stiffness is 1.1 N/μm with the fundamental frequency of larger than 2.2 kHz. Lastly, the excellent performance of the presented compliant mechanism is compared with several mechanisms in the previous literature. As a conclusion, the presented boundary constraint strategy provides a new way to balance the trade-off between the frequency response and the stroke range widely existed in compliant mechanisms.
Knowledge outcomes within rotational models of social work field education.
Birkenmaier, Julie; Curley, Jami; Rowan, Noell L
2012-01-01
This study assessed knowledge outcomes among concurrent, concurrent/sequential, and sequential rotation models of field instruction. Posttest knowledge scores of students ( n = 231) in aging-related field education were higher for students who participated in the concurrent rotation model, and for those who completed field education at a long-term care facility. Scores were also higher for students in programs that infused a higher number of geriatric competencies in their curriculum. Recommendations are provided to programs considering rotation models of field education related to older adults.
Thirteen years and counting: Outcomes of a concurrent ASN/BSN enrollment program.
Heglund, Stephen; Simmons, Jessica; Wink, Diane; D'Meza Leuner, Jean
In their 2011 report, The Future of Nursing, the Institute of Medicine called for 80% of the nursing workforce to be comprised of baccalaureate prepared Registered Nurses by the year 2020. One suggested approach to achieve this goal is the creation of programs that allow students to progress through associate and baccalaureate nursing preparation simultaneously. This paper describes the University of Central Florida's 13-year experience after implementing a Concurrent Enrollment Program. Development and structure of the program, advisement and curriculum details, facilitators and barriers are described. Data on National Council Licensure Examination for Registered Nurses pass rates, completion rates, comparison with traditional RN-BSN students, and progression to graduate school are also included. The Concurrent Program model described here between a specific university and state college partners, demonstrated positive outcomes that support achievement of the Institute of Medicine's goals. Copyright © 2017 Elsevier Inc. All rights reserved.
History of Concurrency. The Controversy of Military Acquisition Program Schedule Compression
1986-09-01
33 The Manhattan Project . ........... 35 Post-War Acquisition . . . . ...... 40 The Lockheed Skunk Works ........ 42 iii Era of Controversy...been involved Vin the Manhattan Project (47:55). The concurrency concept and the innovative foundation of the Air Force Ballistic Missile Program...inter- F4 continental oallistic missile. To snorten the program’s development time they drew upon lessons from the Manhattan Project and made three
A Model-Driven Approach to Teaching Concurrency
ERIC Educational Resources Information Center
Carro, Manuel; Herranz, Angel; Marino, Julio
2013-01-01
We present an undergraduate course on concurrent programming where formal models are used in different stages of the learning process. The main practical difference with other approaches lies in the fact that the ability to develop correct concurrent software relies on a systematic transformation of formal models of inter-process interaction (so…
ERIC Educational Resources Information Center
Cook, Amy L.; Wilczenski, Felicia L.; Vanderberg, Laura
2017-01-01
There have been significant advances in educational programming and postsecondary options targeting acquisition of self-determination skills among students with intellectual disability. This article provides a description of an inclusive concurrent enrollment (ICE) program at an urban public university and describes findings related to student…
Ethnic Studies, Policies, and Programs: A Response to Assembly Concurrent Resolution 71.
ERIC Educational Resources Information Center
Petersen, Allan; Cepeda, Rita
Assembly Concurrent Resolution 71 (ACR 71) requests California's three public segments of higher education to review those policies and programs that are aimed at ensuring that all graduates "possess an understanding and awareness of non-white ethnic groups" and to consider adopting necessary policies to ensure that goal. This report…
1997-12-01
Fracture Analysis of the F-5, 15%-Spar Bolt DR Devendra Kumar SAALC/LD 6- 16 CUNY-City College, New York, NY A Simple, Multiversion Concurrency Control...Program, University of Dayton, Dayton, OH. [3]AFGROW, Air Force Crack Propagation Analysis Program, Version 3.82 (1997) 15-8 A SIMPLE, MULTIVERSION ...Office of Scientific Research Boiling Air Force Base, DC and San Antonio Air Logistic Center August 1997 16-1 A SIMPLE, MULTIVERSION CONCURRENCY
Choi, Mi-Ri; Jeon, Sang-Wan; Yi, Eun-Surk
2018-04-01
The purpose of this study is to analyze the differences among the hospitalized cancer patients on their perception of exercise and physical activity constraints based on their medical history. The study used questionnaire survey as measurement tool for 194 cancer patients (male or female, aged 20 or older) living in Seoul metropolitan area (Seoul, Gyeonggi, Incheon). The collected data were analyzed using frequency analysis, exploratory factor analysis, reliability analysis t -test, and one-way distribution using statistical program SPSS 18.0. The following results were obtained. First, there was no statistically significant difference between cancer stage and exercise recognition/physical activity constraint. Second, there was a significant difference between cancer stage and sociocultural constraint/facility constraint/program constraint. Third, there was a significant difference between cancer operation history and physical/socio-cultural/facility/program constraint. Fourth, there was a significant difference between cancer operation history and negative perception/facility/program constraint. Fifth, there was a significant difference between ancillary cancer treatment method and negative perception/facility/program constraint. Sixth, there was a significant difference between hospitalization period and positive perception/negative perception/physical constraint/cognitive constraint. In conclusion, this study will provide information necessary to create patient-centered healthcare service system by analyzing exercise recognition of hospitalized cancer patients based on their medical history and to investigate the constraint factors that prevents patients from actually making efforts to exercise.
Optimization of Car Body under Constraints of Noise, Vibration, and Harshness (NVH), and Crash
NASA Technical Reports Server (NTRS)
Kodiyalam, Srinivas; Yang, Ren-Jye; Sobieszczanski-Sobieski, Jaroslaw (Editor)
2000-01-01
To be competitive on the today's market, cars have to be as light as possible while meeting the Noise, Vibration, and Harshness (NVH) requirements and conforming to Government-man dated crash survival regulations. The latter are difficult to meet because they involve very compute-intensive, nonlinear analysis, e.g., the code RADIOSS capable of simulation of the dynamics, and the geometrical and material nonlinearities of a thin-walled car structure in crash, would require over 12 days of elapsed time for a single design of a 390K elastic degrees of freedom model, if executed on a single processor of the state-of-the-art SGI Origin2000 computer. Of course, in optimization that crash analysis would have to be invoked many times. Needless to say, that has rendered such optimization intractable until now. The car finite element model is shown. The advent of computers that comprise large numbers of concurrently operating processors has created a new environment wherein the above optimization, and other engineering problems heretofore regarded as intractable may be solved. The procedure, shown, is a piecewise approximation based method and involves using a sensitivity based Taylor series approximation model for NVH and a polynomial response surface model for Crash. In that method the NVH constraints are evaluated using a finite element code (MSC/NASTRAN) that yields the constraint values and their derivatives with respect to design variables. The crash constraints are evaluated using the explicit code RADIOSS on the Origin 2000 operating on 256 processors simultaneously to generate data for a polynomial response surface in the design variable domain. The NVH constraints and their derivatives combined with the response surface for the crash constraints form an approximation to the system analysis (surrogate analysis) that enables a cycle of multidisciplinary optimization within move limits. In the inner loop, the NVH sensitivities are recomputed to update the NVH approximation model while keeping the Crash response surface constant. In every outer loop, the Crash response surface approximation is updated, including a gradual increase in the order of the response surface and the response surface extension in the direction of the search. In this optimization task, the NVH discipline has 30 design variables while the crash discipline has 20 design variables. A subset of these design variables (10) are common to both the NVH and crash disciplines. In order to construct a linear response surface for the Crash discipline constraints, a minimum of 21 design points would have to be analyzed using the RADIOSS code. On a single processor in Origin 2000 that amount of computing would require over 9 months! In this work, these runs were carried out concurrently on the Origin 2000 using multiple processors, ranging from 8 to 16, for each crash (RADIOSS) analysis. Another figure shows the wall time required for a single RADIOSS analysis using varying number of processors, as well as provides a comparison of 2 different common data placement procedures within the allotted memories for each analysis. The initial design is an infeasible design with NVH discipline Static Torsion constraint violations of over 10%. The final optimized design is a feasible design with a weight reduction of 15 kg compared to the initial design. This work demonstrates how advanced methodology for optimization combined with the technology of concurrent processing enables applications that until now were out of reach because of very long time-to-solution.
Yong, Paul J; Sadownik, Leslie; Brotto, Lori A
2015-01-01
Little is known about women with concurrent diagnoses of deep dyspareunia and superficial dyspareunia. The aim of this study was to determine the prevalence, associations, and outcome of women with concurrent deep-superficial dyspareunia. This is a prospective study of a multidisciplinary vulvodynia program (n = 150; mean age 28.7 ± 6.4 years). Women with superficial dyspareunia due to provoked vestibulodynia were divided into two groups: those also having deep dyspareunia (i.e., concurrent deep-superficial dyspareunia) and those with only superficial dyspareunia due to provoked vestibulodynia. Demographics, dyspareunia-related factors, other pain conditions, and psychological variables at pretreatment were tested for an association with concurrent deep-superficial dyspareunia. Outcome in both groups was assessed to 6 months posttreatment. Level of dyspareunia pain (0-10) and Female Sexual Distress Scale were the main outcome measures. The prevalence of concurrent deep-superficial dyspareunia was 44% (66/150) among women with superficial dyspareunia due to provoked vestibulodynia. At pretreatment, on multiple logistic regression, concurrent deep-superficial dyspareunia was independently associated with a higher level of dyspareunia pain (odds ratio [OR] = 1.19 [1.01-1.39], P = 0.030), diagnosis of endometriosis (OR = 4.30 [1.16-15.90], P = 0.022), history of bladder problems (OR = 3.84 [1.37-10.76], P = 0.008), and more depression symptoms (OR = 1.07 [1.02-1.12], P = 0.007), with no difference in the Female Sexual Distress Scale. At 6 months posttreatment, women with concurrent deep-superficial dyspareunia improved in the level of dyspareunia pain and in the Female Sexual Distress Scale to the same degree as women with only superficial dyspareunia due to provoked vestibulodynia. Concurrent deep-superficial dyspareunia is reported by almost half of women in a multidisciplinary vulvodynia program. In women with provoked vestibulodynia, concurrent deep-superficial dyspareunia may be related to endometriosis or interstitial cystitis, and is associated with depression and more severe dyspareunia symptoms. Standardized multidisciplinary care is effective for women with concurrent dyspareunia. © 2014 International Society for Sexual Medicine.
Image-Processing Software For A Hypercube Computer
NASA Technical Reports Server (NTRS)
Lee, Meemong; Mazer, Alan S.; Groom, Steven L.; Williams, Winifred I.
1992-01-01
Concurrent Image Processing Executive (CIPE) is software system intended to develop and use image-processing application programs on concurrent computing environment. Designed to shield programmer from complexities of concurrent-system architecture, it provides interactive image-processing environment for end user. CIPE utilizes architectural characteristics of particular concurrent system to maximize efficiency while preserving architectural independence from user and programmer. CIPE runs on Mark-IIIfp 8-node hypercube computer and associated SUN-4 host computer.
39 CFR 273.7 - Concurrence of Attorney General.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 39 Postal Service 1 2010-07-01 2010-07-01 false Concurrence of Attorney General. 273.7 Section 273... PROGRAM FRAUD CIVIL REMEDIES ACT § 273.7 Concurrence of Attorney General. (a) The Attorney General is... the Attorney General or his designee approves such action in a written statement which specifies: (1...
Compile-Time Schedulability Analysis of Communicating Concurrent Programs
2006-06-28
synchronize via the read and write operations on the FIFO channels. These operations have been implemented with the help of semaphores , which...3 1.1.2 Synchronous Dataflow . . . . . . . . . . . . . . . . . . . . . . . . . . 8 1.1.3 Boolean Dataflow...described by concurrent programs . . . . . . . . . 4 1.3 A synchronous dataflow model, its topology matrix, and repetition vector . 10 1.4 Select and
ERIC Educational Resources Information Center
Weatherly, Jeffrey N.; Thompson, Bradley J.; Hodny, Marisa; Meier, Ellen
2009-01-01
In a simulated casino environment, 6 nonpathological women played concurrently available commercial slot machines programmed to pay out at different rates. Participants did not always demonstrate preferences for the higher paying machine. The data suggest that factors other than programmed or obtained rate of reinforcement may control gambling…
Comparing host and target environments for distributed Ada programs
NASA Technical Reports Server (NTRS)
Paulk, Mark C.
1986-01-01
The Ada programming language provides a means of specifying logical concurrency by using multitasking. Extending the Ada multitasking concurrency mechanism into a physically concurrent distributed environment which imposes its own requirements can lead to incompatibilities. These problems are discussed. Using distributed Ada for a target system may be appropriate, but when using the Ada language in a host environment, a multiprocessing model may be more suitable than retargeting an Ada compiler for the distributed environment. The tradeoffs between multitasking on distributed targets and multiprocessing on distributed hosts are discussed. Comparisons of the multitasking and multiprocessing models indicate different areas of application.
NASA Astrophysics Data System (ADS)
Long, Kai; Yuan, Philip F.; Xu, Shanqing; Xie, Yi Min
2018-04-01
Most studies on composites assume that the constituent phases have different values of stiffness. Little attention has been paid to the effect of constituent phases having distinct Poisson's ratios. This research focuses on a concurrent optimization method for simultaneously designing composite structures and materials with distinct Poisson's ratios. The proposed method aims to minimize the mean compliance of the macrostructure with a given mass of base materials. In contrast to the traditional interpolation of the stiffness matrix through numerical results, an interpolation scheme of the Young's modulus and Poisson's ratio using different parameters is adopted. The numerical results demonstrate that the Poisson effect plays a key role in reducing the mean compliance of the final design. An important contribution of the present study is that the proposed concurrent optimization method can automatically distribute base materials with distinct Poisson's ratios between the macrostructural and microstructural levels under a single constraint of the total mass.
Concurrent planning and execution for a walking robot
NASA Astrophysics Data System (ADS)
Simmons, Reid
1990-07-01
The Planetary Rover project is developing the Ambler, a novel legged robot, and an autonomous software system for walking the Ambler over rough terrain. As part of the project, we have developed a system that integrates perception, planning, and real-time control to navigate a single leg of the robot through complex obstacle courses. The system is integrated using the Task Control Architecture (TCA), a general-purpose set of utilities for building and controlling distributed mobile robot systems. The walking system, as originally implemented, utilized a sequential sense-plan-act control cycle. This report describes efforts to improve the performance of the system by concurrently planning and executing steps. Concurrency was achieved by modifying the existing sequential system to utilize TCA features such as resource management, monitors, temporal constraints, and hierarchical task trees. Performance was increased in excess of 30 percent with only a relatively modest effort to convert and test the system. The results lend support to the utility of using TCA to develop complex mobile robot systems.
Constraint Logic Programming approach to protein structure prediction.
Dal Palù, Alessandro; Dovier, Agostino; Fogolari, Federico
2004-11-30
The protein structure prediction problem is one of the most challenging problems in biological sciences. Many approaches have been proposed using database information and/or simplified protein models. The protein structure prediction problem can be cast in the form of an optimization problem. Notwithstanding its importance, the problem has very seldom been tackled by Constraint Logic Programming, a declarative programming paradigm suitable for solving combinatorial optimization problems. Constraint Logic Programming techniques have been applied to the protein structure prediction problem on the face-centered cube lattice model. Molecular dynamics techniques, endowed with the notion of constraint, have been also exploited. Even using a very simplified model, Constraint Logic Programming on the face-centered cube lattice model allowed us to obtain acceptable results for a few small proteins. As a test implementation their (known) secondary structure and the presence of disulfide bridges are used as constraints. Simplified structures obtained in this way have been converted to all atom models with plausible structure. Results have been compared with a similar approach using a well-established technique as molecular dynamics. The results obtained on small proteins show that Constraint Logic Programming techniques can be employed for studying protein simplified models, which can be converted into realistic all atom models. The advantage of Constraint Logic Programming over other, much more explored, methodologies, resides in the rapid software prototyping, in the easy way of encoding heuristics, and in exploiting all the advances made in this research area, e.g. in constraint propagation and its use for pruning the huge search space.
Liu, Jason B; Berian, Julia R; Ban, Kristen A; Liu, Yaoming; Cohen, Mark E; Angelos, Peter; Matthews, Jeffrey B; Hoyt, David B; Hall, Bruce L; Ko, Clifford Y
2017-09-01
To determine whether concurrently performed operations are associated with an increased risk for adverse events. Concurrent operations occur when a surgeon is simultaneously responsible for critical portions of 2 or more operations. How this practice affects patient outcomes is unknown. Using American College of Surgeons' National Surgical Quality Improvement Program data from 2014 to 2015, operations were considered concurrent if they overlapped by ≥60 minutes or in their entirety. Propensity-score-matched cohorts were constructed to compare death or serious morbidity (DSM), unplanned reoperation, and unplanned readmission in concurrent versus non-concurrent operations. Multilevel hierarchical regression was used to account for the clustered nature of the data while controlling for procedure and case mix. There were 1430 (32.3%) surgeons from 390 (77.7%) hospitals who performed 12,010 (2.3%) concurrent operations. Plastic surgery (n = 393 [13.7%]), otolaryngology (n = 470 [11.2%]), and neurosurgery (n = 2067 [8.4%]) were specialties with the highest proportion of concurrent operations. Spine procedures were the most frequent concurrent procedures overall (n = 2059/12,010 [17.1%]). Unadjusted rates of DSM (9.0% vs 7.1%; P < 0.001), reoperation (3.6% vs 2.7%; P < 0.001), and readmission (6.9% vs 5.1%; P < 0.001) were greater in the concurrent operation cohort versus the non-concurrent. After propensity score matching and risk-adjustment, there was no significant association of concurrence with DSM (odds ratio [OR] 1.08; 95% confidence interval [CI] 0.96-1.21), reoperation (OR 1.16; 95% CI 0.96-1.40), or readmission (OR 1.14; 95% CI 0.99-1.29). In these analyses, concurrent operations were not detected to increase the risk for adverse outcomes. These results do not lessen the need for further studies, continuous self-regulation and proactive disclosure to patients.
NASA Technical Reports Server (NTRS)
Gavert, Raymond B.
1990-01-01
Some experiences of NASA configuration management in providing concurrent engineering support to the Space Station Freedom program for the achievement of life cycle benefits and total quality are discussed. Three change decision experiences involving tracing requirements and automated information systems of the electrical power system are described. The potential benefits of concurrent engineering and total quality management include improved operational effectiveness, reduced logistics and support requirements, prevention of schedule slippages, and life cycle cost savings. It is shown how configuration management can influence the benefits attained through disciplined approaches and innovations that compel consideration of all the technical elements of engineering and quality factors that apply to the program development, transition to operations and in operations. Configuration management experiences involving the Space Station program's tiered management structure, the work package contractors, international partners, and the participating NASA centers are discussed.
Software fault tolerance for real-time avionics systems
NASA Technical Reports Server (NTRS)
Anderson, T.; Knight, J. C.
1983-01-01
Avionics systems have very high reliability requirements and are therefore prime candidates for the inclusion of fault tolerance techniques. In order to provide tolerance to software faults, some form of state restoration is usually advocated as a means of recovery. State restoration can be very expensive for systems which utilize concurrent processes. The concurrency present in most avionics systems and the further difficulties introduced by timing constraints imply that providing tolerance for software faults may be inordinately expensive or complex. A straightforward pragmatic approach to software fault tolerance which is believed to be applicable to many real-time avionics systems is proposed. A classification system for software errors is presented together with approaches to recovery and continued service for each error type.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 20 Employees' Benefits 3 2011-04-01 2011-04-01 false May youth participate in both youth and adult... Enrollment § 664.500 May youth participate in both youth and adult/dislocated worker programs concurrently? (a) Yes, under the Act, eligible youth are 14 through 21 years of age. Adults are defined in the Act...
Code of Federal Regulations, 2010 CFR
2010-04-01
... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false May youth participate in both youth and adult... Enrollment § 664.500 May youth participate in both youth and adult/dislocated worker programs concurrently? (a) Yes, under the Act, eligible youth are 14 through 21 years of age. Adults are defined in the Act...
ERIC Educational Resources Information Center
Smith, Andrea Christine
2010-01-01
Students at the National College of Natural Medicine (NCNM) are eligible to concurrently study both Western medicine, as reflected by the Doctor of Naturopathic Medicine (ND) program, and Eastern medicine, as exhibited by the Master of Science in Oriental Medicine (MSOM) degree program. The dual track is unique in that the dominant Western…
Scalable Replay with Partial-Order Dependencies for Message-Logging Fault Tolerance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lifflander, Jonathan; Meneses, Esteban; Menon, Harshita
2014-09-22
Deterministic replay of a parallel application is commonly used for discovering bugs or to recover from a hard fault with message-logging fault tolerance. For message passing programs, a major source of overhead during forward execution is recording the order in which messages are sent and received. During replay, this ordering must be used to deterministically reproduce the execution. Previous work in replay algorithms often makes minimal assumptions about the programming model and application in order to maintain generality. However, in many cases, only a partial order must be recorded due to determinism intrinsic in the code, ordering constraints imposed bymore » the execution model, and events that are commutative (their relative execution order during replay does not need to be reproduced exactly). In this paper, we present a novel algebraic framework for reasoning about the minimum dependencies required to represent the partial order for different concurrent orderings and interleavings. By exploiting this theory, we improve on an existing scalable message-logging fault tolerance scheme. The improved scheme scales to 131,072 cores on an IBM BlueGene/P with up to 2x lower overhead than one that records a total order.« less
Constraints in Genetic Programming
NASA Technical Reports Server (NTRS)
Janikow, Cezary Z.
1996-01-01
Genetic programming refers to a class of genetic algorithms utilizing generic representation in the form of program trees. For a particular application, one needs to provide the set of functions, whose compositions determine the space of program structures being evolved, and the set of terminals, which determine the space of specific instances of those programs. The algorithm searches the space for the best program for a given problem, applying evolutionary mechanisms borrowed from nature. Genetic algorithms have shown great capabilities in approximately solving optimization problems which could not be approximated or solved with other methods. Genetic programming extends their capabilities to deal with a broader variety of problems. However, it also extends the size of the search space, which often becomes too large to be effectively searched even by evolutionary methods. Therefore, our objective is to utilize problem constraints, if such can be identified, to restrict this space. In this publication, we propose a generic constraint specification language, powerful enough for a broad class of problem constraints. This language has two elements -- one reduces only the number of program instances, the other reduces both the space of program structures as well as their instances. With this language, we define the minimal set of complete constraints, and a set of operators guaranteeing offspring validity from valid parents. We also show that these operators are not less efficient than the standard genetic programming operators if one preprocesses the constraints - the necessary mechanisms are identified.
NASA Technical Reports Server (NTRS)
Ivancic, William D.; Shalkhauser, Mary JO
1991-01-01
Emphasis is on a destination directed packet switching architecture for a 30/20 GHz frequency division multiplex access/time division multiplex (FDMA/TDM) geostationary satellite communication network. Critical subsystems and problem areas are identified and addressed. Efforts have concentrated heavily on the space segment; however, the ground segment was considered concurrently to ensure cost efficiency and realistic operational constraints.
Circuit-switch architecture for a 30/20-GHz FDMA/TDM geostationary satellite communications network
NASA Technical Reports Server (NTRS)
Ivancic, William D.
1992-01-01
A circuit switching architecture is described for a 30/20 GHz frequency division, multiple access uplink/time division multiplexed downlink (FDMA/TDM) geostationary satellite communications network. Critical subsystems and problem areas are identified and addressed. Work was concentrated primarily on the space segment; however, the ground segment was considered concurrently to ensure cost efficiency and realistic operational constraints.
NASA Technical Reports Server (NTRS)
Ivancic, William D.; Shalkhauser, Mary JO
1992-01-01
A destination-directed packet switching architecture for a 30/20-GHz frequency division multiple access/time division multiplexed (FDMA/TDM) geostationary satellite communications network is discussed. Critical subsystems and problem areas are identified and addressed. Efforts have concentrated heavily on the space segment; however, the ground segment has been considered concurrently to ensure cost efficiency and realistic operational constraints.
Wide Area Security Region Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makarov, Yuri V.; Lu, Shuai; Guo, Xinxin
2010-03-31
This report develops innovative and efficient methodologies and practical procedures to determine the wide-area security region of a power system, which take into consideration all types of system constraints including thermal, voltage, voltage stability, transient and potentially oscillatory stability limits in the system. The approach expands the idea of transmission system nomograms to a multidimensional case, involving multiple system limits and parameters such as transmission path constraints, zonal generation or load, etc., considered concurrently. The security region boundary is represented using its piecewise approximation with the help of linear inequalities (so called hyperplanes) in a multi-dimensional space, consisting of systemmore » parameters that are critical for security analyses. The goal of this approximation is to find a minimum set of hyperplanes that describe the boundary with a given accuracy. Methodologies are also developed to use the security hyperplanes, pre-calculated offline, to determine system security margins in real-time system operations, to identify weak elements in the system, and to calculate key contributing factors and sensitivities to determine the best system controls in real time and to assist in developing remedial actions and transmission system enhancements offline . A prototype program that automates the simulation procedures used to build the set of security hyperplanes has also been developed. The program makes it convenient to update the set of security hyperplanes necessitated by changes in system configurations. A prototype operational tool that uses the security hyperplanes to assess security margins and to calculate optimal control directions in real time has been built to demonstrate the project success. Numerical simulations have been conducted using the full-size Western Electricity Coordinating Council (WECC) system model, and they clearly demonstrated the feasibility and the effectiveness of the developed technology. Recommendations for the future work have also been formulated.« less
Missed opportunities for concurrent HIV-STD testing in an academic emergency department.
Klein, Pamela W; Martin, Ian B K; Quinlivan, Evelyn B; Gay, Cynthia L; Leone, Peter A
2014-01-01
We evaluated emergency department (ED) provider adherence to guidelines for concurrent HIV-sexually transmitted disease (STD) testing within an expanded HIV testing program and assessed demographic and clinical factors associated with concurrent HIV-STD testing. We examined concurrent HIV-STD testing in a suburban academic ED with a targeted, expanded HIV testing program. Patients aged 18-64 years who were tested for syphilis, gonorrhea, or chlamydia in 2009 were evaluated for concurrent HIV testing. We analyzed demographic and clinical factors associated with concurrent HIV-STD testing using multivariate logistic regression with a robust variance estimator or, where applicable, exact logistic regression. Only 28.3% of patients tested for syphilis, 3.8% tested for gonorrhea, and 3.8% tested for chlamydia were concurrently tested for HIV during an ED visit. Concurrent HIV-syphilis testing was more likely among younger patients aged 25-34 years (adjusted odds ratio [AOR] = 0.36, 95% confidence interval [CI] 0.78, 2.10) and patients with STD-related chief complaints at triage (AOR=11.47, 95% CI 5.49, 25.06). Concurrent HIV-gonorrhea/chlamydia testing was more likely among men (gonorrhea: AOR=3.98, 95% CI 2.25, 7.02; chlamydia: AOR=3.25, 95% CI 1.80, 5.86) and less likely among patients with STD-related chief complaints at triage (gonorrhea: AOR=0.31, 95% CI 0.13, 0.82; chlamydia: AOR=0.21, 95% CI 0.09, 0.50). Concurrent HIV-STD testing in an academic ED remains low. Systematic interventions that remove the decision-making burden of ordering an HIV test from providers may increase HIV testing in this high-risk population of suspected STD patients.
Verified compilation of Concurrent Managed Languages
2017-11-01
designs for compiler intermediate representations that facilitate mechanized proofs and verification; and (d) a realistic case study that combines these...ideas to prove the correctness of a state-of- the-art concurrent garbage collector. 15. SUBJECT TERMS Program verification, compiler design ...Even though concurrency is a pervasive part of modern software and hardware systems, it has often been ignored in safety-critical system designs . A
Power-constrained supercomputing
NASA Astrophysics Data System (ADS)
Bailey, Peter E.
As we approach exascale systems, power is turning from an optimization goal to a critical operating constraint. With power bounds imposed by both stakeholders and the limitations of existing infrastructure, achieving practical exascale computing will therefore rely on optimizing performance subject to a power constraint. However, this requirement should not add to the burden of application developers; optimizing the runtime environment given restricted power will primarily be the job of high-performance system software. In this dissertation, we explore this area and develop new techniques that extract maximum performance subject to a particular power constraint. These techniques include a method to find theoretical optimal performance, a runtime system that shifts power in real time to improve performance, and a node-level prediction model for selecting power-efficient operating points. We use a linear programming (LP) formulation to optimize application schedules under various power constraints, where a schedule consists of a DVFS state and number of OpenMP threads for each section of computation between consecutive message passing events. We also provide a more flexible mixed integer-linear (ILP) formulation and show that the resulting schedules closely match schedules from the LP formulation. Across four applications, we use our LP-derived upper bounds to show that current approaches trail optimal, power-constrained performance by up to 41%. This demonstrates limitations of current systems, and our LP formulation provides future optimization approaches with a quantitative optimization target. We also introduce Conductor, a run-time system that intelligently distributes available power to nodes and cores to improve performance. The key techniques used are configuration space exploration and adaptive power balancing. Configuration exploration dynamically selects the optimal thread concurrency level and DVFS state subject to a hardware-enforced power bound. Adaptive power balancing efficiently predicts where critical paths are likely to occur and distributes power to those paths. Greater power, in turn, allows increased thread concurrency levels, CPU frequency/voltage, or both. We describe these techniques in detail and show that, compared to the state-of-the-art technique of using statically predetermined, per-node power caps, Conductor leads to a best-case performance improvement of up to 30%, and an average improvement of 19.1%. At the node level, an accurate power/performance model will aid in selecting the right configuration from a large set of available configurations. We present a novel approach to generate such a model offline using kernel clustering and multivariate linear regression. Our model requires only two iterations to select a configuration, which provides a significant advantage over exhaustive search-based strategies. We apply our model to predict power and performance for different applications using arbitrary configurations, and show that our model, when used with hardware frequency-limiting in a runtime system, selects configurations with significantly higher performance at a given power limit than those chosen by frequency-limiting alone. When applied to a set of 36 computational kernels from a range of applications, our model accurately predicts power and performance; our runtime system based on the model maintains 91% of optimal performance while meeting power constraints 88% of the time. When the runtime system violates a power constraint, it exceeds the constraint by only 6% in the average case, while simultaneously achieving 54% more performance than an oracle. Through the combination of the above contributions, we hope to provide guidance and inspiration to research practitioners working on runtime systems for power-constrained environments. We also hope this dissertation will draw attention to the need for software and runtime-controlled power management under power constraints at various levels, from the processor level to the cluster level.
NASA Technical Reports Server (NTRS)
Jensen, E. Douglas
1988-01-01
Alpha is a new kind of operating system that is unique in two highly significant ways. First, it is decentralized transparently providing reliable resource management across physically dispersed nodes, so that distributed applications programming can be done largely as though it were centralized. And second, it provides comprehensive, high technology support for real-time system integration and operation, an application area which consists predominately of aperiodic activities having critical time constraints such as deadlines. Alpha is extremely adaptable so that it can be easily optimized for a wide range of problem-specific functionality, performance, and cost. Alpha is the first systems effort of the Archons Project, and the prototype was created at Carnegie-Mellon University directly on modified Sun multiprocessor workstation hardware. It has been demonstrated with a real-time C(sup 2) application. Continuing research is leading to a series of enhanced follow-ons to Alpha; these are portable but initially hosted on Concurrent's MASSCOMP line of multiprocessor products.
Object Oriented Modeling and Design
NASA Technical Reports Server (NTRS)
Shaykhian, Gholam Ali
2007-01-01
The Object Oriented Modeling and Design seminar is intended for software professionals and students, it covers the concepts and a language-independent graphical notation that can be used to analyze problem requirements, and design a solution to the problem. The seminar discusses the three kinds of object-oriented models class, state, and interaction. The class model represents the static structure of a system, the state model describes the aspects of a system that change over time as well as control behavior and the interaction model describes how objects collaborate to achieve overall results. Existing knowledge of object oriented programming may benefit the learning of modeling and good design. Specific expectations are: Create a class model, Read, recognize, and describe a class model, Describe association and link, Show abstract classes used with multiple inheritance, Explain metadata, reification and constraints, Group classes into a package, Read, recognize, and describe a state model, Explain states and transitions, Read, recognize, and describe interaction model, Explain Use cases and use case relationships, Show concurrency in activity diagram, Object interactions in sequence diagram.
Bruhn, Peter; Geyer-Schulz, Andreas
2002-01-01
In this paper, we introduce genetic programming over context-free languages with linear constraints for combinatorial optimization, apply this method to several variants of the multidimensional knapsack problem, and discuss its performance relative to Michalewicz's genetic algorithm with penalty functions. With respect to Michalewicz's approach, we demonstrate that genetic programming over context-free languages with linear constraints improves convergence. A final result is that genetic programming over context-free languages with linear constraints is ideally suited to modeling complementarities between items in a knapsack problem: The more complementarities in the problem, the stronger the performance in comparison to its competitors.
NASA Technical Reports Server (NTRS)
Gilbertsen, Noreen D.; Belytschko, Ted
1990-01-01
The implementation of a nonlinear explicit program on a vectorized, concurrent computer with shared memory is described and studied. The conflict between vectorization and concurrency is described and some guidelines are given for optimal block sizes. Several example problems are summarized to illustrate the types of speed-ups which can be achieved by reprogramming as compared to compiler optimization.
Concurrent Programming Using Actors: Exploiting Large-Scale Parallelism,
1985-10-07
ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT. TASK* Artificial Inteligence Laboratory AREA Is WORK UNIT NUMBERS 545 Technology Square...D-R162 422 CONCURRENT PROGRMMIZNG USING f"OS XL?ITP TEH l’ LARGE-SCALE PARALLELISH(U) NASI AC E Al CAMBRIDGE ARTIFICIAL INTELLIGENCE L. G AGHA ET AL...RESOLUTION TEST CHART N~ATIONAL BUREAU OF STANDA.RDS - -96 A -E. __ _ __ __’ .,*- - -- •. - MASSACHUSETTS INSTITUTE OF TECHNOLOGY ARTIFICIAL
Finite elements and the method of conjugate gradients on a concurrent processor
NASA Technical Reports Server (NTRS)
Lyzenga, G. A.; Raefsky, A.; Hager, G. H.
1985-01-01
An algorithm for the iterative solution of finite element problems on a concurrent processor is presented. The method of conjugate gradients is used to solve the system of matrix equations, which is distributed among the processors of a MIMD computer according to an element-based spatial decomposition. This algorithm is implemented in a two-dimensional elastostatics program on the Caltech Hypercube concurrent processor. The results of tests on up to 32 processors show nearly linear concurrent speedup, with efficiencies over 90 percent for sufficiently large problems.
Finite elements and the method of conjugate gradients on a concurrent processor
NASA Technical Reports Server (NTRS)
Lyzenga, G. A.; Raefsky, A.; Hager, B. H.
1984-01-01
An algorithm for the iterative solution of finite element problems on a concurrent processor is presented. The method of conjugate gradients is used to solve the system of matrix equations, which is distributed among the processors of a MIMD computer according to an element-based spatial decomposition. This algorithm is implemented in a two-dimensional elastostatics program on the Caltech Hypercube concurrent processor. The results of tests on up to 32 processors show nearly linear concurrent speedup, with efficiencies over 90% for sufficiently large problems.
A Model of Workflow Composition for Emergency Management
NASA Astrophysics Data System (ADS)
Xin, Chen; Bin-ge, Cui; Feng, Zhang; Xue-hui, Xu; Shan-shan, Fu
The common-used workflow technology is not flexible enough in dealing with concurrent emergency situations. The paper proposes a novel model for defining emergency plans, in which workflow segments appear as a constituent part. A formal abstraction, which contains four operations, is defined to compose workflow segments under constraint rule. The software system of the business process resources construction and composition is implemented and integrated into Emergency Plan Management Application System.
NASA Technical Reports Server (NTRS)
Freeman, Kenneth A.; Walsh, Rick; Weeks, David J.
1988-01-01
Space Station issues in fault management are discussed. The system background is described with attention given to design guidelines and power hardware. A contractually developed fault management system, FRAMES, is integrated with the energy management functions, the control switchgear, and the scheduling and operations management functions. The constraints that shaped the FRAMES system and its implementation are considered.
Parallel scheduling of recursively defined arrays
NASA Technical Reports Server (NTRS)
Myers, T. J.; Gokhale, M. B.
1986-01-01
A new method of automatic generation of concurrent programs which constructs arrays defined by sets of recursive equations is described. It is assumed that the time of computation of an array element is a linear combination of its indices, and integer programming is used to seek a succession of hyperplanes along which array elements can be computed concurrently. The method can be used to schedule equations involving variable length dependency vectors and mutually recursive arrays. Portions of the work reported here have been implemented in the PS automatic program generation system.
Execution of Multidisciplinary Design Optimization Approaches on Common Test Problems
NASA Technical Reports Server (NTRS)
Balling, R. J.; Wilkinson, C. A.
1997-01-01
A class of synthetic problems for testing multidisciplinary design optimization (MDO) approaches is presented. These test problems are easy to reproduce because all functions are given as closed-form mathematical expressions. They are constructed in such a way that the optimal value of all variables and the objective is unity. The test problems involve three disciplines and allow the user to specify the number of design variables, state variables, coupling functions, design constraints, controlling design constraints, and the strength of coupling. Several MDO approaches were executed on two sample synthetic test problems. These approaches included single-level optimization approaches, collaborative optimization approaches, and concurrent subspace optimization approaches. Execution results are presented, and the robustness and efficiency of these approaches an evaluated for these sample problems.
A Tutorial on Parallel and Concurrent Programming in Haskell
NASA Astrophysics Data System (ADS)
Peyton Jones, Simon; Singh, Satnam
This practical tutorial introduces the features available in Haskell for writing parallel and concurrent programs. We first describe how to write semi-explicit parallel programs by using annotations to express opportunities for parallelism and to help control the granularity of parallelism for effective execution on modern operating systems and processors. We then describe the mechanisms provided by Haskell for writing explicitly parallel programs with a focus on the use of software transactional memory to help share information between threads. Finally, we show how nested data parallelism can be used to write deterministically parallel programs which allows programmers to use rich data types in data parallel programs which are automatically transformed into flat data parallel versions for efficient execution on multi-core processors.
Murach, Kevin A; Bagley, James R
2016-08-01
Over the last 30+ years, it has become axiomatic that performing aerobic exercise within the same training program as resistance exercise (termed concurrent exercise training) interferes with the hypertrophic adaptations associated with resistance exercise training. However, a close examination of the literature reveals that the interference effect of concurrent exercise training on muscle growth in humans is not as compelling as previously thought. Moreover, recent studies show that, under certain conditions, concurrent exercise may augment resistance exercise-induced hypertrophy in healthy human skeletal muscle. The purpose of this article is to outline the contrary evidence for an acute and chronic interference effect of concurrent exercise on skeletal muscle growth in humans and provide practical literature-based recommendations for maximizing hypertrophy when training concurrently.
NASA Astrophysics Data System (ADS)
Ranaivomiarana, Narindra; Irisarri, François-Xavier; Bettebghor, Dimitri; Desmorat, Boris
2018-04-01
An optimization methodology to find concurrently material spatial distribution and material anisotropy repartition is proposed for orthotropic, linear and elastic two-dimensional membrane structures. The shape of the structure is parameterized by a density variable that determines the presence or absence of material. The polar method is used to parameterize a general orthotropic material by its elasticity tensor invariants by change of frame. A global structural stiffness maximization problem written as a compliance minimization problem is treated, and a volume constraint is applied. The compliance minimization can be put into a double minimization of complementary energy. An extension of the alternate directions algorithm is proposed to solve the double minimization problem. The algorithm iterates between local minimizations in each element of the structure and global minimizations. Thanks to the polar method, the local minimizations are solved explicitly providing analytical solutions. The global minimizations are performed with finite element calculations. The method is shown to be straightforward and efficient. Concurrent optimization of density and anisotropy distribution of a cantilever beam and a bridge are presented.
Concurrent Image Processing Executive (CIPE). Volume 2: Programmer's guide
NASA Technical Reports Server (NTRS)
Williams, Winifred I.
1990-01-01
This manual is intended as a guide for application programmers using the Concurrent Image Processing Executive (CIPE). CIPE is intended to become the support system software for a prototype high performance science analysis workstation. In its current configuration CIPE utilizes a JPL/Caltech Mark 3fp Hypercube with a Sun-4 host. CIPE's design is capable of incorporating other concurrent architectures as well. CIPE provides a programming environment to applications' programmers to shield them from various user interfaces, file transactions, and architectural complexities. A programmer may choose to write applications to use only the Sun-4 or to use the Sun-4 with the hypercube. A hypercube program will use the hypercube's data processors and optionally the Weitek floating point accelerators. The CIPE programming environment provides a simple set of subroutines to activate user interface functions, specify data distributions, activate hypercube resident applications, and to communicate parameters to and from the hypercube.
Variable-Metric Algorithm For Constrained Optimization
NASA Technical Reports Server (NTRS)
Frick, James D.
1989-01-01
Variable Metric Algorithm for Constrained Optimization (VMACO) is nonlinear computer program developed to calculate least value of function of n variables subject to general constraints, both equality and inequality. First set of constraints equality and remaining constraints inequalities. Program utilizes iterative method in seeking optimal solution. Written in ANSI Standard FORTRAN 77.
Symbolic Analysis of Concurrent Programs with Polymorphism
NASA Technical Reports Server (NTRS)
Rungta, Neha Shyam
2010-01-01
The current trend of multi-core and multi-processor computing is causing a paradigm shift from inherently sequential to highly concurrent and parallel applications. Certain thread interleavings, data input values, or combinations of both often cause errors in the system. Systematic verification techniques such as explicit state model checking and symbolic execution are extensively used to detect errors in such systems [7, 9]. Explicit state model checking enumerates possible thread schedules and input data values of a program in order to check for errors [3, 9]. To partially mitigate the state space explosion from data input values, symbolic execution techniques substitute data input values with symbolic values [5, 7, 6]. Explicit state model checking and symbolic execution techniques used in conjunction with exhaustive search techniques such as depth-first search are unable to detect errors in medium to large-sized concurrent programs because the number of behaviors caused by data and thread non-determinism is extremely large. We present an overview of abstraction-guided symbolic execution for concurrent programs that detects errors manifested by a combination of thread schedules and data values [8]. The technique generates a set of key program locations relevant in testing the reachability of the target locations. The symbolic execution is then guided along these locations in an attempt to generate a feasible execution path to the error state. This allows the execution to focus in parts of the behavior space more likely to contain an error.
A programing system for research and applications in structural optimization
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, J.; Rogers, J. L., Jr.
1981-01-01
The flexibility necessary for such diverse utilizations is achieved by combining, in a modular manner, a state-of-the-art optimization program, a production level structural analysis program, and user supplied and problem dependent interface programs. Standard utility capabilities in modern computer operating systems are used to integrate these programs. This approach results in flexibility of the optimization procedure organization and versatility in the formulation of constraints and design variables. Features shown in numerical examples include: variability of structural layout and overall shape geometry, static strength and stiffness constraints, local buckling failure, and vibration constraints.
Hsu, Hsun-Ta; Fulginiti, Anthony; Rice, Eric; Rhoades, Harmony; Winetrobe, Hailey; Danforth, Laura
2018-05-03
Although homeless youth are likely to engage in concurrent sexual relationships and doing so can accelerate HIV transmission, the issue of sexual concurrency (i.e., having sexual partnerships that overlap in time) has received scarce attention in this vulnerable population. The literature that exists tends to focus on individuals' characteristics that may be associated with concurrency and overlooks the influence of their social environment. Informed by the risk amplification and abatement model (RAAM), this study explored the association between pro-social and problematic social network connections, and sexual concurrency among homeless youth using drop-in center services (N = 841). Nearly 37% of youth engaged in concurrency. Partially consistent with the RAAM, regression analyses showed that affiliation with more problematic ties (i.e., having more network members who practice concurrency and unprotected sex) was associated with greater sexual concurrency. Programs addressing HIV risk among homeless youth in drop-in centers should consider the role youths' network composition may play in concurrency.
1990-04-23
developed Ada Real - Time Operating System (ARTOS) for bare machine environments(Target), ACW 1.1I0. " ; - -M.UIECTTERMS Ada programming language, Ada...configuration) Operating System: CSC developed Ada Real - Time Operating System (ARTOS) for bare machine environments Memory Size: 4MB 2.2...Test Method Testing of the MC Ado V1.2.beta/ Concurrent Computer Corporation compiler and the CSC developed Ada Real - Time Operating System (ARTOS) for
Reasoning about real-time systems with temporal interval logic constraints on multi-state automata
NASA Technical Reports Server (NTRS)
Gabrielian, Armen
1991-01-01
Models of real-time systems using a single paradigm often turn out to be inadequate, whether the paradigm is based on states, rules, event sequences, or logic. A model-based approach to reasoning about real-time systems is presented in which a temporal interval logic called TIL is employed to define constraints on a new type of high level automata. The combination, called hierarchical multi-state (HMS) machines, can be used to model formally a real-time system, a dynamic set of requirements, the environment, heuristic knowledge about planning-related problem solving, and the computational states of the reasoning mechanism. In this framework, mathematical techniques were developed for: (1) proving the correctness of a representation; (2) planning of concurrent tasks to achieve goals; and (3) scheduling of plans to satisfy complex temporal constraints. HMS machines allow reasoning about a real-time system from a model of how truth arises instead of merely depending of what is true in a system.
NASA Technical Reports Server (NTRS)
1975-01-01
The persistence of the current period of inflation and its apparent resistance to traditional fiscal and monetary policies implies a circular behavior that becomes increasingly impervious to ameliorative action. This behavior is attributed to: A concurrent industrial boom among industrialized nations; price and production policies; worldwide reductions in agricultural products; international shortages of natural resources and raw materials; and rise in multinational firms and merchant banking.
Enabling communication concurrency through flexible MPI endpoints
Dinan, James; Grant, Ryan E.; Balaji, Pavan; ...
2014-09-23
MPI defines a one-to-one relationship between MPI processes and ranks. This model captures many use cases effectively; however, it also limits communication concurrency and interoperability between MPI and programming models that utilize threads. Our paper describes the MPI endpoints extension, which relaxes the longstanding one-to-one relationship between MPI processes and ranks. Using endpoints, an MPI implementation can map separate communication contexts to threads, allowing them to drive communication independently. Also, endpoints enable threads to be addressable in MPI operations, enhancing interoperability between MPI and other programming models. Furthermore, these characteristics are illustrated through several examples and an empirical study thatmore » contrasts current multithreaded communication performance with the need for high degrees of communication concurrency to achieve peak communication performance.« less
Enabling communication concurrency through flexible MPI endpoints
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dinan, James; Grant, Ryan E.; Balaji, Pavan
MPI defines a one-to-one relationship between MPI processes and ranks. This model captures many use cases effectively; however, it also limits communication concurrency and interoperability between MPI and programming models that utilize threads. Our paper describes the MPI endpoints extension, which relaxes the longstanding one-to-one relationship between MPI processes and ranks. Using endpoints, an MPI implementation can map separate communication contexts to threads, allowing them to drive communication independently. Also, endpoints enable threads to be addressable in MPI operations, enhancing interoperability between MPI and other programming models. Furthermore, these characteristics are illustrated through several examples and an empirical study thatmore » contrasts current multithreaded communication performance with the need for high degrees of communication concurrency to achieve peak communication performance.« less
Enabling communication concurrency through flexible MPI endpoints
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dinan, James; Grant, Ryan E.; Balaji, Pavan
MPI defines a one-to-one relationship between MPI processes and ranks. This model captures many use cases effectively; however, it also limits communication concurrency and interoperability between MPI and programming models that utilize threads. This paper describes the MPI endpoints extension, which relaxes the longstanding one-to-one relationship between MPI processes and ranks. Using endpoints, an MPI implementation can map separate communication contexts to threads, allowing them to drive communication independently. Endpoints also enable threads to be addressable in MPI operations, enhancing interoperability between MPI and other programming models. These characteristics are illustrated through several examples and an empirical study that contrastsmore » current multithreaded communication performance with the need for high degrees of communication concurrency to achieve peak communication performance.« less
Specifying and Verifying Concurrent Programs.
1985-02-01
for Verification and Specification of Concurrent Systems, held in La - Colle - Sur - Loup , France in October, 1984. Work Supported in part by the National...Proc. ACM Symposium on Princi- 0 ples of Programming Languages, Las Vegas, (January 1980), 251-261. [7] J. V. Guttag and J. J. Horning. An Introduction...names in ’V(S). However, the two formulas behave differently under a renaming mapping p. In particu- lar. p(Vv :A( LA )) equals Vv :p(A(v)), so the
Parallel processors and nonlinear structural dynamics algorithms and software
NASA Technical Reports Server (NTRS)
Belytschko, Ted
1990-01-01
Techniques are discussed for the implementation and improvement of vectorization and concurrency in nonlinear explicit structural finite element codes. In explicit integration methods, the computation of the element internal force vector consumes the bulk of the computer time. The program can be efficiently vectorized by subdividing the elements into blocks and executing all computations in vector mode. The structuring of elements into blocks also provides a convenient way to implement concurrency by creating tasks which can be assigned to available processors for evaluation. The techniques were implemented in a 3-D nonlinear program with one-point quadrature shell elements. Concurrency and vectorization were first implemented in a single time step version of the program. Techniques were developed to minimize processor idle time and to select the optimal vector length. A comparison of run times between the program executed in scalar, serial mode and the fully vectorized code executed concurrently using eight processors shows speed-ups of over 25. Conjugate gradient methods for solving nonlinear algebraic equations are also readily adapted to a parallel environment. A new technique for improving convergence properties of conjugate gradients in nonlinear problems is developed in conjunction with other techniques such as diagonal scaling. A significant reduction in the number of iterations required for convergence is shown for a statically loaded rigid bar suspended by three equally spaced springs.
Program Correctness, Verification and Testing for Exascale (Corvette)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sen, Koushik; Iancu, Costin; Demmel, James W
The goal of this project is to provide tools to assess the correctness of parallel programs written using hybrid parallelism. There is a dire lack of both theoretical and engineering know-how in the area of finding bugs in hybrid or large scale parallel programs, which our research aims to change. In the project we have demonstrated novel approaches in several areas: 1. Low overhead automated and precise detection of concurrency bugs at scale. 2. Using low overhead bug detection tools to guide speculative program transformations for performance. 3. Techniques to reduce the concurrency required to reproduce a bug using partialmore » program restart/replay. 4. Techniques to provide reproducible execution of floating point programs. 5. Techniques for tuning the floating point precision used in codes.« less
Tetrapod axial evolution and developmental constraints; Empirical underpinning by a mouse model
Woltering, Joost M.; Duboule, Denis
2015-01-01
The tetrapod vertebral column has become increasingly complex during evolution as an adaptation to a terrestrial life. At the same time, the evolution of the vertebral formula became subject to developmental constraints acting on the size of the cervical and thoraco-lumbar regions. In the course of our studies concerning the evolution of Hox gene regulation, we produced a transgenic mouse model expressing fish Hox genes, which displayed a reduced number of thoraco-lumbar vertebrae and concurrent sacral homeotic transformations. Here, we analyze this mutant stock and conclude that the ancestral, pre-tetrapodial Hox code already possessed the capacity to induce vertebrae with sacral characteristics. This suggests that alterations in the interpretation of the Hox code may have participated to the evolution of this region in tetrapods, along with potential modifications of the HOX proteins themselves. With its reduced vertebral number, this mouse stock violates a previously described developmental constraint, which applies to the thoraco-lumbar region. The resulting offset between motor neuron morphology, vertebral patterning and the relative positioning of hind limbs illustrates that the precise orchestration of the Hox-clock in parallel with other ontogenetic pathways places constraints on the evolvability of the body plan. PMID:26238020
Zhang, Y M; Huang, G; Lu, H W; He, Li
2015-08-15
A key issue facing integrated water resources management and water pollution control is to address the vague parametric information. A full credibility-based chance-constrained programming (FCCP) method is thus developed by introducing the new concept of credibility into the modeling framework. FCCP can deal with fuzzy parameters appearing concurrently in the objective and both sides of the constraints of the model, but also provide a credibility level indicating how much confidence one can believe the optimal modeling solutions. The method is applied to Heshui River watershed in the south-central China for demonstration. Results from the case study showed that groundwater would make up for the water shortage in terms of the shrinking surface water and rising water demand, and the optimized total pumpage of groundwater from both alluvial and karst aquifers would exceed 90% of its maximum allowable levels when credibility level is higher than or equal to 0.9. It is also indicated that an increase in credibility level would induce a reduction in cost for surface water acquisition, a rise in cost from groundwater withdrawal, and negligible variation in cost for water pollution control. Copyright © 2015 Elsevier B.V. All rights reserved.
A Discussion of Issues in Integrity Constraint Monitoring
NASA Technical Reports Server (NTRS)
Fernandez, Francisco G.; Gates, Ann Q.; Cooke, Daniel E.
1998-01-01
In the development of large-scale software systems, analysts, designers, and programmers identify properties of data objects in the system. The ability to check those assertions during runtime is desirable as a means of verifying the integrity of the program. Typically, programmers ensure the satisfaction of such properties through the use of some form of manually embedded assertion check. The disadvantage to this approach is that these assertions become entangled within the program code. The goal of the research is to develop an integrity constraint monitoring mechanism whereby a repository of software system properties (called integrity constraints) are automatically inserted into the program by the mechanism to check for incorrect program behaviors. Such a mechanism would overcome many of the deficiencies of manually embedded assertion checks. This paper gives an overview of the preliminary work performed toward this goal. The manual instrumentation of constraint checking on a series of test programs is discussed, This review then is used as the basis for a discussion of issues to be considered in developing an automated integrity constraint monitor.
Concurrency-based approaches to parallel programming
NASA Technical Reports Server (NTRS)
Kale, L.V.; Chrisochoides, N.; Kohl, J.; Yelick, K.
1995-01-01
The inevitable transition to parallel programming can be facilitated by appropriate tools, including languages and libraries. After describing the needs of applications developers, this paper presents three specific approaches aimed at development of efficient and reusable parallel software for irregular and dynamic-structured problems. A salient feature of all three approaches in their exploitation of concurrency within a processor. Benefits of individual approaches such as these can be leveraged by an interoperability environment which permits modules written using different approaches to co-exist in single applications.
Reliability models for dataflow computer systems
NASA Technical Reports Server (NTRS)
Kavi, K. M.; Buckles, B. P.
1985-01-01
The demands for concurrent operation within a computer system and the representation of parallelism in programming languages have yielded a new form of program representation known as data flow (DENN 74, DENN 75, TREL 82a). A new model based on data flow principles for parallel computations and parallel computer systems is presented. Necessary conditions for liveness and deadlock freeness in data flow graphs are derived. The data flow graph is used as a model to represent asynchronous concurrent computer architectures including data flow computers.
ERIC Educational Resources Information Center
Hawaii Educational Policy Center, 2008
2008-01-01
The 2007 Hawai'i State Legislature passed Senate Concurrent Resolution 118 S.D.1 HD 1 Improving the Community's Understanding of the Department of Education's Programs and School Expenses Including a Comparison with Other States on Adequacy of Funds. Among the requests contained in the resolution were the following: "Be it further resolved…
Time on your hands: Perceived duration of sensory events is biased toward concurrent actions.
Yon, Daniel; Edey, Rosanna; Ivry, Richard B; Press, Clare
2017-02-01
Perceptual systems must rapidly generate accurate representations of the world from sensory inputs that are corrupted by internal and external noise. We can typically obtain more veridical representations by integrating information from multiple channels, but this integration can lead to biases when inputs are, in fact, not from the same source. Although a considerable amount is known about how different sources of information are combined to influence what we perceive, it is not known whether temporal features are combined. It is vital to address this question given the divergent predictions made by different models of cue combination and time perception concerning the plausibility of cross-modal temporal integration, and the implications that such integration would have for research programs in action control and social cognition. Here we present four experiments investigating the influence of movement duration on the perceived duration of an auditory tone. Participants either explicitly (Experiments 1-2) or implicitly (Experiments 3-4) produced hand movements of shorter or longer durations, while judging the duration of a concurrently presented tone (500-950 ms in duration). Across all experiments, judgments of tone duration were attracted toward the duration of executed movements (i.e., tones were perceived to be longer when executing a movement of longer duration). Our results demonstrate that temporal information associated with movement biases perceived auditory duration, placing important constraints on theories modeling cue integration for state estimation, as well as models of time perception, action control and social cognition. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Increasing Communication in Children with Concurrent Vision and Hearing Loss
ERIC Educational Resources Information Center
Brady, Nancy C.; Bashinski, Susan M.
2008-01-01
Nine children with complex communication needs and concurrent vision and hearing losses participated in an intervention program aimed at increasing intentional prelinguistic communication. The intervention constituted a pilot, descriptive study of an adapted version of prelinguistic milieu teaching, hence referred to as A-PMT. In A-PMT, natural…
2013-08-01
in Sequential Design Optimization with Concurrent Calibration-Based Model Validation Dorin Drignei 1 Mathematics and Statistics Department...Validation 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Dorin Drignei; Zissimos Mourelatos; Vijitashwa Pandey
Program manual for ASTOP, an Arbitrary space trajectory optimization program
NASA Technical Reports Server (NTRS)
Horsewood, J. L.
1974-01-01
The ASTOP program (an Arbitrary Space Trajectory Optimization Program) designed to generate optimum low-thrust trajectories in an N-body field while satisfying selected hardware and operational constraints is presented. The trajectory is divided into a number of segments or arcs over which the control is held constant. This constant control over each arc is optimized using a parameter optimization scheme based on gradient techniques. A modified Encke formulation of the equations of motion is employed. The program provides a wide range of constraint, end conditions, and performance index options. The basic approach is conducive to future expansion of features such as the incorporation of new constraints and the addition of new end conditions.
It Takes a Village: Network Effects on Rural Education in Afghanistan. PRGS Dissertation
ERIC Educational Resources Information Center
Hoover, Matthew Amos
2014-01-01
Often, development organizations confront a tradeoff between program priorities and operational constraints. These constraints may be financial, capacity, or logistical; regardless, the tradeoff often requires sacrificing portions of a program. This work is concerned with figuring out how, when constrained, an organization or program manager can…
NASA Technical Reports Server (NTRS)
Menon, R. G.; Kurdila, A. J.
1992-01-01
This paper presents a concurrent methodology to simulate the dynamics of flexible multibody systems with a large number of degrees of freedom. A general class of open-loop structures is treated and a redundant coordinate formulation is adopted. A range space method is used in which the constraint forces are calculated using a preconditioned conjugate gradient method. By using a preconditioner motivated by the regular ordering of the directed graph of the structures, it is shown that the method is order N in the total number of coordinates of the system. The overall formulation has the advantage that it permits fine parallelization and does not rely on system topology to induce concurrency. It can be efficiently implemented on the present generation of parallel computers with a large number of processors. Validation of the method is presented via numerical simulations of space structures incorporating large number of flexible degrees of freedom.
On Why It Is Impossible to Prove that the BDX90 Dispatcher Implements a Time-sharing System
NASA Technical Reports Server (NTRS)
Boyer, R. S.; Moore, J. S.
1983-01-01
The Software Implemented Fault Tolerance SIFT system, is written in PASCAL except for about a page of machine code. The SIFT system implements a small time sharing system in which PASCAL programs for separate application tasks are executed according to a schedule with real time constraints. The PASCAL language has no provision for handling the notion of an interrupt such as the B930 clock interrupt. The PASCAL language also lacks the notion of running a PASCAL subroutine for a given amount of time, suspending it, saving away the suspension, and later activating the suspension. Machine code was used to overcome these inadequacies of PASCAL. Code which handles clock interrupts and suspends processes is called a dispatcher. The time sharing/virtual machine idea is completely destroyed by the reconfiguration task. After termination of the reconfiguration task, the tasks run by the dispatcher have no relation to those run before reconfiguration. It is impossible to view the dispatcher as a time-sharing system implementing virtual BDX930s running concurrently when one process can wipe out the others.
7 CFR 281.1 - General purpose and scope.
Code of Federal Regulations, 2011 CFR
2011-01-01
... AGRICULTURE FOOD STAMP AND FOOD DISTRIBUTION PROGRAM ADMINISTRATION OF THE FOOD STAMP PROGRAM ON INDIAN... Program on Indian reservations either separately or concurrently with the Food distribution program. In order to assure that the Food Stamp Program is responsive to the needs of Indians on reservations, State...
Bandwidth Constraints to Using Video and Other Rich Media in Behavior Change Websites
Jazdzewski, Stephen A; McKay, H Garth; Hudson, Clinton R
2005-01-01
Background Web-based behavior change interventions often include rich media (eg, video, audio, and large graphics). The rationale for using rich media includes the need to reach users who are not inclined or able to use text-based website content, encouragement of program engagement, and following the precedent set by news and sports websites. Objectives We describe the development of a bandwidth usage index, which seeks to provide a practical method to gauge the extent to which websites can successfully be used within different Internet access scenarios (eg, dial-up and broadband). Methods We conducted three studies to measure bandwidth consumption. In Study 1, we measured the bandwidth usage index for three video-rich websites (for smoking cessation, for caregivers, and for improving eldercare by family members). We then estimated the number of concurrent users that could be accommodated by each website under various Internet access scenarios. In Study 2, we sought to validate our estimated threshold number of concurrent users by testing the video-rich smoking cessation website with different numbers of concurrent users. In Study 3, we calculated the bandwidth usage index and threshold number of concurrent users for three versions of the smoking cessation website: the video-rich version (tested in Study 1), an audio-rich version, and a Web-enabled CD-ROM version in which all media-rich content was placed on a CD-ROM on the client computer. Results In Study 1, we found that the bandwidth usage index of the video-rich websites ranged from 144 Kbps to 93 Kbps. These results indicated that dial-up modem users would not achieve a “good user experience” with any of the three rich media websites. Results for Study 2 confirmed that usability was compromised when the estimated threshold number of concurrent users was exceeded. Results for Study 3 indicated that changing a website from video- to audio-rich content reduced the bandwidth requirement by almost 50%, but it remained too large to allow satisfactory use in dial-up modem scenarios. The Web-enabled CD-ROM reduced bandwidth requirements such that even a dial-up modem user could have a good user experience with the rich media content. Conclusions We conclude that the bandwidth usage index represents a practical tool that can help developers and researchers to measure the bandwidth requirements of their websites as well as to evaluate the feasibility of certain website designs in terms of specific use cases. These findings are discussed in terms of reaching different groups of users as well accommodating the intended number of concurrent users. We also discuss the promising option of using Web-enabled CD-ROMs to deliver rich media content to users with dial-up Internet access. We introduce a number of researchable themes for improving our ability to develop Web-based behavior change interventions that can better deliver what they promise. PMID:16236701
Bandwidth constraints to using video and other rich media in behavior change websites.
Danaher, Brian G; Jazdzewski, Stephen A; McKay, H Garth; Hudson, Clinton R
2005-09-16
Web-based behavior change interventions often include rich media (eg, video, audio, and large graphics). The rationale for using rich media includes the need to reach users who are not inclined or able to use text-based website content, encouragement of program engagement, and following the precedent set by news and sports websites. We describe the development of a bandwidth usage index, which seeks to provide a practical method to gauge the extent to which websites can successfully be used within different Internet access scenarios (eg, dial-up and broadband). We conducted three studies to measure bandwidth consumption. In Study 1, we measured the bandwidth usage index for three video-rich websites (for smoking cessation, for caregivers, and for improving eldercare by family members). We then estimated the number of concurrent users that could be accommodated by each website under various Internet access scenarios. In Study 2, we sought to validate our estimated threshold number of concurrent users by testing the video-rich smoking cessation website with different numbers of concurrent users. In Study 3, we calculated the bandwidth usage index and threshold number of concurrent users for three versions of the smoking cessation website: the video-rich version (tested in Study 1), an audio-rich version, and a Web-enabled CD-ROM version in which all media-rich content was placed on a CD-ROM on the client computer. In Study 1, we found that the bandwidth usage index of the video-rich websites ranged from 144 Kbps to 93 Kbps. These results indicated that dial-up modem users would not achieve a "good user experience" with any of the three rich media websites. Results for Study 2 confirmed that usability was compromised when the estimated threshold number of concurrent users was exceeded. Results for Study 3 indicated that changing a website from video- to audio-rich content reduced the bandwidth requirement by almost 50%, but it remained too large to allow satisfactory use in dial-up modem scenarios. The Web-enabled CD-ROM reduced bandwidth requirements such that even a dial-up modem user could have a good user experience with the rich media content. We conclude that the bandwidth usage index represents a practical tool that can help developers and researchers to measure the bandwidth requirements of their websites as well as to evaluate the feasibility of certain website designs in terms of specific use cases. These findings are discussed in terms of reaching different groups of users as well accommodating the intended number of concurrent users. We also discuss the promising option of using Web-enabled CD-ROMs to deliver rich media content to users with dial-up Internet access. We introduce a number of researchable themes for improving our ability to develop Web-based behavior change interventions that can better deliver what they promise.
NASA Technical Reports Server (NTRS)
Hines, J.
1999-01-01
Sensors 2000! (S2K!) is a specialized, integrated projects team organized to provide focused, directed, advanced biosensor and bioinstrumentation systems technology support to NASA's spaceflight and ground-based research and development programs. Specific technology thrusts include telemetry-based sensor systems, chemical/ biological sensors, medical and physiological sensors, miniaturized instrumentation architectures, and data and signal processing systems. A concurrent objective is to promote the mutual use, application, and transition of developed technology by collaborating in academic-commercial-govemment leveraging, joint research, technology utilization and commercialization, and strategic partnering alliances. Sensors 2000! is organized around three primary program elements: Technology and Product Development, Technology infusion and Applications, and Collaborative Activities. Technology and Product Development involves development and demonstration of biosensor and biotelemetry systems for application to NASA Space Life Sciences Programs; production of fully certified spaceflight hardware and payload elements; and sensor/measurement systems development for NASA research and development activities. Technology Infusion and Applications provides technology and program agent support to identify available and applicable technologies from multiple sources for insertion into NASA's strategic enterprises and initiatives. Collaborative Activities involve leveraging of NASA technologies with those of other government agencies, academia, and industry to concurrently provide technology solutions and products of mutual benefit to participating members.
Automatic Verification of Serializers.
1980-03-01
31 2.5 Using semaphores to implement sei ;alizers ......................... 32 2.6 A comparison of...of concurrency control, while Hewitt has concentrated on more primitive control of concurrency in a context where programs communicate by passing...translation oflserializers into clusters and semaphores is given as a possible implementation strategy. Chapter 3 presents a simple semantic model that supl
DOE Office of Scientific and Technical Information (OSTI.GOV)
Landau, David B., E-mail: david.landau@kcl.ac.uk; Hughes, Laura; Baker, Angela
2016-08-01
Purpose: To report toxicity and early survival data for IDEAL-CRT, a trial of dose-escalated concurrent chemoradiotherapy (CRT) for non-small cell lung cancer. Patients and Methods: Patients received tumor doses of 63 to 73 Gy in 30 once-daily fractions over 6 weeks with 2 concurrent cycles of cisplatin and vinorelbine. They were assigned to 1 of 2 groups according to esophageal dose. In group 1, tumor doses were determined by an experimental constraint on maximum esophageal dose, which was escalated following a 6 + 6 design from 65 Gy through 68 Gy to 71 Gy, allowing an esophageal maximum tolerated dose to be determined from early and late toxicities. Tumormore » doses for group 2 patients were determined by other tissue constraints, often lung. Overall survival, progression-free survival, tumor response, and toxicity were evaluated for both groups combined. Results: Eight centers recruited 84 patients: 13, 12, and 10, respectively, in the 65-Gy, 68-Gy, and 71-Gy cohorts of group 1; and 49 in group 2. The mean prescribed tumor dose was 67.7 Gy. Five grade 3 esophagitis and 3 grade 3 pneumonitis events were observed across both groups. After 1 fatal esophageal perforation in the 71-Gy cohort, 68 Gy was declared the esophageal maximum tolerated dose. With a median follow-up of 35 months, median overall survival was 36.9 months, and overall survival and progression-free survival were 87.8% and 72.0%, respectively, at 1 year and 68.0% and 48.5% at 2 years. Conclusions: IDEAL-CRT achieved significant treatment intensification with acceptable toxicity and promising survival. The isotoxic design allowed the esophageal maximum tolerated dose to be identified from relatively few patients.« less
Heterogeneous concurrent computing with exportable services
NASA Technical Reports Server (NTRS)
Sunderam, Vaidy
1995-01-01
Heterogeneous concurrent computing, based on the traditional process-oriented model, is approaching its functionality and performance limits. An alternative paradigm, based on the concept of services, supporting data driven computation, and built on a lightweight process infrastructure, is proposed to enhance the functional capabilities and the operational efficiency of heterogeneous network-based concurrent computing. TPVM is an experimental prototype system supporting exportable services, thread-based computation, and remote memory operations that is built as an extension of and an enhancement to the PVM concurrent computing system. TPVM offers a significantly different computing paradigm for network-based computing, while maintaining a close resemblance to the conventional PVM model in the interest of compatibility and ease of transition Preliminary experiences have demonstrated that the TPVM framework presents a natural yet powerful concurrent programming interface, while being capable of delivering performance improvements of upto thirty percent.
Joint Chance-Constrained Dynamic Programming
NASA Technical Reports Server (NTRS)
Ono, Masahiro; Kuwata, Yoshiaki; Balaram, J. Bob
2012-01-01
This paper presents a novel dynamic programming algorithm with a joint chance constraint, which explicitly bounds the risk of failure in order to maintain the state within a specified feasible region. A joint chance constraint cannot be handled by existing constrained dynamic programming approaches since their application is limited to constraints in the same form as the cost function, that is, an expectation over a sum of one-stage costs. We overcome this challenge by reformulating the joint chance constraint into a constraint on an expectation over a sum of indicator functions, which can be incorporated into the cost function by dualizing the optimization problem. As a result, the primal variables can be optimized by a standard dynamic programming, while the dual variable is optimized by a root-finding algorithm that converges exponentially. Error bounds on the primal and dual objective values are rigorously derived. We demonstrate the algorithm on a path planning problem, as well as an optimal control problem for Mars entry, descent and landing. The simulations are conducted using a real terrain data of Mars, with four million discrete states at each time step.
King, Laurie A; Horak, Fay B
2009-01-01
This article introduces a new framework for therapists to develop an exercise program to delay mobility disability in people with Parkinson disease (PD). Mobility, or the ability to efficiently navigate and function in a variety of environments, requires balance, agility, and flexibility, all of which are affected by PD. This article summarizes recent research identifying how constraints on mobility specific to PD, such as rigidity, bradykinesia, freezing, poor sensory integration, inflexible program selection, and impaired cognitive processing, limit mobility in people with PD. Based on these constraints, a conceptual framework for exercises to maintain and improve mobility is presented. An example of a constraint-focused agility exercise program, incorporating movement principles from tai chi, kayaking, boxing, lunges, agility training, and Pilates exercises, is presented. This new constraint-focused agility exercise program is based on a strong scientific framework and includes progressive levels of sensorimotor, resistance, and coordination challenges that can be customized for each patient while maintaining fidelity. Principles for improving mobility presented here can be incorporated into an ongoing or long-term exercise program for people with PD. PMID:19228832
King, Laurie A; Horak, Fay B
2009-04-01
This article introduces a new framework for therapists to develop an exercise program to delay mobility disability in people with Parkinson disease (PD). Mobility, or the ability to efficiently navigate and function in a variety of environments, requires balance, agility, and flexibility, all of which are affected by PD. This article summarizes recent research identifying how constraints on mobility specific to PD, such as rigidity, bradykinesia, freezing, poor sensory integration, inflexible program selection, and impaired cognitive processing, limit mobility in people with PD. Based on these constraints, a conceptual framework for exercises to maintain and improve mobility is presented. An example of a constraint-focused agility exercise program, incorporating movement principles from tai chi, kayaking, boxing, lunges, agility training, and Pilates exercises, is presented. This new constraint-focused agility exercise program is based on a strong scientific framework and includes progressive levels of sensorimotor, resistance, and coordination challenges that can be customized for each patient while maintaining fidelity. Principles for improving mobility presented here can be incorporated into an ongoing or long-term exercise program for people with PD.
Rationale in Choosing a Teacher Preparation Program.
ERIC Educational Resources Information Center
Raine, LaVerne; Harkins, Donna; Sampson, Mary Beth
A study examined students' reasons for, and implications of, choosing a traditional student teaching program or a field-based program of preservice teacher education. The traditional student teaching program and the field-based program were offered concurrently for a short period of time at Texas A&M University--Commerce. Students enrolled in…
A heuristic constraint programmed planner for deep space exploration problems
NASA Astrophysics Data System (ADS)
Jiang, Xiao; Xu, Rui; Cui, Pingyuan
2017-10-01
In recent years, the increasing numbers of scientific payloads and growing constraints on the probe have made constraint processing technology a hotspot in the deep space planning field. In the procedure of planning, the ordering of variables and values plays a vital role. This paper we present two heuristic ordering methods for variables and values. On this basis a graphplan-like constraint-programmed planner is proposed. In the planner we convert the traditional constraint satisfaction problem to a time-tagged form with different levels. Inspired by the most constrained first principle in constraint satisfaction problem (CSP), the variable heuristic is designed by the number of unassigned variables in the constraint and the value heuristic is designed by the completion degree of the support set. The simulation experiments show that the planner proposed is effective and its performance is competitive with other kind of planners.
Software engineering aspects of real-time programming concepts
NASA Astrophysics Data System (ADS)
Schoitsch, Erwin
1986-08-01
Real-time programming is a discipline of great importance not only in process control, but also in fields like communication, office automation, interactive databases, interactive graphics and operating systems development. General concepts of concurrent programming and constructs for process-synchronization are discussed in detail. Tasking and synchronization concepts, methods of process communication, interrupt and timeout handling in systems based on semaphores, signals, conditional critical regions or on real-time languages like Concurrent PASCAL, MODULA, CHILL and ADA are explained and compared with each other. The second part deals with structuring and modularization of technical processes to build reliable and maintainable real time systems. Software-quality and software engineering aspects are considered throughout the paper.
Concurrent engineering: Spacecraft and mission operations system design
NASA Technical Reports Server (NTRS)
Landshof, J. A.; Harvey, R. J.; Marshall, M. H.
1994-01-01
Despite our awareness of the mission design process, spacecraft historically have been designed and developed by one team and then turned over as a system to the Mission Operations organization to operate on-orbit. By applying concurrent engineering techniques and envisioning operability as an essential characteristic of spacecraft design, tradeoffs can be made in the overall mission design to minimize mission lifetime cost. Lessons learned from previous spacecraft missions will be described, as well as the implementation of concurrent mission operations and spacecraft engineering for the Near Earth Asteroid Rendezvous (NEAR) program.
Research Breathes New Life Into Senior Travel Program.
ERIC Educational Resources Information Center
Blazey, Michael
1986-01-01
A survey of older citizens concerning travel interests revealed constraints to participation in a travel program. A description is given of how research on attitudes and life styles indicated ways in which these constraints could be lessened. (JD)
Elmi, Maryam; Azin, Arash; Elnahas, Ahmad; McCready, David R; Cil, Tulin D
2018-05-14
Patients with genetic susceptibility to breast and ovarian cancer are eligible for risk-reduction surgery. Surgical morbidity of risk-reduction mastectomy (RRM) with concurrent bilateral salpingo-oophorectomy (BSO) is unknown. Outcomes in these patients were compared to patients undergoing RRM without BSO using a large multi-institutional database. A retrospective cohort analysis was conducted using the American College of Surgeon's National Surgical Quality Improvement Program (NSQIP) 2007-2016 datasets, comparing postoperative morbidity between patients undergoing RRM with patients undergoing RRM with concurrent BSO. Patients with genetic susceptibility to breast/ovarian cancer undergoing risk-reduction surgery were identified. The primary outcome was 30-day postoperative major morbidity. Secondary outcomes included surgical site infections, reoperations, readmissions, length of stay, and venous thromboembolic events. A multivariate analysis was performed to determine predictors of postoperative morbidity and the adjusted effect of concurrent BSO on morbidity. Of the 5470 patients undergoing RRM, 149 (2.7%) underwent concurrent BSO. The overall rate of major morbidity and postoperative infections was 4.5% and 4.6%, respectively. There was no significant difference in the rate of postoperative major morbidity (4.5% vs 4.7%, p = 0.91) or any of the secondary outcomes between patients undergoing RRM without BSO vs. those undergoing RRM with concurrent BSO. Multivariable analysis showed Body Mass Index (OR 1.05; p < 0.001) and smoking (OR 1.78; p = 0.003) to be the only predictors associated with major morbidity. Neither immediate breast reconstruction (OR 1.02; p = 0.93) nor concurrent BSO (OR 0.94; p = 0.89) were associated with increased postoperative major morbidity. This study demonstrated that RRM with concurrent BSO was not associated with significant additional morbidity when compared to RRM without BSO. Therefore, this joint approach may be considered for select patients at risk for both breast and ovarian cancer.
Tetrapod axial evolution and developmental constraints; Empirical underpinning by a mouse model.
Woltering, Joost M; Duboule, Denis
2015-11-01
The tetrapod vertebral column has become increasingly complex during evolution as an adaptation to a terrestrial life. At the same time, the evolution of the vertebral formula became subject to developmental constraints acting on the size of the cervical and thoraco-lumbar regions. In the course of our studies concerning the evolution of Hox gene regulation, we produced a transgenic mouse model expressing fish Hox genes, which displayed a reduced number of thoraco-lumbar vertebrae and concurrent sacral homeotic transformations. Here, we analyze this mutant stock and conclude that the ancestral, pre-tetrapodial Hox code already possessed the capacity to induce vertebrae with sacral characteristics. This suggests that alterations in the interpretation of the Hox code may have participated to the evolution of this region in tetrapods, along with potential modifications of the HOX proteins themselves. With its reduced vertebral number, this mouse stock violates a previously described developmental constraint, which applies to the thoraco-lumbar region. The resulting offset between motor neuron morphology, vertebral patterning and the relative positioning of hind limbs illustrates that the precise orchestration of the Hox-clock in parallel with other ontogenetic pathways places constraints on the evolvability of the body plan. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Automatic data partitioning on distributed memory multicomputers. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Gupta, Manish
1992-01-01
Distributed-memory parallel computers are increasingly being used to provide high levels of performance for scientific applications. Unfortunately, such machines are not very easy to program. A number of research efforts seek to alleviate this problem by developing compilers that take over the task of generating communication. The communication overheads and the extent of parallelism exploited in the resulting target program are determined largely by the manner in which data is partitioned across different processors of the machine. Most of the compilers provide no assistance to the programmer in the crucial task of determining a good data partitioning scheme. A novel approach is presented, the constraints-based approach, to the problem of automatic data partitioning for numeric programs. In this approach, the compiler identifies some desirable requirements on the distribution of various arrays being referenced in each statement, based on performance considerations. These desirable requirements are referred to as constraints. For each constraint, the compiler determines a quality measure that captures its importance with respect to the performance of the program. The quality measure is obtained through static performance estimation, without actually generating the target data-parallel program with explicit communication. Each data distribution decision is taken by combining all the relevant constraints. The compiler attempts to resolve any conflicts between constraints such that the overall execution time of the parallel program is minimized. This approach has been implemented as part of a compiler called Paradigm, that accepts Fortran 77 programs, and specifies the partitioning scheme to be used for each array in the program. We have obtained results on some programs taken from the Linpack and Eispack libraries, and the Perfect Benchmarks. These results are quite promising, and demonstrate the feasibility of automatic data partitioning for a significant class of scientific application programs with regular computations.
Control Synthesis for a Class of Hybrid Systems Subject to Configuration-Based Safety Constraints
NASA Technical Reports Server (NTRS)
Heymann, Michael; Lin, Feng; Meyer, George
1997-01-01
We examine a class of hybrid systems which we call Composite Hybrid Machines (CHM's) that consists of the concurrent (and partially synchronized) operation of Elementary Hybrid Machines (EHM's). Legal behavior, specified by a set of illegal configurations that the CHM may not enter, is to be achieved by the concurrent operation of the CHM with a suitably designed legal controller. In the present paper we focus on the problem of synthesizing a legal controller, whenever such a controller exists. More specifically, we address the problem of synthesizing the minimally restrictive legal controller. A controller is minimally restrictive if, when composed to operate concurrently with another legal controller, it will never interfere with the operation of the other controller and, therefore, can be composed to operate concurrently with any other controller that may be designed to achieve liveness specifications or optimality requirements without the need to reinvestigate or reverify legality of the composite controller. We confine our attention to a special class of CHM's where system dynamics is rate-limited and legal guards are conjunctions or disjunctions of atomic formulas in the dynamic variables (of the type x less than or equal to x(sub 0), or x greater than or equal to x(sub 0)). We present an algorithm for synthesis of the minimally restrictive legal controller. We demonstrate our approach by synthesizing a minimally restrictive controller for a steam boiler (the verification of which recently received a great deal of attention).
An Adaptive Flow Solver for Air-Borne Vehicles Undergoing Time-Dependent Motions/Deformations
NASA Technical Reports Server (NTRS)
Singh, Jatinder; Taylor, Stephen
1997-01-01
This report describes a concurrent Euler flow solver for flows around complex 3-D bodies. The solver is based on a cell-centered finite volume methodology on 3-D unstructured tetrahedral grids. In this algorithm, spatial discretization for the inviscid convective term is accomplished using an upwind scheme. A localized reconstruction is done for flow variables which is second order accurate. Evolution in time is accomplished using an explicit three-stage Runge-Kutta method which has second order temporal accuracy. This is adapted for concurrent execution using another proven methodology based on concurrent graph abstraction. This solver operates on heterogeneous network architectures. These architectures may include a broad variety of UNIX workstations and PCs running Windows NT, symmetric multiprocessors and distributed-memory multi-computers. The unstructured grid is generated using commercial grid generation tools. The grid is automatically partitioned using a concurrent algorithm based on heat diffusion. This results in memory requirements that are inversely proportional to the number of processors. The solver uses automatic granularity control and resource management techniques both to balance load and communication requirements, and deal with differing memory constraints. These ideas are again based on heat diffusion. Results are subsequently combined for visualization and analysis using commercial CFD tools. Flow simulation results are demonstrated for a constant section wing at subsonic, transonic, and a supersonic case. These results are compared with experimental data and numerical results of other researchers. Performance results are under way for a variety of network topologies.
ERIC Educational Resources Information Center
Rivard, Mélina; Morin, Diane; Dionne, Carmen; Mello, Catherine; Gagnon, Marc-André
2015-01-01
This study documented the perceived needs of therapists, specialists, and managers who work with children with intellectual disabilities (ID) and/or autism spectrum disorders (ASD) and concurrent problem behaviours (PBs). Seventy-five respondents from specialized PB and early childhood programs within eight public rehabilitation centres were…
Functional language and data flow architectures
NASA Technical Reports Server (NTRS)
Ercegovac, M. D.; Patel, D. R.; Lang, T.
1983-01-01
This is a tutorial article about language and architecture approaches for highly concurrent computer systems based on the functional style of programming. The discussion concentrates on the basic aspects of functional languages, and sequencing models such as data-flow, demand-driven and reduction which are essential at the machine organization level. Several examples of highly concurrent machines are described.
A C++ Thread Package for Concurrent and Parallel Programming
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jie Chen; William Watson
1999-11-01
Recently thread libraries have become a common entity on various operating systems such as Unix, Windows NT and VxWorks. Those thread libraries offer significant performance enhancement by allowing applications to use multiple threads running either concurrently or in parallel on multiprocessors. However, the incompatibilities between native libraries introduces challenges for those who wish to develop portable applications.
Duality in non-linear programming
NASA Astrophysics Data System (ADS)
Jeyalakshmi, K.
2018-04-01
In this paper we consider duality and converse duality for a programming problem involving convex objective and constraint functions with finite dimensional range. We do not assume any constraint qualification. The dual is presented by reducing the problem to a standard Lagrange multiplier problem.
Watching TV and Food Intake: The Role of Content
Chapman, Colin D.; Nilsson, Victor C.; Thune, Hanna Å.; Cedernaes, Jonathan; Le Grevès, Madeleine; Hogenkamp, Pleunie S.; Benedict, Christian; Schiöth, Helgi B.
2014-01-01
Obesity is a serious and growing health concern worldwide. Watching television (TV) represents a condition during which many habitually eat, irrespective of hunger level. However, as of yet, little is known about how the content of television programs being watched differentially impacts concurrent eating behavior. In this study, eighteen normal-weight female students participated in three counter-balanced experimental conditions, including a ‘Boring’ TV condition (art lecture), an ‘Engaging’ TV condition (Swedish TV comedy series), and a no TV control condition during which participants read (a text on insects living in Sweden). Throughout each condition participants had access to both high-calorie (M&Ms) and low-calorie (grapes) snacks. We found that, relative to the Engaging TV condition, Boring TV encouraged excessive eating (+52% g, P = 0.009). Additionally, the Engaging TV condition actually resulted in significantly less concurrent intake relative to the control ‘Text’ condition (−35% g, P = 0.05). This intake was driven almost entirely by the healthy snack, grapes; however, this interaction did not reach significance (P = 0.07). Finally, there was a significant correlation between how bored participants were across all conditions, and their concurrent food intake (beta = 0.317, P = 0.02). Intake as measured by kcals was similarly patterned but did not reach significance. These results suggest that, for women, different TV programs elicit different levels of concurrent food intake, and that the degree to which a program is engaging (or alternately, boring) is related to that intake. Additionally, they suggest that emotional content (e.g. boring vs. engaging) may be more associated than modality (e.g. TV vs. text) with concurrent intake. PMID:24983245
Methodologies and systems for heterogeneous concurrent computing
NASA Technical Reports Server (NTRS)
Sunderam, V. S.
1994-01-01
Heterogeneous concurrent computing is gaining increasing acceptance as an alternative or complementary paradigm to multiprocessor-based parallel processing as well as to conventional supercomputing. While algorithmic and programming aspects of heterogeneous concurrent computing are similar to their parallel processing counterparts, system issues, partitioning and scheduling, and performance aspects are significantly different. In this paper, we discuss critical design and implementation issues in heterogeneous concurrent computing, and describe techniques for enhancing its effectiveness. In particular, we highlight the system level infrastructures that are required, aspects of parallel algorithm development that most affect performance, system capabilities and limitations, and tools and methodologies for effective computing in heterogeneous networked environments. We also present recent developments and experiences in the context of the PVM system and comment on ongoing and future work.
GLobal Integrated Design Environment (GLIDE): A Concurrent Engineering Application
NASA Technical Reports Server (NTRS)
McGuire, Melissa L.; Kunkel, Matthew R.; Smith, David A.
2010-01-01
The GLobal Integrated Design Environment (GLIDE) is a client-server software application purpose-built to mitigate issues associated with real time data sharing in concurrent engineering environments and to facilitate discipline-to-discipline interaction between multiple engineers and researchers. GLIDE is implemented in multiple programming languages utilizing standardized web protocols to enable secure parameter data sharing between engineers and researchers across the Internet in closed and/or widely distributed working environments. A well defined, HyperText Transfer Protocol (HTTP) based Application Programming Interface (API) to the GLIDE client/server environment enables users to interact with GLIDE, and each other, within common and familiar tools. One such common tool, Microsoft Excel (Microsoft Corporation), paired with its add-in API for GLIDE, is discussed in this paper. The top-level examples given demonstrate how this interface improves the efficiency of the design process of a concurrent engineering study while reducing potential errors associated with manually sharing information between study participants.
Teaching People to Manage Constraints: Effects on Creative Problem-Solving
ERIC Educational Resources Information Center
Peterson, David R.; Barrett, Jamie D.; Hester, Kimberly S.; Robledo, Issac C.; Hougen, Dean F.; Day, Eric A.; Mumford, Michael D.
2013-01-01
Constraints often inhibit creative problem-solving. This study examined the impact of training strategies for managing constraints on creative problem-solving. Undergraduates, 218 in all, were asked to work through 1 to 4 self-paced instructional programs focused on constraint management strategies. The quality, originality, and elegance of…
40 CFR 282.50 - Alabama State-Administered Program.
Code of Federal Regulations, 2010 CFR
2010-07-01
... and Recovery Act of 1976 (RCRA), as amended, 42 U.S.C. 6991 et seq. The State's program, as... Alabama underground storage tank program concurrently with this notice and it will be effective on March... to be effective on March 25, 1997. Copies of Alabama's underground storage tank program may be...
Advanced main combustion chamber program
NASA Technical Reports Server (NTRS)
1991-01-01
The topics presented are covered in viewgraph form and include the following: investment of low cost castings; usage of SSME program; usage of MSFC personnel for design effort; and usage of concurrent engineering techniques.
7 CFR 253.1 - General purpose and scope.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Distribution Program and the Food Stamp Program on Indian reservations when such concurrent operation is... Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE GENERAL REGULATIONS AND POLICIES-FOOD DISTRIBUTION ADMINISTRATION OF THE FOOD DISTRIBUTION PROGRAM...
7 CFR 253.1 - General purpose and scope.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Distribution Program and the Food Stamp Program on Indian reservations when such concurrent operation is... Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE GENERAL REGULATIONS AND POLICIES-FOOD DISTRIBUTION ADMINISTRATION OF THE FOOD DISTRIBUTION PROGRAM...
ERIC Educational Resources Information Center
Elsherif, Entisar
2017-01-01
This adaptive methodological inquiry explored the affordances and constraints of one TESOL teacher education program in Libya as a conflict zone. Data was collected through seven documents and 33 questionnaires. Questionnaires were gathered from the investigated program's teacher-educators, student-teachers, and graduates, who were in-service…
Concurrent Smalltalk on the Message-Driven Processor
1991-09-01
language close to Concurrent Smalltalk and having an almost identical name is CONCURRENTSMALLTALK [39] [40] independently developed by Yasuhiko Yokote and...Laboratory Memo 1044, October 1988. [391 Yokote, Yasuhiko , and Tokoro, Mario. ’The Design and Implementation of Concur- rentSmalltalk." Proceedings...of the 1986 Object-Oriented Programming Systems, Lan- guages, and Applications Conference, September 1986. 222 Bibliography [401 Yokote, Yasuhiko , and
Design and Implementation of a Threaded Search Engine for Tour Recommendation Systems
NASA Astrophysics Data System (ADS)
Lee, Junghoon; Park, Gyung-Leen; Ko, Jin-Hee; Shin, In-Hye; Kang, Mikyung
This paper implements a threaded scan engine for the O(n!) search space and measures its performance, aiming at providing a responsive tour recommendation and scheduling service. As a preliminary step of integrating POI ontology, mobile object database, and personalization profile for the development of new vehicular telematics services, this implementation can give a useful guideline to design a challenging and computation-intensive vehicular telematics service. The implemented engine allocates the subtree to the respective threads and makes them run concurrently exploiting the primitives provided by the operating system and the underlying multiprocessor architecture. It also makes it easy to add a variety of constraints, for example, the search tree is pruned if the cost of partial allocation already exceeds the current best. The performance measurement result shows that the service can run even in the low-power telematics device when the number of destinations does not exceed 15, with an appropriate constraint processing.
The relationship between hand hygiene and health care-associated infection: it’s complicated
McLaws, Mary-Louise
2015-01-01
The reasoning that improved hand hygiene compliance contributes to the prevention of health care-associated infections is widely accepted. It is also accepted that high hand hygiene alone cannot impact formidable risk factors, such as older age, immunosuppression, admission to the intensive care unit, longer length of stay, and indwelling devices. When hand hygiene interventions are concurrently undertaken with other routine or special preventive strategies, there is a potential for these concurrent strategies to confound the effect of the hand hygiene program. The result may be an overestimation of the hand hygiene intervention unless the design of the intervention or analysis controls the effect of the potential confounders. Other epidemiologic principles that may also impact the result of a hand hygiene program include failure to consider measurement error of the content of the hand hygiene program and the measurement error of compliance. Some epidemiological errors in hand hygiene programs aimed at reducing health care-associated infections are inherent and not easily controlled. Nevertheless, the inadvertent omission by authors to report these common epidemiological errors, including concurrent infection prevention strategies, suggests to readers that the effect of hand hygiene is greater than the sum of all infection prevention strategies. Worse still, this omission does not assist evidence-based practice. PMID:25678805
Young, Kaelin C; Kendall, Kristina L; Patterson, Kaitlyn M; Pandya, Priyanka D; Fairman, Ciaran M; Smith, Samuel W
2014-11-01
To assess changes in body composition, lumbar-spine bone mineral density (BMD), and rowing performance in college-level rowers over a competition season. Eleven Division I college rowers (mean ± SD 21.4 ± 3.7 y) completed 6 testing sessions throughout the course of their competition season. Testing included measurements of fat mass, bone-free lean mass (BFLM), body fat (%BF), lumbar-spine BMD, and 2000-m time-trial performance. After preseason testing, rowers participated in a periodized training program, with the addition of resistance training to the traditional aerobic-training program. Significant (P < .05) improvements in %BF, total mass, and BFLM were observed at midseason and postseason compared with preseason. Neither lumbar-spine BMD nor BMC significantly changed over the competitive season (P > .05). Finally, rowing performance (as measured by 2000-m time and average watts achieved) significantly improved at midseason and postseason compared with preseason. Our results highlight the efficacy of a seasonal concurrent training program serving to improve body composition and rowing performance, as measured by 2000-m times and average watts, among college-level rowers. Our findings offer practical applications for coaches and athletes looking to design a concurrent strength and aerobic training program to improve rowing performance across a season.
Dal Palù, Alessandro; Pontelli, Enrico; He, Jing; Lu, Yonggang
2007-01-01
The paper describes a novel framework, constructed using Constraint Logic Programming (CLP) and parallelism, to determine the association between parts of the primary sequence of a protein and alpha-helices extracted from 3D low-resolution descriptions of large protein complexes. The association is determined by extracting constraints from the 3D information, regarding length, relative position and connectivity of helices, and solving these constraints with the guidance of a secondary structure prediction algorithm. Parallelism is employed to enhance performance on large proteins. The framework provides a fast, inexpensive alternative to determine the exact tertiary structure of unknown proteins.
NASA Technical Reports Server (NTRS)
Young, Katherine C.; Sobieszczanski-Sobieski, Jaroslaw
1988-01-01
This project has two objectives. The first is to determine whether linear programming techniques can improve performance when handling design optimization problems with a large number of design variables and constraints relative to the feasible directions algorithm. The second purpose is to determine whether using the Kreisselmeier-Steinhauser (KS) function to replace the constraints with one constraint will reduce the cost of total optimization. Comparisons are made using solutions obtained with linear and non-linear methods. The results indicate that there is no cost saving using the linear method or in using the KS function to replace constraints.
ToonTalk(TM)--An Animated Programming Environment for Children.
ERIC Educational Resources Information Center
Kahn, Ken
This paper describes ToonTalk, a general-purpose concurrent programming system in which the source code is animated and the programming environment is a video game. The design objectives of ToonTalk were to create a self-teaching programming system for children that was also a very powerful and flexible programming tool. A keyboard can be used for…
Constraint Programming to Solve Maximal Density Still Life
NASA Astrophysics Data System (ADS)
Chu, Geoffrey; Petrie, Karen Elizabeth; Yorke-Smith, Neil
The Maximum Density Still Life problem fills a finite Game of Life board with a stable pattern of cells that has as many live cells as possible. Although simple to state, this problem is computationally challenging for any but the smallest sizes of board. Especially difficult is to prove that the maximum number of live cells has been found. Various approaches have been employed. The most successful are approaches based on Constraint Programming (CP). We describe the Maximum Density Still Life problem, introduce the concept of constraint programming, give an overview on how the problem can be modelled and solved with CP, and report on best-known results for the problem.
ERIC Educational Resources Information Center
Davis, Elisabeth; Smither, Cameron; Zhu, Bo; Stephan, Jennifer
2017-01-01
Acceleration programs are academically challenging courses in which high school students can simultaneously earn credit toward a high school diploma and a postsecondary degree (dual credit). These programs include Advanced Placement courses, concurrent-enrollment courses, Postsecondary Enrollment Options courses (a dual-enrollment program in…
A Mars Exploration Discovery Program
NASA Astrophysics Data System (ADS)
Hansen, C. J.; Paige, D. A.
2000-07-01
The Mars Exploration Program should consider following the Discovery Program model. In the Discovery Program a team of scientists led by a PI develop the science goals of their mission, decide what payload achieves the necessary measurements most effectively, and then choose a spacecraft with the capabilities needed to carry the payload to the desired target body. The primary constraints associated with the Discovery missions are time and money. The proposer must convince reviewers that their mission has scientific merit and is feasible. Every Announcement of Opportunity has resulted in a collection of creative ideas that fit within advertised constraints. Following this model, a "Mars Discovery Program" would issue an Announcement of Opportunity for each launch opportunity with schedule constraints dictated by the launch window and fiscal constraints in accord with the program budget. All else would be left to the proposer to choose, based on the science the team wants to accomplish, consistent with the program theme of "Life, Climate and Resources". A proposer could propose a lander, an orbiter, a fleet of SCOUT vehicles or penetrators, an airplane, a balloon mission, a large rover, a small rover, etc. depending on what made the most sense for the science investigation and payload. As in the Discovery program, overall feasibility relative to cost, schedule and technology readiness would be evaluated and be part of the selection process.
A Mars Exploration Discovery Program
NASA Technical Reports Server (NTRS)
Hansen, C. J.; Paige, D. A.
2000-01-01
The Mars Exploration Program should consider following the Discovery Program model. In the Discovery Program a team of scientists led by a PI develop the science goals of their mission, decide what payload achieves the necessary measurements most effectively, and then choose a spacecraft with the capabilities needed to carry the payload to the desired target body. The primary constraints associated with the Discovery missions are time and money. The proposer must convince reviewers that their mission has scientific merit and is feasible. Every Announcement of Opportunity has resulted in a collection of creative ideas that fit within advertised constraints. Following this model, a "Mars Discovery Program" would issue an Announcement of Opportunity for each launch opportunity with schedule constraints dictated by the launch window and fiscal constraints in accord with the program budget. All else would be left to the proposer to choose, based on the science the team wants to accomplish, consistent with the program theme of "Life, Climate and Resources". A proposer could propose a lander, an orbiter, a fleet of SCOUT vehicles or penetrators, an airplane, a balloon mission, a large rover, a small rover, etc. depending on what made the most sense for the science investigation and payload. As in the Discovery program, overall feasibility relative to cost, schedule and technology readiness would be evaluated and be part of the selection process.
Multidisciplinary Concurrent Design Optimization via the Internet
NASA Technical Reports Server (NTRS)
Woodard, Stanley E.; Kelkar, Atul G.; Koganti, Gopichand
2001-01-01
A methodology is presented which uses commercial design and analysis software and the Internet to perform concurrent multidisciplinary optimization. The methodology provides a means to develop multidisciplinary designs without requiring that all software be accessible from the same local network. The procedures are amenable to design and development teams whose members, expertise and respective software are not geographically located together. This methodology facilitates multidisciplinary teams working concurrently on a design problem of common interest. Partition of design software to different machines allows each constituent software to be used on the machine that provides the most economy and efficiency. The methodology is demonstrated on the concurrent design of a spacecraft structure and attitude control system. Results are compared to those derived from performing the design with an autonomous FORTRAN program.
Symbolic PathFinder: Symbolic Execution of Java Bytecode
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.; Rungta, Neha
2010-01-01
Symbolic Pathfinder (SPF) combines symbolic execution with model checking and constraint solving for automated test case generation and error detection in Java programs with unspecified inputs. In this tool, programs are executed on symbolic inputs representing multiple concrete inputs. Values of variables are represented as constraints generated from the analysis of Java bytecode. The constraints are solved using off-the shelf solvers to generate test inputs guaranteed to achieve complex coverage criteria. SPF has been used successfully at NASA, in academia, and in industry.
Additive manufacturing: Toward holistic design
Jared, Bradley H.; Aguilo, Miguel A.; Beghini, Lauren L.; ...
2017-03-18
Here, additive manufacturing offers unprecedented opportunities to design complex structures optimized for performance envelopes inaccessible under conventional manufacturing constraints. Additive processes also promote realization of engineered materials with microstructures and properties that are impossible via traditional synthesis techniques. Enthused by these capabilities, optimization design tools have experienced a recent revival. The current capabilities of additive processes and optimization tools are summarized briefly, while an emerging opportunity is discussed to achieve a holistic design paradigm whereby computational tools are integrated with stochastic process and material awareness to enable the concurrent optimization of design topologies, material constructs and fabrication processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jared, Bradley H.; Aguilo, Miguel A.; Beghini, Lauren L.
Here, additive manufacturing offers unprecedented opportunities to design complex structures optimized for performance envelopes inaccessible under conventional manufacturing constraints. Additive processes also promote realization of engineered materials with microstructures and properties that are impossible via traditional synthesis techniques. Enthused by these capabilities, optimization design tools have experienced a recent revival. The current capabilities of additive processes and optimization tools are summarized briefly, while an emerging opportunity is discussed to achieve a holistic design paradigm whereby computational tools are integrated with stochastic process and material awareness to enable the concurrent optimization of design topologies, material constructs and fabrication processes.
The Development and Implementation of Outdoor-Based Secondary School Integrated Programs
ERIC Educational Resources Information Center
Comishin, Kelly; Dyment, Janet E.; Potter, Tom G.; Russell, Constance L.
2004-01-01
Four teachers share the challenges they faced when creating and running outdoor-focused secondary school integrated programs in British Columbia, Canada. The five most common challenges were funding constraints, insufficient support from administrators and colleagues, time constraints, liability and risk management, and inadequate skills and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hempling, Scott; Elefant, Carolyn; Cory, Karlynn
2010-01-01
This report details how state feed-in tariff (FIT) programs can be legally implemented and how they can comply with federal requirements. The report describes the federal constraints on FIT programs and identifies legal methods that are free of those constrains.
The Nature of Credit Constraints and Human Capital. NBER Working Paper No. 13912
ERIC Educational Resources Information Center
Lochner, Lance J.; Monge-Naranjo, Alexander
2008-01-01
This paper studies the nature and impact of credit constraints in the market for human capital. We derive endogenous constraints from the design of government student loan programs and from the limited repayment incentives in private lending markets. These constraints imply cross-sectional patterns for schooling, ability, and family income that…
Zhao, Yingfeng; Liu, Sanyang
2016-01-01
We present a practical branch and bound algorithm for globally solving generalized linear multiplicative programming problem with multiplicative constraints. To solve the problem, a relaxation programming problem which is equivalent to a linear programming is proposed by utilizing a new two-phase relaxation technique. In the algorithm, lower and upper bounds are simultaneously obtained by solving some linear relaxation programming problems. Global convergence has been proved and results of some sample examples and a small random experiment show that the proposed algorithm is feasible and efficient.
ERIC Educational Resources Information Center
London, David T.
Data from the stepwise multiple regression of four educational cognitive style predictor sets on each of six academic competence criteria were used to define the concurrent validity of Hill's educational cognitive style model. The purpose was to determine how appropriate it may be to use this model as a prototype for successful academic programs…
Formal Semanol Specification of Ada.
1980-09-01
concurrent task modeling involved very little change to the SEMANOL metalanguage. A primitive capable of initiating concurrent SEMANOL task processors...i.e., #CO-COMPUTE) and two primitivc-; corresponding to integer semaphores (i.c., #P and #V) were all that were required. In addition, these changes... synchronization techniques and choice of correct unblocking alternatives. We should note that it had been our original intention to use the Ada Translator program
Methods for design and evaluation of integrated hardware-software systems for concurrent computation
NASA Technical Reports Server (NTRS)
Pratt, T. W.
1985-01-01
Research activities and publications are briefly summarized. The major tasks reviewed are: (1) VAX implementation of the PISCES parallel programming environment; (2) Apollo workstation network implementation of the PISCES environment; (3) FLEX implementation of the PISCES environment; (4) sparse matrix iterative solver in PSICES Fortran; (5) image processing application of PISCES; and (6) a formal model of concurrent computation being developed.
Lotfi, Younes; Rezazadeh, Nima; Moossavi, Abdollah; Haghgoo, Hojjat Allah; Rostami, Reza; Bakhshi, Enayatollah; Badfar, Faride; Moghadam, Sedigheh Farokhi; Sadeghi-Firoozabadi, Vahid; Khodabandelou, Yousef
2017-12-01
Balance function has been reported to be worse in ADHD children than in their normal peers. The present study hypothesized that an improvement in balance could result in better cognitive performance in children with ADHD and concurrent vestibular impairment. This study was designed to evaluate the effects of comprehensive vestibular rehabilitation therapy on the cognitive performance of children with combined ADHD and concurrent vestibular impairment. Subject were 54 children with combined ADHD. Those with severe vestibular impairment (n=33) were randomly assigned to two groups that were matched for age. A rehabilitation program comprising overall balance and gate, postural stability, and eye movement exercises was assigned to the intervention group. Subjects in the control group received no intervention for the same time period. Intervention was administered twice weekly for 12 weeks. Choice reaction time (CRT) and spatial working memory (SWM) subtypes of the Cambridge Neuropsychological Test Automated Battery (CANTAB) were completed pre- and post-intervention to determine the effects of vestibular rehabilitation on the cognitive performance of the subjects with ADHD and concurrent vestibular impairment. ANCOVA was used to compare the test results of the intervention and control group post-test. The percentage of correct trial scores for the CRT achieved by the intervention group post-test increased significantly compared to those of the control group (p=0.029). The CRT mean latency scores were significantly prolonged in the intervention group following intervention (p=0.007) compared to the control group. No significant change was found in spatial functioning of the subjects with ADHD following 12 weeks of intervention (p>0.05). The study highlights the effect of vestibular rehabilitation on the cognitive performance of children with combined ADHD and concurrent vestibular disorder. The findings indicate that attention can be affected by early vestibular rehabilitation, which is a basic program for improving memory function in such children. Appropriate vestibular rehabilitation programs based on the type of vestibular impairment of children can improve their cognitive ability to some extent in children with ADHD and concurrent vestibular impairment (p>0.05). Copyright © 2017 Elsevier B.V. All rights reserved.
Natural environment application for NASP-X-30 design and mission planning
NASA Technical Reports Server (NTRS)
Johnson, D. L.; Hill, C. K.; Brown, S. C.; Batts, G. W.
1993-01-01
The NASA/MSFC Mission Analysis Program has recently been utilized in various National Aero-Space Plane (NASP) mission and operational planning scenarios. This paper focuses on presenting various atmospheric constraint statistics based on assumed NASP mission phases using established natural environment design, parametric, threshold values. Probabilities of no-go are calculated using atmospheric parameters such as temperature, humidity, density altitude, peak/steady-state winds, cloud cover/ceiling, thunderstorms, and precipitation. The program although developed to evaluate test or operational missions after flight constraints have been established, can provide valuable information in the design phase of the NASP X-30 program. Inputting the design values as flight constraints the Mission Analysis Program returns the probability of no-go, or launch delay, by hour by month. This output tells the X-30 program manager whether the design values are stringent enough to meet his required test flight schedules.
Multi-Objective Programming for Lot-Sizing with Quantity Discount
NASA Astrophysics Data System (ADS)
Kang, He-Yau; Lee, Amy H. I.; Lai, Chun-Mei; Kang, Mei-Sung
2011-11-01
Multi-objective programming (MOP) is one of the popular methods for decision making in a complex environment. In a MOP, decision makers try to optimize two or more objectives simultaneously under various constraints. A complete optimal solution seldom exists, and a Pareto-optimal solution is usually used. Some methods, such as the weighting method which assigns priorities to the objectives and sets aspiration levels for the objectives, are used to derive a compromise solution. The ɛ-constraint method is a modified weight method. One of the objective functions is optimized while the other objective functions are treated as constraints and are incorporated in the constraint part of the model. This research considers a stochastic lot-sizing problem with multi-suppliers and quantity discounts. The model is transformed into a mixed integer programming (MIP) model next based on the ɛ-constraint method. An illustrative example is used to illustrate the practicality of the proposed model. The results demonstrate that the model is an effective and accurate tool for determining the replenishment of a manufacturer from multiple suppliers for multi-periods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahrens, J.P.; Shapiro, L.G.; Tanimoto, S.L.
1997-04-01
This paper describes a computing environment which supports computer-based scientific research work. Key features include support for automatic distributed scheduling and execution and computer-based scientific experimentation. A new flexible and extensible scheduling technique that is responsive to a user`s scheduling constraints, such as the ordering of program results and the specification of task assignments and processor utilization levels, is presented. An easy-to-use constraint language for specifying scheduling constraints, based on the relational database query language SQL, is described along with a search-based algorithm for fulfilling these constraints. A set of performance studies show that the environment can schedule and executemore » program graphs on a network of workstations as the user requests. A method for automatically generating computer-based scientific experiments is described. Experiments provide a concise method of specifying a large collection of parameterized program executions. The environment achieved significant speedups when executing experiments; for a large collection of scientific experiments an average speedup of 3.4 on an average of 5.5 scheduled processors was obtained.« less
NASA Technical Reports Server (NTRS)
Yan, Jerry C.
1987-01-01
In concurrent systems, a major responsibility of the resource management system is to decide how the application program is to be mapped onto the multi-processor. Instead of using abstract program and machine models, a generate-and-test framework known as 'post-game analysis' that is based on data gathered during program execution is proposed. Each iteration consists of (1) (a simulation of) an execution of the program; (2) analysis of the data gathered; and (3) the proposal of a new mapping that would have a smaller execution time. These heuristics are applied to predict execution time changes in response to small perturbations applied to the current mapping. An initial experiment was carried out using simple strategies on 'pipeline-like' applications. The results obtained from four simple strategies demonstrated that for this kind of application, even simple strategies can produce acceptable speed-up with a small number of iterations.
Programming languages for circuit design.
Pedersen, Michael; Yordanov, Boyan
2015-01-01
This chapter provides an overview of a programming language for Genetic Engineering of Cells (GEC). A GEC program specifies a genetic circuit at a high level of abstraction through constraints on otherwise unspecified DNA parts. The GEC compiler then selects parts which satisfy the constraints from a given parts database. GEC further provides more conventional programming language constructs for abstraction, e.g., through modularity. The GEC language and compiler is available through a Web tool which also provides functionality, e.g., for simulation of designed circuits.
ERIC Educational Resources Information Center
Gong, Wen; Rojewski, Jay W.
2015-01-01
A model describing the development of curriculum and coursework for dual certification programs in two-year postsecondary vocational education institutions throughout China is described. Dual certification programs allow students to obtain an academic diploma and a national vocational qualification certificate concurrently. Chinese educators and…
Maximizing the Use of Electronic Individualized Education Program Software
ERIC Educational Resources Information Center
More, Cori M.; Hart, Juliet E.
2013-01-01
With the growing use of technology in today's schools, electronic IEP programs are being adopted by many school districts around the nation as part of special education service delivery. These programs provide a useful technology that can facilitate compliance with IDEA requirements in IEP development while concurrently lessening teacher paperwork…
NASA Technical Reports Server (NTRS)
Horvath, Joan C.; Alkalaj, Leon J.; Schneider, Karl M.; Amador, Arthur V.; Spitale, Joseph N.
1993-01-01
Robotic spacecraft are controlled by sets of commands called 'sequences.' These sequences must be checked against mission constraints. Making our existing constraint checking program faster would enable new capabilities in our uplink process. Therefore, we are rewriting this program to run on a parallel computer. To do so, we had to determine how to run constraint-checking algorithms in parallel and create a new method of specifying spacecraft models and constraints. This new specification gives us a means of representing flight systems and their predicted response to commands which could be used in a variety of applications throughout the command process, particularly during anomaly or high-activity operations. This commonality could reduce operations cost and risk for future complex missions. Lessons learned in applying some parts of this system to the TOPEX/Poseidon mission will be described.
NASA Astrophysics Data System (ADS)
Lin, Chuang; Wang, Binghui; Jiang, Ning; Farina, Dario
2018-04-01
Objective. This paper proposes a novel simultaneous and proportional multiple degree of freedom (DOF) myoelectric control method for active prostheses. Approach. The approach is based on non-negative matrix factorization (NMF) of surface EMG signals with the inclusion of sparseness constraints. By applying a sparseness constraint to the control signal matrix, it is possible to extract the basis information from arbitrary movements (quasi-unsupervised approach) for multiple DOFs concurrently. Main Results. In online testing based on target hitting, able-bodied subjects reached a greater throughput (TP) when using sparse NMF (SNMF) than with classic NMF or with linear regression (LR). Accordingly, the completion time (CT) was shorter for SNMF than NMF or LR. The same observations were made in two patients with unilateral limb deficiencies. Significance. The addition of sparseness constraints to NMF allows for a quasi-unsupervised approach to myoelectric control with superior results with respect to previous methods for the simultaneous and proportional control of multi-DOF. The proposed factorization algorithm allows robust simultaneous and proportional control, is superior to previous supervised algorithms, and, because of minimal supervision, paves the way to online adaptation in myoelectric control.
WINDOWAC (Wing Design Optimization With Aeroelastic Constraints): Program manual
NASA Technical Reports Server (NTRS)
Haftka, R. T.; Starnes, J. H., Jr.
1974-01-01
User and programer documentation for the WIDOWAC programs is given. WIDOWAC may be used for the design of minimum mass wing structures subjected to flutter, strength, and minimum gage constraints. The wing structure is modeled by finite elements, flutter conditions may be both subsonic and supersonic, and mathematical programing methods are used for the optimization procedure. The user documentation gives general directions on how the programs may be used and describes their limitations; in addition, program input and output are described, and example problems are presented. A discussion of computational algorithms and flow charts of the WIDOWAC programs and major subroutines is also given.
Generalized Symbolic Execution for Model Checking and Testing
NASA Technical Reports Server (NTRS)
Khurshid, Sarfraz; Pasareanu, Corina; Visser, Willem; Kofmeyer, David (Technical Monitor)
2003-01-01
Modern software systems, which often are concurrent and manipulate complex data structures must be extremely reliable. We present a novel framework based on symbolic execution, for automated checking of such systems. We provide a two-fold generalization of traditional symbolic execution based approaches: one, we define a program instrumentation, which enables standard model checkers to perform symbolic execution; two, we give a novel symbolic execution algorithm that handles dynamically allocated structures (e.g., lists and trees), method preconditions (e.g., acyclicity of lists), data (e.g., integers and strings) and concurrency. The program instrumentation enables a model checker to automatically explore program heap configurations (using a systematic treatment of aliasing) and manipulate logical formulae on program data values (using a decision procedure). We illustrate two applications of our framework: checking correctness of multi-threaded programs that take inputs from unbounded domains with complex structure and generation of non-isomorphic test inputs that satisfy a testing criterion. Our implementation for Java uses the Java PathFinder model checker.
Universal programming interface with concurrent access
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alferov, Oleg
2004-10-07
There exist a number of devices with a positioning nature of operation, such as mechanical linear stages, temperature controllers, or filterwheels with discrete state, and most of them have different programming interfaces. The Universal Positioner software suggests the way to handle all of them is with a single approach, whereby a particular hardware driver is created from the template and by translating the actual commands used by the hardware to and from the universal programming interface. The software contains the universal API module itself, the demo simulation of hardware, and the front-end programs to help developers write their own softwaremore » drivers along with example drivers for actual hardware controllers. The software allows user application programs to call devices simultaneously without race conditions (multitasking and concurrent access). The template suggested in this package permits developers to integrate various devices easily into their applications using the same API. The drivers can be stacked; i.e., they can call each other via the same interface.« less
NASA Astrophysics Data System (ADS)
Tremoulet, P. C.
The author describes a number of maintenance improvements in the Fiber Optic Cable System (FOCS). They were achieved during a production phase pilot concurrent engineering program. Listed in order of importance (saved maintenance time and material) by maintenance level, they are: (1) organizational level: improved fiber optic converter (FOC) BITE; (2) Intermediate level: reduced FOC adjustments from 20 to 2; partitioned FOC into electrical and optical parts; developed cost-effective fault isolation test points and test using standard test equipment; improved FOC chassis to have lower mean time to repair; and (3) depot level: revised test requirements documents (TRDs) for common automatic test equipment and incorporated ATE testability into circuit and assemblies and application-specific integrated circuits. These improvements met this contract's tailored logistics MIL-STD 1388-1A requirements of monitoring the design for supportability and determining the most effective support equipment. Important logistics lessons learned while accomplishing these maintainability and supportability improvements on the pilot concurrent engineering program are also discussed.
HIV, STD, and hepatitis risk to primary female partners of men being released from prison.
Grinstead, Olga A; Faigeles, Bonnie; Comfort, Megan; Seal, David; Nealey-Moore, Jill; Belcher, Lisa; Morrow, Kathleen
2005-01-01
Incarcerated men in the US are at increased risk for HIV, STDs and hepatitis, and many men leaving prison have unprotected sex with a primary female partner immediately following release from prison. This paper addresses risk to the primary female partners of men being released from prison (N = 106) by examining the prevalence of men's concurrent unprotected sex with other partners or needle sharing prior to and following release from prison (concurrent risk). Rates of concurrent risk were 46% prior to incarceration, 18% one month post release, and 24% three months post release. Multivariate analysis showed concurrent risk was significantly associated with having a female partner who had one or more HIV/STD risk factors and having a history of injection drug use. Findings demonstrate need for prevention programs for incarcerated men and their female partners.
State-Level Reforms That Support College-Level Program Changes in North Carolina
ERIC Educational Resources Information Center
Bowling, R. Edward; Morrissey, Sharon; Fouts, George M.
2014-01-01
This chapter describes the concurrent reforms occurring in North Carolina--both campus-level changes focused on such issues as developing structured programs of study and state-level reforms aimed at supporting the campus efforts.
De Carvalho, Irene Stuart Torrié; Granfeldt, Yvonne; Dejmek, Petr; Håkansson, Andreas
2015-03-01
Linear programming has been used extensively as a tool for nutritional recommendations. Extending the methodology to food formulation presents new challenges, since not all combinations of nutritious ingredients will produce an acceptable food. Furthermore, it would help in implementation and in ensuring the feasibility of the suggested recommendations. To extend the previously used linear programming methodology from diet optimization to food formulation using consistency constraints. In addition, to exemplify usability using the case of a porridge mix formulation for emergency situations in rural Mozambique. The linear programming method was extended with a consistency constraint based on previously published empirical studies on swelling of starch in soft porridges. The new method was exemplified using the formulation of a nutritious, minimum-cost porridge mix for children aged 1 to 2 years for use as a complete relief food, based primarily on local ingredients, in rural Mozambique. A nutritious porridge fulfilling the consistency constraints was found; however, the minimum cost was unfeasible with local ingredients only. This illustrates the challenges in formulating nutritious yet economically feasible foods from local ingredients. The high cost was caused by the high cost of mineral-rich foods. A nutritious, low-cost porridge that fulfills the consistency constraints was obtained by including supplements of zinc and calcium salts as ingredients. The optimizations were successful in fulfilling all constraints and provided a feasible porridge, showing that the extended constrained linear programming methodology provides a systematic tool for designing nutritious foods.
Concurrent Image Processing Executive (CIPE). Volume 1: Design overview
NASA Technical Reports Server (NTRS)
Lee, Meemong; Groom, Steven L.; Mazer, Alan S.; Williams, Winifred I.
1990-01-01
The design and implementation of a Concurrent Image Processing Executive (CIPE), which is intended to become the support system software for a prototype high performance science analysis workstation are described. The target machine for this software is a JPL/Caltech Mark 3fp Hypercube hosted by either a MASSCOMP 5600 or a Sun-3, Sun-4 workstation; however, the design will accommodate other concurrent machines of similar architecture, i.e., local memory, multiple-instruction-multiple-data (MIMD) machines. The CIPE system provides both a multimode user interface and an applications programmer interface, and has been designed around four loosely coupled modules: user interface, host-resident executive, hypercube-resident executive, and application functions. The loose coupling between modules allows modification of a particular module without significantly affecting the other modules in the system. In order to enhance hypercube memory utilization and to allow expansion of image processing capabilities, a specialized program management method, incremental loading, was devised. To minimize data transfer between host and hypercube, a data management method which distributes, redistributes, and tracks data set information was implemented. The data management also allows data sharing among application programs. The CIPE software architecture provides a flexible environment for scientific analysis of complex remote sensing image data, such as planetary data and imaging spectrometry, utilizing state-of-the-art concurrent computation capabilities.
Concepts of Concurrent Programming
1990-04-01
to the material presented. Carriero89 Carriero, N., and Gelernter, D. " How to Write Parallel Programs : A Guide to the Perplexed." ACM...between the architectures on which programs can be executed and the application domains from which problems are drawn. Our goal is to show how programs ...Sept. 1989), 251-510. Abstract: There are four papers: 1. Programming Languages for Distributed Computing Systems (52); 2. How to Write Parallel
Precision orbit raising trajectories. [solar electric propulsion orbital transfer program
NASA Technical Reports Server (NTRS)
Flanagan, P. F.; Horsewood, J. L.; Pines, S.
1975-01-01
A precision trajectory program has been developed to serve as a test bed for geocentric orbit raising steering laws. The steering laws to be evaluated have been developed using optimization methods employing averaging techniques. This program provides the capability of testing the steering laws in a precision simulation. The principal system models incorporated in the program are described, including the radiation environment, the solar array model, the thrusters and power processors, the geopotential, and the solar system. Steering and array orientation constraints are discussed, and the impact of these constraints on program design is considered.
NASA Technical Reports Server (NTRS)
Kiusalaas, J.; Reddy, G. B.
1977-01-01
A finite element program is presented for computer-automated, minimum weight design of elastic structures with constraints on stresses (including local instability criteria) and displacements. Volume 1 of the report contains the theoretical and user's manual of the program. Sample problems and the listing of the program are included in Volumes 2 and 3. The element subroutines are organized so as to facilitate additions and changes by the user. As a result, a relatively minor programming effort would be required to make DESAP 1 into a special purpose program to handle the user's specific design requirements and failure criteria.
Liu, Jason B; Ban, Kristen A; Berian, Julia R; Hutter, Matthew M; Huffman, Kristopher M; Liu, Yaoming; Hoyt, David B; Hall, Bruce L; Ko, Clifford Y
2017-09-26
Objective To determine whether perioperative outcomes differ between patients undergoing concurrent compared with non-concurrent bariatric operations in the USA. Design Retrospective, propensity score matched cohort study. Setting Hospitals in the US accredited by the American College of Surgeons' metabolic and bariatric surgery accreditation and quality improvement program. Participants 513 167 patients undergoing bariatric operations between 1 January 2014 and 31 December 2016. Main outcome measures The primary outcome measure was a composite of 30 day death, morbidity, readmission, reoperation, anastomotic or staple line leak, and bleeding events. Operative duration and lengths of stay were also assessed. Operations were defined as concurrent if they overlapped by 60 or more minutes or in their entirety. Results In this study of 513 167 operations, 739 (29.5%) surgeons at 483 (57.8%) hospitals performed 6087 (1.2%) concurrent operations. The most frequently performed concurrent bariatric operations were sleeve gastrectomy (n=3250, 53.4%) and Roux-en-Y gastric bypass (n=1601, 26.3%). Concurrent operations were more often performed at large academic medical centers with higher operative volumes and numbers of trainees and by higher volume surgeons. Compared with non-concurrent operations, concurrent operations lasted a median of 34 minutes longer (P<0.001) and resulted in 0.3 days longer average length of stay (P<0.001). Perioperative adverse events were not observed to more likely occur in concurrent compared with non-concurrent operations (7.5% v 7.4%; relative risk 1.02, 95% confidence interval 0.90 to 1.15; P=0.84). Conclusions Concurrent bariatric operations occurred infrequently, but when they did, there was no observable increased risk for adverse perioperative outcomes compared with non-concurrent operations. These results, however, do not argue against improved and more meaningful disclosure of concurrent surgery practices. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Administrative Personnel OFFICE OF GOVERNMENT ETHICS GOVERNMENT ETHICS OFFICE OF GOVERNMENT ETHICS AND EXECUTIVE AGENCY ETHICS PROGRAM RESPONSIBILITIES Executive Agency Ethics Training Programs § 2638.702 Definitions... agency in concurrence with the Office of Government Ethics under 5 CFR 2635.105. Employee includes...
Code of Federal Regulations, 2010 CFR
2010-01-01
... Administrative Personnel OFFICE OF GOVERNMENT ETHICS GOVERNMENT ETHICS OFFICE OF GOVERNMENT ETHICS AND EXECUTIVE AGENCY ETHICS PROGRAM RESPONSIBILITIES Executive Agency Ethics Training Programs § 2638.702 Definitions... agency in concurrence with the Office of Government Ethics under 5 CFR 2635.105. Employee includes...
Code of Federal Regulations, 2012 CFR
2012-01-01
... Administrative Personnel OFFICE OF GOVERNMENT ETHICS GOVERNMENT ETHICS OFFICE OF GOVERNMENT ETHICS AND EXECUTIVE AGENCY ETHICS PROGRAM RESPONSIBILITIES Executive Agency Ethics Training Programs § 2638.702 Definitions... agency in concurrence with the Office of Government Ethics under 5 CFR 2635.105. Employee includes...
Code of Federal Regulations, 2013 CFR
2013-01-01
... Administrative Personnel OFFICE OF GOVERNMENT ETHICS GOVERNMENT ETHICS OFFICE OF GOVERNMENT ETHICS AND EXECUTIVE AGENCY ETHICS PROGRAM RESPONSIBILITIES Executive Agency Ethics Training Programs § 2638.702 Definitions... agency in concurrence with the Office of Government Ethics under 5 CFR 2635.105. Employee includes...
Evaluation Methodologies for Estimating the Likelihood of Program Implementation Failure
ERIC Educational Resources Information Center
Durand, Roger; Decker, Phillip J.; Kirkman, Dorothy M.
2014-01-01
Despite our best efforts as evaluators, program implementation failures abound. A wide variety of valuable methodologies have been adopted to explain and evaluate the "why" of these failures. Yet, typically these methodologies have been employed concurrently (e.g., project monitoring) or to the post-hoc assessment of program activities.…
Educational Policy and Literacy Learning in an ESL Classroom: Constraints and Opportunities
ERIC Educational Resources Information Center
Ricklefs, Mariana Alvayero
2012-01-01
This dissertation was a qualitative case study of an educational program for English Language Learners (ELL) at an elementary school in a small city in the Midwest. This case study investigated how language ideologies influence the constraints and opportunities for the planning and execution of this educational program. The findings evidenced that…
Programming your way out of the past: ISIS and the META Project
NASA Technical Reports Server (NTRS)
Birman, Kenneth P.; Marzullo, Keith
1989-01-01
The ISIS distributed programming system and the META Project are described. The ISIS programming toolkit is an aid to low-level programming that makes it easy to build fault-tolerant distributed applications that exploit replication and concurrent execution. The META Project is reexamining high-level mechanisms such as the filesystem, shell language, and administration tools in distributed systems.
Darmon, Nicole; Ferguson, Elaine L; Briend, André
2002-12-01
Economic constraints may contribute to the unhealthy food choices observed among low socioeconomic groups in industrialized countries. The objective of the present study was to predict the food choices a rational individual would make to reduce his or her food budget, while retaining a diet as close as possible to the average population diet. Isoenergetic diets were modeled by linear programming. To ensure these diets were consistent with habitual food consumption patterns, departure from the average French diet was minimized and constraints that limited portion size and the amount of energy from food groups were introduced into the models. A cost constraint was introduced and progressively strengthened to assess the effect of cost on the selection of foods by the program. Strengthening the cost constraint reduced the proportion of energy contributed by fruits and vegetables, meat and dairy products and increased the proportion from cereals, sweets and added fats, a pattern similar to that observed among low socioeconomic groups. This decreased the nutritional quality of modeled diets, notably the lowest cost linear programming diets had lower vitamin C and beta-carotene densities than the mean French adult diet (i.e., <25% and 10% of the mean density, respectively). These results indicate that a simple cost constraint can decrease the nutrient densities of diets and influence food selection in ways that reproduce the food intake patterns observed among low socioeconomic groups. They suggest that economic measures will be needed to effectively improve the nutritional quality of diets consumed by these populations.
Mathematical programming for the efficient allocation of health care resources.
Stinnett, A A; Paltiel, A D
1996-10-01
Previous discussions of methods for the efficient allocation of health care resources subject to a budget constraint have relied on unnecessarily restrictive assumptions. This paper makes use of established optimization techniques to demonstrate that a general mathematical programming framework can accommodate much more complex information regarding returns to scale, partial and complete indivisibility and program interdependence. Methods are also presented for incorporating ethical constraints into the resource allocation process, including explicit identification of the cost of equity.
Risk-Constrained Dynamic Programming for Optimal Mars Entry, Descent, and Landing
NASA Technical Reports Server (NTRS)
Ono, Masahiro; Kuwata, Yoshiaki
2013-01-01
A chance-constrained dynamic programming algorithm was developed that is capable of making optimal sequential decisions within a user-specified risk bound. This work handles stochastic uncertainties over multiple stages in the CEMAT (Combined EDL-Mobility Analyses Tool) framework. It was demonstrated by a simulation of Mars entry, descent, and landing (EDL) using real landscape data obtained from the Mars Reconnaissance Orbiter. Although standard dynamic programming (DP) provides a general framework for optimal sequential decisionmaking under uncertainty, it typically achieves risk aversion by imposing an arbitrary penalty on failure states. Such a penalty-based approach cannot explicitly bound the probability of mission failure. A key idea behind the new approach is called risk allocation, which decomposes a joint chance constraint into a set of individual chance constraints and distributes risk over them. The joint chance constraint was reformulated into a constraint on an expectation over a sum of an indicator function, which can be incorporated into the cost function by dualizing the optimization problem. As a result, the chance-constraint optimization problem can be turned into an unconstrained optimization over a Lagrangian, which can be solved efficiently using a standard DP approach.
Constraint programming based biomarker optimization.
Zhou, Manli; Luo, Youxi; Sun, Guoquan; Mai, Guoqin; Zhou, Fengfeng
2015-01-01
Efficient and intuitive characterization of biological big data is becoming a major challenge for modern bio-OMIC based scientists. Interactive visualization and exploration of big data is proven to be one of the successful solutions. Most of the existing feature selection algorithms do not allow the interactive inputs from users in the optimizing process of feature selection. This study investigates this question as fixing a few user-input features in the finally selected feature subset and formulates these user-input features as constraints for a programming model. The proposed algorithm, fsCoP (feature selection based on constrained programming), performs well similar to or much better than the existing feature selection algorithms, even with the constraints from both literature and the existing algorithms. An fsCoP biomarker may be intriguing for further wet lab validation, since it satisfies both the classification optimization function and the biomedical knowledge. fsCoP may also be used for the interactive exploration of bio-OMIC big data by interactively adding user-defined constraints for modeling.
A Methodology for Formal Hardware Verification, with Application to Microprocessors.
1993-08-29
concurrent programming lan- guages. Proceedings of the NATO Advanced Study Institute on Logics and Models of Concurrent Systems ( Colle - sur - Loup , France, 8-19...restricted class of formu- las . Bose and Fisher [26] developed a symbolic model checker based on a Cosmos switch-level model. Their modeling approach...verification using SDVS-the method and a case study. 17th Anuual Microprogramming Workshop (New Orleans, LA , 30 October-2 November 1984). Published as
Benko, Matúš; Gfrerer, Helmut
2018-01-01
In this paper, we consider a sufficiently broad class of non-linear mathematical programs with disjunctive constraints, which, e.g. include mathematical programs with complemetarity/vanishing constraints. We present an extension of the concept of [Formula: see text]-stationarity which can be easily combined with the well-known notion of M-stationarity to obtain the stronger property of so-called [Formula: see text]-stationarity. We show how the property of [Formula: see text]-stationarity (and thus also of M-stationarity) can be efficiently verified for the considered problem class by computing [Formula: see text]-stationary solutions of a certain quadratic program. We consider further the situation that the point which is to be tested for [Formula: see text]-stationarity, is not known exactly, but is approximated by some convergent sequence, as it is usually the case when applying some numerical method.
Program Predicts Time Courses of Human/Computer Interactions
NASA Technical Reports Server (NTRS)
Vera, Alonso; Howes, Andrew
2005-01-01
CPM X is a computer program that predicts sequences of, and amounts of time taken by, routine actions performed by a skilled person performing a task. Unlike programs that simulate the interaction of the person with the task environment, CPM X predicts the time course of events as consequences of encoded constraints on human behavior. The constraints determine which cognitive and environmental processes can occur simultaneously and which have sequential dependencies. The input to CPM X comprises (1) a description of a task and strategy in a hierarchical description language and (2) a description of architectural constraints in the form of rules governing interactions of fundamental cognitive, perceptual, and motor operations. The output of CPM X is a Program Evaluation Review Technique (PERT) chart that presents a schedule of predicted cognitive, motor, and perceptual operators interacting with a task environment. The CPM X program allows direct, a priori prediction of skilled user performance on complex human-machine systems, providing a way to assess critical interfaces before they are deployed in mission contexts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ungun, B; Stanford University School of Medicine, Stanford, CA; Fu, A
2016-06-15
Purpose: To develop a procedure for including dose constraints in convex programming-based approaches to treatment planning, and to support dynamic modification of such constraints during planning. Methods: We present a mathematical approach that allows mean dose, maximum dose, minimum dose and dose volume (i.e., percentile) constraints to be appended to any convex formulation of an inverse planning problem. The first three constraint types are convex and readily incorporated. Dose volume constraints are not convex, however, so we introduce a convex restriction that is related to CVaR-based approaches previously proposed in the literature. To compensate for the conservatism of this restriction,more » we propose a new two-pass algorithm that solves the restricted problem on a first pass and uses this solution to form exact constraints on a second pass. In another variant, we introduce slack variables for each dose constraint to prevent the problem from becoming infeasible when the user specifies an incompatible set of constraints. We implement the proposed methods in Python using the convex programming package cvxpy in conjunction with the open source convex solvers SCS and ECOS. Results: We show, for several cases taken from the clinic, that our proposed method meets specified constraints (often with margin) when they are feasible. Constraints are met exactly when we use the two-pass method, and infeasible constraints are replaced with the nearest feasible constraint when slacks are used. Finally, we introduce ConRad, a Python-embedded free software package for convex radiation therapy planning. ConRad implements the methods described above and offers a simple interface for specifying prescriptions and dose constraints. Conclusion: This work demonstrates the feasibility of using modifiable dose constraints in a convex formulation, making it practical to guide the treatment planning process with interactively specified dose constraints. This work was supported by the Stanford BioX Graduate Fellowship and NIH Grant 5R01CA176553.« less
Operations concepts for Mars missions with multiple mobile spacecraft
NASA Technical Reports Server (NTRS)
Dias, William C.
1993-01-01
Missions are being proposed which involve landing a varying number (anywhere from one to 24) of small mobile spacecraft on Mars. Mission proposals include sample returns, in situ geochemistry and geology, and instrument deployment functions. This paper discusses changes needed in traditional space operations methods for support of rover operations. Relevant differences include more frequent commanding, higher risk acceptance, streamlined procedures, and reliance on additional spacecraft autonomy, advanced fault protection, and prenegotiated decisions. New methods are especially important for missions with several Mars rovers operating concurrently against time limits. This paper also discusses likely mission design limits imposed by operations constraints .
Live immunization against East Coast fever--current status.
Di Giulio, Giuseppe; Lynen, Godelieve; Morzaria, Subhash; Oura, Chris; Bishop, Richard
2009-02-01
The infection-and-treatment method (ITM) for immunization of cattle against East Coast fever has historically been used only on a limited scale because of logistical and policy constraints. Recent large-scale deployment among pastoralists in Tanzania has stimulated demand. Concurrently, a suite of molecular tools, developed from the Theileria parva genome, has enabled improved quality control of the immunizing stabilate and post-immunization monitoring of the efficacy and biological impact of ITM in the field. This article outlines the current status of ITM immunization in the field, with associated developments in the molecular epidemiology of T. parva.
Optimization of the computational load of a hypercube supercomputer onboard a mobile robot.
Barhen, J; Toomarian, N; Protopopescu, V
1987-12-01
A combinatorial optimization methodology is developed, which enables the efficient use of hypercube multiprocessors onboard mobile intelligent robots dedicated to time-critical missions. The methodology is implemented in terms of large-scale concurrent algorithms based either on fast simulated annealing, or on nonlinear asynchronous neural networks. In particular, analytic expressions are given for the effect of singleneuron perturbations on the systems' configuration energy. Compact neuromorphic data structures are used to model effects such as prec xdence constraints, processor idling times, and task-schedule overlaps. Results for a typical robot-dynamics benchmark are presented.
A tool for modeling concurrent real-time computation
NASA Technical Reports Server (NTRS)
Sharma, D. D.; Huang, Shie-Rei; Bhatt, Rahul; Sridharan, N. S.
1990-01-01
Real-time computation is a significant area of research in general, and in AI in particular. The complexity of practical real-time problems demands use of knowledge-based problem solving techniques while satisfying real-time performance constraints. Since the demands of a complex real-time problem cannot be predicted (owing to the dynamic nature of the environment) powerful dynamic resource control techniques are needed to monitor and control the performance. A real-time computation model for a real-time tool, an implementation of the QP-Net simulator on a Symbolics machine, and an implementation on a Butterfly multiprocessor machine are briefly described.
Optimization of the computational load of a hypercube supercomputer onboard a mobile robot
NASA Technical Reports Server (NTRS)
Barhen, Jacob; Toomarian, N.; Protopopescu, V.
1987-01-01
A combinatorial optimization methodology is developed, which enables the efficient use of hypercube multiprocessors onboard mobile intelligent robots dedicated to time-critical missions. The methodology is implemented in terms of large-scale concurrent algorithms based either on fast simulated annealing, or on nonlinear asynchronous neural networks. In particular, analytic expressions are given for the effect of single-neuron perturbations on the systems' configuration energy. Compact neuromorphic data structures are used to model effects such as precedence constraints, processor idling times, and task-schedule overlaps. Results for a typical robot-dynamics benchmark are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carlsson, Mats; Johansson, Mikael; Larson, Jeffrey
Previous approaches for scheduling a league with round-robin and divisional tournaments involved decomposing the problem into easier subproblems. This approach, used to schedule the top Swedish handball league Elitserien, reduces the problem complexity but can result in suboptimal schedules. This paper presents an integrated constraint programming model that allows to perform the scheduling in a single step. Particular attention is given to identifying implied and symmetry-breaking constraints that reduce the computational complexity significantly. The experimental evaluation of the integrated approach takes considerably less computational effort than the previous approach.
ERIC Educational Resources Information Center
Le, Nguyen-Thinh; Menzel, Wolfgang
2009-01-01
In this paper, we introduce logic programming as a domain that exhibits some characteristics of being ill-defined. In order to diagnose student errors in such a domain, we need a means to hypothesise the student's intention, that is the strategy underlying her solution. This is achieved by weighting constraints, so that hypotheses about solution…
AP Studio Art as an Enabling Constraint for Secondary Art Education
ERIC Educational Resources Information Center
Graham, Mark A.
2009-01-01
Advanced Placement (AP) Studio Art is an influential force in secondary art education as is evident in the 31,800 portfolios submitted for review in 2008. From the perspectives of a high school educator and AP Reader, this author has observed how the constraints of the AP program can be used to generate support for high school art programs and…
The MEOW lunar project for education and science based on concurrent engineering approach
NASA Astrophysics Data System (ADS)
Roibás-Millán, E.; Sorribes-Palmer, F.; Chimeno-Manguán, M.
2018-07-01
The use of concurrent engineering in the design of space missions allows to take into account in an interrelated methodology the high level of coupling and iteration of mission subsystems in the preliminary conceptual phase. This work presents the result of applying concurrent engineering in a short time lapse to design the main elements of the preliminary design for a lunar exploration mission, developed within ESA Academy Concurrent Engineering Challenge 2017. During this program, students of the Master in Space Systems at Technical University of Madrid designed a low cost satellite to find water on the Moon south pole as prospect of a future human lunar base. The resulting mission, The Moon Explorer And Observer of Water/Ice (MEOW) compromises a 262 kg spacecraft to be launched into a Geostationary Transfer Orbit as a secondary payload in the 2023/2025 time frame. A three months Weak Stability Boundary transfer via the Sun-Earth L1 Lagrange point allows for a high launch timeframe flexibility. The different aspects of the mission (orbit analysis, spacecraft design and payload) and possibilities of concurrent engineering are described.
NASA Astrophysics Data System (ADS)
Li, Hong; Zhang, Li; Jiao, Yong-Chang
2016-07-01
This paper presents an interactive approach based on a discrete differential evolution algorithm to solve a class of integer bilevel programming problems, in which integer decision variables are controlled by an upper-level decision maker and real-value or continuous decision variables are controlled by a lower-level decision maker. Using the Karush--Kuhn-Tucker optimality conditions in the lower-level programming, the original discrete bilevel formulation can be converted into a discrete single-level nonlinear programming problem with the complementarity constraints, and then the smoothing technique is applied to deal with the complementarity constraints. Finally, a discrete single-level nonlinear programming problem is obtained, and solved by an interactive approach. In each iteration, for each given upper-level discrete variable, a system of nonlinear equations including the lower-level variables and Lagrange multipliers is solved first, and then a discrete nonlinear programming problem only with inequality constraints is handled by using a discrete differential evolution algorithm. Simulation results show the effectiveness of the proposed approach.
Minimum weight design of helicopter rotor blades with frequency constraints
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Walsh, Joanne L.
1989-01-01
The minimum weight design of helicopter rotor blades subject to constraints on fundamental coupled flap-lag natural frequencies has been studied in this paper. A constraint has also been imposed on the minimum value of the blade autorotational inertia to ensure that the blade has sufficient inertia to autorotate in case of an engine failure. The program CAMRAD has been used for the blade modal analysis and the program CONMIN has been used for the optimization. In addition, a linear approximation analysis involving Taylor series expansion has been used to reduce the analysis effort. The procedure contains a sensitivity analysis which consists of analytical derivatives of the objective function and the autorotational inertia constraint and central finite difference derivatives of the frequency constraints. Optimum designs have been obtained for blades in vacuum with both rectangular and tapered box beam structures. Design variables include taper ratio, nonstructural segment weights and box beam dimensions. The paper shows that even when starting with an acceptable baseline design, a significant amount of weight reduction is possible while satisfying all the constraints for blades with rectangular and tapered box beams.
ERIC Educational Resources Information Center
Wisconsin Univ., Madison. Inst. for Environmental Studies.
Staff and graduate students from the University of Wisconsin (UW) conducted a 2-week workshop in environmental studies for adolescent Native American students and a concurrent teacher's education program entitled "Wetland Perspectives: Ways of Looking at the Landscape." 1996 is the fifth year for the PreCollege program and the second…
ERIC Educational Resources Information Center
Milchus, Norman J.
The Wayne County Pre-Reading Program for Preventing Reading Failure is an individually, diagnostically prescribed, perceptual-cognitive-linguistic development program. The program utilizes the largest compilation of prescriptively coded, reading readiness materials to be assigned prior to and concurrent with first-year reading instruction. The…
ERIC Educational Resources Information Center
Brown, Amber L; Lee, Joohi
2017-01-01
The purpose of this study was to evaluate the efficacy of the Home Instruction for Parents of Preschool Youngsters program when implemented within Head Start programs by measuring children's language proficiency scores. Participants were kindergarteners concurrently enrolled in both a Head Start program and the Home Instruction for Parents of…
Effects of cacheing on multitasking efficiency and programming strategy on an ELXSI 6400
DOE Office of Scientific and Technical Information (OSTI.GOV)
Montry, G.R.; Benner, R.E.
1985-12-01
The impact of a cache/shared memory architecture, and, in particular, the cache coherency problem, upon concurrent algorithm and program development is discussed. In this context, a simple set of programming strategies are proposed which streamline code development and improve code performance when multitasking in a cache/shared memory or distributed memory environment.
ERIC Educational Resources Information Center
Wielgosz, Meg; Molyneux, Paul
2015-01-01
Students learning English as an additional language (EAL) in Australian schools frequently struggle with the cultural and linguistic demands of the classroom while concurrently grappling with issues of identity and belonging. This article reports on an investigation of the role primary school visual arts programs, distinct programs with a…
NASA Technical Reports Server (NTRS)
Kolb, Mark A.
1990-01-01
Originally, computer programs for engineering design focused on detailed geometric design. Later, computer programs for algorithmically performing the preliminary design of specific well-defined classes of objects became commonplace. However, due to the need for extreme flexibility, it appears unlikely that conventional programming techniques will prove fruitful in developing computer aids for engineering conceptual design. The use of symbolic processing techniques, such as object-oriented programming and constraint propagation, facilitate such flexibility. Object-oriented programming allows programs to be organized around the objects and behavior to be simulated, rather than around fixed sequences of function- and subroutine-calls. Constraint propagation allows declarative statements to be understood as designating multi-directional mathematical relationships among all the variables of an equation, rather than as unidirectional assignments to the variable on the left-hand side of the equation, as in conventional computer programs. The research has concentrated on applying these two techniques to the development of a general-purpose computer aid for engineering conceptual design. Object-oriented programming techniques are utilized to implement a user-extensible database of design components. The mathematical relationships which model both geometry and physics of these components are managed via constraint propagation. In addition, to this component-based hierarchy, special-purpose data structures are provided for describing component interactions and supporting state-dependent parameters. In order to investigate the utility of this approach, a number of sample design problems from the field of aerospace engineering were implemented using the prototype design tool, Rubber Airplane. The additional level of organizational structure obtained by representing design knowledge in terms of components is observed to provide greater convenience to the program user, and to result in a database of engineering information which is easier both to maintain and to extend.
47 CFR 73.1510 - Experimental authorizations.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 4 2010-10-01 2010-10-01 false Experimental authorizations. 73.1510 Section 73... conducted at any time the station is authorized to operate, but the minimum required schedule of programming... regularly scheduled programming concurrently with the experimental transmission if there is no significant...
47 CFR 73.1510 - Experimental authorizations.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 47 Telecommunication 4 2011-10-01 2011-10-01 false Experimental authorizations. 73.1510 Section 73... conducted at any time the station is authorized to operate, but the minimum required schedule of programming... regularly scheduled programming concurrently with the experimental transmission if there is no significant...
Projected 2050 Model Simulations for the Chesapeake Bay Program
The Chesapeake Bay Program as has been tasked with assessing how changes in climate systems are expected to alter key variables and processes within the Watershed in concurrence with land use changes. EPA’s Office of Research and Development will be conducting historic and...
Stein, Melissa R.; Soloway, Irene J.; Jefferson, Karen S.; Roose, Robert J.; Arnsten, Julia H.; Litwin, Alain H.
2012-01-01
Chronic hepatitis C virus (HCV) infection is highly prevalent among current and former drug users. However, the minority of patients enrolled in drug treatment programs have initiated HCV treatment. New models are needed to overcome barriers to care. In this retrospective study, we describe the implementation and outcomes of 42 patients treated in a Concurrent Group Treatment (CGT) program. Patients participated in weekly provider-led group treatment sessions which included review of side effects; discussion of adherence and side effect management; administration of interferon injections; brief physical exam; and ended with brief meditation. Of the first 27 patients who initiated CGT, 42% achieved a sustained viral response. Additionally, 87% (13/15) of genotype-1 infected patients treated with direct acting antiviral agent achieved an undetectable viral load at 24 weeks. The CGT model may be effective in overcoming barriers to treatment and improving adherence and outcomes among patients enrolled in drug treatment programs. PMID:23036920
High-throughput state-machine replication using software transactional memory.
Zhao, Wenbing; Yang, William; Zhang, Honglei; Yang, Jack; Luo, Xiong; Zhu, Yueqin; Yang, Mary; Luo, Chaomin
2016-11-01
State-machine replication is a common way of constructing general purpose fault tolerance systems. To ensure replica consistency, requests must be executed sequentially according to some total order at all non-faulty replicas. Unfortunately, this could severely limit the system throughput. This issue has been partially addressed by identifying non-conflicting requests based on application semantics and executing these requests concurrently. However, identifying and tracking non-conflicting requests require intimate knowledge of application design and implementation, and a custom fault tolerance solution developed for one application cannot be easily adopted by other applications. Software transactional memory offers a new way of constructing concurrent programs. In this article, we present the mechanisms needed to retrofit existing concurrency control algorithms designed for software transactional memory for state-machine replication. The main benefit for using software transactional memory in state-machine replication is that general purpose concurrency control mechanisms can be designed without deep knowledge of application semantics. As such, new fault tolerance systems based on state-machine replications with excellent throughput can be easily designed and maintained. In this article, we introduce three different concurrency control mechanisms for state-machine replication using software transactional memory, namely, ordered strong strict two-phase locking, conventional timestamp-based multiversion concurrency control, and speculative timestamp-based multiversion concurrency control. Our experiments show that speculative timestamp-based multiversion concurrency control mechanism has the best performance in all types of workload, the conventional timestamp-based multiversion concurrency control offers the worst performance due to high abort rate in the presence of even moderate contention between transactions. The ordered strong strict two-phase locking mechanism offers the simplest solution with excellent performance in low contention workload, and fairly good performance in high contention workload.
High-throughput state-machine replication using software transactional memory
Yang, William; Zhang, Honglei; Yang, Jack; Luo, Xiong; Zhu, Yueqin; Yang, Mary; Luo, Chaomin
2017-01-01
State-machine replication is a common way of constructing general purpose fault tolerance systems. To ensure replica consistency, requests must be executed sequentially according to some total order at all non-faulty replicas. Unfortunately, this could severely limit the system throughput. This issue has been partially addressed by identifying non-conflicting requests based on application semantics and executing these requests concurrently. However, identifying and tracking non-conflicting requests require intimate knowledge of application design and implementation, and a custom fault tolerance solution developed for one application cannot be easily adopted by other applications. Software transactional memory offers a new way of constructing concurrent programs. In this article, we present the mechanisms needed to retrofit existing concurrency control algorithms designed for software transactional memory for state-machine replication. The main benefit for using software transactional memory in state-machine replication is that general purpose concurrency control mechanisms can be designed without deep knowledge of application semantics. As such, new fault tolerance systems based on state-machine replications with excellent throughput can be easily designed and maintained. In this article, we introduce three different concurrency control mechanisms for state-machine replication using software transactional memory, namely, ordered strong strict two-phase locking, conventional timestamp-based multiversion concurrency control, and speculative timestamp-based multiversion concurrency control. Our experiments show that speculative timestamp-based multiversion concurrency control mechanism has the best performance in all types of workload, the conventional timestamp-based multiversion concurrency control offers the worst performance due to high abort rate in the presence of even moderate contention between transactions. The ordered strong strict two-phase locking mechanism offers the simplest solution with excellent performance in low contention workload, and fairly good performance in high contention workload. PMID:29075049
NASA Astrophysics Data System (ADS)
Welch, Jonathan
This case study focused on obsolescence management constraints that occur during development of sustainment-dominated systems. Obsolescence management constraints were explored in systems expected to last 20 years or more and that tend to use commercial off-the-shelf products. The field of obsolescence has received little study, but obsolescence has a large cost for military systems. Because developing complex systems takes an average of 3 to 8 years, and commercial off-the-shelf components are typically obsolete within 3 to 5 years, military systems are often deployed with obsolescence issues that are transferred to the sustainment community to determine solutions. The main problem addressed in the study was to identify the constraints that have caused 70% of military systems under development to be obsolete when they are delivered. The purpose of the study was to use a qualitative case study to identify constraints that interfered with obsolescence management occurring during the development stages of a program. The participants of this case study were managers, subordinates, and end-users who were logistics and obsolescence experts. Researchers largely agree that proactive obsolescence management is a lower cost solution for sustainment-dominated systems. Program managers must understand the constraints and understand the impact of not implementing proactive solutions early in the development program lifecycle. The conclusion of the study found several constraints that prevented the development program from early adoption of obsolescence management theories, specifically pro-active theories. There were three major themes identified: (a) management commitment, (b) lack of details in the statement of work, and (c) vendor management. Each major theme includes several subthemes. The recommendation is future researchers should explore two areas: (a) comparing the cost of managing obsolescence early in the development process versus the costs of managing later, (b) exploring the costs and value to start a centralized obsolescence group at each major defense contractor location.
NASA Technical Reports Server (NTRS)
Oleson, Steven R.
2018-01-01
The COncurrent Multidisciplinary Preliminary Assessment of Space Systems (COMPASS) Team partnered with the Applied Research Laboratory to perform a NASA Innovative Advanced Concepts (NIAC) Program study to evaluate chemical based power systems for keeping a Venus lander alive (power and cooling) and functional for a period of days. The mission class targeted was either a Discovery ($500M) or New Frontiers ($750M to $780M) class mission.
Software For Drawing Design Details Concurrently
NASA Technical Reports Server (NTRS)
Crosby, Dewey C., III
1990-01-01
Software system containing five computer-aided-design programs enables more than one designer to work on same part or assembly at same time. Reduces time necessary to produce design by implementing concept of parallel or concurrent detailing, in which all detail drawings documenting three-dimensional model of part or assembly produced simultaneously, rather than sequentially. Keeps various detail drawings consistent with each other and with overall design by distributing changes in each detail to all other affected details.
Rodgers, Rachel F; Paxton, Susan J
2014-01-01
Depressive and eating disorder symptoms are highly comorbid. To date, however, little is known regarding the efficacy of existing programs in decreasing concurrent eating disorder and depressive symptoms. We conducted a systematic review of selective and indicated controlled prevention and early intervention programs that assessed both eating disorder and depressive symptoms. We identified a total of 26 studies. The large majority of identified interventions (92%) were successful in decreasing eating disorder symptoms. However fewer than half (42%) were successful in decreasing both eating disorder and depressive symptoms. Intervention and participant characteristics did not predict success in decreasing depressive symptoms. Indicated prevention and early intervention programs targeting eating disorder symptoms are limited in their success in decreasing concurrent depressive symptoms. Further efforts to develop more efficient interventions that are successful in decreasing both eating disorder and depressive symptoms are warranted.
Algorithms and software for nonlinear structural dynamics
NASA Technical Reports Server (NTRS)
Belytschko, Ted; Gilbertsen, Noreen D.; Neal, Mark O.
1989-01-01
The objective of this research is to develop efficient methods for explicit time integration in nonlinear structural dynamics for computers which utilize both concurrency and vectorization. As a framework for these studies, the program WHAMS, which is described in Explicit Algorithms for the Nonlinear Dynamics of Shells (T. Belytschko, J. I. Lin, and C.-S. Tsay, Computer Methods in Applied Mechanics and Engineering, Vol. 42, 1984, pp 225 to 251), is used. There are two factors which make the development of efficient concurrent explicit time integration programs a challenge in a structural dynamics program: (1) the need for a variety of element types, which complicates the scheduling-allocation problem; and (2) the need for different time steps in different parts of the mesh, which is here called mixed delta t integration, so that a few stiff elements do not reduce the time steps throughout the mesh.
Snyder, Patricia; Eason, Jane M; Philibert, Darbi; Ridgway, Andrea; McCaughey, Tiffany
2008-01-01
Concurrent validity of scores for the Alberta Infant Motor Scale (AIMS) and the Peabody Developmental Gross Motor Scale-2 (PDGMS-2) was examined with a sample of 35 infants at dual risk for motor delays or disabilities. Dual risk was defined as low birthweight (
Socioeconomic Inequality in Concurrent Tobacco and Alcohol Consumption
Intarut, Nirun; Pukdeesamai, Piyalak
2017-01-01
Background: Whilst several studies have examined inequity of tobacco use and inequity of alcohol drinking individually, comparatively little is known about concurrent tobacco and alcohol consumption. The present study therefore investigated inequity of concurrent tobacco and alcohol consumption in Thailand. Methods: The 2015 Health and Welfare Survey was obtained from Thailand’s National Statistical Office and used as a source of national representative data. Concurrent tobacco and alcohol consumption was defined as current and concurrent use of both tobacco and alcohol. The wealth assets index was used as an indicator of socioeconomic inequity. Socioeconomic status included 5 groups ranging from poorest (Q1) to richest (Q5). A total of 55,920 households and 113,705 participants aged 15 years or over were included and analyzed. A weighted multiple logistic regression was performed. Results: The prevalence of concurrent tobacco and alcohol consumption, tobacco consumption only, and alcohol consumption only were 15.2% (95% CI: 14.9, 15.4), 4.7% (95% CI: 4.5, 4.8), and 18.9% (95% CI: 18.7, 19.1), respectively. Weighted multiple logistic regression showed that concurrent tobacco and alcohol consumption was high in the poorest socioeconomic group (P for trend <0.001), and tobacco consumption only was also high in the poorest group (P for trend <0.001). A high prevalence of alcohol consumption was observed in the richest group (P for trend <0.001). Conclusions: These findings suggest that tobacco and alcohol consumption prevention programs would be more effective if they considered socioeconomic inequities in concurrent tobacco and alcohol consumption rather than focusing on single drug use. PMID:28749620
Constraint-based component-modeling for knowledge-based design
NASA Technical Reports Server (NTRS)
Kolb, Mark A.
1992-01-01
The paper describes the application of various advanced programming techniques derived from artificial intelligence research to the development of flexible design tools for conceptual design. Special attention is given to two techniques which appear to be readily applicable to such design tools: the constraint propagation technique and the object-oriented programming. The implementation of these techniques in a prototype computer tool, Rubber Airplane, is described.
Cooperative and Concurrent Enrollment and College Retention
ERIC Educational Resources Information Center
Foster, Regina
2010-01-01
Oklahoma has a unique system of high schools, technology centers and community colleges that work together to enable students to receive education in technical areas. Given Oklahoma's shortage of technical degree recipients, the Cooperative Alliance Program (CAP) was developed to encourage additional students to begin technical programs during…
Practicum Experience in the Master's Degree Program for Personnel Work
ERIC Educational Resources Information Center
Kirkbride, Virginia
1971-01-01
The committee on Professional Development of NAWDC concluded that student personnel curricula should make provisions for internships and academic work to be taken concurrently to enable future graduates to increase their professional skill and understanding of the total program of student personnel services. (Author/BY)
Impacts of Groundwater Constraints on Saudi Arabia's Low-Carbon Electricity Supply Strategy.
Parkinson, Simon C; Djilali, Ned; Krey, Volker; Fricko, Oliver; Johnson, Nils; Khan, Zarrar; Sedraoui, Khaled; Almasoud, Abdulrahman H
2016-02-16
Balancing groundwater depletion, socioeconomic development and food security in Saudi Arabia will require policy that promotes expansion of unconventional freshwater supply options, such as wastewater recycling and desalination. As these processes consume more electricity than conventional freshwater supply technologies, Saudi Arabia's electricity system is vulnerable to groundwater conservation policy. This paper examines strategies for adapting to long-term groundwater constraints in Saudi Arabia's freshwater and electricity supply sectors with an integrated modeling framework. The approach combines electricity and freshwater supply planning models across provinces to provide an improved representation of coupled infrastructure systems. The tool is applied to study the interaction between policy aimed at a complete phase-out of nonrenewable groundwater extraction and concurrent policy aimed at achieving deep reductions in electricity sector carbon emissions. We find that transitioning away from nonrenewable groundwater use by the year 2050 could increase electricity demand by more than 40% relative to 2010 conditions, and require investments similar to strategies aimed at transitioning away from fossil fuels in the electricity sector. Higher electricity demands under groundwater constraints reduce flexibility of supply side options in the electricity sector to limit carbon emissions, making it more expensive to fulfill climate sustainability objectives. The results of this analysis underscore the importance of integrated long-term planning approaches for Saudi Arabia's electricity and freshwater supply systems.
Path Following in the Exact Penalty Method of Convex Programming.
Zhou, Hua; Lange, Kenneth
2015-07-01
Classical penalty methods solve a sequence of unconstrained problems that put greater and greater stress on meeting the constraints. In the limit as the penalty constant tends to ∞, one recovers the constrained solution. In the exact penalty method, squared penalties are replaced by absolute value penalties, and the solution is recovered for a finite value of the penalty constant. In practice, the kinks in the penalty and the unknown magnitude of the penalty constant prevent wide application of the exact penalty method in nonlinear programming. In this article, we examine a strategy of path following consistent with the exact penalty method. Instead of performing optimization at a single penalty constant, we trace the solution as a continuous function of the penalty constant. Thus, path following starts at the unconstrained solution and follows the solution path as the penalty constant increases. In the process, the solution path hits, slides along, and exits from the various constraints. For quadratic programming, the solution path is piecewise linear and takes large jumps from constraint to constraint. For a general convex program, the solution path is piecewise smooth, and path following operates by numerically solving an ordinary differential equation segment by segment. Our diverse applications to a) projection onto a convex set, b) nonnegative least squares, c) quadratically constrained quadratic programming, d) geometric programming, and e) semidefinite programming illustrate the mechanics and potential of path following. The final detour to image denoising demonstrates the relevance of path following to regularized estimation in inverse problems. In regularized estimation, one follows the solution path as the penalty constant decreases from a large value.
Path Following in the Exact Penalty Method of Convex Programming
Zhou, Hua; Lange, Kenneth
2015-01-01
Classical penalty methods solve a sequence of unconstrained problems that put greater and greater stress on meeting the constraints. In the limit as the penalty constant tends to ∞, one recovers the constrained solution. In the exact penalty method, squared penalties are replaced by absolute value penalties, and the solution is recovered for a finite value of the penalty constant. In practice, the kinks in the penalty and the unknown magnitude of the penalty constant prevent wide application of the exact penalty method in nonlinear programming. In this article, we examine a strategy of path following consistent with the exact penalty method. Instead of performing optimization at a single penalty constant, we trace the solution as a continuous function of the penalty constant. Thus, path following starts at the unconstrained solution and follows the solution path as the penalty constant increases. In the process, the solution path hits, slides along, and exits from the various constraints. For quadratic programming, the solution path is piecewise linear and takes large jumps from constraint to constraint. For a general convex program, the solution path is piecewise smooth, and path following operates by numerically solving an ordinary differential equation segment by segment. Our diverse applications to a) projection onto a convex set, b) nonnegative least squares, c) quadratically constrained quadratic programming, d) geometric programming, and e) semidefinite programming illustrate the mechanics and potential of path following. The final detour to image denoising demonstrates the relevance of path following to regularized estimation in inverse problems. In regularized estimation, one follows the solution path as the penalty constant decreases from a large value. PMID:26366044
NASA Astrophysics Data System (ADS)
Fabián Calderón Marín, Carlos; González González, Joaquín Jorge; Laguardia, Rodolfo Alfonso
2017-09-01
The combination of radiotherapy modalities with external bundles and systemic radiotherapy (CIERT) could be a reliable alternative for patients with multiple lesions or those where treatment planning maybe difficult because organ(s)-at-risk (OARs) constraints. Radiobiological models should have the capacity for predicting the biological irradiation response considering the differences in the temporal pattern of dose delivering in both modalities. Two CIERT scenarios were studied: sequential combination in which one modality is executed following the other one and concurrent combination when both modalities are running simultaneously. Expressions are provided for calculation of the dose-response magnitudes Tumor Control Probability (TCP) and Normal Tissue Complication Probability (NTCP). General results on radiobiological modeling using the linear-quadratic (LQ) model are also discussed. Inter-subject variation of radiosensitivity and volume irradiation effect in CIERT are studied. OARs should be under control during the planning in concurrent CIERT treatment as the administered activity is increased. The formulation presented here may be used for biological evaluation of prescriptions and biological treatment planning of CIERT schemes in clinical situation.
2016-04-01
3Concurrency is broadly defined as the overlap between technology development and product development or between product development and...reported extensively on the F-35 program’s cost, schedule, and performance problems. The program plans to begin increasing production rates over the...internal DOD program analyses. GAO also collected and analyzed production and supply chain performance data, and interviewed DOD, program, and
NASA Technical Reports Server (NTRS)
Head, J. W. (Editor)
1978-01-01
Developments reported at a meeting of principal investigators for NASA's planetology geology program are summarized. Topics covered include: constraints on solar system formation; asteriods, comets, and satellites; constraints on planetary interiors; volatiles and regoliths; instrument development techniques; planetary cartography; geological and geochemical constraints on planetary evolution; fluvial processes and channel formation; volcanic processes; Eolian processes; radar studies of planetary surfaces; cratering as a process, landform, and dating method; and the Tharsis region of Mars. Activities at a planetary geology field conference on Eolian processes are reported and techniques recommended for the presentation and analysis of crater size-frequency data are included.
Integrity Constraint Monitoring in Software Development: Proposed Architectures
NASA Technical Reports Server (NTRS)
Fernandez, Francisco G.
1997-01-01
In the development of complex software systems, designers are required to obtain from many sources and manage vast amounts of knowledge of the system being built and communicate this information to personnel with a variety of backgrounds. Knowledge concerning the properties of the system, including the structure of, relationships between and limitations of the data objects in the system, becomes increasingly more vital as the complexity of the system and the number of knowledge sources increases. Ensuring that violations of these properties do not occur becomes steadily more challenging. One approach toward managing the enforcement or system properties, called context monitoring, uses a centralized repository of integrity constraints and a constraint satisfiability mechanism for dynamic verification of property enforcement during program execution. The focus of this paper is to describe possible software architectures that define a mechanism for dynamically checking the satisfiability of a set of constraints on a program. The next section describes the context monitoring approach in general. Section 3 gives an overview of the work currently being done toward the addition of an integrity constraint satisfiability mechanism to a high-level program language, SequenceL, and demonstrates how this model is being examined to develop a general software architecture. Section 4 describes possible architectures for a general constraint satisfiability mechanism, as well as an alternative approach that, uses embedded database queries in lieu of an external monitor. The paper concludes with a brief summary outlining the, current state of the research and future work.
The Effects of Welfare and Employment Programs on Children's Participation in Head Start
ERIC Educational Resources Information Center
Chang, Young Eun; Huston, Aletha C.; Crosby, Danielle A.; Gennetian, Lisa A.
2007-01-01
We examine the effects of 10 welfare and employment programs on single mothers' use of Head Start for their 3- to 4-year-old children, considering concurrent program effects on employment, income, and the use of other types of childcare settings. In general, these welfare and employment experiments increased parental employment and the use of…
MELD: A Logical Approach to Distributed and Parallel Programming
2012-03-01
0215 5b. GRANT NUMBER N/A 5c. PROGRAM ELEMENT NUMBER 61101E 6. AUTHOR(S) Seth Copen Goldstein Flavio Cruz 5d. PROJECT NUMBER BI20 5e. TASK...Comp. Sci., vol. 50, pp. 1–102, 1987. [33] P. Ló pez, F. Pfenning, J. Polakow, and K. Watkins , “Monadic concurrent linear logic programming,” in
Investigating Concurrency in Weapons Programs
2010-10-01
manipulation. Defense AT&L: September-October 2010 20 acquisition’s Feb. 6, 2006, memorandum, “Design/Build Con - currency,” identified the high degree...We have consistently reported on the elevated risk of poor program outcomes from the substantial overlap of development, test, and production...rency a program can experience before significant cost increase is incurred rages on. Advantages and Disadvantages Intuitively, one can see the
ERIC Educational Resources Information Center
Prins, Esther; Gungor, Ramazan
2011-01-01
This article examines the consequences of two concurrent policy changes for family literacy programs in Pennsylvania: (1) the transition from federal (Even Start) to state funding and (2) the elimination of adult education as a work activity for welfare recipients over 22 years of age. Using qualitative data from 10 family literacy programs, the…
ERIC Educational Resources Information Center
Urbaczewski, Andrew; Urbaczewski, Lise
The objective of this study was to find the answers to two primary research questions: "Do students learn programming languages better when they are offered in a particular order, such as 4th generation languages before 3rd generation languages?"; and "Do students learn programming languages better when they are taken in separate semesters as…
Cancer caregivers' perceptions of an exercise and nutrition program.
Anton, Philip M; Partridge, Julie A; Morrissy, Margaret J
2013-03-01
Little research has addressed exercise and nutrition-based interventions for cancer caregivers. This study explored cancer caregivers' perceptions of participating in a structured exercise and nutrition program alongside cancer survivors for whom they provided care. In-depth, semi-structured interviews were conducted by one interviewer with 12 cancer caregivers about their experiences participating in a structured, 12-week exercise and nutrition program designed for cancer survivors and caregivers to complete concurrently. Interviews were conducted until data saturation was reached. Inductive content analysis from individual interviews indicated three separate, but interrelated, themes: (1) the program was a positive mechanism through which caregivers shared and supported the cancer journey concurrently with survivors, (2) the program led to perceived physical and psychological benefits for both caregivers and survivors, and (3) participants perceived that participation in the program led to feeling increased social support in their caregiving duties. Findings from this study suggest that participating in an exercise- and nutrition-based intervention is viewed positively by caregivers and that the outcomes are seen as beneficial to both caregivers and survivors. Interventions that address the health needs of both members of the caregiver-survivor dyad should continue to be encouraged by allied health professionals.
Rezansoff, Stefanie N; Moniruzzaman, Akm; Clark, Elenore; Somers, Julian M
2015-10-31
The majority of Drug Treatment Court (DTC) research has examined the impact of DTCs on criminal recidivism. Comparatively little research has addressed the association between DTC participation and engagement with community-based health and social services. The present study investigated changes in participant involvement with outpatient healthcare and income assistance within a DTC cohort. We hypothesized that involvement with community-based (outpatient) health and social services would increase post-DTC participation, and that service levels would be higher among program graduates and offenders with histories of co-occurring mental and substance use disorders. Participants were 631 offenders at the DTC in Vancouver, Canada (DTCV). Administrative data representing hospital, outpatient medical care, and income assistance were examined one-year pre/post program to assess differences over time. Generalized estimating equations were used to investigate the association between changes in service use and program involvement. We also examined the relationship between level of service use and offender characteristics. Members of the cohort were disproportionately Aboriginal (33 %), had been sentenced 2.7 times in the 2 years preceding their index offence, and 50 % had been diagnosed with a non substance-related mental disorder in the five years preceding the index offence. The mean number of outpatient services post DTCV was 51, and the mean amount of social assistance paid was $5,897. Outpatient service use increased following exposure to DTCV (Adjusted Rate Ratio (ARR) = 1.45) and was significantly higher among women (ARR = 1.47), program graduation (ARR = 1.23), and those previously diagnosed with concurrent substance use and mental disorders (ARR = 4.92). Overall, hospital admissions did not increase post-program, although rates were significantly higher among women (ARR = 1.76) and those with concurrent disorders (ARR = 2.71). Income assistance increased significantly post program (ARR = 1.16), and was significantly higher among women (ARR = 1.03), and those diagnosed with substance use disorders (ARR = 1.42) and concurrent disorders (ARR = 1.72). These findings suggest that the DTCV was a catalyst for increased participant engagement with community health and social supports, and that rates of service use were consistently higher among women and individuals with concurrent disorders. Research is needed to investigate the potential link between health and social support and reductions in recidivism associated with DTCs.
NASA Technical Reports Server (NTRS)
Watts, G.
1992-01-01
A programming technique to eliminate computational instability in multibody simulations that use the Lagrange multiplier is presented. The computational instability occurs when the attached bodies drift apart and violate the constraints. The programming technique uses the constraint equation, instead of integration, to determine the coordinates that are not independent. Although the equations of motion are unchanged, a complete derivation of the incorporation of the Lagrange multiplier into the equation of motion for two bodies is presented. A listing of a digital computer program which uses the programming technique to eliminate computational instability is also presented. The computer program simulates a solid rocket booster and parachute connected by a frictionless swivel.
Interactive two-stage stochastic fuzzy programming for water resources management.
Wang, S; Huang, G H
2011-08-01
In this study, an interactive two-stage stochastic fuzzy programming (ITSFP) approach has been developed through incorporating an interactive fuzzy resolution (IFR) method within an inexact two-stage stochastic programming (ITSP) framework. ITSFP can not only tackle dual uncertainties presented as fuzzy boundary intervals that exist in the objective function and the left- and right-hand sides of constraints, but also permit in-depth analyses of various policy scenarios that are associated with different levels of economic penalties when the promised policy targets are violated. A management problem in terms of water resources allocation has been studied to illustrate applicability of the proposed approach. The results indicate that a set of solutions under different feasibility degrees has been generated for planning the water resources allocation. They can help the decision makers (DMs) to conduct in-depth analyses of tradeoffs between economic efficiency and constraint-violation risk, as well as enable them to identify, in an interactive way, a desired compromise between satisfaction degree of the goal and feasibility of the constraints (i.e., risk of constraint violation). Copyright © 2011 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Zhao, Dang-Jun; Song, Zheng-Yu
2017-08-01
This study proposes a multiphase convex programming approach for rapid reentry trajectory generation that satisfies path, waypoint and no-fly zone (NFZ) constraints on Common Aerial Vehicles (CAVs). Because the time when the vehicle reaches the waypoint is unknown, the trajectory of the vehicle is divided into several phases according to the prescribed waypoints, rendering a multiphase optimization problem with free final time. Due to the requirement of rapidity, the minimum flight time of each phase index is preferred over other indices in this research. The sequential linearization is used to approximate the nonlinear dynamics of the vehicle as well as the nonlinear concave path constraints on the heat rate, dynamic pressure, and normal load; meanwhile, the convexification techniques are proposed to relax the concave constraints on control variables. Next, the original multiphase optimization problem is reformulated as a standard second-order convex programming problem. Theoretical analysis is conducted to show that the original problem and the converted problem have the same solution. Numerical results are presented to demonstrate that the proposed approach is efficient and effective.
Size matters: concurrency and the epidemic potential of HIV in small networks.
Carnegie, Nicole Bohme; Morris, Martina
2012-01-01
Generalized heterosexual epidemics are responsible for the largest share of the global burden of HIV. These occur in populations that do not have high rates of partner acquisition, and research suggests that a pattern of fewer, but concurrent, partnerships may be the mechanism that provides the connectivity necessary for sustained transmission. We examine how network size affects the impact of concurrency on network connectivity. We use a stochastic network model to generate a sample of networks, varying the size of the network and the level of concurrency, and compare the largest components for each scenario to the asymptotic expected values. While the threshold for the growth of a giant component does not change, the transition is more gradual in the smaller networks. As a result, low levels of concurrency generate more connectivity in small networks. Generalized HIV epidemics are by definition those that spread to a larger fraction of the population, but the mechanism may rely in part on the dynamics of transmission in a set of linked small networks. Examples include rural populations in sub-Saharan Africa and segregated minority populations in the US, where the effective size of the sexual network may well be in the hundreds, rather than thousands. Connectivity emerges at lower levels of concurrency in smaller networks, but these networks can still be disconnected with small changes in behavior. Concurrency remains a strategic target for HIV combination prevention programs in this context.
Real-Time MENTAT programming language and architecture
NASA Technical Reports Server (NTRS)
Grimshaw, Andrew S.; Silberman, Ami; Liu, Jane W. S.
1989-01-01
Real-time MENTAT, a programming environment designed to simplify the task of programming real-time applications in distributed and parallel environments, is described. It is based on the same data-driven computation model and object-oriented programming paradigm as MENTAT. It provides an easy-to-use mechanism to exploit parallelism, language constructs for the expression and enforcement of timing constraints, and run-time support for scheduling and exciting real-time programs. The real-time MENTAT programming language is an extended C++. The extensions are added to facilitate automatic detection of data flow and generation of data flow graphs, to express the timing constraints of individual granules of computation, and to provide scheduling directives for the runtime system. A high-level view of the real-time MENTAT system architecture and programming language constructs is provided.
Focus on Undergraduate Personal/Professional Preparation in Physical Education.
ERIC Educational Resources Information Center
Weinberg, Herman
A program is described that features an integrated course sequence and a continuous field-based experience. It focuses intensively on the individual needs and growth of the prospective physical education teacher. The primary components of the program are interrelated and designed to ensure that relevant content appears concurrently and…
Formal Specification and Verification of Concurrent Programs
1993-02-01
of examples from the emerging theory of This book describes operating systems in general programming languages. via the construction of MINIX , a UNIX...look-alike that runs on IBM-PC compatibles. The book con- Wegner72 tains a complete MINIX manual and a complete Wegnerflisting of its C codie. egner
Successful Concurrent Programs: An EXCELerate Program in Oklahoma
ERIC Educational Resources Information Center
Vargas, Juanita Gamez; Roach, Rick; David, Kevin M.
2014-01-01
The article presents the implementation and findings of a successful collaborative effort with the Oklahoma State Regents for Higher Education (OSRHE), Tulsa Community College (TCC), and two local public school districts, Tulsa Public Schools (TPS) and Union Public Schools (UPS). Known as EXCELerate, it's a five-semester dual enrollment pilot…
General Education at Albertus Magnus College.
ERIC Educational Resources Information Center
Savage, Mary
An alternative general education program for freshmen at Albertus Magnus College is described. The program, an interdisciplinary student-centered introduction to general education, is composed of two parts that the student takes concurrently: (1) a year-long seminar in thought and expression, and (2) a sequence of four (usually 7-week) courses in…
Not Just Anywhere, Anywhen: Mapping Change through Studio Work
ERIC Educational Resources Information Center
Tassoni, John Paul; Lewiecki-Wilson, Cynthia
2005-01-01
In this autoethnographic, institutional narrative, we describe the evolution of a Studio program at an open-access, regional campus of a state university. The Studio, first conceptualized by Grego and Thompson, is a one-credit writing workshop taken by students concurrently enrolled in a composition course. Developing this program necessitated…
ERIC Educational Resources Information Center
Harvey, Carl A., II; Kaplan, Allison G.
2007-01-01
This article highlights some of the exciting learning opportunities at the conference in Reno, Nevada. From an author strand running throughout the program to sessions on technology, collaboration, and general best practice, the concurrent programs promise something for everyone. There are exhibits that will showcase the latest in furniture,…
Using a Programmable Calculator to Teach Teophylline Pharmacokinetics.
ERIC Educational Resources Information Center
Closson, Richard Grant
1981-01-01
A calculator program for a Texas Instruments Model 59 to predict serum theophylline concentrations is described. The program accommodates the input of multiple dose times at irregular intervals, clearance changes due to concurrent patient diseases and age less than 17 years. The calculations for five hypothetical patients are given. (Author/MLW)
NASA Technical Reports Server (NTRS)
Kolb, Mark A.
1988-01-01
The Rubber Airplane program, which combines two symbolic processing techniques with a component-based database of design knowledge, is proposed as a computer aid for conceptual design. Using object-oriented programming, programs are organized around the objects and behavior to be simulated, and using constraint propagation, declarative statements designate mathematical relationships among all the equation variables. It is found that the additional level of organizational structure resulting from the arrangement of the design information in terms of design components provides greater flexibility and convenience.
[Constraints and opportunities for inter-sector health promotion initiatives: a case study].
Magalhães, Rosana
2015-07-01
This article analyzes the implementation of inter-sector initiatives linked to the Family Grant, Family Health, and School Health Programs in the Manguinhos neighborhood in the North Zone of Rio de Janeiro, Brazil. The study was conducted in 2010 and 2011 and included document review, local observation, and 25 interviews with program managers, professionals, and staff. This was an exploratory case study using a qualitative approach that identified constraints and opportunities for inter-sector health experiences, contributing to the debate on the effectiveness of health promotion and poverty relief programs.
Object matching using a locally affine invariant and linear programming techniques.
Li, Hongsheng; Huang, Xiaolei; He, Lei
2013-02-01
In this paper, we introduce a new matching method based on a novel locally affine-invariant geometric constraint and linear programming techniques. To model and solve the matching problem in a linear programming formulation, all geometric constraints should be able to be exactly or approximately reformulated into a linear form. This is a major difficulty for this kind of matching algorithm. We propose a novel locally affine-invariant constraint which can be exactly linearized and requires a lot fewer auxiliary variables than other linear programming-based methods do. The key idea behind it is that each point in the template point set can be exactly represented by an affine combination of its neighboring points, whose weights can be solved easily by least squares. Errors of reconstructing each matched point using such weights are used to penalize the disagreement of geometric relationships between the template points and the matched points. The resulting overall objective function can be solved efficiently by linear programming techniques. Our experimental results on both rigid and nonrigid object matching show the effectiveness of the proposed algorithm.
NASA Technical Reports Server (NTRS)
Arneson, Heather M.; Dousse, Nicholas; Langbort, Cedric
2014-01-01
We consider control design for positive compartmental systems in which each compartment's outflow rate is described by a concave function of the amount of material in the compartment.We address the problem of determining the routing of material between compartments to satisfy time-varying state constraints while ensuring that material reaches its intended destination over a finite time horizon. We give sufficient conditions for the existence of a time-varying state-dependent routing strategy which ensures that the closed-loop system satisfies basic network properties of positivity, conservation and interconnection while ensuring that capacity constraints are satisfied, when possible, or adjusted if a solution cannot be found. These conditions are formulated as a linear programming problem. Instances of this linear programming problem can be solved iteratively to generate a solution to the finite horizon routing problem. Results are given for the application of this control design method to an example problem. Key words: linear programming; control of networks; positive systems; controller constraints and structure.
The Definition and Implementation of a Computer Programming Language Based on Constraints.
1980-08-01
though not quite reached, is a complete programming system which will implicitly support the constraint paradigm to the same extent that IISP , say...and detecting and resolving conflicts, just as iisp provides certain services such as automatic storage management, which records given dala in a...defined- it permits the statement of equalities and some simple arithmetic relationships. An implementation representation is chosen, and IISP code for a
ERIC Educational Resources Information Center
Library Journal, 1971
1971-01-01
The preconference on Recruitment of Minorities featured a diversified program with an array of speakers, a panel discussion, concurrent sessions on problem solving, and summaries and critiques from two conference "floaters." (Author)
Safety and environmental constraints on space applications of fusion energy
NASA Technical Reports Server (NTRS)
Roth, J. Reece
1990-01-01
Some of the constraints are examined on fusion reactions, plasma confinement systems, and fusion reactors that are intended for such space related missions as manned or unmanned operations in near earth orbit, interplanetary missions, or requirements of the SDI program. Of the many constraints on space power and propulsion systems, those arising from safety and environmental considerations are emphasized since these considerations place severe constraints on some fusion systems and have not been adequately treated in previous studies.
2011-12-28
specify collaboration constraints that occur in Java and XML frameworks and that the collaboration constraints from these frameworks matter in practice. (a...programming language boundaries, and Chapter 6 and Appendix A demonstrate that Fusion can specify constraints across both Java and XML in practice. (c...designed JUnit, Josh Bloch designed Java Collec- tions, and Krzysztof Cwalina designed the .NET Framework APIs. While all of these frameworks are very
Darmon, Nicole; Ferguson, Elaine L; Briend, André
2006-01-01
To predict, for French women, the impact of a cost constraint on the food choices required to provide a nutritionally adequate diet. Isocaloric daily diets fulfilling both palatability and nutritional constraints were modeled in linear programming, using different cost constraint levels. For each modeled diet, total departure from an observed French population's average food group pattern ("mean observed diet") was minimized. To achieve the nutritional recommendations without a cost constraint, the modeled diet provided more energy from fish, fresh fruits and green vegetables and less energy from animal fats and cheese than the "mean observed diet." Introducing and strengthening a cost constraint decreased the energy provided by meat, fresh vegetables, fresh fruits, vegetable fat, and yogurts and increased the energy from processed meat, eggs, offal, and milk. For the lowest cost diet (ie, 3.18 euros/d), marked changes from the "mean observed diet" were required, including a marked reduction in the amount of energy from fresh fruits (-85%) and green vegetables (-70%), and an increase in the amount of energy from nuts, dried fruits, roots, legumes, and fruit juices. Nutrition education for low-income French women must emphasize these affordable food choices.
An algorithm for the solution of dynamic linear programs
NASA Technical Reports Server (NTRS)
Psiaki, Mark L.
1989-01-01
The algorithm's objective is to efficiently solve Dynamic Linear Programs (DLP) by taking advantage of their special staircase structure. This algorithm constitutes a stepping stone to an improved algorithm for solving Dynamic Quadratic Programs, which, in turn, would make the nonlinear programming method of Successive Quadratic Programs more practical for solving trajectory optimization problems. The ultimate goal is to being trajectory optimization solution speeds into the realm of real-time control. The algorithm exploits the staircase nature of the large constraint matrix of the equality-constrained DLPs encountered when solving inequality-constrained DLPs by an active set approach. A numerically-stable, staircase QL factorization of the staircase constraint matrix is carried out starting from its last rows and columns. The resulting recursion is like the time-varying Riccati equation from multi-stage LQR theory. The resulting factorization increases the efficiency of all of the typical LP solution operations over that of a dense matrix LP code. At the same time numerical stability is ensured. The algorithm also takes advantage of dynamic programming ideas about the cost-to-go by relaxing active pseudo constraints in a backwards sweeping process. This further decreases the cost per update of the LP rank-1 updating procedure, although it may result in more changes of the active set that if pseudo constraints were relaxed in a non-stagewise fashion. The usual stability of closed-loop Linear/Quadratic optimally-controlled systems, if it carries over to strictly linear cost functions, implies that the saving due to reduced factor update effort may outweigh the cost of an increased number of updates. An aerospace example is presented in which a ground-to-ground rocket's distance is maximized. This example demonstrates the applicability of this class of algorithms to aerospace guidance. It also sheds light on the efficacy of the proposed pseudo constraint relaxation scheme.
A comparison of analysis methods to estimate contingency strength.
Lloyd, Blair P; Staubitz, Johanna L; Tapp, Jon T
2018-05-09
To date, several data analysis methods have been used to estimate contingency strength, yet few studies have compared these methods directly. To compare the relative precision and sensitivity of four analysis methods (i.e., exhaustive event-based, nonexhaustive event-based, concurrent interval, concurrent+lag interval), we applied all methods to a simulated data set in which several response-dependent and response-independent schedules of reinforcement were programmed. We evaluated the degree to which contingency strength estimates produced from each method (a) corresponded with expected values for response-dependent schedules and (b) showed sensitivity to parametric manipulations of response-independent reinforcement. Results indicated both event-based methods produced contingency strength estimates that aligned with expected values for response-dependent schedules, but differed in sensitivity to response-independent reinforcement. The precision of interval-based methods varied by analysis method (concurrent vs. concurrent+lag) and schedule type (continuous vs. partial), and showed similar sensitivities to response-independent reinforcement. Recommendations and considerations for measuring contingencies are identified. © 2018 Society for the Experimental Analysis of Behavior.
ACSYNT inner loop flight control design study
NASA Technical Reports Server (NTRS)
Bortins, Richard; Sorensen, John A.
1993-01-01
The NASA Ames Research Center developed the Aircraft Synthesis (ACSYNT) computer program to synthesize conceptual future aircraft designs and to evaluate critical performance metrics early in the design process before significant resources are committed and cost decisions made. ACSYNT uses steady-state performance metrics, such as aircraft range, payload, and fuel consumption, and static performance metrics, such as the control authority required for the takeoff rotation and for landing with an engine out, to evaluate conceptual aircraft designs. It can also optimize designs with respect to selected criteria and constraints. Many modern aircraft have stability provided by the flight control system rather than by the airframe. This may allow the aircraft designer to increase combat agility, or decrease trim drag, for increased range and payload. This strategy requires concurrent design of the airframe and the flight control system, making trade-offs of performance and dynamics during the earliest stages of design. ACSYNT presently lacks means to implement flight control system designs but research is being done to add methods for predicting rotational degrees of freedom and control effector performance. A software module to compute and analyze the dynamics of the aircraft and to compute feedback gains and analyze closed loop dynamics is required. The data gained from these analyses can then be fed back to the aircraft design process so that the effects of the flight control system and the airframe on aircraft performance can be included as design metrics. This report presents results of a feasibility study and the initial design work to add an inner loop flight control system (ILFCS) design capability to the stability and control module in ACSYNT. The overall objective is to provide a capability for concurrent design of the aircraft and its flight control system, and enable concept designers to improve performance by exploiting the interrelationships between aircraft and flight control system design parameters.
34 CFR 674.52 - Cancellation procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
... EDUCATION, DEPARTMENT OF EDUCATION FEDERAL PERKINS LOAN PROGRAM Loan Cancellation § 674.52 Cancellation... loan. (d) Concurrent deferment period. The Secretary considers a Perkins Loan, NDSL or Defense Loan...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gomez, Daniel R., E-mail: dgomez@mdanderson.org; Gillin, Michael; Liao, Zhongxing
Background: Many patients with locally advanced non-small cell lung cancer (NSCLC) cannot undergo concurrent chemotherapy because of comorbidities or poor performance status. Hypofractionated radiation regimens, if tolerable, may provide an option to these patients for effective local control. Methods and Materials: Twenty-five patients were enrolled in a phase 1 dose-escalation trial of proton beam therapy (PBT) from September 2010 through July 2012. Eligible patients had histologically documented lung cancer, thymic tumors, carcinoid tumors, or metastatic thyroid tumors. Concurrent chemotherapy was not allowed, but concurrent treatment with biologic agents was. The dose-escalation schema comprised 15 fractions of 3 Gy(relative biological effectivenessmore » [RBE])/fraction, 3.5 Gy(RBE)/fraction, or 4 Gy(RBE)/fraction. Dose constraints were derived from biologically equivalent doses of standard fractionated treatment. Results: The median follow-up time for patients alive at the time of analysis was 13 months (range, 8-28 months). Fifteen patients received treatment to hilar or mediastinal lymph nodes. Two patients experienced dose-limiting toxicity possibly related to treatment; 1 received 3.5-Gy(RBE) fractions and experienced an in-field tracheoesophageal fistula 9 months after PBT and 1 month after bevacizumab. The other patient received 4-Gy(RBE) fractions and was hospitalized for bacterial pneumonia/radiation pneumonitis 4 months after PBT. Conclusion: Hypofractionated PBT to the thorax delivered over 3 weeks was well tolerated even with significant doses to the lungs and mediastinal structures. Phase 2/3 trials are needed to compare the efficacy of this technique with standard treatment for locally advanced NSCLC.« less
A new vertical grid nesting capability in the Weather Research and Forecasting (WRF) Model
Daniels, Megan H.; Lundquist, Katherine A.; Mirocha, Jeffrey D.; ...
2016-09-16
Mesoscale atmospheric models are increasingly used for high-resolution (<3 km) simulations to better resolve smaller-scale flow details. Increased resolution is achieved using mesh refinement via grid nesting, a procedure where multiple computational domains are integrated either concurrently or in series. A constraint in the concurrent nesting framework offered by the Weather Research and Forecasting (WRF) Model is that mesh refinement is restricted to the horizontal dimensions. This limitation prevents control of the grid aspect ratio, leading to numerical errors due to poor grid quality and preventing grid optimization. Here, a procedure permitting vertical nesting for one-way concurrent simulation is developedmore » and validated through idealized cases. The benefits of vertical nesting are demonstrated using both mesoscale and large-eddy simulations (LES). Mesoscale simulations of the Terrain-Induced Rotor Experiment (T-REX) show that vertical grid nesting can alleviate numerical errors due to large aspect ratios on coarse grids, while allowing for higher vertical resolution on fine grids. Furthermore, the coarsening of the parent domain does not result in a significant loss of accuracy on the nested domain. LES of neutral boundary layer flow shows that, by permitting optimal grid aspect ratios on both parent and nested domains, use of vertical nesting yields improved agreement with the theoretical logarithmic velocity profile on both domains. Lastly, vertical grid nesting in WRF opens the path forward for multiscale simulations, allowing more accurate simulations spanning a wider range of scales than previously possible.« less
A new vertical grid nesting capability in the Weather Research and Forecasting (WRF) Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daniels, Megan H.; Lundquist, Katherine A.; Mirocha, Jeffrey D.
Mesoscale atmospheric models are increasingly used for high-resolution (<3 km) simulations to better resolve smaller-scale flow details. Increased resolution is achieved using mesh refinement via grid nesting, a procedure where multiple computational domains are integrated either concurrently or in series. A constraint in the concurrent nesting framework offered by the Weather Research and Forecasting (WRF) Model is that mesh refinement is restricted to the horizontal dimensions. This limitation prevents control of the grid aspect ratio, leading to numerical errors due to poor grid quality and preventing grid optimization. Here, a procedure permitting vertical nesting for one-way concurrent simulation is developedmore » and validated through idealized cases. The benefits of vertical nesting are demonstrated using both mesoscale and large-eddy simulations (LES). Mesoscale simulations of the Terrain-Induced Rotor Experiment (T-REX) show that vertical grid nesting can alleviate numerical errors due to large aspect ratios on coarse grids, while allowing for higher vertical resolution on fine grids. Furthermore, the coarsening of the parent domain does not result in a significant loss of accuracy on the nested domain. LES of neutral boundary layer flow shows that, by permitting optimal grid aspect ratios on both parent and nested domains, use of vertical nesting yields improved agreement with the theoretical logarithmic velocity profile on both domains. Lastly, vertical grid nesting in WRF opens the path forward for multiscale simulations, allowing more accurate simulations spanning a wider range of scales than previously possible.« less
An Elegant Sufficiency: Load-Aware Differentiated Scheduling of Data Transfers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kettimuthu, Rajkumar; Vardoyan, Gayane; Agrawal, Gagan
2015-11-15
We investigate the file transfer scheduling problem, where transfers among different endpoints must be scheduled to maximize pertinent metrics. We propose two new algorithms that exploit the fact that the aggregate bandwidth obtained over a network or at a storage system tends to increase with the number of concurrent transfers—but only up to a certain limit. The first algorithm, SEAL, uses runtime information and data-driven models to approximate system load and adapt transfer schedules and concurrency so as to maximize performance while avoiding saturation. We implement this algorithm using GridFTP as the transfer protocol and evaluate it using real transfermore » logs in a production WAN environment. Results show that SEAL can improve average slowdowns and turnaround times by up to 25% and worst-case slowdown and turnaround times by up to 50%, compared with the best-performing baseline scheme. Our second algorithm, STEAL, further leverages user-supplied categorization of transfers as either “interactive” (requiring immediate processing) or “batch” (less time-critical). Results show that STEAL reduces the average slowdown of interactive transfers by 63% compared to the best-performing baseline and by 21% compared to SEAL. For batch transfers, compared to the best-performing baseline, STEAL improves by 18% the utilization of the bandwidth unused by interactive transfers. By elegantly ensuring a sufficient, but not excessive, allocation of concurrency to the right transfers, we significantly improve overall performance despite constraints.« less
NASA Astrophysics Data System (ADS)
McEvoy, Thomas Richard; Wolthusen, Stephen D.
Recent research on intrusion detection in supervisory data acquisition and control (SCADA) and DCS systems has focused on anomaly detection at protocol level based on the well-defined nature of traffic on such networks. Here, we consider attacks which compromise sensors or actuators (including physical manipulation), where intrusion may not be readily apparent as data and computational states can be controlled to give an appearance of normality, and sensor and control systems have limited accuracy. To counter these, we propose to consider indirect relations between sensor readings to detect such attacks through concurrent observations as determined by control laws and constraints.
Pattern identification in time-course gene expression data with the CoGAPS matrix factorization.
Fertig, Elana J; Stein-O'Brien, Genevieve; Jaffe, Andrew; Colantuoni, Carlo
2014-01-01
Patterns in time-course gene expression data can represent the biological processes that are active over the measured time period. However, the orthogonality constraint in standard pattern-finding algorithms, including notably principal components analysis (PCA), confounds expression changes resulting from simultaneous, non-orthogonal biological processes. Previously, we have shown that Markov chain Monte Carlo nonnegative matrix factorization algorithms are particularly adept at distinguishing such concurrent patterns. One such matrix factorization is implemented in the software package CoGAPS. We describe the application of this software and several technical considerations for identification of age-related patterns in a public, prefrontal cortex gene expression dataset.
Fox, Ashley M; Jackson, Sharon S; Hansen, Nathan B; Gasa, Nolwazi; Crewe, Mary; Sikkema, Kathleen J
2007-06-01
This study qualitatively examines the intersections of risk for intimate partner violence (IPV) and HIV infection in South Africa. Eighteen women seeking services for relationship violence were asked semistructured questions regarding their abusive experiences and HIV risk. Participants had experienced myriad forms of abuse, which reinforced each other to create a climate that sustained abuse and multiplied HIV risk. Male partners having multiple concurrent sexual relationships, and poor relationship communication compounded female vulnerability to HIV and abuse. A social environment of silence, male power, and economic constraints enabled abuse to continue. "Breaking the silence" and women's empowerment were suggested solutions.
X-33 Attitude Control Using the XRS-2200 Linear Aerospike Engine
NASA Technical Reports Server (NTRS)
Hall, Charles E.; Panossian, Hagop V.
1999-01-01
The Vehicle Control Systems Team at Marshall Space Flight Center, Structures and Dynamics Laboratory, Guidance and Control Systems Division is designing, under a cooperative agreement with Lockheed Martin Skunkworks, the Ascent, Transition, and Entry flight attitude control systems for the X-33 experimental vehicle. Test flights, while suborbital, will achieve sufficient altitudes and Mach numbers to test Single Stage To Orbit, Reusable Launch Vehicle technologies. Ascent flight control phase, the focus of this paper, begins at liftoff and ends at linear aerospike main engine cutoff (MECO). The X-33 attitude control system design is confronted by a myriad of design challenges: a short design cycle, the X-33 incremental test philosophy, the concurrent design philosophy chosen for the X-33 program, and the fact that the attitude control system design is, as usual, closely linked to many other subsystems and must deal with constraints and requirements from these subsystems. Additionally, however, and of special interest, the use of the linear aerospike engine is a departure from the gimbaled engines traditionally used for thrust vector control (TVC) in launch vehicles and poses certain design challenges. This paper discusses the unique problem of designing the X-33 attitude control system with the linear aerospike engine, requirements development, modeling and analyses that verify the design.
Optimal Planning and Problem-Solving
NASA Technical Reports Server (NTRS)
Clemet, Bradley; Schaffer, Steven; Rabideau, Gregg
2008-01-01
CTAEMS MDP Optimal Planner is a problem-solving software designed to command a single spacecraft/rover, or a team of spacecraft/rovers, to perform the best action possible at all times according to an abstract model of the spacecraft/rover and its environment. It also may be useful in solving logistical problems encountered in commercial applications such as shipping and manufacturing. The planner reasons around uncertainty according to specified probabilities of outcomes using a plan hierarchy to avoid exploring certain kinds of suboptimal actions. Also, planned actions are calculated as the state-action space is expanded, rather than afterward, to reduce by an order of magnitude the processing time and memory used. The software solves planning problems with actions that can execute concurrently, that have uncertain duration and quality, and that have functional dependencies on others that affect quality. These problems are modeled in a hierarchical planning language called C_TAEMS, a derivative of the TAEMS language for specifying domains for the DARPA Coordinators program. In realistic environments, actions often have uncertain outcomes and can have complex relationships with other tasks. The planner approaches problems by considering all possible actions that may be taken from any state reachable from a given, initial state, and from within the constraints of a given task hierarchy that specifies what tasks may be performed by which team member.
Multinational Coproduction of Military Aerospace Systems.
1981-10-01
when the participating nations and industries have different and sometimes conflicting goals and practices, the en - deavor is usually even more...development programs and concurrency of development and production activities. In this schedule framework, en - gine development problems contributed at least...that Panavia inevitably had to go through are frequently cited as length- ening the program. Thus, it is possible that the MRCA acquisition program is
Concurrent Engineering Teams. Volume 2: Annotated Bibliography
1990-11-01
publishles. They normally embody restilts of major projects which (a) have a direct bearing am decisionse affecting major program , III) addrnss...D., "What Processes do You Own? How are They Doing?," Program Manager, Journal of the Defense Systems Management College, September-October 1989, pp...216. The key ingredient to any successful TQM program is top management commitment and involvement. The early top management involvement reflects
Intergenerational enrollment and expenditure changes in Medicaid: trends from 1991 to 2005
2012-01-01
Background From its inception, Medicaid was aimed at providing insurance coverage for low income children, elderly, and disabled. Since this time, children have become a smaller proportion of the US population and Medicaid has expanded to additional eligibility groups. We sought to evaluate relative growth in spending in the Medicaid program between children and adults from 1991-2005. We hypothesize that this shifting demographic will result in fewer resources being allocated to children in the Medicaid program. Methods We utilized retrospective enrollment and expenditure data for children, adults and the elderly from 1991 to 2005 for both Medicaid and Children’s Health Insurance Program Medicaid expansion programs. Data were obtained from the Centers for Medicare and Medicaid Services using their Medicaid Statistical Information System. Results From 1991 to 2005, the number of enrollees increased by 83% to 58.7 million. This includes increases of 33% for children, 100% for adults and 50% for the elderly. Concurrently, total expenditures nationwide rose 150% to $273 billion. Expenditures for children increased from $23.4 to $65.7 billion, adults from $46.2 to $123.6 billion, and elderly from $39.2 to $71.3 billion. From 1999 to 2005, Medicaid spending on long-term care increased by 31% to $84.3 billion. Expenditures on the disabled grew by 61% to $119 billion. In total, the disabled account for 43% and long-term care 31%, of the total Medicaid budget. Conclusion Our study did not find an absolute decrease in the overall resources being directed toward children. However, increased spending on adults on a per-capita and absolute basis, particularly disabled adults, is responsible for much of the growth in spending over the past 15 years. Medicaid expenditures have grown faster than inflation and overall national health expenditures. A national strategy is needed to ensure adequate coverage for Medicaid recipients while dealing with the ongoing constraints of state and federal budgets. PMID:22992389
Ranking Forestry Investments With Parametric Linear Programming
Paul A. Murphy
1976-01-01
Parametric linear programming is introduced as a technique for ranking forestry investments under multiple constraints; it combines the advantages of simple tanking and linear programming as capital budgeting tools.
Declarative Programming with Temporal Constraints, in the Language CG.
Negreanu, Lorina
2015-01-01
Specifying and interpreting temporal constraints are key elements of knowledge representation and reasoning, with applications in temporal databases, agent programming, and ambient intelligence. We present and formally characterize the language CG, which tackles this issue. In CG, users are able to develop time-dependent programs, in a flexible and straightforward manner. Such programs can, in turn, be coupled with evolving environments, thus empowering users to control the environment's evolution. CG relies on a structure for storing temporal information, together with a dedicated query mechanism. Hence, we explore the computational complexity of our query satisfaction problem. We discuss previous implementation attempts of CG and introduce a novel prototype which relies on logic programming. Finally, we address the issue of consistency and correctness of CG program execution, using the Event-B modeling approach.
Using Block-local Atomicity to Detect Stale-value Concurrency Errors
NASA Technical Reports Server (NTRS)
Artho, Cyrille; Havelund, Klaus; Biere, Armin
2004-01-01
Data races do not cover all kinds of concurrency errors. This paper presents a data-flow-based technique to find stale-value errors, which are not found by low-level and high-level data race algorithms. Stale values denote copies of shared data where the copy is no longer synchronized. The algorithm to detect such values works as a consistency check that does not require any assumptions or annotations of the program. It has been implemented as a static analysis in JNuke. The analysis is sound and requires only a single execution trace if implemented as a run-time checking algorithm. Being based on an analysis of Java bytecode, it encompasses the full program semantics, including arbitrarily complex expressions. Related techniques are more complex and more prone to over-reporting.
NASA Technical Reports Server (NTRS)
Tapia, R. A.; Vanrooy, D. L.
1976-01-01
A quasi-Newton method is presented for minimizing a nonlinear function while constraining the variables to be nonnegative and sum to one. The nonnegativity constraints were eliminated by working with the squares of the variables and the resulting problem was solved using Tapia's general theory of quasi-Newton methods for constrained optimization. A user's guide for a computer program implementing this algorithm is provided.
Agostini, Deborah; Vallorani, Luciana; Gioacchini, Annamaria; Guescini, Michele; Casadei, Lucia; Passalia, Annunziata; Del Sal, Marta; Piccoli, Giovanni; Andreani, Mauro; Federici, Ario; Stocchi, Vilberto
2017-01-01
Type 2 diabetes (T2D) is an age-related chronic disease associated with metabolic dysregulation, chronic inflammation, and activation of peripheral blood mononuclear cells (PBMC). The aim of this study was to assess the effects of a concurrent exercise training program on inflammatory status and metabolic parameters of T2D patients. Sixteen male patients (age range 55–70) were randomly assigned to an intervention group (n = 8), which underwent a concurrent aerobic and resistance training program (3 times a week; 16 weeks), or to a control group, which followed physicians' usual diabetes care advices. Training intervention significantly improved patients' body composition, blood pressure, total cholesterol, and overall fitness level. After training, plasma levels of adipokines leptin (−33.9%) and RBP4 (−21.3%), and proinflammatory markers IL-6 (−25.3%), TNF-α (−19.8%) and MCP-1 (−15.3%) decreased, whereas anabolic hormone IGF-1 level increased (+16.4%). All improvements were significantly greater than those of control patients. Plasma proteomic profile of exercised patients showed a reduction of immunoglobulin K light chain and fibrinogen as well. Training also induced a modulation of IL-6, IGF-1, and IGFBP-3 mRNAs in the PBMCs. These findings confirm that concurrent aerobic and resistance training improves T2D-related metabolic abnormalities and has the potential to reduce the deleterious health effects of diabetes-related inflammation. PMID:28713486
RSM 1.0 user's guide: A resupply scheduler using integer optimization
NASA Technical Reports Server (NTRS)
Viterna, Larry A.; Green, Robert D.; Reed, David M.
1991-01-01
The Resupply Scheduling Model (RSM) is a PC based, fully menu-driven computer program. It uses integer programming techniques to determine an optimum schedule to replace components on or before a fixed replacement period, subject to user defined constraints such as transportation mass and volume limits or available repair crew time. Principal input for RSJ includes properties such as mass and volume and an assembly sequence. Resource constraints are entered for each period corresponding to the component properties. Though written to analyze the electrical power system on the Space Station Freedom, RSM is quite general and can be used to model the resupply of almost any system subject to user defined resource constraints. Presented here is a step by step procedure for preparing the input, performing the analysis, and interpreting the results. Instructions for installing the program and information on the algorithms are given.
Building flexible real-time systems using the Flex language
NASA Technical Reports Server (NTRS)
Kenny, Kevin B.; Lin, Kwei-Jay
1991-01-01
The design and implementation of a real-time programming language called Flex, which is a derivative of C++, are presented. It is shown how different types of timing requirements might be expressed and enforced in Flex, how they might be fulfilled in a flexible way using different program models, and how the programming environment can help in making binding and scheduling decisions. The timing constraint primitives in Flex are easy to use yet powerful enough to define both independent and relative timing constraints. Program models like imprecise computation and performance polymorphism can carry out flexible real-time programs. In addition, programmers can use a performance measurement tool that produces statistically correct timing models to predict the expected execution time of a program and to help make binding decisions. A real-time programming environment is also presented.
A programing system for research and applications in structural optimization
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, J.; Rogers, J. L., Jr.
1981-01-01
The paper describes a computer programming system designed to be used for methodology research as well as applications in structural optimization. The flexibility necessary for such diverse utilizations is achieved by combining, in a modular manner, a state-of-the-art optimization program, a production level structural analysis program, and user supplied and problem dependent interface programs. Standard utility capabilities existing in modern computer operating systems are used to integrate these programs. This approach results in flexibility of the optimization procedure organization and versatility in the formulation of contraints and design variables. Features shown in numerical examples include: (1) variability of structural layout and overall shape geometry, (2) static strength and stiffness constraints, (3) local buckling failure, and (4) vibration constraints. The paper concludes with a review of the further development trends of this programing system.
Chen, Wendy Y; Aertsens, Joris; Liekens, Inge; Broekx, Steven; De Nocker, Leo
2014-08-01
The strategic importance of ecosystem service valuation as an operational basis for policy decisions on natural restoration has been increasingly recognized in order to align the provision of ecosystem services with the expectation of human society. The contingent valuation method (CVM) is widely used to quantify various ecosystem services. However, two areas of concern arise: (1) whether people value specific functional ecosystem services and overlook some intrinsic aspects of natural restoration, and (2) whether people understand the temporal dimension of ecosystem services and payment schedules given in the contingent scenarios. Using a peri-urban riparian meadow restoration project in Flanders, Belgium as a case, we explored the impacts of residents' perceived importance of various ecosystem services and stated financial constraints on their willingness-to-pay for the proposed restoration project employing the CVM. The results indicated that people tended to value all the benefits of riparian ecosystem restoration concurrently, although they accorded different importances to each individual category of ecosystem services. A longer payment scheme can help the respondents to think more about the flow of ecosystem services into future generations. A weak temporal embedding effect can be detected, which might be attributed to respondents' concern about current financial constraints, rather than financial bindings associated with their income and perceived future financial constraints. This demonstrates the multidimensionality of respondents' financial concerns in CV. This study sheds light on refining future CV studies, especially with regard to public expectation of ecosystem services and the temporal dimension of ecosystem services and payment schedules.
NASA Astrophysics Data System (ADS)
Chen, Wendy Y.; Aertsens, Joris; Liekens, Inge; Broekx, Steven; De Nocker, Leo
2014-08-01
The strategic importance of ecosystem service valuation as an operational basis for policy decisions on natural restoration has been increasingly recognized in order to align the provision of ecosystem services with the expectation of human society. The contingent valuation method (CVM) is widely used to quantify various ecosystem services. However, two areas of concern arise: (1) whether people value specific functional ecosystem services and overlook some intrinsic aspects of natural restoration, and (2) whether people understand the temporal dimension of ecosystem services and payment schedules given in the contingent scenarios. Using a peri-urban riparian meadow restoration project in Flanders, Belgium as a case, we explored the impacts of residents' perceived importance of various ecosystem services and stated financial constraints on their willingness-to-pay for the proposed restoration project employing the CVM. The results indicated that people tended to value all the benefits of riparian ecosystem restoration concurrently, although they accorded different importances to each individual category of ecosystem services. A longer payment scheme can help the respondents to think more about the flow of ecosystem services into future generations. A weak temporal embedding effect can be detected, which might be attributed to respondents' concern about current financial constraints, rather than financial bindings associated with their income and perceived future financial constraints. This demonstrates the multidimensionality of respondents' financial concerns in CV. This study sheds light on refining future CV studies, especially with regard to public expectation of ecosystem services and the temporal dimension of ecosystem services and payment schedules.
Dual Enrollment in a Rural Environment: A Descriptive Quantitative Study
ERIC Educational Resources Information Center
Dodge, Mary Beth
2012-01-01
Dual enrollment is a federally funded program that offers high school students the opportunity to earn both high school and postsecondary credits for the same course. While the phenomenon of concurrent enrollment in postsecondary and college educational programs is not new, political support and public funding has drawn focus to the policies of…
ERIC Educational Resources Information Center
2000
This packet contains three papers on gender identity; power and influence styles in program planning; and white male backlash from a symposium on human resource development (HRD). The first paper, "Identification of Power and Influence Styles in Program Planning Practice" (Baiyin Yang), explores the relationship between HRD practitioners…
Targeted On-Demand Team Performance App Development
2018-02-01
ACCOMPLISHMENTS: Major Goals Task Description Status 1 Project Management Administration, oversight and management of all program tasks, expenditures...reporting charts, financial and project management protocols. Create, complete, and submit all documentation for program office and designated... project provided? All subjects participated in simulated emergency medicine events that included concurrent management of three patients with
Learned Resourcefulness and the Long-Term Benefits of a Chronic Pain Management Program
ERIC Educational Resources Information Center
Kennett, Deborah J.; O'Hagan, Fergal T.; Cezer, Diego
2008-01-01
A concurrent mixed methods approach was used to understand how learned resourcefulness empowers individuals. After completing Rosenbaum's Self-Control Schedule (SCS) measuring resourcefulness, 16 past clients of a multimodal pain clinic were interviewed about the kinds of pain-coping strategies they were practicing from the program. Constant…
Interpreting beyond Syntactics: A Semiotic Learning Model for Computer Programming Languages
ERIC Educational Resources Information Center
May, Jeffrey; Dhillon, Gurpreet
2009-01-01
In the information systems field there are numerous programming languages that can be used in specifying the behavior of concurrent and distributed systems. In the literature it has been argued that a lack of pragmatic and semantic consideration decreases the effectiveness of such specifications. In other words, to simply understand the syntactic…
DOT National Transportation Integrated Search
1994-12-01
The objectives of this study were to: 1) estimate the situation in year 2005 with the current TIMED program in operation, 2) estimate how long current TIMED funding must be extended to fully fund the projects listed in the TIMED program, and 3) estim...
A Professional Learning Program Designed to Increase K-12 Teachers' Instructional Technology Use
ERIC Educational Resources Information Center
Spencer, Lisa A.
2014-01-01
Despite the ready availability of many instructional-technology resources, many teachers in the researched Maryland school district are uncomfortable using technology to deliver content. This concurrent mixed methods case study examined the impact of Sharing Technology with Educators Program (STEP) on 269 K-12 teachers' technology use. The study…
Code of Federal Regulations, 2010 CFR
2010-10-01
... business status requirements will be processed concurrently by SBA. (c) All protests must be in writing and... as a service-disabled veteran-owned small business concern. 19.307 Section 19.307 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Determination...
Combining analysis with optimization at Langley Research Center. An evolutionary process
NASA Technical Reports Server (NTRS)
Rogers, J. L., Jr.
1982-01-01
The evolutionary process of combining analysis and optimization codes was traced with a view toward providing insight into the long term goal of developing the methodology for an integrated, multidisciplinary software system for the concurrent analysis and optimization of aerospace structures. It was traced along the lines of strength sizing, concurrent strength and flutter sizing, and general optimization to define a near-term goal for combining analysis and optimization codes. Development of a modular software system combining general-purpose, state-of-the-art, production-level analysis computer programs for structures, aerodynamics, and aeroelasticity with a state-of-the-art optimization program is required. Incorporation of a modular and flexible structural optimization software system into a state-of-the-art finite element analysis computer program will facilitate this effort. This effort results in the software system used that is controlled with a special-purpose language, communicates with a data management system, and is easily modified for adding new programs and capabilities. A 337 degree-of-freedom finite element model is used in verifying the accuracy of this system.
Update on Integrated Optical Design Analyzer
NASA Technical Reports Server (NTRS)
Moore, James D., Jr.; Troy, Ed
2003-01-01
Updated information on the Integrated Optical Design Analyzer (IODA) computer program has become available. IODA was described in Software for Multidisciplinary Concurrent Optical Design (MFS-31452), NASA Tech Briefs, Vol. 25, No. 10 (October 2001), page 8a. To recapitulate: IODA facilitates multidisciplinary concurrent engineering of highly precise optical instruments. The architecture of IODA was developed by reviewing design processes and software in an effort to automate design procedures. IODA significantly reduces design iteration cycle time and eliminates many potential sources of error. IODA integrates the modeling efforts of a team of experts in different disciplines (e.g., optics, structural analysis, and heat transfer) working at different locations and provides seamless fusion of data among thermal, structural, and optical models used to design an instrument. IODA is compatible with data files generated by the NASTRAN structural-analysis program and the Code V (Registered Trademark) optical-analysis program, and can be used to couple analyses performed by these two programs. IODA supports multiple-load-case analysis for quickly accomplishing trade studies. IODA can also model the transient response of an instrument under the influence of dynamic loads and disturbances.
Transputer parallel processing at NASA Lewis Research Center
NASA Technical Reports Server (NTRS)
Ellis, Graham K.
1989-01-01
The transputer parallel processing lab at NASA Lewis Research Center (LeRC) consists of 69 processors (transputers) that can be connected into various networks for use in general purpose concurrent processing applications. The main goal of the lab is to develop concurrent scientific and engineering application programs that will take advantage of the computational speed increases available on a parallel processor over the traditional sequential processor. Current research involves the development of basic programming tools. These tools will help standardize program interfaces to specific hardware by providing a set of common libraries for applications programmers. The thrust of the current effort is in developing a set of tools for graphics rendering/animation. The applications programmer currently has two options for on-screen plotting. One option can be used for static graphics displays and the other can be used for animated motion. The option for static display involves the use of 2-D graphics primitives that can be called from within an application program. These routines perform the standard 2-D geometric graphics operations in real-coordinate space as well as allowing multiple windows on a single screen.
A Framework for Dynamic Constraint Reasoning Using Procedural Constraints
NASA Technical Reports Server (NTRS)
Jonsson, Ari K.; Frank, Jeremy D.
1999-01-01
Many complex real-world decision and control problems contain an underlying constraint reasoning problem. This is particularly evident in a recently developed approach to planning, where almost all planning decisions are represented by constrained variables. This translates a significant part of the planning problem into a constraint network whose consistency determines the validity of the plan candidate. Since higher-level choices about control actions can add or remove variables and constraints, the underlying constraint network is invariably highly dynamic. Arbitrary domain-dependent constraints may be added to the constraint network and the constraint reasoning mechanism must be able to handle such constraints effectively. Additionally, real problems often require handling constraints over continuous variables. These requirements present a number of significant challenges for a constraint reasoning mechanism. In this paper, we introduce a general framework for handling dynamic constraint networks with real-valued variables, by using procedures to represent and effectively reason about general constraints. The framework is based on a sound theoretical foundation, and can be proven to be sound and complete under well-defined conditions. Furthermore, the framework provides hybrid reasoning capabilities, as alternative solution methods like mathematical programming can be incorporated into the framework, in the form of procedures.
Bliss, Jessica; Golden, Kate; Bourahla, Leila; Stoltzfus, Rebecca; Pelletier, David
2018-01-01
Background Assessment of the impact of emergency cash transfer programs on child nutritional status has been difficult to achieve due to the considerable logistic and ethical constraints that characterize humanitarian settings. Methods We present the findings from a quasi-experimental longitudinal study of a conditional emergency cash transfer program implemented by Concern Worldwide in 2012 during a food crisis in Tahoua, Niger, in which the use of a concurrent control group permits estimation of the program’s impact on child weight gain. Program beneficiaries received three transfers totaling approximately 65% of Niger’s gross national per capita income; mothers attended mandatory sessions on child and infant feeding and care practices. Dietary and anthropometric data from 211 vulnerable households and children targeted by the intervention were compared with 212 similarly vulnerable control households and children from the same 21 villages. We used multilevel mixed effects regression to estimate changes in weight and weight-for-height Z scores (WHZ) over time, and logistic regression to estimate the probability of acute malnutrition. Results We found the intervention to be associated with a 1.27 kg greater overall weight gain (P < 0.001) and a 1.82 greater overall gain in WHZ (P < 0.001). The odds of having acute malnutrition at the end of the intervention were 25 times higher among children in the comparison group than those in households receiving cash (P < 0.001). Conclusions We conclude that this emergency cash transfer program promoted child weight gain and reduced the risk of acute malnutrition among children in the context of a food crisis. We suspect that the use of strategic conditional terms and a valuable transfer size were key features in achieving this result. Limitations in study design prevent us from attributing impact to particular aspects of the program, and preclude a precise estimation of impact. Future studies of this nature would benefit from pre-baseline measurements, more exhaustive data collection on household characteristics and transfer use, and further investigation into the use of conditional terms in emergency settings. PMID:29497505
NASA Astrophysics Data System (ADS)
Malm, William C.; Schichtel, Bret A.; Hand, Jenny L.; Collett, Jeffrey L.
2017-10-01
Recent modeling and field studies have highlighted a relationship between sulfate concentrations and secondarily formed organic aerosols related to isoprene and other volatile biogenic gaseous emissions. The relationship between these biogenic emissions and sulfate is thought to be primarily associated with the effect of sulfate on aerosol acidity, increased aerosol water at high relative humidities, and aerosol volume. The Interagency Monitoring of Protected Visual Environments (IMPROVE) program provides aerosol concentration levels of sulfate (SO4) and organic carbon (OC) at 136 monitoring sites in rural and remote areas of the United States over time periods of between 15 and 28 years. This data set allows for an examination of relationships between these variables over time and space. The relative decreases in SO4 and OC were similar over most of the eastern United States, even though concentrations varied dramatically from one region to another. The analysis implied that for every unit decrease in SO4 there was about a 0.29 decrease in organic aerosol mass (OA = 1.8 × OC). This translated to a 2 μg/m3 decrease in biogenically derived secondary organic aerosol over 15 years in the southeastern United States. The analysis further implied that 35% and 27% in 2001 and 2015, respectively, of average total OA may be biogenically derived secondary organic aerosols and that there was a small but significant decrease in OA not linked to changes in SO4 concentrations. The analysis yields a constraint on ambient SO4-OC relationships that should help to refine and improve regional-scale chemical transport models.
Imai, Keisuke; Hamanaka, Masashi; Yamada, Takehiro; Yamazaki, Hidekazu; Yamamoto, Atsushi; Tsuto, Kazuma; Takegami, Tetsuro; Umezawa, Kunihiko; Ikeda, Eito; Mizuno, Toshiki
2014-01-01
Emergency neuroendovascular revascularization is a reperfusion therapy for acute stroke. The operator for this therapy has to obtain a license as a specialist in endovascular procedures. For neurologists wishing to acquire this license, there are two kinds of training programs: full-time training and concurrent training. Full-time training was chosen by the first author of this review, while concurrent training will be performed by staff in the author's department. The advantage of full-time training is the acquisition of a lot of experience of various diseases that are treated with endovascular procedures and managed in the periprocedural period. However, full-time training has the disadvantages of a requirement to discontinue medical care of neurological diseases except for stroke and employment at a remote institution. The advantages and disadvantages of concurrent training are the reverse of those of full-time training. Neither training system can succeed without cooperation from Departments of Neurology in neighboring universities and the institutional Department of Neurosurgery. It is particularly important for each neurologist to establish a goal of becoming an operator for recanalization therapy alone or for all fields of endovascular procedures because training will differ for attainment of each operator's goal.
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, J.; Bhat, R. B.
1979-01-01
A finite element program is linked with a general purpose optimization program in a 'programing system' which includes user supplied codes that contain problem dependent formulations of the design variables, objective function and constraints. The result is a system adaptable to a wide spectrum of structural optimization problems. In a sample of numerical examples, the design variables are the cross-sectional dimensions and the parameters of overall shape geometry, constraints are applied to stresses, displacements, buckling and vibration characteristics, and structural mass is the objective function. Thin-walled, built-up structures and frameworks are included in the sample. Details of the system organization and characteristics of the component programs are given.
A rigorous approach to self-checking programming
NASA Technical Reports Server (NTRS)
Hua, Kien A.; Abraham, Jacob A.
1986-01-01
Self-checking programming is shown to be an effective concurrent error detection technique. The reliability of a self-checking program however relies on the quality of its assertion statements. A self-checking program written without formal guidelines could provide a poor coverage of the errors. A constructive technique for self-checking programming is presented. A Structured Program Design Language (SPDL) suitable for self-checking software development is defined. A set of formal rules, was also developed, that allows the transfromation of SPDL designs into self-checking designs to be done in a systematic manner.
45 CFR 2553.25 - What are a sponsor's administrative responsibilities?
Code of Federal Regulations, 2010 CFR
2010-10-01
...) CORPORATION FOR NATIONAL AND COMMUNITY SERVICE THE RETIRED AND SENIOR VOLUNTEER PROGRAM Eligibility and... serve concurrently in another capacity, paid or unpaid, during established working hours. The project...
JWST Operations and the Phase I and II Process
NASA Astrophysics Data System (ADS)
Beck, Tracy L.
2010-07-01
The JWST operations and Phase I and Phase II process will build upon our knowledge on the current system in use for HST. The primary observing overheads associated with JWST observations, both direct and indirect, are summarized. While some key operations constraints for JWST may cause deviations from the HST model for proposal planning, the overall interface to JWST planning will use the APT and will appear similar to the HST interface. The requirement is to have a proposal planning model simlar to HST, where proposals submitted to the TAC must have at least the minimum amount of information necessary for assessment of the strength of the science. However, a goal of the JWST planning process is to have the submitted Phase I proposal in executable form, and as complete as possible for many programs. JWST will have significant constraints on the spacecraft pointing and orient, so it is beneficial for the planning process to have these scheduling constraints on programs defined as early as possible. The guide field of JWST is also much smaller than the HST guide field, so searches for available guide stars for JWST science programs must be done at the Phase I deadline. The long range observing plan for each JWST cycle will be generated intially from the TAC accepted programs at the Phase I deadline, and the LRP will be refined after the Phase II deadline when all scheduling constraints are defined.
Space station payload operations scheduling with ESP2
NASA Technical Reports Server (NTRS)
Stacy, Kenneth L.; Jaap, John P.
1988-01-01
The Mission Analysis Division of the Systems Analysis and Integration Laboratory at the Marshall Space Flight Center is developing a system of programs to handle all aspects of scheduling payload operations for Space Station. The Expert Scheduling Program (ESP2) is the heart of this system. The task of payload operations scheduling can be simply stated as positioning the payload activities in a mission so that they collect their desired data without interfering with other activities or violating mission constraints. ESP2 is an advanced version of the Experiment Scheduling Program (ESP) which was developed by the Mission Integration Branch beginning in 1979 to schedule Spacelab payload activities. The automatic scheduler in ESP2 is an expert system that embodies the rules that expert planners would use to schedule payload operations by hand. This scheduler uses depth-first searching, backtracking, and forward chaining techniques to place an activity so that constraints (such as crew, resources, and orbit opportunities) are not violated. It has an explanation facility to show why an activity was or was not scheduled at a certain time. The ESP2 user can also place the activities in the schedule manually. The program offers graphical assistance to the user and will advise when constraints are being violated. ESP2 also has an option to identify conflict introduced into an existing schedule by changes to payload requirements, mission constraints, and orbit opportunities.
ERIC Educational Resources Information Center
Raulston, Tracy Jane
2017-01-01
In this study, a concurrent randomized multiple baseline across three parent-child dyads single-case design was employed to evaluate the effects of a brief three-week parent training program, titled Practiced Routines. The Practiced Routines parent training program included positive behavior supports (PBS) and mindfulness strategies within the…
Hsu, John; Price, Mary; Vogeli, Christine; Brand, Richard; Chernew, Michael E; Chaguturu, Sreekanth K; Weil, Eric; Ferris, Timothy G
2017-05-01
Accountable care organizations (ACOs) appear to lower medical spending, but there is little information on how they do so. We examined the impact of patient participation in a Pioneer ACO and its care management program on rates of emergency department (ED) visits and hospitalizations and on Medicare spending. We used data for the period 2009-14, exploiting naturally staggered program entry to create concurrent controls to help isolate the program effects. The care management program (the ACO's primary intervention) targeted beneficiaries with elevated but modifiable risks for future spending. ACO participation had a modest effect on spending, in line with previous estimates. Participation in the care management program was associated with substantial reductions in rates for hospitalizations and both all and nonemergency ED visits, as well as Medicare spending, when compared to preparticipation levels and to rates and spending for a concurrent sample of beneficiaries who were eligible for but had not yet started the program. Rates of ED visits and hospitalizations were reduced by 6 percent and 8 percent, respectively, and Medicare spending was reduced by 6 percent. Targeting beneficiaries with modifiable high risks and shifting care away from the ED represent viable mechanisms for altering spending within ACOs. Project HOPE—The People-to-People Health Foundation, Inc.
Zhang, Linda; Norena, Monica; Gadermann, Anne; Hubley, Anita; Russell, Lara; Aubry, Tim; To, Matthew J; Farrell, Susan; Hwang, Stephen; Palepu, Anita
2018-01-01
Individuals who are homeless or vulnerably housed have a higher prevalence of concurrent disorders, defined as having a mental health diagnosis and problematic substance use, compared to the general housed population. The study objective was to investigate the effect of having concurrent disorders on health care utilization among homeless or vulnerably housed individuals, using longitudinal data from the Health and Housing in Transition Study. In 2009, 1190 homeless or vulnerably housed adults were recruited in Ottawa, Toronto, and Vancouver, Canada. Participants completed baseline interviews and four annual follow-up interviews, providing data on sociodemographics, housing history, mental health diagnoses, problematic drug use with the Drug Abuse Screening Test (DAST-10), problematic alcohol use with the Alcohol Use Disorders Identification Test (AUDIT), chronic health conditions, and utilization of the following health care services: emergency department (ED), hospitalization, and primary care. Concurrent disorders were defined as the participant having ever received a mental health diagnosis at baseline and having problematic substance use (i.e., DAST-10 ≥ 6 and/or AUDIT ≥ 20) at any time during the study period. Three generalized mixed effects logistic regression models were used to examine the independent association of having concurrent disorders and reporting ED use, hospitalization, or primary care visits in the past 12 months. Among our sample of adults who were homeless or vulnerably housed, 22.6% (n = 261) reported having concurrent disorders at baseline. Individuals with concurrent disorders had significantly higher odds of ED use (adjusted odds ratio [AOR] = 1.71; 95% confidence interval [CI], 1.4-2.11), hospitalization (AOR = 1.45; 95% CI, 1.16-1.81), and primary care visits (AOR = 1.34; 95% CI, 1.05-1.71) in the past 12 months over the four-year follow-up period, after adjusting for potential confounders. Concurrent disorders were associated with higher rates of health care utilization when compared to those without concurrent disorders among homeless and vulnerably housed individuals. Comprehensive programs that integrate mental health and addiction services with primary care as well as community-based outreach may better address the unmet health care needs of individuals living with concurrent disorders who are vulnerable to poor health outcomes.
Fuzzy robust credibility-constrained programming for environmental management and planning.
Zhang, Yimei; Hang, Guohe
2010-06-01
In this study, a fuzzy robust credibility-constrained programming (FRCCP) is developed and applied to the planning for waste management systems. It incorporates the concepts of credibility-based chance-constrained programming and robust programming within an optimization framework. The developed method can reflect uncertainties presented as possibility-density by fuzzy-membership functions. Fuzzy credibility constraints are transformed to the crisp equivalents with different credibility levels, and ordinary fuzzy inclusion constraints are determined by their robust deterministic constraints by setting a-cut levels. The FRCCP method can provide different system costs under different credibility levels (lambda). From the results of sensitivity analyses, the operation cost of the landfill is a critical parameter. For the management, any factors that would induce cost fluctuation during landfilling operation would deserve serious observation and analysis. By FRCCP, useful solutions can be obtained to provide decision-making support for long-term planning of solid waste management systems. It could be further enhanced through incorporating methods of inexact analysis into its framework. It can also be applied to other environmental management problems.
An innovative approach to compensator design
NASA Technical Reports Server (NTRS)
Mitchell, J. R.; Mcdaniel, W. L., Jr.
1973-01-01
The design is considered of a computer-aided-compensator for a control system from a frequency domain point of view. The design technique developed is based on describing the open loop frequency response by n discrete frequency points which result in n functions of the compensator coefficients. Several of these functions are chosen so that the system specifications are properly portrayed; then mathematical programming is used to improve all of these functions which have values below minimum standards. To do this, several definitions in regard to measuring the performance of a system in the frequency domain are given, e.g., relative stability, relative attenuation, proper phasing, etc. Next, theorems which govern the number of compensator coefficients necessary to make improvements in a certain number of functions are proved. After this a mathematical programming tool for aiding in the solution of the problem is developed. This tool is called the constraint improvement algorithm. Then for applying the constraint improvement algorithm generalized, gradients for the constraints are derived. Finally, the necessary theory is incorporated in a Computer program called CIP (compensator Improvement Program). The practical usefulness of CIP is demonstrated by two large system examples.
A System for Automatically Generating Scheduling Heuristics
NASA Technical Reports Server (NTRS)
Morris, Robert
1996-01-01
The goal of this research is to improve the performance of automated schedulers by designing and implementing an algorithm by automatically generating heuristics by selecting a schedule. The particular application selected by applying this method solves the problem of scheduling telescope observations, and is called the Associate Principal Astronomer. The input to the APA scheduler is a set of observation requests submitted by one or more astronomers. Each observation request specifies an observation program as well as scheduling constraints and preferences associated with the program. The scheduler employs greedy heuristic search to synthesize a schedule that satisfies all hard constraints of the domain and achieves a good score with respect to soft constraints expressed as an objective function established by an astronomer-user.
Optimum structural design with plate bending elements - A survey
NASA Technical Reports Server (NTRS)
Haftka, R. T.; Prasad, B.
1981-01-01
A survey is presented of recently published papers in the field of optimum structural design of plates, largely with respect to the minimum-weight design of plates subject to such constraints as fundamental frequency maximization. It is shown that, due to the availability of powerful computers, the trend in optimum plate design is away from methods tailored to specific geometry and loads and toward methods that can be easily programmed for any kind of plate, such as finite element methods. A corresponding shift is seen in optimization from variational techniques to numerical optimization algorithms. Among the topics covered are fully stressed design and optimality criteria, mathematical programming, smooth and ribbed designs, design against plastic collapse, buckling constraints, and vibration constraints.
Symbolic Execution Enhanced System Testing
NASA Technical Reports Server (NTRS)
Davies, Misty D.; Pasareanu, Corina S.; Raman, Vishwanath
2012-01-01
We describe a testing technique that uses information computed by symbolic execution of a program unit to guide the generation of inputs to the system containing the unit, in such a way that the unit's, and hence the system's, coverage is increased. The symbolic execution computes unit constraints at run-time, along program paths obtained by system simulations. We use machine learning techniques treatment learning and function fitting to approximate the system input constraints that will lead to the satisfaction of the unit constraints. Execution of system input predictions either uncovers new code regions in the unit under analysis or provides information that can be used to improve the approximation. We have implemented the technique and we have demonstrated its effectiveness on several examples, including one from the aerospace domain.
On the Probabilistic Deployment of Smart Grid Networks in TV White Space.
Cacciapuoti, Angela Sara; Caleffi, Marcello; Paura, Luigi
2016-05-10
To accommodate the rapidly increasing demand for wireless broadband communications in Smart Grid (SG) networks, research efforts are currently ongoing to enable the SG networks to utilize the TV spectrum according to the Cognitive Radio paradigm. To this aim, in this letter, we develop an analytical framework for the optimal deployment of multiple closely-located SG Neighborhood Area Networks (NANs) concurrently using the same TV spectrum. The objective is to derive the optimal values for both the number of NANs and their coverage. More specifically, regarding the number of NANs, we derive the optimal closed-form expression, i.e., the closed-form expression that assures the deployment of the maximum number of NANs in the considered region satisfying a given collision constraint on the transmissions of the NANs. Regarding the NAN coverage, we derive the optimal closed-form expression, i.e., the closed-form expression of the NAN transmission range that assures the maximum coverage of each NAN in the considered region satisfying the given collision constraint. All the theoretical results are derived by adopting a stochastic approach. Finally, numerical results validate the theoretical analysis.
Clustering Effects on Dynamics in Ionomer Solutions: A Neutron Spin Echo Insight
NASA Astrophysics Data System (ADS)
Perahia, Dvora; Wijesinghe, Sidath; Senanayake, Manjula; Wickramasinghe, Anuradhi; Mohottalalage, Supun S.; Ohl, Michael
Ionizable blocks in ionomers associate into aggregates serving as physical cross-links and concurrently form transport pathways. The dynamics of ionomers underline their functionality. Incorporating small numbers of ionic groups into polymers significantly constraint their dynamics. Recent computational studies demonstrated a direct correlation between ionic cluster morphology and polymer dynamics. Here using neutron spin echo, we probe the segmental dynamics of polystyrene sulfonate (PSS) as the degree of sulfonation of the PSS and the solution dielectrics are varied. Specifically, 20Wt% PSS of 11,000 g/mol with polydispersity of 1.02 with 3% and 9% sulfonation were studies in toluene (dielectric constant ɛ = 2.8), a good solvent for polystyrene, and with 5Wt% of ethanol (ɛ = 24.3l) added. The dynamic structure factor S(q,t) was analyzed with a single exponential except for a limited q range where two time constants associated with constraint and mobile segments were detected. S(q,t) exhibits several distinctive time and length scales for the dynamics with a crossover appearing at the length scale of the ionic clusters. NSF DMR 1611136.
Novel TMS coils designed using an inverse boundary element method
NASA Astrophysics Data System (ADS)
Cobos Sánchez, Clemente; María Guerrero Rodriguez, Jose; Quirós Olozábal, Ángel; Blanco-Navarro, David
2017-01-01
In this work, a new method to design TMS coils is presented. It is based on the inclusion of the concept of stream function of a quasi-static electric current into a boundary element method. The proposed TMS coil design approach is a powerful technique to produce stimulators of arbitrary shape, and remarkably versatile as it permits the prototyping of many different performance requirements and constraints. To illustrate the power of this approach, it has been used for the design of TMS coils wound on rectangular flat, spherical and hemispherical surfaces, subjected to different constraints, such as minimum stored magnetic energy or power dissipation. The performances of such coils have been additionally described; and the torque experienced by each stimulator in the presence of a main magnetic static field have theoretically found in order to study the prospect of using them to perform TMS and fMRI concurrently. The obtained results show that described method is an efficient tool for the design of TMS stimulators, which can be applied to a wide range of coil geometries and performance requirements.
OPTIMAL NETWORK TOPOLOGY DESIGN
NASA Technical Reports Server (NTRS)
Yuen, J. H.
1994-01-01
This program was developed as part of a research study on the topology design and performance analysis for the Space Station Information System (SSIS) network. It uses an efficient algorithm to generate candidate network designs (consisting of subsets of the set of all network components) in increasing order of their total costs, and checks each design to see if it forms an acceptable network. This technique gives the true cost-optimal network, and is particularly useful when the network has many constraints and not too many components. It is intended that this new design technique consider all important performance measures explicitly and take into account the constraints due to various technical feasibilities. In the current program, technical constraints are taken care of by the user properly forming the starting set of candidate components (e.g. nonfeasible links are not included). As subsets are generated, they are tested to see if they form an acceptable network by checking that all requirements are satisfied. Thus the first acceptable subset encountered gives the cost-optimal topology satisfying all given constraints. The user must sort the set of "feasible" link elements in increasing order of their costs. The program prompts the user for the following information for each link: 1) cost, 2) connectivity (number of stations connected by the link), and 3) the stations connected by that link. Unless instructed to stop, the program generates all possible acceptable networks in increasing order of their total costs. The program is written only to generate topologies that are simply connected. Tests on reliability, delay, and other performance measures are discussed in the documentation, but have not been incorporated into the program. This program is written in PASCAL for interactive execution and has been implemented on an IBM PC series computer operating under PC DOS. The disk contains source code only. This program was developed in 1985.
A Structural Evaluation of a Large-Scale Quasi-Experimental Microfinance Initiative
Kaboski, Joseph P.; Townsend, Robert M.
2010-01-01
This paper uses a structural model to understand, predict, and evaluate the impact of an exogenous microcredit intervention program, the Thai Million Baht Village Fund program. We model household decisions in the face of borrowing constraints, income uncertainty, and high-yield indivisible investment opportunities. After estimation of parameters using pre-program data, we evaluate the model’s ability to predict and interpret the impact of the village fund intervention. Simulations from the model mirror the data in yielding a greater increase in consumption than credit, which is interpreted as evidence of credit constraints. A cost-benefit analysis using the model indicates that some households value the program much more than its per household cost, but overall the program costs 20 percent more than the sum of these benefits. PMID:22162594
A Structural Evaluation of a Large-Scale Quasi-Experimental Microfinance Initiative.
Kaboski, Joseph P; Townsend, Robert M
2011-09-01
This paper uses a structural model to understand, predict, and evaluate the impact of an exogenous microcredit intervention program, the Thai Million Baht Village Fund program. We model household decisions in the face of borrowing constraints, income uncertainty, and high-yield indivisible investment opportunities. After estimation of parameters using pre-program data, we evaluate the model's ability to predict and interpret the impact of the village fund intervention. Simulations from the model mirror the data in yielding a greater increase in consumption than credit, which is interpreted as evidence of credit constraints. A cost-benefit analysis using the model indicates that some households value the program much more than its per household cost, but overall the program costs 20 percent more than the sum of these benefits.
ERIC Educational Resources Information Center
Jensen, Murray; Mattheis, Allison; Loyle, Anne
2013-01-01
This article describes a one-semester anatomy and physiology course that is currently offered through the concurrent enrollment program at the University of Minnesota. The article explains how high school teachers are prepared to teach the course and describes efforts to promote program quality, student inquiry, and experiential learning.…
Relationships between Instructional Quality and Classroom Management for Beginning Urban Teachers
ERIC Educational Resources Information Center
Kwok, Andrew
2017-01-01
This mixed-methods study explores the differences in 1st-year urban teachers' classroom management beliefs and actions. The teachers in this study were in their first year of teaching in an urban context concurrent with their participation in a teacher education program offered at a large public university. Using program-wide surveys of 89…
Exploring the Popperian Framework in a Pre-Service Teacher Education Program
ERIC Educational Resources Information Center
Chitpin, Stephanie; Simon, Marielle
2006-01-01
The study reported in this article is derived from a critical analysis of the work of 28 pre-service teachers enrolled in the course "Teaching elementary language arts" in a Bachelor of Education concurrent program in a southern State university. The pre-service teachers were taught how to use an innovative knowledge-building framework based on…
ERIC Educational Resources Information Center
Western Interstate Commission for Higher Education, 2006
2006-01-01
This document was designed to inform members of the policy, education, and research communities about existing state and institutional policies and practices associated with four accelerated learning programs: Advanced Placement (AP), dual/concurrent enrollment, the International Baccalaureate (IB) Diploma Program, and Tech-Prep. This effort was…
ERIC Educational Resources Information Center
Esmaily, Hamideh M.; Silver, Ivan; Shiva, Shadi; Gargani, Alireza; Maleki-Dizaji, Nasrin; Al-Maniri, Abdullah; Wahlstrom, Rolf
2010-01-01
Introduction: An outcome-based education approach has been proposed to develop more effective continuing medical education (CME) programs. We have used this approach in developing an outcome-based educational intervention for general physicians working in primary care (GPs) and evaluated its effectiveness compared with a concurrent CME program in…
ERIC Educational Resources Information Center
Beare, Paul L.
This study reviews the effects of training and service in a student advocacy program for Emotionally Disturbed (ED) children on attitudes of 16 secondary teachers toward ED children in the regular class. The intervention program involved 6 days of inservice training on working with ED students, delivered concurrent with the teachers' serving in an…
Computer Program Re-layers Engineering Drawings
NASA Technical Reports Server (NTRS)
Crosby, Dewey C., III
1990-01-01
RULCHK computer program aids in structuring layers of information pertaining to part or assembly designed with software described in article "Software for Drawing Design Details Concurrently" (MFS-28444). Checks and optionally updates structure of layers for part. Enables designer to construct model and annotate its documentation without burden of manually layering part to conform to standards at design time.
MacDonall, James S
2017-09-01
Some have reported changing the schedule at one alternative of a concurrent schedule changed responding at the other alternative (Catania, 1969), which seems odd because no contingencies were changed there. When concurrent schedules are programmed using two schedules, one associated with each alternative that operate continuously, changing the schedule at one alternative also changes the switch schedule at the other alternative. Thus, changes in responding at the constant alternative could be due to the change in the switch schedule. To assess this possibility, six rats were exposed to a series of conditions that alternated between pairs of interval schedules at both alternatives and a pair of interval schedules at one, constant, alternative and a pair of extinction schedules at the other alternative. Comparing run lengths, visit durations and response rates at the constant alternative in the alternating conditions did not show consistent increases and decreases when a strict criterion for changes was used. Using a less stringent definition (any change in mean values) showed changes. The stay/switch analysis suggests it may be inaccurate to apply behavioral contrast to procedures that change from concurrent variable-interval variable-interval schedules to concurrent variable-interval extinction schedules because the contingencies in neither alternative are constant. © 2017 Society for the Experimental Analysis of Behavior.
Cooperating reduction machines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kluge, W.E.
1983-11-01
This paper presents a concept and a system architecture for the concurrent execution of program expressions of a concrete reduction language based on lamda-expressions. If formulated appropriately, these expressions are well-suited for concurrent execution, following a demand-driven model of computation. In particular, recursive program expressions with nonlinear expansion may, at run time, recursively be partitioned into a hierarchy of independent subexpressions which can be reduced by a corresponding hierarchy of virtual reduction machines. This hierarchy unfolds and collapses dynamically, with virtual machines recursively assuming the role of masters that create and eventually terminate, or synchronize with, slaves. The paper alsomore » proposes a nonhierarchically organized system of reduction machines, each featuring a stack architecture, that effectively supports the allocation of virtual machines to the real machines of the system in compliance with their hierarchical order of creation and termination. 25 references.« less
NASA Astrophysics Data System (ADS)
Schoitsch, Erwin
1988-07-01
Our society is depending more and more on the reliability of embedded (real-time) computer systems even in every-day life. Considering the complexity of the real world, this might become a severe threat. Real-time programming is a discipline important not only in process control and data acquisition systems, but also in fields like communication, office automation, interactive databases, interactive graphics and operating systems development. General concepts of concurrent programming and constructs for process-synchronization are discussed in detail. Tasking and synchronization concepts, methods of process communication, interrupt- and timeout handling in systems based on semaphores, signals, conditional critical regions or on real-time languages like Concurrent PASCAL, MODULA, CHILL and ADA are explained and compared with each other and with respect to their potential to quality and safety.
Nonrecursive formulations of multibody dynamics and concurrent multiprocessing
NASA Technical Reports Server (NTRS)
Kurdila, Andrew J.; Menon, Ramesh
1993-01-01
Since the late 1980's, research in recursive formulations of multibody dynamics has flourished. Historically, much of this research can be traced to applications of low dimensionality in mechanism and vehicle dynamics. Indeed, there is little doubt that recursive order N methods are the method of choice for this class of systems. This approach has the advantage that a minimal number of coordinates are utilized, parallelism can be induced for certain system topologies, and the method is of order N computational cost for systems of N rigid bodies. Despite the fact that many authors have dismissed redundant coordinate formulations as being of order N(exp 3), and hence less attractive than recursive formulations, we present recent research that demonstrates that at least three distinct classes of redundant, nonrecursive multibody formulations consistently achieve order N computational cost for systems of rigid and/or flexible bodies. These formulations are as follows: (1) the preconditioned range space formulation; (2) penalty methods; and (3) augmented Lagrangian methods for nonlinear multibody dynamics. The first method can be traced to its foundation in equality constrained quadratic optimization, while the last two methods have been studied extensively in the context of coercive variational boundary value problems in computational mechanics. Until recently, however, they have not been investigated in the context of multibody simulation, and present theoretical questions unique to nonlinear dynamics. All of these nonrecursive methods have additional advantages with respect to recursive order N methods: (1) the formalisms retain the highly desirable order N computational cost; (2) the techniques are amenable to concurrent simulation strategies; (3) the approaches do not depend upon system topology to induce concurrency; and (4) the methods can be derived to balance the computational load automatically on concurrent multiprocessors. In addition to the presentation of the fundamental formulations, this paper presents new theoretical results regarding the rate of convergence of order N constraint stabilization schemes associated with the newly introduced class of methods.
On the road again: concurrency and condom use among Uganda truck drivers.
Costenbader, Elizabeth C; Lancaster, Kathryn; Bufumbo, Leonard; Akol, Angela; Guest, Greg
2015-01-01
Long-distance truck drivers have been shown to be a critical population in the spread of HIV in Africa. In 2009, surveys with 385 Ugandan long-distance truck drivers measured concurrency point prevalence with two methods; it ranged from 37.4% (calendar-method) to 50.1% (direct question). The majority (84%) of relationships reported were long-term resulting in a long duration of overlap (average of 58 months) across concurrent partnerships. Only 7% of these men reported using any condoms with their spouses during the past month. Among all non-spousal relationships, duration of relationship was the factor most strongly associated with engaging in unprotected sex in the past month in a multivariable analyses controlling for partner and relationship characteristics. Innovative intervention programs for these men and their partners are needed that address the realities of truck drivers' lifestyles.
Abstraction and Assume-Guarantee Reasoning for Automated Software Verification
NASA Technical Reports Server (NTRS)
Chaki, S.; Clarke, E.; Giannakopoulou, D.; Pasareanu, C. S.
2004-01-01
Compositional verification and abstraction are the key techniques to address the state explosion problem associated with model checking of concurrent software. A promising compositional approach is to prove properties of a system by checking properties of its components in an assume-guarantee style. This article proposes a framework for performing abstraction and assume-guarantee reasoning of concurrent C code in an incremental and fully automated fashion. The framework uses predicate abstraction to extract and refine finite state models of software and it uses an automata learning algorithm to incrementally construct assumptions for the compositional verification of the abstract models. The framework can be instantiated with different assume-guarantee rules. We have implemented our approach in the COMFORT reasoning framework and we show how COMFORT out-performs several previous software model checking approaches when checking safety properties of non-trivial concurrent programs.
NASA Technical Reports Server (NTRS)
Bensalem, Saddek; Ganesh, Vijay; Lakhnech, Yassine; Munoz, Cesar; Owre, Sam; Ruess, Harald; Rushby, John; Rusu, Vlad; Saiedi, Hassen; Shankar, N.
2000-01-01
To become practical for assurance, automated formal methods must be made more scalable, automatic, and cost-effective. Such an increase in scope, scale, automation, and utility can be derived from an emphasis on a systematic separation of concerns during verification. SAL (Symbolic Analysis Laboratory) attempts to address these issues. It is a framework for combining different tools to calculate properties of concurrent systems. The heart of SAL is a language, developed in collaboration with Stanford, Berkeley, and Verimag for specifying concurrent systems in a compositional way. Our instantiation of the SAL framework augments PVS with tools for abstraction, invariant generation, program analysis (such as slicing), theorem proving, and model checking to separate concerns as well as calculate properties (i.e., perform, symbolic analysis) of concurrent systems. We. describe the motivation, the language, the tools, their integration in SAL/PAS, and some preliminary experience of their use.
Strategies for Energy Efficient Resource Management of Hybrid Programming Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Dong; Supinski, Bronis de; Schulz, Martin
2013-01-01
Many scientific applications are programmed using hybrid programming models that use both message-passing and shared-memory, due to the increasing prevalence of large-scale systems with multicore, multisocket nodes. Previous work has shown that energy efficiency can be improved using software-controlled execution schemes that consider both the programming model and the power-aware execution capabilities of the system. However, such approaches have focused on identifying optimal resource utilization for one programming model, either shared-memory or message-passing, in isolation. The potential solution space, thus the challenge, increases substantially when optimizing hybrid models since the possible resource configurations increase exponentially. Nonetheless, with the accelerating adoptionmore » of hybrid programming models, we increasingly need improved energy efficiency in hybrid parallel applications on large-scale systems. In this work, we present new software-controlled execution schemes that consider the effects of dynamic concurrency throttling (DCT) and dynamic voltage and frequency scaling (DVFS) in the context of hybrid programming models. Specifically, we present predictive models and novel algorithms based on statistical analysis that anticipate application power and time requirements under different concurrency and frequency configurations. We apply our models and methods to the NPB MZ benchmarks and selected applications from the ASC Sequoia codes. Overall, we achieve substantial energy savings (8.74% on average and up to 13.8%) with some performance gain (up to 7.5%) or negligible performance loss.« less
Guidelines for development structured FORTRAN programs
NASA Technical Reports Server (NTRS)
Earnest, B. M.
1984-01-01
Computer programming and coding standards were compiled to serve as guidelines for the uniform writing of FORTRAN 77 programs at NASA Langley. Software development philosophy, documentation, general coding conventions, and specific FORTRAN coding constraints are discussed.
A survey of methods of feasible directions for the solution of optimal control problems
NASA Technical Reports Server (NTRS)
Polak, E.
1972-01-01
Three methods of feasible directions for optimal control are reviewed. These methods are an extension of the Frank-Wolfe method, a dual method devised by Pironneau and Polack, and a Zontendijk method. The categories of continuous optimal control problems are shown as: (1) fixed time problems with fixed initial state, free terminal state, and simple constraints on the control; (2) fixed time problems with inequality constraints on both the initial and the terminal state and no control constraints; (3) free time problems with inequality constraints on the initial and terminal states and simple constraints on the control; and (4) fixed time problems with inequality state space contraints and constraints on the control. The nonlinear programming algorithms are derived for each of the methods in its associated category.
A Kind of Nonlinear Programming Problem Based on Mixed Fuzzy Relation Equations Constraints
NASA Astrophysics Data System (ADS)
Li, Jinquan; Feng, Shuang; Mi, Honghai
In this work, a kind of nonlinear programming problem with non-differential objective function and under the constraints expressed by a system of mixed fuzzy relation equations is investigated. First, some properties of this kind of optimization problem are obtained. Then, a polynomial-time algorithm for this kind of optimization problem is proposed based on these properties. Furthermore, we show that this algorithm is optimal for the considered optimization problem in this paper. Finally, numerical examples are provided to illustrate our algorithms.
Constrained spacecraft reorientation using mixed integer convex programming
NASA Astrophysics Data System (ADS)
Tam, Margaret; Glenn Lightsey, E.
2016-10-01
A constrained attitude guidance (CAG) system is developed using convex optimization to autonomously achieve spacecraft pointing objectives while meeting the constraints imposed by on-board hardware. These constraints include bounds on the control input and slew rate, as well as pointing constraints imposed by the sensors. The pointing constraints consist of inclusion and exclusion cones that dictate permissible orientations of the spacecraft in order to keep objects in or out of the field of view of the sensors. The optimization scheme drives a body vector towards a target inertial vector along a trajectory that consists solely of permissible orientations in order to achieve the desired attitude for a given mission mode. The non-convex rotational kinematics are handled by discretization, which also ensures that the quaternion stays unity norm. In order to guarantee an admissible path, the pointing constraints are relaxed. Depending on how strict the pointing constraints are, the degree of relaxation is tuneable. The use of binary variables permits the inclusion of logical expressions in the pointing constraints in the case that a set of sensors has redundancies. The resulting mixed integer convex programming (MICP) formulation generates a steering law that can be easily integrated into an attitude determination and control (ADC) system. A sample simulation of the system is performed for the Bevo-2 satellite, including disturbance torques and actuator dynamics which are not modeled by the controller. Simulation results demonstrate the robustness of the system to disturbances while meeting the mission requirements with desirable performance characteristics.
Hydropower, an energy source whose time has come again
NASA Astrophysics Data System (ADS)
1980-01-01
Recent price increases in imported oil demonstrate the urgency for the U.S. to rapidly develop its renewable resources. One such renewable resource for which technology is available now is hydropower. Studies indicate that hydropower potential, particularly at existing dam sites, can save the county hundreds of thousands of barrels of oil per day. But problems and constraints-economic, environmental, institutional, and operational-limit is full potential. Federal programs have had little impact on helping to bring hydro projects on line. Specifically, the Department of Energy's Small Hydro Program could do more to overcome hydro constraints and problems through an effective outreach program and more emphasis on demonstration projects.
NASA Technical Reports Server (NTRS)
Kolb, Mark A.
1990-01-01
Viewgraphs on Rubber Airplane: Constraint-based Component-Modeling for Knowledge Representation in Computer Aided Conceptual Design are presented. Topics covered include: computer aided design; object oriented programming; airfoil design; surveillance aircraft; commercial aircraft; aircraft design; and launch vehicles.
Continuous Optimization on Constraint Manifolds
NASA Technical Reports Server (NTRS)
Dean, Edwin B.
1988-01-01
This paper demonstrates continuous optimization on the differentiable manifold formed by continuous constraint functions. The first order tensor geodesic differential equation is solved on the manifold in both numerical and closed analytic form for simple nonlinear programs. Advantages and disadvantages with respect to conventional optimization techniques are discussed.
Feed Forward Neural Network and Optimal Control Problem with Control and State Constraints
NASA Astrophysics Data System (ADS)
Kmet', Tibor; Kmet'ová, Mária
2009-09-01
A feed forward neural network based optimal control synthesis is presented for solving optimal control problems with control and state constraints. The paper extends adaptive critic neural network architecture proposed by [5] to the optimal control problems with control and state constraints. The optimal control problem is transcribed into a nonlinear programming problem which is implemented with adaptive critic neural network. The proposed simulation method is illustrated by the optimal control problem of nitrogen transformation cycle model. Results show that adaptive critic based systematic approach holds promise for obtaining the optimal control with control and state constraints.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gomes, C.
This report describes a successful project for transference of advanced AI technology into the domain of planning of outages of nuclear power plants as part of DOD`s dual-use program. ROMAN (Rome Lab Outage Manager) is the prototype system that was developed as a result of this project. ROMAN`s main innovation compared to the current state-of-the-art of outage management tools is its capability to automatically enforce safety constraints during the planning and scheduling phase. Another innovative aspect of ROMAN is the generation of more robust schedules that are feasible over time windows. In other words, ROMAN generates a family of schedulesmore » by assigning time intervals as start times to activities rather than single start times, without affecting the overall duration of the project. ROMAN uses a constraint satisfaction paradigm combining a global search tactic with constraint propagation. The derivation of very specialized representations for the constraints to perform efficient propagation is a key aspect for the generation of very fast schedules - constraints are compiled into the code, which is a novel aspect of our work using an automatic programming system, KIDS.« less
Outcomes Assessment in Dental Hygiene Programs.
ERIC Educational Resources Information Center
Grimes, Ellen B.
1999-01-01
A survey of 22 dental-hygiene-program directors found that programs routinely and effectively assess student outcomes and use the information for program improvements and to demonstrate accountability. Both policy and faculty/administrative support were deemed important to implementation. Time constraints were a major barrier. Outcomes-assessment…
Han, Feifei
2017-01-01
While some first language (L1) reading models suggest that inefficient word recognition and small working memory tend to inhibit higher-level comprehension processes; the Compensatory Encoding Model maintains that slow word recognition and small working memory do not normally hinder reading comprehension, as readers are able to operate metacognitive strategies to compensate for inefficient word recognition and working memory limitation as long as readers process a reading task without time constraint. Although empirical evidence is accumulated for support of the Compensatory Encoding Model in L1 reading, there is lack of research for testing of the Compensatory Encoding Model in foreign language (FL) reading. This research empirically tested the Compensatory Encoding Model in English reading among Chinese college English language learners (ELLs). Two studies were conducted. Study one focused on testing whether reading condition varying time affects the relationship between word recognition, working memory, and reading comprehension. Students were tested on a computerized English word recognition test, a computerized Operation Span task, and reading comprehension in time constraint and non-time constraint reading. The correlation and regression analyses showed that the strength of association was much stronger between word recognition, working memory, and reading comprehension in time constraint than that in non-time constraint reading condition. Study two examined whether FL readers were able to operate metacognitive reading strategies as a compensatory way of reading comprehension for inefficient word recognition and working memory limitation in non-time constraint reading. The participants were tested on the same computerized English word recognition test and Operation Span test. They were required to think aloud while reading and to complete the comprehension questions. The think-aloud protocols were coded for concurrent use of reading strategies, classified into language-oriented strategies, content-oriented strategies, re-reading, pausing, and meta-comment. The correlation analyses showed that while word recognition and working memory were only significantly related to frequency of language-oriented strategies, re-reading, and pausing, but not with reading comprehension. Jointly viewed, the results of the two studies, complimenting each other, supported the applicability of the Compensatory Encoding Model in FL reading with Chinese college ELLs. PMID:28522984
Han, Feifei
2017-01-01
While some first language (L1) reading models suggest that inefficient word recognition and small working memory tend to inhibit higher-level comprehension processes; the Compensatory Encoding Model maintains that slow word recognition and small working memory do not normally hinder reading comprehension, as readers are able to operate metacognitive strategies to compensate for inefficient word recognition and working memory limitation as long as readers process a reading task without time constraint. Although empirical evidence is accumulated for support of the Compensatory Encoding Model in L1 reading, there is lack of research for testing of the Compensatory Encoding Model in foreign language (FL) reading. This research empirically tested the Compensatory Encoding Model in English reading among Chinese college English language learners (ELLs). Two studies were conducted. Study one focused on testing whether reading condition varying time affects the relationship between word recognition, working memory, and reading comprehension. Students were tested on a computerized English word recognition test, a computerized Operation Span task, and reading comprehension in time constraint and non-time constraint reading. The correlation and regression analyses showed that the strength of association was much stronger between word recognition, working memory, and reading comprehension in time constraint than that in non-time constraint reading condition. Study two examined whether FL readers were able to operate metacognitive reading strategies as a compensatory way of reading comprehension for inefficient word recognition and working memory limitation in non-time constraint reading. The participants were tested on the same computerized English word recognition test and Operation Span test. They were required to think aloud while reading and to complete the comprehension questions. The think-aloud protocols were coded for concurrent use of reading strategies, classified into language-oriented strategies, content-oriented strategies, re-reading, pausing, and meta-comment. The correlation analyses showed that while word recognition and working memory were only significantly related to frequency of language-oriented strategies, re-reading, and pausing, but not with reading comprehension. Jointly viewed, the results of the two studies, complimenting each other, supported the applicability of the Compensatory Encoding Model in FL reading with Chinese college ELLs.
The NASA computer science research program plan
NASA Technical Reports Server (NTRS)
1983-01-01
A taxonomy of computer science is included, one state of the art of each of the major computer science categories is summarized. A functional breakdown of NASA programs under Aeronautics R and D, space R and T, and institutional support is also included. These areas were assessed against the computer science categories. Concurrent processing, highly reliable computing, and information management are identified.
An Overview of the Runtime Verification Tool Java PathExplorer
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Rosu, Grigore; Clancy, Daniel (Technical Monitor)
2002-01-01
We present an overview of the Java PathExplorer runtime verification tool, in short referred to as JPAX. JPAX can monitor the execution of a Java program and check that it conforms with a set of user provided properties formulated in temporal logic. JPAX can in addition analyze the program for concurrency errors such as deadlocks and data races. The concurrency analysis requires no user provided specification. The tool facilitates automated instrumentation of a program's bytecode, which when executed will emit an event stream, the execution trace, to an observer. The observer dispatches the incoming event stream to a set of observer processes, each performing a specialized analysis, such as the temporal logic verification, the deadlock analysis and the data race analysis. Temporal logic specifications can be formulated by the user in the Maude rewriting logic, where Maude is a high-speed rewriting system for equational logic, but here extended with executable temporal logic. The Maude rewriting engine is then activated as an event driven monitoring process. Alternatively, temporal specifications can be translated into efficient automata, which check the event stream. JPAX can be used during program testing to gain increased information about program executions, and can potentially furthermore be applied during operation to survey safety critical systems.
A Comparison of PETSC Library and HPF Implementations of an Archetypal PDE Computation
NASA Technical Reports Server (NTRS)
Hayder, M. Ehtesham; Keyes, David E.; Mehrotra, Piyush
1997-01-01
Two paradigms for distributed-memory parallel computation that free the application programmer from the details of message passing are compared for an archetypal structured scientific computation a nonlinear, structured-grid partial differential equation boundary value problem using the same algorithm on the same hardware. Both paradigms, parallel libraries represented by Argonne's PETSC, and parallel languages represented by the Portland Group's HPF, are found to be easy to use for this problem class, and both are reasonably effective in exploiting concurrency after a short learning curve. The level of involvement required by the application programmer under either paradigm includes specification of the data partitioning (corresponding to a geometrically simple decomposition of the domain of the PDE). Programming in SPAM style for the PETSC library requires writing the routines that discretize the PDE and its Jacobian, managing subdomain-to-processor mappings (affine global- to-local index mappings), and interfacing to library solver routines. Programming for HPF requires a complete sequential implementation of the same algorithm, introducing concurrency through subdomain blocking (an effort similar to the index mapping), and modest experimentation with rewriting loops to elucidate to the compiler the latent concurrency. Correctness and scalability are cross-validated on up to 32 nodes of an IBM SP2.
Preventing Run-Time Bugs at Compile-Time Using Advanced C++
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neswold, Richard
When writing software, we develop algorithms that tell the computer what to do at run-time. Our solutions are easier to understand and debug when they are properly modeled using class hierarchies, enumerations, and a well-factored API. Unfortunately, even with these design tools, we end up having to debug our programs at run-time. Worse still, debugging an embedded system changes its dynamics, making it tough to find and fix concurrency issues. This paper describes techniques using C++ to detect run-time bugs *at compile time*. A concurrency library, developed at Fermilab, is used for examples in illustrating these techniques.
An Element-Based Concurrent Partitioner for Unstructured Finite Element Meshes
NASA Technical Reports Server (NTRS)
Ding, Hong Q.; Ferraro, Robert D.
1996-01-01
A concurrent partitioner for partitioning unstructured finite element meshes on distributed memory architectures is developed. The partitioner uses an element-based partitioning strategy. Its main advantage over the more conventional node-based partitioning strategy is its modular programming approach to the development of parallel applications. The partitioner first partitions element centroids using a recursive inertial bisection algorithm. Elements and nodes then migrate according to the partitioned centroids, using a data request communication template for unpredictable incoming messages. Our scalable implementation is contrasted to a non-scalable implementation which is a straightforward parallelization of a sequential partitioner.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krishnamoorthy, Sriram; Daily, Jeffrey A.; Vishnu, Abhinav
2015-11-01
Global Arrays (GA) is a distributed-memory programming model that allows for shared-memory-style programming combined with one-sided communication, to create a set of tools that combine high performance with ease-of-use. GA exposes a relatively straightforward programming abstraction, while supporting fully-distributed data structures, locality of reference, and high-performance communication. GA was originally formulated in the early 1990’s to provide a communication layer for the Northwest Chemistry (NWChem) suite of chemistry modeling codes that was being developed concurrently.
Coverage Metrics for Model Checking
NASA Technical Reports Server (NTRS)
Penix, John; Visser, Willem; Norvig, Peter (Technical Monitor)
2001-01-01
When using model checking to verify programs in practice, it is not usually possible to achieve complete coverage of the system. In this position paper we describe ongoing research within the Automated Software Engineering group at NASA Ames on the use of test coverage metrics to measure partial coverage and provide heuristic guidance for program model checking. We are specifically interested in applying and developing coverage metrics for concurrent programs that might be used to support certification of next generation avionics software.
The PlusCal Algorithm Language
NASA Astrophysics Data System (ADS)
Lamport, Leslie
Algorithms are different from programs and should not be described with programming languages. The only simple alternative to programming languages has been pseudo-code. PlusCal is an algorithm language that can be used right now to replace pseudo-code, for both sequential and concurrent algorithms. It is based on the TLA + specification language, and a PlusCal algorithm is automatically translated to a TLA + specification that can be checked with the TLC model checker and reasoned about formally.
Interpretive model for ''A Concurrency Method''
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carter, C.L.
1987-01-01
This paper describes an interpreter for ''A Concurrency Method,'' in which concurrency is the inherent mode of operation and not an appendage to sequentiality. This method is based on the notions of data-drive and single-assignment while preserving a natural manner of programming. The interpreter is designed for and implemented on a network of Corvus Concept Personal Workstations, which are based on the Motorola MC68000 super-microcomputer. The interpreter utilizes the MC68000 processors in each workstation by communicating across OMNINET, the local area network designed for the workstations. The interpreter is a complete system, containing an editor, a compiler, an operating systemmore » with load balancer, and a communication facility. The system includes the basic arithmetic and trigonometric primitive operations for mathematical computations as well as the ability to construct more complex operations from these. 9 refs., 5 figs.« less
Denomme, William James; Benhanoh, Orry
2017-08-01
There is a growing body of research demonstrating that families of individuals with substance use and concurrent disorders (SUCD) experience a wide range of biopsychosocial problems that significantly impedes their quality of life and health. However, there has been a relative lack of treatment programs primarily focused on improving the well-being and quality of life of these family members. The current study assessed the efficacy of such a program at reducing stress, increasing perceived social support from family and friends, and increasing general, dyadic, and self-rated family functioning within these concerned family members. A sample of 125 family members of individuals with SUCDs was recruited, of which 97 participated in the treatment program and 28 were used as the comparison group. Results indicated that the treatment program significantly reduced stress, increased perceived social support from family and friends, and increased general, dyadic and self-rated family functioning. A perceived personal benefits questionnaire demonstrated that participants had a better understanding of SUCDs, better coping capabilities in regard to emotional difficulties, adopted stronger coping methods, participated in more leisure activities, and improved their relationship with the individual with a SUCD. The results of the current study further demonstrate the need to implement more of these family-member oriented psycho-educational treatment programs. Copyright © 2017 Elsevier Inc. All rights reserved.
ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: EXEL INDUSTRIAL AIRMIX SPRAY GUN
The Environmental Technology Verification Program has partnered with Concurrent Technologies Corp. to verify innovative coatings and coating equipment technologies for reducing air emissions. This report describes the performance of EXEL Industrial's Kremlin Airmix high transfer ...
Lax, Leila R; Russell, M Lynn; Nelles, Laura J; Smith, Cathy M
2009-10-01
Professional behaviors, tacitly understood by Canadian-trained physicians, are difficult to teach and often create practice barriers for IMGs. The purpose of this design research study was to develop a Web-based program simulating Canadian medical literacy and culture, and to evaluate strategies of scaffolding individual knowledge building. Study 1 (N = 20) examined usability and pedagogic design. Studies 2 (N = 39) and 3 (N = 33) examined case participation patterns. Model design was validated in Study 1. Studies 2 and 3 demonstrated high levels of participation, on unprompted third tries, on knowledge tests. Recursive patterns were strongest on Reflective Exercises. Five strategies scaffolded knowledge building: (1) video simulations, (2) contextualized resources, (3) concurrent feedback, (4) Reflective Exercises, and (5) commentaries prompting "reflection on reflection." Scaffolded design supports complex knowledge building. These findings are concurrent with educational research on the importance of recursion and revision of knowledge for improvable and relational understanding.
Space Operations Center system analysis study extension. Volume 2: Programmatics and cost
NASA Technical Reports Server (NTRS)
1982-01-01
A summary of Space Operations Center (SOC) orbital space station costs, program options and program recommendations is presented. Program structure, hardware commonality, schedules and program phasing are considered. Program options are analyzed with respect to mission needs, design and technology options, and anticipated funding constraints. Design and system options are discussed.
Scheduling the resident 80-hour work week: an operations research algorithm.
Day, T Eugene; Napoli, Joseph T; Kuo, Paul C
2006-01-01
The resident 80-hour work week requires that programs now schedule duty hours. Typically, scheduling is performed in an empirical "trial-and-error" fashion. However, this is a classic "scheduling" problem from the field of operations research (OR). It is similar to scheduling issues that airlines must face with pilots and planes routing through various airports at various times. The authors hypothesized that an OR approach using iterative computer algorithms could provide a rational scheduling solution. Institution-specific constraints of the residency problem were formulated. A total of 56 residents are rotating through 4 hospitals. Additional constraints were dictated by the Residency Review Committee (RRC) rules or the specific surgical service. For example, at Hospital 1, during the weekday hours between 6 am and 6 pm, there will be a PGY4 or PGY5 and a PGY2 or PGY3 on-duty to cover Service "A." A series of equations and logic statements was generated to satisfy all constraints and requirements. These were restated in the Optimization Programming Language used by the ILOG software suite for solving mixed integer programming problems. An integer programming solution was generated to this resource-constrained assignment problem. A total of 30,900 variables and 12,443 constraints were required. A total of man-hours of programming were used; computer run-time was 25.9 hours. A weekly schedule was generated for each resident that satisfied the RRC regulations while fulfilling all stated surgical service requirements. Each required between 64 and 80 weekly resident duty hours. The authors conclude that OR is a viable approach to schedule resident work hours. This technique is sufficiently robust to accommodate changes in resident numbers, service requirements, and service and hospital rotations.
On the linear programming bound for linear Lee codes.
Astola, Helena; Tabus, Ioan
2016-01-01
Based on an invariance-type property of the Lee-compositions of a linear Lee code, additional equality constraints can be introduced to the linear programming problem of linear Lee codes. In this paper, we formulate this property in terms of an action of the multiplicative group of the field [Formula: see text] on the set of Lee-compositions. We show some useful properties of certain sums of Lee-numbers, which are the eigenvalues of the Lee association scheme, appearing in the linear programming problem of linear Lee codes. Using the additional equality constraints, we formulate the linear programming problem of linear Lee codes in a very compact form, leading to a fast execution, which allows to efficiently compute the bounds for large parameter values of the linear codes.
NASA Astrophysics Data System (ADS)
Semiatin, S. L.; Fagin, P. N.; Goetz, R. L.; Furrer, D. U.; Dutton, R. E.
2015-09-01
The plastic-flow behavior which controls the formation of bulk residual stresses during final heat treatment of powder-metallurgy (PM), nickel-base superalloys was quantified using conventional (isothermal) stress-relaxation (SR) tests and a novel approach which simulates concurrent temperature and strain transients during cooling following solution treatment. The concurrent cooling/straining test involves characterization of the thermal compliance of the test sample. In turn, this information is used to program the ram-displacement- vs-time profile to impose a constant plastic strain rate during cooling. To demonstrate the efficacy of the new approach, SR tests (in both tension and compression) and concurrent cooling/tension-straining experiments were performed on two PM superalloys, LSHR and IN-100. The isothermal SR experiments were conducted at a series of temperatures between 1144 K and 1436 K (871 °C and 1163 °C) on samples that had been supersolvus solution treated and cooled slowly or rapidly to produce starting microstructures comprising coarse gamma grains and coarse or fine secondary gamma-prime precipitates, respectively. The concurrent cooling/straining tests comprised supersolvus solution treatment and various combinations of subsequent cooling rate and plastic strain rate. Comparison of flow-stress data from the SR and concurrent cooling/straining tests showed some similarities and some differences which were explained in the context of the size of the gamma-prime precipitates and the evolution of dislocation substructure. The magnitude of the effect of concurrent deformation during cooling on gamma-prime precipitation was also quantified experimentally and theoretically.
Construct exploit constraint in crash analysis by bypassing canary
NASA Astrophysics Data System (ADS)
Huang, Ning; Huang, Shuguang; Huang, Hui; Chang, Chao
2017-08-01
Selective symbolic execution is a common program testing technology. Developed on the basis of it, some crash analysis systems are often used to test the fragility of the program by constructing exploit constraints, such as CRAX. From the study of crash analysis based on symbolic execution, this paper find that this technology cannot bypass the canary stack protection mechanisms. This paper makes the improvement uses the API hook in Linux. Experimental results show that the use of API hook can effectively solve the problem that crash analysis cannot bypass the canary protection.
Interior point techniques for LP and NLP
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evtushenko, Y.
By using surjective mapping the initial constrained optimization problem is transformed to a problem in a new space with only equality constraints. For the numerical solution of the latter problem we use the generalized gradient-projection method and Newton`s method. After inverse transformation to the initial space we obtain the family of numerical methods for solving optimization problems with equality and inequality constraints. In the linear programming case after some simplification we obtain Dikin`s algorithm, affine scaling algorithm and generalized primal dual interior point linear programming algorithm.
Compensator improvement for multivariable control systems
NASA Technical Reports Server (NTRS)
Mitchell, J. R.; Mcdaniel, W. L., Jr.; Gresham, L. L.
1977-01-01
A theory and the associated numerical technique are developed for an iterative design improvement of the compensation for linear, time-invariant control systems with multiple inputs and multiple outputs. A strict constraint algorithm is used in obtaining a solution of the specified constraints of the control design. The result of the research effort is the multiple input, multiple output Compensator Improvement Program (CIP). The objective of the Compensator Improvement Program is to modify in an iterative manner the free parameters of the dynamic compensation matrix so that the system satisfies frequency domain specifications. In this exposition, the underlying principles of the multivariable CIP algorithm are presented and the practical utility of the program is illustrated with space vehicle related examples.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Diaz, Roberto; Jaboin, Jerry J.; Morales-Paliza, Manuel
Purpose: To conduct a retrospective review of 168 consecutively treated locally advanced head-and-neck cancer (LAHNC) patients treated with intensity-modulated radiotherapy (IMRT)/chemotherapy, to determine the rate and risk factors for developing hypothyroidism. Methods and Materials: Intensity-modulated radiotherapy was delivered in 33 daily fractions to 69.3 Gy to gross disease and 56.1 Gy to clinically normal cervical nodes. Dose-volume histograms (DVHs) of IMRT plans were used to determine radiation dose to thyroid and were compared with DVHs using conventional three-dimensional radiotherapy (3D-RT) in 10 of these same patients randomly selected for replanning and with DVHs of 16 patients in whom the thyroidmore » was intentionally avoided during IMRT. Weekly paclitaxel (30 mg/m{sup 2}) and carboplatin area under the curve-1 were given concurrently with IMRT. Results: Sixty-one of 128 evaluable patients (47.7%) developed hypothyroidism after a median of 1.08 years after IMRT (range, 2.4 months to 3.9 years). Age and volume of irradiated thyroid were associated with hypothyroidism development after IMRT. Compared with 3D-RT, IMRT with no thyroid dose constraints resulted in significantly higher minimum, maximum, and median dose (p < 0.0001) and percentage thyroid volume receiving 10, 20, and 60 Gy (p < 0.05). Compared with 3D-RT, IMRT with thyroid dose constraints resulted in lower median dose and percentage thyroid volume receiving 30, 40, and 50 Gy (p < 0.005) but higher minimum and maximum dose (p < 0.005). Conclusions: If not protected, IMRT for LAHNC can result in higher radiation to the thyroid than with conventional 3D-RT. Techniques to reduce dose and volume of radiation to thyroid tissue with IMRT are achievable and recommended.« less
Creative Funding Opportunities for Interscholastic Athletic Programs
ERIC Educational Resources Information Center
Forester, Brooke E.
2015-01-01
Athletic programs nationwide are facing budget constraints like never before. Pay-to-play programs are becoming commonplace. School districts are providing less and less funding for athletics. Still worse, many high school athletic programs are being cut entirely from the scholastic school setting. Coaches and athletic directors are being forced…
Cluster functions and scattering amplitudes for six and seven points
Harrington, Thomas; Spradlin, Marcus
2017-07-05
Scattering amplitudes in planar super-Yang-Mills theory satisfy several basic physical and mathematical constraints, including physical constraints on their branch cut structure and various empirically discovered connections to the mathematics of cluster algebras. The power of the bootstrap program for amplitudes is inversely proportional to the size of the intersection between these physical and mathematical constraints: ideally we would like a list of constraints which determine scattering amplitudes uniquely. Here, we explore this intersection quantitatively for two-loop six- and seven-point amplitudes by providing a complete taxonomy of the Gr(4, 6) and Gr(4, 7) cluster polylogarithm functions of [15] at weight 4.
NASA Astrophysics Data System (ADS)
Vahdani, Behnam; Tavakkoli-Moghaddam, Reza; Jolai, Fariborz; Baboli, Arman
2013-06-01
This article seeks to offer a systematic approach to establishing a reliable network of facilities in closed loop supply chains (CLSCs) under uncertainties. Facilities that are located in this article concurrently satisfy both traditional objective functions and reliability considerations in CLSC network designs. To attack this problem, a novel mathematical model is developed that integrates the network design decisions in both forward and reverse supply chain networks. The model also utilizes an effective reliability approach to find a robust network design. In order to make the results of this article more realistic, a CLSC for a case study in the iron and steel industry has been explored. The considered CLSC is multi-echelon, multi-facility, multi-product and multi-supplier. Furthermore, multiple facilities exist in the reverse logistics network leading to high complexities. Since the collection centres play an important role in this network, the reliability concept of these facilities is taken into consideration. To solve the proposed model, a novel interactive hybrid solution methodology is developed by combining a number of efficient solution approaches from the recent literature. The proposed solution methodology is a bi-objective interval fuzzy possibilistic chance-constraint mixed integer linear programming (BOIFPCCMILP). Finally, computational experiments are provided to demonstrate the applicability and suitability of the proposed model in a supply chain environment and to help decision makers facilitate their analyses.
Priorities for the poor: a conceptual framework for policy analysis.
Pleskovic, B; Sivitanides, P
1993-04-01
A number of poverty alleviation strategies have been developed over the last 2 decades. While these varies approaches have helped stimulate and guide policymaker consideration of the issue, poverty nonetheless remains an enormous problem in most developing countries. Budgetary and administrative constraints demand that the comprehensive basic needs of the poor not be addressed concurrently. Poverty policy and practice will instead be most effective if needs and expenditures identified by evaluating social indicators and social expenditure programs are prioritized. Applied to the case of Morocco, a conceptual framework is presented for identifying priority poverty problems and social expenditure policies. The methodology allows one to sort out critical poverty problems and analyze their causes by using aggregate, territorial, and references indicators; provides a framework for understanding and evaluating the interlinked effects of investments in social sectors; and introduces a tool for selecting cost-effective policy packages for poverty alleviation. The methodology could, however, be refined by improving the derivation of reference indicators by accounting for predetermined government objectives. The estimation of direct and indirect effects of investments in the critical poverty sectors could also be improved. Additional research is called for to determine how econometric models should be structured to aid in quantifying Morocco's SIO tables and how the estimated linkages may be used to extend and improve the cost-benefit analysis of antipoverty policies.
Integrated identification and control for nanosatellites reclaiming failed satellite
NASA Astrophysics Data System (ADS)
Han, Nan; Luo, Jianjun; Ma, Weihua; Yuan, Jianping
2018-05-01
Using nanosatellites to reclaim a failed satellite needs nanosatellites to attach to its surface to take over its attitude control function. This is challenging, since parameters including the inertia matrix of the combined spacecraft and the relative attitude information of attached nanosatellites with respect to the given body-fixed frame of the failed satellite are all unknown after the attachment. Besides, if the total control capacity needs to be increased during the reclaiming process by new nanosatellites, real-time parameters updating will be necessary. For these reasons, an integrated identification and control method is proposed in this paper, which enables the real-time parameters identification and attitude takeover control to be conducted concurrently. Identification of the inertia matrix of the combined spacecraft and the relative attitude information of attached nanosatellites are both considered. To guarantee sufficient excitation for the identification of the inertia matrix, a modified identification equation is established by filtering out sample points leading to ill-conditioned identification, and the identification performance of the inertia matrix is improved. Based on the real-time estimated inertia matrix, an attitude takeover controller is designed, the stability of the controller is analysed using Lyapunov method. The commanded control torques are allocated to each nanosatellite while the control saturation constraint being satisfied using the Quadratic Programming (QP) method. Numerical simulations are carried out to demonstrate the feasibility and effectiveness of the proposed integrated identification and control method.
7 CFR 18.3 - Development and adoption of equal employment opportunity programs.
Code of Federal Regulations, 2010 CFR
2010-01-01
... for employees of the university and may cover other rights and privileges of employees. (c... amendments to it shall be made effective by the President not later than 30 days from the date of concurrence. ...
15 CFR 930.41 - State agency response.
Code of Federal Regulations, 2010 CFR
2010-01-01
... MANAGEMENT FEDERAL CONSISTENCY WITH APPROVED COASTAL MANAGEMENT PROGRAMS Consistency for Federal Agency... concurrence with or objection to the Federal agency's consistency determination at the earliest practicable time, after providing for public participation in the State agency's review of the consistency...
Acquisition of Programming Skills
1990-04-01
skills (e.g., arithmetic reasoning, work knowledge, information processing speed); and c) passive versus active learning style. Ability measures...concurrent storage and processing an individual was capable of doing), and an active learning style. Implications of the findings for the development of
Family and Other Impacts on Retention
1992-04-01
provide the Army with an invaluable database for evaluating and designing policies and programs to enhance Army retention objectives. These programs... policy , as well as other aspects of the military force. Concurrently, continuing economic growth in the private sector will result in higher levels...work on retention and on the broader body of research on job satisfaction and job turnover. More recently, there has been both policy and theoretical
ERIC Educational Resources Information Center
Laughlin, Jerry W.
2007-01-01
There was rapid growth of Alabama community colleges in the late 1960s. At the same time, there was rapid growth nationally of fire science associate degree programs. With these concurrent events, one would expect fire department personnel in Alabama to benefit from new community college opportunities in fire science and fire administration.…
ERIC Educational Resources Information Center
Eley, Robert K., Ed.
This manual, developed to provide vocational instructors or coordinators with model training plans to be used to conduct concurrent work and education programs for disadvantaged and handicapped students, has the purposes of: (1) serving as a description of the kinds of content that should be included in a training plan, (2) serving as an example…
ERIC Educational Resources Information Center
Biggs, Marie C.; Watkins, Nancy A.
2008-01-01
Singing exaggerates the language of reading. The students find their voices in the rhythm and bounce of language by using music as an alternative technological approach to reading. A concurrent mixed methods study was conducted to investigate the use of an interactive sing-to-read program Tune Into Reading (Electronic Learning Products, 2006)…
Analysis of Interactive Graphics Display Equipment for an Automated Photo Interpretation System.
1982-06-01
System provides the hardware and software for a range of graphics processor tasks. The IMAGE System employs the RSX- II M real - time operating . system in...One hard copy unit serves up to four work stations. The executive program of the IMAGE system is the DEC RSX- 11 M real - time operating system . In...picture controller. The PDP 11/34 executes programs concurrently under the RSX- I IM real - time operating system . Each graphics program consists of a
Command/response protocols and concurrent software
NASA Technical Reports Server (NTRS)
Bynum, W. L.
1987-01-01
A version of the program to control the parallel jaw gripper is documented. The parallel jaw end-effector hardware and the Intel 8031 processor that is used to control the end-effector are briefly described. A general overview of the controller program is given and a complete description of the program's structure and design are contained. There are three appendices: a memory map of the on-chip RAM, a cross-reference listing of the self-scheduling routines, and a summary of the top-level and monitor commands.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun Wei; Huang, Guo H., E-mail: huang@iseis.org; Institute for Energy, Environment and Sustainable Communities, University of Regina, Regina, Saskatchewan, S4S 0A2
2012-06-15
Highlights: Black-Right-Pointing-Pointer Inexact piecewise-linearization-based fuzzy flexible programming is proposed. Black-Right-Pointing-Pointer It's the first application to waste management under multiple complexities. Black-Right-Pointing-Pointer It tackles nonlinear economies-of-scale effects in interval-parameter constraints. Black-Right-Pointing-Pointer It estimates costs more accurately than the linear-regression-based model. Black-Right-Pointing-Pointer Uncertainties are decreased and more satisfactory interval solutions are obtained. - Abstract: To tackle nonlinear economies-of-scale (EOS) effects in interval-parameter constraints for a representative waste management problem, an inexact piecewise-linearization-based fuzzy flexible programming (IPFP) model is developed. In IPFP, interval parameters for waste amounts and transportation/operation costs can be quantified; aspiration levels for net system costs, as well as tolerancemore » intervals for both capacities of waste treatment facilities and waste generation rates can be reflected; and the nonlinear EOS effects transformed from objective function to constraints can be approximated. An interactive algorithm is proposed for solving the IPFP model, which in nature is an interval-parameter mixed-integer quadratically constrained programming model. To demonstrate the IPFP's advantages, two alternative models are developed to compare their performances. One is a conventional linear-regression-based inexact fuzzy programming model (IPFP2) and the other is an IPFP model with all right-hand-sides of fussy constraints being the corresponding interval numbers (IPFP3). The comparison results between IPFP and IPFP2 indicate that the optimized waste amounts would have the similar patterns in both models. However, when dealing with EOS effects in constraints, the IPFP2 may underestimate the net system costs while the IPFP can estimate the costs more accurately. The comparison results between IPFP and IPFP3 indicate that their solutions would be significantly different. The decreased system uncertainties in IPFP's solutions demonstrate its effectiveness for providing more satisfactory interval solutions than IPFP3. Following its first application to waste management, the IPFP can be potentially applied to other environmental problems under multiple complexities.« less
Duckworth, Renée A
2015-12-01
Personality traits are behaviors that show limited flexibility over time and across contexts, and thus understanding their origin requires an understanding of what limits behavioral flexibility. Here, I suggest that insight into the evolutionary origin of personality traits requires determining the relative importance of selection and constraint in producing limits to behavioral flexibility. Natural selection as the primary cause of limits to behavioral flexibility assumes that the default state of behavior is one of high flexibility and predicts that personality variation arises through evolution of buffering mechanisms to stabilize behavioral expression, whereas the constraint hypothesis assumes that the default state is one of limited flexibility and predicts that the neuroendocrine components that underlie personality variation are those most constrained in flexibility. Using recent work on the neurobiology of sensitive periods and maternal programming of offspring behavior, I show that some of the most stable aspects of the neuroendocrine system are structural components and maternally induced epigenetic effects. Evidence of numerous constraints to changes in structural features of the neuroendocrine system and far fewer constraints to flexibility of epigenetic systems suggests that structural constraints play a primary role in the origin of behavioral stability and that epigenetic programming may be more important in generating adaptive variation among individuals. © 2015 New York Academy of Sciences.
Level-Set Topology Optimization with Aeroelastic Constraints
NASA Technical Reports Server (NTRS)
Dunning, Peter D.; Stanford, Bret K.; Kim, H. Alicia
2015-01-01
Level-set topology optimization is used to design a wing considering skin buckling under static aeroelastic trim loading, as well as dynamic aeroelastic stability (flutter). The level-set function is defined over the entire 3D volume of a transport aircraft wing box. Therefore, the approach is not limited by any predefined structure and can explore novel configurations. The Sequential Linear Programming (SLP) level-set method is used to solve the constrained optimization problems. The proposed method is demonstrated using three problems with mass, linear buckling and flutter objective and/or constraints. A constraint aggregation method is used to handle multiple buckling constraints in the wing skins. A continuous flutter constraint formulation is used to handle difficulties arising from discontinuities in the design space caused by a switching of the critical flutter mode.
Affordances and Constraints of a Blended Course in a Teacher Professional Development Program
ERIC Educational Resources Information Center
Bakir, Nesrin; Devers, Christopher; Hug, Barbara
2016-01-01
Using a descriptive research design approach, this study investigated the affordances and constraints of a graduate level blended course focused on science teaching and learning. Data were gathered from 24 in-service teacher interviews and surveys. Identified affordances included the structure and implementation of the course, the flexibility of…
General Constraints on Sampling Wildlife on FIA Plots
Larissa L. Bailey; John R. Sauer; James D. Nichols; Paul H. Geissler
2005-01-01
This paper reviews the constraints to sampling wildlife populations at FIA points. Wildlife sampling programs must have well-defined goals and provide information adequate to meet those goals. Investigators should choose a State variable based on information needs and the spatial sampling scale. We discuss estimation-based methods for three State variables: species...
Scheduling double round-robin tournaments with divisional play using constraint programming
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carlsson, Mats; Johansson, Mikael; Larson, Jeffrey
We study a tournament format that extends a traditional double round-robin format with divisional single round-robin tournaments. Elitserien, the top Swedish handball league, uses such a format for its league schedule. We present a constraint programming model that characterizes the general double round-robin plus divisional single round-robin format. This integrated model allows scheduling to be performed in a single step, as opposed to common multistep approaches that decompose scheduling into smaller problems and possibly miss optimal solutions. In addition to general constraints, we introduce Elitserien-specific requirements for its tournament. These general and league-specific constraints allow us to identify implicit andmore » symmetry-breaking properties that reduce the time to solution from hours to seconds. A scalability study of the number of teams shows that our approach is reasonably fast for even larger league sizes. The experimental evaluation of the integrated approach takes considerably less computational effort to schedule Elitserien than does the previous decomposed approach. (C) 2016 Elsevier B.V. All rights reserved« less
Improving the Held and Karp Approach with Constraint Programming
NASA Astrophysics Data System (ADS)
Benchimol, Pascal; Régin, Jean-Charles; Rousseau, Louis-Martin; Rueher, Michel; van Hoeve, Willem-Jan
Held and Karp have proposed, in the early 1970s, a relaxation for the Traveling Salesman Problem (TSP) as well as a branch-and-bound procedure that can solve small to modest-size instances to optimality [4, 5]. It has been shown that the Held-Karp relaxation produces very tight bounds in practice, and this relaxation is therefore applied in TSP solvers such as Concorde [1]. In this short paper we show that the Held-Karp approach can benefit from well-known techniques in Constraint Programming (CP) such as domain filtering and constraint propagation. Namely, we show that filtering algorithms developed for the weighted spanning tree constraint [3, 8] can be adapted to the context of the Held and Karp procedure. In addition to the adaptation of existing algorithms, we introduce a special-purpose filtering algorithm based on the underlying mechanisms used in Prim's algorithm [7]. Finally, we explored two different branching schemes to close the integrality gap. Our initial experimental results indicate that the addition of the CP techniques to the Held-Karp method can be very effective.
jFuzz: A Concolic Whitebox Fuzzer for Java
NASA Technical Reports Server (NTRS)
Jayaraman, Karthick; Harvison, David; Ganesh, Vijay; Kiezun, Adam
2009-01-01
We present jFuzz, a automatic testing tool for Java programs. jFuzz is a concolic whitebox fuzzer, built on the NASA Java PathFinder, an explicit-state Java model checker, and a framework for developing reliability and analysis tools for Java. Starting from a seed input, jFuzz automatically and systematically generates inputs that exercise new program paths. jFuzz uses a combination of concrete and symbolic execution, and constraint solving. Time spent on solving constraints can be significant. We implemented several well-known optimizations and name-independent caching, which aggressively normalizes the constraints to reduce the number of calls to the constraint solver. We present preliminary results due to the optimizations, and demonstrate the effectiveness of jFuzz in creating good test inputs. The source code of jFuzz is available as part of the NASA Java PathFinder. jFuzz is intended to be a research testbed for investigating new testing and analysis techniques based on concrete and symbolic execution. The source code of jFuzz is available as part of the NASA Java PathFinder.
NASA Astrophysics Data System (ADS)
Sun, Jingliang; Liu, Chunsheng
2018-01-01
In this paper, the problem of intercepting a manoeuvring target within a fixed final time is posed in a non-linear constrained zero-sum differential game framework. The Nash equilibrium solution is found by solving the finite-horizon constrained differential game problem via adaptive dynamic programming technique. Besides, a suitable non-quadratic functional is utilised to encode the control constraints into a differential game problem. The single critic network with constant weights and time-varying activation functions is constructed to approximate the solution of associated time-varying Hamilton-Jacobi-Isaacs equation online. To properly satisfy the terminal constraint, an additional error term is incorporated in a novel weight-updating law such that the terminal constraint error is also minimised over time. By utilising Lyapunov's direct method, the closed-loop differential game system and the estimation weight error of the critic network are proved to be uniformly ultimately bounded. Finally, the effectiveness of the proposed method is demonstrated by using a simple non-linear system and a non-linear missile-target interception system, assuming first-order dynamics for the interceptor and target.
Motion coordination and programmable teleoperation between two industrial robots
NASA Technical Reports Server (NTRS)
Luh, J. Y. S.; Zheng, Y. F.
1987-01-01
Tasks for two coordinated industrial robots always bring the robots in contact with a same object. The motion coordination among the robots and the object must be maintained all the time. To plan the coordinated tasks, only one robot's motion is planned according to the required motion of the object. The motion of the second robot is to follow the first one as specified by a set of holonomic equality constraints at every time instant. If any modification of the object's motion is needed in real-time, only the first robot's motion has to be modified accordingly in real-time. The modification for the second robot is done implicitly through the constraint conditions. Thus the operation is simplified. If the object is physically removed, the second robot still continually follows the first one through the constraint conditions. If the first robot is maneuvered through either the teach pendant or the keyboard, the second one moves accordingly to form the teleoperation which is linked through the software programming. Obviously, the second robot does not need to duplicate the first robot's motion. The programming of the constraints specifies their relative motions.
Sun, Wei; Huang, Guo H; Lv, Ying; Li, Gongchen
2012-06-01
To tackle nonlinear economies-of-scale (EOS) effects in interval-parameter constraints for a representative waste management problem, an inexact piecewise-linearization-based fuzzy flexible programming (IPFP) model is developed. In IPFP, interval parameters for waste amounts and transportation/operation costs can be quantified; aspiration levels for net system costs, as well as tolerance intervals for both capacities of waste treatment facilities and waste generation rates can be reflected; and the nonlinear EOS effects transformed from objective function to constraints can be approximated. An interactive algorithm is proposed for solving the IPFP model, which in nature is an interval-parameter mixed-integer quadratically constrained programming model. To demonstrate the IPFP's advantages, two alternative models are developed to compare their performances. One is a conventional linear-regression-based inexact fuzzy programming model (IPFP2) and the other is an IPFP model with all right-hand-sides of fussy constraints being the corresponding interval numbers (IPFP3). The comparison results between IPFP and IPFP2 indicate that the optimized waste amounts would have the similar patterns in both models. However, when dealing with EOS effects in constraints, the IPFP2 may underestimate the net system costs while the IPFP can estimate the costs more accurately. The comparison results between IPFP and IPFP3 indicate that their solutions would be significantly different. The decreased system uncertainties in IPFP's solutions demonstrate its effectiveness for providing more satisfactory interval solutions than IPFP3. Following its first application to waste management, the IPFP can be potentially applied to other environmental problems under multiple complexities. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Gutin, Gregory; Kim, Eun Jung; Soleimanfallah, Arezou; Szeider, Stefan; Yeo, Anders
The NP-hard general factor problem asks, given a graph and for each vertex a list of integers, whether the graph has a spanning subgraph where each vertex has a degree that belongs to its assigned list. The problem remains NP-hard even if the given graph is bipartite with partition U ⊎ V, and each vertex in U is assigned the list {1}; this subproblem appears in the context of constraint programming as the consistency problem for the extended global cardinality constraint. We show that this subproblem is fixed-parameter tractable when parameterized by the size of the second partite set V. More generally, we show that the general factor problem for bipartite graphs, parameterized by |V |, is fixed-parameter tractable as long as all vertices in U are assigned lists of length 1, but becomes W[1]-hard if vertices in U are assigned lists of length at most 2. We establish fixed-parameter tractability by reducing the problem instance to a bounded number of acyclic instances, each of which can be solved in polynomial time by dynamic programming.
ERIC Educational Resources Information Center
Grossman, Michael; Schortgen, Francis
2016-01-01
This article offers insights into the overall program development process and--institutional obstacles and constraints notwithstanding--successful introduction of a new national security program at a small liberal arts university at a time of growing institutional prioritization of science, technology, engineering, and mathematics (STEM) programs.…
Unifying Model-Based and Reactive Programming within a Model-Based Executive
NASA Technical Reports Server (NTRS)
Williams, Brian C.; Gupta, Vineet; Norvig, Peter (Technical Monitor)
1999-01-01
Real-time, model-based, deduction has recently emerged as a vital component in AI's tool box for developing highly autonomous reactive systems. Yet one of the current hurdles towards developing model-based reactive systems is the number of methods simultaneously employed, and their corresponding melange of programming and modeling languages. This paper offers an important step towards unification. We introduce RMPL, a rich modeling language that combines probabilistic, constraint-based modeling with reactive programming constructs, while offering a simple semantics in terms of hidden state Markov processes. We introduce probabilistic, hierarchical constraint automata (PHCA), which allow Markov processes to be expressed in a compact representation that preserves the modularity of RMPL programs. Finally, a model-based executive, called Reactive Burton is described that exploits this compact encoding to perform efficIent simulation, belief state update and control sequence generation.
Activity-Centric Approach to Distributed Programming
NASA Technical Reports Server (NTRS)
Levy, Renato; Satapathy, Goutam; Lang, Jun
2004-01-01
The first phase of an effort to develop a NASA version of the Cybele software system has been completed. To give meaning to even a highly abbreviated summary of the modifications to be embodied in the NASA version, it is necessary to present the following background information on Cybele: Cybele is a proprietary software infrastructure for use by programmers in developing agent-based application programs [complex application programs that contain autonomous, interacting components (agents)]. Cybele provides support for event handling from multiple sources, multithreading, concurrency control, migration, and load balancing. A Cybele agent follows a programming paradigm, called activity-centric programming, that enables an abstraction over system-level thread mechanisms. Activity centric programming relieves application programmers of the complex tasks of thread management, concurrency control, and event management. In order to provide such functionality, activity-centric programming demands support of other layers of software. This concludes the background information. In the first phase of the present development, a new architecture for Cybele was defined. In this architecture, Cybele follows a modular service-based approach to coupling of the programming and service layers of software architecture. In a service-based approach, the functionalities supported by activity-centric programming are apportioned, according to their characteristics, among several groups called services. A well-defined interface among all such services serves as a path that facilitates the maintenance and enhancement of such services without adverse effect on the whole software framework. The activity-centric application-program interface (API) is part of a kernel. The kernel API calls the services by use of their published interface. This approach makes it possible for any application code written exclusively under the API to be portable for any configuration of Cybele.
Curriculum Mapping in Academic Libraries
ERIC Educational Resources Information Center
Buchanan, Heidi; Webb, Katy Kavanagh; Houk, Amy Harris; Tingelstad, Catherine
2015-01-01
Librarians at four different academic institutions concurrently completed curriculum mapping projects using varying methods to analyze their information literacy instruction. Curriculum mapping is a process for systematically evaluating components of an instructional program for cohesiveness, proper sequencing, and goal achievement. There is a…
Multitasking Operating Systems for the IBM PC.
ERIC Educational Resources Information Center
Owen, G. Scott
1985-01-01
The ability of a microcomputer to execute several programs at the same time is called "multitasking." The nature and use of one multitasking operating system Concurrent PC-DOS from Digital Research (the developers of the CP/M operating system) are discussed. (JN)
Development and evaluation of a radar air traffic control research task.
DOT National Transportation Integrated Search
1965-12-01
A system is described in which various elements of the radar air traffic controller's task can be presented repeatedly, reliably, and concurrently to each of six experimental subjects seated at separate task consoles. Programming of display condition...
Incentive-Based Voltage Regulation in Distribution Networks: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Xinyang; Chen, Lijun; Dall'Anese, Emiliano
This paper considers distribution networks fea- turing distributed energy resources, and designs incentive-based mechanisms that allow the network operator and end-customers to pursue given operational and economic objectives, while concurrently ensuring that voltages are within prescribed limits. Two different network-customer coordination mechanisms that require different amounts of information shared between the network operator and end-customers are developed to identify a solution of a well-defined social-welfare maximization prob- lem. Notably, the signals broadcast by the network operator assume the connotation of prices/incentives that induce the end- customers to adjust the generated/consumed powers in order to avoid the violation of the voltagemore » constraints. Stability of the proposed schemes is analytically established and numerically corroborated.« less
Incentive-Based Voltage Regulation in Distribution Networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dall-Anese, Emiliano; Baker, Kyri A; Zhou, Xinyang
This paper considers distribution networks fea- turing distributed energy resources, and designs incentive-based mechanisms that allow the network operator and end-customers to pursue given operational and economic objectives, while concurrently ensuring that voltages are within prescribed limits. Two different network-customer coordination mechanisms that require different amounts of information shared between the network operator and end-customers are developed to identify a solution of a well-defined social-welfare maximization prob- lem. Notably, the signals broadcast by the network operator assume the connotation of prices/incentives that induce the end- customers to adjust the generated/consumed powers in order to avoid the violation of the voltagemore » constraints. Stability of the proposed schemes is analytically established and numerically corroborated.« less
Magnetohydrodynamics with GAMER
NASA Astrophysics Data System (ADS)
Zhang, Ui-Han; Schive, Hsi-Yu; Chiueh, Tzihong
2018-06-01
GAMER, a parallel Graphic-processing-unit-accelerated Adaptive-MEsh-Refinement (AMR) hydrodynamic code, has been extended to support magnetohydrodynamics (MHD) with both the corner-transport-upwind and MUSCL-Hancock schemes and the constraint transport technique. The divergent preserving operator for AMR has been applied to reinforce the divergence-free constraint on the magnetic field. GAMER-MHD has fully exploited the concurrent executions between the graphic process unit (GPU) MHD solver and other central processing unit computation pertinent to AMR. We perform various standard tests to demonstrate that GAMER-MHD is both second-order accurate and robust, producing results as accurate as those given by high-resolution uniform-grid runs. We also explore a new 3D MHD test, where the magnetic field assumes the Arnold–Beltrami–Childress configuration, temporarily becomes turbulent with current sheets, and finally settles to a lowest-energy equilibrium state. This 3D problem is adopted for the performance test of GAMER-MHD. The single-GPU performance reaches 1.2 × 108 and 5.5 × 107 cell updates per second for the single- and double-precision calculations, respectively, on Tesla P100. We also demonstrate a parallel efficiency of ∼70% for both weak and strong scaling using 1024 XK nodes on the Blue Waters supercomputers.
NASA Technical Reports Server (NTRS)
Brown, David B.
1991-01-01
The results of research and development efforts of the first six months of Task 1, Phase 3 of the project are presented. The goals of Phase 3 are: (1) to further refine the rule base and complete the comparative rule base evaluation; (2) to implement and evaluate a concurrency testing prototype; (3) to convert the complete (unit-level and concurrency) testing prototype to a workstation environment; and (4) to provide a prototype development document to facilitate the transfer of research technology to a working environment. These goals were partially met and the results are summarized.
Marriageable Women: A Focus on Participants in a Community Healthy Marriage Program
Manning, Wendy D.; Trella, Deanna; Lyons, Heidi; Toit, Nola Cora Du
2012-01-01
Although disadvantaged women are the targets of marriage programs, little attention has been paid to women's marriage constraints and their views of marriage. Drawing on an exchange framework and using qualitative data collected from single women participating in a marriage initiative, we introduce the concept of marriageable women—the notion that certain limitations may make women poor marriage partners. Like their male counterparts, we find women also possess qualities that are not considered assets in the marriage market, such as economic constraints, mental and physical health issues, substance use, multiple partner fertility, and gender distrust. We also consider how women participating in a marriage program frame their marriage options, whereas a few opt out of the marriage market altogether. PMID:23258947
A Counterexample Guided Abstraction Refinement Framework for Verifying Concurrent C Programs
2005-05-24
source code are routinely executed. The source code is written in languages ranging from C/C++/Java to ML/ Ocaml . These languages differ not only in...from the difficulty to model computer programs—due to the complexity of programming languages as compared to hardware description languages —to...intermediate specification language lying between high-level Statechart- like formalisms and transition systems. Actions are encoded as changes in
Verification of Concurrent Programs. Part II. Temporal Proof Principles.
1981-09-01
not modify any of the shared program variables. In order to ensure the correct synchronization between the processes we use three semaphore variables...direct, simple, and intuitive rides for the establishment of these properties. rhey usually replace long but repetitively similar chains of primitive ...modify the variables on which Q actually depends. A typical case is that of semaphores . We have the following property: The Semaphore Variable Rule
Language Issues in Mobile Program Security
1998-01-01
primitives for instance synchronous operations Nondeterminism and Privacy Now suppose we introduce nondeterminism via a simple concurrent language...code setting is that the only observable events are those that can be observed from within a mobile program using language primitives and any host...Possibilistic NI is given in It uses a main thread and two triggered threads each with a busy wait loop implementing a semaphore to copy every bit of
The Use of a UNIX-Based Workstation in the Information Systems Laboratory
1989-03-01
system. The conclusions of the research and the resulting recommendations are presented in Chapter III. These recommendations include how to manage...required to run the program on a new system, these should not be significant changes. 2. Processing Environment The UNIX processing environment is...interactive with multi-tasking and multi-user capabilities. Multi-tasking refers to the fact that many programs can be run concurrently. This capability
Department of the Navy Acquisition and Capabilities Guidebook
2012-05-01
Cost Estimates/Service Cost Position..................................... 5-1 5.1.2 Cost Analysis Requirements Description ( CARD ) 5-2 5.1.3...Description ( CARD ). 7. Satisfactory review of program health. 8. Concurrence with draft TDS, TES, and SEP. 9. Approval of full funding...Description ( CARD ) SECNAV M-5000.2 May 2012 5-3 Enclosure (1) A sound cost estimate is based on a well-defined program. The CARD is used
Systematic and Scalable Testing of Concurrent Programs
2013-12-16
The evaluation of CHESS [107] checked eight different programs ranging from process management libraries to a distributed execution engine to a research...tool (§3.1) targets systematic testing of scheduling nondeterminism in multi- threaded components of the Omega cluster management system [129], while...tool for systematic testing of multithreaded com- ponents of the Omega cluster management system [129]. In particular, §3.1.1 defines a model for
Overcoming barriers in care for the dying: Theoretical analysis of an innovative program model.
Wallace, Cara L
2016-08-01
This article explores barriers to end-of-life (EOL) care (including development of a death denying culture, ongoing perceptions about EOL care, poor communication, delayed access, and benefit restrictions) through the theoretical lens of symbolic interactionism (SI), and applies general systems theory (GST) to a promising practice model appropriate for addressing these barriers. The Compassionate Care program is a practice model designed to bridge gaps in care for the dying and is one example of a program offering concurrent care, a recent focus of evaluation though the Affordable Care Act. Concurrent care involves offering curative care alongside palliative or hospice care. Additionally, the program offers comprehensive case management and online resources to enrollees in a national health plan (Spettell et al., 2009).SI and GST are compatible and interrelated theories that provide a relevant picture of barriers to end-of-life care and a practice model that might evoke change among multiple levels of systems. These theories promote insight into current challenges in EOL care, as well as point to areas of needed research and interventions to address them. The article concludes with implications for policy and practice, and discusses the important role of social work in impacting change within EOL care.
An Integer Programming Model for the Management of a Forest in the North of Portugal
NASA Astrophysics Data System (ADS)
Cerveira, Adelaide; Fonseca, Teresa; Mota, Artur; Martins, Isabel
2011-09-01
This study aims to develop an approach for the management of a forest of maritime pine located in the north region of Portugal. The forest is classified into five public lands, the so-called baldios, extending over 4432 ha. These baldios are co-managed by the Official Forest Services and the local communities mainly for timber production purposes. The forest planning involves non-spatial and spatial constraints. Spatial constraints dictate a maximum clearcut area and an exclusion time. An integer programming model is presented and the computational results are discussed.
Very light smoking and alternative tobacco use among college students.
Li, Xiaoyin; Loukas, Alexandra; Perry, Cheryl L
2018-06-01
Concurrent use of cigarettes with alternative tobacco products (ATPs), even among very light smokers, may be harmful. This study examined current use of e-cigarettes, cigars, and hookah, and susceptibility to future use of these products in a sample of college student cigarette smokers. Participants were 1161 18-29 year old (M age = 21.15; SD = 2.72; 52.7% female; 41.2% non-Hispanic white) current, or past 30-day cigarette smokers, drawn from a larger study. Current smokers were categorized as very light smokers [≤5 cigarettes per day (cpd)] and heavier smokers (>5 cpd). 88.6% of all participating college student smokers were very light smokers and 67.7% used at least one ATP concurrently. The prevalence of current use in this sample was 42.9% for e-cigarettes, 36.4% for hookah, and 25.9% for cigars. Compared to heavier smokers, very light smokers were more likely to be younger, racial/ethnic minorities, and four-year versus two-year college students. Multilevel logistic regression models showed that after controlling for socio-demographic characteristics and substance use, being a very light smoker, compared with a heavier smoker, was negatively associated with concurrent e-cigarette use, but positively associated with concurrent cigar use, and not associated with concurrent hookah use. Moreover, compared to heavier smokers, very light smokers reported being more susceptible to future cigar and hookah use, but not e-cigarette use. Concurrent use of cigarettes with ATPs is popular among all college student smokers, but very light smokers are more likely than heavier smokers to use combustible ATPs. Smoking intervention programs and campus policies should caution smokers, especially very light smokers, against ATP use. Copyright © 2018 Elsevier Ltd. All rights reserved.
Ginsburg, Brett C; Pinkston, Jonathan W; Lamb, Richard J
2012-04-01
The selective serotonin reuptake inhibitor fluvoxamine reduces responding for ethanol at lower doses than responding for food when each is available in separate components or separate groups of rats. However, when both are available concurrently and deliveries earned per session are equal, this apparent selectivity inverts and food-maintained behavior is more sensitive than ethanol-maintained behavior to rate-decreasing effects of fluvoxamine. Here, we investigated further the impact that concurrent access to both food and ethanol has on the potency of fluvoxamine. Fluvoxamine (5.6-17.8 mg/kg) potency was assessed under conditions in which food and ethanol were available concurrently and response rates were equal [average variable intervals (VIs) 405 and 14 s for food and ethanol, respectively], as well as when density of food delivery was increased (average VI 60 s for food and VI 14 s for ethanol). The potency of fluvoxamine was also determined when only ethanol was available (food extinction and average VI 14 s for ethanol) and under multiple VIs (VI 30 s for food and ethanol) wherein either food or ethanol was the only programmed reinforcement available during each component. Fluvoxamine was less potent at decreasing ethanol self-administration when food was available concurrently {ED50 [95% confidence limit (CL): 8.2 (6.5-10.3) and 10.7 (7.9-14.4)]} versus when ethanol was available in isolation [ED50: 4.0 (2.7-5.9) and 5.1 (4.3-6.0)]. Effects on food were similar under each condition in which food was available. The results demonstrate that the potency of fluvoxamine in reducing ethanol-maintained behavior depends on whether ethanol is available in isolation or in the context of concurrently scheduled food reinforcement.
Sustaining Arts Programs in Public Education
ERIC Educational Resources Information Center
Dunstan, David
2016-01-01
The purpose of this qualitative research case study was to investigate leadership and funding decisions that determine key factors responsible for sustaining arts programs in public schools. While the educational climate, financial constraints, and standardized testing continue to impact arts programs in public education, Eastland High School, the…
Investment portfolio of a pension fund: Stochastic model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bosch-Princep, M.; Fontanals-Albiol, H.
1994-12-31
This paper presents a stochastic programming model that aims at getting the optimal investment portfolio of a Pension Funds. The model has been designed bearing in mind the liabilities of the Funds to its members. The essential characteristic of the objective function and the constraints is the randomness of the coefficients and the right hand side of the constraints, so it`s necessary to use techniques of stochastic mathematical programming to get information about the amount of money that should be assigned to each sort of investment. It`s important to know the risky attitude of the person that has to takemore » decisions towards running risks. It incorporates the relation between the different coefficients of the objective function and constraints of each period of temporal horizon, through lineal and discrete random processes. Likewise, it includes the hypotheses that are related to Spanish law concerning the subject of Pension Funds.« less
IESIP - AN IMPROVED EXPLORATORY SEARCH TECHNIQUE FOR PURE INTEGER LINEAR PROGRAMMING PROBLEMS
NASA Technical Reports Server (NTRS)
Fogle, F. R.
1994-01-01
IESIP, an Improved Exploratory Search Technique for Pure Integer Linear Programming Problems, addresses the problem of optimizing an objective function of one or more variables subject to a set of confining functions or constraints by a method called discrete optimization or integer programming. Integer programming is based on a specific form of the general linear programming problem in which all variables in the objective function and all variables in the constraints are integers. While more difficult, integer programming is required for accuracy when modeling systems with small numbers of components such as the distribution of goods, machine scheduling, and production scheduling. IESIP establishes a new methodology for solving pure integer programming problems by utilizing a modified version of the univariate exploratory move developed by Robert Hooke and T.A. Jeeves. IESIP also takes some of its technique from the greedy procedure and the idea of unit neighborhoods. A rounding scheme uses the continuous solution found by traditional methods (simplex or other suitable technique) and creates a feasible integer starting point. The Hook and Jeeves exploratory search is modified to accommodate integers and constraints and is then employed to determine an optimal integer solution from the feasible starting solution. The user-friendly IESIP allows for rapid solution of problems up to 10 variables in size (limited by DOS allocation). Sample problems compare IESIP solutions with the traditional branch-and-bound approach. IESIP is written in Borland's TURBO Pascal for IBM PC series computers and compatibles running DOS. Source code and an executable are provided. The main memory requirement for execution is 25K. This program is available on a 5.25 inch 360K MS DOS format diskette. IESIP was developed in 1990. IBM is a trademark of International Business Machines. TURBO Pascal is registered by Borland International.
Cooperative optimization of reconfigurable machine tool configurations and production process plan
NASA Astrophysics Data System (ADS)
Xie, Nan; Li, Aiping; Xue, Wei
2012-09-01
The production process plan design and configurations of reconfigurable machine tool (RMT) interact with each other. Reasonable process plans with suitable configurations of RMT help to improve product quality and reduce production cost. Therefore, a cooperative strategy is needed to concurrently solve the above issue. In this paper, the cooperative optimization model for RMT configurations and production process plan is presented. Its objectives take into account both impacts of process and configuration. Moreover, a novel genetic algorithm is also developed to provide optimal or near-optimal solutions: firstly, its chromosome is redesigned which is composed of three parts, operations, process plan and configurations of RMTs, respectively; secondly, its new selection, crossover and mutation operators are also developed to deal with the process constraints from operation processes (OP) graph, otherwise these operators could generate illegal solutions violating the limits; eventually the optimal configurations for RMT under optimal process plan design can be obtained. At last, a manufacturing line case is applied which is composed of three RMTs. It is shown from the case that the optimal process plan and configurations of RMT are concurrently obtained, and the production cost decreases 6.28% and nonmonetary performance increases 22%. The proposed method can figure out both RMT configurations and production process, improve production capacity, functions and equipment utilization for RMT.
Marco A. Contreras; Woodam Chung; Greg Jones
2008-01-01
Forest transportation planning problems (FTPP) have evolved from considering only the financial aspects of timber management to more holistic problems that also consider the environmental impacts of roads. These additional requirements have introduced side constraints, making FTPP larger and more complex. Mixed-integer programming (MIP) has been used to solve FTPP, but...
Management as the enabling technology for space exploration
NASA Technical Reports Server (NTRS)
Mandell, Humboldt C., Jr.; Griffin, Michael D.
1992-01-01
This paper addresses the dilemma which NASA faces in starting a major new initiative within the constraints of the current national budget. It addressed the fact that unlike previous NASA programs, the major mission constraints come from management factors as opposed to technologies. An action plan is presented, along with some results from early management simplification processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morris, F. A.; Sawyer, C. H.; Maxwell, J. H.
1979-10-01
The Regional Assessments Division in the US Department of Energy (DOE) has undertaken a program to assess the probable consequences of various national energy policies in regions of the United States and to evaluate the constraints on national energy policy imposed by conditions in these regions. The program is referred to as the Regional Issues Identification and Assessment (RIIA) Program. Currently the RIIA Program is evaluating the Trendlong Mid-Mid scenario, a pattern of energy development for 1985 and 1990 derived from the Project Independence Evaluation System (PIES) model. This scenario assumes a medium annual growth rate in both the nationalmore » demand for and national supply of energy. It has been disaggregated to specify the generating capacity to be supplied by each energy source in each state. Pacific Northwest Laboratory (PNL) has the responsibility for evaluating the scenario for the Federal Region 10, consisting of Alaska, Idaho, Oregon, and Washington. PNL is identifying impacts and constraints associated with realizing the scenario in a variety of categories, including air and water quality impacts, health and safety effects, and socioeconomic impacts. This report summarizes the analysis of one such category: institutional constraints - defined to include legal, organizational, and political barriers to the achievement of the scenario in the Northwest.« less
Advanced Concepts Research for Flywheel Technology Applications
NASA Technical Reports Server (NTRS)
Keith, Theo G., Jr.; Wagner, Robert
2004-01-01
The Missile Defense Agency (MDA) (formerly the Ballistic Missile Defense Organization) is embarking on a program to employ the use of High Altitude Airships (HAAs) for surveillance of coastal areas as a part of homeland defense. It is envisioned that these HAAs will fly at 70,000 feet continuously for at least a year, therefore requiring a regenerative electric power system. As part of a program to entice the MDA to utilize the NASA GRC expertise in electric power and propulsion as a means of risk reduction, an internal study program was performed to examine possible configurations that may be employed on a HAA to meet a theoretical surveillance need. This entailed the development of a set of program requirements which were flowed down to system and subsystem level requirements as well as the identification of environmental and infrastructure constraints. Such infrastructure constraints include the ability to construct a reasonably sized HAA within existing airship hangers, as the size of such vehicles could reach in excess of 600 ft. The issues regarding environments at this altitude are similar to those that would be imposed on satellite in Low Earth Orbit. Additionally, operational constraints, due to high winds at certain times of the year were also examined to determine options that could be examined to allow year round coverage of the US coast.
Archer, Charles Jens [Rochester, MN; Pinnow, Kurt Walter [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian Edward [Rochester, MN
2008-10-14
An apparatus, program product and method checks for nodal faults in a row of nodes by causing each node in the row to concurrently communicate with its adjacent neighbor nodes in the row. The communications are analyzed to determine a presence of a faulty node or connection.
Archer, Charles Jens [Rochester, MN; Pinnow, Kurt Walter [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian Edward [Rochester, MN
2012-02-07
An apparatus, program product and method check for nodal faults in a row of nodes by causing each node in the row to concurrently communicate with its adjacent neighbor nodes in the row. The communications are analyzed to determine a presence of a faulty node or connection.
Archer, Charles Jens; Pinnow, Kurt Walter; Ratterman, Joseph D.; Smith, Brian Edward
2010-02-23
An apparatus and program product check for nodal faults in a row of nodes by causing each node in the row to concurrently communicate with its adjacent neighbor nodes in the row. The communications are analyzed to determine a presence of a faulty node or connection.
The Site Program demonstration of CF Systems' organics extraction technology was conducted to obtain specific operating and cost information that could be used in evaluating the potential applicability of the technology to Superfund sites. The demonstration was conducted concurr...
Automatic Management of Parallel and Distributed System Resources
NASA Technical Reports Server (NTRS)
Yan, Jerry; Ngai, Tin Fook; Lundstrom, Stephen F.
1990-01-01
Viewgraphs on automatic management of parallel and distributed system resources are presented. Topics covered include: parallel applications; intelligent management of multiprocessing systems; performance evaluation of parallel architecture; dynamic concurrent programs; compiler-directed system approach; lattice gaseous cellular automata; and sparse matrix Cholesky factorization.
Remote file inquiry (RFI) system
NASA Technical Reports Server (NTRS)
1975-01-01
System interrogates and maintains user-definable data files from remote terminals, using English-like, free-form query language easily learned by persons not proficient in computer programming. System operates in asynchronous mode, allowing any number of inquiries within limitation of available core to be active concurrently.
A Future of Leadership Development
ERIC Educational Resources Information Center
Williams, Ken
2009-01-01
Leadership and leadership development are popular topics today. Concurrent with the construction of leadership theory, leadership development has emerged as a practice, with programs, consultants, reports, and networking opportunities proliferating. Given the reality of limited resources, it is critical that investments in and approaches to…
2011-12-18
Proceedings of the SIGMET- RICS Symposium on Parallel and Distributed Tools, pages 48–59, 1998. [8] A. Dinning and E. Schonberg . Detecting access...multi- threaded programs. ACM Trans. Comput. Syst., 15(4):391– 411, 1997. [38] E. Schonberg . On-the-fly detection of access anomalies. In Proceedings
Evolution of a standard microprocessor-based space computer
NASA Technical Reports Server (NTRS)
Fernandez, M.
1980-01-01
An existing in inventory computer hardware/software package (B-1 RFS/ECM) was repackaged and applied to multiple missile/space programs. Concurrent with the application efforts, low risk modifications were made to the computer from program to program to take advantage of newer, advanced technology and to meet increasingly more demanding requirements (computational and memory capabilities, longer life, and fault tolerant autonomy). It is concluded that microprocessors hold promise in a number of critical areas for future space computer applications. However, the benefits of the DoD VHSIC Program are required and the old proliferation problem must be revised.
Panel discussion: prescribed burning in the 21st century
Jerry Hurley; Ishmael Messer; Stephen J. Botti; Jay Perkins; L. Dean Clark
1995-01-01
Even though many legal, social, and organizational constraints affect prescribed fire programs, the ecological and social benefits of such programs encourage their continued existence (with or without modification). The form of these programs in the next 10 to 50 years is pure speculation; but we must speculate and project the programs, as well as associated benefits...
Structured learning for robotic surgery utilizing a proficiency score: a pilot study.
Hung, Andrew J; Bottyan, Thomas; Clifford, Thomas G; Serang, Sarfaraz; Nakhoda, Zein K; Shah, Swar H; Yokoi, Hana; Aron, Monish; Gill, Inderbir S
2017-01-01
We evaluated feasibility and benefit of implementing structured learning in a robotics program. Furthermore, we assessed validity of a proficiency assessment tool for stepwise graduation. Teaching cases included robotic radical prostatectomy and partial nephrectomy. Procedure steps were categorized: basic, intermediate, and advanced. An assessment tool ["proficiency score" (PS)] was developed to evaluate ability to safely and autonomously complete a step. Graduation required a passing PS (PS ≥ 3) on three consecutive attempts. PS and validated global evaluative assessment of robotic skills (GEARS) were evaluated for completed steps. Linear regression was utilized to determine postgraduate year/PS relationship (construct validity). Spearman's rank correlation coefficient measured correlation between PS and GEARS evaluations (concurrent validity). Intraclass correlation (ICC) evaluated PS agreement between evaluator classes. Twenty-one robotic trainees participated within the pilot program, completing a median of 14 (2-69) cases each. Twenty-three study evaluators scored 14 (1-60) cases. Over 4 months, 229/294 (78 %) cases were designated "teaching" cases. Residents completed 91 % of possible evaluations; faculty completed 78 %. Verbal and quantitative feedback received by trainees increased significantly (p = 0.002, p < 0.001, respectively). Average PS increased with PGY (post-graduate year) for basic and intermediate steps (regression slopes: 0.402 (p < 0.0001), 0.323 (p < 0.0001), respectively) (construct validation). Overall, PS correlated highly with GEARS (ρ = 0.81, p < 0.0001) (concurrent validity). ICC was 0.77 (95 % CI 0.61-0.88) for resident evaluations. Structured learning can be implemented in an academic robotic program with high levels of trainee and evaluator participation, encouraging both quantitative and verbal feedback. A proficiency assessment tool developed for step-specific proficiency has construct and concurrent validity.
Efficient pairwise RNA structure prediction using probabilistic alignment constraints in Dynalign
2007-01-01
Background Joint alignment and secondary structure prediction of two RNA sequences can significantly improve the accuracy of the structural predictions. Methods addressing this problem, however, are forced to employ constraints that reduce computation by restricting the alignments and/or structures (i.e. folds) that are permissible. In this paper, a new methodology is presented for the purpose of establishing alignment constraints based on nucleotide alignment and insertion posterior probabilities. Using a hidden Markov model, posterior probabilities of alignment and insertion are computed for all possible pairings of nucleotide positions from the two sequences. These alignment and insertion posterior probabilities are additively combined to obtain probabilities of co-incidence for nucleotide position pairs. A suitable alignment constraint is obtained by thresholding the co-incidence probabilities. The constraint is integrated with Dynalign, a free energy minimization algorithm for joint alignment and secondary structure prediction. The resulting method is benchmarked against the previous version of Dynalign and against other programs for pairwise RNA structure prediction. Results The proposed technique eliminates manual parameter selection in Dynalign and provides significant computational time savings in comparison to prior constraints in Dynalign while simultaneously providing a small improvement in the structural prediction accuracy. Savings are also realized in memory. In experiments over a 5S RNA dataset with average sequence length of approximately 120 nucleotides, the method reduces computation by a factor of 2. The method performs favorably in comparison to other programs for pairwise RNA structure prediction: yielding better accuracy, on average, and requiring significantly lesser computational resources. Conclusion Probabilistic analysis can be utilized in order to automate the determination of alignment constraints for pairwise RNA structure prediction methods in a principled fashion. These constraints can reduce the computational and memory requirements of these methods while maintaining or improving their accuracy of structural prediction. This extends the practical reach of these methods to longer length sequences. The revised Dynalign code is freely available for download. PMID:17445273
Evaluation Strategies in Financial Education: Evaluation with Imperfect Instruments
ERIC Educational Resources Information Center
Robinson, Lauren; Dudensing, Rebekka; Granovsky, Nancy L.
2016-01-01
Program evaluation often suffers due to time constraints, imperfect instruments, incomplete data, and the need to report standardized metrics. This article about the evaluation process for the Wi$eUp financial education program showcases the difficulties inherent in evaluation and suggests best practices for assessing program effectiveness. We…
SIMCA T 1.0: A SAS Computer Program for Simulating Computer Adaptive Testing
ERIC Educational Resources Information Center
Raiche, Gilles; Blais, Jean-Guy
2006-01-01
Monte Carlo methodologies are frequently applied to study the sampling distribution of the estimated proficiency level in adaptive testing. These methods eliminate real situational constraints. However, these Monte Carlo methodologies are not currently supported by the available software programs, and when these programs are available, their…
A Case Study of the Development of an Early Retirement Program for University Faculty.
ERIC Educational Resources Information Center
Chronister, Jay L.; Trainer, Aileen
1985-01-01
To offset declining enrollments, financial constraints, younger faculties, and high tenure ratios, some institutions are considering early retirement programs to facilitate faculty turnover. A University of Virginia faculty committee reviewed several early retirement options and selected a cost-effective bridging program with ample incentives and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang Jing; Xia Tingyi, E-mail: xiatingyi1959@21cn.com; Wang Yingjie
Purpose: To establish the safety profile and efficacy of epidermal growth factor receptor tyrosine kinase inhibitors (EGFR-TKIs) concurrent with individualized radiotherapy (RT) in patients with locally advanced or metastatic non-small-cell lung cancer (NSCLC). Patients and Methods: Between June 2007 and January 2010, 26 patients with Stage III/IV NSCLC were enrolled in this prospective study. These patients were treated with EGFR-TKIs (gefitinib 250 mg or erlotinib 150 mg, oral daily) concurrent with individualized RT with curative intent. The thoracic RT plans were individually designed on the basis of tumor size and normal tissue volume constraints. All patients were assessed for toxicity,more » and 25 patients were available for efficacy. The primary endpoints were acute toxicity, overall survival, and median survival time. The secondary endpoints included local control rate, time to tumor progression, and progression-free survival (PFS). Results: Median gross tumor volume, mean lung dose, and lung V20 were 56 cm{sup 3}, 8.6 Gy, and 14%, respectively. Median thoracic radiation dose was 70 Gy at a margin of gross tumor volume (range, 42-82 Gy), and median biological equivalent dose was 105 Gy (range, 60-119 Gy). Acute skin, hematologic, esophageal, and pulmonary toxicities were acceptable and manageable. Severe adverse events included neutropenia (Grade 4, 4%) and thrombocytopenia (Grade 4, 8%), esophagitis (Grade 3, 4%), and pneumonitis (Grade 3, 4%). With a median follow-up of 10.2 months, a local control rate of 96% was achieved for thoracic tumor. Median time to progression, median PFS, and median survival time were 6.3, 10.2, and 21.8 months, respectively. The 1- and 2-year PFS rates were both 42%, and 1-, 2-, and 3-year overall survival rates were 57%, 45%, and 30%, respectively. Conclusion: Concurrent EGFR-TKIs with individualized RT shows a favorable safety profile and promising outcome, therefore serving as a therapeutic option for patients with locally advanced or metastatic NSCLC.« less
Giant Panda Maternal Care: A Test of the Experience Constraint Hypothesis.
Snyder, Rebecca J; Perdue, Bonnie M; Zhang, Zhihe; Maple, Terry L; Charlton, Benjamin D
2016-06-07
The body condition constraint and the experience condition constraint hypotheses have both been proposed to account for differences in reproductive success between multiparous (experienced) and primiparous (first-time) mothers. However, because primiparous mothers are typically characterized by both inferior body condition and lack of experience when compared to multiparous mothers, interpreting experience related differences in maternal care as support for either the body condition constraint hypothesis or the experience constraint hypothesis is extremely difficult. Here, we examined maternal behaviour in captive giant pandas, allowing us to simultaneously control for body condition and provide a rigorous test of the experience constraint hypothesis in this endangered animal. We found that multiparous mothers spent more time engaged in key maternal behaviours (nursing, grooming, and holding cubs) and had significantly less vocal cubs than primiparous mothers. This study provides the first evidence supporting the experience constraint hypothesis in the order Carnivora, and may have utility for captive breeding programs in which it is important to monitor the welfare of this species' highly altricial cubs, whose survival is almost entirely dependent on receiving adequate maternal care during the first few weeks of life.
Large-scale linear programs in planning and prediction.
DOT National Transportation Integrated Search
2017-06-01
Large-scale linear programs are at the core of many traffic-related optimization problems in both planning and prediction. Moreover, many of these involve significant uncertainty, and hence are modeled using either chance constraints, or robust optim...
A cognitive approach to classifying perceived behaviors
NASA Astrophysics Data System (ADS)
Benjamin, Dale Paul; Lyons, Damian
2010-04-01
This paper describes our work on integrating distributed, concurrent control in a cognitive architecture, and using it to classify perceived behaviors. We are implementing the Robot Schemas (RS) language in Soar. RS is a CSP-type programming language for robotics that controls a hierarchy of concurrently executing schemas. The behavior of every RS schema is defined using port automata. This provides precision to the semantics and also a constructive means of reasoning about the behavior and meaning of schemas. Our implementation uses Soar operators to build, instantiate and connect port automata as needed. Our approach is to use comprehension through generation (similar to NLSoar) to search for ways to construct port automata that model perceived behaviors. The generality of RS permits us to model dynamic, concurrent behaviors. A virtual world (Ogre) is used to test the accuracy of these automata. Soar's chunking mechanism is used to generalize and save these automata. In this way, the robot learns to recognize new behaviors.
Utilization of CAD/CAE for concurrent design of structural aircraft components
NASA Technical Reports Server (NTRS)
Kahn, William C.
1993-01-01
The feasibility of installing the Stratospheric Observatory for Infrared Astronomy telescope (named SOFIA) into an aircraft for NASA astronomy studies is investigated using CAD/CAE equipment to either design or supply data for every facet of design engineering. The aircraft selected for the platform was a Boeing 747, chosen on the basis of its ability to meet the flight profiles required for the given mission and payload. CAD models of the fuselage of two of the aircraft models studied (747-200 and 747 SP) were developed, and models for the component parts of the telescope and subsystems were developed by the various concurrent engineering groups of the SOFIA program, to determine the requirements for the cavity opening and for design configuration. It is noted that, by developing a plan to use CAD/CAE for concurrent engineering at the beginning of the study, it was possible to produce results in about two-thirds of the time required using traditional methods.
The detection of planetary systems from Space Station - A star observation strategy
NASA Technical Reports Server (NTRS)
Mascy, Alfred C.; Nishioka, Ken; Jorgensen, Helen; Swenson, Byron L.
1987-01-01
A 10-20-yr star-observation program for the Space Station Astrometric Telescope Facility (ATF) is proposed and evaluated by means of computer simulations. The primary aim of the program is to detect stars with planetary systems by precise determination of their motion relative to reference stars. The designs proposed for the ATF are described and illustrated; the basic parameters of the 127 stars selected for the program are listed in a table; spacecraft and science constraints, telescope slewing rates, and the possibility of limiting the program sample to stars near the Galactic equator are discussed; and the effects of these constraints are investigated by simulating 1 yr of ATF operation. Viewing all sky regions, the ATF would have 81-percent active viewing time, observing each star about 200 times (56 h) per yr; only small decrements in this performance would result from limiting the viewing field.
NASA Technical Reports Server (NTRS)
Butler, R.; Williams, F. W.
1992-01-01
A computer program for obtaining the optimum (least mass) dimensions of the kind of prismatic assemblies of laminated, composite plates which occur in advanced aerospace construction is described. Rigorous buckling analysis (derived from exact member theory) and a tailored design procedure are used to produce designs which satisfy buckling and material strength constraints and configurational requirements. Analysis is two to three orders of magnitude quicker than FEM, keeps track of all the governing modes of failure and is efficiently adapted to give sensitivities and to maintain feasibility. Tailoring encourages convergence in fewer sizing cycles than competing programs and permits start designs which are a long way from feasible and/or optimum. Comparisons with its predecessor, PASCO, show that the program is more likely to produce an optimum, will do so more quickly in some cases, and remains accurate for a wider range of problems.
Assessment of a new web-based sexual concurrency measurement tool for men who have sex with men.
Rosenberg, Eli S; Rothenberg, Richard B; Kleinbaum, David G; Stephenson, Rob B; Sullivan, Patrick S
2014-11-10
Men who have sex with men (MSM) are the most affected risk group in the United States' human immunodeficiency virus (HIV) epidemic. Sexual concurrency, the overlapping of partnerships in time, accelerates HIV transmission in populations and has been documented at high levels among MSM. However, concurrency is challenging to measure empirically and variations in assessment techniques used (primarily the date overlap and direct question approaches) and the outcomes derived from them have led to heterogeneity and questionable validity of estimates among MSM and other populations. The aim was to evaluate a novel Web-based and interactive partnership-timing module designed for measuring concurrency among MSM, and to compare outcomes measured by the partnership-timing module to those of typical approaches in an online study of MSM. In an online study of MSM aged ≥18 years, we assessed concurrency by using the direct question method and by gathering the dates of first and last sex, with enhanced programming logic, for each reported partner in the previous 6 months. From these methods, we computed multiple concurrency cumulative prevalence outcomes: direct question, day resolution / date overlap, and month resolution / date overlap including both 1-month ties and excluding ties. We additionally computed variants of the UNAIDS point prevalence outcome. The partnership-timing module was also administered. It uses an interactive month resolution calendar to improve recall and follow-up questions to resolve temporal ambiguities, combines elements of the direct question and date overlap approaches. The agreement between the partnership-timing module and other concurrency outcomes was assessed with percent agreement, kappa statistic (κ), and matched odds ratios at the individual, dyad, and triad levels of analysis. Among 2737 MSM who completed the partnership section of the partnership-timing module, 41.07% (1124/2737) of individuals had concurrent partners in the previous 6 months. The partnership-timing module had the highest degree of agreement with the direct question. Agreement was lower with date overlap outcomes (agreement range 79%-81%, κ range .55-.59) and lowest with the UNAIDS outcome at 5 months before interview (65% agreement, κ=.14, 95% CI .12-.16). All agreements declined after excluding individuals with 1 sex partner (always classified as not engaging in concurrency), although the highest agreement was still observed with the direct question technique (81% agreement, κ=.59, 95% CI .55-.63). Similar patterns in agreement were observed with dyad- and triad-level outcomes. The partnership-timing module showed strong concurrency detection ability and agreement with previous measures. These levels of agreement were greater than others have reported among previous measures. The partnership-timing module may be well suited to quantifying concurrency among MSM at multiple levels of analysis.
Thermodynamic Constraints Improve Metabolic Networks.
Krumholz, Elias W; Libourel, Igor G L
2017-08-08
In pursuit of establishing a realistic metabolic phenotypic space, the reversibility of reactions is thermodynamically constrained in modern metabolic networks. The reversibility constraints follow from heuristic thermodynamic poise approximations that take anticipated cellular metabolite concentration ranges into account. Because constraints reduce the feasible space, draft metabolic network reconstructions may need more extensive reconciliation, and a larger number of genes may become essential. Notwithstanding ubiquitous application, the effect of reversibility constraints on the predictive capabilities of metabolic networks has not been investigated in detail. Instead, work has focused on the implementation and validation of the thermodynamic poise calculation itself. With the advance of fast linear programming-based network reconciliation, the effects of reversibility constraints on network reconciliation and gene essentiality predictions have become feasible and are the subject of this study. Networks with thermodynamically informed reversibility constraints outperformed gene essentiality predictions compared to networks that were constrained with randomly shuffled constraints. Unconstrained networks predicted gene essentiality as accurately as thermodynamically constrained networks, but predicted substantially fewer essential genes. Networks that were reconciled with sequence similarity data and strongly enforced reversibility constraints outperformed all other networks. We conclude that metabolic network analysis confirmed the validity of the thermodynamic constraints, and that thermodynamic poise information is actionable during network reconciliation. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Aspect-object alignment with Integer Linear Programming in opinion mining.
Zhao, Yanyan; Qin, Bing; Liu, Ting; Yang, Wei
2015-01-01
Target extraction is an important task in opinion mining. In this task, a complete target consists of an aspect and its corresponding object. However, previous work has always simply regarded the aspect as the target itself and has ignored the important "object" element. Thus, these studies have addressed incomplete targets, which are of limited use for practical applications. This paper proposes a novel and important sentiment analysis task, termed aspect-object alignment, to solve the "object neglect" problem. The objective of this task is to obtain the correct corresponding object for each aspect. We design a two-step framework for this task. We first provide an aspect-object alignment classifier that incorporates three sets of features, namely, the basic, relational, and special target features. However, the objects that are assigned to aspects in a sentence often contradict each other and possess many complicated features that are difficult to incorporate into a classifier. To resolve these conflicts, we impose two types of constraints in the second step: intra-sentence constraints and inter-sentence constraints. These constraints are encoded as linear formulations, and Integer Linear Programming (ILP) is used as an inference procedure to obtain a final global decision that is consistent with the constraints. Experiments on a corpus in the camera domain demonstrate that the three feature sets used in the aspect-object alignment classifier are effective in improving its performance. Moreover, the classifier with ILP inference performs better than the classifier without it, thereby illustrating that the two types of constraints that we impose are beneficial.
Contracts and Management Services FY 1996 Site Support Program Plan: WBS 6.10.14. Revision 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knoll, J.M. Jr.
1995-09-01
This is the Contracts and Management Services site support program plan for the US DOE Hanford site. The topics addressed in the program plan include a mission statement, program objectives, planning assumptions, program constraints, work breakdown structure, milestone list, milestone description sheets, and activity detail including cost accounting narrative summary, approved funding budget, and activity detailed description.
Software For Integer Programming
NASA Technical Reports Server (NTRS)
Fogle, F. R.
1992-01-01
Improved Exploratory Search Technique for Pure Integer Linear Programming Problems (IESIP) program optimizes objective function of variables subject to confining functions or constraints, using discrete optimization or integer programming. Enables rapid solution of problems up to 10 variables in size. Integer programming required for accuracy in modeling systems containing small number of components, distribution of goods, scheduling operations on machine tools, and scheduling production in general. Written in Borland's TURBO Pascal.
38 CFR 21.7673 - Measurement of concurrent enrollments.
Code of Federal Regulations, 2012 CFR
2012-07-01
... in the program of education which the reservist is pursuing at the primary institution. This conversion will be accomplished as follows: (1) If VA measures the course at the primary institution on a... (CONTINUED) VOCATIONAL REHABILITATION AND EDUCATION Educational Assistance for Members of the Selected...
38 CFR 21.7673 - Measurement of concurrent enrollments.
Code of Federal Regulations, 2010 CFR
2010-07-01
... in the program of education which the reservist is pursuing at the primary institution. This conversion will be accomplished as follows: (1) If VA measures the course at the primary institution on a... (CONTINUED) VOCATIONAL REHABILITATION AND EDUCATION Educational Assistance for Members of the Selected...
A Measure for Evaluating the Effectiveness of Teen Pregnancy Prevention Programs.
ERIC Educational Resources Information Center
Somers, Cheryl L.; Johnson, Stephanie A.; Sawilowksy, Shlomo S.
2002-01-01
The Teen Attitude Pregnancy Scale (TAPS) was developed to measure teen attitudes and intentions regarding teenage pregnancy. The model demonstrated good internal consistency and concurrent validity for the samples in this study. Analysis revealed evidence of validity for this model. (JDM)
30 CFR 700.4 - Responsibility.
Code of Federal Regulations, 2013 CFR
2013-07-01
... concurrence of the Federal surface managing agency as unsuitable for all or certain types of surface coal... the regulation of surface coal mining and reclamation operations under the initial regulatory program... coal mining and reclamation operations on Federal lands in accordance with 30 CFR part 745. (e) The...
30 CFR 700.4 - Responsibility.
Code of Federal Regulations, 2010 CFR
2010-07-01
... concurrence of the Federal surface managing agency as unsuitable for all or certain types of surface coal... the regulation of surface coal mining and reclamation operations under the initial regulatory program... coal mining and reclamation operations on Federal lands in accordance with 30 CFR part 745. (e) The...
30 CFR 700.4 - Responsibility.
Code of Federal Regulations, 2012 CFR
2012-07-01
... concurrence of the Federal surface managing agency as unsuitable for all or certain types of surface coal... the regulation of surface coal mining and reclamation operations under the initial regulatory program... coal mining and reclamation operations on Federal lands in accordance with 30 CFR part 745. (e) The...
30 CFR 700.4 - Responsibility.
Code of Federal Regulations, 2014 CFR
2014-07-01
... concurrence of the Federal surface managing agency as unsuitable for all or certain types of surface coal... the regulation of surface coal mining and reclamation operations under the initial regulatory program... coal mining and reclamation operations on Federal lands in accordance with 30 CFR part 745. (e) The...
30 CFR 700.4 - Responsibility.
Code of Federal Regulations, 2011 CFR
2011-07-01
... concurrence of the Federal surface managing agency as unsuitable for all or certain types of surface coal... the regulation of surface coal mining and reclamation operations under the initial regulatory program... coal mining and reclamation operations on Federal lands in accordance with 30 CFR part 745. (e) The...
29 CFR 1952.313 - Final approval determination.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 1902. Accordingly, the Hawaii plan was granted final approval and concurrent Federal enforcement... employment in Hawaii. The plan does not cover maritime employment in the private sector; Federal government... effective as operations under the Federal program; to submit plan supplements in accordance with 29 CFR part...
29 CFR 1952.294 - Final approval determination.
Code of Federal Regulations, 2013 CFR
2013-07-01
.... Accordingly, the Nevada plan was granted final approval and concurrent Federal enforcement authority was... Nevada. The plan does not cover Federal government employers and employees; any private sector maritime... under the Federal program; to submit plan supplements in accordance with 29 CFR Part 1953; to allocate...
29 CFR 1952.294 - Final approval determination.
Code of Federal Regulations, 2010 CFR
2010-07-01
.... Accordingly, the Nevada plan was granted final approval and concurrent Federal enforcement authority was... Nevada. The plan does not cover Federal government employers and employees; any private sector maritime... under the Federal program; to submit plan supplements in accordance with 29 CFR Part 1953; to allocate...
Concurrent Validity of the Strength-Based "Behavioral Objective Sequence"
ERIC Educational Resources Information Center
Wilder, Lynn K.; Braaten, Sheldon; Wilhite, Kathi; Algozzine, Bob
2006-01-01
An essential task of diagnosticians is the accurate assessment of behavioral skills. Traditionally, deficit-based behavioral assessments have underscored student social skill deficits. Strength-based assessments delineate student competencies and are useful for individualized education program (IEP) and behavioral intervention plan (BIP)…
Virginia's program to combat drug-related DUI, 1988-1989.
DOT National Transportation Integrated Search
1992-01-01
Beginning on April 1, 1988, a revision to Virginia law gave police officers the authority to require an individual suspected of driving under the influence (DUI) of drugs to submit a blood sample to be tested for drug content. Concurrent with the imp...
Multilevel Atomicity - A New Correctness Criterion for Database Concurrency Control.
1982-09-01
Research Office Contract #DAAG29-79-C-0155, Office of Naval Research Contract #N00014.79-C-0873, and Advanced Research PRojecta Agecy of the Department...steps of V. Since the transactions need not be straight-line programs , but can branch in complicated ways. I am forced to describe separately the places...not know whether these specializations provide efficient implementations. This question is a topic for future study. The new programming language