Sample records for critical path scheduling

  1. Project Scheduling Based on Risk of Gas Transmission Pipe

    NASA Astrophysics Data System (ADS)

    Silvianita; Nurbaity, A.; Mulyadi, Y.; Suntoyo; Chamelia, D. M.

    2018-03-01

    The planning of a project has a time limit on which must be completed before or right at a predetermined time. Thus, in a project planning, it is necessary to have scheduling management that is useful for completing a project to achieve maximum results by considering the constraints that will exists. Scheduling management is undertaken to deal with uncertainties and negative impacts of time and cost in project completion. This paper explains about scheduling management in gas transmission pipeline project Gresik-Semarang to find out which scheduling plan is most effectively used in accordance with its risk value. Scheduling management in this paper is assissted by Microsoft Project software to find the critical path of existing project scheduling planning data. Critical path is the longest scheduling path with the fastest completion time. The result is found a critical path on project scheduling with completion time is 152 days. Furthermore, the calculation of risk is done by using House of Risk (HOR) method and it is found that the critical path has a share of 40.98 percent of all causes of the occurence of risk events that will be experienced.

  2. Zero-Slack, Noncritical Paths

    ERIC Educational Resources Information Center

    Simons, Jacob V., Jr.

    2017-01-01

    The critical path method/program evaluation and review technique method of project scheduling is based on the importance of managing a project's critical path(s). Although a critical path is the longest path through a network, its location in large projects is facilitated by the computation of activity slack. However, logical fallacies in…

  3. Optimizing Department of Defense Acquisition Development Test and Evaluation Scheduling

    DTIC Science & Technology

    2015-06-01

    CPM Critical Path Method DOD Department of Defense DT&E development test and evaluation EMD engineering and manufacturing development GAMS...these, including the Program Evaluation Review Technique (PERT), the Critical Path Method ( CPM ), and the resource- constrained project-scheduling...problem (RCPSP). These are of particular interest to this thesis as the current scheduling method uses elements of the PERT/ CPM , and the test

  4. Critical path method applied to research project planning: Fire Economics Evaluation System (FEES)

    Treesearch

    Earl B. Anderson; R. Stanton Hales

    1986-01-01

    The critical path method (CPM) of network analysis (a) depicts precedence among the many activities in a project by a network diagram; (b) identifies critical activities by calculating their starting, finishing, and float times; and (c) displays possible schedules by constructing time charts. CPM was applied to the development of the Forest Service's Fire...

  5. Earth Observing System (EOS) Advanced Microwave Sounding Unit-A (AMSU-A) schedule plan

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This report describes Aerojet's methods and procedures used to control and administer contractual schedules for the EOS/AMSU-A program. Included are the following: the master, intermediate, and detail schedules; critical path analysis; and the total program logic network diagrams.

  6. Nonpreemptive run-time scheduling issues on a multitasked, multiprogrammed multiprocessor with dependencies, bidimensional tasks, folding and dynamic graphs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Allan Ray

    1987-05-01

    Increases in high speed hardware have mandated studies in software techniques to exploit the parallel capabilities. This thesis examines the effects a run-time scheduler has on a multiprocessor. The model consists of directed, acyclic graphs, generated from serial FORTRAN benchmark programs by the parallel compiler Parafrase. A multitasked, multiprogrammed environment is created. Dependencies are generated by the compiler. Tasks are bidimensional, i.e., they may specify both time and processor requests. Processor requests may be folded into execution time by the scheduler. The graphs may arrive at arbitrary time intervals. The general case is NP-hard, thus, a variety of heuristics aremore » examined by a simulator. Multiprogramming demonstrates a greater need for a run-time scheduler than does monoprogramming for a variety of reasons, e.g., greater stress on the processors, a larger number of independent control paths, more variety in the task parameters, etc. The dynamic critical path series of algorithms perform well. Dynamic critical volume did not add much. Unfortunately, dynamic critical path maximizes turnaround time as well as throughput. Two schedulers are presented which balance throughput and turnaround time. The first requires classification of jobs by type; the second requires selection of a ratio value which is dependent upon system parameters. 45 refs., 19 figs., 20 tabs.« less

  7. Scheduling: A guide for program managers

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The following topics are discussed concerning scheduling: (1) milestone scheduling; (2) network scheduling; (3) program evaluation and review technique; (4) critical path method; (5) developing a network; (6) converting an ugly duckling to a swan; (7) network scheduling problem; (8) (9) network scheduling when resources are limited; (10) multi-program considerations; (11) influence on program performance; (12) line-of-balance technique; (13) time management; (14) recapitulization; and (15) analysis.

  8. Critical Path Method Networks and Their Use in Claims Analysis.

    DTIC Science & Technology

    1984-01-01

    produced will only be as good as the time invested and the knowledge of the scheduler. A schedule which is based on faulty logic or which contains... fundementals of putting a schedule together but also *how the construction process functions so that the delays can be accurately inserted. When

  9. 48 CFR 1352.271-73 - Schedule of work.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Manpower Loading Curve. (4) Trade Manning Curves. (5) Subcontracting List. (b) The Production Schedule... events, and activities and shall clearly identify the critical path. The Total Manpower Loading Curve... deviation in the Production Schedule which results in a delay in the completion of work on a vessel past the...

  10. 48 CFR 1352.271-73 - Schedule of work.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Manpower Loading Curve. (4) Trade Manning Curves. (5) Subcontracting List. (b) The Production Schedule... events, and activities and shall clearly identify the critical path. The Total Manpower Loading Curve... deviation in the Production Schedule which results in a delay in the completion of work on a vessel past the...

  11. 48 CFR 1352.271-73 - Schedule of work.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Manpower Loading Curve. (4) Trade Manning Curves. (5) Subcontracting List. (b) The Production Schedule... events, and activities and shall clearly identify the critical path. The Total Manpower Loading Curve... deviation in the Production Schedule which results in a delay in the completion of work on a vessel past the...

  12. 48 CFR 1352.271-73 - Schedule of work.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Manpower Loading Curve. (4) Trade Manning Curves. (5) Subcontracting List. (b) The Production Schedule... events, and activities and shall clearly identify the critical path. The Total Manpower Loading Curve... deviation in the Production Schedule which results in a delay in the completion of work on a vessel past the...

  13. 48 CFR 1352.271-73 - Schedule of work.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Manpower Loading Curve. (4) Trade Manning Curves. (5) Subcontracting List. (b) The Production Schedule... events, and activities and shall clearly identify the critical path. The Total Manpower Loading Curve... deviation in the Production Schedule which results in a delay in the completion of work on a vessel past the...

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Russell, J.S.

    Several factors in the development of the East Wilmington oil field by THUMS Long Beach Co. are described. These include: critical path scheduling, complex stratigraphy, reservoir engineering, drilling program, production methods, pressure maintenance, crude oil processing, automation, transportation facilities, service lines, and electrical facilities. The complexity and closely scheduled operational events interwoven in the THUMS project demands a method for the carefully planned sequence of jobs to be done, beginning with island construction up through routine production and to the LACT system. These demanding requirements necessitated the use of a critical path scheduling program. It was decided to use themore » program evaluation technique. This technique is used to assign responsibilities for individual assignments to time assignments, and to keep the overall program on schedule. The stratigraphy of East Wilmington complicates all engineering functions associated with recovery methods and reservoir evaluation. At least 5 major faults are anticipated.« less

  15. Do-It-Yourself Critical Path Method.

    ERIC Educational Resources Information Center

    Morris, Edward P., Jr.

    This report describes the critical path method (CPM), a system for planning and scheduling work to get the best time-cost combination for any particular job. With the use of diagrams, the report describes how CPM works on a step-by-step basis. CPM uses a network to show which parts of a job must be done and how they would eventually fit together…

  16. A DAG Scheduling Scheme on Heterogeneous Computing Systems Using Tuple-Based Chemical Reaction Optimization

    PubMed Central

    Jiang, Yuyi; Shao, Zhiqing; Guo, Yi

    2014-01-01

    A complex computing problem can be solved efficiently on a system with multiple computing nodes by dividing its implementation code into several parallel processing modules or tasks that can be formulated as directed acyclic graph (DAG) problems. The DAG jobs may be mapped to and scheduled on the computing nodes to minimize the total execution time. Searching an optimal DAG scheduling solution is considered to be NP-complete. This paper proposed a tuple molecular structure-based chemical reaction optimization (TMSCRO) method for DAG scheduling on heterogeneous computing systems, based on a very recently proposed metaheuristic method, chemical reaction optimization (CRO). Comparing with other CRO-based algorithms for DAG scheduling, the design of tuple reaction molecular structure and four elementary reaction operators of TMSCRO is more reasonable. TMSCRO also applies the concept of constrained critical paths (CCPs), constrained-critical-path directed acyclic graph (CCPDAG) and super molecule for accelerating convergence. In this paper, we have also conducted simulation experiments to verify the effectiveness and efficiency of TMSCRO upon a large set of randomly generated graphs and the graphs for real world problems. PMID:25143977

  17. A DAG scheduling scheme on heterogeneous computing systems using tuple-based chemical reaction optimization.

    PubMed

    Jiang, Yuyi; Shao, Zhiqing; Guo, Yi

    2014-01-01

    A complex computing problem can be solved efficiently on a system with multiple computing nodes by dividing its implementation code into several parallel processing modules or tasks that can be formulated as directed acyclic graph (DAG) problems. The DAG jobs may be mapped to and scheduled on the computing nodes to minimize the total execution time. Searching an optimal DAG scheduling solution is considered to be NP-complete. This paper proposed a tuple molecular structure-based chemical reaction optimization (TMSCRO) method for DAG scheduling on heterogeneous computing systems, based on a very recently proposed metaheuristic method, chemical reaction optimization (CRO). Comparing with other CRO-based algorithms for DAG scheduling, the design of tuple reaction molecular structure and four elementary reaction operators of TMSCRO is more reasonable. TMSCRO also applies the concept of constrained critical paths (CCPs), constrained-critical-path directed acyclic graph (CCPDAG) and super molecule for accelerating convergence. In this paper, we have also conducted simulation experiments to verify the effectiveness and efficiency of TMSCRO upon a large set of randomly generated graphs and the graphs for real world problems.

  18. Project Planning and Reporting

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Project Planning Analysis and Reporting System (PPARS) is automated aid in monitoring and scheduling of activities within project. PPARS system consists of PPARS Batch Program, five preprocessor programs, and two post-processor programs. PPARS Batch program is full CPM (Critical Path Method) scheduling program with resource capabilities. Can process networks with up to 10,000 activities.

  19. The Application of Project Management Techniques to College and University Admissions Activities.

    ERIC Educational Resources Information Center

    Bickers, Doyle

    1993-01-01

    The process of project management is illustrated through application to one activity, development of a new brochure, within the admissions program of a fictional college. The project life cycle is described, and a work responsibility schedule, project completion schedule, and critical path chart are used as planning and implementation tools. (MSE)

  20. Surface Hold Advisor Using Critical Sections

    NASA Technical Reports Server (NTRS)

    Law, Caleb Hoi Kei (Inventor); Hsiao, Thomas Kun-Lung (Inventor); Mittler, Nathan C. (Inventor); Couluris, George J. (Inventor)

    2013-01-01

    The Surface Hold Advisor Using Critical Sections is a system and method for providing hold advisories to surface controllers to prevent gridlock and resolve crossing and merging conflicts among vehicles traversing a vertex-edge graph representing a surface traffic network on an airport surface. The Advisor performs pair-wise comparisons of current position and projected path of each vehicle with other surface vehicles to detect conflicts, determine critical sections, and provide hold advisories to traffic controllers recommending vehicles stop at entry points to protected zones around identified critical sections. A critical section defines a segment of the vertex-edge graph where vehicles are in crossing or merging or opposite direction gridlock contention. The Advisor detects critical sections without reference to scheduled, projected or required times along assigned vehicle paths, and generates hold advisories to prevent conflicts without requiring network path direction-of-movement rules and without requiring rerouting, rescheduling or other network optimization solutions.

  1. Generating Performance Models for Irregular Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friese, Ryan D.; Tallent, Nathan R.; Vishnu, Abhinav

    2017-05-30

    Many applications have irregular behavior --- non-uniform input data, input-dependent solvers, irregular memory accesses, unbiased branches --- that cannot be captured using today's automated performance modeling techniques. We describe new hierarchical critical path analyses for the \\Palm model generation tool. To create a model's structure, we capture tasks along representative MPI critical paths. We create a histogram of critical tasks with parameterized task arguments and instance counts. To model each task, we identify hot instruction-level sub-paths and model each sub-path based on data flow, instruction scheduling, and data locality. We describe application models that generate accurate predictions for strong scalingmore » when varying CPU speed, cache speed, memory speed, and architecture. We present results for the Sweep3D neutron transport benchmark; Page Rank on multiple graphs; Support Vector Machine with pruning; and PFLOTRAN's reactive flow/transport solver with domain-induced load imbalance.« less

  2. On scheduling task systems with variable service times

    NASA Astrophysics Data System (ADS)

    Maset, Richard G.; Banawan, Sayed A.

    1993-08-01

    Several strategies have been proposed for developing optimal and near-optimal schedules for task systems (jobs consisting of multiple tasks that can be executed in parallel). Most such strategies, however, implicitly assume deterministic task service times. We show that these strategies are much less effective when service times are highly variable. We then evaluate two strategies—one adaptive, one static—that have been proposed for retaining high performance despite such variability. Both strategies are extensions of critical path scheduling, which has been found to be efficient at producing near-optimal schedules. We found the adaptive approach to be quite effective.

  3. Navy Columbia Class (Ohio Replacement) Ballistic Missile Submarine (SSBN[X]) Program: Background and Issues for Congress

    DTIC Science & Technology

    2016-08-18

    of UK design and manufacture . 12 For more on the arrangement for jointly building Virginia- class boats, see CRS Report RL32418, Navy Virginia (SSN...schedule reduction in Missile Tube Module manufacturing for the OR Class . This schedule reduction, on a potential critical path assembly, would reduce...Navy Columbia Class (Ohio Replacement) Ballistic Missile Submarine (SSBN[X]) Program: Background and Issues for Congress Ronald O’Rourke

  4. Implementation Of Fuzzy Approach To Improve Time Estimation [Case Study Of A Thermal Power Plant Is Considered

    NASA Astrophysics Data System (ADS)

    Pradhan, Moumita; Pradhan, Dinesh; Bandyopadhyay, G.

    2010-10-01

    Fuzzy System has demonstrated their ability to solve different kinds of problem in various application domains. There is an increasing interest to apply fuzzy concept to improve tasks of any system. Here case study of a thermal power plant is considered. Existing time estimation represents time to complete tasks. Applying fuzzy linear approach it becomes clear that after each confidence level least time is taken to complete tasks. As time schedule is less than less amount of cost is needed. Objective of this paper is to show how one system becomes more efficient in applying Fuzzy Linear approach. In this paper we want to optimize the time estimation to perform all tasks in appropriate time schedules. For the case study, optimistic time (to), pessimistic time (tp), most likely time(tm) is considered as data collected from thermal power plant. These time estimates help to calculate expected time(te) which represents time to complete particular task to considering all happenings. Using project evaluation and review technique (PERT) and critical path method (CPM) concept critical path duration (CPD) of this project is calculated. This tells that the probability of fifty percent of the total tasks can be completed in fifty days. Using critical path duration and standard deviation of the critical path, total completion of project can be completed easily after applying normal distribution. Using trapezoidal rule from four time estimates (to, tm, tp, te), we can calculate defuzzyfied value of time estimates. For range of fuzzy, we consider four confidence interval level say 0.4, 0.6, 0.8,1. From our study, it is seen that time estimates at confidence level between 0.4 and 0.8 gives the better result compared to other confidence levels.

  5. The algorithm for duration acceleration of repetitive projects considering the learning effect

    NASA Astrophysics Data System (ADS)

    Chen, Hongtao; Wang, Keke; Du, Yang; Wang, Liwan

    2018-03-01

    Repetitive project optimization problem is common in project scheduling. Repetitive Scheduling Method (RSM) has many irreplaceable advantages in the field of repetitive projects. As the same or similar work is repeated, the proficiency of workers will be correspondingly low to high, and workers will gain experience and improve the efficiency of operations. This is learning effect. Learning effect is one of the important factors affecting the optimization results in repetitive project scheduling. This paper analyzes the influence of the learning effect on the controlling path in RSM from two aspects: one is that the learning effect changes the controlling path, the other is that the learning effect doesn't change the controlling path. This paper proposes corresponding methods to accelerate duration for different types of critical activities and proposes the algorithm for duration acceleration based on the learning effect in RSM. And the paper chooses graphical method to identity activities' types and considers the impacts of the learning effect on duration. The method meets the requirement of duration while ensuring the lowest acceleration cost. A concrete bridge construction project is given to verify the effectiveness of the method. The results of this study will help project managers understand the impacts of the learning effect on repetitive projects, and use the learning effect to optimize project scheduling.

  6. Multi-Element Integrated Project Planning at Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Mullon, Robert

    2008-01-01

    This presentation demonstrates how the ASRC Scheduling team developed working practices to support multiple NASA and ASRC Project Managers using the enterprise capabilities of Primavera P6 and P6 Web Access. This work has proceeded as part of Kennedy Ground Systems' preparation for its transition from the Shuttle Program to the Constellation Program. The presenters will cover Primavera's enterprise-class capabilities for schedule development, integrated critical path analysis, and reporting, as well as advanced Primavera P6 Web Access tools and techniques for communicating project status.

  7. The Manufacture of W-UO2 Fuel Elements for NTP Using the Hot Isostatic Pressing Consolidation Process

    NASA Technical Reports Server (NTRS)

    Broadway, Jeramie; Hickman, Robert; Mireles, Omar

    2012-01-01

    NTP is attractive for space exploration because: (1) Higher Isp than traditional chemical rockets (2)Shorter trip times (3) Reduced propellant mass (4) Increased payload. Lack of qualified fuel material is a key risk (cost, schedule, and performance). Development of stable fuel form is a critical path, long lead activity. Goals of this project are: Mature CERMET and Graphite based fuel materials and Develop and demonstrate critical technologies and capabilities.

  8. A Scheduling Algorithm for Cloud Computing System Based on the Driver of Dynamic Essential Path.

    PubMed

    Xie, Zhiqiang; Shao, Xia; Xin, Yu

    2016-01-01

    To solve the problem of task scheduling in the cloud computing system, this paper proposes a scheduling algorithm for cloud computing based on the driver of dynamic essential path (DDEP). This algorithm applies a predecessor-task layer priority strategy to solve the problem of constraint relations among task nodes. The strategy assigns different priority values to every task node based on the scheduling order of task node as affected by the constraint relations among task nodes, and the task node list is generated by the different priority value. To address the scheduling order problem in which task nodes have the same priority value, the dynamic essential long path strategy is proposed. This strategy computes the dynamic essential path of the pre-scheduling task nodes based on the actual computation cost and communication cost of task node in the scheduling process. The task node that has the longest dynamic essential path is scheduled first as the completion time of task graph is indirectly influenced by the finishing time of task nodes in the longest dynamic essential path. Finally, we demonstrate the proposed algorithm via simulation experiments using Matlab tools. The experimental results indicate that the proposed algorithm can effectively reduce the task Makespan in most cases and meet a high quality performance objective.

  9. A Scheduling Algorithm for Cloud Computing System Based on the Driver of Dynamic Essential Path

    PubMed Central

    Xie, Zhiqiang; Shao, Xia; Xin, Yu

    2016-01-01

    To solve the problem of task scheduling in the cloud computing system, this paper proposes a scheduling algorithm for cloud computing based on the driver of dynamic essential path (DDEP). This algorithm applies a predecessor-task layer priority strategy to solve the problem of constraint relations among task nodes. The strategy assigns different priority values to every task node based on the scheduling order of task node as affected by the constraint relations among task nodes, and the task node list is generated by the different priority value. To address the scheduling order problem in which task nodes have the same priority value, the dynamic essential long path strategy is proposed. This strategy computes the dynamic essential path of the pre-scheduling task nodes based on the actual computation cost and communication cost of task node in the scheduling process. The task node that has the longest dynamic essential path is scheduled first as the completion time of task graph is indirectly influenced by the finishing time of task nodes in the longest dynamic essential path. Finally, we demonstrate the proposed algorithm via simulation experiments using Matlab tools. The experimental results indicate that the proposed algorithm can effectively reduce the task Makespan in most cases and meet a high quality performance objective. PMID:27490901

  10. 48 CFR 852.236-83 - Payments under fixed-price construction contracts (including NAS).

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... shall show on the critical path method (CPM) network the total cost of the guarantee period services in... prescribed in 832.111, insert the following clause in contracts that contain a section entitled “Network...) Failure either to meet schedules in Section Network Analysis System (NAS), or to process the Interim Arrow...

  11. 48 CFR 852.236-83 - Payments under fixed-price construction contracts (including NAS).

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... shall show on the critical path method (CPM) network the total cost of the guarantee period services in... prescribed in 832.111, insert the following clause in contracts that contain a section entitled “Network...) Failure either to meet schedules in Section Network Analysis System (NAS), or to process the Interim Arrow...

  12. 48 CFR 852.236-83 - Payments under fixed-price construction contracts (including NAS).

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... shall show on the critical path method (CPM) network the total cost of the guarantee period services in... prescribed in 832.111, insert the following clause in contracts that contain a section entitled “Network...) Failure either to meet schedules in Section Network Analysis System (NAS), or to process the Interim Arrow...

  13. Overview and Status of the Bioastronautics Critical Path Roadmap (BCPR)

    NASA Technical Reports Server (NTRS)

    Charles, John

    2004-01-01

    Viewgraphs on the status and overview of the Bioastronautics Critical Path Roadmap (BCPR) are presented. The topics include: 1) BCPR Objectives; 2) BCPR and OBPR Program Management; 3) BCPR Disciplines & Cross-Cutting Areas; 4) Characteristics of BCPR Reference Missions; 5) Bioastronautics Timetable (notional); 6) BCPR Processes Risk Identification, Assessment, and Management; 7) Types of BCPR Risks; 8) Enabling Questions Categories; 9) Risk Mitigation Status; 10) Defining Levels of Accepted Risk; 11) BCPR Integration; 12) BCPR Implementation, Integration, and Validation; 13) BCPR Refinement Schedule; 14) Academy Review; 15) Rating Bioastronautics Risks; 16) Risk Rating Exercises; 17) Human Health Risk Assessment Criteria (examples); 18) A Recent Risk Rating Exercise; 19) Consensus Workshop Background; 20) Consensus Workshop Rating Analysis; 21) Consensus Workshop Selected Preliminary Recommendations; and 22) Access to BCPR Content.

  14. Automated Planning and Scheduling for Planetary Rover Distributed Operations

    NASA Technical Reports Server (NTRS)

    Backes, Paul G.; Rabideau, Gregg; Tso, Kam S.; Chien, Steve

    1999-01-01

    Automated planning and Scheduling, including automated path planning, has been integrated with an Internet-based distributed operations system for planetary rover operations. The resulting prototype system enables faster generation of valid rover command sequences by a distributed planetary rover operations team. The Web Interface for Telescience (WITS) provides Internet-based distributed collaboration, the Automated Scheduling and Planning Environment (ASPEN) provides automated planning and scheduling, and an automated path planner provided path planning. The system was demonstrated on the Rocky 7 research rover at JPL.

  15. SMEX-Lite Modular Solar Array Architecture

    NASA Technical Reports Server (NTRS)

    Lyons, John W.; Day, John (Technical Monitor)

    2002-01-01

    The NASA Small Explorer (SMEX) missions have typically had three years between mission definition and launch. This short schedule has posed significant challenges with respect to solar array design and procurement. Typically, the solar panel geometry is frozen prior to going out with a procurement. However, with the SMEX schedule, it has been virtually impossible to freeze the geometry in time to avoid scheduling problems with integrating the solar panels to the spacecraft. A modular solar array architecture was developed to alleviate this problem. This approach involves procuring sufficient modules for multiple missions and assembling the modules onto a solar array framework that is unique to each mission. The modular approach removes the solar array from the critical path of the SMEX integration and testing schedule. It also reduces the cost per unit area of the solar arrays and facilitates the inclusion of experiments involving new solar cell or panel technologies in the SMEX missions.

  16. A format for the interchange of scheduling models

    NASA Technical Reports Server (NTRS)

    Jaap, John P.; Davis, Elizabeth K.

    1994-01-01

    In recent years a variety of space-activity schedulers have been developed within the aerospace community. Space-activity schedulers are characterized by their need to handle large numbers of activities which are time-window constrained and make high demands on many scarce resources, but are minimally constrained by predecessor/successor requirements or critical paths. Two needs to exchange data between these schedulers have materialized. First, there is significant interest in comparing and evaluating the different scheduling engines to ensure that the best technology is applied to each scheduling endeavor. Second, there is a developing requirement to divide a single scheduling task among different sites, each using a different scheduler. In fact, the scheduling task for International Space Station Alpha (ISSA) will be distributed among NASA centers and among the international partners. The format used to interchange scheduling data for ISSA will likely use a growth version of the format discussed in this paper. The model interchange format (or MIF, pronounced as one syllable) discussed in this paper is a robust solution to the need to interchange scheduling requirements for space activities. It is highly extensible, human-readable, and can be generated or edited with common text editors. It also serves well the need to support a 'benchmark' data case which can be delivered on any computer platform.

  17. Opportunities to Investigate the Steering System for Improvement of Truck Driving Properties under Critical Road Conditions

    NASA Astrophysics Data System (ADS)

    Gidlewski, Mirosław

    2011-09-01

    Application of an electric steering system in a truck gives new opportunities to obtain desirable and safe motion path under critical road conditions. Analysis of the opportunity to take advantage of the steering system for improvement of truck driving properties will be carried out on the basis of the results of model tests. The paper describes model of the vehicle applied in simulation tests and methodology as well as anticipated results. The scheduled tests will be carried out within the framework of an research project No. NN509 568439 headed by the author.

  18. Feelings of energy, exercise-related self-efficacy, and voluntary exercise participation.

    PubMed

    Yoon, Seok; Buckworth, Janet; Focht, Brian; Ko, Bomna

    2013-12-01

    This study used a path analysis approach to examine the relationship between feelings of energy, exercise-related self-efficacy beliefs, and exercise participation. A cross-sectional mailing survey design was used to measure feelings of physical and mental energy, task and scheduling self-efficacy beliefs, and voluntary moderate and vigorous exercise participation in 368 healthy, full-time undergraduate students (mean age = 21.43 ± 2.32 years). The path analysis revealed that the hypothesized path model had a strong fit to the study data. The path model showed that feelings of physical energy had significant direct effects on task and scheduling self-efficacy beliefs as well as exercise behaviors. In addition, scheduling self-efficacy had direct effects on moderate and vigorous exercise participation. However, there was no significant direct relationship between task self-efficacy and exercise participation. The path model also revealed that scheduling self-efficacy partially mediated the relationship between feelings of physical energy and exercise participation.

  19. DESIGN AND CONSTRUCTION OF SCHOOL BUILDINGS. PROCEEDINGS, ASSOCIATION OF SCHOOL BUSINESS OFFICIALS OF THE UNITED STATES AND CANADA, ANNUAL MEETING AND EDUCATIONAL EXHIBIT, (50TH SAN FRANCISCO, CALIFORNIA, OCTOBER 71-22, 1964).

    ERIC Educational Resources Information Center

    LIEBESKIND, MORRIS

    PROBLEMS IN THE SCHEDULING AND COMPLETION OF SCHOOL BUILDING DESIGN AND CONSTRUCTION PROJECTS ARE DISCUSSED WITH REFERENCE TO THE CRITICAL PATH METHOD OF PROGRAMING. THE DISCUSSION GIVES A BROAD OVERVIEW OF THE METHOD WITH DETAILED SUGGESTIONS FOR SCHOOL ADMINISTRATORS. SPECIFIC SUBJECT AREAS INCLUDE--(1) CPM, A NEW MANAGEMENT TOOL, (2) CPM…

  20. System and method for optimal load and source scheduling in context aware homes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shetty, Pradeep; Foslien Graber, Wendy; Mangsuli, Purnaprajna R.

    A controller for controlling energy consumption in a home includes a constraints engine to define variables for multiple appliances in the home corresponding to various home modes and persona of an occupant of the home. A modeling engine models multiple paths of energy utilization of the multiple appliances to place the home into a desired state from a current context. An optimal scheduler receives the multiple paths of energy utilization and generates a schedule as a function of the multiple paths and a selected persona to place the home in a desired state.

  1. Propagating Resource Constraints Using Mutual Exclusion Reasoning

    NASA Technical Reports Server (NTRS)

    Frank, Jeremy; Sanchez, Romeo; Do, Minh B.; Clancy, Daniel (Technical Monitor)

    2001-01-01

    One of the most recent techniques for propagating resource constraints in Constraint Based scheduling is Energy Constraint. This technique focuses in precedence based scheduling, where precedence relations are taken into account rather than the absolute position of activities. Although, this particular technique proved to be efficient on discrete unary resources, it provides only loose bounds for jobs using discrete multi-capacity resources. In this paper we show how mutual exclusion reasoning can be used to propagate time bounds for activities using discrete resources. We show that our technique based on critical path analysis and mutex reasoning is just as effective on unary resources, and also shows that it is more effective on multi-capacity resources, through both examples and empirical study.

  2. Applying the TOC Project Management to Operation and Maintenance Scheduling of a Research Vessel

    NASA Astrophysics Data System (ADS)

    Manti, M. Firdausi; Fujimoto, Hideo; Chen, Lian-Yi

    Marine research vessels and their systems are major assets in the marine resources development. Since the running costs for the ship are very high, it is necessary to reduce the total cost by an efficient scheduling for operation and maintenance. To reduce project period and make it efficient, we applied TOC project management method that is a project management approach developed by Dr. Eli Goldratt. It challenges traditional approaches to project management. It will become the most important improvement in the project management since the development of PERT and critical path methodologies. As a case study, we presented the marine geology research project for the purpose of operations in addition to repair on the repairing dock projects for maintenance of vessels.

  3. The LSST Scheduler from design to construction

    NASA Astrophysics Data System (ADS)

    Delgado, Francisco; Reuter, Michael A.

    2016-07-01

    The Large Synoptic Survey Telescope (LSST) will be a highly robotic facility, demanding a very high efficiency during its operation. To achieve this, the LSST Scheduler has been envisioned as an autonomous software component of the Observatory Control System (OCS), that selects the sequence of targets in real time. The Scheduler will drive the survey using optimization of a dynamic cost function of more than 200 parameters. Multiple science programs produce thousands of candidate targets for each observation, and multiple telemetry measurements are received to evaluate the external and the internal conditions of the observatory. The design of the LSST Scheduler started early in the project supported by Model Based Systems Engineering, detailed prototyping and scientific validation of the survey capabilities required. In order to build such a critical component, an agile development path in incremental releases is presented, integrated to the development plan of the Operations Simulator (OpSim) to allow constant testing, integration and validation in a simulated OCS environment. The final product is a Scheduler that is also capable of running 2000 times faster than real time in simulation mode for survey studies and scientific validation during commissioning and operations.

  4. Agent-Based Simulations for Project Management

    NASA Technical Reports Server (NTRS)

    White, J. Chris; Sholtes, Robert M.

    2011-01-01

    Currently, the most common approach used in project planning tools is the Critical Path Method (CPM). While this method was a great improvement over the basic Gantt chart technique being used at the time, it now suffers from three primary flaws: (1) task duration is an input, (2) productivity impacts are not considered , and (3) management corrective actions are not included. Today, computers have exceptional computational power to handle complex simulations of task e)(eculion and project management activities (e.g ., dynamically changing the number of resources assigned to a task when it is behind schedule). Through research under a Department of Defense contract, the author and the ViaSim team have developed a project simulation tool that enables more realistic cost and schedule estimates by using a resource-based model that literally turns the current duration-based CPM approach "on its head." The approach represents a fundamental paradigm shift in estimating projects, managing schedules, and reducing risk through innovative predictive techniques.

  5. Voltage scheduling for low power/energy

    NASA Astrophysics Data System (ADS)

    Manzak, Ali

    2001-07-01

    Power considerations have become an increasingly dominant factor in the design of both portable and desk-top systems. An effective way to reduce power consumption is to lower the supply voltage since voltage is quadratically related to power. This dissertation considers the problem of lowering the supply voltage at (i) the system level and at (ii) the behavioral level. At the system level, the voltage of the variable voltage processor is dynamically changed with the work load. Processors with limited sized buffers as well as those with very large buffers are considered. Given the task arrival times, deadline times, execution times, periods and switching activities, task scheduling algorithms that minimize energy or peak power are developed for the processors equipped with very large buffers. A relation between the operating voltages of the tasks for minimum energy/power is determined using the Lagrange multiplier method, and an iterative algorithm that utilizes this relation is developed. Experimental results show that the voltage assignment obtained by the proposed algorithm is very close (0.1% error) to that of the optimal energy assignment and the optimal peak power (1% error) assignment. Next, on-line and off-fine minimum energy task scheduling algorithms are developed for processors with limited sized buffers. These algorithms have polynomial time complexity and present optimal (off-line) and close-to-optimal (on-line) solutions. A procedure to calculate the minimum buffer size given information about the size of the task (maximum, minimum), execution time (best case, worst case) and deadlines is also presented. At the behavioral level, resources operating at multiple voltages are used to minimize power while maintaining the throughput. Such a scheme has the advantage of allowing modules on the critical paths to be assigned to the highest voltage levels (thus meeting the required timing constraints) while allowing modules on non-critical paths to be assigned to lower voltage levels (thus reducing the power consumption). A polynomial time resource and latency constrained scheduling algorithm is developed to distribute the available slack among the nodes such that power consumption is minimum. The algorithm is iterative and utilizes the slack based on the Lagrange multiplier method.

  6. X-ray satellite

    NASA Technical Reports Server (NTRS)

    1985-01-01

    A mock-up for the development of the Engineering Model (EM) and Flight Model (FM) is introduced which shortens the delay of 7 weeks regarding the previous planned launch date of September 30, to about 3 weeks maintaining the 4 weeks reserve is discussed. As compared with the new assembly integration test (EM-AIT) schedule of March 11, 1985, the EM data handling system is on the critical path. For the attitude measurement and control subsystem, sufficiently flexibility is achieved through combination of dummies and EM hardware to catch up with the existing delays.

  7. Tera-OP Reliable Intelligently Adaptive Processing System (TRIPS) Implementation

    DTIC Science & Technology

    2008-09-01

    38 6.8 Instruction Scheduling ...39 6.8.1 Spatial Path Scheduling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 6.8.2...oblivious scheduling for rapid application prototyping and deployment, environmental adaptivity for resilience in hostile environments, and dynamic

  8. Task path planning, scheduling and learning for free-ranging robot systems

    NASA Technical Reports Server (NTRS)

    Wakefield, G. Steve

    1987-01-01

    The development of robotics applications for space operations is often restricted by the limited movement available to guided robots. Free ranging robots can offer greater flexibility than physically guided robots in these applications. Presented here is an object oriented approach to path planning and task scheduling for free-ranging robots that allows the dynamic determination of paths based on the current environment. The system also provides task learning for repetitive jobs. This approach provides a basis for the design of free-ranging robot systems which are adaptable to various environments and tasks.

  9. A computer method for schedule processing and quick-time updating.

    NASA Technical Reports Server (NTRS)

    Mccoy, W. H.

    1972-01-01

    A schedule analysis program is presented which can be used to process any schedule with continuous flow and with no loops. Although generally thought of as a management tool, it has applicability to such extremes as music composition and computer program efficiency analysis. Other possibilities for its use include the determination of electrical power usage during some operation such as spacecraft checkout, and the determination of impact envelopes for the purpose of scheduling payloads in launch processing. At the core of the described computer method is an algorithm which computes the position of each activity bar on the output waterfall chart. The algorithm is basically a maximal-path computation which gives to each node in the schedule network the maximal path from the initial node to the given node.

  10. Design of a universal logic block for fault-tolerant realization of any logic operation in trapped-ion quantum circuits

    NASA Astrophysics Data System (ADS)

    Goudarzi, H.; Dousti, M. J.; Shafaei, A.; Pedram, M.

    2014-05-01

    This paper presents a physical mapping tool for quantum circuits, which generates the optimal universal logic block (ULB) that can, on average, perform any logical fault-tolerant (FT) quantum operations with the minimum latency. The operation scheduling, placement, and qubit routing problems tackled by the quantum physical mapper are highly dependent on one another. More precisely, the scheduling solution affects the quality of the achievable placement solution due to resource pressures that may be created as a result of operation scheduling, whereas the operation placement and qubit routing solutions influence the scheduling solution due to resulting distances between predecessor and current operations, which in turn determines routing latencies. The proposed flow for the quantum physical mapper captures these dependencies by applying (1) a loose scheduling step, which transforms an initial quantum data flow graph into one that explicitly captures the no-cloning theorem of the quantum computing and then performs instruction scheduling based on a modified force-directed scheduling approach to minimize the resource contention and quantum circuit latency, (2) a placement step, which uses timing-driven instruction placement to minimize the approximate routing latencies while making iterative calls to the aforesaid force-directed scheduler to correct scheduling levels of quantum operations as needed, and (3) a routing step that finds dynamic values of routing latencies for the qubits. In addition to the quantum physical mapper, an approach is presented to determine the single best ULB size for a target quantum circuit by examining the latency of different FT quantum operations mapped onto different ULB sizes and using information about the occurrence frequency of operations on critical paths of the target quantum algorithm to weigh these latencies. Experimental results show an average latency reduction of about 40 % compared to previous work.

  11. Managing Capital Investments and Resources for Large, Complex Satellite Development Efforts

    NASA Technical Reports Server (NTRS)

    Ryan, Richard M.

    2017-01-01

    Once the schedule was baselined and the primary, secondary, and tertiary critical paths (at least) were established: Identified what entity control which portion of the schedule reserve Dollarized the time associated with the schedule reserve and segregated it off from other budget reserves authorization to spend only came when authorization to utilize the time was given Created a process to manage the control of these resources This process was above and beyond the typical schedule controls used to monitor and maintain the day-to-day schedules Webb telescope will be the premier space observatory for astronomers worldwide,extending the tantalizing discoveries of the Hubble Space telescope• An international collaboration among NASA, the European Space Agency, and theCanadian Space Agency• The largest telescope ever placed in space, Webb will be 100 times more powerful thanHubble• It is so big it has to fold origami-style to fit in the rocket and will unfold like a transformeronce in space• The 5-layer sunshield protects the telescope from the Sun, Earth, and Moon’s infraredradiation. It’s like having sun protection of SPF 1 million• Unprecedented infrared sensitivity will peer back in time over 13.5 billion years to see thefirst galaxies born after the Big Bang• Hubble orbits 350 miles above the Earth; Webb will orbit the sun 1 million miles fromEarth• Launch from French Guiana in 2018

  12. A statistical-based scheduling algorithm in automated data path synthesis

    NASA Technical Reports Server (NTRS)

    Jeon, Byung Wook; Lursinsap, Chidchanok

    1992-01-01

    In this paper, we propose a new heuristic scheduling algorithm based on the statistical analysis of the cumulative frequency distribution of operations among control steps. It has a tendency of escaping from local minima and therefore reaching a globally optimal solution. The presented algorithm considers the real world constraints such as chained operations, multicycle operations, and pipelined data paths. The result of the experiment shows that it gives optimal solutions, even though it is greedy in nature.

  13. Wireless Sensor Network Metrics for Real-Time Systems

    DTIC Science & Technology

    2009-05-20

    to compute the probability of end-to-end packet delivery as a function of latency, the expected radio energy consumption on the nodes from relaying... schedules for WSNs. Particularly, we focus on the impact scheduling has on path diversity, using short repeating schedules and Greedy Maximal Matching...a greedy algorithm for constructing a mesh routing topology. Finally, we study the implications of using distributed scheduling schemes to generate

  14. Dynamic Testing of the NASA Hypersonic Project Combined Cycle Engine Testbed for Mode Transition Experiments

    NASA Technical Reports Server (NTRS)

    2011-01-01

    NASA is interested in developing technology that leads to more routine, safe, and affordable access to space. Access to space using airbreathing propulsion systems has potential to meet these objectives based on Airbreathing Access to Space (AAS) system studies. To this end, the NASA Fundamental Aeronautics Program (FAP) Hypersonic Project is conducting fundamental research on a Turbine Based Combined Cycle (TBCC) propulsion system. The TBCC being studied considers a dual flow-path inlet system. One flow-path includes variable geometry to regulate airflow to a turbine engine cycle. The turbine cycle provides propulsion from take-off to supersonic flight. The second flow-path supports a dual-mode scramjet (DMSJ) cycle which would be initiated at supersonic speed to further accelerate the vehicle to hypersonic speed. For a TBCC propulsion system to accelerate a vehicle from supersonic to hypersonic speed, a critical enabling technology is the ability to safely and effectively transition from the turbine to the DMSJ-referred to as mode transition. To experimentally test methods of mode transition, a Combined Cycle Engine (CCE) Large-scale Inlet testbed was designed with two flow paths-a low speed flow-path sized for a turbine cycle and a high speed flow-path designed for a DMSJ. This testbed system is identified as the CCE Large-Scale Inlet for Mode Transition studies (CCE-LIMX). The test plan for the CCE-LIMX in the NASA Glenn Research Center (GRC) 10- by 10-ft Supersonic Wind Tunnel (10x10 SWT) is segmented into multiple phases. The first phase is a matrix of inlet characterization (IC) tests to evaluate the inlet performance and establish the mode transition schedule. The second phase is a matrix of dynamic system identification (SysID) experiments designed to support closed-loop control development at mode transition schedule operating points for the CCE-LIMX. The third phase includes a direct demonstration of controlled mode transition using a closed loop control system developed with the data obtained from the first two phases. Plans for a fourth phase include mode transition experiments with a turbine engine. This paper, focusing on the first two phases of experiments, presents developed operational and analysis tools for streamlined testing and data reduction procedures.

  15. Simulating Mission Command for Planning and Analysis

    DTIC Science & Technology

    2015-06-01

    mission plan. 14. SUBJECT TERMS Mission Planning, CPM , PERT, Simulation, DES, Simkit, Triangle Distribution, Critical Path 15. NUMBER OF...Battalion Task Force CO Company CPM Critical Path Method DES Discrete Event Simulation FA BAT Field Artillery Battalion FEL Future Event List FIST...management tools that can be utilized to find the critical path in military projects. These are the Critical Path Method ( CPM ) and the Program Evaluation and

  16. Scheduling Real-Time Mixed-Criticality Jobs

    NASA Astrophysics Data System (ADS)

    Baruah, Sanjoy K.; Bonifaci, Vincenzo; D'Angelo, Gianlorenzo; Li, Haohan; Marchetti-Spaccamela, Alberto; Megow, Nicole; Stougie, Leen

    Many safety-critical embedded systems are subject to certification requirements; some systems may be required to meet multiple sets of certification requirements, from different certification authorities. Certification requirements in such "mixed-criticality" systems give rise to interesting scheduling problems, that cannot be satisfactorily addressed using techniques from conventional scheduling theory. In this paper, we study a formal model for representing such mixed-criticality workloads. We demonstrate first the intractability of determining whether a system specified in this model can be scheduled to meet all its certification requirements, even for systems subject to two sets of certification requirements. Then we quantify, via the metric of processor speedup factor, the effectiveness of two techniques, reservation-based scheduling and priority-based scheduling, that are widely used in scheduling such mixed-criticality systems, showing that the latter of the two is superior to the former. We also show that the speedup factors are tight for these two techniques.

  17. Incentive-compatible guaranteed renewable health insurance premiums.

    PubMed

    Herring, Bradley; Pauly, Mark V

    2006-05-01

    Theoretical models of guaranteed renewable insurance display front-loaded premium schedules. Such schedules both cover lifetime total claims of low-risk and high-risk individuals and provide an incentive for those who remain low-risk to continue to purchase the policy. Questions have been raised of whether actual individual insurance markets in the US approximate the behavior predicted by these models, both because young consumers may not be able to "afford" front-loading and because insurers may behave strategically in ways that erode the value of protection against risk reclassification. In this paper, the optimal competitive age-based premium schedule for a benchmark guaranteed renewable health insurance policy is estimated using medical expenditure data. Several factors are shown to reduce the amount of front-loading necessary. Indeed, the resulting optimal premium path increases with age. Actual premium paths exhibited by purchasers of individual insurance are close to the optimal renewable schedule we estimate. Finally, consumer utility associated with the feature is examined.

  18. Observatorio Astrofísico de Javalambre: observation scheduler and sequencer

    NASA Astrophysics Data System (ADS)

    Ederoclite, A.; Cristóbal-Hornillos, D.; Moles, M.; Cenarro, A. J.; Marín-Franch, A.; Yanes Díaz, A.; Gruel, N.; Varela, J.; Chueca, S.; Rueda-Teruel, F.; Rueda-Teruel, S.; Luis-Simoes, R.; Hernández-Fuertes, J.; López-Sainz, A.; Chioare Díaz-Martín, M.

    2013-05-01

    Observational strategy is a critical path in any large survey. The planning of a night requires the knowledge of the fields observed, the quality of the data already secured, and the ones still to be observed to optimize scientific returns. Finally, field maximum altitude, sky distance/brightness during the night and meteorological data (cloud coverage and seeing) have to be taken into account in order to increase the chance to have a successful observation. To support the execution of the J-PAS project at the Javalambre Astrophysical Observatory, we have prepared a scheduler and a sequencer (SCH/SQ) which takes into account all the relevant mentioned parameters. The scheduler first selects the fields which can be observed during the night and orders them on the basis of their figure of merit. It takes into account the quality and spectral coverage of the existing observations as well as the possibility to get a good observation during the night. The sequencer takes into account the meteorological variables in order to prepare the observation queue for the night. During the commissioning of the telescopes at OAJ, we expect to improve our figures of merit and eventually get to a system which can function semi-automatically. This poster describes the design of this software.

  19. Mixed Criticality Scheduling for Industrial Wireless Sensor Networks

    PubMed Central

    Jin, Xi; Xia, Changqing; Xu, Huiting; Wang, Jintao; Zeng, Peng

    2016-01-01

    Wireless sensor networks (WSNs) have been widely used in industrial systems. Their real-time performance and reliability are fundamental to industrial production. Many works have studied the two aspects, but only focus on single criticality WSNs. Mixed criticality requirements exist in many advanced applications in which different data flows have different levels of importance (or criticality). In this paper, first, we propose a scheduling algorithm, which guarantees the real-time performance and reliability requirements of data flows with different levels of criticality. The algorithm supports centralized optimization and adaptive adjustment. It is able to improve both the scheduling performance and flexibility. Then, we provide the schedulability test through rigorous theoretical analysis. We conduct extensive simulations, and the results demonstrate that the proposed scheduling algorithm and analysis significantly outperform existing ones. PMID:27589741

  20. Scheduling time-critical graphics on multiple processors

    NASA Technical Reports Server (NTRS)

    Meyer, Tom W.; Hughes, John F.

    1995-01-01

    This paper describes an algorithm for the scheduling of time-critical rendering and computation tasks on single- and multiple-processor architectures, with minimal pipelining. It was developed to manage scientific visualization scenes consisting of hundreds of objects, each of which can be computed and displayed at thousands of possible resolution levels. The algorithm generates the time-critical schedule using progressive-refinement techniques; it always returns a feasible schedule and, when allowed to run to completion, produces a near-optimal schedule which takes advantage of almost the entire multiple-processor system.

  1. Decentralized Control of Scheduling in Distributed Systems.

    DTIC Science & Technology

    1983-03-18

    the job scheduling algorithm adapts to the changing busyness of the various hosts in the system. The environment in which the job scheduling entities...resources and processes that constitute the node and a set of interfaces for accessing these processes and resources. The structure of a node could change ...parallel. Chang [CHNG82] has also described some algorithms for detecting properties of general graphs by traversing paths in a graph in parallel. One of

  2. Real-time operating system timing jitter and its impact on motor control

    NASA Astrophysics Data System (ADS)

    Proctor, Frederick M.; Shackleford, William P.

    2001-12-01

    General-purpose microprocessors are increasingly being used for control applications due to their widespread availability and software support for non-control functions like networking and operator interfaces. Two classes of real-time operating systems (RTOS) exist for these systems. The traditional RTOS serves as the sole operating system, and provides all OS services. Examples include ETS, LynxOS, QNX, Windows CE and VxWorks. RTOS extensions add real-time scheduling capabilities to non-real-time OSes, and provide minimal services needed for the time-critical portions of an application. Examples include RTAI and RTL for Linux, and HyperKernel, OnTime and RTX for Windows NT. Timing jitter is an issue in these systems, due to hardware effects such as bus locking, caches and pipelines, and software effects from mutual exclusion resource locks, non-preemtible critical sections, disabled interrupts, and multiple code paths in the scheduler. Jitter is typically on the order of a microsecond to a few tens of microseconds for hard real-time operating systems, and ranges from milliseconds to seconds in the worst case for soft real-time operating systems. The question of its significance on the performance of a controller arises. Naturally, the smaller the scheduling period required for a control task, the more significant is the impact of timing jitter. Aside from this intuitive relationship is the greater significance of timing on open-loop control, such as for stepper motors, than for closed-loop control, such as for servo motors. Techniques for measuring timing jitter are discussed, and comparisons between various platforms are presented. Techniques to reduce jitter or mitigate its effects are presented. The impact of jitter on stepper motor control is analyzed.

  3. A success paradigm for project managers in the aerospace industry

    NASA Astrophysics Data System (ADS)

    Bauer, Barry Jon

    Within the aerospace industry, project managers traditionally have been selected based on their technical competency. While this may lead to brilliant technical solutions to customer requirements, a lack of management ability can result in failed programs that over-run on cost, are late to critical path schedules, fail to fully utilize the diversity of talent available within the program team, and otherwise disappoint key stakeholders. This research study identifies the key competencies that a project manager should possess in order to successfully lead and manage a project in the aerospace industry. The research attempts to show evidence that within the aerospace industry, it is perceived that management competency is more important to project management success than only technical competence.

  4. Exact and Approximate Probabilistic Symbolic Execution

    NASA Technical Reports Server (NTRS)

    Luckow, Kasper; Pasareanu, Corina S.; Dwyer, Matthew B.; Filieri, Antonio; Visser, Willem

    2014-01-01

    Probabilistic software analysis seeks to quantify the likelihood of reaching a target event under uncertain environments. Recent approaches compute probabilities of execution paths using symbolic execution, but do not support nondeterminism. Nondeterminism arises naturally when no suitable probabilistic model can capture a program behavior, e.g., for multithreading or distributed systems. In this work, we propose a technique, based on symbolic execution, to synthesize schedulers that resolve nondeterminism to maximize the probability of reaching a target event. To scale to large systems, we also introduce approximate algorithms to search for good schedulers, speeding up established random sampling and reinforcement learning results through the quantification of path probabilities based on symbolic execution. We implemented the techniques in Symbolic PathFinder and evaluated them on nondeterministic Java programs. We show that our algorithms significantly improve upon a state-of- the-art statistical model checking algorithm, originally developed for Markov Decision Processes.

  5. In-Space Manufacturing: Pioneering a Sustainable Path to Mars

    NASA Technical Reports Server (NTRS)

    Werkheiser, Niki

    2015-01-01

    In order to provide meaningful impacts to exploration technology needs, the In-Space Manufacturing (ISM) Initiative must influence exploration systems design now. In-space manufacturing offers: dramatic paradigm shift in the development and creation of space architectures; efficiency gain and risk reduction for low Earth orbit and deep space exploration; and "pioneering" approach to maintenance, repair, and logistics leading to sustainable, affordable supply chain model. In order to develop application-based capabilities in time to support NASA budget and schedule, ISM must be able to leverage the significant commercial developments, which requires innovative, agile collaborative mechanisms (contracts, challenges, SBIR's, etc.); and NASA-unique investments to focus primarily on adapting the technologies and processes to the microgravity environment. We must do the foundational work - it is the critical path for taking these technologies from lab curiosities to institutionalized capabilities: characterize, certify, institutionalize, design for Additive Manufacturing (AM). Ideally, International Space Station (ISS) U.S. lab rack or partial rack space should be identified for in-space manufacturing utilization in order to continue technology development of a suite of capabilities required for exploration missions, as well as commercialization on ISS.

  6. Algorithms for Scheduling and Network Problems

    DTIC Science & Technology

    1991-09-01

    time. We already know, by Lemma 2.2.1, that WOPT = O(log( mpU )), so if we could solve this integer program optimally we would be done. However, the...Folydirat, 15:177-191, 1982. [6] I.S. Belov and Ya. N. Stolin. An algorithm in a single path operations scheduling problem. In Mathematical Economics and

  7. Time-critical multirate scheduling using contemporary real-time operating system services

    NASA Technical Reports Server (NTRS)

    Eckhardt, D. E., Jr.

    1983-01-01

    Although real-time operating systems provide many of the task control services necessary to process time-critical applications (i.e., applications with fixed, invariant deadlines), it may still be necessary to provide a scheduling algorithm at a level above the operating system in order to coordinate a set of synchronized, time-critical tasks executing at different cyclic rates. The scheduling requirements for such applications and develops scheduling algorithms using services provided by contemporary real-time operating systems.

  8. Defense AT&L (Volume 35, Number 5, September-October 2006)

    DTIC Science & Technology

    2006-10-01

    percent of production. The criti- Defense AT&L: September-October 2006 8 cal path elements driving the IOT &E schedule are not pro- duction hardware...reduced costs, and successful completion of work in the scheduled time. 30 The Commodity Approach to Aircraft Protection Systems Capt. Bill Chubb, USN The...piece of Littoral Combat Ship Two during the ship’s keel laying ceremony. The Navy’s second Littoral Combat Ship is scheduled for commissioning in

  9. The Human Space Life Sciences Critical Path Roadmap Project: A Strategy for Human Space Flight through Exploration-Class Missions

    NASA Technical Reports Server (NTRS)

    Sawin, Charles F.

    1999-01-01

    The product of the critical path roadmap project is an integrated strategy for mitigating the risks associated with human exploration class missions. It is an evolving process that will assure the ability to communicate the integrated critical path roadmap. Unlike previous reports, this one will not sit on a shelf - it has the full support of the JSC Space and Life Sciences Directorate (SA) and is already being used as a decision making tool (e.g., budget and investigation planning for Shuttle and Space Station mission). Utility of this product depends on many efforts, namely: providing the required information (completed risk data sheets, critical question information, technology data). It is essential to communicate the results of the critical path roadmap to the scientific community - this meeting is a good opportunity to do so. The web site envisioned for the critical path roadmap will provide the capability to communicate to a broader community and to track and update the system routinely.

  10. The terminal area automated path generation problem

    NASA Technical Reports Server (NTRS)

    Hsin, C.-C.

    1977-01-01

    The automated terminal area path generation problem in the advanced Air Traffic Control System (ATC), has been studied. Definitions, input, output and the interrelationships with other ATC functions have been discussed. Alternatives in modeling the problem have been identified. Problem formulations and solution techniques are presented. In particular, the solution of a minimum effort path stretching problem (path generation on a given schedule) has been carried out using the Newton-Raphson trajectory optimization method. Discussions are presented on the effect of different delivery time, aircraft entry position, initial guess on the boundary conditions, etc. Recommendations are made on real-world implementations.

  11. Millimeter-wave studies

    NASA Technical Reports Server (NTRS)

    Allen, Kenneth C.

    1988-01-01

    Progress on millimeter-wave propagation experiments in Hawaii is reported. A short path for measuring attenuation in rain at 9.6, 28.8, 57.6, and 96.1 GHz is in operation. A slant path from Hilo to the top of Mauna Kea is scheduled. On this path, scattering from rain and clouds that may cause interference for satellites closely spaced in geosynchronous orbit will be measured at the same frequencies at 28.8 and 96.1 GHz. In addition the full transmission matrix will be measured at the same frequencies on the slant path. The technique and equipment used to measure the transmission matrix are described.

  12. Think-Aloud Process Superior to Thought-Listing in Increasing Children's Critical Processing of Advertising

    ERIC Educational Resources Information Center

    Rozendaal, Esther; Buijzen, Moniek; Valkenburg, Patti M.

    2012-01-01

    This study develops and tests a model of children's critical processing of advertising. Within this model, 2 paths to reduced advertising susceptibility (i.e., attitude toward the advertised brand) were hypothesized: a cognitive path and an affective path. The secondary aim was to compare these paths for different thought verbalization processes:…

  13. Novel Hybrid Scheduling Technique for Sensor Nodes with Mixed Criticality Tasks.

    PubMed

    Micea, Mihai-Victor; Stangaciu, Cristina-Sorina; Stangaciu, Valentin; Curiac, Daniel-Ioan

    2017-06-26

    Sensor networks become increasingly a key technology for complex control applications. Their potential use in safety- and time-critical domains has raised the need for task scheduling mechanisms specially adapted to sensor node specific requirements, often materialized in predictable jitter-less execution of tasks characterized by different criticality levels. This paper offers an efficient scheduling solution, named Hybrid Hard Real-Time Scheduling (H²RTS), which combines a static, clock driven method with a dynamic, event driven scheduling technique, in order to provide high execution predictability, while keeping a high node Central Processing Unit (CPU) utilization factor. From the detailed, integrated schedulability analysis of the H²RTS, a set of sufficiency tests are introduced and demonstrated based on the processor demand and linear upper bound metrics. The performance and correct behavior of the proposed hybrid scheduling technique have been extensively evaluated and validated both on a simulator and on a sensor mote equipped with ARM7 microcontroller.

  14. Ares I-X Roll Control System Development

    NASA Technical Reports Server (NTRS)

    Unger, Ronald J.; Massey, Edmund C.

    2009-01-01

    Project Managers often face challenging technical, schedule and budget issues. This presentation will explore how the Ares I-X Roll Control System Integrated Product Team (IPT) mitigated challenges such as concurrent engineering requirements and environments and evolving program processes, while successfully managing an aggressive project schedule and tight budget. IPT challenges also included communications and negotiations among inter- and intra-government agencies, including the US Air Force, NASA/MSFC Propulsion Engineering, LaRC, GRC, KSC, WSTF, and the Constellation Program. In order to successfully meet these challenges it was essential that the IPT define those items that most affected the schedule critical path, define early mitigation strategies to reduce technical, schedule, and budget risks, and maintain the end-product focus of an "unmanned test flight" context for the flight hardware. The makeup of the IPT and how it would function were also important considerations. The IPT consisted of NASA/MSFC (project management, engineering, and safety/quality) and contractors (Teledyne Brown Engineering and Pratt and Whitney Rocketdyne, who supplied heritage hardware experience). The early decision to have a small focused IPT working "badgelessly" across functional lines to eliminate functional stove-piping allowed for many more tasks to be done by fewer people. It also enhanced a sense of ownership of the products, while still being able to revert back to traditional roles in order to provide the required technical independence in design reviews and verification closures. This presentation will highlight several prominent issues and discuss how they were mitigated and the resulting Lessons Learned that might benefit other projects.

  15. Electrical utilities model for determining electrical distribution capacity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fritz, R. L.

    1997-09-03

    In its simplest form, this model was to obtain meaningful data on the current state of the Site`s electrical transmission and distribution assets, and turn this vast collection of data into useful information. The resulting product is an Electrical Utilities Model for Determining Electrical Distribution Capacity which provides: current state of the electrical transmission and distribution systems; critical Hanford Site needs based on outyear planning documents; decision factor model. This model will enable Electrical Utilities management to improve forecasting requirements for service levels, budget, schedule, scope, and staffing, and recommend the best path forward to satisfy customer demands at themore » minimum risk and least cost to the government. A dynamic document, the model will be updated annually to reflect changes in Hanford Site activities.« less

  16. What's Going on with Young People Today? The Long and Twisting Path to Adulthood

    ERIC Educational Resources Information Center

    Settersten, Richard A., Jr.; Ray, Barbara

    2010-01-01

    Richard Settersten and Barbara Ray examine the lengthening transition to adulthood over the past several decades, as well as the challenges the new schedule poses for young people, families, and society. The authors begin with a brief history of becoming an adult, noting that the schedule that youth follow to arrive at adulthood changes to meet…

  17. Algorithms for constructing optimal paths and statistical analysis of passenger traffic

    NASA Astrophysics Data System (ADS)

    Trofimov, S. P.; Druzhinina, N. G.; Trofimova, O. G.

    2018-01-01

    Several existing information systems of urban passenger transport (UPT) are considered. Author’s UPT network model is presented. To a passenger a new service is offered that is the best path from one stop to another stop at a specified time. The algorithm and software implementation for finding the optimal path are presented. The algorithm uses the current UPT schedule. The article also describes the algorithm of statistical analysis of trip payments by the electronic E-cards. The algorithm allows obtaining the density of passenger traffic during the day. This density is independent of the network topology and UPT schedules. The resulting density of the traffic flow can solve a number of practical problems. In particular, the forecast for the overflow of passenger transport in the «rush» hours, the quantitative comparison of different topologies transport networks, constructing of the best UPT timetable. The efficiency of the proposed integrated approach is demonstrated by the example of the model town with arbitrary dimensions.

  18. CaLRS: A Critical-Aware Shared LLC Request Scheduling Algorithm on GPGPU

    PubMed Central

    Ma, Jianliang; Meng, Jinglei; Chen, Tianzhou; Wu, Minghui

    2015-01-01

    Ultra high thread-level parallelism in modern GPUs usually introduces numerous memory requests simultaneously. So there are always plenty of memory requests waiting at each bank of the shared LLC (L2 in this paper) and global memory. For global memory, various schedulers have already been developed to adjust the request sequence. But we find few work has ever focused on the service sequence on the shared LLC. We measured that a big number of GPU applications always queue at LLC bank for services, which provide opportunity to optimize the service order on LLC. Through adjusting the GPU memory request service order, we can improve the schedulability of SM. So we proposed a critical-aware shared LLC request scheduling algorithm (CaLRS) in this paper. The priority representative of memory request is critical for CaLRS. We use the number of memory requests that originate from the same warp but have not been serviced when they arrive at the shared LLC bank to represent the criticality of each warp. Experiments show that the proposed scheme can boost the SM schedulability effectively by promoting the scheduling priority of the memory requests with high criticality and improves the performance of GPU indirectly. PMID:25729772

  19. Novel Hybrid Scheduling Technique for Sensor Nodes with Mixed Criticality Tasks

    PubMed Central

    Micea, Mihai-Victor; Stangaciu, Cristina-Sorina; Stangaciu, Valentin; Curiac, Daniel-Ioan

    2017-01-01

    Sensor networks become increasingly a key technology for complex control applications. Their potential use in safety- and time-critical domains has raised the need for task scheduling mechanisms specially adapted to sensor node specific requirements, often materialized in predictable jitter-less execution of tasks characterized by different criticality levels. This paper offers an efficient scheduling solution, named Hybrid Hard Real-Time Scheduling (H2RTS), which combines a static, clock driven method with a dynamic, event driven scheduling technique, in order to provide high execution predictability, while keeping a high node Central Processing Unit (CPU) utilization factor. From the detailed, integrated schedulability analysis of the H2RTS, a set of sufficiency tests are introduced and demonstrated based on the processor demand and linear upper bound metrics. The performance and correct behavior of the proposed hybrid scheduling technique have been extensively evaluated and validated both on a simulator and on a sensor mote equipped with ARM7 microcontroller. PMID:28672856

  20. Robustness of mission plans for unmanned aircraft

    NASA Astrophysics Data System (ADS)

    Niendorf, Moritz

    This thesis studies the robustness of optimal mission plans for unmanned aircraft. Mission planning typically involves tactical planning and path planning. Tactical planning refers to task scheduling and in multi aircraft scenarios also includes establishing a communication topology. Path planning refers to computing a feasible and collision-free trajectory. For a prototypical mission planning problem, the traveling salesman problem on a weighted graph, the robustness of an optimal tour is analyzed with respect to changes to the edge costs. Specifically, the stability region of an optimal tour is obtained, i.e., the set of all edge cost perturbations for which that tour is optimal. The exact stability region of solutions to variants of the traveling salesman problems is obtained from a linear programming relaxation of an auxiliary problem. Edge cost tolerances and edge criticalities are derived from the stability region. For Euclidean traveling salesman problems, robustness with respect to perturbations to vertex locations is considered and safe radii and vertex criticalities are introduced. For weighted-sum multi-objective problems, stability regions with respect to changes in the objectives, weights, and simultaneous changes are given. Most critical weight perturbations are derived. Computing exact stability regions is intractable for large instances. Therefore, tractable approximations are desirable. The stability region of solutions to relaxations of the traveling salesman problem give under approximations and sets of tours give over approximations. The application of these results to the two-neighborhood and the minimum 1-tree relaxation are discussed. Bounds on edge cost tolerances and approximate criticalities are obtainable likewise. A minimum spanning tree is an optimal communication topology for minimizing the cumulative transmission power in multi aircraft missions. The stability region of a minimum spanning tree is given and tolerances, stability balls, and criticalities are derived. This analysis is extended to Euclidean minimum spanning trees. This thesis aims at enabling increased mission performance by providing means of assessing the robustness and optimality of a mission and methods for identifying critical elements. Examples of the application to mission planning in contested environments, cargo aircraft mission planning, multi-objective mission planning, and planning optimal communication topologies for teams of unmanned aircraft are given.

  1. Kuiper Belt Objects Along the Pluto Express Path

    NASA Technical Reports Server (NTRS)

    Jewitt, David C.

    1998-01-01

    The science objective of this work was to identify objects in the Kuiper Belt which will, in the 5 years following Pluto encounter, be close to the flight path of NASA's Pluto-Kuiper Express. Currently, launch is scheduled for 2004 with a flight time of about 1 decade. Early identification of post-Pluto targets is important for mission design and orbit refinement. An object or objects close enough to the flight path can be visited and studied at high resolution, using only residual gas in the thrusters to affect a close encounter.

  2. Scheduling Capacitated One-Way Vehicles on Paths with Deadlines

    NASA Astrophysics Data System (ADS)

    Uchida, Jun; Karuno, Yoshiyuki; Nagamochi, Hiroshi

    In this paper, we deal with a scheduling problem of minimizing the number of employed vehicles on paths. Let G=(V,E) be a path with a set V={vi|i=1,2,...,n} of vertices and a set E={{vi,vi+1}|i=1,2,...,n-1} of edges. Vehicles with capacity b are initially situated at v1. There is a job i at each vertex vi∈V, which has its own handling time hi and deadline di. With each edge {vi,vi+1}∈E, a travel time wi,i+1 is associated. Each job is processed by exactly one vehicle, and the number of jobs processed by a vehicle does not exceed the capacity b. A routing of a vehicle is called one-way if the vehicle visits every edge {vi,vi+1} exactly once (i.e., it simply moves from v1 to vn on G). Any vehicle is assumed to follow the one-way routing constraint. The problem asks to find a schedule that minimizes the number of one-way vehicles, meeting the deadline and capacity constraints. A greedy heuristic is proposed, which repeats a dynamic programming procedure for a single one-way vehicle problem of maximizing the number of non-tardy jobs. We show that the greedy heuristic runs in O(n3) time, and the approximation ratio is at most ln b+1.

  3. 21 CFR 113.100 - Processing and production records.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... products, maximum fill-in or drained weight, or other critical factors specified in the scheduled process... end of the holding period; nature of container. (6) Food preservation methods wherein critical factors... scheduled processes used, including the thermal process, its associated critical factors, as well as other...

  4. 21 CFR 113.100 - Processing and production records.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... products, maximum fill-in or drained weight, or other critical factors specified in the scheduled process... end of the holding period; nature of container. (6) Food preservation methods wherein critical factors... scheduled processes used, including the thermal process, its associated critical factors, as well as other...

  5. 21 CFR 113.100 - Processing and production records.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... products, maximum fill-in or drained weight, or other critical factors specified in the scheduled process... end of the holding period; nature of container. (6) Food preservation methods wherein critical factors... scheduled processes used, including the thermal process, its associated critical factors, as well as other...

  6. PHENIX Work Breakdown Structure. Cost and schedule review copy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1994-02-01

    The Work Breakdown Structure (WBS) Book begins with this Overview section, which contains the high-level summary cost estimate, the cost profile, and the global construction schedule. The summary cost estimate shows the total US cost and the cost in terms of PHENIX construction funds for building the PHENIX detector. All costs in the WBS book are shown in FY 1993 dollars. Also shown are the institutional and foreign contributions, the level of pre-operations funding, and the cost of deferred items. Pie charts are presented at PHENIX WBS level 1 and 2 that show this information. The PHENIX construction funds aremore » shown broken down to PHENIX WBS level 3 items per fiscal year, and the resulting profile is compared to the RHIC target profile. An accumulated difference of the two profiles is also shown. The PHENIX global construction schedule is presented at the end of the Overview section. Following the Overview are sections for each subsystem. Each subsystem section begins with a summary cost estimate, cost profile, and critical path. The total level 3 cost is broken down into fixed costs (M&S), engineering costs (EDIA) and labor costs. Costs are further broken down in terms of PHENIX construction funds, institutional and foreign contributions, pre-operations funding, and deferred items. Also shown is the contingency at level 3 and the level 4 breakdown of the total cost. The cost profile in fiscal years is shown at level 3. The subsystem summaries are followed by the full cost estimate and schedule sheets for that subsystem. These detailed sheets are typically carried down to level 7 or 8. The cost estimate shows Total, M&S, EDIA, and Labor breakdowns, as well as contingency, for each WBS entry.« less

  7. Human Factors and Modeling Methods in the Development of Control Room Modernization Concepts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hugo, Jacques V.; Slay III, Lorenzo

    nuclear power plants. Although the nuclear industry has made steady improvement in outage optimization, each day of a refueling outage still represents an opportunity to save millions of dollars and each day an outage extends past its planned end date represents millions of dollars that may have been spent unnecessarily. Reducing planned outage duration or preventing outage extensions requires careful management of the outage schedule as well as constant oversight and monitoring of work completion during the outage execution. During a typical outage, there are typically more than 10,000 activities on the schedule that, if not managed efficiently, may causemore » expensive outage delays. Management of outages currently relies largely on paper-based resources and general-purpose office software. A typical tool currently used to monitor work performance is a burn-down curve, where total remaining activities are plotted against the baseline schedule to track bulk work completion progress. While these tools are useful, there is still considerable uncertainty during a typical outage that bulk work progress is adequate and therefore a lot of management time is spent analyzing the situation on a daily basis. This paper describes recent advances made in developing a framework for the design of visual outage information presentation, as well as an overview of the scientific principles that informed the development of the visualizations. To test the utility of advanced visual outage information presentation, an outage management dashboard software application was created as part of the Department of Energy’s Advanced Outage Control Center project. This dashboard is intended to present all the critical information an outage manager would need to understand the current status of a refueling outage. The dashboard presents the critical path, bulk work performance, key performance indicators, outage milestones and metrics relating current performance to historical performance. Additionally, the dashboard includes data analysis tools to allow outage managers to drill down into the underlying data to understand the drivers of the indicators.« less

  8. Model Checking Real Time Java Using Java PathFinder

    NASA Technical Reports Server (NTRS)

    Lindstrom, Gary; Mehlitz, Peter C.; Visser, Willem

    2005-01-01

    The Real Time Specification for Java (RTSJ) is an augmentation of Java for real time applications of various degrees of hardness. The central features of RTSJ are real time threads; user defined schedulers; asynchronous events, handlers, and control transfers; a priority inheritance based default scheduler; non-heap memory areas such as immortal and scoped, and non-heap real time threads whose execution is not impeded by garbage collection. The Robust Software Systems group at NASA Ames Research Center has JAVA PATHFINDER (JPF) under development, a Java model checker. JPF at its core is a state exploring JVM which can examine alternative paths in a Java program (e.g., via backtracking) by trying all nondeterministic choices, including thread scheduling order. This paper describes our implementation of an RTSJ profile (subset) in JPF, including requirements, design decisions, and current implementation status. Two examples are analyzed: jobs on a multiprogramming operating system, and a complex resource contention example involving autonomous vehicles crossing an intersection. The utility of JPF in finding logic and timing errors is illustrated, and the remaining challenges in supporting all of RTSJ are assessed.

  9. Alarm guided critical function and success path monitoring

    DOEpatents

    Scarola, Kenneth; Jamison, David S.; Manazir, Richard M.; Rescorl, Robert L.; Harmon, Daryl L.

    1994-01-01

    The use of alarm indication on the overview (IPSO) display to initiate diagnosis of challenges to critical functions or unavailability of success paths, and further alarm-based guidance toward ultimate diagnosis.

  10. Path Flow Estimation Using Time Varying Coefficient State Space Model

    NASA Astrophysics Data System (ADS)

    Jou, Yow-Jen; Lan, Chien-Lun

    2009-08-01

    The dynamic path flow information is very crucial in the field of transportation operation and management, i.e., dynamic traffic assignment, scheduling plan, and signal timing. Time-dependent path information, which is important in many aspects, is nearly impossible to be obtained. Consequently, researchers have been seeking estimation methods for deriving valuable path flow information from less expensive traffic data, primarily link traffic counts of surveillance systems. This investigation considers a path flow estimation problem involving the time varying coefficient state space model, Gibbs sampler, and Kalman filter. Numerical examples with part of a real network of the Taipei Mass Rapid Transit with real O-D matrices is demonstrated to address the accuracy of proposed model. Results of this study show that this time-varying coefficient state space model is very effective in the estimation of path flow compared to time-invariant model.

  11. Design Change Model for Effective Scheduling Change Propagation Paths

    NASA Astrophysics Data System (ADS)

    Zhang, Hai-Zhu; Ding, Guo-Fu; Li, Rong; Qin, Sheng-Feng; Yan, Kai-Yin

    2017-09-01

    Changes in requirements may result in the increasing of product development project cost and lead time, therefore, it is important to understand how requirement changes propagate in the design of complex product systems and be able to select best options to guide design. Currently, a most approach for design change is lack of take the multi-disciplinary coupling relationships and the number of parameters into account integrally. A new design change model is presented to systematically analyze and search change propagation paths. Firstly, a PDS-Behavior-Structure-based design change model is established to describe requirement changes causing the design change propagation in behavior and structure domains. Secondly, a multi-disciplinary oriented behavior matrix is utilized to support change propagation analysis of complex product systems, and the interaction relationships of the matrix elements are used to obtain an initial set of change paths. Finally, a rough set-based propagation space reducing tool is developed to assist in narrowing change propagation paths by computing the importance of the design change parameters. The proposed new design change model and its associated tools have been demonstrated by the scheduling change propagation paths of high speed train's bogie to show its feasibility and effectiveness. This model is not only supportive to response quickly to diversified market requirements, but also helpful to satisfy customer requirements and reduce product development lead time. The proposed new design change model can be applied in a wide range of engineering systems design with improved efficiency.

  12. Women's Career Development Patterns.

    ERIC Educational Resources Information Center

    Schreiber, Pamela J.

    1998-01-01

    Women's career development is characterized by balance of work and family, career interruptions, and diverse career paths. Alternative work arrangements such as flexible schedules, telecommuting, and entrepreneurial opportunities may offer women more options for work. (SK)

  13. Functional integration of vertical flight path and speed control using energy principles

    NASA Technical Reports Server (NTRS)

    Lambregts, A. A.

    1984-01-01

    A generalized automatic flight control system was developed which integrates all longitudinal flight path and speed control functions previously provided by a pitch autopilot and autothrottle. In this design, a net thrust command is computed based on total energy demand arising from both flight path and speed targets. The elevator command is computed based on the energy distribution error between flight path and speed. The engine control is configured to produce the commanded net thrust. The design incorporates control strategies and hierarchy to deal systematically and effectively with all aircraft operational requirements, control nonlinearities, and performance limits. Consistent decoupled maneuver control is achieved for all modes and flight conditions without outer loop gain schedules, control law submodes, or control function duplication.

  14. International Ultraviolet Explorer Observatory operations

    NASA Technical Reports Server (NTRS)

    1985-01-01

    This volume contains the final report for the International Ultraviolet Explorer IUE Observatory Operations contract. The fundamental operational objective of the International Ultraviolet Explorer (IUE) program is to translate competitively selected observing programs into IUE observations, to reduce these observations into meaningful scientific data, and then to present these data to the Guest Observer in a form amenable to the pursuit of scientific research. The IUE Observatory is the key to this objective since it is the central control and support facility for all science operations functions within the IUE Project. In carrying out the operation of this facility, a number of complex functions were provided beginning with telescope scheduling and operation, proceeding to data processing, and ending with data distribution and scientific data analysis. In support of these critical-path functions, a number of other significant activities were also provided, including scientific instrument calibration, systems analysis, and software support. Routine activities have been summarized briefly whenever possible.

  15. Waste information management system: a web-based system for DOE waste forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geisler, T.J.; Shoffner, P.A.; Upadhyay, U.

    2007-07-01

    The implementation of the Department of Energy (DOE) mandated accelerated cleanup program has created significant potential technical impediments that must be overcome. The schedule compression will require close coordination and a comprehensive review and prioritization of the barriers that may impede treatment and disposition of the waste streams at each site. Many issues related to site waste treatment and disposal have now become potential critical path issues under the accelerated schedules. In order to facilitate accelerated cleanup initiatives, waste managers at DOE field sites and at DOE headquarters in Washington, D.C., need timely waste forecast information regarding the volumes andmore » types of waste that will be generated by DOE sites over the next 25 years. Each local DOE site has historically collected, organized, and displayed site waste forecast information in separate and unique systems. However, waste information from all sites needs a common application to allow interested parties to understand and view the complete complex-wide picture. A common application would allow identification of total waste volumes, material classes, disposition sites, choke points, and technological or regulatory barriers to treatment and disposal. The Applied Research Center (ARC) at Florida International University (FIU) in Miami, Florida, has completed the development of this web-based forecast system. (authors)« less

  16. K Basins sludge removal temporary sludge storage tank system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mclean, M.A.

    1997-06-12

    Shipment of sludge from the K Basins to a disposal site is now targeted for August 2000. The current path forward for sludge disposal is shipment to Tank AW-105 in the Tank Waste Remediation System (TWRS). Significant issues of the feasibility of this path exist primarily due to criticality concerns and the presence of polychlorinated biphenyls (PCBS) in the sludge at levels that trigger regulation under the Toxic Substance Control Act. Introduction of PCBs into the TWRS processes could potentially involve significant design and operational impacts to both the Spent Nuclear Fuel and TWRS projects if technical and regulatory issuesmore » related to PCB treatment cannot be satisfactorily resolved. Concerns of meeting the TWRS acceptance criteria have evolved such that new storage tanks for the K Basins sludge may be the best option for storage prior to vitrification of the sludge. A reconunendation for the final disposition of the sludge is scheduled for June 30, 1997. To support this decision process, this project was developed. This project provides a preconceptual design package including preconceptual designs and cost estimates for the temporary sludge storage tanks. Development of cost estimates for the design and construction of sludge storage systems is required to help evaluate a recommendation for the final disposition of the K Basin sludge.« less

  17. Escalator: An Autonomous Scheduling Scheme for Convergecast in TSCH

    PubMed Central

    Oh, Sukho; Hwang, DongYeop; Kim, Ki-Hyung; Kim, Kangseok

    2018-01-01

    Time Slotted Channel Hopping (TSCH) is widely used in the industrial wireless sensor networks due to its high reliability and energy efficiency. Various timeslot and channel scheduling schemes have been proposed for achieving high reliability and energy efficiency for TSCH networks. Recently proposed autonomous scheduling schemes provide flexible timeslot scheduling based on the routing topology, but do not take into account the network traffic and packet forwarding delays. In this paper, we propose an autonomous scheduling scheme for convergecast in TSCH networks with RPL as a routing protocol, named Escalator. Escalator generates a consecutive timeslot schedule along the packet forwarding path to minimize the packet transmission delay. The schedule is generated autonomously by utilizing only the local routing topology information without any additional signaling with other nodes. The generated schedule is guaranteed to be conflict-free, in that all nodes in the network could transmit packets to the sink in every slotframe cycle. We implement Escalator and evaluate its performance with existing autonomous scheduling schemes through a testbed and simulation. Experimental results show that the proposed Escalator has lower end-to-end delay and higher packet delivery ratio compared to the existing schemes regardless of the network topology. PMID:29659508

  18. Escalator: An Autonomous Scheduling Scheme for Convergecast in TSCH.

    PubMed

    Oh, Sukho; Hwang, DongYeop; Kim, Ki-Hyung; Kim, Kangseok

    2018-04-16

    Time Slotted Channel Hopping (TSCH) is widely used in the industrial wireless sensor networks due to its high reliability and energy efficiency. Various timeslot and channel scheduling schemes have been proposed for achieving high reliability and energy efficiency for TSCH networks. Recently proposed autonomous scheduling schemes provide flexible timeslot scheduling based on the routing topology, but do not take into account the network traffic and packet forwarding delays. In this paper, we propose an autonomous scheduling scheme for convergecast in TSCH networks with RPL as a routing protocol, named Escalator. Escalator generates a consecutive timeslot schedule along the packet forwarding path to minimize the packet transmission delay. The schedule is generated autonomously by utilizing only the local routing topology information without any additional signaling with other nodes. The generated schedule is guaranteed to be conflict-free, in that all nodes in the network could transmit packets to the sink in every slotframe cycle. We implement Escalator and evaluate its performance with existing autonomous scheduling schemes through a testbed and simulation. Experimental results show that the proposed Escalator has lower end-to-end delay and higher packet delivery ratio compared to the existing schemes regardless of the network topology.

  19. 21 CFR 113.100 - Processing and production records.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...-in or drained weight, or other critical factors specified in the scheduled process shall also be... container position; speed of the container conveyor chain; and, when the scheduled process specifies... of container. (6) Food preservation methods wherein critical factors such as water activity are used...

  20. Co-scheduling of network resource provisioning and host-to-host bandwidth reservation on high-performance network and storage systems

    DOEpatents

    Yu, Dantong; Katramatos, Dimitrios; Sim, Alexander; Shoshani, Arie

    2014-04-22

    A cross-domain network resource reservation scheduler configured to schedule a path from at least one end-site includes a management plane device configured to monitor and provide information representing at least one of functionality, performance, faults, and fault recovery associated with a network resource; a control plane device configured to at least one of schedule the network resource, provision local area network quality of service, provision local area network bandwidth, and provision wide area network bandwidth; and a service plane device configured to interface with the control plane device to reserve the network resource based on a reservation request and the information from the management plane device. Corresponding methods and computer-readable medium are also disclosed.

  1. Car painting process scheduling with harmony search algorithm

    NASA Astrophysics Data System (ADS)

    Syahputra, M. F.; Maiyasya, A.; Purnamawati, S.; Abdullah, D.; Albra, W.; Heikal, M.; Abdurrahman, A.; Khaddafi, M.

    2018-02-01

    Automotive painting program in the process of painting the car body by using robot power, making efficiency in the production system. Production system will be more efficient if pay attention to scheduling of car order which will be done by considering painting body shape of car. Flow shop scheduling is a scheduling model in which the job-job to be processed entirely flows in the same product direction / path. Scheduling problems often arise if there are n jobs to be processed on the machine, which must be specified which must be done first and how to allocate jobs on the machine to obtain a scheduled production process. Harmony Search Algorithm is a metaheuristic optimization algorithm based on music. The algorithm is inspired by observations that lead to music in search of perfect harmony. This musical harmony is in line to find optimal in the optimization process. Based on the tests that have been done, obtained the optimal car sequence with minimum makespan value.

  2. Quantifying tight-gas sandstone permeability via critical path analysis

    USDA-ARS?s Scientific Manuscript database

    Rock permeability has been actively investigated over the past several decades by the geosciences community. However, its accurate estimation still presents significant technical challenges, especially in spatially complex rocks. In this letter, we apply critical path analysis (CPA) to estimate perm...

  3. Implications of path tolerance and path characteristics on critical vehicle manoeuvres

    NASA Astrophysics Data System (ADS)

    Lundahl, K.; Frisk, E.; Nielsen, L.

    2017-12-01

    Path planning and path following are core components in safe autonomous driving. Typically, a path planner provides a path with some tolerance on how tightly the path should be followed. Based on that, and other path characteristics, for example, sharpness of curves, a speed profile needs to be assigned so that the vehicle can stay within the given tolerance without going unnecessarily slow. Here, such trajectory planning is based on optimal control formulations where critical cases arise as on-the-limit solutions. The study focuses on heavy commercial vehicles, causing rollover to be of a major concern, due to the relatively high centre of gravity. Several results are obtained on required model complexity depending on path characteristics, for example, quantification of required path tolerance for a simple model to be sufficient, quantification of when yaw inertia needs to be considered in more detail, and how the curvature rate of change interplays with available friction. Overall, in situations where the vehicle is subject to a wide range of driving conditions, from good transport roads to more tricky avoidance manoeuvres, the requirements on the path following will vary. For this, the provided results form a basis for real-time path following.

  4. Successful Completion of the JWST OGSE2 Cryogenic Test at JSC Chamber-A While Managing Numerous Challenges

    NASA Technical Reports Server (NTRS)

    Park, Sang C.; Brinckerhoff, Pamela; Franck, Randy; Schweickart, Rusty; Thomson, Shaun; Burt, Bill; Ousley, Wes

    2016-01-01

    The James Webb Space Telescope (JWST) Optical Telescope Element (OTE) assembly is the largest optically stable infrared-optimized telescope currently being manufactured and assembled, and scheduled for launch in 2018. The JWST OTE, including the primary mirrors, secondary mirror, and the Aft Optics Subsystems (AOS) are designed to be passively cooled and operate at near 45 degrees Kelvin. Due to the size of its large sunshield in relation to existing test facilities, JWST cannot be optically or thermally tested as a complete observatory-level system at flight temperatures. As a result, the telescope portion along with its instrument complement will be tested as a single unit very late in the program, and on the program schedule critical path. To mitigate schedule risks, a set of 'pathfinder' cryogenic tests will be performed to reduce program risks by demonstrating the optical testing capabilities of the facility, characterizing telescope thermal performance, and allowing project personnel to learn valuable testing lessons off-line. This paper describes the 'pathfinder' cryogenic test program, focusing on the recently completed second test in the series called the Optical Ground Support Equipment 2 (OGSE2) test. The JWST OGSE2 was successfully completed within the allocated project schedule while faced with numerous conflicting thermal requirements during cool-down to the final cryogenic operational temperatures, and during warm-up after the cryo-stable optical tests. The challenges include developing a pre-test cool-down and warm-up profiles without a reliable method to predict the thermal behaviors in a rarified helium environment, and managing the test article hardware safety driven by the project Limits and Constraints (L&C's). Furthermore, OGSE2 test included the time critical Aft Optics Subsystem (AOS), a part of the flight Optical Telescope Element that would need to be placed back in the overall telescope assembly integrations. The OGSE2 test requirements included the strict adherence of the project contamination controls due to the presence of the contamination sensitive flight optical elements. The test operations required close coordination of numerous personnel while they being exposed and trained for the 'final' combined OTE and instrument cryo-test in 2017. This paper will also encompass the OGSE2 thermal data look-back review.

  5. Quality of service routing in wireless ad hoc networks

    NASA Astrophysics Data System (ADS)

    Sane, Sachin J.; Patcha, Animesh; Mishra, Amitabh

    2003-08-01

    An efficient routing protocol is essential to guarantee application level quality of service running on wireless ad hoc networks. In this paper we propose a novel routing algorithm that computes a path between a source and a destination by considering several important constraints such as path-life, availability of sufficient energy as well as buffer space in each of the nodes on the path between the source and destination. The algorithm chooses the best path from among the multiples paths that it computes between two endpoints. We consider the use of control packets that run at a priority higher than the data packets in determining the multiple paths. The paper also examines the impact of different schedulers such as weighted fair queuing, and weighted random early detection among others in preserving the QoS level guarantees. Our extensive simulation results indicate that the algorithm improves the overall lifetime of a network, reduces the number of dropped packets, and decreases the end-to-end delay for real-time voice application.

  6. The path to an experiment in space (from concept to flight)

    NASA Technical Reports Server (NTRS)

    Salzman, Jack A.

    1994-01-01

    The following are discussed in this viewgraph presentation on developing flight experiments for NASA's Microgravity Science and Applications Program: time from flight PI selection to launch; key flight experiment phases and schedule drivers; microgravity experiment definition/development process; definition and engineering development phase; ground-based reduced gravity research facilities; project organization; responsibilities and duties of principle investigator/co-investigators, project scientist, and project manager; the science requirements document; flight development phase; experiment cost and schedule; and keys to experiment success.

  7. Aircraft Energy Conservation during Airport Ground Operations

    DTIC Science & Technology

    1982-03-01

    minimized. The model can be run in a non -optimizing mode to simulate movements along pre-assigned taxi paths. 8-6 The model is also designed ...5.5 5.6 5.7 Engine Designation by Airline and Aircraft Type IAD 2-6 Engine Designation by Airline and Aircraft Type DCA 2-7 Fuel Flow Rates...B.2 CY 1979 Aircraft Operations at IAD and DCA Airports . . 3-5 B.3 1979 Scheduled and Non -Scheduled Departures from IAD by Aircraft Type and

  8. Traffic Patrol Service Platform Scheduling and Containment Optimization Strategy

    NASA Astrophysics Data System (ADS)

    Wang, Tiane; Niu, Taiyang; Wan, Baocheng; Li, Jian

    This article is based on the traffic and patrol police service platform settings and scheduling, in order to achieve the main purpose of rapid containment for the suspect in an emergency event. Proposing new boundary definition based on graph theory, using 0-1 programming, Dijkstra algorithm, the shortest path tree (SPT) and some of the related knowledge establish a containment model. Finally, making a combination with a city-specific data and using this model obtain the best containment plan.

  9. Office of Biological and Physical Research: Overview Transitioning to the Vision for Space Exploration

    NASA Technical Reports Server (NTRS)

    Crouch, Roger

    2004-01-01

    Viewgraphs on NASA's transition to its vision for space exploration is presented. The topics include: 1) Strategic Directives Guiding the Human Support Technology Program; 2) Progressive Capabilities; 3) A Journey to Inspire, Innovate, and Discover; 4) Risk Mitigation Status Technology Readiness Level (TRL) and Countermeasures Readiness Level (CRL); 5) Biological And Physical Research Enterprise Aligning With The Vision For U.S. Space Exploration; 6) Critical Path Roadmap Reference Missions; 7) Rating Risks; 8) Current Critical Path Roadmap (Draft) Rating Risks: Human Health; 9) Current Critical Path Roadmap (Draft) Rating Risks: System Performance/Efficiency; 10) Biological And Physical Research Enterprise Efforts to Align With Vision For U.S. Space Exploration; 11) Aligning with the Vision: Exploration Research Areas of Emphasis; 12) Code U Efforts To Align With The Vision For U.S. Space Exploration; 13) Types of Critical Path Roadmap Risks; and 14) ISS Human Support Systems Research, Development, and Demonstration. A summary discussing the vision for U.S. space exploration is also provided.

  10. 9 CFR 381.303 - Critical factors and the application of the process schedule.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... PRODUCTS INSPECTION AND VOLUNTARY INSPECTION AND CERTIFICATION POULTRY PRODUCTS INSPECTION REGULATIONS... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Critical factors and the application of the process schedule. 381.303 Section 381.303 Animals and Animal Products FOOD SAFETY AND...

  11. Circadian rhythms, sleep, and performance in space.

    PubMed

    Mallis, M M; DeRoshia, C W

    2005-06-01

    Maintaining optimal alertness and neurobehavioral functioning during space operations is critical to enable the National Aeronautics and Space Administration's (NASA's) vision "to extend humanity's reach to the Moon, Mars and beyond" to become a reality. Field data have demonstrated that sleep times and performance of crewmembers can be compromised by extended duty days, irregular work schedules, high workload, and varying environmental factors. This paper documents evidence of significant sleep loss and disruption of circadian rhythms in astronauts and associated performance decrements during several space missions, which demonstrates the need to develop effective countermeasures. Both sleep and circadian disruptions have been identified in the Behavioral Health and Performance (BH&P) area and the Advanced Human Support Technology (AHST) area of NASA's Bioastronautics Critical Path Roadmap. Such disruptions could have serious consequences on the effectiveness, health, and safety of astronaut crews, thus reducing the safety margin and increasing the chances of an accident or incident. These decrements oftentimes can be difficult to detect and counter effectively in restrictive operational environments. NASA is focusing research on the development of optimal sleep/wake schedules and countermeasure timing and application to help mitigate the cumulative effects of sleep and circadian disruption and enhance operational performance. Investing research in humans is one of NASA's building blocks that will allow for both short- and long-duration space missions and help NASA in developing approaches to manage and overcome the human limitations of space travel. In addition to reviewing the current state of knowledge concerning sleep and circadian disruptions during space operations, this paper provides an overview of NASA's broad research goals. Also, NASA-funded research, designed to evaluate the relationships between sleep quality, circadian rhythm stability, and performance proficiency in both ground-based simulations and space mission studies, as described in the 2003 NASA Task Book, will be reviewed.

  12. Circadian rhythms, sleep, and performance in space

    NASA Technical Reports Server (NTRS)

    Mallis, M. M.; DeRoshia, C. W.

    2005-01-01

    Maintaining optimal alertness and neurobehavioral functioning during space operations is critical to enable the National Aeronautics and Space Administration's (NASA's) vision "to extend humanity's reach to the Moon, Mars and beyond" to become a reality. Field data have demonstrated that sleep times and performance of crewmembers can be compromised by extended duty days, irregular work schedules, high workload, and varying environmental factors. This paper documents evidence of significant sleep loss and disruption of circadian rhythms in astronauts and associated performance decrements during several space missions, which demonstrates the need to develop effective countermeasures. Both sleep and circadian disruptions have been identified in the Behavioral Health and Performance (BH&P) area and the Advanced Human Support Technology (AHST) area of NASA's Bioastronautics Critical Path Roadmap. Such disruptions could have serious consequences on the effectiveness, health, and safety of astronaut crews, thus reducing the safety margin and increasing the chances of an accident or incident. These decrements oftentimes can be difficult to detect and counter effectively in restrictive operational environments. NASA is focusing research on the development of optimal sleep/wake schedules and countermeasure timing and application to help mitigate the cumulative effects of sleep and circadian disruption and enhance operational performance. Investing research in humans is one of NASA's building blocks that will allow for both short- and long-duration space missions and help NASA in developing approaches to manage and overcome the human limitations of space travel. In addition to reviewing the current state of knowledge concerning sleep and circadian disruptions during space operations, this paper provides an overview of NASA's broad research goals. Also, NASA-funded research, designed to evaluate the relationships between sleep quality, circadian rhythm stability, and performance proficiency in both ground-based simulations and space mission studies, as described in the 2003 NASA Task Book, will be reviewed.

  13. 76 FR 41266 - Critical Path Manufacturing Sector Research Initiative (U01)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-13

    ... under the ``Regulatory Information'' section. The title of the page is ``Research Acquisitions... the Critical Path. Research into methods for laboratory synthesis of molecules that have been designed... accelerated by better design of the facilities where this research is conducted. Creating and making these...

  14. Scheduling Software for Complex Scenarios

    NASA Technical Reports Server (NTRS)

    2006-01-01

    Preparing a vehicle and its payload for a single launch is a complex process that involves thousands of operations. Because the equipment and facilities required to carry out these operations are extremely expensive and limited in number, optimal assignment and efficient use are critically important. Overlapping missions that compete for the same resources, ground rules, safety requirements, and the unique needs of processing vehicles and payloads destined for space impose numerous constraints that, when combined, require advanced scheduling. Traditional scheduling systems use simple algorithms and criteria when selecting activities and assigning resources and times to each activity. Schedules generated by these simple decision rules are, however, frequently far from optimal. To resolve mission-critical scheduling issues and predict possible problem areas, NASA historically relied upon expert human schedulers who used their judgment and experience to determine where things should happen, whether they will happen on time, and whether the requested resources are truly necessary.

  15. Conceptualizing the Critical Path Linked by Teacher Commitment

    ERIC Educational Resources Information Center

    Sun, Jingping

    2015-01-01

    Purpose: The purpose of this paper is to propose a critical path through which school leadership travels to students by highlighting the importance of teacher commitment. Design/methodology/approach: Using both meta-analytic and narrative review methods, this paper systematically reviews the evidence in the past 20 years about the…

  16. THE CRITICAL-PATH METHOD OF CONSTRUCTION CONTROL.

    ERIC Educational Resources Information Center

    DOMBROW, RODGER T.; MAUCHLY, JOHN

    THIS DISCUSSION PRESENTS A DEFINITION AND BRIEF DESCRIPTION OF THE CRITICAL-PATH METHOD AS APPLIED TO BUILDING CONSTRUCTION. INTRODUCING REMARKS CONSIDER THE MOST PERTINENT QUESTIONS PERTAINING TO CPM AND THE NEEDS ASSOCIATED WITH MINIMIZING TIME AND COST ON CONSTRUCTION PROJECTS. SPECIFIC DISCUSSION INCLUDES--(1) ADVANTAGES OF NETWORK TECHNIQUES,…

  17. Assessment of critical path analyses of the relationship between permeability and electrical conductivity of pore networks

    USDA-ARS?s Scientific Manuscript database

    Critical path analysis (CPA) is a method for estimating macroscopic transport coefficients of heterogeneous materials that are highly disordered at the micro-scale. Developed originally to model conduction in semiconductors, numerous researchers have noted that CPA might also have relevance to flow ...

  18. EPA Critical Path Science Plan Projects 19, 20 and 21: Human and Bovine Source Detection

    EPA Science Inventory

    The U.S. EPA Critical Path Science Plan Projects are: Project 19: develop novel bovine and human host-specific PCR assays and complete performance evaluation with other published methods. Project 20: Evaluate human-specific assays with water samples impacted with different lev...

  19. Ensuring critical event sequences in high consequence computer based systems as inspired by path expressions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kidd, M.E.C.

    1997-02-01

    The goal of our work is to provide a high level of confidence that critical software driven event sequences are maintained in the face of hardware failures, malevolent attacks and harsh or unstable operating environments. This will be accomplished by providing dynamic fault management measures directly to the software developer and to their varied development environments. The methodology employed here is inspired by previous work in path expressions. This paper discusses the perceived problems, a brief overview of path expressions, the proposed methods, and a discussion of the differences between the proposed methods and traditional path expression usage and implementation.

  20. Qudas Power Plant Turbine Restoration Project and Qudas Power Plant Expansion Project Baghdad, Iraq. Project and Sustainment Assessment

    DTIC Science & Technology

    2007-10-19

    HGPI); Combustion Inspections (CI); and inspection and scheduled critical maintenance on all auxillary systems of the four GE Frame 9E units located...Work requirements for a functional end product. 6 services and scheduled critical maintenance on all auxillary systems of the four GE Frame

  1. The role of critical ethnic awareness and social support in the discrimination-depression relationship among Asian Americans: path analysis.

    PubMed

    Kim, Isok

    2014-01-01

    This study used a path analytic technique to examine associations among critical ethnic awareness, racial discrimination, social support, and depressive symptoms. Using a convenience sample from online survey of Asian American adults (N = 405), the study tested 2 main hypotheses: First, based on the empowerment theory, critical ethnic awareness would be positively associated with racial discrimination experience; and second, based on the social support deterioration model, social support would partially mediate the relationship between racial discrimination and depressive symptoms. The result of the path analysis model showed that the proposed path model was a good fit based on global fit indices, χ²(2) = 4.70, p = .10; root mean square error of approximation = 0.06; comparative fit index = 0.97; Tucker-Lewis index = 0.92; and standardized root mean square residual = 0.03. The examinations of study hypotheses demonstrated that critical ethnic awareness was directly associated (b = .11, p < .05) with the racial discrimination experience, whereas social support had a significant indirect effect (b = .48; bias-corrected 95% confidence interval [0.02, 1.26]) between the racial discrimination experience and depressive symptoms. The proposed path model illustrated that both critical ethnic awareness and social support are important mechanisms for explaining the relationship between racial discrimination and depressive symptoms among this sample of Asian Americans. This study highlights the usefulness of the critical ethnic awareness concept as a way to better understand how Asian Americans might perceive and recognize racial discrimination experiences in relation to its mental health consequences.

  2. Comparative Study of Job Burnout Among Critical Care Nurses With Fixed and Rotating Shift Schedules

    PubMed Central

    Shamali, Mahdi; Shahriari, Mohsen; Babaii, Atye; Abbasinia, Mohammad

    2015-01-01

    Background: Nurses, as health care providers, are insurmountably obliged to the practice of shift work. Literature has reported shift working as one of the inducing factors of burnout. Despite numerous studies in this area, there are inconsistencies on the relationship between shift working and burnout among nurses, especially in those who work in critical care settings. Objectives: The aim of this study was to compare the occupational burnout in critical care nurses with and without fixed shift schedules. Patients and Methods: In this comparative study, 130 nurses with rotating shift schedule and 130 nurses with fixed shift schedule from six university hospitals were selected using stratified random sampling. Maslach burnout inventory was used for data collection. Independent samples t-test, chi-square and one-way ANOVA tests were used to analyze the data. Results: Most of the participants were females (62.7%), aged between 22 - 29 years (38.5%), married (59.2%), and had a bachelor degree (86.9%). The mean score of emotional exhaustion was significantly higher in nurses with fixed shift schedules (P < 0.001). However, no significant difference was found between the mean scores of the two groups in the personal accomplishment and depersonalization subscales (P > 0.05). Moreover, no significant difference was found in burnout mean scores between nurses with fixed morning and fixed night shifts (P > 0.05). The means of the emotional exhaustion subscale were significantly different in nurses with different characteristics (P < 0.05) except the gender and working unit. Conclusions: As a result of this study, it was found that critical care nurses with fixed shift schedules display more burnout in emotional exhaustion dimension, compared to those working with rotating shift schedules. PMID:26576442

  3. Design and implementation of priority and time-window based traffic scheduling and routing-spectrum allocation mechanism in elastic optical networks

    NASA Astrophysics Data System (ADS)

    Wang, Honghuan; Xing, Fangyuan; Yin, Hongxi; Zhao, Nan; Lian, Bizhan

    2016-02-01

    With the explosive growth of network services, the reasonable traffic scheduling and efficient configuration of network resources have an important significance to increase the efficiency of the network. In this paper, an adaptive traffic scheduling policy based on the priority and time window is proposed and the performance of this algorithm is evaluated in terms of scheduling ratio. The routing and spectrum allocation are achieved by using the Floyd shortest path algorithm and establishing a node spectrum resource allocation model based on greedy algorithm, which is proposed by us. The fairness index is introduced to improve the capability of spectrum configuration. The results show that the designed traffic scheduling strategy can be applied to networks with multicast and broadcast functionalities, and makes them get real-time and efficient response. The scheme of node spectrum configuration improves the frequency resource utilization and gives play to the efficiency of the network.

  4. Cooperative Scheduling of Imaging Observation Tasks for High-Altitude Airships Based on Propagation Algorithm

    PubMed Central

    Chuan, He; Dishan, Qiu; Jin, Liu

    2012-01-01

    The cooperative scheduling problem on high-altitude airships for imaging observation tasks is discussed. A constraint programming model is established by analyzing the main constraints, which takes the maximum task benefit and the minimum cruising distance as two optimization objectives. The cooperative scheduling problem of high-altitude airships is converted into a main problem and a subproblem by adopting hierarchy architecture. The solution to the main problem can construct the preliminary matching between tasks and observation resource in order to reduce the search space of the original problem. Furthermore, the solution to the sub-problem can detect the key nodes that each airship needs to fly through in sequence, so as to get the cruising path. Firstly, the task set is divided by using k-core neighborhood growth cluster algorithm (K-NGCA). Then, a novel swarm intelligence algorithm named propagation algorithm (PA) is combined with the key node search algorithm (KNSA) to optimize the cruising path of each airship and determine the execution time interval of each task. Meanwhile, this paper also provides the realization approach of the above algorithm and especially makes a detailed introduction on the encoding rules, search models, and propagation mechanism of the PA. Finally, the application results and comparison analysis show the proposed models and algorithms are effective and feasible. PMID:23365522

  5. Network models for solving the problem of multicriterial adaptive optimization of investment projects control with several acceptable technologies

    NASA Astrophysics Data System (ADS)

    Shorikov, A. F.; Butsenko, E. V.

    2017-10-01

    This paper discusses the problem of multicriterial adaptive optimization the control of investment projects in the presence of several technologies. On the basis of network modeling proposed a new economic and mathematical model and a method for solving the problem of multicriterial adaptive optimization the control of investment projects in the presence of several technologies. Network economic and mathematical modeling allows you to determine the optimal time and calendar schedule for the implementation of the investment project and serves as an instrument to increase the economic potential and competitiveness of the enterprise. On a meaningful practical example, the processes of forming network models are shown, including the definition of the sequence of actions of a particular investment projecting process, the network-based work schedules are constructed. The calculation of the parameters of network models is carried out. Optimal (critical) paths have been formed and the optimal time for implementing the chosen technologies of the investment project has been calculated. It also shows the selection of the optimal technology from a set of possible technologies for project implementation, taking into account the time and cost of the work. The proposed model and method for solving the problem of managing investment projects can serve as a basis for the development, creation and application of appropriate computer information systems to support the adoption of managerial decisions by business people.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The Work Breakdown Structure (WBS) Book begins with this Overview section, which contains the high-level summary cost estimate, the cost profile, and the global construction schedule. The summary cost estimate shows the total US cost and the cost in terms of PHENIX construction funds for building the PHENIX detector. All costs in the WBS book are shown in FY 1993 dollars. Also shown are the institutional and foreign contributions, the level of pre-operations funding, and the cost of deferred items. Pie charts are presented at PHENIX WBS level 1 and 2 that show this information. The PHENIX construction funds aremore » shown broken down to PHENIX WBS level 3 items per fiscal year, and the resulting profile is compared to the RHIC target profile. An accumulated difference of the two profiles is also shown. The PHENIX global construction schedule is presented at the end of the Overview section. Following the Overview are sections for each subsystem. Each subsystem section begins with a summary cost estimate, cost profile, and critical path. The total level 3 cost is broken down into fixed costs (M&S), engineering costs (EDIA) and labor costs. Costs are further broken down in terms of PHENIX construction funds, institutional and foreign contributions, pre-operations funding, and deferred items. Also shown is the contingency at level 3 and the level 4 breakdown of the total cost. The cost profile in fiscal years is shown at level 3. The subsystem summaries are followed by the full cost estimate and schedule sheets for that subsystem. These detailed sheets are typically carried down to level 7 or 8. The cost estimate Total, M&S, EDIA, and Labor breakdowns, as well as contingency, for each WBS entry.« less

  7. Waste Information Management System with 2012-13 Waste Streams - 13095

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Upadhyay, H.; Quintero, W.; Lagos, L.

    2013-07-01

    The Waste Information Management System (WIMS) 2012-13 was updated to support the Department of Energy (DOE) accelerated cleanup program. The schedule compression required close coordination and a comprehensive review and prioritization of the barriers that impeded treatment and disposition of the waste streams at each site. Many issues related to waste treatment and disposal were potential critical path issues under the accelerated schedule. In order to facilitate accelerated cleanup initiatives, waste managers at DOE field sites and at DOE Headquarters in Washington, D.C., needed timely waste forecast and transportation information regarding the volumes and types of radioactive waste that wouldmore » be generated by DOE sites over the next 40 years. Each local DOE site historically collected, organized, and displayed waste forecast information in separate and unique systems. In order for interested parties to understand and view the complete DOE complex-wide picture, the radioactive waste and shipment information of each DOE site needed to be entered into a common application. The WIMS application was therefore created to serve as a common application to improve stakeholder comprehension and improve DOE radioactive waste treatment and disposal planning and scheduling. WIMS allows identification of total forecasted waste volumes, material classes, disposition sites, choke points, technological or regulatory barriers to treatment and disposal, along with forecasted waste transportation information by rail, truck and inter-modal shipments. The Applied Research Center (ARC) at Florida International University (FIU) in Miami, Florida, developed and deployed the web-based forecast and transportation system and is responsible for updating the radioactive waste forecast and transportation data on a regular basis to ensure the long-term viability and value of this system. (authors)« less

  8. Estimation of distribution algorithm with path relinking for the blocking flow-shop scheduling problem

    NASA Astrophysics Data System (ADS)

    Shao, Zhongshi; Pi, Dechang; Shao, Weishi

    2018-05-01

    This article presents an effective estimation of distribution algorithm, named P-EDA, to solve the blocking flow-shop scheduling problem (BFSP) with the makespan criterion. In the P-EDA, a Nawaz-Enscore-Ham (NEH)-based heuristic and the random method are combined to generate the initial population. Based on several superior individuals provided by a modified linear rank selection, a probabilistic model is constructed to describe the probabilistic distribution of the promising solution space. The path relinking technique is incorporated into EDA to avoid blindness of the search and improve the convergence property. A modified referenced local search is designed to enhance the local exploitation. Moreover, a diversity-maintaining scheme is introduced into EDA to avoid deterioration of the population. Finally, the parameters of the proposed P-EDA are calibrated using a design of experiments approach. Simulation results and comparisons with some well-performing algorithms demonstrate the effectiveness of the P-EDA for solving BFSP.

  9. Class start times, sleep, and academic performance in college: a path analysis.

    PubMed

    Onyper, Serge V; Thacher, Pamela V; Gilbert, Jack W; Gradess, Samuel G

    2012-04-01

    Path analysis was used to examine the relationship between class start times, sleep, circadian preference, and academic performance in college-aged adults. Consistent with observations in middle and high school students, college students with later class start times slept longer, experienced less daytime sleepiness, and were less likely to miss class. Chronotype was an important moderator of sleep schedules and daytime functioning; those with morning preference went to bed and woke up earlier and functioned better throughout the day. The benefits of taking later classes did not extend to academic performance, however; grades were somewhat lower in students with predominantly late class schedules. Furthermore, students taking later classes were at greater risk for increased alcohol consumption, and among all the factors affecting academic performance, alcohol misuse exerted the strongest effect. Thus, these results indicate that later class start times in college, while allowing for more sleep, also increase the likelihood of alcohol misuse, ultimately impeding academic success.

  10. Time-Critical Cooperative Path Following of Multiple UAVs: Case Studies

    DTIC Science & Technology

    2012-10-30

    control algorithm for UAVs in 3D space. Section IV derives a strategy for time-critical cooperative path following of multiple UAVs that relies on the...UAVs in 3D space, in which a fleet of UAVs is tasked to converge to and follow a set of desired feasible paths so as to meet spatial and temporal...cooperative trajectory generation is not addressed in this paper. In fact, it is assumed that a set of desired 3D time trajectories pd,i(td) : R → R3

  11. Developing algorithm for the critical care physician scheduling

    NASA Astrophysics Data System (ADS)

    Lee, Hyojun; Pah, Adam; Amaral, Luis; Northwestern Memorial Hospital Collaboration

    Understanding the social network has enabled us to quantitatively study social phenomena such as behaviors in adoption and propagation of information. However, most work has been focusing on networks of large heterogeneous communities, and little attention has been paid to how work-relevant information spreads within networks of small and homogeneous groups of highly trained individuals, such as physicians. Within the professionals, the behavior patterns and the transmission of information relevant to the job are dependent not only on the social network between the employees but also on the schedules and teams that work together. In order to systematically investigate the dependence of the spread of ideas and adoption of innovations on a work-environment network, we sought to construct a model for the interaction network of critical care physicians at Northwestern Memorial Hospital (NMH) based on their work schedules. We inferred patterns and hidden rules from past work schedules such as turnover rates. Using the characteristics of the work schedules of the physicians and their turnover rates, we were able to create multi-year synthetic work schedules for a generic intensive care unit. The algorithm for creating shift schedules can be applied to other schedule dependent networks ARO1.

  12. Optimization and planning of operating theatre activities: an original definition of pathways and process modeling.

    PubMed

    Barbagallo, Simone; Corradi, Luca; de Ville de Goyet, Jean; Iannucci, Marina; Porro, Ivan; Rosso, Nicola; Tanfani, Elena; Testi, Angela

    2015-05-17

    The Operating Room (OR) is a key resource of all major hospitals, but it also accounts for up 40% of resource costs. Improving cost effectiveness, while maintaining a quality of care, is a universal objective. These goals imply an optimization of planning and a scheduling of the activities involved. This is highly challenging due to the inherent variable and unpredictable nature of surgery. A Business Process Modeling Notation (BPMN 2.0) was used for the representation of the "OR Process" (being defined as the sequence of all of the elementary steps between "patient ready for surgery" to "patient operated upon") as a general pathway ("path"). The path was then both further standardized as much as possible and, at the same time, keeping all of the key-elements that would allow one to address or define the other steps of planning, and the inherent and wide variability in terms of patient specificity. The path was used to schedule OR activity, room-by-room, and day-by-day, feeding the process from a "waiting list database" and using a mathematical optimization model with the objective of ending up in an optimized planning. The OR process was defined with special attention paid to flows, timing and resource involvement. Standardization involved a dynamics operation and defined an expected operating time for each operation. The optimization model has been implemented and tested on real clinical data. The comparison of the results reported with the real data, shows that by using the optimization model, allows for the scheduling of about 30% more patients than in actual practice, as well as to better exploit the OR efficiency, increasing the average operating room utilization rate up to 20%. The optimization of OR activity planning is essential in order to manage the hospital's waiting list. Optimal planning is facilitated by defining the operation as a standard pathway where all variables are taken into account. By allowing a precise scheduling, it feeds the process of planning and, further up-stream, the management of a waiting list in an interactive and bi-directional dynamic process.

  13. Economic Operation of Supercritical CO2 Refrigeration Energy Storage Technology

    NASA Astrophysics Data System (ADS)

    Hay, Ryan

    With increasing penetration of intermittent renewable energy resources, improved methods of energy storage are becoming a crucial stepping stone in the path toward a smarter, greener grid. SuperCritical Technologies is a company based in Bremerton, WA that is developing a storage technology that can operate entirely on waste heat, a resource that is otherwise dispelled into the environment. The following research models this storage technology in several electricity spot markets around the US to determine if it is economically viable. A modification to the storage dispatch scheme is then presented which allows the storage unit to increase its profit in real-time markets by taking advantage of extreme price fluctuations. Next, the technology is modeled in combination with an industrial load profile on two different utility rate schedules to determine potential cost savings. The forecast of facility load has a significant impact on savings from the storage dispatch, so an exploration into this relationship is then presented.

  14. Study of short haul high-density V/STOL transportation systems. Volume 2: Appendices

    NASA Technical Reports Server (NTRS)

    Solomon, H. L.

    1972-01-01

    Essential supporting data to the short haul transportation study are presented. The specific appendices are arena characteristics, aerospace transportation analysis computer program, economics, model calibration, STOLport siting and services path selection, STOL schedule definition, tabulated California corridor results, and tabulated Midwest arena results.

  15. Adaptive critics for dynamic optimization.

    PubMed

    Kulkarni, Raghavendra V; Venayagamoorthy, Ganesh Kumar

    2010-06-01

    A novel action-dependent adaptive critic design (ACD) is developed for dynamic optimization. The proposed combination of a particle swarm optimization-based actor and a neural network critic is demonstrated through dynamic sleep scheduling of wireless sensor motes for wildlife monitoring. The objective of the sleep scheduler is to dynamically adapt the sleep duration to node's battery capacity and movement pattern of animals in its environment in order to obtain snapshots of the animal on its trajectory uniformly. Simulation results show that the sleep time of the node determined by the actor critic yields superior quality of sensory data acquisition and enhanced node longevity. Copyright 2010 Elsevier Ltd. All rights reserved.

  16. An Optimal Schedule for Urban Road Network Repair Based on the Greedy Algorithm

    PubMed Central

    Lu, Guangquan; Xiong, Ying; Wang, Yunpeng

    2016-01-01

    The schedule of urban road network recovery caused by rainstorms, snow, and other bad weather conditions, traffic incidents, and other daily events is essential. However, limited studies have been conducted to investigate this problem. We fill this research gap by proposing an optimal schedule for urban road network repair with limited repair resources based on the greedy algorithm. Critical links will be given priority in repair according to the basic concept of the greedy algorithm. In this study, the link whose restoration produces the ratio of the system-wide travel time of the current network to the worst network is the minimum. We define such a link as the critical link for the current network. We will re-evaluate the importance of damaged links after each repair process is completed. That is, the critical link ranking will be changed along with the repair process because of the interaction among links. We repair the most critical link for the specific network state based on the greedy algorithm to obtain the optimal schedule. The algorithm can still quickly obtain an optimal schedule even if the scale of the road network is large because the greedy algorithm can reduce computational complexity. We prove that the problem can obtain the optimal solution using the greedy algorithm in theory. The algorithm is also demonstrated in the Sioux Falls network. The problem discussed in this paper is highly significant in dealing with urban road network restoration. PMID:27768732

  17. Scheduling Randomly-Deployed Heterogeneous Video Sensor Nodes for Reduced Intrusion Detection Time

    NASA Astrophysics Data System (ADS)

    Pham, Congduc

    This paper proposes to use video sensor nodes to provide an efficient intrusion detection system. We use a scheduling mechanism that takes into account the criticality of the surveillance application and present a performance study of various cover set construction strategies that take into account cameras with heterogeneous angle of view and those with very small angle of view. We show by simulation how a dynamic criticality management scheme can provide fast event detection for mission-critical surveillance applications by increasing the network lifetime and providing low stealth time of intrusions.

  18. SEPS mission and system integration/interface requirements for the space transportation system. [Solar Electric Propulsion System

    NASA Technical Reports Server (NTRS)

    Cork, M. J.; Barnett, P. M.; Shaffer, J., Jr.; Doran, B. J.

    1979-01-01

    Earth escape mission requirements on Solar Electric Propulsion System (SEPS), and the interface definition and planned integration between SEPS, user spacecraft, and other elements of the STS. Emphasis is placed on the Comet rendezvous mission, scheduled to be the first SEPS user. Interactive SEPS interface characteristics with spacecraft and mission, as well as the multiple organizations and inter-related development schedules required to integrate the SEPS with spacecraft and STS, require early attention to definition of interfaces in order to assure a successful path to the first SEPS launch in July 1985

  19. How to Compute a Slot Marker - Calculation of Controller Managed Spacing Tools for Efficient Descents with Precision Scheduling

    NASA Technical Reports Server (NTRS)

    Prevot, Thomas

    2012-01-01

    This paper describes the underlying principles and algorithms for computing the primary controller managed spacing (CMS) tools developed at NASA for precisely spacing aircraft along efficient descent paths. The trajectory-based CMS tools include slot markers, delay indications and speed advisories. These tools are one of three core NASA technologies integrated in NASAs ATM technology demonstration-1 (ATD-1) that will operationally demonstrate the feasibility of fuel-efficient, high throughput arrival operations using Automatic Dependent Surveillance Broadcast (ADS-B) and ground-based and airborne NASA technologies for precision scheduling and spacing.

  20. The MICRO-BOSS scheduling system: Current status and future efforts

    NASA Technical Reports Server (NTRS)

    Sadeh, Norman M.

    1993-01-01

    In this paper, a micro-opportunistic approach to factory scheduling was described that closely monitors the evolution of bottlenecks during the construction of the schedule, and continuously redirects search towards the bottleneck that appears to be most critical. This approach differs from earlier opportunistic approaches, as it does not require scheduling large resource subproblems or large job subproblems before revising the current scheduling strategy. This micro-opportunistic approach was implemented in the context of the MICRO-BOSS factory scheduling system. A study comparing MICRO-BOSS against a macro-opportunistic scheduler suggests that the additional flexibility of the micro-opportunistic approach to scheduling generally yields important reductions in both tardiness and inventory.

  1. The Value of Weather Forecast in Irrigation

    NASA Astrophysics Data System (ADS)

    Cai, X.; Wang, D.

    2007-12-01

    This paper studies irrigation scheduling (when and how much water to apply during the crop growth season) in the Havana Lowlands region, Illinois, using meteorological, agronomic and agricultural production data from 2002. Irrigation scheduling determines the timing and amount of water applied to an irrigated cropland during the crop growing season. In this study a hydrologic-agronomic simulation is coupled with an optimization algorithm to search for the optimal irrigation schedule under various weather forecast horizons. The economic profit of irrigated corn from an optimized scheduling is compared to that from and the actual schedule, which is adopted from a pervious study. Extended and reliable climate prediction and weather forecast are found to be significantly valuable. If a weather forecast horizon is long enough to include the critical crop growth stage, in which crop yield bears the maximum loss over all stages, much economic loss can be avoided. Climate predictions of one to two months, which can cover the critical period, might be even more beneficial during a dry year. The other purpose of this paper is to analyze farmers' behavior in irrigation scheduling by comparing the "actual" schedule to the "optimized" ones. The ultimate goal of irrigation schedule optimization is to provide information to farmers so that they may modify their behavior. In practice, farmers' decision may not follow an optimal irrigation schedule due to the impact of various factors such as natural conditions, policies, farmers' habits and empirical knowledge, and the uncertain or inexact information that they receive. In this study farmers' behavior in irrigation decision making is analyzed by comparing the "actual" schedule to the "optimized" ones. This study finds that the identification of the crop growth stage with the most severe water stress is critical for irrigation scheduling. For the case study site in the year of 2002, framers' response to water stress was found to be late; they did not even respond appropriately to a major rainfall just 3 days ahead, which might be due to either an unreliable weather forecast or farmer's ignorance of the forecast.

  2. CPM (Critical Path Method) as a Curriculum Tool.

    ERIC Educational Resources Information Center

    Mongerson, M. Duane

    This document discusses and illustrates the use of the Critical Path Method (CPM) as a tool for developing curriculum. In so doing a brief review of the evolution of CPM as a management tool developed by E. I. duPont de Nemours Company is presented. It is also noted that CPM is only a method of sequencing learning activities and not an end unto…

  3. Defining Advancement Career Paths and Succession Plans: Critical Human Capital Retention Strategies for High-Performing Advancement Divisions

    ERIC Educational Resources Information Center

    Croteau, Jon Derek; Wolk, Holly Gordon

    2010-01-01

    There are many factors that can influence whether a highly talented staff member will build a career within an institution or use it as a stepping stone. This article defines and explores the notions of developing career paths and succession planning and why they are critical human capital investment strategies in retaining the highest performers…

  4. An Analysis of Mission Critical Computer Software in Naval Aviation

    DTIC Science & Technology

    1991-03-01

    No. Task No. Work Unit Accesion Number 11. TITLE (Include Security Classification) AN ANALYSIS OF MISSION CRITICAL COMPUTER SOFTWARE IN NAVAL AVIATION...software development schedules were sustained without a milestone change being made. Also, software that was released to the fleet had no major...fleet contain any major defects? This research has revealed that only about half of the original software development schedules were sustained without a

  5. Germany's ECEC Workforce: A Difficult Path to Professionalisation

    ERIC Educational Resources Information Center

    Rauschenbach, Thomas; Riedel, Birgit

    2016-01-01

    In a European comparison, the childcare profession in Germany has taken a distinct path of development which is closely interwoven with the history of early childhood education and care (ECEC) in general. Institutional choices critical to this path are the assignment of childcare as part of social welfare, the pursuit of a maternalist tradition in…

  6. Implementation of Compressed Work Schedules: Participation and Job Redesign as Critical Factors for Employee Acceptance.

    ERIC Educational Resources Information Center

    Latack, Janina C.; Foster, Lawrence W.

    1985-01-01

    Analyzes the effects of an implementation of a three-day/thirty-eight hour (3/38) work schedule among information systems personnel (N=84). Data showed that 18 months after implementation, 3/38 employees still strongly favor the compressed schedule. Data also suggest substantial organizational payoffs including reductions in sick time, overtime,…

  7. MDTM: Optimizing Data Transfer using Multicore-Aware I/O Scheduling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Liang; Demar, Phil; Wu, Wenji

    2017-05-09

    Bulk data transfer is facing significant challenges in the coming era of big data. There are multiple performance bottlenecks along the end-to-end path from the source to destination storage system. The limitations of current generation data transfer tools themselves can have a significant impact on end-to-end data transfer rates. In this paper, we identify the issues that lead to underperformance of these tools, and present a new data transfer tool with an innovative I/O scheduler called MDTM. The MDTM scheduler exploits underlying multicore layouts to optimize throughput by reducing delay and contention for I/O reading and writing operations. With ourmore » evaluations, we show how MDTM successfully avoids NUMA-based congestion and significantly improves end-to-end data transfer rates across high-speed wide area networks.« less

  8. MDTM: Optimizing Data Transfer using Multicore-Aware I/O Scheduling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Liang; Demar, Phil; Wu, Wenji

    2017-01-01

    Bulk data transfer is facing significant challenges in the coming era of big data. There are multiple performance bottlenecks along the end-to-end path from the source to destination storage system. The limitations of current generation data transfer tools themselves can have a significant impact on end-to-end data transfer rates. In this paper, we identify the issues that lead to underperformance of these tools, and present a new data transfer tool with an innovative I/O scheduler called MDTM. The MDTM scheduler exploits underlying multicore layouts to optimize throughput by reducing delay and contention for I/O reading and writing operations. With ourmore » evaluations, we show how MDTM successfully avoids NUMA-based congestion and significantly improves end-to-end data transfer rates across high-speed wide area networks.« less

  9. Scheduling of House Development Projects with CPM and PERT Method for Time Efficiency (Case Study: House Type 36)

    NASA Astrophysics Data System (ADS)

    Kholil, Muhammad; Nurul Alfa, Bonitasari; Hariadi, Madjumsyah

    2018-04-01

    Network planning is one of the management techniques used to plan and control the implementation of a project, which shows the relationship between activities. The objective of this research is to arrange network planning on house construction project on CV. XYZ and to know the role of network planning in increasing the efficiency of time so that can be obtained the optimal project completion period. This research uses descriptive method, where the data collected by direct observation to the company, interview, and literature study. The result of this research is optimal time planning in project work. Based on the results of the research, it can be concluded that the use of the both methods in scheduling of house construction project gives very significant effect on the completion time of the project. The company’s CPM (Critical Path Method) method can complete the project with 131 days, PERT (Program Evaluation Review and Technique) Method takes 136 days. Based on PERT calculation obtained Z = -0.66 or 0,2546 (from normal distribution table), and also obtained the value of probability or probability is 74,54%. This means that the possibility of house construction project activities can be completed on time is high enough. While without using both methods the project completion time takes 173 days. So using the CPM method, the company can save time up to 42 days and has time efficiency by using network planning.

  10. Precision Cleaning - Path to Premier

    NASA Technical Reports Server (NTRS)

    Mackler, Scott E.

    2008-01-01

    ITT Space Systems Division s new Precision Cleaning facility provides critical cleaning and packaging of aerospace flight hardware and optical payloads to meet customer performance requirements. The Precision Cleaning Path to Premier Project was a 2007 capital project and is a key element in the approved Premier Resource Management - Integrated Supply Chain Footprint Optimization Project. Formerly precision cleaning was located offsite in a leased building. A new facility equipped with modern precision cleaning equipment including advanced process analytical technology and improved capabilities was designed and built after outsourcing solutions were investigated and found lacking in ability to meet quality specifications and schedule needs. SSD cleans parts that can range in size from a single threaded fastener all the way up to large composite structures. Materials that can be processed include optics, composites, metals and various high performance coatings. We are required to provide verification to our customers that we have met their particulate and molecular cleanliness requirements and we have that analytical capability in this new facility. The new facility footprint is approximately half the size of the former leased operation and provides double the amount of throughput. Process improvements and new cleaning equipment are projected to increase 1st pass yield from 78% to 98% avoiding $300K+/yr in rework costs. Cost avoidance of $350K/yr will result from elimination of rent, IT services, transportation, and decreased utility costs. Savings due to reduced staff expected to net $4-500K/yr.

  11. Introduction of Virtualization Technology to Multi-Process Model Checking

    NASA Technical Reports Server (NTRS)

    Leungwattanakit, Watcharin; Artho, Cyrille; Hagiya, Masami; Tanabe, Yoshinori; Yamamoto, Mitsuharu

    2009-01-01

    Model checkers find failures in software by exploring every possible execution schedule. Java PathFinder (JPF), a Java model checker, has been extended recently to cover networked applications by caching data transferred in a communication channel. A target process is executed by JPF, whereas its peer process runs on a regular virtual machine outside. However, non-deterministic target programs may produce different output data in each schedule, causing the cache to restart the peer process to handle the different set of data. Virtualization tools could help us restore previous states of peers, eliminating peer restart. This paper proposes the application of virtualization technology to networked model checking, concentrating on JPF.

  12. Modern digital flight control system design for VTOL aircraft

    NASA Technical Reports Server (NTRS)

    Broussard, J. R.; Berry, P. W.; Stengel, R. F.

    1979-01-01

    Methods for and results from the design and evaluation of a digital flight control system (DFCS) for a CH-47B helicopter are presented. The DFCS employed proportional-integral control logic to provide rapid, precise response to automatic or manual guidance commands while following conventional or spiral-descent approach paths. It contained altitude- and velocity-command modes, and it adapted to varying flight conditions through gain scheduling. Extensive use was made of linear systems analysis techniques. The DFCS was designed, using linear-optimal estimation and control theory, and the effects of gain scheduling are assessed by examination of closed-loop eigenvalues and time responses.

  13. Progress on the decommissioning of Zion nuclear generating station

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moloney, B. P.; Hess, J.

    2013-07-01

    The decommissioning of the twin 1040 MWe PWRs at Zion, near Chicago USA is a ground breaking programme. The original owner, Exelon Nuclear Corporation, transferred the full responsibility for reactor dismantling and site license termination to a subsidiary of EnergySolutions. The target end state of the Zion site for return to Exelon will be a green field with the exception of the dry fuel storage pad. In return, ZionSolutions has access to the full value of the decommissioning trust fund. There are two potential attractions of this model: lower overall cost and significant schedule acceleration. The Zion programme which commencedmore » in September 2010 is designed to return the cleared site with an Independent Spent Fuel Storage Installation (ISFSI) pad in 2020, 12 years earlier than planned by Exelon. The overall cost, at $500 M per full size power reactor is significantly below the long run trend of $750 M+ per PWR. Implementation of the accelerated programme has been underway for nearly three years and is making good progress. The programme is characterised by numerous projects proceeding in parallel. The critical path is defined by the inspection and removal of fuel from the pond and transfer into dry fuel storage casks on the ISFSI pad and completion of RPV segmentation. Fuel loading is expected to commence in mid- 2013 with completion in late 2014. In parallel, ZionSolutions is proceeding with the segmentation of the Reactor Vessel (RV) and internals in both Units. Removal of large components from Unit 1 is underway. Numerous other projects are underway or have been completed to date. They include access openings into both containments, installation of heavy lift crane capacity, rail upgrades to support waste removal from the site, radiological characterization of facilities and equipment and numerous related tasks. As at February 2013, the programme is just ahead of schedule and within the latest budget. The paper will provide a fuller update. The first two years of the Zion programme offer some interesting learning opportunities. The critical importance of leadership and project control systems will be emphasised in the paper. Strong supplier relationships and good community cooperation are essential. A learning and adaptable team, incentivised to meet schedule and budget, drives affordability of the whole programme. Our key lessons so far concern organisation and people as much as engineering and technology. (authors)« less

  14. Learned Helplessness as a Schedule-Shift Effect.

    ERIC Educational Resources Information Center

    McReynolds, William T.

    1980-01-01

    The essentials of learned helplessness theory are described and supporting evidence surveyed. The explanation Seligman and Maier give for these findings is critically analyzed. A schedule-shift discrimination theory of learned helplessness effects is also discussed. (Author)

  15. The TPS Advanced Development Project for CEV

    NASA Technical Reports Server (NTRS)

    Reuther, James; Wercinski, Paul; Venkatapathy, Ethiraj; Ellerby, Don; Raiche, George; Bowman, Lynn; Jones, Craig; Kowal, John

    2006-01-01

    The CEV TPS Advanced Development Project (ADP) is a NASA in-house activity for providing two heatshield preliminary designs (a Lunar direct return as well as a LEO only return) for the CEV, including the TPS, the carrier structure, the interfaces and the attachments. The project s primary objective is the development of a single heatshield preliminary design that meets both Lunar direct return and LEO return requirements. The effort to develop the Lunar direct return capable heatshield is considered a high risk item for the NASA CEV development effort due to the low TRL (approx. 4) of the candidate TPS materials. By initiating the TPS ADP early in the development cycle, the intent is to use materials analysis and testing in combination with manufacturing demonstrations to reduce the programmatic risk of using advanced TPS technologies in the critical path for CEV. Due to the technical and schedule risks associated a Lunar return heatshield, the ADP will pursue a parallel path design approach, whereby a back-up TPS/heatshield design that only meets LEO return requirements is also developed. The TPS materials and carrier structure design concept selections will be based on testing, analysis, design and evaluation of scalability and manufacturing performed under the ADP. At the TPS PDR, the preferred programmatic strategy is to transfer the continued (detailed) design, development, testing and evaluation (DDT&E) of both the Lunar direct and LEO return designs to a government/prime contractor coordinated sub-system design team. The CEV prime contractor would have responsibility for the continued heatshield sub-system development. Continued government participation would include analysis, testing and evaluation as well as decision authority at TPS Final System Decision (FSD) (choosing between the primary and back-up heatshields) occurring between TPS PDR and TPS Critical Design Review (CDR). After TPS FSD the prime CEV contractor will complete the detailed design, certification testing, procurement, and integration of the CEV TPS.

  16. Teaching as a Political Act: Critical Pedagogy in Library Instruction

    ERIC Educational Resources Information Center

    Fritch, Melia Erin

    2018-01-01

    This article establishes a theoretical framework for critical library instruction (and thereby critical information literacy) that is built upon critical feminist theory, critical race theory, and engaged pedagogy, among others. Using the ideas and work of theorists to create a path linking the ideas of critical analyses together, the author…

  17. Exploring the Architectural Tradespace of Severe Weather Monitoring Nanosatellite Constellations

    NASA Astrophysics Data System (ADS)

    Hitomi, N.; Selva, D.; Blackwell, W. J.

    2014-12-01

    MicroMAS-1, a 3U nanosatellite developed by MIT/LL, MIT/SSL, and University of Massachusetts, was launched on July 13, 2014 and is scheduled for deployment from the International Space Station in September. The development of MicroMAS motivates an architectural analysis of a constellation of nanosatellites with the goal of drastically reducing the cost of observing severe storms compared with current monolithic missions such as the Precision and All-Weather Temperature and Humidity (PATH) mission from the NASA Decadal Survey. Our goal is to evolve the instrument capability on weather monitoring nanosatellites to achieve higher performance and better satisfy stakeholder needs. Clear definitions of performance requirements are critical in the conceptual design phase when much of the project's lifecycle cost and performance will be fixed. Ability to perform trade studies and optimization of performance needs with instrument capability will enable design teams to focus on key technologies that will introduce high value and high return on investment. In this work, we approach the significant trades and trends of constellations for monitoring severe storms by applying our rule-based decision support tool. We examine a subset of stakeholder groups listed in the OSCAR online database (e.g., weather, climate) that would benefit from severe storm weather data and their respective observation requirements (e.g. spatial resolution, accuracy). We use ten parameters in our analysis, including atmospheric temperature, humidity, and precipitation. We compare the performance and cost of thousands of different possible constellations. The constellations support hyperspectral sounders that cover different portions of the millimeter-wave spectrum (50-60 GHz, 118GHz, 183GHz) in different orbits, and the performance results are compared against those of the monolithic PATH mission. Our preliminary results indicate that constellations using the hyperspectral millimeter wave sounders can better satisfy stakeholder needs compared to the PATH mission. Well-architected constellations have increased coverage, improved horizontal resolution from lower orbits, and improved temporal resolution. Furthermore, this improved performance can be achieved at a lower cost than what is estimated for the PATH mission.

  18. The One-Year Residency Program: An Alternative Path to the Master's Degree in Social Work.

    ERIC Educational Resources Information Center

    Salmon, Robert; Walker, Joel

    1981-01-01

    At Hunter College an alternative master's program for social workers who cannot give up employment for full-time study replaces the traditional scheduling with three flexible time periods: pre-fieldwork and post-fieldwork, part-time coursework, and one-year, four-day-a-week fieldwork experience in the place of employment. (MSE)

  19. Going Places with Books. Suggestions from "NSTA Recommends" Reviewers

    ERIC Educational Resources Information Center

    Texley, Juliana

    2008-01-01

    The National Science Education Standards encourage teachers to go on their own special path to professional development. Whether your budget and personal schedule allow real travel or vicarious adventure, summer is a great time to take a book along. So, once again this year, the reviewers of "NSTA Recommends" have come up with suggestions for your…

  20. 21 CFR 113.100 - Processing and production records.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... critical factors specified in the scheduled process shall also be recorded. In addition, the following... preservation methods wherein critical factors such as water activity are used in conjunction with thermal... critical factors, as well as other critical factors, and results of aw determinations. (7) Other systems...

  1. Path Expressions

    DTIC Science & Technology

    1975-06-01

    ORGANIZATION NAME AND ADDRESS Carnegie-Mellon University Computer Science Dept Pittsburgh, Pa 15213 II. CONTROLLING OFFICE NAMF AND ADDRESS...programmer. Example 1. A communciation between two procasses is initiated by declaring a buffer which can hold a message whose interpretation is Known...words, the functions named in a path are automatically embedded in a critical region specific for that path.) The computation of the next state in

  2. Cyber Fighter Associate

    DTIC Science & Technology

    2016-01-01

    accomplish a patch- management mission while securing a critical path. As a first proof of concept a simulation with a network of 10 nodes and 4...software-agility walk of the “PERFORMANCE Each Threat Managed ” tree is slightly more complex than the network -agility walk. The original design of the...CyFiA was tested to accomplish a patch- management mission while securing a critical path. As a first proof of concept a simulation with a network of 10

  3. The effects of narrow and elevated path walking on aperture crossing.

    PubMed

    Hackney, Amy L; Cinelli, Michael E; Denomme, Luke T; Frank, James S

    2015-06-01

    The study investigated the impact that action capabilities have on identifying possibilities for action, particularly how postural threat influences the passability of apertures. To do this, the ability to maintain balance was challenged by manipulating the level of postural threat while walking. First, participants walked along a 7m path and passed through two vertical obstacles spaced 1.1-1.5×the shoulder width apart during normal walking. Next, postural threat was manipulated by having participants complete the task either walking on a narrow, ground level path or on an elevated/narrow path. Despite a decrease in walking speed as well as an increase in trunk sway in both the narrow and elevated/narrow walking conditions, the passability of apertures was only affected when the consequence of instability was greatest. In the elevated/narrow walking condition, individuals maintained a larger critical point (rotated their shoulders for larger aperture widths) compared to normal walking. However, this effect was not observed for the narrow path walking suggesting that the level of postural threat was not enough to impose similar changes to the critical point. Therefore, it appears that manipulating action capabilities by increasing postural threat does indeed influence aperture crossing behavior, however the consequence associated with instability must be high before both gait characteristics and the critical point are affected. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Space Station Freedom Data Assessment Study

    NASA Technical Reports Server (NTRS)

    Johnson, Anngienetta R.; Deskevich, Joseph

    1990-01-01

    The SSF Data Assessment Study was initiated to identify payload and operations data requirements to be supported in the Space Station era. To initiate the study payload requirements from the projected SSF user community were obtained utilizing an electronic questionnaire. The results of the questionnaire were incorporated in a personal computer compatible database used for mission scheduling and end-to-end communications analyses. This paper discusses data flow paths and associated latencies, communications bottlenecks, resource needs versus availability, payload scheduling 'warning flags' and payload data loading requirements for each major milestone in the Space Station buildup sequence. This paper also presents the statistical and analytical assessments produced using the data base, an experiment scheduling program, and a Space Station unique end-to-end simulation model. The modeling concepts and simulation methodologies presented in this paper provide a foundation for forecasting communication requirements and identifying modeling tools to be used in the SSF Tactical Operations Planning (TOP) process.

  5. Spectroscopic method for Earth-satellite-Earth laser long-path absorption measurements using Retroreflector In Space (RIS)

    NASA Technical Reports Server (NTRS)

    Sugimoto, Nobuo; Minato, Atsushi; Sasano, Yasuhiro

    1992-01-01

    The Retroreflector in Space (RIS) is a single element cube-corner retroreflector with a diameter of 0.5 m designed for earth-satellite-earth laser long-path absorption experiments. The RIS is to be loaded on the Advanced Earth Observing System (ADEOS) satellite which is scheduled for launch in Feb. 1996. The orbit for ADEOS is a sun synchronous subrecurrent polar-orbit with an inclination of 98.6 deg. It has a period of 101 minutes and an altitude of approximately 800 km. The local time at descending node is 10:15-10:45, and the recurrent period is 41 days. The velocity relative to the ground is approximately 7 km/s. In the RIS experiment, a laser beam transmitted from a ground station is reflected by RIS and received at the ground station. The absorption of the intervening atmosphere is measured in the round-trip optical path.

  6. In Praise of Ignorance

    ERIC Educational Resources Information Center

    Formica, Piero

    2014-01-01

    In this article Piero Formica examines the difference between incremental and revolutionary innovation, distinguishing between the constrained "path finders" and the unconstrained "path creators". He argues that an acceptance of "ignorance" and a willingness to venture into the unknown are critical elements in…

  7. Real-time energy-saving metro train rescheduling with primary delay identification

    PubMed Central

    Li, Keping; Schonfeld, Paul

    2018-01-01

    This paper aims to reschedule online metro trains in delay scenarios. A graph representation and a mixed integer programming model are proposed to formulate the optimization problem. The solution approach is a two-stage optimization method. In the first stage, based on a proposed train state graph and system analysis, the primary and flow-on delays are specifically analyzed and identified with a critical path algorithm. For the second stage a hybrid genetic algorithm is designed to optimize the schedule, with the delay identification results as input. Then, based on the infrastructure data of Beijing Subway Line 4 of China, case studies are presented to demonstrate the effectiveness and efficiency of the solution approach. The results show that the algorithm can quickly and accurately identify primary delays among different types of delays. The economic cost of energy consumption and total delay is considerably reduced (by more than 10% in each case). The computation time of the Hybrid-GA is low enough for rescheduling online. Sensitivity analyses further demonstrate that the proposed approach can be used as a decision-making support tool for operators. PMID:29474471

  8. 5 CFR 532.254 - Special schedules.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ....254 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS PREVAILING RATE SYSTEMS Prevailing Rate Determinations § 532.254 Special schedules. (a) A lead agency, with the approval... critical to the mission of a Federal activity based on findings that— (1) Unusual prevailing pay practices...

  9. Avoiding Biased-Feeding in the Scheduling of Collaborative Multipath TCP.

    PubMed

    Tsai, Meng-Hsun; Chou, Chien-Ming; Lan, Kun-Chan

    2016-01-01

    Smartphones have become the major communication and portable computing devices that access the Internet through Wi-Fi or mobile networks. Unfortunately, users without a mobile data subscription can only access the Internet at limited locations, such as hotspots. In this paper, we propose a collaborative bandwidth sharing protocol (CBSP) built on top of MultiPath TCP (MPTCP). CBSP enables users to buy bandwidth on demand from neighbors (called Helpers) and uses virtual interfaces to bind the subflows of MPTCP to avoid modifying the implementation of MPTCP. However, although MPTCP provides the required multi-homing functionality for bandwidth sharing, the current packet scheduling in collaborative MPTCP (e.g., Co-MPTCP) leads to the so-called biased-feeding problem. In this problem, the fastest link might always be selected to send packets whenever it has available cwnd, which results in other links not being fully utilized. In this work, we set out to design an algorithm, called Scheduled Window-based Transmission Control (SWTC), to improve the performance of packet scheduling in MPTCP, and we perform extensive simulations to evaluate its performance.

  10. Avoiding Biased-Feeding in the Scheduling of Collaborative Multipath TCP

    PubMed Central

    2016-01-01

    Smartphones have become the major communication and portable computing devices that access the Internet through Wi-Fi or mobile networks. Unfortunately, users without a mobile data subscription can only access the Internet at limited locations, such as hotspots. In this paper, we propose a collaborative bandwidth sharing protocol (CBSP) built on top of MultiPath TCP (MPTCP). CBSP enables users to buy bandwidth on demand from neighbors (called Helpers) and uses virtual interfaces to bind the subflows of MPTCP to avoid modifying the implementation of MPTCP. However, although MPTCP provides the required multi-homing functionality for bandwidth sharing, the current packet scheduling in collaborative MPTCP (e.g., Co-MPTCP) leads to the so-called biased-feeding problem. In this problem, the fastest link might always be selected to send packets whenever it has available cwnd, which results in other links not being fully utilized. In this work, we set out to design an algorithm, called Scheduled Window-based Transmission Control (SWTC), to improve the performance of packet scheduling in MPTCP, and we perform extensive simulations to evaluate its performance. PMID:27529783

  11. The Critical Path Institute's approach to precompetitive sharing and advancing regulatory science.

    PubMed

    Woosley, R L; Myers, R T; Goodsaid, F

    2010-05-01

    Many successful large industries, such as computer-chip manufacturers, the cable television industry, and high-definition television developers,(1) have established successful precompetitive collaborations focusing on standards, applied science, and technology that advance the field for all stakeholders and benefit the public.(2) The pharmaceutical industry, however, has a well-earned reputation for fierce competition and did not demonstrate willingness to share data or knowledge until the US Food and Drug Administration (FDA) launched the Critical Path Initiative in 2004 (ref. 3).

  12. Resource Allocation and Outpatient Appointment Scheduling Using Simulation Optimization

    PubMed Central

    Ling, Teresa Wai Ching; Yeung, Wing Kwan

    2017-01-01

    This paper studies the real-life problems of outpatient clinics having the multiple objectives of minimizing resource overtime, patient waiting time, and waiting area congestion. In the clinic, there are several patient classes, each of which follows different treatment procedure flow paths through a multiphase and multiserver queuing system with scarce staff and limited space. We incorporate the stochastic factors for the probabilities of the patients being diverted into different flow paths, patient punctuality, arrival times, procedure duration, and the number of accompanied visitors. We present a novel two-stage simulation-based heuristic algorithm to assess various tactical and operational decisions for optimizing the multiple objectives. In stage I, we search for a resource allocation plan, and in stage II, we determine a block appointment schedule by patient class and a service discipline for the daily operational level. We also explore the effects of the separate strategies and their integration to identify the best possible combination. The computational experiments are designed on the basis of data from a study of an ophthalmology clinic in a public hospital. Results show that our approach significantly mitigates the undesirable outcomes by integrating the strategies and increasing the resource flexibility at the bottleneck procedures without adding resources. PMID:29104748

  13. Resource Allocation and Outpatient Appointment Scheduling Using Simulation Optimization.

    PubMed

    Lin, Carrie Ka Yuk; Ling, Teresa Wai Ching; Yeung, Wing Kwan

    2017-01-01

    This paper studies the real-life problems of outpatient clinics having the multiple objectives of minimizing resource overtime, patient waiting time, and waiting area congestion. In the clinic, there are several patient classes, each of which follows different treatment procedure flow paths through a multiphase and multiserver queuing system with scarce staff and limited space. We incorporate the stochastic factors for the probabilities of the patients being diverted into different flow paths, patient punctuality, arrival times, procedure duration, and the number of accompanied visitors. We present a novel two-stage simulation-based heuristic algorithm to assess various tactical and operational decisions for optimizing the multiple objectives. In stage I, we search for a resource allocation plan, and in stage II, we determine a block appointment schedule by patient class and a service discipline for the daily operational level. We also explore the effects of the separate strategies and their integration to identify the best possible combination. The computational experiments are designed on the basis of data from a study of an ophthalmology clinic in a public hospital. Results show that our approach significantly mitigates the undesirable outcomes by integrating the strategies and increasing the resource flexibility at the bottleneck procedures without adding resources.

  14. The MICRO-BOSS scheduling system: Current status and future efforts

    NASA Technical Reports Server (NTRS)

    Sadeh, Norman M.

    1992-01-01

    In this paper, a micro-opportunistic approach to factory scheduling was described that closely monitors the evolution of bottlenecks during the construction of the schedule and continuously redirects search towards the bottleneck that appears to be most critical. This approach differs from earlier opportunistic approaches, as it does not require scheduling large resource subproblems or large job subproblems before revising the current scheduling strategy. This micro-opportunistic approach was implemented in the context of the MICRO-BOSS factory scheduling system. A study comparing MICRO-BOSS against a macro-opportunistic scheduler suggests that the additional flexibility of the micro-opportunistic approach to scheduling generally yields important reductions in both tardiness and inventory. Current research efforts include: adaptation of MICRO-BOSS to deal with sequence-dependent setups and development of micro-opportunistic reactive scheduling techniques that will enable the system to patch the schedule in the presence of contingencies such as machine breakdowns, raw materials arriving late, job cancellations, etc.

  15. Bioastronautics Roadmap: A Risk Reduction Strategy for Human Space Exploration

    NASA Technical Reports Server (NTRS)

    2005-01-01

    The Bioastronautics Critical Path Roadmap is the framework used to identify and assess the risks to crews exposed to the hazardous environments of space. It guides the implementation of research strategies to prevent or reduce those risks. Although the BCPR identifies steps that must be taken to reduce the risks to health and performance that are associated with human space flight, the BCPR is not a "critical path" analysis in the strict engineering sense. The BCPR will evolve to accommodate new information and technology development and will enable NASA to conduct a formal critical path analysis in the future. As a management tool, the BCPR provides information for making informed decisions about research priorities and resource allocation. The outcome-driven nature of the BCPR makes it amenable for assessing the focus, progress and success of the Bioastronautics research and technology program. The BCPR is also a tool for communicating program priorities and progress to the research community and NASA management.

  16. Paving the critical path: how can clinical pharmacology help achieve the vision?

    PubMed

    Lesko, L J

    2007-02-01

    It has been almost 3 years since the launch of the FDA critical path initiative following the publication of the paper "Innovation or Stagnation: Challenges and Opportunities on the Critical Path of New Medical Product Development." The initiative was intended to create an urgency with the drug development enterprise to address the so-called "productivity problem" in modern drug development. Clinical pharmacologists are strategically aligned with solutions designed to reduce late phase clinical trial failures to show adequate efficacy and/or safety. This article reviews some of the ways that clinical pharmacologists can lead and implement change in the drug development process. It includes a discussion of model-based, semi-mechanistic drug development, drug/disease models that facilitate informed clinical trial designs and optimal dosing, the qualification process and criteria for new biomarkers and surrogate endpoints, approaches to streamlining clinical trials and new types of interaction between industry and FDA such as the end-of-phase 2A and voluntary genomic data submission meetings respectively.

  17. Team Mentoring for Interdisciplinary Team Science: Lessons From K12 Scholars and Directors.

    PubMed

    Guise, Jeanne-Marie; Geller, Stacie; Regensteiner, Judith G; Raymond, Nancy; Nagel, Joan

    2017-02-01

    Mentoring is critical for academic success. As science transitions to a team science model, team mentoring may have advantages. The goal of this study was to understand the process, benefits, and challenges of team mentoring relating to career development and research. A national survey was conducted of Building Interdisciplinary Research Careers in Women's Health (BIRCWH) program directors-current and former scholars from 27 active National Institutes of Health (NIH)-funded BIRCWH NIH K12 programs-to characterize and understand the value and challenges of the team approach to mentoring. Quantitative data were analyzed descriptively, and qualitative data were analyzed thematically. Responses were received from 25/27 (93%) program directors, 78/108 (72%) current scholars, and 91/162 (56%) former scholars. Scholars reported that team mentoring was beneficial to their career development (152/169; 90%) and research (148/169; 88%). Reported advantages included a diversity of opinions, expanded networking, development of stronger study designs, and modeling of different career paths. Challenges included scheduling and managing conflicting opinions. Advice by directors offered to junior faculty entering team mentoring included the following: not to be intimidated by senior mentors, be willing to navigate conflicting advice, be proactive about scheduling and guiding discussions, have an open mind to different approaches, be explicit about expectations and mentors' roles (including importance of having a primary mentor to help navigate discussions), and meet in person as a team. These findings suggest that interdisciplinary/interprofessional team mentoring has many important advantages, but that skills are required to optimally utilize multiple perspectives.

  18. Team Mentoring for Interdisciplinary Team Science: Lessons from K12 Scholars and Directors

    PubMed Central

    Guise, Jeanne-Marie; Geller, Stacie; Regensteiner, Judith G.; Raymond, Nancy; Nagel, Joan

    2016-01-01

    Purpose Mentoring is critical for academic success. As science transitions to a team science model, team mentoring may have advantages. The goal of this study was to understand the process, benefits, and challenges of team mentoring relating to career development and research. Method A national survey was conducted of Building Interdisciplinary Research Careers in Women’s Health (BIRCWH) program directors, current and former scholars s from 27 active National Institutes of Health (NIH)-funded BIRCWH NIH K12 programs to characterize and understand the value and challenges of the team approach to mentoring. Quantitative data were analyzed descriptively and qualitative thematically. Results Responses were received from 25/27 (93%) of program directors, 78/108 (72%) current scholars, and 91/162 (56%) former scholars. Scholars reported that team mentoring was beneficial to their career development (152/169, 90%) and research (148/169, 88%). Reported advantages included a diversity of opinions, expanded networking, development of stronger study designs, and modeling of different career paths. Challenges included scheduling and managing conflicting opinions. Advice by directors offered to junior faculty entering team mentoring included: not to be intimidated by senior mentors, be willing to navigate conflicting advice, be proactive about scheduling and guiding discussions, have an open mind to different approaches, be explicit about expectations and mentors’ roles (including importance of having a primary mentor to help navigate discussions), and meeting in person as a team. Conclusions These findings suggest that interdisciplinary/interprofessional team mentoring has many important advantages, but that skills are required to optimally utilize multiple perspectives. PMID:27556675

  19. Clinical care paths: a role for finance in clinical decision-making.

    PubMed

    Abrams, Michael N; Cummings, Simone; Hage, Dana

    2012-12-01

    Care paths map the critical actions and decision points across a patient's course of medical treatment; their purpose is to guide physicians in the delivery of high-quality care while reducing care costs by avoiding services that do not contribute meaningfully to positive outcomes. Each care path development initiative should be led by a respected physician champion, whose specialty is in the area of the care episode being mapped, with the support of a clinician project manager. Once the care path has been developed and implemented, the finance leader's role begins in earnest with the tracking of financial and clinical data against care paths.

  20. When Serious Project Management is a Critical Business Requirement

    NASA Technical Reports Server (NTRS)

    Jansma, P. A.; Gibby, L.; Chambers, C.; Joines, J.; Egger, R.

    2000-01-01

    When serious project management is a critical business requirement, project managers need to integrate cost, schedule and technical scope of work across the project, and apply earned value management (EVM).

  1. Merit Pay Misfires

    ERIC Educational Resources Information Center

    Ramirez, Al

    2011-01-01

    Critics argue that the uniform salary schedule is unfair because it promotes mediocrity by rewarding poor performers while failing to recognize outstanding achievement on the job. Advocates for merit pay systems for preK-12 education also contend that the uniform salary schedule ignores the basic purpose of education--student learning. Although…

  2. Memory characteristics of ring-shaped ceramic superconductors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takeoka, A.; Hasunuma, M.; Sakaiya, S.

    1989-03-01

    For the practical application of ceramic superconductors, the authors investigated the residual magnetic field characteristics of ring-shaped ceramic superconductors in a Y-Ba-Cu-O system with high Tc. The residual magnetic field of a ring with asymmetric current paths, supplied by external currents, appeared when one of the branch currents was above the critical current. The residual magnetic field saturated when both brach currents exceeded the critical current of the ring and showed hysteresis-like characteristics. The saturated magnetic field is subject to the critical current of the ring. A superconducting ring with asymmetric current paths suggests a simple and quite new persistent-currentmore » type memory device.« less

  3. The influence of different training schedules on the learning of psychomotor skills for endoscopic surgery.

    PubMed

    Verdaasdonk, E G G; Stassen, L P S; van Wijk, R P J; Dankelman, J

    2007-02-01

    Psychomotor skills for endoscopic surgery can be trained with virtual reality simulators. Distributed training is more effective than massed training, but it is unclear whether distributed training over several days is more effective than distributed training within 1 day. This study aimed to determine which of these two options is the most effective for training endoscopic psychomotor skills. Students with no endoscopic experience were randomly assigned either to distributed training on 3 consecutive days (group A, n = 10) or distributed training within 1 day (group B, n = 10). For this study the SIMENDO virtual reality simulator for endoscopic skills was used. The training involved 12 repetitions of three different exercises (drop balls, needle manipulation, 30 degree endoscope) in differently distributed training schedules. All the participants performed a posttraining test (posttest) for the trained tasks 7 days after the training. The parameters measured were time, nontarget environment collisions, and instrument path length. There were no significant differences between the groups in the first training session for all the parameters. In the posttest, group A (training over several days) performed 18.7% faster than group B (training on 1 day) (p = 0.013). The collision and path length scores for group A did not differ significantly from the scores for group B. The distributed group trained over several days was faster, with the same number of errors and the same instrument path length used. Psychomotor skill training for endoscopic surgery distributed over several days is superior to training on 1 day.

  4. Three Critical Tasks America's Disadvantaged Face on Their Path to College.

    ERIC Educational Resources Information Center

    Cabrera, Alberto F.; La Nasa, Steven M.

    2000-01-01

    Using data from the National Educational Longitudinal Study of 1988, examines the wide disparity of college-choice activities between socioeconomic groups. In order to highlight this disparity, analyzes three tasks that all students must complete on their path to college. (Author/EV)

  5. Stress path dependent hydromechanical behaviour of heterogeneous carbonate rock

    NASA Astrophysics Data System (ADS)

    Gland, N.; Dautriat, J.; Dimanov, A.; Raphanel, J.

    2010-06-01

    The influence of stress paths, representative of reservoir conditions, on the hydromechanical behavior of a moderately heterogeneous carbonate has been investigated. Multiscale structural heterogeneities, common for instance in carbonate rocks, can strongly alter the mechanical response and significantly influence the evolution of flow properties with stress. Using a triaxial cell, the permeability evolutions during compression and the effects of brittle (fracture) and plastic (pore collapse) deformations at yield, were measured. A strong scattering was observed on the mechanical response both in term of compressibility and failure threshold. Using the porosity scaling predicted by an adapted effective medium theory (based on crack growth under Hertzian contact), we have rescaled the critical pressures by the normalized porosity deviation. This procedure reduces efficiently the scattering, revealing in the framework of proportional stress path loading, a linear relation between the critical pressures and the stress path parameter through all the deformation regimes. It leads to a new formulation for the critical state envelope in the 'mean stress, deviatoric stress' diagram. The attractive feature of this new yield envelope formulation relies on the fact that only the two most common different mechanical tests 'Uniaxial Compression' and 'Hydrostatic Compression', are needed to define entirely the yield envelope. Volumic strains and normalized permeabilities are finally mapped in the stresses diagram and correlated.

  6. Critical Chain Exercises

    ERIC Educational Resources Information Center

    Doyle, John Kevin

    2010-01-01

    Critical Chains project management focuses on holding buffers at the project level vs. task level, and managing buffers as a project resource. A number of studies have shown that Critical Chain project management can significantly improve organizational schedule fidelity (i.e., improve the proportion of projects delivered on time) and reduce…

  7. Predit: A temporal predictive framework for scheduling systems

    NASA Technical Reports Server (NTRS)

    Paolucci, E.; Patriarca, E.; Sem, M.; Gini, G.

    1992-01-01

    Scheduling can be formalized as a Constraint Satisfaction Problem (CSP). Within this framework activities belonging to a plan are interconnected via temporal constraints that account for slack among them. Temporal representation must include methods for constraints propagation and provide a logic for symbolic and numerical deductions. In this paper we describe a support framework for opportunistic reasoning in constraint directed scheduling. In order to focus the attention of an incremental scheduler on critical problem aspects, some discrete temporal indexes are presented. They are also useful for the prediction of the degree of resources contention. The predictive method expressed through our indexes can be seen as a Knowledge Source for an opportunistic scheduler with a blackboard architecture.

  8. Finding Out Critical Points For Real-Time Path Planning

    NASA Astrophysics Data System (ADS)

    Chen, Wei

    1989-03-01

    Path planning for a mobile robot is a classic topic, but the path planning under real-time environment is a different issue. The system sources including sampling time, processing time, processes communicating time, and memory space are very limited for this type of application. This paper presents a method which abstracts the world representation from the sensory data and makes the decision as to which point will be a potentially critical point to span the world map by using incomplete knowledge about physical world and heuristic rule. Without any previous knowledge or map of the workspace, the robot will determine the world map by roving through the workspace. The computational complexity for building and searching such a map is not more than O( n2 ) The find-path problem is well-known in robotics. Given an object with an initial location and orientation, a goal location and orientation, and a set of obstacles located in space, the problem is to find a continuous path for the object from the initial position to the goal position which avoids collisions with obstacles along the way. There are a lot of methods to find a collision-free path in given environment. Techniques for solving this problem can be classified into three approaches: 1) the configuration space approach [1],[2],[3] which represents the polygonal obstacles by vertices in a graph. The idea is to determine those parts of the free space which a reference point of the moving object can occupy without colliding with any obstacles. A path is then found for the reference point through this truly free space. Dealing with rotations turns out to be a major difficulty with the approach, requiring complex geometric algorithms which are computationally expensive. 2) the direct representation of the free space using basic shape primitives such as convex polygons [4] and overlapping generalized cones [5]. 3) the combination of technique 1 and 2 [6] by which the space is divided into the primary convex region, overlap region and obstacle region, then obstacle boundaries with attribute values are represented by the vertices of the hypergraph. The primary convex region and overlap region are represented by hyperedges, the centroids of overlap form the critical points. The difficulty is generating segment graph and estimating of minimum path width. The all techniques mentioned above need previous knowledge about the world to make path planning and the computational cost is not low. They are not available in an unknow and uncertain environment. Due to limited system resources such as CPU time, memory size and knowledge about the special application in an intelligent system (such as mobile robot), it is necessary to use algorithms that provide the good decision which is feasible with the available resources in real time rather than the best answer that could be achieved in unlimited time with unlimited resources. A real-time path planner should meet following requirements: - Quickly abstract the representation of the world from the sensory data without any previous knowledge about the robot environment. - Easily update the world model to spell out the global-path map and to reflect changes in the robot environment. - Must make a decision of where the robot must go and which direction the range sensor should point to in real time with limited resources. The method presented here assumes that the data from range sensors has been processed by signal process unite. The path planner will guide the scan of range sensor, find critical points, make decision where the robot should go and which point is poten- tial critical point, generate the path map and monitor the robot moves to the given point. The program runs recursively until the goal is reached or the whole workspace is roved through.

  9. Irrigation scheduling as affected by field capacity and wilting point water content from different data sources

    USDA-ARS?s Scientific Manuscript database

    Soil water content at field capacity and wilting point water content is critical information for irrigation scheduling, regardless of soil water sensor-based method (SM) or evapotranspiration (ET)-based method. Both methods require knowledge on site-specific and soil-specific Management Allowable De...

  10. The James Webb Space Telescope: Observatory Status and the Path to Launch

    NASA Technical Reports Server (NTRS)

    McElwain, Michael; Bowers, Chuck; Clampin, Mark; Niedner, Mal

    2016-01-01

    JWST will carry out transformative science from the very early universe and across cosmic time. JWST OTE and ISIM have been combined to form OTIS, which will commence environmental testing. The full JWST team has made tremendous progress since the last AT+I meeting in 2014.JWST on track following 2011 replan and remains on schedule to launch in October 2018.

  11. Power-constrained supercomputing

    NASA Astrophysics Data System (ADS)

    Bailey, Peter E.

    As we approach exascale systems, power is turning from an optimization goal to a critical operating constraint. With power bounds imposed by both stakeholders and the limitations of existing infrastructure, achieving practical exascale computing will therefore rely on optimizing performance subject to a power constraint. However, this requirement should not add to the burden of application developers; optimizing the runtime environment given restricted power will primarily be the job of high-performance system software. In this dissertation, we explore this area and develop new techniques that extract maximum performance subject to a particular power constraint. These techniques include a method to find theoretical optimal performance, a runtime system that shifts power in real time to improve performance, and a node-level prediction model for selecting power-efficient operating points. We use a linear programming (LP) formulation to optimize application schedules under various power constraints, where a schedule consists of a DVFS state and number of OpenMP threads for each section of computation between consecutive message passing events. We also provide a more flexible mixed integer-linear (ILP) formulation and show that the resulting schedules closely match schedules from the LP formulation. Across four applications, we use our LP-derived upper bounds to show that current approaches trail optimal, power-constrained performance by up to 41%. This demonstrates limitations of current systems, and our LP formulation provides future optimization approaches with a quantitative optimization target. We also introduce Conductor, a run-time system that intelligently distributes available power to nodes and cores to improve performance. The key techniques used are configuration space exploration and adaptive power balancing. Configuration exploration dynamically selects the optimal thread concurrency level and DVFS state subject to a hardware-enforced power bound. Adaptive power balancing efficiently predicts where critical paths are likely to occur and distributes power to those paths. Greater power, in turn, allows increased thread concurrency levels, CPU frequency/voltage, or both. We describe these techniques in detail and show that, compared to the state-of-the-art technique of using statically predetermined, per-node power caps, Conductor leads to a best-case performance improvement of up to 30%, and an average improvement of 19.1%. At the node level, an accurate power/performance model will aid in selecting the right configuration from a large set of available configurations. We present a novel approach to generate such a model offline using kernel clustering and multivariate linear regression. Our model requires only two iterations to select a configuration, which provides a significant advantage over exhaustive search-based strategies. We apply our model to predict power and performance for different applications using arbitrary configurations, and show that our model, when used with hardware frequency-limiting in a runtime system, selects configurations with significantly higher performance at a given power limit than those chosen by frequency-limiting alone. When applied to a set of 36 computational kernels from a range of applications, our model accurately predicts power and performance; our runtime system based on the model maintains 91% of optimal performance while meeting power constraints 88% of the time. When the runtime system violates a power constraint, it exceeds the constraint by only 6% in the average case, while simultaneously achieving 54% more performance than an oracle. Through the combination of the above contributions, we hope to provide guidance and inspiration to research practitioners working on runtime systems for power-constrained environments. We also hope this dissertation will draw attention to the need for software and runtime-controlled power management under power constraints at various levels, from the processor level to the cluster level.

  12. PRACTICAL: Planning and Resource Allocation in C2-Domains With Time Critical Algorithms (PRACTICAL: Planning en Allocatie in C2-Domeinen Met Tijdkritische Algoritmen)

    DTIC Science & Technology

    1993-02-01

    the (re)planning framework, incorporating the demonstrators CALIGULA and ALLOCATOR for resource allocation and scheduling respectively. In the Command...demonstrator CALIGULA for the problem of allocating frequencies to a radio link network. The problems in the domain of scheduling are dealt with. which has...demonstrating the (re)planning framework, incorporating the demonstrators CALIGULA and ALLOCATOR for resource allocation and scheduling respectively

  13. Reply to "Comment on `Particle path through a nested Mach-Zehnder interferometer' "

    NASA Astrophysics Data System (ADS)

    Griffiths, Robert B.

    2017-06-01

    The correctness of the consistent histories analysis of weakly interacting probes, related to the path of a particle, is maintained against the criticisms in the Comment, and against the alternative approach described there, which receives no support from standard (textbook) quantum mechanics.

  14. Sociological Theory and Youth Aspiration Research: A Critical Overview.

    ERIC Educational Resources Information Center

    Picou, J. Steven; Wells, Richard H.

    Reviewing sociological theories relative to youth aspiration research, the following thesis was presented: "pre-path analysis aspiration research was characterized by a person-centered, middle-range functionalist approach which eventually shifted to a person-centered, functionalist-system approach with the introduction of the path model…

  15. Integrated Cost and Schedule using Monte Carlo Simulation of a CPM Model - 12419

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hulett, David T.; Nosbisch, Michael R.

    This discussion of the recommended practice (RP) 57R-09 of AACE International defines the integrated analysis of schedule and cost risk to estimate the appropriate level of cost and schedule contingency reserve on projects. The main contribution of this RP is to include the impact of schedule risk on cost risk and hence on the need for cost contingency reserves. Additional benefits include the prioritizing of the risks to cost, some of which are risks to schedule, so that risk mitigation may be conducted in a cost-effective way, scatter diagrams of time-cost pairs for developing joint targets of time and cost,more » and probabilistic cash flow which shows cash flow at different levels of certainty. Integrating cost and schedule risk into one analysis based on the project schedule loaded with costed resources from the cost estimate provides both: (1) more accurate cost estimates than if the schedule risk were ignored or incorporated only partially, and (2) illustrates the importance of schedule risk to cost risk when the durations of activities using labor-type (time-dependent) resources are risky. Many activities such as detailed engineering, construction or software development are mainly conducted by people who need to be paid even if their work takes longer than scheduled. Level-of-effort resources, such as the project management team, are extreme examples of time-dependent resources, since if the project duration exceeds its planned duration the cost of these resources will increase over their budgeted amount. The integrated cost-schedule risk analysis is based on: - A high quality CPM schedule with logic tight enough so that it will provide the correct dates and critical paths during simulation automatically without manual intervention. - A contingency-free estimate of project costs that is loaded on the activities of the schedule. - Resolves inconsistencies between cost estimate and schedule that often creep into those documents as project execution proceeds. - Good-quality risk data that are usually collected in risk interviews of the project team, management and others knowledgeable in the risk of the project. The risks from the risk register are used as the basis of the risk data in the risk driver method. The risk driver method is based in the fundamental principle that identifiable risks drive overall cost and schedule risk. - A Monte Carlo simulation software program that can simulate schedule risk, burn WM2012 rate risk and time-independent resource risk. The results include the standard histograms and cumulative distributions of possible cost and time results for the project. However, by simulating both cost and time simultaneously we can collect the cost-time pairs of results and hence show the scatter diagram ('football chart') that indicates the joint probability of finishing on time and on budget. Also, we can derive the probabilistic cash flow for comparison with the time-phased project budget. Finally the risks to schedule completion and to cost can be prioritized, say at the P-80 level of confidence, to help focus the risk mitigation efforts. If the cost and schedule estimates including contingency reserves are not acceptable to the project stakeholders the project team should conduct risk mitigation workshops and studies, deciding which risk mitigation actions to take, and re-run the Monte Carlo simulation to determine the possible improvement to the project's objectives. Finally, it is recommended that the contingency reserves of cost and of time, calculated at a level that represents an acceptable degree of certainty and uncertainty for the project stakeholders, be added as a resource-loaded activity to the project schedule for strategic planning purposes. The risk analysis described in this paper is correct only for the current plan, represented by the schedule. The project contingency reserve of time and cost that are the main results of this analysis apply if that plan is to be followed. Of course project managers have the option of re-planning and re-scheduling in the face of new facts, in part by mitigating risk. This analysis identifies the high-priority risks to cost and to schedule, which assist the project manager in planning further risk mitigation. Some project managers reject the results and argue that they cannot possibly be so late or so overrun. Those project managers may be wasting an opportunity to mitigate risk and get a more favorable outcome. (authors)« less

  16. Search Path Mapping: A Versatile Approach for Visualizing Problem-Solving Behavior.

    ERIC Educational Resources Information Center

    Stevens, Ronald H.

    1991-01-01

    Computer-based problem-solving examinations in immunology generate graphic representations of students' search paths, allowing evaluation of how organized and focused their knowledge is, how well their organization relates to critical concepts in immunology, where major misconceptions exist, and whether proper knowledge links exist between content…

  17. Jovan Skerlic as a Life Tutor

    ERIC Educational Resources Information Center

    Mumovic, Ana

    2015-01-01

    The paper studies and illuminates Jovan Skerlic's social function and "criticism anatomy" in his "History of Serbian Literature." The object of analysis is the act and actors of Skerlic's engaged criticism and method relying on facts. It is the path taken a hundred years later by Serbian criticism and literature as culture and…

  18. Autonomous power expert system

    NASA Technical Reports Server (NTRS)

    Ringer, Mark J.; Quinn, Todd M.

    1990-01-01

    The goal of the Autonomous Power System (APS) program is to develop and apply intelligent problem solving and control technologies to the Space Station Freedom Electrical Power Systems (SSF/EPS). The objectives of the program are to establish artificial intelligence/expert system technology paths, to create knowledge based tools with advanced human-operator interfaces, and to integrate and interface knowledge-based and conventional control schemes. This program is being developed at the NASA-Lewis. The APS Brassboard represents a subset of a 20 KHz Space Station Power Management And Distribution (PMAD) testbed. A distributed control scheme is used to manage multiple levels of computers and switchgear. The brassboard is comprised of a set of intelligent switchgear used to effectively switch power from the sources to the loads. The Autonomous Power Expert System (APEX) portion of the APS program integrates a knowledge based fault diagnostic system, a power resource scheduler, and an interface to the APS Brassboard. The system includes knowledge bases for system diagnostics, fault detection and isolation, and recommended actions. The scheduler autonomously assigns start times to the attached loads based on temporal and power constraints. The scheduler is able to work in a near real time environment for both scheduling and dynamic replanning.

  19. Autonomous power expert system

    NASA Technical Reports Server (NTRS)

    Ringer, Mark J.; Quinn, Todd M.

    1990-01-01

    The goal of the Autonomous Power System (APS) program is to develop and apply intelligent problem solving and control technologies to the Space Station Freedom Electrical Power Systems (SSF/EPS). The objectives of the program are to establish artificial intelligence/expert system technology paths, to create knowledge based tools with advanced human-operator interfaces, and to integrate and interface knowledge-based and conventional control schemes. This program is being developed at the NASA-Lewis. The APS Brassboard represents a subset of a 20 KHz Space Station Power Management And Distribution (PMAD) testbed. A distributed control scheme is used to manage multiple levels of computers and switchgear. The brassboard is comprised of a set of intelligent switchgear used to effectively switch power from the sources to the loads. The Autonomous Power Expert System (APEX) portion of the APS program integrates a knowledge based fault diagnostic system, a power resource scheduler, and an interface to the APS Brassboard. The system includes knowledge bases for system diagnostics, fault detection and isolation, and recommended actions. The scheduler autonomously assigns start times to the attached loads based on temporal and power constraints. The scheduler is able to work in a near real time environment for both scheduling an dynamic replanning.

  20. Performance Analysis of Stop-Skipping Scheduling Plans in Rail Transit under Time-Dependent Demand

    PubMed Central

    Cao, Zhichao; Yuan, Zhenzhou; Zhang, Silin

    2016-01-01

    Stop-skipping is a key method for alleviating congestion in rail transit, where schedules are sometimes difficult to implement. Several mechanisms have been proposed and analyzed in the literature, but very few performance comparisons are available. This study formulated train choice behavior estimation into the model considering passengers’ perception. If a passenger’s train path can be identified, this information would be useful for improving the stop-skipping schedule service. Multi-performance is a key characteristic of our proposed five stop-skipping schedules, but quantified analysis can be used to illustrate the different effects of well-known deterministic and stochastic forms. Problems in the novel category of forms were justified in the context of a single line rather than transit network. We analyzed four deterministic forms based on the well-known A/B stop-skipping operating strategy. A stochastic form was innovatively modeled as a binary integer programming problem. We present a performance analysis of our proposed model to demonstrate that stop-skipping can feasibly be used to improve the service of passengers and enhance the elasticity of train operations under demand variations along with an explicit parametric discussion. PMID:27420087

  1. Performance Analysis of Stop-Skipping Scheduling Plans in Rail Transit under Time-Dependent Demand.

    PubMed

    Cao, Zhichao; Yuan, Zhenzhou; Zhang, Silin

    2016-07-13

    Stop-skipping is a key method for alleviating congestion in rail transit, where schedules are sometimes difficult to implement. Several mechanisms have been proposed and analyzed in the literature, but very few performance comparisons are available. This study formulated train choice behavior estimation into the model considering passengers' perception. If a passenger's train path can be identified, this information would be useful for improving the stop-skipping schedule service. Multi-performance is a key characteristic of our proposed five stop-skipping schedules, but quantified analysis can be used to illustrate the different effects of well-known deterministic and stochastic forms. Problems in the novel category of forms were justified in the context of a single line rather than transit network. We analyzed four deterministic forms based on the well-known A/B stop-skipping operating strategy. A stochastic form was innovatively modeled as a binary integer programming problem. We present a performance analysis of our proposed model to demonstrate that stop-skipping can feasibly be used to improve the service of passengers and enhance the elasticity of train operations under demand variations along with an explicit parametric discussion.

  2. Assist-as-needed path control for the PASCAL rehabilitation robot.

    PubMed

    Keller, Urs; Rauter, Georg; Riener, Robert

    2013-06-01

    Adults and children with neurological disorders often require rehabilitation therapy to improve their arm motor functions. Complementary to conventional therapy, robotic therapy can be applied. Such robots should support arm movements while assisting only as much as needed to ensure an active participation of the patient. Different control strategies are known to provide arm support to the patient. The path controller is a strategy that helps the patient's arm to stay close to a given path while allowing for temporal and spatial freedom. In this paper, an assist-as-needed path controller is presented that is implemented in the end-effector-based robot PASCAL, which was designed for children with cerebral palsy. The new control approach is a combination of an existing path controller with additional speed restrictions to support, when the arm speed is too slow, and to resist, when the speed is too fast. Furthermore, a target position gain scheduling is introduced in order to reach a target position with a predefined precision as well as an adaptable direction-dependent supportive flux that supports along the path. These path control features were preliminarily tested with a healthy adult volunteer in different conditions. The presented controller covers the range from a completely passive user, who needs full support to an actively performed movement that needs no assistance. In close future, the controller is planned to be used to enable reaching in children as well as in adults and help to increase the intensity of the rehabilitation therapy by assisting the hand movement and by provoking an active participation.

  3. Cooperative Surveillance and Pursuit Using Unmanned Aerial Vehicles and Unattended Ground Sensors

    PubMed Central

    Las Fargeas, Jonathan; Kabamba, Pierre; Girard, Anouck

    2015-01-01

    This paper considers the problem of path planning for a team of unmanned aerial vehicles performing surveillance near a friendly base. The unmanned aerial vehicles do not possess sensors with automated target recognition capability and, thus, rely on communicating with unattended ground sensors placed on roads to detect and image potential intruders. The problem is motivated by persistent intelligence, surveillance, reconnaissance and base defense missions. The problem is formulated and shown to be intractable. A heuristic algorithm to coordinate the unmanned aerial vehicles during surveillance and pursuit is presented. Revisit deadlines are used to schedule the vehicles' paths nominally. The algorithm uses detections from the sensors to predict intruders' locations and selects the vehicles' paths by minimizing a linear combination of missed deadlines and the probability of not intercepting intruders. An analysis of the algorithm's completeness and complexity is then provided. The effectiveness of the heuristic is illustrated through simulations in a variety of scenarios. PMID:25591168

  4. CQPSO scheduling algorithm for heterogeneous multi-core DAG task model

    NASA Astrophysics Data System (ADS)

    Zhai, Wenzheng; Hu, Yue-Li; Ran, Feng

    2017-07-01

    Efficient task scheduling is critical to achieve high performance in a heterogeneous multi-core computing environment. The paper focuses on the heterogeneous multi-core directed acyclic graph (DAG) task model and proposes a novel task scheduling method based on an improved chaotic quantum-behaved particle swarm optimization (CQPSO) algorithm. A task priority scheduling list was built. A processor with minimum cumulative earliest finish time (EFT) was acted as the object of the first task assignment. The task precedence relationships were satisfied and the total execution time of all tasks was minimized. The experimental results show that the proposed algorithm has the advantage of optimization abilities, simple and feasible, fast convergence, and can be applied to the task scheduling optimization for other heterogeneous and distributed environment.

  5. Uncertainty analysis of an irrigation scheduling model for water management in crop production

    USDA-ARS?s Scientific Manuscript database

    Irrigation scheduling tools are critical to allow producers to manage water resources for crop production in an accurate and timely manner. To be useful, these tools need to be accurate, complete, and relatively reliable. The current work presents the uncertainty analysis and its results for the Mis...

  6. Cooperative path planning for multi-USV based on improved artificial bee colony algorithm

    NASA Astrophysics Data System (ADS)

    Cao, Lu; Chen, Qiwei

    2018-03-01

    Due to the complex constraints, more uncertain factors and critical real-time demand of path planning for multiple unmanned surface vehicle (multi-USV), an improved artificial bee colony (I-ABC) algorithm were proposed to solve the model of cooperative path planning for multi-USV. First the Voronoi diagram of battle field space is conceived to generate the optimal area of USVs paths. Then the chaotic searching algorithm is used to initialize the collection of paths, which is regard as foods of the ABC algorithm. With the limited data, the initial collection can search the optimal area of paths perfectly. Finally simulations of the multi-USV path planning under various threats have been carried out. Simulation results verify that the I-ABC algorithm can improve the diversity of nectar source and the convergence rate of algorithm. It can increase the adaptability of dynamic battlefield and unexpected threats for USV.

  7. A Novel Dual Separate Paths (DSP) Algorithm Providing Fault-Tolerant Communication for Wireless Sensor Networks.

    PubMed

    Tien, Nguyen Xuan; Kim, Semog; Rhee, Jong Myung; Park, Sang Yoon

    2017-07-25

    Fault tolerance has long been a major concern for sensor communications in fault-tolerant cyber physical systems (CPSs). Network failure problems often occur in wireless sensor networks (WSNs) due to various factors such as the insufficient power of sensor nodes, the dislocation of sensor nodes, the unstable state of wireless links, and unpredictable environmental interference. Fault tolerance is thus one of the key requirements for data communications in WSN applications. This paper proposes a novel path redundancy-based algorithm, called dual separate paths (DSP), that provides fault-tolerant communication with the improvement of the network traffic performance for WSN applications, such as fault-tolerant CPSs. The proposed DSP algorithm establishes two separate paths between a source and a destination in a network based on the network topology information. These paths are node-disjoint paths and have optimal path distances. Unicast frames are delivered from the source to the destination in the network through the dual paths, providing fault-tolerant communication and reducing redundant unicast traffic for the network. The DSP algorithm can be applied to wired and wireless networks, such as WSNs, to provide seamless fault-tolerant communication for mission-critical and life-critical applications such as fault-tolerant CPSs. The analyzed and simulated results show that the DSP-based approach not only provides fault-tolerant communication, but also improves network traffic performance. For the case study in this paper, when the DSP algorithm was applied to high-availability seamless redundancy (HSR) networks, the proposed DSP-based approach reduced the network traffic by 80% to 88% compared with the standard HSR protocol, thus improving network traffic performance.

  8. Life Cycle Cost Growth Study for the Discovery and New Frontiers Program Office

    NASA Technical Reports Server (NTRS)

    Barley, Bryan; Gilbert, Paul; Newhouse, Marilyn

    2010-01-01

    The D&NF Program Office LCC Management Study provides a detailed look at the drivers underlying cost overruns and schedule delays for five D&NF missions. While none of the findings are new, the study underlines the importance of continued emphasis on sound project management techniques: a clean project management structure with a clear definition of roles and responsibilities across the various partners in a project, an understanding of institutional standards and procedures and any differences among the partners, and the critical need for a comprehensive IMS that can be used easily and routinely to identify potential threats to the critical path. The study also highlights the continuing need for realistic estimates of the total LCC. Sufficient time and resources must be allocated early in a project to ensure that the appropriate trade studies and analyses are performed across all aspects of a mission: spacecraft, ground system, operations concept, and fault management, to ensure that proposed and confirmed costs truly reflect the resource requirements over the entire mission life cycle. These studies need to include a realistic review of the assumptions underlying the use of new technologies, the integration of heritage and new hardware and software into the total mission environment, and any development and test savings based on heritage technology and lessons learned. Finally, the LCC Management Study stresses the need to listen to, carefully consider, and take positive action regarding the issues raised during reviews by the expert review teams.

  9. Unification theory of optimal life histories and linear demographic models in internal stochasticity.

    PubMed

    Oizumi, Ryo

    2014-01-01

    Life history of organisms is exposed to uncertainty generated by internal and external stochasticities. Internal stochasticity is generated by the randomness in each individual life history, such as randomness in food intake, genetic character and size growth rate, whereas external stochasticity is due to the environment. For instance, it is known that the external stochasticity tends to affect population growth rate negatively. It has been shown in a recent theoretical study using path-integral formulation in structured linear demographic models that internal stochasticity can affect population growth rate positively or negatively. However, internal stochasticity has not been the main subject of researches. Taking account of effect of internal stochasticity on the population growth rate, the fittest organism has the optimal control of life history affected by the stochasticity in the habitat. The study of this control is known as the optimal life schedule problems. In order to analyze the optimal control under internal stochasticity, we need to make use of "Stochastic Control Theory" in the optimal life schedule problem. There is, however, no such kind of theory unifying optimal life history and internal stochasticity. This study focuses on an extension of optimal life schedule problems to unify control theory of internal stochasticity into linear demographic models. First, we show the relationship between the general age-states linear demographic models and the stochastic control theory via several mathematical formulations, such as path-integral, integral equation, and transition matrix. Secondly, we apply our theory to a two-resource utilization model for two different breeding systems: semelparity and iteroparity. Finally, we show that the diversity of resources is important for species in a case. Our study shows that this unification theory can address risk hedges of life history in general age-states linear demographic models.

  10. Time-Critical Cooperative Path Following of Multiple Unmanned Aerial Vehicles over Time-Varying Networks

    DTIC Science & Technology

    2013-03-01

    Ciência e a Tecnologia . References [1] Kaminer, I., Pascoal, A.M., Hallberg, E., and Silvestreo, C., “Trajectory Tracking for Autonomous Vehicles: An...for publication). [53] Cichella, V., Xargay, E., Dobrokhodov, V., Kaminer, I., Pascoal, A. M., and Hovakimyan, N., “Geometric 3D Path-Following

  11. Education through Movies: Improving Teaching Skills and Fostering Reflection among Students and Teachers

    ERIC Educational Resources Information Center

    Blasco, Pablo Gonzalez; Moreto, Graziela; Blasco, Mariluz González; Levites, Marcelo Rozenfeld; Janaudis, Marco Aurelio

    2015-01-01

    Learning through aesthetics--in which cinema is included--stimulates learner reflection. As emotions play key roles in learning attitudes and changing behavior, teachers must impact learners affective domain. Since feelings exist before concepts, the affective path is a critical path to the rational process of learning. Cinema is the audiovisual…

  12. Wells Fargo Innovation Incubator (IN²) | NREL

    Science.gov Websites

    Incubator (IN2) is now a $30 million program supporting innovative technologies and innovators. IN2 is buildings portfolio of Wells Fargo to help companies derisk technologies and ease their path to market companies' technologies to help them meet critical validation milestones on the path to market. Companies

  13. 78 FR 5817 - Detecting and Evaluating Drug-Induced Liver Injury; What's Normal, What's Not, and What Should We...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-28

    ... with the Critical Path Institute (C-Path) and the Pharmaceutical Research and Manufacturers of America. Its purpose is to discuss, debate, and build consensus among stakeholders in the pharmaceutical... philanthropic support from the southern Arizona community, Science Foundation Arizona, and FDA. The...

  14. Disfluencies along the Garden Path: Brain Electrophysiological Evidence of Disrupted Sentence Processing

    ERIC Educational Resources Information Center

    Maxfield, Nathan D.; Lyon, Justine M.; Silliman, Elaine R.

    2009-01-01

    Bailey and Ferreira (2003) hypothesized and reported behavioral evidence that disfluencies (filled and silent pauses) undesirably affect sentence processing when they appear before disambiguating verbs in Garden Path (GP) sentences. Disfluencies here cause the parser to "linger" on, and apparently accept as correct, an erroneous parse. Critically,…

  15. Integrating Critical Thinking Instruction and Assessment into Online University Courses: An Action Research Study

    ERIC Educational Resources Information Center

    Mason Heinrichs, Kim R.

    2016-01-01

    Universities claim that improved critical thinking ability is an educational outcome for their graduates, but they seldom create a path for students to achieve that outcome. In this practitioner action research study, the author created a job aid, entitled "Critical Thinking as a Differentiator for Distinguished Performance," to help…

  16. A hybrid online scheduling mechanism with revision and progressive techniques for autonomous Earth observation satellite

    NASA Astrophysics Data System (ADS)

    Li, Guoliang; Xing, Lining; Chen, Yingwu

    2017-11-01

    The autonomicity of self-scheduling on Earth observation satellite and the increasing scale of satellite network attract much attention from researchers in the last decades. In reality, the limited onboard computational resource presents challenge for the online scheduling algorithm. This study considered online scheduling problem for a single autonomous Earth observation satellite within satellite network environment. It especially addressed that the urgent tasks arrive stochastically during the scheduling horizon. We described the problem and proposed a hybrid online scheduling mechanism with revision and progressive techniques to solve this problem. The mechanism includes two decision policies, a when-to-schedule policy combining periodic scheduling and critical cumulative number-based event-driven rescheduling, and a how-to-schedule policy combining progressive and revision approaches to accommodate two categories of task: normal tasks and urgent tasks. Thus, we developed two heuristic (re)scheduling algorithms and compared them with other generally used techniques. Computational experiments indicated that the into-scheduling percentage of urgent tasks in the proposed mechanism is much higher than that in periodic scheduling mechanism, and the specific performance is highly dependent on some mechanism-relevant and task-relevant factors. For the online scheduling, the modified weighted shortest imaging time first and dynamic profit system benefit heuristics outperformed the others on total profit and the percentage of successfully scheduled urgent tasks.

  17. Effects of Napping During Shift Work on Sleepiness and Performance in Emergency Medical Services Personnel and Similar Shift Workers: A Systematic Review and Meta-Analysis

    DOT National Transportation Integrated Search

    2018-01-11

    Background: Scheduled napping during work shifts may be an effective way to mitigate fatigue-related risk. This study aimed to critically review and synthesize existing literature on the impact of scheduled naps on fatigue-related outcomes for EMS pe...

  18. Scheduling viability tests for seeds in long-term storage based on a Bayesian Multi-Level Model

    USDA-ARS?s Scientific Manuscript database

    Genebank managers conduct viability tests on stored seeds so they can replace lots that have viability near a critical threshold, such as 50 or 85% germination. Currently, these tests are typically scheduled at uniform intervals; testing every 5 years is common. A manager needs to balance the cost...

  19. T-L Plane Abstraction-Based Energy-Efficient Real-Time Scheduling for Multi-Core Wireless Sensors.

    PubMed

    Kim, Youngmin; Lee, Ki-Seong; Pham, Ngoc-Son; Lee, Sun-Ro; Lee, Chan-Gun

    2016-07-08

    Energy efficiency is considered as a critical requirement for wireless sensor networks. As more wireless sensor nodes are equipped with multi-cores, there are emerging needs for energy-efficient real-time scheduling algorithms. The T-L plane-based scheme is known to be an optimal global scheduling technique for periodic real-time tasks on multi-cores. Unfortunately, there has been a scarcity of studies on extending T-L plane-based scheduling algorithms to exploit energy-saving techniques. In this paper, we propose a new T-L plane-based algorithm enabling energy-efficient real-time scheduling on multi-core sensor nodes with dynamic power management (DPM). Our approach addresses the overhead of processor mode transitions and reduces fragmentations of the idle time, which are inherent in T-L plane-based algorithms. Our experimental results show the effectiveness of the proposed algorithm compared to other energy-aware scheduling methods on T-L plane abstraction.

  20. A Simulation Based Approach to Optimize Berth Throughput Under Uncertainty at Marine Container Terminals

    NASA Technical Reports Server (NTRS)

    Golias, Mihalis M.

    2011-01-01

    Berth scheduling is a critical function at marine container terminals and determining the best berth schedule depends on several factors including the type and function of the port, size of the port, location, nearby competition, and type of contractual agreement between the terminal and the carriers. In this paper we formulate the berth scheduling problem as a bi-objective mixed-integer problem with the objective to maximize customer satisfaction and reliability of the berth schedule under the assumption that vessel handling times are stochastic parameters following a discrete and known probability distribution. A combination of an exact algorithm, a Genetic Algorithms based heuristic and a simulation post-Pareto analysis is proposed as the solution approach to the resulting problem. Based on a number of experiments it is concluded that the proposed berth scheduling policy outperforms the berth scheduling policy where reliability is not considered.

  1. A distributed agent architecture for real-time knowledge-based systems: Real-time expert systems project, phase 1

    NASA Technical Reports Server (NTRS)

    Lee, S. Daniel

    1990-01-01

    We propose a distributed agent architecture (DAA) that can support a variety of paradigms based on both traditional real-time computing and artificial intelligence. DAA consists of distributed agents that are classified into two categories: reactive and cognitive. Reactive agents can be implemented directly in Ada to meet hard real-time requirements and be deployed on on-board embedded processors. A traditional real-time computing methodology under consideration is the rate monotonic theory that can guarantee schedulability based on analytical methods. AI techniques under consideration for reactive agents are approximate or anytime reasoning that can be implemented using Bayesian belief networks as in Guardian. Cognitive agents are traditional expert systems that can be implemented in ART-Ada to meet soft real-time requirements. During the initial design of cognitive agents, it is critical to consider the migration path that would allow initial deployment on ground-based workstations with eventual deployment on on-board processors. ART-Ada technology enables this migration while Lisp-based technologies make it difficult if not impossible. In addition to reactive and cognitive agents, a meta-level agent would be needed to coordinate multiple agents and to provide meta-level control.

  2. STS-2 - SOFTWARE INTEGRATION TESTS (SIT) - KSC

    NASA Image and Video Library

    1981-09-01

    S81-36331 (24 Aug. 1981) --- Astronauts Joe H. Engle, left, and Richard H. Truly pause before participating in the integrated test of the assembled space shuttle components scheduled for launch no earlier than Sept. 30, 1981. Moments later, Engle, STS-2 crew commander, and Truly, pilot, entered the cabin of the orbiter Columbia for a mission simulation. The shuttle integrated tests (SIT) are designed to check out every connection and signal path in the STS-2 vehicle composed of the orbiter, two solid rocket boosters (SRB) and an external fuel tank (ET) for Columbia?s main engines. Completion of the tests will clear the way for preparations for rollout to Pad A at Launch Complex 39, scheduled for the latter part of August or early September. Photo credit: NASA

  3. Using Planning, Scheduling and Execution for Autonomous Mars Rover Operations

    NASA Technical Reports Server (NTRS)

    Estlin, Tara A.; Gaines, Daniel M.; Chouinard, Caroline M.; Fisher, Forest W.; Castano, Rebecca; Judd, Michele J.; Nesnas, Issa A.

    2006-01-01

    With each new rover mission to Mars, rovers are traveling significantly longer distances. This distance increase raises not only the opportunities for science data collection, but also amplifies the amount of environment and rover state uncertainty that must be handled in rover operations. This paper describes how planning, scheduling and execution techniques can be used onboard a rover to autonomously generate and execute rover activities and in particular to handle new science opportunities that have been identified dynamically. We also discuss some of the particular challenges we face in supporting autonomous rover decision-making. These include interaction with rover navigation and path-planning software and handling large amounts of uncertainty in state and resource estimations. Finally, we describe our experiences in testing this work using several Mars rover prototypes in a realistic environment.

  4. Comparison of OPC job prioritization schemes to generate data for mask manufacturing

    NASA Astrophysics Data System (ADS)

    Lewis, Travis; Veeraraghavan, Vijay; Jantzen, Kenneth; Kim, Stephen; Park, Minyoung; Russell, Gordon; Simmons, Mark

    2015-03-01

    Delivering mask ready OPC corrected data to the mask shop on-time is critical for a foundry to meet the cycle time commitment for a new product. With current OPC compute resource sharing technology, different job scheduling algorithms are possible, such as, priority based resource allocation and fair share resource allocation. In order to maximize computer cluster efficiency, minimize the cost of the data processing and deliver data on schedule, the trade-offs of each scheduling algorithm need to be understood. Using actual production jobs, each of the scheduling algorithms will be tested in a production tape-out environment. Each scheduling algorithm will be judged on its ability to deliver data on schedule and the trade-offs associated with each method will be analyzed. It is now possible to introduce advance scheduling algorithms to the OPC data processing environment to meet the goals of on-time delivery of mask ready OPC data while maximizing efficiency and reducing cost.

  5. Discrete-Event-Dynamic-System-Based Approaches for Control in Integrated Voice/Data Multihop Radio Networks.

    DTIC Science & Technology

    1994-12-07

    set Ci is such that i C_ A and, in general, Ci n Cj 0 for i # j, The importance of the distinction n o ýt a av1 itIam i; the t Ulm hIt I 9B i l 16 a...be the M-dimensional slot assignment probability vector [01, . ., ’iM] T and Wi(0) as the expected node i waiting time. Our objective is to determine...Nominal Sample Path BP#m BP # (m+l) I I I! I I II * I II ,m A2,m :A3,m VIl m T l m T2,m 13 V3,m Figure 2b - (2,m) Phantom Slot Sample Path 8 3 A Schedule

  6. Nuclear Weapons: NNSA Has a New Approach to Managing the B61-12 Life Extension, but a Constrained Schedule and Other Risks Remain

    DTIC Science & Technology

    2016-02-01

    components. In 2010, they began an LEP to consolidate four versions of a legacy nuclear weapon, the B61 bomb , into a bomb called the B61-12 (see...Force Integrated Master Schedule BIMS Boeing Integrated Master Schedule B61 bomb B61 legacy bomb CD critical decision Cost Guide GAO Cost...are versions of the B61 bomb , an aircraft-delivered weapon that is a key component of the United States’ commitments to the North Atlantic Treaty

  7. Special Air Missions: A Path to the 21st Century

    DTIC Science & Technology

    1997-03-01

    C2 standpoint, this physical dislocation from the parent command maintains the personal service currently in place at each of the scheduling agencies... Coursebook , Air Command and Staff College) 51. 11Bishop, Robert D., Colonel, USAF, TRANSCOM briefing to ACSC, 4 Feb 97. 12Stephen Watkins, “Ship to Shore...Joint History Office. Excerpt from: Joint Operations Coursebook , Air Command and Staff College. Maxwell AFB, AL Jackson, Paul, ed., Jane’s All the

  8. The Search for Life Beyond Earth

    NASA Image and Video Library

    2014-07-14

    NASA Administrator Charles Bolden delivers opening remarks during a panel discussion on the search for life beyond Earth in the James E. Webb Auditorium at NASA Headquarters on Monday, July 14, 2014 in Washington, DC. The panel discussed how NASA's space-based observatories are making new discoveries and how the agency's new telescope, the James Webb Space Telescope, will continue this path of discovery after its schedule launch in 2018. Photo Credit: (NASA/Joel Kowsky)

  9. Scheduling multimedia services in cloud computing environment

    NASA Astrophysics Data System (ADS)

    Liu, Yunchang; Li, Chunlin; Luo, Youlong; Shao, Yanling; Zhang, Jing

    2018-02-01

    Currently, security is a critical factor for multimedia services running in the cloud computing environment. As an effective mechanism, trust can improve security level and mitigate attacks within cloud computing environments. Unfortunately, existing scheduling strategy for multimedia service in the cloud computing environment do not integrate trust mechanism when making scheduling decisions. In this paper, we propose a scheduling scheme for multimedia services in multi clouds. At first, a novel scheduling architecture is presented. Then, We build a trust model including both subjective trust and objective trust to evaluate the trust degree of multimedia service providers. By employing Bayesian theory, the subjective trust degree between multimedia service providers and users is obtained. According to the attributes of QoS, the objective trust degree of multimedia service providers is calculated. Finally, a scheduling algorithm integrating trust of entities is proposed by considering the deadline, cost and trust requirements of multimedia services. The scheduling algorithm heuristically hunts for reasonable resource allocations and satisfies the requirement of trust and meets deadlines for the multimedia services. Detailed simulated experiments demonstrate the effectiveness and feasibility of the proposed trust scheduling scheme.

  10. Software-based data path for raster-scanned multi-beam mask lithography

    NASA Astrophysics Data System (ADS)

    Rajagopalan, Archana; Agarwal, Ankita; Buck, Peter; Geller, Paul; Hamaker, H. Christopher; Rao, Nagswara

    2016-10-01

    According to the 2013 SEMATECH Mask Industry Survey,i roughly half of all photomasks are produced using laser mask pattern generator ("LMPG") lithography. LMPG lithography can be used for all layers at mature technology nodes, and for many non-critical and semi-critical masks at advanced nodes. The extensive use of multi-patterning at the 14-nm node significantly increases the number of critical mask layers, and the transition in wafer lithography from positive tone resist to negative tone resist at the 14-nm design node enables the switch from advanced binary masks back to attenuated phase shifting masks that require second level writes to remove unwanted chrome. LMPG lithography is typically used for second level writes due to its high productivity, absence of charging effects, and versatile non-actinic alignment capability. As multi-patterning use expands from double to triple patterning and beyond, the number of LMPG second level writes increases correspondingly. The desire to reserve the limited capacity of advanced electron beam writers for use when essential is another factor driving the demand for LMPG capacity. The increasing demand for cost-effective productivity has kept most of the laser mask writers ever manufactured running in production, sometimes long past their projected lifespan, and new writers continue to be built based on hardware developed some years ago.ii The data path is a case in point. While state-ofthe- art when first introduced, hardware-based data path systems are difficult to modify or add new features to meet the changing requirements of the market. As data volumes increase, design styles change, and new uses are found for laser writers, it is useful to consider a replacement for this critical subsystem. The availability of low-cost, high-performance, distributed computer systems combined with highly scalable EDA software lends itself well to creating an advanced data path system. EDA software, in routine production today, scales well to hundreds or even thousands of CPU-cores, offering the potential for virtually unlimited capacity. Features available in EDA software such as sizing, scaling, tone reversal, OPC, MPC, rasterization, and others are easily adapted to the requirements of a data path system. This paper presents the motivation, requirements, design and performance of an advanced, scalable software data path system suitable to support multi-beam laser mask lithography.

  11. Report of the Cost Assessment and Validation Task Force on the International Space Station

    NASA Technical Reports Server (NTRS)

    1998-01-01

    The Cost Assessment and Validation (CAV) Task Force was established for independent review and assessment of cost, schedule and partnership performance on the International Space Station (ISS) Program. The CAV Task Force has made the following key findings: The International Space Station Program has made notable and reasonable progress over the past four years in defining and executing a very challenging and technically complex effort. The Program size, complexity, and ambitious schedule goals were beyond that which could be reasonably achieved within the $2.1 billion annual cap or $17.4 billion total cap. A number of critical risk elements are likely to have an adverse impact on the International Space Station cost and schedule. The schedule uncertainty associated with Russian implementation of joint Partnership agreements is the major threat to the ISS Program. The Fiscal Year (FY) 1999 budget submission to Congress is not adequate to execute the baseline ISS Program, cover normal program growth, and address the known critical risks. Additional annual funding of between $130 million and $250 million will be required. Completion of ISS assembly is likely to be delayed from one to three years beyond December 2003.

  12. Cost Assessment and Validation Task Force on the International Space Station

    NASA Technical Reports Server (NTRS)

    1998-01-01

    The Cost Assessment and Validation (CAV) Task Force was established for independent review and assessment of cost, schedule and partnership performance on the International Space Station (ISS) Program. The CAV Task Force has made the following key findings: The International Space Station Program has made notable and reasonable progress over the past four years in defining and executing a very challenging and technically complex effort; The Program, size, complexity, and ambitious schedule goals were beyond that which could be reasonably achieved within the $2.1 billion annual cap or $17.4 billion total cap; A number of critical risk elements are likely to have an adverse impact on the International Space Station cost and schedule; The schedule uncertainty associated with Russian implementation of joint Partnership agreements is the major threat to the ISS Program; The Fiscal Year (FY) 1999 budget submission to Congress is not adequate to execute the baseline ISS Program, cover normal program, growth, and address the known critical risks. Additional annual funding of between $130 million and $250 million will be required; and Completion of ISS assembly is likely to be delayed from, one to three years beyond December 2003.

  13. Analysis on Tracking Schedule and Measurements Characteristics for the Spacecraft on the Phase of Lunar Transfer and Capture

    NASA Astrophysics Data System (ADS)

    Song, Young-Joo; Choi, Su-Jin; Ahn, Sang-il; Sim, Eun-Sup

    2014-03-01

    In this work, the preliminary analysis on both the tracking schedule and measurements characteristics for the spacecraft on the phase of lunar transfer and capture is performed. To analyze both the tracking schedule and measurements characteristics, lunar transfer and capture phases¡¯ optimized trajectories are directly adapted from former research, and eleven ground tracking facilities (three Deep Space Network sties, seven Near Earth Network sites, one Daejeon site) are assumed to support the mission. Under these conceptual mission scenarios, detailed tracking schedules and expected measurement characteristics during critical maneuvers (Trans Lunar Injection, Lunar Orbit Insertion and Apoapsis Adjustment Maneuver), especially for the Deajeon station, are successfully analyzed. The orders of predicted measurements' variances during lunar capture phase according to critical maneuvers are found to be within the order of mm/s for the range and micro-deg/s for the angular measurements rates which are in good agreement with the recommended values of typical measurement modeling accuracies for Deep Space Networks. Although preliminary navigation accuracy guidelines are provided through this work, it is expected to give more practical insights into preparing the Korea's future lunar mission, especially for developing flight dynamics subsystem.

  14. Codification of scan path parameters and development of perimeter scan strategies for 3D bowl-shaped laser forming

    NASA Astrophysics Data System (ADS)

    Tavakoli, A.; Naeini, H. Moslemi; Roohi, Amir H.; Gollo, M. Hoseinpour; Shahabad, Sh. Imani

    2018-01-01

    In the 3D laser forming process, developing an appropriate laser scan pattern for producing specimens with high quality and uniformity is critical. This study presents certain principles for developing scan paths. Seven scan path parameters are considered, including: (1) combined linear or curved path; (2) type of combined linear path; (3) order of scan sequences; (4) the position of the start point in each scan; (5) continuous or discontinuous scan path; (6) direction of scan path; and (7) angular arrangement of combined linear scan paths. Regarding these path parameters, ten combined linear scan patterns are presented. Numerical simulations show continuous hexagonal, scan pattern, scanning from outer to inner path, is the optimized. In addition, it is observed the position of the start point and the angular arrangement of scan paths is the most effective path parameters. Also, further experimentations show four sequences due to creat symmetric condition enhance the height of the bowl-shaped products and uniformity. Finally, the optimized hexagonal pattern was compared with the similar circular one. In the hexagonal scan path, distortion value and standard deviation rather to edge height of formed specimen is very low, and the edge height despite of decreasing length of scan path increases significantly compared to the circular scan path. As a result, four-sequence hexagonal scan pattern is proposed as the optimized perimeter scan path to produce bowl-shaped product.

  15. Human instrumental performance in ratio and interval contingencies: A challenge for associative theory.

    PubMed

    Pérez, Omar D; Aitken, Michael R F; Zhukovsky, Peter; Soto, Fabián A; Urcelay, Gonzalo P; Dickinson, Anthony

    2016-12-15

    Associative learning theories regard the probability of reinforcement as the critical factor determining responding. However, the role of this factor in instrumental conditioning is not completely clear. In fact, free-operant experiments show that participants respond at a higher rate on variable ratio than on variable interval schedules even though the reinforcement probability is matched between the schedules. This difference has been attributed to the differential reinforcement of long inter-response times (IRTs) by interval schedules, which acts to slow responding. In the present study, we used a novel experimental design to investigate human responding under random ratio (RR) and regulated probability interval (RPI) schedules, a type of interval schedule that sets a reinforcement probability independently of the IRT duration. Participants responded on each type of schedule before a final choice test in which they distributed responding between two schedules similar to those experienced during training. Although response rates did not differ during training, the participants responded at a lower rate on the RPI schedule than on the matched RR schedule during the choice test. This preference cannot be attributed to a higher probability of reinforcement for long IRTs and questions the idea that similar associative processes underlie classical and instrumental conditioning.

  16. Librarians and Teaching Faculty: Partners in Bibliographic Instruction.

    ERIC Educational Resources Information Center

    Carlson, David; Miller, Ruth H.

    1984-01-01

    This paper focuses on critical factors regarding the effectiveness of the course-related or course-integrated mode of bibliographic instruction: administrative problems (cost in time and personnel, coordination and scheduling materials development); critical role of faculty; consistency of instruction; transference of library knowledge from one…

  17. Using mobile health technology to deliver decision support for self-monitoring after lung transplantation.

    PubMed

    Jiang, Yun; Sereika, Susan M; DeVito Dabbs, Annette; Handler, Steven M; Schlenk, Elizabeth A

    2016-10-01

    Lung transplant recipients (LTR) experience problems recognizing and reporting critical condition changes during their daily health self-monitoring. Pocket PATH(®), a mobile health application, was designed to provide automatic feedback messages to LTR to guide decisions for detecting and reporting critical values of health indicators. To examine the degree to which LTR followed decision support messages to report recorded critical values, and to explore predictors of appropriately following technology decision support by reporting critical values during the first year after transplantation. A cross-sectional correlational study was conducted to analyze existing data from 96 LTR who used the Pocket PATH for daily health self-monitoring. When a critical value is entered, the device automatically generated a feedback message to guide LTR about when and what to report to their transplant coordinators. Their socio-demographics and clinical characteristics were obtained before discharge. Their use of Pocket PATH for health self-monitoring during 12 months was categorized as low (≤25% of days), moderate (>25% to ≤75% of days), and high (>75% of days) use. Following technology decision support was defined by the total number of critical feedback messages appropriately handled divided by the total number of critical feedback messages generated. This variable was dichotomized by whether or not all (100%) feedback messages were appropriately followed. Binary logistic regression was used to explore predictors of appropriately following decision support. Of the 96 participants, 53 had at least 1 critical feedback message generated during 12 months. Of these 53 participants, the average message response rate was 90% and 33 (62%) followed 100% decision support. LTR who moderately used Pocket PATH (n=23) were less likely to follow technology decision support than the high (odds ratio [OR]=0.11, p=0.02) and low (OR=0.04, p=0.02) use groups. The odds of following decision support were reduced in LTR whose income met basic needs (OR=0.01, p=0.01) or who had longer hospital stays (OR=0.94, p=0.004). A significant interaction was found between gender and past technology experience (OR=0.21, p=0.03), suggesting that with increased past technology experience, the odds of following decision support to report all critical values decreased in men but increased in women. The majority of LTR responded appropriately to mobile technology-based decision support for reporting recorded critical values. Appropriately following technology decision support was associated with gender, income, experience with technology, length of hospital stay, and frequency of use of technology for self-monitoring. Clinicians should monitor LTR, who are at risk for poor reporting of recorded critical values, more vigilantly even when LTR are provided with mobile technology decision support. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  18. Proactive routing mutation against stealthy Distributed Denial of Service attacks: metrics, modeling, and analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duan, Qi; Al-Shaer, Ehab; Chatterjee, Samrat

    The Infrastructure Distributed Denial of Service (IDDoS) attacks continue to be one of the most devastating challenges facing cyber systems. The new generation of IDDoS attacks exploit the inherent weakness of cyber infrastructure including deterministic nature of routes, skew distribution of flows, and Internet ossification to discover the network critical links and launch highly stealthy flooding attacks that are not observable at the victim end. In this paper, first, we propose a new metric to quantitatively measure the potential susceptibility of any arbitrary target server or domain to stealthy IDDoS attacks, and es- timate the impact of such susceptibility onmore » enterprises. Second, we develop a proactive route mutation technique to minimize the susceptibility to these attacks by dynamically changing the flow paths periodically to invalidate the adversary knowledge about the network and avoid targeted critical links. Our proposed approach actively changes these network paths while satisfying security and qualify of service requirements. We present an integrated approach of proactive route mutation that combines both infrastructure-based mutation that is based on reconfiguration of switches and routers, and middle-box approach that uses an overlay of end-point proxies to construct a virtual network path free of critical links to reach a destination. We implemented the proactive path mutation technique on a Software Defined Network using the OpendDaylight controller to demonstrate a feasible deployment of this approach. Our evaluation validates the correctness, effectiveness, and scalability of the proposed approaches.« less

  19. Controlling Low-Rate Signal Path Microdischarge for an Ultra-Low-Background Proportional Counter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mace, Emily K.; Aalseth, Craig E.; Bonicalzi, Ricco

    2013-05-01

    ABSTRACT Pacific Northwest National Laboratory (PNNL) has developed an ultra-low-background proportional counter (ULBPC) made of high purity copper. These detectors are part of an ultra-low-background counting system (ULBCS) in the newly constructed shallow underground laboratory at PNNL (at a depth of ~30 meters water-equivalent). To control backgrounds, the current preamplifier electronics are located outside the ULBCS shielding. Thus the signal from the detector travels through ~1 meter of cable and is potentially susceptible to high voltage microdischarge and other sources of electronic noise. Based on initial successful tests, commercial cables and connectors were used for this critical signal path. Subsequentmore » testing across different batches of commercial cables and connectors, however, showed unwanted (but still low) rates of microdischarge noise. To control this noise source, two approaches were pursued: first, to carefully validate cables, connectors, and other commercial components in this critical signal path, making modifications where necessary; second, to develop a custom low-noise, low-background preamplifier that can be integrated with the ULBPC and thus remove most commercial components from the critical signal path. This integrated preamplifier approach is based on the Amptek A250 low-noise charge-integrating preamplifier module. The initial microdischarge signals observed are presented and characterized according to the suspected source. Each of the approaches for mitigation is described, and the results from both are compared with each other and with the original performance seen with commercial cables and connectors.« less

  20. In the linear quadratic model, the Poisson approximation and the Zaider-Minerbo formula agree on the ranking of tumor control probabilities, up to a critical cell birth rate.

    PubMed

    Ballhausen, Hendrik; Belka, Claus

    2017-03-01

    To provide a rule for the agreement or disagreement of the Poisson approximation (PA) and the Zaider-Minerbo formula (ZM) on the ranking of treatment alternatives in terms of tumor control probability (TCP) in the linear quadratic model. A general criterion involving a critical cell birth rate was formally derived. For demonstration, the criterion was applied to a distinct radiobiological model of fast growing head and neck tumors and a respective range of 22 conventional and nonconventional head and neck schedules. There is a critical cell birth rate b crit below which PA and ZM agree on which one out of two alternative treatment schemes with single-cell survival curves S'(t) and S''(t) offers better TCP: [Formula: see text] For cell birth rates b above this critical cell birth rate, PA and ZM disagree if and only if b >b crit > 0. In case of the exemplary head and neck schedules, out of 231 possible combinations, only 16 or 7% were found where PA and ZM disagreed. In all 231 cases the prediction of the criterion was numerically confirmed, and cell birth rates at crossovers between schedules matched the calculated critical cell birth rates. TCP estimated by PA and ZM almost never numerically coincide. Still, in many cases both formulas at least agree about which one out of two alternative fractionation schemes offers better TCP. In case of fast growing tumors featuring a high cell birth rate, however, ZM may suggest a re-evaluation of treatment options.

  1. Collaborative Scheduling Using JMS in a Mixed Java and .NET Environment

    NASA Technical Reports Server (NTRS)

    Wang, Yeou-Fang; Wax, Allan; Lam, Ray; Baldwin, John; Borden, Chet

    2006-01-01

    A collaborative framework/environment was proto-typed to prove the feasibility of scheduling space flight missions on NASA's Deep Space Network (DSN) in a distributed fashion. In this environment, effective collaboration relies on efficient communications among all flight mission and DSN scheduling users. There-fore, messaging becomes critical to timely event notification and data synchronization. In the prototype, a rapid messaging system using Java Message Service (JMS) in a mixed Java and .NET environment is established. This scheme allows both Java and .NET applications to communicate with each other for data synchronization and schedule negotiation. The JMS approach we used is based on a centralized messaging scheme. With proper use of a high speed messaging system, all users in this collaborative framework can communicate with each other to generate a schedule collaboratively to meet DSN and projects tracking needs.

  2. Improvement to Airport Throughput Using Intelligent Arrival Scheduling and an Expanded Planning Horizon

    NASA Technical Reports Server (NTRS)

    Glaab, Patricia C.

    2012-01-01

    The first phase of this study investigated the amount of time a flight can be delayed or expedited within the Terminal Airspace using only speed changes. The Arrival Capacity Calculator analysis tool was used to predict the time adjustment envelope for standard descent arrivals and then for CDA arrivals. Results ranged from 0.77 to 5.38 minutes. STAR routes were configured for the ACES simulation, and a validation of the ACC results was conducted comparing the maximum predicted time adjustments to those seen in ACES. The final phase investigated full runway-to-runway trajectories using ACES. The radial distance used by the arrival scheduler was incrementally increased from 50 to 150 nautical miles (nmi). The increased Planning Horizon radii allowed the arrival scheduler to arrange, path stretch, and speed-adjust flights to more fully load the arrival stream. The average throughput for the high volume portion of the day increased from 30 aircraft per runway for the 50 nmi radius to 40 aircraft per runway for the 150 nmi radius for a traffic set representative of high volume 2018. The recommended radius for the arrival scheduler s Planning Horizon was found to be 130 nmi, which allowed more than 95% loading of the arrival stream.

  3. An Optimization-Driven Analysis Pipeline to Uncover Biomarkers and Signaling Paths: Cervix Cancer.

    PubMed

    Lorenzo, Enery; Camacho-Caceres, Katia; Ropelewski, Alexander J; Rosas, Juan; Ortiz-Mojer, Michael; Perez-Marty, Lynn; Irizarry, Juan; Gonzalez, Valerie; Rodríguez, Jesús A; Cabrera-Rios, Mauricio; Isaza, Clara

    2015-06-01

    Establishing how a series of potentially important genes might relate to each other is relevant to understand the origin and evolution of illnesses, such as cancer. High-throughput biological experiments have played a critical role in providing information in this regard. A special challenge, however, is that of trying to conciliate information from separate microarray experiments to build a potential genetic signaling path. This work proposes a two-step analysis pipeline, based on optimization, to approach meta-analysis aiming to build a proxy for a genetic signaling path.

  4. Automatic programming of arc welding robots

    NASA Astrophysics Data System (ADS)

    Padmanabhan, Srikanth

    Automatic programming of arc welding robots requires the geometric description of a part from a solid modeling system, expert weld process knowledge and the kinematic arrangement of the robot and positioner automatically. Current commercial solid models are incapable of storing explicitly product and process definitions of weld features. This work presents a paradigm to develop a computer-aided engineering environment that supports complete weld feature information in a solid model and to create an automatic programming system for robotic arc welding. In the first part, welding features are treated as properties or attributes of an object, features which are portions of the object surface--the topological boundary. The structure for representing the features and attributes is a graph called the Welding Attribute Graph (WAGRAPH). The method associates appropriate weld features to geometric primitives, adds welding attributes, and checks the validity of welding specifications. A systematic structure is provided to incorporate welding attributes and coordinate system information in a CSG tree. The specific implementation of this structure using a hybrid solid modeler (IDEAS) and an object-oriented programming paradigm is described. The second part provides a comprehensive methodology to acquire and represent weld process knowledge required for the proper selection of welding schedules. A methodology of knowledge acquisition using statistical methods is proposed. It is shown that these procedures did little to capture the private knowledge of experts (heuristics), but helped in determining general dependencies, and trends. A need was established for building the knowledge-based system using handbook knowledge and to allow the experts further to build the system. A methodology to check the consistency and validity for such knowledge addition is proposed. A mapping shell designed to transform the design features to application specific weld process schedules is described. A new approach using fixed path modified continuation methods is proposed in the final section to plan continuously the trajectory of weld seams in an integrated welding robot and positioner environment. The joint displacement, velocity, and acceleration histories all along the path as a function of the path parameter for the best possible welding condition are provided for the robot and the positioner to track various paths normally encountered in arc welding.

  5. An Energy Efficient MAC Protocol for Multi-Hop Swallowable Body Sensor Networks

    PubMed Central

    Lin, Lin; Yang, Chengfeng; Wong, Kai Juan; Yan, Hao; Shen, Junwen; Phee, Soo Jay

    2014-01-01

    Swallowable body sensor networks (BSNs) are composed of sensors which are swallowed by patients and send the collected data to the outside coordinator. These sensors are energy constraint and the batteries are difficult to be replaced. The medium access control (MAC) protocol plays an important role in energy management. This paper investigates an energy efficient MAC protocol design for swallowable BSNs. Multi-hop communication is analyzed and proved more energy efficient than single-hop communication within the human body when the circuitry power is low. Based on this result, a centrally controlled time slotting schedule is proposed. The major workload is shifted from the sensors to the coordinator. The coordinator collects the path-loss map and calculates the schedules, including routing, slot assignment and transmission power. Sensor nodes follow the schedules to send data in a multi-hop way. The proposed protocol is compared with the IEEE 802.15.6 protocol in terms of energy consumption. The results show that it is more energy efficient than IEEE 802.15.6 for swallowable BSN scenarios. PMID:25330049

  6. Future aircraft networks and schedules

    NASA Astrophysics Data System (ADS)

    Shu, Yan

    2011-07-01

    Because of the importance of air transportation scheduling, the emergence of small aircraft and the vision of future fuel-efficient aircraft, this thesis has focused on the study of aircraft scheduling and network design involving multiple types of aircraft and flight services. It develops models and solution algorithms for the schedule design problem and analyzes the computational results. First, based on the current development of small aircraft and on-demand flight services, this thesis expands a business model for integrating on-demand flight services with the traditional scheduled flight services. This thesis proposes a three-step approach to the design of aircraft schedules and networks from scratch under the model. In the first step, both a frequency assignment model for scheduled flights that incorporates a passenger path choice model and a frequency assignment model for on-demand flights that incorporates a passenger mode choice model are created. In the second step, a rough fleet assignment model that determines a set of flight legs, each of which is assigned an aircraft type and a rough departure time is constructed. In the third step, a timetable model that determines an exact departure time for each flight leg is developed. Based on the models proposed in the three steps, this thesis creates schedule design instances that involve almost all the major airports and markets in the United States. The instances of the frequency assignment model created in this thesis are large-scale non-convex mixed-integer programming problems, and this dissertation develops an overall network structure and proposes iterative algorithms for solving these instances. The instances of both the rough fleet assignment model and the timetable model created in this thesis are large-scale mixed-integer programming problems, and this dissertation develops subproblem schemes for solving these instances. Based on these solution algorithms, this dissertation also presents computational results of these large-scale instances. To validate the models and solution algorithms developed, this thesis also compares the daily flight schedules that it designs with the schedules of the existing airlines. Furthermore, it creates instances that represent different economic and fuel-prices conditions and derives schedules under these different conditions. In addition, it discusses the implication of using new aircraft in the future flight schedules. Finally, future research in three areas---model, computational method, and simulation for validation---is proposed.

  7. Comparing the Impact of Course-Based and Apprentice-Based Research Experiences in a Life Science Laboratory Curriculum†

    PubMed Central

    Shapiro, Casey; Moberg-Parker, Jordan; Toma, Shannon; Ayon, Carlos; Zimmerman, Hilary; Roth-Johnson, Elizabeth A.; Hancock, Stephen P.; Levis-Fitzgerald, Marc; Sanders, Erin R.

    2015-01-01

    This four-year study describes the assessment of a bifurcated laboratory curriculum designed to provide upper-division undergraduate majors in two life science departments meaningful exposure to authentic research. The timing is critical as it provides a pathway for both directly admitted and transfer students to enter research. To fulfill their degree requirements, all majors complete one of two paths in the laboratory program. One path immerses students in scientific discovery experienced through team research projects (course-based undergraduate research experiences, or CUREs) and the other path through a mentored, independent research project (apprentice-based research experiences, or AREs). The bifurcated laboratory curriculum was structured using backwards design to help all students, irrespective of path, achieve specific learning outcomes. Over 1,000 undergraduates enrolled in the curriculum. Self-report survey results indicate that there were no significant differences in affective gains by path. Students conveyed which aspects of the curriculum were critical to their learning and development of research-oriented skills. Students’ interests in biology increased upon completion of the curriculum, inspiring a subset of CURE participants to subsequently pursue further research. A rubric-guided performance evaluation, employed to directly measure learning, revealed differences in learning gains for CURE versus ARE participants, with evidence suggesting a CURE can reduce the achievement gap between high-performing students and their peers. PMID:26751568

  8. Forming limit strains for non-linear strain path of AA6014 aluminium sheet deformed at room temperature

    NASA Astrophysics Data System (ADS)

    Bressan, José Divo; Liewald, Mathias; Drotleff, Klaus

    2017-10-01

    Forming limit strain curves of conventional aluminium alloy AA6014 sheets after loading with non-linear strain paths are presented and compared with D-Bressan macroscopic model of sheet metal rupture by critical shear stress criterion. AA6014 exhibits good formability at room temperature and, thus, is mainly employed in car body external parts by manufacturing at room temperature. According to Weber et al., experimental bi-linear strain paths were carried out in specimens with 1mm thickness by pre-stretching in uniaxial and biaxial directions up to 5%, 10% and 20% strain levels before performing Nakajima testing experiments to obtain the forming limit strain curves, FLCs. In addition, FLCs of AA6014 were predicted by employing D-Bressan critical shear stress criterion for bi-linear strain path and comparisons with the experimental FLCs were analyzed and discussed. In order to obtain the material coefficients of plastic anisotropy, strain and strain rate hardening behavior and calibrate the D-Bressan model, tensile tests, two different strain rate on specimens cut at 0°, 45° and 90° to the rolling direction and also bulge test were carried out at room temperature. The correlation of experimental bi-linear strain path FLCs is reasonably good with the predicted limit strains from D-Bressan model, assuming equivalent pre-strain calculated by Hill 1979 yield criterion.

  9. ABLEPathPlanner library for Umbra

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oppel III, Fred J; Xavier, Patrick G.; Gottlieb, Eric Joseph

    Umbra contains a flexible, modular path planner that is used to simulate complex entity behaviors moving within 3D terrain environments that include buildings, barriers, roads, bridges, fences, and a variety of other terrain features (water, vegetation, slope, etc…). The path planning algorithm is a critical component required to execute these tactical behaviors to provide realistic entity movement and provide efficient system computing performance.

  10. Consistency of forest presence and biomass predictions modeled across overlapping spatial and temporal extents

    Treesearch

    Mark D. Nelson; Sean Healey; W. Keith Moser; J.G. Masek; Warren Cohen

    2011-01-01

    We assessed the consistency across space and time of spatially explicit models of forest presence and biomass in southern Missouri, USA, for adjacent, partially overlapping satellite image Path/Rows, and for coincident satellite images from the same Path/Row acquired in different years. Such consistency in satellite image-based classification and estimation is critical...

  11. A critical transition in leaf evolution facilitated the Cretaceous angiosperm revolution.

    PubMed

    de Boer, Hugo Jan; Eppinga, Maarten B; Wassen, Martin J; Dekker, Stefan C

    2012-01-01

    The revolutionary rise of broad-leaved (flowering) angiosperm plant species during the Cretaceous initiated a global ecological transformation towards modern biodiversity. Still, the mechanisms involved in this angiosperm radiation remain enigmatic. Here we show that the period of rapid angiosperm evolution initiated after the leaf interior (post venous) transport path length for water was reduced beyond the leaf interior transport path length for CO2 at a critical leaf vein density of 2.5-5 mm mm(-2). Data and our modelling approaches indicate that surpassing this critical vein density was a pivotal moment in leaf evolution that enabled evolving angiosperms to profit from developing leaves with more and smaller stomata in terms of higher carbon returns from equal water loss. Surpassing the critical vein density may therefore have facilitated evolving angiosperms to develop leaves with higher gas exchange capacities required to adapt to the Cretaceous CO2 decline and outcompete previously dominant coniferous species in the upper canopy.

  12. Routing and scheduling of hazardous materials shipments: algorithmic approaches to managing spent nuclear fuel transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cox, R.G.

    Much controversy surrounds government regulation of routing and scheduling of Hazardous Materials Transportation (HMT). Increases in operating costs must be balanced against expected benefits from local HMT bans and curfews when promulgating or preempting HMT regulations. Algorithmic approaches for evaluating HMT routing and scheduling regulatory policy are described. A review of current US HMT regulatory policy is presented to provide a context for the analysis. Next, a multiobjective shortest path algorithm to find the set of efficient routes under conflicting objectives is presented. This algorithm generates all efficient routes under any partial ordering in a single pass through the network.more » Also, scheduling algorithms are presented to estimate the travel time delay due to HMT curfews along a route. Algorithms are presented assuming either deterministic or stochastic travel times between curfew cities and also possible rerouting to avoid such cities. These algorithms are applied to the case study of US highway transport of spent nuclear fuel from reactors to permanent repositories. Two data sets were used. One data set included the US Interstate Highway System (IHS) network with reactor locations, possible repository sites, and 150 heavily populated areas (HPAs). The other data set contained estimates of the population residing with 0.5 miles of the IHS and the Eastern US. Curfew delay is dramatically reduced by optimally scheduling departure times unless inter-HPA travel times are highly uncertain. Rerouting shipments to avoid HPAs is a less efficient approach to reducing delay.« less

  13. The Search for Life Beyond Earth

    NASA Image and Video Library

    2014-07-14

    John Grunsfeld, Associate Administrator for NASA's Science Mission Directorate, far left, speaks during a panel discussion on the search for life beyond Earth in the James E. Webb Auditorium at NASA Headquarters on Monday, July 14, 2014 in Washington, DC. The panel discussed how NASA's space-based observatories are making new discoveries and how the agency's new telescope, the James Webb Space Telescope, will continue this path of discovery after its schedule launch in 2018. Photo Credit: (NASA/Joel Kowsky)

  14. Design and Hardware-in-the-Loop Implementation of Optimal Canonical Maneuvers for an Autonomous Planetary Aerial Vehicle

    DTIC Science & Technology

    2012-12-01

    selflessly working your own school and writing schedule around mine , supporting me throughout career paths that have been anything but traditional...observation, and other scientific research and exploration purposes. 4 A ground rover on a planet, moon, or other body such as an asteroid must...applied to autonomous craft that could eventually operate on the surface of planets, moons, and asteroids , as well as in Earth orbit or deep space

  15. The Search for Life Beyond Earth

    NASA Image and Video Library

    2014-07-14

    NASA Chief Scientist Ellen Stofan, far left, introduces members of the panel prior to a discussion of the search for life beyond Earth in the James E. Webb Auditorium at NASA Headquarters on Monday, July 14, 2014 in Washington, DC. The panel discussed how NASA's space-based observatories are making new discoveries and how the agency's new telescope, the James Webb Space Telescope, will continue this path of discovery after its schedule launch in 2018. Photo Credit: (NASA/Joel Kowsky)

  16. Machine Maintenance Scheduling with Reliability Engineering Method and Maintenance Value Stream Mapping

    NASA Astrophysics Data System (ADS)

    Sembiring, N.; Nasution, A. H.

    2018-02-01

    Corrective maintenance i.e replacing or repairing the machine component after machine break down always done in a manufacturing company. It causes the production process must be stopped. Production time will decrease due to the maintenance team must replace or repair the damage machine component. This paper proposes a preventive maintenance’s schedule for a critical component of a critical machine of an crude palm oil and kernel company due to increase maintenance efficiency. The Reliability Engineering & Maintenance Value Stream Mapping is used as a method and a tool to analize the reliability of the component and reduce the wastage in any process by segregating value added and non value added activities.

  17. Advanced construction management for lunar base construction - Surface operations planner

    NASA Technical Reports Server (NTRS)

    Kehoe, Robert P.

    1992-01-01

    The study proposes a conceptual solution and lays the framework for developing a new, sophisticated and intelligent tool for a lunar base construction crew to use. This concept integrates expert systems for critical decision making, virtual reality for training, logistics and laydown optimization, automated productivity measurements, and an advanced scheduling tool to form a unique new planning tool. The concept features extensive use of computers and expert systems software to support the actual work, while allowing the crew to control the project from the lunar surface. Consideration is given to a logistics data base, laydown area management, flexible critical progress scheduler, video simulation of assembly tasks, and assembly information and tracking documentation.

  18. Synchronization of Concurrent Processes

    DTIC Science & Technology

    1975-07-01

    Pettersen Stanford Ur.iversity Artificial Intelligence Laboratory ABSTRACT Th oaoer gives an overview of commonly used synchronization primitives and...wr.ters . ut.l.z.ng the DroDo4d synchronization primitive . The solution is simpler and shorter than other known S’ms The first sections of the paper...un reicr»» side il nrcttaary and Identity by block number) Scheduling, process scheduling, synchronization , mutual exclusion, semaphores , critical

  19. Enabling Autonomous Rover Science through Dynamic Planning and Scheduling

    NASA Technical Reports Server (NTRS)

    Estlin, Tara A.; Gaines, Daniel; Chouinard, Caroline; Fisher, Forest; Castano, Rebecca; Judd, Michele; Nesnas, Issa

    2005-01-01

    This paper describes how dynamic planning and scheduling techniques can be used onboard a rover to autonomously adjust rover activities in support of science goals. These goals could be identified by scientists on the ground or could be identified by onboard data-analysis software. Several different types of dynamic decisions are described, including the handling of opportunistic science goals identified during rover traverses, preserving high priority science targets when resources, such as power, are unexpectedly over-subscribed, and dynamically adding additional, ground-specified science targets when rover actions are executed more quickly than expected. After describing our specific system approach, we discuss some of the particular challenges we have examined to support autonomous rover decision-making. These include interaction with rover navigation and path-planning software and handling large amounts of uncertainty in state and resource estimations.

  20. Evaluation of Microwave Landing System (MLS) effect on the delivery performance of a fixed-path metering and spacing system

    NASA Technical Reports Server (NTRS)

    Credeur, L.; Davis, C. M.; Capron, W. R.

    1981-01-01

    Metering and spacing (M & S) system's algorithms described assume an aircraft two dimensional are navigation capability. The three navigation systems compared were: very high frequency omnidirectional range/distance measuring equipment (VOR/DME) and ILS, VOR/DME and + or - 40 MLS, and VOR/DME and + or - 60 MLS. Other factors studied were M & S tentative schedule point location, route geometry effects, and approach gate location effects. Summarized results are: the MLS offers some improvement over VOR/DME and ILS if all approach routes contain computer assisted turns; pilot reaction to moving the gate closer to the runway threshold may adversely affect M & S performance; and coupling en route metering to terminal scheduling transfers most of the terminal holding to more full efficient, higher altitude en route delay.

  1. Annealed importance sampling with constant cooling rate

    NASA Astrophysics Data System (ADS)

    Giovannelli, Edoardo; Cardini, Gianni; Gellini, Cristina; Pietraperzia, Giangaetano; Chelli, Riccardo

    2015-02-01

    Annealed importance sampling is a simulation method devised by Neal [Stat. Comput. 11, 125 (2001)] to assign weights to configurations generated by simulated annealing trajectories. In particular, the equilibrium average of a generic physical quantity can be computed by a weighted average exploiting weights and estimates of this quantity associated to the final configurations of the annealed trajectories. Here, we review annealed importance sampling from the perspective of nonequilibrium path-ensemble averages [G. E. Crooks, Phys. Rev. E 61, 2361 (2000)]. The equivalence of Neal's and Crooks' treatments highlights the generality of the method, which goes beyond the mere thermal-based protocols. Furthermore, we show that a temperature schedule based on a constant cooling rate outperforms stepwise cooling schedules and that, for a given elapsed computer time, performances of annealed importance sampling are, in general, improved by increasing the number of intermediate temperatures.

  2. Optimizing Mars Airplane Trajectory with the Application Navigation System

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael; Riley, Derek

    2004-01-01

    Planning complex missions requires a number of programs to be executed in concert. The Application Navigation System (ANS), developed in the NAS Division, can execute many interdependent programs in a distributed environment. We show that the ANS simplifies user effort and reduces time in optimization of the trajectory of a martian airplane. We use a software package, Cart3D, to evaluate trajectories and a shortest path algorithm to determine the optimal trajectory. ANS employs the GridScape to represent the dynamic state of the available computer resources. Then, ANS uses a scheduler to dynamically assign ready task to machine resources and the GridScape for tracking available resources and forecasting completion time of running tasks. We demonstrate system capability to schedule and run the trajectory optimization application with efficiency exceeding 60% on 64 processors.

  3. Dynamic advance reservation with delayed allocation

    DOEpatents

    Vokkarane, Vinod; Somani, Arun

    2014-12-02

    A method of scheduling data transmissions from a source to a destination, includes the steps of: providing a communication system having a number of channels and a number of paths, each of the channels having a plurality of designated time slots; receiving two or more data transmission requests; provisioning the transmission of the data; receiving data corresponding to at least one of the two or more data transmission requests; waiting until an earliest requested start time T.sub.s; allocating at the current time each of the two or more data transmission requests; transmitting the data; and repeating the steps of waiting, allocating, and transmitting until each of the two or more data transmission requests that have been provisioned for a transmission of data is satisfied. A system to perform the method of scheduling data transmissions is also described.

  4. T-L Plane Abstraction-Based Energy-Efficient Real-Time Scheduling for Multi-Core Wireless Sensors

    PubMed Central

    Kim, Youngmin; Lee, Ki-Seong; Pham, Ngoc-Son; Lee, Sun-Ro; Lee, Chan-Gun

    2016-01-01

    Energy efficiency is considered as a critical requirement for wireless sensor networks. As more wireless sensor nodes are equipped with multi-cores, there are emerging needs for energy-efficient real-time scheduling algorithms. The T-L plane-based scheme is known to be an optimal global scheduling technique for periodic real-time tasks on multi-cores. Unfortunately, there has been a scarcity of studies on extending T-L plane-based scheduling algorithms to exploit energy-saving techniques. In this paper, we propose a new T-L plane-based algorithm enabling energy-efficient real-time scheduling on multi-core sensor nodes with dynamic power management (DPM). Our approach addresses the overhead of processor mode transitions and reduces fragmentations of the idle time, which are inherent in T-L plane-based algorithms. Our experimental results show the effectiveness of the proposed algorithm compared to other energy-aware scheduling methods on T-L plane abstraction. PMID:27399722

  5. Application of modern control theory to scheduling and path-stretching maneuvers of aircraft in the near terminal area

    NASA Technical Reports Server (NTRS)

    Athans, M.

    1974-01-01

    A design concept of the dynamic control of aircraft in the near terminal area is discussed. An arbitrary set of nominal air routes, with possible multiple merging points, all leading to a single runway, is considered. The system allows for the automated determination of acceleration/deceleration of aircraft along the nominal air routes, as well as for the automated determination of path-stretching delay maneuvers. In addition to normal operating conditions, the system accommodates: (1) variable commanded separations over the outer marker to allow for takeoffs and between successive landings and (2) emergency conditions under which aircraft in distress have priority. The system design is based on a combination of three distinct optimal control problems involving a standard linear-quadratic problem, a parameter optimization problem, and a minimum-time rendezvous problem.

  6. Effects of aircraft and flight parameters on energy-efficient profile descents in time-based metered traffic

    NASA Technical Reports Server (NTRS)

    Dejarnette, F. R.

    1984-01-01

    Attention is given to a computer algorithm yielding the data required for a flight crew to navigate from an entry fix, about 100 nm from an airport, to a metering fix, and arrive there at a predetermined time, altitude, and airspeed. The flight path is divided into several descent and deceleration segments. Results for the case of a B-737 airliner indicate that wind and nonstandard atmospheric properties have a significant effect on the flight path and must be taken into account. While a range of combinations of Mach number and calibrated airspeed is possible for the descent segments leading to the metering fix, only small changes in the fuel consumed were observed for this range of combinations. A combination that is based on scheduling flexibility therefore seems preferable.

  7. Test results of flight guidance for fuel conservative descents in a time-based metered air traffic environment. [terminal configured vehicle

    NASA Technical Reports Server (NTRS)

    Knox, C. E.; Person, L. H., Jr.

    1981-01-01

    The NASA developed, implemented, and flight tested a flight management algorithm designed to improve the accuracy of delivering an airplane in a fuel-conservative manner to a metering fix at a time designated by air traffic control. This algorithm provides a 3D path with time control (4D) for the TCV B-737 airplane to make an idle-thrust, clean configured (landing gear up, flaps zero, and speed brakes retracted) descent to arrive at the metering fix at a predetermined time, altitude, and airspeed. The descent path is calculated for a constant Mach/airspeed schedule from linear approximations of airplane performance with considerations given for gross weight, wind, and nonstandard pressure and temperature effects. The flight management descent algorithms are described and flight test results are presented.

  8. Preliminary test results of a flight management algorithm for fuel conservative descents in a time based metered traffic environment. [flight tests of an algorithm to minimize fuel consumption of aircraft based on flight time

    NASA Technical Reports Server (NTRS)

    Knox, C. E.; Cannon, D. G.

    1979-01-01

    A flight management algorithm designed to improve the accuracy of delivering the airplane fuel efficiently to a metering fix at a time designated by air traffic control is discussed. The algorithm provides a 3-D path with time control (4-D) for a test B 737 airplane to make an idle thrust, clean configured descent to arrive at the metering fix at a predetermined time, altitude, and airspeed. The descent path is calculated for a constant Mach/airspeed schedule from linear approximations of airplane performance with considerations given for gross weight, wind, and nonstandard pressure and temperature effects. The flight management descent algorithms and the results of the flight tests are discussed.

  9. Analysis of Critical Parts and Materials

    DTIC Science & Technology

    1980-12-01

    1 1 1% 1% 1% 1% Large Orders Manual Ordering of Some Critical Parts Order Spares with Original Order Incentives Belter Capital Investment...demand 23 Large orders 24 Long lead procurement funding (including raw materials, facility funding) 25 Manpower analysis and training 26 Manual ... ordering of some critical parts 27 More active role in schedule negotiation 28 Multiple source procurements 29 Multi-year program funding 30 Order

  10. Concentration-Discharge Relations in the Critical Zone: Implications for Resolving Critical Zone Structure, Function, and Evolution

    NASA Astrophysics Data System (ADS)

    Chorover, Jon; Derry, Louis A.; McDowell, William H.

    2017-11-01

    Critical zone science seeks to develop mechanistic theories that describe critical zone structure, function, and long-term evolution. One postulate is that hydrogeochemical controls on critical zone evolution can be inferred from solute discharges measured down-gradient of reactive flow paths. These flow paths have variable lengths, interfacial compositions, and residence times, and their mixing is reflected in concentration-discharge (C-Q) relations. Motivation for this special section originates from a U.S. Critical Zone Observatories workshop that was held at the University of New Hampshire, 20-22 July 2015. The workshop focused on resolving mechanistic CZ controls over surface water chemical dynamics across the full range of lithogenic (e.g., nonhydrolyzing and hydrolyzing cations and oxyanions) and bioactive solutes (e.g., organic and inorganic forms of C, N, P, and S), including dissolved and colloidal species that may cooccur for a given element. Papers submitted to this special section on "concentration-discharge relations in the critical zone" include those from authors who attended the workshop, as well as others who responded to the open solicitation. Submissions were invited that utilized information pertaining to internal, integrated catchment function (relations between hydrology, biogeochemistry, and landscape structure) to help illuminate controls on observed C-Q relations.

  11. Job Scheduling in a Heterogeneous Grid Environment

    NASA Technical Reports Server (NTRS)

    Shan, Hong-Zhang; Smith, Warren; Oliker, Leonid; Biswas, Rupak

    2004-01-01

    Computational grids have the potential for solving large-scale scientific problems using heterogeneous and geographically distributed resources. However, a number of major technical hurdles must be overcome before this potential can be realized. One problem that is critical to effective utilization of computational grids is the efficient scheduling of jobs. This work addresses this problem by describing and evaluating a grid scheduling architecture and three job migration algorithms. The architecture is scalable and does not assume control of local site resources. The job migration policies use the availability and performance of computer systems, the network bandwidth available between systems, and the volume of input and output data associated with each job. An extensive performance comparison is presented using real workloads from leading computational centers. The results, based on several key metrics, demonstrate that the performance of our distributed migration algorithms is significantly greater than that of a local scheduling framework and comparable to a non-scalable global scheduling approach.

  12. Resolving Risks in Individual Astronauts: A New Paradigm for Critical Path Exposures

    NASA Technical Reports Server (NTRS)

    Richmond, Robert C.

    2005-01-01

    The limited number of astronauts available for risk-assessment prevents classic epidemiologic study, and thereby requires alternative approach to assessing risks within individual astronauts exposed to toxic agents identified within the Bioastronautics Critical Path Roadmap (BCPR). Developing a system of noninvasive real-time biodosimetry that provides large datasets for analyses before, during, and after missions for simultaneously determining 1) the kinds of toxic insult, 2) the degree of that insult, both within tissues absorbing that insult, would be usehl for resolving statistically significant risk-assessment in individual astronauts. Therefore, a currently achievable multiparametric paradigm is presented for use in analyzing gene-expression and protein-expression so as to establish predictive outcomes.

  13. A path integral approach to the full Dicke model with dipole-dipole interaction

    NASA Astrophysics Data System (ADS)

    Aparicio Alcalde, M.; Stephany, J.; Svaiter, N. F.

    2011-12-01

    We consider the full Dicke spin-boson model composed by a single bosonic mode and an ensemble of N identical two-level atoms with different couplings for the resonant and anti-resonant interaction terms, and incorporate a dipole-dipole interaction between the atoms. Assuming that the system is in thermal equilibrium with a reservoir at temperature β-1, we compute the free energy in the thermodynamic limit N → ∞ in the saddle-point approximation to the path integral and determine the critical temperature for the super-radiant phase transition. In the zero temperature limit, we recover the critical coupling of the quantum phase transition, presented in the literature.

  14. Some tests of wet tropospheric calibration for the CASA Uno Global Positioning System experiment

    NASA Technical Reports Server (NTRS)

    Dixon, T. H.; Wolf, S. Kornreich

    1990-01-01

    Wet tropospheric path delay can be a major error source for Global Positioning System (GPS) geodetic experiments. Strategies for minimizing this error are investigted using data from CASA Uno, the first major GPS experiment in Central and South America, where wet path delays may be both high and variable. Wet path delay calibration using water vapor radiometers (WVRs) and residual delay estimation is compared with strategies where the entire wet path delay is estimated stochastically without prior calibration, using data from a 270-km test baseline in Costa Rica. Both approaches yield centimeter-level baseline repeatability and similar tropospheric estimates, suggesting that WVR calibration is not critical for obtaining high precision results with GPS in the CASA region.

  15. On the Distribution of Free Path Lengthsfor the Periodic Lorentz Gas

    NASA Astrophysics Data System (ADS)

    Bourgain, Jean; Golse, François; Wennberg, Bernt

    Consider the domain and let the free path length be defined as The distribution of values of is studied in the limit as for all . It is shown that the value is critical for this problem: in other words, the limiting behavior of depends only on whether γ is larger or smaller than .

  16. Probabilistic modeling of condition-based maintenance strategies and quantification of its benefits for airliners

    NASA Astrophysics Data System (ADS)

    Pattabhiraman, Sriram

    Airplane fuselage structures are designed with the concept of damage tolerance, wherein small damage are allowed to remain on the airplane, and damage that otherwise affect the safety of the structure are repaired. The damage critical to the safety of the fuselage are repaired by scheduling maintenance at pre-determined intervals. Scheduling maintenance is an interesting trade-off between damage tolerance and cost. Tolerance of larger damage would require less frequent maintenance and hence, a lower cost, to maintain a certain level of reliability. Alternatively, condition-based maintenance techniques have been developed using on-board sensors, which track damage continuously and request maintenance only when the damage size crosses a particular threshold. This effects a tolerance of larger damage than scheduled maintenance, leading to savings in cost. This work quantifies the savings of condition-based maintenance over scheduled maintenance. The work also quantifies converting the cost savings into weight savings. Structural health monitoring will need time to be able to establish itself as a stand-alone system for maintenance, due to concerns on its diagnosis accuracy and reliability. This work also investigates the effect of synchronizing structural health monitoring system with scheduled maintenance. This work uses on-board SHM equipment skip structural airframe maintenance (a subsect of scheduled maintenance), whenever deemed unnecessary while maintain a desired level of safety of structure. The work will also predict the necessary maintenance for a fleet of airplanes, based on the current damage status of the airplanes. The work also analyses the possibility of false alarm, wherein maintenance is being requested with no critical damage on the airplane. The work use SHM as a tool to identify lemons in a fleet of airplanes. Lemons are those airplanes that would warrant more maintenance trips than the average behavior of the fleet.

  17. Comparison of Two Watch Schedules for Personnel at the White House Military Office President's Emergency Operations Center.

    PubMed

    Shattuck, Nita Lewis; Matsangas, Panagiotis; Eriksen, Elke; Kulubis, Spiros

    2015-08-01

    The aim of this study was to assess effectiveness of an alternative, 24-hr-on/72-hr-off watchstanding schedule on sleep and morale of personnel assigned to the President's Emergency Operations Center (PEOC). As part of the White House Military Office, PEOC personnel historically worked a 12-hr "Panama" watch schedule. Personnel reported experiencing chronic insufficient and disrupted sleep patterns and sought advice for improving their watchstanding schedule. Participants (N = 14 active-duty military members, ages 29 to 42 years) completed the Profile of Mood State (POMS) three times: before, during, and after switching to the alternative schedule with 5-hr sleep periods built into their workday. Participants completed a poststudy questionnaire to assess individual schedule preferences. Sleep was measured actigraphically, supplemented by activity logs. As indicated by POMS scores, mood improved significantly on the new schedule. Although average total sleep amount did not change substantively, the timing of sleep was more consistent on the new schedule, resulting in better sleep hygiene. PEOC personnel overwhelmingly preferred the new schedule, reporting not only that they felt more rested but that the new schedule was more conducive to the demands of family life. Demands of family life and time spent commuting were found to be critical factors for acceptance of the alternative schedule. This new schedule will be most effective if personnel adhere to the scheduled rest periods assigned during their 24-hr duty. A successful schedule should avoid conflicts between social life and operational demands. Results may lead to changes in the work schedules of other departments with similar 24/7 responsibilities. © 2015, Human Factors and Ergonomics Society.

  18. The Search for Life Beyond Earth

    NASA Image and Video Library

    2014-07-14

    Dave Gallagher, Director of Astronomy, Physics, and Space Technology at NASA's Jet Propulsion Laboratory speaks during a panel discussion on the search for life beyond Earth in the James E. Webb Auditorium at NASA Headquarters on Monday, July 14, 2014 in Washington, DC. The panel discussed how NASA's space-based observatories are making new discoveries and how the agency's new telescope, the James Webb Space Telescope, will continue this path of discovery after its schedule launch in 2018. Photo Credit: (NASA/Joel Kowsky)

  19. KSC-01pp1200

    NASA Image and Video Library

    2001-06-26

    KENNEDY SPACE CENTER, Fla. -- The Joint Airlock Module, sporting a NASA logo, is moved toward the payload bay of Space Shuttle Atlantis for mission STS-104. Once installed and activated, the airlock becomes the primary path for International Space Station spacewalk entry and departure using U.S. spacesuits, which are known as Extravehicular Mobility Units, or EMUs. In addition, the Joint Airlock is designed to support the Russian Orlan spacesuit for EVA activity. Launch of Atlantis is scheduled no earlier than July 12 at 5:04 a.m. EDT

  20. The Search for Life Beyond Earth

    NASA Image and Video Library

    2014-07-14

    Sara Seager, a MacArthur Fellow and Professor of Planetary Science and Physics at the Massachusetts Institute of Technology, speaks during a panel discussion on the search for life beyond Earth in the James E. Webb Auditorium at NASA Headquarters on Monday, July 14, 2014 in Washington, DC. The panel discussed how NASA's space-based observatories are making new discoveries and how the agency's new telescope, the James Webb Space Telescope, will continue this path of discovery after its schedule launch in 2018. Photo Credit: (NASA/Joel Kowsky)

  1. The Search for Life Beyond Earth

    NASA Image and Video Library

    2014-07-14

    John Grunsfeld, Associate Administrator for NASA's Science Mission Directorate, second from left, answers a question from the audience during a panel discussion on the search for life beyond Earth in the James E. Webb Auditorium at NASA Headquarters on Monday, July 14, 2014 in Washington, DC. The panel discussed how NASA's space-based observatories are making new discoveries and how the agency's new telescope, the James Webb Space Telescope, will continue this path of discovery after its schedule launch in 2018. Photo Credit: (NASA/Joel Kowsky)

  2. The Search for Life Beyond Earth

    NASA Image and Video Library

    2014-07-14

    John Grunsfeld, Associate Administrator for NASA's Science Mission Directorate, far left, answers a question from the audience during a panel discussion on the search for life beyond Earth in the James E. Webb Auditorium at NASA Headquarters on Monday, July 14, 2014 in Washington, DC. The panel discussed how NASA's space-based observatories are making new discoveries and how the agency's new telescope, the James Webb Space Telescope, will continue this path of discovery after its schedule launch in 2018. Photo Credit: (NASA/Joel Kowsky)

  3. The Search for Life Beyond Earth

    NASA Image and Video Library

    2014-07-14

    Matt Mountain, Director of the Space Telescope Science Institute and telescope scientist for the James Webb Space Telescope, speaks during a panel discussion on the search for life beyond Earth in the James E. Webb Auditorium at NASA Headquarters on Monday, July 14, 2014 in Washington, DC. The panel discussed how NASA's space-based observatories are making new discoveries and how the agency's new telescope, the James Webb Space Telescope, will continue this path of discovery after its schedule launch in 2018. Photo Credit: (NASA/Joel Kowsky)

  4. The Search for Life Beyond Earth

    NASA Image and Video Library

    2014-07-14

    John Mather, Nobel Laureate and Project Scientist for the James Webb Space Telescope at NASA's Goddard Space Flight Center, speaks during a panel discussion on the search for life beyond Earth in the James E. Webb Auditorium at NASA Headquarters on Monday, July 14, 2014 in Washington, DC. The panel discussed how NASA's space-based observatories are making new discoveries and how the agency's new telescope, the James Webb Space Telescope, will continue this path of discovery after its schedule launch in 2018. Photo Credit: (NASA/Joel Kowsky)

  5. The Search for Life Beyond Earth

    NASA Image and Video Library

    2014-07-14

    NASA Chief Scientist Ellen Stofan, far left, asks the members of the panel a question during a discussion of the search for life beyond Earth in the James E. Webb Auditorium at NASA Headquarters on Monday, July 14, 2014 in Washington, DC. The panel discussed how NASA's space-based observatories are making new discoveries and how the agency's new telescope, the James Webb Space Telescope, will continue this path of discovery after its schedule launch in 2018. Photo Credit: (NASA/Joel Kowsky)

  6. 49 CFR Appendix A to Part 272 - Schedule of Civil Penalties 1

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION CRITICAL INCIDENT STRESS PLANS Pt. 272, App. A Appendix A to Part... incident stress plan: (a) Failure to inform about relief options 5,000 6,000 (b) Failure to offer timely... critical incident stress plan for approval by the Federal Railroad Administration. (a) Failure to submit a...

  7. Automating Mid- and Long-Range Scheduling for NASA's Deep Space Network

    NASA Technical Reports Server (NTRS)

    Johnston, Mark D.; Tran, Daniel; Arroyo, Belinda; Sorensen, Sugi; Tay, Peter; Carruth, Butch; Coffman, Adam; Wallace, Mike

    2012-01-01

    NASA has recently deployed a new mid-range scheduling system for the antennas of the Deep Space Network (DSN), called Service Scheduling Software, or S(sup 3). This system is architected as a modern web application containing a central scheduling database integrated with a collaborative environment, exploiting the same technologies as social web applications but applied to a space operations context. This is highly relevant to the DSN domain since the network schedule of operations is developed in a peer-to-peer negotiation process among all users who utilize the DSN (representing 37 projects including international partners and ground-based science and calibration users). The initial implementation of S(sup 3) is complete and the system has been operational since July 2011. S(sup 3) has been used for negotiating schedules since April 2011, including the baseline schedules for three launching missions in late 2011. S(sup 3) supports a distributed scheduling model, in which changes can potentially be made by multiple users based on multiple schedule "workspaces" or versions of the schedule. This has led to several challenges in the design of the scheduling database, and of a change proposal workflow that allows users to concur with or to reject proposed schedule changes, and then counter-propose with alternative or additional suggested changes. This paper describes some key aspects of the S(sup 3) system and lessons learned from its operational deployment to date, focusing on the challenges of multi-user collaborative scheduling in a practical and mission-critical setting. We will also describe the ongoing project to extend S(sup 3) to encompass long-range planning, downtime analysis, and forecasting, as the next step in developing a single integrated DSN scheduling tool suite to cover all time ranges.

  8. Velocity Inversion In Cylindrical Couette Gas Flows

    NASA Astrophysics Data System (ADS)

    Dongari, Nishanth; Barber, Robert W.; Emerson, David R.; Zhang, Yonghao; Reese, Jason M.

    2012-05-01

    We investigate a power-law probability distribution function to describe the mean free path of rarefied gas molecules in non-planar geometries. A new curvature-dependent model is derived by taking into account the boundary-limiting effects on the molecular mean free path for surfaces with both convex and concave curvatures. In comparison to a planar wall, we find that the mean free path for a convex surface is higher at the wall and exhibits a sharper gradient within the Knudsen layer. In contrast, a concave wall exhibits a lower mean free path near the surface and the gradients in the Knudsen layer are shallower. The Navier-Stokes constitutive relations and velocity-slip boundary conditions are modified based on a power-law scaling to describe the mean free path, in accordance with the kinetic theory of gases, i.e. transport properties can be described in terms of the mean free path. Velocity profiles for isothermal cylindrical Couette flow are obtained using the power-law model. We demonstrate that our model is more accurate than the classical slip solution, especially in the transition regime, and we are able to capture important non-linear trends associated with the non-equilibrium physics of the Knudsen layer. In addition, we establish a new criterion for the critical accommodation coefficient that leads to the non-intuitive phenomena of velocity-inversion. Our results are compared with conventional hydrodynamic models and direct simulation Monte Carlo data. The power-law model predicts that the critical accommodation coefficient is significantly lower than that calculated using the classical slip solution and is in good agreement with available DSMC data. Our proposed constitutive scaling for non-planar surfaces is based on simple physical arguments and can be readily implemented in conventional fluid dynamics codes for arbitrary geometric configurations.

  9. Channel Modelling and Performance of Non-Line-of-Sight Ultraviolet Scattering Communications

    DTIC Science & Technology

    2012-01-01

    Avalanche photodiode (APD) detectors are also rapidly being developed [6, 7]. These device advances have inspired recent research in LED-based short...response and path loss results for outdoor NLOS UV communication channels in Section 3. The impulse response modelling describes UV pulse broadening via...Both the impulse response and path loss are critical to communication system design and performance assessment. Although pulse broadening creates inter

  10. Optimizing End-to-End Big Data Transfers over Terabits Network Infrastructure

    DOE PAGES

    Kim, Youngjae; Atchley, Scott; Vallee, Geoffroy R.; ...

    2016-04-05

    While future terabit networks hold the promise of significantly improving big-data motion among geographically distributed data centers, significant challenges must be overcome even on today's 100 gigabit networks to realize end-to-end performance. Multiple bottlenecks exist along the end-to-end path from source to sink, for instance, the data storage infrastructure at both the source and sink and its interplay with the wide-area network are increasingly the bottleneck to achieving high performance. In this study, we identify the issues that lead to congestion on the path of an end-to-end data transfer in the terabit network environment, and we present a new bulkmore » data movement framework for terabit networks, called LADS. LADS exploits the underlying storage layout at each endpoint to maximize throughput without negatively impacting the performance of shared storage resources for other users. LADS also uses the Common Communication Interface (CCI) in lieu of the sockets interface to benefit from hardware-level zero-copy, and operating system bypass capabilities when available. It can further improve data transfer performance under congestion on the end systems using buffering at the source using flash storage. With our evaluations, we show that LADS can avoid congested storage elements within the shared storage resource, improving input/output bandwidth, and data transfer rates across the high speed networks. We also investigate the performance degradation problems of LADS due to I/O contention on the parallel file system (PFS), when multiple LADS tools share the PFS. We design and evaluate a meta-scheduler to coordinate multiple I/O streams while sharing the PFS, to minimize the I/O contention on the PFS. Finally, with our evaluations, we observe that LADS with meta-scheduling can further improve the performance by up to 14 percent relative to LADS without meta-scheduling.« less

  11. Optimizing End-to-End Big Data Transfers over Terabits Network Infrastructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Youngjae; Atchley, Scott; Vallee, Geoffroy R.

    While future terabit networks hold the promise of significantly improving big-data motion among geographically distributed data centers, significant challenges must be overcome even on today's 100 gigabit networks to realize end-to-end performance. Multiple bottlenecks exist along the end-to-end path from source to sink, for instance, the data storage infrastructure at both the source and sink and its interplay with the wide-area network are increasingly the bottleneck to achieving high performance. In this study, we identify the issues that lead to congestion on the path of an end-to-end data transfer in the terabit network environment, and we present a new bulkmore » data movement framework for terabit networks, called LADS. LADS exploits the underlying storage layout at each endpoint to maximize throughput without negatively impacting the performance of shared storage resources for other users. LADS also uses the Common Communication Interface (CCI) in lieu of the sockets interface to benefit from hardware-level zero-copy, and operating system bypass capabilities when available. It can further improve data transfer performance under congestion on the end systems using buffering at the source using flash storage. With our evaluations, we show that LADS can avoid congested storage elements within the shared storage resource, improving input/output bandwidth, and data transfer rates across the high speed networks. We also investigate the performance degradation problems of LADS due to I/O contention on the parallel file system (PFS), when multiple LADS tools share the PFS. We design and evaluate a meta-scheduler to coordinate multiple I/O streams while sharing the PFS, to minimize the I/O contention on the PFS. Finally, with our evaluations, we observe that LADS with meta-scheduling can further improve the performance by up to 14 percent relative to LADS without meta-scheduling.« less

  12. Extended charge banking model of dual path shocks for implantable cardioverter defibrillators

    PubMed Central

    Dosdall, Derek J; Sweeney, James D

    2008-01-01

    Background Single path defibrillation shock methods have been improved through the use of the Charge Banking Model of defibrillation, which predicts the response of the heart to shocks as a simple resistor-capacitor (RC) circuit. While dual path defibrillation configurations have significantly reduced defibrillation thresholds, improvements to dual path defibrillation techniques have been limited to experimental observations without a practical model to aid in improving dual path defibrillation techniques. Methods The Charge Banking Model has been extended into a new Extended Charge Banking Model of defibrillation that represents small sections of the heart as separate RC circuits, uses a weighting factor based on published defibrillation shock field gradient measures, and implements a critical mass criteria to predict the relative efficacy of single and dual path defibrillation shocks. Results The new model reproduced the results from several published experimental protocols that demonstrated the relative efficacy of dual path defibrillation shocks. The model predicts that time between phases or pulses of dual path defibrillation shock configurations should be minimized to maximize shock efficacy. Discussion Through this approach the Extended Charge Banking Model predictions may be used to improve dual path and multi-pulse defibrillation techniques, which have been shown experimentally to lower defibrillation thresholds substantially. The new model may be a useful tool to help in further improving dual path and multiple pulse defibrillation techniques by predicting optimal pulse durations and shock timing parameters. PMID:18673561

  13. Lunar Atmosphere and Dust Environment Explorer Integration and Test

    NASA Technical Reports Server (NTRS)

    Wright, Michael R.; McCormick, John L.; Hoffman, Richard G.

    2010-01-01

    Integration and test (I&T) of the Lunar Atmosphere and Dust Environment Explorer (LADEE) is presented. A collaborative NASA project between Goddard Space Flight Center and Ames Research Center, LADEE's mission is to explore the low lunar orbit environment and exosphere for constituents. Its instruments include two spectrometers, a dust detector, and a laser communication technology demonstration. Although a relatively low-cost spacecraft, LADEE has I&T requirements typical of most planetary probes, such as prelaunch contamination control, sterilization, and instrument calibration. To lead to a successful mission, I&T at the spacecraft, instrument, and observatory level must include step-by-step and end-to-end functional, environmental, and performance testing. Due to its compressed development schedule, LADEE I&T planning requires adjusting test flows and sequences to account for long-lead critical-path items and limited spares. A protoflight test-level strategy is also baselined. However, the program benefits from having two independent but collaborative teams of engineers, managers, and technicians that have a wealth of flight project experience. This paper summarizes the LADEE I&T planning, flow, facilities, and probe-unique processes. Coordination of requirements and approaches to I&T when multiple organizations are involved is discussed. Also presented are cost-effective approaches to I&T that are transferable to most any spaceflight project I&T program.

  14. Retargeting of existing FORTRAN program and development of parallel compilers

    NASA Technical Reports Server (NTRS)

    Agrawal, Dharma P.

    1988-01-01

    The software models used in implementing the parallelizing compiler for the B-HIVE multiprocessor system are described. The various models and strategies used in the compiler development are: flexible granularity model, which allows a compromise between two extreme granularity models; communication model, which is capable of precisely describing the interprocessor communication timings and patterns; loop type detection strategy, which identifies different types of loops; critical path with coloring scheme, which is a versatile scheduling strategy for any multicomputer with some associated communication costs; and loop allocation strategy, which realizes optimum overlapped operations between computation and communication of the system. Using these models, several sample routines of the AIR3D package are examined and tested. It may be noted that automatically generated codes are highly parallelized to provide the maximized degree of parallelism, obtaining the speedup up to a 28 to 32-processor system. A comparison of parallel codes for both the existing and proposed communication model, is performed and the corresponding expected speedup factors are obtained. The experimentation shows that the B-HIVE compiler produces more efficient codes than existing techniques. Work is progressing well in completing the final phase of the compiler. Numerous enhancements are needed to improve the capabilities of the parallelizing compiler.

  15. Remote observing with the Keck Telescopes from the U.S. mainland

    NASA Astrophysics Data System (ADS)

    Kibrick, Robert I.; Allen, Steve L.; Conrad, Albert

    2000-06-01

    We describe the current status of efforts to establish a high-bandwidth network from the U.S. mainland to Mauna Kea and a facility in California to support Keck remote observing and engineering via the Internet. The California facility will be an extension of the existing Keck remote operations facility located in Waimea, Hawaii. It will be targeted towards short-duration observing runs which now comprise roughly half of all scheduled science runs on the Keck Telescope. Keck technical staff in Hawaii will support remote observers on the mainland via video conferencing and collaborative software tools. Advantages and disadvantages of remote operation from California versus Hawaii are explored, and costs of alternative communication paths examined. We describe a plan for a backup communications path to protect against failure of the primary network. Alternative software models for remote operation are explored, and recent operational results described.

  16. Development and test results of a flight management algorithm for fuel conservative descents in a time-based metered traffic environment

    NASA Technical Reports Server (NTRS)

    Knox, C. E.; Cannon, D. G.

    1980-01-01

    A simple flight management descent algorithm designed to improve the accuracy of delivering an airplane in a fuel-conservative manner to a metering fix at a time designated by air traffic control was developed and flight tested. This algorithm provides a three dimensional path with terminal area time constraints (four dimensional) for an airplane to make an idle thrust, clean configured (landing gear up, flaps zero, and speed brakes retracted) descent to arrive at the metering fix at a predetermined time, altitude, and airspeed. The descent path was calculated for a constant Mach/airspeed schedule from linear approximations of airplane performance with considerations given for gross weight, wind, and nonstandard pressure and temperature effects. The flight management descent algorithm is described. The results of the flight tests flown with the Terminal Configured Vehicle airplane are presented.

  17. False Starts and Breakthroughs: Senior Thesis Research as a Critical Learning Process

    ERIC Educational Resources Information Center

    Schaus, Margaret; Snyder, Terry

    2018-01-01

    Every senior at Haverford College writes a thesis or its equivalent, conducting independent research with guidance from faculty and librarians. Students critically engage in investigative work in archives, field studies, and labs. In this article, librarians explore the way anthropology and history thesis writers do research to define paths toward…

  18. Teaching Note: When a "Feminist Approach" Is Too Narrow

    ERIC Educational Resources Information Center

    Bondestam, Fredrik

    2011-01-01

    For feminist literary critics and teachers writing about and teaching literature "after feminism," the path is potentially treacherous. Feminist literary criticism, if it is applied too narrowly and used to reject complex literary texts that do not uphold an imagined feminist standard of "positive images" of women, can end up undermining other…

  19. Converging Paths: Creativity Research and Educational Practice

    ERIC Educational Resources Information Center

    Hanson, Michael Hanchett

    2014-01-01

    Education has long been a central issue for creativity research, and the integration of creativity and education has remained a goal and controversy. In spite of over sixty years of trying to bring creativity into education, education is often criticized for not teaching creative thinking, while also criticized from other quarters for not meeting…

  20. Retrieving Immortal Questions, Initiating Immortal Conversations

    ERIC Educational Resources Information Center

    Duarte, Eduardo M.

    2012-01-01

    In his presidential address, which is included in this collection of papers, Kip Kline suggests that the time has arrived to redirect the work of philosophy of education away from the path of critical theory, and thus to depart from what he described as the discourse of "parrhesia." The author takes Kline's critique of critical philosophy of…

  1. Relations between Goals, Self-Efficacy, Critical Thinking and Deep Processing Strategies: A Path Analysis

    ERIC Educational Resources Information Center

    Phan, Huy Phuong

    2009-01-01

    Research exploring students' academic learning has recently amalgamated different motivational theories within one conceptual framework. The inclusion of achievement goals, self-efficacy, deep processing and critical thinking has been cited in a number of studies. This article discusses two empirical studies that examined these four theoretical…

  2. Mission Operations Planning and Scheduling System (MOPSS)

    NASA Technical Reports Server (NTRS)

    Wood, Terri; Hempel, Paul

    2011-01-01

    MOPSS is a generic framework that can be configured on the fly to support a wide range of planning and scheduling applications. It is currently used to support seven missions at Goddard Space Flight Center (GSFC) in roles that include science planning, mission planning, and real-time control. Prior to MOPSS, each spacecraft project built its own planning and scheduling capability to plan satellite activities and communications and to create the commands to be uplinked to the spacecraft. This approach required creating a data repository for storing planning and scheduling information, building user interfaces to display data, generating needed scheduling algorithms, and implementing customized external interfaces. Complex scheduling problems that involved reacting to multiple variable situations were analyzed manually. Operators then used the results to add commands to the schedule. Each architecture was unique to specific satellite requirements. MOPSS is an expert system that automates mission operations and frees the flight operations team to concentrate on critical activities. It is easily reconfigured by the flight operations team as the mission evolves. The heart of the system is a custom object-oriented data layer mapped onto an Oracle relational database. The combination of these two technologies allows a user or system engineer to capture any type of scheduling or planning data in the system's generic data storage via a GUI.

  3. Structural factoring approach for analyzing stochastic networks

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.; Shier, Douglas R.

    1991-01-01

    The problem of finding the distribution of the shortest path length through a stochastic network is investigated. A general algorithm for determining the exact distribution of the shortest path length is developed based on the concept of conditional factoring, in which a directed, stochastic network is decomposed into an equivalent set of smaller, generally less complex subnetworks. Several network constructs are identified and exploited to reduce significantly the computational effort required to solve a network problem relative to complete enumeration. This algorithm can be applied to two important classes of stochastic path problems: determining the critical path distribution for acyclic networks and the exact two-terminal reliability for probabilistic networks. Computational experience with the algorithm was encouraging and allowed the exact solution of networks that have been previously analyzed only by approximation techniques.

  4. Impact of different eddy covariance sensors and set-up on the annual balance of CO2 and fluxes of CH4 and latent heat in the Arctic

    NASA Astrophysics Data System (ADS)

    Goodrich, J. P.; Zona, D.; Gioli, B.; Murphy, P.; Burba, G. G.; Oechel, W. C.

    2015-12-01

    Expanding eddy covariance measurements of CO2 and CH4 fluxes in the Arctic is critical for refining the global C budget. Continuous measurements are particularly challenging because of the remote locations, low power availability, and extreme weather conditions. The necessity for tailoring instrumentation at different sites further complicates the interpretation of results and may add uncertainty to estimates of annual CO2 budgets. We investigated the influence of different sensor combinations on FCO2, latent heat (LE), and FCH4, and assessed the differences in annual FCO2 estimated with different instrumentation at the same sites. Using data from four sites across the North Slope of Alaska, we resolved FCO2 and FCH4 to within 5% using different combinations of open- and closed-path gas analyzers and within 10% using heated and non-heated anemometers. A continuously heated anemometer increased data coverage relative to non-heated anemometers while resulting in comparable annual FCO2, despite over-estimating sensible heat fluxes by 15%. We also implemented an intermittent heating strategy whereby activation only when ice or snow blockage of the transducers was detected. This resulted in comparable data coverage (~ 60%) to the continuously heated anemometer, while avoiding potential over-estimation of sensible heat and gas fluxes. We found good agreement in FCO2 and FCH4 from two closed-path and one open-path gas analyzer, despite the need for large spectral corrections of closed-path fluxes and density and temperature corrections to open-path sensors. However, data coverage was generally greater when using closed-path, especially during cold seasons (36-40% vs 10-14% for the open path), when fluxes from Arctic regions are particularly uncertain and potentially critical to annual C budgets. Measurement of Arctic LE remains a challenge due to strong attenuation along sample tubes, even when heated, that could not be accounted for with spectral corrections.

  5. 48 CFR 52.245-1 - Government Property.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... property was acquired consistent with its engineering, production planning, and property control operations...) Separate inventory disposal schedules are required for aircraft in any condition, flight safety critical...

  6. 48 CFR 52.245-1 - Government Property.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... property was acquired consistent with its engineering, production planning, and property control operations...) Separate inventory disposal schedules are required for aircraft in any condition, flight safety critical...

  7. 48 CFR 52.245-1 - Government Property.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... property was acquired consistent with its engineering, production planning, and property control operations...) Separate inventory disposal schedules are required for aircraft in any condition, flight safety critical...

  8. Mothers and meals. The effects of mothers' meal planning and shopping motivations on children's participation in family meals.

    PubMed

    McIntosh, William Alex; Kubena, Karen S; Tolle, Glen; Dean, Wesley R; Jan, Jie-sheng; Anding, Jenna

    2010-12-01

    Participation in family meals has been associated with benefits for health and social development of children. The objective of the study was to identify the impact of mothers' work of caring through planning regularly scheduled meals, shopping and cooking, on children's participation in family meals. Parents of children aged 9-11 or 13-15 years from 300 Houston families were surveyed about parents' work, meal planning for and scheduling of meals, motivations for food purchases, importance of family meals, and children's frequency of eating dinner with their families. The children were interviewed about the importance of eating family meals. Hypotheses were tested using path analysis to calculate indirect and total effects of variables on the outcome variable of frequency of children eating dinner with their family. Mothers' belief in the importance of family meals increased likelihood of children eating dinner with families by increasing likelihood that mothers planned dinner and that dinners were regularly scheduled. Mothers' perception of time pressures on meal preparation had a negative, indirect effect on the frequency of children's participation in family dinners by reducing mothers' meal planning. Copyright © 2010 Elsevier Ltd. All rights reserved.

  9. Relationship between total quality management, critical paths, and outcomes management.

    PubMed

    Lynn, P A

    1996-09-01

    Total quality management (TQM), clinical paths, and outcomes management are high-profile strategies in today's health care environment. Each strategy is distinct, yet there are interrelationships among them. TQM supports a customer-focused organizational culture, providing tools and techniques to identify and solve problems. Clinical paths are tools for enhancing patient care coordination and for identifying system-wide and patient population specific issues. Outcomes management is an integrated system for measuring the results in patient populations over time. There is a recent shift in outcomes measurement towards expanding both the nature of the outcomes examined and the timeframes in which they are studied.

  10. Fundamental changes to EPA's research enterprise: the path forward.

    PubMed

    Anastas, Paul T

    2012-01-17

    Environmental protection in the United States has reached a critical juncture. It has become clear that to address the complex and interrelated environmental challenges we face, we must augment our traditional approaches. The scientific community must build upon its deep understanding of risk assessment, risk management, and reductionism with tools, technologies, insights and approaches to pursue sustainability. The U.S. Environmental Protection Agency (EPA) has recognized this need for systemic change by implementing a new research paradigm called "The Path Forward." This paper outlines the principles of the Path Forward and the actions taken since 2010 to align EPA's research efforts with the goal of sustainability.

  11. Critique of Coleman's Theory of the Vanishing Cosmological Constant

    NASA Astrophysics Data System (ADS)

    Susskind, Leonard

    In these lectures I would like to review some of the criticisms to the Coleman worm-hole theory of the vanishing cosmological constant. In particular, I would like to focus on the most fundamental assumption that the path integral over topologies defines a probability for the cosmological constant which has the form EXP(A) with A being the Baum-Hawking-Coleman saddle point. Coleman argues that the euclideam path integral over all geometries may be dominated by special configurations which consist of large smooth "spheres" connected by any number of narrow wormholes. Formally summing up such configurations gives a very divergent expression for the path integral…

  12. Flexible work in call centres: Working hours, work-life conflict & health.

    PubMed

    Bohle, Philip; Willaby, Harold; Quinlan, Michael; McNamara, Maria

    2011-01-01

    Call-centre workers encounter major psychosocial pressures, including high work intensity and undesirable working hours. Little is known, however, about whether these pressures vary with employment status and how they affect work-life conflict and health. Questionnaire data were collected from 179 telephone operators in Sydney, Australia, of whom 124 (69.3%) were female and 54 (30.2%) were male. Ninety-three (52%) were permanent full-time workers, 37 (20.7%) were permanent part-time, and 49 (27.4%) were casual employees. Hypothesised structural relationships between employment status, working hours and work organisation, work-life conflict and health were tested using partial least squares modelling in PLS (Chin, 1998). The final model demonstrated satisfactory fit. It supported important elements of the hypothesised structure, although four of the proposed paths failed to reach significance and the fit was enhanced by adding a path. The final model indicated that casual workers reported more variable working hours which were relatively weakly associated with greater dissatisfaction with hours. The interaction of schedule control and variability of hours also predicted dissatisfaction with hours. Conversely, permanent workers reported greater work intensity, which was associated with both lower work schedule control and greater work-life conflict. Greater work-life conflict was associated with more fatigue and psychological symptoms. Labour market factors and the undesirability of longer hours in a stressful, high-intensity work environment appear to have contributed to the results. Copyright © 2010 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  13. Efficiently Scheduling Multi-core Guest Virtual Machines on Multi-core Hosts in Network Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoginath, Srikanth B; Perumalla, Kalyan S

    2011-01-01

    Virtual machine (VM)-based simulation is a method used by network simulators to incorporate realistic application behaviors by executing actual VMs as high-fidelity surrogates for simulated end-hosts. A critical requirement in such a method is the simulation time-ordered scheduling and execution of the VMs. Prior approaches such as time dilation are less efficient due to the high degree of multiplexing possible when multiple multi-core VMs are simulated on multi-core host systems. We present a new simulation time-ordered scheduler to efficiently schedule multi-core VMs on multi-core real hosts, with a virtual clock realized on each virtual core. The distinguishing features of ourmore » approach are: (1) customizable granularity of the VM scheduling time unit on the simulation time axis, (2) ability to take arbitrary leaps in virtual time by VMs to maximize the utilization of host (real) cores when guest virtual cores idle, and (3) empirically determinable optimality in the tradeoff between total execution (real) time and time-ordering accuracy levels. Experiments show that it is possible to get nearly perfect time-ordered execution, with a slight cost in total run time, relative to optimized non-simulation VM schedulers. Interestingly, with our time-ordered scheduler, it is also possible to reduce the time-ordering error from over 50% of non-simulation scheduler to less than 1% realized by our scheduler, with almost the same run time efficiency as that of the highly efficient non-simulation VM schedulers.« less

  14. Healthcare4VideoStorm: Making Smart Decisions Based on Storm Metrics.

    PubMed

    Zhang, Weishan; Duan, Pengcheng; Chen, Xiufeng; Lu, Qinghua

    2016-04-23

    Storm-based stream processing is widely used for real-time large-scale distributed processing. Knowing the run-time status and ensuring performance is critical to providing expected dependability for some applications, e.g., continuous video processing for security surveillance. The existing scheduling strategies' granularity is too coarse to have good performance, and mainly considers network resources without computing resources while scheduling. In this paper, we propose Healthcare4Storm, a framework that finds Storm insights based on Storm metrics to gain knowledge from the health status of an application, finally ending up with smart scheduling decisions. It takes into account both network and computing resources and conducts scheduling at a fine-grained level using tuples instead of topologies. The comprehensive evaluation shows that the proposed framework has good performance and can improve the dependability of the Storm-based applications.

  15. Exploring the Mechanisms of Differentiation, Dedifferentiation, Reprogramming and Transdifferentiation

    PubMed Central

    Xu, Li; Zhang, Kun; Wang, Jin

    2014-01-01

    We explored the underlying mechanisms of differentiation, dedifferentiation, reprogramming and transdifferentiation (cell type switchings) from landscape and flux perspectives. Lineage reprogramming is a new regenerative method to convert a matured cell into another cell including direct transdifferentiation without undergoing a pluripotent cell state and indirect transdifferentiation with an initial dedifferentiation-reversion (reprogramming) to a pluripotent cell state. Each cell type is quantified by a distinct valley on the potential landscape with higher probability. We investigated three driving forces for cell fate decision making: stochastic fluctuations, gene regulation and induction, which can lead to cell type switchings. We showed that under the driving forces the direct transdifferentiation process proceeds from a differentiated cell valley to another differentiated cell valley through either a distinct stable intermediate state or a certain series of unstable indeterminate states. The dedifferentiation process proceeds through a pluripotent cell state. Barrier height and the corresponding escape time from the valley on the landscape can be used to quantify the stability and efficiency of cell type switchings. We also uncovered the mechanisms of the underlying processes by quantifying the dominant biological paths of cell type switchings on the potential landscape. The dynamics of cell type switchings are determined by both landscape gradient and flux. The flux can lead to the deviations of the dominant biological paths for cell type switchings from the naively expected landscape gradient path. As a result, the corresponding dominant paths of cell type switchings are irreversible. We also classified the mechanisms of cell fate development from our landscape theory: super-critical pitchfork bifurcation, sub-critical pitchfork bifurcation, sub-critical pitchfork with two saddle-node bifurcation, and saddle-node bifurcation. Our model showed good agreements with the experiments. It provides a general framework to explore the mechanisms of differentiation, dedifferentiation, reprogramming and transdifferentiation. PMID:25133589

  16. Path planning and energy management of solar-powered unmanned ground vehicles

    NASA Astrophysics Data System (ADS)

    Kaplan, Adam

    Many of the applications pertinent to unmanned vehicles, such as environmental research and analysis, communications, and information-surveillance and reconnaissance, benefit from prolonged vehicle operation time. Conventional efforts to increase the operational time of electric-powered unmanned vehicles have traditionally focused on the design of energy-efficient components and the identification of energy efficient search patterns, while little attention has been paid to the vehicle's mission-level path plan and power management. This thesis explores the formulation and generation of integrated motion-plans and power-schedules for solar-panel equipped mobile robots operating under strict energy constraints, which cannot be effectively addressed through conventional motion planning algorithms. Transit problems are considered to design time-optimal paths using both Balkcom-Mason and Pseudo-Dubins curves. Additionally, a more complicated problem to generate mission plans for vehicles which must persistently travel between certain locations, similar to the traveling salesperson problem (TSP), is presented. A comparison between one of the common motion-planning algorithms and experimental results of the prescribed algorithms, made possible by use of a test environment and mobile robot designed and developed specifically for this research, are presented and discussed.

  17. Heuristic for Critical Machine Based a Lot Streaming for Two-Stage Hybrid Production Environment

    NASA Astrophysics Data System (ADS)

    Vivek, P.; Saravanan, R.; Chandrasekaran, M.; Pugazhenthi, R.

    2017-03-01

    Lot streaming in Hybrid flowshop [HFS] is encountered in many real world problems. This paper deals with a heuristic approach for Lot streaming based on critical machine consideration for a two stage Hybrid Flowshop. The first stage has two identical parallel machines and the second stage has only one machine. In the second stage machine is considered as a critical by valid reasons these kind of problems is known as NP hard. A mathematical model developed for the selected problem. The simulation modelling and analysis were carried out in Extend V6 software. The heuristic developed for obtaining optimal lot streaming schedule. The eleven cases of lot streaming were considered. The proposed heuristic was verified and validated by real time simulation experiments. All possible lot streaming strategies and possible sequence under each lot streaming strategy were simulated and examined. The heuristic consistently yielded optimal schedule consistently in all eleven cases. The identification procedure for select best lot streaming strategy was suggested.

  18. Workforce deployment--a critical organizational competency.

    PubMed

    Harms, Roxanne

    2009-01-01

    Staff scheduling has historically been embedded within hospital operations, often defined by each new manager of a unit or program, and notably absent from the organization's practice and standards infrastructure and accountabilities of the executive team. Silvestro and Silvestro contend that "there is a need to recognize that hospital performance relies critically on the competence and effectiveness of roster planning activities, and that these activities are therefore of strategic importance." This article highlights the importance of including staff scheduling--or workforce deployment--in health care organizations' long-term strategic solutions to cope with the deepening workforce shortage (which is likely to hit harder than ever as the economy begins to recover). Viewing workforce deployment as a key organizational competency is a critical success factor for health care in the next decade, and the Workforce Deployment Maturity Model is discussed as a framework to enable organizations to measure their current capabilities, identify priorities and set goals for increasing organizational competency using a methodical and deliberate approach.

  19. Status Report on the Development of Micro-Scheduling Software for the Advanced Outage Control Center Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Germain, Shawn St.; Thomas, Kenneth; Farris, Ronald

    2014-09-01

    The long-term viability of existing nuclear power plants (NPPs) in the United States (U.S.) is dependent upon a number of factors, including maintaining high capacity factors, maintaining nuclear safety, and reducing operating costs, particularly those associated with refueling outages. Refueling outages typically take 20-30 days, and for existing light water NPPs in the U.S., the reactor cannot be in operation during the outage. Furthermore, given that many NPPs generate between $1-1.5 million/day in revenue when in operation, there is considerable interest in shortening the length of refueling outages. Yet, refueling outages are highly complex operations, involving multiple concurrent and dependentmore » activities that are difficult to coordinate. Finding ways to improve refueling outage performance while maintaining nuclear safety has proven to be difficult. The Advanced Outage Control Center project is a research and development (R&D) demonstration activity under the Light Water Reactor Sustainability (LWRS) Program. LWRS is a R&D program which works with industry R&D programs to establish technical foundations for the licensing and managing of long-term, safe, and economical operation of current NPPs. The Advanced Outage Control Center project has the goal of improving the management of commercial NPP refueling outages. To accomplish this goal, this INL R&D project is developing an advanced outage control center (OCC) that is specifically designed to maximize the usefulness of communication and collaboration technologies for outage coordination and problem resolution activities. This report describes specific recent efforts to develop a capability called outage Micro-Scheduling. Micro-Scheduling is the ability to allocate and schedule outage support task resources on a sub-hour basis. Micro-Scheduling is the real-time fine-tuning of the outage schedule to react to the actual progress of the primary outage activities to ensure that support task resources are optimally deployed with the least amount of delay and unproductive use of resources. The remaining sections of this report describe in more detail the scheduling challenges that occur during outages, how a Micro-Scheduling capability helps address those challenges, and provides a status update on work accomplished to date and the path forward.« less

  20. Energy latency tradeoffs for medium access and sleep scheduling in wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Gang, Lu

    Wireless sensor networks are expected to be used in a wide range of applications from environment monitoring to event detection. The key challenge is to provide energy efficient communication; however, latency remains an important concern for many applications that require fast response. The central thesis of this work is that energy efficient medium access and sleep scheduling mechanisms can be designed without necessarily sacrificing application-specific latency performance. We validate this thesis through results from four case studies that cover various aspects of medium access and sleep scheduling design in wireless sensor networks. Our first effort, DMAC, is to design an adaptive low latency and energy efficient MAC for data gathering to reduce the sleep latency. We propose staggered schedule, duty cycle adaptation, data prediction and the use of more-to-send packets to enable seamless packet forwarding under varying traffic load and channel contentions. Simulation and experimental results show significant energy savings and latency reduction while ensuring high data reliability. The second research effort, DESS, investigates the problem of designing sleep schedules in arbitrary network communication topologies to minimize the worst case end-to-end latency (referred to as delay diameter). We develop a novel graph-theoretical formulation, derive and analyze optimal solutions for the tree and ring topologies and heuristics for arbitrary topologies. The third study addresses the problem of minimum latency joint scheduling and routing (MLSR). By constructing a novel delay graph, the optimal joint scheduling and routing can be solved by M node-disjoint paths algorithm under multiple channel model. We further extended the algorithm to handle dynamic traffic changes and topology changes. A heuristic solution is proposed for MLSR under single channel interference. In the fourth study, EEJSPC, we first formulate a fundamental optimization problem that provides tunable energy-latency-throughput tradeoffs with joint scheduling and power control and present both exponential and polynomial complexity solutions. Then we investigate the problem of minimizing total transmission energy while satisfying transmission requests within a latency bound, and present an iterative approach which converges rapidly to the optimal parameter settings.

  1. Strategy for D/He-3 fusion development

    NASA Technical Reports Server (NTRS)

    Santarius, John F.

    1988-01-01

    It is concluded that Deuterium/Helium-3 fusion faces a more difficult physics development path but an easier technology development path than does Deuterium/Tritium. Early D/He-3 tests in next generation D/T fusion experiments might provide a valuable D/He-3 proof-of-principle at modest cost. At least one high leverage alternate concept should be vigorously pursued. Space applications of D/He-3 fusion are critically important to large scale development.

  2. WEAMR — A Weighted Energy Aware Multipath Reliable Routing Mechanism for Hotline-Based WSNs

    PubMed Central

    Tufail, Ali; Qamar, Arslan; Khan, Adil Mehmood; Baig, Waleed Akram; Kim, Ki-Hyung

    2013-01-01

    Reliable source to sink communication is the most important factor for an efficient routing protocol especially in domains of military, healthcare and disaster recovery applications. We present weighted energy aware multipath reliable routing (WEAMR), a novel energy aware multipath routing protocol which utilizes hotline-assisted routing to meet such requirements for mission critical applications. The protocol reduces the number of average hops from source to destination and provides unmatched reliability as compared to well known reactive ad hoc protocols i.e., AODV and AOMDV. Our protocol makes efficient use of network paths based on weighted cost calculation and intelligently selects the best possible paths for data transmissions. The path cost calculation considers end to end number of hops, latency and minimum energy node value in the path. In case of path failure path recalculation is done efficiently with minimum latency and control packets overhead. Our evaluation shows that our proposal provides better end-to-end delivery with less routing overhead and higher packet delivery success ratio compared to AODV and AOMDV. The use of multipath also increases overall life time of WSN network using optimum energy available paths between sender and receiver in WDNs. PMID:23669714

  3. WEAMR-a weighted energy aware multipath reliable routing mechanism for hotline-based WSNs.

    PubMed

    Tufail, Ali; Qamar, Arslan; Khan, Adil Mehmood; Baig, Waleed Akram; Kim, Ki-Hyung

    2013-05-13

    Reliable source to sink communication is the most important factor for an efficient routing protocol especially in domains of military, healthcare and disaster recovery applications. We present weighted energy aware multipath reliable routing (WEAMR), a novel energy aware multipath routing protocol which utilizes hotline-assisted routing to meet such requirements for mission critical applications. The protocol reduces the number of average hops from source to destination and provides unmatched reliability as compared to well known reactive ad hoc protocols i.e., AODV and AOMDV. Our protocol makes efficient use of network paths based on weighted cost calculation and intelligently selects the best possible paths for data transmissions. The path cost calculation considers end to end number of hops, latency and minimum energy node value in the path. In case of path failure path recalculation is done efficiently with minimum latency and control packets overhead. Our evaluation shows that our proposal provides better end-to-end delivery with less routing overhead and higher packet delivery success ratio compared to AODV and AOMDV. The use of multipath also increases overall life time of WSN network using optimum energy available paths between sender and receiver in WDNs.

  4. Speed and path control for conflict-free flight in high air traffic demand in terminal airspace

    NASA Astrophysics Data System (ADS)

    Rezaei, Ali

    To accommodate the growing air traffic demand, flights will need to be planned and navigated with a much higher level of precision than today's aircraft flight path. The Next Generation Air Transportation System (NextGen) stands to benefit significantly in safety and efficiency from such movement of aircraft along precisely defined paths. Air Traffic Operations (ATO) relying on such precision--the Precision Air Traffic Operations or PATO--are the foundation of high throughput capacity envisioned for the future airports. In PATO, the preferred method is to manage the air traffic by assigning a speed profile to each aircraft in a given fleet in a given airspace (in practice known as (speed control). In this research, an algorithm has been developed, set in the context of a Hybrid Control System (HCS) model, that determines whether a speed control solution exists for a given fleet of aircraft in a given airspace and if so, computes this solution as a collective speed profile that assures separation if executed without deviation. Uncertainties such as weather are not considered but the algorithm can be modified to include uncertainties. The algorithm first computes all feasible sequences (i.e., all sequences that allow the given fleet of aircraft to reach destinations without violating the FAA's separation requirement) by looking at all pairs of aircraft. Then, the most likely sequence is determined and the speed control solution is constructed by a backward trajectory generation, starting with the aircraft last out and proceeds to the first out. This computation can be done for different sequences in parallel which helps to reduce the computation time. If such a solution does not exist, then the algorithm calculates a minimal path modification (known as path control) that will allow separation-compliance speed control. We will also prove that the algorithm will modify the path without creating a new separation violation. The new path will be generated by adding new waypoints in the airspace. As a byproduct, instead of minimal path modification, one can use the aircraft arrival time schedule to generate the sequence in which the aircraft reach their destinations.

  5. A Critical Investigation of Advanced Placement U.S. History Textbooks

    ERIC Educational Resources Information Center

    Cramer, Gregory J.

    2012-01-01

    This dissertation critically investigates how Advanced Placement (AP) U.S. History textbooks portray key events in Latino/a history. The investigation is made in light of claims made by the College Board, the ACLU, scholars, and federal and state governments that the Advanced Placement program is the path to educational equity for Latino/a…

  6. Exploring Critical Factors of Self Concept among High Income Community College Graduates

    ERIC Educational Resources Information Center

    Rasul, Mohamad Sattar; Nor, Ahmad Rosli Mohd; Amat, Salleh; Rauf, Rose Amnah Abdul

    2015-01-01

    This study was undertaken to explore the critical factors influencing the self-concept of community college graduates in the development of their careers. Individuals with a positive self-concept are often associated with a good career choices and a well-panned career development path. Hence community college students should be girded with a…

  7. Commentary: Youth Are Critical to Stemming the Worldwide Tide of Chronic Disease

    ERIC Educational Resources Information Center

    Baldwin, Wendy

    2013-01-01

    Adolescence is a critical developmental stage and an opportunity to set a positive course for future health and well-being. Adolescence may be the "last best chance" to address some of the behaviors that can have significant repercussions for an individual's health trajectory, specifically the path to noncommunicable diseases (NCDs). Why should…

  8. Untrodden Paths: A Critical Conversation about Wilder Places in Outdoor Education

    ERIC Educational Resources Information Center

    Straker, Jo; Potter, Tom G.; Irwin, David

    2017-01-01

    This paper asks, what is the outdoors, and challenges conceptions of the role the outdoors play in education. It critically examines why a better understanding of the outdoors is important to outdoor education, how wilder places are essential to education, and how learning generated from these places can be translated into sustainable thinking and…

  9. Critical Behavior of Spatial Evolutionary Game with Altruistic to Spiteful Preferences on Two-Dimensional Lattices

    NASA Astrophysics Data System (ADS)

    Yang, Bo; Li, Xiao-Teng; Chen, Wei; Liu, Jian; Chen, Xiao-Song

    2016-10-01

    Self-questioning mechanism which is similar to single spin-flip of Ising model in statistical physics is introduced into spatial evolutionary game model. We propose a game model with altruistic to spiteful preferences via weighted sums of own and opponent's payoffs. This game model can be transformed into Ising model with an external field. Both interaction between spins and the external field are determined by the elements of payoff matrix and the preference parameter. In the case of perfect rationality at zero social temperature, this game model has three different phases which are entirely cooperative phase, entirely non-cooperative phase and mixed phase. In the investigations of the game model with Monte Carlo simulation, two paths of payoff and preference parameters are taken. In one path, the system undergoes a discontinuous transition from cooperative phase to non-cooperative phase with the change of preference parameter. In another path, two continuous transitions appear one after another when system changes from cooperative phase to non-cooperative phase with the prefenrence parameter. The critical exponents v, β, and γ of two continuous phase transitions are estimated by the finite-size scaling analysis. Both continuous phase transitions have the same critical exponents and they belong to the same universality class as the two-dimensional Ising model. Supported by the National Natural Science Foundation of China under Grant Nos. 11121403 and 11504384

  10. The Effect of Sharrows, Painted Bicycle Lanes and Physically Protected Paths on the Severity of Bicycle Injuries Caused by Motor Vehicles.

    PubMed

    Wall, Stephen P; Lee, David C; Frangos, Spiros G; Sethi, Monica; Heyer, Jessica H; Ayoung-Chee, Patricia; DiMaggio, Charles J

    2016-01-01

    We conducted individual and ecologic analyses of prospectively collected data from 839 injured bicyclists who collided with motorized vehicles and presented to Bellevue Hospital, an urban Level-1 trauma center in New York City, from December 2008 to August 2014. Variables included demographics, scene information, rider behaviors, bicycle route availability, and whether the collision occurred before the road segment was converted to a bicycle route. We used negative binomial modeling to assess the risk of injury occurrence following bicycle path or lane implementation. We dichotomized U.S. National Trauma Data Bank Injury Severity Scores (ISS) into none/mild (0-8) versus moderate, severe, or critical (>8) and used adjusted multivariable logistic regression to model the association of ISS with collision proximity to sharrows (i.e., bicycle lanes designated for sharing with cars), painted bicycle lanes, or physically protected paths. Negative binomial modeling of monthly counts, while adjusting for pedestrian activity, revealed that physically protected paths were associated with 23% fewer injuries. Painted bicycle lanes reduced injury risk by nearly 90% (IDR 0.09, 95% CI 0.02-0.33). Holding all else equal, compared to no bicycle route, a bicycle injury nearby sharrows was nearly twice as likely to be moderate, severe, or critical (adjusted odds ratio 1.94; 95% confidence interval (CI) 0.91-4.15). Painted bicycle lanes and physically protected paths were 1.52 (95% CI 0.85-2.71) and 1.66 (95% CI 0.85-3.22) times as likely to be associated with more than mild injury respectively.

  11. Rotational-path decomposition based recursive planning for spacecraft attitude reorientation

    NASA Astrophysics Data System (ADS)

    Xu, Rui; Wang, Hui; Xu, Wenming; Cui, Pingyuan; Zhu, Shengying

    2018-02-01

    The spacecraft reorientation is a common task in many space missions. With multiple pointing constraints, it is greatly difficult to solve the constrained spacecraft reorientation planning problem. To deal with this problem, an efficient rotational-path decomposition based recursive planning (RDRP) method is proposed in this paper. The uniform pointing-constraint-ignored attitude rotation planning process is designed to solve all rotations without considering pointing constraints. Then the whole path is checked node by node. If any pointing constraint is violated, the nearest critical increment approach will be used to generate feasible alternative nodes in the process of rotational-path decomposition. As the planning path of each subdivision may still violate pointing constraints, multiple decomposition is needed and the reorientation planning is designed as a recursive manner. Simulation results demonstrate the effectiveness of the proposed method. The proposed method has been successfully applied in two SPARK microsatellites to solve onboard constrained attitude reorientation planning problem, which were developed by the Shanghai Engineering Center for Microsatellites and launched on 22 December 2016.

  12. Maintenance and operations contractor plan for transition to the project Hanford management contract (PHMC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Waite, J.L.

    1996-04-12

    This plan has been developed by Westinghouse Hanford Company (WHC), and its subcontractors ICF Kaiser Hanford (ICF KH) and BCS Richland, Inc. (BCSR), at the direction of the US Department of Energy (DOE), Richland Operations Office (RL). WHC and its subcontractors are hereafter referred to as the Maintenance and Operations (M and O) Contractor. The plan identifies actions involving the M and O Contractor that are critical to (1) prepare for a smooth transition to the Project Hanford Management Contractor (PHMC), and (2) support and assist the PHMC and RL in achieving transition as planned, with no or minimal impactmore » to ongoing baseline activities. The plan is structured around two primary phases. The first is the pre-award phase, which started in mid-February 1996 and is currently scheduled to be completed on June 1, 1996, at which time the contract is currently planned to be awarded. The second is the follow-on four-month post-award phase from June 1, 1996, until October 1, 1996. Considering the magnitude and complexity of the scope of work being transitioned, completion in four months will require significant effort by all parties. To better ensure success, the M and O Contractor has developed a pre-award phase that is intended to maximize readiness for transition. Priority is given to preparation for facility assessments and processing of personnel, as these areas are determined to be on the critical path for transition. In addition, the M and O Contractor will put emphasis during the pre-award phase to close out open items prior to contract award, to include grievances, employee concerns, audit findings, compliance issues, etc.« less

  13. Advanced composite fuselage technology

    NASA Technical Reports Server (NTRS)

    Ilcewicz, Larry B.; Smith, Peter J.; Horton, Ray E.

    1993-01-01

    Boeing's ATCAS program has completed its third year and continues to progress towards a goal to demonstrate composite fuselage technology with cost and weight advantages over aluminum. Work on this program is performed by an integrated team that includes several groups within The Boeing Company, industrial and university subcontractors, and technical support from NASA. During the course of the program, the ATCAS team has continued to perform a critical review of composite developments by recognizing advances in metal fuselage technology. Despite recent material, structural design, and manufacturing advancements for metals, polymeric matrix composite designs studied in ATCAS still project significant cost and weight advantages for future applications. A critical path to demonstrating technology readiness for composite transport fuselage structures was created to summarize ATCAS tasks for Phases A, B, and C. This includes a global schedule and list of technical issues which will be addressed throughout the course of studies. Work performed in ATCAS since the last ACT conference is also summarized. Most activities relate to crown quadrant manufacturing scaleup and performance verification. The former was highlighted by fabricating a curved, 7 ft. by 10 ft. panel, with cocured hat-stiffeners and cobonded J-frames. In building to this scale, process developments were achieved for tow-placed skins, drape formed stiffeners, braided/RTM frames, and panel cure tooling. Over 700 tests and supporting analyses have been performed for crown material and design evaluation, including structural tests that demonstrated limit load requirements for severed stiffener/skin failsafe damage conditions. Analysis of tests for tow-placed hybrid laminates with large damage indicates a tensile fracture toughness that is higher than that observed for advanced aluminum alloys. Additional recent ATCAS achievements include crown supporting technology, keel quadrant design evaluation, and sandwich process development.

  14. The Search for Life Beyond Earth

    NASA Image and Video Library

    2014-07-14

    John Mather, Project Scientist for the James Webb Space Telescope at NASA's Goddard Space Flight Center, center, answers a question from the audience during a panel discussion on the search for life beyond Earth in the James E. Webb Auditorium at NASA Headquarters on Monday, July 14, 2014 in Washington, DC. The panel discussed how NASA's space-based observatories are making new discoveries and how the agency's new telescope, the James Webb Space Telescope, will continue this path of discovery after its schedule launch in 2018. Photo Credit: (NASA/Joel Kowsky)

  15. The Search for Life Beyond Earth

    NASA Image and Video Library

    2014-07-14

    An animation of the James Webb Space Telescope (JWST) is projected as John Mather, Nobel Laureate and Project Scientist for the JWST speaks during a panel discussion on the search for life beyond Earth in the James E. Webb Auditorium at NASA Headquarters on Monday, July 14, 2014 in Washington, DC. The panel discussed how NASA's space-based observatories are making new discoveries and how the agency's new telescope, the James Webb Space Telescope, will continue this path of discovery after its schedule launch in 2018. Photo Credit: (NASA/Joel Kowsky)

  16. Air pollution from aircraft operations at San Jose Municipal Airport, California

    NASA Technical Reports Server (NTRS)

    Schairer, E. T.

    1978-01-01

    The amount of air pollution discharged by arriving and departing aircraft at the San Jose Municipal Airport was estimated. These estimates were made for each one hour interval of a summer weekday in 1977. The contributions of both general aviation (personal and business aircraft) and certified air carriers (scheduled airliners) were considered. The locations at which the pollutants were discharged were estimated by approximating the flight paths of arriving and departing aircraft. Three types of pollutants were considered: carbon monoxide, hydrocarbons, and oxides of nitrogen.

  17. A 20k Payload Launch Vehicle Fast Track Development Concept Using an RD-180 Engine and a Centaur Upper Stage

    NASA Technical Reports Server (NTRS)

    Toelle, Ronald (Compiler)

    1995-01-01

    A launch vehicle concept to deliver 20,000 lb of payload to a 100-nmi orbit has been defined. A new liquid oxygen/kerosene booster powered by an RD-180 engine was designed while using a slightly modified Centaur upper stage. The design, development, and test program met the imposed 40-mo schedule by elimination of major structural testing by increased factors of safety and concurrent engineering concepts. A growth path to attain 65,000 lb of payload is developed.

  18. The Search for Life Beyond Earth

    NASA Image and Video Library

    2014-07-14

    Members of the audience walk past an example of a 1.2 meter telescope mirror that could be used in a future space telescope following a panel discussion on the search for life beyond Earth in the James E. Webb Auditorium at NASA Headquarters on Monday, July 14, 2014 in Washington, DC. The panel discussed how NASA's space-based observatories are making new discoveries and how the agency's new telescope, the James Webb Space Telescope, will continue this path of discovery after its schedule launch in 2018. Photo Credit: (NASA/Joel Kowsky)

  19. 78 FR 13372 - Wildland Fire Executive Council Meeting Schedule

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-27

    ... leadership, direction, and program oversight in support of the Wildland Fire Leadership Council. Questions... Governance; (3) Barriers and Critical Success Factors related to the Cohesive Strategy; (4) Regional Action...

  20. [Cost accounting for gastrectomy under critical path--the usefulness of direct accounting of personnel expenses and a guide to shortening hospital stay].

    PubMed

    Nozue, M; Maruyama, T; Imamura, F; Fukue, M

    2000-08-01

    In this study, cost accounting was made for a surgical case of gastrectomy according to critical path (path) and the economic contribution of the path was determined. In addition, changes in the cost percentage with changes in number of hospital days were simulated. Basically, cost accounting was done by means of cost accounting by departments, which meets the concept of direct cost accounting of administered accounts. Personnel expenses were calculated by means of both direct and indirect calculations. In the direct method, the total hours personnel participated were recorded for calculation. In the indirect method, personnel expenses were calculated from the ratio of the income of the surgical department to that of other departments. Purchase prices for all materials and drugs used were recorded to check buying costs. According to the direct calculating method, the personnel expenses came to approximately 300,000 yen, total cost was approximately 700,000 yen, and the cost percentage was 59%. According to the indirect method, the personnel expenses were approximately 540,000 yen and the total cost was approximately 940,000 yen, the cost percentage being 80%. A simulation study of changes in the cost with changes in hospital days revealed that the cost percentages were assessed to be approximately 53% in 19 hospital days and approximately 45% in 12 hospital days.

  1. Parents' nonstandard work schedules and child well-being: a critical review of the literature.

    PubMed

    Li, Jianghong; Johnson, Sarah E; Han, Wen-Jui; Andrews, Sonia; Kendall, Garth; Strazdins, Lyndall; Dockery, Alfred

    2014-02-01

    This paper provides a comprehensive review of empirical evidence linking parental nonstandard work schedules to four main child developmental outcomes: internalizing and externalizing problems, cognitive development, and body mass index. We evaluated the studies based on theory and methodological rigor (longitudinal data, representative samples, consideration of selection and information bias, confounders, moderators, and mediators). Of 23 studies published between 1980 and 2012 that met the selection criteria, 21 reported significant associations between nonstandard work schedules and an adverse child developmental outcome. The associations were partially mediated through parental depressive symptoms, low quality parenting, reduced parent-child interaction and closeness, and a less supportive home environment. These associations were more pronounced in disadvantaged families and when parents worked such schedules full time. We discuss the nuance, strengths, and limitations of the existing studies, and propose recommendations for future research.

  2. Technology readiness assessments: A retrospective

    NASA Astrophysics Data System (ADS)

    Mankins, John C.

    2009-11-01

    The development of new system capabilities typically depends upon the prior success of advanced technology research and development efforts. These systems developments inevitably face the three major challenges of any project: performance, schedule and budget. Done well, advanced technology programs can substantially reduce the uncertainty in all three of these dimensions of project management. Done poorly, or not at all, and new system developments suffer from cost overruns, schedule delays and the steady erosion of initial performance objectives. It is often critical for senior management to be able to determine which of these two paths is more likely—and to respond accordingly. The challenge for system and technology managers is to be able to make clear, well-documented assessments of technology readiness and risks, and to do so at key points in the life cycle of the program. In the mid 1970s, the National Aeronautics and Space Administration (NASA) introduced the concept of "technology readiness levels" (TRLs) as a discipline-independent, programmatic figure of merit (FOM) to allow more effective assessment of, and communication regarding the maturity of new technologies. In 1995, the TRL scale was further strengthened by the articulation of the first definitions of each level, along with examples (J. Mankins, Technology readiness levels, A White Paper, NASA, Washington, DC, 1995. [1]). Since then, TRLs have been embraced by the U.S. Congress' General Accountability Office (GAO), adopted by the U.S. Department of Defense (DOD), and are being considered for use by numerous other organizations. Overall, the TRLs have proved to be highly effective in communicating the status of new technologies among sometimes diverse organizations. This paper will review the concept of "technology readiness assessments", and provide a retrospective on the history of "TRLs" during the past 30 years. The paper will conclude with observations concerning prospective future directions for the important discipline of technology readiness assessments.

  3. The impact of closed-loop electronic medication management on time to first dose: a comparative study between paper and digital hospital environments.

    PubMed

    Austin, Jodie A; Smith, Ian R; Tariq, Amina

    2018-01-22

    Closed-loop electronic medication management systems (EMMS) are recognised as an effective intervention to improve medication safety, yet evidence of their effectiveness in hospitals is limited. Few studies have compared medication turnaround time for a closed-loop electronic versus paper-based medication management environment. To compare medication turnaround times in a paper-based hospital environment with a digital hospital equipped with a closed-loop EMMS, consisting of computerised physician order entry, profiled automated dispensing cabinets packaged with unit dose medications and barcode medication administration. Data were collected during 2 weeks at three private hospital sites (one with closed-loop EMMS) within the same organisation network in Queensland, Australia. Time between scheduled and actual administration times was analysed for first dose of time-critical and non-critical medications located on the ward or sourced via pharmacy. Medication turnaround times at the EMMS site were less compared to the paper-based sites (median, IQR: 35 min, 8-57 min versus 120 min, 30-180 min, P < 0.001). For time-critical medications, 77% were administered within 60 min of scheduled time at the EMMS site versus 38% for the paper-based sites. Similar difference was observed for non-critical medications, 80% were administered within 60 min of their scheduled time at the EMMS site versus 41% at the paper-based facilities. The study indicates medication turnaround times utilising a closed-loop EMMS are less compared to paper-based systems. This improvement may be attributable to increased accessibility of medications using automated dispensing cabinets and electronic medication administration records flagging tasks to nurses in real time. © 2018 Royal Pharmaceutical Society.

  4. Self-powered information measuring wireless networks using the distribution of tasks within multicore processors

    NASA Astrophysics Data System (ADS)

    Zhuravska, Iryna M.; Koretska, Oleksandra O.; Musiyenko, Maksym P.; Surtel, Wojciech; Assembay, Azat; Kovalev, Vladimir; Tleshova, Akmaral

    2017-08-01

    The article contains basic approaches to develop the self-powered information measuring wireless networks (SPIM-WN) using the distribution of tasks within multicore processors critical applying based on the interaction of movable components - as in the direction of data transmission as wireless transfer of energy coming from polymetric sensors. Base mathematic model of scheduling tasks within multiprocessor systems was modernized to schedule and allocate tasks between cores of one-crystal computer (SoC) to increase energy efficiency SPIM-WN objects.

  5. Cache Sharing and Isolation Tradeoffs in Multicore Mixed-Criticality Systems

    DTIC Science & Technology

    2015-05-01

    form of lockdown registers, to provide way-based partitioning. These alternatives are illustrated in Fig. 1 with respect to a quad-core ARM Cortex A9... processor (as we do for Level-A and -B tasks), but they did not consider MC systems. Altmeyer et al. [1] considered uniprocessor scheduling on a system with a...framework. We randomly generated task sets and determined the fraction that were schedulable on our target hardware platform, the quad-core ARM Cortex A9

  6. Causes of catastrophic failure in complex systems

    NASA Astrophysics Data System (ADS)

    Thomas, David A.

    2010-08-01

    Root causes of mission critical failures and major cost and schedule overruns in complex systems and programs are studied through the post-mortem analyses compiled for several examples, including the Hubble Space Telescope, the Challenger and Columbia Shuttle accidents, and the Three Mile Island nuclear power plant accident. The roles of organizational complexity, cognitive biases in decision making, the display of quantitative data, and cost and schedule pressure are all considered. Recommendations for mitigating the risk of similar failures in future programs are also provided.

  7. Equilibrium paths of an imperfect plate with respect to its aspect ratio

    NASA Astrophysics Data System (ADS)

    Psotny, Martin

    2017-07-01

    The stability analysis of a rectangular plate loaded in compression is presented, a specialized code based on FEM has been created. Special finite element with 48 degrees of freedom has been used for analysis. The nonlinear finite element method equations are derived from the variational principle of minimum of total potential energy. To trace the complete nonlinear equilibrium paths, the Newton-Raphson iteration algorithm is used, load versus displacement control was changed during the calculation process. The peculiarities of the effects of the initial imperfections on the load-deflection paths are investigated with respect to aspect ratio of the plate. Special attention is paid to the influence of imperfections on the post-critical buckling mode.

  8. Evaluation of the Terminal Precision Scheduling and Spacing System for Near-Term NAS Application

    NASA Technical Reports Server (NTRS)

    Thipphavong, Jane; Martin, Lynne Hazel; Swenson, Harry N.; Lin, Paul; Nguyen, Jimmy

    2012-01-01

    NASA has developed a capability for terminal area precision scheduling and spacing (TAPSS) to provide higher capacity and more efficiently manage arrivals during peak demand periods. This advanced technology is NASA's vision for the NextGen terminal metering capability. A set of human-in-the-loop experiments was conducted to evaluate the performance of the TAPSS system for near-term implementation. The experiments evaluated the TAPSS system under the current terminal routing infrastructure to validate operational feasibility. A second goal of the study was to measure the benefit of the Center and TRACON advisory tools to help prioritize the requirements for controller radar display enhancements. Simulation results indicate that using the TAPSS system provides benefits under current operations, supporting a 10% increase in airport throughput. Enhancements to Center decision support tools had limited impact on improving the efficiency of terminal operations, but did provide more fuel-efficient advisories to achieve scheduling conformance within 20 seconds. The TRACON controller decision support tools were found to provide the most benefit, by improving the precision in schedule conformance to within 20 seconds, reducing the number of arrivals having lateral path deviations by 50% and lowering subjective controller workload. Overall, the TAPSS system was found to successfully develop an achievable terminal arrival metering plan that was sustainable under heavy traffic demand levels and reduce the complexity of terminal operations when coupled with the use of the terminal controller advisory tools.

  9. Intensity-Modulated Radiation Therapy (IMRT)

    MedlinePlus

    ... specialized training in the field of radiation oncology physics, ensures the linear accelerator delivers the precise radiation ... critical normal structures, as well as the patient's health. Typically, patients are scheduled for IMRT sessions five ...

  10. Critical role of bevacizumab scheduling in combination with pre-surgical chemo-radiotherapy in MRI-defined high-risk locally advanced rectal cancer: Results of the BRANCH trial.

    PubMed

    Avallone, Antonio; Pecori, Biagio; Bianco, Franco; Aloj, Luigi; Tatangelo, Fabiana; Romano, Carmela; Granata, Vincenza; Marone, Pietro; Leone, Alessandra; Botti, Gerardo; Petrillo, Antonella; Caracò, Corradina; Iaffaioli, Vincenzo R; Muto, Paolo; Romano, Giovanni; Comella, Pasquale; Budillon, Alfredo; Delrio, Paolo

    2015-10-06

    We have previously shown that an intensified preoperative regimen including oxaliplatin plus raltitrexed and 5-fluorouracil/folinic acid (OXATOM/FUFA) during preoperative pelvic radiotherapy produced promising results in locally advanced rectal cancer (LARC). Preclinical evidence suggests that the scheduling of bevacizumab may be crucial to optimize its combination with chemo-radiotherapy. This non-randomized, non-comparative, phase II study was conducted in MRI-defined high-risk LARC. Patients received three biweekly cycles of OXATOM/FUFA during RT. Bevacizumab was given 2 weeks before the start of chemo-radiotherapy, and on the same day of chemotherapy for 3 cycles (concomitant-schedule A) or 4 days prior to the first and second cycle of chemotherapy (sequential-schedule B). Primary end point was pathological complete tumor regression (TRG1) rate. The accrual for the concomitant-schedule was early terminated because the number of TRG1 (2 out of 16 patients) was statistically inconsistent with the hypothesis of activity (30%) to be tested. Conversely, the endpoint was reached with the sequential-schedule and the final TRG1 rate among 46 enrolled patients was 50% (95% CI 35%-65%). Neutropenia was the most common grade ≥ 3 toxicity with both schedules, but it was less pronounced with the sequential than concomitant-schedule (30% vs. 44%). Postoperative complications occurred in 8/15 (53%) and 13/46 (28%) patients in schedule A and B, respectively. At 5 year follow-up the probability of PFS and OS was 80% (95%CI, 66%-89%) and 85% (95%CI, 69%-93%), respectively, for the sequential-schedule. These results highlights the relevance of bevacizumab scheduling to optimize its combination with preoperative chemo-radiotherapy in the management of LARC.

  11. Trailblazers: An Examination of Community College Black Women in Senior Level Administrator Roles--Their Stories through Their Eyes

    ERIC Educational Resources Information Center

    Williams-Bruce, Tameka Lazette

    2013-01-01

    This paper explores how Black women who work in senior level administrative positions at community colleges were able to establish successful career paths. The literature review draws from the theoretical framework of critical race theory, the Black feminist thought, and critical race feminism. The use of counter-stories establishes a platform for…

  12. Examining the Relationship between Middle School Students' Critical Reading Skills, Science Literacy Skills and Attitudes: A Structural Equation Modeling

    ERIC Educational Resources Information Center

    Karademir, Ersin; Ulucinar, Ufuk

    2017-01-01

    The purpose of this study is to verify the causal relationship between middle school students' critical reading skills, science literacy skills and attitudes towards science literacy with research data according to the default model. Through the structural equation modeling, path analysis has been applied in the study which was designed in…

  13. Drivers anticipate lead-vehicle conflicts during automated longitudinal control: Sensory cues capture driver attention and promote appropriate and timely responses.

    PubMed

    Morando, Alberto; Victor, Trent; Dozza, Marco

    2016-12-01

    Adaptive Cruise Control (ACC) has been shown to reduce the exposure to critical situations by maintaining a safe speed and headway. It has also been shown that drivers adapt their visual behavior in response to the driving task demand with ACC, anticipating an impending lead vehicle conflict by directing their eyes to the forward path before a situation becomes critical. The purpose of this paper is to identify the causes related to this anticipatory mechanism, by investigating drivers' visual behavior while driving with ACC when a potential critical situation is encountered, identified as a forward collision warning (FCW) onset (including false positive warnings). This paper discusses how sensory cues capture attention to the forward path in anticipation of the FCW onset. The analysis used the naturalistic database EuroFOT to examine visual behavior with respect to two manually-coded metrics, glance location and glance eccentricity, and then related the findings to vehicle data (such as speed, acceleration, and radar information). Three sensory cues (longitudinal deceleration, looming, and brake lights) were found to be relevant for capturing driver attention and increase glances to the forward path in anticipation of the threat; the deceleration cue seems to be dominant. The results also show that the FCW acts as an effective attention-orienting mechanism when no threat anticipation is present. These findings, relevant to the study of automation, provide additional information about drivers' response to potential lead-vehicle conflicts when longitudinal control is automated. Moreover, these results suggest that sensory cues are important for alerting drivers to an impending critical situation, allowing for a prompt reaction. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. A function-based approach to cockpit procedure aids

    NASA Technical Reports Server (NTRS)

    Phatak, Anil V.; Jain, Parveen; Palmer, Everett

    1990-01-01

    The objective of this research is to develop and test a cockpit procedural aid that can compose and present procedures that are appropriate for the given flight situation. The procedure would indicate the status of the aircraft engineering systems, and the environmental conditions. Prescribed procedures already exist for normal as well as for a number of non-normal and emergency situations, and can be presented to the crew using an interactive cockpit display. However, no procedures are prescribed or recommended for a host of plausible flight situations involving multiple malfunctions compounded by adverse environmental conditions. Under these circumstances, the cockpit procedural aid must review the prescribed procedures for the individual malfunction (when available), evaluate the alternatives or options, and present one or more composite procedures (prioritized or unprioritized) in response to the given situation. A top-down function-based conceptual approach towards composing and presenting cockpit procedures is being investigated. This approach is based upon the thought process that an operating crew must go through while attempting to meet the flight objectives given the current flight situation. In order to accomplish the flight objectives, certain critical functions must be maintained during each phase of the flight, using the appropriate procedures or success paths. The viability of these procedures depends upon the availability of required resources. If resources available are not sufficient to meet the requirements, alternative procedures (success paths) using the available resources must be constructed to maintain the critical functions and the corresponding objectives. If no success path exists that can satisfy the critical functions/objectives, then the next level of critical functions/objectives must be selected and the process repeated. Information is given in viewgraph form.

  15. Evaluation of cardiovascular risks of spaceflight does not support the NASA bioastronautics critical path roadmap.

    PubMed

    Convertino, Victor A; Cooke, William H

    2005-09-01

    Occurrence of serious cardiac dysrhythmias and diminished cardiac and vascular function are the primary cardiovascular risks of spaceflight identified in the 2005 NASA Bioastronautics Critical Path Roadmap. A review of the literature was conducted on experimental results and observational data obtained from spaceflight and relevant ground simulation studies that addressed occurrence of cardiac dysrhythmias, cardiac contractile and vascular function, manifestation of asymptomatic cardiovascular disease, orthostatic intolerance, and response to exercise stress. Based on data from astronauts who have flown in space, there is no compelling experimental evidence to support significant occurrence of cardiac dysrhythmias, manifestation of asymptomatic cardiovascular disease, or reduction in myocardial contractile function. Although there are post-spaceflight data that demonstrate lower peripheral resistance in astronauts who become presyncopal compared with non-presyncopal astronauts, it is not clear that these differences are the result of decreased vascular function. However, the evidence of postflight orthostatic intolerance and reduced exercise capacity is well substantiated by both spaceflight and ground experiments. Although attenuation of baroreflex function(s) may contribute to postflight orthostatic instability, a primary mechanism of orthostatic intolerance and reduced exercise capacity is reduced end-diastolic and stroke volume associated with lower blood volumes and consequent cardiac remodeling. Data from the literature on the current population of astronauts support the notion that the primary cardiovascular risks of spaceflight are compromised hemodynamic responses to central hypovolemia resulting in reduced orthostatic tolerance and exercise capacity rather than occurrence of cardiac dysrhythmias, reduced cardiac contractile and vascular function, or manifestation of asymptomatic cardiovascular disease. These observations warrant a critical review and revision of the 2005 Bioastronautics Critical Path Roadmap.

  16. Systems and methods for analyzing liquids under vacuum

    DOEpatents

    Yu, Xiao-Ying; Yang, Li; Cowin, James P.; Iedema, Martin J.; Zhu, Zihua

    2013-10-15

    Systems and methods for supporting a liquid against a vacuum pressure in a chamber can enable analysis of the liquid surface using vacuum-based chemical analysis instruments. No electrical or fluid connections are required to pass through the chamber walls. The systems can include a reservoir, a pump, and a liquid flow path. The reservoir contains a liquid-phase sample. The pump drives flow of the sample from the reservoir, through the liquid flow path, and back to the reservoir. The flow of the sample is not substantially driven by a differential between pressures inside and outside of the liquid flow path. An aperture in the liquid flow path exposes a stable portion of the liquid-phase sample to the vacuum pressure within the chamber. The radius, or size, of the aperture is less than or equal to a critical value required to support a meniscus of the liquid-phase sample by surface tension.

  17. Six paths for the future of social epidemiology.

    PubMed

    Galea, Sandro; Link, Bruce G

    2013-09-15

    Social epidemiology is now an accepted part of the academic intellectual landscape. However, in many ways, social epidemiology also runs the risk of losing the identity that distinguished it as a field during its emergence. In the present article, we scan the strengths of social epidemiology to imagine paths forward that will make the field distinct and useful to the understanding of population health in future. We suggest 6 paths to such a future, each emerging from promising research trends in the field in which social epidemiologists can, and should, lead in coming years. Each of these paths contributes to the formation of distinct capacities that social epidemiologists can claim and use to elaborate or fill in gaps in the already strong history of social epidemiology. They present an opportunity for the field to build on its strengths and move forward while leading in new and critical areas in population health.

  18. Six Paths for the Future of Social Epidemiology

    PubMed Central

    Galea, Sandro; Link, Bruce G.

    2013-01-01

    Social epidemiology is now an accepted part of the academic intellectual landscape. However, in many ways, social epidemiology also runs the risk of losing the identity that distinguished it as a field during its emergence. In the present article, we scan the strengths of social epidemiology to imagine paths forward that will make the field distinct and useful to the understanding of population health in future. We suggest 6 paths to such a future, each emerging from promising research trends in the field in which social epidemiologists can, and should, lead in coming years. Each of these paths contributes to the formation of distinct capacities that social epidemiologists can claim and use to elaborate or fill in gaps in the already strong history of social epidemiology. They present an opportunity for the field to build on its strengths and move forward while leading in new and critical areas in population health. PMID:24008899

  19. Radio propagation through solar and other extraterrestrial ionized media

    NASA Technical Reports Server (NTRS)

    Smith, E. K.; Edelson, R. E.

    1980-01-01

    The present S- and X-band communications needs in deep space are addressed to illustrate the aspects which are affected by propagation through extraterrestrial plasmas. The magnitude, critical threshold, and frequency dependence of some eight propagation effects for an S-band propagation path passing within 4 solar radii of the Sun are described. The theory and observation of propagation in extraterrestrial plasmas are discussed and the various plasma states along a near solar propagation path are illustrated. Classical magnetoionic theory (cold anisotropic plasma) is examined for its applicability to the path in question. The characteristics of the plasma states found along the path are summarized and the errors in some of the standard approximations are indicated. Models of extraterrestrial plasmas are included. Modeling the electron density in the solar corona and solar wind, is emphasized but some cursory information on the terrestrial planets plus Jupiters is included.

  20. Reliability analysis of component of affination centrifugal 1 machine by using reliability engineering

    NASA Astrophysics Data System (ADS)

    Sembiring, N.; Ginting, E.; Darnello, T.

    2017-12-01

    Problems that appear in a company that produces refined sugar, the production floor has not reached the level of critical machine availability because it often suffered damage (breakdown). This results in a sudden loss of production time and production opportunities. This problem can be solved by Reliability Engineering method where the statistical approach to historical damage data is performed to see the pattern of the distribution. The method can provide a value of reliability, rate of damage, and availability level, of an machine during the maintenance time interval schedule. The result of distribution test to time inter-damage data (MTTF) flexible hose component is lognormal distribution while component of teflon cone lifthing is weibull distribution. While from distribution test to mean time of improvement (MTTR) flexible hose component is exponential distribution while component of teflon cone lifthing is weibull distribution. The actual results of the flexible hose component on the replacement schedule per 720 hours obtained reliability of 0.2451 and availability 0.9960. While on the critical components of teflon cone lifthing actual on the replacement schedule per 1944 hours obtained reliability of 0.4083 and availability 0.9927.

  1. Routing optimization in networks based on traffic gravitational field model

    NASA Astrophysics Data System (ADS)

    Liu, Longgeng; Luo, Guangchun

    2017-04-01

    For research on the gravitational field routing mechanism on complex networks, we further analyze the gravitational effect of paths. In this study, we introduce the concept of path confidence degree to evaluate the unblocked reliability of paths that it takes the traffic state of all nodes on the path into account from the overall. On the basis of this, we propose an improved gravitational field routing protocol considering all the nodes’ gravities on the path and the path confidence degree. In order to evaluate the transmission performance of the routing strategy, an order parameter is introduced to measure the network throughput by the critical value of phase transition from a free-flow phase to a jammed phase, and the betweenness centrality is used to evaluate the transmission performance and traffic congestion of the network. Simulation results show that compared with the shortest-path routing strategy and the previous gravitational field routing strategy, the proposed algorithm improves the network throughput considerably and effectively balances the traffic load within the network, and all nodes in the network are utilized high efficiently. As long as γ ≥ α, the transmission performance can reach the maximum and remains unchanged for different α and γ, which ensures that the proposed routing protocol is high efficient and stable.

  2. Software Security Knowledge: CWE. Knowing What Could Make Software Vulnerable to Attack

    DTIC Science & Technology

    2011-05-01

    shall be subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. 1...Buffer • CWE-642: External Control of Critical State Data • CWE-73: External Control of File Name or Path • CWE-426: Untrusted Search Path • CWE...94: Failure to Control Generation of Code (aka ’Code Injection’) • CWE-494: Download of Code Without Integrity Check • CWE-404: Improper Resource

  3. Properties of behavior under different random ratio and random interval schedules: A parametric study.

    PubMed

    Dembo, M; De Penfold, J B; Ruiz, R; Casalta, H

    1985-03-01

    Four pigeons were trained to peck a key under different values of a temporally defined independent variable (T) and different probabilities of reinforcement (p). Parameter T is a fixed repeating time cycle and p the probability of reinforcement for the first response of each cycle T. Two dependent variables were used: mean response rate and mean postreinforcement pause. For all values of p a critical value for the independent variable T was found (T=1 sec) in which marked changes took place in response rate and postreinforcement pauses. Behavior typical of random ratio schedules was obtained at T 1 sec and behavior typical of random interval schedules at T 1 sec. Copyright © 1985. Published by Elsevier B.V.

  4. Joint Strike Figher Acquisition: Mature Critical Technologies Needed to Reduce Risks

    DTIC Science & Technology

    2001-10-01

    Reduce Risks GAO-02-39 Report Documentation Page Report Date 00OCT2001 Report Type N/A Dates Covered (from... to) - Title and Subtitle JOINT STRIKE...FIGHTER ACQUISITION: Mature Critical Technologies Needed to Reduce Risks Contract Number Grant Number Program Element Number Author(s) Project...1Joint Strike Fighter Acquisition: Development Schedule Should Be Changed to Reduce Risks (GAO/T-NSIAD-00-132

  5. Hierarchical Coloured Petrinet Based Healthcare Infrastructure Interdependency Model

    NASA Astrophysics Data System (ADS)

    Nivedita, N.; Durbha, S.

    2014-11-01

    To ensure a resilient Healthcare Critical Infrastructure, understanding the vulnerabilities and analysing the interdependency on other critical infrastructures is important. To model this critical infrastructure and its dependencies, Hierarchal Coloured petri net modelling approach for simulating the vulnerability of Healthcare Critical infrastructure in a disaster situation is studied.. The model enables to analyse and understand various state changes, which occur when there is a disruption or damage to any of the Critical Infrastructure, and its cascading nature. It also enables to explore optimal paths for evacuation during the disaster. The simulation environment can be used to understand and highlight various vulnerabilities of Healthcare Critical Infrastructure during a flood disaster scenario; minimize consequences; and enable timely, efficient response.

  6. Chemotherapy appointment scheduling under uncertainty using mean-risk stochastic integer programming.

    PubMed

    Alvarado, Michelle; Ntaimo, Lewis

    2018-03-01

    Oncology clinics are often burdened with scheduling large volumes of cancer patients for chemotherapy treatments under limited resources such as the number of nurses and chairs. These cancer patients require a series of appointments over several weeks or months and the timing of these appointments is critical to the treatment's effectiveness. Additionally, the appointment duration, the acuity levels of each appointment, and the availability of clinic nurses are uncertain. The timing constraints, stochastic parameters, rising treatment costs, and increased demand of outpatient oncology clinic services motivate the need for efficient appointment schedules and clinic operations. In this paper, we develop three mean-risk stochastic integer programming (SIP) models, referred to as SIP-CHEMO, for the problem of scheduling individual chemotherapy patient appointments and resources. These mean-risk models are presented and an algorithm is devised to improve computational speed. Computational results were conducted using a simulation model and results indicate that the risk-averse SIP-CHEMO model with the expected excess mean-risk measure can decrease patient waiting times and nurse overtime when compared to deterministic scheduling algorithms by 42 % and 27 %, respectively.

  7. Evaluation of Flow Paths and Confluences for Saltwater Intrusion and Its Influence on Fish Species Diversity in a Deltaic River Network

    NASA Astrophysics Data System (ADS)

    Shao, X.; Cui, B.; Zhang, Z.; Fang, Y.; Jawitz, J. W.

    2016-12-01

    Freshwater in a delta is often at risk of saltwater intrusion, which has been a serious issue in estuarine deltas all over the world. Salinity gradients and hydrologic connectivity in the deltas can be disturbed by saltwater intrusion, which can fluctuate frequently and locally in time and space to affect biotic processes and then to affect the distribution patterns of the riverine fishes throughout the river network. Therefore, identifying the major flow paths or locations at risk of saltwater intrusion in estuarine ecosystems is necessary for saltwater intrusion mitigation and fish species diversity conservation. In this study, we use the betweenness centrality (BC) as the weighted attribute of the river network to identify the critical confluences and detect the preferential flow paths for saltwater intrusion through the least-cost-path algorithm from graph theory approach. Moreover, we analyse the responses of the salinity and fish species diversity to the BC values of confluences calculated in the river network. Our results show that the most likely location of saltwater intrusion is not a simple gradient change from sea to land, but closely dependent on the river segments' characteristics. In addition, a significant positive correlation between the salinity and the BC values of confluences is determined in the Pearl River Delta. Changes in the BC values of confluences can produce significant variation in the fish species diversity. Therefore, the dynamics of saltwater intrusion are a growing consideration for understanding the patterns and subsequent processes driving fish community structure. Freshwater can be diverted into these major flow paths and critical confluences to improve river network management and conservation of fish species diversity under saltwater intrusion.

  8. Adsorption effect on the formation of conductive path in defective TiO2: ab initio calculations

    NASA Astrophysics Data System (ADS)

    Li, Lei; Li, Wenshi; Qin, Han; Yang, Jianfeng; Mao, Ling-Feng

    2017-10-01

    Although the metal/TiO2/metal junctions providing resistive switching properties have attracted lots of attention in recent decades, revealing the atomic-nature of conductive path in TiO2 active layer remains a critical challenge. Here the effects of metal adsorption on defective TiO2(1 1 0) surface are theoretically investigated via ab initio calculations. The dependence of the conductive path on the adsorption of Ti/Zr/Cu/Pt/O atoms above a lattice Ti-ion in (1 1 0) plane and at 〈1 1 0〉 direction of the defective TiO2(0 0 1) surface are compared. It is found that Ti adsorptions in both sites give larger contributions to the presence of conductive path with more stability and larger transport coefficients at Fermi level, whereas the O adsorptions at both sites fail to produce conductive path. Moreover, the adsorptions of Zr/Cu/Pt atoms reduce the existence possibility of conductive path, especially absorbed above the lattice Ti-ion at 〈1 1 0〉 direction. Thus, it is helpful to clarify the interaction of the metal electrode and oxide layer in resistive random access memory.

  9. AWAS: A dynamic work scheduling system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Y.; Hao, J.; Kocur, G.

    1994-12-31

    The Automated Work Administration System (AWAS) is an automated scheduling system developed at GTE. A typical work center has 1000 employees and processes 4000 jobs each day. Jobs are geographically distributed within the service area of the work center, require different skills, and have to be done within specified time windows. Each job can take anywhere from 12 minutes to several hours to complete. Each employee can have his/her individual schedule, skill, or working area. The jobs can enter and leave the system at any time The employees dial up to the system to request for their next job atmore » the beginning of a day or after a job is done. The system is able to respond to the changes dynamically and produce close to optimum solutions at real time. We formulate the real world problem as a minimum cost network flow problem. Both employees and jobs are formulated as nodes. Relationship between jobs and employees are formulated as arcs, and working hours contributed by employees and consumed by jobs are formulated as flow. The goal is to minimize missed commitments. We solve the problem with the successive shortest path algorithm. Combined with pre-processing and post-processing, the system produces reasonable outputs and the response time is very good.« less

  10. A Cross-Layer Duty Cycle MAC Protocol Supporting a Pipeline Feature for Wireless Sensor Networks

    PubMed Central

    Tong, Fei; Xie, Rong; Shu, Lei; Kim, Young-Chon

    2011-01-01

    Although the conventional duty cycle MAC protocols for Wireless Sensor Networks (WSNs) such as RMAC perform well in terms of saving energy and reducing end-to-end delivery latency, they were designed independently and require an extra routing protocol in the network layer to provide path information for the MAC layer. In this paper, we propose a new cross-layer duty cycle MAC protocol with data forwarding supporting a pipeline feature (P-MAC) for WSNs. P-MAC first divides the whole network into many grades around the sink. Each node identifies its grade according to its logical hop distance to the sink and simultaneously establishes a sleep/wakeup schedule using the grade information. Those nodes in the same grade keep the same schedule, which is staggered with the schedule of the nodes in the adjacent grade. Then a variation of the RTS/CTS handshake mechanism is used to forward data continuously in a pipeline fashion from the higher grade to the lower grade nodes and finally to the sink. No extra routing overhead is needed, thus increasing the network scalability while maintaining the superiority of duty-cycling. The simulation results in OPNET show that P-MAC has better performance than S-MAC and RMAC in terms of packet delivery latency and energy efficiency. PMID:22163895

  11. 14 CFR 29.79 - Landing: Category A.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... approach and landing path must be established to avoid the critical areas of the height-velocity envelope... surface after complete power failure occurring during normal cruise. [Doc. No. 24802, 61 FR 21900, May 10...

  12. An integrated bioinformatics infrastructure essential for advancing pharmacogenomics and personalized medicine in the context of the FDA's Critical Path Initiative.

    PubMed

    Tong, Weida; Harris, Stephen C; Fang, Hong; Shi, Leming; Perkins, Roger; Goodsaid, Federico; Frueh, Felix W

    2007-01-01

    Pharmacogenomics (PGx) is identified in the FDA Critical Path document as a major opportunity for advancing medical product development and personalized medicine. An integrated bioinformatics infrastructure for use in FDA data review is crucial to realize the benefits of PGx for public health. We have developed an integrated bioinformatics tool, called ArrayTrack, for managing, analyzing and interpreting genomic and other biomarker data (e.g. proteomic and metabolomic data). ArrayTrack is a highly flexible and robust software platform, which allows evolving with technological advances and changing user needs. ArrayTrack is used in the routine review of genomic data submitted to the FDA; here, three hypothetical examples of its use in the Voluntary eXploratory Data Submission (VXDS) program are illustrated.: © Published by Elsevier Ltd.

  13. Making the Buses Run.

    ERIC Educational Resources Information Center

    Fickes, Michael

    1998-01-01

    Examines issues concerning outsourcing student transportation services: cost; management needs and capabilities; goals; and politics. Critical areas of transportation management are highlighted such as personnel management, student management and discipline, risk management, fleet analysis, and routing and scheduling. (GR)

  14. Vehicle dynamics control of four in-wheel motor drive electric vehicle using gain scheduling based on tyre cornering stiffness estimation

    NASA Astrophysics Data System (ADS)

    Xiong, Lu; Yu, Zhuoping; Wang, Yang; Yang, Chen; Meng, Yufeng

    2012-06-01

    This paper focuses on the vehicle dynamic control system for a four in-wheel motor drive electric vehicle, aiming at improving vehicle stability under critical driving conditions. The vehicle dynamics controller is composed of three modules, i.e. motion following control, control allocation and vehicle state estimation. Considering the strong nonlinearity of the tyres under critical driving conditions, the yaw motion of the vehicle is regulated by gain scheduling control based on the linear quadratic regulator theory. The feed-forward and feedback gains of the controller are updated in real-time by online estimation of the tyre cornering stiffness, so as to ensure the control robustness against environmental disturbances as well as parameter uncertainty. The control allocation module allocates the calculated generalised force requirements to each in-wheel motor based on quadratic programming theory while taking the tyre longitudinal/lateral force coupling characteristic into consideration. Simulations under a variety of driving conditions are carried out to verify the control algorithm. Simulation results indicate that the proposed vehicle stability controller can effectively stabilise the vehicle motion under critical driving conditions.

  15. Back pressure based multicast scheduling for fair bandwidth allocation.

    PubMed

    Sarkar, Saswati; Tassiulas, Leandros

    2005-09-01

    We study the fair allocation of bandwidth in multicast networks with multirate capabilities. In multirate transmission, each source encodes its signal in layers. The lowest layer contains the most important information and all receivers of a session should receive it. If a receiver's data path has additional bandwidth, it receives higher layers which leads to a better quality of reception. The bandwidth allocation objective is to distribute the layers fairly. We present a computationally simple, decentralized scheduling policy that attains the maxmin fair rates without using any knowledge of traffic statistics and layer bandwidths. This policy learns the congestion level from the queue lengths at the nodes, and adapts the packet transmissions accordingly. When the network is congested, packets are dropped from the higher layers; therefore, the more important lower layers suffer negligible packet loss. We present analytical and simulation results that guarantee the maxmin fairness of the resulting rate allocation, and upper bound the packet loss rates for different layers.

  16. Retrieval of O+ Density From Combined OII 83.4 nm and OII 61.7 nm Limb Emissions

    NASA Astrophysics Data System (ADS)

    Geddes, G.; Finn, S. C.; Stephan, A. W.; Cook, T.; Chakrabarti, S.

    2016-12-01

    OII 83.4 nm and OII 61.7 nm emissions are produced by photoionization of neutral oxygen in the thermosphere. While OII 83.4 nm photons are resonantly scattered by O+ ions, OII 61.7 nm photons do not interact with the ionosphere. Combined observations of these two features, which share a production mechanism but have different paths through the ionosphere, can be used to infer the O+ density causing the scattering of OII 83.4 nm. We retrieve O+ density from synthetic measurements of the OII 83.4 nm and OII 61.7 nm emission features using a Markov chain Monte Carlo technique. This method allows us to quantify constraints on retrieved ionospheric parameters, giving an estimate of O+ density retrieval capability in preparation for the Limb-Imaging Ionospheric and Thermospheric Extreme-ultraviolet Spectrograph (LITES), scheduled to fly on the International Space Station in November 2016. This work is also applicable to observations from the Ionospheric Connection Explorer (ICON), scheduled for launch in June 2017.

  17. A Demonstrator Intelligent Scheduler For Sensor-Based Robots

    NASA Astrophysics Data System (ADS)

    Perrotta, Gabriella; Allen, Charles R.; Shepherd, Andrew J.

    1987-10-01

    The development of an execution module capable of functioning as as on-line supervisor for a robot equipped with a vision sensor and tactile sensing gripper system is described. The on-line module is supported by two off-line software modules which provide a procedural based assembly constraints language to allow the assembly task to be defined. This input is then converted into a normalised and minimised form. The host Robot programming language permits high level motions to be issued at the to level, hence allowing a low programming overhead to the designer, who must describe the assembly sequence. Components are selected for pick and place robot movement, based on information derived from two cameras, one static and the other mounted on the end effector of the robot. The approach taken is multi-path scheduling as described by Fox pi. The system is seen to permit robot assembly in a less constrained parts presentation environment making full use of the sensory detail available on the robot.

  18. Improved Results for Route Planning in Stochastic Transportation Networks

    NASA Technical Reports Server (NTRS)

    Boyan, Justin; Mitzenmacher, Michael

    2000-01-01

    In the bus network problem, the goal is to generate a plan for getting from point X to point Y within a city using buses in the smallest expected time. Because bus arrival times are not determined by a fixed schedule but instead may be random. the problem requires more than standard shortest path techniques. In recent work, Datar and Ranade provide algorithms in the case where bus arrivals are assumed to be independent and exponentially distributed. We offer solutions to two important generalizations of the problem, answering open questions posed by Datar and Ranade. First, we provide a polynomial time algorithm for a much wider class of arrival distributions, namely those with increasing failure rate. This class includes not only exponential distributions but also uniform, normal, and gamma distributions. Second, in the case where bus arrival times are independent and geometric discrete random variable,. we provide an algorithm for transportation networks of buses and trains, where trains run according to a fixed schedule.

  19. Centrality and Flow Vergence gradient based Path analysis of scientific literature: A case study of Biotechnology for Engineering

    NASA Astrophysics Data System (ADS)

    Lathabai, Hiran H.; Prabhakaran, Thara; Changat, Manoj

    2015-07-01

    Biotechnology, ever since its inception has had a huge impact on the society and its various applications have been intricately woven into the human web of life. Its evolution amidst all the other research realms vital to mankind is remarkable. In this paper, we intend to identify the radical innovations in Biotechnology for Engineering using network analyses. Centrality analysis and Path analysis are used for identifying important works. Existence of Flow Vergence effect in the scientific literature is revealed. Flow Vergence gradient, an arc metric derived from FV model, is utilised for Path analysis which detects pivotal papers of paradigm shift more accurately. A major paradigm shift has been identified in the business models of Biotechnology for Engineering - 'Capability to Connectivity' model. Evidence towards the adoption of business practices in BT firms by nanotechnology start-ups is also identified. The notion of critical divergence is introduced and the exhibition of interdisciplinary interaction in emerging fields due to critical divergence is discussed. Implications of above analyses which target: (i) Science and technology policy makers, (ii) industrialists and investors, (iii) researchers in academia as well as industry, are also discussed.

  20. Observation and modeling of source effects in coda wave interferometry at Pavlof volcano

    USGS Publications Warehouse

    Haney, M.M.; van, Wijik K.; Preston, L.A.; Aldridge, D.F.

    2009-01-01

    Sorting out source and path effects for seismic waves at volcanoes is critical for the proper interpretation of underlying volcanic processes. Source or path effects imply that seismic waves interact strongly with the volcanic subsurface, either through partial resonance in a conduit (Garces et al., 2000; Sturton and Neuberg, 2006) or by random scattering in the heterogeneous volcanic edifice (Wegler and Luhr, 2001). As a result, both source and path effects can cause seismic waves to repeatedly sample parts of the volcano, leading to enhanced sensitivity to small changes in material properties at those locations. The challenge for volcano seismologists is to detect and reliably interpret these subtle changes for the purpose of monitoring eruptions. ?? 2009 Society of Exploration Geophysicists.

  1. Electromechanical actuators affected by multiple failures: Prognostic method based on spectral analysis techniques

    NASA Astrophysics Data System (ADS)

    Belmonte, D.; Vedova, M. D. L. Dalla; Ferro, C.; Maggiore, P.

    2017-06-01

    The proposal of prognostic algorithms able to identify precursors of incipient failures of primary flight command electromechanical actuators (EMA) is beneficial for the anticipation of the incoming failure: an early and correct interpretation of the failure degradation pattern, in fact, can trig an early alert of the maintenance crew, who can properly schedule the servomechanism replacement. An innovative prognostic model-based approach, able to recognize the EMA progressive degradations before his anomalous behaviors become critical, is proposed: the Fault Detection and Identification (FDI) of the considered incipient failures is performed analyzing proper system operational parameters, able to put in evidence the corresponding degradation path, by means of a numerical algorithm based on spectral analysis techniques. Subsequently, these operational parameters will be correlated with the actual EMA health condition by means of failure maps created by a reference monitoring model-based algorithm. In this work, the proposed method has been tested in case of EMA affected by combined progressive failures: in particular, partial stator single phase turn to turn short-circuit and rotor static eccentricity are considered. In order to evaluate the prognostic method, a numerical test-bench has been conceived. Results show that the method exhibit adequate robustness and a high degree of confidence in the ability to early identify an eventual malfunctioning, minimizing the risk of fake alarms or unannounced failures.

  2. Learning Kinematic Constraints in Laparoscopic Surgery

    PubMed Central

    Huang, Felix C.; Mussa-Ivaldi, Ferdinando A.; Pugh, Carla M.; Patton, James L.

    2012-01-01

    To better understand how kinematic variables impact learning in surgical training, we devised an interactive environment for simulated laparoscopic maneuvers, using either 1) mechanical constraints typical of a surgical “box-trainer” or 2) virtual constraints in which free hand movements control virtual tool motion. During training, the virtual tool responded to the absolute position in space (Position-Based) or the orientation (Orientation-Based) of a hand-held sensor. Volunteers were further assigned to different sequences of target distances (Near-Far-Near or Far-Near-Far). Training with the Orientation-Based constraint enabled much lower path error and shorter movement times during training, which suggests that tool motion that simply mirrors joint motion is easier to learn. When evaluated in physically constrained (physical box-trainer) conditions, each group exhibited improved performance from training. However, Position-Based training enabled greater reductions in movement error relative to Orientation-Based (mean difference: 14.0 percent; CI: 0.7, 28.6). Furthermore, the Near-Far-Near schedule allowed a greater decrease in task time relative to the Far-Near-Far sequence (mean −13:5 percent, CI: −19:5, −7:5). Training that focused on shallow tool insertion (near targets) might promote more efficient movement strategies by emphasizing the curvature of tool motion. In addition, our findings suggest that an understanding of absolute tool position is critical to coping with mechanical interactions between the tool and trocar. PMID:23293709

  3. Minimising back reflections from the common path objective in a fundus camera

    NASA Astrophysics Data System (ADS)

    Swat, A.

    2016-11-01

    Eliminating back reflections is critical in the design of a fundus camera with internal illuminating system. As there is very little light reflected from the retina, even excellent antireflective coatings are not sufficient suppression of ghost reflections, therefore the number of surfaces in the common optics in illuminating and imaging paths shall be minimised. Typically a single aspheric objective is used. In the paper an alternative approach, an objective with all spherical surfaces, is presented. As more surfaces are required, more sophisticated method is needed to get rid of back reflections. Typically back reflections analysis, comprise treating subsequent objective surfaces as mirrors, and reflections from the objective surfaces are traced back through the imaging path. This approach can be applied in both sequential and nonsequential ray tracing. It is good enough for system check but not very suitable for early optimisation process in the optical system design phase. There are also available standard ghost control merit function operands in the sequential ray-trace, for example in Zemax system, but these don't allow back ray-trace in an alternative optical path, illumination vs. imaging. What is proposed in the paper, is a complete method to incorporate ghost reflected energy into the raytracing system merit function for sequential mode which is more efficient in optimisation process. Although developed for the purpose of specific case of fundus camera, the method might be utilised in a wider range of applications where ghost control is critical.

  4. Primary weathering rates, water transit times, and concentration-discharge relations: A theoretical analysis for the critical zone

    NASA Astrophysics Data System (ADS)

    Ameli, Ali A.; Beven, Keith; Erlandsson, Martin; Creed, Irena F.; McDonnell, Jeffrey J.; Bishop, Kevin

    2017-01-01

    The permeability architecture of the critical zone exerts a major influence on the hydrogeochemistry of the critical zone. Water flow path dynamics drive the spatiotemporal pattern of geochemical evolution and resulting streamflow concentration-discharge (C-Q) relation, but these flow paths are complex and difficult to map quantitatively. Here we couple a new integrated flow and particle tracking transport model with a general reversible Transition State Theory style dissolution rate law to explore theoretically how C-Q relations and concentration in the critical zone respond to decline in saturated hydraulic conductivity (Ks) with soil depth. We do this for a range of flow rates and mineral reaction kinetics. Our results show that for minerals with a high ratio of equilibrium concentration (Ceq) to intrinsic weathering rate (Rmax), vertical heterogeneity in Ks enhances the gradient of weathering-derived solute concentration in the critical zone and strengthens the inverse stream C-Q relation. As CeqRmax decreases, the spatial distribution of concentration in the critical zone becomes more uniform for a wide range of flow rates, and stream C-Q relation approaches chemostatic behavior, regardless of the degree of vertical heterogeneity in Ks. These findings suggest that the transport-controlled mechanisms in the hillslope can lead to chemostatic C-Q relations in the stream while the hillslope surface reaction-controlled mechanisms are associated with an inverse stream C-Q relation. In addition, as CeqRmax decreases, the concentration in the critical zone and stream become less dependent on groundwater age (or transit time).

  5. An Approach to Realizing Process Control for Underground Mining Operations of Mobile Machines

    PubMed Central

    Song, Zhen; Schunnesson, Håkan; Rinne, Mikael; Sturgul, John

    2015-01-01

    The excavation and production in underground mines are complicated processes which consist of many different operations. The process of underground mining is considerably constrained by the geometry and geology of the mine. The various mining operations are normally performed in series at each working face. The delay of a single operation will lead to a domino effect, thus delay the starting time for the next process and the completion time of the entire process. This paper presents a new approach to the process control for underground mining operations, e.g. drilling, bolting, mucking. This approach can estimate the working time and its probability for each operation more efficiently and objectively by improving the existing PERT (Program Evaluation and Review Technique) and CPM (Critical Path Method). If the delay of the critical operation (which is on a critical path) inevitably affects the productivity of mined ore, the approach can rapidly assign mucking machines new jobs to increase this amount at a maximum level by using a new mucking algorithm under external constraints. PMID:26062092

  6. An Approach to Realizing Process Control for Underground Mining Operations of Mobile Machines.

    PubMed

    Song, Zhen; Schunnesson, Håkan; Rinne, Mikael; Sturgul, John

    2015-01-01

    The excavation and production in underground mines are complicated processes which consist of many different operations. The process of underground mining is considerably constrained by the geometry and geology of the mine. The various mining operations are normally performed in series at each working face. The delay of a single operation will lead to a domino effect, thus delay the starting time for the next process and the completion time of the entire process. This paper presents a new approach to the process control for underground mining operations, e.g. drilling, bolting, mucking. This approach can estimate the working time and its probability for each operation more efficiently and objectively by improving the existing PERT (Program Evaluation and Review Technique) and CPM (Critical Path Method). If the delay of the critical operation (which is on a critical path) inevitably affects the productivity of mined ore, the approach can rapidly assign mucking machines new jobs to increase this amount at a maximum level by using a new mucking algorithm under external constraints.

  7. CPM and PERT in Library Management.

    ERIC Educational Resources Information Center

    Main, Linda

    1989-01-01

    Discusses two techniques of systems analysis--Critical Path Method (CPM) and Program Evaluation Review Techniques (PERT)--and their place in library management. An overview of CPM and PERT charting procedures is provided. (11 references) (Author/MES)

  8. Integration of domain and resource-based reasoning for real-time control in dynamic environments

    NASA Technical Reports Server (NTRS)

    Morgan, Keith; Whitebread, Kenneth R.; Kendus, Michael; Cromarty, Andrew S.

    1993-01-01

    A real-time software controller that successfully integrates domain-based and resource-based control reasoning to perform task execution in a dynamically changing environment is described. The design of the controller is based on the concept of partitioning the process to be controlled into a set of tasks, each of which achieves some process goal. It is assumed that, in general, there are multiple ways (tasks) to achieve a goal. The controller dynamically determines current goals and their current criticality, choosing and scheduling tasks to achieve those goals in the time available. It incorporates rule-based goal reasoning, a TMS-based criticality propagation mechanism, and a real-time scheduler. The controller has been used to build a knowledge-based situation assessment system that formed a major component of a real-time, distributed, cooperative problem solving system built under DARPA contract. It is also being employed in other applications now in progress.

  9. Surface Navigation Using Optimized Waypoints and Particle Swarm Optimization

    NASA Technical Reports Server (NTRS)

    Birge, Brian

    2013-01-01

    The design priority for manned space exploration missions is almost always placed on human safety. Proposed manned surface exploration tasks (lunar, asteroid sample returns, Mars) have the possibility of astronauts traveling several kilometers away from a home base. Deviations from preplanned paths are expected while exploring. In a time-critical emergency situation, there is a need to develop an optimal home base return path. The return path may or may not be similar to the outbound path, and what defines optimal may change with, and even within, each mission. A novel path planning algorithm and prototype program was developed using biologically inspired particle swarm optimization (PSO) that generates an optimal path of traversal while avoiding obstacles. Applications include emergency path planning on lunar, Martian, and/or asteroid surfaces, generating multiple scenarios for outbound missions, Earth-based search and rescue, as well as human manual traversal and/or path integration into robotic control systems. The strategy allows for a changing environment, and can be re-tasked at will and run in real-time situations. Given a random extraterrestrial planetary or small body surface position, the goal was to find the fastest (or shortest) path to an arbitrary position such as a safe zone or geographic objective, subject to possibly varying constraints. The problem requires a workable solution 100% of the time, though it does not require the absolute theoretical optimum. Obstacles should be avoided, but if they cannot be, then the algorithm needs to be smart enough to recognize this and deal with it. With some modifications, it works with non-stationary error topologies as well.

  10. Austenite grain growth kinetics in Al-killed plain carbon steels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Militzer, M.; Giumelli, A.; Hawbolt, E.B.

    1996-11-01

    Austenite grain growth kinetics have been investigated in three Al-killed plain carbon steels. Experimental results have been validated using the statistical grain growth model by Abbruzzese and Luecke, which takes pinning by second-phase particles into account. It is shown that the pinning force is a function of the pre-heat-treatment schedule. Extrapolation to the conditions of a hot-strip mill indicates that grain growth occurs without pinning during conventional processing. Analytical relations are proposed to simulate austenite grain growth for Al-killed plain carbon steels for any thermal path in a hot-strip mill.

  11. Storm Warning Service

    NASA Technical Reports Server (NTRS)

    1993-01-01

    A Huntsville meteorologist of Baron Services, Inc. has formed a commercial weather advisory service. Weather information is based on data from Marshall Space Flight Center (MSFC) collected from antennas in Alabama and Tennessee. Bob Baron refines and enhances MSFC's real time display software. Computer data is changed to audio data for radio transmission, received by clients through an antenna and decoded by computer for display. Using his service, clients can monitor the approach of significant storms and schedule operations accordingly. Utilities and emergency management officials are able to plot a storm's path. A recent agreement with two other companies will promote continued development and marketing.

  12. Job-shop scheduling applied to computer vision

    NASA Astrophysics Data System (ADS)

    Sebastian y Zuniga, Jose M.; Torres-Medina, Fernando; Aracil, Rafael; Reinoso, Oscar; Jimenez, Luis M.; Garcia, David

    1997-09-01

    This paper presents a method for minimizing the total elapsed time spent by n tasks running on m differents processors working in parallel. The developed algorithm not only minimizes the total elapsed time but also reduces the idle time and waiting time of in-process tasks. This condition is very important in some applications of computer vision in which the time to finish the total process is particularly critical -- quality control in industrial inspection, real- time computer vision, guided robots. The scheduling algorithm is based on the use of two matrices, obtained from the precedence relationships between tasks, and the data obtained from the two matrices. The developed scheduling algorithm has been tested in one application of quality control using computer vision. The results obtained have been satisfactory in the application of different image processing algorithms.

  13. Advances in mixed-integer programming methods for chemical production scheduling.

    PubMed

    Velez, Sara; Maravelias, Christos T

    2014-01-01

    The goal of this paper is to critically review advances in the area of chemical production scheduling over the past three decades and then present two recently proposed solution methods that have led to dramatic computational enhancements. First, we present a general framework and problem classification and discuss modeling and solution methods with an emphasis on mixed-integer programming (MIP) techniques. Second, we present two solution methods: (a) a constraint propagation algorithm that allows us to compute parameters that are then used to tighten MIP scheduling models and (b) a reformulation that introduces new variables, thus leading to effective branching. We also present computational results and an example illustrating how these methods are implemented, as well as the resulting enhancements. We close with a discussion of open research challenges and future research directions.

  14. Comparison of teen and adult driver crash scenarios in a nationally representative sample of serious crashes.

    PubMed

    McDonald, Catherine C; Curry, Allison E; Kandadai, Venk; Sommers, Marilyn S; Winston, Flaura K

    2014-11-01

    Motor vehicle crashes are the leading cause of death and acquired disability during the first four decades of life. While teen drivers have the highest crash risk, few studies examine the similarities and differences in teen and adult driver crashes. We aimed to: (1) identify and compare the most frequent crash scenarios-integrated information on a vehicle's movement prior to crash, immediate pre-crash event, and crash configuration-for teen and adult drivers involved in serious crashes, and (2) for the most frequent scenarios, explore whether the distribution of driver critical errors differed for teens and adult drivers. We analyzed data from the National Motor Vehicle Crash Causation Survey, a nationally representative study of serious crashes conducted by the U.S. National Highway Traffic Safety Administration from 2005 to 2007. Our sample included 642 16- to 19-year-old and 1167 35- to 54-year-old crash-involved drivers (weighted n=296,482 and 439,356, respectively) who made a critical error that led to their crash's critical pre-crash event (i.e., event that made the crash inevitable). We estimated prevalence ratios (PR) and 95% confidence intervals (CI) to compare the relative frequency of crash scenarios and driver critical errors. The top five crash scenarios among teen drivers, accounting for 37.3% of their crashes, included: (1) going straight, other vehicle stopped, rear end; (2) stopped in traffic lane, turning left at intersection, turn into path of other vehicle; (3) negotiating curve, off right edge of road, right roadside departure; (4) going straight, off right edge of road, right roadside departure; and (5) stopped in lane, turning left at intersection, turn across path of other vehicle. The top five crash scenarios among adult drivers, accounting for 33.9% of their crashes, included the same scenarios as the teen drivers with the exception of scenario (3) and the addition of going straight, crossing over an intersection, and continuing on a straight path. For two scenarios ((1) and (3) above), teens were more likely than adults to make a critical decision error (e.g., traveling too fast for conditions). Our findings indicate that among those who make a driver critical error in a serious crash, there are few differences in the scenarios or critical driver errors for teen and adult drivers. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Research and Development Strategies in the Semiconductor Industry

    NASA Astrophysics Data System (ADS)

    Bowling, Allen

    2003-03-01

    In the 21st Century semiconductor industry, there is a critical balance between internally funded semiconductor research and development (R) and externally funded R. External R may include jointly-funded research collaborations/partnerships with other device manufacturers, jointly-funded consortia-based R, and individually-funded research programs at universities and other contract research locations. Each of these approaches has merits and each has costs. There is a critical balance between keeping the internal research and development pipeline filled and keeping it from being overspent. To meet both competitive schedule and cost goals, a semiconductor device manufacturer must decide on a model for selection of internal versus external R. Today, one of the most critical decisions is whether or not to do semiconductor research and development on 300 mm silicon wafers. Equipment suppliers are doing first development on 300 mm equipment. So, for the device manufacturer, there is a balance between the cost of doing development on 300 mm wafers and the development time schedule driven by equipment availability. In the face of these cost and schedule elements, device manufacturers are looking to consortia such as SEMATECH, SRC, and SRC MARCO for early development and screening of new materials and device structure approaches. This also causes much more close development collaboration between device manufacturer and equipment supplier. Many device manufacturers are also making use of direct contract research with universities and other contract-research organizations, such as IMEC, LETI, and other government-funded research organizations around the world. To get the most out of these external research interactions, the company must develop a strategy for management and technology integration of external R.

  16. The Aging of Engines: An Operator’s Perspective

    DTIC Science & Technology

    2000-10-01

    internal HCF failures of blades . Erosion of compressor gas path 2-3 components can be minimized through the use of inlet aluminide intermetallic...fatigue problems in the dovetails durability in accelerated burner rig tests [2,35]. areas of titanium alloy fan and compressor blades . Shot peening in...Criticality Analysis replacement of durability-critical components, such as FOD Foreign object damage blades and vanes. The need to balance risk and escalating

  17. Examining Transformation on the Road to the Professoriate

    ERIC Educational Resources Information Center

    Benoit, Anne C.

    2016-01-01

    This chapter presents the findings of a narrative study of two diverse faculty members on the path to their current faculty positions and examines their negotiation of a critical event in light of adult transformative learning.

  18. Expanding the ribosomal universe.

    PubMed

    Dinman, Jonathan D; Kinzy, Terri Goss

    2009-12-09

    In this issue of Structure, Taylor et al. (2009) present the most complete model of an eukaryotic ribosome to date. This achievement represents a critical milestone along the path to structurally defining the unique aspects of the eukaryotic protein synthetic machinery.

  19. Statechart Analysis with Symbolic PathFinder

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.

    2012-01-01

    We report here on our on-going work that addresses the automated analysis and test case generation for software systems modeled using multiple Statechart formalisms. The work is motivated by large programs such as NASA Exploration, that involve multiple systems that interact via safety-critical protocols and are designed with different Statechart variants. To verify these safety-critical systems, we have developed Polyglot, a framework for modeling and analysis of model-based software written using different Statechart formalisms. Polyglot uses a common intermediate representation with customizable Statechart semantics and leverages the analysis and test generation capabilities of the Symbolic PathFinder tool. Polyglot is used as follows: First, the structure of the Statechart model (expressed in Matlab Stateflow or Rational Rhapsody) is translated into a common intermediate representation (IR). The IR is then translated into Java code that represents the structure of the model. The semantics are provided as "pluggable" modules.

  20. [Legal framework of postgraduate nursing education in spain].

    PubMed

    Fernández, B M

    1996-01-01

    Being part of the first report of the SEEIUC Forum on the training of nurses in critical care units, this article shows the different postgraduation training paths which Spanish legislation establishes. The "Titulos Oficiales de Especialización Profesional" ("Official Degrees on Professional Specialization") settle the seven nursing specialties regulated by the Decreto 992/1987. Following a second path, "Titulos de Postgraduado no Oficiales" ("Non-official postgraduation degrees"), every University acknowledged by the LRU and creating them as their Own Degrees, may organize Master courses, University experts, University specialists and Postgraduation university degrees, according to their autonomy. So that this autonomous offer is as homogeneous as possible, there is an interuniversity agreement which encompasses 24 national universities and gathers the general criteria for the academic organization of such courses. The report is completed by an analysis of the training offer for critical care nursing, developed during the 1995/1996 course in Spain.

  1. NASA flight controllers - Meeting cultural and leadership challenges on the critical path to mission success

    NASA Technical Reports Server (NTRS)

    Clement, James L., Jr.; Ritsher, Jennifer Boyd

    2006-01-01

    As part of its preparation for missions to the Moon and Mars, NASA has identified high priority critical path roadmap (CPR) questions, two of which focus on the performance of mission control personnel. NASA flight controllers have always worked in an incredibly demanding setting, but the International Space Station poses even more challenges than prior missions. We surveyed 14 senior ISS flight controllers and a contrasting sample of 12 more junior controllers about the management and cultural challenges they face and the most effective strategies for addressing them. There was substantial consensus among participants on some issues, such as the importance of building a personal relationship with Russian colleagues. Responses from junior and senior controllers differed in some areas, such as training. We frame the results in terms of two CPR questions. We aim to use our results to improve flight controller training.

  2. Dynamical correlation effects on photoisomerization: Ab initio multiple spawning dynamics with MS-CASPT2 for a model trans-protonated Schiff base

    DOE PAGES

    Liu, Lihong; Liu, Jian; Martinez, Todd J.

    2015-12-17

    Here, we investigate the photoisomerization of a model retinal protonated Schiff base (trans-PSB3) using ab initio multiple spawning (AIMS) based on multi-state second order perturbation theory (MSPT2). Discrepancies between the photodynamical mechanism computed with three-root state-averaged complete active space self-consistent field (SA-3-CASSCF, which does not include dynamic electron correlation effects) and MSPT2 show that dynamic correlation is critical in this photoisomerization reaction. Furthermore, we show that the photodynamics of trans-PSB3 is not well described by predictions based on minimum energy conical intersections (MECIs) or minimum energy conical intersection (CI) seam paths. Instead, most of the CIs involved in the photoisomerizationmore » are far from MECIs and minimum energy CI seam paths. Thus, both dynamical nuclear effects and dynamic electron correlation are critical to understanding the photochemical mechanism.« less

  3. Electronic Resources for Security Related Information, CIAC-2307 R.1

    DTIC Science & Technology

    1994-12-01

    administrators NETwork - Statistics about the network NODEntry node1 <node2 <…>> - BITEARN NODES entry for the specified node(s) NODEntry node1 / abc */xyz...Just the “:xyz.” tag and all tags whose name starts with “ abc ” PATHs snode node1 <node2 <…>> - BITNET path between “snode” and the specified node(s...protect their data and sytems . The mention of vendors or product names does not imply criticism or endorsement by the National Institute of Standards

  4. Critical Directed Energy Test and Evaluation Infrastructure Shortfalls: Results of the Directed Energy Test and Evaluation Capability Tri-Service Study Update

    DTIC Science & Technology

    2009-06-01

    Sensor H11 HPM Chamber Test Capability—Explosive Equivalent Substitute H12 HEL Irradiance & Temperature H13 HEL Near/In-Beam Path Quality H14 HPM Sensor...such things as artillery shells or UAVs and may impact the earth. Possible targets include missiles in flight or a relatively close command, control...capability is a synergy of four high priority shortfalls identified by the T-SS Update. H13 —HEL near/in-beam path quality H13 is the need for a

  5. Universal pion freeze-out in heavy-ion collisions.

    PubMed

    Adamová, D; Agakichiev, G; Appelshäuser, H; Belaga, V; Braun-Munzinger, P; Castillo, A; Cherlin, A; Damjanović, S; Dietel, T; Dietrich, L; Drees, A; Esumi, S I; Filimonov, K; Fomenko, K; Fraenkel, Z; Garabatos, C; Glässel, P; Hering, G; Holeczek, J; Kushpil, V; Lenkeit, B; Ludolphs, W; Maas, A; Marín, A; Milosević, J; Milov, A; Miśkowiec, D; Panebrattsev, Yu; Petchenova, O; Petrácek, V; Pfeiffer, A; Rak, J; Ravinovich, I; Rehak, P; Sako, H; Schmitz, W; Schukraft, J; Sedykh, S; Shimansky, S; Slívová, J; Specht, H J; Stachel, J; Sumbera, M; Tilsner, H; Tserruya, I; Wessels, J P; Wienold, T; Windelband, B; Wurm, J P; Xie, W; Yurevich, S; Yurevich, V

    2003-01-17

    Based on an evaluation of data on pion interferometry and on particle yields at midrapidity, we propose a universal condition for thermal freeze-out of pions in heavy-ion collisions. We show that freeze-out occurs when the mean free path of pions lambda(f) reaches a value of about 1 fm, which is much smaller than the spatial extent of the system at freeze-out. This critical mean free path is independent of the centrality of the collision and beam energy from the Alternating Gradient Synchrotron to the Relativistic Heavy Ion Collider.

  6. Cryogenic Fluid Management Technology Development for Nuclear Thermal Propulsion

    NASA Technical Reports Server (NTRS)

    Taylor, B. D.; Caffrey, J.; Hedayat, A.; Stephens, J.; Polsgrove, R.

    2015-01-01

    Cryogenic fluid management technology is critical to the success of future nuclear thermal propulsion powered vehicles and long duration missions. This paper discusses current capabilities in key technologies and their development path. The thermal environment, complicated from the radiation escaping a reactor of a nuclear thermal propulsion system, is examined and analysis presented. The technology development path required for maintaining cryogenic propellants in this environment is reviewed. This paper is intended to encourage and bring attention to the cryogenic fluid management technologies needed to enable nuclear thermal propulsion powered deep space missions.

  7. A path model for Whittaker vectors

    NASA Astrophysics Data System (ADS)

    Di Francesco, Philippe; Kedem, Rinat; Turmunkh, Bolor

    2017-06-01

    In this paper we construct weighted path models to compute Whittaker vectors in the completion of Verma modules, as well as Whittaker functions of fundamental type, for all finite-dimensional simple Lie algebras, affine Lie algebras, and the quantum algebra U_q(slr+1) . This leads to series expressions for the Whittaker functions. We show how this construction leads directly to the quantum Toda equations satisfied by these functions, and to the q-difference equations in the quantum case. We investigate the critical limit of affine Whittaker functions computed in this way.

  8. 30 CFR 850.13 - Training.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...; and (iii) Handling, transportation, and storage; (2) Blast designs, including— (i) Geologic and topographic considerations; (ii) Design of a blast hole, with critical dimensions; (iii) Pattern design, field... records; (9) Schedules; (10) Preblasting surveys, including— (i) Availability, (ii) Coverage, and (iii...

  9. 30 CFR 850.13 - Training.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...; and (iii) Handling, transportation, and storage; (2) Blast designs, including— (i) Geologic and topographic considerations; (ii) Design of a blast hole, with critical dimensions; (iii) Pattern design, field... records; (9) Schedules; (10) Preblasting surveys, including— (i) Availability, (ii) Coverage, and (iii...

  10. 30 CFR 850.13 - Training.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...; and (iii) Handling, transportation, and storage; (2) Blast designs, including— (i) Geologic and topographic considerations; (ii) Design of a blast hole, with critical dimensions; (iii) Pattern design, field... records; (9) Schedules; (10) Preblasting surveys, including— (i) Availability, (ii) Coverage, and (iii...

  11. 30 CFR 850.13 - Training.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...; and (iii) Handling, transportation, and storage; (2) Blast designs, including— (i) Geologic and topographic considerations; (ii) Design of a blast hole, with critical dimensions; (iii) Pattern design, field... records; (9) Schedules; (10) Preblasting surveys, including— (i) Availability, (ii) Coverage, and (iii...

  12. 30 CFR 850.13 - Training.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...; and (iii) Handling, transportation, and storage; (2) Blast designs, including— (i) Geologic and topographic considerations; (ii) Design of a blast hole, with critical dimensions; (iii) Pattern design, field... records; (9) Schedules; (10) Preblasting surveys, including— (i) Availability, (ii) Coverage, and (iii...

  13. Critical role of bevacizumab scheduling in combination with pre-surgical chemo-radiotherapy in MRI-defined high-risk locally advanced rectal cancer: results of the branch trial

    PubMed Central

    Avallone, Antonio; Pecori, Biagio; Bianco, Franco; Aloj, Luigi; Tatangelo, Fabiana; Romano, Carmela; Granata, Vincenza; Marone, Pietro; Leone, Alessandra; Botti, Gerardo; Petrillo, Antonella; Caracò, Corradina; Iaffaioli, Vincenzo R.; Muto, Paolo; Romano, Giovanni; Comella, Pasquale; Budillon, Alfredo; Delrio, Paolo

    2015-01-01

    Background We have previously shown that an intensified preoperative regimen including oxaliplatin plus raltitrexed and 5-fluorouracil/folinic acid (OXATOM/FUFA) during preoperative pelvic radiotherapy produced promising results in locally advanced rectal cancer (LARC). Preclinical evidence suggests that the scheduling of bevacizumab may be crucial to optimize its combination with chemo-radiotherapy. Patients and methods This non-randomized, non-comparative, phase II study was conducted in MRI-defined high-risk LARC. Patients received three biweekly cycles of OXATOM/FUFA during RT. Bevacizumab was given 2 weeks before the start of chemo-radiotherapy, and on the same day of chemotherapy for 3 cycles (concomitant-schedule A) or 4 days prior to the first and second cycle of chemotherapy (sequential-schedule B). Primary end point was pathological complete tumor regression (TRG1) rate. Results The accrual for the concomitant-schedule was early terminated because the number of TRG1 (2 out of 16 patients) was statistically inconsistent with the hypothesis of activity (30%) to be tested. Conversely, the endpoint was reached with the sequential-schedule and the final TRG1 rate among 46 enrolled patients was 50% (95% CI 35%–65%). Neutropenia was the most common grade ≥3 toxicity with both schedules, but it was less pronounced with the sequential than concomitant-schedule (30% vs. 44%). Postoperative complications occurred in 8/15 (53%) and 13/46 (28%) patients in schedule A and B, respectively. At 5 year follow-up the probability of PFS and OS was 80% (95%CI, 66%–89%) and 85% (95%CI, 69%–93%), respectively, for the sequential-schedule. Conclusions These results highlights the relevance of bevacizumab scheduling to optimize its combination with preoperative chemo-radiotherapy in the management of LARC. PMID:26320185

  14. A collaborative network middleware project by Lambda Station, TeraPaths, and Phoebus

    NASA Astrophysics Data System (ADS)

    Bobyshev, A.; Bradley, S.; Crawford, M.; DeMar, P.; Katramatos, D.; Shroff, K.; Swany, M.; Yu, D.

    2010-04-01

    The TeraPaths, Lambda Station, and Phoebus projects, funded by the US Department of Energy, have successfully developed network middleware services that establish on-demand and manage true end-to-end, Quality-of-Service (QoS) aware, virtual network paths across multiple administrative network domains, select network paths and gracefully reroute traffic over these dynamic paths, and streamline traffic between packet and circuit networks using transparent gateways. These services improve network QoS and performance for applications, playing a critical role in the effective use of emerging dynamic circuit network services. They provide interfaces to applications, such as dCache SRM, translate network service requests into network device configurations, and coordinate with each other to setup up end-to-end network paths. The End Site Control Plane Subsystem (ESCPS) builds upon the success of the three projects by combining their individual capabilities into the next generation of network middleware. ESCPS addresses challenges such as cross-domain control plane signalling and interoperability, authentication and authorization in a Grid environment, topology discovery, and dynamic status tracking. The new network middleware will take full advantage of the perfSONAR monitoring infrastructure and the Inter-Domain Control plane efforts and will be deployed and fully vetted in the Large Hadron Collider data movement environment.

  15. Path optimization method for the sign problem

    NASA Astrophysics Data System (ADS)

    Ohnishi, Akira; Mori, Yuto; Kashiwa, Kouji

    2018-03-01

    We propose a path optimization method (POM) to evade the sign problem in the Monte-Carlo calculations for complex actions. Among many approaches to the sign problem, the Lefschetz-thimble path-integral method and the complex Langevin method are promising and extensively discussed. In these methods, real field variables are complexified and the integration manifold is determined by the flow equations or stochastically sampled. When we have singular points of the action or multiple critical points near the original integral surface, however, we have a risk to encounter the residual and global sign problems or the singular drift term problem. One of the ways to avoid the singular points is to optimize the integration path which is designed not to hit the singular points of the Boltzmann weight. By specifying the one-dimensional integration-path as z = t +if(t)(f ɛ R) and by optimizing f(t) to enhance the average phase factor, we demonstrate that we can avoid the sign problem in a one-variable toy model for which the complex Langevin method is found to fail. In this proceedings, we propose POM and discuss how we can avoid the sign problem in a toy model. We also discuss the possibility to utilize the neural network to optimize the path.

  16. Portable open-path chemical sensor using a quantum cascade laser

    NASA Astrophysics Data System (ADS)

    Corrigan, Paul; Lwin, Maung; Huntley, Reuven; Chhabra, Amandeep; Moshary, Fred; Gross, Barry; Ahmed, Samir

    2009-05-01

    Remote sensing of enemy installations or their movements by trace gas detection is a critical but challenging military objective. Open path measurements over ranges of a few meters to many kilometers with sensitivity in the parts per million or billion regime are crucial in anticipating the presence of a threat. Previous approaches to detect ground level chemical plumes, explosive constituents, or combustion have relied on low-resolution, short range Fourier transform infrared spectrometer (FTIR), or low-sensitivity near-infrared differential optical absorption spectroscopy (DOAS). As mid-infrared quantum cascade laser (QCL) sources have improved in cost and performance, systems based on QCL's that can be tailored to monitor multiple chemical species in real time are becoming a viable alternative. We present the design of a portable, high-resolution, multi-kilometer open path trace gas sensor based on QCL technology. Using a tunable (1045-1047cm-1) QCL, a modeled atmosphere and link-budget analysis with commercial component specifications, we show that with this approach, accuracy in parts per billion ozone or ammonia can be obtained in seconds at path lengths up to 10 km. We have assembled an open-path QCL sensor based on this theoretical approach at City College of New York, and we present preliminary results demonstrating the potential of QCLs in open-path sensing applications.

  17. Underwater Robot Task Planning Using Multi-Objective Meta-Heuristics

    PubMed Central

    Landa-Torres, Itziar; Manjarres, Diana; Bilbao, Sonia; Del Ser, Javier

    2017-01-01

    Robotics deployed in the underwater medium are subject to stringent operational conditions that impose a high degree of criticality on the allocation of resources and the schedule of operations in mission planning. In this context the so-called cost of a mission must be considered as an additional criterion when designing optimal task schedules within the mission at hand. Such a cost can be conceived as the impact of the mission on the robotic resources themselves, which range from the consumption of battery to other negative effects such as mechanic erosion. This manuscript focuses on this issue by devising three heuristic solvers aimed at efficiently scheduling tasks in robotic swarms, which collaborate together to accomplish a mission, and by presenting experimental results obtained over realistic scenarios in the underwater environment. The heuristic techniques resort to a Random-Keys encoding strategy to represent the allocation of robots to tasks and the relative execution order of such tasks within the schedule of certain robots. The obtained results reveal interesting differences in terms of Pareto optimality and spread between the algorithms considered in the benchmark, which are insightful for the selection of a proper task scheduler in real underwater campaigns. PMID:28375160

  18. Underwater Robot Task Planning Using Multi-Objective Meta-Heuristics.

    PubMed

    Landa-Torres, Itziar; Manjarres, Diana; Bilbao, Sonia; Del Ser, Javier

    2017-04-04

    Robotics deployed in the underwater medium are subject to stringent operational conditions that impose a high degree of criticality on the allocation of resources and the schedule of operations in mission planning. In this context the so-called cost of a mission must be considered as an additional criterion when designing optimal task schedules within the mission at hand. Such a cost can be conceived as the impact of the mission on the robotic resources themselves, which range from the consumption of battery to other negative effects such as mechanic erosion. This manuscript focuses on this issue by devising three heuristic solvers aimed at efficiently scheduling tasks in robotic swarms, which collaborate together to accomplish a mission, and by presenting experimental results obtained over realistic scenarios in the underwater environment. The heuristic techniques resort to a Random-Keys encoding strategy to represent the allocation of robots to tasks and the relative execution order of such tasks within the schedule of certain robots. The obtained results reveal interesting differences in terms of Pareto optimality and spread between the algorithms considered in the benchmark, which are insightful for the selection of a proper task scheduler in real underwater campaigns.

  19. Improving patient access to an interventional US clinic.

    PubMed

    Steele, Joseph R; Clarke, Ryan K; Terrell, John A; Brightmon, Tonya R

    2014-01-01

    A continuous quality improvement project was conducted to increase patient access to a neurointerventional ultrasonography (US) clinic. The clinic was experiencing major scheduling delays because of an increasing patient volume. A multidisciplinary team was formed that included schedulers, medical assistants, nurses, technologists, and physicians. The team created an Ishikawa diagram of the possible causes of the long wait time to the next available appointment and developed a flowchart of the steps involved in scheduling and completing a diagnostic US examination and biopsy. The team then implemented a staged intervention that included adjustments to staffing and room use (stage 1); new procedures for scheduling same-day add-on appointments (stage 2); and a lead technician rotation to optimize patient flow, staffing, and workflow (stage 3). Six months after initiation of the intervention, the mean time to the next available appointment had decreased from 25 days at baseline to 1 day, and the number of available daily appointments had increased from 38 to 55. These improvements resulted from a coordinated provider effort and had a net present value of more than $275,000. This project demonstrates that structural changes in staffing, workflow, and room use can substantially reduce scheduling delays for critical imaging procedures. © RSNA, 2014.

  20. Interferometer. [high resolution

    NASA Technical Reports Server (NTRS)

    Breckinridge, J. B.; Norton, R. H.; Schindler, R. A. (Inventor)

    1981-01-01

    A high resolution interferometer is described. The interferometer is insensitive to slight misalignment of its elements, avoids channeling in the spectrum, generates a maximum equal path fringe contrast, produces an even two sided interferogram without critical matching of the wedge angles of the beamsplitter and compensator wedges, and is optically phase tunable. The interferometer includes a mirror along the path of each beam component produced by the beamsplitter, for reflecting the beam component from the beamsplitter, for reflecting the beam component from the beamsplitter to a corresponding retroreflector and for reflecting the beam returned by the retroreflector back to the beamsplitter. A wedge located along each beam component path, is large enough to cover the retroreflector, so that each beam component passes through the wedge during movement towards the retroreflector and away therefrom.

  1. A link-adding strategy for transport efficiency of complex networks

    NASA Astrophysics Data System (ADS)

    Ma, Jinlong; Han, Weizhan; Guo, Qing; Wang, Zhenyong; Zhang, Shuai

    2016-12-01

    The transport efficiency is one of the critical parameters to evaluate the performance of a network. In this paper, we propose an improved efficient (IE) strategy to enhance the network transport efficiency of complex networks by adding a fraction of links to an existing network based on the node’s local degree centrality and the shortest path length. Simulation results show that the proposed strategy can bring better traffic capacity and shorter average shortest path length than the low-degree-first (LDF) strategy under the shortest path routing protocol. It is found that the proposed strategy is beneficial to the improvement of overall traffic handling and delivering ability of the network. This study can alleviate the congestion in networks, and is helpful to design and optimize realistic networks.

  2. Adaptation in protein fitness landscapes is facilitated by indirect paths

    PubMed Central

    Wu, Nicholas C; Dai, Lei; Olson, C Anders; Lloyd-Smith, James O; Sun, Ren

    2016-01-01

    The structure of fitness landscapes is critical for understanding adaptive protein evolution. Previous empirical studies on fitness landscapes were confined to either the neighborhood around the wild type sequence, involving mostly single and double mutants, or a combinatorially complete subgraph involving only two amino acids at each site. In reality, the dimensionality of protein sequence space is higher (20L) and there may be higher-order interactions among more than two sites. Here we experimentally characterized the fitness landscape of four sites in protein GB1, containing 204 = 160,000 variants. We found that while reciprocal sign epistasis blocked many direct paths of adaptation, such evolutionary traps could be circumvented by indirect paths through genotype space involving gain and subsequent loss of mutations. These indirect paths alleviate the constraint on adaptive protein evolution, suggesting that the heretofore neglected dimensions of sequence space may change our views on how proteins evolve. DOI: http://dx.doi.org/10.7554/eLife.16965.001 PMID:27391790

  3. Network-Based Management Procedures.

    ERIC Educational Resources Information Center

    Buckner, Allen L.

    Network-based management procedures serve as valuable aids in organizational management, achievement of objectives, problem solving, and decisionmaking. Network techniques especially applicable to educational management systems are the program evaluation and review technique (PERT) and the critical path method (CPM). Other network charting…

  4. Fluor Hanford (FH) River Corridor Transition Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MCBRIDE, D.J.

    2002-08-28

    This Transition Plan defines the scope and schedule for actions that are critical for a smooth transition of the River Corridor scope of work and to ensure the achievement of transition as planned, with minimal or no impact to ongoing baseline activities.

  5. 47 CFR 90.676 - Transition administrator for reconfiguration of the 806-824/851-869 MHz band in order to separate...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... their systems and ensuring that estimates contain a firm work schedule. The Transition Administrator... available to eligible applicants in the Public Safety or Critical Infrastructure Industry Categories as set...

  6. An effective combined environment test facility

    NASA Technical Reports Server (NTRS)

    Deitch, A.

    1980-01-01

    A critical missile component required operational verification while subjected to combined environments within and beyond flight parameters. The testing schedule necessitated the design and fabrication of a test facility in order to provide the specified temperatures combined with humidity, altitude and vibration.

  7. ADAPT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reynolds, John; Jankovsky, Zachary; Metzroth, Kyle G

    2018-04-04

    The purpose of the ADAPT code is to generate Dynamic Event Trees (DET) using a user specified set of simulators. ADAPT can utilize any simulation tool which meets a minimal set of requirements. ADAPT is based on the concept of DET which uses explicit modeling of the deterministic dynamic processes that take place during a nuclear reactor plant system (or other complex system) evolution along with stochastic modeling. When DET are used to model various aspects of Probabilistic Risk Assessment (PRA), all accident progression scenarios starting from an initiating event are considered simultaneously. The DET branching occurs at user specifiedmore » times and/or when an action is required by the system and/or the operator. These outcomes then decide how the dynamic system variables will evolve in time for each DET branch. Since two different outcomes at a DET branching may lead to completely different paths for system evolution, the next branching for these paths may occur not only at separate times, but can be based on different branching criteria. The computational infrastructure allows for flexibility in ADAPT to link with different system simulation codes, parallel processing of the scenarios under consideration, on-line scenario management (initiation as well as termination), analysis of results, and user friendly graphical capabilities. The ADAPT system is designed for a distributed computing environment; the scheduler can track multiple concurrent branches simultaneously. The scheduler is modularized so that the DET branching strategy can be modified (e.g. biasing towards the worst-case scenario/event). Independent database systems store data from the simulation tasks and the DET structure so that the event tree can be constructed and analyzed later. ADAPT is provided with a user-friendly client which can easily sort through and display the results of an experiment, precluding the need for the user to manually inspect individual simulator runs.« less

  8. Effects of Napping During Shift Work on Sleepiness and Performance in Emergency Medical Services Personnel and Similar Shift Workers: A Systematic Review and Meta-Analysis.

    PubMed

    Martin-Gill, Christian; Barger, Laura K; Moore, Charity G; Higgins, J Stephen; Teasley, Ellen M; Weiss, Patricia M; Condle, Joseph P; Flickinger, Katharyn L; Coppler, Patrick J; Sequeira, Denisse J; Divecha, Ayushi A; Matthews, Margaret E; Lang, Eddy S; Patterson, P Daniel

    2018-02-15

    Scheduled napping during work shifts may be an effective way to mitigate fatigue-related risk. This study aimed to critically review and synthesize existing literature on the impact of scheduled naps on fatigue-related outcomes for EMS personnel and similar shift worker groups. A systematic literature review was performed of the impact of a scheduled nap during shift work on EMS personnel or similar shift workers. The primary (critical) outcome of interest was EMS personnel safety. Secondary (important) outcomes were patient safety; personnel performance; acute states of fatigue, alertness, and sleepiness; indicators of sleep duration and/or quality; employee retention/turnover; indicators of long-term health; and cost to the system. Meta-analyses were performed to evaluate the impact of napping on a measure of personnel performance (the psychomotor vigilance test [PVT]) and measures of acute fatigue. Of 4,660 unique records identified, 13 experimental studies were determined relevant and summarized. The effect of napping on reaction time measured at the end of shift was small and non-significant (SMD 0.12, 95% CI -0.13 to 0.36; p = 0.34). Napping during work did not change reaction time from the beginning to the end of the shift (SMD -0.01, 95% CI -25.0 to 0.24; p = 0.96). Naps had a moderate, significant effect on sleepiness measured at the end of shift (SMD 0.40, 95% CI 0.09 to 0.72; p = 0.01). The difference in sleepiness from the start to the end of shift was moderate and statistically significant (SMD 0.41, 95% CI 0.09 to 0.72; p = 0.01). Reviewed literature indicated that scheduled naps at work improved performance and decreased fatigue in shift workers. Further research is required to identify the optimal timing and duration of scheduled naps to maximize the beneficial outcomes.

  9. Which way and how far? Tracking of translation and rotation information for human path integration.

    PubMed

    Chrastil, Elizabeth R; Sherrill, Katherine R; Hasselmo, Michael E; Stern, Chantal E

    2016-10-01

    Path integration, the constant updating of the navigator's knowledge of position and orientation during movement, requires both visuospatial knowledge and memory. This study aimed to develop a systems-level understanding of human path integration by examining the basic building blocks of path integration in humans. To achieve this goal, we used functional imaging to examine the neural mechanisms that support the tracking and memory of translational and rotational components of human path integration. Critically, and in contrast to previous studies, we examined movement in translation and rotation tasks with no defined end-point or goal. Navigators accumulated translational and rotational information during virtual self-motion. Activity in hippocampus, retrosplenial cortex (RSC), and parahippocampal cortex (PHC) increased during both translation and rotation encoding, suggesting that these regions track self-motion information during path integration. These results address current questions regarding distance coding in the human brain. By implementing a modified delayed match to sample paradigm, we also examined the encoding and maintenance of path integration signals in working memory. Hippocampus, PHC, and RSC were recruited during successful encoding and maintenance of path integration information, with RSC selective for tasks that required processing heading rotation changes. These data indicate distinct working memory mechanisms for translation and rotation, which are essential for updating neural representations of current location. The results provide evidence that hippocampus, PHC, and RSC flexibly track task-relevant translation and rotation signals for path integration and could form the hub of a more distributed network supporting spatial navigation. Hum Brain Mapp 37:3636-3655, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  10. Hippocampus and Retrosplenial Cortex Combine Path Integration Signals for Successful Navigation

    PubMed Central

    Erdem, Uğur M.; Ross, Robert S.; Brown, Thackery I.; Hasselmo, Michael E.; Stern, Chantal E.

    2013-01-01

    The current study used fMRI in humans to examine goal-directed navigation in an open field environment. We designed a task that required participants to encode survey-level spatial information and subsequently navigate to a goal location in either first person, third person, or survey perspectives. Critically, no distinguishing landmarks or goal location markers were present in the environment, thereby requiring participants to rely on path integration mechanisms for successful navigation. We focused our analysis on mechanisms related to navigation and mechanisms tracking linear distance to the goal location. Successful navigation required translation of encoded survey-level map information for orientation and implementation of a planned route to the goal. Our results demonstrate that successful first and third person navigation trials recruited the anterior hippocampus more than trials when the goal location was not successfully reached. When examining only successful trials, the retrosplenial and posterior parietal cortices were recruited for goal-directed navigation in both first person and third person perspectives. Unique to first person perspective navigation, the hippocampus was recruited to path integrate self-motion cues with location computations toward the goal location. Last, our results demonstrate that the hippocampus supports goal-directed navigation by actively tracking proximity to the goal throughout navigation. When using path integration mechanisms in first person and third person perspective navigation, the posterior hippocampus was more strongly recruited as participants approach the goal. These findings provide critical insight into the neural mechanisms by which we are able to use map-level representations of our environment to reach our navigational goals. PMID:24305826

  11. A dual-porous, biophysical void structure model of soil for the understanding of the conditions causing nitrous oxide emission

    NASA Astrophysics Data System (ADS)

    Matthews, G. Peter; Maurizio Laudone, G.; Whalle, W. Richard; Bird, Nigel; Gregory, Andrew; Cardenas, Laura; Misselbrook, Tom

    2010-05-01

    Nitrous oxide is the fourth most important greenhouse gas. It is 300 times more potent than carbon dioxide, and two-thirds of anthropogenic nitrous oxide is emitted by agricultural land. This presentation will begin with a brief overview of the laboratory measurements of nitrous oxide emission from carefully characterised soils, presented in more detail by Cardenas et al.. The measurements were made in a twelve-chamber, gas chromatographic apparatus at North Wyke Research (formerly IGER). The presentation will then continue with a description of a void network model of sufficient accuracy and authenticity that it can be used to explain and predict the nitrous oxide production, and the modelling of the biological, chemical and physical processes for the production of nitrous oxide within the constructed network. Finally, conclusions will be drawn from a comparison of the model results with experiment. The void network model Nitrous oxide is produced by microbial activity located in ‘hotspots' within the microstructure of soil, and nutrients and gases flow or diffuse to and from these hotspots through the water or gas-filled macro-porosity. It is clear, therefore, that a network model to describe and explain nitrous oxide production must encompass the full size range of pore space active within the process, which covers 6 orders of magnitude, and must make realistic suppositions about the positional relationship of the hotspots relative to the soil macro-porosity. Previous experimental (Tsakiroglou, C. D. et al, European J.Soil Sci., 2008) and theoretical approaches to the modelling of soil void structure cannot generally meet these two requirements. We have therefore built on the success of the previous uni-porous model of soil (Matthews, G. P. et al, Wat.Resour.Res, 2010), and the concept of a critical percolation path, to develop a dual porous model (Laudone, G. M. et al, European J.Soil Sci., 2010) with the following features: • A porous unit cell, with periodic boundary conditions, and with a critical percolation path with the correct percolation characteristics and void volume of the macro-porosity of the soil. • A solid phase between the pores of the large unit cell, with the correct volume of the fraction of larger soil aggregates (larger 1 mm). • All the remaining pores of the large unit cell, which are not part of the critical percolation path, filled with smaller unit cells, which account for the micro-porosity of the soil sample. We describe the construction of a model that closely matches the following characteristics of a specific example of typical arable soil, taken from the Warren field of the Rothamsted experimental farm at Woburn, although the model can be used for a wide range of soils: (i) macroporosity and microporosity as measured by the water retention curve, (ii) the shape of the water retention characteristic under a wide range of tensions, (iii) the soil texture, and (iv) the extent of irreducible water content. Process model We will describe the insertion of Michaelis-Menten kinetics and Crank-Nicholson diffusion equations into the precisely scaled model, building on previous diffusion modelling (Laudone, G. M. et al, Chem.Eng.Sci., 2008). Comparison with experiment A comparison with experimental results sheds light on (i) the positional relationships of aerobic and anaerobic bacteria relative to the critical percolation path, (ii) the relationship between the critical percolation path and the preferential / critical flow path (Figure 4), (iii) the extent of ignorance about the reaction kinetics of some of the fundamental processes occurring, (iv) the soil conditions that cause nitrous oxide emission, and (v) the effect of soil compaction on the emission. Acknowledgement This presentation is a summary of the some of the work of the BBSRC funded U.K. soil research consortium "Soil Programme for Quality and Resilience" (BB/E001793/1 and others), of which Matthews is principal investigator.

  12. The Role of Datasets on Scientific Influence within Conflict Research

    PubMed Central

    Van Holt, Tracy; Johnson, Jeffery C.; Moates, Shiloh; Carley, Kathleen M.

    2016-01-01

    We inductively tested if a coherent field of inquiry in human conflict research emerged in an analysis of published research involving “conflict” in the Web of Science (WoS) over a 66-year period (1945–2011). We created a citation network that linked the 62,504 WoS records and their cited literature. We performed a critical path analysis (CPA), a specialized social network analysis on this citation network (~1.5 million works), to highlight the main contributions in conflict research and to test if research on conflict has in fact evolved to represent a coherent field of inquiry. Out of this vast dataset, 49 academic works were highlighted by the CPA suggesting a coherent field of inquiry; which means that researchers in the field acknowledge seminal contributions and share a common knowledge base. Other conflict concepts that were also analyzed—such as interpersonal conflict or conflict among pharmaceuticals, for example, did not form their own CP. A single path formed, meaning that there was a cohesive set of ideas that built upon previous research. This is in contrast to a main path analysis of conflict from 1957–1971 where ideas didn’t persist in that multiple paths existed and died or emerged reflecting lack of scientific coherence (Carley, Hummon, and Harty, 1993). The critical path consisted of a number of key features: 1) Concepts that built throughout include the notion that resource availability drives conflict, which emerged in the 1960s-1990s and continued on until 2011. More recent intrastate studies that focused on inequalities emerged from interstate studies on the democracy of peace earlier on the path. 2) Recent research on the path focused on forecasting conflict, which depends on well-developed metrics and theories to model. 3) We used keyword analysis to independently show how the CP was topically linked (i.e., through democracy, modeling, resources, and geography). Publically available conflict datasets developed early on helped shape the operationalization of conflict. In fact, 94% of the works on the CP that analyzed data either relied on publically available datasets, or they generated a dataset and made it public. These datasets appear to be important in the development of conflict research, allowing for cross-case comparisons, and comparisons to previous works. PMID:27124569

  13. The Role of Datasets on Scientific Influence within Conflict Research.

    PubMed

    Van Holt, Tracy; Johnson, Jeffery C; Moates, Shiloh; Carley, Kathleen M

    2016-01-01

    We inductively tested if a coherent field of inquiry in human conflict research emerged in an analysis of published research involving "conflict" in the Web of Science (WoS) over a 66-year period (1945-2011). We created a citation network that linked the 62,504 WoS records and their cited literature. We performed a critical path analysis (CPA), a specialized social network analysis on this citation network (~1.5 million works), to highlight the main contributions in conflict research and to test if research on conflict has in fact evolved to represent a coherent field of inquiry. Out of this vast dataset, 49 academic works were highlighted by the CPA suggesting a coherent field of inquiry; which means that researchers in the field acknowledge seminal contributions and share a common knowledge base. Other conflict concepts that were also analyzed-such as interpersonal conflict or conflict among pharmaceuticals, for example, did not form their own CP. A single path formed, meaning that there was a cohesive set of ideas that built upon previous research. This is in contrast to a main path analysis of conflict from 1957-1971 where ideas didn't persist in that multiple paths existed and died or emerged reflecting lack of scientific coherence (Carley, Hummon, and Harty, 1993). The critical path consisted of a number of key features: 1) Concepts that built throughout include the notion that resource availability drives conflict, which emerged in the 1960s-1990s and continued on until 2011. More recent intrastate studies that focused on inequalities emerged from interstate studies on the democracy of peace earlier on the path. 2) Recent research on the path focused on forecasting conflict, which depends on well-developed metrics and theories to model. 3) We used keyword analysis to independently show how the CP was topically linked (i.e., through democracy, modeling, resources, and geography). Publically available conflict datasets developed early on helped shape the operationalization of conflict. In fact, 94% of the works on the CP that analyzed data either relied on publically available datasets, or they generated a dataset and made it public. These datasets appear to be important in the development of conflict research, allowing for cross-case comparisons, and comparisons to previous works.

  14. Preclinical Determinants of Drug Choice under Concurrent Schedules of Drug Self-Administration

    PubMed Central

    Banks, Matthew L.; Negus, S. Stevens

    2012-01-01

    Drug self-administration procedures have played a critical role in the experimental analysis of psychoactive compounds, such as cocaine, for over 50 years. While there are numerous permutations of this procedure, this paper will specifically focus on choice procedures using concurrent schedules of intravenous drug self-administration. The aims of this paper are to first highlight the evolution of drug choice procedures and then review the subsequent preclinical body of literature utilizing these choice procedures to understand the environmental, pharmacological, and biological determinants of the reinforcing stimulus effects of drugs. A main rationale for this paper is our proposition that choice schedules are underutilized in investigating the reinforcing effects of drugs in assays of drug self-administration. Moreover, we will conclude with potential future directions and unexplored scientific space for the use of drug choice procedures. PMID:23243420

  15. Energy-aware scheduling of surveillance in wireless multimedia sensor networks.

    PubMed

    Wang, Xue; Wang, Sheng; Ma, Junjie; Sun, Xinyao

    2010-01-01

    Wireless sensor networks involve a large number of sensor nodes with limited energy supply, which impacts the behavior of their application. In wireless multimedia sensor networks, sensor nodes are equipped with audio and visual information collection modules. Multimedia contents are ubiquitously retrieved in surveillance applications. To solve the energy problems during target surveillance with wireless multimedia sensor networks, an energy-aware sensor scheduling method is proposed in this paper. Sensor nodes which acquire acoustic signals are deployed randomly in the sensing fields. Target localization is based on the signal energy feature provided by multiple sensor nodes, employing particle swarm optimization (PSO). During the target surveillance procedure, sensor nodes are adaptively grouped in a totally distributed manner. Specially, the target motion information is extracted by a forecasting algorithm, which is based on the hidden Markov model (HMM). The forecasting results are utilized to awaken sensor node in the vicinity of future target position. According to the two properties, signal energy feature and residual energy, the sensor nodes decide whether to participate in target detection separately with a fuzzy control approach. Meanwhile, the local routing scheme of data transmission towards the observer is discussed. Experimental results demonstrate the efficiency of energy-aware scheduling of surveillance in wireless multimedia sensor network, where significant energy saving is achieved by the sensor awakening approach and data transmission paths are calculated with low computational complexity.

  16. New Opportunities for Funding Dialysis-Dependent Undocumented Individuals

    PubMed Central

    2017-01-01

    The cost of dialysis for the estimated 6500 dialysis-dependent undocumented individuals with kidney failure in the United States is high, the quality of dialysis care they receive is poor, and their treatment varies regionally. Some regions use state and matched federal funds to cover regularly scheduled dialysis treatments, while others provide treatment only in emergent life-threatening conditions. Nephrologists caring for patients who receive emergent dialysis are tasked with the difficult moral dilemma of determining “who gets dialysis that day.” Without a path to citizenship and by exclusion from the federal marketplace exchanges, undocumented individuals have limited options for their treatment. A novel opportunity to provide scheduled dialysis for this population is through the purchase of insurance off the exchange. Plans purchased off the exchange must still abide by the 2014 provision of the Patient Protection and Affordable Care Act, which prohibits insurance companies from denying coverage based on a preexisting health condition. In 2015 and 2016, >100 patients previously receiving only emergent dialysis at the two largest safety-net hospital systems in Texas obtained off-the-exchange commercial health insurance plans. These undocumented patients now receive scheduled dialysis treatments, which has improved their care and quality of life, as well as decompressed the overburdened hospital systems. The long-term sustainability of this option is not known. Socially responsive and visionary policymakers allowing the move into this bold, new direction deserve special appreciation. PMID:27577244

  17. A 48Cycles/MB H.264/AVC Deblocking Filter Architecture for Ultra High Definition Applications

    NASA Astrophysics Data System (ADS)

    Zhou, Dajiang; Zhou, Jinjia; Zhu, Jiayi; Goto, Satoshi

    In this paper, a highly parallel deblocking filter architecture for H.264/AVC is proposed to process one macroblock in 48 clock cycles and give real-time support to QFHD@60fps sequences at less than 100MHz. 4 edge filters organized in 2 groups for simultaneously processing vertical and horizontal edges are applied in this architecture to enhance its throughput. While parallelism increases, pipeline hazards arise owing to the latency of edge filters and data dependency of deblocking algorithm. To solve this problem, a zig-zag processing schedule is proposed to eliminate the pipeline bubbles. Data path of the architecture is then derived according to the processing schedule and optimized through data flow merging, so as to minimize the cost of logic and internal buffer. Meanwhile, the architecture's data input rate is designed to be identical to its throughput, while the transmission order of input data can also match the zig-zag processing schedule. Therefore no intercommunication buffer is required between the deblocking filter and its previous component for speed matching or data reordering. As a result, only one 24×64 two-port SRAM as internal buffer is required in this design. When synthesized with SMIC 130nm process, the architecture costs a gate count of 30.2k, which is competitive considering its high performance.

  18. Software Safety Progress in NASA

    NASA Technical Reports Server (NTRS)

    Radley, Charles F.

    1995-01-01

    NASA has developed guidelines for development and analysis of safety-critical software. These guidelines have been documented in a Guidebook for Safety Critical Software Development and Analysis. The guidelines represent a practical 'how to' approach, to assist software developers and safety analysts in cost effective methods for software safety. They provide guidance in the implementation of the recent NASA Software Safety Standard NSS-1740.13 which was released as 'Interim' version in June 1994, scheduled for formal adoption late 1995. This paper is a survey of the methods in general use, resulting in the NASA guidelines for safety critical software development and analysis.

  19. Yes: The Symptoms of OCD and Depression Are Discrete and Not Exclusively Negative Affectivity

    PubMed Central

    Moore, Kathleen A.; Howell, Jacqui

    2017-01-01

    Although Obsessive-Compulsive Disorder (OCD) and Depression are classified as separate disorders, the high incidence of co-morbidity and the strong correlations between measures of each has led to debate about the nature of their relationship. Some authors have proposed that OCD is in fact a mood disorder while others have suggested that the two disorders are grounded in negative affectivity. A third proposition is that depression is an essential part of OCD but that OCD is a separate disorder from depression. The aim in this study was to investigate these diverse propositions in a non-clinical sample and also to determine whether factors implicated in each, that is anxious and depressive cognitions, hopelessness, and self-criticism, would demonstrate commonality as predictors of the symptoms of OCD and of depression. Two hundred participants (59% female) (M age = 34 years, SD = 16) completed the Padua Inventory, Carroll Rating Scale, Cognitions Checklist, Self-Criticism Scale, Beck Hopelessness Scale, Buss-Durkee Hostility Inventory-Revised and a Negative Affectivity Schedule. Results indicated a strong correlation between OCD and depression, depression, and negative affectivity but a weaker relationship between OCD and negative affectivity. Path analyses revealed that both anxious and depressive cognitions, as well as hostility predicted both disorders but the Beta-weights were stronger on OCD. Self-criticism predicted only depression while hopelessness failed to predict either disorder but was itself predicted by depressive cognitions. Depression was a stronger indicator of negative affect than OCD and while OCD positively predicted depression, depression was a negative indicator of OCD. These results support the hypothesis that OCD and depression are discrete disorders and indicate that while depression is implicated in OCD, the reverse does not hold. While both disorders are related to negative affectivity, this relationship is much stronger for depression thus failing to confirm that both are subsumed by a common factor, in this case, negative affectivity. The proposition that depression is part of OCD but that OCD is not necessarily implicated in depression and is, in fact, a separate disorder, is supported by the current model. Further research is required to support the utility of the model in clinical samples. PMID:28553250

  20. Critical consciousness and intent to persist through college in DACA and U.S. citizen students: The role of immigration status, race, and ethnicity.

    PubMed

    Cadenas, Germán A; Bernstein, Bianca L; Tracey, Terence J G

    2018-05-21

    We used the model of critical consciousness (CC; Freire, 1973) to examine college persistence in a sample of Hispanic Deferred Action for Childhood Arrivals (DACA) college students in contrast to Hispanic and non-Hispanic White U.S. citizens. To do this, we looked to social cognitive career theory (Lent, Brown, & Hackett, 1994) to clarify the development of CC and its association with college persistence in students facing marginalization due to immigration status and racial/ethnic identity. The sample consisted of 368 undergraduate college students, including 89 Hispanic DACA recipients, 88 Hispanics with U.S. citizenship, and 191 non-Hispanic Whites with U.S. citizenship. Students completed scales on intent to persist in college, political self-efficacy, political outcome expectations, critical reflection, critical action, and supports and barriers for critical action. The data were examined using multigroup structural equation modeling; goodness of fit indices suggested good model fit for all groups. Tests of structural invariance revealed that 7 relational paths were equal across student groups, while race/ethnicity and immigration status differentiated the strength of 7 paths. Our findings indicate that there are differences in how Hispanic DACA students experience CC in relation to support for their political advocacy and activism. Findings also highlight that political outcome expectations predicted higher intent to persist in college for all students, including Hispanic DACA students. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  1. The path dependency theory: analytical framework to study institutional integration. The case of France.

    PubMed

    Trouvé, Hélène; Couturier, Yves; Etheridge, Francis; Saint-Jean, Olivier; Somme, Dominique

    2010-06-30

    The literature on integration indicates the need for an enhanced theorization of institutional integration. This article proposes path dependence as an analytical framework to study the systems in which integration takes place. PRISMA proposes a model for integrating health and social care services for older adults. This model was initially tested in Quebec. The PRISMA France study gave us an opportunity to analyze institutional integration in France. A qualitative approach was used. Analyses were based on semi-structured interviews with actors of all levels of decision-making, observations of advisory board meetings, and administrative documents. Our analyses revealed the complexity and fragmentation of institutional integration. The path dependency theory, which analyzes the change capacity of institutions by taking into account their historic structures, allows analysis of this situation. The path dependency to the Bismarckian system and the incomplete reforms of gerontological policies generate the coexistence and juxtaposition of institutional systems. In such a context, no institution has sufficient ability to determine gerontology policy and build institutional integration by itself. Using path dependence as an analytical framework helps to understand the reasons why institutional integration is critical to organizational and clinical integration, and the complex construction of institutional integration in France.

  2. Merging Hydrologic, Geochemical, and Geophysical Approaches to Understand the Regolith Architecture of a Deeply Weathered Piedmont Critical Zone

    NASA Astrophysics Data System (ADS)

    Cosans, C.; Moore, J.; Harman, C. J.

    2017-12-01

    Located in the deeply weathered Piedmont in Maryland, Pond Branch has a rich legacy of hydrological and geochemical research dating back to the first geochemical mass balance study published in 1970. More recently, geophysical investigations including seismic and electrical resistivity tomography have characterized the subsurface at Pond Branch and contributed to new hypotheses about critical zone evolution. Heterogeneity in electrical resistivity in the shallow subsurface may suggest disparate flow paths for recharge, with some regions with low hydraulic conductivity generating perched flow, while other hillslope sections recharge to the much deeper regolith boundary. These shallow and deep flow paths are hypothesized to be somewhat hydrologically and chemically connected, with the spatially and temporally discontinuous connections resulting in different hydraulic responses to recharge and different concentrations of weathering solutes. To test this hypothesis, we combined modeling and field approaches. We modeled weathering solutes along the hypothesized flow paths using PFLOTRAN. We measured hydrologic gradients in the hillslopes and riparian zone using piezometer water levels. We collected geochemical data including major ions and silica. Weathering solute concentrations were measured directly in the precipitation, hillslope springs, and the riparian zone for comparison to modeled concentration values. End member mixing methods were used to determine contributions of precipitation, hillslopes, and riparian zone to the stream. Combining geophysical, geochemical, and hydrological methods may offer insights into the source of stream water and controls on chemical weathering. Previous hypotheses that Piedmont critical zone architecture results from a balance of erosion, soil, and weathering front advance rates cannot account for the inverted regolith structure observed through seismic investigations at Pond Branch. Recent alternative hypotheses including weathering along tectonically-induced fractures and weathering front advance have been proposed, but additional data are needed to test them. Developing a thorough, nuanced understanding of the geochemical and hydrological behavior of Pond Branch may help test and refine hypotheses for Piedmont critical zone evolution.

  3. Naturally Speaking: The Naturalness Criterion and Physics at the LHC

    NASA Astrophysics Data System (ADS)

    Giudice, Gian Francesco

    The following sections are included: * Naturalness in Scientific Thought * Drowning by Numbers * A Quantum Complication * The Naturalness Criterion as a Principle * An Account of Events * The Paths Chosen by Nature * Measuring Naturalness * Anthropic Reasoning * Naturalness versus Criticality * Conclusions * References

  4. 3 CFR 8991 - Proclamation 8991 of May 31, 2013. National Oceans Month, 2013

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... critical role in nearly every part of our national life. They connect us to countries around the world, and... promote economic growth. The plan charts a path to better decision-making through science and data sharing...

  5. Global paths of time-periodic solutions of the Benjamin-Ono equation connecting arbitrary traveling waves

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ambrose, David M.; Wilkening, Jon

    2008-12-11

    We classify all bifurcations from traveling waves to non-trivial time-periodic solutions of the Benjamin-Ono equation that are predicted by linearization. We use a spectrally accurate numerical continuation method to study several paths of non-trivial solutions beyond the realm of linear theory. These paths are found to either re-connect with a different traveling wave or to blow up. In the latter case, as the bifurcation parameter approaches a critical value, the amplitude of the initial condition grows without bound and the period approaches zero. We propose a conjecture that gives the mapping from one bifurcation to its counterpart on the othermore » side of the path of non-trivial solutions. By experimentation with data fitting, we identify the form of the exact solutions on the path connecting two traveling waves, which represents the Fourier coefficients of the solution as power sums of a finite number of particle positions whose elementary symmetric functions execute simple orbits in the complex plane (circles or epicycles). We then solve a system of algebraic equations to express the unknown constants in the new representation in terms of the mean, a spatial phase, a temporal phase, four integers (enumerating the bifurcation at each end of the path) and one additional bifurcation parameter. We also find examples of interior bifurcations from these paths of already non-trivial solutions, but we do not attempt to analyze their algebraic structure.« less

  6. Quality of Work Life Issues for the 1980s.

    ERIC Educational Resources Information Center

    Rosow, Jerome M.

    1981-01-01

    Considers some of the economic, sociological, technological, and psychological factors influencing the shape of work in America during the 1980s. Critical issues are pay, employee benefits, job security, alternative work schedules, occupational stress, economic participation, and democracy in the workplace. (CT)

  7. 14 CFR 1214.805 - Unforeseen customer delay.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... problem pose a threat of delay to the Shuttle launch schedule or critical off-line activities, NASA shall... availability of facilities, equipment, and personnel. In requesting NASA to make such special efforts, the customer shall agree to reimburse NASA the estimated additional cost incurred. ...

  8. 14 CFR 1214.805 - Unforeseen customer delay.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... problem pose a threat of delay to the Shuttle launch schedule or critical off-line activities, NASA shall... availability of facilities, equipment, and personnel. In requesting NASA to make such special efforts, the customer shall agree to reimburse NASA the estimated additional cost incurred. ...

  9. 14 CFR 1214.805 - Unforeseen customer delay.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... problem pose a threat of delay to the Shuttle launch schedule or critical off-line activities, NASA shall... availability of facilities, equipment, and personnel. In requesting NASA to make such special efforts, the customer shall agree to reimburse NASA the estimated additional cost incurred. ...

  10. Report: EPA Plans for Managing Counter Terrorism/ Emergency Response Equipment and Protecting Critical Assets Not Fully Implemented

    EPA Pesticide Factsheets

    Report #09-P-0087, January 27, 2009. EPA has progressed in implementing the counter terrorism/emergency response (CT/ER) initiatives, but is behind schedule in implementing the Radiation Ambient Monitoring (RadNet) System.

  11. 46 CFR 160.135-9 - Preapproval review.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... lay-up schedule for a Fiber Reinforced Plastic (FRP) lifeboat; (7) Hull and canopy construction..., fiberglass, cloth, and plastic used in the lifeboat's manufacture; (10) Fabrication details for each major... water spray systems drawings and specifications, if installed; (16) Plans for critical subassemblies...

  12. 46 CFR 160.135-9 - Preapproval review.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... lay-up schedule for a Fiber Reinforced Plastic (FRP) lifeboat; (7) Hull and canopy construction..., fiberglass, cloth, and plastic used in the lifeboat's manufacture; (10) Fabrication details for each major... water spray systems drawings and specifications, if installed; (16) Plans for critical subassemblies...

  13. 46 CFR 160.135-9 - Preapproval review.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... lay-up schedule for a Fiber Reinforced Plastic (FRP) lifeboat; (7) Hull and canopy construction..., fiberglass, cloth, and plastic used in the lifeboat's manufacture; (10) Fabrication details for each major... water spray systems drawings and specifications, if installed; (16) Plans for critical subassemblies...

  14. Spaceborne Gravity Gradiometers

    NASA Technical Reports Server (NTRS)

    Wells, W. C. (Editor)

    1984-01-01

    The current status of gravity gradiometers and technology that could be available in the 1990's for the GRAVSAT-B mission are assessed. Problems associated with sensors, testing, spacecraft, and data processing are explored as well as critical steps, schedule, and cost factors in the development plan.

  15. Mission-directed path planning for planetary rover exploration

    NASA Astrophysics Data System (ADS)

    Tompkins, Paul

    2005-07-01

    Robotic rovers uniquely benefit planetary exploration---they enable regional exploration with the precision of in-situ measurements, a combination impossible from an orbiting spacecraft or fixed lander. Mission planning for planetary rover exploration currently utilizes sophisticated software for activity planning and scheduling, but simplified path planning and execution approaches tailored for localized operations to individual targets. This approach is insufficient for the investigation of multiple, regionally distributed targets in a single command cycle. Path planning tailored for this task must consider the impact of large scale terrain on power, speed and regional access; the effect of route timing on resource availability; the limitations of finite resource capacity and other operational constraints on vehicle range and timing; and the mutual influence between traverses and upstream and downstream stationary activities. Encapsulating this reasoning in an efficient autonomous planner would allow a rover to continue operating rationally despite significant deviations from an initial plan. This research presents mission-directed path planning that enables an autonomous, strategic reasoning capability for robotic explorers. Planning operates in a space of position, time and energy. Unlike previous hierarchical approaches, it treats these dimensions simultaneously to enable globally-optimal solutions. The approach calls on a near incremental search algorithm designed for planning and re-planning under global constraints, in spaces of higher than two dimensions. Solutions under this method specify routes that avoid terrain obstacles, optimize the collection and use of rechargable energy, satisfy local and global mission constraints, and account for the time and energy of interleaved mission activities. Furthermore, the approach efficiently re-plans in response to updates in vehicle state and world models, and is well suited to online operation aboard a robot. Simulations exhibit that the new methodology succeeds where conventional path planners would fail. Three planetary-relevant field experiments demonstrate the power of mission-directed path planning in directing actual exploration robots. Offline mission-directed planning sustained a solar-powered rover in a 24-hour sun-synchronous traverse. Online planning and re-planning enabled full navigational autonomy of over 1 kilometer, and supported the execution of science activities distributed over hundreds of meters.

  16. Combined Cycle Engine Large-Scale Inlet for Mode Transition Experiments: System Identification Rack Hardware Design

    NASA Technical Reports Server (NTRS)

    Thomas, Randy; Stueber, Thomas J.

    2013-01-01

    The System Identification (SysID) Rack is a real-time hardware-in-the-loop data acquisition (DAQ) and control instrument rack that was designed and built to support inlet testing in the NASA Glenn Research Center 10- by 10-Foot Supersonic Wind Tunnel. This instrument rack is used to support experiments on the Combined-Cycle Engine Large-Scale Inlet for Mode Transition Experiment (CCE? LIMX). The CCE?LIMX is a testbed for an integrated dual flow-path inlet configuration with the two flow paths in an over-and-under arrangement such that the high-speed flow path is located below the lowspeed flow path. The CCE?LIMX includes multiple actuators that are designed to redirect airflow from one flow path to the other; this action is referred to as "inlet mode transition." Multiple phases of experiments have been planned to support research that investigates inlet mode transition: inlet characterization (Phase-1) and system identification (Phase-2). The SysID Rack hardware design met the following requirements to support Phase-1 and Phase-2 experiments: safely and effectively move multiple actuators individually or synchronously; sample and save effector control and position sensor feedback signals; automate control of actuator positioning based on a mode transition schedule; sample and save pressure sensor signals; and perform DAQ and control processes operating at 2.5 KHz. This document describes the hardware components used to build the SysID Rack including their function, specifications, and system interface. Furthermore, provided in this document are a SysID Rack effectors signal list (signal flow); system identification experiment setup; illustrations indicating a typical SysID Rack experiment; and a SysID Rack performance overview for Phase-1 and Phase-2 experiments. The SysID Rack described in this document was a useful tool to meet the project objectives.

  17. Game injuries in relation to game schedules in the National Basketball Association.

    PubMed

    Teramoto, Masaru; Cross, Chad L; Cushman, Daniel M; Maak, Travis G; Petron, David J; Willick, Stuart E

    2017-03-01

    Injury management is critical in the National Basketball Association (NBA), as players experience a wide variety of injuries. Recently, it has been suggested that game schedules, such as back-to-back games and four games in five days, increase the risk of injuries in the NBA. The aim of this study was to examine the association between game schedules and player injuries in the NBA. Descriptive epidemiology study. The present study analyzed game injuries and game schedules in the 2012-13 through 2014-15 regular seasons. Game injuries by game schedules and players' profiles were examined using an exact binomial test, the Fisher's exact test and the Mann-Whitney-Wilcoxon test. A Poisson regression analysis was performed to predict the number of game injuries sustained by each player from game schedules and injured players' profiles. There were a total of 681 cases of game injuries sustained by 280 different players during the three years (total N=1443 players). Playing back-to-back games or playing four games in five days alone was not associated with an increased rate of game injuries, whereas a significant positive association was found between game injuries and playing away from home (p<0.05). Playing back-to-back games and away games were significant predictors of frequent game injuries (p<0.05). Game schedules could be one factor that impacts the risk of game injuries in the NBA. The findings could be useful for designing optimal game schedules in the NBA as well as helping NBA teams make adjustments to minimize game injuries. Copyright © 2016 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  18. Spatial-spectral characterization of focused spatially chirped broadband laser beams.

    PubMed

    Greco, Michael J; Block, Erica; Meier, Amanda K; Beaman, Alex; Cooper, Samuel; Iliev, Marin; Squier, Jeff A; Durfee, Charles G

    2015-11-20

    Proper alignment is critical to obtain the desired performance from focused spatially chirped beams, for example in simultaneous spatial and temporal focusing (SSTF). We present a simple technique for inspecting the beam paths and focusing conditions for the spectral components of a broadband beam. We spectrally resolve the light transmitted past a knife edge as it was scanned across the beam at several axial positions. The measurement yields information about spot size, M2, and the propagation paths of different frequency components. We also present calculations to illustrate the effects of defocus aberration on SSTF beams.

  19. Universal Health Coverage - The Critical Importance of Global Solidarity and Good Governance Comment on "Ethical Perspective: Five Unacceptable Trade-offs on the Path to Universal Health Coverage".

    PubMed

    Reis, Andreas A

    2016-06-07

    This article provides a commentary to Ole Norheim' s editorial entitled "Ethical perspective: Five unacceptable trade-offs on the path to universal health coverage." It reinforces its message that an inclusive, participatory process is essential for ethical decision-making and underlines the crucial importance of good governance in setting fair priorities in healthcare. Solidarity on both national and international levels is needed to make progress towards the goal of universal health coverage (UHC). © 2016 by Kerman University of Medical Sciences.

  20. Low-latency optical parallel adder based on a binary decision diagram with wavelength division multiplexing scheme

    NASA Astrophysics Data System (ADS)

    Shinya, A.; Ishihara, T.; Inoue, K.; Nozaki, K.; Kita, S.; Notomi, M.

    2018-02-01

    We propose an optical parallel adder based on a binary decision diagram that can calculate simply by propagating light through electrically controlled optical pass gates. The CARRY and CARRY operations are multiplexed in one circuit by a wavelength division multiplexing scheme to reduce the number of optical elements, and only a single gate constitutes the critical path for one digit calculation. The processing time reaches picoseconds per digit when we use a 100-μm-long optical path gates, which is ten times faster than a CMOS circuit.

  1. STS-113 Space Shuttle Endeavour launch

    NASA Technical Reports Server (NTRS)

    2002-01-01

    KENNEDY SPACE CENTER, FLA. - Water near Launch Pad 39A provides a mirror image of Space Shuttle Endeavour blazing a path into the night sky after launch on mission STS-113. Liftoff occurred ontime at 7:49:47 p.m. EST. The launch is the 19th for Endeavour, and the 112th flight in the Shuttle program. Mission STS-113 is the 16th assembly flight to the International Space Station, carrying another structure for the Station, the P1 integrated truss. Also onboard are the Expedition 6 crew, who will replace Expedition 5. Endeavour is scheduled to land at KSC after an 11-day journey.

  2. STS-109 Crew Interview: Grunsfeld

    NASA Technical Reports Server (NTRS)

    2002-01-01

    STS-109 Payload Commander John Grunsfeld is seen during a prelaunch interview answering questions about his inspiration to become an astronaut and his career path. He gives details on the mission's goal (which is to service the Hubble Space Telescope (HST)), his role during the mission, the five scheduled spacewalks, the Columbia Orbiter's recent upgrades, and what he sees as the challenges of the mission. Grunsfeld describes how his experience on the STS-103 mission, a previous HST servicing mission, has helped prepare him for the STS-109 mission. The interview ends with Grunsfeld explaining why the servicing of the Reaction Wheel Assembly, a task added late in his training, is so important.

  3. Cell transmission model of dynamic assignment for urban rail transit networks.

    PubMed

    Xu, Guangming; Zhao, Shuo; Shi, Feng; Zhang, Feilian

    2017-01-01

    For urban rail transit network, the space-time flow distribution can play an important role in evaluating and optimizing the space-time resource allocation. For obtaining the space-time flow distribution without the restriction of schedules, a dynamic assignment problem is proposed based on the concept of continuous transmission. To solve the dynamic assignment problem, the cell transmission model is built for urban rail transit networks. The priority principle, queuing process, capacity constraints and congestion effects are considered in the cell transmission mechanism. Then an efficient method is designed to solve the shortest path for an urban rail network, which decreases the computing cost for solving the cell transmission model. The instantaneous dynamic user optimal state can be reached with the method of successive average. Many evaluation indexes of passenger flow can be generated, to provide effective support for the optimization of train schedules and the capacity evaluation for urban rail transit network. Finally, the model and its potential application are demonstrated via two numerical experiments using a small-scale network and the Beijing Metro network.

  4. Path scheduling for multiple mobile actors in wireless sensor network

    NASA Astrophysics Data System (ADS)

    Trapasiya, Samir D.; Soni, Himanshu B.

    2017-05-01

    In wireless sensor network (WSN), energy is the main constraint. In this work we have addressed this issue for single as well as multiple mobile sensor actor network. In this work, we have proposed Rendezvous Point Selection Scheme (RPSS) in which Rendezvous Nodes are selected by set covering problem approach and from that, Rendezvous Points are selected in a way to reduce the tour length. The mobile actors tour is scheduled to pass through those Rendezvous Points as per Travelling Salesman Problem (TSP). We have also proposed novel rendezvous node rotation scheme for fair utilisation of all the nodes. We have compared RPSS with Stationery Actor scheme as well as RD-VT, RD-VT-SMT and WRP-SMT for performance metrics like energy consumption, network lifetime, route length and found the better outcome in all the cases for single actor. We have also applied RPSS for multiple mobile actor case like Multi-Actor Single Depot (MASD) termination and Multi-Actor Multiple Depot (MAMD) termination and observed by extensive simulation that MAMD saves the network energy in optimised way and enhance network lifetime compared to all other schemes.

  5. A Hybrid Approach to Composite Damage and Failure Analysis Combining Synergistic Damage Mechanics and Peridynamics

    DTIC Science & Technology

    2016-06-30

    PERFORMING ORGANIZATION Texas A&M Eng ineering Experiment Station (TEES) REPORT NUMBER 1470 William D. Fitch Parkway M1601473/ 505170-00001/2...0.7% strain when the dilatational energy density reaches the experimentally determined critical value (0.2 MPa). 3 To validate whether the critical...implementation against experimental results in terms of the crack path shape. We perform convergence studies in terms of the non local region size for

  6. Immunogenicity of HPV prophylactic vaccines: Serology assays and their use in HPV vaccine evaluation and development.

    PubMed

    Pinto, Ligia A; Dillner, Joakim; Beddows, Simon; Unger, Elizabeth R

    2018-01-17

    When administered as standard three-dose schedules, the licensed HPV prophylactic vaccines have demonstrated extraordinary immunogenicity and efficacy. We summarize the immunogenicity of these licensed vaccines and the most commonly used serology assays, with a focus on key considerations for one-dose vaccine schedules. Although immune correlates of protection against infection are not entirely clear, both preclinical and clinical evidence point to neutralizing antibodies as the principal mechanism of protection. Thus, immunogenicity assessments in vaccine trials have focused on measurements of antibody responses to the vaccine. Non-inferiority of antibody responses after two doses of HPV vaccines separated by 6 months has been demonstrated and this evidence supported the recent WHO recommendations for two-dose vaccination schedules in both boys and girls 9-14 years of age. There is also some evidence suggesting that one dose of HPV vaccines may provide protection similar to the currently recommended two-dose regimens but robust data on efficacy and immunogenicity of one-dose vaccine schedules are lacking. In addition, immunogenicity has been assessed and reported using different methods, precluding direct comparison of results between different studies and vaccines. New head-to-head vaccine trials evaluating one-dose immunogenicity and efficacy have been initiated and an increase in the number of trials relying on immunobridging is anticipated. Therefore, standardized measurement and reporting of immunogenicity for the up to nine HPV types targeted by the current vaccines is now critical. Building on previous HPV serology assay standardization and harmonization efforts initiated by the WHO HPV LabNet in 2006, new secondary standards, critical reference reagents and testing guidelines will be generated as part of a new partnership to facilitate harmonization of the immunogenicity testing in new HPV vaccine trials. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Adaptive Management of Social-Ecological Systems: The Path Forward

    EPA Science Inventory

    Adaptive management remains at the forefront of environmental management nearly 40 years after its original conception, largely because we have yet to develop other methodologies that offer the same promise. Despite the criticisms of adaptive management and the numerous failed at...

  8. THE MAQC (MICROARRAY QUALITY CONTROL) PROJECT: CALIBRATED RNA SAMPLES, REFERENCE DATASETS, AND QC METRICS AND THRESHOLDS

    EPA Science Inventory

    FDAs Critical Path Initiative identifies pharmacogenomics and toxicogenomics as key opportunities in advancing medical product development and personalized medicine, and the Guidance for Industry: Pharmacogenomic Data Submissions has been released. Microarrays represent a co...

  9. 14 CFR § 1214.805 - Unforeseen customer delay.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... problem pose a threat of delay to the Shuttle launch schedule or critical off-line activities, NASA shall... availability of facilities, equipment, and personnel. In requesting NASA to make such special efforts, the customer shall agree to reimburse NASA the estimated additional cost incurred. ...

  10. DDC19: An Indictment.

    ERIC Educational Resources Information Center

    Berman, Sanford

    1980-01-01

    Criticizes the 19th edition of "Dewey Decimal Classification" for violating traditional classification goals for library materials and ignoring the desires of libraries and other users. A total reform is proposed to eliminate Phoenix schedules and to accept only those relocations approved by an editorial board of users. (RAA)

  11. Greedy data transportation scheme with hard packet deadlines for wireless ad hoc networks.

    PubMed

    Lee, HyungJune

    2014-01-01

    We present a greedy data transportation scheme with hard packet deadlines in ad hoc sensor networks of stationary nodes and multiple mobile nodes with scheduled trajectory path and arrival time. In the proposed routing strategy, each stationary ad hoc node en route decides whether to relay a shortest-path stationary node toward destination or a passing-by mobile node that will carry closer to destination. We aim to utilize mobile nodes to minimize the total routing cost as far as the selected route can satisfy the end-to-end packet deadline. We evaluate our proposed routing algorithm in terms of routing cost, packet delivery ratio, packet delivery time, and usability of mobile nodes based on network level simulations. Simulation results show that our proposed algorithm fully exploits the remaining time till packet deadline to turn into networking benefits of reducing the overall routing cost and improving packet delivery performance. Also, we demonstrate that the routing scheme guarantees packet delivery with hard deadlines, contributing to QoS improvement in various network services.

  12. Greedy Data Transportation Scheme with Hard Packet Deadlines for Wireless Ad Hoc Networks

    PubMed Central

    Lee, HyungJune

    2014-01-01

    We present a greedy data transportation scheme with hard packet deadlines in ad hoc sensor networks of stationary nodes and multiple mobile nodes with scheduled trajectory path and arrival time. In the proposed routing strategy, each stationary ad hoc node en route decides whether to relay a shortest-path stationary node toward destination or a passing-by mobile node that will carry closer to destination. We aim to utilize mobile nodes to minimize the total routing cost as far as the selected route can satisfy the end-to-end packet deadline. We evaluate our proposed routing algorithm in terms of routing cost, packet delivery ratio, packet delivery time, and usability of mobile nodes based on network level simulations. Simulation results show that our proposed algorithm fully exploits the remaining time till packet deadline to turn into networking benefits of reducing the overall routing cost and improving packet delivery performance. Also, we demonstrate that the routing scheme guarantees packet delivery with hard deadlines, contributing to QoS improvement in various network services. PMID:25258736

  13. Contact Graph Routing Enhancements Developed in ION for DTN

    NASA Technical Reports Server (NTRS)

    Segui, John S.; Burleigh, Scott

    2013-01-01

    The Interplanetary Overlay Network (ION) software suite is an open-source, flight-ready implementation of networking protocols including the Delay/Disruption Tolerant Networking (DTN) Bundle Protocol (BP), the CCSDS (Consultative Committee for Space Data Systems) File Delivery Protocol (CFDP), and many others including the Contact Graph Routing (CGR) DTN routing system. While DTN offers the capability to tolerate disruption and long signal propagation delays in transmission, without an appropriate routing protocol, no data can be delivered. CGR was built for space exploration networks with scheduled communication opportunities (typically based on trajectories and orbits), represented as a contact graph. Since CGR uses knowledge of future connectivity, the contact graph can grow rather large, and so efficient processing is desired. These enhancements allow CGR to scale to predicted NASA space network complexities and beyond. This software improves upon CGR by adopting an earliest-arrival-time cost metric and using the Dijkstra path selection algorithm. Moving to Dijkstra path selection also enables construction of an earliest- arrival-time tree for multicast routing. The enhancements have been rolled into ION 3.0 available on sourceforge.net.

  14. Body Investment, Depression, and Alcohol Use as Risk Factors for Suicide Proneness in College Students

    PubMed Central

    Lamis, Dorian A.; Malone, Patrick S.; Langhinrichsen-Rohling, Jennifer; Ellis, Thomas E.

    2009-01-01

    This study examined the relationships among three risk factors – body investment, depression, and alcohol use – and suicide proneness as measured by the Life Attitudes Schedule – Short Form (LAS-SF) in college students (n = 318). Path analysis was used to construct a causal model of suicide proneness. The Body Investment Scale (BIS) subscales were assumed to be causally prior to depression, which was in turn modeled as occurring prior to alcohol use, which was in turn modeled as prior to suicide proneness. Results revealed that, as expected suicide proneness was positively predicted by alcohol use, alcohol use was positively predicted by depression, and depression was negatively predicted by the body image component of the BIS. Additionally, the body image-suicide proneness link was significantly mediated by depression and its direct effect on suicide proneness as well as by the two-mediator path of body image on depression on drinking on suicide proneness. Implications are offered for the improved identification and treatment of young adults at risk for suicidal and health-diminishing behaviors. PMID:20573605

  15. Development and demonstration of an on-board mission planner for helicopters

    NASA Technical Reports Server (NTRS)

    Deutsch, Owen L.; Desai, Mukund

    1988-01-01

    Mission management tasks can be distributed within a planning hierarchy, where each level of the hierarchy addresses a scope of action, and associated time scale or planning horizon, and requirements for plan generation response time. The current work is focused on the far-field planning subproblem, with a scope and planning horizon encompassing the entire mission and with a response time required to be about two minutes. The far-feld planning problem is posed as a constrained optimization problem and algorithms and structural organizations are proposed for the solution. Algorithms are implemented in a developmental environment, and performance is assessed with respect to optimality and feasibility for the intended application and in comparison with alternative algorithms. This is done for the three major components of far-field planning: goal planning, waypoint path planning, and timeline management. It appears feasible to meet performance requirements on a 10 Mips flyable processor (dedicated to far-field planning) using a heuristically-guided simulated annealing technique for the goal planner, a modified A* search for the waypoint path planner, and a speed scheduling technique developed for this project.

  16. High accurate interpolation of NURBS tool path for CNC machine tools

    NASA Astrophysics Data System (ADS)

    Liu, Qiang; Liu, Huan; Yuan, Songmei

    2016-09-01

    Feedrate fluctuation caused by approximation errors of interpolation methods has great effects on machining quality in NURBS interpolation, but few methods can efficiently eliminate or reduce it to a satisfying level without sacrificing the computing efficiency at present. In order to solve this problem, a high accurate interpolation method for NURBS tool path is proposed. The proposed method can efficiently reduce the feedrate fluctuation by forming a quartic equation with respect to the curve parameter increment, which can be efficiently solved by analytic methods in real-time. Theoretically, the proposed method can totally eliminate the feedrate fluctuation for any 2nd degree NURBS curves and can interpolate 3rd degree NURBS curves with minimal feedrate fluctuation. Moreover, a smooth feedrate planning algorithm is also proposed to generate smooth tool motion with considering multiple constraints and scheduling errors by an efficient planning strategy. Experiments are conducted to verify the feasibility and applicability of the proposed method. This research presents a novel NURBS interpolation method with not only high accuracy but also satisfying computing efficiency.

  17. Analysis of the Gulf of Mexico's Veracruz-Havana Route of La Flota de la Nueva España

    NASA Astrophysics Data System (ADS)

    Lugo-Fernández, A.; Ball, D. A.; Gravois, M.; Horrell, C.; Irion, J. B.

    2007-06-01

    During colonial times, an active maritime trade existed between Spain and the New World, with convoys sailing annually to and from Mexico and returning via Havana, Cuba, after wintering in America. A database constructed from secondary and open sources revealed that Spanish vessels were sailing over open waters along a northern path near Louisiana and a southern path across the central Gulf of Mexico. These routes were traversed in about one month and scheduling for the convoy was based on an understanding of the Americas’ meteorological and oceanographic climate. However, other factors may also have been involved in the directional layout of the routes. Today these ancient routes crisscross planning areas for oil and gas lease sales in the US Exclusive Economic Zone and the information presented in this article may aid in identifying areas where historic shipwrecks may lie. Maps and documents found during this study helped piece together the evolution of our understanding of the Gulf of Mexico surface circulation and how this knowledge influenced sailing during colonial times.

  18. Aseptic Handling of the MOMA Mass Spectrometer After Dry Heat Microbial Reduction

    NASA Technical Reports Server (NTRS)

    Lalime, Erin

    2017-01-01

    Mars Organic Molecule Analyzer Mass Spectrometer (MOMA-MS) is an instrument in the larger MOMA instrument suite for the European Space Agency (ESA) ExoMars 2020 Rover. As a life-detection instrument on a Mars landing mission, MOMA-MS has very stringent Planetary Protection (PP) bioburden requirements. Within the MOMA instrument suite, the hardware surfaces of the sample path must be cleaned to a level of 0.03 spore/sq m. To meet this requirement, a process called Dry Heat Microbial Reduction (DHMR) is used to decrease the number of viable spores by 4 orders of magnitude. Before DHMR, the hardware is handled using standard cleanroom practices, while after DHMR, all sample path surfaces must be handled aseptically when exposed. Aseptic handling of the sample path involves a number of strategies and protocols including working only in an aseptic ISO class 5 work space, limiting the amount of time of exposure, using sterile garmenting with sterile gloves, and using sterile tools. Before work begins, the aseptic workspace will be tested for bioburden and particle fallout, and all tools that will contact sample path surfaces must be sterilized. During the exposure activity, sterile garments will be worn, sterile tools will be handled in a 2 person set up so that the operator touches only the sterile tool and not the exterior surfaces of the sterile pouch, and the environment will be monitored with active and passive fallout for bioburden and particle levels. Any breach in the planetary protection cleanliness can necessitate repeating DHMR, which not only has significant cost and schedule implications, it also become a risk to hardware that is not rated for repeated long exposures to high temperatures.

  19. Intermittent Metronomic Drug Schedule Is Essential for Activating Antitumor Innate Immunity and Tumor Xenograft Regression12

    PubMed Central

    Chen, Chong-Sheng; Doloff, Joshua C; Waxman, David J

    2014-01-01

    Metronomic chemotherapy using cyclophosphamide (CPA) is widely associated with antiangiogenesis; however, recent studies implicate other immune-based mechanisms, including antitumor innate immunity, which can induce major tumor regression in implanted brain tumor models. This study demonstrates the critical importance of drug schedule: CPA induced a potent antitumor innate immune response and tumor regression when administered intermittently on a 6-day repeating metronomic schedule but not with the same total exposure to activated CPA administered on an every 3-day schedule or using a daily oral regimen that serves as the basis for many clinical trials of metronomic chemotherapy. Notably, the more frequent metronomic CPA schedules abrogated the antitumor innate immune and therapeutic responses. Further, the innate immune response and antitumor activity both displayed an unusually steep dose-response curve and were not accompanied by antiangiogenesis. The strong recruitment of innate immune cells by the 6-day repeating CPA schedule was not sustained, and tumor regression was abolished, by a moderate (25%) reduction in CPA dose. Moreover, an ∼20% increase in CPA dose eliminated the partial tumor regression and weak innate immune cell recruitment seen in a subset of the every 6-day treated tumors. Thus, metronomic drug treatment must be at a sufficiently high dose but also sufficiently well spaced in time to induce strong sustained antitumor immune cell recruitment. Many current clinical metronomic chemotherapeutic protocols employ oral daily low-dose schedules that do not meet these requirements, suggesting that they may benefit from optimization designed to maximize antitumor immune responses. PMID:24563621

  20. Assessment of critical path analyses of the relationship between permeability and electrical conductivity of pore networks

    NASA Astrophysics Data System (ADS)

    Skaggs, Todd H.

    2011-10-01

    Critical path analysis (CPA) is a method for estimating macroscopic transport coefficients of heterogeneous materials that are highly disordered at the micro-scale. Developed originally to model conduction in semiconductors, numerous researchers have noted that CPA might also have relevance to flow and transport processes in porous media. However, the results of several numerical investigations of critical path analysis on pore network models raise questions about the applicability of CPA to porous media. Among other things, these studies found that (i) in well-connected 3D networks, CPA predictions were inaccurate and became worse when heterogeneity was increased; and (ii) CPA could not fully explain the transport properties of 2D networks. To better understand the applicability of CPA to porous media, we made numerical computations of permeability and electrical conductivity on 2D and 3D networks with differing pore-size distributions and geometries. A new CPA model for the relationship between the permeability and electrical conductivity was found to be in good agreement with numerical data, and to be a significant improvement over a classical CPA model. In sufficiently disordered 3D networks, the new CPA prediction was within ±20% of the true value, and was nearly optimal in terms of minimizing the squared prediction errors across differing network configurations. The agreement of CPA predictions with 2D network computations was similarly good, although 2D networks are in general not well-suited for evaluating CPA. Numerical transport coefficients derived for regular 3D networks of slit-shaped pores were found to be in better agreement with experimental data from rock samples than were coefficients derived for networks of cylindrical pores.

  1. Applicability of Markets to Global Scheduling in Grids: Critical Examination of General Equilibrium Theory and Market Folklore

    NASA Technical Reports Server (NTRS)

    Nakai, Junko; VanDerWijngaart, Rob F.

    2003-01-01

    Markets are often considered superior to other global scheduling mechanisms for distributed computing systems. This claim is supported by: a casual observation from our every-day life that markets successfully equilibrate supply and demand, and the features of markets which originate in the general equilibrium theory, e.g., efficiency and the lack of necessity of 2 central controller. This paper describes why such beliefs in markets are not warranted. It does so by examining the general equilibrium theory, in terms of scope, abstraction, and interpretation. Not only does the general equilibrium theory fail to provide a satisfactory explanation of actual economies, including a computing-resource economy, it also falls short of supplying theoretical foundations for commonly held views of market desirability. This paper also points out that the argument for the desirability of markets involves circular reasoning and that the desirability can be established only vis-a-vis a scheduling goal. Finally, recasting the conclusion of Arrow's Impossibility Theorem as that for global scheduling, we conclude that there exists no market-based scheduler that is rational (in the sense defined in microeconomic theory), takes into account utility of more than one user, and yet yields a Pareto-optimal outcome for arbitrary user utility functions.

  2. Historical Mass, Power, Schedule, and Cost Growth for NASA Spacecraft

    NASA Technical Reports Server (NTRS)

    Hayhurst, Marc R.; Bitten, Robert E.; Shinn, Stephen A.; Judnick, Daniel C.; Hallgrimson, Ingrid E.; Youngs, Megan A.

    2016-01-01

    Although spacecraft developers have been moving towards standardized product lines as the aerospace industry has matured, NASA's continual need to push the cutting edge of science to accomplish unique, challenging missions can still lead to spacecraft resource growth over time. This paper assesses historical mass, power, cost, and schedule growth for multiple NASA spacecraft from the last twenty years and compares to industry reserve guidelines to understand where the guidelines may fall short. Growth is assessed from project start to launch, from the time of the preliminary design review (PDR) to launch and from the time of the critical design review (CDR) to launch. Data is also assessed not just at the spacecraft bus level, but also at the subsystem level wherever possible, to help obtain further insight into possible drivers of growth. Potential recommendations to minimize spacecraft mass, power, cost, and schedule growth for future missions are also discussed.

  3. Principles and Guidelines for Duty and Rest Scheduling in Commercial Aviation

    NASA Technical Reports Server (NTRS)

    Dinges, David F.; Graeber, R. Curtis; Rosekind, Mark R.; Samel, Alexander

    1996-01-01

    The aviation industry requires 24-hour activities to meet operational demands. Growth in global long-haul, regional, overnight cargo, and short-haul domestic operations will continue to increase these round-the-clock requirements. Flight crews must be available to support 24-hour-a-day operations to meet these industry demands. Both domestic and international aviation can also require crossing multiple time zones. Therefore, shift work, night work, irregular work schedules, unpredictable work schedules, and dm zone changes will continue to be commonplace components of the aviation industry. These factors pose known challenges to human physiology, and because they result in performance-impairing fatigue, they pose a risk to safety. It is critical to acknowledge and, whenever possible, incorporate scientific information on fatigue, human sleep, and circadian physiology into 24-hour aviation operations. Utilization of such scientific information can help promote crew performance and alertness during flight operations and thereby maintain and improve the safety margin.

  4. The tail wags the dog: managing large telescope construction projects with lagging requirements and creeping scope

    NASA Astrophysics Data System (ADS)

    Warner, Mark

    2014-08-01

    In a perfect world, large telescopes would be developed and built in logical, sequential order. First, scientific requirements would be agreed upon, vetted, and fully developed. From these, instrument designers would define their own subsystem requirements and specifications, and then flesh out preliminary designs. This in turn would then allow optic designers to specify lens and mirror requirements, which would permit telescope mounts and drives to be designed. Finally, software and safety systems, enclosures and domes, buildings, foundations, and infrastructures would be specified and developed. Unfortunately, the order of most large telescope projects is the opposite of this sequence. We don't live in a perfect world. Scientists usually don't want to commit to operational requirements until late in the design process, instrument designers frequently change and update their designs due to improving filter and camera technologies, and mount and optics engineers seem to live by the words "more" and "better" throughout their own design processes. Amplifying this is the fact that site construction of buildings and domes are usually the earliest critical path items on the schedule, and are often subject to lengthy permitting and environmental processes. These facility and support items therefore must quickly get underway, often before operational requirements are fully considered. Mirrors and mounts also have very long lead times for fabrication, which in turn necessitates that they are specified and purchased early. All of these factors can result in expensive and time-consuming change orders when requirements are finalized and/or shift late in the process. This paper discusses some of these issues encountered on large, multi-year construction projects. It also presents some techniques and ideas to minimize these effects on schedule and cost. Included is a discussion on the role of Interface Control Documents (ICDs), the importance (and danger) of making big-picture decisions early, and designing flexibility and adaptability into subsystems. In a perfect world, science would be the big dog in the room, wagging the engineering tail. In our non-perfect world, however, it's often the tail that ends up wagging the dog instead.

  5. NASA Bioculture System: From Experiment Definition to Flight Payload

    NASA Technical Reports Server (NTRS)

    Sato, Kevin Y.; Almeida, Eduardo; Austin, Edward M.

    2014-01-01

    Starting in 2015, the NASA Bioculture System will be available to the science community to conduct cell biology and microbiology experiments on ISS. The Bioculture System carries ten environmentally independent Cassettes, which house the experiments. The closed loop fluids flow path subsystem in each Cassette provides a perfusion-based method for maintain specimen cultures in a shear-free environment by using a biochamber based on porous hollow fiber bioreactor technology. Each Cassette contains an incubator and separate insulated refrigerator compartment for storage of media, samples, nutrients and additives. The hardware is capable of fully automated or manual specimen culturing and processing, including in-flight experiment initiation, sampling and fixation, up to BSL-2 specimen culturing, and the ability to up to 10 independent cultures in parallel for statistical analysis. The incubation and culturing of specimens in the Bioculture System is a departure from standard laboratory culturing methods. Therefore, it is critical that the PI has an understanding the pre-flight test required for successfully using the Bioculture System to conduct an on-orbit experiment. Overall, the PI will conduct a series of ground tests to define flight experiment and on-orbit implementation requirements, verify biocompatibility, and determine base bioreactor conditions. The ground test processes for the utilization of the Bioculture System, from experiment selection to flight, will be reviewed. Also, pre-flight test schedules and use of COTS ground test equipment (CellMax and FiberCell systems) and the Bioculture System will be discussed.

  6. Neural Architectures for Control

    NASA Technical Reports Server (NTRS)

    Peterson, James K.

    1991-01-01

    The cerebellar model articulated controller (CMAC) neural architectures are shown to be viable for the purposes of real-time learning and control. Software tools for the exploration of CMAC performance are developed for three hardware platforms, the MacIntosh, the IBM PC, and the SUN workstation. All algorithm development was done using the C programming language. These software tools were then used to implement an adaptive critic neuro-control design that learns in real-time how to back up a trailer truck. The truck backer-upper experiment is a standard performance measure in the neural network literature, but previously the training of the controllers was done off-line. With the CMAC neural architectures, it was possible to train the neuro-controllers on-line in real-time on a MS-DOS PC 386. CMAC neural architectures are also used in conjunction with a hierarchical planning approach to find collision-free paths over 2-D analog valued obstacle fields. The method constructs a coarse resolution version of the original problem and then finds the corresponding coarse optimal path using multipass dynamic programming. CMAC artificial neural architectures are used to estimate the analog transition costs that dynamic programming requires. The CMAC architectures are trained in real-time for each obstacle field presented. The coarse optimal path is then used as a baseline for the construction of a fine scale optimal path through the original obstacle array. These results are a very good indication of the potential power of the neural architectures in control design. In order to reach as wide an audience as possible, we have run a seminar on neuro-control that has met once per week since 20 May 1991. This seminar has thoroughly discussed the CMAC architecture, relevant portions of classical control, back propagation through time, and adaptive critic designs.

  7. The Case for Teacher-Led Innovation

    ERIC Educational Resources Information Center

    Parry, Louka

    2018-01-01

    In order to bring the benefits of new education approaches, strategies, and tools to classrooms around the world, we need champions of innovation to blaze the path and serve as role models of creativity and exploration. Teachers are uniquely positioned to serve in this critical role.

  8. Opto-Electronic and Interconnects Hierarchical Design Automation System (OE-IDEAS)

    DTIC Science & Technology

    2004-05-01

    NETBOOK WEBSITE............................................................71 8.2 SIMULATION OF CRITICAL PATH FROM THE MAYO “10G” SYSTEM MCM BOARD...Benchmarks from the DaVinci Netbook website In May 2002, CFDRC downloaded all the materials from the DaVinci Netbook website containing the benchmark

  9. Predicting field weed emergence with empirical models and soft computing techniques

    USDA-ARS?s Scientific Manuscript database

    Seedling emergence is the most important phenological process that influences the success of weed species; therefore, predicting weed emergence timing plays a critical role in scheduling weed management measures. Important efforts have been made in the attempt to develop models to predict seedling e...

  10. 5 CFR 531.403 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS PAY UNDER THE GENERAL SCHEDULE... performance by an employee that warrants advancement of the employee's rate of basic pay to the next higher.... Calendar week means a period of any seven consecutive calendar days. Critical element has the meaning given...

  11. 9 CFR 381.303 - Critical factors and the application of the process schedule.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE AGENCY ORGANIZATION AND TERMINOLOGY; MANDATORY MEAT AND POULTRY PRODUCTS INSPECTION AND VOLUNTARY INSPECTION AND CERTIFICATION POULTRY PRODUCTS INSPECTION REGULATIONS.../air ratio; and (2) Heating medium flow rate. (Approved by the Office of Management and Budget under...

  12. 9 CFR 381.303 - Critical factors and the application of the process schedule.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE AGENCY ORGANIZATION AND TERMINOLOGY; MANDATORY MEAT AND POULTRY PRODUCTS INSPECTION AND VOLUNTARY INSPECTION AND CERTIFICATION POULTRY PRODUCTS INSPECTION REGULATIONS.../air ratio; and (2) Heating medium flow rate. (Approved by the Office of Management and Budget under...

  13. 9 CFR 381.303 - Critical factors and the application of the process schedule.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE AGENCY ORGANIZATION AND TERMINOLOGY; MANDATORY MEAT AND POULTRY PRODUCTS INSPECTION AND VOLUNTARY INSPECTION AND CERTIFICATION POULTRY PRODUCTS INSPECTION REGULATIONS.../air ratio; and (2) Heating medium flow rate. (Approved by the Office of Management and Budget under...

  14. 9 CFR 381.303 - Critical factors and the application of the process schedule.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE AGENCY ORGANIZATION AND TERMINOLOGY; MANDATORY MEAT AND POULTRY PRODUCTS INSPECTION AND VOLUNTARY INSPECTION AND CERTIFICATION POULTRY PRODUCTS INSPECTION REGULATIONS.../air ratio; and (2) Heating medium flow rate. (Approved by the Office of Management and Budget under...

  15. Dynamic phase transitions of the Blume-Emery-Griffiths model under an oscillating external magnetic field by the path probability method

    NASA Astrophysics Data System (ADS)

    Ertaş, Mehmet; Keskin, Mustafa

    2015-03-01

    By using the path probability method (PPM) with point distribution, we study the dynamic phase transitions (DPTs) in the Blume-Emery-Griffiths (BEG) model under an oscillating external magnetic field. The phases in the model are obtained by solving the dynamic equations for the average order parameters and a disordered phase, ordered phase and four mixed phases are found. We also investigate the thermal behavior of the dynamic order parameters to analyze the nature dynamic transitions as well as to obtain the DPT temperatures. The dynamic phase diagrams are presented in three different planes in which exhibit the dynamic tricritical point, double critical end point, critical end point, quadrupole point, triple point as well as the reentrant behavior, strongly depending on the values of the system parameters. We compare and discuss the dynamic phase diagrams with dynamic phase diagrams that were obtained within the Glauber-type stochastic dynamics based on the mean-field theory.

  16. Interview with Janet Woodcock: progress on the FDA's critical path initiative.

    PubMed

    Woodcock, Janet

    2009-12-01

    Janet Woodcock is the Director of the US FDA's Center for Drug Evaluation and Research. Dr Woodcock has held various positions within the FDA's Office of the Commissioner from October 2003 until 1 April, 2008, as Deputy Commissioner and Chief Medical Officer, Deputy Commissioner for Operations and Chief Operating Officer and Director of the Critical Path Programs. She oversaw scientific and medical regulatory operations for the FDA. Dr Woodcock served as Director of the Center for Drug Evaluation and Research at the FDA from 1994 to 2005. She previously served in other positions at the FDA including Director of the Office of Therapeutics Research and Review and Acting Deputy Director of the Center for Biologics Evaluation and Research. Dr Woodcock received her MD from Northwestern Medical School (IL, USA), and completed further training and held teaching appointments at the Pennsylvania State University (PA, USA)and the University of California in San Francisco (CA, USA). She joined the FDA in 1986.

  17. Making psycholinguistics musical: Self-paced reading time evidence for shared processing of linguistic and musical syntax

    PubMed Central

    Robert Slevc, L.; Rosenberg, Jason C.; Patel, Aniruddh D.

    2009-01-01

    Linguistic processing–especially syntactic processing–is often considered a hallmark of human cognition, thus the domain-specificity or domain-generality of syntactic processing has attracted considerable debate. These experiments address this issue by simultaneously manipulating syntactic processing demands in language and music. Participants performed self-paced reading of garden-path sentences in which structurally unexpected words cause temporary syntactic processing difficulty. A musical chord accompanied each sentence segment, with the resulting sequence forming a coherent chord progression. When structurally unexpected words were paired with harmonically unexpected chords, participants showed substantially enhanced garden-path effects. No such interaction was observed when the critical words violated semantic expectancy, nor when the critical chords violated timbral expectancy. These results support a prediction of the shared syntactic integration resource hypothesis (SSIRH, Patel, 2003), which suggests that music and language draw on a common pool of limited processing resources for integrating incoming elements into syntactic structures. PMID:19293110

  18. Human spaceflight technology needs-a foundation for JSC's technology strategy

    NASA Astrophysics Data System (ADS)

    Stecklein, J. M.

    Human space exploration has always been heavily influenced by goals to achieve a specific mission on a specific schedule. This approach drove rapid technology development, the rapidity of which added risks and became a major driver for costs and cost uncertainty. The National Aeronautics and Space Administration (NASA) is now approaching the extension of human presence throughout the solar system by balancing a proactive yet less schedule-driven development of technology with opportunistic scheduling of missions as the needed technologies are realized. This approach should provide cost effective, low risk technology development that will enable efficient and effective manned spaceflight missions. As a first step, the NASA Human Spaceflight Architecture Team (HAT) has identified a suite of critical technologies needed to support future manned missions across a range of destinations, including in cis-lunar space, near earth asteroid visits, lunar exploration, Mars moons, and Mars exploration. The challenge now is to develop a strategy and plan for technology development that efficiently enables these missions over a reasonable time period, without increasing technology development costs unnecessarily due to schedule pressure, and subsequently mitigating development and mission risks. NASA's Johnson Space Center (JSC), as the nation's primary center for human exploration, is addressing this challenge through an innovative approach in allocating Internal Research and Development funding to projects. The HAT Technology Needs (Tech Needs) Database has been developed to correlate across critical technologies and the NASA Office of Chief Technologist Technology Area Breakdown Structure (TABS). The TechNeeds Database illuminates that many critical technologies may support a single technical capability gap, that many HAT technology needs may map to a single TABS technology discipline, and that a single HAT technology need may map to multiple TABS technology disciplines. Th- TechNeeds Database greatly clarifies understanding of the complex relationships of critical technologies to mission and architecture element needs. Extensions to the core TechNeeds Database allow JSC to factor in and appropriately weight JSC core technology competencies, and considerations of commercialization potential and partnership potential. The inherent coupling among these, along with an appropriate importance weighting, has provided an initial prioritization for allocation of technology development research funding at JSc. The HAT Technology Needs Database, with a core of built-in reports, clarifies and communicates complex technology needs for cost effective human space exploration so that an organization seeking to assure that research prioritization supports human spaceflight of the future can be successful.

  19. Human Spaceflight Technology Needs - A Foundation for JSC's Technology Strategy

    NASA Technical Reports Server (NTRS)

    Stecklein, Jonette M.

    2013-01-01

    Human space exploration has always been heavily influenced by goals to achieve a specific mission on a specific schedule. This approach drove rapid technology development, the rapidity of which adds risks as well as provides a major driver for costs and cost uncertainty. The National Aeronautics and Space Administration (NASA) is now approaching the extension of human presence throughout the solar system by balancing a proactive yet less schedule-driven development of technology with opportunistic scheduling of missions as the needed technologies are realized. This approach should provide cost effective, low risk technology development that will enable efficient and effective manned spaceflight missions. As a first step, the NASA Human Spaceflight Architecture Team (HAT) has identified a suite of critical technologies needed to support future manned missions across a range of destinations, including in cis-lunar space, near earth asteroid visits, lunar exploration, Mars moons, and Mars exploration. The challenge now is to develop a strategy and plan for technology development that efficiently enables these missions over a reasonable time period, without increasing technology development costs unnecessarily due to schedule pressure, and subsequently mitigating development and mission risks. NASA's Johnson Space Center (JSC), as the nation s primary center for human exploration, is addressing this challenge through an innovative approach in allocating Internal Research and Development funding to projects. The HAT Technology Needs (TechNeeds) Database has been developed to correlate across critical technologies and the NASA Office of Chief Technologist Technology Area Breakdown Structure (TABS). The TechNeeds Database illuminates that many critical technologies may support a single technical capability gap, that many HAT technology needs may map to a single TABS technology discipline, and that a single HAT technology need may map to multiple TABS technology disciplines. The TechNeeds Database greatly clarifies understanding of the complex relationships of critical technologies to mission and architecture element needs. Extensions to the core TechNeeds Database allow JSC to factor in and appropriately weight JSC Center Core Technology Competencies, and considerations of Commercialization Potential and Partnership Potential. The inherent coupling among these, along with an appropriate importance weighting, has provided an initial prioritization for allocation of technology development research funding for JSC. The HAT Technology Needs Database, with a core of built-in reports, clarifies and communicates complex technology needs for cost effective human space exploration such that an organization seeking to assure that research prioritization supports human spaceflight of the future can be successful.

  20. Analyzing the nursing organizational structure and process from a scheduling perspective.

    PubMed

    Maenhout, Broos; Vanhoucke, Mario

    2013-09-01

    The efficient and effective management of nursing personnel is of critical importance in a hospital's environment comprising approximately 25 % of the hospital's operational costs. The nurse organizational structure and the organizational processes highly affect the nurses' working conditions and the provided quality of care. In this paper, we investigate the impact of different nurse organization structures and different organizational processes for a real-life situation in a Belgian university hospital. In order to make accurate nurse staffing decisions, the employed solution methodology incorporates shift scheduling characteristics in order to overcome the deficiencies of the many phase-specific methodologies that are proposed in the academic literature.

Top