Sample records for time analysis project

  1. Projecting Event-Based Analysis Dates in Clinical Trials: An Illustration Based on the International Duration Evaluation of Adjuvant Chemotherapy (IDEA) Collaboration. Projecting analysis dates for the IDEA collaboration.

    PubMed

    Renfro, Lindsay A; Grothey, Axel M; Paul, James; Floriani, Irene; Bonnetain, Franck; Niedzwiecki, Donna; Yamanaka, Takeharu; Souglakos, Ioannis; Yothers, Greg; Sargent, Daniel J

    2014-12-01

    Clinical trials are expensive and lengthy, where success of a given trial depends on observing a prospectively defined number of patient events required to answer the clinical question. The point at which this analysis time occurs depends on both patient accrual and primary event rates, which typically vary throughout the trial's duration. We demonstrate real-time analysis date projections using data from a collection of six clinical trials that are part of the IDEA collaboration, an international preplanned pooling of data from six trials testing the duration of adjuvant chemotherapy in stage III colon cancer, and we additionally consider the hypothetical impact of one trial's early termination of follow-up. In the absence of outcome data from IDEA, monthly accrual rates for each of the six IDEA trials were used to project subsequent trial-specific accrual, while historical data from similar Adjuvant Colon Cancer Endpoints (ACCENT) Group trials were used to construct a parametric model for IDEA's primary endpoint, disease-free survival, under the same treatment regimen. With this information and using the planned total accrual from each IDEA trial protocol, individual patient accrual and event dates were simulated and the overall IDEA interim and final analysis times projected. Projections were then compared with actual (previously undisclosed) trial-specific event totals at a recent census time for validation. The change in projected final analysis date assuming early termination of follow-up for one IDEA trial was also calculated. Trial-specific predicted event totals were close to the actual number of events per trial for the recent census date at which the number of events per trial was known, with the overall IDEA projected number of events only off by eight patients. Potential early termination of follow-up by one IDEA trial was estimated to postpone the overall IDEA final analysis date by 9 months. Real-time projection of the final analysis time during a trial, or the overall analysis time during a trial collaborative such as IDEA, has practical implications for trial feasibility when these projections are translated into additional time and resources required.

  2. Projecting Event-Based Analysis Dates in Clinical Trials: An Illustration Based on the International Duration Evaluation of Adjuvant Chemotherapy (IDEA) Collaboration. Projecting analysis dates for the IDEA collaboration

    PubMed Central

    Renfro, Lindsay A.; Grothey, Axel M.; Paul, James; Floriani, Irene; Bonnetain, Franck; Niedzwiecki, Donna; Yamanaka, Takeharu; Souglakos, Ioannis; Yothers, Greg; Sargent, Daniel J.

    2015-01-01

    Purpose Clinical trials are expensive and lengthy, where success of a given trial depends on observing a prospectively defined number of patient events required to answer the clinical question. The point at which this analysis time occurs depends on both patient accrual and primary event rates, which typically vary throughout the trial's duration. We demonstrate real-time analysis date projections using data from a collection of six clinical trials that are part of the IDEA collaboration, an international preplanned pooling of data from six trials testing the duration of adjuvant chemotherapy in stage III colon cancer, and we additionally consider the hypothetical impact of one trial's early termination of follow-up. Patients and Methods In the absence of outcome data from IDEA, monthly accrual rates for each of the six IDEA trials were used to project subsequent trial-specific accrual, while historical data from similar Adjuvant Colon Cancer Endpoints (ACCENT) Group trials were used to construct a parametric model for IDEA's primary endpoint, disease-free survival, under the same treatment regimen. With this information and using the planned total accrual from each IDEA trial protocol, individual patient accrual and event dates were simulated and the overall IDEA interim and final analysis times projected. Projections were then compared with actual (previously undisclosed) trial-specific event totals at a recent census time for validation. The change in projected final analysis date assuming early termination of follow-up for one IDEA trial was also calculated. Results Trial-specific predicted event totals were close to the actual number of events per trial for the recent census date at which the number of events per trial was known, with the overall IDEA projected number of events only off by eight patients. Potential early termination of follow-up by one IDEA trial was estimated to postpone the overall IDEA final analysis date by 9 months. Conclusions Real-time projection of the final analysis time during a trial, or the overall analysis time during a trial collaborative such as IDEA, has practical implications for trial feasibility when these projections are translated into additional time and resources required. PMID:26989447

  3. 75 FR 65620 - Inglis Hydropower, LLC; Notice of Application Ready for Environmental Analysis and Soliciting...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-26

    ... Hydropower, LLC; Notice of Application Ready for Environmental Analysis and Soliciting Comments...: Inglis Hydropower, LLC. e. Name of Project: Inglis Hydropower Project. f. Location: The project would be... ready for environmental analysis at this time. l. The proposed 2.0-megawatt Inglis Hydropower Project...

  4. Sampling and Analysis Plan - Guidance and Template v.4 - General Projects - 04/2014

    EPA Pesticide Factsheets

    This Sampling and Analysis Plan (SAP) guidance and template is intended to assist organizations in documenting the procedural and analytical requirements for one-time, or time-limited, projects involving the collection of water, soil, sediment, or other

  5. Machine translation project alternatives analysis

    NASA Technical Reports Server (NTRS)

    Bajis, Catherine J.; Bedford, Denise A. D.

    1993-01-01

    The Machine Translation Project consists of several components, two of which, the Project Plan and the Requirements Analysis, have already been delivered. The Project Plan details the overall rationale, objectives and time-table for the project as a whole. The Requirements Analysis compares a number of available machine translation systems, their capabilities, possible configurations, and costs. The Alternatives Analysis has resulted in a number of conclusions and recommendations to the NASA STI program concerning the acquisition of specific MT systems and related hardware and software.

  6. Technology Transition Pull: A Case Study of Rate Monotonic Analysis (Part 2).

    DTIC Science & Technology

    1995-04-01

    met in software-intensive real - time systems . RMA allows engineers to under- stand and predict the timing behavior of real-time software to a degree...not previously possible. The Rate Monotonic Analysis for Real - Time Systems (RMARTS) Project at the SEI has dem- onstrated how to design, implement...troubleshoot, and maintain real - time systems using RMA. From 1987-1992, the project worked to develop the technology and encourage its widespread

  7. A Social Network Analysis of 140 Community‐Academic Partnerships for Health: Examining the Healthier Wisconsin Partnership Program

    PubMed Central

    Ahmed, Syed M.; Maurana, Cheryl A.; DeFino, Mia C.; Brewer, Devon D.

    2015-01-01

    Abstract Introduction: Social Network Analysis (SNA) provides an important, underutilized approach to evaluating Community Academic Partnerships for Health (CAPHs). This study examines administrative data from 140 CAPHs funded by the Healthier Wisconsin Partnership Program (HWPP). Methods: Funder data was normalized to maximize number of interconnections between funded projects and 318 non‐redundant community partner organizations in a dual mode analysis, examining the period from 2003–2013.Two strategic planning periods, 2003–2008 vs. 2009–2014, allowed temporal comparison. Results: Connectivity of the network was largely unchanged over time, with most projects and partner organizations connected to a single large component in both time periods. Inter‐partner ties formed in HWPP projects were transient. Most community partners were only involved in projects during one strategic time period. Community organizations participating in both time periods were involved in significantly more projects during the first time period than partners participating in the first time period only (Cohen's d = 0.93). Discussion: This approach represents a significant step toward using objective (non‐survey) data for large clusters of health partnerships and has implications for translational science in community settings. Considerations for government, funders, and communities are offered. Examining partnerships within health priority areas, orphaned projects, and faculty ties to these networks are areas for future research. PMID:25974413

  8. A Requirements-Based Exploration of Open-Source Software Development Projects--Towards a Natural Language Processing Software Analysis Framework

    ERIC Educational Resources Information Center

    Vlas, Radu Eduard

    2012-01-01

    Open source projects do have requirements; they are, however, mostly informal, text descriptions found in requests, forums, and other correspondence. Understanding such requirements provides insight into the nature of open source projects. Unfortunately, manual analysis of natural language requirements is time-consuming, and for large projects,…

  9. Low-Latency Embedded Vision Processor (LLEVS)

    DTIC Science & Technology

    2016-03-01

    26 3.2.3 Task 3 Projected Performance Analysis of FPGA- based Vision Processor ........... 31 3.2.3.1 Algorithms Latency Analysis ...Programmable Gate Array Custom Hardware for Real- Time Multiresolution Analysis . ............................................... 35...conduct data analysis for performance projections. The data acquired through measurements , simulation and estimation provide the requisite platform for

  10. Development of a funding, cost, and spending model for satellite projects

    NASA Technical Reports Server (NTRS)

    Johnson, Jesse P.

    1989-01-01

    The need for a predictive budget/funging model is obvious. The current models used by the Resource Analysis Office (RAO) are used to predict the total costs of satellite projects. An effort to extend the modeling capabilities from total budget analysis to total budget and budget outlays over time analysis was conducted. A statistical based and data driven methodology was used to derive and develop the model. Th budget data for the last 18 GSFC-sponsored satellite projects were analyzed and used to build a funding model which would describe the historical spending patterns. This raw data consisted of dollars spent in that specific year and their 1989 dollar equivalent. This data was converted to the standard format used by the RAO group and placed in a database. A simple statistical analysis was performed to calculate the gross statistics associated with project length and project cost ant the conditional statistics on project length and project cost. The modeling approach used is derived form the theory of embedded statistics which states that properly analyzed data will produce the underlying generating function. The process of funding large scale projects over extended periods of time is described by Life Cycle Cost Models (LCCM). The data was analyzed to find a model in the generic form of a LCCM. The model developed is based on a Weibull function whose parameters are found by both nonlinear optimization and nonlinear regression. In order to use this model it is necessary to transform the problem from a dollar/time space to a percentage of total budget/time space. This transformation is equivalent to moving to a probability space. By using the basic rules of probability, the validity of both the optimization and the regression steps are insured. This statistically significant model is then integrated and inverted. The resulting output represents a project schedule which relates the amount of money spent to the percentage of project completion.

  11. Success in health information exchange projects: solving the implementation puzzle.

    PubMed

    Sicotte, Claude; Paré, Guy

    2010-04-01

    Interest in health information exchange (HIE), defined as the use of information technology to support the electronic transfer of clinical information across health care organizations, continues to grow among those pursuing greater patient safety and health care accessibility and efficiency. In this paper, we present the results of a longitudinal multiple-case study of two large-scale HIE implementation projects carried out in real time over 3-year and 2-year periods in Québec, Canada. Data were primarily collected through semi-structured interviews (n=52) with key informants, namely implementation team members and targeted users. These were supplemented with non-participants observation of team meetings and by the analysis of organizational documents. The cross-case comparison was particularly relevant given that project circumstances led to contrasting outcomes: while one project failed, the other was a success. A risk management analysis was performed taking a process view in order to capture the complexity of project implementations as evolving phenomena that are affected by interdependent pre-existing and emergent risks that tend to change over time. The longitudinal case analysis clearly demonstrates that the risk factors were closely intertwined. Systematic ripple effects from one risk factor to another were observed. This risk interdependence evolved dynamically over time, with a snowball effect that rendered a change of path progressively more difficult as time passed. The results of the cross-case analysis demonstrate a direct relationship between the quality of an implementation strategy and project outcomes. Copyright 2010 Elsevier Ltd. All rights reserved.

  12. Case-Exercises, Diagnosis, and Explanations in a Knowledge Based Tutoring System for Project Planning.

    ERIC Educational Resources Information Center

    Pulz, Michael; Lusti, Markus

    PROJECTTUTOR is an intelligent tutoring system that enhances conventional classroom instruction by teaching problem solving in project planning. The domain knowledge covered by the expert module is divided into three functions. Structural analysis, identifies the activities that make up the project, time analysis, computes the earliest and latest…

  13. Analysis of loss of time value during road maintenance project

    NASA Astrophysics Data System (ADS)

    Sudarsana, Dewa Ketut; Sanjaya, Putu Ari

    2017-06-01

    Lane closure is frequently performed in the execution of the road maintenance project. It has a negative impact on road users such as the loss of vehicle operating costs and the loss of time value. Nevertheless, analysis on loss of time value in Indonesia has not been carried out. The parameter of time value for the road users was the minimum wage city/region approach. Vehicle speed of pre-construction was obtained by observation, while the speed during the road maintenance project was predicted by the speed of the pre-construction by multiplying it with the speed adjustment factor. In the case of execution of the National road maintenance project in the two-lane two-way urban and interurban road types in the fiscal year of 2015 in Bali province, the loss of time value was at the average of IDR 12,789,000/day/link road. The relationship of traffic volume and loss of time value of the road users was obtained by a logarithm model.

  14. The Contribution of Human Factors in Military System Development: Methodological Considerations

    DTIC Science & Technology

    1980-07-01

    Risk/Uncertainty Analysis - Project Scoring - Utility Scales - Relevance Tree Techniques (Reverse Factor Analysis) 2. Computer Simulation Simulation...effectiveness of mathematical models for R&D project selection. Management Science, April 1973, 18. 6-43 .1~ *.-. Souder, W.E. h scoring methodology for...per some interval PROFICIENCY test scores (written) RADIATION radiation effects aircrew performance on radiation environments REACTION TIME 1) (time

  15. An IS Project Management Course Project

    ERIC Educational Resources Information Center

    Frank, Ronald L.

    2010-01-01

    Information Systems curricula should provide project management (PM) theory, current practice, and hands-on experience. The schedule usually does not allow time in Analysis and Design courses for development oriented project management instruction other than a short introduction. Similarly, networking courses usually don't put project management…

  16. Application fuzzy multi-attribute decision analysis method to prioritize project success criteria

    NASA Astrophysics Data System (ADS)

    Phong, Nguyen Thanh; Quyen, Nguyen Le Hoang Thuy To

    2017-11-01

    Project success is a foundation for project owner to manage and control not only for the current project but also for future potential projects in construction companies. However, identifying the key success criteria for evaluating a particular project in real practice is a challenging task. Normally, it depends on a lot of factors, such as the expectation of the project owner and stakeholders, triple constraints of the project (cost, time, quality), and company's mission, vision, and objectives. Traditional decision-making methods for measuring the project success are usually based on subjective opinions of panel experts, resulting in irrational and inappropriate decisions. Therefore, this paper introduces a multi-attribute decision analysis method (MADAM) for weighting project success criteria by using fuzzy Analytical Hierarchy Process approach. It is found that this method is useful when dealing with imprecise and uncertain human judgments in evaluating project success criteria. Moreover, this research also suggests that although cost, time, and quality are three project success criteria projects, the satisfaction of project owner and acceptance of project stakeholders with the completed project criteria is the most important criteria for project success evaluation in Vietnam.

  17. A human factors methodology for real-time support applications

    NASA Technical Reports Server (NTRS)

    Murphy, E. D.; Vanbalen, P. M.; Mitchell, C. M.

    1983-01-01

    A general approach to the human factors (HF) analysis of new or existing projects at NASA/Goddard is delineated. Because the methodology evolved from HF evaluations of the Mission Planning Terminal (MPT) and the Earth Radiation Budget Satellite Mission Operations Room (ERBS MOR), it is directed specifically to the HF analysis of real-time support applications. Major topics included for discussion are the process of establishing a working relationship between the Human Factors Group (HFG) and the project, orientation of HF analysts to the project, human factors analysis and review, and coordination with major cycles of system development. Sub-topics include specific areas for analysis and appropriate HF tools. Management support functions are outlined. References provide a guide to sources of further information.

  18. Integrated Cost and Schedule using Monte Carlo Simulation of a CPM Model - 12419

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hulett, David T.; Nosbisch, Michael R.

    This discussion of the recommended practice (RP) 57R-09 of AACE International defines the integrated analysis of schedule and cost risk to estimate the appropriate level of cost and schedule contingency reserve on projects. The main contribution of this RP is to include the impact of schedule risk on cost risk and hence on the need for cost contingency reserves. Additional benefits include the prioritizing of the risks to cost, some of which are risks to schedule, so that risk mitigation may be conducted in a cost-effective way, scatter diagrams of time-cost pairs for developing joint targets of time and cost,more » and probabilistic cash flow which shows cash flow at different levels of certainty. Integrating cost and schedule risk into one analysis based on the project schedule loaded with costed resources from the cost estimate provides both: (1) more accurate cost estimates than if the schedule risk were ignored or incorporated only partially, and (2) illustrates the importance of schedule risk to cost risk when the durations of activities using labor-type (time-dependent) resources are risky. Many activities such as detailed engineering, construction or software development are mainly conducted by people who need to be paid even if their work takes longer than scheduled. Level-of-effort resources, such as the project management team, are extreme examples of time-dependent resources, since if the project duration exceeds its planned duration the cost of these resources will increase over their budgeted amount. The integrated cost-schedule risk analysis is based on: - A high quality CPM schedule with logic tight enough so that it will provide the correct dates and critical paths during simulation automatically without manual intervention. - A contingency-free estimate of project costs that is loaded on the activities of the schedule. - Resolves inconsistencies between cost estimate and schedule that often creep into those documents as project execution proceeds. - Good-quality risk data that are usually collected in risk interviews of the project team, management and others knowledgeable in the risk of the project. The risks from the risk register are used as the basis of the risk data in the risk driver method. The risk driver method is based in the fundamental principle that identifiable risks drive overall cost and schedule risk. - A Monte Carlo simulation software program that can simulate schedule risk, burn WM2012 rate risk and time-independent resource risk. The results include the standard histograms and cumulative distributions of possible cost and time results for the project. However, by simulating both cost and time simultaneously we can collect the cost-time pairs of results and hence show the scatter diagram ('football chart') that indicates the joint probability of finishing on time and on budget. Also, we can derive the probabilistic cash flow for comparison with the time-phased project budget. Finally the risks to schedule completion and to cost can be prioritized, say at the P-80 level of confidence, to help focus the risk mitigation efforts. If the cost and schedule estimates including contingency reserves are not acceptable to the project stakeholders the project team should conduct risk mitigation workshops and studies, deciding which risk mitigation actions to take, and re-run the Monte Carlo simulation to determine the possible improvement to the project's objectives. Finally, it is recommended that the contingency reserves of cost and of time, calculated at a level that represents an acceptable degree of certainty and uncertainty for the project stakeholders, be added as a resource-loaded activity to the project schedule for strategic planning purposes. The risk analysis described in this paper is correct only for the current plan, represented by the schedule. The project contingency reserve of time and cost that are the main results of this analysis apply if that plan is to be followed. Of course project managers have the option of re-planning and re-scheduling in the face of new facts, in part by mitigating risk. This analysis identifies the high-priority risks to cost and to schedule, which assist the project manager in planning further risk mitigation. Some project managers reject the results and argue that they cannot possibly be so late or so overrun. Those project managers may be wasting an opportunity to mitigate risk and get a more favorable outcome. (authors)« less

  19. Anomalies and contradictions in an airport construction project: a historical analysis based on Cultural-Historical Activity Theory.

    PubMed

    Lopes, Manoela Gomes Reis; Vilela, Rodolfo Andrade de Gouveia; Querol, Marco Antônio Pereira

    2018-02-19

    Large construction projects involve the functioning of a complex activity system (AS) in network format. Anomalies such as accidents, delays, reworks, etc., can be explained by contradictions that emerge historically in the system. The aim of this study was to analyze the history of an airport construction project to understand the current contradictions and anomalies in the AS and how they emerged. A case study was conducted for this purpose, combining Collective Work Analysis, interviews, observations, and analysis of documents that provided the basis for sessions in the Change Laboratory, where a participant timeline was elaborated with the principal events during the construction project. Based on the timeline, a historical analysis of the airport's AS revealed critical historical events and contradictions that explained the anomalies that occurred during the project. The analysis showed that the airport had been planned for construction with politically determined deadlines that were insufficient and inconsistent with the project's complexity. The choice of the contract modality, which assigned responsibility to a joint venture for all of the project's phases, was another critical historical event, because it allowed launching the construction before a definitive executive project had been drafted. There were also different cultures in companies working together for the first time in the context of a project with time pressures and outsourcing of activities without the necessary coordination. Identifying these contradictions and their historical origins proved essential for understanding the current situation and efforts to prevent similar situations in the future.

  20. Rate Monotonic Analysis for Real-Time Systems

    DTIC Science & Technology

    1991-03-01

    The essential goal of the Rate Monotonic Analysis (RMA) for Real - Time Systems Project at the Software Engineering Institute is to catalyze...improvement in the practice of real time systems engineering, specifically by increasing the use of rate monotonic analysis and scheduling algorithms. In this

  1. Quarterly Update, January-March 1992

    DTIC Science & Technology

    1992-03-01

    representations to support exploiting that commonality. The project used the Feature-Oriented Domain Analysis ( FODA ) method, developed by the project in 1990, in...9 FODA feature-oriented domain analysis ................................... 14 FTP file transfer protocol...concentrations, and product market share for 23 countries. Along with other SEI staff, members of the Rate Monotonic Analysis for Real-Time Systems (RMARTS

  2. Research Project Evaluation-Learnings from the PATHWAYS Project Experience.

    PubMed

    Galas, Aleksander; Pilat, Aleksandra; Leonardi, Matilde; Tobiasz-Adamczyk, Beata

    2018-05-25

    Every research project faces challenges regarding how to achieve its goals in a timely and effective manner. The purpose of this paper is to present a project evaluation methodology gathered during the implementation of the Participation to Healthy Workplaces and Inclusive Strategies in the Work Sector (the EU PATHWAYS Project). The PATHWAYS project involved multiple countries and multi-cultural aspects of re/integrating chronically ill patients into labor markets in different countries. This paper describes key project's evaluation issues including: (1) purposes, (2) advisability, (3) tools, (4) implementation, and (5) possible benefits and presents the advantages of a continuous monitoring. Project evaluation tool to assess structure and resources, process, management and communication, achievements, and outcomes. The project used a mixed evaluation approach and included Strengths (S), Weaknesses (W), Opportunities (O), and Threats (SWOT) analysis. A methodology for longitudinal EU projects' evaluation is described. The evaluation process allowed to highlight strengths and weaknesses and highlighted good coordination and communication between project partners as well as some key issues such as: the need for a shared glossary covering areas investigated by the project, problematic issues related to the involvement of stakeholders from outside the project, and issues with timing. Numerical SWOT analysis showed improvement in project performance over time. The proportion of participating project partners in the evaluation varied from 100% to 83.3%. There is a need for the implementation of a structured evaluation process in multidisciplinary projects involving different stakeholders in diverse socio-environmental and political conditions. Based on the PATHWAYS experience, a clear monitoring methodology is suggested as essential in every multidisciplinary research projects.

  3. The TimeStudio Project: An open source scientific workflow system for the behavioral and brain sciences.

    PubMed

    Nyström, Pär; Falck-Ytter, Terje; Gredebäck, Gustaf

    2016-06-01

    This article describes a new open source scientific workflow system, the TimeStudio Project, dedicated to the behavioral and brain sciences. The program is written in MATLAB and features a graphical user interface for the dynamic pipelining of computer algorithms developed as TimeStudio plugins. TimeStudio includes both a set of general plugins (for reading data files, modifying data structures, visualizing data structures, etc.) and a set of plugins specifically developed for the analysis of event-related eyetracking data as a proof of concept. It is possible to create custom plugins to integrate new or existing MATLAB code anywhere in a workflow, making TimeStudio a flexible workbench for organizing and performing a wide range of analyses. The system also features an integrated sharing and archiving tool for TimeStudio workflows, which can be used to share workflows both during the data analysis phase and after scientific publication. TimeStudio thus facilitates the reproduction and replication of scientific studies, increases the transparency of analyses, and reduces individual researchers' analysis workload. The project website ( http://timestudioproject.com ) contains the latest releases of TimeStudio, together with documentation and user forums.

  4. ANALYSIS OF COSTS FOR THE TREATMENT OF DENTAL FLUOROSIS

    EPA Science Inventory

    The research project was initiated to conduct a cost/benefit analysis for those communities whose fluoride levels exceed two times the optimum and to determine the economic impact of defluoridating community drinking water. The initial data used in this study were from a project ...

  5. Environmental regulation of plant gene expression: an RT-qPCR laboratory project for an upper-level undergraduate biochemistry or molecular biology course.

    PubMed

    Eickelberg, Garrett J; Fisher, Alison J

    2013-01-01

    We present a novel laboratory project employing "real-time" RT-qPCR to measure the effect of environment on the expression of the FLOWERING LOCUS C gene, a key regulator of floral timing in Arabidopsis thaliana plants. The project requires four 3-hr laboratory sessions and is aimed at upper-level undergraduate students in biochemistry or molecular biology courses. The project provides students with hands-on experience with RT-qPCR, the current "gold standard" for gene expression analysis, including detailed data analysis using the common 2-ΔΔCT method. Moreover, it provides a convenient starting point for many inquiry-driven projects addressing diverse questions concerning ecological biochemistry, naturally occurring genetic variation, developmental biology, and the regulation of gene expression in nature. Copyright © 2013 Wiley Periodicals, Inc.

  6. Enabling Predictive Simulation and UQ of Complex Multiphysics PDE Systems by the Development of Goal-Oriented Variational Sensitivity Analysis and a-Posteriori Error Estimation Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Estep, Donald

    2015-11-30

    This project addressed the challenge of predictive computational analysis of strongly coupled, highly nonlinear multiphysics systems characterized by multiple physical phenomena that span a large range of length- and time-scales. Specifically, the project was focused on computational estimation of numerical error and sensitivity analysis of computational solutions with respect to variations in parameters and data. In addition, the project investigated the use of accurate computational estimates to guide efficient adaptive discretization. The project developed, analyzed and evaluated new variational adjoint-based techniques for integration, model, and data error estimation/control and sensitivity analysis, in evolutionary multiphysics multiscale simulations.

  7. Fast Missile Boat Project Planning using CPM and What If Analysis Method

    NASA Astrophysics Data System (ADS)

    Silvianita; Firmansyah, R.; Rosyid, D. M.; Suntoyo; Chamelia, D. M.

    2018-03-01

    This paper discusses analysis of fast ship missile project planning with CPM and What If analysis.The poor performance can cause project delay and cost overruns, these two things often occur in a project management. Scheduling has an important role to control the project. Good scheduling is required because it will affect the product quality. Scheduling is useful to manage the activities so that the project completion time can be as short as possible and it is used to determine the costs of the project are in optimum condition. The alternatives that can be used to overcome the delay project namely increasing labor and equipment.Project planning can be done to determine whether the project is already in an optimum condition. Alternatives that can be used to overcome the delay in the project is to increase labor and equipment, the next is to look for the optimum solution taking into account the duration and gain of each alternative.

  8. Accelerated Optical Projection Tomography Applied to In Vivo Imaging of Zebrafish

    PubMed Central

    Correia, Teresa; Yin, Jun; Ramel, Marie-Christine; Andrews, Natalie; Katan, Matilda; Bugeon, Laurence; Dallman, Margaret J.; McGinty, James; Frankel, Paul; French, Paul M. W.; Arridge, Simon

    2015-01-01

    Optical projection tomography (OPT) provides a non-invasive 3-D imaging modality that can be applied to longitudinal studies of live disease models, including in zebrafish. Current limitations include the requirement of a minimum number of angular projections for reconstruction of reasonable OPT images using filtered back projection (FBP), which is typically several hundred, leading to acquisition times of several minutes. It is highly desirable to decrease the number of required angular projections to decrease both the total acquisition time and the light dose to the sample. This is particularly important to enable longitudinal studies, which involve measurements of the same fish at different time points. In this work, we demonstrate that the use of an iterative algorithm to reconstruct sparsely sampled OPT data sets can provide useful 3-D images with 50 or fewer projections, thereby significantly decreasing the minimum acquisition time and light dose while maintaining image quality. A transgenic zebrafish embryo with fluorescent labelling of the vasculature was imaged to acquire densely sampled (800 projections) and under-sampled data sets of transmitted and fluorescence projection images. The under-sampled OPT data sets were reconstructed using an iterative total variation-based image reconstruction algorithm and compared against FBP reconstructions of the densely sampled data sets. To illustrate the potential for quantitative analysis following rapid OPT data acquisition, a Hessian-based method was applied to automatically segment the reconstructed images to select the vasculature network. Results showed that 3-D images of the zebrafish embryo and its vasculature of sufficient visual quality for quantitative analysis can be reconstructed using the iterative algorithm from only 32 projections—achieving up to 28 times improvement in imaging speed and leading to total acquisition times of a few seconds. PMID:26308086

  9. Building Information Modelling (BIM) and Unmanned Aerial Vehicle (UAV) technologies in infrastructure construction project management and delay and disruption analysis

    NASA Astrophysics Data System (ADS)

    Vacanas, Yiannis; Themistocleous, Kyriacos; Agapiou, Athos; Hadjimitsis, Diofantos

    2015-06-01

    Time in infrastructure construction projects has always been a fundamental issue as early as from the inception of a project, during the construction process and often after the completion and delivery. In a typical construction contract time related matters such as the completion date and possible delays are among the most important issues that are dealt with by the contract provisions. In the event of delay there are usually provisions for extension of time award to the contractor with possible reimbursement for the extra cost and expenses caused by this extension of time to the contract duration. In the case the contractor is not entitled to extension of time, the owner will be possibly entitled to amounts as compensation for the time prohibited from using his development. Even in the event of completion within the time agreed, under certain circumstances a contractor may have claims for reimbursement for extra costs incurred due to induced acceleration measures he had to take in order to mitigate disruption effects caused to the progress of the works by the owner or his representatives. Depending on the size of the project and the agreement amount, these reimbursement sums may be extremely high. Therefore innovative methods with the exploitation of new technologies for effective project management for the avoidance of delays, delay analysis and mitigation measures are essential; moreover, methods for collecting efficiently information during the construction process so that disputes regarding time are avoided or resolved in a quick and fair manner are required. This paper explores the state of art for existing use of Building Information Modelling (BIM) and Unmanned Aerial Vehicles (UAV) technologies in the construction industry in general. Moreover the paper considers the prospect of using BIM technology in conjunction with the use of UAV technology for efficient and accurate as-built data collection and illustration of the works progress during an infrastructure construction project in order to achieve more effective project management, record keeping and delay analysis.

  10. Rt-Space: A Real-Time Stochastically-Provisioned Adaptive Container Environment

    DTIC Science & Technology

    2017-08-04

    SECURITY CLASSIFICATION OF: This project was directed at component-based soft real- time (SRT) systems implemented on multicore platforms. To facilitate...upon average-case or near- average-case task execution times . The main intellectual contribution of this project was the development of methods for...allocating CPU time to components and associated analysis for validating SRT correctness. 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 13

  11. Analysis of extreme values of the economic efficiency indicators of transport infrastructure projects

    NASA Astrophysics Data System (ADS)

    Korytárová, J.; Vaňková, L.

    2017-10-01

    Paper builds on previous research of the authors into the evaluation of economic efficiency of transport infrastructure projects evaluated by the economic efficiency ratio - NPV, IRR and BCR. Values of indicators and subsequent outputs of the sensitivity analysis show extremely favourable values in some cases. The authors dealt with the analysis of these indicators down to the level of the input variables and examined which inputs have a larger share of these extreme values. NCF for the calculation of above mentioned ratios is created by benefits that arise as the difference between zero and investment options of the project (savings in travel and operating costs, savings in travel time costs, reduction in accident costs and savings in exogenous costs) as well as total agency costs. Savings in travel time costs which contribute to the overall utility of projects by more than 70% appear to be the most important benefits in the long term horizon. This is the reason why this benefit emphasized. The outcome of the article has resulted how the particular basic variables contributed to the total robustness of economic efficiency of these project.

  12. An Analysis of Project Performance for Partnering Projects in the U.S. Army Corps of Engineers

    DTIC Science & Technology

    1992-12-01

    AD-A259 322111 II IIIIlil|I l I| AN ANALYSIS OF PROJECT PERFORMANCE FOR PARTNERING PROJECTS IN THE U. S . ARMY CORPS OF ENGINEERS DTIC S ELECTE JAN 1... S . ARMY CORPS OF ENGINEERS by DAVID CHARLES WESTON, B.S. THESIS Presented to the Faculty of the Graduate School of The University of Texas in...ACKNOWLEDGEMENTS I would like to acknowledge and thank those members of the U. S . Army Corps of Engineers who spent their time and effort collecting and

  13. Naturalizing the Future in Factual Discourse: A Critical Linguistic Analysis of a Projected Event.

    ERIC Educational Resources Information Center

    Dunmire, Patricia L.

    1997-01-01

    Examines the linguistic processes through which a projected effect is constructed within factual discourse. Applies critical linguistic analysis to coverage of the 1990 Gulf War in the "New York Times" and "Washington Post." Expands on work in critical linguistics and demonstrates how political interests underlying newspaper…

  14. Analysis of Florida Department Of Transportation Transit Corridor Program/Projects: Technical Memorandum Number Two; Summary of Transit Corridor Projects Status/Strengths and Weaknesses/Lessons Learned

    DOT National Transportation Integrated Search

    2001-03-01

    The purpose of this project is to review and summarize the performance of transit corridor projects funded by the FDOT during the time period from July 1, 1993 through December 31, 1999. Through surveys and interviews, CUTR will summarize in a "lesso...

  15. Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.

    PubMed

    Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias

    2016-01-01

    To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.

  16. Risk assessment framework on time impact: Infrastructure projects in soft soil during construction stage

    NASA Astrophysics Data System (ADS)

    Low, W. W.; Wong, K. S.; Lee, J. L.

    2018-04-01

    With the growth of economy and population, there is an increase in infrastructure construction projects. As such, it is unavoidable to have construction projects on soft soil. Without proper risk management plan, construction projects are vulnerable to different types of risks which will have negative impact on project’s time, cost and quality. Literature review showed that little or none of the research is focused on the risk assessment on the infrastructure project in soft soil. Hence, the aim of this research is to propose a risk assessment framework in infrastructure projects in soft soil during the construction stage. This research was focused on the impact of risks on project time and internal risk factors. The research method was Analytical Hierarchy Process and the sample population was experienced industry experts who have experience in infrastructure projects. Analysis was completed and result showed that for internal factors, the five most significant risks on time element are lack of special equipment, potential contractual disputes and claims, shortage of skilled workers, delay/lack of materials supply, and insolvency of contractor/sub-contractor. Results indicated that resources risk factor play a critical role on project time frame in infrastructure projects in soft soil during the construction stage.

  17. Financial and tax risks at implementation of "Chayanda- Lensk" section of "Sila Sibiri" gas transportation system construction project

    NASA Astrophysics Data System (ADS)

    Sharf, I. V.; Chukhareva, N. V.; Kuznetsova, L. P.

    2014-08-01

    High social and economic importance of large-scale projects on gasification of East Siberian regions of Russia and diversifying gas exports poses the problem of complex risk analysis of the project. This article discusses the various types of risks that could significantly affect the timing of the implementation and effectiveness of the project for the construction of the first line of "Sila Sibiri", the "Chayanda-Lensk" section. Special attention is paid to financial and tax aspects of the project. Graphically presented analysis of the dynamics of financial indicators reflect certain periods of effectiveness in implementing the project. Authors also discuss the possible causes and consequences of risks.

  18. The analysis and forecasting of male cycling time trial records established within England and Wales.

    PubMed

    Dyer, Bryce; Hassani, Hossein; Shadi, Mehran

    2016-01-01

    The format of cycling time trials in England, Wales and Northern Ireland, involves riders competing individually over several fixed race distances of 10-100 miles in length and using time constrained formats of 12 and 24 h in duration. Drawing on data provided by the national governing body that covers the regions of England and Wales, an analysis of six male competition record progressions was undertaken to illustrate its progression. Future forecasts are then projected through use of the Singular Spectrum Analysis technique. This method has not been applied to sport-based time series data before. All six records have seen a progressive improvement and are non-linear in nature. Five records saw their highest level of record change during the 1950-1969 period. Whilst new record frequency generally has reduced since this period, the magnitude of performance improvement has generally increased. The Singular Spectrum Analysis technique successfully provided forecasted projections in the short to medium term with a high level of fit to the time series data.

  19. Nonlinear models for estimating GSFC travel requirements

    NASA Technical Reports Server (NTRS)

    Buffalano, C.; Hagan, F. J.

    1974-01-01

    A methodology is presented for estimating travel requirements for a particular period of time. Travel models were generated using nonlinear regression analysis techniques on a data base of FY-72 and FY-73 information from 79 GSFC projects. Although the subject matter relates to GSFX activities, the type of analysis used and the manner of selecting the relevant variables would be of interest to other NASA centers, government agencies, private corporations and, in general, any organization with a significant travel budget. Models were developed for each of six types of activity: flight projects (in-house and out-of-house), experiments on non-GSFC projects, international projects, ART/SRT, data analysis, advanced studies, tracking and data, and indirects.

  20. Graduate Student Project: Employer Operations Management Analysis

    ERIC Educational Resources Information Center

    Fish, Lynn A.

    2008-01-01

    Part-time graduate students at an Association to Advance Collegiate Schools of Business-accredited college complete a unique project by applying operations management concepts to their current employer. More than 92% of 368 graduates indicated that this experiential project was a positive learning experience, and results show a positive impact on…

  1. Experiential Learning through Integrated Project Work: An Example from Soil Science.

    ERIC Educational Resources Information Center

    Mellor, Antony

    1991-01-01

    Describes the planning, implementation, and evaluation of an integrated student soil science project. Reports that the course was designed to develop student-centered approaches to learning and to develop transferable skills and personal qualities at the same time. Explains that the project included fieldwork, laboratory analysis, data…

  2. Projected Changes in Hydrological Extremes in a Cold Region Watershed: Sensitivity of Results to Statistical Methods of Analysis

    NASA Astrophysics Data System (ADS)

    Dibike, Y. B.; Eum, H. I.; Prowse, T. D.

    2017-12-01

    Flows originating from alpine dominated cold region watersheds typically experience extended winter low flows followed by spring snowmelt and summer rainfall driven high flows. In a warmer climate, there will be temperature- induced shift in precipitation from snow towards rain as well as changes in snowmelt timing affecting the frequency of extreme high and low flow events which could significantly alter ecosystem services. This study examines the potential changes in the frequency and severity of hydrologic extremes in the Athabasca River watershed in Alberta, Canada based on the Variable Infiltration Capacity (VIC) hydrologic model and selected and statistically downscaled climate change scenario data from the latest Coupled Model Intercomparison Project (CMIP5). The sensitivity of these projected changes is also examined by applying different extreme flow analysis methods. The hydrological model projections show an overall increase in mean annual streamflow in the watershed and a corresponding shift in the freshet timing to earlier period. Most of the streams are projected to experience increases during the winter and spring seasons and decreases during the summer and early fall seasons, with an overall projected increases in extreme high flows, especially for low frequency events. While the middle and lower parts of the watershed are characterised by projected increases in extreme high flows, the high elevation alpine region is mainly characterised by corresponding decreases in extreme low flow events. However, the magnitude of projected changes in extreme flow varies over a wide range, especially for low frequent events, depending on the climate scenario and period of analysis, and sometimes in a nonlinear way. Nonetheless, the sensitivity of the projected changes to the statistical method of analysis is found to be relatively small compared to the inter-model variability.

  3. 75 FR 47585 - Jonathan and Jayne Chase; Notice of Application Tendered for Filing With the Commission and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-06

    .... e. Name of Project: Troy Hydroelectric Project. f. Location: On the Missisquoi River, in the Town of Troy, Orleans County, Vermont. The project would not occupy lands of the United States. g. Filed... application is not ready for environmental analysis at this time. n. The Troy Project would consist of: (1...

  4. The diversity and evolution of ecological and environmental citizen science.

    PubMed

    Pocock, Michael J O; Tweddle, John C; Savage, Joanna; Robinson, Lucy D; Roy, Helen E

    2017-01-01

    Citizen science-the involvement of volunteers in data collection, analysis and interpretation-simultaneously supports research and public engagement with science, and its profile is rapidly rising. Citizen science represents a diverse range of approaches, but until now this diversity has not been quantitatively explored. We conducted a systematic internet search and discovered 509 environmental and ecological citizen science projects. We scored each project for 32 attributes based on publicly obtainable information and used multiple factor analysis to summarise this variation to assess citizen science approaches. We found that projects varied according to their methodological approach from 'mass participation' (e.g. easy participation by anyone anywhere) to 'systematic monitoring' (e.g. trained volunteers repeatedly sampling at specific locations). They also varied in complexity from approaches that are 'simple' to those that are 'elaborate' (e.g. provide lots of support to gather rich, detailed datasets). There was a separate cluster of entirely computer-based projects but, in general, we found that the range of citizen science projects in ecology and the environment showed continuous variation and cannot be neatly categorised into distinct types of activity. While the diversity of projects begun in each time period (pre 1990, 1990-99, 2000-09 and 2010-13) has not increased, we found that projects tended to have become increasingly different from each other as time progressed (possibly due to changing opportunities, including technological innovation). Most projects were still active so consequently we found that the overall diversity of active projects (available for participation) increased as time progressed. Overall, understanding the landscape of citizen science in ecology and the environment (and its change over time) is valuable because it informs the comparative evaluation of the 'success' of different citizen science approaches. Comparative evaluation provides an evidence-base to inform the future development of citizen science activities.

  5. Analysis of interactions among barriers in project risk management

    NASA Astrophysics Data System (ADS)

    Dandage, Rahul V.; Mantha, Shankar S.; Rane, Santosh B.; Bhoola, Vanita

    2018-03-01

    In the context of the scope, time, cost, and quality constraints, failure is not uncommon in project management. While small projects have 70% chances of success, large projects virtually have no chance of meeting the quadruple constraints. While there is no dearth of research on project risk management, the manifestation of barriers to project risk management is a less dwelt topic. The success of project management is oftentimes based on the understanding of barriers to effective risk management, application of appropriate risk management methodology, proactive leadership to avoid barriers, workers' attitude, adequate resources, organizational culture, and involvement of top management. This paper represents various risk categories and barriers to risk management in domestic and international projects through literature survey and feedback from project professionals. After analysing the various modelling methods used in project risk management literature, interpretive structural modelling (ISM) and MICMAC analysis have been used to analyse interactions among the barriers and prioritize them. The analysis indicates that lack of top management support, lack of formal training, and lack of addressing cultural differences are the high priority barriers, among many others.

  6. An Analysis of Problems in College Students' Participation in the Western China Program

    ERIC Educational Resources Information Center

    Yumei, Yi

    2008-01-01

    Since its initiation in 2003, the College Student Western China Program has had several satisfying achievements. At the same time, however, problems exist in the project. This article gives a brief analysis of problems encountered in the project from the aspects of publicity and campaign work, plans and schedules, student participation, voluntary…

  7. Critical path method applied to research project planning: Fire Economics Evaluation System (FEES)

    Treesearch

    Earl B. Anderson; R. Stanton Hales

    1986-01-01

    The critical path method (CPM) of network analysis (a) depicts precedence among the many activities in a project by a network diagram; (b) identifies critical activities by calculating their starting, finishing, and float times; and (c) displays possible schedules by constructing time charts. CPM was applied to the development of the Forest Service's Fire...

  8. Advanced Ground Systems Maintenance Physics Models For Diagnostics Project

    NASA Technical Reports Server (NTRS)

    Perotti, Jose M.

    2015-01-01

    The project will use high-fidelity physics models and simulations to simulate real-time operations of cryogenic and systems and calculate the status/health of the systems. The project enables the delivery of system health advisories to ground system operators. The capability will also be used to conduct planning and analysis of cryogenic system operations. This project will develop and implement high-fidelity physics-based modeling techniques tosimulate the real-time operation of cryogenics and other fluids systems and, when compared to thereal-time operation of the actual systems, provide assessment of their state. Physics-modelcalculated measurements (called “pseudo-sensors”) will be compared to the system real-timedata. Comparison results will be utilized to provide systems operators with enhanced monitoring ofsystems' health and status, identify off-nominal trends and diagnose system/component failures.This capability can also be used to conduct planning and analysis of cryogenics and other fluidsystems designs. This capability will be interfaced with the ground operations command andcontrol system as a part of the Advanced Ground Systems Maintenance (AGSM) project to helpassure system availability and mission success. The initial capability will be developed for theLiquid Oxygen (LO2) ground loading systems.

  9. Multi-spacecraft solar energetic particle analysis of FERMI gamma-ray flare events within the HESPERIA H2020 project

    NASA Astrophysics Data System (ADS)

    Tziotziou, Kostas; Malandraki, Olga; Valtonen, Eino; Heber, Bernd; Zucca, Pietro; Klein, Karl-Ludwig; Vainio, Rami; Tsiropoula, Georgia; Share, Gerald

    2017-04-01

    Multi-spacecraft observations of solar energetic particle (SEP) events are important for understanding the acceleration processes and the interplanetary propagation of particles released during eruptive events. In this work, we have carefully studied 25 gamma-ray flare events observed by FERMI and investigated possible associations with SEP-related events observed with STEREO and L1 spacecraft in the heliosphere. A data-driven velocity dispersion analysis (VDA) and Time-Shifting Analysis (TSA) are used for deriving the release times of protons and electrons at the Sun and for comparing them with the respective times stemming from the gamma-ray event analysis and their X-ray signatures, in an attempt to interconnect the SEPs and Fermi events and better understand the physics involved. Acknowledgements: This project has received funding from the European Union's Horizon 2020 research and innovation program under grant agreement No 637324.

  10. QuickNGS elevates Next-Generation Sequencing data analysis to a new level of automation.

    PubMed

    Wagle, Prerana; Nikolić, Miloš; Frommolt, Peter

    2015-07-01

    Next-Generation Sequencing (NGS) has emerged as a widely used tool in molecular biology. While time and cost for the sequencing itself are decreasing, the analysis of the massive amounts of data remains challenging. Since multiple algorithmic approaches for the basic data analysis have been developed, there is now an increasing need to efficiently use these tools to obtain results in reasonable time. We have developed QuickNGS, a new workflow system for laboratories with the need to analyze data from multiple NGS projects at a time. QuickNGS takes advantage of parallel computing resources, a comprehensive back-end database, and a careful selection of previously published algorithmic approaches to build fully automated data analysis workflows. We demonstrate the efficiency of our new software by a comprehensive analysis of 10 RNA-Seq samples which we can finish in only a few minutes of hands-on time. The approach we have taken is suitable to process even much larger numbers of samples and multiple projects at a time. Our approach considerably reduces the barriers that still limit the usability of the powerful NGS technology and finally decreases the time to be spent before proceeding to further downstream analysis and interpretation of the data.

  11. Wearable Biomedical Measurement Systems for Assessment of Mental Stress of Combatants in Real Time

    PubMed Central

    Seoane, Fernando; Mohino-Herranz, Inmaculada; Ferreira, Javier; Alvarez, Lorena; Buendia, Ruben; Ayllón, David; Llerena, Cosme; Gil-Pita, Roberto

    2014-01-01

    The Spanish Ministry of Defense, through its Future Combatant program, has sought to develop technology aids with the aim of extending combatants' operational capabilities. Within this framework the ATREC project funded by the “Coincidente” program aims at analyzing diverse biometrics to assess by real time monitoring the stress levels of combatants. This project combines multidisciplinary disciplines and fields, including wearable instrumentation, textile technology, signal processing, pattern recognition and psychological analysis of the obtained information. In this work the ATREC project is described, including the different execution phases, the wearable biomedical measurement systems, the experimental setup, the biomedical signal analysis and speech processing performed. The preliminary results obtained from the data analysis collected during the first phase of the project are presented, indicating the good classification performance exhibited when using features obtained from electrocardiographic recordings and electrical bioimpedance measurements from the thorax. These results suggest that cardiac and respiration activity offer better biomarkers for assessment of stress than speech, galvanic skin response or skin temperature when recorded with wearable biomedical measurement systems. PMID:24759113

  12. Wearable biomedical measurement systems for assessment of mental stress of combatants in real time.

    PubMed

    Seoane, Fernando; Mohino-Herranz, Inmaculada; Ferreira, Javier; Alvarez, Lorena; Buendia, Ruben; Ayllón, David; Llerena, Cosme; Gil-Pita, Roberto

    2014-04-22

    The Spanish Ministry of Defense, through its Future Combatant program, has sought to develop technology aids with the aim of extending combatants' operational capabilities. Within this framework the ATREC project funded by the "Coincidente" program aims at analyzing diverse biometrics to assess by real time monitoring the stress levels of combatants. This project combines multidisciplinary disciplines and fields, including wearable instrumentation, textile technology, signal processing, pattern recognition and psychological analysis of the obtained information. In this work the ATREC project is described, including the different execution phases, the wearable biomedical measurement systems, the experimental setup, the biomedical signal analysis and speech processing performed. The preliminary results obtained from the data analysis collected during the first phase of the project are presented, indicating the good classification performance exhibited when using features obtained from electrocardiographic recordings and electrical bioimpedance measurements from the thorax. These results suggest that cardiac and respiration activity offer better biomarkers for assessment of stress than speech, galvanic skin response or skin temperature when recorded with wearable biomedical measurement systems.

  13. Analysis of Time-Series Quasi-Experiments. Final Report.

    ERIC Educational Resources Information Center

    Glass, Gene V.; Maguire, Thomas O.

    The objective of this project was to investigate the adequacy of statistical models developed by G. E. P. Box and G. C. Tiao for the analysis of time-series quasi-experiments: (1) The basic model developed by Box and Tiao is applied to actual time-series experiment data from two separate experiments, one in psychology and one in educational…

  14. 78 FR 38307 - Gresham Municipal Utilities; Notice of Application Tendered for Filing With the Commission and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-26

    .... Date filed: June 10, 2013. d. Applicant: Gresham Municipal Utilities. e. Name of Project: Upper Red Lake Dam Hydroelectric Project. f. Location: On Red River in Shawano County, Wisconsin. No federal... analysis at this time. n. The Upper Red Lake Dam Hydroelectric Project would consist of the following...

  15. Advanced Ground Systems Maintenance Physics Models for Diagnostics Project

    NASA Technical Reports Server (NTRS)

    Harp, Janicce Leshay

    2014-01-01

    The project will use high-fidelity physics models and simulations to simulate real-time operations of cryogenic and systems and calculate the status/health of the systems. The project enables the delivery of system health advisories to ground system operators. The capability will also be used to conduct planning and analysis of cryogenic system operations.

  16. The ECLSS Advanced Automation Project Evolution and Technology Assessment

    NASA Technical Reports Server (NTRS)

    Dewberry, Brandon S.; Carnes, James R.; Lukefahr, Brenda D.; Rogers, John S.; Rochowiak, Daniel M.; Mckee, James W.; Benson, Brian L.

    1990-01-01

    Viewgraphs on Environmental Control and Life Support System (ECLSS) advanced automation project evolution and technology assessment are presented. Topics covered include: the ECLSS advanced automation project; automatic fault diagnosis of ECLSS subsystems descriptions; in-line, real-time chemical and microbial fluid analysis; and object-oriented, distributed chemical and microbial modeling of regenerative environmental control systems description.

  17. Systems Analysis of Rapid Transit Underground Construction : Volume 2. Sections 6-9 and Appendixes.

    DOT National Transportation Integrated Search

    1974-12-01

    Three San Francisco Bay Area Rapid Transit (BART) projects and two Washington Metropolitan Area Transit Authority (WMATA) projects are analyzed with respect to time schedules, costs, and sensitivity to physical and institutional controls. These data ...

  18. The Integration of Multi-State Clarus Data into Data Visualization Tools

    DOT National Transportation Integrated Search

    2011-12-20

    This project focused on the integration of all Clarus Data into the Regional Integrated Transportation Information System (RITIS) for real-time situational awareness and historical safety data analysis. The initial outcomes of this project are the fu...

  19. 23 CFR 627.3 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION ENGINEERING AND TRAFFIC OPERATIONS VALUE ENGINEERING... construction. Value Engineering (VE) analysis. The systematic process of reviewing and assessing a project by a...; and (3) Reducing the time to develop and deliver the project. Value Engineering (VE) Job Plan. A...

  20. 23 CFR 627.3 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION ENGINEERING AND TRAFFIC OPERATIONS VALUE ENGINEERING... construction. Value Engineering (VE) analysis. The systematic process of reviewing and assessing a project by a...; and (3) Reducing the time to develop and deliver the project. Value Engineering (VE) Job Plan. A...

  1. Hybrid function projective synchronization in complex dynamical networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wei, Qiang; Wang, Xing-yuan, E-mail: wangxy@dlut.edu.cn; Hu, Xiao-peng

    2014-02-15

    This paper investigates hybrid function projective synchronization in complex dynamical networks. When the complex dynamical networks could be synchronized up to an equilibrium or periodic orbit, a hybrid feedback controller is designed to realize the different component of vector of node could be synchronized up to different desired scaling function in complex dynamical networks with time delay. Hybrid function projective synchronization (HFPS) in complex dynamical networks with constant delay and HFPS in complex dynamical networks with time-varying coupling delay are researched, respectively. Finally, the numerical simulations show the effectiveness of theoretical analysis.

  2. Household survey analysis of the impact of comprehensive strategies to improve the expanded programme on immunisation at the county level in western China, 2006–2010

    PubMed Central

    Zhou, Yuqing; Xing, Yi; Liang, Xiaofeng; Yue, Chenyan; Zhu, Xu; Hipgrave, David

    2016-01-01

    Objective To evaluate interventions to improve routine vaccination coverage and caregiver knowledge in China's remote west, where routine immunisation is relatively weak. Design Prospective pre–post (2006–2010) evaluation in project counties; retrospective comparison based on 2004 administrative data at baseline and surveyed post-intervention (2010) data in selected non-project counties. Setting Four project counties and one non-project county in each of four provinces. Participants 3390 children in project counties at baseline, and 3299 in project and 830 in non-project counties post-intervention; and 3279 caregivers at baseline, and 3389 in project and 830 in non-project counties post-intervention. Intervention Multicomponent inexpensive knowledge-strengthening and service-strengthening and innovative, multisectoral engagement. Data collection Standard 30-cluster household surveys of vaccine coverage and caregiver interviews pre-intervention and post-intervention in each project county. Similar surveys in one non-project county selected by local authorities in each province post-intervention. Administrative data on vaccination coverage in non-project counties at baseline. Primary outcome measures Changes in vaccine coverage between baseline and project completion (2010); comparative caregiver knowledge in all counties in 2010. Analysis Crude (χ2) analysis of changes and differences in vaccination coverage and related knowledge. Multiple logistic regression to assess associations with timely coverage. Results Timely coverage of four routine vaccines increased by 21% (p<0.001) and hepatitis B (HepB) birth dose by 35% (p<0.001) over baseline in project counties. Comparison with non-project counties revealed secular improvement in most provinces, except new vaccine coverage was mostly higher in project counties. Ethnicity, province, birthplace, vaccination site, dual-parental out-migration and parental knowledge had significant associations with coverage. Knowledge increased for all variables but one in project counties (highest p<0.05) and was substantially higher than in non-project counties (p<0.01). Conclusions Comprehensive but inexpensive strategies improved vaccination coverage and caretaker knowledge in western China. Establishing multisectoral leadership, involving the education sector and including immunisation in public-sector performance standards, are affordable and effective interventions. PMID:26966053

  3. 76 FR 17160 - Notice of Finding of No Significant Antitrust Changes and Time for Filing Requests for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-28

    ... Time for Filing Requests for Reevaluation. FOR FURTHER INFORMATION CONTACT: Aaron Szabo, Financial Analyst, Financial Analysis and International Projects Branch, Division of Policy and Rulemaking, Office... Analysis and Recommendation In reaching this conclusion, the NRC staff considered the structure of the...

  4. Can performance on summative evaluation of wax-added dental anatomy projects be better predicted from the combination of supervised and unsupervised practice than from supervised practice alone?

    PubMed

    Radjaeipour, G; Chambers, D W; Geissberger, M

    2016-11-01

    The study explored the effects of adding student-directed projects in pre-clinical dental anatomy laboratory on improving the predictability of students' eventual performance on summative evaluation exercises, given the presence of intervening faculty-controlled, in-class practice. All students from four consecutive classes (n = 555) completed wax-added home projects (HP), spending as much or as little time as desired and receiving no faculty feedback; followed by similar laboratory projects (LP) with time limits and feedback; and then summative practical projects (PP) in a timed format but without faculty feedback. Path analysis was used to assess if the student-directed HP had any effect over and above the laboratory projects. Average scores were HP = 0.785 (SD = 0.089); LP = 0.736 (SD = 0.092); and PP = 0.743 (SD = 0.108). Path analysis was applied to show the effects of including a student-controlled home practice exercise on summative exercise performance. HP contributed 57% direct effect and 37% mediated effect through the LP condition. Student-directed home practice provided a measureable improvement in ability to predict eventual performance in summative test cases over and above the predictive contribution of intervening faculty-controlled practice conditions. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  5. Identifying inaccuracy of MS Project using system analysis

    NASA Astrophysics Data System (ADS)

    Fachrurrazi; Husin, Saiful; Malahayati, Nurul; Irzaidi

    2018-05-01

    The problem encountered in project owner’s financial accounting report is the difference in total project costs of MS Project to the Indonesian Standard (Standard Indonesia Standard / Cost Estimating Standard Book of Indonesia). It is one of the MS Project problems concerning to its cost accuracy, so cost data cannot be used in an integrated way for all project components. This study focuses on finding the causes of inaccuracy of the MS Projects. The aim of this study, which is operationally, are: (i) identifying cost analysis procedures for both current methods (SNI) and MS Project; (ii) identifying cost bias in each element of the cost analysis procedure; and (iii) analysing the cost differences (cost bias) in each element to identify what the cause of inaccuracies in MS Project toward SNI is. The method in this study is comparing for both the system analysis of MS Project and SNI. The results are: (i) MS Project system in Work of Resources element has limitation for two decimal digits only, have led to its inaccuracy. Where the Work of Resources (referred to as effort) in MS Project represents multiplication between the Quantities of Activities and Requirements of resources in SNI; (ii) MS Project and SNI have differences in the costing methods (the cost estimation methods), in which the SNI uses the Quantity-Based Costing (QBC), meanwhile MS Project uses the Time-Based Costing (TBC). Based on this research, we recommend to the contractors who use SNI should make an adjustment for Work of Resources in MS Project (with correction index) so that it can be used in an integrated way to the project owner’s financial accounting system. Further research will conduct for improvement the MS Project as an integrated tool toward all part of the project participant.

  6. The diversity and evolution of ecological and environmental citizen science

    PubMed Central

    Tweddle, John C.; Savage, Joanna; Robinson, Lucy D.; Roy, Helen E.

    2017-01-01

    Citizen science—the involvement of volunteers in data collection, analysis and interpretation—simultaneously supports research and public engagement with science, and its profile is rapidly rising. Citizen science represents a diverse range of approaches, but until now this diversity has not been quantitatively explored. We conducted a systematic internet search and discovered 509 environmental and ecological citizen science projects. We scored each project for 32 attributes based on publicly obtainable information and used multiple factor analysis to summarise this variation to assess citizen science approaches. We found that projects varied according to their methodological approach from ‘mass participation’ (e.g. easy participation by anyone anywhere) to ‘systematic monitoring’ (e.g. trained volunteers repeatedly sampling at specific locations). They also varied in complexity from approaches that are ‘simple’ to those that are ‘elaborate’ (e.g. provide lots of support to gather rich, detailed datasets). There was a separate cluster of entirely computer-based projects but, in general, we found that the range of citizen science projects in ecology and the environment showed continuous variation and cannot be neatly categorised into distinct types of activity. While the diversity of projects begun in each time period (pre 1990, 1990–99, 2000–09 and 2010–13) has not increased, we found that projects tended to have become increasingly different from each other as time progressed (possibly due to changing opportunities, including technological innovation). Most projects were still active so consequently we found that the overall diversity of active projects (available for participation) increased as time progressed. Overall, understanding the landscape of citizen science in ecology and the environment (and its change over time) is valuable because it informs the comparative evaluation of the ‘success’ of different citizen science approaches. Comparative evaluation provides an evidence-base to inform the future development of citizen science activities. PMID:28369087

  7. The RACE (R&D in Advanced Communications Technologies for Europe) Program of the European Communities

    DTIC Science & Technology

    1988-08-17

    asynchronous TDM for all channels; and hybrid solu- However, since technoeconomic considerations may im- tions, possibly involving dynamic rearranging. A...qualitative electronic switching are favored: CMOS, silicon bipolar, analysis , under the headings: timing of introduction, net- and gallium arsenide...or ring configurations. tem requirements. Project 1029. In this project an up-to-date analysis Microelectronic Components was made of the state of the

  8. Improving Project Management with Simulation and Completion Distribution Functions

    NASA Technical Reports Server (NTRS)

    Cates, Grant R.

    2004-01-01

    Despite the critical importance of project completion timeliness, management practices in place today remain inadequate for addressing the persistent problem of project completion tardiness. A major culprit in late projects is uncertainty, which most, if not all, projects are inherently subject to. This uncertainty resides in the estimates for activity durations, the occurrence of unplanned and unforeseen events, and the availability of critical resources. In response to this problem, this research developed a comprehensive simulation based methodology for conducting quantitative project completion time risk analysis. It is called the Project Assessment by Simulation Technique (PAST). This new tool enables project stakeholders to visualize uncertainty or risk, i.e. the likelihood of their project completing late and the magnitude of the lateness, by providing them with a completion time distribution function of their projects. Discrete event simulation is used within PAST to determine the completion distribution function for the project of interest. The simulation is populated with both deterministic and stochastic elements. The deterministic inputs include planned project activities, precedence requirements, and resource requirements. The stochastic inputs include activity duration growth distributions, probabilities for events that can impact the project, and other dynamic constraints that may be placed upon project activities and milestones. These stochastic inputs are based upon past data from similar projects. The time for an entity to complete the simulation network, subject to both the deterministic and stochastic factors, represents the time to complete the project. Repeating the simulation hundreds or thousands of times allows one to create the project completion distribution function. The Project Assessment by Simulation Technique was demonstrated to be effective for the on-going NASA project to assemble the International Space Station. Approximately $500 million per month is being spent on this project, which is scheduled to complete by 2010. NASA project stakeholders participated in determining and managing completion distribution functions produced from PAST. The first result was that project stakeholders improved project completion risk awareness. Secondly, using PAST, mitigation options were analyzed to improve project completion performance and reduce total project cost.

  9. Further dissemination of medical education projects after presentation at a pediatric national meeting (1998-2008).

    PubMed

    Smith, Sherilyn; Kind, Terry; Beck, Gary; Schiller, Jocelyn; McLauchlan, Heather; Harris, Mitchell; Gigante, Joseph

    2014-01-01

    Further dissemination of medical education work presented at national meetings is limited. The purpose of this study was to explore dissemination outcomes of scholarly work in pediatric medical education. Council on Medical Student Education in Pediatrics (COMSEP) members who presented at COMSEP national meetings from 1998 to 2008 received a questionnaire about scholarly dissemination outcomes. Descriptive statistics and chi-square analysis explored variables related to dissemination. Qualitative analysis of free text comments explored barriers to dissemination. Outcomes were determined for 81% of presentations (138/171). The dissemination rate was 67% (92/138 presentations), with 47 publications (34%). Dissemination rates did not vary by presentation type (poster vs. oral) or project type. There was no relationship between presentation type, project type, and dissemination method. Barriers included perceived inadequate time, mentorship, and methodological skills for scholarly work. Most projects were further disseminated. Additional resources including mentoring and protected time for scholarly work are needed by educators to optimize dissemination.

  10. Protein Analysis Using Real-Time PCR Instrumentation: Incorporation in an Integrated, Inquiry-Based Project

    ERIC Educational Resources Information Center

    Southard, Jonathan N.

    2014-01-01

    Instrumentation for real-time PCR is used primarily for amplification and quantitation of nucleic acids. The capability to measure fluorescence while controlling temperature in multiple samples can also be applied to the analysis of proteins. Conformational stability and changes in stability due to ligand binding are easily assessed. Protein…

  11. Real Data and Rapid Results: Ocean Color Data Analysis with Giovanni (GES DISC Interactive Online Visualization and ANalysis Infrastructure)

    NASA Technical Reports Server (NTRS)

    Acker, J. G.; Leptoukh, G.; Kempler, S.; Gregg, W.; Berrick, S.; Zhu, T.; Liu, Z.; Rui, H.; Shen, S.

    2004-01-01

    The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) has taken a major step addressing the challenge of using archived Earth Observing System (EOS) data for regional or global studies by developing an infrastructure with a World Wide Web interface which allows online, interactive, data analysis: the GES DISC Interactive Online Visualization and ANalysis Infrastructure, or "Giovanni." Giovanni provides a data analysis environment that is largely independent of underlying data file format. The Ocean Color Time-Series Project has created an initial implementation of Giovanni using monthly Standard Mapped Image (SMI) data products from the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) mission. Giovanni users select geophysical parameters, and the geographical region and time period of interest. The system rapidly generates a graphical or ASCII numerical data output. Currently available output options are: Area plot (averaged or accumulated over any available data period for any rectangular area); Time plot (time series averaged over any rectangular area); Hovmeller plots (image view of any longitude-time and latitude-time cross sections); ASCII output for all plot types; and area plot animations. Future plans include correlation plots, output formats compatible with Geographical Information Systems (GIs), and higher temporal resolution data. The Ocean Color Time-Series Project will produce sensor-independent ocean color data beginning with the Coastal Zone Color Scanner (CZCS) mission and extending through SeaWiFS and Moderate Resolution Imaging Spectroradiometer (MODIS) data sets, and will enable incorporation of Visible/lnfrared Imaging Radiometer Suite (VIIRS) data, which will be added to Giovanni. The first phase of Giovanni will also include tutorials demonstrating the use of Giovanni and collaborative assistance in the development of research projects using the SeaWiFS and Ocean Color Time-Series Project data in the online Laboratory for Ocean Color Users (LOCUS). The synergy of Giovanni with high-quality ocean color data provides users with the ability to investigate a variety of important oceanic phenomena, such as coastal primary productivity related to pelagic fisheries, seasonal patterns and interannual variability, interdependence of atmospheric dust aerosols and harmful algal blooms, and the potential effects of climate change on oceanic productivity.

  12. The PI-Mode of Project Management

    NASA Technical Reports Server (NTRS)

    Isaac, Dan

    1997-01-01

    The PI-Mode is NASA's new approach to project management. It responds to the Agency's new policy to develop scientific missions that deliver the highest quality science for a fixed cost. It also attempts to provide more research opportunities by reducing project development times and increasing the number of launches per year. In order to accomplish this, the Principal Investigator is placed at the helm of the project with full responsibility over all aspects of the mission, including instrument and spacecraft development, as well as mission operations and data analysis. This paper intends to study the PI-Mode to determine the strengths and weaknesses of such a new project management technique. It also presents an analysis of its possible impact on the scientific community and its relations with industry, NASA, and other institutions.

  13. Modeling and projection of dengue fever cases in Guangzhou based on variation of weather factors.

    PubMed

    Li, Chenlu; Wang, Xiaofeng; Wu, Xiaoxu; Liu, Jianing; Ji, Duoying; Du, Juan

    2017-12-15

    Dengue fever is one of the most serious vector-borne infectious diseases, especially in Guangzhou, China. Dengue viruses and their vectors Aedes albopictus are sensitive to climate change primarily in relation to weather factors. Previous research has mainly focused on identifying the relationship between climate factors and dengue cases, or developing dengue case models with some non-climate factors. However, there has been little research addressing the modeling and projection of dengue cases only from the perspective of climate change. This study considered this topic using long time series data (1998-2014). First, sensitive weather factors were identified through meta-analysis that included literature review screening, lagged analysis, and collinear analysis. Then, key factors that included monthly average temperature at a lag of two months, and monthly average relative humidity and monthly average precipitation at lags of three months were determined. Second, time series Poisson analysis was used with the generalized additive model approach to develop a dengue model based on key weather factors for January 1998 to December 2012. Data from January 2013 to July 2014 were used to validate that the model was reliable and reasonable. Finally, future weather data (January 2020 to December 2070) were input into the model to project the occurrence of dengue cases under different climate scenarios (RCP 2.6 and RCP 8.5). Longer time series analysis and scientifically selected weather variables were used to develop a dengue model to ensure reliability. The projections suggested that seasonal disease control (especially in summer and fall) and mitigation of greenhouse gas emissions could help reduce the incidence of dengue fever. The results of this study hope to provide a scientifically theoretical basis for the prevention and control of dengue fever in Guangzhou. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. HydroClimATe: hydrologic and climatic analysis toolkit

    USGS Publications Warehouse

    Dickinson, Jesse; Hanson, Randall T.; Predmore, Steven K.

    2014-01-01

    The potential consequences of climate variability and climate change have been identified as major issues for the sustainability and availability of the worldwide water resources. Unlike global climate change, climate variability represents deviations from the long-term state of the climate over periods of a few years to several decades. Currently, rich hydrologic time-series data are available, but the combination of data preparation and statistical methods developed by the U.S. Geological Survey as part of the Groundwater Resources Program is relatively unavailable to hydrologists and engineers who could benefit from estimates of climate variability and its effects on periodic recharge and water-resource availability. This report documents HydroClimATe, a computer program for assessing the relations between variable climatic and hydrologic time-series data. HydroClimATe was developed for a Windows operating system. The software includes statistical tools for (1) time-series preprocessing, (2) spectral analysis, (3) spatial and temporal analysis, (4) correlation analysis, and (5) projections. The time-series preprocessing tools include spline fitting, standardization using a normal or gamma distribution, and transformation by a cumulative departure. The spectral analysis tools include discrete Fourier transform, maximum entropy method, and singular spectrum analysis. The spatial and temporal analysis tool is empirical orthogonal function analysis. The correlation analysis tools are linear regression and lag correlation. The projection tools include autoregressive time-series modeling and generation of many realizations. These tools are demonstrated in four examples that use stream-flow discharge data, groundwater-level records, gridded time series of precipitation data, and the Multivariate ENSO Index.

  15. Defense Planning In A Time Of Conflict: A Comparative Analysis Of The 2001-2014 Quadrennial Defense Reviews, And Implications For The Army--Executive Summary

    DTIC Science & Technology

    2018-01-01

    Defense Planning in a Time of Conflict A Comparative Analysis of the 2001– 2014 Quadrennial Defense Reviews, and Implications for...The purpose of the project was to perform a comparative historical review of the four Quadrennial Defense Reviews (QDRs) conducted since the first...Planning in a Time of Conflict: A Comparative Analysis of the 2001–2014 Quadrennial Defense Reviews, and Implications for the Army—that documented

  16. Analysis, comparison, and contrast of two primary maintenance contracting techniques used by the Florida Department of Transportation [summary].

    DOT National Transportation Integrated Search

    2016-12-01

    A common method of government contracting is seeking the lowest bid that meets project specifications. While this assures that money is saved upfront, it may not ensure other project values, such as delivery time, ability to address unforeseen situat...

  17. Tech Prep Model for Marketing Education.

    ERIC Educational Resources Information Center

    Ruhland, Sheila K.; King, Binky M.

    A project was conducted to develop two tech prep models for marketing education (ME) in Missouri to provide a sequence of courses for skill-enhanced and time-shortened programs. First, labor market trends, employment growth projections, and business and industry labor needs in Missouri were researched and analyzed. The analysis results were used…

  18. The shifting dynamics of social roles and project ownership over the lifecycle of a community-based participatory research project.

    PubMed

    Salsberg, Jon; Macridis, Soultana; Garcia Bengoechea, Enrique; Macaulay, Ann C; Moore, Spencer

    2017-06-01

    . Community based participatory research (CBPR) is often initiated by academic researchers, yet relies on meaningful community engagement and ownership to have lasting impact. Little is understood about how ownership shifts from academic to community partners. . We examined a CBPR project over its life course and asked: what does the evolution of ownership look like from project initiation by an academic (non-community) champion (T1); to maturation-when the intervention is ready to be deployed (T2); to independence-the time when the original champion steps aside (T3); and finally, to its maintenance-when the community has had an opportunity to function independently of the original academic champion (T4)? . Using sociometric (whole network) social network analysis, knowledge leadership was measured using 'in-degree centrality'. Stakeholder network structure was measured using 'centralisation' and 'core-periphery analysis'. Friedman rank sum test was used to measure change in actor roles over time from T1 to T4. . Project stakeholder roles were observed to shift significantly (P < 0.005) from initiation (T1) to project maintenance (T4). Community stakeholders emerged into positions of knowledge leadership, while the roles of academic partners diminished in importance. The overall stakeholder network demonstrated a structural shift towards a core of densely interacting community stakeholders. . This was the first study to use Social network analysis to document a shift in ownership from academic to community partners, indicating community self-determination over the research process. Further analysis of qualitative data will determine which participatory actions or strategies were responsible for this observed change. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  19. Gridded Surface Subsurface Hydrologic Analysis Modeling for Analysis of Flood Design Features at the Picayune Strand Restoration Project

    DTIC Science & Technology

    2016-08-01

    the POI. ............................................................... 17  Figure 9. Discharge time series for the Miller pump system...2. In C2, the Miller Canal pump system was implicitly simulated by a time series of outflows assigned to model cells. This flow time series was...representative of how the pump system would operate during the storm events simulated in this work (USACE 2004). The outflow time series for the Miller

  20. Potential value of satellite cloud pictures in weather modification projects

    NASA Technical Reports Server (NTRS)

    Biswas, K. R.

    1972-01-01

    Satellite imagery for one project season of cloud seeding programs in the northern Great Plains has been surveyed for its probable usefulness in weather modification programs. The research projects and the meteorological information available are described. A few illustrative examples of satellite imagery analysis are cited and discussed, along with local observations of weather and the seeding decisions made in the research program. This analysis indicates a definite correlation between satellite-observed cloud patterns and the types of cloud seeding activity undertaken, and suggests a high probability of better and/or earlier decisions if the imagery is available in real time. Infrared imagery provides better estimates of cloud height which can be useful in assessing the possibility of a hail threat. The satellite imagery appears to be of more value to area-seeding projects than to single-cloud seeding experiments where the imagery is of little value except as an aid in local forecasting and analysis.

  1. Metropolitan Spokane Region Water Resources Study. Appendix 1. Institutional Analysis

    DTIC Science & Technology

    1976-01-01

    projects in the State of Washington. -It- h"Ithe- ath- -ority to require compliance from all local... projects of relatively modest scope. -rojec- financ- ing is based, as in the case of irrigation districts, on assessments lev- ied on district lands. The ...treatment -facilities would be phased out at that time. The other interim facilities all west of Five Mile Prairie are projected to remain in service

  2. Overview of the data analysis and new micro-pattern gas detector development for the Active Target Time Projection Chamber (AT-TPC) project.

    NASA Astrophysics Data System (ADS)

    Ayyad, Yassid; Mittig, Wolfgang; Bazin, Daniel; Cortesi, Marco

    2017-07-01

    The Active Target Time Projection Chamber (AT-TPC) project at the NSCL (National Superconducting Cyclotron Laboratory, Michigan State University) is a novel active target detector tailored for low-energy nuclear reactions in inverse kinematics with radioactive ion beams. The AT-TPC allows for a full three dimensional reconstruction of the reaction and provides high luminosity without degradation of resolution by the thickness of the target. Since all the particles (and also the reaction vertex) are tracked inside the detector, the AT-TPC has full 4π efficiency. The AT-TPC can operate under a magnetic field (2 T) that improves the identification of the particles and the energy resolution through the measurement of the magnetic rigidity. Another important characteristic of the AT-TPC is the high-gain operation achieved by the hybrid thick Gas Electron Multipliers (THGEM)-Micromegas pad plane, that allow operation also in pure elemental gas. These two features make the AT-TPC a unique high resolution spectrometer with full acceptance for nuclear physics reactions. This work presents an overview of the project, focused on the data analysis and the development of new micro-pattern gas detectors.

  3. Local-feature analysis for automated coarse-graining of bulk-polymer molecular dynamics simulations.

    PubMed

    Xue, Y; Ludovice, P J; Grover, M A

    2012-12-01

    A method for automated coarse-graining of bulk polymers is presented, using the data-mining tool of local feature analysis. Most existing methods for polymer coarse-graining define superatoms based on their covalent bonding topology along the polymer backbone, but here superatoms are defined based only on their correlated motions, as observed in molecular dynamics simulations. Correlated atomic motions are identified in the simulation data using local feature analysis, between atoms in the same or in different polymer chains. Groups of highly correlated atoms constitute the superatoms in the coarse-graining scheme, and the positions of their seed coordinates are then projected forward in time. Based on only the seed positions, local feature analysis enables the full reconstruction of all atomic positions. This reconstruction suggests an iterative scheme to reduce the computation of the simulations to initialize another short molecular dynamic simulation, identify new superatoms, and again project forward in time.

  4. Evaluation of grid generation technologies from an applied perspective

    NASA Technical Reports Server (NTRS)

    Hufford, Gary S.; Harrand, Vincent J.; Patel, Bhavin C.; Mitchell, Curtis R.

    1995-01-01

    An analysis of the grid generation process from the point of view of an applied CFD engineer is given. Issues addressed include geometric modeling, structured grid generation, unstructured grid generation, hybrid grid generation and use of virtual parts libraries in large parametric analysis projects. The analysis is geared towards comparing the effective turn around time for specific grid generation and CFD projects. The conclusion was made that a single grid generation methodology is not universally suited for all CFD applications due to both limitations in grid generation and flow solver technology. A new geometric modeling and grid generation tool, CFD-GEOM, is introduced to effectively integrate the geometric modeling process to the various grid generation methodologies including structured, unstructured, and hybrid procedures. The full integration of the geometric modeling and grid generation allows implementation of extremely efficient updating procedures, a necessary requirement for large parametric analysis projects. The concept of using virtual parts libraries in conjunction with hybrid grids for large parametric analysis projects is also introduced to improve the efficiency of the applied CFD engineer.

  5. Independent Peer Review of Communications, Navigation, and Networking re-Configurable Testbed (CoNNeCT) Project Antenna Pointing Subsystem (APS) Integrated Gimbal Assembly (IGA) Structural Analysis

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S.; Larsen, Curtis E.; Pellicciotti, Joseph W.

    2010-01-01

    Glenn Research Center Chief Engineer's Office requested an independent review of the structural analysis and modeling of the Communications, Navigation, and Networking re-Configurable Testbed (CoNNeCT) Project Antenna Pointing Subsystem (APS) Integrated Gimbal Assembly (IGA) to be conducted by the NASA Engineering and Safety Center (NESC). At this time, the IGA had completed its critical design review (CDR). The assessment was to be a peer review of the NEi-NASTRAN1 model of the APS Antenna, and not a peer review of the design and the analysis that had been completed by the GRC team for CDR. Thus, only a limited amount of information was provided on the structural analysis. However, the NESC team had difficulty separating analysis concerns from modeling issues. The team studied the NASTRAN model, but did not fully investigate how the model was used by the CoNNeCT Project and how the Project was interpreting the results. The team's findings, observations, and NESC recommendations are contained in this report.

  6. Project delay analysis of HRSG

    NASA Astrophysics Data System (ADS)

    Silvianita; Novega, A. S.; Rosyid, D. M.; Suntoyo

    2017-08-01

    Completion of HRSG (Heat Recovery Steam Generator) fabrication project sometimes is not sufficient with the targeted time written on the contract. The delay on fabrication process can cause some disadvantages for fabricator, including forfeit payment, delay on HRSG construction process up until HRSG trials delay. In this paper, the author is using semi quantitative on HRSG pressure part fabrication delay with configuration plant 1 GT (Gas Turbine) + 1 HRSG + 1 STG (Steam Turbine Generator) using bow-tie analysis method. Bow-tie analysis method is a combination from FTA (Fault tree analysis) and ETA (Event tree analysis) to develop the risk matrix of HRSG. The result from FTA analysis is use as a threat for preventive measure. The result from ETA analysis is use as impact from fabrication delay.

  7. CosmoQuest:Using Data Validation for More Than Just Data Validation

    NASA Astrophysics Data System (ADS)

    Lehan, C.; Gay, P.

    2016-12-01

    It is often taken for granted that different scientists completing the same task (e.g. mapping geologic features) will get the same results, and data validation is often skipped or under-utilized due to time and funding constraints. Robbins et. al (2014), however, demonstrated that this is a needed step, as large variation can exist even among collaborating team members completing straight-forward tasks like marking craters. Data Validation should be much more than a simple post-project verification of results. The CosmoQuest virtual research facility employs regular data-validation for a variety of benefits, including real-time user feedback, real-time tracking to observe user activity while it's happening, and using pre-solved data to analyze users' progress and to help them retain skills. Some creativity in this area can drastically improve project results. We discuss methods of validating data in citizen science projects and outline the variety of uses for validation, which, when used properly, improves the scientific output of the project and the user experience for the citizens doing the work. More than just a tool for scientists, validation can assist users in both learning and retaining important information and skills, improving the quality and quantity of data gathered. Real-time analysis of user data can give key information in the effectiveness of the project that a broad glance would miss, and properly presenting that analysis is vital. Training users to validate their own data, or the data of others, can significantly improve the accuracy of misinformed or novice users.

  8. IoGET: Internet of Geophysical and Environmental Things

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mudunuru, Maruti Kumar

    The objective of this project is to provide novel and fast reduced-order models for onboard computation at sensor nodes for real-time analysis. The approach will require that LANL perform high-fidelity numerical simulations, construct simple reduced-order models (ROMs) using machine learning and signal processing algorithms, and use real-time data analysis for ROMs and compressive sensing at sensor nodes.

  9. A Product Analysis Method and Its Staging to Develop Redesign Competences

    ERIC Educational Resources Information Center

    Hansen, Claus Thorp; Lenau, Torben Anker

    2013-01-01

    Most product development work in industrial practice is incremental, i.e., the company has had a product in production and on the market for some time, and now time has come to design an upgraded variant. This type of redesign project requires that the engineering designers have competences to carry through an analysis of the existing product…

  10. Modelling the Cooling of Coffee: Insights from a Preliminary Study in Indonesia

    ERIC Educational Resources Information Center

    Widjaja, Wanty

    2010-01-01

    This paper discusses an attempt to examine pre-service teachers' mathematical modelling skills. A modelling project investigating relationships between temperature and time in the process of cooling of coffee was chosen. The analysis was based on group written reports of the cooling of coffee project and observation of classroom discussion.…

  11. Updated Intensity - Duration - Frequency Curves Under Different Future Climate Scenarios

    NASA Astrophysics Data System (ADS)

    Ragno, E.; AghaKouchak, A.

    2016-12-01

    Current infrastructure design procedures rely on the use of Intensity - Duration - Frequency (IDF) curves retrieved under the assumption of temporal stationarity, meaning that occurrences of extreme events are expected to be time invariant. However, numerous studies have observed more severe extreme events over time. Hence, the stationarity assumption for extreme analysis may not be appropriate in a warming climate. This issue raises concerns regarding the safety and resilience of the existing and future infrastructures. Here we employ historical and projected (RCP 8.5) CMIP5 runs to investigate IDF curves of 14 urban areas across the United States. We first statistically assess changes in precipitation extremes using an energy-based test for equal distributions. Then, through a Bayesian inference approach for stationary and non-stationary extreme value analysis, we provide updated IDF curves based on climatic model projections. This presentation summarizes the projected changes in statistics of extremes. We show that, based on CMIP5 simulations, extreme precipitation events in some urban areas can be 20% more severe in the future, even when projected annual mean precipitation is expected to remain similar to the ground-based climatology.

  12. [Analysis of evaluation process of research projects submitted to the Fondo de Investigación Sanitaria, Spain].

    PubMed

    Prieto Carles, C; Gómez-Gerique, J; Gutiérrez Millet, V; Veiga de Cabo, J; Sanz Martul, E; Mendoza Hernández, J L

    2000-10-07

    At the present time it seems very clear that research improvement is both an unquestionable fact and the right way to develop technological innovation, services and patents. However, such improvement and corresponding finances needs to be done under fine and rigorous evaluation process as an assessment tool under which all the research projects applying to a public or private call for proposals should be submitted to assure a coherence point according to the investment to be made. At this end, the main target of this work has been focused to analysis and study the evaluation process traditionally made by Fondo de Investigación Sanitaria (FIS) as well as to propose most adequate modifications. A sample of 431 research projects corresponding to year 1998 proposal was analysed. The evaluation from FIS and ANEP (National Evaluation and Prospective Agency) was evaluated and scored (evaluation quality) in its main contents by 3 independent evaluators, the showed results submitted to a comparative frame between these agencies at indoor (FIS) and outdoor (FIS/ANEP) level. FIS evaluation had 20 commissions or areas of knowledge. The analysis indoor (FIS) clearly showed that evaluation quality was correlated to the assigned commission (F = 3.71; p < 0.001) and to the time last of the researched proposal (F = 3.42; p < 0.05) but no related to the evaluator. On the other hand, the quality of ANEP evaluation showed a correlated dependency of the three mentioned facts. In all terms, the ANEP evaluation was better than FIS for the three years time projects, but in did not show significant differences in one or two years time projects. In all cases, the evaluation with final results as negative (financing denied) showed an average quality higher than positive evaluation. The obtained results advice about the convenience of making some changes in the evaluative structure and to review the sort of FIS technical commissions focusing an improvement of the evaluation process.

  13. Schedule Analysis Software Saves Time for Project Planners

    NASA Technical Reports Server (NTRS)

    2015-01-01

    Since the early 2000s, a resource management team at Marshall Space Flight Center has developed and improved the Schedule Test and Assessment Tool, a software add-on capable of analyzing, summarizing, and finding logic gaps in project schedules. Companies like Lanham, Maryland-based Vantage Systems Inc. use the tool to manage NASA projects, but it has also been released for free to more than 200 US companies, agencies, and other entities.

  14. Time/frequency systems.

    DOT National Transportation Integrated Search

    1971-06-01

    The report summarizes the work performed at DOT/TSC on the Time/Frequency ATC System study project. Principal emphasis in this report is given to the evaluation and analysis of the technological risk areas. A survey and description of proposed T/F sy...

  15. When, not if: the inescapability of an uncertain climate future.

    PubMed

    Ballard, Timothy; Lewandowsky, Stephan

    2015-11-28

    Climate change projections necessarily involve uncertainty. Analysis of the physics and mathematics of the climate system reveals that greater uncertainty about future temperature increases is nearly always associated with greater expected damages from climate change. In contrast to those normative constraints, uncertainty is frequently cited in public discourse as a reason to delay mitigative action. This failure to understand the actual implications of uncertainty may incur notable future costs. It is therefore important to communicate uncertainty in a way that improves people's understanding of climate change risks. We examined whether responses to projections were influenced by whether the projection emphasized uncertainty in the outcome or in its time of arrival. We presented participants with statements and graphs indicating projected increases in temperature, sea levels, ocean acidification and a decrease in arctic sea ice. In the uncertain-outcome condition, statements reported the upper and lower confidence bounds of the projected outcome at a fixed time point. In the uncertain time-of-arrival condition, statements reported the upper and lower confidence bounds of the projected time of arrival for a fixed outcome. Results suggested that people perceived the threat as more serious and were more likely to encourage mitigative action in the time-uncertain condition than in the outcome-uncertain condition. This finding has implications for effectively communicating the climate change risks to policy-makers and the general public. © 2015 The Author(s).

  16. Nonnegative Matrix Factorization for Efficient Hyperspectral Image Projection

    NASA Technical Reports Server (NTRS)

    Iacchetta, Alexander S.; Fienup, James R.; Leisawitz, David T.; Bolcar, Matthew R.

    2015-01-01

    Hyperspectral imaging for remote sensing has prompted development of hyperspectral image projectors that can be used to characterize hyperspectral imaging cameras and techniques in the lab. One such emerging astronomical hyperspectral imaging technique is wide-field double-Fourier interferometry. NASA's current, state-of-the-art, Wide-field Imaging Interferometry Testbed (WIIT) uses a Calibrated Hyperspectral Image Projector (CHIP) to generate test scenes and provide a more complete understanding of wide-field double-Fourier interferometry. Given enough time, the CHIP is capable of projecting scenes with astronomically realistic spatial and spectral complexity. However, this would require a very lengthy data collection process. For accurate but time-efficient projection of complicated hyperspectral images with the CHIP, the field must be decomposed both spectrally and spatially in a way that provides a favorable trade-off between accurately projecting the hyperspectral image and the time required for data collection. We apply nonnegative matrix factorization (NMF) to decompose hyperspectral astronomical datacubes into eigenspectra and eigenimages that allow time-efficient projection with the CHIP. Included is a brief analysis of NMF parameters that affect accuracy, including the number of eigenspectra and eigenimages used to approximate the hyperspectral image to be projected. For the chosen field, the normalized mean squared synthesis error is under 0.01 with just 8 eigenspectra. NMF of hyperspectral astronomical fields better utilizes the CHIP's capabilities, providing time-efficient and accurate representations of astronomical scenes to be imaged with the WIIT.

  17. An Overview of the Role of Systems Analysis in NASA's Hypersonics Project

    NASA Technical Reports Server (NTRS)

    Robinson, Jeffrey S.; Martin John G.; Bowles, Jeffrey V> ; Mehta, Unmeel B.; Snyder, CHristopher A.

    2006-01-01

    NASA's Aeronautics Research Mission Directorate recently restructured its Vehicle Systems Program, refocusing it towards understanding the fundamental physics that govern flight in all speed regimes. Now called the Fundamental Aeronautics Program, it is comprised of four new projects, Subsonic Fixed Wing, Subsonic Rotary Wing, Supersonics, and Hypersonics. The Aeronautics Research Mission Directorate has charged the Hypersonics Project with having a basic understanding of all systems that travel at hypersonic speeds within the Earth's and other planets atmospheres. This includes both powered and unpowered systems, such as re-entry vehicles and vehicles powered by rocket or airbreathing propulsion that cruise in and accelerate through the atmosphere. The primary objective of the Hypersonics Project is to develop physics-based predictive tools that enable the design, analysis and optimization of such systems. The Hypersonics Project charges the systems analysis discipline team with providing it the decision-making information it needs to properly guide research and technology development. Credible, rapid, and robust multi-disciplinary system analysis processes and design tools are required in order to generate this information. To this end, the principal challenges for the systems analysis team are the introduction of high fidelity physics into the analysis process and integration into a design environment, quantification of design uncertainty through the use of probabilistic methods, reduction in design cycle time, and the development and implementation of robust processes and tools enabling a wide design space and associated technology assessment capability. This paper will discuss the roles and responsibilities of the systems analysis discipline team within the Hypersonics Project as well as the tools, methods, processes, and approach that the team will undertake in order to perform its project designated functions.

  18. Report on Integration of Existing Grid Models for N-R HES Interaction Focused on Balancing Authorities for Sub-hour Penalties and Opportunities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McJunkin, Timothy; Epiney, Aaron; Rabiti, Cristian

    2017-06-01

    This report provides a summary of the effort in the Nuclear-Renewable Hybrid Energy System (N-R HES) project on the level 4 milestone to consider integration of existing grid models into the factors for optimization on shorter time intervals than the existing electric grid models with the Risk Analysis Virtual Environment (RAVEN) and Modelica [1] optimizations and economic analysis that are the focus of the project to date.

  19. Research in Computational Astrobiology

    NASA Technical Reports Server (NTRS)

    Chaban, Galina; Colombano, Silvano; Scargle, Jeff; New, Michael H.; Pohorille, Andrew; Wilson, Michael A.

    2003-01-01

    We report on several projects in the field of computational astrobiology, which is devoted to advancing our understanding of the origin, evolution and distribution of life in the Universe using theoretical and computational tools. Research projects included modifying existing computer simulation codes to use efficient, multiple time step algorithms, statistical methods for analysis of astrophysical data via optimal partitioning methods, electronic structure calculations on water-nuclei acid complexes, incorporation of structural information into genomic sequence analysis methods and calculations of shock-induced formation of polycylic aromatic hydrocarbon compounds.

  20. Detecting Shielded Special Nuclear Materials Using Multi-Dimensional Neutron Source and Detector Geometries

    NASA Astrophysics Data System (ADS)

    Santarius, John; Navarro, Marcos; Michalak, Matthew; Fancher, Aaron; Kulcinski, Gerald; Bonomo, Richard

    2016-10-01

    A newly initiated research project will be described that investigates methods for detecting shielded special nuclear materials by combining multi-dimensional neutron sources, forward/adjoint calculations modeling neutron and gamma transport, and sparse data analysis of detector signals. The key tasks for this project are: (1) developing a radiation transport capability for use in optimizing adaptive-geometry, inertial-electrostatic confinement (IEC) neutron source/detector configurations for neutron pulses distributed in space and/or phased in time; (2) creating distributed-geometry, gas-target, IEC fusion neutron sources; (3) applying sparse data and noise reduction algorithms, such as principal component analysis (PCA) and wavelet transform analysis, to enhance detection fidelity; and (4) educating graduate and undergraduate students. Funded by DHS DNDO Project 2015-DN-077-ARI095.

  1. Risk management of PPP project in the preparation stage based on Fault Tree Analysis

    NASA Astrophysics Data System (ADS)

    Xing, Yuanzhi; Guan, Qiuling

    2017-03-01

    The risk management of PPP(Public Private Partnership) project can improve the level of risk control between government departments and private investors, so as to make more beneficial decisions, reduce investment losses and achieve mutual benefit as well. Therefore, this paper takes the PPP project preparation stage venture as the research object to identify and confirm four types of risks. At the same time, fault tree analysis(FTA) is used to evaluate the risk factors that belong to different parts, and quantify the influencing degree of risk impact on the basis of risk identification. In addition, it determines the importance order of risk factors by calculating unit structure importance on PPP project preparation stage. The result shows that accuracy of government decision-making, rationality of private investors funds allocation and instability of market returns are the main factors to generate the shared risk on the project.

  2. Time Projection Compton Spectrometer (TPCS). User`s guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Landron, C.O.; Baldwin, G.T.

    1994-04-01

    The Time Projection Compton Spectrometer (TPCS) is a radiation diagnostic designed to determine the time-integrated energy spectrum between 100 keV -- 2 MeV of flash x-ray sources. This guide is intended as a reference for the routine operator of the TPCS. Contents include a brief overview of the principle of operation, detailed component descriptions, detailed assembly and disassembly procedures, guide to routine operations, and troubleshooting flowcharts. Detailed principle of operation, signal analysis and spectrum unfold algorithms are beyond the scope of this guide; however, the guide makes reference to sources containing this information.

  3. Projection of landfill stabilization period by time series analysis of leachate quality and transformation trends of VOCs.

    PubMed

    Sizirici, Banu; Tansel, Berrin

    2010-01-01

    The purpose of this study was to evaluate suitability of using the time series analysis for selected leachate quantity and quality parameters to forecast the duration of post closure period of a closed landfill. Selected leachate quality parameters (i.e., sodium, chloride, iron, bicarbonate, total dissolved solids (TDS), and ammonium as N) and volatile organic compounds (VOCs) (i.e., vinyl chloride, 1,4-dichlorobenzene, chlorobenzene, benzene, toluene, ethyl benzene, xylenes, total BTEX) were analyzed by the time series multiplicative decomposition model to estimate the projected levels of the parameters. These parameters were selected based on their detection levels and consistency of detection in leachate samples. In addition, VOCs detected in leachate and their chemical transformations were considered in view of the decomposition stage of the landfill. Projected leachate quality trends were analyzed and compared with the maximum contaminant level (MCL) for the respective parameters. Conditions that lead to specific trends (i.e., increasing, decreasing, or steady) and interactions of leachate quality parameters were evaluated. Decreasing trends were projected for leachate quantity, concentrations of sodium, chloride, TDS, ammonia as N, vinyl chloride, 1,4-dichlorobenzene, benzene, toluene, ethyl benzene, xylenes, and total BTEX. Increasing trends were projected for concentrations of iron, bicarbonate, and chlorobenzene. Anaerobic conditions in landfill provide favorable conditions for corrosion of iron resulting in higher concentrations over time. Bicarbonate formation as a byproduct of bacterial respiration during waste decomposition and the lime rock cap system of the landfill contribute to the increasing levels of bicarbonate in leachate. Chlorobenzene is produced during anaerobic biodegradation of 1,4-dichlorobenzene, hence, the increasing trend of chlorobenzene may be due to the declining trend of 1,4-dichlorobenzene. The time series multiplicative decomposition model in general provides an adequate forecast for future planning purposes for the parameters monitored in leachate. The model projections for 1,4-dichlorobenzene were relatively less accurate in comparison to the projections for vinyl chloride and chlorobenzene. Based on the trends observed, future monitoring needs for the selected leachate parameters were identified.

  4. Collaborative Resource Allocation

    NASA Technical Reports Server (NTRS)

    Wang, Yeou-Fang; Wax, Allan; Lam, Raymond; Baldwin, John; Borden, Chester

    2007-01-01

    Collaborative Resource Allocation Networking Environment (CRANE) Version 0.5 is a prototype created to prove the newest concept of using a distributed environment to schedule Deep Space Network (DSN) antenna times in a collaborative fashion. This program is for all space-flight and terrestrial science project users and DSN schedulers to perform scheduling activities and conflict resolution, both synchronously and asynchronously. Project schedulers can, for the first time, participate directly in scheduling their tracking times into the official DSN schedule, and negotiate directly with other projects in an integrated scheduling system. A master schedule covers long-range, mid-range, near-real-time, and real-time scheduling time frames all in one, rather than the current method of separate functions that are supported by different processes and tools. CRANE also provides private workspaces (both dynamic and static), data sharing, scenario management, user control, rapid messaging (based on Java Message Service), data/time synchronization, workflow management, notification (including emails), conflict checking, and a linkage to a schedule generation engine. The data structure with corresponding database design combines object trees with multiple associated mortal instances and relational database to provide unprecedented traceability and simplify the existing DSN XML schedule representation. These technologies are used to provide traceability, schedule negotiation, conflict resolution, and load forecasting from real-time operations to long-range loading analysis up to 20 years in the future. CRANE includes a database, a stored procedure layer, an agent-based middle tier, a Web service wrapper, a Windows Integrated Analysis Environment (IAE), a Java application, and a Web page interface.

  5. Wake acoustic analysis and image decomposition via beamforming of microphone signal projections on wavelet subspaces

    DOT National Transportation Integrated Search

    2006-05-08

    This paper describes the integration of wavelet analysis and time-domain beamforming : of microphone array output signals for analyzing the acoustic emissions from airplane : generated wake vortices. This integrated process provides visual and quanti...

  6. Analysis method comparison of on-time and on-budget data.

    DOT National Transportation Integrated Search

    2007-02-01

    New Mexico Department of Transportation (NMDOT) results for On-Time and On-Budget performance measures as reported in (AASHTO/SCoQ) NCHRP 20-24(37) Project Measuring Performance Among State DOTs (Phase I) are lower than construction personnel kno...

  7. Project development laboratories energy fuels and oils based on NRU “MPEI”

    NASA Astrophysics Data System (ADS)

    Burakov, I. A.; Burakov, A. Y.; Nikitina, I. S.; Khomenkov, A. M.; Paramonova, A. O.; Khtoo Naing, Aung

    2017-11-01

    In the process of improving the efficiency of power plants a hot topic is the use of high-quality fuels and lubricants. In the process of transportation, preparation for use, storage and maintenance of the properties of fuels and lubricants may deteriorate, which entails a reduction in the efficiency of power plants. One of the ways to prevent the deterioration of the properties is a timely analysis of the relevant laboratories. In this day, the existence of laboratories of energy fuels and energy laboratory oil at thermal power stations is satisfactory character. However, the training of qualified personnel to work in these laboratories is a serious problem, as the lack of opportunities in these laboratories a complete list of required tests. The solution to this problem is to explore the possibility of application of methods of analysis of the properties of fuels and lubricants in the stage of training and re-training of qualified personnel. In this regard, on the basis of MPEI developed laboratory projects of solid, liquid and gaseous fuels, power and energy oils and lubricants. Projects allow for a complete list of tests required for the timely control of properties and prevent the deterioration of these properties. Assess the financial component of the implementation of the developed projects based on the use of modern equipment used for tests. Projects allow for a complete list of tests required for the timely control of properties and prevent the deterioration of these properties.

  8. Subspace-based interference removal methods for a multichannel biomagnetic sensor array.

    PubMed

    Sekihara, Kensuke; Nagarajan, Srikantan S

    2017-10-01

    In biomagnetic signal processing, the theory of the signal subspace has been applied to removing interfering magnetic fields, and a representative algorithm is the signal space projection algorithm, in which the signal/interference subspace is defined in the spatial domain as the span of signal/interference-source lead field vectors. This paper extends the notion of this conventional (spatial domain) signal subspace by introducing a new definition of signal subspace in the time domain. It defines the time-domain signal subspace as the span of row vectors that contain the source time course values. This definition leads to symmetric relationships between the time-domain and the conventional (spatial-domain) signal subspaces. As a review, this article shows that the notion of the time-domain signal subspace provides useful insights over existing interference removal methods from a unified perspective. Main results and significance. Using the time-domain signal subspace, it is possible to interpret a number of interference removal methods as the time domain signal space projection. Such methods include adaptive noise canceling, sensor noise suppression, the common temporal subspace projection, the spatio-temporal signal space separation, and the recently-proposed dual signal subspace projection. Our analysis using the notion of the time domain signal space projection reveals implicit assumptions these methods rely on, and shows that the difference between these methods results only from the manner of deriving the interference subspace. Numerical examples that illustrate the results of our arguments are provided.

  9. Subspace-based interference removal methods for a multichannel biomagnetic sensor array

    NASA Astrophysics Data System (ADS)

    Sekihara, Kensuke; Nagarajan, Srikantan S.

    2017-10-01

    Objective. In biomagnetic signal processing, the theory of the signal subspace has been applied to removing interfering magnetic fields, and a representative algorithm is the signal space projection algorithm, in which the signal/interference subspace is defined in the spatial domain as the span of signal/interference-source lead field vectors. This paper extends the notion of this conventional (spatial domain) signal subspace by introducing a new definition of signal subspace in the time domain. Approach. It defines the time-domain signal subspace as the span of row vectors that contain the source time course values. This definition leads to symmetric relationships between the time-domain and the conventional (spatial-domain) signal subspaces. As a review, this article shows that the notion of the time-domain signal subspace provides useful insights over existing interference removal methods from a unified perspective. Main results and significance. Using the time-domain signal subspace, it is possible to interpret a number of interference removal methods as the time domain signal space projection. Such methods include adaptive noise canceling, sensor noise suppression, the common temporal subspace projection, the spatio-temporal signal space separation, and the recently-proposed dual signal subspace projection. Our analysis using the notion of the time domain signal space projection reveals implicit assumptions these methods rely on, and shows that the difference between these methods results only from the manner of deriving the interference subspace. Numerical examples that illustrate the results of our arguments are provided.

  10. A Classroom-Based Distributed Workflow Initiative for the Early Involvement of Undergraduate Students in Scientific Research

    ERIC Educational Resources Information Center

    Friedrich, Jon M.

    2014-01-01

    Engaging freshman and sophomore students in meaningful scientific research is challenging because of their developing skill set and their necessary time commitments to regular classwork. A project called the Chondrule Analysis Project was initiated to engage first- and second-year students in an initial research experience and also accomplish…

  11. Chapter 4: Overview of the vegetation management treatment economic analysis module in the integrated landscape assessment project

    Treesearch

    Xiaoping Zhou; Miles A. Hemstrom

    2014-01-01

    Forest land provides various ecosystem services, including timber, biomass, and carbon sequestration. Estimating trends in these ecosystem services is essential for assessing potential outcomes of landscape management scenarios. However, the state-and transition models used in the Integrated Landscape Assessment Project for simulating landscape changes over time do not...

  12. Time Resolved Microfluorescence In Biomedical Diagnosis

    NASA Astrophysics Data System (ADS)

    Schneckenburger, Herbert

    1985-12-01

    A measuring system combining subnanosecond laser-induced fluorescence with microscopic signal detection was installed and used for diverse projects in the biomedical and environmental fields. These projects range from tumor diagnosis and enzymatic analysis to measurements of the activity of methanogenic bacteria, which affect biogas production and waste water cleaning. The advantages of this method and its practical applicability are discussed.

  13. Projected Supply, Demand, and Shortages of Registered Nurses, 2000-2020.

    ERIC Educational Resources Information Center

    Health Resources and Services Administration (DHHS/PHS), Rockville, MD. National Center for Health Workforce Analysis.

    The supply, demand, and shortages of registered nurses (RNs) were projected and analyzed for 2000-2020. According to the analysis, the national supply of full-time-equivalent registered nurses in 2000 was estimated at 1.89 million versus an estimated demand of 2 million, leaving a shortage of 110,000 (6%). The shortage is expected to grow…

  14. Analysis Strategy and Selection Procedure for νμ Charged Current Inclusive Interactions using the ^0 Detector (P0D) and the Time Projection Chambers of the T2K Experiment

    NASA Astrophysics Data System (ADS)

    Reinherz-Aronis, Erez; Clifton, Alex; Das, Raj; Toki, Walter; Johnson, Robert; Marino, Alysia; Yuan, Tianlu

    2013-04-01

    νμ Charge-Current events are produced and collected by the Near Detectors (ND280) in the Tokai to Kamioka (T2K) experiment. This talk focuses on those interactions that are created in the Pi-Zero detector (PøD) and whose momentum is measured by the Time Projection Chambers (TPC). The description of the analysis event selection is presented which includes Data-Quality cuts, Beam Quality parameters, and Fiducial Volume boundaries which are applied on the beginning of the PøD track. In addition the matching procedure of a TPC track to a PøD track and the optimization of this procedure in presented.

  15. The Advanced Rapid Imaging and Analysis (ARIA) Project: Status of SAR products for Earthquakes, Floods, Volcanoes and Groundwater-related Subsidence

    NASA Astrophysics Data System (ADS)

    Owen, S. E.; Yun, S. H.; Hua, H.; Agram, P. S.; Liu, Z.; Sacco, G. F.; Manipon, G.; Linick, J. P.; Fielding, E. J.; Lundgren, P.; Farr, T. G.; Webb, F.; Rosen, P. A.; Simons, M.

    2017-12-01

    The Advanced Rapid Imaging and Analysis (ARIA) project for Natural Hazards is focused on rapidly generating high-level geodetic imaging products and placing them in the hands of the solid earth science and local, national, and international natural hazard communities by providing science product generation, exploration, and delivery capabilities at an operational level. Space-based geodetic measurement techniques including Interferometric Synthetic Aperture Radar (InSAR), differential Global Positioning System, and SAR-based change detection have become critical additions to our toolset for understanding and mapping the damage and deformation caused by earthquakes, volcanic eruptions, floods, landslides, and groundwater extraction. Up until recently, processing of these data sets has been handcrafted for each study or event and has not generated products rapidly and reliably enough for response to natural disasters or for timely analysis of large data sets. The ARIA project, a joint venture co-sponsored by the California Institute of Technology and by NASA through the Jet Propulsion Laboratory, has been capturing the knowledge applied to these responses and building it into an automated infrastructure to generate imaging products in near real-time that can improve situational awareness for disaster response. In addition to supporting the growing science and hazard response communities, the ARIA project has developed the capabilities to provide automated imaging and analysis capabilities necessary to keep up with the influx of raw SAR data from geodetic imaging missions such as ESA's Sentinel-1A/B, now operating with repeat intervals as short as 6 days, and the upcoming NASA NISAR mission. We will present the progress and results we have made on automating the analysis of Sentinel-1A/B SAR data for hazard monitoring and response, with emphasis on recent developments and end user engagement in flood extent mapping and deformation time series for both volcano monitoring and mapping of groundwater-related subsidence

  16. Regional climate projections for the MENA-CORDEX domain: analysis of projected temperature and precipitation changes

    NASA Astrophysics Data System (ADS)

    Hänsler, Andreas; Weber, Torsten; Eggert, Bastian; Saeed, Fahad; Jacob, Daniela

    2014-05-01

    Within the CORDEX initiative a multi-model suite of regionalized climate change information will be made available for several regions of the world. The German Climate Service Center (CSC) is taking part in this initiative by applying the regional climate model REMO to downscale global climate projections of different coupled general circulation models (GCMs) for several CORDEX domains. Also for the MENA-CORDEX domain, a set of regional climate change projections has been established at the CSC by downscaling CMIP5 projections of the Max-Planck-Institute Earth System Model (MPI-ESM) for the scenarios RCP4.5 and RCP8.5 with the regional model REMO for the time period from 1950 to 2100 to a horizontal resolution of 0.44 degree. In this study we investigate projected changes in future climate conditions over the domain towards the end of the 21st century. Focus in the analysis is given to projected changes in the temperature and rainfall characteristics and their differences for the two scenarios will be highlighted.

  17. Time dependent analysis of assay comparability: a novel approach to understand intra- and inter-site variability over time

    NASA Astrophysics Data System (ADS)

    Winiwarter, Susanne; Middleton, Brian; Jones, Barry; Courtney, Paul; Lindmark, Bo; Page, Ken M.; Clark, Alan; Landqvist, Claire

    2015-09-01

    We demonstrate here a novel use of statistical tools to study intra- and inter-site assay variability of five early drug metabolism and pharmacokinetics in vitro assays over time. Firstly, a tool for process control is presented. It shows the overall assay variability but allows also the following of changes due to assay adjustments and can additionally highlight other, potentially unexpected variations. Secondly, we define the minimum discriminatory difference/ratio to support projects to understand how experimental values measured at different sites at a given time can be compared. Such discriminatory values are calculated for 3 month periods and followed over time for each assay. Again assay modifications, especially assay harmonization efforts, can be noted. Both the process control tool and the variability estimates are based on the results of control compounds tested every time an assay is run. Variability estimates for a limited set of project compounds were computed as well and found to be comparable. This analysis reinforces the need to consider assay variability in decision making, compound ranking and in silico modeling.

  18. Parent-adolescent joint projects involving leisure time and activities during the transition to high school.

    PubMed

    Marshall, Sheila K; Young, Richard A; Wozniak, Agnieszka; Lollis, Susan; Tilton-Weaver, Lauree; Nelson, Margo; Goessling, Kristen

    2014-10-01

    Leisure research to date has generally overlooked planning and organizing of leisure time and activities between parents and adolescents. This investigation examined how a sample of Canadian adolescents and their parents jointly constructed and acted on goals related to adolescents' leisure time during the move from elementary to high school. Using the Qualitative Action-Project Method, data were collected over an 8-10 month period from 26 parent-adolescent dyads located in two urban sites, through video-taped conversations about leisure time, video recall interviews, and telephone monitoring interviews. Analysis of the data revealed that the joint projects of the 26 dyads could be grouped into three clusters: a) governance transfer or attempts to shift, from parent to adolescent, responsibility over academic demands, organizing leisure time, and safety with peers, b) balancing extra-curricular activities with family life, academics, and social activities, and c) relationship adjustment or maintenance. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.

  19. The concept of value stream mapping to reduce of work-time waste as applied the smart construction management

    NASA Astrophysics Data System (ADS)

    Elizar, Suripin, Wibowo, Mochamad Agung

    2017-11-01

    Delays in construction sites occur due to systematic additions of time waste in various activities that are part of the construction process. Work-time waste is non-adding value activity which used to differentiate between physical construction waste found on site and other waste which occurs during the construction process. The aim of this study is identification using the concept of Value Stream Mapping (VSM) to reduce of work-time waste as applied the smart construction management.VSM analysis is a method of business process improvement. The application of VSM began in the manufacturing community. The research method base on theoretically informed case study and literature review. The data have collected using questionnaire through personal interviews from 383 respondents on construction project in Indonesia. The results show that concept of VSM can identify causes of work-time waste. Base on result of questioners and quantitative approach analysis was obtained 29 variables that influence of work-time waste or non-value-adding activities. Base on three cases of construction project founded that average 14.88% of working time was classified as waste. Finally, the concept of VSM can recommend to identification of systematic for reveal current practices and opportunities for improvement towards global challenges. The concept of value stream mapping can help optimize to reduce work-time waste and improve quality standard of construction management. The concept is also can help manager to make a decision to reduce work-time waste so as to obtain of result in more efficient for performance and sustainable construction project.

  20. Development and analysis of a modular approach to payload specialist training. [training of spacecrews for Spacelab

    NASA Technical Reports Server (NTRS)

    Watters, H.; Steadman, J.

    1976-01-01

    A modular training approach for Spacelab payload crews is described. Representative missions are defined for training requirements analysis, training hardware, and simulations. Training times are projected for each experiment of each representative flight. A parametric analysis of the various flights defines resource requirements for a modular training facility at different flight frequencies. The modular approach is believed to be more flexible, time saving, and economical than previous single high fidelity trainer concepts. Block diagrams of training programs are shown.

  1. The development of the nursing profession in a globalised context: A qualitative case study in Kerala, India.

    PubMed

    Timmons, Stephen; Evans, Catrin; Nair, Sreelekha

    2016-10-01

    In the paper, we are looking at the relationship between globalisation and the professional project, using nursing in Kerala as an exemplar. Our focus is on the intersection of the professional project, gender and globalisation processes. Included in our analysis are the ways in which gender affects the professional project in the global south, and the development of a professional project which it is closely tied to global markets and global migration, revealing the political-economic, historical, and cultural factors that influence the shape and consequences of nurse migration. The phenomenon that enabled our analysis, by showing these forces at work in a particular time and place, was an outbreak of strikes by nurses working in private hospitals in Kerala in 2011-2012. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Projecting Future Scheduled Airline Demand, Schedules and NGATS Benefits Using TSAM

    NASA Technical Reports Server (NTRS)

    Dollyhigh, Samuel; Smith, Jeremy; Viken, Jeff; Trani, Antonio; Baik, Hojong; Hinze, Nickolas; Ashiabor, Senanu

    2006-01-01

    The Transportation Systems Analysis Model (TSAM) developed by Virginia Tech s Air Transportation Systems Lab and NASA Langley can provide detailed analysis of the effects on the demand for air travel of a full range of NASA and FAA aviation projects. TSAM has been used to project the passenger demand for very light jet (VLJ) air taxi service, scheduled airline demand growth and future schedules, Next Generation Air Transportation System (NGATS) benefits, and future passenger revenues for the Airport and Airway Trust Fund. TSAM can project the resulting demand when new vehicles and/or technology is inserted into the long distance (100 or more miles one-way) transportation system, as well as, changes in demand as a result of fare yield increases or decreases, airport transit times, scheduled flight times, ticket taxes, reductions or increases in flight delays, and so on. TSAM models all long distance travel in the contiguous U.S. and determines the mode choice of the traveler based on detailed trip costs, travel time, schedule frequency, purpose of the trip (business or non-business), and household income level of the traveler. Demand is modeled at the county level, with an airport choice module providing up to three airports as part of the mode choice. Future enplanements at airports can be projected for different scenarios. A Fratar algorithm and a schedule generator are applied to generate future flight schedules. This paper presents the application of TSAM to modeling future scheduled air passenger demand and resulting airline schedules, the impact of NGATS goals and objectives on passenger demand, along with projections for passenger fee receipts for several scenarios for the FAA Airport and Airway Trust Fund.

  3. Best practices from WisDOT mega and ARRA projects : statistical analysis and % time vs. % cost metrics.

    DOT National Transportation Integrated Search

    2012-03-01

    This study was undertaken to: 1) apply a benchmarking process to identify best practices within four areas Wisconsin Department of Transportation (WisDOT) construction management and 2) analyze two performance metrics, % Cost vs. % Time, tracked by t...

  4. 7 CFR 1775.20 - Reporting.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... (CONTINUED) TECHNICAL ASSISTANCE GRANTS Grant Application Processing § 1775.20 Reporting. (a) Grantees shall constantly monitor performance to ensure that time schedules are being met, projected work by time periods is..., results of activity); (2) Analysis of challenges or setbacks that occurred during the grant period; (3...

  5. Computational Understanding: Analysis of Sentences and Context

    DTIC Science & Technology

    1974-05-01

    Computer Science Department Stanford, California 9430b 10- PROGRAM ELEMENT. PROJECT. TASK AREA « WORK UNIT NUMBERS II. CONTROLLING OFFICE NAME...these is the need tor programs that can respond in useful ways to information expressed in a natural language. However a computational understanding...buying structure because "Mary" appears where it does. But the time for analysis was rarely over five seconds of computer time, when the Lisp program

  6. United States Marine Corps Aerial Refueling Requirements Analysis

    DTIC Science & Technology

    2000-12-01

    REFUELING REQUIREMENTS ANALYSIS William R. Gates Systems Management Department Naval Postgraduate School 555 Dyer Road, Room 229 Monterey, CA 93943...refueling customers . KC130s are rotated through the track 24 hours per day, providing customers fuel as needed. KC130 sorties are scheduled to reflect the...projected average customer (aircraft) arrival rates, average service times (time required to engage the drogue, receive fuel and 1075 Report Documentation

  7. Existing Whole-House Solutions Case Study: Bay Ridge Gardens - Mixed Humid Affordable Multifamily Housing Deep Energy Retrofit, Annapolis, Maryland

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2013-10-01

    Under this project, the BA-PIRC research team evaluated the installation, measured performance, and cost-effectiveness of efficiency upgrade measures for a tenant-in-place deep energy retrofit at the Bay Ridge multifamily development in Annapolis, Maryland. The design and construction phase of the Bay Ridge project was completed in August 2012. This case study summarizes system commissioning, short-term test results, utility bill data analysis, and analysis of real-time data collected over a one-year period after the retrofit was complete.

  8. Methodology for construction compliance monitoring in the crediting of investment projects for road construction

    NASA Astrophysics Data System (ADS)

    Vaynshtok, Natalia

    2017-10-01

    The article provides the results of development of the methodology for construction compliance monitoring in the crediting of investment projects for road construction. Work scope analysis of construction audit was conducted and an algorithm of financial audit in the crediting investment projects was developed. Furthermore, the possible pitfalls and abuses of counterparties were investigated and recommendations were given allowing the bank to receive objective and independent information on the progress of the project in real time. This mechanism is useful for the bank in insurance of possible risks, targeted and rational use of credit funds.

  9. Finite difference time domain modeling of spiral antennas

    NASA Technical Reports Server (NTRS)

    Penney, Christopher W.; Beggs, John H.; Luebbers, Raymond J.

    1992-01-01

    The objectives outlined in the original proposal for this project were to create a well-documented computer analysis model based on the finite-difference, time-domain (FDTD) method that would be capable of computing antenna impedance, far-zone radiation patterns, and radar cross-section (RCS). The ability to model a variety of penetrable materials in addition to conductors is also desired. The spiral antennas under study by this project meet these requirements since they are constructed of slots cut into conducting surfaces which are backed by dielectric materials.

  10. Configuration management issues and objectives for a real-time research flight test support facility

    NASA Technical Reports Server (NTRS)

    Yergensen, Stephen; Rhea, Donald C.

    1988-01-01

    An account is given of configuration management activities for the Western Aeronautical Test Range (WATR) at NASA-Ames, whose primary function is the conduct of aeronautical research flight testing through real-time processing and display, tracking, and communications systems. The processing of WATR configuration change requests for specific research flight test projects must be conducted in such a way as to refrain from compromising the reliability of WATR support to all project users. Configuration management's scope ranges from mission planning to operations monitoring and performance trend analysis.

  11. Theoretical analysis to interpret projected image data from in-situ 3-dimensional equiaxed nucleation and growth

    NASA Astrophysics Data System (ADS)

    Mooney, Robin P.; McFadden, Shaun

    2017-12-01

    In-situ observation of crystal growth in transparent media allows us to observe solidification phase change in real-time. These systems are analogous to opaque systems such as metals. The interpretation of transient 2-dimensional area projections from 3-dimensional phase change phenomena occurring in a bulky sample is problematic due to uncertainty of impingement and hidden nucleation events; in stereology this problem is known as over-projection. This manuscript describes and demonstrates a continuous model for nucleation and growth using the well-established Johnson-Mehl-Avrami-Kolmogorov model, and provides a method to relate 3-dimensional volumetric data (nucleation events, volume fraction) to observed data in a 2-dimensional projection (nucleation count, area fraction). A parametric analysis is performed; the projection phenomenon is shown to be significant in cases where nucleation is occurring continuously with a relatively large variance. In general, area fraction on a projection plane will overestimate the volume fraction within the sample and the nuclei count recorded on the projection plane will underestimate the number of real nucleation events. The statistical framework given in this manuscript provides a methodology to deal with the differences between the observed (projected) data and the real (volumetric) measures.

  12. What drives continuous improvement project success in healthcare?

    PubMed

    Stelson, Paul; Hille, Joshua; Eseonu, Chinweike; Doolen, Toni

    2017-02-13

    Purpose The purpose of this paper is to present findings from a study of factors that affect continuous improvement (CI) project success in hospitals. Design/methodology/approach Quantitative regression analysis was performed on Likert scale survey responses. Qualitative thematic analysis was performed on open-ended survey responses and written reports on CI projects. Findings The paper identifies managerial and employee factors that affect project success. These factors include managerial support, communication, and affective commitment. Affective commitment is the extent to which employees perceive the change as being needed or necessary. Practical implications The results highlight how managerial decisions, approaches to communication - including communication before, during and after CI projects affect project success. The results also show that success depends on the way employees perceive proposed changes. This suggests the need for a more individualized approach to CI, lean, and broader change initiatives. Originality/value This research is the first to fuse project success and sustainability theory to CI projects, beyond Kaizen events, in healthcare environments. The research is particularly important at a time when healthcare organizations are required to make rapid changes with limited resources as they work toward outcome-based assessment and reimbursement rules.

  13. Benefit-Cost Analysis of TAT Phase I Worker Training. Training and Technology Project. Special Report.

    ERIC Educational Resources Information Center

    Kirby, Frederick C.; Castagna, Paul A.

    The purpose of this study is to estimate costs and benefits and to compute alternative benefit-cost ratios for both the individuals and the Federal Government as a result of investing time and resources in the Training and Technology (TAT) Project. TAT is a continuing experimental program in training skilled workers for private industry. The five…

  14. Time-Resolved Microfluorescence In Biomedical Diagnosis

    NASA Astrophysics Data System (ADS)

    Schneckenburger, Herbert

    1985-02-01

    A measuring system combining subnanosecond laser-induced fluorescence with microscopic signal detection was installed and used for diverse projects in the biomedical and environmental field. These projects are ranging from tumor diagnosis and enzymatic analysis to measurements of the activity of methanogenic bacteria which effect biogas production and waste water cleaning. The advantages of this method and its practical applicability are discussed.

  15. A Factorial Analysis of Timed and Untimed Measures of Mathematics and Reading Abilities in School Aged Twins

    ERIC Educational Resources Information Center

    Hart, Sara A.; Petrill, Stephen A.; Thompson, Lee A.

    2010-01-01

    The present study examined the phenotypic and genetic relationship between fluency and non-fluency-based measures of reading and mathematics performance. Participants were drawn from the Western Reserve Reading and Math Project, an ongoing longitudinal twin project of same-sex MZ and DZ twins from Ohio. The present analyses are based on…

  16. The SMART-NAS Testbed

    NASA Technical Reports Server (NTRS)

    Aquilina, Rudolph A.

    2015-01-01

    The SMART-NAS Testbed for Safe Trajectory Based Operations Project will deliver an evaluation capability, critical to the ATM community, allowing full NextGen and beyond-NextGen concepts to be assessed and developed. To meet this objective a strong focus will be placed on concept integration and validation to enable a gate-to-gate trajectory-based system capability that satisfies a full vision for NextGen. The SMART-NAS for Safe TBO Project consists of six sub-projects. Three of the sub-projects are focused on exploring and developing technologies, concepts and models for evolving and transforming air traffic management operations in the ATM+2 time horizon, while the remaining three sub-projects are focused on developing the tools and capabilities needed for testing these advanced concepts. Function Allocation, Networked Air Traffic Management and Trajectory Based Operations are developing concepts and models. SMART-NAS Test-bed, System Assurance Technologies and Real-time Safety Modeling are developing the tools and capabilities to test these concepts. Simulation and modeling capabilities will include the ability to assess multiple operational scenarios of the national airspace system, accept data feeds, allowing shadowing of actual operations in either real-time, fast-time and/or hybrid modes of operations in distributed environments, and enable integrated examinations of concepts, algorithms, technologies, and NAS architectures. An important focus within this project is to enable the development of a real-time, system-wide safety assurance system. The basis of such a system is a continuum of information acquisition, analysis, and assessment that enables awareness and corrective action to detect and mitigate potential threats to continuous system-wide safety at all levels. This process, which currently can only be done post operations, will be driven towards "real-time" assessments in the 2035 time frame.

  17. The NOVA project: maximizing beam time efficiency through synergistic analyses of SRμCT data

    NASA Astrophysics Data System (ADS)

    Schmelzle, Sebastian; Heethoff, Michael; Heuveline, Vincent; Lösel, Philipp; Becker, Jürgen; Beckmann, Felix; Schluenzen, Frank; Hammel, Jörg U.; Kopmann, Andreas; Mexner, Wolfgang; Vogelgesang, Matthias; Jerome, Nicholas Tan; Betz, Oliver; Beutel, Rolf; Wipfler, Benjamin; Blanke, Alexander; Harzsch, Steffen; Hörnig, Marie; Baumbach, Tilo; van de Kamp, Thomas

    2017-09-01

    Beamtime and resulting SRμCT data are a valuable resource for researchers of a broad scientific community in life sciences. Most research groups, however, are only interested in a specific organ and use only a fraction of their data. The rest of the data usually remains untapped. By using a new collaborative approach, the NOVA project (Network for Online Visualization and synergistic Analysis of tomographic data) aims to demonstrate, that more efficient use of the valuable beam time is possible by coordinated research on different organ systems. The biological partners in the project cover different scientific aspects and thus serve as model community for the collaborative approach. As proof of principle, different aspects of insect head morphology will be investigated (e.g., biomechanics of the mouthparts, and neurobiology with the topology of sensory areas). This effort is accomplished by development of advanced analysis tools for the ever-increasing quantity of tomographic datasets. In the preceding project ASTOR, we already successfully demonstrated considerable progress in semi-automatic segmentation and classification of internal structures. Further improvement of these methods is essential for an efficient use of beam time and will be refined in the current NOVAproject. Significant enhancements are also planned at PETRA III beamline p05 to provide all possible contrast modalities in x-ray imaging optimized to biological samples, on the reconstruction algorithms, and the tools for subsequent analyses and management of the data. All improvements made on key technologies within this project will in the long-term be equally beneficial for all users of tomography instrumentations.

  18. Statistical analysis of vessel waiting time and lockage times on the upper Mississippi River.

    DOT National Transportation Integrated Search

    2011-10-01

    This project uses statistical methods to analyze traffic congestion of the upper Mississippi and : the Illinois Rivers, in particular, locks 18, 20, 21, 22, 24, and 25 on the upper Mississippi and : the Lagrange and Peoria locks on the Illinois River...

  19. Cyberpark 2000: Protected Areas Management Pilot Project. Satellite time series vegetation monitoring

    NASA Astrophysics Data System (ADS)

    Monteleone, M.; Lanorte, A.; Lasaponara, R.

    2009-04-01

    Cyberpark 2000 is a project funded by the UE Regional Operating Program of the Apulia Region (2000-2006). The main objective of the Cyberpark 2000 project is to develop a new assessment model for the management and monitoring of protected areas in Foggia Province (Apulia Region) based on Information and Communication Technologies. The results herein described are placed inside the research activities finalized to develop an environmental monitoring system knowledge based on the use of satellite time series. This study include: - A- satellite time series of high spatial resolution data for supporting the analysis of fire static risk factors through land use mapping and spectral/quantitative characterization of vegetation fuels; - B- satellite time series of MODIS for supporting fire dynamic risk evaluation of study area - Integrated fire detection by using thermal imaging cameras placed on panoramic view-points; - C - integrated high spatial and high temporal satellite time series for supporting studies in change detection factors or anomalies in vegetation covers; - D - satellite time-series for monitoring: (i) post fire vegetation recovery and (ii) spatio/temporal vegetation dynamics in unburned and burned vegetation covers.

  20. Real-time analysis keratometer

    NASA Technical Reports Server (NTRS)

    Adachi, Iwao P. (Inventor); Adachi, Yoshifumi (Inventor); Frazer, Robert E. (Inventor)

    1987-01-01

    A computer assisted keratometer in which a fiducial line pattern reticle illuminated by CW or pulsed laser light is projected on a corneal surface through lenses, a prismoidal beamsplitter quarterwave plate, and objective optics. The reticle surface is curved as a conjugate of an ideal corneal curvature. The fiducial image reflected from the cornea undergoes a polarization shift through the quarterwave plate and beamsplitter whereby the projected and reflected beams are separated and directed orthogonally. The reflected beam fiducial pattern forms a moire pattern with a replica of the first recticle. This moire pattern contains transverse aberration due to differences in curvature between the cornea and the ideal corneal curvature. The moire pattern is analyzed in real time by computer which displays either the CW moire pattern or a pulsed mode analysis of the transverse aberration of the cornea under observation, in real time. With the eye focused on a plurality of fixation points in succession, a survey of the entire corneal topography is made and a contour map or three dimensional plot of the cornea can be made as a computer readout in addition to corneal radius and refractive power analysis.

  1. Acquisition and production of skilled behavior in dynamic decision-making tasks

    NASA Technical Reports Server (NTRS)

    Kirlik, Alex

    1993-01-01

    Summaries of the four projects completed during the performance of this research are included. The four projects described are: Perceptual Augmentation Aiding for Situation Assessment, Perceptual Augmentation Aiding for Dynamic Decision-Making and Control, Action Advisory Aiding for Dynamic Decision-Making and Control, and Display Design to Support Time-Constrained Route Optimization. Papers based on each of these projects are currently in preparation. The theoretical framework upon which the first three projects are based, Ecological Task Analysis, was also developed during the performance of this research, and is described in a previous report. A project concerned with modeling strategies in human control of a dynamic system was also completed during the performance of this research.

  2. Assessment Approach for Identifying Compatibility of Restoration Projects with Geomorphic and Flooding Processes in Gravel Bed Rivers

    NASA Astrophysics Data System (ADS)

    DeVries, Paul; Aldrich, Robert

    2015-08-01

    A critical requirement for a successful river restoration project in a dynamic gravel bed river is that it be compatible with natural hydraulic and sediment transport processes operating at the reach scale. The potential for failure is greater at locations where the influence of natural processes is inconsistent with intended project function and performance. We present an approach using practical GIS, hydrologic, hydraulic, and sediment transport analyses to identify locations where specific restoration project types have the greatest likelihood of working as intended because their function and design are matched with flooding and morphologic processes. The key premise is to identify whether a specific river analysis segment (length ~1-10 bankfull widths) within a longer reach is geomorphically active or inactive in the context of vertical and lateral stabilities, and hydrologically active for floodplain connectivity. Analyses involve empirical channel geometry relations, aerial photographic time series, LiDAR data, HEC-RAS hydraulic modeling, and a time-integrated sediment transport budget to evaluate trapping efficiency within each segment. The analysis segments are defined by HEC-RAS model cross sections. The results have been used effectively to identify feasible projects in a variety of alluvial gravel bed river reaches with lengths between 11 and 80 km and 2-year flood magnitudes between ~350 and 1330 m3/s. Projects constructed based on the results have all performed as planned. In addition, the results provide key criteria for formulating erosion and flood management plans.

  3. Assessment Approach for Identifying Compatibility of Restoration Projects with Geomorphic and Flooding Processes in Gravel Bed Rivers.

    PubMed

    DeVries, Paul; Aldrich, Robert

    2015-08-01

    A critical requirement for a successful river restoration project in a dynamic gravel bed river is that it be compatible with natural hydraulic and sediment transport processes operating at the reach scale. The potential for failure is greater at locations where the influence of natural processes is inconsistent with intended project function and performance. We present an approach using practical GIS, hydrologic, hydraulic, and sediment transport analyses to identify locations where specific restoration project types have the greatest likelihood of working as intended because their function and design are matched with flooding and morphologic processes. The key premise is to identify whether a specific river analysis segment (length ~1-10 bankfull widths) within a longer reach is geomorphically active or inactive in the context of vertical and lateral stabilities, and hydrologically active for floodplain connectivity. Analyses involve empirical channel geometry relations, aerial photographic time series, LiDAR data, HEC-RAS hydraulic modeling, and a time-integrated sediment transport budget to evaluate trapping efficiency within each segment. The analysis segments are defined by HEC-RAS model cross sections. The results have been used effectively to identify feasible projects in a variety of alluvial gravel bed river reaches with lengths between 11 and 80 km and 2-year flood magnitudes between ~350 and 1330 m(3)/s. Projects constructed based on the results have all performed as planned. In addition, the results provide key criteria for formulating erosion and flood management plans.

  4. Improving Initiation and Tracking of Research Projects at an Academic Health Center: A Case Study.

    PubMed

    Schmidt, Susanne; Goros, Martin; Parsons, Helen M; Saygin, Can; Wan, Hung-Da; Shireman, Paula K; Gelfond, Jonathan A L

    2017-09-01

    Research service cores at academic health centers are important in driving translational advancements. Specifically, biostatistics and research design units provide services and training in data analytics, biostatistics, and study design. However, the increasing demand and complexity of assigning appropriate personnel to time-sensitive projects strains existing resources, potentially decreasing productivity and increasing costs. Improving processes for project initiation, assigning appropriate personnel, and tracking time-sensitive projects can eliminate bottlenecks and utilize resources more efficiently. In this case study, we describe our application of lean six sigma principles to our biostatistics unit to establish a systematic continual process improvement cycle for intake, allocation, and tracking of research design and data analysis projects. The define, measure, analyze, improve, and control methodology was used to guide the process improvement. Our goal was to assess and improve the efficiency and effectiveness of operations by objectively measuring outcomes, automating processes, and reducing bottlenecks. As a result, we developed a web-based dashboard application to capture, track, categorize, streamline, and automate project flow. Our workflow system resulted in improved transparency, efficiency, and workload allocation. Using the dashboard application, we reduced the average study intake time from 18 to 6 days, a 66.7% reduction over 12 months (January to December 2015).

  5. Software Project Management and Measurement on the World-Wide-Web (WWW)

    NASA Technical Reports Server (NTRS)

    Callahan, John; Ramakrishnan, Sudhaka

    1996-01-01

    We briefly describe a system for forms-based, work-flow management that helps members of a software development team overcome geographical barriers to collaboration. Our system, called the Web Integrated Software Environment (WISE), is implemented as a World-Wide-Web service that allows for management and measurement of software development projects based on dynamic analysis of change activity in the workflow. WISE tracks issues in a software development process, provides informal communication between the users with different roles, supports to-do lists, and helps in software process improvement. WISE minimizes the time devoted to metrics collection and analysis by providing implicit delivery of messages between users based on the content of project documents. The use of a database in WISE is hidden from the users who view WISE as maintaining a personal 'to-do list' of tasks related to the many projects on which they may play different roles.

  6. A Genealogical Interpretation of Principal Components Analysis

    PubMed Central

    McVean, Gil

    2009-01-01

    Principal components analysis, PCA, is a statistical method commonly used in population genetics to identify structure in the distribution of genetic variation across geographical location and ethnic background. However, while the method is often used to inform about historical demographic processes, little is known about the relationship between fundamental demographic parameters and the projection of samples onto the primary axes. Here I show that for SNP data the projection of samples onto the principal components can be obtained directly from considering the average coalescent times between pairs of haploid genomes. The result provides a framework for interpreting PCA projections in terms of underlying processes, including migration, geographical isolation, and admixture. I also demonstrate a link between PCA and Wright's fst and show that SNP ascertainment has a largely simple and predictable effect on the projection of samples. Using examples from human genetics, I discuss the application of these results to empirical data and the implications for inference. PMID:19834557

  7. Effort Drivers Estimation for Brazilian Geographically Distributed Software Development

    NASA Astrophysics Data System (ADS)

    Almeida, Ana Carina M.; Souza, Renata; Aquino, Gibeon; Meira, Silvio

    To meet the requirements of today’s fast paced markets, it is important to develop projects on time and with the minimum use of resources. A good estimate is the key to achieve this goal. Several companies have started to work with geographically distributed teams due to cost reduction and time-to-market. Some researchers indicate that this approach introduces new challenges, because the teams work in different time zones and have possible differences in culture and language. It is already known that the multisite development increases the software cycle time. Data from 15 DSD projects from 10 distinct companies were collected. The analysis shows drivers that impact significantly the total effort planned to develop systems using DSD approach in Brazil.

  8. Characterizing the performance of ecosystem models across time scales: A spectral analysis of the North American Carbon Program site-level synthesis

    Treesearch

    Michael C. Dietze; Rodrigo Vargas; Andrew D. Richardson; Paul C. Stoy; Alan G. Barr; Ryan S. Anderson; M. Altaf Arain; Ian T. Baker; T. Andrew Black; Jing M. Chen; Philippe Ciais; Lawrence B. Flanagan; Christopher M. Gough; Robert F. Grant; David Hollinger; R. Cesar Izaurralde; Christopher J. Kucharik; Peter Lafleur; Shugang Liu; Erandathie Lokupitiya; Yiqi Luo; J. William Munger; Changhui Peng; Benjamin Poulter; David T. Price; Daniel M. Ricciuto; William J. Riley; Alok Kumar Sahoo; Kevin Schaefer; Andrew E. Suyker; Hanqin Tian; Christina Tonitto; Hans Verbeeck; Shashi B. Verma; Weifeng Wang; Ensheng Weng

    2011-01-01

    Ecosystem models are important tools for diagnosing the carbon cycle and projecting its behavior across space and time. Despite the fact that ecosystems respond to drivers at multiple time scales, most assessments of model performance do not discriminate different time scales. Spectral methods, such as wavelet analyses, present an alternative approach that enables the...

  9. Project of Near-Real-Time Generation of ShakeMaps and a New Hazard Map in Austria

    NASA Astrophysics Data System (ADS)

    Jia, Yan; Weginger, Stefan; Horn, Nikolaus; Hausmann, Helmut; Lenhardt, Wolfgang

    2016-04-01

    Target-orientated prevention and effective crisis management can reduce or avoid damage and save lives in case of a strong earthquake. To achieve this goal, a project for automatic generated ShakeMaps (maps of ground motion and shaking intensity) and updating the Austrian hazard map was started at ZAMG (Zentralanstalt für Meteorologie und Geodynamik) in 2015. The first goal of the project is set for a near-real-time generation of ShakeMaps following strong earthquakes in Austria to provide rapid, accurate and official information to support the governmental crisis management. Using newly developed methods and software by SHARE (Seismic Hazard Harmonization in Europe) and GEM (Global Earthquake Model), which allows a transnational analysis at European level, a new generation of Austrian hazard maps will be ultimately calculated. More information and a status of our project will be given by this presentation.

  10. Differential Die-Away Instrument: Report on Fuel Assembly Mock-up Measurements with Neutron Generator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goodsell, Alison Victoria; Swinhoe, Martyn Thomas; Henzl, Vladimir

    2014-09-18

    Fresh fuel experiments for the differential die-away (DDA) project were performed using a DT neutron generator, a 15x15 PWR fuel assembly, and nine 3He detectors in a water tank inside of a shielded cell at Los Alamos National Laboratory (LANL). Eight different fuel enrichments were created using low enriched (LEU) and depleted uranium (DU) dioxide fuel rods. A list-mode data acquisition system recorded the time-dependent signal and analysis of the DDA signal die-away time was performed. The die-away time depended on the amount of fissile material in the fuel assembly and the position of the detector. These experiments were performedmore » in support of the spent nuclear fuel Next Generation Safeguards Initiative DDA project. Lessons learned from the fresh fuel DDA instrument experiments and simulations will provide useful information to the spent fuel project.« less

  11. POWER SPECTRUM ANALYSIS AND TIME-DOMAIN ANALYSIS OF HRV RECORDED BY TELEMETRY IN UNRESTRAINED RATS. (R828046)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  12. Decision Analysis Techniques for Adult Learners: Application to Leadership

    ERIC Educational Resources Information Center

    Toosi, Farah

    2017-01-01

    Most decision analysis techniques are not taught at higher education institutions. Leaders, project managers and procurement agents in industry have strong technical knowledge, and it is crucial for them to apply this knowledge at the right time to make critical decisions. There are uncertainties, problems, and risks involved in business…

  13. Involving People in the Analysis: Listening, Reflecting, Discounting Nothing.

    ERIC Educational Resources Information Center

    Richardson, Malcolm

    2002-01-01

    This article explores methodological issues arising from a research project that involved six people with learning difficulties in researching aspects of their own lives. These included how participants were included in data analysis and the researcher's role. It stresses the importance of the researcher listening to participants, taking time to…

  14. The Evaluation and Research of Multi-Project Programs: Program Component Analysis.

    ERIC Educational Resources Information Center

    Baker, Eva L.

    1977-01-01

    It is difficult to base evaluations on concepts irrelevant to state policy making. Evaluation of a multiproject program requires both time and differentiation of method. Data from the California Early Childhood Program illustrate process variables for program component analysis, and research questions for intraprogram comparison. (CP)

  15. 77 FR 68757 - Clean River Power MR-1, LLC; Clean River Power MR-2, LLC; Clean River Power MR-3, LLC; Clean...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-16

    ... environmental analysis at this time. n. The proposed Beverly Lock and Dam Water Power Project would be located... River; (3) two turbine-generator units providing a combined installed capacity of 3.0 megawatts (MW); (4... about 17,853 megawatt-hours (MWh). The proposed Devola Lock and Dam Water Power Project would be located...

  16. Time-delayed chameleon: Analysis, synchronization and FPGA implementation

    NASA Astrophysics Data System (ADS)

    Rajagopal, Karthikeyan; Jafari, Sajad; Laarem, Guessas

    2017-12-01

    In this paper we report a time-delayed chameleon-like chaotic system which can belong to different families of chaotic attractors depending on the choices of parameters. Such a characteristic of self-excited and hidden chaotic flows in a simple 3D system with time delay has not been reported earlier. Dynamic analysis of the proposed time-delayed systems are analysed in time-delay space and parameter space. A novel adaptive modified functional projective lag synchronization algorithm is derived for synchronizing identical time-delayed chameleon systems with uncertain parameters. The proposed time-delayed systems and the synchronization algorithm with controllers and parameter estimates are then implemented in FPGA using hardware-software co-simulation and the results are presented.

  17. NATUREYES: Development of surveillance monitoring system for open spaces based on the analysis of different optical sensors data and environmental conditions

    NASA Astrophysics Data System (ADS)

    Molina-Jimenez, Teresa; Caballero-Aroca, Jose; Simón-Martín, Santiago; Hervás-Juan, Juan; García-Martínez, Jose-David; Pérez-Picazo, Emilio; Dolz-García, Ramón; Pons-Vila, Alejandro; Quintana-Rumbau, Salvador; Valiente Pardo, Jose Antonio; Estrela, Maria José; Pastor-Guzmán, Francisco

    2005-09-01

    We present results of a R&D project aimed to produce an environmental surveillance system that, working in wild areas, allows for a real-time observation and control of some ambient factors that could produce a natural disaster. The main objective of the project is the development of an open platform capable to work with several kinds of sensors, in order to adapt itself to the needs of each situation. The detection of environmental risks and management of this data to give a real-time response is the overall objective of the project. The main parts of the system are: 1.- Detection system: capable to perform real-time data and image communication, fully autonomous and designed to consider the environmental conditions. 2.- Alarm management headquaters: reception on real-time of data from the detector network. All the data is analysed to enable a decision about whether there is or not an alarm situation. 3.- Mobile alarm-reception system: portable system for reception of the alarm signal from the headquaters. The project was financed by the Science and Technology Ministry, National Research and Development Programme (TIC2000-0366-P4, 2001-2004).

  18. Modeling and Analysis of Geoelectric Fields: Extended Solar Shield

    NASA Astrophysics Data System (ADS)

    Ngwira, C. M.; Pulkkinen, A. A.

    2016-12-01

    In the NASA Applied Sciences Program Solar Shield project, an unprecedented first-principles-based system to forecast geomagnetically induced current (GIC) in high-voltage power transmission systems was developed. Rapid progress in the field of numerical physics-based space environment modeling has led to major developments over the past few years. In this study modeling and analysis of induced geoelectric fields is discussed. Specifically, we focus on the successful incorporation of 3-D EM transfer functions in the modeling of E-fields, and on the analysis of near real-time simulation outputs used in the Solar Shield forecast system. The extended Solar Shield is a collaborative project between DHS, NASA, NOAA, CUA and EPRI.

  19. Financial Analysis: A Review of the Methods and Their Application to Employee Training. Training and Development Research Center Project Number Nine.

    ERIC Educational Resources Information Center

    Mosier, Nancy R.

    Financial analysis techniques are tools that help managers make sound financial decisions that contribute to general corporate objectives. A literature review reveals that the most commonly used financial analysis techniques are payback time, average rate of return, present value or present worth, and internal rate of return. Despite the success…

  20. Planetary Spatial Analyst

    NASA Technical Reports Server (NTRS)

    Keely, Leslie

    2008-01-01

    This is a status report for the project entitled Planetary Spatial Analyst (PSA). This report covers activities from the project inception on October 1, 2007 to June 1, 2008. Originally a three year proposal, PSA was awarded funding for one year and required a revised work statement and budget. At the time of this writing the project is well on track both for completion of work as well as budget. The revised project focused on two objectives: build a solid connection with the target community and implement a prototype software application that provides 3D visualization and spatial analysis technologies for that community. Progress has been made for both of these objectives.

  1. AMON: Transition to real-time operations

    NASA Astrophysics Data System (ADS)

    Cowen, D. F.; Keivani, A.; Tešić, G.

    2016-04-01

    The Astrophysical Multimessenger Observatory Network (AMON) will link the world's leading high-energy neutrino, cosmic-ray, gamma-ray and gravitational wave observatories by performing real-time coincidence searches for multimessenger sources from observatories' subthreshold data streams. The resulting coincidences will be distributed to interested parties in the form of electronic alerts for real-time follow-up observation. We will present the science case, design elements, current and projected partner observatories, status of the AMON project, and an initial AMON-enabled analysis. The prototype of the AMON server has been online since August 2014 and processing archival data. Currently, we are deploying new high-uptime servers and will be ready to start issuing alerts as early as winter 2015/16.

  2. Rural telemedicine project in northern New Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zink, S.; Hahn, H.; Rudnick, J.

    A virtual electronic medical record system is being deployed over the Internet with security in northern New Mexico using TeleMed, a multimedia medical records management system that uses CORBA-based client-server technology and distributed database architecture. The goal of the NNM Rural Telemedicine Project is to implement TeleMed into fifteen rural clinics and two hospitals within a 25,000 square mile area of northern New Mexico. Evaluation of the project consists of three components: job task analysis, audit of immunized children, and time motion studies. Preliminary results of the evaluation components are presented.

  3. Enabling drug discovery project decisions with integrated computational chemistry and informatics

    NASA Astrophysics Data System (ADS)

    Tsui, Vickie; Ortwine, Daniel F.; Blaney, Jeffrey M.

    2017-03-01

    Computational chemistry/informatics scientists and software engineers in Genentech Small Molecule Drug Discovery collaborate with experimental scientists in a therapeutic project-centric environment. Our mission is to enable and improve pre-clinical drug discovery design and decisions. Our goal is to deliver timely data, analysis, and modeling to our therapeutic project teams using best-in-class software tools. We describe our strategy, the organization of our group, and our approaches to reach this goal. We conclude with a summary of the interdisciplinary skills required for computational scientists and recommendations for their training.

  4. Capturing change: the duality of time-lapse imagery to acquire data and depict ecological dynamics

    USGS Publications Warehouse

    Brinley Buckley, Emma M.; Allen, Craig R.; Forsberg, Michael; Farrell, Michael; Caven, Andrew J.

    2017-01-01

    We investigate the scientific and communicative value of time-lapse imagery by exploring applications for data collection and visualization. Time-lapse imagery has a myriad of possible applications to study and depict ecosystems and can operate at unique temporal and spatial scales to bridge the gap between large-scale satellite imagery projects and observational field research. Time-lapse data sequences, linking time-lapse imagery with data visualization, have the ability to make data come alive for a wider audience by connecting abstract numbers to images that root data in time and place. Utilizing imagery from the Platte Basin Timelapse Project, water inundation and vegetation phenology metrics are quantified via image analysis and then paired with passive monitoring data, including streamflow and water chemistry. Dynamic and interactive time-lapse data sequences elucidate the visible and invisible ecological dynamics of a significantly altered yet internationally important river system in central Nebraska.

  5. Operation Sun Beam, Shot Small Boy. Project Officer's report - Project 7. 10. Spectral analysis with high-time resolution of the thermal-radiation pulse

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahoney, J.J.; Harris, L.H.; Hennecke, H.J.

    1985-09-01

    The primary objective of this project was to investigate the spectral irradiance and luminosity versus time for the first thermal pulse at Shot Small Boy. This was accomplished by use of spectral filters with narrow band passes, phototubes, and magnetic tape recorders with high time resolution at two locations. The measured elapsed time to the first thermal maximum was from 50 to 110 microseconds, depending on wavelength. A graph of radiant thermal power versus time was obtained for the thermal pulse. The delineation of the first thermal pulse, especially the rise portion, is considered to be more definite than hasmore » been obtained previously. The resolution time of the instrumentation was approximately 50 microseconds. Secondary objectives were to measure the total luminosity versus time and also to measure the atmospheric attenuation. These objectives were accomplished by making measurements at two distances, 2.5 and 3.5 miles, from ground zero. In the case of the total luminosity measurements, a system of filters with a spectral transmittance approximating the sensitivity response of the average human eye was used. The results are tabulated in the report.« less

  6. Discounting the distant future-Data on Australian discount rates estimated by a stochastic interest rate model.

    PubMed

    Truong, Chi; Trück, Stefan

    2017-04-01

    Data on certainty equivalent discount factors and discount rates for stochastic interest rates in Australia are provided in this paper. The data has been used for the analysis of investments into climate adaptation projects in ׳It׳s not now or never: Implications of investment timing and risk aversion on climate adaptation to extreme events ׳ (Truong and Trück, 2016) [3] and can be used for other cost-benefit analysis studies in Australia. The data is of particular interest for the discounting of projects that create monetary costs and benefits in the distant future.

  7. Impact of Requirements Quality on Project Success or Failure

    NASA Astrophysics Data System (ADS)

    Tamai, Tetsuo; Kamata, Mayumi Itakura

    We are interested in the relationship between the quality of the requirements specifications for software projects and the subsequent outcome of the projects. To examine this relationship, we investigated 32 projects started and completed between 2003 and 2005 by the software development division of a large company in Tokyo. The company has collected reliable data on requirements specification quality, as evaluated by software quality assurance teams, and overall project performance data relating to cost and time overruns. The data for requirements specification quality were first converted into a multiple-dimensional space, with each dimension corresponding to an item of the recommended structure for software requirements specifications (SRS) defined in IEEE Std. 830-1998. We applied various statistical analysis methods to the SRS quality data and project outcomes.

  8. What weight should be assigned to future environmental impacts? A probabilistic cost benefit analysis using recent advances on discounting.

    PubMed

    Almansa, Carmen; Martínez-Paz, José M

    2011-03-01

    Cost-benefit analysis is a standard methodological platform for public investment evaluation. In high environmental impact projects, with a long-term effect on future generations, the choice of discount rate and time horizon is of particular relevance, because it can lead to very different profitability assessments. This paper describes some recent approaches to environmental discounting and applies them, together with a number of classical procedures, to the economic evaluation of a plant for the desalination of irrigation return water from intensive farming, aimed at halting the degradation of an area of great ecological value, the Mar Menor, in South Eastern Spain. A Monte Carlo procedure is used in four CBA approaches and three time horizons to carry out a probabilistic sensitivity analysis designed to integrate the views of an international panel of experts in environmental discounting with the uncertainty affecting the market price of the project's main output, i.e., irrigation water for a water-deprived area. The results show which discounting scenarios most accurately estimate the socio-environmental profitability of the project while also considering the risk associated with these two key parameters. The analysis also provides some methodological findings regarding ways of assessing financial and environmental profitability in decisions concerning public investment in the environment. Copyright © 2010 Elsevier B.V. All rights reserved.

  9. Impact of agile methodologies on team capacity in automotive radio-navigation projects

    NASA Astrophysics Data System (ADS)

    Prostean, G.; Hutanu, A.; Volker, S.

    2017-01-01

    The development processes used in automotive radio-navigation projects are constantly under adaption pressure. While the software development models are based on automotive production processes, the integration of peripheral components into an automotive system will trigger a high number of requirement modifications. The use of traditional development models in automotive industry will bring team’s development capacity to its boundaries. The root cause lays in the inflexibility of actual processes and their adaption limits. This paper addresses a new project management approach for the development of radio-navigation projects. The understanding of weaknesses of current used models helped us in development and integration of agile methodologies in traditional development model structure. In the first part we focus on the change management methods to reduce request for change inflow. Established change management risk analysis processes enables the project management to judge the impact of a requirement change and also gives time to the project to implement some changes. However, in big automotive radio-navigation projects the saved time is not enough to implement the large amount of changes, which are submitted to the project. In the second phase of this paper we focus on increasing team capacity by integrating at critical project phases agile methodologies into the used traditional model. The overall objective of this paper is to prove the need of process adaption in order to solve project team capacity bottlenecks.

  10. Ocean Surface Wave Optical Roughness - Analysis of Innovative Measurements

    DTIC Science & Technology

    2012-09-30

    goals of the program are to (1) examine time -dependent oceanic radiance distribution in relation to dynamic surface boundary layer (SBL) processes; (2... Analysis of Innovative Measurements 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER...RESULTS An overview of results is provided by Zappa et al. [2012] and Dickey et al. [2012]. TOGA-COARE and Air-sea fluxes Time series

  11. Analysis of High Precision GPS Time Series and Strain Rates for the Geothermal Play Fairway Analysis of Washington State Prospects Project

    DOE Data Explorer

    Michael Swyer

    2015-02-22

    Global Positioning System (GPS) time series from the National Science Foundation (NSF) Earthscope’s Plate Boundary Observatory (PBO) and Central Washington University’s Pacific Northwest Geodetic Array (PANGA). GPS station velocities were used to infer strain rates using the ‘splines in tension’ method. Strain rates were derived separately for subduction zone locking at depth and block rotation near the surface within crustal block boundaries.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eto, Joseph H.; Parashar, Manu; Lewis, Nancy Jo

    The Real Time System Operations (RTSO) 2006-2007 project focused on two parallel technical tasks: (1) Real-Time Applications of Phasors for Monitoring, Alarming and Control; and (2) Real-Time Voltage Security Assessment (RTVSA) Prototype Tool. The overall goal of the phasor applications project was to accelerate adoption and foster greater use of new, more accurate, time-synchronized phasor measurements by conducting research and prototyping applications on California ISO's phasor platform - Real-Time Dynamics Monitoring System (RTDMS) -- that provide previously unavailable information on the dynamic stability of the grid. Feasibility assessment studies were conducted on potential application of this technology for small-signal stabilitymore » monitoring, validating/improving existing stability nomograms, conducting frequency response analysis, and obtaining real-time sensitivity information on key metrics to assess grid stress. Based on study findings, prototype applications for real-time visualization and alarming, small-signal stability monitoring, measurement based sensitivity analysis and frequency response assessment were developed, factory- and field-tested at the California ISO and at BPA. The goal of the RTVSA project was to provide California ISO with a prototype voltage security assessment tool that runs in real time within California ISO?s new reliability and congestion management system. CERTS conducted a technical assessment of appropriate algorithms, developed a prototype incorporating state-of-art algorithms (such as the continuation power flow, direct method, boundary orbiting method, and hyperplanes) into a framework most suitable for an operations environment. Based on study findings, a functional specification was prepared, which the California ISO has since used to procure a production-quality tool that is now a part of a suite of advanced computational tools that is used by California ISO for reliability and congestion management.« less

  13. The Three Gorges Project: How sustainable?

    NASA Astrophysics Data System (ADS)

    Kepa Brian Morgan, Te Kipa; Sardelic, Daniel N.; Waretini, Amaria F.

    2012-08-01

    SummaryIn 1984 the Government of China approved the decision to construct the Three Gorges Dam Project, the largest project since the Great Wall. The project had many barriers to overcome, and the decision was made at a time when sustainability was a relatively unknown concept. The decision to construct the Three Gorges Project remains contentious today, especially since Deputy Director of the Three Gorges Project Construction Committee, Wang Xiaofeng, stated that "We absolutely cannot relax our guard against ecological and environmental security problems sparked by the Three Gorges Project" (Bristow, 2007; McCabe, 2007). The question therefore was posed: how sustainable is the Three Gorges Project? Conventional approaches to sustainability assessment tend to use monetary based assessment aligned to triple bottom line thinking. That is, projects are evaluated as trade-offs between economic, environmental and social costs and benefits. The question of sustainability is considered using such a traditional Cost-Benefit Analysis approach, as undertaken in 1988 by a CIPM-Yangtze Joint Venture, and the Mauri Model Decision Making Framework (MMDMF). The Mauri Model differs from other approaches in that sustainability performance indicators are considered independently from any particular stakeholder bias. Bias is then introduced subsequently as a sensitivity analysis on the raw results obtained. The MMDMF is unique in that it is based on the Māori concept of Mauri, the binding force between the physical and the spiritual attributes of something, or the capacity to support life in the air, soil, and water. This concept of Mauri is analogous to the Chinese concept of Qi, and there are many analogous concepts in other cultures. It is the universal relevance of Mauri that allows its use to assess sustainability. This research identified that the MMDMF was a strong complement to Cost-Benefit Analysis, which is not designed as a sustainability assessment tool in itself. The MMDMF does have relevance in identifying areas of conflict, and it can support the Cost-Benefit Analysis in assessing sustainability, as a Decision Support Tool. The research concluded that, based on both models, the Three Gorges Project as understood in 1988, and incorporating more recent sustainability analysis is contributing to enhanced sustainability.

  14. A Value Analysis of Lean Processes in Target Value Design and Integrated Project Delivery.

    PubMed

    Nanda, Upali; K Rybkowski, Zofia; Pati, Sipra; Nejati, Adeleh

    2017-04-01

    To investigate what key stakeholders consider to be the advantages and the opportunities for improvement in using lean thinking and tools in the integrated project delivery (IPD) process. A detailed literature review was followed by case study of a Lean-IPD project. Interviews with members of the project leadership team, focus groups with the integrated team as well as the design team, and an online survey of all stakeholders were conducted. Statistical analysis and thematic content analysis were used to analyze the data, followed by a plus-delta analysis. (1) Learning is a large, implicit benefit of Lean-IPD that is not currently captured by any success metric; (2) the cardboard mock-up was the most successful lean strategy; (3) although a collaborative project, the level of influence of different stakeholder groups was perceived to be different by different stakeholders; (4) overall, Lean-IPD was rated as better than traditional design-bid-build methods; and (5) opportunities for improvement reported were increase in accurate cost estimating, more efficient use of time, perception of imbalance of control/influence, and need for facilitation (which represents different points of view). While lean tools and an IPD method are preferred to traditional design-bid-build methods, the perception of different stakeholders varies and more work needs to be done to allow a truly shared decision-making model. Learning was identified as one of the biggest advantages.

  15. Development Status of the WetLab-2 Project: New Tools for On-orbit Real-time Quantitative Gene Expression.

    NASA Technical Reports Server (NTRS)

    Jung, Jimmy; Parra, Macarena P.; Almeida, Eduardo; Boone, Travis; Chinn, Tori; Ricco, Antonio; Souza, Kenneth; Hyde, Liz; Rukhsana, Yousuf; Richey, C. Scott

    2013-01-01

    The primary objective of NASA Ames Research Centers WetLab-2 Project is to place on the ISS a research platform to facilitate gene expression analysis via quantitative real-time PCR (qRT-PCR) of biological specimens grown or cultured on orbit. The WetLab-2 equipment will be capable of processing multiple sample types ranging from microbial cultures to animal tissues dissected on-orbit. In addition to the logistical benefits of in-situ sample processing and analysis, conducting qRT-PCR on-orbit eliminates the confounding effects on gene expression of reentry stresses and shock acting on live cells and organisms. The system can also validate terrestrial analyses of samples returned from ISS by providing quantitative on-orbit gene expression benchmarking prior to sample return. The ability to get on orbit data will provide investigators with the opportunity to adjust experimental parameters for subsequent trials based on the real-time data analysis without need for sample return and re-flight. Finally, WetLab-2 can be used for analysis of air, surface, water, and clinical samples to monitor environmental contaminants and crew health. The verification flight of the instrument is scheduled to launch on SpaceX-5 in Aug. 2014.Progress to date: The WetLab-2 project completed a thorough study of commercially available qRT-PCR systems and performed a downselect based on both scientific and engineering requirements. The selected instrument, the Cepheid SmartCycler, has advantages including modular design (16 independent PCR modules), low power consumption, and rapid ramp times. The SmartCycler has multiplex capabilities, assaying up to four genes of interest in each of the 16 modules. The WetLab-2 team is currently working with Cepheid to modify the unit for housing within an EXPRESS rack locker on the ISS. This will enable the downlink of data to the ground and provide uplink capabilities for programming, commanding, monitoring, and instrument maintenance. The project is currently designing a module that will lyse the cells and extract RNA of sufficient quality for use in qRT-PCR reactions while using a housekeeping gene to normalize RNA concentration and integrity. Current testing focuses on two promising commercial products and chemistries that allow for RNA extraction with minimal complexity and crew time.

  16. Engineering the LISA Project: Systems Engineering Challenges

    NASA Technical Reports Server (NTRS)

    Evans, Jordan P.

    2006-01-01

    The Laser Interferometer Space Antenna (LISA) is a joint NASA/ESA mission to detect and measure gravitational waves with periods from 1 s to 10000 s. The systems engineering challenges of developing a giant interferometer, 5 million kilometers on a side, an: numerous. Some of the key challenges are presented in this paper. The organizational challenges imposed by sharing the engineering function between three centers (ESA ESTEC, NASA GSFC, and JPL) across nine time zones are addressed. The issues and approaches to allocation of the acceleration noise and measurement sensitivity budget terms across a traditionally decomposed system are discussed. Additionally, using LISA to detect gravitational waves for the first time presents significant data analysis challenges, many of which drive the project system design. The approach to understanding the implications of science data analysis on the system is also addressed.

  17. Using project performance to measure effectiveness of quality management system maintenance and practices in construction industry.

    PubMed

    Leong, Tiong Kung; Zakuan, Norhayati; Mat Saman, Muhamad Zameri; Ariff, Mohd Shoki Md; Tan, Choy Soon

    2014-01-01

    This paper proposed seven existing and new performance indicators to measure the effectiveness of quality management system (QMS) maintenance and practices in construction industry. This research is carried out with a questionnaire based on QMS variables which are extracted from literature review and project performance indicators which are established from project management's theory. Data collected was analyzed using correlation and regression analysis. The findings indicate that client satisfaction and time variance have positive and significant relationship with QMS while other project performance indicators do not show significant results. Further studies can use the same project performance indicators to study the effectiveness of QMS in different sampling area to improve the generalizability of the findings.

  18. Lunar lander ground support system

    NASA Technical Reports Server (NTRS)

    1991-01-01

    This year's project, like the previous Aerospace Group's project, involves a lunar transportation system. The basic time line will be the years 2010-2030 and will be referred to as a second generation system, as lunar bases would be present. The project design completed this year is referred to as the Lunar Lander Ground Support System (LLGSS). The area chosen for analysis encompasses a great number of vehicles and personnel. The design of certain elements of the overall lunar mission are complete projects in themselves. For this reason the project chosen for the Senior Aerospace Design is the design of specific servicing vehicles and additions or modifications to existing vehicles for the area of concern involving servicing and maintenance of the lunar lander while on the surface.

  19. Using Project Performance to Measure Effectiveness of Quality Management System Maintenance and Practices in Construction Industry

    PubMed Central

    Leong, Tiong Kung; Ariff, Mohd. Shoki Md.

    2014-01-01

    This paper proposed seven existing and new performance indicators to measure the effectiveness of quality management system (QMS) maintenance and practices in construction industry. This research is carried out with a questionnaire based on QMS variables which are extracted from literature review and project performance indicators which are established from project management's theory. Data collected was analyzed using correlation and regression analysis. The findings indicate that client satisfaction and time variance have positive and significant relationship with QMS while other project performance indicators do not show significant results. Further studies can use the same project performance indicators to study the effectiveness of QMS in different sampling area to improve the generalizability of the findings. PMID:24701182

  20. Blended learning in situated contexts: 3-year evaluation of an online peer review project.

    PubMed

    Bridges, S; Chang, J W W; Chu, C H; Gardner, K

    2014-08-01

    Situated and sociocultural perspectives on learning indicate that the design of complex tasks supported by educational technologies holds potential for dental education in moving novices towards closer approximation of the clinical outcomes of their expert mentors. A cross-faculty-, student-centred, web-based project in operative dentistry was established within the Universitas 21 (U21) network of higher education institutions to support university goals for internationalisation in clinical learning by enabling distributed interactions across sites and institutions. This paper aims to present evaluation of one dental faculty's project experience of curriculum redesign for deeper student learning. A mixed-method case study approach was utilised. Three cohorts of second-year students from a 5-year bachelor of dental surgery (BDS) programme were invited to participate in annual surveys and focus group interviews on project completion. Survey data were analysed for differences between years using multivariate logistical regression analysis. Thematic analysis of questionnaire open responses and interview transcripts was conducted. Multivariate logistic regression analysis noted significant differences across items over time indicating learning improvements, attainment of university aims and the positive influence of redesign. Students perceived the enquiry-based project as stimulating and motivating, and building confidence in operative techniques. Institutional goals for greater understanding of others and lifelong learning showed improvement over time. Despite positive scores, students indicated global citizenship and intercultural understanding were conceptually challenging. Establishment of online student learning communities through a blended approach to learning stimulated motivation and intellectual engagement, thereby supporting a situated approach to cognition. Sociocultural perspectives indicate that novice-expert interactions supported student development of professional identities. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  1. CIAN - Cell Imaging and Analysis Network at the Biology Department of McGill University

    PubMed Central

    Lacoste, J.; Lesage, G.; Bunnell, S.; Han, H.; Küster-Schöck, E.

    2010-01-01

    CF-31 The Cell Imaging and Analysis Network (CIAN) provides services and tools to researchers in the field of cell biology from within or outside Montreal's McGill University community. CIAN is composed of six scientific platforms: Cell Imaging (confocal and fluorescence microscopy), Proteomics (2-D protein gel electrophoresis and DiGE, fluorescent protein analysis), Automation and High throughput screening (Pinning robot and liquid handler), Protein Expression for Antibody Production, Genomics (real-time PCR), and Data storage and analysis (cluster, server, and workstations). Users submit project proposals, and can obtain training and consultation in any aspect of the facility, or initiate projects with the full-service platforms. CIAN is designed to facilitate training, enhance interactions, as well as share and maintain resources and expertise.

  2. NIR monitoring of in-service wood structures

    Treesearch

    Michela Zanetti; Timothy G. Rials; Douglas Rammer

    2005-01-01

    Near infrared spectroscopy (NIRS) was used to study a set of Southern Yellow Pine boards exposed to natural weathering for different periods of exposure time. This non-destructive spectroscopic technique is a very powerful tool to predict the weathering of wood when used in combination with multivariate analysis (Principal Component Analysis, PCA, and Projection to...

  3. Random Matrix Theory in molecular dynamics analysis.

    PubMed

    Palese, Luigi Leonardo

    2015-01-01

    It is well known that, in some situations, principal component analysis (PCA) carried out on molecular dynamics data results in the appearance of cosine-shaped low index projections. Because this is reminiscent of the results obtained by performing PCA on a multidimensional Brownian dynamics, it has been suggested that short-time protein dynamics is essentially nothing more than a noisy signal. Here we use Random Matrix Theory to analyze a series of short-time molecular dynamics experiments which are specifically designed to be simulations with high cosine content. We use as a model system the protein apoCox17, a mitochondrial copper chaperone. Spectral analysis on correlation matrices allows to easily differentiate random correlations, simply deriving from the finite length of the process, from non-random signals reflecting the intrinsic system properties. Our results clearly show that protein dynamics is not really Brownian also in presence of the cosine-shaped low index projections on principal axes. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Verdant: automated annotation, alignment and phylogenetic analysis of whole chloroplast genomes.

    PubMed

    McKain, Michael R; Hartsock, Ryan H; Wohl, Molly M; Kellogg, Elizabeth A

    2017-01-01

    Chloroplast genomes are now produced in the hundreds for angiosperm phylogenetics projects, but current methods for annotation, alignment and tree estimation still require some manual intervention reducing throughput and increasing analysis time for large chloroplast systematics projects. Verdant is a web-based software suite and database built to take advantage a novel annotation program, annoBTD. Using annoBTD, Verdant provides accurate annotation of chloroplast genomes without manual intervention. Subsequent alignment and tree estimation can incorporate newly annotated and publically available plastomes and can accommodate a large number of taxa. Verdant sharply reduces the time required for analysis of assembled chloroplast genomes and removes the need for pipelines and software on personal hardware. Verdant is available at: http://verdant.iplantcollaborative.org/plastidDB/ It is implemented in PHP, Perl, MySQL, Javascript, HTML and CSS with all major browsers supported. mrmckain@gmail.comSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  5. Investigating the effects of PDC cutters geometry on ROP using the Taguchi technique

    NASA Astrophysics Data System (ADS)

    Jamaludin, A. A.; Mehat, N. M.; Kamaruddin, S.

    2017-10-01

    At times, the polycrystalline diamond compact (PDC) bit’s performance dropped and affects the rate of penetration (ROP). The objective of this project is to investigate the effect of PDC cutter geometry and optimize them. An intensive study in cutter geometry would further enhance the ROP performance. The relatively extended analysis was carried out and four significant geometry factors have been identified that directly improved the ROP. Cutter size, back rake angle, side rake angle and chamfer angle are the stated geometry factors. An appropriate optimization technique that effectively controls all influential geometry factors during cutters manufacturing is introduced and adopted in this project. By adopting L9 Taguchi OA, simulation experiment is conducted by using explicit dynamics finite element analysis. Through a structure Taguchi analysis, ANOVA confirms that the most significant geometry to improve ROP is cutter size (99.16% percentage contribution). The optimized cutter is expected to drill with high ROP that can reduce the rig time, which in its turn, may reduce the total drilling cost.

  6. Period Changes in Pulsating Red Supergiant Stars: A Science and Education Project

    NASA Astrophysics Data System (ADS)

    Percy, J. R.; Favaro, E.; Glasheen, J.; Ho, B.; Sato, H.

    2008-12-01

    We describe research done as part of the University of Toronto Mentorship Program, which enables outstanding senior high school students to work on research projects at the university. The students began with extensive background reading on variable stars, and became familiar with various forms of time-series analysis by applying them to a few red supergiant variables in the AAVSO International Database; we report on the results. They also prepared a useful manual for our publicly-available self-correlation analysis software. They undertook an intensive analysis of the period changes in BC Cyg, using the AAVSO and Turner data and the (O-C) method, in the hope that evolutionary period changes could be observed. The (O-C) diagram, however, is dominated by errors in determining the times of maximum, and by the effects of cycle-to-cycle period fluctuations. As a result, the (O-C) method is generally not effective for these stars. We also describe the Mentorship Program and its elements, and reflect on the students' experience.

  7. Ranking the Project Management Success Factors for Construction Project in South India

    NASA Astrophysics Data System (ADS)

    Aneesha, K.; Haridharan, M. K.

    2017-07-01

    In Today’s construction industry, to achieve a greater advantage over the firms, success of each project and efficiency is required. Effective Project Management overcomes these types of challenges. This study identifies the success factors which are important for project management in construction project success. From the literature review, 26 factors were found to be critical. Project managers, construction managers, civil engineers, contractors and site engineers were the respondents. After analyzing the data in SPSS software, the dominant factors from the regression analysis are top management support, competent project team, abilities to solve problems, realistic cost and time estimates, information/communication, competency of the project manager are the 6 factors out of 12 in 26 factors. Effective communication between stakeholders got highest priority and client involvement, good leadership, clarity of project goals got second priority. Informal communication gives better results compared to formal communications like written formats. To remove communication barrier with the stakeholders, informal communication like speaking face-to-face with the language this fits for the stakeholders.

  8. Application of Multi-Model CMIP5 Analysis in Future Drought Adaptation Strategies

    NASA Astrophysics Data System (ADS)

    Casey, M.; Luo, L.; Lang, Y.

    2014-12-01

    Drought influences the efficacy of numerous natural and artificial systems including species diversity, agriculture, and infrastructure. Global climate change raises concerns that extend well beyond atmospheric and hydrological disciplines - as climate changes with time, the need for system adaptation becomes apparent. Drought, as a natural phenomenon, is typically defined relative to the climate in which it occurs. Typically a 30-year reference time frame (RTF) is used to determine the severity of a drought event. This study investigates the projected future droughts over North America with different RTFs. Confidence in future hydroclimate projection is characterized by the agreement of long term (2005-2100) multi-model precipitation (P) and temperature (T) projections within the Coupled model Intercomparison Project Phase 5 (CMIP5). Drought severity and the propensity of extreme conditions are measured by the multi-scalar, probabilistic, RTF-based Standard Precipitation Index (SPI) and Standard Precipitation Evapotranspiration Index (SPEI). SPI considers only P while SPEI incorporates Evapotranspiration (E) via T; comparing the two reveals the role of temperature change in future hydroclimate change. Future hydroclimate conditions, hydroclimate extremity, and CMIP5 model agreement are assessed for each Representative Concentration Pathway (RCP 2.6, 4.5, 6.0, 8.5) in regions throughout North America for the entire year and for the boreal seasons. In addition, multiple time scales of SPI and SPEI are calculated to characterize drought at time scales ranging from short to long term. The study explores a simple, standardized method for considering adaptation in future drought assessment, which provides a novel perspective to incorporate adaptation with climate change. The result of the analysis is a multi-dimension, probabilistic summary of the hydrological (P, E) environment a natural or artificial system must adapt to over time. Studies similar to this with specified criteria (SPI/SPEI value, time scale, RCP, etc.) can provide professionals in a variety of disciplines with necessary climatic insight to develop adaptation strategies.

  9. Image quality in thoracic 4D cone-beam CT: A sensitivity analysis of respiratory signal, binning method, reconstruction algorithm, and projection angular spacing

    PubMed Central

    Shieh, Chun-Chien; Kipritidis, John; O’Brien, Ricky T.; Kuncic, Zdenka; Keall, Paul J.

    2014-01-01

    Purpose: Respiratory signal, binning method, and reconstruction algorithm are three major controllable factors affecting image quality in thoracic 4D cone-beam CT (4D-CBCT), which is widely used in image guided radiotherapy (IGRT). Previous studies have investigated each of these factors individually, but no integrated sensitivity analysis has been performed. In addition, projection angular spacing is also a key factor in reconstruction, but how it affects image quality is not obvious. An investigation of the impacts of these four factors on image quality can help determine the most effective strategy in improving 4D-CBCT for IGRT. Methods: Fourteen 4D-CBCT patient projection datasets with various respiratory motion features were reconstructed with the following controllable factors: (i) respiratory signal (real-time position management, projection image intensity analysis, or fiducial marker tracking), (ii) binning method (phase, displacement, or equal-projection-density displacement binning), and (iii) reconstruction algorithm [Feldkamp–Davis–Kress (FDK), McKinnon–Bates (MKB), or adaptive-steepest-descent projection-onto-convex-sets (ASD-POCS)]. The image quality was quantified using signal-to-noise ratio (SNR), contrast-to-noise ratio, and edge-response width in order to assess noise/streaking and blur. The SNR values were also analyzed with respect to the maximum, mean, and root-mean-squared-error (RMSE) projection angular spacing to investigate how projection angular spacing affects image quality. Results: The choice of respiratory signals was found to have no significant impact on image quality. Displacement-based binning was found to be less prone to motion artifacts compared to phase binning in more than half of the cases, but was shown to suffer from large interbin image quality variation and large projection angular gaps. Both MKB and ASD-POCS resulted in noticeably improved image quality almost 100% of the time relative to FDK. In addition, SNR values were found to increase with decreasing RMSE values of projection angular gaps with strong correlations (r ≈ −0.7) regardless of the reconstruction algorithm used. Conclusions: Based on the authors’ results, displacement-based binning methods, better reconstruction algorithms, and the acquisition of even projection angular views are the most important factors to consider for improving thoracic 4D-CBCT image quality. In view of the practical issues with displacement-based binning and the fact that projection angular spacing is not currently directly controllable, development of better reconstruction algorithms represents the most effective strategy for improving image quality in thoracic 4D-CBCT for IGRT applications at the current stage. PMID:24694143

  10. Large Scale Analyses and Visualization of Adaptive Amino Acid Changes Projects.

    PubMed

    Vázquez, Noé; Vieira, Cristina P; Amorim, Bárbara S R; Torres, André; López-Fernández, Hugo; Fdez-Riverola, Florentino; Sousa, José L R; Reboiro-Jato, Miguel; Vieira, Jorge

    2018-03-01

    When changes at few amino acid sites are the target of selection, adaptive amino acid changes in protein sequences can be identified using maximum-likelihood methods based on models of codon substitution (such as codeml). Although such methods have been employed numerous times using a variety of different organisms, the time needed to collect the data and prepare the input files means that tens or hundreds of coding regions are usually analyzed. Nevertheless, the recent availability of flexible and easy to use computer applications that collect relevant data (such as BDBM) and infer positively selected amino acid sites (such as ADOPS), means that the entire process is easier and quicker than before. However, the lack of a batch option in ADOPS, here reported, still precludes the analysis of hundreds or thousands of sequence files. Given the interest and possibility of running such large-scale projects, we have also developed a database where ADOPS projects can be stored. Therefore, this study also presents the B+ database, which is both a data repository and a convenient interface that looks at the information contained in ADOPS projects without the need to download and unzip the corresponding ADOPS project file. The ADOPS projects available at B+ can also be downloaded, unzipped, and opened using the ADOPS graphical interface. The availability of such a database ensures results repeatability, promotes data reuse with significant savings on the time needed for preparing datasets, and effortlessly allows further exploration of the data contained in ADOPS projects.

  11. A framework for studying transient dynamics of population projection matrix models.

    PubMed

    Stott, Iain; Townley, Stuart; Hodgson, David James

    2011-09-01

    Empirical models are central to effective conservation and population management, and should be predictive of real-world dynamics. Available modelling methods are diverse, but analysis usually focuses on long-term dynamics that are unable to describe the complicated short-term time series that can arise even from simple models following ecological disturbances or perturbations. Recent interest in such transient dynamics has led to diverse methodologies for their quantification in density-independent, time-invariant population projection matrix (PPM) models, but the fragmented nature of this literature has stifled the widespread analysis of transients. We review the literature on transient analyses of linear PPM models and synthesise a coherent framework. We promote the use of standardised indices, and categorise indices according to their focus on either convergence times or transient population density, and on either transient bounds or case-specific transient dynamics. We use a large database of empirical PPM models to explore relationships between indices of transient dynamics. This analysis promotes the use of population inertia as a simple, versatile and informative predictor of transient population density, but criticises the utility of established indices of convergence times. Our findings should guide further development of analyses of transient population dynamics using PPMs or other empirical modelling techniques. © 2011 Blackwell Publishing Ltd/CNRS.

  12. Mesoscopic Community Structure of Financial Markets Revealed by Price and Sign Fluctuations.

    PubMed

    Almog, Assaf; Besamusca, Ferry; MacMahon, Mel; Garlaschelli, Diego

    2015-01-01

    The mesoscopic organization of complex systems, from financial markets to the brain, is an intermediate between the microscopic dynamics of individual units (stocks or neurons, in the mentioned cases), and the macroscopic dynamics of the system as a whole. The organization is determined by "communities" of units whose dynamics, represented by time series of activity, is more strongly correlated internally than with the rest of the system. Recent studies have shown that the binary projections of various financial and neural time series exhibit nontrivial dynamical features that resemble those of the original data. This implies that a significant piece of information is encoded into the binary projection (i.e. the sign) of such increments. Here, we explore whether the binary signatures of multiple time series can replicate the same complex community organization of the financial market, as the original weighted time series. We adopt a method that has been specifically designed to detect communities from cross-correlation matrices of time series data. Our analysis shows that the simpler binary representation leads to a community structure that is almost identical with that obtained using the full weighted representation. These results confirm that binary projections of financial time series contain significant structural information.

  13. Nonlinear Real-Time Optical Signal Processing.

    DTIC Science & Technology

    1981-06-30

    bandwidth and space-bandwidth products. Real-time homonorphic and loga- rithmic filtering by halftone nonlinear processing has been achieved. A...Page ABSTRACT 1 1. RESEARCH OBJECTIVES AND PROGRESS 3 I-- 1.1 Introduction and Project overview 3 1.2 Halftone Processing 9 1.3 Direct Nonlinear...time homomorphic and logarithmic filtering by halftone nonlinear processing has been achieved. A detailed analysis of degradation due to the finite gamma

  14. PERTS: A Prototyping Environment for Real-Time Systems

    NASA Technical Reports Server (NTRS)

    Liu, Jane W. S.; Lin, Kwei-Jay; Liu, C. L.

    1991-01-01

    We discuss an ongoing project to build a Prototyping Environment for Real-Time Systems, called PERTS. PERTS is a unique prototyping environment in that it has (1) tools and performance models for the analysis and evaluation of real-time prototype systems, (2) building blocks for flexible real-time programs and the support system software, (3) basic building blocks of distributed and intelligent real time applications, and (4) an execution environment. PERTS will make the recent and future theoretical advances in real-time system design and engineering readily usable to practitioners. In particular, it will provide an environment for the use and evaluation of new design approaches, for experimentation with alternative system building blocks and for the analysis and performance profiling of prototype real-time systems.

  15. A rapid response air quality analysis system for use in projects having stringent quality assurance requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowman, A.W.

    1990-04-01

    This paper describes an approach to solve air quality problems which frequently occur during iterations of the baseline change process. From a schedule standpoint, it is desirable to perform this evaluation in as short a time as possible while budgetary pressures limit the size of the staff available to do the work. Without a method in place to deal with baseline change proposal requests the environment analysts may not be able to produce the analysis results in the time frame expected. Using a concept called the Rapid Response Air Quality Analysis System (RAAS), the problems of timing and cost becomemore » tractable. The system could be adapted to assess other atmospheric pathway impacts, e.g., acoustics or visibility. The air quality analysis system used to perform the EA analysis (EA) for the Salt Repository Project (part of the Civilian Radioactive Waste Management Program), and later to evaluate the consequences of proposed baseline changes, consists of three components: Emission source data files; Emission rates contained in spreadsheets; Impact assessment model codes. The spreadsheets contain user-written codes (macros) that calculate emission rates from (1) emission source data (e.g., numbers and locations of sources, detailed operating schedules, and source specifications including horsepower, load factor, and duty cycle); (2) emission factors such as those published by the U.S. Environmental Protection Agency, and (3) control efficiencies.« less

  16. Use of a Weather Generator for analysis of projections of future daily temperature and its validation with climate change indices

    NASA Astrophysics Data System (ADS)

    Di Piazza, A.; Cordano, E.; Eccel, E.

    2012-04-01

    The issue of climate change detection is considered a major challenge. In particular, high temporal resolution climate change scenarios are required in the evaluation of the effects of climate change on agricultural management (crop suitability, yields, risk assessment, etc.) energy production and water management. In this work, a "Weather Generator" technique was used for downscaling climate change scenarios for temperature. An R package (RMAWGEN, Cordano and Eccel, 2011 - available on http://cran.r-project.org) was developed aiming to generate synthetic daily weather conditions by using the theory of vectorial auto-regressive models (VAR). The VAR model was chosen for its ability in maintaining the temporal and spatial correlations among variables. In particular, observed time series of daily maximum and minimum temperature are transformed into "new" normally-distributed variable time series which are used to calibrate the parameters of a VAR model by using ordinary least square methods. Therefore the implemented algorithm, applied to monthly mean climatic values downscaled by Global Climate Model predictions, can generate several stochastic daily scenarios where the statistical consistency among series is saved. Further details are present in RMAWGEN documentation. An application is presented here by using a dataset with daily temperature time series recorded in 41 different sites of Trentino region for the period 1958-2010. Temperature time series were pre-processed to fill missing values (by a site-specific calibrated Inverse Distance Weighting algorithm, corrected with elevation) and to remove inhomogeneities. Several climatic indices were taken into account, useful for several impact assessment applications, and their time trends within the time series were analyzed. The indices go from the more classical ones, as annual mean temperatures, seasonal mean temperatures and their anomalies (from the reference period 1961-1990) to the climate change indices selected from the list recommended by the World Meteorological Organization Commission for Climatology (WMO-CCL) and the Research Programme on Climate Variability and Predictability (CLIVAR) project's Expert Team on Climate Change Detection, Monitoring and Indices (ETCCDMI). Each index was applied to both observed (and processed) data and to synthetic time series produced by the Weather Generator, over the thirty year reference period 1981-2010, in order to validate the procedure. Climate projections were statistically downscaled for a selection of sites for the two 30-year periods 2021-2050 and 2071-2099 of the European project "Ensembles" multi-model output (scenario A1B). The use of several climatic indices strengthens the trend analysis of both the generated synthetic series and future climate projections.

  17. "A day in my life" photography project: the silent voice of pediatric bone marrow transplant patients.

    PubMed

    Breitwieser, Carrie L; Vaughn, Lisa M

    2014-01-01

    A photovoice project was conducted with pediatric bone marrow transplant (BMT) patients to examine their coping skills and interpretation of their experience during a BMT, especially when hospitalized. We also wanted to determine how photovoice could be used within a pediatric BMT unit. Sixteen children (ages 4-14) and 2 young adults (ages 22 and 25) from a pediatric BMT unit participated in the project. Six BMT outpatients participated in the data analysis and evaluation phase. Fourteen clinical staff evaluated the impact of the project on their practice. Three primary themes emerged from the pre- and post-BMT photos, accompanying detailed notes, and BMT outpatient analysis of the photos: (a) BMT is "torture," (b) BMT is "time slipping away," and (c) BMT requires normalization, comfort, distraction, and support. BMT patients and staff concluded that photovoice helped express and release emotions regarding the challenges of BMT. BMT staff noted that the results of this project reminded them of the importance of being patient-centered and mindful of patient experience and the therapeutic relationship. © 2014 by Association of Pediatric Hematology/Oncology Nurses.

  18. The Santa Monica freeway diamond lanes. Volume I. Summary. Final report, March 1976-August 1976

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Billheimer, J.W.; Bullemer, R.J.; Fratessa, C.

    1977-09-01

    The Santa Monica Freeway Diamond Lanes, a pair of concurrent-flow preferential lanes for buses and carpools linking the City of Santa Monica, California, with the Los Angeles CBD, opened on March 16, 1976 and operated amid much controversy for 21 weeks until the U.S. District Court halted the project. The Diamond Lane project marked the first time preferential lanes had been created by taking busy freeway lanes out of existing service and dedicating them to the exclusive use of high-occupancy vehicles. This report summarizes the findings of the evaluation of the project. The report addresses a broad range of impactsmore » in the following major areas: Traffic speeds and travel times; traffic volumes and carpool information; bus operations and ridership; safety and enforcement; energy and air quality; and public attitudes and response. Analysis shows that the project succeeded in increasing carpool ridership by 65% and the increased bus service accompanying the Diamond Lanes caused bus ridership to more than triple. Nonetheless, energy savings and air quality improvements were insignificant, freeway accidents increased significantly, non-carpoolers lost far more time than carpoolers gained, and a heated public outcry developed which has delayed the implementation of other preferential treatment projects in S. California.« less

  19. Importance of Requirements Analysis & Traceability to Improve Software Quality and Reduce Cost and Risk

    NASA Technical Reports Server (NTRS)

    Kapoor, Manju M.; Mehta, Manju

    2010-01-01

    The goal of this paper is to emphasize the importance of developing complete and unambiguous requirements early in the project cycle (prior to Preliminary Design Phase). Having a complete set of requirements early in the project cycle allows sufficient time to generate a traceability matrix. Requirements traceability and analysis are the key elements in improving verification and validation process, and thus overall software quality. Traceability can be most beneficial when the system changes. If changes are made to high-level requirements it implies that low-level requirements need to be modified. Traceability ensures that requirements are appropriately and efficiently verified at various levels whereas analysis ensures that a rightly interpreted set of requirements is produced.

  20. Cost analysis of a project to digitize classic articles in neurosurgery*

    PubMed Central

    Bauer, Kathleen

    2002-01-01

    In summer 2000, the Cushing/Whitney Medical Library at Yale University began a demonstration project to digitize classic articles in neurosurgery from the late 1800s and early 1900s. The objective of the first phase of the project was to measure the time and costs involved in digitization, and those results are reported here. In the second phase, metadata will be added to the digitized articles, and the project will be publicized. Thirteen articles were scanned using optical character recognition (OCR) software, and the resulting text files were carefully proofread. Time for photocopying, scanning, and proofreading were recorded. This project achieved an average cost per item (total pages plus images) of $4.12, a figure at the high end of average costs found in other studies. This project experienced high costs for two reasons. First, the articles contained many images, which required extra processing. Second, the older fonts and the poor condition of many of these articles complicated the OCR process. The average article cost $84.46 to digitize. Although costs were high, the selection of historically important articles maximized the benefit gained from the investment in digitization. PMID:11999182

  1. Cost analysis of a project to digitize classic articles in neurosurgery.

    PubMed

    Bauer, Kathleen

    2002-04-01

    In summer 2000, the Cushing/Whitney Medical Library at Yale University began a demonstration project to digitize classic articles in neurosurgery from the late 1800s and early 1900s. The objective of the first phase of the project was to measure the time and costs involved in digitization, and those results are reported here. In the second phase, metadata will be added to the digitized articles, and the project will be publicized. Thirteen articles were scanned using optical character recognition (OCR) software, and the resulting text files were carefully proofread. Time for photocopying, scanning, and proofreading were recorded. This project achieved an average cost per item (total pages plus images) of $4.12, a figure at the high end of average costs found in other studies. This project experienced high costs for two reasons. First, the articles contained many images, which required extra processing. Second, the older fonts and the poor condition of many of these articles complicated the OCR process. The average article cost $84.46 to digitize. Although costs were high, the selection of historically important articles maximized the benefit gained from the investment in digitization.

  2. The accuracy of matrix population model projections for coniferous trees in the Sierra Nevada, California

    USGS Publications Warehouse

    van Mantgem, P.J.; Stephenson, N.L.

    2005-01-01

    1 We assess the use of simple, size-based matrix population models for projecting population trends for six coniferous tree species in the Sierra Nevada, California. We used demographic data from 16 673 trees in 15 permanent plots to create 17 separate time-invariant, density-independent population projection models, and determined differences between trends projected from initial surveys with a 5-year interval and observed data during two subsequent 5-year time steps. 2 We detected departures from the assumptions of the matrix modelling approach in terms of strong growth autocorrelations. We also found evidence of observation errors for measurements of tree growth and, to a more limited degree, recruitment. Loglinear analysis provided evidence of significant temporal variation in demographic rates for only two of the 17 populations. 3 Total population sizes were strongly predicted by model projections, although population dynamics were dominated by carryover from the previous 5-year time step (i.e. there were few cases of recruitment or death). Fractional changes to overall population sizes were less well predicted. Compared with a null model and a simple demographic model lacking size structure, matrix model projections were better able to predict total population sizes, although the differences were not statistically significant. Matrix model projections were also able to predict short-term rates of survival, growth and recruitment. Mortality frequencies were not well predicted. 4 Our results suggest that simple size-structured models can accurately project future short-term changes for some tree populations. However, not all populations were well predicted and these simple models would probably become more inaccurate over longer projection intervals. The predictive ability of these models would also be limited by disturbance or other events that destabilize demographic rates. ?? 2005 British Ecological Society.

  3. Decision Making Methodology to Mitigate Damage From Glacial Lake Outburst Floods From Imja Lake in Nepal

    NASA Astrophysics Data System (ADS)

    McKinney, D. C.; Cuellar, A. D.

    2015-12-01

    Climate change has accelerated glacial retreat in high altitude glaciated regions of Nepal leading to the growth and formation of glacier lakes. Glacial lake outburst floods (GLOF) are sudden events triggered by an earthquake, moraine failure or other shock that causes a sudden outflow of water. These floods are catastrophic because of their sudden onset, the difficulty predicting them, and enormous quantity of water and debris rapidly flooding downstream areas. Imja Lake in the Himalaya of Nepal has experienced accelerated growth since it first appeared in the 1960s. Communities threatened by a flood from Imja Lake have advocated for projects to adapt to the increasing threat of a GLOF. Nonetheless, discussions surrounding projects for Imja have not included a rigorous analysis of the potential consequences of a flood, probability of an event, or costs of mitigation projects in part because this information is unknown or uncertain. This work presents a demonstration of a decision making methodology developed to rationally analyze the risks posed by Imja Lake and the various adaptation projects proposed using available information. In this work the authors use decision analysis, data envelopement analysis (DEA), and sensitivity analysis to assess proposed adaptation measures that would mitigate damage in downstream communities from a GLOF. We use an existing hydrodynamic model of the at-risk area to determine how adaptation projects will affect downstream flooding and estimate fatalities using an empirical method developed for dam failures. The DEA methodology allows us to estimate the value of a statistical life implied by each project given the cost of the project and number of lives saved to determine which project is the most efficient. In contrast the decision analysis methodology requires fatalities to be assigned a cost but allows the inclusion of uncertainty in the decision making process. We compare the output of these two methodologies and determine the sensitivity of the conclusions to changes in uncertain input parameters including project cost, value of a statistical life, and time to a GLOF event.

  4. GET electronics samples data analysis

    NASA Astrophysics Data System (ADS)

    Giovinazzo, J.; Goigoux, T.; Anvar, S.; Baron, P.; Blank, B.; Delagnes, E.; Grinyer, G. F.; Pancin, J.; Pedroza, J. L.; Pibernat, J.; Pollacco, E.; Rebii, A.; Roger, T.; Sizun, P.

    2016-12-01

    The General Electronics for TPCs (GET) has been developed to equip a generation of time projection chamber detectors for nuclear physics, and may also be used for a wider range of detector types. The goal of this paper is to propose first analysis procedures to be applied on raw data samples from the GET system, in order to correct for systematic effects observed on test measurements. We also present a method to estimate the response function of the GET system channels. The response function is required in analysis where the input signal needs to be reconstructed, in terms of time distribution, from the registered output samples.

  5. Providing Decision-Relevant Information for a State Climate Change Action Plan

    NASA Astrophysics Data System (ADS)

    Wake, C.; Frades, M.; Hurtt, G. C.; Magnusson, M.; Gittell, R.; Skoglund, C.; Morin, J.

    2008-12-01

    Carbon Solutions New England (CSNE), a public-private partnership formed to promote collective action to achieve a low carbon society, has been working with the Governor appointed New Hampshire Climate Change Policy Task Force (NHCCTF) to support the development of a state Climate Change Action Plan. CSNE's role has been to quantify the potential carbon emissions reduction, implementation costs, and cost savings at three distinct time periods (2012, 2025, 2050) for a range of strategies identified by the Task Force. These strategies were developed for several sectors (transportation and land use, electricity generation and use, building energy use, and agriculture, forestry, and waste).New Hampshire's existing and projected economic and population growth are well above the regional average, creating additional challenges for the state to meet regional emission reduction targets. However, by pursuing an ambitious suite of renewable energy and energy efficiency strategies, New Hampshire may be able to continue growing while reducing emissions at a rate close to 3% per year up to 2025. This suite includes efficiency improvements in new and existing buildings, a renewable portfolio standard for electricity generation, avoiding forested land conversion, fuel economy gains in new vehicles, and a reduction in vehicle miles traveled. Most (over 80%) of these emission reduction strategies are projected to provide net economic savings in 2025.A collaborative and iterative process was developed among the key partners in the project. The foundation for the project's success included: a diverse analysis team with leadership that was committed to the project, an open source analysis approach, weekly meetings and frequent communication among the partners, interim reporting of analysis, and an established and trusting relationship among the partners, in part due to collaboration on previous projects.To develop decision-relevant information for the Task Force, CSNE addressed several challenges, including: allocating the emission reduction and economic impacts of local- to state-scale mitigation strategies that are in reality integrated on regional and/or national scales; incorporating changes to the details of the strategies over time; identifying and quantifying key variables; choosing appropriate levels of detail for over 100 strategies within the limited analysis timeframe; integrating individual strategies into a coherent whole; and structuring data presentation to maximize transparency of analysis without confusing or overwhelming decision makers.

  6. Description of the supporting factors of final project in Mathematics and Natural Sciences Faculty of Syiah Kuala University with multiple correspondence analysis

    NASA Astrophysics Data System (ADS)

    Rusyana, Asep; Nurhasanah; Maulizasari

    2018-05-01

    Syiah Kuala University (Unsyiah) is hoped to have graduates who are qualified for working or creating a field of work. A final project course implementation process must be effective. This research uses data from the evaluation conducted by Mathematics and Natural Sciences Faculty (FMIPA) of Unsyiah. Some of the factors that support the completion of the final project are duration, guidance, the final project seminars, facility, public impact, and quality. This research aims to know the factors that have a relationship with the completion of the final project and identify similarities among variables. The factors that support the completion of the final project at every study program in FMIPA are (1) duration, (2) guidance and (3) facilities. These factors are examined for the correlations by chi-square test. After that, the variables are analyzed with multiple correspondence analysis. Based on the plot of correspondence, the activities of the guidance and facilities in Informatics Study Program are included in the fair category, while the guidance and facilities in the Chemistry are included in the best category. Besides that, students in Physics can finish the final project with the fastest completion duration, while students in Pharmacy finish for the longest time.

  7. Analysis and fifteen-year projection of the market for LANDSAT data

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The potential market for LANDSAT products through the 1990's was determined. Results are presented in a matrix format. Improved resolution is a major factor in the marketability of LANDSAT data, the 10 meter resolution (projected for 1995) having a significant impact on the federal, private, and international users, and on the agricultural, minerals, and national defense applications. Data delivery time and competition from the French remote sensing system are considered.

  8. [Tobacco quality analysis of producing areas of Yunnan tobacco using near-infrared (NIR) spectrum].

    PubMed

    Wang, Yi; Ma, Xiang; Wen, Ya-Dong; Yu, Chun-Xia; Wang, Luo-Ping; Zhao, Long-Lian; Li, Jun-Hui

    2013-01-01

    In the present study, tobacco quality analysis of different producing areas was carried out applying spectrum projection and correlation methods. The group of industrial classification data was near-infrared (NIR) spectrum in 2010 year of middle parts of tobacco plant from Hongta Tobacco (Group) Co., Ltd. Twelve hundred seventy six superior tobacco leaf samples were collected from four producing areas, in which three areas from Yuxi, Chuxiong and Zhaotong, in Yunnan province all belong to tobacco varieties of K326 and one area from Dali belongs to tobacco varieties of Hongda. The conclusion showed that when the samples were divided into two parts by the ratio of 2 : 1 randomly as analysis and verification sets, the verification set corresponded with the analysis set applying spectrum projection because their correlation coefficients by the first and second dimensional projection were all above 0.99. At the same time, The study discussed a method to get the quantitative similarity values of different producing areas samples. The similarity values were instructive in tobacco plant planning, quality management, acquisition of raw materials of tobacco and tobacco leaf blending.

  9. Private participation in infrastructure: A risk analysis of long-term contracts in power sector

    NASA Astrophysics Data System (ADS)

    Ceran, Nisangul

    The objective of this dissertation is to assess whether the private participation in energy sector through long term contracting, such as Build-Operate-Transfer (BOT) type investments, is an efficient way of promoting efficiency in the economy. To this end; the theoretical literature on the issue is discussed, the experience of several developing countries are examined, and a BOT project, which is undertaken by the Enron company in Turkey, has been studied in depth as a case study. Different risk analysis techniques, including sensitivity and probabilistic risk analysis with the Monte Carlo Simulation (MCS) method have been applied to assess the financial feasibility and risks of the case study project, and to shed light on the level of rent-seeking in the BOT agreements. Although data on rent seeking and corruption is difficult to obtain, the analysis of case study investment using the sensitivity and MCS method provided some information that can be used in assessing the level of rent-seeking in BOT projects. The risk analysis enabled to test the sustainability of the long-term BOT contracts through the analysis of projects financial feasibility with and without the government guarantees in the project. The approach of testing the sustainability of the project under different scenarios is helpful to understand the potential costs and contingent liabilities for the government and project's impact on a country's overall economy. The results of the risk analysis made by the MCS method for the BOT project used as the case study strongly suggest that, the BOT projects does not serve to the interest of the society and transfers substantial amount of public money to the private companies, implying severe governance problems. It is found that not only government but also private sector may be reluctant about full privatization of infrastructure due to several factors such as involvement of large sunk costs, very long time period for returns to be received, political and macroeconomic uncertainties and insufficient institutional and regulatory environment. It is concluded that the BOT type infrastructure projects are not an efficient way of promoting private sector participation in infrastructure. They tend to serve the interest of rent-seekers rather than the interest of the society. Since concession contracts and Treasury guarantees shift the commercial risk to government, the private sector has no incentive to be efficient. The concession agreements distort the market conditions by preventing free completion in the market.

  10. The Pilot Phase of the Global Soil Wetness Project Phase 3

    NASA Astrophysics Data System (ADS)

    Kim, H.; Oki, T.

    2015-12-01

    After the second phase of the Global Soil Wetness Project (GSWP2) as an early global continuous gridded multi-model analysis, a comprehensive set of land surface fluxes and state variables became available. It has been broadly utilized in the hydrology community, and its success has evolved to take advantages of recent scientific progress and to extend the relatively short time span (1986-1995) of the previous project. In the third phase proposed here (GSWP3), an extensive set of quantities for hydro-energy-eco systems will be produced to investigate their long-term (1901-2010) changes. The energy-water-carbon cycles and their interactions are also examined subcomponent-wise with appropriate model verifications in ensemble land simulations. In this study, the preliminary results and problems found from the first round analysis of the GSWP3 pilot study are shown. Also, it is discussed how the global offline simulation activity contributes to wider communities and a bigger scope such as Climate Model Intercomparison Project Phase 6 (CMIP6).

  11. Semiconductor grade, solar silicon purification project

    NASA Technical Reports Server (NTRS)

    Ingle, W. M.; Rosler, R. R.; Thompson, S. W.; Chaney, R. E.

    1979-01-01

    Experimental apparatus and procedures used in the development of a 3-step SiF2(x) polymer transport purification process are described. Both S.S.M.S. and E.S. analysis demonstrated that major purification had occured and some samples were indistinguishable from semiconductor grade silicon (except possibly for phosphorus). Recent electrical analysis via crystal growth reveals that the product contains compensated phosphorus and boron. The low projected product cost and short energy payback time suggest that the economics of this process will result in a cost less than the goal of $10/Kg(1975 dollars). The process appears to be readily scalable to a major silicon purification facility.

  12. The New York Times Report on Teenage Reading Tastes and Habits.

    ERIC Educational Resources Information Center

    Freiberger, Rema

    In order to learn whether teenagers are reading books and, if so, which books they choose, "The New York Times" conducted a fact-finding project. Questionnaires were mailed to the school librarians and English chairmen of 7000 secondary and intermediate schools. The wide variety of answers to observable trends necessitated the analysis of a random…

  13. Precise 3D Track Reconstruction Algorithm for the ICARUS T600 Liquid Argon Time Projection Chamber Detector

    DOE PAGES

    Antonello, M.; Baibussinov, B.; Benetti, P.; ...

    2013-01-15

    Liquid Argon Time Projection Chamber (LAr TPC) detectors offer charged particle imaging capability with remarkable spatial resolution. Precise event reconstruction procedures are critical in order to fully exploit the potential of this technology. In this paper we present a new, general approach to 3D reconstruction for the LAr TPC with a practical application to the track reconstruction. The efficiency of the method is evaluated on a sample of simulated tracks. We present also the application of the method to the analysis of stopping particle tracks collected during the ICARUS T600 detector operation with the CNGS neutrino beam.

  14. Road weather connected vehicle applications : benefit-cost analysis interim report.

    DOT National Transportation Integrated Search

    2013-01-01

    RWMP is currently engaged in a project to evaluate the potential benefits of road weather connected vehicle applications. Of particular interest are the potential improvements in safety, reductions in travel time, improved travel reliability, reducti...

  15. When is Testing Sufficient

    NASA Technical Reports Server (NTRS)

    Rosenberg, Linda H.; Arthur, James D.; Stapko, Ruth K.; Davani, Darush

    1999-01-01

    The Software Assurance Technology Center (SATC) at NASA Goddard Space Flight Center has been investigating how projects can determine when sufficient testing has been completed. For most projects, schedules are underestimated, and the last phase of the software development, testing, must be decreased. Two questions are frequently asked: "To what extent is the software error-free? " and "How much time and effort is required to detect and remove the remaining errors? " Clearly, neither question can be answered with absolute certainty. Nonetheless, the ability to answer these questions with some acceptable level of confidence is highly desirable. First, knowing the extent to which a product is error-free, we can judge when it is time to terminate testing. Secondly, if errors are judged to be present, we can perform a cost/benefit trade-off analysis to estimate when the software will be ready for use and at what cost. This paper explains the efforts of the SATC to help projects determine what is sufficient testing and when is the most cost-effective time to stop testing.

  16. Overview of computational control research at UT Austin

    NASA Technical Reports Server (NTRS)

    Bong, Wie

    1989-01-01

    An overview of current research activities at UT Austin is presented to discuss certain technical issues in the following areas: (1) Computer-Aided Nonlinear Control Design: In this project, the describing function method is employed for the nonlinear control analysis and design of a flexible spacecraft equipped with pulse modulated reaction jets. INCA program has been enhanced to allow the numerical calculation of describing functions as well as the nonlinear limit cycle analysis capability in the frequency domain; (2) Robust Linear Quadratic Gaussian (LQG) Compensator Synthesis: Robust control design techniques and software tools are developed for flexible space structures with parameter uncertainty. In particular, an interactive, robust multivariable control design capability is being developed for INCA program; and (3) LQR-Based Autonomous Control System for the Space Station: In this project, real time implementation of LQR-based autonomous control system is investigated for the space station with time-varying inertias and with significant multibody dynamic interactions.

  17. AEP Ohio gridSMART Demonstration Project Real-Time Pricing Demonstration Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Widergren, Steven E.; Subbarao, Krishnappa; Fuller, Jason C.

    2014-02-01

    This report contributes initial findings from an analysis of significant aspects of the gridSMART® Real-Time Pricing (RTP) – Double Auction demonstration project. Over the course of four years, Pacific Northwest National Laboratory (PNNL) worked with American Electric Power (AEP), Ohio and Battelle Memorial Institute to design, build, and operate an innovative system to engage residential consumers and their end-use resources in a participatory approach to electric system operations, an incentive-based approach that has the promise of providing greater efficiency under normal operating conditions and greater flexibility to react under situations of system stress. The material contained in this report supplementsmore » the findings documented by AEP Ohio in the main body of the gridSMART report. It delves into three main areas: impacts on system operations, impacts on households, and observations about the sensitivity of load to price changes.« less

  18. Development and Implementation of a Generic Analysis Template for Structural-Thermal-Optical-Performance Modeling

    NASA Technical Reports Server (NTRS)

    Scola, Salvatore; Stavely, Rebecca; Jackson, Trevor; Boyer, Charlie; Osmundsen, Jim; Turczynski, Craig; Stimson, Chad

    2016-01-01

    Performance-related effects of system level temperature changes can be a key consideration in the design of many types of optical instruments. This is especially true for space-based imagers, which may require complex thermal control systems to maintain alignment of the optical components. Structural-Thermal-Optical-Performance (STOP) analysis is a multi-disciplinary process that can be used to assess the performance of these optical systems when subjected to the expected design environment. This type of analysis can be very time consuming, which makes it difficult to use as a trade study tool early in the project life cycle. In many cases, only one or two iterations can be performed over the course of a project. This limits the design space to best practices since it may be too difficult, or take too long, to test new concepts analytically. In order to overcome this challenge, automation, and a standard procedure for performing these studies is essential. A methodology was developed within the framework of the Comet software tool that captures the basic inputs, outputs, and processes used in most STOP analyses. This resulted in a generic, reusable analysis template that can be used for design trades for a variety of optical systems. The template captures much of the upfront setup such as meshing, boundary conditions, data transfer, naming conventions, and post-processing, and therefore saves time for each subsequent project. A description of the methodology and the analysis template is presented, and results are described for a simple telescope optical system.

  19. Development and implementation of a generic analysis template for structural-thermal-optical-performance modeling

    NASA Astrophysics Data System (ADS)

    Scola, Salvatore; Stavely, Rebecca; Jackson, Trevor; Boyer, Charlie; Osmundsen, Jim; Turczynski, Craig; Stimson, Chad

    2016-09-01

    Performance-related effects of system level temperature changes can be a key consideration in the design of many types of optical instruments. This is especially true for space-based imagers, which may require complex thermal control systems to maintain alignment of the optical components. Structural-Thermal-Optical-Performance (STOP) analysis is a multi-disciplinary process that can be used to assess the performance of these optical systems when subjected to the expected design environment. This type of analysis can be very time consuming, which makes it difficult to use as a trade study tool early in the project life cycle. In many cases, only one or two iterations can be performed over the course of a project. This limits the design space to best practices since it may be too difficult, or take too long, to test new concepts analytically. In order to overcome this challenge, automation, and a standard procedure for performing these studies is essential. A methodology was developed within the framework of the Comet software tool that captures the basic inputs, outputs, and processes used in most STOP analyses. This resulted in a generic, reusable analysis template that can be used for design trades for a variety of optical systems. The template captures much of the upfront setup such as meshing, boundary conditions, data transfer, naming conventions, and post-processing, and therefore saves time for each subsequent project. A description of the methodology and the analysis template is presented, and results are described for a simple telescope optical system.

  20. Optimizing Energy Consumption in Building Designs Using Building Information Model (BIM)

    NASA Astrophysics Data System (ADS)

    Egwunatum, Samuel; Joseph-Akwara, Esther; Akaigwe, Richard

    2016-09-01

    Given the ability of a Building Information Model (BIM) to serve as a multi-disciplinary data repository, this paper seeks to explore and exploit the sustainability value of Building Information Modelling/models in delivering buildings that require less energy for their operation, emit less CO2 and at the same time provide a comfortable living environment for their occupants. This objective was achieved by a critical and extensive review of the literature covering: (1) building energy consumption, (2) building energy performance and analysis, and (3) building information modeling and energy assessment. The literature cited in this paper showed that linking an energy analysis tool with a BIM model helped project design teams to predict and create optimized energy consumption. To validate this finding, an in-depth analysis was carried out on a completed BIM integrated construction project using the Arboleda Project in the Dominican Republic. The findings showed that the BIM-based energy analysis helped the design team achieve the world's first 103% positive energy building. From the research findings, the paper concludes that linking an energy analysis tool with a BIM model helps to expedite the energy analysis process, provide more detailed and accurate results as well as deliver energy-efficient buildings. The study further recommends that the adoption of a level 2 BIM and the integration of BIM in energy optimization analyse should be made compulsory for all projects irrespective of the method of procurement (government-funded or otherwise) or its size.

  1. From climate-change spaghetti to climate-change distributions for 21st Century California

    USGS Publications Warehouse

    Dettinger, M.D.

    2005-01-01

    The uncertainties associated with climate-change projections for California are unlikely to disappear any time soon, and yet important long-term decisions will be needed to accommodate those potential changes. Projection uncertainties have typically been addressed by analysis of a few scenarios, chosen based on availability or to capture the extreme cases among available projections. However, by focusing on more common projections rather than the most extreme projections (using a new resampling method), new insights into current projections emerge: (1) uncertainties associated with future greenhouse-gas emissions are comparable with the differences among climate models, so that neither source of uncertainties should be neglected or underrepresented; (2) twenty-first century temperature projections spread more, overall, than do precipitation scenarios; (3) projections of extremely wet futures for California are true outliers among current projections; and (4) current projections that are warmest tend, overall, to yield a moderately drier California, while the cooler projections yield a somewhat wetter future. The resampling approach applied in this paper also provides a natural opportunity to objectively incorporate measures of model skill and the likelihoods of various emission scenarios into future assessments.

  2. Longitudinal versus cross-sectional methodology for estimating the economic burden of breast cancer: a pilot study.

    PubMed

    Mullins, C Daniel; Wang, Junling; Cooke, Jesse L; Blatt, Lisa; Baquet, Claudia R

    2004-01-01

    Projecting future breast cancer treatment expenditure is critical for budgeting purposes, medical decision making and the allocation of resources in order to maximise the overall impact on health-related outcomes of care. Currently, both longitudinal and cross-sectional methodologies are used to project the economic burden of cancer. This pilot study examined the differences in estimates that were obtained using these two methods, focusing on Maryland, US Medicaid reimbursement data for chemotherapy and prescription drugs for the years 1999-2000. Two different methodologies for projecting life cycles of cancer expenditure were considered. The first examined expenditure according to chronological time (calendar quarter) for all cancer patients in the database in a given quarter. The second examined only the most recent quarter and constructed a hypothetical expenditure life cycle by taking into consideration the number of quarters since the respective patient had her first claim. We found different average expenditures using the same data and over the same time period. The longitudinal measurement had less extreme peaks and troughs, and yielded average expenditure in the final period that was 60% higher than that produced using the cross-sectional analysis; however, the longitudinal analysis had intermediate periods with significantly lower estimated expenditure than the cross-sectional data. These disparate results signify that each of the methods has merit. The longitudinal method tracks changes over time while the cross-sectional approach reflects more recent data, e.g. current practice patterns. Thus, this study reiterates the importance of considering the methodology when projecting future cancer expenditure.

  3. Involvement of the anterior cingulate cortex in time-based prospective memory task monitoring: An EEG analysis of brain sources using Independent Component and Measure Projection Analysis

    PubMed Central

    Burgos, Pablo; Kilborn, Kerry; Evans, Jonathan J.

    2017-01-01

    Objective Time-based prospective memory (PM), remembering to do something at a particular moment in the future, is considered to depend upon self-initiated strategic monitoring, involving a retrieval mode (sustained maintenance of the intention) plus target checking (intermittent time checks). The present experiment was designed to explore what brain regions and brain activity are associated with these components of strategic monitoring in time-based PM tasks. Method 24 participants were asked to reset a clock every four minutes, while performing a foreground ongoing word categorisation task. EEG activity was recorded and data were decomposed into source-resolved activity using Independent Component Analysis. Common brain regions across participants, associated with retrieval mode and target checking, were found using Measure Projection Analysis. Results Participants decreased their performance on the ongoing task when concurrently performed with the time-based PM task, reflecting an active retrieval mode that relied on withdrawal of limited resources from the ongoing task. Brain activity, with its source in or near the anterior cingulate cortex (ACC), showed changes associated with an active retrieval mode including greater negative ERP deflections, decreased theta synchronization, and increased alpha suppression for events locked to the ongoing task while maintaining a time-based intention. Activity in the ACC was also associated with time-checks and found consistently across participants; however, we did not find an association with time perception processing per se. Conclusion The involvement of the ACC in both aspects of time-based PM monitoring may be related to different functions that have been attributed to it: strategic control of attention during the retrieval mode (distributing attentional resources between the ongoing task and the time-based task) and anticipatory/decision making processing associated with clock-checks. PMID:28863146

  4. Data Rods: High Speed, Time-Series Analysis of Massive Cryospheric Data Sets Using Object-Oriented Database Methods

    NASA Astrophysics Data System (ADS)

    Liang, Y.; Gallaher, D. W.; Grant, G.; Lv, Q.

    2011-12-01

    Change over time, is the central driver of climate change detection. The goal is to diagnose the underlying causes, and make projections into the future. In an effort to optimize this process we have developed the Data Rod model, an object-oriented approach that provides the ability to query grid cell changes and their relationships to neighboring grid cells through time. The time series data is organized in time-centric structures called "data rods." A single data rod can be pictured as the multi-spectral data history at one grid cell: a vertical column of data through time. This resolves the long-standing problem of managing time-series data and opens new possibilities for temporal data analysis. This structure enables rapid time- centric analysis at any grid cell across multiple sensors and satellite platforms. Collections of data rods can be spatially and temporally filtered, statistically analyzed, and aggregated for use with pattern matching algorithms. Likewise, individual image pixels can be extracted to generate multi-spectral imagery at any spatial and temporal location. The Data Rods project has created a series of prototype databases to store and analyze massive datasets containing multi-modality remote sensing data. Using object-oriented technology, this method overcomes the operational limitations of traditional relational databases. To demonstrate the speed and efficiency of time-centric analysis using the Data Rods model, we have developed a sea ice detection algorithm. This application determines the concentration of sea ice in a small spatial region across a long temporal window. If performed using traditional analytical techniques, this task would typically require extensive data downloads and spatial filtering. Using Data Rods databases, the exact spatio-temporal data set is immediately available No extraneous data is downloaded, and all selected data querying occurs transparently on the server side. Moreover, fundamental statistical calculations such as running averages are easily implemented against the time-centric columns of data.

  5. Ultra-rapid EOP determination with VLBI

    NASA Astrophysics Data System (ADS)

    Haas, Rüdiger; Kurihara, Shinobu; Nozawa, Kentaro; Hobiger, Thomas; Lovell, Jim; McCallum, Jamie; Quick, Jonathan

    2013-04-01

    In 2007 the Geospatial information Authority of Japan (GSI) and the Onsala Space Observatory (OSO) started a project aiming at determining the earth rotation angle, usually expressed as dUT1, in near real-time. In the beginning of this project dedicated one hour long one-baseline experiments were observed periodically using the VLBI stations Onsala (Sweden) and Tsukuba (Japan). The strategy is that the observed VLBI-data are sent in real-time via the international optical fibre backbone to the VLBI-correlator at Tsukuba where the data are correlated and analyzed in near-real time, producing ultra-rapid dUT1 results. An offline version of this strategy has been adopted in 2009 for the regular VLBI intensive series INT-2 involving Wettzell (Germany) and Tsukuba. Since March 2010 the INT-2 is using real-time e-transfer, too, and since June 2010 also automated analysis. Starting in 2009 the ultra-rapid approach was applied to regular 24 hour long VLBI-sessions that involve Tsukuba and Onsala, so that ultra-rapid dUT1 results can be produced already during ongoing VLBI-sessions. This strategy was successfully operated during the 15 days long CONT11 campaign. In 2011 the ultra-rapid strategy was extended to involve a network of VLBI-stations, so that not only dUT1 but also the polar motion components can be determined in near real-time. Initially, in November 2011 a dedicated three-station session was observed involving Onsala, Tsukuba and Hobart (Tasmania, Australia). In 2012 several regular 24 hour long IVS-sessions that involved Onsala, Tsukuba and HartRAO (South Africa) were operated with the ultra-rapid strategy, and in several cases also Hobart was added as a fourth station. For this project we use the new analysis software c5++ developed by the National Institute of Information and Communications Technology (NICT). In this presentation we give an overview of the UREOP-project, describe the recent developments, and discuss the obtained results.

  6. The Version 2 Global Precipitation Climatology Project (GPCP) Monthly Precipitation Analysis (1979-Present)

    NASA Technical Reports Server (NTRS)

    Adler, Robert F.; Huffman, George J.; Chang, Alfred; Ferraro, Ralph; Xie, Ping-Ping; Janowiak, John; Rudolf, Bruno; Schneider, Udo; Curtis, Scott; Bolvin, David

    2003-01-01

    The Global Precipitation Climatology Project (GPCP) Version 2 Monthly Precipitation Analysis is described. This globally complete, monthly analysis of surface precipitation at 2.5 degrees x 2.5 degrees latitude-longitude resolution is available from January 1979 to the present. It is a merged analysis that incorporates precipitation estimates from low-orbit-satellite microwave data, geosynchronous-orbit-satellite infrared data, and rain gauge observations. The merging approach utilizes the higher accuracy of the low-orbit microwave observations to calibrate, or adjust, the more frequent geosynchronous infrared observations. The data set is extended back into the premicrowave era (before 1987) by using infrared-only observations calibrated to the microwave-based analysis of the later years. The combined satellite-based product is adjusted by the raingauge analysis. This monthly analysis is the foundation for the GPCP suite of products including those at finer temporal resolution, satellite estimate, and error estimates for each field. The 23-year GPCP climatology is characterized, along with time and space variations of precipitation.

  7. The SERENDIP 2 SETI project: Current status

    NASA Technical Reports Server (NTRS)

    Bowyer, C. S.; Werthimer, D.; Donnelly, C.; Herrick, W.; Lampton, M.

    1991-01-01

    Over the past 30 years, interest in extraterrestrial intelligence has progressed from philosophical discussion to rigorous scientific endeavors attempting to make contact. Since it is impossible to assess the probability of success and the amount of telescope time needed for detection, Search for Extraterrestrial Intelligence (SETI) Projects are plagued with the problem of attaining the large amounts of time needed on the world's precious few large radio telescopes. To circumvent this problem, the Search for Extraterrestrial Radio Emissions from Nearby Developed Intelligent Populations (SERENDIP) instrument operates autonomously in a piggyback mode utilizing whatever observing plan is chosen by the primary observer. In this way, large quantities of high-quality data can be collected in a cost-effective and unobtrusive manner. During normal operations, SERENDIP logs statistically significant events for further offline analysis. Due to the large number of terrestrial and near-space transmitters on earth, a major element of the SERENDIP project involves identifying and rejecting spurious signals from these sources. Another major element of the SERENDIP Project (as well as most other SETI efforts) is detecting extraterrestrial intelligence (ETI) signals. Events selected as candidate ETI signals are studied further in a targeted search program which utilizes between 24 to 48 hours of dedicated telescope time each year.

  8. U.S. Energy Service Company Industry: Market Size and Project Performance from 1990-2008

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larsen, Peter; Goldman, Charles; Satchwell, Andrew

    2012-08-21

    The U.S. energy service company (ESCO) industry is an example of a private sector business model where energy savings are delivered to customers primarily through the use of performance-based contracts. This study was conceived as a snapshot of the ESCO industry prior to the economic slowdown and the introduction of federal stimulus funding mandated by enactment of the American Recovery and Reinvestment Act of 2009 (ARRA). This study utilizes two parallel analytic approaches to characterize ESCO industry and market trends in the U.S.: (1) a ?top-down? approach involving a survey of individual ESCOs to estimate aggregate industry activity and (2)more » a ?bottom-up? analysis of a database of ~;;3,250 projects (representing over $8B in project investment) that reports market trends including installed EE retrofit strategies, project installation costs and savings, project payback times, and benefit-cost ratios over time. Despite the onset of a severe economic recession, the U.S. ESCO industry managed to grow at about 7percent per year between 2006 and 2008. ESCO industry revenues were about $4.1 billion in 2008 and ESCOs anticipate accelerated growth through 2011 (25percent per year). We found that 2,484 ESCO projects in our database generated ~;;$4.0 billion ($2009) in net, direct economic benefits to their customers. We estimate that the ESCO project database includes about 20percent of all U.S. ESCO market activity from 1990-2008. Assuming the net benefits per project are comparable for ESCO projects that are not included in the LBNL database, this would suggest that the ESCO industry has generated ~;;$23 billion in net direct economic benefits for customers at projects installed between 1990 and 2008. There is empirical evidence confirming that the industry is evolving by installing more comprehensive and complex measures?including onsite generation and measures to address deferred maintenance?but this evolution has significant implications for customer project economics, especially at K-12 schools. We found that the median simple payback time has increased from 1.9 to 3.2 years in private sector projects since the early-to-mid 1990s and from 5.2 to 10.5 years in public sector projects for the same time period.« less

  9. Improvement of the cost-benefit analysis algorithm for high-rise construction projects

    NASA Astrophysics Data System (ADS)

    Gafurov, Andrey; Skotarenko, Oksana; Plotnikov, Vladimir

    2018-03-01

    The specific nature of high-rise investment projects entailing long-term construction, high risks, etc. implies a need to improve the standard algorithm of cost-benefit analysis. An improved algorithm is described in the article. For development of the improved algorithm of cost-benefit analysis for high-rise construction projects, the following methods were used: weighted average cost of capital, dynamic cost-benefit analysis of investment projects, risk mapping, scenario analysis, sensitivity analysis of critical ratios, etc. This comprehensive approach helped to adapt the original algorithm to feasibility objectives in high-rise construction. The authors put together the algorithm of cost-benefit analysis for high-rise construction projects on the basis of risk mapping and sensitivity analysis of critical ratios. The suggested project risk management algorithms greatly expand the standard algorithm of cost-benefit analysis in investment projects, namely: the "Project analysis scenario" flowchart, improving quality and reliability of forecasting reports in investment projects; the main stages of cash flow adjustment based on risk mapping for better cost-benefit project analysis provided the broad range of risks in high-rise construction; analysis of dynamic cost-benefit values considering project sensitivity to crucial variables, improving flexibility in implementation of high-rise projects.

  10. [Publication rate of Deutsche Forschungsgemeinschaft (DFG)-supported research projects. An analysis of the "fate" of DFG-support methods in anesthesia, surgery and internal medicine].

    PubMed

    Boldt, J; Maleck, W

    2000-09-22

    Outstanding medical research is not possible without financial support. The success of supported research projects have been evaluated only rarely. The publication rate of research projects supported by the German Research Council (Deutsche Forschungsgemeinschaft [DFG]) was assessed separately for internal medicine, surgery, and anesthesiology. Based on the "Figures and Facts" published by the DFG all supported projects of 1996 for all three specialities were included. In a Medline-based analysis all published papers dealing with the supported project and all papers published by the supported persons from 1996 to may 2000 were documented. A total of 315 grants were analysed (internal medicine: 234; surgery: 63; anesthesiology: 18). Projects with clinical topics were less often supported (n = 80) than experimental projects (n = 235). 162 (69.3%) of the grants in internal medicine, 41 (65.1) in surgery, and 14 (77.8%) of the grants in anesthesiology were published. In anesthesiology all published projects were in English language (internal medicine: 98.2%; surgery: 95%). Independent of the topic of the grant, several supported persons in internal medicine and surgery did not publish any papers between 1996 and may 2000, whereas all supported anesthesiologists published papers in peer reviewed journals in this time period. The publication rate of DFG supported projects is not sufficient. Except for a final internal report after finishing the research project no quality control exists for DFG grants. Unfortunately, not all supported projects were published. A better feedback between the financial support by the DFG and the publication rate of DFG grants is desirable.

  11. Learning through projects in the training of biomedical engineers: an application experience

    NASA Astrophysics Data System (ADS)

    Gambi, José Antonio Li; Peme, Carmen

    2011-09-01

    Learning through Projects in the curriculum consists of both the identification and analysis of a problem, and the design of solution, execution and evaluation strategies, with teams of students. The project is conceived as the creation of a set of strategies articulated and developed during a certain amount of time to solve a problem contextualized in situations continually changing, where the constant evaluation provides feedback to make adjustments. In 2009, Learning through Projects was applied on the subject Hospital Facilities and three intervention projects were developed in health centers. This first stage is restricted to the analysis of the aspects that are considered to be basic to the professional training: a) Context knowledge: The future biomedical engineers must be familiarized with the complex health system where they will develop their profession; b) Team work: This is one of the essential skills in the training of students, since Biomedical Engineering connects the knowledge of sciences of life with the knowledge of exact sciences and technology; c) Regulations: The activities related to the profession require the implementation of regulations; therefore, to be aware of and to apply these regulations is a fundamental aspect to be analyzed in this stage; d) Project evaluation: It refers to the elaboration and studying of co-evaluation reports, which helps to find out if Learning through Projects contributes to the training. This new line of investigation has the purpose of discovering if the application of this learning strategy makes changes in the training of students in relation to their future professional career. The findings of this ongoing investigation will allow for the analysis of the possibility of extending its application. Key words: engineering, biomedical, learning, projects, strategies.

  12. Track inspection planning and risk measurement analysis.

    DOT National Transportation Integrated Search

    2014-11-01

    This project models track inspection operations on a railroad network and discusses how the inspection results can : be used to measure the risk of failure on the tracks. In particular, the inspection times of the tracks, inspection frequency of the ...

  13. A quantitative analysis study on the implementation of partnering in the design and build construction project

    NASA Astrophysics Data System (ADS)

    Halil, F. M.; Nasir, N. M.; Shukur, A. S.; Hashim, H.

    2018-02-01

    Design and Build construction project involved the biggest scale of the cost of investment as compared to the traditional approach. In Design and Build, the client hires a design professional that will design according to the client’s need and specification. This research aim is to explore the concept of partnering implementation practiced in the design and build procurement approach. Therefore, the selection of design professionals such as Contractors and consultants in the project is crucial to ensure the successful project completion on time, cost, and quality. The methodology adopted using quantitative approach. Administration of the questionnaire was distributed to the public client by using postal survey. Outcomes of the results, the public clients agreed that project management capabilities and commitment to budget as a crucial element of partnering from the design professional in design and build construction project.

  14. Risk management, financial evaluation and funding for wastewater and stormwater reuse projects.

    PubMed

    Furlong, Casey; De Silva, Saman; Gan, Kein; Guthrie, Lachlan; Considine, Robert

    2017-04-15

    This paper has considered risk management, financial evaluation and funding in seven Australian wastewater and stormwater reuse projects. From the investigated case studies it can be seen that responsible parties have generally been well equipped to identify potential risks. In relation to financial evaluation methods some serious discrepancies, such as time periods for analysis, and how stormwater benefits are valued, have been identified. Most of the projects have required external, often National Government, funding to proceed. As National funding is likely to become less common in the future, future reuse projects may need to be funded internally by the water industry. In order to enable this the authors propose that the industry requires (1) a standard project evaluation process, and (2) an infrastructure funders' forum (or committee) with representation from both utilities and regulators, in order to compare and prioritise future reuse projects against each other. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Microbiological Quality and Food Safety of Plants Grown on ISS Project

    NASA Technical Reports Server (NTRS)

    Wheeler, Raymond M. (Compiler)

    2014-01-01

    The goal of this project is to select and advance methods to enable real-time sampling, microbiological analysis, and sanitation of crops grown on the International Space Station (ISS). These methods would validate the microbiological quality of crops grown for consumption to ensure safe and palatable fresh foods. This would be achieved through the development / advancement of microbiological sample collection, rapid pathogen detection and effective sanitation methods that are compatible with a microgravity environment.

  16. Update of the DTM thermosphere model in the framework of the H2020 project `SWAMI'

    NASA Astrophysics Data System (ADS)

    Bruinsma, S.; Jackson, D.; Stolle, C.; Negrin, S.

    2017-12-01

    In the framework of the H2020 project SWAMI (Space Weather Atmosphere Model and Indices), which is expected to start in January 2018, the CIRA thermosphere specification model DTM2013 will be improved through the combination of assimilating more density data to drive down remaining biases and a new high cadence kp geomagnetic index in order to improve storm-time performance. Five more years of GRACE high-resolution densities from 2012-2016, densities from the last year of the GOCE mission, Swarm mean densities, and mean densities from 2010-2017 inferred from the geodetic satellites at about 800 km are available now. The DTM2013 model will be compared with the new density data in order to detect possible systematic errors or other kinds of deficiencies and a first analysis will be presented. Also, a more detailed analysis of model performance under storm conditions will be provided, which will then be the benchmark to quantify model improvement expected with the higher cadence kp indices. In the SWAMI project, the DTM model will be coupled in the 120-160 km altitude region to the Met Office Unified Model in order to create a whole atmosphere model. It can be used for launch operations, re-entry computations, orbit prediction, and aeronomy and space weather studies. The project objectives and time line will be given.

  17. Galaxy Zoo: Science and Public Engagement Hand in Hand

    NASA Astrophysics Data System (ADS)

    Masters, Karen; Lintott, Chris; Feldt, Julie; Keel, Bill; Skibba, Ramin

    2015-08-01

    Galaxy Zoo (www.galaxyzoo.org) is familiar to many as a hugely successful citizen science project. Hundreds of thousands of members of the public have contributed to Galaxy Zoo which collects visual classifications of galaxies in images from a variety of surveys (e.g. the Sloan Digital Sky Survey, Hubble Space Telescope surveys, and others) using an internet tool. Galaxy Zoo inspired the creation of "The Zooniverse" (www.zooniverse.org) which is now the world's leading online platform for citizens science, hosting a growing number of projects (at the time of writing ~30) making use of crowdsourced data analysis in all areas of academic research, and boasting over 1.3 million participants.Galaxy Zoo has also shown itself, in a series of (now ~60) peer reviewed papers, to be a fantastic database for the study of galaxy evolution. Participation in citizen science is also fantastic public engagement with scientific research. But what do the participants learn while they are involved in crowdsourced data analysis?In this talk I will discuss how public engagement via citizen science can be an effective means of outreach from data intensive astronomical surveys. A citizen science project (if done right) can and should increase the scientific output of an astronomical project, while at the same time inspiring participants to learn more about the science and techniques of astronomy.

  18. The Seismic Tool-Kit (STK): an open source software for seismology and signal processing.

    NASA Astrophysics Data System (ADS)

    Reymond, Dominique

    2016-04-01

    We present an open source software project (GNU public license), named STK: Seismic ToolKit, that is dedicated mainly for seismology and signal processing. The STK project that started in 2007, is hosted by SourceForge.net, and count more than 19 500 downloads at the date of writing. The STK project is composed of two main branches: First, a graphical interface dedicated to signal processing (in the SAC format (SAC_ASCII and SAC_BIN): where the signal can be plotted, zoomed, filtered, integrated, derivated, ... etc. (a large variety of IFR and FIR filter is proposed). The estimation of spectral density of the signal are performed via the Fourier transform, with visualization of the Power Spectral Density (PSD) in linear or log scale, and also the evolutive time-frequency representation (or sonagram). The 3-components signals can be also processed for estimating their polarization properties, either for a given window, or either for evolutive windows along the time. This polarization analysis is useful for extracting the polarized noises, differentiating P waves, Rayleigh waves, Love waves, ... etc. Secondly, a panel of Utilities-Program are proposed for working in a terminal mode, with basic programs for computing azimuth and distance in spherical geometry, inter/auto-correlation, spectral density, time-frequency for an entire directory of signals, focal planes, and main components axis, radiation pattern of P waves, Polarization analysis of different waves (including noize), under/over-sampling the signals, cubic-spline smoothing, and linear/non linear regression analysis of data set. A MINimum library of Linear AlGebra (MIN-LINAG) is also provided for computing the main matrix process like: QR/QL decomposition, Cholesky solve of linear system, finding eigen value/eigen vectors, QR-solve/Eigen-solve of linear equations systems ... etc. STK is developed in C/C++, mainly under Linux OS, and it has been also partially implemented under MS-Windows. Usefull links: http://sourceforge.net/projects/seismic-toolkit/ http://sourceforge.net/p/seismic-toolkit/wiki/browse_pages/

  19. On-line analysis capabilities developed to support the AFW wind-tunnel tests

    NASA Technical Reports Server (NTRS)

    Wieseman, Carol D.; Hoadley, Sherwood T.; Mcgraw, Sandra M.

    1992-01-01

    A variety of on-line analysis tools were developed to support two active flexible wing (AFW) wind-tunnel tests. These tools were developed to verify control law execution, to satisfy analysis requirements of the control law designers, to provide measures of system stability in a real-time environment, and to provide project managers with a quantitative measure of controller performance. Descriptions and purposes of the developed capabilities are presented along with examples. Procedures for saving and transferring data for near real-time analysis, and descriptions of the corresponding data interface programs are also presented. The on-line analysis tools worked well before, during, and after the wind tunnel test and proved to be a vital and important part of the entire test effort.

  20. Open source tools for fluorescent imaging.

    PubMed

    Hamilton, Nicholas A

    2012-01-01

    As microscopy becomes increasingly automated and imaging expands in the spatial and time dimensions, quantitative analysis tools for fluorescent imaging are becoming critical to remove both bottlenecks in throughput as well as fully extract and exploit the information contained in the imaging. In recent years there has been a flurry of activity in the development of bio-image analysis tools and methods with the result that there are now many high-quality, well-documented, and well-supported open source bio-image analysis projects with large user bases that cover essentially every aspect from image capture to publication. These open source solutions are now providing a viable alternative to commercial solutions. More importantly, they are forming an interoperable and interconnected network of tools that allow data and analysis methods to be shared between many of the major projects. Just as researchers build on, transmit, and verify knowledge through publication, open source analysis methods and software are creating a foundation that can be built upon, transmitted, and verified. Here we describe many of the major projects, their capabilities, and features. We also give an overview of the current state of open source software for fluorescent microscopy analysis and the many reasons to use and develop open source methods. Copyright © 2012 Elsevier Inc. All rights reserved.

  1. PERSEUS QC: preparing statistic data sets

    NASA Astrophysics Data System (ADS)

    Belokopytov, Vladimir; Khaliulin, Alexey; Ingerov, Andrey; Zhuk, Elena; Gertman, Isaac; Zodiatis, George; Nikolaidis, Marios; Nikolaidis, Andreas; Stylianou, Stavros

    2017-09-01

    The Desktop Oceanographic Data Processing Module was developed for visual analysis of interdisciplinary cruise measurements. The program provides the possibility of data selection based on different criteria, map plotting, sea horizontal sections, and sea depth vertical profiles. The data selection in the area of interest can be specified according to a set of different physical and chemical parameters complimented by additional parameters, such as the cruise number, ship name, and time period. The visual analysis of a set of vertical profiles in the selected area allows to determine the quality of the data, their location and the time of the in-situ measurements and to exclude any questionable data from the statistical analysis. For each selected set of profiles, the average vertical profile, the minimal and maximal values of the parameter under examination and the root mean square (r.m.s.) are estimated. These estimates are compared with the parameter ranges, set for each sub-region by MEDAR/MEDATLAS-II and SeaDataNet2 projects. In the framework of the PERSEUS project, certain parameters which lacked a range were calculated from scratch, while some of the previously used ranges were re-defined using more comprehensive data sets based on SeaDataNet2, SESAME and PERSEUS projects. In some cases we have used additional sub- regions to redefine the ranges ore precisely. The recalculated ranges are used to improve the PERSEUS Data Quality Control.

  2. (Re)Discovering Retrospective Miscue Analysis: An Action Research Exploration Using Recorded Readings to Improve Third-Grade Students' Reading Fluency

    ERIC Educational Resources Information Center

    Born, Melissa; Curtis, Reagan

    2013-01-01

    An action research project was undertaken focused on integrating recorded readings and Retrospective Miscue Analysis (RMA) into center-based instructional time in a third-grade classroom. Initial DIBELS test results were used to select 6 struggling readers, all of whom showed improved fluency in response to our instructional interventions. The…

  3. 77 FR 52344 - Information Collection Request Sent to the Office of Management and Budget (OMB) for Approval...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-29

    ... responses time per burden per nonhour response hours response burden cost Tier 1 (Desktop Analysis... developers of these small-scale projects do the desktop analysis described in Tier 1 or Tier 2 using publicly... published in the Federal Register (77 FR 19683) a notice of our intent to request that OMB renew approval...

  4. Emission projections for the U.S. Environmental Protection Agency Section 812 second prospective Clean Air Act cost/benefit analysis.

    PubMed

    Wilson, James H; Mullen, Maureen A; Bollman, Andrew D; Thesing, Kirstin B; Salhotra, Manish; Divita, Frank; Neumann, James E; Price, Jason C; DeMocker, James

    2008-05-01

    Section 812 of the Clean Air Act Amendments (CAAA) of 1990 requires the U.S. Environmental Protection Agency (EPA) to perform periodic, comprehensive analyses of the total costs and total benefits of programs implemented pursuant to the CAAA. The first prospective analysis was completed in 1999. The second prospective analysis was initiated during 2005. The first step in the second prospective analysis was the development of base and projection year emission estimates that will be used to generate benefit estimates of CAAA programs. This paper describes the analysis, methods, and results of the recently completed emission projections. There are several unique features of this analysis. One is the use of consistent economic assumptions from the Department of Energy's Annual Energy Outlook 2005 (AEO 2005) projections as the basis for estimating 2010 and 2020 emissions for all sectors. Another is the analysis of the different emissions paths for both with and without CAAA scenarios. Other features of this analysis include being the first EPA analysis that uses the 2002 National Emission Inventory files as the basis for making 48-state emission projections, incorporating control factor files from the Regional Planning Organizations (RPOs) that had completed emission projections at the time the analysis was performed, and modeling the emission benefits of the expected adoption of measures to meet the 8-hr ozone National Ambient Air Quality Standards (NAAQS), the Clean Air Visibility Rule, and the PM2.5 NAAQS. This analysis shows that the 1990 CAAA have produced significant reductions in criteria pollutant emissions since 1990 and that these emission reductions are expected to continue through 2020. CAAA provisions have reduced volatile organic compound (VOC) emissions by approximately 7 million t/yr by 2000, and are estimated to produce associated VOC emission reductions of 16.7 million t by 2020. Total oxides of nitrogen (NO(x)) emission reductions attributable to the CAAA are 5, 12, and 17 million t in 2000, 2010, and 2020, respectively. Sulfur dioxide (SO2) emission benefits during the study period are dominated by electricity-generating unit (EGU) SO2 emission reductions. These EGU emission benefits go from 7.5 million t reduced in 2000 to 15 million t reduced in 2020.

  5. Times of Maximum Light: The Passband Dependence

    NASA Astrophysics Data System (ADS)

    Joner, M. D.; Laney, C. D.

    2004-05-01

    We present UBVRIJHK light curves for the dwarf Cepheid variable star AD Canis Minoris. These data are part of a larger project to determine absolute magnitudes for this class of stars. Our figures clearly show changes in the times of maximum light, the amplitude, and the light curve morphology that are dependent on the passband used in the observation. Note that when data from a variety of passbands are used in studies that require a period analysis or that search for small changes in the pulsational period, it is easy to introduce significant systematic errors into the results. We thank the Brigham Young University Department of Physics and Astronmy for continued support of our research. We also acknowledge the South African Astronomical Observatory for time granted to this project.

  6. Applying Real-Time UML: Real-World Experiences

    NASA Astrophysics Data System (ADS)

    Cooling, Niall; Pachschwoell, Stefan

    2004-06-01

    This paper presents Austrian Aerospace's experiences of applying UML for the design of an embedded real-time avionics system based on Feabhas' "Pragma Process". It describes the complete lifecycle from adoption of UML, through training, CASE-tool selection, system analysis, and software design and development of the project itself. It concludes by reflecting on the experiences obtained and some lessons learnt.

  7. Application of Linear Discriminant Analysis in Dimensionality Reduction for Hand Motion Classification

    NASA Astrophysics Data System (ADS)

    Phinyomark, A.; Hu, H.; Phukpattaranont, P.; Limsakul, C.

    2012-01-01

    The classification of upper-limb movements based on surface electromyography (EMG) signals is an important issue in the control of assistive devices and rehabilitation systems. Increasing the number of EMG channels and features in order to increase the number of control commands can yield a high dimensional feature vector. To cope with the accuracy and computation problems associated with high dimensionality, it is commonplace to apply a processing step that transforms the data to a space of significantly lower dimensions with only a limited loss of useful information. Linear discriminant analysis (LDA) has been successfully applied as an EMG feature projection method. Recently, a number of extended LDA-based algorithms have been proposed, which are more competitive in terms of both classification accuracy and computational costs/times with classical LDA. This paper presents the findings of a comparative study of classical LDA and five extended LDA methods. From a quantitative comparison based on seven multi-feature sets, three extended LDA-based algorithms, consisting of uncorrelated LDA, orthogonal LDA and orthogonal fuzzy neighborhood discriminant analysis, produce better class separability when compared with a baseline system (without feature projection), principle component analysis (PCA), and classical LDA. Based on a 7-dimension time domain and time-scale feature vectors, these methods achieved respectively 95.2% and 93.2% classification accuracy by using a linear discriminant classifier.

  8. Image quality in thoracic 4D cone-beam CT: A sensitivity analysis of respiratory signal, binning method, reconstruction algorithm, and projection angular spacing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shieh, Chun-Chien; Kipritidis, John; O’Brien, Ricky T.

    Purpose: Respiratory signal, binning method, and reconstruction algorithm are three major controllable factors affecting image quality in thoracic 4D cone-beam CT (4D-CBCT), which is widely used in image guided radiotherapy (IGRT). Previous studies have investigated each of these factors individually, but no integrated sensitivity analysis has been performed. In addition, projection angular spacing is also a key factor in reconstruction, but how it affects image quality is not obvious. An investigation of the impacts of these four factors on image quality can help determine the most effective strategy in improving 4D-CBCT for IGRT. Methods: Fourteen 4D-CBCT patient projection datasets withmore » various respiratory motion features were reconstructed with the following controllable factors: (i) respiratory signal (real-time position management, projection image intensity analysis, or fiducial marker tracking), (ii) binning method (phase, displacement, or equal-projection-density displacement binning), and (iii) reconstruction algorithm [Feldkamp–Davis–Kress (FDK), McKinnon–Bates (MKB), or adaptive-steepest-descent projection-onto-convex-sets (ASD-POCS)]. The image quality was quantified using signal-to-noise ratio (SNR), contrast-to-noise ratio, and edge-response width in order to assess noise/streaking and blur. The SNR values were also analyzed with respect to the maximum, mean, and root-mean-squared-error (RMSE) projection angular spacing to investigate how projection angular spacing affects image quality. Results: The choice of respiratory signals was found to have no significant impact on image quality. Displacement-based binning was found to be less prone to motion artifacts compared to phase binning in more than half of the cases, but was shown to suffer from large interbin image quality variation and large projection angular gaps. Both MKB and ASD-POCS resulted in noticeably improved image quality almost 100% of the time relative to FDK. In addition, SNR values were found to increase with decreasing RMSE values of projection angular gaps with strong correlations (r ≈ −0.7) regardless of the reconstruction algorithm used. Conclusions: Based on the authors’ results, displacement-based binning methods, better reconstruction algorithms, and the acquisition of even projection angular views are the most important factors to consider for improving thoracic 4D-CBCT image quality. In view of the practical issues with displacement-based binning and the fact that projection angular spacing is not currently directly controllable, development of better reconstruction algorithms represents the most effective strategy for improving image quality in thoracic 4D-CBCT for IGRT applications at the current stage.« less

  9. PRODIGEN: visualizing the probability landscape of stochastic gene regulatory networks in state and time space.

    PubMed

    Ma, Chihua; Luciani, Timothy; Terebus, Anna; Liang, Jie; Marai, G Elisabeta

    2017-02-15

    Visualizing the complex probability landscape of stochastic gene regulatory networks can further biologists' understanding of phenotypic behavior associated with specific genes. We present PRODIGEN (PRObability DIstribution of GEne Networks), a web-based visual analysis tool for the systematic exploration of probability distributions over simulation time and state space in such networks. PRODIGEN was designed in collaboration with bioinformaticians who research stochastic gene networks. The analysis tool combines in a novel way existing, expanded, and new visual encodings to capture the time-varying characteristics of probability distributions: spaghetti plots over one dimensional projection, heatmaps of distributions over 2D projections, enhanced with overlaid time curves to display temporal changes, and novel individual glyphs of state information corresponding to particular peaks. We demonstrate the effectiveness of the tool through two case studies on the computed probabilistic landscape of a gene regulatory network and of a toggle-switch network. Domain expert feedback indicates that our visual approach can help biologists: 1) visualize probabilities of stable states, 2) explore the temporal probability distributions, and 3) discover small peaks in the probability landscape that have potential relation to specific diseases.

  10. Mesoscopic Community Structure of Financial Markets Revealed by Price and Sign Fluctuations

    PubMed Central

    Almog, Assaf; Besamusca, Ferry; MacMahon, Mel; Garlaschelli, Diego

    2015-01-01

    The mesoscopic organization of complex systems, from financial markets to the brain, is an intermediate between the microscopic dynamics of individual units (stocks or neurons, in the mentioned cases), and the macroscopic dynamics of the system as a whole. The organization is determined by “communities” of units whose dynamics, represented by time series of activity, is more strongly correlated internally than with the rest of the system. Recent studies have shown that the binary projections of various financial and neural time series exhibit nontrivial dynamical features that resemble those of the original data. This implies that a significant piece of information is encoded into the binary projection (i.e. the sign) of such increments. Here, we explore whether the binary signatures of multiple time series can replicate the same complex community organization of the financial market, as the original weighted time series. We adopt a method that has been specifically designed to detect communities from cross-correlation matrices of time series data. Our analysis shows that the simpler binary representation leads to a community structure that is almost identical with that obtained using the full weighted representation. These results confirm that binary projections of financial time series contain significant structural information. PMID:26226226

  11. Corporate Social Responsibility: A Real Options Approach to the Challenge of Financial Sustainability

    PubMed Central

    2015-01-01

    Background In contemporary complex societies, social values like ethics, corporate social responsibility, and being respectful with the environment, among others, are becoming social requirements. Corporations are expected to fulfill them and, according to empirical evidence, an overwhelming majority aspires to good social valuation. At the same time, the maximization of market share value in the long run continues to be the central corporate goal. Making environmental and social expenses compatible with value creation is a central challenge for corporations since it implies the financial sustainability of Corporate Social Responsibility (CSR). Methods and Results The value creation capacity of CSR projects, mainly through innovation, is widely acknowledged in economic literature and corporate practice. This fact arouses the need of having a quantitative framework capable of summarizing the value creation capacity of the variables involved in CSR projects. With this aim we build up a sensitivity analysis of real option ratios that studies and quantifies the value creation capacity of CSR projects connected with innovation. Ratio analysis has the advantage of being scale independent. Hence, it furnishes a homogeneous framework to express the interaction of value creation variables and, thus, supports strategic thinking quantitatively. Often, CSR expenses can be regarded as preliminary projects that create the opportunity to undertake a full future project. For them, we obtain the minimum expectations scenario that makes financially sustainable a preliminary project that can be interpreted as a call option. We propose a classification of CSR projects from the decision analysis perspective following a two-fold approach: Their relationship with value creation and their links with existing corporate activities. This classification of CSR projects aims at contributing to choose the best capital budgeting method to study the financial sustainability of the project and identifying those CSR projects that fulfill the required features to be studied from the real options perspective. PMID:25938410

  12. Corporate social responsibility: a real options approach to the challenge of financial sustainability.

    PubMed

    Bosch-Badia, Maria-Teresa; Montllor-Serrats, Joan; Tarrazon-Rodon, Maria-Antonia

    2015-01-01

    In contemporary complex societies, social values like ethics, corporate social responsibility, and being respectful with the environment, among others, are becoming social requirements. Corporations are expected to fulfill them and, according to empirical evidence, an overwhelming majority aspires to good social valuation. At the same time, the maximization of market share value in the long run continues to be the central corporate goal. Making environmental and social expenses compatible with value creation is a central challenge for corporations since it implies the financial sustainability of Corporate Social Responsibility (CSR). The value creation capacity of CSR projects, mainly through innovation, is widely acknowledged in economic literature and corporate practice. This fact arouses the need of having a quantitative framework capable of summarizing the value creation capacity of the variables involved in CSR projects. With this aim we build up a sensitivity analysis of real option ratios that studies and quantifies the value creation capacity of CSR projects connected with innovation. Ratio analysis has the advantage of being scale independent. Hence, it furnishes a homogeneous framework to express the interaction of value creation variables and, thus, supports strategic thinking quantitatively. Often, CSR expenses can be regarded as preliminary projects that create the opportunity to undertake a full future project. For them, we obtain the minimum expectations scenario that makes financially sustainable a preliminary project that can be interpreted as a call option. We propose a classification of CSR projects from the decision analysis perspective following a two-fold approach: Their relationship with value creation and their links with existing corporate activities. This classification of CSR projects aims at contributing to choose the best capital budgeting method to study the financial sustainability of the project and identifying those CSR projects that fulfill the required features to be studied from the real options perspective.

  13. Investigation into the computerized data bases of the Employment and Training Administration. Regional Management Information System Project (RMIS) report on second-year activities, 1975--1976

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Postle, W.; Heckman, B.

    1977-01-01

    The research and development project discussed was aimed at creating the necessary computer system for the rapid retrieval, analysis, and display of information to meet the individual and nonroutine needs of the Department of Labor's Employment and Training Administration and the general public. The major objective was to demonstrate that it was both feasible and practical to organize data that are currently available and to provide planning and management information in a much more usable and timely fashion than previously possible. Fast access to data with a system which is easy to use was an important project goal. Programs weremore » written to analyze and display data by means of bar, pie, and line charts, etc. Although prototypical interactive retrieval, analysis, and report formation tools have been developed, further research and development of interactive tools is required. (RWR)« less

  14. Analysis of the Transport and Fate of Metals Released From ...

    EPA Pesticide Factsheets

    This project’s objectives were to provide analysis of water quality following the release of acid mine drainage in the Animas and San Juan Rivers in a timely manner to 1) generate a comprehensive picture of the plume at the river system level, 2) help inform future monitoring efforts and 3) to predict potential secondary effects that could occur from materials that may remain stored within the system. The project focuses on assessing metals contamination during the plume and in the first month following the event. This project’s objectives were to provide analysis of water quality following the release of acid mine drainage from the Gold King Mine in the Animas and San Juan Rivers in a timely manner to 1) generate a comprehensive picture of the plume at the river system level, 2) help inform future monitoring efforts and 3) to predict potential secondary effects that could occur from materials that may remain stored within the system. The project focuses on assessing metals contamination during the plume and in the first month following the event.

  15. A projection-based model reduction strategy for the wave and vibration analysis of rotating periodic structures

    NASA Astrophysics Data System (ADS)

    Beli, D.; Mencik, J.-M.; Silva, P. B.; Arruda, J. R. F.

    2018-05-01

    The wave finite element method has proved to be an efficient and accurate numerical tool to perform the free and forced vibration analysis of linear reciprocal periodic structures, i.e. those conforming to symmetrical wave fields. In this paper, its use is extended to the analysis of rotating periodic structures, which, due to the gyroscopic effect, exhibit asymmetric wave propagation. A projection-based strategy which uses reduced symplectic wave basis is employed, which provides a well-conditioned eigenproblem for computing waves in rotating periodic structures. The proposed formulation is applied to the free and forced response analysis of homogeneous, multi-layered and phononic ring structures. In all test cases, the following features are highlighted: well-conditioned dispersion diagrams, good accuracy, and low computational time. The proposed strategy is particularly convenient in the simulation of rotating structures when parametric analysis for several rotational speeds is usually required, e.g. for calculating Campbell diagrams. This provides an efficient and flexible framework for the analysis of rotordynamic problems.

  16. Malczyce, Poland: a multifaceted community action project in eastern Europe in a time of rapid economic change.

    PubMed

    Moskalewicz, J; Swiatktewicz, G

    2000-01-01

    The major focus of this paper is the sustainability of a one-year demonstration project on drug misuse prevention, which was implemented in a local community affected by acute economic crisis and high unemployment. The project was initiated by the Institute of Psychiatry and Neurology, and supported by the European Commission. The primary goal of the project was to demonstrate that community-based prevention is possible and feasible within the context of current transitions in Poland. Its major outcome was a community prevention package consisting of a number of booklets and videos to assist other communities in their prevention efforts. Experiences from this study suggest that factors contributing to the sustainability of a community prevention project can be identified and emphasized through simple analysis of community surveys, as well as focus group discussion.

  17. Why HID headlights bother older drivers

    PubMed Central

    Mainster, M A; Timberlake, G T

    2003-01-01

    Driving requires effective coordination of visual, motor, and cognitive skills. Visual skills are pushed to their limit at night by decreased illumination and by disabling glare from oncoming headlights. High intensity discharge (HID) headlamps project light farther down roads, improving their owner’s driving safety by increasing the time available for reaction to potential problems. Glare is proportional to headlamp brightness, however, so increasing headlamp brightness also increases potential glare for oncoming drivers, particularly on curving two lane roads. This problem is worse for older drivers because of their increased intraocular light scattering, glare sensitivity, and photostress recovery time. An analysis of automobile headlights, intraocular stray light, glare, and night driving shows that brightness rather than blueness is the primary reason for the visual problems that HID headlights can cause for older drivers who confront them. The increased light projected by HID headlights is potentially valuable, but serious questions remain regarding how and where it should be projected. PMID:12488274

  18. Prediction of accrual closure date in multi-center clinical trials with discrete-time Poisson process models.

    PubMed

    Tang, Gong; Kong, Yuan; Chang, Chung-Chou Ho; Kong, Lan; Costantino, Joseph P

    2012-01-01

    In a phase III multi-center cancer clinical trial or a large public health study, sample size is predetermined to achieve desired power, and study participants are enrolled from tens or hundreds of participating institutions. As the accrual is closing to the target size, the coordinating data center needs to project the accrual closure date on the basis of the observed accrual pattern and notify the participating sites several weeks in advance. In the past, projections were simply based on some crude assessment, and conservative measures were incorporated in order to achieve the target accrual size. This approach often resulted in excessive accrual size and subsequently unnecessary financial burden on the study sponsors. Here we proposed a discrete-time Poisson process-based method to estimate the accrual rate at time of projection and subsequently the trial closure date. To ensure that target size would be reached with high confidence, we also proposed a conservative method for the closure date projection. The proposed method was illustrated through the analysis of the accrual data of the National Surgical Adjuvant Breast and Bowel Project trial B-38. The results showed that application of the proposed method could help to save considerable amount of expenditure in patient management without compromising the accrual goal in multi-center clinical trials. Copyright © 2012 John Wiley & Sons, Ltd.

  19. Environmental monitoring for public access and community tracking (EMPACT): Discussion of the program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engel-Cox, J.A.

    1999-07-01

    The Environmental Monitoring for Public Access and Community Tracking (EMPACT) program is a unique initiative to provide time-relevant environmental information that is easily accessible and clearly communicated to residents in 86 of the nation's largest metropolitan areas. EMPACT is a US Environmental Protection Agency (EPA) program to use innovative and time-relevant monitoring and communication technologies. President Clinton articulated his vision for right-to-know programs when he directed the EPA to provide local environmental information to communities. EPA Administrator Carol Browner created EMPACT and other programs to meet this vision, giving EMPACT the goal of providing timely, useful and accurate environmental andmore » public health information to all Americans. This paper is an analysis of the status of the EMPACT program during its first 2 years. EMPACT has launched 27 environmental monitoring and communication projects, including metropolitan grants, EPA Headquarter and Regional projects, and research activities. These projects are located in 37 states and 68 cities throughout the United States, and represent significant progress towards EMPACT's goal of reaching 86 major metropolitan areas throughout all 50 states, the District of Columbia and Puerto Rico by 2001. These projects focus on five principles established by EPA Administrator Browner: using advanced technology and science, establishing partnerships, increasing public access to data, communicating useful action-oriented information, and establishing a framework for sharing monitoring techniques and data between projects.« less

  20. Evolution of the U.S. Energy Service Company Industry: Market Size and Project Performance from 1990-2008

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larsen, Peter; Goldman, Charles A.; Satchwell, Andrew

    2012-05-08

    The U.S. energy service company (ESCO) industry is an example of a private sector business model where energy savings are delivered to customers primarily through the use of performance-based contracts. This study was conceived as a snapshot of the ESCO industry prior to the economic slowdown and the introduction of federal stimulus funding mandated by enactment of the American Recovery and Reinvestment Act of 2009 (ARRA). This study utilizes two parallel analytic approaches to characterize ESCO industry and market trends in the U.S.: (1) a “top-down” approach involving a survey of individual ESCOs to estimate aggregate industry activity and (2)more » a “bottom-up” analysis of a database of -3,265 projects (representing over $8B in project investment) that reports market trends including installed EE retrofit strategies, project installation costs and savings, project payback times, and benefit-cost ratios over time. Despite the onset of an economic recession, the U.S. ESCO industry managed to grow at about 7% per year between 2006 and 2008. ESCO industry revenues are relatively small compared to total U.S. energy expenditures (about $4.1 billion in 2008), but ESCOs anticipated accelerated growth through 2011 (25% per year). We found that 2,484 ESCO projects in our database generated -$4.0 billion ($2009) in net, direct economic benefits to their customers. We estimate that the ESCO project database includes about 20% of all U.S. ESCO market activity from 1990-2008. Assuming the net benefits per project are comparable for ESCO projects that are not included in the LBNL database, this would suggest that the ESCO industry has generated -$23 billion in net direct economic benefits for customers at projects installed between 1990 and 2008. We found that nearly 85% of all public and institutional projects met or exceeded the guaranteed level of savings. We estimated that a typical ESCO project generated $1.5 dollars of direct benefits for every dollar of customer investment. There is empirical evidence confirming that the industry is responding to customer demand by installing more comprehensive and complex measures—including onsite generation and measures to address deferred maintenance—but this evolution has significant implications for customer project economics, especially at K-12 schools. We found that the median simple payback time has increased from 1.9 to 3.2 years in private sector projects since the early-to-mid 1990s and from 5.2 to 10.5 years in public sector projects for the same time period.« less

  1. Trends in Health Disparities, Health Inequity, and Social Determinants of Health Research: A 17-Year Analysis of NINR, NCI, NHLBI, and NIMHD Funding.

    PubMed

    Kneipp, Shawn M; Schwartz, Todd A; Drevdahl, Denise J; Canales, Mary K; Santacroce, Sheila; Santos, Hudson P; Anderson, Ruth

    The theoretical landscape of health disparities research now emphasizes health inequities and the role that social determinants of health (SDOH) play in creating and perpetuating them. Whether National Institutes of Health (NIH) funding patterns reflect this theoretical shift is unknown. The aim of this study was to examine the National Institute of Nursing Research's (NINR) funding for research focused on health disparities, health inequities, and SDOH, relative to other key NIH institutes. Data on 32,968 projects funded by NINR, the National Cancer Institute, the National Heart, Lung, and Blood Institute, and the National Institute of Minority Health and Health Disparities (NIMHD) during the years 2000 through 2016 were downloaded from NIH RePORTER; those with health disparities, health inequity, or SDOH terms used in the abstract were identified. Descriptive statistics and a general linear model approach were used to assess differences in cumulative project counts and funding proportions, and funding trends over time. Overall, funding for health disparities projects was 14-19 times greater than for health inequity and SDOH projects and was more concentrated in centers and institutional training than in individual research projects. NINR's proportion of funding for disparities projects was consistently greater than that of the National Cancer Institute and the National Heart, Lung, and Blood Institute, but not for inequities and SDOH projects. NIMHD's proportion of funding for disparities, and inequities and SDOH projects (combined) was 2-30 times greater than that of other institutes. Over the 16-year period, funding for disparities, inequity, and SDOH projects each increased (all ps < .05); however, growth in inequities and SDOH funding was not evident in more recent years. Funding for projects focused on health equities and the SDOH lag behind theoretical shifts in the broader health disparities research arena. With the exception of NIMHD, there is a disconnect between funding for projects with a disparities orientation in institutional training and center projects relative to individual research projects. These trends have implications for nurse scientists seeking NIH funding to support health equity-oriented research.

  2. 7 CFR 1738.2 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... financial analysis prepared by the Agency, based on the financial projections supplied by the applicant and... definition includes Variable Interest Entities as described in Financial Accounting Standards Board... required at the time the application was received by the Agency. Build-out means the construction...

  3. 7 CFR 1738.2 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... financial analysis prepared by the Agency, based on the financial projections supplied by the applicant and... definition includes Variable Interest Entities as described in Financial Accounting Standards Board... required at the time the application was received by the Agency. Build-out means the construction...

  4. 7 CFR 1738.2 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... financial analysis prepared by the Agency, based on the financial projections supplied by the applicant and... definition includes Variable Interest Entities as described in Financial Accounting Standards Board... required at the time the application was received by the Agency. Build-out means the construction...

  5. Enrollment Projection within a Decision-Making Framework.

    ERIC Educational Resources Information Center

    Armstrong, David F.; Nunley, Charlene Wenckowski

    1981-01-01

    Two methods used to predict enrollment at Montgomery College in Maryland are compared and evaluated, and the administrative context in which they are used is considered. The two methods involve time series analysis (curve fitting) and indicator techniques (yield from components). (MSE)

  6. A Study of Technical Engineering Peer Reviews at NASA

    NASA Technical Reports Server (NTRS)

    Chao, Lawrence P.; Tumer, Irem Y.; Bell, David G.

    2003-01-01

    This report describes the state of practices of design reviews at NASA and research into what can be done to improve peer review practices. There are many types of reviews at NASA: required and not, formalized and informal, programmatic and technical. Standing project formal reviews such as the Preliminary Design Review and Critical Design Review are a required part of every project and mission development. However, the technical, engineering peer reviews that support teams' work on such projects are informal, some times ad hoc, and inconsistent across the organization. The goal of this work is to identify best practices and lessons learned from NASA's experience, supported by academic research and methodologies to ultimately improve the process. This research has determined that the organization, composition, scope, and approach of the reviews impact their success. Failure Modes and Effects Analysis (FMEA) can identify key areas of concern before or in the reviews. Product definition tools like the Project Priority Matrix, engineering-focused Customer Value Chain Analysis (CVCA), and project or system-based Quality Function Deployment (QFD) help prioritize resources in reviews. The use of information technology and structured design methodologies can strengthen the engineering peer review process to help NASA work towards error-proofing the design process.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watrous, Matthew George; Adamic, Mary Louise; Olson, John Eric

    The goal of the project, New Paradigms for Isotope Ratio Mass Spectrometry: Raising the Scientific Profile and Improved Performance for Accelerator Mass Spectrometry (AMS) and Thermal Ionization Mass Spectrometry (TIMS), is to ensure that the ongoing isotope ratio determination capability within the U.S. Department of Energy complex is the world’s best for application to nonproliferation. This report spells out the progress of Task 4, Transition of TIMS to AMS for Iodine Analysis, of the larger project. The subtasks under Task 4 and the accomplishments throughout the three year project life cycle are presented in this report. Progress was made inmore » optimization of chemical extraction, determination of a detection limit for 127Iodine, production of standard materials for AMS analysis quality assurance, facilitation of knowledge exchange with respect to analyzing iodine on an AMS, cross comparison with a world-leading AMS laboratory, supercritical fluid extraction of iodine for AMS analysis and electrodeposition of seawater as a direct method of preparation for iodine analysis by AMS--all with the goal of minimizing the time required to stand up an AMS capability for iodine analysis of exposed air filters at INL. An effective extraction method has been developed and demonstrated for iodine analysis of exposed air filters. Innovative techniques to accomplish the cathode preparation for AMS analysis were developed and demonstrated and published. The known gap of a lack of available materials for reference standards in the analysis of iodine by AMS was filled by the preparation of homogenous materials that were calibrated against NIST materials. A minimum limit on the amount of abundant isotope in a sample was determined for AMS analysis. The knowledge exchange occurred with fantastic success. Scientists engaged the international AMS community at conferences, as well as in their laboratories for collaborative work. The supercritical fluid extraction work has positive data, but is not a replacement for leaching yet. The added seawater work has led to a good method and a possible publication. The focus of this project was to minimize the time to stand up the AMS capability, by having all the preparation and supporting functions worked out ahead of the instrument arrival. Due to all the preparatory work and its success, the instrument was delivered and turned over to the INL in February 2015. Since then, INL scientists have been successfully vetting the capabilities and accomplishing their own measurements that agree well with the leading laboratories of the world for iodine analysis by AMS. Initially, all AMS data was gathered on other laboratories instruments, but during the last six months data for this project has come from INL’s AMS.« less

  8. Risk analysis for renewable energy projects due to constraints arising

    NASA Astrophysics Data System (ADS)

    Prostean, G.; Vasar, C.; Prostean, O.; Vartosu, A.

    2016-02-01

    Starting from the target of the European Union (EU) to use renewable energy in the area that aims a binding target of 20% renewable energy in final energy consumption by 2020, this article illustrates the identification of risks for implementation of wind energy projects in Romania, which could lead to complex technical implications, social and administrative. In specific projects analyzed in this paper were identified critical bottlenecks in the future wind power supply chain and reasonable time periods that may arise. Renewable energy technologies have to face a number of constraints that delayed scaling-up their production process, their transport process, the equipment reliability, etc. so implementing these types of projects requiring complex specialized team, the coordination of which also involve specific risks. The research team applied an analytical risk approach to identify major risks encountered within a wind farm project developed in Romania in isolated regions with different particularities, configured for different geographical areas (hill and mountain locations in Romania). Identification of major risks was based on the conceptual model set up for the entire project implementation process. Throughout this conceptual model there were identified specific constraints of such process. Integration risks were examined by an empirical study based on the method HAZOP (Hazard and Operability). The discussion describes the analysis of our results implementation context of renewable energy projects in Romania and creates a framework for assessing energy supply to any entity from renewable sources.

  9. The projection of a test genome onto a reference population and applications to humans and archaic hominins.

    PubMed

    Yang, Melinda A; Harris, Kelley; Slatkin, Montgomery

    2014-12-01

    We introduce a method for comparing a test genome with numerous genomes from a reference population. Sites in the test genome are given a weight, w, that depends on the allele frequency, x, in the reference population. The projection of the test genome onto the reference population is the average weight for each x, [Formula: see text]. The weight is assigned in such a way that, if the test genome is a random sample from the reference population, then [Formula: see text]. Using analytic theory, numerical analysis, and simulations, we show how the projection depends on the time of population splitting, the history of admixture, and changes in past population size. The projection is sensitive to small amounts of past admixture, the direction of admixture, and admixture from a population not sampled (a ghost population). We compute the projections of several human and two archaic genomes onto three reference populations from the 1000 Genomes project-Europeans, Han Chinese, and Yoruba-and discuss the consistency of our analysis with previously published results for European and Yoruba demographic history. Including higher amounts of admixture between Europeans and Yoruba soon after their separation and low amounts of admixture more recently can resolve discrepancies between the projections and demographic inferences from some previous studies. Copyright © 2014 by the Genetics Society of America.

  10. Relationship between time management in construction industry and project management performance

    NASA Astrophysics Data System (ADS)

    Nasir, Najuwa; Nawi, Mohd Nasrun Mohd; Radzuan, Kamaruddin

    2016-08-01

    Nowadays, construction industry particularly in Malaysia struggle in achieving status of eminent time management for construction project. Project managers have a great responsibility to keep the project success under time of project completion. However, studies shows that delays especially in Malaysian construction industry still unresolved due to weakness in managing the project. In addition, quality of time management on construction projects is generally poor. Due to the progressively extended delays issue, time performance becomes an important subject to be explored to investigate delay factors. The method of this study is review of literature towards issues in construction industry which affecting time performance of project in general by focusing towards process involved for project management. Based on study, it was found that knowledge, commitment, cooperation are the main criteria as an overall to manage the project into a smooth process during project execution until completion. It can be concluded that, the strength between project manager and team members in these main criteria while conducting the project towards good time performance is highly needed. However, there is lack of establishment towards factors of poor time performance which strongly related with project management. Hence, this study has been conducted to establish factors of poor time performance and its relations with project management.

  11. Project management practice and its effects on project success in Malaysian construction industry

    NASA Astrophysics Data System (ADS)

    Haron, N. A.; Devi, P.; Hassim, S.; Alias, A. H.; Tahir, M. M.; Harun, A. N.

    2017-12-01

    The rapid economic development has increased the demand for construction of infrastructure and facilities globally. Sustainable development and globalization are the new ‘Zeitgeist’ of the 21st century. In order to implement these projects successfully and to meet the functional aim of the projects within their lifetime, an efficient project management practice is needed. The aim of this study is to identify the critical success factors (CSFs) and the extent of use of project management practice which affects project success, especially during the implementation stage. Data were obtained from self-administered questionnaires with 232 respondents. A mixed method of data collection was adopted using semi-structured interview and questionnaire approach. The result of the analysis of data obtained showed that new and emerging criteria such as customer satisfaction, competency of the project team, and performance of subcontractors/suppliers are becoming measures of success in addition to the classic iron triangle’s view of time, cost and quality. An insight on the extent of use of different project management practice in the industry was also achieved from the study.

  12. Simulation of the National Aerospace System for Safety Analysis

    NASA Technical Reports Server (NTRS)

    Pritchett, Amy; Goldsman, Dave; Statler, Irv (Technical Monitor)

    2002-01-01

    Work started on this project on January 1, 1999, the first year of the grant. Following the outline of the grant proposal, a simulator architecture has been established which can incorporate the variety of types of models needed to accurately simulate national airspace dynamics. For the sake of efficiency, this architecture was based on an established single-aircraft flight simulator, the Reconfigurable Flight Simulator (RFS), already developed at Georgia Tech. Likewise, in the first year substantive changes and additions were made to the RFS to convert it into a simulation of the National Airspace System, with the flexibility to incorporate many types of models: aircraft models; controller models; airspace configuration generators; discrete event generators; embedded statistical functions; and display and data outputs. The architecture has been developed with the capability to accept any models of these types; due to its object-oriented structure, individual simulator components can be added and removed during run-time, and can be compiled separately. Simulation objects from other projects should be easy to convert to meet architecture requirements, with the intent that both this project may now be able to incorporate established simulation components from other projects, and that other projects may easily use this simulation without significant time investment.

  13. Cointegration and Nonstationarity in the Context of Multiresolution Analysis

    NASA Astrophysics Data System (ADS)

    Worden, K.; Cross, E. J.; Kyprianou, A.

    2011-07-01

    Cointegration has established itself as a powerful means of projecting out long-term trends from time-series data in the context of econometrics. Recent work by the current authors has further established that cointegration can be applied profitably in the context of structural health monitoring (SHM), where it is desirable to project out the effects of environmental and operational variations from data in order that they do not generate false positives in diagnostic tests. The concept of cointegration is partly built on a clear understanding of the ideas of stationarity and nonstationarity for time-series. Nonstationarity in this context is 'traditionally' established through the use of statistical tests, e.g. the hypothesis test based on the augmented Dickey-Fuller statistic. However, it is important to understand the distinction in this case between 'trend' stationarity and stationarity of the AR models typically fitted as part of the analysis process. The current paper will discuss this distinction in the context of SHM data and will extend the discussion by the introduction of multi-resolution (discrete wavelet) analysis as a means of characterising the time-scales on which nonstationarity manifests itself. The discussion will be based on synthetic data and also on experimental data for the guided-wave SHM of a composite plate.

  14. National Economic Development Procedures Manual - Overview Manual for Conducting National Economic Development Analysis

    DTIC Science & Technology

    1991-10-01

    culd coinpensate loser and The bulk of project costs are generallye c aInipr:vs :ocii".. ... .... A incurred during the construction period. Benefits... cost he would be willing to that each fisherman would produce at each possible incur ) to catch the 1,000 pounds of fish is $4,000. price . Nonetheless...neutrality, enabling them to use expected annual damages as the measure Project costs are incurred primarily at the time of a beneficiary’s willingness

  15. Engineering studies of vectorcardiographs in blood pressure measuring systems

    NASA Technical Reports Server (NTRS)

    Mark, R. G.

    1975-01-01

    The following projects involving cardiovascular instrumentation were conducted: (1) the development and fabrication of a three-dimensional display measurement system for vectorcardiograms, (2) the development and fabrication of a cardiovascular monitoring system to noninvasively monitor beat-by-beat the blood pressure and heart rate using aortic pulse wave velocity, (3) the development of software for an interactive system to analyze systolic time interval data, and (4) the development of microprocessor-based physiologic instrumentation, focussing initially on EKG rhythm analysis. Brief descriptions of these projects were given.

  16. Students talk about energy in project-based inquiry science

    NASA Astrophysics Data System (ADS)

    Harrer, Benedikt W.; Flood, Virginia J.; Wittmann, Michael C.

    2013-01-01

    We examine the types of emergent language eighth grade students in rural Maine middle schools use when they discuss energy in their first experiences with Project-Based Inquiry Science: Energy, a research-based curriculum that uses a specific language for talking about energy. By comparative analysis of the language used by the curriculum materials to students' language, we find that students' talk is at times more aligned with a Stores and Transfer model of energy than the Forms model supported by the curriculum.

  17. Project SAFE: A Blueprint for Flight Standards. Part 1.

    DTIC Science & Technology

    1985-01-01

    nwanage quota utilization and scheduling was designed to meet a more stable and predictable training environmtent than that which now exists in the Flight...accurate and timely reporting of field office activities, and f. Provide improved capability to conduct national level analyses to predict and prevent...and analysis of the task prfocmd tV filw "lht B tanderds Lstms aIOUMaMM (Brief descriotion ci 7aj project is wIg ;,ude0-01n ’Te objective of the Ml

  18. Bayesian analyses of time-interval data for environmental radiation monitoring.

    PubMed

    Luo, Peng; Sharp, Julia L; DeVol, Timothy A

    2013-01-01

    Time-interval (time difference between two consecutive pulses) analysis based on the principles of Bayesian inference was investigated for online radiation monitoring. Using experimental and simulated data, Bayesian analysis of time-interval data [Bayesian (ti)] was compared with Bayesian and a conventional frequentist analysis of counts in a fixed count time [Bayesian (cnt) and single interval test (SIT), respectively]. The performances of the three methods were compared in terms of average run length (ARL) and detection probability for several simulated detection scenarios. Experimental data were acquired with a DGF-4C system in list mode. Simulated data were obtained using Monte Carlo techniques to obtain a random sampling of the Poisson distribution. All statistical algorithms were developed using the R Project for statistical computing. Bayesian analysis of time-interval information provided a similar detection probability as Bayesian analysis of count information, but the authors were able to make a decision with fewer pulses at relatively higher radiation levels. In addition, for the cases with very short presence of the source (< count time), time-interval information is more sensitive to detect a change than count information since the source data is averaged by the background data over the entire count time. The relationships of the source time, change points, and modifications to the Bayesian approach for increasing detection probability are presented.

  19. Design, implementation and migration of security systems as an extreme project.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scharmer, Carol; Trujillo, David

    2010-08-01

    Decision Trees, algorithms, software code, risk management, reports, plans, drawings, change control, presentations, and analysis - all useful tools and efforts but time consuming, resource intensive, and potentially costly for projects that have absolute schedule and budget constraints. What are necessary and prudent efforts when a customer calls with a major security problem that needs to be fixed with a proven, off-the-approval-list, multi-layered integrated system with high visibility and limited funding and expires at the end of the Fiscal Year? Whether driven by budget cycles, safety, or by management decree, many such projects begin with generic scopes and funding allocatedmore » based on a rapid management 'guestimate.' Then a Project Manager (PM) is assigned a project with a predefined and potentially limited scope, compressed schedule, and potentially insufficient funding. The PM is tasked to rapidly and cost effectively coordinate a requirements-based design, implementation, test, and turnover of a fully operational system to the customer, all while the customer is operating and maintaining an existing security system. Many project management manuals call this an impossible project that should not be attempted. However, security is serious business and the reality is that rapid deployment of proven systems via an 'Extreme Project' is sometimes necessary. Extreme Projects can be wildly successful but require a dedicated team of security professionals lead by an experienced project manager using a highly-tailored and agile project management process with management support at all levels, all combined with significant interface with the customer. This paper does not advocate such projects or condone eliminating the valuable analysis and project management techniques. Indeed, having worked on a well-planned project provides the basis for experienced team members to complete Extreme Projects. This paper does, however, provide insight into what it takes for projects to be successfully implemented and accepted when completed under extreme conditions.« less

  20. Design implementation and migration of security systems as an extreme project.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scharmer, Carol

    2010-10-01

    Decision Trees, algorithms, software code, risk management, reports, plans, drawings, change control, presentations, and analysis - all useful tools and efforts but time consuming, resource intensive, and potentially costly for projects that have absolute schedule and budget constraints. What are necessary and prudent efforts when a customer calls with a major security problem that needs to be fixed with a proven, off-the-approval-list, multi-layered integrated system with high visibility and limited funding and expires at the end of the Fiscal Year? Whether driven by budget cycles, safety, or by management decree, many such projects begin with generic scopes and funding allocatedmore » based on a rapid management 'guestimate.' Then a Project Manager (PM) is assigned a project with a predefined and potentially limited scope, compressed schedule, and potentially insufficient funding. The PM is tasked to rapidly and cost effectively coordinate a requirements-based design, implementation, test, and turnover of a fully operational system to the customer, all while the customer is operating and maintaining an existing security system. Many project management manuals call this an impossible project that should not be attempted. However, security is serious business and the reality is that rapid deployment of proven systems via an 'Extreme Project' is sometimes necessary. Extreme Projects can be wildly successful but require a dedicated team of security professionals lead by an experienced project manager using a highly-tailored and agile project management process with management support at all levels, all combined with significant interface with the customer. This paper does not advocate such projects or condone eliminating the valuable analysis and project management techniques. Indeed, having worked on a well-planned project provides the basis for experienced team members to complete Extreme Projects. This paper does, however, provide insight into what it takes for projects to be successfully implemented and accepted when completed under extreme conditions.« less

  1. Estimating the Energy, Demand and Cost Savings from a Geothermal Heat Pump ESPC Project at Fort Polk, LA Through Utility Bill Analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shonder, John A; Hughes, Patrick

    2006-01-01

    Energy savings performance contracts (ESPCs) are a method of financing energy conservation projects using the energy cost savings generated by the conservation measures themselves. Ideally, reduced energy costs are visible as reduced utility bills, but in fact this is not always the case. On large military bases, for example, a single electric meter typically covers hundreds of individual buildings. Savings from an ESPC involving only a small number of these buildings will have little effect on the overall utility bill. In fact, changes in mission, occupancy, and energy prices could cause substantial increases in utility bills. For this reason, other,more » more practical, methods have been developed to measure and verify savings in ESPC projects. Nevertheless, increasing utility bills--when ESPCs are expected to be reducing them--are problematic and can lead some observers to question whether savings are actually being achieved. In this paper, the authors use utility bill analysis to determine energy, demand, and cost savings from an ESPC project that installed geothermal heat pumps in the family housing areas of the military base at Fort Polk, Louisiana. The savings estimates for the first year after the retrofits were found to be in substantial agreement with previous estimates that were based on submetered data. However, the utility bills also show that electrical use tended to increase as time went on. Since other data show that the energy use in family housing has remained about the same over the period, the authors conclude that the savings from the ESPC have persisted, and increases in electrical use must be due to loads unassociated with family housing. This shows that under certain circumstances, and with the proper analysis, utility bills can be used to estimate savings from ESPC projects. However, these circumstances are rare and over time the comparison may be invalidated by increases in energy use in areas unaffected by the ESPC.« less

  2. A decision-analytic approach to the optimal allocation of resources for endangered species consultation

    USGS Publications Warehouse

    Converse, Sarah J.; Shelley, Kevin J.; Morey, Steve; Chan, Jeffrey; LaTier, Andrea; Scafidi, Carolyn; Crouse, Deborah T.; Runge, Michael C.

    2011-01-01

    The resources available to support conservation work, whether time or money, are limited. Decision makers need methods to help them identify the optimal allocation of limited resources to meet conservation goals, and decision analysis is uniquely suited to assist with the development of such methods. In recent years, a number of case studies have been described that examine optimal conservation decisions under fiscal constraints; here we develop methods to look at other types of constraints, including limited staff and regulatory deadlines. In the US, Section Seven consultation, an important component of protection under the federal Endangered Species Act, requires that federal agencies overseeing projects consult with federal biologists to avoid jeopardizing species. A benefit of consultation is negotiation of project modifications that lessen impacts on species, so staff time allocated to consultation supports conservation. However, some offices have experienced declining staff, potentially reducing the efficacy of consultation. This is true of the US Fish and Wildlife Service's Washington Fish and Wildlife Office (WFWO) and its consultation work on federally-threatened bull trout (Salvelinus confluentus). To improve effectiveness, WFWO managers needed a tool to help allocate this work to maximize conservation benefits. We used a decision-analytic approach to score projects based on the value of staff time investment, and then identified an optimal decision rule for how scored projects would be allocated across bins, where projects in different bins received different time investments. We found that, given current staff, the optimal decision rule placed 80% of informal consultations (those where expected effects are beneficial, insignificant, or discountable) in a short bin where they would be completed without negotiating changes. The remaining 20% would be placed in a long bin, warranting an investment of seven days, including time for negotiation. For formal consultations (those where expected effects are significant), 82% of projects would be placed in a long bin, with an average time investment of 15. days. The WFWO is using this decision-support tool to help allocate staff time. Because workload allocation decisions are iterative, we describe a monitoring plan designed to increase the tool's efficacy over time. This work has general application beyond Section Seven consultation, in that it provides a framework for efficient investment of staff time in conservation when such time is limited and when regulatory deadlines prevent an unconstrained approach. ?? 2010.

  3. UAS Integration in the NAS Project: Flight Test 3 Data Analysis of JADEM-Autoresolver Detect and Avoid System

    NASA Technical Reports Server (NTRS)

    Gong, Chester; Wu, Minghong G.; Santiago, Confesor

    2016-01-01

    The Unmanned Aircraft Systems Integration in the National Airspace System project, or UAS Integration in the NAS, aims to reduce technical barriers related to safety and operational challenges associated with enabling routine UAS access to the NAS. The UAS Integration in the NAS Project conducted a flight test activity, referred to as Flight Test 3 (FT3), involving several Detect-and-Avoid (DAA) research prototype systems between June 15, 2015 and August 12, 2015 at the Armstrong Flight Research Center (AFRC). This report documents the flight testing and analysis results for the NASA Ames-developed JADEM-Autoresolver DAA system, referred to as 'Autoresolver' herein. Four flight test days (June 17, 18, 22, and July 22) were dedicated to Autoresolver testing. The objectives of this test were as follows: 1. Validate CPA prediction accuracy and detect-and-avoid (DAA, formerly known as self-separation) alerting logic in realistic flight conditions. 2. Validate DAA trajectory model including maneuvers. 3. Evaluate TCAS/DAA interoperability. 4. Inform final Minimum Operating Performance Standards (MOPS). Flight test scenarios were designed to collect data to directly address the objectives 1-3. Objective 4, inform final MOPS, was a general objective applicable to the UAS in the NAS project as a whole, of which flight test is a subset. This report presents analysis results completed in support of the UAS in the NAS project FT3 data review conducted on October 20, 2015. Due to time constraints and, to a lesser extent, TCAS data collection issues, objective 3 was not evaluated in this analysis.

  4. [Tobacco quality analysis of industrial classification of different years using near-infrared (NIR) spectrum].

    PubMed

    Wang, Yi; Xiang, Ma; Wen, Ya-Dong; Yu, Chun-Xia; Wang, Luo-Ping; Zhao, Long-Lian; Li, Jun-Hui

    2012-11-01

    In this study, tobacco quality analysis of main Industrial classification of different years was carried out applying spectrum projection and correlation methods. The group of data was near-infrared (NIR) spectrum from Hongta Tobacco (Group) Co., Ltd. 5730 tobacco leaf Industrial classification samples from Yuxi in Yunnan province from 2007 to 2010 year were collected using near infrared spectroscopy, which from different parts and colors and all belong to tobacco varieties of HONGDA. The conclusion showed that, when the samples were divided to two part by the ratio of 2:1 randomly as analysis and verification sets in the same year, the verification set corresponded with the analysis set applying spectrum projection because their correlation coefficients were above 0.98. The correlation coefficients between two different years applying spectrum projection were above 0.97. The highest correlation coefficient was the one between 2008 and 2009 year and the lowest correlation coefficient was the one between 2007 and 2010 year. At the same time, The study discussed a method to get the quantitative similarity values of different industrial classification samples. The similarity and consistency values were instructive in combination and replacement of tobacco leaf blending.

  5. Interprofessional teaching and learning in the health care professions: A qualitative evaluation of the Robert Bosch Foundation's grant program "Operation Team"

    PubMed Central

    Nock, Lukas

    2016-01-01

    Aim: Interprofessional teaching and learning is gaining significance in the health professions. At the same time, the development and implementation of such educational courses is demanding. Focusing on factors critical to success, the aim of this paper is to evaluate the experience gathered by eight grant projects in which interprofessional courses were designed. Emphasis is placed on the level of cooperation between the participating educational institutions, course content, the operative implementation of the course units and their permanent integration into curricula. Method: Data was collected in semi-structured, guideline-based interviews with project leaders and team members (n=43). University and vocational students who had attended the evaluated courses were also included in the survey (n=7) as a means to triangulate data. Analysis was carried out based on qualitative content analysis. Results: A participatory, dialogue-centered model of cooperation appears to be most suited for developing and implementing courses. Belonging to the factors critical to success are the time when courses are offered, the conditions for attendance, the different teaching and learning cultures of the professions involved, preparation and deployment of instructors, and the role played by project coordination. Permanently integrating interprofessional units into medical curricula revealed itself to be difficult. Conclusion: While the development and realization of interprofessional courses can be achieved easily enough in projects, curricular integration of the new course units is challenging. In respect to the latter, not only a large amount of staffing resources and time are required, but also the creation of the necessary system-level structures, not just within the educational institutions (organizational development) but also in the frameworks governing the professions. PMID:27280127

  6. Interprofessional teaching and learning in the health care professions: A qualitative evaluation of the Robert Bosch Foundation's grant program "Operation Team".

    PubMed

    Nock, Lukas

    2016-01-01

    Interprofessional teaching and learning is gaining significance in the health professions. At the same time, the development and implementation of such educational courses is demanding. Focusing on factors critical to success, the aim of this paper is to evaluate the experience gathered by eight grant projects in which interprofessional courses were designed. Emphasis is placed on the level of cooperation between the participating educational institutions, course content, the operative implementation of the course units and their permanent integration into curricula. Data was collected in semi-structured, guideline-based interviews with project leaders and team members (n=43). University and vocational students who had attended the evaluated courses were also included in the survey (n=7) as a means to triangulate data. Analysis was carried out based on qualitative content analysis. A participatory, dialogue-centered model of cooperation appears to be most suited for developing and implementing courses. Belonging to the factors critical to success are the time when courses are offered, the conditions for attendance, the different teaching and learning cultures of the professions involved, preparation and deployment of instructors, and the role played by project coordination. Permanently integrating interprofessional units into medical curricula revealed itself to be difficult. While the development and realization of interprofessional courses can be achieved easily enough in projects, curricular integration of the new course units is challenging. In respect to the latter, not only a large amount of staffing resources and time are required, but also the creation of the necessary system-level structures, not just within the educational institutions (organizational development) but also in the frameworks governing the professions.

  7. Estimation and prediction of origin-destination matrices for I-66.

    DOT National Transportation Integrated Search

    2011-09-01

    This project uses the Box-Jenkins time-series technique to model and forecast the traffic flows and then : uses the flow forecasts to predict the origin-destination matrices. First, a detailed analysis was conducted : to investigate the best data cor...

  8. User requirements for project-oriented remote sensing

    NASA Technical Reports Server (NTRS)

    Hitchcock, H. C.; Baxter, F. P.; Cox, T. L.

    1975-01-01

    Registration of remotely sensed data to geodetic coordinates provides for overlay analysis of land use data. For aerial photographs of a large area, differences in scales, dates, and film types are reconciled, and multispectral scanner data are machine registered at the time of acquisition.

  9. Hybrid methods for cybersecurity analysis :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Warren Leon,; Dunlavy, Daniel M.

    2014-01-01

    Early 2010 saw a signi cant change in adversarial techniques aimed at network intrusion: a shift from malware delivered via email attachments toward the use of hidden, embedded hyperlinks to initiate sequences of downloads and interactions with web sites and network servers containing malicious software. Enterprise security groups were well poised and experienced in defending the former attacks, but the new types of attacks were larger in number, more challenging to detect, dynamic in nature, and required the development of new technologies and analytic capabilities. The Hybrid LDRD project was aimed at delivering new capabilities in large-scale data modeling andmore » analysis to enterprise security operators and analysts and understanding the challenges of detection and prevention of emerging cybersecurity threats. Leveraging previous LDRD research e orts and capabilities in large-scale relational data analysis, large-scale discrete data analysis and visualization, and streaming data analysis, new modeling and analysis capabilities were quickly brought to bear on the problems in email phishing and spear phishing attacks in the Sandia enterprise security operational groups at the onset of the Hybrid project. As part of this project, a software development and deployment framework was created within the security analyst work ow tool sets to facilitate the delivery and testing of new capabilities as they became available, and machine learning algorithms were developed to address the challenge of dynamic threats. Furthermore, researchers from the Hybrid project were embedded in the security analyst groups for almost a full year, engaged in daily operational activities and routines, creating an atmosphere of trust and collaboration between the researchers and security personnel. The Hybrid project has altered the way that research ideas can be incorporated into the production environments of Sandias enterprise security groups, reducing time to deployment from months and years to hours and days for the application of new modeling and analysis capabilities to emerging threats. The development and deployment framework has been generalized into the Hybrid Framework and incor- porated into several LDRD, WFO, and DOE/CSL projects and proposals. And most importantly, the Hybrid project has provided Sandia security analysts with new, scalable, extensible analytic capabilities that have resulted in alerts not detectable using their previous work ow tool sets.« less

  10. Nationwide disturbance attribution on NASA’s earth exchange: experiences in a high-end computing environment

    Treesearch

    J. Chris Toney; Karen G. Schleeweis; Jennifer Dungan; Andrew Michaelis; Todd Schroeder; Gretchen G. Moisen

    2015-01-01

    The North American Forest Dynamics (NAFD) project’s Attribution Team is completing nationwide processing of historic Landsat data to provide a comprehensive annual, wall-to-wall analysis of US disturbance history, with attribution, over the last 25+ years. Per-pixel time series analysis based on a new nonparametric curve fitting algorithm yields several metrics useful...

  11. Sensitivity Analysis for Multidisciplinary Systems (SAMS)

    DTIC Science & Technology

    2016-12-01

    support both mode-based structural representations and time-dependent, nonlinear finite element structural dynamics. This interim report describes...Adaptation, & Sensitivity Toolkit • Elasticity, heat transfer, & compressible flow • Adjoint solver for sensitivity analysis • High-order finite elements ...PROGRAM ELEMENT NUMBER 62201F 6. AUTHOR(S) Richard D. Snyder 5d. PROJECT NUMBER 2401 5e. TASK NUMBER N/A 5f. WORK UNIT NUMBER Q1FS 7

  12. Risk Dimensions and Political Decisions Frame Environmental Communication: A Content Analysis of Seven U.S. Newspapers from 1970-2010

    ERIC Educational Resources Information Center

    Grantham, Susan; Vieira, Edward T., Jr.

    2014-01-01

    This project examined the focus of environmental news frames used in seven American newspapers between 1970 and 2010. During this time newspapers were a primary source of news. Based on gatekeeping and agenda-setting theory, as well as source credibility, the content analysis of 2,123 articles examined the environmental topics within the articles,…

  13. The Post-Dam System. Volume 5. Harvard Project Manager (HPM).

    DTIC Science & Technology

    1992-10-01

    cQllected and analyzed to determine structural integrity and usability. From this analysis, a repair schedule is developed. This is currently a time...information on mission-critical facility damage is collected and analyzed to determine structural integrity and usability. From this analysis, a repair...to determine repair strategies with an expert system, keep track of materials and equipment with a relational database management system, and

  14. Real-time volumetric image reconstruction and 3D tumor localization based on a single x-ray projection image for lung cancer radiotherapy.

    PubMed

    Li, Ruijiang; Jia, Xun; Lewis, John H; Gu, Xuejun; Folkerts, Michael; Men, Chunhua; Jiang, Steve B

    2010-06-01

    To develop an algorithm for real-time volumetric image reconstruction and 3D tumor localization based on a single x-ray projection image for lung cancer radiotherapy. Given a set of volumetric images of a patient at N breathing phases as the training data, deformable image registration was performed between a reference phase and the other N-1 phases, resulting in N-1 deformation vector fields (DVFs). These DVFs can be represented efficiently by a few eigenvectors and coefficients obtained from principal component analysis (PCA). By varying the PCA coefficients, new DVFs can be generated, which, when applied on the reference image, lead to new volumetric images. A volumetric image can then be reconstructed from a single projection image by optimizing the PCA coefficients such that its computed projection matches the measured one. The 3D location of the tumor can be derived by applying the inverted DVF on its position in the reference image. The algorithm was implemented on graphics processing units (GPUs) to achieve real-time efficiency. The training data were generated using a realistic and dynamic mathematical phantom with ten breathing phases. The testing data were 360 cone beam projections corresponding to one gantry rotation, simulated using the same phantom with a 50% increase in breathing amplitude. The average relative image intensity error of the reconstructed volumetric images is 6.9% +/- 2.4%. The average 3D tumor localization error is 0.8 +/- 0.5 mm. On an NVIDIA Tesla C1060 GPU card, the average computation time for reconstructing a volumetric image from each projection is 0.24 s (range: 0.17 and 0.35 s). The authors have shown the feasibility of reconstructing volumetric images and localizing tumor positions in 3D in near real-time from a single x-ray image.

  15. Vertical structure and physical processes of the Madden-Julian Oscillation: Biases and uncertainties at short range

    NASA Astrophysics Data System (ADS)

    Xavier, Prince K.; Petch, Jon C.; Klingaman, Nicholas P.; Woolnough, Steve J.; Jiang, Xianan; Waliser, Duane E.; Caian, Mihaela; Cole, Jason; Hagos, Samson M.; Hannay, Cecile; Kim, Daehyun; Miyakawa, Tomoki; Pritchard, Michael S.; Roehrig, Romain; Shindo, Eiki; Vitart, Frederic; Wang, Hailan

    2015-05-01

    An analysis of diabatic heating and moistening processes from 12 to 36 h lead time forecasts from 12 Global Circulation Models are presented as part of the "Vertical structure and physical processes of the Madden-Julian Oscillation (MJO)" project. A lead time of 12-36 h is chosen to constrain the large-scale dynamics and thermodynamics to be close to observations while avoiding being too close to the initial spin-up of the models as they adjust to being driven from the Years of Tropical Convection (YOTC) analysis. A comparison of the vertical velocity and rainfall with the observations and YOTC analysis suggests that the phases of convection associated with the MJO are constrained in most models at this lead time although the rainfall in the suppressed phase is typically overestimated. Although the large-scale dynamics is reasonably constrained, moistening and heating profiles have large intermodel spread. In particular, there are large spreads in convective heating and moistening at midlevels during the transition to active convection. Radiative heating and cloud parameters have the largest relative spread across models at upper levels during the active phase. A detailed analysis of time step behavior shows that some models show strong intermittency in rainfall and differences in the precipitation and dynamics relationship between models. The wealth of model outputs archived during this project is a very valuable resource for model developers beyond the study of the MJO. In addition, the findings of this study can inform the design of process model experiments, and inform the priorities for field experiments and future observing systems.

  16. Outsourced probe data effectiveness on signalized arterials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharifi, Elham; Young, Stanley Ernest; Eshragh, Sepideh

    This paper presents results of an I-95 Corridor Coalition sponsored project to assess the ability of outsourced vehicle probe data to provide accurate travel time on signalized roadways for the purposes of real-time operations as well as performance measures. The quality of outsourced probe data on freeways has led many departments of transportation to consider such data for arterial performance monitoring. From April 2013 through June of 2014, the University of Maryland Center for Advanced Transportation Technology gathered travel times from several arterial corridors within the mid-Atlantic region using Bluetooth traffic monitoring (BTM) equipment, and compared these travel times withmore » the data reported to the I95 Vehicle Probe Project (VPP) from an outsourced probe data vendor. The analysis consisted of several methodologies: (1) a traditional analysis that used precision and bias speed metrics; (2) a slowdown analysis that quantified the percentage of significant traffic disruptions accurately captured in the VPP data; (3) a sampled distribution method that uses overlay methods to enhance and analyze recurring congestion patterns. (4) Last, the BTM and VPP data from each 24-hour period of data collection were reviewed by the research team to assess the extent to which VPP captured the nature of the traffic flow. Based on the analysis, probe data is recommended only on arterial roadways with signal densities (measured in signals per mile) up to one, and it should be tested and used with caution for signal densities between one and two, and is not recommended when signal density exceeds two.« less

  17. Multivariate statistical analysis of wildfires in Portugal

    NASA Astrophysics Data System (ADS)

    Costa, Ricardo; Caramelo, Liliana; Pereira, Mário

    2013-04-01

    Several studies demonstrate that wildfires in Portugal present high temporal and spatial variability as well as cluster behavior (Pereira et al., 2005, 2011). This study aims to contribute to the characterization of the fire regime in Portugal with the multivariate statistical analysis of the time series of number of fires and area burned in Portugal during the 1980 - 2009 period. The data used in the analysis is an extended version of the Rural Fire Portuguese Database (PRFD) (Pereira et al, 2011), provided by the National Forest Authority (Autoridade Florestal Nacional, AFN), the Portuguese Forest Service, which includes information for more than 500,000 fire records. There are many multiple advanced techniques for examining the relationships among multiple time series at the same time (e.g., canonical correlation analysis, principal components analysis, factor analysis, path analysis, multiple analyses of variance, clustering systems). This study compares and discusses the results obtained with these different techniques. Pereira, M.G., Trigo, R.M., DaCamara, C.C., Pereira, J.M.C., Leite, S.M., 2005: "Synoptic patterns associated with large summer forest fires in Portugal". Agricultural and Forest Meteorology. 129, 11-25. Pereira, M. G., Malamud, B. D., Trigo, R. M., and Alves, P. I.: The history and characteristics of the 1980-2005 Portuguese rural fire database, Nat. Hazards Earth Syst. Sci., 11, 3343-3358, doi:10.5194/nhess-11-3343-2011, 2011 This work is supported by European Union Funds (FEDER/COMPETE - Operational Competitiveness Programme) and by national funds (FCT - Portuguese Foundation for Science and Technology) under the project FCOMP-01-0124-FEDER-022692, the project FLAIR (PTDC/AAC-AMB/104702/2008) and the EU 7th Framework Program through FUME (contract number 243888).

  18. Classifications for Coastal Wetlands Planning, Protection and Restoration Act site-specific projects: 2008 and 2009

    USGS Publications Warehouse

    Jones, William R.; Garber, Adrienne

    2012-01-01

    The Coastal Wetlands Planning, Protection and Restoration Act (CWPPRA) funds over 100 wetland restoration projects across Louisiana. Integral to the success of CWPPRA is its long-term monitoring program, which enables State and Federal agencies to determine the effectiveness of each restoration effort. One component of this monitoring program is the analysis of high-resolution, color-infrared aerial photography at the U.S. Geological Survey's National Wetlands Research Center in Lafayette, Louisiana. Color-infrared aerial photography (9- by 9-inch) is obtained before project construction and several times after construction. Each frame is scanned on a photogrametric scanner that produces a high-resolution image in Tagged Image File Format (TIFF). By using image-processing software, these TIFF files are then orthorectified and mosaicked to produce a seamless image of a project area and its associated reference area (a control site near the project that has common environmental features, such as marsh type, soil types, and water salinities.) The project and reference areas are then classified according to pixel value into two distinct classes, land and water. After initial land and water ratios have been established by using photography obtained before and after project construction, subsequent comparisons can be made over time to determine land-water change. Several challenges are associated with the land-water interpretation process. Primarily, land-water classifications are often complicated by the presence of floating aquatic vegetation that occurs throughout the freshwater systems of coastal Louisiana and that is sometimes difficult to differentiate from emergent marsh. Other challenges include tidal fluctuations and water movement from strong winds, which may result in flooding and inundation of emergent marsh during certain conditions. Compensating for these events is difficult but possible by using other sources of imagery to verify marsh conditions for other dates in time.

  19. Scalable Open Science Approach for Mutation Calling of Tumor Exomes Using Multiple Genomic Pipelines.

    PubMed

    Ellrott, Kyle; Bailey, Matthew H; Saksena, Gordon; Covington, Kyle R; Kandoth, Cyriac; Stewart, Chip; Hess, Julian; Ma, Singer; Chiotti, Kami E; McLellan, Michael; Sofia, Heidi J; Hutter, Carolyn; Getz, Gad; Wheeler, David; Ding, Li

    2018-03-28

    The Cancer Genome Atlas (TCGA) cancer genomics dataset includes over 10,000 tumor-normal exome pairs across 33 different cancer types, in total >400 TB of raw data files requiring analysis. Here we describe the Multi-Center Mutation Calling in Multiple Cancers project, our effort to generate a comprehensive encyclopedia of somatic mutation calls for the TCGA data to enable robust cross-tumor-type analyses. Our approach accounts for variance and batch effects introduced by the rapid advancement of DNA extraction, hybridization-capture, sequencing, and analysis methods over time. We present best practices for applying an ensemble of seven mutation-calling algorithms with scoring and artifact filtering. The dataset created by this analysis includes 3.5 million somatic variants and forms the basis for PanCan Atlas papers. The results have been made available to the research community along with the methods used to generate them. This project is the result of collaboration from a number of institutes and demonstrates how team science drives extremely large genomics projects. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  20. Characterization of cracking behavior using posttest fractographic analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kobayashi, T.; Shockey, D.A.

    A determination of time to initiation of stress corrosion cracking in structures and test specimens is important for performing structural failure analysis and for setting inspection intervals. Yet it is seldom possible to establish how much of a component's lifetime represents the time to initiation of fracture and how much represents postinitiation crack growth. This exploratory research project was undertaken to examine the feasibility of determining crack initiation times and crack growth rates from posttest examination of fracture surfaces of constant-extension-rate-test (CERT) specimens by using the fracture reconstruction applying surface topography analysis (FRASTA) technique. The specimens used in this studymore » were Type 304 stainless steel fractured in several boiling water reactor (BWR) aqueous environments. 2 refs., 25 figs., 2 tabs.« less

  1. Portfolio Analysis of Renewable Energy Opportunities: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richards, Allison; Deprizio, Jodi; Anderson, Kate

    Time Warner Cable (TWC), now Charter Communications (CC), partnered with the National Renewable Energy Laboratory (NREL) to assess the technical and economic potential for solar photovoltaic (PV), wind, and ground-source heat-pump systems at 696 TWC facilities. NREL identified 306 sites where adding a renewable energy system would provide cost savings over the project life-cycle. In general, the top sites have some combination of high electricity rates ($0.16-$0.29/kWh), significant state incentives, and favorable net-metering policies. If all projects were implemented via third-party power purchase agreements, TWC/CC would save $37 million over 25 years and meet 10.5% of their energy consumption withmore » renewable energy. This paper describes the portfolio screening methodology used to identify and prioritize renewable energy opportunities across the TWC sites, as well as a summary of the potential cost savings that may be realized by implementing these projects. This may provide a template for other companies interested in identifying and prioritizing renewable energy opportunities across a large number of geographically dispersed sites. Following this initial portfolio analysis, NREL will be conducting in-depth analysis of project development opportunities at ten sites and evaluating off-grid solutions that may enable carbon emission reduction and grid independence at select facilities.« less

  2. Exploring sustainability transitions in households: insights from real-life experiments

    NASA Astrophysics Data System (ADS)

    Baedeker, Carolin; Buhl, Johannes; Greiff, Kathrin; Hasselkuß, Marco; Liedtke, Christa; Lukas, Melanie

    2016-04-01

    Societal transformation towards sustainable consumption and production, especially in urban areas, is a key challenge. The design and implementation of sustainable product service systems (PSS) might be the initial point, in which private households play a major role. The Sustainable LivingLab research infrastructure was developed as an experimental setting for investigating consumption and production patterns in private households, especially to explore socio-technical innovations which are helpful to guide sustainability transitions. The suggested presentation describes results of several real-life experiments conducted in German households, e.g. the project SusLabNRW (North-Rhine Westphalia as part of the European SusLabNWE-Project), the EnerTransRuhr project as well as the PATHWAYS project that explore patterns of action, time use, social practices and the related resource use in private households. The presentation gives an overview of the employed methods and analysed data (qualitative interviews, social network analysis, survey on household activities and inventories and a sustainability assessment (resource profiles - MIPS household analysis). Households' resource consumption was calculated in all fields of activity to analyse social practices' impact. The presentation illustrates how aggregated data can inform scenario analysis and concludes with an outlook onto transition pathways at household level and socio-technical innovations in the fields of housing, nutrition and mobility.

  3. Using Statistical Analysis Software to Advance Nitro Plasticizer Wettability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shear, Trevor Allan

    Statistical analysis in science is an extremely powerful tool that is often underutilized. Additionally, it is frequently the case that data is misinterpreted or not used to its fullest extent. Utilizing the advanced software JMP®, many aspects of experimental design and data analysis can be evaluated and improved. This overview will detail the features of JMP® and how they were used to advance a project, resulting in time and cost savings, as well as the collection of scientifically sound data. The project analyzed in this report addresses the inability of a nitro plasticizer to coat a gold coated quartz crystalmore » sensor used in a quartz crystal microbalance. Through the use of the JMP® software, the wettability of the nitro plasticizer was increased by over 200% using an atmospheric plasma pen, ensuring good sample preparation and reliable results.« less

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakafuji, Dora; Gouveia, Lauren

    This project supports development of the next generation, integrated energy management infrastructure (EMS) able to incorporate advance visualization of behind-the-meter distributed resource information and probabilistic renewable energy generation forecasts to inform real-time operational decisions. The project involves end-users and active feedback from an Utility Advisory Team (UAT) to help inform how information can be used to enhance operational functions (e.g. unit commitment, load forecasting, Automatic Generation Control (AGC) reserve monitoring, ramp alerts) within two major EMS platforms. Objectives include: Engaging utility operations personnel to develop user input on displays, set expectations, test and review; Developing ease of use and timelinessmore » metrics for measuring enhancements; Developing prototype integrated capabilities within two operational EMS environments; Demonstrating an integrated decision analysis platform with real-time wind and solar forecasting information and timely distributed resource information; Seamlessly integrating new 4-dimensional information into operations without increasing workload and complexities; Developing sufficient analytics to inform and confidently transform and adopt new operating practices and procedures; Disseminating project lessons learned through industry sponsored workshops and conferences;Building on collaborative utility-vendor partnership and industry capabilities« less

  5. Feature selection and back-projection algorithms for nonline-of-sight laser-gated viewing

    NASA Astrophysics Data System (ADS)

    Laurenzis, Martin; Velten, Andreas

    2014-11-01

    We discuss new approaches to analyze laser-gated viewing data for nonline-of-sight vision with a frame-to-frame back-projection as well as feature selection algorithms. Although first back-projection approaches use time transients for each pixel, our method has the ability to calculate the projection of imaging data on the voxel space for each frame. Further, different data analysis algorithms and their sequential application were studied with the aim of identifying and selecting signals from different target positions. A slight modification of commonly used filters leads to a powerful selection of local maximum values. It is demonstrated that the choice of the filter has an impact on the selectivity i.e., multiple target detection as well as on the localization precision.

  6. Real-time acquisition and display of flow contrast using speckle variance optical coherence tomography in a graphics processing unit.

    PubMed

    Xu, Jing; Wong, Kevin; Jian, Yifan; Sarunic, Marinko V

    2014-02-01

    In this report, we describe a graphics processing unit (GPU)-accelerated processing platform for real-time acquisition and display of flow contrast images with Fourier domain optical coherence tomography (FDOCT) in mouse and human eyes in vivo. Motion contrast from blood flow is processed using the speckle variance OCT (svOCT) technique, which relies on the acquisition of multiple B-scan frames at the same location and tracking the change of the speckle pattern. Real-time mouse and human retinal imaging using two different custom-built OCT systems with processing and display performed on GPU are presented with an in-depth analysis of performance metrics. The display output included structural OCT data, en face projections of the intensity data, and the svOCT en face projections of retinal microvasculature; these results compare projections with and without speckle variance in the different retinal layers to reveal significant contrast improvements. As a demonstration, videos of real-time svOCT for in vivo human and mouse retinal imaging are included in our results. The capability of performing real-time svOCT imaging of the retinal vasculature may be a useful tool in a clinical environment for monitoring disease-related pathological changes in the microcirculation such as diabetic retinopathy.

  7. Do climate model predictions agree with long-term precipitation trends in the arid southwestern United States?

    NASA Astrophysics Data System (ADS)

    Elias, E.; Rango, A.; James, D.; Maxwell, C.; Anderson, J.; Abatzoglou, J. T.

    2016-12-01

    Researchers evaluating climate projections across southwestern North America observed a decreasing precipitation trend. Aridification was most pronounced in the cold (non-monsoonal) season, whereas downward trends in precipitation were smaller in the warm (monsoonal) season. In this region, based upon a multimodel mean of 20 Coupled Model Intercomparison Project 5 models using a business-as-usual (Representative Concentration Pathway 8.5) trajectory, midcentury precipitation is projected to increase slightly during the monsoonal time period (July-September; 6%) and decrease slightly during the remainder of the year (October-June; -4%). We use observed long-term (1915-2015) monthly precipitation records from 16 weather stations to investigate how well measured trends corroborate climate model predictions during the monsoonal and non-monsoonal timeframe. Running trend analysis using the Mann-Kendall test for 15 to 101 year moving windows reveals that half the stations showed significant (p≤0.1), albeit small, increasing trends based on the longest term record. Trends based on shorter-term records reveal a period of significant precipitation decline at all stations representing the 1950s drought. Trends from 1930 to 2015 reveal significant annual, monsoonal and non-monsoonal increases in precipitation (Fig 1). The 1960 to 2015 time window shows no significant precipitation trends. The more recent time window (1980 to 2015) shows a slight, but not significant, increase in monsoonal precipitation and a larger, significant decline in non-monsoonal precipitation. GCM precipitation projections are consistent with more recent trends for the region. Running trends from the most recent time window (mid-1990s to 2015) at all stations show increasing monsoonal precipitation and decreasing Oct-Jun precipitation, with significant trends at 6 of 16 stations. Running trend analysis revealed that the long-term trends were not persistent throughout the series length, but depended on the period examined. Recent trends in Southwest precipitation are directionally consistent with anthropogenic climate change.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harben, P E; Harris, D; Myers, S

    Seismic imaging and tracking methods have intelligence and monitoring applications. Current systems, however, do not adequately calibrate or model the unknown geological heterogeneity. Current systems are also not designed for rapid data acquisition and analysis in the field. This project seeks to build the core technological capabilities coupled with innovative deployment, processing, and analysis methodologies to allow seismic methods to be effectively utilized in the applications of seismic imaging and vehicle tracking where rapid (minutes to hours) and real-time analysis is required. The goal of this project is to build capabilities in acquisition system design, utilization and in full 3Dmore » finite difference modeling as well as statistical characterization of geological heterogeneity. Such capabilities coupled with a rapid field analysis methodology based on matched field processing are applied to problems associated with surveillance, battlefield management, finding hard and deeply buried targets, and portal monitoring. This project benefits the U.S. military and intelligence community in support of LLNL's national-security mission. FY03 was the final year of this project. In the 2.5 years this project has been active, numerous and varied developments and milestones have been accomplished. A wireless communication module for seismic data was developed to facilitate rapid seismic data acquisition and analysis. The E3D code was enhanced to include topographic effects. Codes were developed to implement the Karhunen-Loeve (K-L) statistical methodology for generating geological heterogeneity that can be utilized in E3D modeling. The matched field processing methodology applied to vehicle tracking and based on a field calibration to characterize geological heterogeneity was tested and successfully demonstrated in a tank tracking experiment at the Nevada Test Site. A 3-seismic-array vehicle tracking testbed was installed on-site at LLNL for testing real-time seismic tracking methods. A field experiment was conducted over a tunnel at the Nevada Site that quantified the tunnel reflection signal and, coupled with modeling, identified key needs and requirements in experimental layout of sensors. A large field experiment was conducted at the Lake Lynn Laboratory, a mine safety research facility in Pennsylvania, over a tunnel complex in realistic, difficult conditions. This experiment gathered the necessary data for a full 3D attempt to apply the methodology. The experiment also collected data to analyze the capabilities to detect and locate in-tunnel explosions for mine safety and other applications.« less

  9. VE at Scope Time (VEST): Three construction examples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sperling, R.B.

    1991-04-01

    Value Engineering at Scope Time (VEST)'' was published in Value World, January-February-March 1991. That article describes VEST as a four-phase process utilizing the heart'' of VE methodology, which is designed to be used with members of construction design teams to help them focus on the scope of work by doing cost modeling, function analysis, brainstorming and evaluation of ideas. With minimal training designers, architects and engineers can become energized to find creative design solutions and learn an effective, synergistic team approach to facilities design projects using VEST. If time is available, the team can begin the development of some highermore » ranked ideas into preliminary proposals. This paper is an expansion of that article, adding a brief section on training and presenting three examples of VEST on construction projects at a federally-funded research Laboratory.« less

  10. Advanced Ground Systems Maintenance Prognostics Project

    NASA Technical Reports Server (NTRS)

    Perotti, Jose M.

    2015-01-01

    The project implements prognostics capabilities to predict when a component system or subsystem will no longer meet desired functional or performance criteria, called the end of life. The capability also provides an assessment of the remaining useful life of a hardware component. The project enables the delivery of system health advisories to ground system operators. This project will use modeling techniques and algorithms to assess components' health andpredict remaining life for such components. The prognostics capability being developed will beused:during the design phase and during pre/post operations to conduct planning and analysis ofsystem design, maintenance & logistics plans, and system/mission operations plansduring real-time operations to monitor changes to components' health and assess their impacton operations.This capability will be interfaced to Ground Operations' command and control system as a part ofthe AGSM project to help assure system availability and mission success. The initial modelingeffort for this capability will be developed for Liquid Oxygen ground loading applications.

  11. Understanding extreme sea levels for broad-scale coastal impact and adaptation analysis

    NASA Astrophysics Data System (ADS)

    Wahl, T.; Haigh, I. D.; Nicholls, R. J.; Arns, A.; Dangendorf, S.; Hinkel, J.; Slangen, A. B. A.

    2017-07-01

    One of the main consequences of mean sea level rise (SLR) on human settlements is an increase in flood risk due to an increase in the intensity and frequency of extreme sea levels (ESL). While substantial research efforts are directed towards quantifying projections and uncertainties of future global and regional SLR, corresponding uncertainties in contemporary ESL have not been assessed and projections are limited. Here we quantify, for the first time at global scale, the uncertainties in present-day ESL estimates, which have by default been ignored in broad-scale sea-level rise impact assessments to date. ESL uncertainties exceed those from global SLR projections and, assuming that we meet the Paris agreement goals, the projected SLR itself by the end of the century in many regions. Both uncertainties in SLR projections and ESL estimates need to be understood and combined to fully assess potential impacts and adaptation needs.

  12. A web-based repository of surgical simulator projects.

    PubMed

    Leskovský, Peter; Harders, Matthias; Székely, Gábor

    2006-01-01

    The use of computer-based surgical simulators for training of prospective surgeons has been a topic of research for more than a decade. As a result, a large number of academic projects have been carried out, and a growing number of commercial products are available on the market. Keeping track of all these endeavors for established groups as well as for newly started projects can be quite arduous. Gathering information on existing methods, already traveled research paths, and problems encountered is a time consuming task. To alleviate this situation, we have established a modifiable online repository of existing projects. It contains detailed information about a large number of simulator projects gathered from web pages, papers and personal communication. The database is modifiable (with password protected sections) and also allows for a simple statistical analysis of the collected data. For further information, the surgical repository web page can be found at www.virtualsurgery.vision.ee.ethz.ch.

  13. Project SEARCH UK - Evaluating Its Employment Outcomes.

    PubMed

    Kaehne, Axel

    2016-11-01

    The study reports the findings of an evaluation of Project SEARCH UK. The programme develops internships for young people with intellectual disabilities who are about to leave school or college. The aim of the evaluation was to investigate at what rate Project SEARCH provided employment opportunities to participants. The evaluation obtained data from all sites operational in the UK at the time of evaluation (n = 17) and analysed employment outcomes. Data were available for 315 young people (n = 315) in the programme and pay and other employment related data were available for a subsample. The results of the analysis suggest that Project SEARCH achieves on average employment rates of around 50 per cent. Project SEARCH UK represents a valuable addition to the supported employment provision in the UK. Its unique model should inform discussions around best practice in supported employment. Implications for other supported employment programmes are discussed. © 2015 John Wiley & Sons Ltd.

  14. Criteria for Developing a Successful Privatization Project

    DTIC Science & Technology

    1989-05-01

    conceptualization and planning are required when pursuing privatization projects. In fact, privatization project proponents need to know how to...selection of projects for analysis, methods of acquiring information about these projects, and the analysis framwork . Chapter IV includes the analysis. A...performed an analysis to determine cormion conceptual and creative approaches and lessons learned. This analysis was then used to develop criteria for

  15. Causative factors of cost overrun in highway projects of Sindh province of Pakistan

    NASA Astrophysics Data System (ADS)

    Sohu, S.; Halid, A.; Nagapan, S.; Fattah, A.; Latif, I.; Ullah, K.

    2017-11-01

    Cost overrun is an increase of cost of project from approved budget which was signed by parties at the time of tender. Cost overrun in construction of highway projects is a common problem worldwide and construction industry of Pakistan is also facing this crucial problem of cost overrun in highway projects of Pakistan. The main objective of this research is to identify the causative factors of cost overrun in highway projects of Sindh province of Pakistan. A well designed questionnaire was developed based on 64 common factors of cost overrun from literature review. Developed questionnaire was distributed among selected 30 experts from owner/client, designer/consultant and contractor who have experience more than 20 years’ experience in highway projects. The collected data was statistical analyzed. After analysis results showed that delay process in payment by client, inadequate planning, client interference, poor contract management, delay of decision making, change of scope of project and financial problems faced by client were most causative factors of cost overrun in highway projects. This research will provide alertness to stakeholders of highway projects of Sindh province to avoid cost overrun in projects.

  16. The Antarctic Ice.

    ERIC Educational Resources Information Center

    Radok, Uwe

    1985-01-01

    The International Antarctic Glaciological Project has collected information on the East Antarctic ice sheet since 1969. Analysis of ice cores revealed climatic history, and radar soundings helped map bedrock of the continent. Computer models of the ice sheet and its changes over time will aid in predicting the future. (DH)

  17. Play Nice Across Time Space

    NASA Technical Reports Server (NTRS)

    Conroy, Michael P.

    2015-01-01

    Lecture is an overview of Simulation technologies, methods and practices, as applied to current and past NASA programs. Focus is on sharing experience and the overall benefits to programs and projects of having appropriate simulation and analysis capabilities available at the correct point in a system lifecycle.

  18. Analysis of change orders in geotechnical engineering work at INDOT : [technical summary].

    DOT National Transportation Integrated Search

    2011-01-01

    There was a perception at INDOT that the number of change orders connected with geotechnical work was excessive, and that, as a consequence, geotechnical projects were not completed on time or within budget. It was reported that INDOT construction pr...

  19. Methods for forecasting freight in uncertainty : time series analysis of multiple factors.

    DOT National Transportation Integrated Search

    2011-01-31

    The main goal of this research was to analyze and more accurately model freight movement in : Alabama. Ultimately, the goal of this project was to provide an overall approach to the : integration of accurate freight models into transportation plans a...

  20. Developing Legacy: Health Planning in the Host City of Porto Alegre for the 2014 Football World Cup.

    PubMed

    Witt, Regina Rigatto; Kotlhar, Mauro Kruter; Mesquita, Marilise Oliveira; Lima, Maria Alice Dias da Silva; Marin, Sandra Mara; Day, Carolina Baltar; Bandeira, Andrea Goncalves; Hutton, Alison

    2015-12-01

    To describe the process adopted to identify, classify, and evaluate legacy of health care planning in the host city of Porto Alegre for the Football World Cup 2014. There is an emerging interest in the need to demonstrate a sustainable health legacy from mass gatherings investments. Leaving a public health legacy for future host cities and countries is now an important part of planning for these events. The Ministry of Sports initiated and coordinated the development of projects in the host cities to identify actions, projects, and constructions to be developed to prepare for the World Cup. In Porto Alegre, a common structure was developed by the coordinating team to instruct legacy identification, classification, and evaluation. This structure was based on international documentary analysis (including official reports, policy documents, and web-based resources) and direct communication with recognized experts in the field. Sixteen total legacies were identified for health surveillance (5) and health services (11). They were classified according to the strategic area, organizations involved, dimension, typology, planned or unplanned, tangible or intangible, territorial coverage, and situation prior to the World Cup. Possible impacts were then assessed as positive, negative, and potentiating, and mitigating actions were indicated. The project allowed the identification, classification, and development of health legacy, including risk analysis, surveillance, mitigation measures, and provision of emergency medical care. Although the project intended the development of indicators to measure the identified legacies, evaluation was not possible at the time of publication due to time.

  1. RTEMS CENTRE- RTEMS Improvement

    NASA Astrophysics Data System (ADS)

    Silva, Helder; Constantino, Alexandre; Freitas, Daniel; Coutinho, Manuel; Faustino, Sergio; Sousa, Jose; Dias, Luis; Zulianello, Marco

    2010-08-01

    During the last two years, EDISOFT's RTEMS CENTRE team [1], jointly with the European Space Agency and with the support of the worldwide RTEMS community [2], have been developing an activity to facilitate the qualification of the real-time operating system RTEMS (Real-Time Operating System for Multiprocessor Systems). This paper intends to give a high level visibility of the progress and the results obtained in the RTEMS Improvement [3] activity. The primary objective [4] of the project is to improve the RTEMS product, its documentation and to facilitate the qualification of RTEMS for future space missions, taking into consideration the specific operational requirements. The sections below provide a brief overview of the RTEMS operating system and the activities performed in the RTEMS Improvement project, which includes the selection of API managers to be qualified, the tailoring process, the requirements analysis, the reverse engineering and design of the RTEMS, the quality assurance process, the ISVV activities, the test campaign, the results obtained, the criticality analysis and the facilitation of qualification process.

  2. Participatory arts programs in residential dementia care: Playing with language differences.

    PubMed

    Swinnen, Aagje; de Medeiros, Kate

    2017-01-01

    This article examines connections between language, identity, and cultural difference in the context of participatory arts in residential dementia care. Specifically, it looks at how language differences become instruments for the language play that characterizes the participatory arts programs, TimeSlips and the Alzheimer's Poetry Project. These are two approaches that are predominantly spoken-word driven. Although people living with dementia experience cognitive decline that affects language, they are linguistic agents capable of participating in ongoing negotiation processes of connection, belonging, and in- and exclusion through language use. The analysis of two ethnographic vignettes, based on extensive fieldwork in the closed wards of two Dutch nursing homes, illustrates how TimeSlips and the Alzheimer's Poetry Project support them in this agency. The theoretical framework of the analysis consists of literature on the linguistic agency of people living with dementia, the notions of the homo ludens (or man the player) and ludic language, as well as linguistic strategies of belonging in relation to place.

  3. Developpement energetique par modelisation et intelligence territoriale: Un outil de prise de decision participative pour le developpement durable des projets eoliens

    NASA Astrophysics Data System (ADS)

    Vazquez Rascon, Maria de Lourdes

    This thesis focuses on the implementation of a participatory and transparent decision making tool about the wind farm projects. This tool is based on an (argumentative) framework that reflects the stakeholder's values systems involved in these projects and it employs two multicriteria methods: the multicriteria decision aide and the participatory geographical information systems, making it possible to represent this value systems by criteria and indicators to be evaluated. The stakeholder's values systems will allow the inclusion of environmental, economic and social-cultural aspects of wind energy projects and, thus, a sustainable development wind projects vision. This vision will be analyzed using the 16 sustainable principles included in the Quebec's Sustainable Development Act. Four specific objectives have been instrumented to favor a logical completion work, and to ensure the development of a successfultool : designing a methodology to couple the MCDA and participatory GIS, testing the developed methodology by a case study, making a robustness analysis to address strategic issues and analyzing the strengths, weaknesses, opportunities and threads of the developed methodology. Achieving the first goal allowed us to obtain a decision-making tool called Territorial Intelligence Modeling for Energy Development (TIMED approach). The TIMED approach is visually represented by a figure expressing the idea of a co-construction decision and where ail stakeholders are the focus of this methodology. TIMED is composed of four modules: Multi-Criteria decision analysis, participatory geographic Information systems, active involvement of the stakeholders and scientific knowledge/local knowledge. The integration of these four modules allows for the analysis of different implementation scenarios of wind turbines in order to choose the best one based on a participatory and transparent decision-making process that takes into account stakeholders' concerns. The second objective enabled the testing of TIMED in an ex-post experience of a wind farm in operation since 2006. In this test, II people participated representing four stakeholder' categories: the private sector, the public sector, experts and civil society. This test allowed us to analyze the current situation in which wind projects are currently developed in Quebec. The concerns of some stakeholders regarding situations that are not considered in the current context were explored through a third goal. This third objective allowed us to make simulations taking into account the assumptions of strategic levels. Examples of the strategic level are the communication tools used to approach the host community and the park property type. Finally, the fourth objective, a SWOT analysis with the participation of eight experts, allowed us to verify the extent to which TIMED approach succeeded in constructing four fields for participatory decision-making: physical, intellectual, emotional and procedural. From these facts, 116 strengths, 28 weaknesses, 32 constraints and 54 opportunities were identified. Contributions, applications, limitations and extensions of this research are based on giving a participatory decision-making methodology taking into account socio-cultural, environmental and economic variables; making reflection sessions on a wind farm in operation; acquiring MCDA knowledge for participants involved in testing the proposed methodology; taking into account the physical, intellectual, emotional and procedural spaces to al1iculate a participatory decision; using the proposed methodology in renewable energy sources other than wind; the need to an interdisciplinary team for the methodology application; access to quality data; access to information technologies; the right to public participation; the neutrality of experts; the relationships between experts and non-experts; cultural constraints; improvement of designed indicators; the implementation of a Web platform for participatory decision-making and writing a manual on the use of the developed methodology. Keywords: wind farm, multicriteria decision, geographic information systems, TIMED approach, sustainable wind energy projects development, renewable energy, social participation, robustness concern, SWOT analysis.

  4. Event coincidence analysis for quantifying statistical interrelationships between event time series. On the role of flood events as triggers of epidemic outbreaks

    NASA Astrophysics Data System (ADS)

    Donges, J. F.; Schleussner, C.-F.; Siegmund, J. F.; Donner, R. V.

    2016-05-01

    Studying event time series is a powerful approach for analyzing the dynamics of complex dynamical systems in many fields of science. In this paper, we describe the method of event coincidence analysis to provide a framework for quantifying the strength, directionality and time lag of statistical interrelationships between event series. Event coincidence analysis allows to formulate and test null hypotheses on the origin of the observed interrelationships including tests based on Poisson processes or, more generally, stochastic point processes with a prescribed inter-event time distribution and other higher-order properties. Applying the framework to country-level observational data yields evidence that flood events have acted as triggers of epidemic outbreaks globally since the 1950s. Facing projected future changes in the statistics of climatic extreme events, statistical techniques such as event coincidence analysis will be relevant for investigating the impacts of anthropogenic climate change on human societies and ecosystems worldwide.

  5. Promoting cultural understanding through pediatric clinical dyads: an education research project.

    PubMed

    McDermott-Levy, Ruth; Cantrell, Mary Ann; Reynolds, Kathryn

    2014-11-01

    This project explored the experiences of six undergraduate nursing students, three American nursing students and three nursing students from the Sultan of Oman, who participated in a faculty initiated education research project as part of their pediatric clinical practicum. Students were placed in dyads, with one American-born student and one Omani student in each dyad. Omani students also were paired with American nurse preceptors. A transcript-based content analysis was used to analyze data generated from qualitative focus group student interviews and student journals. The analysis generated three themes that described how myths were dispelled, cultural barriers were broken down and knowledge gained from another cultural perspective. The nurse preceptors were surveyed at the conclusion of the program. The survey findings suggest that preceptors gained a different cultural perspective of nursing care and they were better informed of the Omani students' learning needs. There was, however, an additional investment of preceptor time in meeting the learning needs of international students. Additional faculty time was also required for preparation and time during clinical conferencing to address differences in nursing practice between U.S. and Oman while meeting course learning objectives. Overall, the educational program provided evidence of enhancing American and Omani student cultural competence and Omani student adaptation to the United States. Coupling a domestic student with an international student to form dyads from the beginning of international students' experience could be a significant enhancement to both groups of students' learning experience. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. LIBS: a potential tool for industrial/agricultural waste water analysis

    NASA Astrophysics Data System (ADS)

    Karpate, Tanvi; K. M., Muhammed Shameem; Nayak, Rajesh; V. K., Unnikrishnan; Santhosh, C.

    2016-04-01

    Laser Induced Breakdown Spectroscopy (LIBS) is a multi-elemental analysis technique with various advantages and has the ability to detect any element in real time. This technique holds a potential for environmental monitoring and various such analysis has been done in soil, glass, paint, water, plastic etc confirms the robustness of this technique for such applications. Compared to the currently available water quality monitoring methods and techniques, LIBS has several advantages, viz. no need for sample preparation, fast and easy operation, and chemical free during the process. In LIBS, powerful pulsed laser generates plasma which is then analyzed to get quantitative and qualitative details of the elements present in the sample. Another main advantage of LIBS technique is that it can perform in standoff mode for real time analysis. Water samples from industries and agricultural strata tend to have a lot of pollutants making it harmful for consumption. The emphasis of this project is to determine such harmful pollutants present in trace amounts in industrial and agricultural wastewater. When high intensity laser is made incident on the sample, a plasma is generated which gives a multielemental emission spectra. LIBS analysis has shown outstanding success for solids samples. For liquid samples, the analysis is challenging as the liquid sample has the chances of splashing due to the high energy of laser and thus making it difficult to generate plasma. This project also deals with determining the most efficient method for testing of water sample for qualitative as well as quantitative analysis using LIBS.

  7. Communication, coordination and cooperation in construction projects: business environment and human behaviours

    NASA Astrophysics Data System (ADS)

    Salah Alaloul, Wesam; Shahir Liew, Mohd; Zawawi, Noor Amila Wan

    2017-12-01

    The accomplishment of construction projects is extremely dependent on the integration of several stakeholders; therefore none of them has the control or the ability to accomplish the project alone. Each of them may influence and be influenced by the project management approach. There is no comprehensive theoretical platform for defining Communication, Coordination and Cooperation (3Cs) in the management of construction project. This paper deliberates the function of the 3Cs different theoretical perceptions. Through an analysis of selected articles from reputable academic journals in construction management, the business environment and human behaviour were identified as two main parts. A little has been done so far about the 3Cs, and how they are correlated with construction projects performance. Therefore, the objective of this paper is to explain the definitions and the association between the 3Cs. There is a significant link between communication and coordination. Coordination alternatively, is trust-based a logic of mutual and exchange. Consequently, cooperation is much more sophisticated, which needing more time and attempts.

  8. Useful Life Prediction for Payload Carrier Hardware

    NASA Technical Reports Server (NTRS)

    Ben-Arieh, David

    2002-01-01

    The Space Shuttle has been identified for use through 2020. Payload carrier systems will be needed to support missions through the same time frame. To support the future decision making process with reliable systems, it is necessary to analyze design integrity, identify possible sources of undesirable risk and recognize required upgrades for carrier systems. This project analyzed the information available regarding the carriers and developed the probability of becoming obsolete under different scenarios. In addition, this project resulted in a plan for an improved information system that will improve monitoring and control of the various carriers. The information collected throughout this project is presented in this report as process flow, historical records, and statistical analysis.

  9. Advanced engineering environment pilot project.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwegel, Jill; Pomplun, Alan R.; Abernathy, Rusty

    2006-10-01

    The Advanced Engineering Environment (AEE) is a concurrent engineering concept that enables real-time process tooling design and analysis, collaborative process flow development, automated document creation, and full process traceability throughout a product's life cycle. The AEE will enable NNSA's Design and Production Agencies to collaborate through a singular integrated process. Sandia National Laboratories and Parametric Technology Corporation (PTC) are working together on a prototype AEE pilot project to evaluate PTC's product collaboration tools relative to the needs of the NWC. The primary deliverable for the project is a set of validated criteria for defining a complete commercial off-the-shelf (COTS) solutionmore » to deploy the AEE across the NWC.« less

  10. Automated detection of Pi 2 pulsations using wavelet analysis: 1. Method and an application for substorm monitoring

    USGS Publications Warehouse

    Nose, M.; Iyemori, T.; Takeda, M.; Kamei, T.; Milling, D.K.; Orr, D.; Singer, H.J.; Worthington, E.W.; Sumitomo, N.

    1998-01-01

    Wavelet analysis is suitable for investigating waves, such as Pi 2 pulsations, which are limited in both time and frequency. We have developed an algorithm to detect Pi 2 pulsations by wavelet analysis. We tested the algorithm and found that the results of Pi 2 detection are consistent with those obtained by visual inspection. The algorithm is applied in a project which aims at the nowcasting of substorm onsets. In this project we use real-time geomagnetic field data, with a sampling rate of 1 second, obtained at mid- and low-latitude stations (Mineyama in Japan, the York SAMNET station in the U.K., and Boulder in the U.S.). These stations are each separated by about 120??in longitude, so at least one station is on the nightside at all times. We plan to analyze the real-time data at each station using the Pi 2 detection algorithm, and to exchange the detection results among these stations via the Internet. Therefore we can obtain information about substorm onsets in real-time, even if we are on the dayside. We have constructed a system to detect Pi 2 pulsations automatically at Mineyama observatory. The detection results for the period of February to August 1996 showed that the rate of successful detection of Pi 2 pulsations was 83.4% for the nightside (18-06MLT) and 26.5% for the dayside (06-18MLT). The detection results near local midnight (20-02MLT) give the rate of successful detection of 93.2%.

  11. CMS Analysis and Data Reduction with Apache Spark

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gutsche, Oliver; Canali, Luca; Cremer, Illia

    Experimental Particle Physics has been at the forefront of analyzing the world's largest datasets for decades. The HEP community was among the first to develop suitable software and computing tools for this task. In recent times, new toolkits and systems for distributed data processing, collectively called "Big Data" technologies have emerged from industry and open source projects to support the analysis of Petabyte and Exabyte datasets in industry. While the principles of data analysis in HEP have not changed (filtering and transforming experiment-specific data formats), these new technologies use different approaches and tools, promising a fresh look at analysis ofmore » very large datasets that could potentially reduce the time-to-physics with increased interactivity. Moreover these new tools are typically actively developed by large communities, often profiting of industry resources, and under open source licensing. These factors result in a boost for adoption and maturity of the tools and for the communities supporting them, at the same time helping in reducing the cost of ownership for the end-users. In this talk, we are presenting studies of using Apache Spark for end user data analysis. We are studying the HEP analysis workflow separated into two thrusts: the reduction of centrally produced experiment datasets and the end-analysis up to the publication plot. Studying the first thrust, CMS is working together with CERN openlab and Intel on the CMS Big Data Reduction Facility. The goal is to reduce 1 PB of official CMS data to 1 TB of ntuple output for analysis. We are presenting the progress of this 2-year project with first results of scaling up Spark-based HEP analysis. Studying the second thrust, we are presenting studies on using Apache Spark for a CMS Dark Matter physics search, comparing Spark's feasibility, usability and performance to the ROOT-based analysis.« less

  12. Impact of a rural solar electrification project on the level and structure of women’s empowerment

    NASA Astrophysics Data System (ADS)

    Burney, Jennifer; Alaofè, Halimatou; Naylor, Rosamond; Taren, Douglas

    2017-09-01

    Although development organizations agree that reliable access to energy and energy services—one of the 17 Sustainable Development Goals—is likely to have profound and perhaps disproportionate impacts on women, few studies have directly empirically estimated the impact of energy access on women’s empowerment. This is a result of both a relative dearth of energy access evaluations in general and a lack of clarity on how to quantify gender impacts of development projects. Here we present an evaluation of the impacts of the Solar Market Garden—a distributed photovoltaic irrigation project—on the level and structure of women’s empowerment in Benin, West Africa. We use a quasi-experimental design (matched-pair villages) to estimate changes in empowerment for project beneficiaries after one year of Solar Market Garden production relative to non-beneficiaries in both treatment and comparison villages (n = 771). To create an empowerment metric, we constructed a set of general questions based on existing theories of empowerment, and then used latent variable analysis to understand the underlying structure of empowerment locally. We repeated this analysis at follow-up to understand whether the structure of empowerment had changed over time, and then measured changes in both the levels and likelihood of empowerment over time. We show that the Solar Market Garden significantly positively impacted women’s empowerment, particularly through the domain of economic independence. In addition to providing rigorous evidence for the impact of a rural renewable energy project on women’s empowerment, our work lays out a methodology that can be used in the future to benchmark the gender impacts of energy projects.

  13. Interdisciplinary Investigations in Support of Project DI-MOD

    NASA Technical Reports Server (NTRS)

    Starks, Scott A. (Principal Investigator)

    1996-01-01

    Various concepts from time series analysis are used as the basis for the development of algorithms to assist in the analysis and interpretation of remote sensed imagery. An approach to trend detection that is based upon the fractal analysis of power spectrum estimates is presented. Additionally, research was conducted toward the development of a software architecture to support processing tasks associated with databases housing a variety of data. An algorithmic approach which provides for the automation of the state monitoring process is presented.

  14. The Induction of Chaos in Electronic Circuits Final Report-October 1, 2001

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R.M.Wheat, Jr.

    2003-04-01

    This project, now known by the name ''Chaos in Electronic Circuits,'' was originally tasked as a two-year project to examine various ''fault'' or ''non-normal'' operational states of common electronic circuits with some focus on determining the feasibility of exploiting these states. Efforts over the two-year duration of this project have been dominated by the study of the chaotic behavior of electronic circuits. These efforts have included setting up laboratory space and hardware for conducting laboratory tests and experiments, acquiring and developing computer simulation and analysis capabilities, conducting literature surveys, developing test circuitry and computer models to exercise and test ourmore » capabilities, and experimenting with and studying the use of RF injection as a means of inducing chaotic behavior in electronics. An extensive array of nonlinear time series analysis tools have been developed and integrated into a package named ''After Acquisition'' (AA), including capabilities such as Delayed Coordinate Embedding Mapping (DCEM), Time Resolved (3-D) Fourier Transform, and several other phase space re-creation methods. Many computer models have been developed for Spice and for the ATP (Alternative Transients Program), modeling the several working circuits that have been developed for use in the laboratory. And finally, methods of induction of chaos in electronic circuits have been explored.« less

  15. Aeroservoelasticity

    NASA Technical Reports Server (NTRS)

    Noll, Thomas E.

    1990-01-01

    The paper describes recent accomplishments and current research projects along four main thrusts in aeroservoelasticity at NASA Langley. One activity focuses on enhancing the modeling and analysis procedures to accurately predict aeroservoelastic interactions. Improvements to the minimum-state method of approximating unsteady aerodynamics are shown to provide precise low-order models for design and simulation tasks. Recent extensions in aerodynamic correction-factor methodology are also described. With respect to analysis procedures, the paper reviews novel enhancements to matched filter theory and random process theory for predicting the critical gust profile and the associated time-correlated gust loads for structural design considerations. Two research projects leading towards improved design capability are also summarized: (1) an integrated structure/control design capability and (2) procedures for obtaining low-order robust digital control laws for aeroelastic applications.

  16. A methodology for the evaluation of program cost and schedule risk for the SEASAT program

    NASA Technical Reports Server (NTRS)

    Abram, P.; Myers, D.

    1976-01-01

    An interactive computerized project management software package (RISKNET) is designed to analyze the effect of the risk involved in each specific activity on the results of the total SEASAT-A program. Both the time and the cost of each distinct activity can be modeled with an uncertainty interval so as to provide the project manager with not only the expected time and cost for the completion of the total program, but also with the expected range of costs corresponding to any desired level of significance. The nature of the SEASAT-A program is described. The capabilities of RISKNET and the implementation plan of a RISKNET analysis for the development of SEASAT-A are presented.

  17. FAILSAFE Health Management for Embedded Systems

    NASA Technical Reports Server (NTRS)

    Horvath, Gregory A.; Wagner, David A.; Wen, Hui Ying; Barry, Matthew

    2010-01-01

    The FAILSAFE project is developing concepts and prototype implementations for software health management in mission- critical, real-time embedded systems. The project unites features of the industry-standard ARINC 653 Avionics Application Software Standard Interface and JPL s Mission Data System (MDS) technology (see figure). The ARINC 653 standard establishes requirements for the services provided by partitioned, real-time operating systems. The MDS technology provides a state analysis method, canonical architecture, and software framework that facilitates the design and implementation of software-intensive complex systems. The MDS technology has been used to provide the health management function for an ARINC 653 application implementation. In particular, the focus is on showing how this combination enables reasoning about, and recovering from, application software problems.

  18. An Empirical Analysis of the Cascade Secret Key Reconciliation Protocol for Quantum Key Distribution

    DTIC Science & Technology

    2011-09-01

    performance with the parity checks within each pass increasing and as a result, the processing time is expected to increase as well. A conclusion is drawn... timely manner has driven efforts to develop new key distribution methods. The most promising method is Quantum Key Distribution (QKD) and is...thank the QKD Project Team for all of the insight and support they provided in such a short time period. Thanks are especially in order for my

  19. Bias and robustness of uncertainty components estimates in transient climate projections

    NASA Astrophysics Data System (ADS)

    Hingray, Benoit; Blanchet, Juliette; Jean-Philippe, Vidal

    2016-04-01

    A critical issue in climate change studies is the estimation of uncertainties in projections along with the contribution of the different uncertainty sources, including scenario uncertainty, the different components of model uncertainty and internal variability. Quantifying the different uncertainty sources faces actually different problems. For instance and for the sake of simplicity, an estimate of model uncertainty is classically obtained from the empirical variance of the climate responses obtained for the different modeling chains. These estimates are however biased. Another difficulty arises from the limited number of members that are classically available for most modeling chains. In this case, the climate response of one given chain and the effect of its internal variability may be actually difficult if not impossible to separate. The estimate of scenario uncertainty, model uncertainty and internal variability components are thus likely to be not really robust. We explore the importance of the bias and the robustness of the estimates for two classical Analysis of Variance (ANOVA) approaches: a Single Time approach (STANOVA), based on the only data available for the considered projection lead time and a time series based approach (QEANOVA), which assumes quasi-ergodicity of climate outputs over the whole available climate simulation period (Hingray and Saïd, 2014). We explore both issues for a simple but classical configuration where uncertainties in projections are composed of two single sources: model uncertainty and internal climate variability. The bias in model uncertainty estimates is explored from theoretical expressions of unbiased estimators developed for both ANOVA approaches. The robustness of uncertainty estimates is explored for multiple synthetic ensembles of time series projections generated with MonteCarlo simulations. For both ANOVA approaches, when the empirical variance of climate responses is used to estimate model uncertainty, the bias is always positive. It can be especially high with STANOVA. In the most critical configurations, when the number of members available for each modeling chain is small (< 3) and when internal variability explains most of total uncertainty variance (75% or more), the overestimation is higher than 100% of the true model uncertainty variance. The bias can be considerably reduced with a time series ANOVA approach, owing to the multiple time steps accounted for. The longer the transient time period used for the analysis, the larger the reduction. When a quasi-ergodic ANOVA approach is applied to decadal data for the whole 1980-2100 period, the bias is reduced by a factor 2.5 to 20 depending on the projection lead time. In all cases, the bias is likely to be not negligible for a large number of climate impact studies resulting in a likely large overestimation of the contribution of model uncertainty to total variance. For both approaches, the robustness of all uncertainty estimates is higher when more members are available, when internal variability is smaller and/or the response-to-uncertainty ratio is higher. QEANOVA estimates are much more robust than STANOVA ones: QEANOVA simulated confidence intervals are roughly 3 to 5 times smaller than STANOVA ones. Excepted for STANOVA when less than 3 members is available, the robustness is rather high for total uncertainty and moderate for internal variability estimates. For model uncertainty or response-to-uncertainty ratio estimates, the robustness is conversely low for QEANOVA to very low for STANOVA. In the most critical configurations (small number of member, large internal variability), large over- or underestimation of uncertainty components is very thus likely. To propose relevant uncertainty analyses and avoid misleading interpretations, estimates of uncertainty components should be therefore bias corrected and ideally come with estimates of their robustness. This work is part of the COMPLEX Project (European Collaborative Project FP7-ENV-2012 number: 308601; http://www.complex.ac.uk/). Hingray, B., Saïd, M., 2014. Partitioning internal variability and model uncertainty components in a multimodel multireplicate ensemble of climate projections. J.Climate. doi:10.1175/JCLI-D-13-00629.1 Hingray, B., Blanchet, J. (revision) Unbiased estimators for uncertainty components in transient climate projections. J. Climate Hingray, B., Blanchet, J., Vidal, J.P. (revision) Robustness of uncertainty components estimates in climate projections. J.Climate

  20. Analysis of Variables to Predict First Year Persistence Using Logistic Regression Analysis at the University of South Florida: Model v2.0

    ERIC Educational Resources Information Center

    Herreid, Charlene H.; Miller, Thomas E.

    2009-01-01

    This article is the fourth in a series of articles describing an attrition prediction and intervention project at the University of South Florida (USF) in Tampa. In this article, the researchers describe the updated version of the prediction model. The original model was developed from a sample of about 900 First Time in College (FTIC) students…

  1. Monomolecular Silane Coatings on Magnesium/Aluminium Fuels

    DTIC Science & Technology

    1991-07-01

    iii SUMMARY The aim of this project was to investigate the curing reaction between CTBN and magnesium/aluminium ailoy surfaces. A dispersion of...performing rheoloqacai experiments with these coated magnesium particles and CTBN . Surface analysis ot the alloys show a nigh percentage of magnesium...Rheoloq’cai analysis of these alloys dispersed 40% w/w in CTBN show increasing rates of change in viscosity with time for each alloy with increasing nominal

  2. Projection-Based Reduced Order Modeling for Spacecraft Thermal Analysis

    NASA Technical Reports Server (NTRS)

    Qian, Jing; Wang, Yi; Song, Hongjun; Pant, Kapil; Peabody, Hume; Ku, Jentung; Butler, Charles D.

    2015-01-01

    This paper presents a mathematically rigorous, subspace projection-based reduced order modeling (ROM) methodology and an integrated framework to automatically generate reduced order models for spacecraft thermal analysis. Two key steps in the reduced order modeling procedure are described: (1) the acquisition of a full-scale spacecraft model in the ordinary differential equation (ODE) and differential algebraic equation (DAE) form to resolve its dynamic thermal behavior; and (2) the ROM to markedly reduce the dimension of the full-scale model. Specifically, proper orthogonal decomposition (POD) in conjunction with discrete empirical interpolation method (DEIM) and trajectory piece-wise linear (TPWL) methods are developed to address the strong nonlinear thermal effects due to coupled conductive and radiative heat transfer in the spacecraft environment. Case studies using NASA-relevant satellite models are undertaken to verify the capability and to assess the computational performance of the ROM technique in terms of speed-up and error relative to the full-scale model. ROM exhibits excellent agreement in spatiotemporal thermal profiles (<0.5% relative error in pertinent time scales) along with salient computational acceleration (up to two orders of magnitude speed-up) over the full-scale analysis. These findings establish the feasibility of ROM to perform rational and computationally affordable thermal analysis, develop reliable thermal control strategies for spacecraft, and greatly reduce the development cycle times and costs.

  3. Making real options really work.

    PubMed

    van Putten, Alexander B; MacMillan, Ian C

    2004-12-01

    As a way to value growth opportunities, real options have had a difficult time catching on with managers. Many CFOs believe the method ensures the overvaluation of risky projects. This concern is legitimate, but abandoning real options as a valuation model isn't the solution. Companies that rely solely on discounted cash flow (DCF) analysis underestimate the value of their projects and may fail to invest enough in uncertain but highly promising opportunities. CFOs need not--and should not--choose one approach over the other. Far from being a replacement for DCF analysis, real options are an essential complement, and a project's total value should encompass both. DCF captures a base estimate of value; real options take into account the potential for big gains. This is not to say that there aren't problems with real options. As currently applied, they focus almost exclusively on the risks associated with revenues, ignoring the risks associated with a project's costs. It's also true that option valuations almost always ignore assets that an initial investment in a subsequently abandoned project will often leave the company. In this article, the authors present a simple formula for combining DCF and option valuations that addresses these two problems. Using an integrated approach, managers will, in the long run, select better projects than their more timid competitors while keeping risk under control. Thus, they will outperform their rivals in both the product and the capital markets.

  4. Improved work zone design guidelines and enhanced model of travel delays in work zones : Phase I, portability and scalability of interarrival and service time probability distribution functions for different locations in Ohio and the establishment of impr

    DOT National Transportation Integrated Search

    2006-01-01

    The project focuses on two major issues - the improvement of current work zone design practices and an analysis of : vehicle interarrival time (IAT) and speed distributions for the development of a digital computer simulation model for : queues and t...

  5. Pro-sustainability choices and child deaths averted: from project experience to investment strategy.

    PubMed

    Sarriot, Eric G; Swedberg, Eric A; Ricca, James G

    2011-05-01

    The pursuit of the Millennium Development Goals and advancing the 'global health agenda' demand the achievement of health impact at scale through efficient investments. We have previously offered that sustainability-a necessary condition for successful expansion of programmes-can be addressed in practical terms. Based on benchmarks from actual child survival projects, we assess the expected impact of translating pro-sustainability choices into investment strategies. We review the experience of Save the Children US in Guinea in terms of investment, approach to sustainability and impact. It offers three benchmarks for impact: Entry project (21 lives saved of children under age five per US$100 000), Expansion project (37 LS/US$100k), and Continuation project (100 LS/US$100k). Extrapolating this experience, we model the impact of a traditional investment scenario against a pro-sustainability scenario and compare the deaths averted per dollar spent over five project cycles. The impact per dollar spent on a pro-sustainability strategy is 3.4 times that of a traditional one over the long run (range from 2.2 to 5.7 times in a sensitivity analysis). This large efficiency differential between two investment approaches offers a testable hypothesis for large-scale/long-term studies. The 'bang for the buck' of health programmes could be greatly increased by following a pro-sustainability investment strategy.

  6. Management and analysis of Michigan intelligent transportation systems center data with application to the Detroit area I-75 corridor.

    DOT National Transportation Integrated Search

    2011-02-01

    An understanding of traffic flow in time and space is fundamental to the development of : strategies for the efficient use of the existing transportation infrastructure in large : metropolitan areas. Thus, this project involved developing the methods...

  7. MIDAS Website. Revised

    NASA Technical Reports Server (NTRS)

    Goodman, Allen; Shively, R. Joy (Technical Monitor)

    1997-01-01

    MIDAS, Man-machine Integration Design and Analysis System, is a unique combination of software tools aimed at reducing design cycle time, supporting quantitative predictions of human-system effectiveness and improving the design of crew stations and their associated operating procedures. This project is supported jointly by the US Army and NASA.

  8. Signal optimization and analysis using PASSER V-07 : training workshop: code IPR006.

    DOT National Transportation Integrated Search

    2011-01-01

    The objective of this project was to conduct one pilot workshop and five regular workshops to teach the effective use of the enhanced PASSER V-07 arterial signal timing optimization software. PASSER V-07 and materials for conducting a one-day trainin...

  9. Promoting energy efficiency through improved electricity pricing: A mid-project report

    NASA Astrophysics Data System (ADS)

    Action, J. P.; Kohler, D. F.; Mitchell, B. M.; Park, R. E.

    1982-03-01

    Five related areas of electricity demand analysis under alternative rate forms were studied. Adjustments by large commercial and industrial customers are examined. Residential demand under time of day (TOD) pricing is examined. A methodology for evaluating alternative rate structures is developed and applied.

  10. Climate change impact on streamflow in large-scale river basins: projections and their uncertainties sourced from GCMs and RCP scenarios

    NASA Astrophysics Data System (ADS)

    Nasonova, Olga N.; Gusev, Yeugeniy M.; Kovalev, Evgeny E.; Ayzel, Georgy V.

    2018-06-01

    Climate change impact on river runoff was investigated within the framework of the second phase of the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP2) using a physically-based land surface model Soil Water - Atmosphere - Plants (SWAP) (developed in the Institute of Water Problems of the Russian Academy of Sciences) and meteorological projections (for 2006-2099) simulated by five General Circulation Models (GCMs) (including GFDL-ESM2M, HadGEM2-ES, IPSL-CM5A-LR, MIROC-ESM-CHEM, and NorESM1-M) for each of four Representative Concentration Pathway (RCP) scenarios (RCP2.6, RCP4.5, RCP6.0, and RCP8.5). Eleven large-scale river basins were used in this study. First of all, SWAP was calibrated and validated against monthly values of measured river runoff with making use of forcing data from the WATCH data set and all GCMs' projections were bias-corrected to the WATCH. Then, for each basin, 20 projections of possible changes in river runoff during the 21st century were simulated by SWAP. Analysis of the obtained hydrological projections allowed us to estimate their uncertainties resulted from application of different GCMs and RCP scenarios. On the average, the contribution of different GCMs to the uncertainty of the projected river runoff is nearly twice larger than the contribution of RCP scenarios. At the same time the contribution of GCMs slightly decreases with time.

  11. In Vivo Time-Lapse Imaging in the Zebrafish Lateral Line: A Flexible, Open-Ended Research Project for an Undergraduate Neurobiology Laboratory Course.

    PubMed

    Marra, Molly H; Tobias, Zachary J C; Cohen, Hannah R; Glover, Greta; Weissman, Tamily A

    2015-01-01

    The lateral line sensory system in fish detects movements in the water and allows fish to respond to predators, prey, and other stimuli. As the lateral line forms in the first two days of zebrafish development, axons extend caudally along the lateral surface of the fish, eventually forming synapses with hair cells of neuromasts. Growing lateral line axons are located superficially under the skin and can be labeled in living zebrafish using fluorescent protein expression. This system provides a relatively straightforward approach for in vivo time-lapse imaging of neuronal development in an undergraduate setting. Here we describe an upper-level neurobiology laboratory module in which students investigate aspects of axonal development in the zebrafish lateral line system. Students learn to handle and image living fish, collect time-lapse videos of moving mitochondria, and quantitatively measure mitochondrial dynamics by generating and analyzing kymographs of their movements. Energy demands may differ between axons with extending growth cones versus axons that have already reached their targets and are forming synapses. Since relatively little is known about this process in developing lateral line axons, students generate and test their own hypotheses regarding how mitochondrial dynamics may differ at two different time points in axonal development. Students also learn to incorporate into their analysis a powerful yet accessible quantitative tool, the kymograph, which is used to graph movement over time. After students measure and quantify dynamics in living fish at 1 and 2 days post fertilization, this module extends into independent projects, in which students can expand their studies in a number of different, inquiry-driven directions. The project can also be pared down for courses that wish to focus solely on the quantitative analysis (without fish handling), or vice versa. This research module provides a useful approach for the design of open-ended laboratory research projects that integrate the scientific process into undergraduate Biology courses, as encouraged by the AAAS and NSF Vision and Change Initiative.

  12. Using operations research to plan improvement of the transport of critically ill patients.

    PubMed

    Chen, Jing; Awasthi, Anjali; Shechter, Steven; Atkins, Derek; Lemke, Linda; Fisher, Les; Dodek, Peter

    2013-01-01

    Operations research is the application of mathematical modeling, statistical analysis, and mathematical optimization to understand and improve processes in organizations. The objective of this study was to illustrate how the methods of operations research can be used to identify opportunities to reduce the absolute value and variability of interfacility transport intervals for critically ill patients. After linking data from two patient transport organizations in British Columbia, Canada, for all critical care transports during the calendar year 2006, the steps for transfer of critically ill patients were tabulated into a series of time intervals. Statistical modeling, root-cause analysis, Monte Carlo simulation, and sensitivity analysis were used to test the effect of changes in component intervals on overall duration and variation of transport times. Based on quality improvement principles, we focused on reducing the 75th percentile and standard deviation of these intervals. We analyzed a total of 3808 ground and air transports. Constraining time spent by transport personnel at sending and receiving hospitals was projected to reduce the total time taken by 33 minutes with as much as a 20% reduction in standard deviation of these transport intervals in 75% of ground transfers. Enforcing a policy of requiring acceptance of patients who have life- or limb-threatening conditions or organ failure was projected to reduce the standard deviation of air transport time by 63 minutes and the standard deviation of ground transport time by 68 minutes. Based on findings from our analyses, we developed recommendations for technology renovation, personnel training, system improvement, and policy enforcement. Use of the tools of operations research identifies opportunities for improvement in a complex system of critical care transport.

  13. Testing Software Development Project Productivity Model

    NASA Astrophysics Data System (ADS)

    Lipkin, Ilya

    Software development is an increasingly influential factor in today's business environment, and a major issue affecting software development is how an organization estimates projects. If the organization underestimates cost, schedule, and quality requirements, the end results will not meet customer needs. On the other hand, if the organization overestimates these criteria, resources that could have been used more profitably will be wasted. There is no accurate model or measure available that can guide an organization in a quest for software development, with existing estimation models often underestimating software development efforts as much as 500 to 600 percent. To address this issue, existing models usually are calibrated using local data with a small sample size, with resulting estimates not offering improved cost analysis. This study presents a conceptual model for accurately estimating software development, based on an extensive literature review and theoretical analysis based on Sociotechnical Systems (STS) theory. The conceptual model serves as a solution to bridge organizational and technological factors and is validated using an empirical dataset provided by the DoD. Practical implications of this study allow for practitioners to concentrate on specific constructs of interest that provide the best value for the least amount of time. This study outlines key contributing constructs that are unique for Software Size E-SLOC, Man-hours Spent, and Quality of the Product, those constructs having the largest contribution to project productivity. This study discusses customer characteristics and provides a framework for a simplified project analysis for source selection evaluation and audit task reviews for the customers and suppliers. Theoretical contributions of this study provide an initial theory-based hypothesized project productivity model that can be used as a generic overall model across several application domains such as IT, Command and Control, Simulation and etc... This research validates findings from previous work concerning software project productivity and leverages said results in this study. The hypothesized project productivity model provides statistical support and validation of expert opinions used by practitioners in the field of software project estimation.

  14. Phenotype analysis of congenital and neurodevelopmental disorders in the next generation sequencing era.

    PubMed

    Carey, John C

    2017-09-01

    The designation, phenotype, was proposed as a term by Wilhelm Johannsen in 1909. The word is derived from the Greek, phano (showing) and typo (type), phanotypos. Phenotype has become a widely recognized term, even outside of the genetics community, in recent years with the ongoing identification of human disease genes. The term has been defined as the observable constitution of an organism, but sometimes refers to a condition when a person has a particular clinical presentation. Analysis of phenotype is a timely theme because advances in the understanding of the genetic basis of human disease and the emergence of next generation sequencing have spurred a renewed interest in phenotype and the proposal to establish a "Human Phenome Project." This article summarizes the principles of phenotype analysis that are important in medical genetics and describes approaches to comprehensive phenotype analysis in the investigation of patients with human disorders. I discuss the various elements related to disease phenotypes and highlight neurofibromatosis type 1 and the Elements of Morphology Project as illustrations of the principles. In recent years, the notion of "deep phenotyping" has emerged. Currently there are now a number of proposed strategies and resources to approach this concept. Not since the 1960s and 1970s has there been such an exciting time in the history of medicine surrounding the analysis of phenotype in genetic disorders. © 2017 Wiley Periodicals, Inc.

  15. An Integrated Analysis-Test Approach

    NASA Technical Reports Server (NTRS)

    Kaufman, Daniel

    2003-01-01

    This viewgraph presentation provides an overview of a project to develop a computer program which integrates data analysis and test procedures. The software application aims to propose a new perspective to traditional mechanical analysis and test procedures and to integrate pre-test and test analysis calculation methods. The program also should also be able to be used in portable devices and allows for the 'quasi-real time' analysis of data sent by electronic means. Test methods reviewed during this presentation include: shaker swept sine and random tests, shaker shock mode tests, shaker base driven model survey tests and acoustic tests.

  16. Community capacity as an "inside job": evolution of perceived ownership within a university-aboriginal community partnership.

    PubMed

    Cargo, Margaret D; Delormier, Treena; Lévesque, Lucie; McComber, Alex M; Macaulay, Ann C

    2011-01-01

    PURPOSE. To assess the evolution of perceived ownership of a university-Aboriginal community partnership across three project stages. DESIGN. Survey administration to project partners during project formalization (1996-T1), mobilization (1999-T2), and maintenance (2004-T3). SETTING. Aboriginal community of Kahnawake, outside Montreal, Quebec, Canada. PARTICIPANTS. Partners involved in influencing decision making in the Kahnawake Schools Diabetes Prevention Project (KSDPP). MEASURE AND ANALYSIS . A measure of perceived primary ownership subjected to linear trend analysis. RESULTS. KSDPP staff were perceived as primary owner at T1 and shared ownership with Community Advisory Board (CAB) members at T2 and T3. Trend tests indicated greater perceived ownership between T1 and T3 for CAB (χ(2)(1)  =  12.3, p < .0001) and declining KSDPP staff (χ(2)(1)  =  10.5, p < .001) ownership over time. Academic partners were never perceived as primary owners. CONCLUSION. This project was community driven from the beginning. It was not dependent on an external academic change agent to activate the community and develop the community's capacity to plan and implement a solution. It still took several years for the grassroots CAB to take responsibility from KSDPP staff, thus indicating the need for sustained funding to build grassroots community capacity.

  17. Assessment of Evidence Base from Medical Debriefs Data on Space Motion Sickness Incidence and Treatment

    NASA Technical Reports Server (NTRS)

    Younker, D.R.; Daniels, V.R.; Boyd, J.L.; Putcha, L.

    2008-01-01

    An objective of this data compilation and analysis project is to examine incidence and treatment efficacy of common patho-physiological disturbances during spaceflight. Analysis of medical debriefs data indicated that astronauts used medications to alleviate symptoms of four major ailments for which astronauts received treatment for sleep disturbances, space motion sickness (SMS), pain (headache, back pain) and sinus congestion. In the present data compilation and analysis project on SMS treatment during space missions, subject demographics (gender, age, first-time or repeat flyer), incidence and severity of SMS symptoms and subjective treatment efficacy from 317 crewmember debrief records were examined from STS-1 through STS-89. Preliminary analysis of data revealed that 50% of crew members reported SMS symptoms on at least one flight and 22% never experienced it. In addition, there were 387 medication dosing episodes reported, and promethazine was the most commonly used medication. Results of analysis of symptom check lists, medication use/efficacy and gender and flight record differences in incidence and treatment efficacy will be presented. Evidence gaps for treatment efficacy along with medication use trend analysis will be identified.

  18. Bridging clinical researcher perceptions and health IT realities: A case study of stakeholder creep.

    PubMed

    Panyard, Daniel J; Ramly, Edmond; Dean, Shannon M; Bartels, Christie M

    2018-02-01

    We present a case report detailing a challenge in health information technology (HIT) project implementations we term "stakeholder creep": not thoroughly identifying which stakeholders need to be involved and why before starting a project, consequently not understanding the true effort, skill sets, social capital, and time required to complete the project. A root cause analysis was performed post-implementation to understand what led to stakeholder creep. HIT project stakeholders were given a questionnaire to comment on these misconceptions and a proposed implementation tool to help mitigate stakeholder creep. Stakeholder creep contributed to an unexpected increase in time (3-month delayed go-live) and effort (68% over expected HIT work hours). Four main clinician/researcher misconceptions were identified that contributed to the development of stakeholder creep: 1) that EHR IT is a single group; 2) that all EHR IT members know the entire EHR functionality; 3) that changes to an EHR need the input of just a single EHR IT member; and 4) that the technological complexity of a project mirrors the clinical complexity. HIT project stakeholders similarly perceived clinicians/researchers to hold these misconceptions. The proposed stakeholder planning tool was perceived to be feasible and helpful. Stakeholder creep can negatively affect HIT project implementations. Projects may be susceptible to stakeholder creep when clinicians/researchers hold misconceptions related to HIT organization and processes. Implementation tools, such as the proposed stakeholder checklist, could be helpful in preempting and mitigating the effect of stakeholder creep. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Exploring biomedical ontology mappings with graph theory methods.

    PubMed

    Kocbek, Simon; Kim, Jin-Dong

    2017-01-01

    In the era of semantic web, life science ontologies play an important role in tasks such as annotating biological objects, linking relevant data pieces, and verifying data consistency. Understanding ontology structures and overlapping ontologies is essential for tasks such as ontology reuse and development. We present an exploratory study where we examine structure and look for patterns in BioPortal, a comprehensive publicly available repository of live science ontologies. We report an analysis of biomedical ontology mapping data over time. We apply graph theory methods such as Modularity Analysis and Betweenness Centrality to analyse data gathered at five different time points. We identify communities, i.e., sets of overlapping ontologies, and define similar and closest communities. We demonstrate evolution of identified communities over time and identify core ontologies of the closest communities. We use BioPortal project and category data to measure community coherence. We also validate identified communities with their mutual mentions in scientific literature. With comparing mapping data gathered at five different time points, we identified similar and closest communities of overlapping ontologies, and demonstrated evolution of communities over time. Results showed that anatomy and health ontologies tend to form more isolated communities compared to other categories. We also showed that communities contain all or the majority of ontologies being used in narrower projects. In addition, we identified major changes in mapping data after migration to BioPortal Version 4.

  20. Instrumentation for Aerosol and Gas Speciation

    NASA Technical Reports Server (NTRS)

    Coggiola, Michael J.

    1998-01-01

    Using support from NASA Grant No. NAG 2-963, SRI International successfully completed the project, entitled, 'Instrumentation for Aerosol and Gas Speciation.' This effort (SRI Project 7383) covered the design, fabrication, testing, and deployment of a real-time aerosol speciation instrument in NASA's DC-8 aircraft during the Spring 1996 SUbsonic aircraft: Contrail and Cloud Effects Special Study (SUCCESS) mission. This final technical report describes the pertinent details of the instrument design, its abilities, its deployment during SUCCESS and the data acquired from the mission, and the post-mission calibration, data reduction, and analysis.

  1. Abstract - Cooperative Research and Development Agreement between Ames National Laboratory and National Energy Technology Laboratory AGMT-0609

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bryden, Mark; Tucker, David A.

    The goal of this project is to develop a merged environment for simulation and analysis (MESA) at the National Energy Technology Laboratory’s (NETL) Hybrid Performance (Hyper) project laboratory. The MESA sensor lab developed as a component of this research will provide a development platform for investigating: 1) advanced control strategies, 2) testing and development of sensor hardware, 3) various modeling in-the-loop algorithms and 4) other advanced computational algorithms for improved plant performance using sensors, real-time models, and complex systems tools.

  2. Les annees de transition: une epoque de changement. Reveu et analyse des projets pilotes sur les questions relatives aux annees de Transition. Volume 1: Resume (Year of Transition: Times for Change. A Review and Analysis of Pilot Projects Investigating Issues in the Transition Years. Volume 1: Summary).

    ERIC Educational Resources Information Center

    Hargreaves, Andy; And Others

    In 1990, the Ontario (Canada) Ministry of Education implemented the Transition Years project, an initiative for restructuring middle-grades education. This French-language document presents findings of a study that identified effective policies and practices used by the pilot schools. Data were derived from: (1) surveys completed by staffs in…

  3. WetLab-2: Tools for Conducting On-Orbit Quantitative Real-Time Gene Expression Analysis on ISS

    NASA Technical Reports Server (NTRS)

    Parra, Macarena; Almeida, Eduardo; Boone, Travis; Jung, Jimmy; Schonfeld, Julie

    2014-01-01

    The objective of NASA Ames Research Centers WetLab-2 Project is to place on the ISS a research platform capable of conducting gene expression analysis via quantitative real-time PCR (qRT-PCR) of biological specimens sampled or cultured on orbit. The project has selected a Commercial-Off-The-Shelf (COTS) qRT-PCR system, the Cepheid SmartCycler and will fly it in its COTS configuration. The SmartCycler has a number of advantages including modular design (16 independent PCR modules), low power consumption, rapid ramp times and the ability to detect up to four separate fluorescent channels at one time enabling multiplex assays that can be used for normalization and to study multiple genes of interest in each module. The team is currently working with Cepheid to enable the downlink of data from the ISS to the ground and provide uplink capabilities for programming, commanding, monitoring, and instrument maintenance. The project has adapted commercial technology to design a module that can lyse cells and extract RNA of sufficient quality and quantity for use in qRT-PCR reactions while using a housekeeping gene to normalize RNA concentration and integrity. The WetLab-2 system is capable of processing multiple sample types ranging from microbial cultures to animal tissues dissected on-orbit. The ability to conduct qRT-PCR on-orbit eliminates the confounding effects on gene expression of reentry stresses and shock acting on live cells and organisms or the concern of RNA degradation of fixed samples. The system can be used to validate terrestrial analyses of samples returned from ISS by providing on-orbit gene expression benchmarking prior to sample return. The ability to get on orbit data will provide investigators with the opportunity to adjust experiment parameters for subsequent trials based on the real-time data analysis without need for sample return and re-flight. Researchers will also be able to sample multigenerational changes in organisms. Finally, the system can be used for analysis of air, surface, water, and clinical samples to monitor environmental contaminants and crew health. The verification flight of the instrument is scheduled to launch on SpaceX-7 in June 2015.

  4. The influence of client brief and change order in construction project

    NASA Astrophysics Data System (ADS)

    Mahat, N. A. A.; Adnan, H.

    2018-02-01

    Construction briefing is a statement of needs about intentions and projects objectives. Briefing process is the preliminary stage in the design process and successful briefing can achieve project delivery right on target time, cost and quality of project confidently. Although there are many efforts to approach client’s requirement and needs for a project, it is still not collected adequately to make proper solutions in design. Thus, these may lead the client to include change orders during the construction phase. This paper is concerned toward the influence of client’s briefing of a construction project that impact on the change order on the construction works. The research objective is to identify the influence of client’s brief on change orders, therefore, the aims of the research is to reduce change orders in project delivery. This research adopted both qualitative and quantitative data collection methods which are content analysis and semi structure interview. The findings highlight factors contributing to change orders and the essential attributes of clients during the briefing stage that may help minimise them.

  5. General practitioners in Styria - who is willing to take part in research projects and why? : A survey by the Institute of General Practice and Health Services Research.

    PubMed

    Poggenburg, Stephanie; Reinisch, Manuel; Höfler, Reinhild; Stigler, Florian; Avian, Alexander; Siebenhofer, Andrea

    2017-11-01

    Increasing recognition of general practice is reflected in the growing number of university institutes devoted to the subject and Health Services Research (HSR) is flourishing as a result. In May 2015 the Institute of General Practice and Evidence-based Health Services Research, Medical University of Graz, initiated a survey of Styrian GPs. The aim of the survey was to determine the willingness to take part in HSR projects, to collect sociodemographic data from GPs who were interested and to identify factors affecting participation in research projects. Of the 1015 GPs who received the questionnaire, 142 (14%) responded and 135 (13%) were included in the analysis. Overall 106 (10%) GPs indicated their willingness to take part in research projects. Factors inhibiting participation were lack of time, administrative workload, and lack of assistance. Overall, 10% of Styrian GPs were willing to participate in research projects. Knowledge about the circumstances under which family doctors are prepared to participate in HSR projects will help in the planning of future projects.

  6. Updating temperature and salinity mean values and trends in the Western Mediterranean: The RADMED project

    NASA Astrophysics Data System (ADS)

    Vargas-Yáñez, M.; García-Martínez, M. C.; Moya, F.; Balbín, R.; López-Jurado, J. L.; Serra, M.; Zunino, P.; Pascual, J.; Salat, J.

    2017-09-01

    The RADMED project is devoted to the implementation and maintenance of a multidisciplinary monitoring system around the Spanish Mediterranean waters. This observing system is based on periodic multidisciplinary cruises covering the coastal waters, continental shelf and slope waters and some deep stations (>2000 m) from the Westernmost Alboran Sea to Barcelona in the Catalan Sea, including the Balearic Islands. This project was launched in 2007 unifying and extending some previous monitoring projects which had a more reduced geographical coverage. Some of the time series currently available extend from 1992, while the more recent ones were initiated in 2007. The present work updates the available time series up to 2015 (included) and shows the capability of these time series for two main purposes: the calculation of mean values for the properties of main water masses around the Spanish Mediterranean, and the study of the interannual and decadal variability of such properties. The data set provided by the RADMED project has been merged with historical data from the MEDAR/MEDATLAS data base for the calculation of temperature and salinity trends from 1900 to 2015. The analysis of these time series shows that the intermediate and deep layers of the Western Mediterranean have increased their temperature and salinity with an acceleration of the warming and salting trends from 1943. Trends for the heat absorbed by the water column for the 1943-2015 period, range between 0.2 and 0.6 W/m2 depending on the used methodology. The temperature and salinity trends for the same period and for the intermediate layer are 0.002 °C/yr and 0.001 yr-1 respectively. Deep layers warmed and increased their salinity at a rate of 0.004 °C/yr and 0.001 yr-1.

  7. Investigation of frame-to-frame back projection and feature selection algorithms for non-line-of-sight laser gated viewing

    NASA Astrophysics Data System (ADS)

    Laurenzis, Martin; Velten, Andreas

    2014-10-01

    In the present paper, we discuss new approaches to analyze laser gated viewing data for non-line-of-sight vision with a novel frame-to-frame back projection as well as feature selection algorithms. While first back projection approaches use time transients for each pixel, our new method has the ability to calculate the projection of imaging data on the obscured voxel space for each frame. Further, four different data analysis algorithms were studied with the aim to identify and select signals from different target positions. A slight modification of commonly used filters leads to powerful selection of local maximum values. It is demonstrated that the choice of the filter has impact on the selectivity i.e. multiple target detection as well as on the localization precision.

  8. Demystifying first-cost green building premiums in healthcare.

    PubMed

    Houghton, Adele; Vittori, Gail; Guenther, Robin

    2009-01-01

    This study assesses the extent of "first-cost green building construction premiums" in the healthcare sector based on data submitted by and interviews with 13 current LEED-certified and LEED-registered healthcare project teams, coupled with a literature survey of articles on the topics of actual and perceived first-cost premiums associated with green building strategies. This analysis covers both perceived and realized costs across a range of projects in this sector, leading to the following conclusions: Construction first-cost premiums may be lower than is generally perceived, and they appear to be independent of both building size and level of "green" achievement; projects are using financial incentives and philanthropy to drive higher levels of achievement; premiums are decreasing over time; and projects are benefiting from improvements in health and productivity which, although difficult to monetize, are universally valued.

  9. Water quality monitor (EMPAX instrument)

    NASA Technical Reports Server (NTRS)

    Kelliher, Warren C.; Clark, Ben; Thornton, Mike

    1991-01-01

    The impetus of the Viking Mission to Mars led to the first miniaturization of a X-ray Fluorescence Spectrometer (XRFS). Two units were flown on the Viking Mission and successfully operated for two years analyzing the elemental composition of the Martian soil. Under a Bureau of Mines/NASA Technology Utilization project, this XRFS design was utilized to produce a battery powered, portable unit for elemental analysis of geological samples. This paper will detail design improvements and additional sampling capabilities that were incorporated into a second generation portable XRFS that was funded by the EPA/NASA Technology Utilization project. The unit, Environment Monitoring with Portable Analysis by X-ray (EMPAX), was developed specifically for quantitative determination of the need of EPA and and any industry affected by environmental concerns, the EMPAX fulfills a critical need to provide on-site, real-time analysis of toxic metal contamination. A patent was issued on EMPAX, but a commercial manufacturer is still being sought.

  10. Quality management: reduction of waiting time and efficiency enhancement in an ENT-university outpatients' department

    PubMed Central

    Helbig, Matthias; Helbig, Silke; Kahla-Witzsch, Heike A; May, Angelika

    2009-01-01

    Background Public health systems are confronted with constantly rising costs. Furthermore, diagnostic as well as treatment services become more and more specialized. These are the reasons for an interdisciplinary project on the one hand aiming at simplification of planning and scheduling patient appointments, on the other hand at fulfilling all requirements of efficiency and treatment quality. Methods As to understanding procedure and problem solving activities, the responsible project group strictly proceeded with four methodical steps: actual state analysis, analysis of causes, correcting measures, and examination of effectiveness. Various methods of quality management, as for instance opinion polls, data collections, and several procedures of problem identification as well as of solution proposals were applied. All activities were realized according to the requirements of the clinic's ISO 9001:2000 certified quality management system. The development of this project is described step by step from planning phase to inauguration into the daily routine of the clinic and subsequent control of effectiveness. Results Five significant problem fields could be identified. After an analysis of causes the major remedial measures were: installation of a patient telephone hotline, standardization of appointment arrangements for all patients, modification of the appointments book considering the reason for coming in planning defined working periods for certain symptoms and treatments, improvement of telephonic counselling, and transition to flexible time planning by daily updates of the appointments book. After implementation of these changes into the clinic's routine success could be demonstrated by significantly reduced waiting times and resulting increased patient satisfaction. Conclusion Systematic scrutiny of the existing organizational structures of the outpatients' department of our clinic by means of actual state analysis and analysis of causes revealed the necessity of improvement. According to rules of quality management correcting measures and subsequent examination of effectiveness were performed. These changes resulted in higher satisfaction of patients, referring colleagues and clinic staff the like. Additionally the clinic is able to cope with an increasing demand for appointments in outpatients' departments, and the clinic's human resources are employed more effectively. PMID:19183496

  11. Temporal Dissociation of Neocortical and Hippocampal Contributions to Mental Time Travel Using Intracranial Recordings in Humans.

    PubMed

    Schurr, Roey; Nitzan, Mor; Eliahou, Ruth; Spinelli, Laurent; Seeck, Margitta; Blanke, Olaf; Arzy, Shahar

    2018-01-01

    In mental time travel (MTT) one is "traveling" back-and-forth in time, remembering, and imagining events. Despite intensive research regarding memory processes in the hippocampus, it was only recently shown that the hippocampus plays an essential role in encoding the temporal order of events remembered, and therefore plays an important role in MTT. Does it also encode the temporal relations of these events to the remembering self? We asked patients undergoing pre-surgical evaluation with depth electrodes penetrating the temporal lobes bilaterally toward the hippocampus to project themselves in time to a past, future, or present time-point, and then make judgments regarding various events. Classification analysis of intracranial evoked potentials revealed clear temporal dissociation in the left hemisphere between lateral-temporal electrodes, activated at ~100-300 ms, and hippocampal electrodes, activated at ~400-600 ms. This dissociation may suggest a division of labor in the temporal lobe during self-projection in time, hinting toward the different roles of the lateral-temporal cortex and the hippocampus in MTT and the temporal organization of the related events with respect to the experiencing self.

  12. NAS Demand Predictions, Transportation Systems Analysis Model (TSAM) Compared with Other Forecasts

    NASA Technical Reports Server (NTRS)

    Viken, Jeff; Dollyhigh, Samuel; Smith, Jeremy; Trani, Antonio; Baik, Hojong; Hinze, Nicholas; Ashiabor, Senanu

    2006-01-01

    The current work incorporates the Transportation Systems Analysis Model (TSAM) to predict the future demand for airline travel. TSAM is a multi-mode, national model that predicts the demand for all long distance travel at a county level based upon population and demographics. The model conducts a mode choice analysis to compute the demand for commercial airline travel based upon the traveler s purpose of the trip, value of time, cost and time of the trip,. The county demand for airline travel is then aggregated (or distributed) to the airport level, and the enplanement demand at commercial airports is modeled. With the growth in flight demand, and utilizing current airline flight schedules, the Fratar algorithm is used to develop future flight schedules in the NAS. The projected flights can then be flown through air transportation simulators to quantify the ability of the NAS to meet future demand. A major strength of the TSAM analysis is that scenario planning can be conducted to quantify capacity requirements at individual airports, based upon different future scenarios. Different demographic scenarios can be analyzed to model the demand sensitivity to them. Also, it is fairly well know, but not well modeled at the airport level, that the demand for travel is highly dependent on the cost of travel, or the fare yield of the airline industry. The FAA projects the fare yield (in constant year dollars) to keep decreasing into the future. The magnitude and/or direction of these projections can be suspect in light of the general lack of airline profits and the large rises in airline fuel cost. Also, changes in travel time and convenience have an influence on the demand for air travel, especially for business travel. Future planners cannot easily conduct sensitivity studies of future demand with the FAA TAF data, nor with the Boeing or Airbus projections. In TSAM many factors can be parameterized and various demand sensitivities can be predicted for future travel. These resulting demand scenarios can be incorporated into future flight schedules, therefore providing a quantifiable demand for flights in the NAS for a range of futures. In addition, new future airline business scenarios are investigated that illustrate when direct flights can replace connecting flights and larger aircraft can be substituted, only when justified by demand.

  13. Bibliographic control of audiovisuals: analysis of a cataloging project using OCLC.

    PubMed

    Curtis, J A; Davison, F M

    1985-04-01

    The staff of the Quillen-Dishner College of Medicine Library cataloged 702 audiovisual titles between July 1, 1982, and June 30, 1983, using the OCLC database. This paper discusses the library's audiovisual collection and describes the method and scope of a study conducted during this project, the cataloging standards and conventions adopted, the assignment and use of NLM classification, the provision of summaries for programs, and the amount of staff time expended in cataloging typical items. An analysis of the use of OCLC for this project resulted in the following findings: the rate of successful searches for audiovisual copy was 82.4%; the error rate for records used was 41.9%; modifications were required in every record used; the Library of Congress and seven member institutions provided 62.8% of the records used. It was concluded that the effort to establish bibliographic control of audiovisuals is not widespread and that expanded and improved audiovisual cataloging by the Library of Congress and the National Library of Medicine would substantially contribute to that goal.

  14. Technical and investigative support for high density digital satellite recording systems

    NASA Technical Reports Server (NTRS)

    Schultz, R. A.

    1983-01-01

    Recent results of dropout measurements and defect analysis conducted on one reel of Ampex 721 which was submitted for evaluation by the manufacturer are described. The results or status of other tape evaluation activities are also reviewed. Several changes in test interpretations and applications are recommended. In some cases, deficiencies in test methods or equipment became apparent during continued work on this project and other IITRI tape evaluation projects. Techniques and equipment for future tasks such as tape qualification are also recommended and discussed. Project effort and expenditures were kept at a relatively low level. This rate provided added development time and experience with the IITRI Dropout Measurement System, which is approaching its potential as a computer based dropout analysis tool. Another benefit is the expanded data base on critical parameters that can be achieved from tests on different tape types and lots as they become available. More consideration and effort was directed toward identification of critical parameters, development of meaningful repeatable test procedures, and tape procurement strategy.

  15. Real-time monitoring of CO2 storage sites: Application to Illinois Basin-Decatur Project

    USGS Publications Warehouse

    Picard, G.; Berard, T.; Chabora, E.; Marsteller, S.; Greenberg, S.; Finley, R.J.; Rinck, U.; Greenaway, R.; Champagnon, C.; Davard, J.

    2011-01-01

    Optimization of carbon dioxide (CO2) storage operations for efficiency and safety requires use of monitoring techniques and implementation of control protocols. The monitoring techniques consist of permanent sensors and tools deployed for measurement campaigns. Large amounts of data are thus generated. These data must be managed and integrated for interpretation at different time scales. A fast interpretation loop involves combining continuous measurements from permanent sensors as they are collected to enable a rapid response to detected events; a slower loop requires combining large datasets gathered over longer operational periods from all techniques. The purpose of this paper is twofold. First, it presents an analysis of the monitoring objectives to be performed in the slow and fast interpretation loops. Second, it describes the implementation of the fast interpretation loop with a real-time monitoring system at the Illinois Basin-Decatur Project (IBDP) in Illinois, USA. ?? 2011 Published by Elsevier Ltd.

  16. Hydrologic index development and application to selected Coastwide Reference Monitoring System sites and Coastal Wetlands Planning, Protection and Restoration Act projects

    USGS Publications Warehouse

    Snedden, Gregg A.; Swenson, Erick M.

    2012-01-01

    Hourly time-series salinity and water-level data are collected at all stations within the Coastwide Reference Monitoring System (CRMS) network across coastal Louisiana. These data, in addition to vegetation and soils data collected as part of CRMS, are used to develop a suite of metrics and indices to assess wetland condition in coastal Louisiana. This document addresses the primary objectives of the CRMS hydrologic analytical team, which were to (1) adopt standard time-series analytical techniques that could effectively assess spatial and temporal variability in hydrologic characteristics across the Louisiana coastal zone on site, project, basin, and coastwide scales and (2) develop and apply an index based on wetland hydrology that can describe the suitability of local hydrology in the context of maximizing the productivity of wetland plant communities. Approaches to quantifying tidal variability (least squares harmonic analysis) and partitioning variability of time-series data to various time scales (spectral analysis) are presented. The relation between marsh elevation and the tidal frame of a given hydrograph is described. A hydrologic index that integrates water-level and salinity data, which are collected hourly, with vegetation data that are collected annually is developed. To demonstrate its utility, the hydrologic index is applied to 173 CRMS sites across the coast, and variability in index scores across marsh vegetation types (fresh, intermediate, brackish, and saline) is assessed. The index is also applied to 11 sites located in three Coastal Wetlands Planning, Protection and Restoration Act projects, and the ability of the index to convey temporal hydrologic variability in response to climatic stressors and restoration measures, as well as the effect that this community may have on wetland plant productivity, is illustrated.

  17. Role of interstitial branching in the development of visual corticocortical connections: a time-lapse and fixed-tissue analysis.

    PubMed

    Ruthazer, Edward S; Bachleda, Amelia R; Olavarria, Jaime F

    2010-12-15

    We combined fixed-tissue and time-lapse analyses to investigate the axonal branching phenomena underlying the development of topographically organized ipsilateral projections from area 17 to area 18a in the rat. These complementary approaches allowed us to relate static, large-scale information provided by traditional fixed-tissue analysis to highly dynamic, local, small-scale branching phenomena observed with two-photon time-lapse microscopy in acute slices of visual cortex. Our fixed-tissue data revealed that labeled area 17 fibers invaded area 18a gray matter at topographically restricted sites, reaching superficial layers in significant numbers by postnatal day 6 (P6). Moreover, most parental axons gave rise to only one or occasionally a small number of closely spaced interstitial branches beneath 18a. Our time-lapse data showed that many filopodium-like branches emerged along parental axons in white matter or deep layers in area 18a. Most of these filopodial branches were transient, often disappearing after several minutes to hours of exploratory extension and retraction. These dynamic behaviors decreased significantly from P4, when the projection is first forming, through the second postnatal week, suggesting that the expression of, or sensitivity to, cortical cues promoting new branch addition in the white matter is developmentally down-regulated coincident with gray matter innervation. Together, these data demonstrate that the development of topographically organized corticocortical projections in rats involves extensive exploratory branching along parental axons and invasion of cortex by only a small number of interstitial branches, rather than the widespread innervation of superficial cortical layers by an initially exuberant population of branches. © 2010 Wiley-Liss, Inc.

  18. Role of Interstitial Branching in the Development of Visual Corticocortical Connections: A Time-Lapse and Fixed-Tissue Analysis

    PubMed Central

    Ruthazer, Edward S.; Bachleda, Amelia R.; Olavarria, Jaime F.

    2013-01-01

    We combined fixed-tissue and time-lapse analyses to investigate the axonal branching phenomena underlying the development of topographically organized ipsilateral projections from area 17 to area 18a in the rat. These complementary approaches allowed us to relate static, large-scale information provided by traditional fixed-tissue analysis to highly dynamic, local, small-scale branching phenomena observed with two-photon time-lapse microscopy in acute slices of visual cortex. Our fixed-tissue data revealed that labeled area 17 fibers invaded area 18a gray matter at topographically restricted sites, reaching superficial layers in significant numbers by postnatal day 6 (P6). Moreover, most parental axons gave rise to only one or occasionally a small number of closely spaced interstitial branches beneath 18a. Our time-lapse data showed that many filopodium-like branches emerged along parental axons in white matter or deep layers in area 18a. Most of these filopo-dial branches were transient, often disappearing after several minutes to hours of exploratory extension and retraction. These dynamic behaviors decreased significantly from P4, when the projection is first forming, through the second postnatal week, suggesting that the expression of, or sensitivity to, cortical cues promoting new branch addition in the white matter is developmentally down-regulated coincident with gray matter innervation. Together, these data demonstrate that the development of topographically organized corticocortical projections in rats involves extensive exploratory branching along parental axons and invasion of cortex by only a small number of interstitial branches, rather than the widespread innervation of superficial cortical layers by an initially exuberant population of branches. PMID:21031561

  19. The Emerging Infrastructure of Autonomous Astronomy

    NASA Astrophysics Data System (ADS)

    Seaman, R.; Allan, A.; Axelrod, T.; Cook, K.; White, R.; Williams, R.

    2007-10-01

    Advances in the understanding of cosmic processes demand that sky transient events be confronted with statistical techniques honed on static phenomena. Time domain data sets require vast surveys such as LSST {http://www.lsst.org/lsst_home.shtml} and Pan-STARRS {http://www.pan-starrs.ifa.hawaii.edu}. A new autonomous infrastructure must close the loop from the scheduling of survey observations, through data archiving and pipeline processing, to the publication of transient event alerts and automated follow-up, and to the easy analysis of resulting data. The IVOA VOEvent {http://voevent.org} working group leads efforts to characterize sky transient alerts published through VOEventNet {http://voeventnet.org}. The Heterogeneous Telescope Networks (HTN {http://www.telescope-networks.org}) consortium are observatories and robotic telescope projects seeking interoperability with a long-term goal of creating an e-market for telescope time. Two projects relying on VOEvent and HTN are eSTAR {http://www.estar.org.uk} and the Thinking Telescope {http://www.thinkingtelescopes.lanl.gov} Project.

  20. Towards an operational use of space imagery for oil pollution monitoring in the Mediterranean basin: a demonstration in the Adriatic Sea.

    PubMed

    Ferraro, Guido; Bernardini, Annalia; David, Matej; Meyer-Roux, Serge; Muellenhoff, Oliver; Perkovic, Marko; Tarchi, Dario; Topouzelis, Kostas

    2007-04-01

    Studies of operational pollution carried out by European commission - Joint Research Centre in the Mediterranean Sea for the years 1999-2004 are briefly introduced. The specific analysis of the Adriatic Sea for the same period demonstrates that this area has been characterized by a relevant number of illegal discharges from ships. After setting the historical background of the project AESOP (aerial and satellite surveillance of operational pollution in the Adriatic Sea), the content, partners and aim of the project are presented. Finally, the results of the first phase of the AESOP project are presented. The results seem very encouraging. For the first time in the Adriatic, real time detection of oil spills in satellite images and an immediate verification by the Coast Guard has been undertaken. An exploratory activity has also been carried out in collaboration with the University of Ljubljana to use automatic information system (AIS) to identify the ships detected in the satellite images.

  1. The Seismic Tool-Kit (STK): An Open Source Software For Learning the Basis of Signal Processing and Seismology.

    NASA Astrophysics Data System (ADS)

    Reymond, D.

    2016-12-01

    We present an open source software project (GNU public license), named STK: Seismic Tool-Kit, that is dedicated mainly for learning signal processing and seismology. The STK project that started in 2007, is hosted by SourceForge.net, and count more than 20000 downloads at the date of writing.The STK project is composed of two main branches:First, a graphical interface dedicated to signal processing (in the SAC format (SAC_ASCII and SAC_BIN): where the signal can be plotted, zoomed, filtered, integrated, derivated, ... etc. (a large variety of IFR and FIR filter is proposed). The passage in the frequency domain via the Fourier transform is used to introduce the estimation of spectral density of the signal , with visualization of the Power Spectral Density (PSD) in linear or log scale, and also the evolutive time-frequency representation (or sonagram). The 3-components signals can be also processed for estimating their polarization properties, either for a given window, or either for evolutive windows along the time. This polarization analysis is useful for extracting the polarized noises, differentiating P waves, Rayleigh waves, Love waves, ... etc. Secondly, a panel of Utilities-Program are proposed for working in a terminal mode, with basic programs for computing azimuth and distance in spherical geometry, inter/auto-correlation, spectral density, time-frequency for an entire directory of signals, focal planes, and main components axis, radiation pattern of P waves, Polarization analysis of different waves (including noise), under/over-sampling the signals, cubic-spline smoothing, and linear/non linear regression analysis of data set. STK is developed in C/C++, mainly under Linux OS, and it has been also partially implemented under MS-Windows. STK has been used in some schools for viewing and plotting seismic records provided by IRIS, and it has been used as a practical support for teaching the basis of signal processing. Useful links:http://sourceforge.net/projects/seismic-toolkit/http://sourceforge.net/p/seismic-toolkit/wiki/browse_pages/

  2. Vertical structure and physical processes of the Madden-Julian Oscillation: Biases and uncertainties at short range

    DOE PAGES

    Xavier, Prince K.; Petch, Jon C.; Klingaman, Nicholas P.; ...

    2015-05-26

    We present an analysis of diabatic heating and moistening processes from 12 to 36 h lead time forecasts from 12 Global Circulation Models as part of the “Vertical structure and physical processes of the Madden-Julian Oscillation (MJO)” project. A lead time of 12–36 h is chosen to constrain the large-scale dynamics and thermodynamics to be close to observations while avoiding being too close to the initial spin-up of the models as they adjust to being driven from the Years of Tropical Convection (YOTC) analysis. A comparison of the vertical velocity and rainfall with the observations and YOTC analysis suggests thatmore » the phases of convection associated with the MJO are constrained in most models at this lead time although the rainfall in the suppressed phase is typically overestimated. Although the large-scale dynamics is reasonably constrained, moistening and heating profiles have large intermodel spread. In particular, there are large spreads in convective heating and moistening at midlevels during the transition to active convection. Radiative heating and cloud parameters have the largest relative spread across models at upper levels during the active phase. A detailed analysis of time step behavior shows that some models show strong intermittency in rainfall and differences in the precipitation and dynamics relationship between models. In conclusion, the wealth of model outputs archived during this project is a very valuable resource for model developers beyond the study of the MJO. Additionally, the findings of this study can inform the design of process model experiments, and inform the priorities for field experiments and future observing systems.« less

  3. Correcting Spellings in Second Language Learners' Computer-Assisted Collaborative Writing

    ERIC Educational Resources Information Center

    Musk, Nigel

    2016-01-01

    The present study uses multimodal conversation analysis to examine how pupils studying English as a foreign language make spelling corrections in real time while doing collaborative computer-assisted project work. Unlike most previous related investigations, this study focuses on the "process" rather than evaluating the final…

  4. Self-Directed Learning Research in the Community/Junior College: Description, Conclusions, and Recommendations.

    ERIC Educational Resources Information Center

    Long, Huey B.; Walsh, Stephen M.

    1993-01-01

    Offers an analysis of 11 dissertations focusing on self-directed learning (SDL) in community colleges, highlighting the importance of promoting SDL, the relationship between the level of SDL and other variables, verification and measurement of time spent on SDL projects, and effects of SDL. (DMM)

  5. Ethical Internationalisation in Higher Education: Interfaces with International Development and Sustainability

    ERIC Educational Resources Information Center

    Pashby, Karen; de Oliveira Andreotti, Vanessa

    2016-01-01

    This analysis is situated within a larger project focusing on ethics and internationalisation in higher education. Internationalisation is occurring at a fast pace and encompasses overlapping and contradictory aims largely framed by market imperatives. At the same time, institutions of higher education increasingly promote sustainability. We use a…

  6. Cost analysis of the Mercedes-Benz occupant detection system for air bag shut-off

    DOT National Transportation Integrated Search

    1997-04-01

    The objective of this project is to develop the variable manufacturing costs and lead time estimates of an "occupant detection system for air bag shut off". The Mercedes-Benz system was used as a base for developing this data. The system consists of ...

  7. Alabama Vocational Management Information System. Final Report.

    ERIC Educational Resources Information Center

    Patterson, Douglas; And Others

    A project was developed to design and implement a management information system (MIS) to provide decision makers with accurate, usable, and timely data and information concerning input, output, and impact of vocational education. The objectives were to (1) design an MIS embracing student accounting, fiscal accounting, manpower analysis, and…

  8. Approaches to Data Analysis in Longitudinal Field Investigations of Educational Programs.

    ERIC Educational Resources Information Center

    Jovick, Thomas D.

    The federally funded longitudinal field study called Management Implications of Team Teaching (MITT) required a search for an appropriate strategy for analyzing through-time relationships among selected variables. The MITT project used questionnaires and interviews to collect data concerning the work, governance, attitudes, and orientation of…

  9. Projective-anticipating, projective, and projective-lag synchronization of time-delayed chaotic systems on random networks.

    PubMed

    Feng, Cun-Fang; Xu, Xin-Jian; Wang, Sheng-Jun; Wang, Ying-Hai

    2008-06-01

    We study projective-anticipating, projective, and projective-lag synchronization of time-delayed chaotic systems on random networks. We relax some limitations of previous work, where projective-anticipating and projective-lag synchronization can be achieved only on two coupled chaotic systems. In this paper, we realize projective-anticipating and projective-lag synchronization on complex dynamical networks composed of a large number of interconnected components. At the same time, although previous work studied projective synchronization on complex dynamical networks, the dynamics of the nodes are coupled partially linear chaotic systems. In this paper, the dynamics of the nodes of the complex networks are time-delayed chaotic systems without the limitation of the partial linearity. Based on the Lyapunov stability theory, we suggest a generic method to achieve the projective-anticipating, projective, and projective-lag synchronization of time-delayed chaotic systems on random dynamical networks, and we find both its existence and sufficient stability conditions. The validity of the proposed method is demonstrated and verified by examining specific examples using Ikeda and Mackey-Glass systems on Erdos-Renyi networks.

  10. On the use of IT investment assessment methods in the area of spatial data infrastructure

    NASA Astrophysics Data System (ADS)

    Zwirowicz-Rutkowska, Agnieszka

    2016-06-01

    One of the important issues concerning development of spatial data infrastructures (SDIs) is the carrying out of economic and financial analysis. It is essential to determine expenses and also assess effects resulting from the development and use of infrastructures. Costs and benefits assessment could be associated with assessment of the infrastructure effectiveness and efficiency as well as the infrastructure value, understood as the infrastructure impact on economic aspects of an organisational performance, both of an organisation which realises an SDI project and all users of the infrastructure. The aim of this paper is an overview of various assessment methods of investment as well as an analysis of different types of costs and benefits used for information technology (IT) projects. Based on the literature, the analysis of the examples of the use of these methods in the area of spatial data infrastructures is also presented. Furthermore, the issues of SDI projects and investments are outlined. The results of the analysis indicate usefulness of the financial methods from different fields of management in the area of SDI building, development and use. The author proposes, in addition to the financial methods, the adaptation of the various techniques used for IT investments and their development, taking into consideration the SDI specificity for the purpose of assessment of different types of costs and benefits and integration of financial aspects with non-financial ones. Among the challenges are identification and quantification of costs and benefits, as well as establishing measures which would fit the characteristics of the SDI project and artefacts resulting from the project realisation. Moreover, aspects of subjectivity and variability in time should be taken into account as the consequences of definite goals and policies as well as business context of organisation undertaking the project or using its artefacts and also investors.

  11. Constellation Ground Systems Launch Availability Analysis: Enhancing Highly Reliable Launch Systems Design

    NASA Technical Reports Server (NTRS)

    Gernand, Jeffrey L.; Gillespie, Amanda M.; Monaghan, Mark W.; Cummings, Nicholas H.

    2010-01-01

    Success of the Constellation Program's lunar architecture requires successfully launching two vehicles, Ares I/Orion and Ares V/Altair, within a very limited time period. The reliability and maintainability of flight vehicles and ground systems must deliver a high probability of successfully launching the second vehicle in order to avoid wasting the on-orbit asset launched by the first vehicle. The Ground Operations Project determined which ground subsystems had the potential to affect the probability of the second launch and allocated quantitative availability requirements to these subsystems. The Ground Operations Project also developed a methodology to estimate subsystem reliability, availability, and maintainability to ensure that ground subsystems complied with allocated launch availability and maintainability requirements. The verification analysis developed quantitative estimates of subsystem availability based on design documentation, testing results, and other information. Where appropriate, actual performance history was used to calculate failure rates for legacy subsystems or comparative components that will support Constellation. The results of the verification analysis will be used to assess compliance with requirements and to highlight design or performance shortcomings for further decision making. This case study will discuss the subsystem requirements allocation process, describe the ground systems methodology for completing quantitative reliability, availability, and maintainability analysis, and present findings and observation based on analysis leading to the Ground Operations Project Preliminary Design Review milestone.

  12. Optimization of image quality and acquisition time for lab-based X-ray microtomography using an iterative reconstruction algorithm

    NASA Astrophysics Data System (ADS)

    Lin, Qingyang; Andrew, Matthew; Thompson, William; Blunt, Martin J.; Bijeljic, Branko

    2018-05-01

    Non-invasive laboratory-based X-ray microtomography has been widely applied in many industrial and research disciplines. However, the main barrier to the use of laboratory systems compared to a synchrotron beamline is its much longer image acquisition time (hours per scan compared to seconds to minutes at a synchrotron), which results in limited application for dynamic in situ processes. Therefore, the majority of existing laboratory X-ray microtomography is limited to static imaging; relatively fast imaging (tens of minutes per scan) can only be achieved by sacrificing imaging quality, e.g. reducing exposure time or number of projections. To alleviate this barrier, we introduce an optimized implementation of a well-known iterative reconstruction algorithm that allows users to reconstruct tomographic images with reasonable image quality, but requires lower X-ray signal counts and fewer projections than conventional methods. Quantitative analysis and comparison between the iterative and the conventional filtered back-projection reconstruction algorithm was performed using a sandstone rock sample with and without liquid phases in the pore space. Overall, by implementing the iterative reconstruction algorithm, the required image acquisition time for samples such as this, with sparse object structure, can be reduced by a factor of up to 4 without measurable loss of sharpness or signal to noise ratio.

  13. Using Six Sigma methodology to reduce patient transfer times from floor to critical-care beds.

    PubMed

    Silich, Stephan J; Wetz, Robert V; Riebling, Nancy; Coleman, Christine; Khoueiry, Georges; Abi Rafeh, Nidal; Bagon, Emma; Szerszen, Anita

    2012-01-01

    In response to concerns regarding delays in transferring critically ill patients to intensive care units (ICU), a quality improvement project, using the Six Sigma process, was undertaken to correct issues leading to transfer delay. To test the efficacy of a Six Sigma intervention to reduce transfer time and establish a patient transfer process that would effectively enhance communication between hospital caregivers and improve the continuum of care for patients. The project was conducted at a 714-bed tertiary care hospital in Staten Island, New York. A Six Sigma multidisciplinary team was assembled to assess areas that needed improvement, manage the intervention, and analyze the results. The Six Sigma process identified eight key steps in the transfer of patients from general medical floors to critical care areas. Preintervention data and a root-cause analysis helped to establish the goal transfer-time limits of 3 h for any individual transfer and 90 min for the average of all transfers. The Six Sigma approach is a problem-solving methodology that resulted in almost a 60% reduction in patient transfer time from a general medical floor to a critical care area. The Six Sigma process is a feasible method for implementing healthcare related quality of care projects, especially those that are complex. © 2011 National Association for Healthcare Quality.

  14. THE NANOGRAV NINE-YEAR DATA SET: OBSERVATIONS, ARRIVAL TIME MEASUREMENTS, AND ANALYSIS OF 37 MILLISECOND PULSARS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arzoumanian, Zaven; Brazier, Adam; Chatterjee, Shami

    2015-11-01

    We present high-precision timing observations spanning up to nine years for 37 millisecond pulsars monitored with the Green Bank and Arecibo radio telescopes as part of the North American Nanohertz Observatory for Gravitational Waves (NANOGrav) project. We describe the observational and instrumental setups used to collect the data, and methodology applied for calculating pulse times of arrival; these include novel methods for measuring instrumental offsets and characterizing low signal-to-noise ratio timing results. The time of arrival data are fit to a physical timing model for each source, including terms that characterize time-variable dispersion measure and frequency-dependent pulse shape evolution. Inmore » conjunction with the timing model fit, we have performed a Bayesian analysis of a parameterized timing noise model for each source, and detect evidence for excess low-frequency, or “red,” timing noise in 10 of the pulsars. For 5 of these cases this is likely due to interstellar medium propagation effects rather than intrisic spin variations. Subsequent papers in this series will present further analysis of this data set aimed at detecting or limiting the presence of nanohertz-frequency gravitational wave signals.« less

  15. A tool to assess sex-gender when selecting health research projects.

    PubMed

    Tomás, Concepción; Yago, Teresa; Eguiluz, Mercedes; Samitier, M A Luisa; Oliveros, Teresa; Palacios, Gemma

    2015-04-01

    To validate the questionnaire "Gender Perspective in Health Research" (GPIHR) to assess the inclusion of gender perspective in research projects. Validation study in two stages. Feasibility was analysed in the first, and reliability, internal consistence and validity in the second. Aragón Institute of Health Science, Aragón, Spain. GPIHR was applied to 118 research projects funded in national and international competitive tenders from 2003 to 2012. Analysis of inter- and intra-observer reliability with Kappa index and internal consistency with Cronbach's alpha. Content validity analysed through literature review and construct validity with an exploratory factor analysis. Validated GPIHR has 10 questions: 3 in the introduction, 1 for objectives, 3 for methodology and 3 for research purpose. Average time of application was 13min Inter-observer reliability (Kappa) varied between 0.35 and 0.94 and intra-observer between 0.40 and 0.94. Theoretical construct is supported in the literature. Factor analysis identifies three levels of GP inclusion: "difference by sex", "gender sensitive" and "feminist research" with an internal consistency of 0.64, 0.87 and 0.81, respectively, which explain 74.78% of variance. GPIHR questionnaire is a valid tool to assess GP and useful for those researchers who would like to include GP in their projects. Copyright © 2014 Elsevier España, S.L.U. All rights reserved.

  16. A methodology to estimate uncertainty for emission projections through sensitivity analysis.

    PubMed

    Lumbreras, Julio; de Andrés, Juan Manuel; Pérez, Javier; Borge, Rafael; de la Paz, David; Rodríguez, María Encarnación

    2015-04-01

    Air pollution abatement policies must be based on quantitative information on current and future emissions of pollutants. As emission projections uncertainties are inevitable and traditional statistical treatments of uncertainty are highly time/resources consuming, a simplified methodology for nonstatistical uncertainty estimation based on sensitivity analysis is presented in this work. The methodology was applied to the "with measures" scenario for Spain, concretely over the 12 highest emitting sectors regarding greenhouse gas and air pollutants emissions. Examples of methodology application for two important sectors (power plants, and agriculture and livestock) are shown and explained in depth. Uncertainty bands were obtained up to 2020 by modifying the driving factors of the 12 selected sectors and the methodology was tested against a recomputed emission trend in a low economic-growth perspective and official figures for 2010, showing a very good performance. A solid understanding and quantification of uncertainties related to atmospheric emission inventories and projections provide useful information for policy negotiations. However, as many of those uncertainties are irreducible, there is an interest on how they could be managed in order to derive robust policy conclusions. Taking this into account, a method developed to use sensitivity analysis as a source of information to derive nonstatistical uncertainty bands for emission projections is presented and applied to Spain. This method simplifies uncertainty assessment and allows other countries to take advantage of their sensitivity analyses.

  17. Detection of Buried Targets via Active Selection of Labeled Data: Application to Sensing Subsurface UXO

    DTIC Science & Technology

    2007-06-01

    images,” IEEE Trans. Pattern Analysis Machine Intelligence, vol. 13, no. 2, pp. 99–113, 1991. [15] C. Bouman and M. Shapiro, “A multiscale random...including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing...this project was on developing new statistical algorithms for analysis of electromagnetic induction (EMI) and magnetometer data measured at actual

  18. GetData: A filesystem-based, column-oriented database format for time-ordered binary data

    NASA Astrophysics Data System (ADS)

    Wiebe, Donald V.; Netterfield, Calvin B.; Kisner, Theodore S.

    2015-12-01

    The GetData Project is the reference implementation of the Dirfile Standards, a filesystem-based, column-oriented database format for time-ordered binary data. Dirfiles provide a fast, simple format for storing and reading data, suitable for both quicklook and analysis pipelines. GetData provides a C API and bindings exist for various other languages. GetData is distributed under the terms of the GNU Lesser General Public License.

  19. WISE: Automated support for software project management and measurement. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Ramakrishnan, Sudhakar

    1995-01-01

    One important aspect of software development and IV&V is measurement. Unless a software development effort is measured in some way, it is difficult to judge the effectiveness of current efforts and predict future performances. Collection of metrics and adherence to a process are difficult tasks in a software project. Change activity is a powerful indicator of project status. Automated systems that can handle change requests, issues, and other process documents provide an excellent platform for tracking the status of the project. A World Wide Web based architecture is developed for (a) making metrics collection an implicit part of the software process, (b) providing metric analysis dynamically, (c) supporting automated tools that can complement current practices of in-process improvement, and (d) overcoming geographical barrier. An operational system (WISE) instantiates this architecture allowing for the improvement of software process in a realistic environment. The tool tracks issues in software development process, provides informal communication between the users with different roles, supports to-do lists (TDL), and helps in software process improvement. WISE minimizes the time devoted to metrics collection, analysis, and captures software change data. Automated tools like WISE focus on understanding and managing the software process. The goal is improvement through measurement.

  20. Multigeneration data migration from legacy systems

    NASA Astrophysics Data System (ADS)

    Ratib, Osman M.; Liu, Brent J.; Kho, Hwa T.; Tao, Wenchao; Wang, Cun; McCoy, J. Michael

    2003-05-01

    The migration of image data from different generations of legacy archive systems represents a technical challenge and in incremental cost in transitions to newer generations of PACS. UCLA medical center has elected to completely replace the existing PACS infrastructure encompassing several generations of legacy systems by a new commercial system providing enterprise-wide image management and communication. One of the most challenging parts of the project was the migration of large volumes of legacy images into the new system. Planning of the migration required the development of specialized software and hardware, and included different phases of data mediation from existing databases to the new PACS database prior to the migration of the image data. The project plan included a detailed analysis of resources and cost of data migration to optimize the process and minimize the delay of a hybrid operation where the legacy systems need to remain operational. Our analysis and project planning showed that the data migration represents the most critical path in the process of PACS renewal. Careful planning and optimization of the project timeline and resources allocated is critical to minimize the financial impact and the time delays that such migrations can impose on the implementation plan.

  1. GI-POP: a combinational annotation and genomic island prediction pipeline for ongoing microbial genome projects.

    PubMed

    Lee, Chi-Ching; Chen, Yi-Ping Phoebe; Yao, Tzu-Jung; Ma, Cheng-Yu; Lo, Wei-Cheng; Lyu, Ping-Chiang; Tang, Chuan Yi

    2013-04-10

    Sequencing of microbial genomes is important because of microbial-carrying antibiotic and pathogenetic activities. However, even with the help of new assembling software, finishing a whole genome is a time-consuming task. In most bacteria, pathogenetic or antibiotic genes are carried in genomic islands. Therefore, a quick genomic island (GI) prediction method is useful for ongoing sequencing genomes. In this work, we built a Web server called GI-POP (http://gipop.life.nthu.edu.tw) which integrates a sequence assembling tool, a functional annotation pipeline, and a high-performance GI predicting module, in a support vector machine (SVM)-based method called genomic island genomic profile scanning (GI-GPS). The draft genomes of the ongoing genome projects in contigs or scaffolds can be submitted to our Web server, and it provides the functional annotation and highly probable GI-predicting results. GI-POP is a comprehensive annotation Web server designed for ongoing genome project analysis. Researchers can perform annotation and obtain pre-analytic information include possible GIs, coding/non-coding sequences and functional analysis from their draft genomes. This pre-analytic system can provide useful information for finishing a genome sequencing project. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Risk assessment of underpass infrastructure project based on IS0 31000 and ISO 21500 using fishbone diagram and RFMEA (project risk failure mode and effects analysis) method

    NASA Astrophysics Data System (ADS)

    Purwanggono, Bambang; Margarette, Anastasia

    2017-12-01

    Completion time of highway construction is very meaningful for smooth transportation, moreover expected number of ownership motor vehicle will increase each year. Therefore, this study was conducted with to analyze the constraints that contained in an infrastructure development project. This research was conducted on Jatingaleh Underpass Project, Semarang. This research was carried out while the project is running, on the implementation, this project is experiencing delays. This research is done to find out what are the constraints that occur in execution of a road infrastructure project, in particular that causes delays. The method that used to find the root cause is fishbone diagram to obtain a possible means of mitigation. Coupled with the RFMEA method used to determine the critical risks that must be addressed immediately on road infrastructure project. The result of data tabulation in this study indicates that the most possible mitigation tool to make a Standard Operating Procedure (SOP) recommendations to disrupt utilities that interfere project implementation. Process of risk assessment has been carried out systematically based on ISO 31000:2009 on risk management and for determination of delayed variables, the requirements of process groups according to ISO 21500:2013 on project management were used.

  3. Analysis the Transient Process of Wind Power Resources when there are Voltage Sags in Distribution Grid

    NASA Astrophysics Data System (ADS)

    Nhu Y, Do

    2018-03-01

    Vietnam has many advantages of wind power resources. Time by time there are more and more capacity as well as number of wind power project in Vietnam. Corresponding to the increase of wind power emitted into national grid, It is necessary to research and analyze in order to ensure the safety and reliability of win power connection. In national distribution grid, voltage sag occurs regularly, it can strongly influence on the operation of wind power. The most serious consequence is the disconnection. The paper presents the analysis of distribution grid's transient process when voltage is sagged. Base on the analysis, the solutions will be recommended to improve the reliability and effective operation of wind power resources.

  4. An application of computer aided requirements analysis to a real time deep space system

    NASA Technical Reports Server (NTRS)

    Farny, A. M.; Morris, R. V.; Hartsough, C.; Callender, E. D.; Teichroew, D.; Chikofsky, E.

    1981-01-01

    The entire procedure of incorporating the requirements and goals of a space flight project into integrated, time ordered sequences of spacecraft commands, is called the uplink process. The Uplink Process Control Task (UPCT) was created to examine the uplink process and determine ways to improve it. The Problem Statement Language/Problem Statement Analyzer (PSL/PSA) designed to assist the designer/analyst/engineer in the preparation of specifications of an information system is used as a supporting tool to aid in the analysis. Attention is given to a definition of the uplink process, the definition of PSL/PSA, the construction of a PSA database, the value of analysis to the study of the uplink process, and the PSL/PSA lessons learned.

  5. The BANANA project. V. Misaligned and precessing stellar rotation axes in CV Velorum

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Albrecht, Simon; Winn, Joshua N.; Triaud, Amaury

    As part of the Binaries Are Not Always Neatly Aligned project (BANANA), we have found that the eclipsing binary CV Velorum has misaligned rotation axes. Based on our analysis of the Rossiter-McLaughlin effect, we find sky-projected spin-orbit angles of β{sub p} = –52° ± 6° and β{sub s} = 3° ± 7° for the primary and secondary stars (B2.5V + B2.5V, P = 6.9 days). We combine this information with several measurements of changing projected stellar rotation speeds (vsin i {sub *}) over the last 30 yr, leading to a model in which the primary star's obliquity is ≈65°, andmore » its spin axis precesses around the total angular momentum vector with a period of about 140 yr. The geometry of the secondary star is less clear, although a significant obliquity is also implicated by the observed time variations in the vsin i {sub *}. By integrating the secular tidal evolution equations backward in time, we find that the system could have evolved from a state of even stronger misalignment similar to DI Herculis, a younger but otherwise comparable binary.« less

  6. A model to assess the Mars Telecommunications Network relay robustness

    NASA Technical Reports Server (NTRS)

    Girerd, Andre R.; Meshkat, Leila; Edwards, Charles D., Jr.; Lee, Charles H.

    2005-01-01

    The relatively long mission durations and compatible radio protocols of current and projected Mars orbiters have enabled the gradual development of a heterogeneous constellation providing proximity communication services for surface assets. The current and forecasted capability of this evolving network has reached the point that designers of future surface missions consider complete dependence on it. Such designers, along with those architecting network requirements, have a need to understand the robustness of projected communication service. A model has been created to identify the robustness of the Mars Network as a function of surface location and time. Due to the decade-plus time horizon considered, the network will evolve, with emerging productive nodes and nodes that cease or fail to contribute. The model is a flexible framework to holistically process node information into measures of capability robustness that can be visualized for maximum understanding. Outputs from JPL's Telecom Orbit Analysis Simulation Tool (TOAST) provide global telecom performance parameters for current and projected orbiters. Probabilistic estimates of orbiter fuel life are derived from orbit keeping burn rates, forecasted maneuver tasking, and anomaly resolution budgets. Orbiter reliability is estimated probabilistically. A flexible scheduling framework accommodates the projected mission queue as well as potential alterations.

  7. The Forsyth County Cervical Cancer Prevention Project--II. Compliance with screening follow-up of abnormal cervical smears.

    PubMed

    Michielutte, R; Dignan, M; Bahnson, J; Wells, H B

    1994-12-01

    The Forsyth County Cervical Cancer Prevention Project was a community-wide cancer education program to address the problem of cervical cancer incidence and mortality among minority women in Forsyth County, North Carolina. This paper reports program results with regard to increasing compliance with follow-up for abnormal cervical smears. An analysis of trends prior to and after implementation of the educational program was conducted in one private and two public health primary care clinics to provide an assessment of impact of the project in improving compliance with follow-up among black women. A similar analysis also was conducted for white women. The results of medical record reviews of follow-up procedures for 878 abnormal cervical smears suggested a modest program effect among black women. The percentage of black women who returned for follow-up and treatment of an abnormal cervical smear significantly increased during the time the program was in effect. The trend analysis further indicated that the decline did not begin prior to the intervention period and was maintained throughout the duration of the intervention. No significant change in the percentage who returned for follow-up was found for white women.

  8. An Evaluation of Research Ethics in Undergraduate Health Science Research Methodology Programs at a South African University.

    PubMed

    Coetzee, Tanya; Hoffmann, Willem A; de Roubaix, Malcolm

    2015-10-01

    The amended research ethics policy at a South African University required the ethics review of undergraduate research projects, prompting the need to explore the content and teaching approach of research ethics education in health science undergraduate programs. Two qualitative data collection strategies were used: document analysis (syllabi and study guides) and semi-structured interviews with research methodology coordinators. Five main themes emerged: (a) timing of research ethics courses, (b) research ethics course content, (c) sub-optimal use of creative classroom activities to facilitate research ethics lectures, (d) understanding the need for undergraduate project research ethics review, and (e) research ethics capacity training for research methodology lecturers and undergraduate project supervisors. © The Author(s) 2015.

  9. An Analysis of the Costs, Benefits, and Implications of Different Approaches to Capturing the Value of Renewable Energy Tax Incentives

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bolinger, Mark

    This report compares the relative costs, benefits, and implications of capturing the value of renewable energy tax benefits in these three different ways – applying them against outside income , carrying them forward in time until they can be fully absorbed internally, or monetizing them through third-party tax equity investors – to see which method is most competitive under various scenarios. It finds that under current law and late-2013 market conditions, monetization makes sense for all but the most tax-efficient project sponsors. In other words, for most project sponsors, bringing in third-party tax equity currently provides net benefits to amore » project.« less

  10. High resolution global climate modelling; the UPSCALE project, a large simulation campaign

    NASA Astrophysics Data System (ADS)

    Mizielinski, M. S.; Roberts, M. J.; Vidale, P. L.; Schiemann, R.; Demory, M.-E.; Strachan, J.; Edwards, T.; Stephens, A.; Lawrence, B. N.; Pritchard, M.; Chiu, P.; Iwi, A.; Churchill, J.; del Cano Novales, C.; Kettleborough, J.; Roseblade, W.; Selwood, P.; Foster, M.; Glover, M.; Malcolm, A.

    2014-01-01

    The UPSCALE (UK on PRACE: weather-resolving Simulations of Climate for globAL Environmental risk) project constructed and ran an ensemble of HadGEM3 (Hadley centre Global Environment Model 3) atmosphere-only global climate simulations over the period 1985-2011, at resolutions of N512 (25 km), N216 (60 km) and N96 (130 km) as used in current global weather forecasting, seasonal prediction and climate modelling respectively. Alongside these present climate simulations a parallel ensemble looking at extremes of future climate was run, using a time-slice methodology to consider conditions at the end of this century. These simulations were primarily performed using a 144 million core hour, single year grant of computing time from PRACE (the Partnership for Advanced Computing in Europe) in 2012, with additional resources supplied by the Natural Environmental Research Council (NERC) and the Met Office. Almost 400 terabytes of simulation data were generated on the HERMIT supercomputer at the high performance computing center Stuttgart (HLRS), and transferred to the JASMIN super-data cluster provided by the Science and Technology Facilities Council Centre for Data Archival (STFC CEDA) for analysis and storage. In this paper we describe the implementation of the project, present the technical challenges in terms of optimisation, data output, transfer and storage that such a project involves and include details of the model configuration and the composition of the UPSCALE dataset. This dataset is available for scientific analysis to allow assessment of the value of model resolution in both present and potential future climate conditions.

  11. High-resolution global climate modelling: the UPSCALE project, a large-simulation campaign

    NASA Astrophysics Data System (ADS)

    Mizielinski, M. S.; Roberts, M. J.; Vidale, P. L.; Schiemann, R.; Demory, M.-E.; Strachan, J.; Edwards, T.; Stephens, A.; Lawrence, B. N.; Pritchard, M.; Chiu, P.; Iwi, A.; Churchill, J.; del Cano Novales, C.; Kettleborough, J.; Roseblade, W.; Selwood, P.; Foster, M.; Glover, M.; Malcolm, A.

    2014-08-01

    The UPSCALE (UK on PRACE: weather-resolving Simulations of Climate for globAL Environmental risk) project constructed and ran an ensemble of HadGEM3 (Hadley Centre Global Environment Model 3) atmosphere-only global climate simulations over the period 1985-2011, at resolutions of N512 (25 km), N216 (60 km) and N96 (130 km) as used in current global weather forecasting, seasonal prediction and climate modelling respectively. Alongside these present climate simulations a parallel ensemble looking at extremes of future climate was run, using a time-slice methodology to consider conditions at the end of this century. These simulations were primarily performed using a 144 million core hour, single year grant of computing time from PRACE (the Partnership for Advanced Computing in Europe) in 2012, with additional resources supplied by the Natural Environment Research Council (NERC) and the Met Office. Almost 400 terabytes of simulation data were generated on the HERMIT supercomputer at the High Performance Computing Center Stuttgart (HLRS), and transferred to the JASMIN super-data cluster provided by the Science and Technology Facilities Council Centre for Data Archival (STFC CEDA) for analysis and storage. In this paper we describe the implementation of the project, present the technical challenges in terms of optimisation, data output, transfer and storage that such a project involves and include details of the model configuration and the composition of the UPSCALE data set. This data set is available for scientific analysis to allow assessment of the value of model resolution in both present and potential future climate conditions.

  12. 24 CFR 965.406 - Benefit/cost analysis for similar projects.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Benefit/cost analysis for similar... Existing PHA-Owned Projects § 965.406 Benefit/cost analysis for similar projects. PHAs with more than one project of similar design and utilities service may prepare a benefit/cost analysis for a representative...

  13. Digitizing dissertations for an institutional repository: a process and cost analysis.

    PubMed

    Piorun, Mary; Palmer, Lisa A

    2008-07-01

    This paper describes the Lamar Soutter Library's process and costs associated with digitizing 300 doctoral dissertations for a newly implemented institutional repository at the University of Massachusetts Medical School. Project tasks included identifying metadata elements, obtaining and tracking permissions, converting the dissertations to an electronic format, and coordinating workflow between library departments. Each dissertation was scanned, reviewed for quality control, enhanced with a table of contents, processed through an optical character recognition function, and added to the institutional repository. Three hundred and twenty dissertations were digitized and added to the repository for a cost of $23,562, or $0.28 per page. Seventy-four percent of the authors who were contacted (n = 282) granted permission to digitize their dissertations. Processing time per title was 170 minutes, for a total processing time of 906 hours. In the first 17 months, full-text dissertations in the collection were downloaded 17,555 times. Locally digitizing dissertations or other scholarly works for inclusion in institutional repositories can be cost effective, especially if small, defined projects are chosen. A successful project serves as an excellent recruitment strategy for the institutional repository and helps libraries build new relationships. Challenges include workflow, cost, policy development, and copyright permissions.

  14. Outcrop Analysis of the Cretaceous Mesaverde Group: Jicarilla Apache Reservation, New Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ridgley, Jennie; Dunbar, Robin Wright

    2001-04-24

    Field work for this project was conducted during July and April 1998, at which time fourteen measured sections were described and correlated on or adjacent to Jicarilla Apache Reservation lands. A fifteenth section, described east of the main field area, is included in this report, although its distant location precluded use in the correlations and cross sections presented herein. Ground-based photo mosaics were shot for much of the exposed Mesaverde outcrop belt and were used to assist in correlation. Outcrop gamma-ray surveys at six of the fifteen measured sections using a GAD-6 scintillometer was conducted. The raw gamma-ray data aremore » included in this report, however, analysis of those data is part of the ongoing Phase Two of this project.« less

  15. Urban ninth-grade girls interactions with and outcomes from a design-oriented physics project

    NASA Astrophysics Data System (ADS)

    Higginbotham, Thomas Eric Miksad

    Past literature has documented a shrinking but persistent gap in physics and engineering for females, both in school and in the workforce. A commonly recommended strategy to invite girls into science at the school level is to have students work on design-projects in groups, which has been shown to increase all students' learning outcomes and attitudes towards science. Students (n=28) in a ninth-grade inner-city physics class participated in such a project, in which they built remotely operated underwater vehicles (ROV's) over the course of one month. Students (n=23) in a comparison classroom learned the same content using the Active Physics curriculum during the same time frame. Mixed methods were used to study the ROV classroom. Students in both classes were given pre- and post-physics content tests. Qualitative data collected during the project included field notes, video, and teacher interviews. Macro-level data analysis was done, which informed further micro-analysis. Macro-analysis revealed significantly higher learning outcomes for the ROV class than for the non-ROV class. Within the ROV class, girls, and in particular, girls in female-majority groups had increased learning outcomes and high levels of interest and engagement with the project, while girls in mixed-sex and male-majority groups did not. Qualitative macro-analysis revealed that in all of the female-majority groups, females took leadership roles within the groups, while in all of the non female-majority groups, males took leadership roles. The only groups in which girls completely disengaged from the project were mixed-sex or male majority groups. Case studies and cross case analysis suggested that girls foregrounded group process over product, and used the level of group unity as a metric of the groups' success. Groups led by girls were more cooperative and exhibited distributed leadership and participation. These findings were interpreted through lenses of expectation states theory and social interdependence theory. This study suggests that the commonly recommended non-sexist strategy of using hands-on group work can be positive, but should be undertaken with conscious attention to group dynamics.

  16. Using Group Research Projects to Stimulate Undergraduate Astronomy Major Learning

    NASA Astrophysics Data System (ADS)

    McGraw, Allison M.; Hardegree-Ullman, K. K.; Turner, J. D.; Shirley, Y. L.; Walker-LaFollette, A. M.; Robertson, A. N.; Carleton, T. M.; Smart, B. M.; Towner, A. P. M.; Wallace, S. C.; Smith, C. W.; Small, L. C.; Daugherty, M. J.; Guvenen, B. C.; Crawford, B. E.; Austin, C. L.; Schlingman, W. M.

    2012-05-01

    The University of Arizona Astronomy Club has been working on two large group research projects since 2009. One research project is a transiting extrasolar planet project that is fully student led and run. We observed the transiting exoplanets, TrES-3b and TrES-4b, with the 1.55 meter Kupier Telescope in near-UV and optical filters in order to detect any asymmetries between filters. The second project is a radio astronomy survey utilizing the Arizona Radio Observatory 12m telescope on Kitt Peak to study molecular gas in cold cores identified by the Planck all sky survey. This project provides a unique opportunity for a large group of students to get hands-on experience observing with a world-class radio observatory. These projects involve students in every single step of the process including: proposal writing to obtain telescope time on various Southern Arizona telescopes, observing at these telescopes, data reduction and analysis, managing large data sets, and presenting results at scientific meetings and in journal publications. The primary goal of these projects is to involve students in cutting-edge research early on in their undergraduate studies. The projects are designed to be continuous long term projects so that new students can easily join. As of January 2012 the extrasolar planet project became an official independent study class. New students learn from the more experienced students on the projects creating a learner-centered environment.

  17. Open environments to support systems engineering tool integration: A study using the Portable Common Tool Environment (PCTE)

    NASA Technical Reports Server (NTRS)

    Eckhardt, Dave E., Jr.; Jipping, Michael J.; Wild, Chris J.; Zeil, Steven J.; Roberts, Cathy C.

    1993-01-01

    A study of computer engineering tool integration using the Portable Common Tool Environment (PCTE) Public Interface Standard is presented. Over a 10-week time frame, three existing software products were encapsulated to work in the Emeraude environment, an implementation of the PCTE version 1.5 standard. The software products used were a computer-aided software engineering (CASE) design tool, a software reuse tool, and a computer architecture design and analysis tool. The tool set was then demonstrated to work in a coordinated design process in the Emeraude environment. The project and the features of PCTE used are described, experience with the use of Emeraude environment over the project time frame is summarized, and several related areas for future research are summarized.

  18. The AAL project: automated monitoring and intelligent analysis for the ATLAS data taking infrastructure

    NASA Astrophysics Data System (ADS)

    Kazarov, A.; Lehmann Miotto, G.; Magnoni, L.

    2012-06-01

    The Trigger and Data Acquisition (TDAQ) system of the ATLAS experiment at CERN is the infrastructure responsible for collecting and transferring ATLAS experimental data from detectors to the mass storage system. It relies on a large, distributed computing environment, including thousands of computing nodes with thousands of application running concurrently. In such a complex environment, information analysis is fundamental for controlling applications behavior, error reporting and operational monitoring. During data taking runs, streams of messages sent by applications via the message reporting system together with data published from applications via information services are the main sources of knowledge about correctness of running operations. The flow of data produced (with an average rate of O(1-10KHz)) is constantly monitored by experts to detect problem or misbehavior. This requires strong competence and experience in understanding and discovering problems and root causes, and often the meaningful information is not in the single message or update, but in the aggregated behavior in a certain time-line. The AAL project is meant at reducing the man power needs and at assuring a constant high quality of problem detection by automating most of the monitoring tasks and providing real-time correlation of data-taking and system metrics. This project combines technologies coming from different disciplines, in particular it leverages on an Event Driven Architecture to unify the flow of data from the ATLAS infrastructure, on a Complex Event Processing (CEP) engine for correlation of events and on a message oriented architecture for components integration. The project is composed of 2 main components: a core processing engine, responsible for correlation of events through expert-defined queries and a web based front-end to present real-time information and interact with the system. All components works in a loose-coupled event based architecture, with a message broker to centralize all communication between modules. The result is an intelligent system able to extract and compute relevant information from the flow of operational data to provide real-time feedback to human experts who can promptly react when needed. The paper presents the design and implementation of the AAL project, together with the results of its usage as automated monitoring assistant for the ATLAS data taking infrastructure.

  19. ALL-PATHWAYS DOSE ANALYSIS FOR THE PORTSMOUTH ON-SITE WASTE DISPOSAL FACILITY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, F.; Phifer, M.

    A Portsmouth On-Site Waste Disposal Facility (OSWDF) All-Pathways analysis has been conducted that considers the radiological impacts to a resident farmer. It is assumed that the resident farmer utilizes a farm pond contaminated by the OSWDF to irrigate a garden and pasture and water livestock from which food for the resident farmer is obtained, and that the farmer utilizes groundwater from the Berea sandstone aquifer for domestic purposes (i.e. drinking water and showering). As described by FBP 2014b the Hydrologic Evaluation of Landfill Performance (HELP) model (Schroeder et al. 1994) and the Surface Transport Over Multiple Phases (STOMP) model (Whitemore » and Oostrom 2000, 2006) were used to model the flow and transport from the OSWDF to the Points of Assessment (POAs) associated with the 680-ft elevation sandstone layer (680 SSL) and the Berea sandstone aquifer. From this modeling the activity concentrations radionuclides were projected over time at the POAs. The activity concentrations were utilized as input to a GoldSimTM (GTG 2010) dose model, described herein, in order to project the dose to a resident farmer over time. A base case and five sensitivity cases were analyzed. The sensitivity cases included an evaluation of the impacts of using a conservative inventory, an uncased well to the Berea sandstone aquifer, a low waste zone uranium distribution coefficient (Kd), different transfer factors, and reference person exposure parameters (i.e. at 95 percentile). The maximum base case dose within the 1,000 year assessment period was projected to be 1.5E-14 mrem/yr, and the maximum base case dose at any time less than 10,000 years was projected to be 0.002 mrem/yr. The maximum projected dose of any sensitivity case was approximately 2.6 mrem/yr associated with the use of an uncased well to the Berea sandstone aquifer. This sensitivity case is considered very unlikely because it assumes leakage from the location of greatest concentration in the 680 SSL in to the Berea sandstone aquiver over time and does not conform to standard private water well construction practices. The bottom-line is that all predicted doses from the base case and five sensitivity cases fall well below the DOE all-pathways 25 mrem/yr Performance Objective.« less

  20. A Climatology of Tropospheric CO over the Central and Southeastern United States and the Southwestern Pacific Ocean Derived from Space, Air, and Ground-based Infrared Interferometer Spectra

    NASA Technical Reports Server (NTRS)

    McMillian, W. Wallace; Strow, L. Larrabee; Revercomb, H.; Knuteson, R.; Thompson, A.

    2003-01-01

    This final report summarizes all research activities and publications undertaken as part of NASA Atmospheric Chemistry and Modeling Analysis Program (ACMAP) Grant NAG-1-2022, 'A Climatology of Tropospheric CO over the Central and Southeastern United States and the Southwestern Pacific Ocean Derived from Space, Air, and Ground-based Infrared Interferometer Spectra'. Major project accomplishments include: (1) analysis of more than 300,000 AERI spectra from the ARM SGP site yielding a 5-year (1998-2002) timeseries of CO retrievals from the Lamont, OK AERI; (2) development of a prototype CO profile retrieval algorithm for AERI spectra; (3) validation and publication of the first CO retrievals from the Scanning High-resolution Interferometer Sounder (SHIS); and (4) development of a prototype AERI tropospheric O3 retrieval algorithm. Compilation and publication of the 5-year Lamont, OK timeseries is underway including a new collaboration with scientists at the Lawrence Berkeley National Laboratory. Public access to this data will be provided upon article submission. A comprehensive CO analysis of the archive of HIS spectra of remains as the only originally proposed activity with little progress. The greatest challenge faced in this project was motivating the University of Wisconsin Co-Investigators to deliver their archived HIS and AERIOO data along with the requisite temperature and water vapor profiles in a timely manner. Part of the supplied HIS dataset from ASHOE may be analyzed as part of a Master s Thesis under a separate project. Our success with the SAFARI 2000 SHIS CO analysis demonstrates the utility of such aircraft remote sensing data given the proper support from the instrument investigators. In addition to the PI and Co-I s, personnel involved in this CO climatology project include one Post Doctoral Fellow, one Research Scientist, two graduate students, and two undergraduate students. A total of fifteen presentations regarding research related to this project were delivered at eleven different scientific meetings. Thus far, three publications have resulted from this project with another five in preparation. No subject inventions resulted from this research project.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agol, Dorice, E-mail: d.agol@uea.a.c.uk; Latawiec, Agnieszka E., E-mail: a.latawiec@iis-rio.org; Opole University of Technology, Department of Production Engineering and Logistics, Luboszycka 5, 45-036 Opole

    There has been an increased interest in using sustainability indicators for evaluating the impacts of development and conservation projects. Past and recent experiences have shown that sustainability indicators can be powerful tools for measuring the outcomes of various interventions, when used appropriately and adequately. Currently, there is a range of methods for applying sustainability indicators for project impact evaluation at the environment–development interface. At the same time, a number of challenges persist which have implication for impact evaluation processes especially in developing countries. We highlight some key and recurrent challenges, using three cases from Kenya, Indonesia and Brazil. In thismore » study, we have conducted a comparative analysis across multiple projects from the three countries, which aimed to conserve biodiversity and improve livelihoods. The assessments of these projects were designed to evaluate their positive, negative, short-term, long term, direct and indirect impacts. We have identified a set of commonly used sustainability indicators to evaluate the projects and have discussed opportunities and challenges associated with their application. Our analysis shows that impact evaluation processes present good opportunities for applying sustainability indicators. On the other hand, we find that project proponents (e.g. managers, evaluators, donors/funders) face challenges with establishing full impacts of interventions and that these are rooted in monitoring and evaluation processes, lack of evidence-based impacts, difficulties of measuring certain outcomes and concerns over scale of a range of impacts. We outline key lessons learnt from the multiple cases and propose ways to overcome common problems. Results from our analysis demonstrate practical experiences of applying sustainability indicators in developing countries context where there are different prevailing socio-economic, cultural and environmental conditions. The knowledge derived from this study may therefore be useful to a wider range of audience who are concerned with sustainable integration of development and environmental conservation. - Highlights: • Sustainability indicators are increasingly used for evaluating project impacts. • Lessons learnt are based on case studies from Africa, Asia and South America. • Similar challenges when assessing impacts of development and conservation projects • Need for pragmatic solutions to overcome challenges when assessing project impacts.« less

  2. A quality improvement project sustainably decreased time to onset of active physical therapy intervention in patients with acute lung injury.

    PubMed

    Dinglas, Victor D; Parker, Ann M; Reddy, Dereddi Raja S; Colantuoni, Elizabeth; Zanni, Jennifer M; Turnbull, Alison E; Nelliot, Archana; Ciesla, Nancy; Needham, Dale M

    2014-10-01

    Rehabilitation started early during an intensive care unit (ICU) stay is associated with improved outcomes and is the basis for many quality improvement (QI) projects showing important changes in practice. However, little evidence exists regarding whether such changes are sustainable in real-world practice. To evaluate the sustained effect of a quality improvement project on the timing of initiation of active physical therapy intervention in patients with acute lung injury (ALI). This was a pre-post evaluation using prospectively collected data involving consecutive patients with ALI admitted pre-quality improvement (October 2004-April 2007, n = 120) versus post-quality improvement (July 2009-July 2012, n = 123) from a single medical ICU. The primary outcome was time to first active physical therapy intervention, defined as strengthening, mobility, or cycle ergometry exercises. Among ICU survivors, more patients in the post-quality improvement versus pre-quality improvement group received physical therapy in the ICU (89% vs. 24%, P < 0.001) and were able to stand, transfer, or ambulate during physical therapy in the ICU (64% vs. 7%, P < 0.001). Among all patients in the post-quality improvement versus pre-quality improvement group, there was a shorter median (interquartile range) time to first physical therapy (4 [2, 6] vs. 11 d [6, 29], P < 0.001) and a greater median (interquartile range) proportion of ICU days with physical therapy after initiation (50% [33, 67%] vs. 18% [4, 47%], P = 0.003). In multivariable regression analysis, the post-quality improvement period was associated with shorter time to physical therapy (adjusted hazard ratio [95% confidence interval], 8.38 [4.98, 14.11], P < 0.001), with this association significant for each of the 5 years during the post-quality improvement period. The following variables were independently associated with a longer time to physical therapy: higher Sequential Organ Failure Assessment score (0.93 [0.89, 0.97]), higher FiO2 (0.86 [0.75, 0.99] for each 10% increase), use of an opioid infusion (0.47 [0.25, 0.89]), and deep sedation (0.24 [0.12, 0.46]). In this single-site, pre-post analysis of patients with ALI, an early rehabilitation quality improvement project was independently associated with a substantial decrease in the time to initiation of active physical therapy intervention that was sustained over 5 years. Over the entire pre-post period, severity of illness and sedation were independently associated with a longer time to initiation of active physical therapy intervention in the ICU.

  3. The economics of project analysis: Optimal investment criteria and methods of study

    NASA Technical Reports Server (NTRS)

    Scriven, M. C.

    1979-01-01

    Insight is provided toward the development of an optimal program for investment analysis of project proposals offering commercial potential and its components. This involves a critique of economic investment criteria viewed in relation to requirements of engineering economy analysis. An outline for a systems approach to project analysis is given Application of the Leontief input-output methodology to analysis of projects involving multiple processes and products is investigated. Effective application of elements of neoclassical economic theory to investment analysis of project components is demonstrated. Patterns of both static and dynamic activity levels are incorporated.

  4. Astronomy Remote Observing Research Projects of US High School Students

    NASA Astrophysics Data System (ADS)

    Kadooka, M.; Meech, K. J.

    2006-08-01

    In order to address the challenging climate for promoting astronomy education in the high schools we have used astronomy projects to give students authentic research experiences in order to encourage their pursuit of science and technology careers. Initially, we conducted teacher workshops to develop a cadre of teachers who have been instrumental in recruiting students to work on projects. Once identified, these students have been motivated to conduct astronomy research projects with appropriate guidance. Some have worked on these projects during non-school hours and others through a research course. The goal has been for students to meet the objectives of inquiry-based learning, a major US National Science Standard. Case studies will be described using event-based learning with the NASA Deep Impact mission. Hawaii students became active participants investigating comet properties through the NASA Deep Impact mission. The Deep Impact Education and Public Outreach group developed materials which were used by our students. After learning how to use image processing software, these students obtained Comet 9P/ Tempel 1 images in real time from the remote observing Faulkes Telescope North located on Haleakala, Maui for their projects. Besides conducting event-based projects which are time critical, Oregon students have worked on galaxies and sunspots projects. For variable star research, they used images obtained from the remote observing offline mode of Lowell Telescope located in Flagstaff, Arizona. Essential to these projects has been consistent follow-up required for honing skills in observing, image processing, analysis, and communication of project results through Science Fair entries. Key to our success has been the network of professional and amateur astronomers and educators collaborating in a multiplicity of ways to mentor our students. This work-in-progress and process will be shared on how to inspire students to pursue careers in science and technology with these projects.

  5. The PEPR GeneChip data warehouse, and implementation of a dynamic time series query tool (SGQT) with graphical interface.

    PubMed

    Chen, Josephine; Zhao, Po; Massaro, Donald; Clerch, Linda B; Almon, Richard R; DuBois, Debra C; Jusko, William J; Hoffman, Eric P

    2004-01-01

    Publicly accessible DNA databases (genome browsers) are rapidly accelerating post-genomic research (see http://www.genome.ucsc.edu/), with integrated genomic DNA, gene structure, EST/ splicing and cross-species ortholog data. DNA databases have relatively low dimensionality; the genome is a linear code that anchors all associated data. In contrast, RNA expression and protein databases need to be able to handle very high dimensional data, with time, tissue, cell type and genes, as interrelated variables. The high dimensionality of microarray expression profile data, and the lack of a standard experimental platform have complicated the development of web-accessible databases and analytical tools. We have designed and implemented a public resource of expression profile data containing 1024 human, mouse and rat Affymetrix GeneChip expression profiles, generated in the same laboratory, and subject to the same quality and procedural controls (Public Expression Profiling Resource; PEPR). Our Oracle-based PEPR data warehouse includes a novel time series query analysis tool (SGQT), enabling dynamic generation of graphs and spreadsheets showing the action of any transcript of interest over time. In this report, we demonstrate the utility of this tool using a 27 time point, in vivo muscle regeneration series. This data warehouse and associated analysis tools provides access to multidimensional microarray data through web-based interfaces, both for download of all types of raw data for independent analysis, and also for straightforward gene-based queries. Planned implementations of PEPR will include web-based remote entry of projects adhering to quality control and standard operating procedure (QC/SOP) criteria, and automated output of alternative probe set algorithms for each project (see http://microarray.cnmcresearch.org/pgadatatable.asp).

  6. The PEPR GeneChip data warehouse, and implementation of a dynamic time series query tool (SGQT) with graphical interface

    PubMed Central

    Chen, Josephine; Zhao, Po; Massaro, Donald; Clerch, Linda B.; Almon, Richard R.; DuBois, Debra C.; Jusko, William J.; Hoffman, Eric P.

    2004-01-01

    Publicly accessible DNA databases (genome browsers) are rapidly accelerating post-genomic research (see http://www.genome.ucsc.edu/), with integrated genomic DNA, gene structure, EST/ splicing and cross-species ortholog data. DNA databases have relatively low dimensionality; the genome is a linear code that anchors all associated data. In contrast, RNA expression and protein databases need to be able to handle very high dimensional data, with time, tissue, cell type and genes, as interrelated variables. The high dimensionality of microarray expression profile data, and the lack of a standard experimental platform have complicated the development of web-accessible databases and analytical tools. We have designed and implemented a public resource of expression profile data containing 1024 human, mouse and rat Affymetrix GeneChip expression profiles, generated in the same laboratory, and subject to the same quality and procedural controls (Public Expression Profiling Resource; PEPR). Our Oracle-based PEPR data warehouse includes a novel time series query analysis tool (SGQT), enabling dynamic generation of graphs and spreadsheets showing the action of any transcript of interest over time. In this report, we demonstrate the utility of this tool using a 27 time point, in vivo muscle regeneration series. This data warehouse and associated analysis tools provides access to multidimensional microarray data through web-based interfaces, both for download of all types of raw data for independent analysis, and also for straightforward gene-based queries. Planned implementations of PEPR will include web-based remote entry of projects adhering to quality control and standard operating procedure (QC/SOP) criteria, and automated output of alternative probe set algorithms for each project (see http://microarray.cnmcresearch.org/pgadatatable.asp). PMID:14681485

  7. Method for decreasing CT simulation time of complex phantoms and systems through separation of material specific projection data

    NASA Astrophysics Data System (ADS)

    Divel, Sarah E.; Christensen, Soren; Wintermark, Max; Lansberg, Maarten G.; Pelc, Norbert J.

    2017-03-01

    Computer simulation is a powerful tool in CT; however, long simulation times of complex phantoms and systems, especially when modeling many physical aspects (e.g., spectrum, finite detector and source size), hinder the ability to realistically and efficiently evaluate and optimize CT techniques. Long simulation times primarily result from the tracing of hundreds of line integrals through each of the hundreds of geometrical shapes defined within the phantom. However, when the goal is to perform dynamic simulations or test many scan protocols using a particular phantom, traditional simulation methods inefficiently and repeatedly calculate line integrals through the same set of structures although only a few parameters change in each new case. In this work, we have developed a new simulation framework that overcomes such inefficiencies by dividing the phantom into material specific regions with the same time attenuation profiles, acquiring and storing monoenergetic projections of the regions, and subsequently scaling and combining the projections to create equivalent polyenergetic sinograms. The simulation framework is especially efficient for the validation and optimization of CT perfusion which requires analysis of many stroke cases and testing hundreds of scan protocols on a realistic and complex numerical brain phantom. Using this updated framework to conduct a 31-time point simulation with 80 mm of z-coverage of a brain phantom on two 16-core Linux serves, we have reduced the simulation time from 62 hours to under 2.6 hours, a 95% reduction.

  8. The CHT2 Project: Diachronic 3d Reconstruction of Historic Sites

    NASA Astrophysics Data System (ADS)

    Guidi, G.; Micoli, L.; Gonizzi Barsanti, S.; Malik, U.

    2017-08-01

    Digital modelling archaeological and architectural monuments in their current state and in their presumed past aspect has been recognized not only as a way for explaining to the public the genesis of a historical site, but also as an effective tool for research. The search for historical sources, their proper analysis and interdisciplinary relationship between technological disciplines and the humanities are fundamental for obtaining reliable hypothetical reconstructions. This paper presents an experimental activity defined by the project Cultural Heritage Through Time - CHT2 (http://cht2-project.eu), funded in the framework of the Joint Programming Initiative on Cultural Heritage (JPI-CH) of the European Commission. Its goal is to develop time-varying 3D products, from landscape to architectural scale, deals with the implementation of the methodology on one of the case studies: the late Roman circus of Milan, built in the era when the city was the capital of the Western Roman Empire (286-402 A.D). The work presented here covers one of the cases in which the physical evidences have now been almost entirely disappeared. The diachronic reconstruction is based on a proper mix of quantitative data originated by 3D surveys at present time, and historical sources like ancient maps, drawings, archaeological reports, archaeological restrictions decrees and old photographs. Such heterogeneous sources have been first georeferenced and then properly integrated according to the methodology defined in the framework of the CHT2 project, to hypothesize a reliable reconstruction of the area in different historical periods.

  9. Comparison of Different Approach of Back Projection Method in Retrieving the Rupture Process of Large Earthquakes

    NASA Astrophysics Data System (ADS)

    Tan, F.; Wang, G.; Chen, C.; Ge, Z.

    2016-12-01

    Back-projection of teleseismic P waves [Ishii et al., 2005] has been widely used to image the rupture of earthquakes. Besides the conventional narrowband beamforming in time domain, approaches in frequency domain such as MUSIC back projection (Meng 2011) and compressive sensing (Yao et al, 2011), are proposed to improve the resolution. Each method has its advantages and disadvantages and should be properly used in different cases. Therefore, a thorough research to compare and test these methods is needed. We write a GUI program, which puts the three methods together so that people can conveniently use different methods to process the same data and compare the results. Then we use all the methods to process several earthquake data, including 2008 Wenchuan Mw7.9 earthquake and 2011 Tohoku-Oki Mw9.0 earthquake, and theoretical seismograms of both simple sources and complex ruptures. Our results show differences in efficiency, accuracy and stability among the methods. Quantitative and qualitative analysis are applied to measure their dependence on data and parameters, such as station number, station distribution, grid size, calculate window length and so on. In general, back projection makes it possible to get a good result in a very short time using less than 20 lines of high-quality data with proper station distribution, but the swimming artifact can be significant. Some ways, for instance, combining global seismic data, could help ameliorate this method. Music back projection needs relatively more data to obtain a better and more stable result, which means it needs a lot more time since its runtime accumulates obviously faster than back projection with the increase of station number. Compressive sensing deals more effectively with multiple sources in a same time window, however, costs the longest time due to repeatedly solving matrix. Resolution of all the methods is complicated and depends on many factors. An important one is the grid size, which in turn influences runtime significantly. More detailed results in this research may help people to choose proper data, method and parameters.

  10. Selling science 2.0: What scientific projects receive crowdfunding online?

    PubMed

    Schäfer, Mike S; Metag, Julia; Feustle, Jessica; Herzog, Livia

    2016-09-19

    Crowdfunding has emerged as an additional source for financing research in recent years. The study at hand identifies and tests explanatory factors influencing the success of scientific crowdfunding projects by drawing on news value theory, the "reputation signaling" approach, and economic theories of online payment. A standardized content analysis of 371 projects on English- and German-language platforms reveals that each theory provides factors influencing crowdfunding success. It shows that projects presented on science-only crowdfunding platforms have a higher success rate. At the same time, projects are more likely to be successful if their presentation includes visualizations and humor, the lower their targeted funding is, the less personal data potential donors have to relinquish and the more interaction between researchers and donors is possible. This suggests that after donors decide to visit a scientific crowdfunding platform, factors unrelated to science matter more for subsequent funding decisions, raising questions about the potential and implications of crowdfunding science. © The Author(s) 2016.

  11. Analysis Of Leakage In Carbon Sequestration Projects In Forestry:A Case Study Of Upper Magat Watershed, Philippines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lasco, Rodel D.; Pulhin, Florencia B.; Sales, Renezita F.

    2007-06-01

    The role of forestry projects in carbon conservation andsequestration is receiving much attention because of their role in themitigation of climate change. The main objective of the study is toanalyze the potential of the Upper Magat Watershed for a carbonsequestration project. The three main development components of theproject are forest conservation, tree plantations, and agroforestry farmdevelopment. At Year 30, the watershed can attain a net carbon benefit of19.5 M tC at a cost of US$ 34.5 M. The potential leakage of the projectis estimated using historical experience in technology adoption inwatershed areas in the Philippines and a high adoption rate.more » Two leakagescenarios were used: baseline and project leakage scenarios. Most of theleakage occurs in the first 10 years of the project as displacement oflivelihood occurs during this time. The carbon lost via leakage isestimated to be 3.7 M tC in the historical adoption scenario, and 8.1 MtC under the enhanced adoption scenario.« less

  12. Composting projects under the Clean Development Mechanism: Sustainable contribution to mitigate climate change

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rogger, Cyrill; Beaurain, Francois; Schmidt, Tobias S., E-mail: tobiasschmidt@ethz.ch

    2011-01-15

    The Clean Development Mechanism (CDM) of the Kyoto Protocol aims to reduce greenhouse gas emissions in developing countries and at the same time to assist these countries in sustainable development. While composting as a suitable mitigation option in the waste sector can clearly contribute to the former goal there are indications that high rents can also be achieved regarding the latter. In this article composting is compared with other CDM project types inside and outside the waste sector with regards to both project numbers and contribution to sustainable development. It is found that, despite the high number of waste projects,more » composting is underrepresented and a major reason for this fact is identified. Based on a multi-criteria analysis it is shown that composting has a higher potential for contribution to sustainable development than most other best in class projects. As these contributions can only be assured if certain requirements are followed, eight key obligations are presented.« less

  13. Tight-frame based iterative image reconstruction for spectral breast CT

    PubMed Central

    Zhao, Bo; Gao, Hao; Ding, Huanjun; Molloi, Sabee

    2013-01-01

    Purpose: To investigate tight-frame based iterative reconstruction (TFIR) technique for spectral breast computed tomography (CT) using fewer projections while achieving greater image quality. Methods: The experimental data were acquired with a fan-beam breast CT system based on a cadmium zinc telluride photon-counting detector. The images were reconstructed with a varying number of projections using the TFIR and filtered backprojection (FBP) techniques. The image quality between these two techniques was evaluated. The image's spatial resolution was evaluated using a high-resolution phantom, and the contrast to noise ratio (CNR) was evaluated using a postmortem breast sample. The postmortem breast samples were decomposed into water, lipid, and protein contents based on images reconstructed from TFIR with 204 projections and FBP with 614 projections. The volumetric fractions of water, lipid, and protein from the image-based measurements in both TFIR and FBP were compared to the chemical analysis. Results: The spatial resolution and CNR were comparable for the images reconstructed by TFIR with 204 projections and FBP with 614 projections. Both reconstruction techniques provided accurate quantification of water, lipid, and protein composition of the breast tissue when compared with data from the reference standard chemical analysis. Conclusions: Accurate breast tissue decomposition can be done with three fold fewer projection images by the TFIR technique without any reduction in image spatial resolution and CNR. This can result in a two-third reduction of the patient dose in a multislit and multislice spiral CT system in addition to the reduced scanning time in this system. PMID:23464320

  14. Project ATLANTA (Atlanta Land use Analysis: Temperature and Air Quality): Use of Remote Sensing and Modeling to Analyze How Urban Land Use Change Affects Meteorology and Air Quality Through Time

    NASA Technical Reports Server (NTRS)

    Quattrochi, Dale A.; Luvall, Jeffrey C.; Estes, Maurice G., Jr.

    1999-01-01

    This paper presents an overview of Project ATLANTA (ATlanta Land use ANalysis: Temperature and Air-quality) which is an investigation that seeks to observe, measure, model, and analyze how the rapid growth of the Atlanta, Georgia metropolitan area since the early 1970's has impacted the region's climate and air quality. The primary objectives for this research effort are: (1) To investigate and model the relationships between land cover change in the Atlanta metropolitan, and the development of the urban heat island phenomenon through time; (2) To investigate and model the temporal relationships between Atlanta urban growth and land cover change on air quality; and (3) To model the overall effects of urban development on surface energy budget characteristics across the Atlanta urban landscape through time. Our key goal is to derive a better scientific understanding of how land cover changes associated with urbanization in the Atlanta area, principally in transforming forest lands to urban land covers through time, has, and will, effect local and regional climate, surface energy flux, and air quality characteristics. Allied with this goal is the prospect that the results from this research can be applied by urban planners, environmental managers and other decision-makers, for determining how urbanization has impacted the climate and overall environment of the Atlanta area. Multiscaled remote sensing data, particularly high resolution thermal infrared data, are integral to this study for the analysis of thermal energy fluxes across the Atlanta urban landscape.

  15. Parallel Event Analysis Under Unix

    NASA Astrophysics Data System (ADS)

    Looney, S.; Nilsson, B. S.; Oest, T.; Pettersson, T.; Ranjard, F.; Thibonnier, J.-P.

    The ALEPH experiment at LEP, the CERN CN division and Digital Equipment Corp. have, in a joint project, developed a parallel event analysis system. The parallel physics code is identical to ALEPH's standard analysis code, ALPHA, only the organisation of input/output is changed. The user may switch between sequential and parallel processing by simply changing one input "card". The initial implementation runs on an 8-node DEC 3000/400 farm, using the PVM software, and exhibits a near-perfect speed-up linearity, reducing the turn-around time by a factor of 8.

  16. Real-time development of data acquisition and analysis software for hands-on physiology education in neuroscience: G-PRIME.

    PubMed

    Lott, Gus K; Johnson, Bruce R; Bonow, Robert H; Land, Bruce R; Hoy, Ronald R

    2009-01-01

    We report on the real-time creation of an application for hands-on neurophysiology in an advanced undergraduate teaching laboratory. Enabled by the rapid software development tools included in the Matlab technical computing environment (The Mathworks, Natick, MA), a team, consisting of a neurophysiology educator and a biophysicist trained as an electrical engineer, interfaced to a course of approximately 15 students from engineering and biology backgrounds. The result is the powerful freeware data acquisition and analysis environment, "g-PRIME." The software was developed from week to week in response to curriculum demands, and student feedback. The program evolved from a simple software oscilloscope, enabling RC circuit analysis, to a suite of tools supporting analysis of neuronal excitability and synaptic transmission analysis in invertebrate model systems. The program has subsequently expanded in application to university courses, research, and high school projects in the US and abroad as free courseware.

  17. CUDA-based high-performance computing of the S-BPF algorithm with no-waiting pipelining

    NASA Astrophysics Data System (ADS)

    Deng, Lin; Yan, Bin; Chang, Qingmei; Han, Yu; Zhang, Xiang; Xi, Xiaoqi; Li, Lei

    2015-10-01

    The backprojection-filtration (BPF) algorithm has become a good solution for local reconstruction in cone-beam computed tomography (CBCT). However, the reconstruction speed of BPF is a severe limitation for clinical applications. The selective-backprojection filtration (S-BPF) algorithm is developed to improve the parallel performance of BPF by selective backprojection. Furthermore, the general-purpose graphics processing unit (GP-GPU) is a popular tool for accelerating the reconstruction. Much work has been performed aiming for the optimization of the cone-beam back-projection. As the cone-beam back-projection process becomes faster, the data transportation holds a much bigger time proportion in the reconstruction than before. This paper focuses on minimizing the total time in the reconstruction with the S-BPF algorithm by hiding the data transportation among hard disk, CPU and GPU. And based on the analysis of the S-BPF algorithm, some strategies are implemented: (1) the asynchronous calls are used to overlap the implemention of CPU and GPU, (2) an innovative strategy is applied to obtain the DBP image to hide the transport time effectively, (3) two streams for data transportation and calculation are synchronized by the cudaEvent in the inverse of finite Hilbert transform on GPU. Our main contribution is a smart reconstruction of the S-BPF algorithm with GPU's continuous calculation and no data transportation time cost. a 5123 volume is reconstructed in less than 0.7 second on a single Tesla-based K20 GPU from 182 views projection with 5122 pixel per projection. The time cost of our implementation is about a half of that without the overlap behavior.

  18. On the Hydrologic Adjustment of Climate-Model Projections: The Potential Pitfall of Potential Evapotranspiration

    USGS Publications Warehouse

    Milly, Paul C.D.; Dunne, Krista A.

    2011-01-01

    Hydrologic models often are applied to adjust projections of hydroclimatic change that come from climate models. Such adjustment includes climate-bias correction, spatial refinement ("downscaling"), and consideration of the roles of hydrologic processes that were neglected in the climate model. Described herein is a quantitative analysis of the effects of hydrologic adjustment on the projections of runoff change associated with projected twenty-first-century climate change. In a case study including three climate models and 10 river basins in the contiguous United States, the authors find that relative (i.e., fractional or percentage) runoff change computed with hydrologic adjustment more often than not was less positive (or, equivalently, more negative) than what was projected by the climate models. The dominant contributor to this decrease in runoff was a ubiquitous change in runoff (median -11%) caused by the hydrologic model’s apparent amplification of the climate-model-implied growth in potential evapotranspiration. Analysis suggests that the hydrologic model, on the basis of the empirical, temperature-based modified Jensen–Haise formula, calculates a change in potential evapotranspiration that is typically 3 times the change implied by the climate models, which explicitly track surface energy budgets. In comparison with the amplification of potential evapotranspiration, central tendencies of other contributions from hydrologic adjustment (spatial refinement, climate-bias adjustment, and process refinement) were relatively small. The authors’ findings highlight the need for caution when projecting changes in potential evapotranspiration for use in hydrologic models or drought indices to evaluate climate-change impacts on water.

  19. Cost-Savings Analysis of the Better Beginnings, Better Futures Community-Based Project for Young Children and Their Families: A 10-Year Follow-up.

    PubMed

    Peters, Ray DeV; Petrunka, Kelly; Khan, Shahriar; Howell-Moneta, Angela; Nelson, Geoffrey; Pancer, S Mark; Loomis, Colleen

    2016-02-01

    This study examined the long-term cost-savings of the Better Beginnings, Better Futures (BBBF) initiative, a community-based early intervention project for young children living in socioeconomically disadvantaged neighborhoods during their transition to primary school. A quasi-experimental, longitudinal two-group design was used to compare costs and outcomes for children and families in three BBBF project neighborhoods (n = 401) and two comparison neighborhoods (n = 225). A cost-savings analysis was conducted using all project costs for providing up to 4 years of BBBF programs when children were in junior kindergarten (JK) (4 years old) to grade 2 (8 years old). Data on 19 government service cost measures were collected from the longitudinal research sample from the time the youth were in JK through to grade 12 (18 years old), 10 years after ending project participation. The average family incremental net savings to government of providing the BBBF project was $6331 in 2014 Canadian dollars. When the BBBF monetary return to government as a ratio of savings to costs was calculated, for every dollar invested by the government, a return of $2.50 per family was saved. Findings from this study have important implications for government investments in early interventions focused on a successful transition to primary school as well as parenting programs and community development initiatives in support of children's development.

  20. Systematic review finds that study data not published in full text articles have unclear impact on meta-analyses results in medical research.

    PubMed

    Schmucker, Christine M; Blümle, Anette; Schell, Lisa K; Schwarzer, Guido; Oeller, Patrick; Cabrera, Laura; von Elm, Erik; Briel, Matthias; Meerpohl, Joerg J

    2017-01-01

    A meta-analysis as part of a systematic review aims to provide a thorough, comprehensive and unbiased statistical summary of data from the literature. However, relevant study results could be missing from a meta-analysis because of selective publication and inadequate dissemination. If missing outcome data differ systematically from published ones, a meta-analysis will be biased with an inaccurate assessment of the intervention effect. As part of the EU-funded OPEN project (www.open-project.eu) we conducted a systematic review that assessed whether the inclusion of data that were not published at all and/or published only in the grey literature influences pooled effect estimates in meta-analyses and leads to different interpretation. Systematic review of published literature (methodological research projects). Four bibliographic databases were searched up to February 2016 without restriction of publication year or language. Methodological research projects were considered eligible for inclusion if they reviewed a cohort of meta-analyses which (i) compared pooled effect estimates of meta-analyses of health care interventions according to publication status of data or (ii) examined whether the inclusion of unpublished or grey literature data impacts the result of a meta-analysis. Seven methodological research projects including 187 meta-analyses comparing pooled treatment effect estimates according to different publication status were identified. Two research projects showed that published data showed larger pooled treatment effects in favour of the intervention than unpublished or grey literature data (Ratio of ORs 1.15, 95% CI 1.04-1.28 and 1.34, 95% CI 1.09-1.66). In the remaining research projects pooled effect estimates and/or overall findings were not significantly changed by the inclusion of unpublished and/or grey literature data. The precision of the pooled estimate was increased with narrower 95% confidence interval. Although we may anticipate that systematic reviews and meta-analyses not including unpublished or grey literature study results are likely to overestimate the treatment effects, current empirical research shows that this is only the case in a minority of reviews. Therefore, currently, a meta-analyst should particularly consider time, effort and costs when adding such data to their analysis. Future research is needed to identify which reviews may benefit most from including unpublished or grey data.

  1. Systematic review finds that study data not published in full text articles have unclear impact on meta-analyses results in medical research

    PubMed Central

    Blümle, Anette; Schell, Lisa K.; Schwarzer, Guido; Oeller, Patrick; Cabrera, Laura; von Elm, Erik; Briel, Matthias; Meerpohl, Joerg J.

    2017-01-01

    Background A meta-analysis as part of a systematic review aims to provide a thorough, comprehensive and unbiased statistical summary of data from the literature. However, relevant study results could be missing from a meta-analysis because of selective publication and inadequate dissemination. If missing outcome data differ systematically from published ones, a meta-analysis will be biased with an inaccurate assessment of the intervention effect. As part of the EU-funded OPEN project (www.open-project.eu) we conducted a systematic review that assessed whether the inclusion of data that were not published at all and/or published only in the grey literature influences pooled effect estimates in meta-analyses and leads to different interpretation. Methods and findings Systematic review of published literature (methodological research projects). Four bibliographic databases were searched up to February 2016 without restriction of publication year or language. Methodological research projects were considered eligible for inclusion if they reviewed a cohort of meta-analyses which (i) compared pooled effect estimates of meta-analyses of health care interventions according to publication status of data or (ii) examined whether the inclusion of unpublished or grey literature data impacts the result of a meta-analysis. Seven methodological research projects including 187 meta-analyses comparing pooled treatment effect estimates according to different publication status were identified. Two research projects showed that published data showed larger pooled treatment effects in favour of the intervention than unpublished or grey literature data (Ratio of ORs 1.15, 95% CI 1.04–1.28 and 1.34, 95% CI 1.09–1.66). In the remaining research projects pooled effect estimates and/or overall findings were not significantly changed by the inclusion of unpublished and/or grey literature data. The precision of the pooled estimate was increased with narrower 95% confidence interval. Conclusions Although we may anticipate that systematic reviews and meta-analyses not including unpublished or grey literature study results are likely to overestimate the treatment effects, current empirical research shows that this is only the case in a minority of reviews. Therefore, currently, a meta-analyst should particularly consider time, effort and costs when adding such data to their analysis. Future research is needed to identify which reviews may benefit most from including unpublished or grey data. PMID:28441452

  2. Modernizing sports facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dustin, R.

    Modernization and renovation of sports facilities challenge the design team to balance a number of requirements: spectator and owner expectations, existing building and site conditions, architectural layouts, code and legislation issues, time constraints and budget issues. System alternatives are evaluated and selected based on the relative priorities of these requirements. These priorities are unique to each project. At Alexander Memorial Coliseum, project schedules, construction funds and facility usage became the priorities. The ACC basketball schedule and arrival of the Centennial Olympics dictated the construction schedule. Initiation and success of the project depended on the commitment of the design team tomore » meet coliseum funding levels established three years ago. Analysis of facility usage and system alternative capabilities drove the design team to select a system that met the project requirements and will maximize the benefits to the owner and spectators for many years to come.« less

  3. Results of student-peer collaboration in the development of the Geoscience Student Data Network

    NASA Astrophysics Data System (ADS)

    Block, K. A.; Snyder, W. S.; Williams, N.; Rudolph, E.

    2012-12-01

    The Geoscience Student Data Network (GSDNet) is an NSF-CCLI project to develop a software application that facilitates student collaboration and data analysis. Cyberinfrastructure development is accompanied by a three-course curriculum that includes a field component implemented jointly at City College of New York (CCNY) and Boise State University (BSU). We report on the challenges of utilizing existing social networking technology for student collaboration and the hurdles of real-time information exchange on heavily taxed networks and facilities. The field component and research project currently underway is engaging eight students from CCNY and their BSU peer-mentors. Students are characterizing a geothermal prospect in Idaho by combining data collected in the field, laboratory studies and cyberinfrastructure outlets using the GSDNet prototype. We will summarize results of student projects from data collection, metadata documentation, online collaboration, and project dissemination.

  4. Climate change and soil salinity: The case of coastal Bangladesh.

    PubMed

    Dasgupta, Susmita; Hossain, Md Moqbul; Huq, Mainul; Wheeler, David

    2015-12-01

    This paper estimates location-specific soil salinity in coastal Bangladesh for 2050. The analysis was conducted in two stages: First, changes in soil salinity for the period 2001-2009 were assessed using information recorded at 41 soil monitoring stations by the Soil Research Development Institute. Using these data, a spatial econometric model was estimated linking soil salinity with the salinity of nearby rivers, land elevation, temperature, and rainfall. Second, future soil salinity for 69 coastal sub-districts was projected from climate-induced changes in river salinity and projections of rainfall and temperature based on time trends for 20 Bangladesh Meteorological Department weather stations in the coastal region. The findings indicate that climate change poses a major soil salinization risk in coastal Bangladesh. Across 41 monitoring stations, the annual median projected change in soil salinity is 39 % by 2050. Above the median, 25 % of all stations have projected changes of 51 % or higher.

  5. What is the reward? Medical students' learning and personal development during a research project course.

    PubMed

    Möller, Riitta; Shoshan, Maria; Heikkilä, Kristiina

    2015-01-01

    Until recently, the outcome of medical students' research projects has mainly been assessed in terms of scientific publications, whereas other results important for students' development have been less studied. The aim of this study was to investigate medical students' experiences of learning as an outcome of the research project course. Written reflections of 50 students were analyzed by manifest inductive content analysis. Three categories emerged: 'thinking as a scientist', 'working as a scientist', and 'personal development'. Students became more aware about the nature of knowledge, how to generate new knowledge, and developed skills in scientific thinking and critical appraisal. Unexpectedly, effects on personal characteristics, such as self-confidence, self-discipline, independence, and time management skills were also acknowledged. We conclude that individual research projects enhance research-specific skills and competencies needed in evidence-based clinical work and are beneficial for personal and professional development.

  6. First results of Liguria-Trento transplant network project: a model for a macroregional network and real-time registry in Italy.

    PubMed

    Valente, R; Cambiaso, F; Santori, G; Ghirelli, R; Gianelli, A; Valente, U

    2004-04-01

    In Italy, health-care telematic is funded and supported at the level of national government or regional institutions. In 1999, the Italian Ministry of Health started to fund the Liguria-Trento Transplant Network (LTTN) project, a health research project with the aim to build an informative system for donor management and transplantation activity in a macroregional area. At the time of LTTN project proposal, no published Transplant Network Informative System fulfilled Italian rules on telematic management of electronic documentation concerning transplantation activity. Partnership of LTTN project were two Regional Transplant Coordinating Centres, Nord Italia Transplant Interregional Coordinating Centre and the Italian Institute of Health/National Transplant Coordinating Centre. Project Total Quality Management methods were adopted. Technological and case analysis followed ANSI-HL7, CEN-TC251, and Object-Oriented Software Engineering standards. A low-tech prototype powered by a web access relational database is running on a transplant network including web-based clients located in 17 intensive care units, in Nord Italia Transplant Interregional Coordinating Centre, and at the Italian Institute of Health/National Transplant Coordinating Centre. LTTN registry includes pretransplant, surgical, and posttransplant phases regarding liver, kidney, pancreas, and kidney-pancreas transplantation in adult and pediatric recipients. Clinical specifications were prioritized in agreement with the RAND/UCLA appropriateness method. Further implementation will include formal rules for data access and output release, fault tolerance, and a continuous registry evolution plan.

  7. Evaluation and Verification of Decadal Predictions using the MiKlip Central Evaluation System - a Case Study using the MiKlip Prototype Model Data

    NASA Astrophysics Data System (ADS)

    Illing, Sebastian; Schuster, Mareike; Kadow, Christopher; Kröner, Igor; Richling, Andy; Grieger, Jens; Kruschke, Tim; Lang, Benjamin; Redl, Robert; Schartner, Thomas; Cubasch, Ulrich

    2016-04-01

    MiKlip is project for medium-term climate prediction funded by the Federal Ministry of Education and Research in Germany (BMBF) and aims to create a model system that is able provide reliable decadal climate forecasts. During the first project phase of MiKlip the sub-project INTEGRATION located at Freie Universität Berlin developed a framework for scientific infrastructures (FREVA). More information about FREVA can be found in EGU2016-13060. An instance of this framework is used as Central Evaluation System (CES) during the MiKlip project. Throughout the first project phase various sub-projects developed over 25 analysis tools - so called plugins - for the CES. The main focus of these plugins is on the evaluation and verification of decadal climate prediction data, but most plugins are not limited to this scope. They target a wide range of scientific questions. Starting from preprocessing tools like the "LeadtimeSelector", which creates lead-time dependent time-series from decadal hindcast sets, over tracking tools like the "Zykpak" plugin, which can objectively locate and track mid-latitude cyclones, to plugins like "MurCSS" or "SPECS", which calculate deterministic and probabilistic skill metrics. We also integrated some analyses from Model Evaluation Tools (MET), which was developed at NCAR. We will show the theoretical background, technical implementation strategies, and some interesting results of the evaluation of the MiKlip Prototype decadal prediction system for a selected set of these tools.

  8. Multi-objective optimization of discrete time-cost tradeoff problem in project networks using non-dominated sorting genetic algorithm

    NASA Astrophysics Data System (ADS)

    Shahriari, Mohammadreza

    2016-06-01

    The time-cost tradeoff problem is one of the most important and applicable problems in project scheduling area. There are many factors that force the mangers to crash the time. This factor could be early utilization, early commissioning and operation, improving the project cash flow, avoiding unfavorable weather conditions, compensating the delays, and so on. Since there is a need to allocate extra resources to short the finishing time of project and the project managers are intended to spend the lowest possible amount of money and achieve the maximum crashing time, as a result, both direct and indirect costs will be influenced in the project, and here, we are facing into the time value of money. It means that when we crash the starting activities in a project, the extra investment will be tied in until the end date of the project; however, when we crash the final activities, the extra investment will be tied in for a much shorter period. This study is presenting a two-objective mathematical model for balancing compressing the project time with activities delay to prepare a suitable tool for decision makers caught in available facilities and due to the time of projects. Also drawing the scheduling problem to real world conditions by considering nonlinear objective function and the time value of money are considered. The presented problem was solved using NSGA-II, and the effect of time compressing reports on the non-dominant set.

  9. How to Grow Project Scientists: A Systematic Approach to Developing Project Scientists

    NASA Technical Reports Server (NTRS)

    Kea, Howard

    2011-01-01

    The Project Manager is one of the key individuals that can determine the success or failure of a project. NASA is fully committed to the training and development of Project Managers across the agency to ensure that highly capable individuals are equipped with the competencies and experience to successfully lead a project. An equally critical position is that of the Project Scientist. The Project Scientist provides the scientific leadership necessary for the scientific success of a project by insuring that the mission meets or exceeds the scientific requirements. Traditionally, NASA Goddard project scientists were appointed and approved by the Center Science Director based on their knowledge, experience, and other qualifications. However the process to obtain the necessary knowledge, skills and abilities was not documented or done in a systematic way. NASA Goddard's current Science Director, Nicholas White saw the need to create a pipeline for developing new projects scientists, and appointed a team to develop a process for training potential project scientists. The team members were Dr. Harley Thronson, Chair, Dr. Howard Kea, Mr. Mark Goldman, DACUM facilitator and the late Dr. Michael VanSteenberg. The DACUM process, an occupational analysis and evaluation system, was used to produce a picture of the project scientist's duties, tasks, knowledge, and skills. The output resulted in a 3-Day introductory course detailing all the required knowledge, skills and abilities a scientist must develop over time to be qualified for selections as a Project Scientist.

  10. Communicating mega-projects in the face of uncertainties: Israeli mass media treatment of the Dead Sea Water Canal.

    PubMed

    Fischhendler, Itay; Cohen-Blankshtain, Galit; Shuali, Yoav; Boykoff, Max

    2015-10-01

    Given the potential for uncertainties to influence mega-projects, this study examines how mega-projects are deliberated in the public arena. The paper traces the strategies used to promote the Dead Sea Water Canal. Findings show that the Dead Sea mega-project was encumbered by ample uncertainties. Treatment of uncertainties in early coverage was dominated by economics and raised primarily by politicians, while more contemporary media discourses have been dominated by ecological uncertainties voiced by environmental non-governmental organizations. This change in uncertainty type is explained by the changing nature of the project and by shifts in societal values over time. The study also reveals that 'uncertainty reduction' and to a lesser degree, 'project cancellation', are still the strategies most often used to address uncertainties. Statistical analysis indicates that although uncertainties and strategies are significantly correlated, there may be other intervening variables that affect this correlation. This research also therefore contributes to wider and ongoing considerations of uncertainty in the public arena through various media representational practices. © The Author(s) 2013.

  11. Dynamic Assessment of Seismic Risk (DASR) by Multi-parametric Observations: Preliminary Results of PRIME experiment within the PRE-EARTHQUAKES EU-FP7 Project

    NASA Astrophysics Data System (ADS)

    Tramutoli, V.; Inan, S.; Jakowski, N.; Pulinets, S. A.; Romanov, A.; Filizzola, C.; Shagimuratov, I.; Pergola, N.; Ouzounov, D. P.; Papadopoulos, G. A.; Parrot, M.; Genzano, N.; Lisi, M.; Alparlsan, E.; Wilken, V.; Tsybukia, K.; Romanov, A.; Paciello, R.; Zakharenkova, I.; Romano, G.

    2012-12-01

    The integration of different observations together with the refinement of data analysis methods, is generally expected to improve our present knowledge of preparatory phases of earthquakes and of their possible precursors. This is also the main goal of PRE-EARTHQUAKES (Processing Russian and European EARTH observations for earthQUAKE precursors Studies) the FP7 Project which, to this aim, committed together, different international expertise and observational capabilities, in the last 2 years. In the learning phase of the project, different parameters (e.g. thermal anomalies, total electron content, radon concentration, etc.), measured from ground and satellite systems and analyzed by using different data analysis approaches, have been studied for selected geographic areas and specific seismic events in the past. Since July 2012 the PRIME (PRE-EARTHQUAKES Real-time Integration and Monitoring Experiment) started attempting to perform, on the base of independent observations collected and integrated in real-time through the PEG (PRE-EARTHQUAKES Geo-portal), a Dynamic Assessment of Seismic Risk (DASR) on selected geographic areas of Europe (Italy-Greece-Turkey) and Asia (Kamchatka, Sakhalin, Japan). In this paper, results so far achieved as well as the potential and opportunities they open for a worldwide Earthquake Observation System (EQuOS) - as a dedicated component of GEOSS (Global Earth Observation System of Systems) - will be presented.

  12. Implementing EVM Data Analysis Adding Value from a NASA Project Manager's Perspective

    NASA Technical Reports Server (NTRS)

    Counts, Stacy; Kerby, Jerald

    2006-01-01

    Data Analysis is one of the keys to an effective Earned Value Management (EVM) Process. Project Managers (PM) must continually evaluate data in assessing the health of their projects. Good analysis of data can assist PMs in making better decisions in managing projects. To better support our P Ms, National Aeronautics and Space Administration (NASA) - Marshall Space Flight Center (MSFC) recently renewed its emphasis on sound EVM data analysis practices and processes, During this presentation we will discuss the approach that MSFC followed in implementing better data analysis across its Center. We will address our approach to effectively equip and support our projects in applying a sound data analysis process. In addition, the PM for the Space Station Biological Research Project will share her experiences of how effective data analysis can benefit a PM in the decision making process. The PM will discuss how the emphasis on data analysis has helped create a solid method for assessing the project s performance. Using data analysis successfully can be an effective and efficient tool in today s environment with increasing workloads and downsizing workforces

  13. Climate, Water and Energy in the Nordic Countries

    NASA Astrophysics Data System (ADS)

    Snorrason, A.; Jonsdottir, J. F.

    2003-04-01

    In light of the recent IPCC Climate Change Assessment and recent progress made in meteorological and hydrological modelling, the directors of the Nordic hydrological institutes (CHIN) initiated a research project "Climate, Water and Energy" (CWE) with funding from the Nordic Energy Research and the Nordic Council of Ministers focusing on climatic impact assessment in the energy sector. Climatic variability and change affect the hydrological systems, which in turn affect the energy sector, this will increase the risk associated with the development and use of water resources in the Nordic countries. Within the CWE project four thematic groups work on this issue of climatic change and how changes in precipitation and temperature will have direct influences on runoff. A primary aim of the CWE climate group is to derive a common scenario or a "best-guess" estimate of climate change in northern Europe and Greenland, based on recent regional climate change experiments and representing the change from 1990 to 2050 under the IPCC SRES B2 emission scenario. A data set, along with the most important information for using the scenario is available at the project web site. The glacier group has chosen 8 glaciers from Greenland, Iceland, Norway and Sweden for an analysis of the response of glaciers to climate changes. Mass balance and dynamical changes, corresponding to the common scenario for climate changes, will be modelled and effects on glacier hydrology will be estimated. The long time series group has reported on the status of time series analysis in the Nordic countries. The group will select and quality control time series of stream flow to be included in the Nordic component of the database FRIEND. Also the group will collect information on time series for other variables and these series will be systematically analysed with respect to trend and other long-term changes. The hydrological modelling group has reported on "Climate change impacts on water resources in the Nordic countries - State of the art and discussion of principles". The group will compare different hydrological models and discuss uncertainties in models and climate scenarios, while production of new results based on the composite scenario from the CWE-climate group depends on other projects. The product of the project will be an in-depth analysis of the present status of research and know-how in the sphere of climatic and hydrological research in the Nordic countries. It will be a synthesis and integration of present research with focus on the needs of the energy sector. It will also identify and prioritise key future research areas that are of benefit to the energy sector.

  14. Wireless Cloud Computing on Guided Missile Destroyers: A Business Case Analysis

    DTIC Science & Technology

    2013-06-01

    Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction...to the Office of Management and Budget, Paperwork Reduction Project (0704–0188) Washington DC 20503. 1 . AGENCY USE ONLY (Leave blank) 2. REPORT...INTRODUCTION........................................................................................................ 1   A.  OVERVIEW

  15. An Analysis of Televised Public Service Advertising. Drug Abuse Information Research Project.

    ERIC Educational Resources Information Center

    Hanneman, Gerhard J.; And Others

    Government regulations state that broadcasters are obligated to allot program time to matters of public interest, but neither law nor precedent have determined their commitment to present messages on social problems. To determine the amount of public service advertising (PSA) that is broadcast, particularly anti-drug appeals, a content analysis…

  16. Performance Evaluation of Reliable Multicast Protocol for Checkout and Launch Control Systems

    NASA Technical Reports Server (NTRS)

    Shu, Wei Wennie; Porter, John

    2000-01-01

    The overall objective of this project is to study reliability and performance of Real Time Critical Network (RTCN) for checkout and launch control systems (CLCS). The major tasks include reliability and performance evaluation of Reliable Multicast (RM) package and fault tolerance analysis and design of dual redundant network architecture.

  17. Achievement Emotions and Academic Performance: Longitudinal Models of Reciprocal Effects

    ERIC Educational Resources Information Center

    Pekrun, Reinhard; Lichtenfeld, Stephanie; Marsh, Herbert W.; Murayama, Kou; Goetz, Thomas

    2017-01-01

    A reciprocal effects model linking emotion and achievement over time is proposed. The model was tested using five annual waves of the Project for the Analysis of Learning and Achievement in Mathematics (PALMA) longitudinal study, which investigated adolescents' development in mathematics (Grades 5-9; N = 3,425 German students; mean starting…

  18. 42 CFR 57.1508 - Amount of interest subsidy payments; limitations.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... payments will be made, and the level of such payments shall be determined by the Secretary on the basis of... consideration his analysis of the present and reasonable projected future financial ability of the applicant to... Profession Personnel § 57.1508 Amount of interest subsidy payments; limitations. The length of time for which...

  19. Short-Term Enrollment Forecasting for Accurate Budget Planning.

    ERIC Educational Resources Information Center

    Salley, Charles D.

    1979-01-01

    Reliance on enrollment trend models for revenue projections has led to a scenario of alternating overbudgeted and underbudgeted years. A study of a large, public university indicates that time series analysis should be used instead to anticipate the orderly seasonal and cyclical patterns that are visible in a period of moderate trend growth.…

  20. The Co-Operative: Good with Schools?

    ERIC Educational Resources Information Center

    Coates, Max

    2015-01-01

    The article is a summary of a small-scale research project which considers the formation of Co-operative Trust Schools. This was carried out in 2013 at a time when the number of schools becoming Academies and Trust Schools through the Co-operative College was burgeoning. Through questionnaire, interview, documentary analysis and exploration of…

  1. South African CSP projects under the REIPPP programme - Requirements, challenges and opportunities

    NASA Astrophysics Data System (ADS)

    Relancio, Javier; Cuellar, Alberto; Walker, Gregg; Ettmayr, Chris

    2016-05-01

    Thus far seven Concentrated Solar Power (CSP) projects have been awarded under the Renewable Energy Independent Power Producer Procurement Programme (REIPPPP), totalling 600MW: one project is in operation, four under construction and two on their way to financial close. This provides an excellent opportunity for analysis of key features of the projects that have contributed to or detracted from the programme's success. The paper draws from Mott MacDonald's involvement as Technical Advisor on the seven CSP projects that have been successful under the REIPPPP to date as well as other global CSP developments. It presents how various programme requirements have affected the implementation of projects, such as the technical requirements, time of day tariff structure, economic development requirements and the renewable energy grid code. The increasingly competitive tariffs offered have encouraged developers to investigate efficiency maximising project configurations and cost saving mechanisms, as well as featuring state of the art technology in their proposals. The paper assesses the role of the project participants (developers, lenders and government) with regards to these innovative technologies and solutions. In our paper we discuss the status of projects and the SA market, analysing the main challenges and opportunities that in turn have influenced various aspects such as technology choice, operational regimes and supply chain arrangements.

  2. Intensity - Duration - Frequency Curves for U.S. Cities in a Warming Climate

    NASA Astrophysics Data System (ADS)

    Ragno, Elisa; AghaKouchak, Amir; Love, Charlotte; Vahedifard, Farshid; Cheng, Linyin; Lima, Carlos

    2017-04-01

    Current infrastructure design procedures rely on the use of Intensity - Duration - Frequency (IDF) curves retrieved under the assumption of temporal stationarity, meaning that occurrences of extreme events are expected to be time invariant. However, numerous studies have observed more severe extreme events over time. Hence, the stationarity assumption for extreme analysis may not be appropriate in a warming climate. This issue raises concerns regarding the safety and resilience of infrastructures and natural slopes. Here we employ daily precipitation data from historical and projected (RCP 8.5) CMIP5 runs to investigate IDF curves of 14 urban areas across the United States. We first statistically assess changes in precipitation extremes using an energy-based test for equal distributions. Then, through a Bayesian inference approach for stationary and non-stationary extreme value analysis, we provide updated IDF curves based on future climatic model projections. We show that, based on CMIP5 simulations, U.S cities may experience extreme precipitation events up to 20% more intense and twice as frequently, relative to historical records, despite the expectation of unchanged annual mean precipitation.

  3. Autonoetic consciousness: Reconsidering the role of episodic memory in future-oriented self-projection.

    PubMed

    Klein, Stanley B

    2016-01-01

    Following the seminal work of Ingvar (1985. "Memory for the future": An essay on the temporal organization of conscious awareness. Human Neurobiology, 4, 127-136), Suddendorf (1994. The discovery of the fourth dimension: Mental time travel and human evolution. Master's thesis. University of Waikato, Hamilton, New Zealand), and Tulving (1985. Memory and consciousness. Canadian Psychology/PsychologieCanadienne, 26, 1-12), exploration of the ability to anticipate and prepare for future contingencies that cannot be known with certainty has grown into a thriving research enterprise. A fundamental tenet of this line of inquiry is that future-oriented mental time travel, in most of its presentations, is underwritten by a property or an extension of episodic recollection. However, a careful conceptual analysis of exactly how episodic memory functions in this capacity has yet to be undertaken. In this paper I conduct such an analysis. Based on conceptual, phenomenological, and empirical considerations, I conclude that the autonoetic component of episodic memory, not episodic memory per se, is the causally determinative factor enabling an individual to project him or herself into a personal future.

  4. Sustainability of mega water diversion projects: Experience and lessons from China.

    PubMed

    Yu, Min; Wang, Chaoran; Liu, Yi; Olsson, Gustaf; Wang, Chunyan

    2018-04-01

    Water availability and water demand are not evenly distributed in time and space. Many mega water diversion projects have been launched to alleviate water shortages in China. This paper analyzes the temporal and spatial features of 59 mega water diversion projects in China using statistical analysis. The relationship between nine major basins is measured using a network analysis method, and the associated economic, environmental and social impacts are explored using an impact analysis method. The study finds the development of water diversion has experienced four stages in China, from a starting period through to a period of high-speed development. Both the length of water diversion channels and the amount of transferred water have increased significantly in the past 50years. As of 2015, over 100billionm 3 of water was transferred in China through 16,000km in channels. These projects reached over half of China's provinces. The Yangtze River Basin is now the largest source of transferred water. Through inter-basin water diversion, China gains the opportunity to increase Gross Domestic Product by 4%. However, the construction costs exceed 150 billion US dollars, larger than in any other country. The average cost per unit of transferred water has increased with time and scale but decreased from western to eastern China. Furthermore, annual total energy consumption for pumping exceeded 50billionkilowatt-hours and the related greenhouse gas emissions are estimated to be 48milliontons. It is worth noting that ecological problems caused by water diversion affect the Han River and Yellow River Basins. Over 500 thousand people have been relocated away from their homes due to water diversion. To improve the sustainability of water diversion, four kinds of innovative measures have been provided for decision makers: national diversion guidelines, integrated water basin management, economic incentives and ex-post evaluation. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. PINT, a New Pulsar Timing Software

    NASA Astrophysics Data System (ADS)

    Luo, Jing; Jenet, Fredrick A.; Ransom, Scott M.; Demorest, Paul; Van Haasteren, Rutger; Archibald, Anne

    2015-01-01

    We are presenting a new pulsar timing software PINT. The current pulsar timing group are heavily depending on Tempo/Tempo2, a package for analysis pulsar data. However, for a high accuracy pulsar timing related project, such as pulsar timing for gravitational waves, an alternative software is needed for the purpose of examing the results. We are developing a Tempo independent software with a different structure. Different modules is designed to be more isolated and easier to be expanded. Instead of C, we are using Python as our programming language for the advantage of flexibility and powerful docstring. Here, we are presenting the detailed design and the first result of the software.

  6. Naval Surface Forces Real-Time Reutilization Asset Management Warehouses: A Cost-Benefit Analysis

    DTIC Science & Technology

    2008-12-01

    shared their valuable insights. And finally, to our advisors Dr . Ken Euske and CDR Brett Wagner who kept us on track throughout the project. Nick could...of $206,368,657. SURFOR’s East and West Coast warehouses accounted for $146,975,108 of the list value. The team identified potential cost...obligating the funds. A relatively simple cost benefit analysis of the reprogramming costs versus the cost savings would justify the expense. In the

  7. Nonlinear Analysis of Time Series in Genome-Wide Linkage Disequilibrium Data

    NASA Astrophysics Data System (ADS)

    Hernández-Lemus, Enrique; Estrada-Gil, Jesús K.; Silva-Zolezzi, Irma; Fernández-López, J. Carlos; Hidalgo-Miranda, Alfredo; Jiménez-Sánchez, Gerardo

    2008-02-01

    The statistical study of large scale genomic data has turned out to be a very important tool in population genetics. Quantitative methods are essential to understand and implement association studies in the biomedical and health sciences. Nevertheless, the characterization of recently admixed populations has been an elusive problem due to the presence of a number of complex phenomena. For example, linkage disequilibrium structures are thought to be more complex than their non-recently admixed population counterparts, presenting the so-called ancestry blocks, admixed regions that are not yet smoothed by the effect of genetic recombination. In order to distinguish characteristic features for various populations we have implemented several methods, some of them borrowed or adapted from the analysis of nonlinear time series in statistical physics and quantitative physiology. We calculate the main fractal dimensions (Kolmogorov's capacity, information dimension and correlation dimension, usually named, D0, D1 and D2). We also have made detrended fluctuation analysis and information based similarity index calculations for the probability distribution of correlations of linkage disequilibrium coefficient of six recently admixed (mestizo) populations within the Mexican Genome Diversity Project [1] and for the non-recently admixed populations in the International HapMap Project [2]. Nonlinear correlations showed up as a consequence of internal structure within the haplotype distributions. The analysis of these correlations as well as the scope and limitations of these procedures within the biomedical sciences are discussed.

  8. Student ownership of projects in an upper-division optics laboratory course: A multiple case study of successful experiences

    NASA Astrophysics Data System (ADS)

    Dounas-Frazer, Dimitri R.; Stanley, Jacob T.; Lewandowski, H. J.

    2017-12-01

    We investigate students' sense of ownership of multiweek final projects in an upper-division optics lab course. Using a multiple case study approach, we describe three student projects in detail. Within-case analyses focused on identifying key issues in each project, and constructing chronological descriptions of those events. Cross-case analysis focused on identifying emergent themes with respect to five dimensions of project ownership: student agency, instructor mentorship, peer collaboration, interest and value, and affective responses. Our within- and cross-case analyses yielded three major findings. First, coupling division of labor with collective brainstorming can help balance student agency, instructor mentorship, and peer collaboration. Second, students' interest in the project and perceptions of its value can increase over time; initial student interest in the project topic is not a necessary condition for student ownership of the project. Third, student ownership is characterized by a wide range of emotions that fluctuate as students alternate between extended periods of struggle and moments of success while working on their projects. These findings not only extend the literature on student ownership into a new educational domain—namely, upper-division physics labs—they also have concrete implications for the design of experimental physics projects in courses for which student ownership is a desired learning outcome. We describe the course and projects in sufficient detail that others can adapt our results to their particular contexts.

  9. Enhancing Astronomy Major Learning Through Group Research Projects

    NASA Astrophysics Data System (ADS)

    McGraw, Allison M.; Hardegree-Ullman, K.; Turner, J.; Shirley, Y. L.; Walker-Lafollette, A.; Scott, A.; Guvenen, B.; Raphael, B.; Sanford, B.; Smart, B.; Nguyen, C.; Jones, C.; Smith, C.; Cates, I.; Romine, J.; Cook, K.; Pearson, K.; Biddle, L.; Small, L.; Donnels, M.; Nieberding, M.; Kwon, M.; Thompson, R.; De La Rosa, R.; Hofmann, R.; Tombleson, R.; Smith, T.; Towner, A. P.; Wallace, S.

    2013-01-01

    The University of Arizona Astronomy Club has been using group research projects to enhance the learning experience of undergraduates in astronomy and related fields. Students work on two projects that employ a peer-mentoring system so they can learn crucial skills and concepts necessary in research environments. Students work on a transiting exoplanet project using the 1.55-meter Kuiper Telescope on Mt. Bigelow in Southern Arizona to collect near-UV and optical wavelength data. The goal of the project is to refine planetary parameters and to attempt to detect exoplanet magnetic fields by searching for near-UV light curve asymmetries. The other project is a survey that utilizes the 12-meter Arizona Radio Observatory on Kitt Peak to search for the spectroscopic signature of infall in nearby starless cores. These are unique projects because students are involved throughout the entire research process, including writing proposals for telescope time, observing at the telescopes, data reduction and analysis, writing papers for publication in journals, and presenting research at scientific conferences. Exoplanet project members are able to receive independent study credit for participating in the research, which helps keep the project on track. Both projects allow students to work on professional research and prepare for several astronomy courses early in their academic career. They also encourage teamwork and mentor-style peer teaching, and can help students identify their own research projects as they expand their knowledge.

  10. Rapid SAR and GPS Measurements and Models for Hazard Science and Situational Awareness

    NASA Astrophysics Data System (ADS)

    Owen, S. E.; Yun, S. H.; Hua, H.; Agram, P. S.; Liu, Z.; Moore, A. W.; Rosen, P. A.; Simons, M.; Webb, F.; Linick, J.; Fielding, E. J.; Lundgren, P.; Sacco, G. F.; Polet, J.; Manipon, G.

    2016-12-01

    The Advanced Rapid Imaging and Analysis (ARIA) project for Natural Hazards is focused on rapidly generating higher level geodetic imaging products and placing them in the hands of the solid earth science and local, national, and international natural hazard communities by providing science product generation, exploration, and delivery capabilities at an operational level. Space-based geodetic measurement techniques such as Interferometric Synthetic Aperture Radar (InSAR), Differential Global Positioning System (DGPS), SAR-based change detection, and image pixel tracking have recently become critical additions to our toolset for understanding and mapping the damage caused by earthquakes, volcanic eruptions, landslides, and floods. Analyses of these data sets are still largely handcrafted following each event and are not generated rapidly and reliably enough for response to natural disasters or for timely analysis of large data sets. The ARIA project, a joint venture co-sponsored by California Institute of Technology (Caltech) and by NASA through the Jet Propulsion Laboratory (JPL), has been capturing the knowledge applied to these responses and building it into an automated infrastructure to generate imaging products in near real-time that can improve situational awareness for disaster response. In addition, the ARIA project is developing the capabilities to provide automated imaging and analysis capabilities necessary to keep up with the imminent increase in raw data from geodetic imaging missions planned for launch by NASA, as well as international space agencies. We will present the progress we have made on automating the analysis of SAR data for hazard monitoring and response using data from Sentinel 1a/b as well as continuous GPS stations. Since the beginning of our project, our team has imaged events and generated response products for events around the world. These response products have enabled many conversations with those in the disaster response community about the potential usefulness of rapid SAR and GPS-based information. We will present progress on our data system technology that enables rapid and reliable production of imagery, as well as lessons learned from our engagement with FEMA and others in the hazard response community on the important actionable information that they need.

  11. Metal fires and their implications for advanced reactors.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nowlen, Steven Patrick; Figueroa, Victor G.; Olivier, Tara Jean

    This report details the primary results of the Laboratory Directed Research and Development project (LDRD 08-0857) Metal Fires and Their Implications for Advance Reactors. Advanced reactors may employ liquid metal coolants, typically sodium, because of their many desirable qualities. This project addressed some of the significant challenges associated with the use of liquid metal coolants, primary among these being the extremely rapid oxidation (combustion) that occurs at the high operating temperatures in reactors. The project has identified a number of areas for which gaps existed in knowledge pertinent to reactor safety analyses. Experimental and analysis capabilities were developed in thesemore » areas to varying degrees. In conjunction with team participation in a DOE gap analysis panel, focus was on the oxidation of spilled sodium on thermally massive surfaces. These are spills onto surfaces that substantially cool the sodium during the oxidation process, and they are relevant because standard risk mitigation procedures seek to move spill environments into this regime through rapid draining of spilled sodium. While the spilled sodium is not quenched, the burning mode is different in that there is a transition to a smoldering mode that has not been comprehensively described previously. Prior work has described spilled sodium as a pool fire, but there is a crucial, experimentally-observed transition to a smoldering mode of oxidation. A series of experimental measurements have comprehensively described the thermal evolution of this type of sodium fire for the first time. A new physics-based model has been developed that also predicts the thermal evolution of this type of sodium fire for the first time. The model introduces smoldering oxidation through porous oxide layers to go beyond traditional pool fire analyses that have been carried out previously in order to predict experimentally observed trends. Combined, these developments add significantly to the safety analysis capabilities of the advanced-reactor community for directly relevant scenarios. Beyond the focus on the thermally-interacting and smoldering sodium pool fires, experimental and analysis capabilities for sodium spray fires have also been developed in this project.« less

  12. Bayesian lead time estimation for the Johns Hopkins Lung Project data.

    PubMed

    Jang, Hyejeong; Kim, Seongho; Wu, Dongfeng

    2013-09-01

    Lung cancer screening using X-rays has been controversial for many years. A major concern is whether lung cancer screening really brings any survival benefits, which depends on effective treatment after early detection. The problem was analyzed from a different point of view and estimates were presented of the projected lead time for participants in a lung cancer screening program using the Johns Hopkins Lung Project (JHLP) data. The newly developed method of lead time estimation was applied where the lifetime T was treated as a random variable rather than a fixed value, resulting in the number of future screenings for a given individual is a random variable. Using the actuarial life table available from the United States Social Security Administration, the lifetime distribution was first obtained, then the lead time distribution was projected using the JHLP data. The data analysis with the JHLP data shows that, for a male heavy smoker with initial screening ages at 50, 60, and 70, the probability of no-early-detection with semiannual screens will be 32.16%, 32.45%, and 33.17%, respectively; while the mean lead time is 1.36, 1.33 and 1.23 years. The probability of no-early-detection increases monotonically when the screening interval increases, and it increases slightly as the initial age increases for the same screening interval. The mean lead time and its standard error decrease when the screening interval increases for all age groups, and both decrease when initial age increases with the same screening interval. The overall mean lead time estimated with a random lifetime T is slightly less than that with a fixed value of T. This result is hoped to be of benefit to improve current screening programs. Copyright © 2013 Ministry of Health, Saudi Arabia. Published by Elsevier Ltd. All rights reserved.

  13. Extreme Performance Scalable Operating Systems Final Progress Report (July 1, 2008 - October 31, 2011)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malony, Allen D; Shende, Sameer

    This is the final progress report for the FastOS (Phase 2) (FastOS-2) project with Argonne National Laboratory and the University of Oregon (UO). The project started at UO on July 1, 2008 and ran until April 30, 2010, at which time a six-month no-cost extension began. The FastOS-2 work at UO delivered excellent results in all research work areas: * scalable parallel monitoring * kernel-level performance measurement * parallel I/0 system measurement * large-scale and hybrid application performance measurement * onlne scalable performance data reduction and analysis * binary instrumentation

  14. The method of projected characteristics for the evolution of magnetic arches

    NASA Technical Reports Server (NTRS)

    Nakagawa, Y.; Hu, Y. Q.; Wu, S. T.

    1987-01-01

    A numerical method of solving fully nonlinear MHD equation is described. In particular, the formulation based on the newly developed method of projected characteristics (Nakagawa, 1981) suitable to study the evolution of magnetic arches due to motions of their foot-points is presented. The final formulation is given in the form of difference equations; therefore, the analysis of numerical stability is also presented. Further, the most important derivation of physically self-consistent, time-dependent boundary conditions (i.e. the evolving boundary equations) is given in detail, and some results obtained with such boundary equations are reported.

  15. Knowledge Repository for Fmea Related Knowledge

    NASA Astrophysics Data System (ADS)

    Cândea, Gabriela Simona; Kifor, Claudiu Vasile; Cândea, Ciprian

    2014-11-01

    This paper presents innovative usage of knowledge system into Failure Mode and Effects Analysis (FMEA) process using the ontology to represent the knowledge. Knowledge system is built to serve multi-projects work that nowadays are in place in any manufacturing or services provider, and knowledge must be retained and reused at the company level and not only at project level. The system is following the FMEA methodology and the validation of the concept is compliant with the automotive industry standards published by Automotive Industry Action Group, and not only. Collaboration is assured trough web-based GUI that supports multiple users access at any time

  16. UMTRA project water sampling and analysis plan, Durango, Colorado

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1994-01-01

    Surface remedial action has been completed at the Uranium Mill Tailings Remedial Action Project in Durango, Colorado. Contaminated soil and debris have been removed from the former processing site and placed in the Bodo Canyon disposal cell. Ground water at the former uranium mill/tailings site and raffinate pond area has been contaminated by the former milling operations. The ground water at the disposal site was not impacted by the former milling operations at the time of the cell`s construction. Activities for fiscal 1994 involve ground water sampling and site characterization of the disposal site.

  17. Project SHARE Sustainable Hydropower in Alpine Rivers Ecosystems

    NASA Astrophysics Data System (ADS)

    Mammoliti Mochet, Andrea

    2010-05-01

    SHARE - Sustainable Hydropower in Alpine Rivers Ecosystems is a running project early approved and co funded by the European regional development fund in the context of the European Territorial Cooperation Alpine Space programme 2007 - 2013: the project is formally ongoing from August 2009 and it will end July 2012. Hydropower is the most important renewable resource for electricity production in alpine areas: it has advantages for the global CO2 balance but creates serious environmental impacts. RES-e Directives require renewable electricity enhance but, at the same time, the Water Framework Directive obliges member States to reach or maintain a water bodies "good" ecological status, intrinsically limiting the hydropower exploitation. Administrators daily face an increasing demand of water abstraction but lack reliable tools to rigorously evaluate their effects on mountain rivers and the social and economical outputs on longer time scale. The project intends to develop, test and promote a decision support system to merge on an unprejudiced base, river ecosystems and hydropower requirements. This approach will be led using existing scientific tools, adjustable to transnational, national and local normative and carried on by permanent panel of administrators and stakeholders. Scientific knowledge related to HP & river management will be "translated" by the communication tools and spent as a concrete added value to build a decision support system. In particular, the Multicriteria Analysis (MCA) will be applied to assess different management alternatives where a single-criterion approach (such as cost-benefit analysis) falls short, especially where environmental, technical, economic and social criteria can't be quantified by monetary values. All the existing monitoring databases will be used and harmonized with new information collected during the Pilot case studies. At the same time, all information collected will be available to end users and actors of related projects. The project openly pursues integrated river management aims (environmental and economic): - define, share and test a decision making framework based on validated methodologies in order to allow public decision makers to take transparent decisions about planning and management of HP concessions, taking account resulting effects on river ecosystems and on all different stakeholders - creation of a technical panel including public decision makers, stakeholders and PPs to promote & transfer the SHARE approach to local, national & transnational level to concretely upgrade the actual standard of problem solving attitude; - classify scenarios of water use optimization, taking into account the different actor needs; - establish a set of generally applicable and comparable indicators & monitoring standards based on transferable guidelines and metrics considering the specific disparities among power stations, diversity of technical approaches and different river ecosystems; - designation and mapping of alpine hydro systems more vulnerable typologies; - designation and mapping of the most convenient sites and typologies of "low impact" new plants; - contribute to the concrete local integration implementation of WFD and RES-e directives. The project partnership embodies different alpine countries & hydrosystems, profiles, status, end users, networks and previous experiences. At the same time the project official observers represent the links with outside the project networks, end users & stakeholders.

  18. Evaluation of three electronic report processing systems for preparing hydrologic reports of the U.S Geological Survey, Water Resources Division

    USGS Publications Warehouse

    Stiltner, G.J.

    1990-01-01

    In 1987, the Water Resources Division of the U.S. Geological Survey undertook three pilot projects to evaluate electronic report processing systems as a means to improve the quality and timeliness of reports pertaining to water resources investigations. The three projects selected for study included the use of the following configuration of software and hardware: Ventura Publisher software on an IBM model AT personal computer, PageMaker software on a Macintosh computer, and FrameMaker software on a Sun Microsystems workstation. The following assessment criteria were to be addressed in the pilot studies: The combined use of text, tables, and graphics; analysis of time; ease of learning; compatibility with the existing minicomputer system; and technical limitations. It was considered essential that the camera-ready copy produced be in a format suitable for publication. Visual improvement alone was not a consideration. This report consolidates and summarizes the findings of the electronic report processing pilot projects. Text and table files originating on the existing minicomputer system were successfully transformed to the electronic report processing systems in American Standard Code for Information Interchange (ASCII) format. Graphics prepared using a proprietary graphics software package were transferred to all the electronic report processing software through the use of Computer Graphic Metafiles. Graphics from other sources were entered into the systems by scanning paper images. Comparative analysis of time needed to process text and tables by the electronic report processing systems and by conventional methods indicated that, although more time is invested in creating the original page composition for an electronically processed report , substantial time is saved in producing subsequent reports because the format can be stored and re-used by electronic means as a template. Because of the more compact page layouts, costs of printing the reports were 15% to 25% less than costs of printing the reports prepared by conventional methods. Because the largest report workload in the offices conducting water resources investigations is preparation of Water-Resources Investigations Reports, Open-File Reports, and annual State Data Reports, the pilot studies only involved these projects. (USGS)

  19. Cost/Schedule Control Systems Criteria: A Reference Guide to C/SCSC information

    DTIC Science & Technology

    1992-09-01

    Smith, Larry A. "Mainframe ARTEMIS: More than a Project Management Tool -- Earned Value Analysis ( PEVA )," Project Management Journal, 19:23-28 (April 1988...A. "Mainframe ARTEMIS: More than a Project Management Tool - Earned Value Analysis ( PEVA )," Project Management Journal, 19:23-28 (April 1988). 14...than a Project Management Tool -- Earned Value Analysis ( PEVA )," Project Management Journal, 19:23-28 (April 1988). 17. Trufant, Thomas M. and Robert

  20. Theoretical framework of the causes of construction time and cost overruns

    NASA Astrophysics Data System (ADS)

    Ullah, K.; Abdullah, A. H.; Nagapan, S.; Suhoo, S.; Khan, M. S.

    2017-11-01

    Any construction practitioner fundamental goal is to complete the projects within estimated duration and budgets, and expected quality targets. However, time and cost overruns are regular and universal phenomenon in construction projects and the construction projects in Malaysia has no exemption from the problems of time overrun and cost overrun. In order to accomplish the successful completion of construction projects on specified time and within planned cost, there are various factors that should be given serious attention so that issues such as time and cost overrun can be addressed. This paper aims to construct a framework for the causes of time overrun and cost overrun in construction projects of Malaysia. Based on the relevant literature review, causative factors of time overrun and cost overrun in Malaysian construction projects are summarized and the theoretical frameworks of the causes of construction time overrun and cost overrun is constructed. The developed frameworks for construction time and cost overruns based on the existing literature will assist the construction practitioners to plan the efficient approaches for achieving successful completion of the projects.

  1. [The final situation in the Turkey "Stent for Life" project].

    PubMed

    Ertaş, Gökhan; Kozan, Omer; Değertekin, Muzaffer; Kervan, Umit; Aksoy, Mehmet; Koç, Orhan; Göktekin, Omer

    2012-09-01

    The Stent for Life (SFL) project's main mission is to increase the use of primary percutaneous coronary intervention (PCI) in more than 70% of all acute ST segment elevation myocardial infarction (STEMI) patients. Previous to the SFL project, thrombolysis was the dominant reperfusion strategy since a low percentage of acute STEMI patients had access to primary PCI in our country. In this study, we present the main barriers of access to primary PCI in the centers that were involved with the SFL project. Patients with acute STEMI admitted to the centers that were involved in the SFL project between 2009 and 2011 were included in the analysis. Since the inception of the SFL project, the primary PCI rate has reached over 90% in SFL pilot cities. In the last 5 years, the number of ambulances and emergency stations has increased. Since the collaboration with 112 Emergency Service, a great majority of cases were reached via the emergency medical system. The mean door-to-balloon time for the pilot cities was 54.72±43.66 minutes. After three years of the SFL project, primary PCI has emerged as the preferred reperfusion strategy for patients with STEMI in pilot cities.

  2. Aural analysis of image texture via cepstral filtering and sonification

    NASA Astrophysics Data System (ADS)

    Rangayyan, Rangaraj M.; Martins, Antonio C. G.; Ruschioni, Ruggero A.

    1996-03-01

    Texture plays an important role in image analysis and understanding, with many applications in medical imaging and computer vision. However, analysis of texture by image processing is a rather difficult issue, with most techniques being oriented towards statistical analysis which may not have readily comprehensible perceptual correlates. We propose new methods for auditory display (AD) and sonification of (quasi-) periodic texture (where a basic texture element or `texton' is repeated over the image field) and random texture (which could be modeled as filtered or `spot' noise). Although the AD designed is not intended to be speech- like or musical, we draw analogies between the two types of texture mentioned above and voiced/unvoiced speech, and design a sonification algorithm which incorporates physical and perceptual concepts of texture and speech. More specifically, we present a method for AD of texture where the projections of the image at various angles (Radon transforms or integrals) are mapped to audible signals and played in sequence. In the case of random texture, the spectral envelopes of the projections are related to the filter spot characteristics, and convey the essential information for texture discrimination. In the case of periodic texture, the AD provides timber and pitch related to the texton and periodicity. In another procedure for sonification of periodic texture, we propose to first deconvolve the image using cepstral analysis to extract information about the texton and horizontal and vertical periodicities. The projections of individual textons at various angles are used to create a voiced-speech-like signal with each projection mapped to a basic wavelet, the horizontal period to pitch, and the vertical period to rhythm on a longer time scale. The sound pattern then consists of a serial, melody-like sonification of the patterns for each projection. We believe that our approaches provide the much-desired `natural' connection between the image data and the sounds generated. We have evaluated the sonification techniques with a number of synthetic textures. The sound patterns created have demonstrated the potential of the methods in distinguishing between different types of texture. We are investigating the application of these techniques to auditory analysis of texture in medical images such as magnetic resonance images.

  3. Lessons learned from setting up the NOWESP research data base: Experiences in an interdisciplinary research project

    NASA Astrophysics Data System (ADS)

    Radach, Günther; Gekeler, Jens

    1996-09-01

    Research carried out within the framework of the MAST project NOWESP (North-West European Shelf Programme) was based on a multi-parameter data set of existing marine data, relevant for estimating trends, variability and fluxes on the Northwest European Shelf. The data sets were provided by the partners of the project. Additional data sets were obtained from several other institutions. During the project, the data were organized in the NOWESP Research Data Base (NRDB), for which a special data base scheme was defined that was capable of storing different types of marine data. Data products, like time series and interpolated fields, were provided to the partners for analysis (Radach et al. [1997]). After three years of project time, the feasibility of such an approach is discussed. Ways of optimizing data access and evaluation are proposed. A project-oriented Research Data Base is a useful tool because of its flexibility and proximity to the research being carried out. However, several requirements must be met to derive optimum benefits from this type of service unit. Since this task usually is carried out by a limited number of staff, an early start of project data management is recommended. To enable future projects to succeed in an analogous compilation of relevant data for their use, as performed in NOWESP, the task of organizing the data sets for any short-term project should be shared between a research data base group and a national or international data centre whose experience and software could be used. It must be ensured that only quality controlled data sets from the individual data-produ cing projects are delivered to the national data centres. It is recommended that data quality control should be performed by the originators and/or data centres before delivering any data sets to the research data base. Delivery of the (full) data sets should be checked and their quality should be approved by authorized data centres.

  4. Project management: importance for diagnostic laboratories.

    PubMed

    Croxatto, A; Greub, G

    2017-07-01

    The need for diagnostic laboratories to improve both quality and productivity alongside personnel shortages incite laboratory managers to constantly optimize laboratory workflows, organization, and technology. These continuous modifications of the laboratories should be conducted using efficient project and change management approaches to maximize the opportunities for successful completion of the project. This review aims at presenting a general overview of project management with an emphasis on selected critical aspects. Conventional project management tools and models, such as HERMES, described in the literature, associated personal experience, and educational courses on management have been used to illustrate this review. This review presents general guidelines of project management and highlights their importance for microbiology diagnostic laboratories. As an example, some critical aspects of project management will be illustrated with a project of automation, as experienced at the laboratories of bacteriology and hygiene of the University Hospital of Lausanne. It is important to define clearly beforehand the objective of a project, its perimeter, its costs, and its time frame including precise duration estimates of each step. Then, a project management plan including explanations and descriptions on how to manage, execute, and control the project is necessary to continuously monitor the progression of a project to achieve its defined goals. Moreover, a thorough risk analysis with contingency and mitigation measures should be performed at each phase of a project to minimize the impact of project failures. The increasing complexities of modern laboratories mean clinical microbiologists must use several management tools including project and change management to improve the outcome of major projects and activities. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  5. Web services for ecosystem services management and poverty alleviation

    NASA Astrophysics Data System (ADS)

    Buytaert, W.; Baez, S.; Veliz Rosas, C.

    2011-12-01

    Over the last decades, near real-time environmental observation, technical advances in computer power and cyber-infrastructure, and the development of environmental software algorithms have increased dramatically. The integration of these evolutions is one of the major challenges of the next decade for environmental sciences. Worldwide, many coordinated activities are ongoing to make this integration a reality. However, far less attention is paid to the question of how these developments can benefit environmental services management in a poverty alleviation context. Such projects are typically faced with issues of large predictive uncertainties, limited resources, limited local scientific capacity. At the same time, the complexity of the socio-economic contexts requires a very strong bottom-up oriented and interdisciplinary approach to environmental data collection and processing. Here, we present the results of two projects on integrated environmental monitoring and scenario analysis aimed at poverty alleviation in the Peruvian Andes and Amazon. In the upper Andean highlands, farmers are monitoring the water cycle of headwater catchments to analyse the impact of land-use changes on stream flow and potential consequences for downstream irrigation. In the Amazon, local communities are monitoring the dynamics of turtle populations and their relations with river levels. In both cases, the use of online databases and web processing services enable real-time analysis of the data and scenario analysis. The system provides both physical and social indicators to assess the impact of land-use management options on local socio-economic development.

  6. Near Real-Time Surveillance for Influenza Vaccine Safety: Proof-of-Concept in the Vaccine Safety Datalink Project

    PubMed Central

    Greene, Sharon K.; Kulldorff, Martin; Lewis, Edwin M.; Li, Rong; Yin, Ruihua; Weintraub, Eric S.; Fireman, Bruce H.; Lieu, Tracy A.; Nordin, James D.; Glanz, Jason M.; Baxter, Roger; Jacobsen, Steven J.; Broder, Karen R.; Lee, Grace M.

    2010-01-01

    The emergence of pandemic H1N1 influenza in 2009 has prompted public health responses, including production and licensure of new influenza A (H1N1) 2009 monovalent vaccines. Safety monitoring is a critical component of vaccination programs. As proof-of-concept, the authors mimicked near real-time prospective surveillance for prespecified neurologic and allergic adverse events among enrollees in 8 medical care organizations (the Vaccine Safety Datalink Project) who received seasonal trivalent inactivated influenza vaccine during the 2005/06–2007/08 influenza seasons. In self-controlled case series analysis, the risk of adverse events in a prespecified exposure period following vaccination was compared with the risk in 1 control period for the same individual either before or after vaccination. In difference-in-difference analysis, the relative risk in exposed versus control periods each season was compared with the relative risk in previous seasons since 2000/01. The authors used Poisson-based analysis to compare the risk of Guillian-Barré syndrome following vaccination in each season with that in previous seasons. Maximized sequential probability ratio tests were used to adjust for repeated analyses on weekly data. With administration of 1,195,552 doses to children under age 18 years and 4,773,956 doses to adults, no elevated risk of adverse events was identified. Near real-time surveillance for selected adverse events can be implemented prospectively to rapidly assess seasonal and pandemic influenza vaccine safety. PMID:19965887

  7. Condensing Massive Satellite Datasets For Rapid Interactive Analysis

    NASA Astrophysics Data System (ADS)

    Grant, G.; Gallaher, D. W.; Lv, Q.; Campbell, G. G.; Fowler, C.; LIU, Q.; Chen, C.; Klucik, R.; McAllister, R. A.

    2015-12-01

    Our goal is to enable users to interactively analyze massive satellite datasets, identifying anomalous data or values that fall outside of thresholds. To achieve this, the project seeks to create a derived database containing only the most relevant information, accelerating the analysis process. The database is designed to be an ancillary tool for the researcher, not an archival database to replace the original data. This approach is aimed at improving performance by reducing the overall size by way of condensing the data. The primary challenges of the project include: - The nature of the research question(s) may not be known ahead of time. - The thresholds for determining anomalies may be uncertain. - Problems associated with processing cloudy, missing, or noisy satellite imagery. - The contents and method of creation of the condensed dataset must be easily explainable to users. The architecture of the database will reorganize spatially-oriented satellite imagery into temporally-oriented columns of data (a.k.a., "data rods") to facilitate time-series analysis. The database itself is an open-source parallel database, designed to make full use of clustered server technologies. A demonstration of the system capabilities will be shown. Applications for this technology include quick-look views of the data, as well as the potential for on-board satellite processing of essential information, with the goal of reducing data latency.

  8. An improved optimization algorithm and Bayes factor termination criterion for sequential projection pursuit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Webb-Robertson, Bobbie-Jo M.; Jarman, Kristin H.; Harvey, Scott D.

    2005-05-28

    A fundamental problem in analysis of highly multivariate spectral or chromatographic data is reduction of dimensionality. Principal components analysis (PCA), concerned with explaining the variance-covariance structure of the data, is a commonly used approach to dimension reduction. Recently an attractive alternative to PCA, sequential projection pursuit (SPP), has been introduced. Designed to elicit clustering tendencies in the data, SPP may be more appropriate when performing clustering or classification analysis. However, the existing genetic algorithm (GA) implementation of SPP has two shortcomings, computation time and inability to determine the number of factors necessary to explain the majority of the structure inmore » the data. We address both these shortcomings. First, we introduce a new SPP algorithm, a random scan sampling algorithm (RSSA), that significantly reduces computation time. We compare the computational burden of the RSS and GA implementation for SPP on a dataset containing Raman spectra of twelve organic compounds. Second, we propose a Bayes factor criterion, BFC, as an effective measure for selecting the number of factors needed to explain the majority of the structure in the data. We compare SPP to PCA on two datasets varying in type, size, and difficulty; in both cases SPP achieves a higher accuracy with a lower number of latent variables.« less

  9. Innovative Instrumentation and Analysis of the Temperature Measurement for High Temperature Gasification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seong W. Lee

    The project entitled, ''Innovative Instrumentation and Analysis of the Temperature Measurement for High Temperature Gasification'', was successfully completed by the Principal Investigator, Dr. S. Lee and his research team in the Center for Advanced Energy Systems and Environmental Control Technologies at Morgan State University. The major results and outcomes were presented in semi-annual progress reports and annual project review meetings/presentations. Specifically, the literature survey including the gasifier temperature measurement, the ultrasonic application in cleaning application, and spray coating process and the gasifier simulator (cold model) testing has been successfully conducted during the first year. The results show that four factorsmore » (blower voltage, ultrasonic application, injection time intervals, particle weight) were considered as significant factors that affect the temperature measurement. Then the gasifier simulator (hot model) design and the fabrication as well as the systematic tests on hot model were completed to test the significant factors on temperature measurement in the second year. The advanced Industrial analytic methods such as statistics-based experimental design, analysis of variance (ANOVA) and regression methods were applied in the hot model tests. The results show that operational parameters (i.e. air flow rate, water flow rate, fine dust particle amount, ammonia addition) presented significant impact on the temperature measurement inside the gasifier simulator. The experimental design and ANOVA are very efficient way to design and analyze the experiments. The results show that the air flow rate and fine dust particle amount are statistically significant to the temperature measurement. The regression model provided the functional relation between the temperature and these factors with substantial accuracy. In the last year of the project period, the ultrasonic and subsonic cleaning methods and coating materials were tested/applied on the thermocouple cleaning according to the proposed approach. Different frequency, application time and power of the ultrasonic/subsonic output were tested. The results show that the ultrasonic approach is one of the best methods to clean the thermocouple tips during the routine operation of the gasifier. In addition, the real time data acquisition system was also designed and applied in the experiments. This advanced instrumentation provided the efficient and accurate data acquisition for this project. In summary, the accomplishment of the project provided useful information of the ultrasonic cleaning method applied in thermocouple tip cleaning. The temperature measurement could be much improved both in accuracy and duration provided that the proposed approach is widely used in the gasification facilities.« less

  10. Earthquake source imaging by high-resolution array analysis at regional distances: the 2010 M7 Haiti earthquake as seen by the Venezuela National Seismic Network

    NASA Astrophysics Data System (ADS)

    Meng, L.; Ampuero, J. P.; Rendon, H.

    2010-12-01

    Back projection of teleseismic waves based on array processing has become a popular technique for earthquake source imaging,in particular to track the areas of the source that generate the strongest high frequency radiation. The technique has been previously applied to study the rupture process of the Sumatra earthquake and the supershear rupture of the Kunlun earthquakes. Here we attempt to image the Haiti earthquake using the data recorded by Venezuela National Seismic Network (VNSN). The network is composed of 22 broad-band stations with an East-West oriented geometry, and is located approximately 10 degrees away from Haiti in the perpendicular direction to the Enriquillo fault strike. This is the first opportunity to exploit the privileged position of the VNSN to study large earthquake ruptures in the Caribbean region. This is also a great opportunity to explore the back projection scheme of the crustal Pn phase at regional distances,which provides unique complementary insights to the teleseismic source inversions. The challenge in the analysis of the 2010 M7.0 Haiti earthquake is its very compact source region, possibly shorter than 30km, which is below the resolution limit of standard back projection techniques based on beamforming. Results of back projection analysis using the teleseismic USarray data reveal little details of the rupture process. To overcome the classical resolution limit we explored the Multiple Signal Classification method (MUSIC), a high-resolution array processing technique based on the signal-noise orthognality in the eigen space of the data covariance, which achieves both enhanced resolution and better ability to resolve closely spaced sources. We experiment with various synthetic earthquake scenarios to test the resolution. We find that MUSIC provides at least 3 times higher resolution than beamforming. We also study the inherent bias due to the interferences of coherent Green’s functions, which leads to a potential quantification of biased uncertainty of the back projection. Preliminary results from the Venezuela data set shows an East to West rupture propagation along the fault with sub-Rayleigh rupture speed, consistent with a compact source with two significant asperities which are confirmed by source time function obtained from Green’s function deconvolution and other source inversion results. These efforts could lead the Venezuela National Seismic Network to play a prominent role in the timely characterization of the rupture process of large earthquakes in the Caribbean, including the future ruptures along the yet unbroken segments of the Enriquillo fault system.

  11. Estimation of seismically detectable portion of a gas plume: CO2CRC Otway project case study

    NASA Astrophysics Data System (ADS)

    Pevzner, Roman; Caspari, Eva; Bona, Andrej; Galvin, Robert; Gurevich, Boris

    2013-04-01

    CO2CRC Otway project comprises of several experiments involving CO2/CH4 or pure CO2 gas injection into different geological formations at the Otway test site (Victoria, Australia). During the first stage of the project, which was finished in 2010, more than 64,000 t of gas were injected into the depleted gas reservoir at ~2 km depth. At the moment, preparations for the next stage of the project aiming to examine capabilities of seismic monitoring of small scale injection (up to 15,000 t) into saline formation are ongoing. Time-lapse seismic is one of the most typical methods for CO2 geosequestration monitoring. Significant experience was gained during the first stage of the project through acquisition and analysis of the 4D surface seismic and numerous time-lapse VSP surveys. In order to justify the second stage of the project and optimise parameters of the experiment, several modelling studies were conducted. In order to predict seismic signal we populate realistic geological model with elastic properties, model their changes using fluid substitution technique applied to the fluid flow simulation results and compute synthetic seismic baseline and monitor volumes. To assess detectability of the time-lapse signal caused by the injection, we assume that the time-lapse noise level will be equivalent to the level of difference between the last two Otway 3D surveys acquired in 2009 and 2010 using conventional surface technique (15,000 lbs vibroseis sources and single geophones as the receivers). In order to quantify the uncertainties in plume imaging/visualisation due to the time-lapse noise realisation we propose to use multiple noise realisations with the same F-Kx-Ky amplitude spectra as the field noise for each synthetic signal volume. Having signal detection criterion defined in the terms of signal/time- lapse noise level on a single trace we estimate visible portion of the plume as a function of this criterion. This approach also gives an opportunity to attempt to evaluate probability of the signal detection. The authors acknowledge the funding provided by the Australian government through its CRC program to support this CO2CRC research project. We also acknowledge the CO2CRC's corporate sponsors and the financial assistance provided through Australian National Low Emissions Coal Research and Development (ANLEC R&D). ANLEC R&D is supported by Australian Coal Association Low Emissions Technology Limited and the Australian Government through the Clean Energy Initiative.

  12. Inexpensive and Highly Reproducible Cloud-Based Variant Calling of 2,535 Human Genomes

    PubMed Central

    Shringarpure, Suyash S.; Carroll, Andrew; De La Vega, Francisco M.; Bustamante, Carlos D.

    2015-01-01

    Population scale sequencing of whole human genomes is becoming economically feasible; however, data management and analysis remains a formidable challenge for many research groups. Large sequencing studies, like the 1000 Genomes Project, have improved our understanding of human demography and the effect of rare genetic variation in disease. Variant calling on datasets of hundreds or thousands of genomes is time-consuming, expensive, and not easily reproducible given the myriad components of a variant calling pipeline. Here, we describe a cloud-based pipeline for joint variant calling in large samples using the Real Time Genomics population caller. We deployed the population caller on the Amazon cloud with the DNAnexus platform in order to achieve low-cost variant calling. Using our pipeline, we were able to identify 68.3 million variants in 2,535 samples from Phase 3 of the 1000 Genomes Project. By performing the variant calling in a parallel manner, the data was processed within 5 days at a compute cost of $7.33 per sample (a total cost of $18,590 for completed jobs and $21,805 for all jobs). Analysis of cost dependence and running time on the data size suggests that, given near linear scalability, cloud computing can be a cheap and efficient platform for analyzing even larger sequencing studies in the future. PMID:26110529

  13. Preliminary Results for the OECD/NEA Time Dependent Benchmark using Rattlesnake, Rattlesnake-IQS and TDKENO

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeHart, Mark D.; Mausolff, Zander; Weems, Zach

    2016-08-01

    One goal of the MAMMOTH M&S project is to validate the analysis capabilities within MAMMOTH. Historical data has shown limited value for validation of full three-dimensional (3D) multi-physics methods. Initial analysis considered the TREAT startup minimum critical core and one of the startup transient tests. At present, validation is focusing on measurements taken during the M8CAL test calibration series. These exercises will valuable in preliminary assessment of the ability of MAMMOTH to perform coupled multi-physics calculations; calculations performed to date are being used to validate the neutron transport solver Rattlesnake\\cite{Rattlesnake} and the fuels performance code BISON. Other validation projects outsidemore » of TREAT are available for single-physics benchmarking. Because the transient solution capability of Rattlesnake is one of the key attributes that makes it unique for TREAT transient simulations, validation of the transient solution of Rattlesnake using other time dependent kinetics benchmarks has considerable value. The Nuclear Energy Agency (NEA) of the Organization for Economic Cooperation and Development (OECD) has recently developed a computational benchmark for transient simulations. This benchmark considered both two-dimensional (2D) and 3D configurations for a total number of 26 different transients. All are negative reactivity insertions, typically returning to the critical state after some time.« less

  14. Knowledge to Action - Understanding Natural Hazards-Induced Power Outage Scenarios for Actionable Disaster Responses

    NASA Astrophysics Data System (ADS)

    Kar, B.; Robinson, C.; Koch, D. B.; Omitaomu, O.

    2017-12-01

    The Sendai Framework for Disaster Risk Reduction 2015-2030 identified the following four priorities to prevent and reduce disaster risks: i) understanding disaster risk; ii) strengthening governance to manage disaster risk; iii) investing in disaster risk reduction for resilience and; iv) enhancing disaster preparedness for effective response, and to "Build Back Better" in recovery, rehabilitation and reconstruction. While forecasting and decision making tools are in place to predict and understand future impacts of natural hazards, the knowledge to action approach that currently exists fails to provide updated information needed by decision makers to undertake response and recovery efforts following a hazard event. For instance, during a tropical storm event advisories are released every two to three hours, but manual analysis of geospatial data to determine potential impacts of the event tends to be time-consuming and a post-event process. Researchers at Oak Ridge National Laboratory have developed a Spatial Decision Support System that enables real-time analysis of storm impact based on updated advisory. A prototype of the tool that focuses on determining projected power outage areas and projected duration of outages demonstrates the feasibility of integrating science with decision making for emergency management personnel to act in real time to protect communities and reduce risk.

  15. Artificial intelligence for multi-mission planetary operations

    NASA Technical Reports Server (NTRS)

    Atkinson, David J.; Lawson, Denise L.; James, Mark L.

    1990-01-01

    A brief introduction is given to an automated system called the Spacecraft Health Automated Reasoning Prototype (SHARP). SHARP is designed to demonstrate automated health and status analysis for multi-mission spacecraft and ground data systems operations. The SHARP system combines conventional computer science methodologies with artificial intelligence techniques to produce an effective method for detecting and analyzing potential spacecraft and ground systems problems. The system performs real-time analysis of spacecraft and other related telemetry, and is also capable of examining data in historical context. Telecommunications link analysis of the Voyager II spacecraft is the initial focus for evaluation of the prototype in a real-time operations setting during the Voyager spacecraft encounter with Neptune in August, 1989. The preliminary results of the SHARP project and plans for future application of the technology are discussed.

  16. Time Overrun in Construction Project

    NASA Astrophysics Data System (ADS)

    Othman, I.; Shafiq, Nasir; Nuruddin, M. F.

    2017-12-01

    Timely completion is the key criteria to achieve success in any project despite the industry. Unfortunately construction industry in Malaysia has been labelled as industry facing poor performance leading to failure in achieving effective time management. As the consequence most of the project face huge amount of time overrun. This study assesses the causes of construction projects time overrun in Malaysia using structured questionnaire survey. Each respondent is asked to assign a one-to-five rating for each of the 18 time factors identified from literature review. Out of the 50 questionnaires sent out, 33 were received back representing 68% of the response rate. Data received from the questionnaires were analysed and processed using the descriptive statistics procedures. Findings from the study revealed that design and documentation issues, project management and contract administration, ineffective project planning and scheduling, contractor’s site management, financial resource management were the major factors that cause the time overrun. This study is hoped to help the practitioners to implement the mitigation measure at planning stage in order to achieve successful construction projects.

  17. Seismpol_ a visual-basic computer program for interactive and automatic earthquake waveform analysis

    NASA Astrophysics Data System (ADS)

    Patanè, Domenico; Ferrari, Ferruccio

    1997-11-01

    A Microsoft Visual-Basic computer program for waveform analysis of seismic signals is presented. The program combines interactive and automatic processing of digital signals using data recorded by three-component seismic stations. The analysis procedure can be used in either an interactive earthquake analysis or an automatic on-line processing of seismic recordings. The algorithm works in the time domain using the Covariance Matrix Decomposition method (CMD), so that polarization characteristics may be computed continuously in real time and seismic phases can be identified and discriminated. Visual inspection of the particle motion in hortogonal planes of projection (hodograms) reduces the danger of misinterpretation derived from the application of the polarization filter. The choice of time window and frequency intervals improves the quality of the extracted polarization information. In fact, the program uses a band-pass Butterworth filter to process the signals in the frequency domain by analysis of a selected signal window into a series of narrow frequency bands. Significant results supported by well defined polarizations and source azimuth estimates for P and S phases are also obtained for short-period seismic events (local microearthquakes).

  18. Investigating the Nature of Dark Energy using Type Ia Supernovae with WFIRST-AFTA Space Mission

    NASA Astrophysics Data System (ADS)

    Perlmutter, Saul

    Scientifically, the WFIRST supernova program is unique: it makes possible a dark energy measurement that no other space mission or ground-based project is addressing, a measurement that will set the standard in determining the expansion history of the universe continuously from low to high redshifts (0.1 < z < 1.7). In the context of the WFIRST Science Definition Team several participants in this proposal have developed a first version of a supernova program, described in the WFIRST SDT Report. While this program was judged to be a robust one, and the estimates of the sensitivity to the cosmological parameters were felt to be reliable, due to limitations of time the analysis was clearly limited in depth on a number of issues. The objective of this proposal is to further develop this program. Technically this is the WFIRST measurement that arguably requires the most advanced project development, since it requires near-real-time analysis and follow-up with WFIRST, and since it is using the IFU spectrograph in the WFI package, the IFU being the WFIRST instrument that does not yet have a completely consistent set of specifications in the design iteration of the SDT report. In this proposal for the WFIRST Scientific Investigation Team, focused primarily on the supernova dark energy measurements, we address these crucial technical needs by bringing the larger supernova community's expertise on the science elements together with a smaller focused team that can produce the specific deliverables. Thus the objectives of this 5 year proposal are the following: 1. Development of scientific performance requirements for the study of Dark Energy using Type Ia supernovae 2. Design an observing strategy using the Wide Field Instrument (WFI) and the Integral Field Spectrometer Unit (IFU) 3. Development of science data analysis techniques and data analysis software 4. Development of ground and space calibration requirements and estimating realistic correlated errors, both statistical and systematic 5. Development of simulations and data challenges to validate the above 6. Development of complete plans in coordination with WFIRST project, for all aspects of science simulations, precursor observations, ground calibration, observational needs, data processing, anciliary data collection/incorporation, analysis, dissemination and documentation of the proposed science investigation. The 5 year program also intends to provide the following deliverables: 1. Documentation describing detailed scientific performance requirements 2. Documentation describing a design of an observing program 3. Documentation of science data analysis techniques 4. Simulations and data challenges to validate the above items 5. Algorithms used to perform processing of science data to serve as a basis for the WFIRST pipeline To achieve these objectives the plan is to set up a Supernova Project Office, seven Supernova Working Groups, and two Supernova Software Deliverables Teams. During the recent years of work with the Science Definition Team, it has been clear that the WFIRST Project Office requires a continuous series of scientific answers to the stream of design and requirements questions that arise in the development of the mission. One of the highest priorities of the Supernova Project Office will be to coordinate with the WFIRST Project Office and be the one-stop-shopping source of answers to such questions. The second topic of this proposal is Weak Lensing (WL). The intrinsic broad wavelength coverage and excellent flux calibration of the IFU spectra will provide an important training for the photometric redshift measurements, beyond what is possible from the ground, required for the WL survey. At this time the IFU design details are not fully developed, and our studies will ensure that the WL photo-z requirements are folded into a realistic final IFU design.

  19. Initial Data Analysis Results for ATD-2 ISAS HITL Simulation

    NASA Technical Reports Server (NTRS)

    Lee, Hanbong

    2017-01-01

    To evaluate the operational procedures and information requirements for the core functional capabilities of the ATD-2 project, such as tactical surface metering tool, APREQ-CFR procedure, and data element exchanges between ramp and tower, human-in-the-loop (HITL) simulations were performed in March, 2017. This presentation shows the initial data analysis results from the HITL simulations. With respect to the different runway configurations and metering values in tactical surface scheduler, various airport performance metrics were analyzed and compared. These metrics include gate holding time, taxi-out in time, runway throughput, queue size and wait time in queue, and TMI flight compliance. In addition to the metering value, other factors affecting the airport performance in the HITL simulation, including run duration, runway changes, and TMI constraints, are also discussed.

  20. Statistical tools for analysis and modeling of cosmic populations and astronomical time series: CUDAHM and TSE

    NASA Astrophysics Data System (ADS)

    Loredo, Thomas; Budavari, Tamas; Scargle, Jeffrey D.

    2018-01-01

    This presentation provides an overview of open-source software packages addressing two challenging classes of astrostatistics problems. (1) CUDAHM is a C++ framework for hierarchical Bayesian modeling of cosmic populations, leveraging graphics processing units (GPUs) to enable applying this computationally challenging paradigm to large datasets. CUDAHM is motivated by measurement error problems in astronomy, where density estimation and linear and nonlinear regression must be addressed for populations of thousands to millions of objects whose features are measured with possibly complex uncertainties, potentially including selection effects. An example calculation demonstrates accurate GPU-accelerated luminosity function estimation for simulated populations of $10^6$ objects in about two hours using a single NVIDIA Tesla K40c GPU. (2) Time Series Explorer (TSE) is a collection of software in Python and MATLAB for exploratory analysis and statistical modeling of astronomical time series. It comprises a library of stand-alone functions and classes, as well as an application environment for interactive exploration of times series data. The presentation will summarize key capabilities of this emerging project, including new algorithms for analysis of irregularly-sampled time series.

Top