Sample records for projection method applied

  1. An Aural Learning Project: Assimilating Jazz Education Methods for Traditional Applied Pedagogy

    ERIC Educational Resources Information Center

    Gamso, Nancy M.

    2011-01-01

    The Aural Learning Project (ALP) was developed to incorporate jazz method components into the author's classical practice and her applied woodwind lesson curriculum. The primary objective was to place a more focused pedagogical emphasis on listening and hearing than is traditionally used in the classical applied curriculum. The components of the…

  2. A Formula for Fixing Troubled Projects: The Scientific Method Meets Leadership

    NASA Technical Reports Server (NTRS)

    Wagner, Sandra

    2006-01-01

    This presentation focuses on project management, specifically addressing project issues using the scientific method of problem-solving. Two sample projects where this methodology has been applied are provided.

  3. Lessons learned applying CASE methods/tools to Ada software development projects

    NASA Technical Reports Server (NTRS)

    Blumberg, Maurice H.; Randall, Richard L.

    1993-01-01

    This paper describes the lessons learned from introducing CASE methods/tools into organizations and applying them to actual Ada software development projects. This paper will be useful to any organization planning to introduce a software engineering environment (SEE) or evolving an existing one. It contains management level lessons learned, as well as lessons learned in using specific SEE tools/methods. The experiences presented are from Alpha Test projects established under the STARS (Software Technology for Adaptable and Reliable Systems) project. They reflect the front end efforts by those projects to understand the tools/methods, initial experiences in their introduction and use, and later experiences in the use of specific tools/methods and the introduction of new ones.

  4. Space-time least-squares Petrov-Galerkin projection in nonlinear model reduction.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choi, Youngsoo; Carlberg, Kevin Thomas

    Our work proposes a space-time least-squares Petrov-Galerkin (ST-LSPG) projection method for model reduction of nonlinear dynamical systems. In contrast to typical nonlinear model-reduction methods that first apply Petrov-Galerkin projection in the spatial dimension and subsequently apply time integration to numerically resolve the resulting low-dimensional dynamical system, the proposed method applies projection in space and time simultaneously. To accomplish this, the method first introduces a low-dimensional space-time trial subspace, which can be obtained by computing tensor decompositions of state-snapshot data. The method then computes discrete-optimal approximations in this space-time trial subspace by minimizing the residual arising after time discretization over allmore » space and time in a weighted ℓ 2-norm. This norm can be de ned to enable complexity reduction (i.e., hyper-reduction) in time, which leads to space-time collocation and space-time GNAT variants of the ST-LSPG method. Advantages of the approach relative to typical spatial-projection-based nonlinear model reduction methods such as Galerkin projection and least-squares Petrov-Galerkin projection include: (1) a reduction of both the spatial and temporal dimensions of the dynamical system, (2) the removal of spurious temporal modes (e.g., unstable growth) from the state space, and (3) error bounds that exhibit slower growth in time. Numerical examples performed on model problems in fluid dynamics demonstrate the ability of the method to generate orders-of-magnitude computational savings relative to spatial-projection-based reduced-order models without sacrificing accuracy.« less

  5. Planning "and" Sprinting: Use of a Hybrid Project Management Methodology within a CIS Capstone Course

    ERIC Educational Resources Information Center

    Baird, Aaron; Riggins, Frederick J.

    2012-01-01

    An increasing number of information systems projects in industry are managed using hybrid project management methodologies, but this shift in project management methods is not fully represented in our CIS curriculums. CIS capstone courses often include an applied project that is managed with traditional project management methods (plan first,…

  6. Using the Project Method in Distributive Education. Teacher's Manual.

    ERIC Educational Resources Information Center

    Maletta, Edwin

    The document explains how to integrate the project training methods into a distributive education curriculum for grades 10 or 11. The purpose of this teacher's manual is to give an overall picture of the project method in use. Ten sample projects are included which could apply to any distributive education student concentrating on the major areas…

  7. Electronic-projecting Moire method applying CBR-technology

    NASA Astrophysics Data System (ADS)

    Kuzyakov, O. N.; Lapteva, U. V.; Andreeva, M. A.

    2018-01-01

    Electronic-projecting method based on Moire effect for examining surface topology is suggested. Conditions of forming Moire fringes and their parameters’ dependence on reference parameters of object and virtual grids are analyzed. Control system structure and decision-making subsystem are elaborated. Subsystem execution includes CBR-technology, based on applying case base. The approach related to analysing and forming decision for each separate local area with consequent formation of common topology map is applied.

  8. Proposal and Evaluation of Management Method for College Mechatronics Education Applying the Project Management

    NASA Astrophysics Data System (ADS)

    Ando, Yoshinobu; Eguchi, Yuya; Mizukawa, Makoto

    In this research, we proposed and evaluated a management method of college mechatronics education. We applied the project management to college mechatronics education. We practiced our management method to the seminar “Microcomputer Seminar” for 3rd grade students who belong to Department of Electrical Engineering, Shibaura Institute of Technology. We succeeded in management of Microcomputer Seminar in 2006. We obtained the good evaluation for our management method by means of questionnaire.

  9. Critical path method applied to research project planning: Fire Economics Evaluation System (FEES)

    Treesearch

    Earl B. Anderson; R. Stanton Hales

    1986-01-01

    The critical path method (CPM) of network analysis (a) depicts precedence among the many activities in a project by a network diagram; (b) identifies critical activities by calculating their starting, finishing, and float times; and (c) displays possible schedules by constructing time charts. CPM was applied to the development of the Forest Service's Fire...

  10. A Method of Measuring the Costs and Benefits of Applied Research.

    ERIC Educational Resources Information Center

    Sprague, John W.

    The Bureau of Mines studied the application of the concepts and methods of cost-benefit analysis to the problem of ranking alternative applied research projects. Procedures for measuring the different classes of project costs and benefits, both private and public, are outlined, and cost-benefit calculations are presented, based on the criteria of…

  11. Virtual fringe projection system with nonparallel illumination based on iteration

    NASA Astrophysics Data System (ADS)

    Zhou, Duo; Wang, Zhangying; Gao, Nan; Zhang, Zonghua; Jiang, Xiangqian

    2017-06-01

    Fringe projection profilometry has been widely applied in many fields. To set up an ideal measuring system, a virtual fringe projection technique has been studied to assist in the design of hardware configurations. However, existing virtual fringe projection systems use parallel illumination and have a fixed optical framework. This paper presents a virtual fringe projection system with nonparallel illumination. Using an iterative method to calculate intersection points between rays and reference planes or object surfaces, the proposed system can simulate projected fringe patterns and captured images. A new explicit calibration method has been presented to validate the precision of the system. Simulated results indicate that the proposed iterative method outperforms previous systems. Our virtual system can be applied to error analysis, algorithm optimization, and help operators to find ideal system parameter settings for actual measurements.

  12. Nonrigid registration of 3D longitudinal optical coherence tomography volumes with choroidal neovascularization

    NASA Astrophysics Data System (ADS)

    Wei, Qiangding; Shi, Fei; Zhu, Weifang; Xiang, Dehui; Chen, Haoyu; Chen, Xinjian

    2017-02-01

    In this paper, we propose a 3D registration method for retinal optical coherence tomography (OCT) volumes. The proposed method consists of five main steps: First, a projection image of the 3D OCT scan is created. Second, the vessel enhancement filter is applied on the projection image to detect vessel shadow. Third, landmark points are extracted based on both vessel positions and layer information. Fourth, the coherent point drift method is used to align retinal OCT volumes. Finally, a nonrigid B-spline-based registration method is applied to find the optimal transform to match the data. We applied this registration method on 15 3D OCT scans of patients with Choroidal Neovascularization (CNV). The Dice coefficients (DSC) between layers are greatly improved after applying the nonrigid registration.

  13. Reflections on Mixing Methods in Applied Linguistics Research

    ERIC Educational Resources Information Center

    Hashemi, Mohammad R.

    2012-01-01

    This commentary advocates the use of mixed methods research--that is the integration of qualitative and quantitative methods in a single study--in applied linguistics. Based on preliminary findings from a research project in progress, some reflections on the current practice of mixing methods as a new trend in applied linguistics are put forward.…

  14. 78 FR 13402 - Proposed Collection; Comment Request for Regulation Project

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-27

    ... the election not to apply look-back method in de minimis cases. DATES: Written comments should be... . SUPPLEMENTARY INFORMATION: Title: Election Not to Apply Look-Back Method in De Minimis Cases. OMB Number: 1545...), a taxpayer may elect not to apply the look-back method to long-term contracts in de minimis cases...

  15. ESP 2.0: Improved method for projecting U.S. GHG and air pollution emissions through 2055

    EPA Science Inventory

    The Emission Scenario Projection (ESP) method is used to develop multi-decadal projections of U.S. Greenhouse Gas (GHG) and criteria pollutant emissions. The resulting future-year emissions can then translated into an emissions inventory and applied in climate and air quality mod...

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, H; Kong, V; Jin, J

    Purpose: A synchronized moving grid (SMOG) has been proposed to reduce scatter and lag artifacts in cone beam computed tomography (CBCT). However, information is missing in each projection because certain areas are blocked by the grid. A previous solution to this issue is acquiring 2 complimentary projections at each position, which increases scanning time. This study reports our first Result using an inter-projection sensor fusion (IPSF) method to estimate missing projection in our prototype SMOG-based CBCT system. Methods: An in-house SMOG assembling with a 1:1 grid of 3 mm gap has been installed in a CBCT benchtop. The grid movesmore » back and forth in a 3-mm amplitude and up-to 20-Hz frequency. A control program in LabView synchronizes the grid motion with the platform rotation and x-ray firing so that the grid patterns for any two neighboring projections are complimentary. A Catphan was scanned with 360 projections. After scatter correction, the IPSF algorithm was applied to estimate missing signal for each projection using the information from the 2 neighboring projections. Feldkamp-Davis-Kress (FDK) algorithm was applied to reconstruct CBCT images. The CBCTs were compared to those reconstructed using normal projections without applying the SMOG system. Results: The SMOG-IPSF method may reduce image dose by half due to the blocked radiation by the grid. The method almost completely removed scatter related artifacts, such as the cupping artifacts. The evaluation of line pair patterns in the CatPhan suggested that the spatial resolution degradation was minimal. Conclusion: The SMOG-IPSF is promising in reducing scatter artifacts and improving image quality while reducing radiation dose.« less

  17. University of Tennessee Center for Space Transportation and Applied Research (CSTAR)

    NASA Astrophysics Data System (ADS)

    1995-10-01

    The Center for Space Transportation and Applied Research had projects with space applications in six major areas: laser materials processing, artificial intelligence/expert systems, space transportation, computational methods, chemical propulsion, and electric propulsion. The closeout status of all these projects is addressed.

  18. University of Tennessee Center for Space Transportation and Applied Research (CSTAR)

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Center for Space Transportation and Applied Research had projects with space applications in six major areas: laser materials processing, artificial intelligence/expert systems, space transportation, computational methods, chemical propulsion, and electric propulsion. The closeout status of all these projects is addressed.

  19. Spotting the difference in molecular dynamics simulations of biomolecules

    NASA Astrophysics Data System (ADS)

    Sakuraba, Shun; Kono, Hidetoshi

    2016-08-01

    Comparing two trajectories from molecular simulations conducted under different conditions is not a trivial task. In this study, we apply a method called Linear Discriminant Analysis with ITERative procedure (LDA-ITER) to compare two molecular simulation results by finding the appropriate projection vectors. Because LDA-ITER attempts to determine a projection such that the projections of the two trajectories do not overlap, the comparison does not suffer from a strong anisotropy, which is an issue in protein dynamics. LDA-ITER is applied to two test cases: the T4 lysozyme protein simulation with or without a point mutation and the allosteric protein PDZ2 domain of hPTP1E with or without a ligand. The projection determined by the method agrees with the experimental data and previous simulations. The proposed procedure, which complements existing methods, is a versatile analytical method that is specialized to find the "difference" between two trajectories.

  20. Lessons Learned from Client Projects in an Undergraduate Project Management Course

    ERIC Educational Resources Information Center

    Pollard, Carol E.

    2012-01-01

    This work proposes that a subtle combination of three learning methods offering "just in time" project management knowledge, coupled with hands-on project management experience can be particularly effective in producing project management students with employable skills. Students were required to apply formal project management knowledge to gain…

  1. 26 CFR 1.467-9 - Effective dates and automatic method changes for certain agreements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... applies only to rental agreements described in § 1.467-8. (c) Application of regulation project IA-292-84... before May 18, 1999, a taxpayer may choose to apply the provisions of regulation project IA-292-84 (1996... comply with the provisions of §§ 1.467-1 through 1.467-7. (2) Application of regulation project IA-292-84...

  2. Developing integrated methods to address complex resource and environmental issues

    USGS Publications Warehouse

    Smith, Kathleen S.; Phillips, Jeffrey D.; McCafferty, Anne E.; Clark, Roger N.

    2016-02-08

    IntroductionThis circular provides an overview of selected activities that were conducted within the U.S. Geological Survey (USGS) Integrated Methods Development Project, an interdisciplinary project designed to develop new tools and conduct innovative research requiring integration of geologic, geophysical, geochemical, and remote-sensing expertise. The project was supported by the USGS Mineral Resources Program, and its products and acquired capabilities have broad applications to missions throughout the USGS and beyond.In addressing challenges associated with understanding the location, quantity, and quality of mineral resources, and in investigating the potential environmental consequences of resource development, a number of field and laboratory capabilities and interpretative methodologies evolved from the project that have applications to traditional resource studies as well as to studies related to ecosystem health, human health, disaster and hazard assessment, and planetary science. New or improved tools and research findings developed within the project have been applied to other projects and activities. Specifically, geophysical equipment and techniques have been applied to a variety of traditional and nontraditional mineral- and energy-resource studies, military applications, environmental investigations, and applied research activities that involve climate change, mapping techniques, and monitoring capabilities. Diverse applied geochemistry activities provide a process-level understanding of the mobility, chemical speciation, and bioavailability of elements, particularly metals and metalloids, in a variety of environmental settings. Imaging spectroscopy capabilities maintained and developed within the project have been applied to traditional resource studies as well as to studies related to ecosystem health, human health, disaster assessment, and planetary science. Brief descriptions of capabilities and laboratory facilities and summaries of some applications of project products and research findings are included in this circular. The work helped support the USGS mission to “provide reliable scientific information to describe and understand the Earth; minimize loss of life and property from natural disasters; manage water, biological, energy, and mineral resources; and enhance and protect our quality of life.” Activities within the project include the following:Spanned scales from microscopic to planetary;Demonstrated broad applications across disciplines;Included life-cycle studies of mineral resources;Incorporated specialized areas of expertise in applied geochemistry including mineralogy, hydrogeology, analytical chemistry, aqueous geochemistry, biogeochemistry, microbiology, aquatic toxicology, and public health; andIncorporated specialized areas of expertise in geophysics including magnetics, gravity, radiometrics, electromagnetics, seismic, ground-penetrating radar, borehole radar, and imaging spectroscopy.This circular consists of eight sections that contain summaries of various activities under the project. The eight sections are listed below:Laboratory Facilities and Capabilities, which includes brief descriptions of the various types of laboratories and capabilities used for the project;Method and Software Development, which includes summaries of remote-sensing, geophysical, and mineralogical methods developed or enhanced by the project;Instrument Development, which includes descriptions of geophysical instruments developed under the project;Minerals, Energy, and Climate, which includes summaries of research that applies to mineral or energy resources, environmental processes and monitoring, and carbon sequestration by earth materials;Element Cycling, Toxicity, and Health, which includes summaries of several process-oriented geochemical and biogeochemical studies and health-related research activities;Hydrogeology and Water Quality, which includes descriptions of innovative geophysical, remote-sensing, and geochemical research pertaining to hydrogeology and water-quality applications;Hazards and Disaster Assessment, which includes summaries of research and method development that were applied to natural hazards, human-caused hazards, and disaster assessments; andDatabases and Framework Studies, which includes descriptions of fundamental applications of geophysical studies and of the importance of archived data.

  3. Introducing quality improvement methods into local public health departments: structured evaluation of a statewide pilot project.

    PubMed

    Riley, William; Parsons, Helen; McCoy, Kim; Burns, Debra; Anderson, Donna; Lee, Suhna; Sainfort, François

    2009-10-01

    To test the feasibility and assess the preliminary impact of a unique statewide quality improvement (QI) training program designed for public health departments. One hundred and ninety-five public health employees/managers from 38 local health departments throughout Minnesota were selected to participate in a newly developed QI training program and 65 of those engaged in and completed eight expert-supported QI projects over a period of 10 months from June 2007 through March 2008. As part of the Minnesota Quality Improvement Initiative, a structured distance education QI training program was designed and deployed in a first large-scale pilot. To evaluate the preliminary impact of the program, a mixed-method evaluation design was used based on four dimensions: learner reaction, knowledge, intention to apply, and preliminary outcomes. Subjective ratings of three dimensions of training quality were collected from participants after each of the scheduled learning sessions. Pre- and post-QI project surveys were administered to collect participant reactions, knowledge, future intention to apply learning, and perceived outcomes. Monthly and final QI project reports were collected to further inform success and preliminary outcomes of the projects. The participants reported (1) high levels of satisfaction with the training sessions, (2) increased perception of the relevance of the QI techniques, (3) increased perceived knowledge of all specific QI methods and techniques, (4) increased confidence in applying QI techniques on future projects, (5) increased intention to apply techniques on future QI projects, and (6) high perceived success of, and satisfaction with, the projects. Finally, preliminary outcomes data show moderate to large improvements in quality and/or efficiency for six out of eight projects. QI methods and techniques can be successfully implemented in local public health agencies on a statewide basis using the collaborative model through distance training and expert facilitation. This unique training can improve both core and support processes and lead to favorable staff reactions, increased knowledge, and improved health outcomes. The program can be further improved and deployed and holds great promise to facilitate the successful dissemination of proven QI methods throughout local public health departments.

  4. 26 CFR 1.174-3 - Treatment as expenses.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...) of this section. If adopted, the method shall apply to all research and experimental expenditures... method is requested, and a description of the project or projects with respect to which research or... change to a different method of treating research or experimental expenditures shall be in writing and...

  5. Discriminative Projection Selection Based Face Image Hashing

    NASA Astrophysics Data System (ADS)

    Karabat, Cagatay; Erdogan, Hakan

    Face image hashing is an emerging method used in biometric verification systems. In this paper, we propose a novel face image hashing method based on a new technique called discriminative projection selection. We apply the Fisher criterion for selecting the rows of a random projection matrix in a user-dependent fashion. Moreover, another contribution of this paper is to employ a bimodal Gaussian mixture model at the quantization step. Our simulation results on three different databases demonstrate that the proposed method has superior performance in comparison to previously proposed random projection based methods.

  6. Parallel High Order Accuracy Methods Applied to Non-Linear Hyperbolic Equations and to Problems in Materials Sciences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jan Hesthaven

    2012-02-06

    Final report for DOE Contract DE-FG02-98ER25346 entitled Parallel High Order Accuracy Methods Applied to Non-Linear Hyperbolic Equations and to Problems in Materials Sciences. Principal Investigator Jan S. Hesthaven Division of Applied Mathematics Brown University, Box F Providence, RI 02912 Jan.Hesthaven@Brown.edu February 6, 2012 Note: This grant was originally awarded to Professor David Gottlieb and the majority of the work envisioned reflects his original ideas. However, when Prof Gottlieb passed away in December 2008, Professor Hesthaven took over as PI to ensure proper mentoring of students and postdoctoral researchers already involved in the project. This unusual circumstance has naturally impacted themore » project and its timeline. However, as the report reflects, the planned work has been accomplished and some activities beyond the original scope have been pursued with success. Project overview and main results The effort in this project focuses on the development of high order accurate computational methods for the solution of hyperbolic equations with application to problems with strong shocks. While the methods are general, emphasis is on applications to gas dynamics with strong shocks.« less

  7. Implicit and Explicit: An Experiment in Applied Psycholinguistics, Assessing Different Methods of Teaching Grammatical Structures in English as a Foreign Language.

    ERIC Educational Resources Information Center

    Olsson, Margareta

    Project 3 of the GUME research project on foreign language teaching methods, in line with Projects 1 and 2, questions whether the best effect in language teaching is achieved solely by intensive drilling of the structure in question (the implicit method) or if grammatical explanations further the assimilation of the patterns so that, within the…

  8. 26 CFR 1.175-6 - Adoption or change of method.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... project or farm as to which the method or change of method is to apply; (4) Set forth the amount of all... farm. The authorization with respect to the special project or single farm will not affect the method... 26 Internal Revenue 3 2010-04-01 2010-04-01 false Adoption or change of method. 1.175-6 Section 1...

  9. Water supply management using an extended group fuzzy decision-making method: a case study in north-eastern Iran

    NASA Astrophysics Data System (ADS)

    Minatour, Yasser; Bonakdari, Hossein; Zarghami, Mahdi; Bakhshi, Maryam Ali

    2015-09-01

    The purpose of this study was to develop a group fuzzy multi-criteria decision-making method to be applied in rating problems associated with water resources management. Thus, here Chen's group fuzzy TOPSIS method extended by a difference technique to handle uncertainties of applying a group decision making. Then, the extended group fuzzy TOPSIS method combined with a consistency check. In the presented method, initially linguistic judgments are being surveyed via a consistency checking process, and afterward these judgments are being used in the extended Chen's fuzzy TOPSIS method. Here, each expert's opinion is turned to accurate mathematical numbers and, then, to apply uncertainties, the opinions of group are turned to fuzzy numbers using three mathematical operators. The proposed method is applied to select the optimal strategy for the rural water supply of Nohoor village in north-eastern Iran, as a case study and illustrated example. Sensitivity analyses test over results and comparing results with project reality showed that proposed method offered good results for water resources projects.

  10. Projection-slice theorem based 2D-3D registration

    NASA Astrophysics Data System (ADS)

    van der Bom, M. J.; Pluim, J. P. W.; Homan, R.; Timmer, J.; Bartels, L. W.

    2007-03-01

    In X-ray guided procedures, the surgeon or interventionalist is dependent on his or her knowledge of the patient's specific anatomy and the projection images acquired during the procedure by a rotational X-ray source. Unfortunately, these X-ray projections fail to give information on the patient's anatomy in the dimension along the projection axis. It would be very profitable to provide the surgeon or interventionalist with a 3D insight of the patient's anatomy that is directly linked to the X-ray images acquired during the procedure. In this paper we present a new robust 2D-3D registration method based on the Projection-Slice Theorem. This theorem gives us a relation between the pre-operative 3D data set and the interventional projection images. Registration is performed by minimizing a translation invariant similarity measure that is applied to the Fourier transforms of the images. The method was tested by performing multiple exhaustive searches on phantom data of the Circle of Willis and on a post-mortem human skull. Validation was performed visually by comparing the test projections to the ones that corresponded to the minimal value of the similarity measure. The Projection-Slice Theorem Based method was shown to be very effective and robust, and provides capture ranges up to 62 degrees. Experiments have shown that the method is capable of retrieving similar results when translations are applied to the projection images.

  11. Applying the TOC Project Management to Operation and Maintenance Scheduling of a Research Vessel

    NASA Astrophysics Data System (ADS)

    Manti, M. Firdausi; Fujimoto, Hideo; Chen, Lian-Yi

    Marine research vessels and their systems are major assets in the marine resources development. Since the running costs for the ship are very high, it is necessary to reduce the total cost by an efficient scheduling for operation and maintenance. To reduce project period and make it efficient, we applied TOC project management method that is a project management approach developed by Dr. Eli Goldratt. It challenges traditional approaches to project management. It will become the most important improvement in the project management since the development of PERT and critical path methodologies. As a case study, we presented the marine geology research project for the purpose of operations in addition to repair on the repairing dock projects for maintenance of vessels.

  12. The Benefits and Costs of National Service: Methods for Benefit Assessment with Application to Three AmeriCorps Programs.

    ERIC Educational Resources Information Center

    Neumann, George R.; And Others

    A study applied the principles of benefit-cost analysis to three prototype grants programs of AmeriCorps: AmeriCorps for Math and Literacy, Project First, and the East Bay Conservation Corps. It studied the methods these projects used and estimated the benefits using data from projects similar in approach and implementation. Benefits received by…

  13. Laser projection positioning of spatial contour curves via a galvanometric scanner

    NASA Astrophysics Data System (ADS)

    Tu, Junchao; Zhang, Liyan

    2018-04-01

    The technology of laser projection positioning is widely applied in advanced manufacturing fields (e.g. composite plying, parts location and installation). In order to use it better, a laser projection positioning (LPP) system is designed and implemented. Firstly, the LPP system is built by a laser galvanometric scanning (LGS) system and a binocular vision system. Applying Single-hidden Layer Feed-forward Neural Network (SLFN), the system model is constructed next. Secondly, the LGS system and the binocular system, which are respectively independent, are integrated through a datadriven calibration method based on extreme learning machine (ELM) algorithm. Finally, a projection positioning method is proposed within the framework of the calibrated SLFN system model. A well-designed experiment is conducted to verify the viability and effectiveness of the proposed system. In addition, the accuracy of projection positioning are evaluated to show that the LPP system can achieves the good localization effect.

  14. An Approach to Teaching Applied GIS: Implementation for Local Organizations.

    ERIC Educational Resources Information Center

    Benhart, John, Jr.

    2000-01-01

    Describes the instructional method, Client-Life Cycle GIS Project Learning, used in a course at Indiana University of Pennsylvania that enables students to learn with and about geographic information system (GIS). Discusses the course technical issues in GIS and an example project using this method. (CMK)

  15. Use of the Hotelling observer to optimize image reconstruction in digital breast tomosynthesis

    PubMed Central

    Sánchez, Adrian A.; Sidky, Emil Y.; Pan, Xiaochuan

    2015-01-01

    Abstract. We propose an implementation of the Hotelling observer that can be applied to the optimization of linear image reconstruction algorithms in digital breast tomosynthesis. The method is based on considering information within a specific region of interest, and it is applied to the optimization of algorithms for detectability of microcalcifications. Several linear algorithms are considered: simple back-projection, filtered back-projection, back-projection filtration, and Λ-tomography. The optimized algorithms are then evaluated through the reconstruction of phantom data. The method appears robust across algorithms and parameters and leads to the generation of algorithm implementations which subjectively appear optimized for the task of interest. PMID:26702408

  16. Correction for specimen movement and rotation errors for in-vivo Optical Projection Tomography

    PubMed Central

    Birk, Udo Jochen; Rieckher, Matthias; Konstantinides, Nikos; Darrell, Alex; Sarasa-Renedo, Ana; Meyer, Heiko; Tavernarakis, Nektarios; Ripoll, Jorge

    2010-01-01

    The application of optical projection tomography to in-vivo experiments is limited by specimen movement during the acquisition. We present a set of mathematical correction methods applied to the acquired data stacks to correct for movement in both directions of the image plane. These methods have been applied to correct experimental data taken from in-vivo optical projection tomography experiments in Caenorhabditis elegans. Successful reconstructions for both fluorescence and white light (absorption) measurements are shown. Since no difference between movement of the animal and movement of the rotation axis is made, this approach at the same time removes artifacts due to mechanical drifts and errors in the assumed center of rotation. PMID:21258448

  17. Applied Linguistics Project: Student-Led Computer Assisted Research in High School EAL/EAP

    ERIC Educational Resources Information Center

    Bohát, Róbert; Rödlingová, Beata; Horáková, Nina

    2015-01-01

    The Applied Linguistics Project (ALP) started at the International School of Prague (ISP) in 2013. Every year, Grade 9 English as an Additional Language (EAL) students identify an area of learning in need of improvement and design a research method followed by data collection and analysis using basic computer software tools or online corpora.…

  18. Software Reliability 2002

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores R.

    2003-01-01

    In FY01 we learned that hardware reliability models need substantial changes to account for differences in software, thus making software reliability measurements more effective, accurate, and easier to apply. These reliability models are generally based on familiar distributions or parametric methods. An obvious question is 'What new statistical and probability models can be developed using non-parametric and distribution-free methods instead of the traditional parametric method?" Two approaches to software reliability engineering appear somewhat promising. The first study, begin in FY01, is based in hardware reliability, a very well established science that has many aspects that can be applied to software. This research effort has investigated mathematical aspects of hardware reliability and has identified those applicable to software. Currently the research effort is applying and testing these approaches to software reliability measurement, These parametric models require much project data that may be difficult to apply and interpret. Projects at GSFC are often complex in both technology and schedules. Assessing and estimating reliability of the final system is extremely difficult when various subsystems are tested and completed long before others. Parametric and distribution free techniques may offer a new and accurate way of modeling failure time and other project data to provide earlier and more accurate estimates of system reliability.

  19. 75 FR 5852 - Proposed Collection; Comment Request for Regulation Project

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-04

    ... Method in De Minimis Cases (Sec. 1.460-6). DATES: Written comments should be received on or before April... INFORMATION: Title: Election Not to Apply Look-Back Method in De Minimis Cases. OMB Number: 1545-1572... may elect not to apply the look-back method to long-term contracts in de minimis cases. The taxpayer...

  20. Connecting University and Student Teaching Experiences through the Jackdaw Kit Project

    ERIC Educational Resources Information Center

    Marshall, Jill

    2010-01-01

    The author shares the challenges faced in her social studies methods class due to the constraints of NCLB. By incorporating the jackdaw kit project within her social studies methods course, her candidates were able to connect what they were talking about in methods and apply it to their students teaching situations where there was little time for…

  1. Prioritizing sewer rehabilitation projects using AHP-PROMETHEE II ranking method.

    PubMed

    Kessili, Abdelhak; Benmamar, Saadia

    2016-01-01

    The aim of this paper is to develop a methodology for the prioritization of sewer rehabilitation projects for Algiers (Algeria) sewer networks to support the National Sanitation Office in its challenge to make decisions on prioritization of sewer rehabilitation projects. The methodology applies multiple-criteria decision making. The study includes 47 projects (collectors) and 12 criteria to evaluate them. These criteria represent the different issues considered in the prioritization of the projects, which are structural, hydraulic, environmental, financial, social and technical. The analytic hierarchy process (AHP) is used to determine weights of the criteria and the Preference Ranking Organization Method for Enrichment Evaluations (PROMETHEE II) method is used to obtain the final ranking of the projects. The model was verified using the sewer data of Algiers. The results have shown that the method can be used for prioritizing sewer rehabilitation projects.

  2. How is the weather? Forecasting inpatient glycemic control

    PubMed Central

    Saulnier, George E; Castro, Janna C; Cook, Curtiss B; Thompson, Bithika M

    2017-01-01

    Aim: Apply methods of damped trend analysis to forecast inpatient glycemic control. Method: Observed and calculated point-of-care blood glucose data trends were determined over 62 weeks. Mean absolute percent error was used to calculate differences between observed and forecasted values. Comparisons were drawn between model results and linear regression forecasting. Results: The forecasted mean glucose trends observed during the first 24 and 48 weeks of projections compared favorably to the results provided by linear regression forecasting. However, in some scenarios, the damped trend method changed inferences compared with linear regression. In all scenarios, mean absolute percent error values remained below the 10% accepted by demand industries. Conclusion: Results indicate that forecasting methods historically applied within demand industries can project future inpatient glycemic control. Additional study is needed to determine if forecasting is useful in the analyses of other glucometric parameters and, if so, how to apply the techniques to quality improvement. PMID:29134125

  3. An applied study using systems engineering methods to prioritize green systems options

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Sonya M; Macdonald, John M

    2009-01-01

    For many years, there have been questions about the effectiveness of applying different green solutions. If you're building a home and wish to use green technologies, where do you start? While all technologies sound promising, which will perform the best over time? All this has to be considered within the cost and schedule of the project. The amount of information available on the topic can be overwhelming. We seek to examine if Systems Engineering methods can be used to help people choose and prioritize technologies that fit within their project and budget. Several methods are used to gain perspective intomore » how to select the green technologies, such as the Analytic Hierarchy Process (AHP) and Kepner-Tregoe. In our study, subjects applied these methods to analyze cost, schedule, and trade-offs. Results will document whether the experimental approach is applicable to defining system priorities for green technologies.« less

  4. Newly invented biobased materials from low-carbon, diverted waste fibers: research methods, testing, and full-scale application in a case study structure

    Treesearch

    Julee A Herdt; John Hunt; Kellen Schauermann

    2016-01-01

    This project demonstrates newly invented, biobased construction materials developed by applying lowcarbon, biomass waste sources through the Authors’ engineered fiber processes and technology. If manufactured and applied large-scale the project inventions can divert large volumes of cellulose waste into high-performance, low embodied energy, environmental construction...

  5. Projection-based estimation and nonuniformity correction of sensitivity profiles in phased-array surface coils.

    PubMed

    Yun, Sungdae; Kyriakos, Walid E; Chung, Jun-Young; Han, Yeji; Yoo, Seung-Schik; Park, Hyunwook

    2007-03-01

    To develop a novel approach for calculating the accurate sensitivity profiles of phased-array coils, resulting in correction of nonuniform intensity in parallel MRI. The proposed intensity-correction method estimates the accurate sensitivity profile of each channel of the phased-array coil. The sensitivity profile is estimated by fitting a nonlinear curve to every projection view through the imaged object. The nonlinear curve-fitting efficiently obtains the low-frequency sensitivity profile by eliminating the high-frequency image contents. Filtered back-projection (FBP) is then used to compute the estimates of the sensitivity profile of each channel. The method was applied to both phantom and brain images acquired from the phased-array coil. Intensity-corrected images from the proposed method had more uniform intensity than those obtained by the commonly used sum-of-squares (SOS) approach. With the use of the proposed correction method, the intensity variation was reduced to 6.1% from 13.1% of the SOS. When the proposed approach was applied to the computation of the sensitivity maps during sensitivity encoding (SENSE) reconstruction, it outperformed the SOS approach in terms of the reconstructed image uniformity. The proposed method is more effective at correcting the intensity nonuniformity of phased-array surface-coil images than the conventional SOS method. In addition, the method was shown to be resilient to noise and was successfully applied for image reconstruction in parallel imaging.

  6. The use of a projection method to simplify portal and hepatic vein segmentation in liver anatomy.

    PubMed

    Huang, Shaohui; Wang, Boliang; Cheng, Ming; Huang, Xiaoyang; Ju, Ying

    2008-12-01

    In living donor liver transplantation, the volume of the potential graft must be measured to ensure sufficient liver function after surgery. Couinaud divided the liver into 8 functionally independent segments. However, this method is not simple to perform in 3D space directly. Thus, we propose a rapid method to segment the liver based on the hepatic vessel tree. The most important step of this method is vascular projection. By carefully selecting a projection plane, a 3D point can be fixed in the projection plane. This greatly helps in rapid classification. This method was validated by applying it to a 3D liver depicted on CT images, and the result was in good agreement with Couinaud's classification.

  7. Image restoration by the method of convex projections: part 2 applications and numerical results.

    PubMed

    Sezan, M I; Stark, H

    1982-01-01

    The image restoration theory discussed in a previous paper by Youla and Webb [1] is applied to a simulated image and the results compared with the well-known method known as the Gerchberg-Papoulis algorithm. The results show that the method of image restoration by projection onto convex sets, by providing a convenient technique for utilizing a priori information, performs significantly better than the Gerchberg-Papoulis method.

  8. Variance analysis of forecasted streamflow maxima in a wet temperate climate

    NASA Astrophysics Data System (ADS)

    Al Aamery, Nabil; Fox, James F.; Snyder, Mark; Chandramouli, Chandra V.

    2018-05-01

    Coupling global climate models, hydrologic models and extreme value analysis provides a method to forecast streamflow maxima, however the elusive variance structure of the results hinders confidence in application. Directly correcting the bias of forecasts using the relative change between forecast and control simulations has been shown to marginalize hydrologic uncertainty, reduce model bias, and remove systematic variance when predicting mean monthly and mean annual streamflow, prompting our investigation for maxima streamflow. We assess the variance structure of streamflow maxima using realizations of emission scenario, global climate model type and project phase, downscaling methods, bias correction, extreme value methods, and hydrologic model inputs and parameterization. Results show that the relative change of streamflow maxima was not dependent on systematic variance from the annual maxima versus peak over threshold method applied, albeit we stress that researchers strictly adhere to rules from extreme value theory when applying the peak over threshold method. Regardless of which method is applied, extreme value model fitting does add variance to the projection, and the variance is an increasing function of the return period. Unlike the relative change of mean streamflow, results show that the variance of the maxima's relative change was dependent on all climate model factors tested as well as hydrologic model inputs and calibration. Ensemble projections forecast an increase of streamflow maxima for 2050 with pronounced forecast standard error, including an increase of +30(±21), +38(±34) and +51(±85)% for 2, 20 and 100 year streamflow events for the wet temperate region studied. The variance of maxima projections was dominated by climate model factors and extreme value analyses.

  9. Effectiveness of project ACORDE materials: applied evaluative research in a preclinical technique course.

    PubMed

    Shugars, D A; Trent, P J; Heymann, H O

    1979-08-01

    Two instructional strategies, the traditional lecture method and a standardized self-instructional (ACORDE) format, were compared for efficiency and perceived usefulness in a preclinical restorative dentistry technique course through the use of a posttest-only control group research design. Control and experimental groups were compared on (a) technique grades, (b) didactic grades, (c) amount of time spent, (d) student and faculty perceptions, and (e) observation of social dynamics. The results of this study demonstrated the effectiveness of Project ACORDE materials in teaching dental students, provided an example of applied research designed to test contemplated instructional innovations prior to use and used a method which highlighted qualitative, as well as quantitative, techniques for data gathering in applied research.

  10. Wind Plant Performance Prediction (WP3) Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craig, Anna

    The methods for analysis of operational wind plant data are highly variable across the wind industry, leading to high uncertainties in the validation and bias-correction of preconstruction energy estimation methods. Lack of credibility in the preconstruction energy estimates leads to significant impacts on project financing and therefore the final levelized cost of energy for the plant. In this work, the variation in the evaluation of a wind plant's operational energy production as a result of variations in the processing methods applied to the operational data is examined. Preliminary results indicate that selection of the filters applied to the data andmore » the filter parameters can have significant impacts in the final computed assessment metrics.« less

  11. Integration of optical measurement methods with flight parameter measurement systems

    NASA Astrophysics Data System (ADS)

    Kopecki, Grzegorz; Rzucidlo, Pawel

    2016-05-01

    During the AIM (advanced in-flight measurement techniques) and AIM2 projects, innovative modern techniques were developed. The purpose of the AIM project was to develop optical measurement techniques dedicated for flight tests. Such methods give information about aircraft elements deformation, thermal loads or pressure distribution, etc. In AIM2 the development of optical methods for flight testing was continued. In particular, this project aimed at the development of methods that could be easily applied in flight tests in an industrial setting. Another equally important task was to guarantee the synchronization of the classical measuring system with cameras. The PW-6U glider used in flight tests was provided by the Rzeszów University of Technology. The glider had all the equipment necessary for testing the IPCT (image pattern correlation technique) and IRT (infrared thermometry) methods. Additionally, equipment adequate for the measurement of typical flight parameters, registration and analysis has been developed. This article describes the designed system, as well as presenting the system’s application during flight tests. Additionally, the results obtained in flight tests show certain limitations of the IRT method as applied.

  12. A Primer on Bootstrap Factor Analysis as Applied to Health Studies Research

    ERIC Educational Resources Information Center

    Lu, Wenhua; Miao, Jingang; McKyer, E. Lisako J.

    2014-01-01

    Objectives: To demonstrate how the bootstrap method could be conducted in exploratory factor analysis (EFA) with a syntax written in SPSS. Methods: The data obtained from the Texas Childhood Obesity Prevention Policy Evaluation project (T-COPPE project) were used for illustration. A 5-step procedure to conduct bootstrap factor analysis (BFA) was…

  13. Applying the Kanban method in problem-based project work: a case study in a manufacturing engineering bachelor's programme at Aalborg University Copenhagen

    NASA Astrophysics Data System (ADS)

    Balve, Patrick; Krüger, Volker; Tolstrup Sørensen, Lene

    2017-11-01

    Problem-based learning (PBL) has proven to be highly effective for educating students in an active and self-motivated manner in various disciplines. Student projects carried out following PBL principles are very dynamic and carry a high level of uncertainty, both conditions under which agile project management approaches are assumed to be highly supportive. The paper describes an empirical case study carried out at Aalborg University Copenhagen involving students from two different semesters of a Bachelor of Science programme. While executing the study, compelling examples of how PBL and the agile project management method Kanban blend could be identified. A final survey reveals that applying Kanban produces noticeable improvements with respect to creating, assigning and coordinating project tasks. Other improvements were found in group communication, knowledge about the work progress with regards to both the individual and the collective and the students' way of continuously improving their own teamwork.

  14. Negotiating a Systems Development Method

    NASA Astrophysics Data System (ADS)

    Karlsson, Fredrik; Hedström, Karin

    Systems development methods (or methods) are often applied in tailored version to fit the actual situation. Method tailoring is in most the existing literature viewed as either (a) a highly rational process with the method engineer as the driver where the project members are passive information providers or (b) an unstructured process where the systems developer makes individual choices, a selection process without any driver. The purpose of this chapter is to illustrate that important design decisions during method tailoring are made by project members through negotiation. The study has been carried out using the perspective of actor-network theory. Our narratives depict method tailoring as more complex than (a) and (b) show the driver role rotates between the project members, and design decisions are based on influences from several project members. However, these design decisions are not consensus decisions.

  15. Project-Based Learning in Primary Schools: Effects on Pupils' Learning and Attitudes

    ERIC Educational Resources Information Center

    Kaldi, Stavroula; Filippatou, Diamanto; Govaris, Christos

    2011-01-01

    This study focuses upon the effectiveness of project-based learning on primary school pupils regarding their content knowledge and attitudes towards self-efficacy, task value, group work, teaching methods applied and peers from diverse ethnic backgrounds. A cross-curricular project was implemented within the curriculum area of environmental…

  16. Lessons from comparative effectiveness research methods development projects funded under the Recovery Act.

    PubMed

    Zurovac, Jelena; Esposito, Dominick

    2014-11-01

    The American Recovery and Reinvestment Act of 2009 (ARRA) directed nearly US$29.2 million to comparative effectiveness research (CER) methods development. To help inform future CER methods investments, we describe the ARRA CER methods projects, identify barriers to this research and discuss the alignment of topics with published methods development priorities. We used several existing resources and held discussions with ARRA CER methods investigators. Although funded projects explored many identified priority topics, investigators noted that much work remains. For example, given the considerable investments in CER data infrastructure, the methods development field can benefit from additional efforts to educate researchers about the availability of new data sources and about how best to apply methods to match their research questions and data.

  17. Project-Based Learning in Programmable Logic Controller

    NASA Astrophysics Data System (ADS)

    Seke, F. R.; Sumilat, J. M.; Kembuan, D. R. E.; Kewas, J. C.; Muchtar, H.; Ibrahim, N.

    2018-02-01

    Project-based learning is a learning method that uses project activities as the core of learning and requires student creativity in completing the project. The aims of this study is to investigate the influence of project-based learning methods on students with a high level of creativity in learning the Programmable Logic Controller (PLC). This study used experimental methods with experimental class and control class consisting of 24 students, with 12 students of high creativity and 12 students of low creativity. The application of project-based learning methods into the PLC courses combined with the level of student creativity enables the students to be directly involved in the work of the PLC project which gives them experience in utilizing PLCs for the benefit of the industry. Therefore, it’s concluded that project-based learning method is one of the superior learning methods to apply on highly creative students to PLC courses. This method can be used as an effort to improve student learning outcomes and student creativity as well as to educate prospective teachers to become reliable educators in theory and practice which will be tasked to create qualified human resources candidates in order to meet future industry needs.

  18. A Case Study on Reducing Children's Screen Time: The Project of Screen Free Week

    ERIC Educational Resources Information Center

    Kara, Hatice Gözde Ertürk

    2018-01-01

    The current study aims to direct children to alternative activities within a week period by applying the project of screen free week to voluntary families. The ultimate aim of the study is to reduce children's screen time. The instrumental case study method; one of the qualitative research methods, was employed. Five children attending the…

  19. The successive projection algorithm as an initialization method for brain tumor segmentation using non-negative matrix factorization.

    PubMed

    Sauwen, Nicolas; Acou, Marjan; Bharath, Halandur N; Sima, Diana M; Veraart, Jelle; Maes, Frederik; Himmelreich, Uwe; Achten, Eric; Van Huffel, Sabine

    2017-01-01

    Non-negative matrix factorization (NMF) has become a widely used tool for additive parts-based analysis in a wide range of applications. As NMF is a non-convex problem, the quality of the solution will depend on the initialization of the factor matrices. In this study, the successive projection algorithm (SPA) is proposed as an initialization method for NMF. SPA builds on convex geometry and allocates endmembers based on successive orthogonal subspace projections of the input data. SPA is a fast and reproducible method, and it aligns well with the assumptions made in near-separable NMF analyses. SPA was applied to multi-parametric magnetic resonance imaging (MRI) datasets for brain tumor segmentation using different NMF algorithms. Comparison with common initialization methods shows that SPA achieves similar segmentation quality and it is competitive in terms of convergence rate. Whereas SPA was previously applied as a direct endmember extraction tool, we have shown improved segmentation results when using SPA as an initialization method, as it allows further enhancement of the sources during the NMF iterative procedure.

  20. The topology of galaxy clustering.

    NASA Astrophysics Data System (ADS)

    Coles, P.; Plionis, M.

    The authors discuss an objective method for quantifying the topology of the galaxy distribution using only projected galaxy counts. The method is a useful complement to fully three-dimensional studies of topology based on the genus by virtue of the enormous projected data sets available. Applying the method to the Lick counts they find no evidence for large-scale non-gaussian behaviour, whereas the small-scale distribution is strongly non-gaussian, with a shift in the meatball direction.

  1. Research methods in nursing students' Bachelor's theses in Sweden: A descriptive study.

    PubMed

    Johansson, Linda; Silén, Marit

    2018-07-01

    During the nursing programme in Sweden, students complete an independent project that allows them to receive both a professional qualification as a nurse and a Bachelor's degree. This project gives students the opportunity to develop and apply skills such as critical thinking, problem-solving and decision-making, thus preparing them for their future work. However, only a few, small-scale studies have analysed the independent project to gain more insight into how nursing students carry out this task. The aim of the present study was to describe the methods, including ethical considerations and assessment of data quality, applied in nursing students' independent Bachelor's degree projects in a Swedish context. A descriptive study with a quantitative approach. A total of 490 independent projects were analysed using descriptive statistics. Literature reviews were the predominant project form. References were often used to support the analysis method. They were not, however, always relevant to the method. This was also true of ethical considerations. When a qualitative approach was used, and data collected through interviews, the participants were typically professionals. In qualitative projects involving analysis of biographies/autobiographies or blogs participants were either persons with a disease or next of kin of a person with a disease. Although most of the projects were literature reviews, it seemed unclear to the nursing students how the data should be analysed as well as what ethical issues should be raised in relation to the method. Consequently, further research and guidance are needed. In Sweden, independent projects are not considered research and are therefore not required to undergo ethics vetting. However, it is important that they be designed so as to avoid possible research ethics problems. Asking persons about their health, which occurred in some of the empirical projects, may therefore be considered questionable. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. Exploring an Experiential Learning Project through Kolb's Learning Theory Using a Qualitative Research Method

    ERIC Educational Resources Information Center

    Chan, Cecilia Ka Yuk

    2012-01-01

    Experiential learning pedagogy is taking a lead in the development of graduate attributes and educational aims as these are of prime importance for society. This paper shows a community service experiential project conducted in China. The project enabled students to serve the affected community in a post-earthquake area by applying their knowledge…

  3. A low-complexity 2-point step size gradient projection method with selective function evaluations for smoothed total variation based CBCT reconstructions

    NASA Astrophysics Data System (ADS)

    Song, Bongyong; Park, Justin C.; Song, William Y.

    2014-11-01

    The Barzilai-Borwein (BB) 2-point step size gradient method is receiving attention for accelerating Total Variation (TV) based CBCT reconstructions. In order to become truly viable for clinical applications, however, its convergence property needs to be properly addressed. We propose a novel fast converging gradient projection BB method that requires ‘at most one function evaluation’ in each iterative step. This Selective Function Evaluation method, referred to as GPBB-SFE in this paper, exhibits the desired convergence property when it is combined with a ‘smoothed TV’ or any other differentiable prior. This way, the proposed GPBB-SFE algorithm offers fast and guaranteed convergence to the desired 3DCBCT image with minimal computational complexity. We first applied this algorithm to a Shepp-Logan numerical phantom. We then applied to a CatPhan 600 physical phantom (The Phantom Laboratory, Salem, NY) and a clinically-treated head-and-neck patient, both acquired from the TrueBeam™ system (Varian Medical Systems, Palo Alto, CA). Furthermore, we accelerated the reconstruction by implementing the algorithm on NVIDIA GTX 480 GPU card. We first compared GPBB-SFE with three recently proposed BB-based CBCT reconstruction methods available in the literature using Shepp-Logan numerical phantom with 40 projections. It is found that GPBB-SFE shows either faster convergence speed/time or superior convergence property compared to existing BB-based algorithms. With the CatPhan 600 physical phantom, the GPBB-SFE algorithm requires only 3 function evaluations in 30 iterations and reconstructs the standard, 364-projection FDK reconstruction quality image using only 60 projections. We then applied the algorithm to a clinically-treated head-and-neck patient. It was observed that the GPBB-SFE algorithm requires only 18 function evaluations in 30 iterations. Compared with the FDK algorithm with 364 projections, the GPBB-SFE algorithm produces visibly equivalent quality CBCT image for the head-and-neck patient with only 180 projections, in 131.7 s, further supporting its clinical applicability.

  4. A low-complexity 2-point step size gradient projection method with selective function evaluations for smoothed total variation based CBCT reconstructions.

    PubMed

    Song, Bongyong; Park, Justin C; Song, William Y

    2014-11-07

    The Barzilai-Borwein (BB) 2-point step size gradient method is receiving attention for accelerating Total Variation (TV) based CBCT reconstructions. In order to become truly viable for clinical applications, however, its convergence property needs to be properly addressed. We propose a novel fast converging gradient projection BB method that requires 'at most one function evaluation' in each iterative step. This Selective Function Evaluation method, referred to as GPBB-SFE in this paper, exhibits the desired convergence property when it is combined with a 'smoothed TV' or any other differentiable prior. This way, the proposed GPBB-SFE algorithm offers fast and guaranteed convergence to the desired 3DCBCT image with minimal computational complexity. We first applied this algorithm to a Shepp-Logan numerical phantom. We then applied to a CatPhan 600 physical phantom (The Phantom Laboratory, Salem, NY) and a clinically-treated head-and-neck patient, both acquired from the TrueBeam™ system (Varian Medical Systems, Palo Alto, CA). Furthermore, we accelerated the reconstruction by implementing the algorithm on NVIDIA GTX 480 GPU card. We first compared GPBB-SFE with three recently proposed BB-based CBCT reconstruction methods available in the literature using Shepp-Logan numerical phantom with 40 projections. It is found that GPBB-SFE shows either faster convergence speed/time or superior convergence property compared to existing BB-based algorithms. With the CatPhan 600 physical phantom, the GPBB-SFE algorithm requires only 3 function evaluations in 30 iterations and reconstructs the standard, 364-projection FDK reconstruction quality image using only 60 projections. We then applied the algorithm to a clinically-treated head-and-neck patient. It was observed that the GPBB-SFE algorithm requires only 18 function evaluations in 30 iterations. Compared with the FDK algorithm with 364 projections, the GPBB-SFE algorithm produces visibly equivalent quality CBCT image for the head-and-neck patient with only 180 projections, in 131.7 s, further supporting its clinical applicability.

  5. IT-supported integrated care pathways for diabetes: A compilation and review of good practices.

    PubMed

    Vrijhoef, Hubertus Jm; de Belvis, Antonio Giulio; de la Calle, Matias; de Sabata, Maria Stella; Hauck, Bastian; Montante, Sabrina; Moritz, Annette; Pelizzola, Dario; Saraheimo, Markku; Guldemond, Nick A

    2017-06-01

    Integrated Care Pathways (ICPs) are a method for the mutual decision-making and organization of care for a well-defined group of patients during a well-defined period. The aim of a care pathway is to enhance the quality of care by improving patient outcomes, promoting patient safety, increasing patient satisfaction, and optimizing the use of resources. To describe this concept, different names are used, e.g. care pathways and integrated care pathways. Modern information technologies (IT) can support ICPs by enabling patient empowerment, better management, and the monitoring of care provided by multidisciplinary teams. This study analyses ICPs across Europe, identifying commonalities and success factors to establish good practices for IT-supported ICPs in diabetes care. A mixed-method approach was applied, combining desk research on 24 projects from the European Innovation Partnership on Active and Healthy Ageing (EIP on AHA) with follow-up interviews of project participants, and a non-systematic literature review. We applied a Delphi technique to select process and outcome indicators, derived from different literature sources which were compiled and applied for the identification of successful good practices. Desk research identified sixteen projects featuring IT-supported ICPs, mostly derived from the EIP on AHA, as good practices based on our criteria. Follow-up interviews were then conducted with representatives from 9 of the 16 projects to gather information not publicly available and understand how these projects were meeting the identified criteria. In parallel, the non-systematic literature review of 434 PubMed search results revealed a total of eight relevant projects. On the basis of the selected EIP on AHA project data and non-systematic literature review, no commonalities with regard to defined process or outcome indicators could be identified through our approach. Conversely, the research produced a heterogeneous picture in all aspects of the projects' indicators. Data from desk research and follow-up interviews partly lacked information on outcome and performance, which limited the comparison between practices. Applying a comprehensive set of indicators in a multi-method approach to assess the projects included in this research study did not reveal any obvious commonalities which might serve as a blueprint for future IT-supported ICP projects. Instead, an unexpected high degree of heterogeneity was observed, that may reflect diverse local implementation requirements e.g. specificities of the local healthcare system, local regulations, or preexisting structures used for the project setup. Improving the definition of and reporting on project outcomes could help advance research on and implementation of effective integrated care solutions for chronic disease management across Europe.

  6. Assessment of data quality needs for use in transportation applications.

    DOT National Transportation Integrated Search

    2013-02-01

    The objective of this project is to investigate the data quality measures and how they are applied to travel time prediction. This project showcases a shortterm travel time prediction method that takes into account the data needs of the realtim...

  7. Applying Program Theory-Driven Approach to Design and Evaluate a Teacher Professional Development Program

    ERIC Educational Resources Information Center

    Lin, Su-ching; Wu, Ming-sui

    2016-01-01

    This study was the first year of a two-year project which applied a program theory-driven approach to evaluating the impact of teachers' professional development interventions on students' learning by using a mix of methods, qualitative inquiry, and quasi-experimental design. The current study was to show the results of using the method of…

  8. Electrondriven processes in polyatomic molecules

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKoy, Vincent

    2017-03-20

    This project developed and applied scalable computational methods to obtain information about low-energy electron collisions with larger polyatomic molecules. Such collisions are important in modeling radiation damage to living systems, in spark ignition and combustion, and in plasma processing of materials. The focus of the project was to develop efficient methods that could be used to obtain both fundamental scientific insights and data of practical value to applications.

  9. TH-EF-207A-05: Feasibility of Applying SMEIR Method On Small Animal 4D Cone Beam CT Imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhong, Y; Zhang, Y; Shao, Y

    Purpose: Small animal cone beam CT imaging has been widely used in preclinical research. Due to the higher respiratory rate and heat beats of small animals, motion blurring is inevitable and needs to be corrected in the reconstruction. Simultaneous motion estimation and image reconstruction (SMEIR) method, which uses projection images of all phases, proved to be effective in motion model estimation and able to reconstruct motion-compensated images. We demonstrate the application of SMEIR for small animal 4D cone beam CT imaging by computer simulations on a digital rat model. Methods: The small animal CBCT imaging system was simulated with themore » source-to-detector distance of 300 mm and the source-to-object distance of 200 mm. A sequence of rat phantom were generated with 0.4 mm{sup 3} voxel size. The respiratory cycle was taken as 1.0 second and the motions were simulated with a diaphragm motion of 2.4mm and an anterior-posterior expansion of 1.6 mm. The projection images were calculated using a ray-tracing method, and 4D-CBCT were reconstructed using SMEIR and FDK methods. The SMEIR method iterates over two alternating steps: 1) motion-compensated iterative image reconstruction by using projections from all respiration phases and 2) motion model estimation from projections directly through a 2D-3D deformable registration of the image obtained in the first step to projection images of other phases. Results: The images reconstructed using SMEIR method reproduced the features in the original phantom. Projections from the same phase were also reconstructed using FDK method. Compared with the FDK results, the images from SMEIR method substantially improve the image quality with minimum artifacts. Conclusion: We demonstrate that it is viable to apply SMEIR method to reconstruct small animal 4D-CBCT images.« less

  10. Non-Adiabatic Molecular Dynamics Methods for Materials Discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Furche, Filipp; Parker, Shane M.; Muuronen, Mikko J.

    2017-04-04

    The flow of radiative energy in light-driven materials such as photosensitizer dyes or photocatalysts is governed by non-adiabatic transitions between electronic states and cannot be described within the Born-Oppenheimer approximation commonly used in electronic structure theory. The non-adiabatic molecular dynamics (NAMD) methods based on Tully surface hopping and time-dependent density functional theory developed in this project have greatly extended the range of molecular materials that can be tackled by NAMD simulations. New algorithms to compute molecular excited state and response properties efficiently were developed. Fundamental limitations of common non-linear response methods were discovered and characterized. Methods for accurate computations ofmore » vibronic spectra of materials such as black absorbers were developed and applied. It was shown that open-shell TDDFT methods capture bond breaking in NAMD simulations, a longstanding challenge for single-reference molecular dynamics simulations. The methods developed in this project were applied to study the photodissociation of acetaldehyde and revealed that non-adiabatic effects are experimentally observable in fragment kinetic energy distributions. Finally, the project enabled the first detailed NAMD simulations of photocatalytic water oxidation by titania nanoclusters, uncovering the mechanism of this fundamentally important reaction for fuel generation and storage.« less

  11. Optical efficiency of solar concentrators by a reverse optical path method.

    PubMed

    Parretta, A; Antonini, A; Milan, E; Stefancich, M; Martinelli, G; Armani, M

    2008-09-15

    A method for the optical characterization of a solar concentrator, based on the reverse illumination by a Lambertian source and measurement of intensity of light projected on a far screen, has been developed. It is shown that the projected light intensity is simply correlated to the angle-resolved efficiency of a concentrator, conventionally obtained by a direct illumination procedure. The method has been applied by simulating simple reflective nonimaging and Fresnel lens concentrators.

  12. Superiorization with level control

    NASA Astrophysics Data System (ADS)

    Cegielski, Andrzej; Al-Musallam, Fadhel

    2017-04-01

    The convex feasibility problem is to find a common point of a finite family of closed convex subsets. In many applications one requires something more, namely finding a common point of closed convex subsets which minimizes a continuous convex function. The latter requirement leads to an application of the superiorization methodology which is actually settled between methods for convex feasibility problem and the convex constrained minimization. Inspired by the superiorization idea we introduce a method which sequentially applies a long-step algorithm for a sequence of convex feasibility problems; the method employs quasi-nonexpansive operators as well as subgradient projections with level control and does not require evaluation of the metric projection. We replace a perturbation of the iterations (applied in the superiorization methodology) by a perturbation of the current level in minimizing the objective function. We consider the method in the Euclidean space in order to guarantee the strong convergence, although the method is well defined in a Hilbert space.

  13. Segmentation of touching handwritten Japanese characters using the graph theory method

    NASA Astrophysics Data System (ADS)

    Suwa, Misako

    2000-12-01

    Projection analysis methods have been widely used to segment Japanese character strings. However, if adjacent characters have overhanging strokes or a touching point doesn't correspond to the histogram minimum, the methods are prone to result in errors. In contrast, non-projection analysis methods being proposed for use on numerals or alphabet characters cannot be simply applied for Japanese characters because of the differences in the structure of the characters. Based on the oversegmenting strategy, a new pre-segmentation method is presented in this paper: touching patterns are represented as graphs and touching strokes are regarded as the elements of proper edge cutsets. By using the graph theoretical technique, the cutset martrix is calculated. Then, by applying pruning rules, potential touching strokes are determined and the patterns are over segmented. Moreover, this algorithm was confirmed to be valid for touching patterns with overhanging strokes and doubly connected patterns in simulations.

  14. Applying scrum methods to ITS projects.

    DOT National Transportation Integrated Search

    2017-08-01

    The introduction of new technology generally brings new challenges and new methods to help with deployments. Agile methodologies have been introduced in the information technology industry to potentially speed up development. The Federal Highway Admi...

  15. Projection-free approximate balanced truncation of large unstable systems

    NASA Astrophysics Data System (ADS)

    Flinois, Thibault L. B.; Morgans, Aimee S.; Schmid, Peter J.

    2015-08-01

    In this article, we show that the projection-free, snapshot-based, balanced truncation method can be applied directly to unstable systems. We prove that even for unstable systems, the unmodified balanced proper orthogonal decomposition algorithm theoretically yields a converged transformation that balances the Gramians (including the unstable subspace). We then apply the method to a spatially developing unstable system and show that it results in reduced-order models of similar quality to the ones obtained with existing methods. Due to the unbounded growth of unstable modes, a practical restriction on the final impulse response simulation time appears, which can be adjusted depending on the desired order of the reduced-order model. Recommendations are given to further reduce the cost of the method if the system is large and to improve the performance of the method if it does not yield acceptable results in its unmodified form. Finally, the method is applied to the linearized flow around a cylinder at Re = 100 to show that it actually is able to accurately reproduce impulse responses for more realistic unstable large-scale systems in practice. The well-established approximate balanced truncation numerical framework therefore can be safely applied to unstable systems without any modifications. Additionally, balanced reduced-order models can readily be obtained even for large systems, where the computational cost of existing methods is prohibitive.

  16. Printed Circuit Board Quality Assurance

    NASA Technical Reports Server (NTRS)

    Sood, Bhanu

    2016-01-01

    PCB Assurance Summary: PCB assurance actives are informed by risk in context of the Project. Lessons are being applied across Projects for continuous improvements. Newer component technologies, smaller/high pitch devices: tighter and more demanding PCB designs: Identifying new research areas. New materials, designs, structures and test methods.

  17. Teaching Power Electronics with a Design-Oriented, Project-Based Learning Method at the Technical University of Denmark

    ERIC Educational Resources Information Center

    Zhang, Zhe; Hansen, Claus Thorp; Andersen, Michael A. E.

    2016-01-01

    Power electronics is a fast-developing technology within the electrical engineering field. This paper presents the results and experiences gained from applying design-oriented project-based learning to switch-mode power supply design in a power electronics course at the Technical University of Denmark (DTU). Project-based learning (PBL) is known…

  18. How economics can further the success of ecological restoration.

    PubMed

    Iftekhar, Md Sayed; Polyakov, Maksym; Ansell, Dean; Gibson, Fiona; Kay, Geoffrey M

    2017-04-01

    Restoration scientists and practitioners have recently begun to include economic and social aspects in the design and investment decisions for restoration projects. With few exceptions, ecological restoration studies that include economics focus solely on evaluating costs of restoration projects. However, economic principles, tools, and instruments can be applied to a range of other factors that affect project success. We considered the relevance of applying economics to address 4 key challenges of ecological restoration: assessing social and economic benefits, estimating overall costs, project prioritization and selection, and long-term financing of restoration programs. We found it is uncommon to consider all types of benefits (such as nonmarket values) and costs (such as transaction costs) in restoration programs. Total benefit of a restoration project can be estimated using market prices and various nonmarket valuation techniques. Total cost of a project can be estimated using methods based on property or land-sale prices, such as hedonic pricing method and organizational surveys. Securing continuous (or long-term) funding is also vital to accomplishing restoration goals and can be achieved by establishing synergy with existing programs, public-private partnerships, and financing through taxation. © 2016 Society for Conservation Biology.

  19. A numerical study of different projection-based model reduction techniques applied to computational homogenisation

    NASA Astrophysics Data System (ADS)

    Soldner, Dominic; Brands, Benjamin; Zabihyan, Reza; Steinmann, Paul; Mergheim, Julia

    2017-10-01

    Computing the macroscopic material response of a continuum body commonly involves the formulation of a phenomenological constitutive model. However, the response is mainly influenced by the heterogeneous microstructure. Computational homogenisation can be used to determine the constitutive behaviour on the macro-scale by solving a boundary value problem at the micro-scale for every so-called macroscopic material point within a nested solution scheme. Hence, this procedure requires the repeated solution of similar microscopic boundary value problems. To reduce the computational cost, model order reduction techniques can be applied. An important aspect thereby is the robustness of the obtained reduced model. Within this study reduced-order modelling (ROM) for the geometrically nonlinear case using hyperelastic materials is applied for the boundary value problem on the micro-scale. This involves the Proper Orthogonal Decomposition (POD) for the primary unknown and hyper-reduction methods for the arising nonlinearity. Therein three methods for hyper-reduction, differing in how the nonlinearity is approximated and the subsequent projection, are compared in terms of accuracy and robustness. Introducing interpolation or Gappy-POD based approximations may not preserve the symmetry of the system tangent, rendering the widely used Galerkin projection sub-optimal. Hence, a different projection related to a Gauss-Newton scheme (Gauss-Newton with Approximated Tensors- GNAT) is favoured to obtain an optimal projection and a robust reduced model.

  20. Statistical deprojection of galaxy pairs

    NASA Astrophysics Data System (ADS)

    Nottale, Laurent; Chamaraux, Pierre

    2018-06-01

    Aims: The purpose of the present paper is to provide methods of statistical analysis of the physical properties of galaxy pairs. We perform this study to apply it later to catalogs of isolated pairs of galaxies, especially two new catalogs we recently constructed that contain ≈1000 and ≈13 000 pairs, respectively. We are particularly interested by the dynamics of those pairs, including the determination of their masses. Methods: We could not compute the dynamical parameters directly since the necessary data are incomplete. Indeed, we only have at our disposal one component of the intervelocity between the members, namely along the line of sight, and two components of their interdistance, i.e., the projection on the sky-plane. Moreover, we know only one point of each galaxy orbit. Hence we need statistical methods to find the probability distribution of 3D interdistances and 3D intervelocities from their projections; we designed those methods under the term deprojection. Results: We proceed in two steps to determine and use the deprojection methods. First we derive the probability distributions expected for the various relevant projected quantities, namely intervelocity vz, interdistance rp, their ratio, and the product rp v_z^2, which is involved in mass determination. In a second step, we propose various methods of deprojection of those parameters based on the previous analysis. We start from a histogram of the projected data and we apply inversion formulae to obtain the deprojected distributions; lastly, we test the methods by numerical simulations, which also allow us to determine the uncertainties involved.

  1. Applied Cognitive Task Analysis (ACTA) Methodology

    DTIC Science & Technology

    1997-11-01

    experienced based cognitive skills. The primary goal of this project was to develop streamlined methods of Cognitive Task Analysis that would fill this need...We have made important progression this direction. We have developed streamlined methods of Cognitive Task Analysis . Our evaluation study indicates...developed a CD-based stand alone instructional package, which will make the Applied Cognitive Task Analysis (ACTA) tools widely accessible. A survey of the

  2. Evaluation of the impacts of climate change on disease vectors through ecological niche modelling.

    PubMed

    Carvalho, B M; Rangel, E F; Vale, M M

    2017-08-01

    Vector-borne diseases are exceptionally sensitive to climate change. Predicting vector occurrence in specific regions is a challenge that disease control programs must meet in order to plan and execute control interventions and climate change adaptation measures. Recently, an increasing number of scientific articles have applied ecological niche modelling (ENM) to study medically important insects and ticks. With a myriad of available methods, it is challenging to interpret their results. Here we review the future projections of disease vectors produced by ENM, and assess their trends and limitations. Tropical regions are currently occupied by many vector species; but future projections indicate poleward expansions of suitable climates for their occurrence and, therefore, entomological surveillance must be continuously done in areas projected to become suitable. The most commonly applied methods were the maximum entropy algorithm, generalized linear models, the genetic algorithm for rule set prediction, and discriminant analysis. Lack of consideration of the full-known current distribution of the target species on models with future projections has led to questionable predictions. We conclude that there is no ideal 'gold standard' method to model vector distributions; researchers are encouraged to test different methods for the same data. Such practice is becoming common in the field of ENM, but still lags behind in studies of disease vectors.

  3. Higher-order Traits and Happiness in the Workplace: The Importance of Occupational Project Scale for the Evaluation of Characteristic Adaptations.

    PubMed

    Buruk, Pelin; Şimşek, Ömer Faruk; Kocayörük, Ercan

    2017-01-01

    This study attempts to explain the relationship between job satisfaction and the Big Two, Stability and Plasticity, which are the higher-order traits of Big Five. Occupational Project, a narrative construct, was considered a mediator variable in this relationship. Occupational Project consists of affective and cognitive evaluations of an individual's work life as a project in terms of the completed (past), the ongoing (present) and the prospective (future) parts. The survey method was applied to a sample of 253 participants. The results supported the proposed model, in which Occupational Project mediated the relationship between the Big Two and both job satisfaction and affect in workplace. Discussion is focused on applying Occupational Project as a practical tool for management. Consideration of an employee's Occupational Project could provide management with a means to question, understand, intervene with and redefine the narrative quality of his/her occupational project that influences job satisfaction.

  4. Environmental impact assessment for alternative-energy power plants in México.

    PubMed

    González-Avila, María E; Beltrán-Morales, Luis Felipe; Braker, Elizabeth; Ortega-Rubio, Alfredo

    2006-07-01

    Ten Environmental Impact Assessment Reports (EIAR) were reviewed for projects involving alternative power plants in Mexico developed during the last twelve years. Our analysis focused on the methods used to assess the impacts produced by hydroelectric and geothermal power projects. These methods used to assess impacts in EIARs ranged from the most simple, descriptive criteria, to quantitative models. These methods are not concordant with the level of the EIAR required by the environmental authority or even, with the kind of project developed. It is concluded that there is no correlation between the tools used to assess impacts and the assigned type of the EIAR. Because the methods to assess impacts produced by these power projects have not changed during 2000 years, we propose a quantitative method, based on ecological criteria and tools, to assess the impacts produced by hydroelectric and geothermal plants, according to the specific characteristics of the project. The proposed method is supported by environmental norms, and can assist environmental authorities in assigning the correct level and tools to be applied to hydroelectric and geothermal projects. The proposed method can be adapted to other production activities in Mexico and to other countries.

  5. Fast projection/backprojection and incremental methods applied to synchrotron light tomographic reconstruction.

    PubMed

    de Lima, Camila; Salomão Helou, Elias

    2018-01-01

    Iterative methods for tomographic image reconstruction have the computational cost of each iteration dominated by the computation of the (back)projection operator, which take roughly O(N 3 ) floating point operations (flops) for N × N pixels images. Furthermore, classical iterative algorithms may take too many iterations in order to achieve acceptable images, thereby making the use of these techniques unpractical for high-resolution images. Techniques have been developed in the literature in order to reduce the computational cost of the (back)projection operator to O(N 2 logN) flops. Also, incremental algorithms have been devised that reduce by an order of magnitude the number of iterations required to achieve acceptable images. The present paper introduces an incremental algorithm with a cost of O(N 2 logN) flops per iteration and applies it to the reconstruction of very large tomographic images obtained from synchrotron light illuminated data.

  6. Advances in the NASA Earth Science Division Applied Science Program

    NASA Astrophysics Data System (ADS)

    Friedl, L.; Bonniksen, C. K.; Escobar, V. M.

    2016-12-01

    The NASA Earth Science Division's Applied Science Program advances the understanding of and ability to used remote sensing data in support of socio-economic needs. The integration of socio-economic considerations in to NASA Earth Science projects has advanced significantly. The large variety of acquisition methods used has required innovative implementation options. The integration of application themes and the implementation of application science activities in flight project is continuing to evolve. The creation of the recently released Earth Science Division, Directive on Project Applications Program and the addition of an application science requirement in the recent EVM-2 solicitation document NASA's current intent. Continuing improvement in the Earth Science Applications Science Program are expected in the areas of thematic integration, Project Applications Program tailoring for Class D missions and transfer of knowledge between scientists and projects.

  7. 78 FR 23961 - Request for Steering Committee Nominations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-23

    ... development of a methods research agenda and coordination of methods research in support of using electronic... surveillance, and methods research and application for scientific professionals. 3. IMEDS-Evaluation: Applies... transparent way to create exciting new research projects to advance regulatory science. The Foundation acts as...

  8. Morphological characterization of diesel soot agglomerates based on the Beer-Lambert law

    NASA Astrophysics Data System (ADS)

    Lapuerta, Magín; Martos, Francisco J.; José Expósito, Juan

    2013-03-01

    A new method is proposed for the determination of the number of primary particles composing soot agglomerates emitted from diesel engines as well as their individual fractal dimension. The method is based on the Beer-Lambert law and it is applied to micro-photographs taken in high resolution transmission electron microscopy. Differences in the grey levels of the images lead to a more accurate estimation of the geometry of the agglomerate (in this case radius of gyration) than other methods based exclusively on the planar projections of the agglomerates. The method was validated by applying it to different images of the same agglomerate observed from different angles of incidence, and proving that the effect of the angle of incidence is minor, contrary to other methods. Finally, the comparisons with other methods showed that the size, number of primary particles and fractal dimension (the latter depending on the particle size) are usually underestimated when only planar projections of the agglomerates are considered.

  9. Lessons from NASA Applied Sciences Program: Success Factors in Applying Earth Science in Decision Making

    NASA Astrophysics Data System (ADS)

    Friedl, L. A.; Cox, L.

    2008-12-01

    The NASA Applied Sciences Program collaborates with organizations to discover and demonstrate applications of NASA Earth science research and technology to decision making. The desired outcome is for public and private organizations to use NASA Earth science products in innovative applications for sustained, operational uses to enhance their decisions. In addition, the program facilitates the end-user feedback to Earth science to improve products and demands for research. The Program thus serves as a bridge between Earth science research and technology and the applied organizations and end-users with management, policy, and business responsibilities. Since 2002, the Applied Sciences Program has sponsored over 115 applications-oriented projects to apply Earth observations and model products to decision making activities. Projects have spanned numerous topics - agriculture, air quality, water resources, disasters, public health, aviation, etc. The projects have involved government agencies, private companies, universities, non-governmental organizations, and foreign entities in multiple types of teaming arrangements. The paper will examine this set of applications projects and present specific examples of successful use of Earth science in decision making. The paper will discuss scientific, organizational, and management factors that contribute to or impede the integration of the Earth science research in policy and management. The paper will also present new methods the Applied Sciences Program plans to implement to improve linkages between science and end users.

  10. Holomorphic projections and Ramanujan’s mock theta functions

    PubMed Central

    Imamoğlu, Özlem; Raum, Martin; Richter, Olav K.

    2014-01-01

    We use spectral methods of automorphic forms to establish a holomorphic projection operator for tensor products of vector-valued harmonic weak Maass forms and vector-valued modular forms. We apply this operator to discover simple recursions for Fourier series coefficients of Ramanujan’s mock theta functions. PMID:24591582

  11. BLACK SKIMMERS (RYNCHOPS NIGER) IN AN URBAN LANDSCAPE: CONTAMINANT AND DIET INFLUENCES ON REPRODUCTIVE OUTPUT IN SAN DIEGO BAY

    EPA Science Inventory

    The project addresses an applied ecological question through interdisciplinary and innovative analytical methods. By combining stable isotope and contaminant analyses, this project will provide a comprehensive perspective of how contaminants and diet vary within an organism...

  12. Project risk management in the construction of high-rise buildings

    NASA Astrophysics Data System (ADS)

    Titarenko, Boris; Hasnaoui, Amir; Titarenko, Roman; Buzuk, Liliya

    2018-03-01

    This paper shows the project risk management methods, which allow to better identify risks in the construction of high-rise buildings and to manage them throughout the life cycle of the project. One of the project risk management processes is a quantitative analysis of risks. The quantitative analysis usually includes the assessment of the potential impact of project risks and their probabilities. This paper shows the most popular methods of risk probability assessment and tries to indicate the advantages of the robust approach over the traditional methods. Within the framework of the project risk management model a robust approach of P. Huber is applied and expanded for the tasks of regression analysis of project data. The suggested algorithms used to assess the parameters in statistical models allow to obtain reliable estimates. A review of the theoretical problems of the development of robust models built on the methodology of the minimax estimates was done and the algorithm for the situation of asymmetric "contamination" was developed.

  13. Cryopreservation and conservation of microalgae: the development of a Pan-European scientific and biotechnological resource (the COBRA project).

    PubMed

    Day, J G; Benson, E E; Harding, K; Knowles, B; Idowu, M; Bremner, D; Santos, L; Santos, F; Friedl, T; Lorenz, M; Lukesova, A; Elster, J; Lukavsky, J; Herdman, M; Rippka, R; Hall, T

    2005-01-01

    Microalgae are one of the most biologically important elements of worldwide ecology and could be the source of diverse new products and medicines. COBRA (The COnservation of a vital european scientific and Biotechnological Resource: microAlgae and cyanobacteria) is the acronym for a European Union, RTD Infrastructures project (Contract No. QLRI-CT-2001-01645). This project is in the process of developing a European Biological Resource Centre based on existing algal culture collections. The COBRA project's central aim is to apply cryopreservation methodologies to microalgae and cyanobacteria, organisms that, to date, have proved difficult to conserve using cryogenic methods. In addition, molecular and biochemical stability tests have been developed to ensure that the equivalent strains of microorganisms supplied by the culture collections give high quality and consistent performance. Fundamental and applied knowledge of stress physiology form an essential component of the project and this is being employed to assist the optimisation of methods for preserving a wide range of algal diversity. COBRA's "Resource Centre" utilises Information Technologies (IT) and Knowledge Management practices to assist project coordination, management and information dissemination and facilitate the generation of new knowledge pertaining to algal conservation. This review of the COBRA project will give a summary of current methodologies for cryopreservation of microalgae and procedures adopted within the COBRA project to enhance preservation techniques for this diverse group of organisms.

  14. Experiences Using Formal Methods for Requirements Modeling

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David

    1996-01-01

    This paper describes three cases studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, the formal modeling provided a cost effective enhancement of the existing verification and validation processes. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.

  15. Evaluation on Cost Overrun Risks of Long-distance Water Diversion Project Based on SPA-IAHP Method

    NASA Astrophysics Data System (ADS)

    Yuanyue, Yang; Huimin, Li

    2018-02-01

    Large investment, long route, many change orders and etc. are main causes for costs overrun of long-distance water diversion project. This paper, based on existing research, builds a full-process cost overrun risk evaluation index system for water diversion project, apply SPA-IAHP method to set up cost overrun risk evaluation mode, calculate and rank weight of every risk evaluation indexes. Finally, the cost overrun risks are comprehensively evaluated by calculating linkage measure, and comprehensive risk level is acquired. SPA-IAHP method can accurately evaluate risks, and the reliability is high. By case calculation and verification, it can provide valid cost overrun decision making information to construction companies.

  16. "Intelligent Ensemble" Projections of Precipitation and Surface Radiation in Support of Agricultural Climate Change Adaptation

    NASA Technical Reports Server (NTRS)

    Taylor, Patrick C.; Baker, Noel C.

    2015-01-01

    Earth's climate is changing and will continue to change into the foreseeable future. Expected changes in the climatological distribution of precipitation, surface temperature, and surface solar radiation will significantly impact agriculture. Adaptation strategies are, therefore, required to reduce the agricultural impacts of climate change. Climate change projections of precipitation, surface temperature, and surface solar radiation distributions are necessary input for adaption planning studies. These projections are conventionally constructed from an ensemble of climate model simulations (e.g., the Coupled Model Intercomparison Project 5 (CMIP5)) as an equal weighted average, one model one vote. Each climate model, however, represents the array of climate-relevant physical processes with varying degrees of fidelity influencing the projection of individual climate variables differently. Presented here is a new approach, termed the "Intelligent Ensemble, that constructs climate variable projections by weighting each model according to its ability to represent key physical processes, e.g., precipitation probability distribution. This approach provides added value over the equal weighted average method. Physical process metrics applied in the "Intelligent Ensemble" method are created using a combination of NASA and NOAA satellite and surface-based cloud, radiation, temperature, and precipitation data sets. The "Intelligent Ensemble" method is applied to the RCP4.5 and RCP8.5 anthropogenic climate forcing simulations within the CMIP5 archive to develop a set of climate change scenarios for precipitation, temperature, and surface solar radiation in each USDA Farm Resource Region for use in climate change adaptation studies.

  17. Business Models for Training and Performance Improvement Departments

    ERIC Educational Resources Information Center

    Carliner, Saul

    2004-01-01

    Although typically applied to entire enterprises, the concept of business models applies to training and performance improvement groups. Business models are "the method by which firm[s] build and use [their] resources to offer.. value." Business models affect the types of projects, services offered, skills required, business processes, and type of…

  18. Applying the Kanban Method in Problem-Based Project Work: A Case Study in A Manufacturing Engineering Bachelor's Programme at Aalborg University Copenhagen

    ERIC Educational Resources Information Center

    Balve, Patrick; Krüger, Volker; Tolstrup Sørensen, Lene

    2017-01-01

    Problem-based learning (PBL) has proven to be highly effective for educating students in an active and self-motivated manner in various disciplines. Student projects carried out following PBL principles are very dynamic and carry a high level of uncertainty, both conditions under which agile project management approaches are assumed to be highly…

  19. CloudSat system engineering: techniques that point to a future success

    NASA Technical Reports Server (NTRS)

    Basilio, R. R.; Boain, R. J.; Lam, T.

    2002-01-01

    Over the past three years the CloutSat Project, a NASA Earth System Science Pathfinder mission to provide from space the first global survey of cloud profiles and cloud physical properties, has implemented a successful project system engineering approach. Techniques learned through heuristic reasoning of past project events and professional experience were applied along with select methods recently touted to increase effectiveness without compromising effiency.

  20. Applying a 2D based CAD scheme for detecting micro-calcification clusters using digital breast tomosynthesis images: an assessment

    NASA Astrophysics Data System (ADS)

    Park, Sang Cheol; Zheng, Bin; Wang, Xiao-Hui; Gur, David

    2008-03-01

    Digital breast tomosynthesis (DBT) has emerged as a promising imaging modality for screening mammography. However, visually detecting micro-calcification clusters depicted on DBT images is a difficult task. Computer-aided detection (CAD) schemes for detecting micro-calcification clusters depicted on mammograms can achieve high performance and the use of CAD results can assist radiologists in detecting subtle micro-calcification clusters. In this study, we compared the performance of an available 2D based CAD scheme with one that includes a new grouping and scoring method when applied to both projection and reconstructed DBT images. We selected a dataset involving 96 DBT examinations acquired on 45 women. Each DBT image set included 11 low dose projection images and a varying number of reconstructed image slices ranging from 18 to 87. In this dataset 20 true-positive micro-calcification clusters were visually detected on the projection images and 40 were visually detected on the reconstructed images, respectively. We first applied the CAD scheme that was previously developed in our laboratory to the DBT dataset. We then tested a new grouping method that defines an independent cluster by grouping the same cluster detected on different projection or reconstructed images. We then compared four scoring methods to assess the CAD performance. The maximum sensitivity level observed for the different grouping and scoring methods were 70% and 88% for the projection and reconstructed images with a maximum false-positive rate of 4.0 and 15.9 per examination, respectively. This preliminary study demonstrates that (1) among the maximum, the minimum or the average CAD generated scores, using the maximum score of the grouped cluster regions achieved the highest performance level, (2) the histogram based scoring method is reasonably effective in reducing false-positive detections on the projection images but the overall CAD sensitivity is lower due to lower signal-to-noise ratio, and (3) CAD achieved higher sensitivity and higher false-positive rate (per examination) on the reconstructed images. We concluded that without changing the detection threshold or performing pre-filtering to possibly increase detection sensitivity, current CAD schemes developed and optimized for 2D mammograms perform relatively poorly and need to be re-optimized using DBT datasets and new grouping and scoring methods need to be incorporated into the schemes if these are to be used on the DBT examinations.

  1. Twinning European and South Asian river basins to enhance capacity and implement adaptive integrated water resources management approaches - results from the EC-project BRAHMATWINN

    NASA Astrophysics Data System (ADS)

    Flügel, W.-A.

    2011-04-01

    The EC-project BRAHMATWINN was carrying out a harmonised integrated water resources management (IWRM) approach as addressed by the European Water Initiative (EWI) in headwater river systems of alpine mountain massifs of the twinning Upper Danube River Basin (UDRB) and the Upper Brahmaputra River Basins (UBRB) in Europe and Southeast Asia respectively. Social and natural scientists in cooperation with water law experts and local stakeholders produced the project outcomes presented in Chapter 2 till Chapter 10 of this publication. BRAHMATWINN applied a holistic approach towards IWRM comprising climate modelling, socio-economic and governance analysis and concepts together with methods and integrated tools of applied Geoinformatics. A detailed description of the deliverables produced by the BRAHMATWINN project is published on the project homepage http://www.brahmatwinn.uni-jena.de.

  2. Interdisciplinary Curriculum Development in Hospital Methods Improvement. Final Report.

    ERIC Educational Resources Information Center

    Watt, John R.

    The major purpose of this project was to develop a "package" curriculum of Hospital Methods Improvement techniques for college students in health related majors. The elementary Industrial Engineering methods for simplifying work and saving labor were applied to the hospital environment and its complex of problems. The report's…

  3. Restoration of multichannel microwave radiometric images

    NASA Technical Reports Server (NTRS)

    Chin, R. T.; Yeh, C. L.; Olson, W. S.

    1983-01-01

    A constrained iterative image restoration method is applied to multichannel diffraction-limited imagery. This method is based on the Gerchberg-Papoulis algorithm utilizing incomplete information and partial constraints. The procedure is described using the orthogonal projection operators which project onto two prescribed subspaces iteratively. Some of its properties and limitations are also presented. The selection of appropriate constraints was emphasized in a practical application. Multichannel microwave images, each having different spatial resolution, were restored to a common highest resolution to demonstrate the effectiveness of the method. Both noise-free and noisy images were used in this investigation.

  4. The Value of Methodical Management: Optimizing Science Results

    NASA Astrophysics Data System (ADS)

    Saby, Linnea

    2016-01-01

    As science progresses, making new discoveries in radio astronomy becomes increasingly complex. Instrumentation must be incredibly fine-tuned and well-understood, scientists must consider the skills and schedules of large research teams, and inter-organizational projects sometimes require coordination between observatories around the globe. Structured and methodical management allows scientists to work more effectively in this environment and leads to optimal science output. This report outlines the principles of methodical project management in general, and describes how those principles are applied at the National Radio Astronomy Observatory (NRAO) in Charlottesville, Virginia.

  5. Experiences Using Lightweight Formal Methods for Requirements Modeling

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David

    1997-01-01

    This paper describes three case studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, formal methods enhanced the existing verification and validation processes, by testing key properties of the evolving requirements, and helping to identify weaknesses. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.

  6. Darboux coordinates and instanton corrections in projective superspace

    NASA Astrophysics Data System (ADS)

    Crichigno, P. Marcos; Jain, Dharmesh

    2012-10-01

    By demanding consistency of the Legendre transform construction of hyperkähler metrics in projective superspace, we derive the expression for the Darboux coordinates on the hyperkähler manifold. We apply these results to study the Coulomb branch moduli space of 4D, {N}=2 super-Yang-Mills theory (SYM) on {{{R}}^3}× {S^1} , recovering the results by GMN. We also apply this method to study the electric corrections to the moduli space of 5D, {N}=1 SYM on {{{R}}^3}× {T^2} and give the Darboux coordinates explicitly.

  7. Project-Based Learning Environments: Challenging Preservice Teachers to Act in the Moment

    ERIC Educational Resources Information Center

    Wilhelm, Jennifer; Sherrod, Sonya; Walters, Kendra

    2008-01-01

    In this design study, the authors documented means by which preservice teachers experienced mathematics and science associated with understanding the Moon and sky as they participated in project work within their mathematics and science methods course. The authors examined the way preservice teachers applied mathematics needed to accomplish…

  8. PROJECT EARNING POWER, GRANT RD-1806-G. HISTORY AND FINAL REPORT.

    ERIC Educational Resources Information Center

    LANGDON, MARGARET

    THE PURPOSE OF THE PROJECT WAS TO APPLY CURRENT KNOWLEDGE, METHODS, AND TECHNIQUES IN INDUSTRIAL DESIGN AND PRODUCT DEVELOPMENT TO REHABILITATING THE HANDICAPPED BY USING THE LABOR FORCE AVAILABLE IN SHELTERED WORKSHOPS AND HOMEBOUND PROGRAMS. DEMONSTRATION PROGRAMS WERE ESTABLISHED AS TASK FORCES IN CHICAGO, NEW YORK, AND LOS ANGELES WITH THE…

  9. The DEVELOP National Program's Strategy for Communicating Applied Science Outcomes

    NASA Astrophysics Data System (ADS)

    Childs-Gleason, L. M.; Ross, K. W.; Crepps, G.; Favors, J.; Kelley, C.; Miller, T. N.; Allsbrook, K. N.; Rogers, L.; Ruiz, M. L.

    2016-12-01

    NASA's DEVELOP National Program conducts rapid feasibility projects that enable the future workforce and current decision makers to collaborate and build capacity to use Earth science data to enhance environmental management and policy. The program communicates its results and applications to a broad spectrum of audiences through a variety of methods: "virtual poster sessions" that engage the general public through short project videos and interactive dialogue periods, a "Campus Ambassador Corps" that communicates about the program and its projects to academia, scientific and policy conference presentations, community engagement activities and end-of-project presentations, project "hand-offs" providing results and tools to project partners, traditional publications (both gray literature and peer-reviewed), an interactive website project gallery, targeted brochures, and through multiple social media venues and campaigns. This presentation will describe the various methods employed by DEVELOP to communicate the program's scientific outputs, target audiences, general statistics, community response and best practices.

  10. Local blur analysis and phase error correction method for fringe projection profilometry systems.

    PubMed

    Rao, Li; Da, Feipeng

    2018-05-20

    We introduce a flexible error correction method for fringe projection profilometry (FPP) systems in the presence of local blur phenomenon. Local blur caused by global light transport such as camera defocus, projector defocus, and subsurface scattering will cause significant systematic errors in FPP systems. Previous methods, which adopt high-frequency patterns to separate the direct and global components, fail when the global light phenomenon occurs locally. In this paper, the influence of local blur on phase quality is thoroughly analyzed, and a concise error correction method is proposed to compensate the phase errors. For defocus phenomenon, this method can be directly applied. With the aid of spatially varying point spread functions and local frontal plane assumption, experiments show that the proposed method can effectively alleviate the system errors and improve the final reconstruction accuracy in various scenes. For a subsurface scattering scenario, if the translucent object is dominated by multiple scattering, the proposed method can also be applied to correct systematic errors once the bidirectional scattering-surface reflectance distribution function of the object material is measured.

  11. A PDF projection method: A pressure algorithm for stand-alone transported PDFs

    NASA Astrophysics Data System (ADS)

    Ghorbani, Asghar; Steinhilber, Gerd; Markus, Detlev; Maas, Ulrich

    2015-03-01

    In this paper, a new formulation of the projection approach is introduced for stand-alone probability density function (PDF) methods. The method is suitable for applications in low-Mach number transient turbulent reacting flows. The method is based on a fractional step method in which first the advection-diffusion-reaction equations are modelled and solved within a particle-based PDF method to predict an intermediate velocity field. Then the mean velocity field is projected onto a space where the continuity for the mean velocity is satisfied. In this approach, a Poisson equation is solved on the Eulerian grid to obtain the mean pressure field. Then the mean pressure is interpolated at the location of each stochastic Lagrangian particle. The formulation of the Poisson equation avoids the time derivatives of the density (due to convection) as well as second-order spatial derivatives. This in turn eliminates the major sources of instability in the presence of stochastic noise that are inherent in particle-based PDF methods. The convergence of the algorithm (in the non-turbulent case) is investigated first by the method of manufactured solutions. Then the algorithm is applied to a one-dimensional turbulent premixed flame in order to assess the accuracy and convergence of the method in the case of turbulent combustion. As a part of this work, we also apply the algorithm to a more realistic flow, namely a transient turbulent reacting jet, in order to assess the performance of the method.

  12. [Tobacco quality analysis of industrial classification of different years using near-infrared (NIR) spectrum].

    PubMed

    Wang, Yi; Xiang, Ma; Wen, Ya-Dong; Yu, Chun-Xia; Wang, Luo-Ping; Zhao, Long-Lian; Li, Jun-Hui

    2012-11-01

    In this study, tobacco quality analysis of main Industrial classification of different years was carried out applying spectrum projection and correlation methods. The group of data was near-infrared (NIR) spectrum from Hongta Tobacco (Group) Co., Ltd. 5730 tobacco leaf Industrial classification samples from Yuxi in Yunnan province from 2007 to 2010 year were collected using near infrared spectroscopy, which from different parts and colors and all belong to tobacco varieties of HONGDA. The conclusion showed that, when the samples were divided to two part by the ratio of 2:1 randomly as analysis and verification sets in the same year, the verification set corresponded with the analysis set applying spectrum projection because their correlation coefficients were above 0.98. The correlation coefficients between two different years applying spectrum projection were above 0.97. The highest correlation coefficient was the one between 2008 and 2009 year and the lowest correlation coefficient was the one between 2007 and 2010 year. At the same time, The study discussed a method to get the quantitative similarity values of different industrial classification samples. The similarity and consistency values were instructive in combination and replacement of tobacco leaf blending.

  13. A STUDY ON IMPLEMENTATION OF PROCUREMENT MANAGEMENT METHOD THAT DELIVERLY'S UNCERTAINTY OF EQUIPMENTS, MATERIALS AND INSTRUCTIONS

    NASA Astrophysics Data System (ADS)

    Asaine, Keita; Asaine, Wataru; Shiratsuchi, Ryoma; Yoshida, Takaichi; Hashimoto, Masaaki

    This paper highlights the issues of procurement management in construction projects, such as late delivery of purchased equipments/materials and missing instructions from customers, which cause delays of construction schedule and over-budget cost. We point that most of these problems are caused by lack of synchronization between procurement activities and process control. Therefore, we propose a managerial method which enables better synchronization between the two by applying this method to a construction company. We discuss the necessary conditions and validity of incorporating it and show the way how to establish the mechanics through the case study. Furthermore, we analyze that the feature of this method is not only addressing procurement issues but also bringing additional benefits, such as shortening project lead time and reducing project cost.

  14. Activity analysis: contributions to the innovation of projects for aircrafts cabins.

    PubMed

    Rossi, N T; Greghi, F M; Menegon, L N; Souza, G B J

    2012-01-01

    This article presents results obtained from some ergonomics intervention in the project for the conception of aircraft's cabins. The study's aim is to analyze the contribution of the method adopted in the passengers' activities analysis in reference situations, real-use situations in aircraft's cabins, applied to analyze typical activities performed by people in their own environment. Within this perspective, the study shows two analyses which highlight the use of electronic device. The first analysis has been registered through a shooting filming in a real commercial flight. In the second one, the use is developed within the domestic environment. The same method has been applied in both contexts and it is based on activity analysis. Starting with the filming activity, postures and actions analysis, self-confrontation interviews, action course reconstruction and elaboration of postures envelopes. The results point out that the developed method might be applied to different contexts, evincing different ways of space occupation to meet human personal needs while performing an activity, which can help us with the anticipation of the users' needs, as well as indicate some innovation possibilities.

  15. Undergraduate degree projects in the Swedish dental schools: a documentary analysis.

    PubMed

    Franzén, C; Brown, G

    2013-05-01

    Undergraduate degree projects have currently been introduced into courses in the four Swedish dental schools. The rationale for research projects is that they enable students to develop research expertise skills and to show their ability to apply and develop knowledge relevant to professional practice. This paper reports a qualitative analysis of the curriculum documents and handbooks including the criteria used to assess the students' research reports. The aim was to investigate commonalities and differences in the design of degree projects between the four Swedish dental schools and to explore any inconsistencies within the documents. The documentary analysis was based on the constant comparison method. Four overarching themes emerged from the analysis: (i) developing scientific expertise, (ii) developing professional expertise, (iii) following rules and (iv) fostering creativity. The documents from the four dental schools revealed similar views on the purposes of the projects and provided similar assessment criteria. The students were requested to formulate an odontological problem, apply a relevant scientific method, analyse texts and empirical data, express critical reflections and write a short thesis. The students were free to choose topics. There were differences between the dental schools on the emphasis placed on practical uses of the projects and theoretical background of the projects. Two of the schools insisted on rigid rules of completing and writing the project yet paradoxically emphasised creativity. There were wide variations in the required length of the project report. The report may prove useful to dental schools in other countries who are about to design undergraduate research projects. © 2013 John Wiley & Sons A/S. Published by Blackwell Publishing Ltd.

  16. Applying Various Methods of Communicating Science for Community Decision-Making and Public Awareness: A NASA DEVELOP National Program Case Study

    NASA Astrophysics Data System (ADS)

    Miller, T. N.; Brumbaugh, E. J.; Barker, M.; Ly, V.; Schick, R.; Rogers, L.

    2015-12-01

    The NASA DEVELOP National Program conducts over eighty Earth science projects every year. Each project applies NASA Earth observations to impact decision-making related to a local or regional community concern. Small, interdisciplinary teams create a methodology to address the specific issue, and then pass on the results to partner organizations, as well as providing them with instruction to continue using remote sensing for future decisions. Many different methods are used by individual teams, and the program as a whole, to communicate results and research accomplishments to decision-makers, stakeholders, alumni, and the general public. These methods vary in scope from formal publications to more informal venues, such as social media. This presentation will highlight the communication techniques used by the DEVELOP program. Audiences, strategies, and outlets will be discussed, including a newsletter, microjournal, video contest, and several others.

  17. On randomized algorithms for numerical solution of applied Fredholm integral equations of the second kind

    NASA Astrophysics Data System (ADS)

    Voytishek, Anton V.; Shipilov, Nikolay M.

    2017-11-01

    In this paper, the systematization of numerical (implemented on a computer) randomized functional algorithms for approximation of a solution of Fredholm integral equation of the second kind is carried out. Wherein, three types of such algorithms are distinguished: the projection, the mesh and the projection-mesh methods. The possibilities for usage of these algorithms for solution of practically important problems is investigated in detail. The disadvantages of the mesh algorithms, related to the necessity of calculation values of the kernels of integral equations in fixed points, are identified. On practice, these kernels have integrated singularities, and calculation of their values is impossible. Thus, for applied problems, related to solving Fredholm integral equation of the second kind, it is expedient to use not mesh, but the projection and the projection-mesh randomized algorithms.

  18. New human biomonitoring methods for chemicals of concern-the German approach to enhance relevance.

    PubMed

    Kolossa-Gehring, Marike; Fiddicke, Ulrike; Leng, Gabriele; Angerer, Jürgen; Wolz, Birgit

    2017-03-01

    In Germany strong efforts have been made within the last years to develop new methods for human biomonitoring (HBM). The German Federal Ministry for the Environment, Nature Conservation, Building and Nuclear Safety (BMUB) and the German Chemical Industry Association e. V. (VCI) cooperate since 2010 to increase the knowledge on the internal exposure of the general population to chemicals. The projects aim is to promote human biomonitoring by developing new analytical methods Key partner of the cooperation is the German Environment Agency (UBA) which has been entrusted with the scientific coordination. Another key partner is the "HBM Expert Panel" which each year puts together a list of chemicals of interest to the project from which the Steering Committee of the project choses up to five substances for which method development will be started. Emphasis is placed on substances with either a potential health relevance or on substances to which the general population is potentially exposed to a considerable extent. The HBM Expert Panel also advises on method development. Once a method is developed, it is usually first applied to about 40 non-occupationally exposed individuals. A next step is applying the methods to different samples. Either, if the time trend is of major interest, to samples from the German Environmental Specimen Bank, or, in case exposure sources and distribution of exposure levels in the general population are the focus, the new methods are applied to samples from children and adolescents from the population representative 5th German Environmental Survey (GerES V). Results are expected in late 2018. This article describes the challenges faced during method development and solutions found. An overview presents the 34 selected substances, the 14 methods developed and the 7 HBM-I values derived in the period from 2010 to mid 2016. Copyright © 2016 The Authors. Published by Elsevier GmbH.. All rights reserved.

  19. Estimation of wind regime from combination of RCM and NWP data in the Gulf of Riga (Baltic Sea)

    NASA Astrophysics Data System (ADS)

    Sile, T.; Sennikovs, J.; Bethers, U.

    2012-04-01

    Gulf of Riga is a semi-enclosed gulf located in the Eastern part of the Baltic Sea. Reliable wind climate data is crucial for the development of wind energy. The objective of this study is to create high resolution wind parameter datasets for the Gulf of Riga using climate and numerical weather prediction (NWP) models as an alternative to methods that rely on observations with the expectation of benefit from comparing different approaches. The models used for the estimation of the wind regime are an ensemble of Regional Climate Models (RCM, ENSEMBLES, 23 runs are considered) and high resolution NWP data. Future projections provided by RCM are of interest however their spatial resolution is unsatisfactory. We describe a method of spatial refinement of RCM data using NWP data to resolve small scale features. We apply the method of RCM bias correction (Sennikovs and Bethers, 2009) previously used for temperature and precipitation to wind data and use NWP data instead of observations. The refinement function is calculated using contemporary climate (1981- 2010) and later applied to RCM near future (2021 - 2050) projections to produce a dataset with the same resolution as NWP data. This method corrects for RCM biases that were shown to be present in the initial analysis and inter-model statistical analysis was carried out to estimate uncertainty. Using the datasets produced by this method the current and future projections of wind speed and wind energy density are calculated. Acknowledgments: This research is part of the GORWIND (The Gulf of Riga as a Resource for Wind Energy) project (EU34711). The ENSEMBLES data used in this work was funded by the EU FP6 Integrated Project ENSEMBLES (Contract number 505539) whose support is gratefully acknowledged.

  20. A simple method for determining stress intensity factors for a crack in bi-material interface

    NASA Astrophysics Data System (ADS)

    Morioka, Yuta

    Because of violently oscillating nature of stress and displacement fields near the crack tip, it is difficult to obtain stress intensity factors for a crack between two dis-similar media. For a crack in a homogeneous medium, it is a common practice to find stress intensity factors through strain energy release rates. However, individual strain energy release rates do not exist for bi-material interface crack. Hence it is necessary to find alternative methods to evaluate stress intensity factors. Several methods have been proposed in the past. However they involve mathematical complexity and sometimes require additional finite element analysis. The purpose of this research is to develop a simple method to find stress intensity factors in bi-material interface cracks. A finite element based projection method is proposed in the research. It is shown that the projection method yields very accurate stress intensity factors for a crack in isotropic and anisotropic bi-material interfaces. The projection method is also compared to displacement ratio method and energy method proposed by other authors. Through comparison it is found that projection method is much simpler to apply with its accuracy comparable to that of displacement ratio method.

  1. Assessing the contribution of different factors in RegCM4.3 regional climate model projections using the Factor Separation method over the Med-CORDEX domain

    NASA Astrophysics Data System (ADS)

    Zsolt Torma, Csaba; Giorgi, Filippo

    2014-05-01

    A set of regional climate model (RCM) simulations applying dynamical downscaling of global climate model (GCM) simulations over the Mediterranean domain specified by the international initiative Coordinated Regional Downscaling Experiment (CORDEX) were completed with the Regional Climate Model RegCM, version RegCM4.3. Two GCMs were selected from the Coupled Model Intercomparison Project Phase 5 (CMIP5) ensemble to provide the driving fields for the RegCM: HadGEM2-ES (HadGEM) and MPI-ESM-MR (MPI). The simulations consist of an ensemble including multiple physics configurations and different "Reference Concentration Pathways" (RCP4.5 and RCP8.5). In total 15 simulations were carried out with 7 model physics configurations with varying convection and land surface schemes. The horizontal grid spacing of the RCM simulations is 50 km and the simulated period in all cases is 1970-2100 (1970-2099 in case of HadGEM driven simulations). This ensemble includes a combination of experiments in which different model components are changed individually and in combination, and thus lends itself optimally to the application of the Factor Separation (FS) method. This study applies the FS method to investigate the contributions of different factors, along with their synergy, on a set of regional climate model (RCM) projections for the Mediterranean region. The FS method is applied to 6 projections for the period 1970-2100 performed with the regional model RegCM4.3 over the Med-CORDEX domain. Two different sets of factors are intercompared, namely the driving global climate model (HadGEM and MPI) boundary conditions against two model physics settings (convection scheme and irrigation). We find that both the GCM driving conditions and the model physics provide important contributions, depending on the variable analyzed (surface air temperature and precipitation), season (winter vs. summer) and time horizon into the future, while the synergy term mostly tends to counterbalance the contributions of the individual factors. We demonstrate the usefulness of the FS method to assess different sources of uncertainty in RCM-based regional climate projections.

  2. The Stories of Inventions: An Interdisciplinary, Project-Based Unit for U.S. History Students

    ERIC Educational Resources Information Center

    Nargund-Joshi, Vanashri; Bragg, John

    2017-01-01

    During the second industrial revolution (1870-1914), scientists moved away from trial-anderror methods to more systematically apply the principles of chemistry, physics, and biology (Mokyr 1998). The authors chose this period as the foundation of a project-based learning (PBL) unit integrated with the ninth-grade U.S. history curriculum (Thomas…

  3. Pencil Pressure and Anxiety in Drawings: A Techno-Projective Approach

    ERIC Educational Resources Information Center

    LaRoque, Sean Davis; Obrzut, John E.

    2006-01-01

    This study used a techno-projective assessment method to analyze the relationship between pencil pressure applied during drawing tasks and state anxiety (S-anxiety) and trait anxiety (T-anxiety) levels. A highly accurate and precise pressure-sensitive palette was used by participants (N = 50) between the ages of 6 and 11 to reliably and…

  4. Single-shot color fringe projection for three-dimensional shape measurement of objects with discontinuities.

    PubMed

    Dai, Meiling; Yang, Fujun; He, Xiaoyuan

    2012-04-20

    A simple but effective fringe projection profilometry is proposed to measure 3D shape by using one snapshot color sinusoidal fringe pattern. One color fringe pattern encoded with a sinusoidal fringe (as red component) and one uniform intensity pattern (as blue component) is projected by a digital video projector, and the deformed fringe pattern is recorded by a color CCD camera. The captured color fringe pattern is separated into its RGB components and division operation is applied to red and blue channels to reduce the variable reflection intensity. Shape information of the tested object is decoded by applying an arcsine algorithm on the normalized fringe pattern with subpixel resolution. In the case of fringe discontinuities caused by height steps, or spatially isolated surfaces, the separated blue component is binarized and used for correcting the phase demodulation. A simple and robust method is also introduced to compensate for nonlinear intensity response of the digital video projector. The experimental results demonstrate the validity of the proposed method.

  5. Incorporating ITS into transportation improvement planning : the Seattle Case Study using PRUEVIIN

    DOT National Transportation Integrated Search

    1998-01-01

    This project explored methods to analyze ITS strategies within Major Investment Study (MIS) studies and to apply them in a case study. The case study developed methods to define alternatives, and to estimate impacts and costs at the level required fo...

  6. A RUTCOR Project on Discrete Applied Mathematics

    DTIC Science & Technology

    1989-01-30

    the more important results of this work is the possibility that Groebner basis methods of computational commutative algebra might lead to effective...Billera, L.J., " Groebner Basis Methods for Multivariate Splines," prepared for the Proceedings of the Oslo Conference on Computer-aided Geometric Design

  7. Projection-reduction method applied to deriving non-linear optical conductivity for an electron-impurity system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kang, Nam Lyong; Lee, Sang-Seok; Graduate School of Engineering, Tottori University, 4-101 Koyama-Minami, Tottori

    2013-07-15

    The projection-reduction method introduced by the present authors is known to give a validated theory for optical transitions in the systems of electrons interacting with phonons. In this work, using this method, we derive the linear and first order nonlinear optical conductivites for an electron-impurity system and examine whether the expressions faithfully satisfy the quantum mechanical philosophy, in the same way as for the electron-phonon systems. The result shows that the Fermi distribution function for electrons, energy denominators, and electron-impurity coupling factors are contained properly in organized manners along with absorption of photons for each electron transition process in themore » final expressions. Furthermore, the result is shown to be represented properly by schematic diagrams, as in the formulation of electron-phonon interaction. Therefore, in conclusion, we claim that this method can be applied in modeling optical transitions of electrons interacting with both impurities and phonons.« less

  8. A Case Study of Coordination in Distributed Agile Software Development

    NASA Astrophysics Data System (ADS)

    Hole, Steinar; Moe, Nils Brede

    Global Software Development (GSD) has gained significant popularity as an emerging paradigm. Companies also show interest in applying agile approaches in distributed development to combine the advantages of both approaches. However, in their most radical forms, agile and GSD can be placed in each end of a plan-based/agile spectrum because of how work is coordinated. We describe how three GSD projects applying agile methods coordinate their work. We found that trust is needed to reduce the need of standardization and direct supervision when coordinating work in a GSD project, and that electronic chatting supports mutual adjustment. Further, co-location and modularization mitigates communication problems, enables agility in at least part of a GSD project, and renders the implementation of Scrum of Scrums possible.

  9. User Participation in Coproduction of Health Innovation: Proposal for a Synergy Project

    PubMed Central

    Zukauskaite, Elena; Westberg, Niklas

    2018-01-01

    Background This project concerns advancing knowledge, methods, and logic for user participation in coproduction of health innovations. Such advancement is vital for several reasons. From a user perspective, participation in coproduction provides an opportunity to gain real influence over goal definition, design, and implementation of health innovations, ensuring that the solution developed solves real problems in right ways. From a societal perspective, it’s a mean to improve the efficiency of health care and the implementation of the Patient Act. As for industry, frameworks and knowledge of coproduction offer tools to operate in a complex sector, with great potential for innovation of services and products. Objective The fundamental objective of this project is to advance knowledge and methods of how user participation in the coproduction of health innovations can be applied in order to benefit users, industry, and public sector. Methods This project is a synergy project, which means that the objective will be accomplished through collaboration and meta-analysis between three subprojects that address different user groups, apply different strategies to promote human health, and relate to different parts of the health sector. Furthermore, subprojects focus on distinctive stages in the spectrum of innovation, with the objective to generate knowledge of the innovation process as a whole. The project is organized around three work packages related to three challenges—coproduction, positioning, and realization. Each subproject is designed such that it has its own field of study with clearly identified objectives but also targets work packages to contribute to the project as a whole. The work on the work packages will use case methodology for data collection and analysis based on the subprojects as data sources. More concretely, logic of multiple case studies will be applied with each subproject representing a separate case which is similar to each other in its attention to user participation in coproduction, but different regarding, for example, context and target groups. At the synergy level, the framework methodology will be used to handle and analyze the vast amount of information generated within the subprojects. Results The project period is from July 1, 2018 to June 30, 2022. Conclusions By addressing the objective of this project, we will create new knowledge on how to manage challenges to health innovation associated with the coproduction process, the positioning of solutions, and realization. PMID:29743159

  10. In vivo fluorescence lifetime optical projection tomography

    PubMed Central

    McGinty, James; Taylor, Harriet B.; Chen, Lingling; Bugeon, Laurence; Lamb, Jonathan R.; Dallman, Margaret J.; French, Paul M. W.

    2011-01-01

    We demonstrate the application of fluorescence lifetime optical projection tomography (FLIM-OPT) to in vivo imaging of lysC:GFP transgenic zebrafish embryos (Danio rerio). This method has been applied to unambiguously distinguish between the fluorescent protein (GFP) signal in myeloid cells from background autofluorescence based on the fluorescence lifetime. The combination of FLIM, an inherently ratiometric method, in conjunction with OPT results in a quantitative 3-D tomographic technique that could be used as a robust method for in vivo biological and pharmaceutical research, for example as a readout of Förster resonance energy transfer based interactions. PMID:21559145

  11. Diagnostics and Active Control of Aircraft Interior Noise

    NASA Technical Reports Server (NTRS)

    Fuller, C. R.

    1998-01-01

    This project deals with developing advanced methods for investigating and controlling interior noise in aircraft. The work concentrates on developing and applying the techniques of Near Field Acoustic Holography (NAH) and Principal Component Analysis (PCA) to the aircraft interior noise dynamic problem. This involves investigating the current state of the art, developing new techniques and then applying them to the particular problem being studied. The knowledge gained under the first part of the project was then used to develop and apply new, advanced noise control techniques for reducing interior noise. A new fully active control approach based on the PCA was developed and implemented on a test cylinder. Finally an active-passive approach based on tunable vibration absorbers was to be developed and analytically applied to a range of test structures from simple plates to aircraft fuselages.

  12. Validation Process Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lewis, John E.; English, Christine M.; Gesick, Joshua C.

    This report documents the validation process as applied to projects awarded through Funding Opportunity Announcements (FOAs) within the U.S. Department of Energy Bioenergy Technologies Office (DOE-BETO). It describes the procedures used to protect and verify project data, as well as the systematic framework used to evaluate and track performance metrics throughout the life of the project. This report also describes the procedures used to validate the proposed process design, cost data, analysis methodologies, and supporting documentation provided by the recipients.

  13. Application of Artificial Intelligence Techniques in Unmanned Aerial Vehicle Flight

    NASA Technical Reports Server (NTRS)

    Bauer, Frank H. (Technical Monitor); Dufrene, Warren R., Jr.

    2003-01-01

    This paper describes the development of an application of Artificial Intelligence for Unmanned Aerial Vehicle (UAV) control. The project was done as part of the requirements for a class in Artificial Intelligence (AI) at Nova southeastern University and as an adjunct to a project at NASA Goddard Space Flight Center's Wallops Flight Facility for a resilient, robust, and intelligent UAV flight control system. A method is outlined which allows a base level application for applying an AI method, Fuzzy Logic, to aspects of Control Logic for UAV flight. One element of UAV flight, automated altitude hold, has been implemented and preliminary results displayed. A low cost approach was taken using freeware, gnu, software, and demo programs. The focus of this research has been to outline some of the AI techniques used for UAV flight control and discuss some of the tools used to apply AI techniques. The intent is to succeed with the implementation of applying AI techniques to actually control different aspects of the flight of an UAV.

  14. An Engineering Method of Civil Jet Requirements Validation Based on Requirements Project Principle

    NASA Astrophysics Data System (ADS)

    Wang, Yue; Gao, Dan; Mao, Xuming

    2018-03-01

    A method of requirements validation is developed and defined to meet the needs of civil jet requirements validation in product development. Based on requirements project principle, this method will not affect the conventional design elements, and can effectively connect the requirements with design. It realizes the modern civil jet development concept, which is “requirement is the origin, design is the basis”. So far, the method has been successfully applied in civil jet aircraft development in China. Taking takeoff field length as an example, the validation process and the validation method of the requirements are detailed introduced in the study, with the hope of providing the experiences to other civil jet product design.

  15. An Auxiliary Method To Reduce Potential Adverse Impacts Of Projected Land Developments: Subwatershed Prioritization

    EPA Science Inventory

    An index based method is developed that ranks the subwatersheds of a watershed based on their relative impacts on watershed response to anticipated land developments, and then applied to an urbanizing watershed in Eastern Pennsylvania. Simulations with a semi-distributed hydrolo...

  16. Multi-projector auto-calibration and placement optimization for non-planar surfaces

    NASA Astrophysics Data System (ADS)

    Li, Dong; Xie, Jinghui; Zhao, Lu; Zhou, Lijing; Weng, Dongdong

    2015-10-01

    Non-planar projection has been widely applied in virtual reality and digital entertainment and exhibitions because of its flexible layout and immersive display effects. Compared with planar projection, a non-planar projection is more difficult to achieve because projector calibration and image distortion correction are difficult processes. This paper uses a cylindrical screen as an example to present a new method for automatically calibrating a multi-projector system in a non-planar environment without using 3D reconstruction. This method corrects the geometric calibration error caused by the screen's manufactured imperfections, such as an undulating surface or a slant in the vertical plane. In addition, based on actual projection demand, this paper presents the overall performance evaluation criteria for the multi-projector system. According to these criteria, we determined the optimal placement for the projectors. This method also extends to surfaces that can be parameterized, such as spheres, ellipsoids, and paraboloids, and demonstrates a broad applicability.

  17. Out-of-Focus Projector Calibration Method with Distortion Correction on the Projection Plane in the Structured Light Three-Dimensional Measurement System.

    PubMed

    Zhang, Jiarui; Zhang, Yingjie; Chen, Bo

    2017-12-20

    The three-dimensional measurement system with a binary defocusing technique is widely applied in diverse fields. The measurement accuracy is mainly determined by out-of-focus projector calibration accuracy. In this paper, a high-precision out-of-focus projector calibration method that is based on distortion correction on the projection plane and nonlinear optimization algorithm is proposed. To this end, the paper experimentally presents the principle that the projector has noticeable distortions outside its focus plane. In terms of this principle, the proposed method uses a high-order radial and tangential lens distortion representation on the projection plane to correct the calibration residuals caused by projection distortion. The final accuracy parameters of out-of-focus projector were obtained using a nonlinear optimization algorithm with good initial values, which were provided by coarsely calibrating the parameters of the out-of-focus projector on the focal and projection planes. Finally, the experimental results demonstrated that the proposed method can accuracy calibrate an out-of-focus projector, regardless of the amount of defocusing.

  18. A stable and high-order accurate discontinuous Galerkin based splitting method for the incompressible Navier-Stokes equations

    NASA Astrophysics Data System (ADS)

    Piatkowski, Marian; Müthing, Steffen; Bastian, Peter

    2018-03-01

    In this paper we consider discontinuous Galerkin (DG) methods for the incompressible Navier-Stokes equations in the framework of projection methods. In particular we employ symmetric interior penalty DG methods within the second-order rotational incremental pressure correction scheme. The major focus of the paper is threefold: i) We propose a modified upwind scheme based on the Vijayasundaram numerical flux that has favourable properties in the context of DG. ii) We present a novel postprocessing technique in the Helmholtz projection step based on H (div) reconstruction of the pressure correction that is computed locally, is a projection in the discrete setting and ensures that the projected velocity satisfies the discrete continuity equation exactly. As a consequence it also provides local mass conservation of the projected velocity. iii) Numerical results demonstrate the properties of the scheme for different polynomial degrees applied to two-dimensional problems with known solution as well as large-scale three-dimensional problems. In particular we address second-order convergence in time of the splitting scheme as well as its long-time stability.

  19. The Convergence of Heat, Groundwater & Fracture Permeability. Innovative Play Fairway Modelling Applied to the Tularosa Basin Phase 1 Project Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, Carlon R.; Nash, Gregory D.; Sorkhabi, Rasoul

    This report summarizes the activities and key findings of the project team occurring during Phase 1 (August 2014-October 2015) of the Tularosa Basin Geothermal Play Fairway Analysis Project. The Tularosa Basin Play Fairway Analysis (PFA) project tested two distinct geothermal exploration methodologies covering the entire basin within South Central New Mexico and Far West Texas. Throughout the initial phase of the project, the underexplored basin proved to be a challenging, yet ideal test bed to evaluate effectiveness of the team’s data collection techniques as well as the effectiveness of our innovative PFA. Phase 1 of the effort employed a low-cost,more » pragmatic approach using two methods to identify potential geothermal plays within the study area and then compared and contrasted the results of each method to rank and evaluate potential plays. Both methods appear to be very effective and highly transferable to other areas.« less

  20. Statistically Validated Networks in Bipartite Complex Systems

    PubMed Central

    Tumminello, Michele; Miccichè, Salvatore; Lillo, Fabrizio; Piilo, Jyrki; Mantegna, Rosario N.

    2011-01-01

    Many complex systems present an intrinsic bipartite structure where elements of one set link to elements of the second set. In these complex systems, such as the system of actors and movies, elements of one set are qualitatively different than elements of the other set. The properties of these complex systems are typically investigated by constructing and analyzing a projected network on one of the two sets (for example the actor network or the movie network). Complex systems are often very heterogeneous in the number of relationships that the elements of one set establish with the elements of the other set, and this heterogeneity makes it very difficult to discriminate links of the projected network that are just reflecting system's heterogeneity from links relevant to unveil the properties of the system. Here we introduce an unsupervised method to statistically validate each link of a projected network against a null hypothesis that takes into account system heterogeneity. We apply the method to a biological, an economic and a social complex system. The method we propose is able to detect network structures which are very informative about the organization and specialization of the investigated systems, and identifies those relationships between elements of the projected network that cannot be explained simply by system heterogeneity. We also show that our method applies to bipartite systems in which different relationships might have different qualitative nature, generating statistically validated networks in which such difference is preserved. PMID:21483858

  1. Reflection Coefficients.

    ERIC Educational Resources Information Center

    Greenslade, Thomas B., Jr.

    1994-01-01

    Discusses and provides an example of reflectivity approximation to determine whether reflection will occur. Provides a method to show thin-film interference on a projection screen. Also applies the reflectivity concepts to electromagnetic wave systems. (MVL)

  2. Pervasive healthcare as a scientific discipline.

    PubMed

    Bardram, J E

    2008-01-01

    The OECD countries are facing a set of core challenges; an increasing elderly population; increasing number of chronic and lifestyle-related diseases; expanding scope of what medicine can do; and increasing lack of medical professionals. Pervasive healthcare asks how pervasive computing technology can be designed to meet these challenges. The objective of this paper is to discuss 'pervasive healthcare' as a research field and tries to establish how novel and distinct it is, compared to related work within biomedical engineering, medical informatics, and ubiquitous computing. The paper presents the research questions, approach, technologies, and methods of pervasive healthcare and discusses these in comparison to those of other related scientific disciplines. A set of central research themes are presented; monitoring and body sensor networks; pervasive assistive technologies; pervasive computing for hospitals; and preventive and persuasive technologies. Two projects illustrate the kind of research being done in pervasive healthcare. The first project is targeted at home-based monitoring of hypertension; the second project is designing context-aware technologies for hospitals. Both projects approach the healthcare challenges in a new way, apply a new type of research method, and come up with new kinds of technological solutions. 'Clinical proof-of-concept' is recommended as a new method for pervasive healthcare research; the method helps design and test pervasive healthcare technologies, and in ascertaining their clinical potential before large-scale clinical tests are needed. The paper concludes that pervasive healthcare as a research field and agenda is novel; it is addressing new emerging research questions, represents a novel approach, designs new types of technologies, and applies a new kind of research method.

  3. Ultra-high resolution computed tomography imaging

    DOEpatents

    Paulus, Michael J.; Sari-Sarraf, Hamed; Tobin, Jr., Kenneth William; Gleason, Shaun S.; Thomas, Jr., Clarence E.

    2002-01-01

    A method for ultra-high resolution computed tomography imaging, comprising the steps of: focusing a high energy particle beam, for example x-rays or gamma-rays, onto a target object; acquiring a 2-dimensional projection data set representative of the target object; generating a corrected projection data set by applying a deconvolution algorithm, having an experimentally determined a transfer function, to the 2-dimensional data set; storing the corrected projection data set; incrementally rotating the target object through an angle of approximately 180.degree., and after each the incremental rotation, repeating the radiating, acquiring, generating and storing steps; and, after the rotating step, applying a cone-beam algorithm, for example a modified tomographic reconstruction algorithm, to the corrected projection data sets to generate a 3-dimensional image. The size of the spot focus of the beam is reduced to not greater than approximately 1 micron, and even to not greater than approximately 0.5 microns.

  4. Aerospace reliability applied to biomedicine.

    NASA Technical Reports Server (NTRS)

    Lalli, V. R.; Vargo, D. J.

    1972-01-01

    An analysis is presented that indicates that the reliability and quality assurance methodology selected by NASA to minimize failures in aerospace equipment can be applied directly to biomedical devices to improve hospital equipment reliability. The Space Electric Rocket Test project is used as an example of NASA application of reliability and quality assurance (R&QA) methods. By analogy a comparison is made to show how these same methods can be used in the development of transducers, instrumentation, and complex systems for use in medicine.

  5. Applying quantum principles to psychology

    NASA Astrophysics Data System (ADS)

    Busemeyer, Jerome R.; Wang, Zheng; Khrennikov, Andrei; Basieva, Irina

    2014-12-01

    This article starts out with a detailed example illustrating the utility of applying quantum probability to psychology. Then it describes several alternative mathematical methods for mapping fundamental quantum concepts (such as state preparation, measurement, state evolution) to fundamental psychological concepts (such as stimulus, response, information processing). For state preparation, we consider both pure states and densities with mixtures. For measurement, we consider projective measurements and positive operator valued measurements. The advantages and disadvantages of each method with respect to applications in psychology are discussed.

  6. Parametric Cost and Schedule Modeling for Early Technology Development

    DTIC Science & Technology

    2018-04-02

    Best Paper in the Analysis Methods Category and 2017 Best Paper Overall awards. It was also presented at the 2017 NASA Cost and Schedule Symposium... Methods over the Project Life Cycle .............................................................................................. 2 Figure 2. Average...information contribute to the lack of data, objective models, and methods that can be broadly applied in early planning stages. Scientific

  7. The Historical Method of Inquiry in a Teacher Training Program: Theory and Metatheory.

    ERIC Educational Resources Information Center

    Kimmons, Ron

    A historical method of inquiry can be applied to an experimental teacher training program, specifically, the Ford Training and Preparation Program (FTPP). The historical method requires gathering a lot of loose ideas and events that have been part of the project and hanging them together in an integrated way. To achieve this, two organizing…

  8. FIELD ANALYTICAL METHODS: ADVANCED FIELD MONITORING METHODS DEVELOPMENT AND EVALUATION OF NEW AND INNOVATIVE TECHNOLOGIES THAT SUPPORT THE SITE CHARACTERIZATION AND MONITORING REQUIREMENTS OF THE SUPERFUND PROGRAM.

    EPA Science Inventory

    The overall goal of this task is to help reduce the uncertainties in the assessment of environmental health and human exposure by better characterizing hazardous wastes through cost-effective analytical methods. Research projects are directed towards the applied development and ...

  9. Watershed-scale evaluation of the Water Erosion Prediction Project (WEPP) model in the Lake Tahoe basin

    Treesearch

    Erin S. Brooks; Mariana Dobre; William J. Elliot; Joan Q. Wu; Jan Boll

    2016-01-01

    Forest managers need methods to evaluate the impacts of management at the watershed scale. The Water Erosion Prediction Project (WEPP) has the ability to model disturbed forested hillslopes, but has difficulty addressing some of the critical processes that are important at a watershed scale, including baseflow and water yield. In order to apply WEPP to...

  10. Evaluating the Fatih Project Applications in the Turkish Educational System According to Teachers' Viewpoints (Turkey)

    ERIC Educational Resources Information Center

    Sozen, Erol; Coskun, Mücahit

    2017-01-01

    The purpose of this study was to analyze teachers' perspectives on the usage of smart boards and Tablet PCs in the Fatih Project using some variables (gender, branches, school types, educational status etc). The measurement scale of the study was developed and applied with the high school teachers in Düzce. Quantitative research methods were used…

  11. Measurement of Software Project Management Effectiveness

    DTIC Science & Technology

    2008-12-01

    with technical, financial, policy, and non -technical concerns of stakeholders, to develop and select suitable risk control actions, and implementation...not intervene in their projects and therefore affect their views. In most research experiments, researchers apply a controlled event, method or...enable a consistency check among the responses and for other research purposes. Therefore, for the risk control area model, only the responses from

  12. Applied Academic Skills in Vocational and Nonvocational Classrooms: A Classroom Observation and Focus Group Study.

    ERIC Educational Resources Information Center

    Watkins, Larae

    After 2 years of a 5-year pilot project to develop approaches to strengthen basic competencies of students enrolled in vocational programs in Oklahoma, two of the projects were reviewed. The study sought to: (1) document the incidence and level of basic skills instruction, along with the teaching methods and materials used, in the vocational and…

  13. The Effectiveness of Project-Based Learning on Pupils with Learning Difficulties Regarding Academic Performance, Group Work and Motivation

    ERIC Educational Resources Information Center

    Filippatou, Diamanto; Kaldi, Stavroula

    2010-01-01

    This study focuses upon the effectiveness of project-based learning on primary school pupils with learning difficulties regarding their academic performance and attitudes towards self efficacy, task value, group work and teaching methods applied. The present study is a part of a larger one that included six Greek fourth-grade primary school…

  14. The Effect on the 8th Grade Students' Attitude towards Statistics of Project Based Learning

    ERIC Educational Resources Information Center

    Koparan, Timur; Güven, Bülent

    2014-01-01

    This study investigates the effect of the project based learning approach on 8th grade students' attitude towards statistics. With this aim, an attitude scale towards statistics was developed. Quasi-experimental research model was used in this study. Following this model in the control group the traditional method was applied to teach statistics…

  15. Innovative Instrumentation and Analysis of the Temperature Measurement for High Temperature Gasification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seong W. Lee

    The project entitled, ''Innovative Instrumentation and Analysis of the Temperature Measurement for High Temperature Gasification'', was successfully completed by the Principal Investigator, Dr. S. Lee and his research team in the Center for Advanced Energy Systems and Environmental Control Technologies at Morgan State University. The major results and outcomes were presented in semi-annual progress reports and annual project review meetings/presentations. Specifically, the literature survey including the gasifier temperature measurement, the ultrasonic application in cleaning application, and spray coating process and the gasifier simulator (cold model) testing has been successfully conducted during the first year. The results show that four factorsmore » (blower voltage, ultrasonic application, injection time intervals, particle weight) were considered as significant factors that affect the temperature measurement. Then the gasifier simulator (hot model) design and the fabrication as well as the systematic tests on hot model were completed to test the significant factors on temperature measurement in the second year. The advanced Industrial analytic methods such as statistics-based experimental design, analysis of variance (ANOVA) and regression methods were applied in the hot model tests. The results show that operational parameters (i.e. air flow rate, water flow rate, fine dust particle amount, ammonia addition) presented significant impact on the temperature measurement inside the gasifier simulator. The experimental design and ANOVA are very efficient way to design and analyze the experiments. The results show that the air flow rate and fine dust particle amount are statistically significant to the temperature measurement. The regression model provided the functional relation between the temperature and these factors with substantial accuracy. In the last year of the project period, the ultrasonic and subsonic cleaning methods and coating materials were tested/applied on the thermocouple cleaning according to the proposed approach. Different frequency, application time and power of the ultrasonic/subsonic output were tested. The results show that the ultrasonic approach is one of the best methods to clean the thermocouple tips during the routine operation of the gasifier. In addition, the real time data acquisition system was also designed and applied in the experiments. This advanced instrumentation provided the efficient and accurate data acquisition for this project. In summary, the accomplishment of the project provided useful information of the ultrasonic cleaning method applied in thermocouple tip cleaning. The temperature measurement could be much improved both in accuracy and duration provided that the proposed approach is widely used in the gasification facilities.« less

  16. MODIFYING EPA METHOD 314.0 FOR ANALYSIS OF PERCHLORATE IN AQUEOUS SAMPLES CONTAINING HIGH TOTAL DISSOLVED SOLIDS

    EPA Science Inventory

    Through the Regional Applied Research Effort (RARE) program, the Chemical Exposure Research Branch and Region 9 personnel in San Francisco, California are collaborating on a project to explore sample pretreatment and preconcentration techniques to lower the method detection limit...

  17. Delphi: An Overview, An Application, Some Lessons.

    ERIC Educational Resources Information Center

    Moore, Carl M.; Coke, James G.

    This paper discusses Delphi-a method of utilizing individuals' knowledge, judgment, and opinions to address complex questions and applies the method to a community planning project in Stow, Ohio. There are four phases of any Delphi: (1) exploring the subject under discussion, with each individual contributing pertinent information, (2) reaching an…

  18. THE CRITICAL-PATH METHOD OF CONSTRUCTION CONTROL.

    ERIC Educational Resources Information Center

    DOMBROW, RODGER T.; MAUCHLY, JOHN

    THIS DISCUSSION PRESENTS A DEFINITION AND BRIEF DESCRIPTION OF THE CRITICAL-PATH METHOD AS APPLIED TO BUILDING CONSTRUCTION. INTRODUCING REMARKS CONSIDER THE MOST PERTINENT QUESTIONS PERTAINING TO CPM AND THE NEEDS ASSOCIATED WITH MINIMIZING TIME AND COST ON CONSTRUCTION PROJECTS. SPECIFIC DISCUSSION INCLUDES--(1) ADVANTAGES OF NETWORK TECHNIQUES,…

  19. A RUTCOR Project in Discrete Applied Mathematics

    DTIC Science & Technology

    1990-02-20

    representations of smooth piecewise polynomial functions over triangulated regions have led in particular to the conclusion that Groebner basis methods of...Reversing Number of a Digraph," in preparation. 4. Billera, L.J., and Rose, L.L., " Groebner Basis Methods for Multivariate Splines," RRR 1-89, January

  20. Introduction to 3D Graphics through Excel

    ERIC Educational Resources Information Center

    Benacka, Jan

    2013-01-01

    The article presents a method of explaining the principles of 3D graphics through making a revolvable and sizable orthographic parallel projection of cuboid in Excel. No programming is used. The method was tried in fourteen 90 minute lessons with 181 participants, which were Informatics teachers, undergraduates of Applied Informatics and gymnasium…

  1. The Multi-SAG project: filling the MultiDark simulations with semi-analytic galaxies

    NASA Astrophysics Data System (ADS)

    Vega-Martínez, C. A.; Cora, S. A.; Padilla, N. D.; Muñoz Arancibia, A. M.; Orsi, A. A.; Ruiz, A. N.

    2016-08-01

    The semi-analytical model sag is a code of galaxy formation and evolution which is applied to halo catalogs and merger trees extracted from cosmological -body simulations of dark matter. This contribution describes the project of constructing a catalog of simulated galaxies by adapting and applying the model sag over two dark matter simulations of the spanish MultiDark Project publicly available. Those simulations have particles, each, in boxes with sizes of 1000 Mpc and 400 Mpc respectively with Planck cosmological parameters. They cover a large range of masses and have halo mass resolutions of , therefore each simulation is able to produce more than 150 millions of simulated galaxies. A detailed description of the method is explained, and the first statistical results are shown.

  2. 28 CFR 100.16 - Cost estimate submission.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., including: (1) The judgmental factors applied, such as trends or budgetary data, and the mathematical or other methods used in the estimate, including those used in projecting from known data; and (2) The...

  3. Analysis of Expedited Defense Contracting Methods in the Acquisition of Emerging Technology

    DTIC Science & Technology

    2016-12-01

    NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA MBA PROFESSIONAL REPORT ANALYSIS OF EXPEDITED DEFENSE CONTRACTING METHODS IN THE...CONTRACTING METHODS IN THE ACQUISITION OF EMERGING TECHNOLOGY 5. FUNDING NUMBERS 6. AUTHOR(S) Jacob D. Sabin and Mark K. Zakner 7. PERFORMING...firms. The DOD has authority for applying non-traditional contracting methods to better adapt to this competitive marketplace. This project studied non

  4. Applying knowledge-anchored hypothesis discovery methods to advance clinical and translational research: the OAMiner project

    PubMed Central

    Jackson, Rebecca D; Best, Thomas M; Borlawsky, Tara B; Lai, Albert M; James, Stephen; Gurcan, Metin N

    2012-01-01

    The conduct of clinical and translational research regularly involves the use of a variety of heterogeneous and large-scale data resources. Scalable methods for the integrative analysis of such resources, particularly when attempting to leverage computable domain knowledge in order to generate actionable hypotheses in a high-throughput manner, remain an open area of research. In this report, we describe both a generalizable design pattern for such integrative knowledge-anchored hypothesis discovery operations and our experience in applying that design pattern in the experimental context of a set of driving research questions related to the publicly available Osteoarthritis Initiative data repository. We believe that this ‘test bed’ project and the lessons learned during its execution are both generalizable and representative of common clinical and translational research paradigms. PMID:22647689

  5. Accelerated perturbation-resilient block-iterative projection methods with application to image reconstruction

    PubMed Central

    Nikazad, T; Davidi, R; Herman, G. T.

    2013-01-01

    We study the convergence of a class of accelerated perturbation-resilient block-iterative projection methods for solving systems of linear equations. We prove convergence to a fixed point of an operator even in the presence of summable perturbations of the iterates, irrespective of the consistency of the linear system. For a consistent system, the limit point is a solution of the system. In the inconsistent case, the symmetric version of our method converges to a weighted least squares solution. Perturbation resilience is utilized to approximate the minimum of a convex functional subject to the equations. A main contribution, as compared to previously published approaches to achieving similar aims, is a more than an order of magnitude speed-up, as demonstrated by applying the methods to problems of image reconstruction from projections. In addition, the accelerated algorithms are illustrated to be better, in a strict sense provided by the method of statistical hypothesis testing, than their unaccelerated versions for the task of detecting small tumors in the brain from X-ray CT projection data. PMID:23440911

  6. Accelerated perturbation-resilient block-iterative projection methods with application to image reconstruction.

    PubMed

    Nikazad, T; Davidi, R; Herman, G T

    2012-03-01

    We study the convergence of a class of accelerated perturbation-resilient block-iterative projection methods for solving systems of linear equations. We prove convergence to a fixed point of an operator even in the presence of summable perturbations of the iterates, irrespective of the consistency of the linear system. For a consistent system, the limit point is a solution of the system. In the inconsistent case, the symmetric version of our method converges to a weighted least squares solution. Perturbation resilience is utilized to approximate the minimum of a convex functional subject to the equations. A main contribution, as compared to previously published approaches to achieving similar aims, is a more than an order of magnitude speed-up, as demonstrated by applying the methods to problems of image reconstruction from projections. In addition, the accelerated algorithms are illustrated to be better, in a strict sense provided by the method of statistical hypothesis testing, than their unaccelerated versions for the task of detecting small tumors in the brain from X-ray CT projection data.

  7. Research requirements to reduce civil helicopter life cycle cost

    NASA Technical Reports Server (NTRS)

    Blewitt, S. J.

    1978-01-01

    The problem of the high cost of helicopter development, production, operation, and maintenance is defined and the cost drivers are identified. Helicopter life cycle costs would decrease by about 17 percent if currently available technology were applied. With advanced technology, a reduction of about 30 percent in helicopter life cycle costs is projected. Technological and managerial deficiencies which contribute to high costs are examined, basic research and development projects which can reduce costs include methods for reduced fuel consumption; improved turbine engines; airframe and engine production methods; safety; rotor systems; and advanced transmission systems.

  8. Results of a Formal Methods Demonstration Project

    NASA Technical Reports Server (NTRS)

    Kelly, J.; Covington, R.; Hamilton, D.

    1994-01-01

    This paper describes the results of a cooperative study conducted by a team of researchers in formal methods at three NASA Centers to demonstrate FM techniques and to tailor them to critical NASA software systems. This pilot project applied FM to an existing critical software subsystem, the Shuttle's Jet Select subsystem (Phase I of an ongoing study). The present study shows that FM can be used successfully to uncover hidden issues in a highly critical and mature Functional Subsystem Software Requirements (FSSR) specification which are very difficult to discover by traditional means.

  9. Equidistant map projections of a triaxial ellipsoid with the use of reduced coordinates

    NASA Astrophysics Data System (ADS)

    Pędzich, Paweł

    2017-12-01

    The paper presents a new method of constructing equidistant map projections of a triaxial ellipsoid as a function of reduced coordinates. Equations for x and y coordinates are expressed with the use of the normal elliptic integral of the second kind and Jacobian elliptic functions. This solution allows to use common known and widely described in literature methods of solving such integrals and functions. The main advantage of this method is the fact that the calculations of x and y coordinates are practically based on a single algorithm that is required to solve the elliptic integral of the second kind. Equations are provided for three types of map projections: cylindrical, azimuthal and pseudocylindrical. These types of projections are often used in planetary cartography for presentation of entire and polar regions of extraterrestrial objects. The paper also contains equations for the calculation of the length of a meridian and a parallel of a triaxial ellipsoid in reduced coordinates. Moreover, graticules of three coordinates systems (planetographic, planetocentric and reduced) in developed map projections are presented. The basic properties of developed map projections are also described. The obtained map projections may be applied in planetary cartography in order to create maps of extraterrestrial objects.

  10. Statistical iterative reconstruction for streak artefact reduction when using multidetector CT to image the dento-alveolar structures.

    PubMed

    Dong, J; Hayakawa, Y; Kober, C

    2014-01-01

    When metallic prosthetic appliances and dental fillings exist in the oral cavity, the appearance of metal-induced streak artefacts is not avoidable in CT images. The aim of this study was to develop a method for artefact reduction using the statistical reconstruction on multidetector row CT images. Adjacent CT images often depict similar anatomical structures. Therefore, reconstructed images with weak artefacts were attempted using projection data of an artefact-free image in a neighbouring thin slice. Images with moderate and strong artefacts were continuously processed in sequence by successive iterative restoration where the projection data was generated from the adjacent reconstructed slice. First, the basic maximum likelihood-expectation maximization algorithm was applied. Next, the ordered subset-expectation maximization algorithm was examined. Alternatively, a small region of interest setting was designated. Finally, the general purpose graphic processing unit machine was applied in both situations. The algorithms reduced the metal-induced streak artefacts on multidetector row CT images when the sequential processing method was applied. The ordered subset-expectation maximization and small region of interest reduced the processing duration without apparent detriments. A general-purpose graphic processing unit realized the high performance. A statistical reconstruction method was applied for the streak artefact reduction. The alternative algorithms applied were effective. Both software and hardware tools, such as ordered subset-expectation maximization, small region of interest and general-purpose graphic processing unit achieved fast artefact correction.

  11. 5-SPICE: the application of an original framework for community health worker program design, quality improvement and research agenda setting

    PubMed Central

    Palazuelos, Daniel; DaEun Im, Dana; Peckarsky, Matthew; Schwarz, Dan; Farmer, Didi Bertrand; Dhillon, Ranu; Johnson, Ari; Orihuela, Claudia; Hackett, Jill; Bazile, Junior; Berman, Leslie; Ballard, Madeleine; Panjabi, Raj; Ternier, Ralph; Slavin, Sam; Lee, Scott; Selinsky, Steve; Mitnick, Carole Diane

    2013-01-01

    Introduction Despite decades of experience with community health workers (CHWs) in a wide variety of global health projects, there is no established conceptual framework that structures how implementers and researchers can understand, study and improve their respective programs based on lessons learned by other CHW programs. Objective To apply an original, non-linear framework and case study method, 5-SPICE, to multiple sister projects of a large, international non-governmental organization (NGO), and other CHW projects. Design Engaging a large group of implementers, researchers and the best available literature, the 5-SPICE framework was refined and then applied to a selection of CHW programs. Insights gleaned from the case study method were summarized in a tabular format named the ‘5×5-SPICE chart’. This format graphically lists the ways in which essential CHW program elements interact, both positively and negatively, in the implementation field. Results The 5×5-SPICE charts reveal a variety of insights that come from a more complex understanding of how essential CHW projects interact and influence each other in their unique context. Some have been well described in the literature previously, while others are exclusive to this article. An analysis of how best to compensate CHWs is also offered as an example of the type of insights that this method may yield. Conclusions The 5-SPICE framework is a novel instrument that can be used to guide discussions about CHW projects. Insights from this process can help guide quality improvement efforts, or be used as hypothesis that will form the basis of a program's research agenda. Recent experience with research protocols embedded into successfully implemented projects demonstrates how such hypothesis can be rigorously tested. PMID:23561023

  12. An Ensemble Successive Project Algorithm for Liquor Detection Using Near Infrared Sensor.

    PubMed

    Qu, Fangfang; Ren, Dong; Wang, Jihua; Zhang, Zhong; Lu, Na; Meng, Lei

    2016-01-11

    Spectral analysis technique based on near infrared (NIR) sensor is a powerful tool for complex information processing and high precision recognition, and it has been widely applied to quality analysis and online inspection of agricultural products. This paper proposes a new method to address the instability of small sample sizes in the successive projections algorithm (SPA) as well as the lack of association between selected variables and the analyte. The proposed method is an evaluated bootstrap ensemble SPA method (EBSPA) based on a variable evaluation index (EI) for variable selection, and is applied to the quantitative prediction of alcohol concentrations in liquor using NIR sensor. In the experiment, the proposed EBSPA with three kinds of modeling methods are established to test their performance. In addition, the proposed EBSPA combined with partial least square is compared with other state-of-the-art variable selection methods. The results show that the proposed method can solve the defects of SPA and it has the best generalization performance and stability. Furthermore, the physical meaning of the selected variables from the near infrared sensor data is clear, which can effectively reduce the variables and improve their prediction accuracy.

  13. A New Continuous-Time Equality-Constrained Optimization to Avoid Singularity.

    PubMed

    Quan, Quan; Cai, Kai-Yuan

    2016-02-01

    In equality-constrained optimization, a standard regularity assumption is often associated with feasible point methods, namely, that the gradients of constraints are linearly independent. In practice, the regularity assumption may be violated. In order to avoid such a singularity, a new projection matrix is proposed based on which a feasible point method to continuous-time, equality-constrained optimization is developed. First, the equality constraint is transformed into a continuous-time dynamical system with solutions that always satisfy the equality constraint. Second, a new projection matrix without singularity is proposed to realize the transformation. An update (or say a controller) is subsequently designed to decrease the objective function along the solutions of the transformed continuous-time dynamical system. The invariance principle is then applied to analyze the behavior of the solution. Furthermore, the proposed method is modified to address cases in which solutions do not satisfy the equality constraint. Finally, the proposed optimization approach is applied to three examples to demonstrate its effectiveness.

  14. Evaluating the effect of a third-party implementation of resolution recovery on the quality of SPECT bone scan imaging using visual grading regression.

    PubMed

    Hay, Peter D; Smith, Julie; O'Connor, Richard A

    2016-02-01

    The aim of this study was to evaluate the benefits to SPECT bone scan image quality when applying resolution recovery (RR) during image reconstruction using software provided by a third-party supplier. Bone SPECT data from 90 clinical studies were reconstructed retrospectively using software supplied independent of the gamma camera manufacturer. The current clinical datasets contain 120×10 s projections and are reconstructed using an iterative method with a Butterworth postfilter. Five further reconstructions were created with the following characteristics: 10 s projections with a Butterworth postfilter (to assess intraobserver variation); 10 s projections with a Gaussian postfilter with and without RR; and 5 s projections with a Gaussian postfilter with and without RR. Two expert observers were asked to rate image quality on a five-point scale relative to our current clinical reconstruction. Datasets were anonymized and presented in random order. The benefits of RR on image scores were evaluated using ordinal logistic regression (visual grading regression). The application of RR during reconstruction increased the probability of both observers of scoring image quality as better than the current clinical reconstruction even where the dataset contained half the normal counts. Type of reconstruction and observer were both statistically significant variables in the ordinal logistic regression model. Visual grading regression was found to be a useful method for validating the local introduction of technological developments in nuclear medicine imaging. RR, as implemented by the independent software supplier, improved bone SPECT image quality when applied during image reconstruction. In the majority of clinical cases, acquisition times for bone SPECT intended for the purposes of localization can safely be halved (from 10 s projections to 5 s) when RR is applied.

  15. Seal coat research project

    DOT National Transportation Integrated Search

    1999-12-01

    This study evaluates the use of seal coating as a method to protect bituminous pavements from oxidation, water infiltration, and raveling. The Minnesota Department of Transportation (Mn/DOT) applied seal coating to a roadway segment of Trunk Highway ...

  16. Coordinated development of leading biomass pretreatment technologies.

    PubMed

    Wyman, Charles E; Dale, Bruce E; Elander, Richard T; Holtzapple, Mark; Ladisch, Michael R; Lee, Y Y

    2005-12-01

    For the first time, a single source of cellulosic biomass was pretreated by leading technologies using identical analytical methods to provide comparative performance data. In particular, ammonia explosion, aqueous ammonia recycle, controlled pH, dilute acid, flowthrough, and lime approaches were applied to prepare corn stover for subsequent biological conversion to sugars through a Biomass Refining Consortium for Applied Fundamentals and Innovation (CAFI) among Auburn University, Dartmouth College, Michigan State University, the National Renewable Energy Laboratory, Purdue University, and Texas A&M University. An Agricultural and Industrial Advisory Board provided guidance to the project. Pretreatment conditions were selected based on the extensive experience of the team with each of the technologies, and the resulting fluid and solid streams were characterized using standard methods. The data were used to close material balances, and energy balances were estimated for all processes. The digestibilities of the solids by a controlled supply of cellulase enzyme and the fermentability of the liquids were also assessed and used to guide selection of optimum pretreatment conditions. Economic assessments were applied based on the performance data to estimate each pretreatment cost on a consistent basis. Through this approach, comparative data were developed on sugar recovery from hemicellulose and cellulose by the combined pretreatment and enzymatic hydrolysis operations when applied to corn stover. This paper introduces the project and summarizes the shared methods for papers reporting results of this research in this special edition of Bioresource Technology.

  17. [Tobacco quality analysis of producing areas of Yunnan tobacco using near-infrared (NIR) spectrum].

    PubMed

    Wang, Yi; Ma, Xiang; Wen, Ya-Dong; Yu, Chun-Xia; Wang, Luo-Ping; Zhao, Long-Lian; Li, Jun-Hui

    2013-01-01

    In the present study, tobacco quality analysis of different producing areas was carried out applying spectrum projection and correlation methods. The group of industrial classification data was near-infrared (NIR) spectrum in 2010 year of middle parts of tobacco plant from Hongta Tobacco (Group) Co., Ltd. Twelve hundred seventy six superior tobacco leaf samples were collected from four producing areas, in which three areas from Yuxi, Chuxiong and Zhaotong, in Yunnan province all belong to tobacco varieties of K326 and one area from Dali belongs to tobacco varieties of Hongda. The conclusion showed that when the samples were divided into two parts by the ratio of 2 : 1 randomly as analysis and verification sets, the verification set corresponded with the analysis set applying spectrum projection because their correlation coefficients by the first and second dimensional projection were all above 0.99. At the same time, The study discussed a method to get the quantitative similarity values of different producing areas samples. The similarity values were instructive in tobacco plant planning, quality management, acquisition of raw materials of tobacco and tobacco leaf blending.

  18. Defining Instructional Quality by Employing the Total Quality Management (TQM) Method: A Research Project.

    ERIC Educational Resources Information Center

    Croker, Robert E.; And Others

    The feasibility of using W. E. Deming's total quality management (TQM) method to define instructional quality was examined by surveying three groups of students attending Idaho State University's College of Education and School of Applied Technology: 31 students seeking cosmetology certification; 75 undergraduates pursuing degrees in corporate…

  19. 76 FR 70954 - Idaho Panhandle National Forests, Idaho; Idaho Panhandle National Forest Noxious Weed Treatment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-16

    ... registered herbicides is one of the various treatment methods that are proposed. The overall project goal is... insects; and herbicides that target specific invasive plant species. The application of herbicides would... spraying would be the primary method of applying herbicide in order to target individual and groups of...

  20. A Brain-Computer Interface Project Applied in Computer Engineering

    ERIC Educational Resources Information Center

    Katona, Jozsef; Kovari, Attila

    2016-01-01

    Keeping up with novel methods and keeping abreast of new applications are crucial issues in engineering education. In brain research, one of the most significant research areas in recent decades, many developments have application in both modern engineering technology and education. New measurement methods in the observation of brain activity open…

  1. Developing Health Indicators for People with Intellectual Disabilities. The Method of the Pomona Project

    ERIC Educational Resources Information Center

    van Schrojenstein Lantman-de Valk, H.; Linehan, C.; Kerr, M.; Noonan-Walsh, P.

    2007-01-01

    Aim: Recently, attention has focused on the health inequalities experienced by people with intellectual disabilities (ID) when compared with the general population. To inform policies aimed at equalizing health opportunities, comparable evidence is needed about the aspects of their health that may be amenable to intervention. Method: Applying the…

  2. ACCESS 3. Approximation concepts code for efficient structural synthesis: User's guide

    NASA Technical Reports Server (NTRS)

    Fleury, C.; Schmit, L. A., Jr.

    1980-01-01

    A user's guide is presented for ACCESS-3, a research oriented program which combines dual methods and a collection of approximation concepts to achieve excellent efficiency in structural synthesis. The finite element method is used for structural analysis and dual algorithms of mathematical programming are applied in the design optimization procedure. This program retains all of the ACCESS-2 capabilities and the data preparation formats are fully compatible. Four distinct optimizer options were added: interior point penalty function method (NEWSUMT); second order primal projection method (PRIMAL2); second order Newton-type dual method (DUAL2); and first order gradient projection-type dual method (DUAL1). A pure discrete and mixed continuous-discrete design variable capability, and zero order approximation of the stress constraints are also included.

  3. 34 CFR 664.5 - What definitions apply to the Fulbright-Hays Group Projects Abroad Program?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Projects Abroad Program? 664.5 Section 664.5 Education Regulations of the Offices of the Department of... PROJECTS ABROAD PROGRAM General § 664.5 What definitions apply to the Fulbright-Hays Group Projects Abroad... apply to this program: The following definitions apply to the Fulbright-Hays Group Projects Abroad...

  4. 34 CFR 664.5 - What definitions apply to the Fulbright-Hays Group Projects Abroad Program?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Projects Abroad Program? 664.5 Section 664.5 Education Regulations of the Offices of the Department of... PROJECTS ABROAD PROGRAM General § 664.5 What definitions apply to the Fulbright-Hays Group Projects Abroad... apply to this program: The following definitions apply to the Fulbright-Hays Group Projects Abroad...

  5. 34 CFR 664.5 - What definitions apply to the Fulbright-Hays Group Projects Abroad Program?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Projects Abroad Program? 664.5 Section 664.5 Education Regulations of the Offices of the Department of... PROJECTS ABROAD PROGRAM General § 664.5 What definitions apply to the Fulbright-Hays Group Projects Abroad... apply to this program: The following definitions apply to the Fulbright-Hays Group Projects Abroad...

  6. 34 CFR 664.5 - What definitions apply to the Fulbright-Hays Group Projects Abroad Program?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Projects Abroad Program? 664.5 Section 664.5 Education Regulations of the Offices of the Department of... PROJECTS ABROAD PROGRAM General § 664.5 What definitions apply to the Fulbright-Hays Group Projects Abroad... apply to this program: The following definitions apply to the Fulbright-Hays Group Projects Abroad...

  7. 34 CFR 664.5 - What definitions apply to the Fulbright-Hays Group Projects Abroad Program?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Projects Abroad Program? 664.5 Section 664.5 Education Regulations of the Offices of the Department of... PROJECTS ABROAD PROGRAM General § 664.5 What definitions apply to the Fulbright-Hays Group Projects Abroad... apply to this program: The following definitions apply to the Fulbright-Hays Group Projects Abroad...

  8. The effectiveness of Microsoft Project in assessing extension of time under PAM 2006 standard form of contract

    NASA Astrophysics Data System (ADS)

    Suhaida, S. K.; Wong, Z. D.

    2017-11-01

    Time is equal to money; and it is applies in the construction industry where time is very important. Most of the standard form of contracts provide contractual clauses to ascertain time and money related to the scenarios while Extension of Time (EOT) is one of them. Under circumstance and delays, contractor is allow to apply EOT in order to complete the works on a later completion date without Liquidated Damages (LD) imposed to the claimant. However, both claimants and assessors encountered problems in assessing the EOT. The aim of this research is to recommend the usage of Microsoft Project as a tool in assessing EOT associated with the standard form of contract, PAM 2006. A quantitative method is applied towards the respondents that consisted of architects and quantity surveyors (QS) in order to collect data on challenges in assessing EOT claims and the effectiveness of Microsoft Project as a tool. The finding of this research highlighted that Microsoft Project can serve as a basis to perform EOT tasks as this software can be used as a data bank to store handy information which crucial for preparing and evaluating EOT.

  9. Method for 3D profilometry measurement based on contouring moire fringe

    NASA Astrophysics Data System (ADS)

    Shi, Zhiwei; Lin, Juhua

    2007-12-01

    3D shape measurement is one of the most active branches of optical research recently. A method of 3D profilometry measurement by the combination of Moire projection method and phase-shifting technology based on SCM (Single Chip Microcomputer) control is presented in the paper. Automatic measurement of 3D surface profiles can be carried out by applying this method with high speed and high precision.

  10. Comparing Networks from a Data Analysis Perspective

    NASA Astrophysics Data System (ADS)

    Li, Wei; Yang, Jing-Yu

    To probe network characteristics, two predominant ways of network comparison are global property statistics and subgraph enumeration. However, they suffer from limited information and exhaustible computing. Here, we present an approach to compare networks from the perspective of data analysis. Initially, the approach projects each node of original network as a high-dimensional data point, and the network is seen as clouds of data points. Then the dispersion information of the principal component analysis (PCA) projection of the generated data clouds can be used to distinguish networks. We applied this node projection method to the yeast protein-protein interaction networks and the Internet Autonomous System networks, two types of networks with several similar higher properties. The method can efficiently distinguish one from the other. The identical result of different datasets from independent sources also indicated that the method is a robust and universal framework.

  11. Note: An online testing method for lifetime projection of high power light-emitting diode under accelerated reliability test.

    PubMed

    Chen, Qi; Chen, Quan; Luo, Xiaobing

    2014-09-01

    In recent years, due to the fast development of high power light-emitting diode (LED), its lifetime prediction and assessment have become a crucial issue. Although the in situ measurement has been widely used for reliability testing in laser diode community, it has not been applied commonly in LED community. In this paper, an online testing method for LED life projection under accelerated reliability test was proposed and the prototype was built. The optical parametric data were collected. The systematic error and the measuring uncertainty were calculated to be within 0.2% and within 2%, respectively. With this online testing method, experimental data can be acquired continuously and sufficient amount of data can be gathered. Thus, the projection fitting accuracy can be improved (r(2) = 0.954) and testing duration can be shortened.

  12. Project JOVE. [microgravity experiments and applications

    NASA Technical Reports Server (NTRS)

    Lyell, M. J.

    1994-01-01

    The goal of this project is to investigate new areas of research pertaining to free surface-interface fluids mechanics and/or microgravity which have potential commercial applications. This paper presents an introduction to ferrohydrodynamics (FHD), and discusses some applications. Also, computational methods for solving free surface flow problems are presented in detail. Both have diverse applications in industry and in microgravity fluids applications. Three different modeling schemes for FHD flows are addressed and the governing equations, including Maxwell's equations, are introduced. In the area of computational modeling of free surface flows, both Eulerian and Lagrangian schemes are discussed. The state of the art in computational methods applied to free surface flows is elucidated. In particular, adaptive grids and re-zoning methods are discussed. Additional research results are addressed and copies of the publications produced under the JOVE Project are included.

  13. Decision problems in management of construction projects

    NASA Astrophysics Data System (ADS)

    Szafranko, E.

    2017-10-01

    In a construction business, one must oftentimes make decisions during all stages of a building process, from planning a new construction project through its execution to the stage of using a ready structure. As a rule, the decision making process is made more complicated due to certain conditions specific for civil engineering. With such diverse decision situations, it is recommended to apply various decision making support methods. Both, literature and hands-on experience suggest several methods based on analytical and computational procedures, some less and some more complex. This article presents the methods which can be helpful in supporting decision making processes in the management of civil engineering projects. These are multi-criteria methods, such as MCE, AHP or indicator methods. Because the methods have different advantages and disadvantages, whereas decision situations have their own specific nature, a brief summary of the methods alongside some recommendations regarding their practical applications has been given at the end of the paper. The main aim of this article is to review the methods of decision support and their analysis for possible use in the construction industry.

  14. Cost-benefit evaluation of a decentralized water system for wastewater reuse and environmental protection.

    PubMed

    Chen, R; Wang, X C

    2009-01-01

    This paper proposed a net benefit value (NBV) model for cost-benefit evaluation of wastewater treatment and reuse projects, and attention was mainly paid to decentralized systems which are drawing wide interests all over the world especially in the water-deficient countries and regions. In the NBV model, all the factors related to project costs are monetary ones which can be calculated by using traditional methods, while many of the factors related to project benefits are non-monetary ones which need sophisticated methods for monetization. In this regard, the authors elaborated several methods for monetization of the benefits from wastewater discharge reduction, local environment improvement, and human health protection. The proposed model and methods were applied for the cost-benefit evaluation of a decentralized water reclamation and reuse project in a newly developed residential area in Xi'an, China. The system with dual-pipe collection and grey water treatment and reuse was found to be economically ineligible (NBV > 0) when all the treated water is reused for artificial pond replenishment, gardening and other non-potable purposes by taking into account the benefit of water saving. As environmental benefits are further considered, the economic advantage of the project is more significant.

  15. On experimental damage localization by SP2E: Application of H∞ estimation and oblique projections

    NASA Astrophysics Data System (ADS)

    Lenzen, Armin; Vollmering, Max

    2018-05-01

    In this article experimental damage localization based on H∞ estimation and state projection estimation error (SP2E) is studied. Based on an introduced difference process, a state space representation is derived for advantageous numerical solvability. Because real structural excitations are presumed to be unknown, a general input is applied therein, which allows synchronization and normalization. Furthermore, state projections are introduced to enhance damage identification. While first experiments to verify method SP2E have already been conducted and published, further laboratory results are analyzed here. Therefore, SP2E is used to experimentally localize stiffness degradations and mass alterations. Furthermore, the influence of projection techniques is analyzed. In summary, method SP2E is able to localize structural alterations, which has been observed by results of laboratory experiments.

  16. A Framework Applied Three Ways: Responsive Methods of Co-Developing and Implementing Community Science Solutions for Local Impact

    NASA Astrophysics Data System (ADS)

    Goodwin, M.; Pandya, R.; Udu-gama, N.; Wilkins, S.

    2017-12-01

    While one-size-fits all may work for most hats, it rarely does for communities. Research products, methods and knowledge may be usable at a local scale, but applying them often presents a challenge due to issues like availability, accessibility, awareness, lack of trust, and time. However, in an environment with diminishing federal investment in issues related climate change, natural hazards, and natural resource use and management, the ability of communities to access and leverage science has never been more urgent. Established, yet responsive frameworks and methods can help scientists and communities work together to identify and address specific challenges and leverage science to make a local impact. Through the launch of over 50 community science projects since 2013, the Thriving Earth Exchange (TEX) has created a living framework consisting of a set of milestones by which teams of scientists and community leaders navigate the challenges of working together. Central to the framework are context, trust, project planning and refinement, relationship management and community impact. We find that careful and respectful partnership management results in trust and an open exchange of information. Community science partnerships grounded in local priorities result in the development and exchange of stronger decision-relevant tools, resources and knowledge. This presentation will explore three methods TEX uses to apply its framework to community science partnerships: cohort-based collaboration, online dialogues, and one-on-one consultation. The choice of method should be responsive to a community's needs and working style. For example, a community may require customized support, desire the input and support of peers, or require consultation with multiple experts before deciding on a course of action. Knowing and applying the method of engagement best suited to achieve the community's objectives will ensure that the science is most effectively translated and applied.

  17. 3D image acquisition by fiber-based fringe projection

    NASA Astrophysics Data System (ADS)

    Pfeifer, Tilo; Driessen, Sascha

    2005-02-01

    In macroscopic production processes several measuring methods are used to assure the quality of 3D parts. Definitely, one of the most widespread techniques is the fringe projection. It"s a fast and accurate method to receive the topography of a part as a computer file which can be processed in further steps, e.g. to compare the measured part to a given CAD file. In this article it will be shown how the fringe projection method is applied to a fiber-optic system. The fringes generated by a miniaturized fringe projector (MiniRot) are first projected onto the front-end of an image guide using special optics. The image guide serves as a transmitter for the fringes in order to get them onto the surface of a micro part. A second image guide is used to observe the micro part. It"s mounted under an angle relating to the illuminating image guide so that the triangulation condition is fulfilled. With a CCD camera connected to the second image guide the projected fringes are recorded and those data is analyzed by an image processing system.

  18. REGIONAL RESEARCH, METHODS, AND SUPPORT

    EPA Science Inventory

    The Human Exposure and Atmospheric Sciences Division (HEASD) has several collaborations with regional partners through the Regional Science Program (RSP) managed by ORD's Office of Science Policy (OSP). These projects resulted from common interests outlined in the Regional Appli...

  19. Successful Application of Active Learning Techniques to Introductory Microbiology.

    ERIC Educational Resources Information Center

    Hoffman, Elizabeth A.

    2001-01-01

    Points out the low student achievement in microbiology courses and presents an active learning method applied in an introductory microbiology course which features daily quizzes, cooperative learning activities, and group projects. (Contains 30 references.) (YDS)

  20. Medulloblastoma | Office of Cancer Genomics

    Cancer.gov

    The Medulloblastoma Project was developed to apply newly emerging genomic methods towards the discovery of novel genetic alterations in medulloblastoma (MB). MB is the most common malignant brain tumor in children, accounting for approximately 20% of all pediatric brain tumors.

  1. Truncated RAP-MUSIC (TRAP-MUSIC) for MEG and EEG source localization.

    PubMed

    Mäkelä, Niko; Stenroos, Matti; Sarvas, Jukka; Ilmoniemi, Risto J

    2018-02-15

    Electrically active brain regions can be located applying MUltiple SIgnal Classification (MUSIC) on magneto- or electroencephalographic (MEG; EEG) data. We introduce a new MUSIC method, called truncated recursively-applied-and-projected MUSIC (TRAP-MUSIC). It corrects a hidden deficiency of the conventional RAP-MUSIC algorithm, which prevents estimation of the true number of brain-signal sources accurately. The correction is done by applying a sequential dimension reduction to the signal-subspace projection. We show that TRAP-MUSIC significantly improves the performance of MUSIC-type localization; in particular, it successfully and robustly locates active brain regions and estimates their number. We compare TRAP-MUSIC and RAP-MUSIC in simulations with varying key parameters, e.g., signal-to-noise ratio, correlation between source time-courses, and initial estimate for the dimension of the signal space. In addition, we validate TRAP-MUSIC with measured MEG data. We suggest that with the proposed TRAP-MUSIC method, MUSIC-type localization could become more reliable and suitable for various online and offline MEG and EEG applications. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Applying the min-projection strategy to improve the transient performance of the three-phase grid-connected inverter.

    PubMed

    Baygi, Mahdi Oloumi; Ghazi, Reza; Monfared, Mohammad

    2014-07-01

    Applying the min-projection strategy (MPS) to a three-phase grid-connected inverter to improve its transient performance is the main objective of this paper. For this purpose, the inverter is first modeled as a switched linear system. Then, the feasibility of the MPS technique is investigated and the stability criterion is derived. Hereafter, the fundamental equations of the MPS for the control of the inverter are obtained. The proposed scheme is simulated in PSCAD/EMTDC environment. The validity of the MPS approach is confirmed by comparing the obtained results with those of VOC method. The results demonstrate that the proposed method despite its simplicity provides an excellent transient performance, fully decoupled control of active and reactive powers, acceptable THD level and a reasonable switching frequency. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  3. Simultaneous planning of the project scheduling and material procurement problem under the presence of multiple suppliers

    NASA Astrophysics Data System (ADS)

    Tabrizi, Babak H.; Ghaderi, Seyed Farid

    2016-09-01

    Simultaneous planning of project scheduling and material procurement can improve the project execution costs. Hence, the issue has been addressed here by a mixed-integer programming model. The proposed model facilitates the procurement decisions by accounting for a number of suppliers offering a distinctive discount formula from which to purchase the required materials. It is aimed at developing schedules with the best net present value regarding the obtained benefit and costs of the project execution. A genetic algorithm is applied to deal with the problem, in addition to a modified version equipped with a variable neighbourhood search. The underlying factors of the solution methods are calibrated by the Taguchi method to obtain robust solutions. The performance of the aforementioned methods is compared for different problem sizes, in which the utilized local search proved efficient. Finally, a sensitivity analysis is carried out to check the effect of inflation on the objective function value.

  4. Case-Based Capture and Reuse of Aerospace Design Rationale

    NASA Technical Reports Server (NTRS)

    Leake, David B.

    2001-01-01

    The goal of this project was to apply artificial intelligence techniques to facilitate capture and reuse of aerospace design rationale. The project combined case-based reasoning (CBR) and concept maps (CMaps) to develop methods for capturing, organizing, and interactively accessing records of experiences encapsulating the methods and rationale underlying expert aerospace design, in order to bring the captured knowledge to bear to support future reasoning. The project's results contribute both principles and methods for effective design-aiding systems that aid capture and access of useful design knowledge. The project has been guided by the tenets that design-aiding systems must: (1) Leverage a designer's knowledge, rather than attempting to replace it; (2) Be able to reflect different designers' differing conceptualizations of the design task, and to clarify those conceptualizations to others; (3) Include capabilities to capture information both by interactive knowledge modeling and during normal use; and (4) Integrate into normal designer tasks as naturally and unobtrusive as possible.

  5. Generic precise augmented reality guiding system and its calibration method based on 3D virtual model.

    PubMed

    Liu, Miao; Yang, Shourui; Wang, Zhangying; Huang, Shujun; Liu, Yue; Niu, Zhenqi; Zhang, Xiaoxuan; Zhu, Jigui; Zhang, Zonghua

    2016-05-30

    Augmented reality system can be applied to provide precise guidance for various kinds of manual works. The adaptability and guiding accuracy of such systems are decided by the computational model and the corresponding calibration method. In this paper, a novel type of augmented reality guiding system and the corresponding designing scheme are proposed. Guided by external positioning equipment, the proposed system can achieve high relative indication accuracy in a large working space. Meanwhile, the proposed system is realized with a digital projector and the general back projection model is derived with geometry relationship between digitized 3D model and the projector in free space. The corresponding calibration method is also designed for the proposed system to obtain the parameters of projector. To validate the proposed back projection model, the coordinate data collected by a 3D positioning equipment is used to calculate and optimize the extrinsic parameters. The final projecting indication accuracy of the system is verified with subpixel pattern projecting technique.

  6. Bayesian probabilistic population projections for all countries.

    PubMed

    Raftery, Adrian E; Li, Nan; Ševčíková, Hana; Gerland, Patrick; Heilig, Gerhard K

    2012-08-28

    Projections of countries' future populations, broken down by age and sex, are widely used for planning and research. They are mostly done deterministically, but there is a widespread need for probabilistic projections. We propose a bayesian method for probabilistic population projections for all countries. The total fertility rate and female and male life expectancies at birth are projected probabilistically using bayesian hierarchical models estimated via Markov chain Monte Carlo using United Nations population data for all countries. These are then converted to age-specific rates and combined with a cohort component projection model. This yields probabilistic projections of any population quantity of interest. The method is illustrated for five countries of different demographic stages, continents and sizes. The method is validated by an out of sample experiment in which data from 1950-1990 are used for estimation, and applied to predict 1990-2010. The method appears reasonably accurate and well calibrated for this period. The results suggest that the current United Nations high and low variants greatly underestimate uncertainty about the number of oldest old from about 2050 and that they underestimate uncertainty for high fertility countries and overstate uncertainty for countries that have completed the demographic transition and whose fertility has started to recover towards replacement level, mostly in Europe. The results also indicate that the potential support ratio (persons aged 20-64 per person aged 65+) will almost certainly decline dramatically in most countries over the coming decades.

  7. Manufacturing Methods and Technology Project Summary Reports

    DTIC Science & Technology

    1982-12-01

    aluminide was used to eliminate adhesive failures. A doctor blade and expandable ring segment were selected as the tooling to apply the 0.010 inch...contractual effort is to develop manu- facturing technology for the production of integrally bladed impellers using titanium pre-alloyed powder and...Projectiles in Modernized Plants 1-16 METALS Abstracts ME-1 Projects 176 7046, 17T 7046 and 177 7046 - Precision Cast Titanium Compressor Casing ME

  8. Applying Affect Recognition in Serious Games: The PlayMancer Project

    NASA Astrophysics Data System (ADS)

    Ben Moussa, Maher; Magnenat-Thalmann, Nadia

    This paper presents an overview and the state-of-art in the applications of 'affect' recognition in serious games for the support of patients in behavioral and mental disorder treatments and chronic pain rehabilitation, within the framework of the European project PlayMancer. Three key technologies are discussed relating to facial affect recognition, fusion of different affect recognition methods, and the application of affect recognition in serious games.

  9. Application of information-retrieval methods to the classification of physical data

    NASA Technical Reports Server (NTRS)

    Mamotko, Z. N.; Khorolskaya, S. K.; Shatrovskiy, L. I.

    1975-01-01

    Scientific data received from satellites are characterized as a multi-dimensional time series, whose terms are vector functions of a vector of measurement conditions. Information retrieval methods are used to construct lower dimensional samples on the basis of the condition vector, in order to obtain these data and to construct partial relations. The methods are applied to the joint Soviet-French Arkad project.

  10. Los Alamos Science: The Human Genome Project. Number 20, 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cooper, N G; Shea, N

    1992-01-01

    This article provides a broad overview of the Human Genome Project, with particular emphasis on work being done at Los Alamos. It tries to emphasize the scientific aspects of the project, compared to the more speculative information presented in the popular press. There is a brief introduction to modern genetics, including a review of classic work. There is a broad overview of the Genome Project, describing what the project is, what are some of its major five-year goals, what are major technological challenges ahead of the project, and what can the field of biology, as well as society expect tomore » see as benefits from this project. Specific results on the efforts directed at mapping chromosomes 16 and 5 are discussed. A brief introduction to DNA libraries is presented, bearing in mind that Los Alamos has housed such libraries for many years prior to the Genome Project. Information on efforts to do applied computational work related to the project are discussed, as well as experimental efforts to do rapid DNA sequencing by means of single-molecule detection using applied spectroscopic methods. The article introduces the Los Alamos staff which are working on the Genome Project, and concludes with brief discussions on ethical, legal, and social implications of this work; a brief glimpse of genetics as it may be practiced in the next century; and a glossary of relevant terms.« less

  11. Los Alamos Science: The Human Genome Project. Number 20, 1992

    DOE R&D Accomplishments Database

    Cooper, N. G.; Shea, N. eds.

    1992-01-01

    This document provides a broad overview of the Human Genome Project, with particular emphasis on work being done at Los Alamos. It tries to emphasize the scientific aspects of the project, compared to the more speculative information presented in the popular press. There is a brief introduction to modern genetics, including a review of classic work. There is a broad overview of the Genome Project, describing what the project is, what are some of its major five-year goals, what are major technological challenges ahead of the project, and what can the field of biology, as well as society expect to see as benefits from this project. Specific results on the efforts directed at mapping chromosomes 16 and 5 are discussed. A brief introduction to DNA libraries is presented, bearing in mind that Los Alamos has housed such libraries for many years prior to the Genome Project. Information on efforts to do applied computational work related to the project are discussed, as well as experimental efforts to do rapid DNA sequencing by means of single-molecule detection using applied spectroscopic methods. The article introduces the Los Alamos staff which are working on the Genome Project, and concludes with brief discussions on ethical, legal, and social implications of this work; a brief glimpse of genetics as it may be practiced in the next century; and a glossary of relevant terms.

  12. Irreducible projective representations and their physical applications

    NASA Astrophysics Data System (ADS)

    Yang, Jian; Liu, Zheng-Xin

    2018-01-01

    An eigenfunction method is applied to reduce the regular projective representations (Reps) of finite groups to obtain their irreducible projective Reps. Anti-unitary groups are treated specially, where the decoupled factor systems and modified Schur’s lemma are introduced. We discuss the applications of irreducible Reps in many-body physics. It is shown that in symmetry protected topological phases, geometric defects or symmetry defects may carry projective Rep of the symmetry group; while in symmetry enriched topological phases, intrinsic excitations (such as spinons or visons) may carry projective Rep of the symmetry group. We also discuss the applications of projective Reps in problems related to spectrum degeneracy, such as in search of models without sign problem in quantum Monte Carlo simulations.

  13. DNA: The Strand that Connects Us All

    ScienceCinema

    Kaplan, Matt [Univ. of Arizona, Tucson, AZ (United States). Genetics Core Facility

    2018-04-26

    Learn how the methods and discoveries of human population genetics are applied for personal genealogical reconstruction and anthropological testing. Dr. Kaplan starts with a short general review of human genetics and the biology behind this form of DNA testing. He looks at how DNA testing is performed and how samples are processed in the University of Arizona laboratory. He also examines examples of personal genealogical results from Family Tree DNA and personal anthropological results from the Genographic Project. Finally, he describes the newest project in the UA laboratory, the DNA Shoah Project.

  14. Risk evaluation of highway engineering project based on the fuzzy-AHP

    NASA Astrophysics Data System (ADS)

    Yang, Qian; Wei, Yajun

    2011-10-01

    Engineering projects are social activities, which integrate with technology, economy, management and organization. There are uncertainties in each respect of engineering projects, and it needs to strengthen risk management urgently. Based on the analysis of the characteristics of highway engineering, and the study of the basic theory on risk evaluation, the paper built an index system of highway project risk evaluation. Besides based on fuzzy mathematics principle, analytical hierarchy process was used and as a result, the model of the comprehensive appraisal method of fuzzy and AHP was set up for the risk evaluation of express way concessionary project. The validity and the practicability of the risk evaluation of expressway concessionary project were verified after the model was applied to the practice of a project.

  15. Linguistic analysis of project ownership for undergraduate research experiences.

    PubMed

    Hanauer, D I; Frederick, J; Fotinakes, B; Strobel, S A

    2012-01-01

    We used computational linguistic and content analyses to explore the concept of project ownership for undergraduate research. We used linguistic analysis of student interview data to develop a quantitative methodology for assessing project ownership and applied this method to measure degrees of project ownership expressed by students in relation to different types of educational research experiences. The results of the study suggest that the design of a research experience significantly influences the degree of project ownership expressed by students when they describe those experiences. The analysis identified both positive and negative aspects of project ownership and provided a working definition for how a student experiences his or her research opportunity. These elements suggest several features that could be incorporated into an undergraduate research experience to foster a student's sense of project ownership.

  16. Application of Artificial Intelligence Techniques in Uninhabited Aerial Vehicle Flight

    NASA Technical Reports Server (NTRS)

    Dufrene, Warren R., Jr.

    2004-01-01

    This paper describes the development of an application of Artificial Intelligence (AI) for Unmanned Aerial Vehicle (UAV) control. The project was done as part of the requirements for a class in AI at NOVA Southeastearn University and a beginning project at NASA Wallops Flight Facility for a resilient, robust, and intelligent UAV flight control system. A method is outlined which allows a base level application for applying an Artificial Intelligence method, Fuzzy Logic, to aspects of Control Logic for UAV flight. One element of UAV flight, automated altitude hold, has been implemented and preliminary results displayed.

  17. Application of Artificial Intelligence Techniques in Uninhabitated Aerial Vehicle Flight

    NASA Technical Reports Server (NTRS)

    Dufrene, Warren R., Jr.

    2003-01-01

    This paper describes the development of an application of Artificial Intelligence (AI) for Unmanned Aerial Vehicle (UAV) control. The project was done as part of the requirements for a class in AI at NOVA southeastern University and a beginning project at NASA Wallops Flight Facility for a resilient, robust, and intelligent UAV flight control system. A method is outlined which allows a base level application for applying an Artificial Intelligence method, Fuzzy Logic, to aspects of Control Logic for UAV flight. One element of UAV flight, automated altitude hold, has been implemented and preliminary results displayed.

  18. The Development of Online Tutorial Program Design Using Problem-Based Learning in Open Distance Learning System

    ERIC Educational Resources Information Center

    Said, Asnah; Syarif, Edy

    2016-01-01

    This research aimed to evaluate of online tutorial program design by applying problem-based learning Research Methods currently implemented in the system of Open Distance Learning (ODL). The students must take a Research Methods course to prepare themselves for academic writing projects. Problem-based learning basically emphasizes the process of…

  19. Applying a Mixed Method Design to Evaluate Training Seminars within an Early Childhood Education Project

    ERIC Educational Resources Information Center

    Grammatikopoulos, Vasilis; Zachopoulou, Evridiki; Tsangaridou, Niki; Liukkonen, Jarmo; Pickup, Ian

    2008-01-01

    The body of research relating to assessment in education suggests that professional developers and seminar administrators have generally paid little attention to evaluation procedures. Scholars have also been critical of evaluations which use a single data source and have favoured the use of a multiple method design to generate a complete picture…

  20. Benchmarking and Performance Measurement.

    ERIC Educational Resources Information Center

    Town, J. Stephen

    This paper defines benchmarking and its relationship to quality management, describes a project which applied the technique in a library context, and explores the relationship between performance measurement and benchmarking. Numerous benchmarking methods contain similar elements: deciding what to benchmark; identifying partners; gathering…

  1. The Stanford how things work project

    NASA Technical Reports Server (NTRS)

    Fikes, Richard; Gruber, Tom; Iwasaki, Yumi

    1994-01-01

    We provide an overview of the Stanford How Things Work (HTW) project, an ongoing integrated collection of research activities in the Knowledge Systems Laboratory at Stanford University. The project is developing technology for representing knowledge about engineered devices in a form that enables the knowledge to be used in multiple systems for multiple reasoning tasks and reasoning methods that enable the represented knowledge to be effectively applied to the performance of the core engineering task of simulating and analyzing device behavior. The central new capabilities currently being developed in the project are automated assistance with model formulation and with verification that a design for an electro-mechanical device satisfies its functional specification.

  2. Comparison of different numerical treatments for x-ray phase tomography of soft tissue from differential phase projections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pelliccia, Daniele; Vaz, Raquel; Svalbe, Imants

    X-ray imaging of soft tissue is made difficult by their low absorbance. The use of x-ray phase imaging and tomography can significantly enhance the detection of these tissues and several approaches have been proposed to this end. Methods such as analyzer-based imaging or grating interferometry produce differential phase projections that can be used to reconstruct the 3D distribution of the sample refractive index. We report on the quantitative comparison of three different methods to obtain x-ray phase tomography with filtered back-projection from differential phase projections in the presence of noise. The three procedures represent different numerical approaches to solve themore » same mathematical problem, namely phase retrieval and filtered back-projection. It is found that obtaining individual phase projections and subsequently applying a conventional filtered back-projection algorithm produces the best results for noisy experimental data, when compared with other procedures based on the Hilbert transform. The algorithms are tested on simulated phantom data with added noise and the predictions are confirmed by experimental data acquired using a grating interferometer. The experiment is performed on unstained adult zebrafish, an important model organism for biomedical studies. The method optimization described here allows resolution of weak soft tissue features, such as muscle fibers.« less

  3. Using Remote Sensing, Geomorphology, and Soils to Map Episodic Streams in Drylands

    NASA Astrophysics Data System (ADS)

    Thibodeaux-Yost, S. N. S.

    2016-12-01

    Millions of acres of public land in the California deserts are currently being evaluated and permitted for the construction of large-scale renewable energy projects. The absence of a standard method for identifying episodic streams in arid and semi-arid (dryland) regions is a source of conflict between project developers and the government agencies responsible for conserving natural resources and permitting renewable energy projects. There is a need for a consistent, efficient, and cost-effective dryland stream delineation protocol that accurately reflects the extent and distribution of active watercourses. This thesis evaluates the stream delineation method and results used by the developer for the proposed Ridgecrest Solar Power Project on the El Paso Fan, Ridgecrest, Kern County, California. This evaluation is then compared and contrasted with results achieved using remote sensing, geomorphology, soils, and GIS analysis to identify stream presence on the site. This study's results identified 105 acres of watercourse, a value 10 times greater than that originally identified by the project developer. In addition, the applied methods provide an ecohydrologic base map to better inform project siting and potential project impact mitigation opportunities. This study concludes that remote sensing, geomorphology, and dryland soils can be used to accurately and efficiently identify episodic stream activity and the extent of watercourses in dryland environments.

  4. Projector primary-based optimization for superimposed projection mappings

    NASA Astrophysics Data System (ADS)

    Ahmed, Bilal; Lee, Jong Hun; Lee, Yong Yi; Lee, Kwan H.

    2018-01-01

    Recently, many researchers have focused on fully overlapping projections for three-dimensional (3-D) projection mapping systems but reproducing a high-quality appearance using this technology still remains a challenge. On top of existing color compensation-based methods, much effort is still required to faithfully reproduce an appearance that is free from artifacts, colorimetric inconsistencies, and inappropriate illuminance over the 3-D projection surface. According to our observation, this is due to the fact that overlapping projections are treated as an additive-linear mixture of color. However, this is not the case according to our elaborated observations. We propose a method that enables us to use high-quality appearance data that are measured from original objects and regenerate the same appearance by projecting optimized images using multiple projectors, ensuring that the projection-rendered results look visually close to the real object. We prepare our target appearances by photographing original objects. Then, using calibrated projector-camera pairs, we compensate for missing geometric correspondences to make our method robust against noise. The heart of our method is a target appearance-driven adaptive sampling of the projection surface followed by a representation of overlapping projections in terms of the projector-primary response. This gives off projector-primary weights to facilitate blending and the system is applied with constraints. These samples are used to populate a light transport-based system. Then, the system is solved minimizing the error to get the projection images in a noise-free manner by utilizing intersample overlaps. We ensure that we make the best utilization of available hardware resources to recreate projection mapped appearances that look as close to the original object as possible. Our experimental results show compelling results in terms of visual similarity and colorimetric error.

  5. Selection of climate change scenario data for impact modelling.

    PubMed

    Sloth Madsen, M; Maule, C Fox; MacKellar, N; Olesen, J E; Christensen, J Hesselbjerg

    2012-01-01

    Impact models investigating climate change effects on food safety often need detailed climate data. The aim of this study was to select climate change projection data for selected crop phenology and mycotoxin impact models. Using the ENSEMBLES database of climate model output, this study illustrates how the projected climate change signal of important variables as temperature, precipitation and relative humidity depends on the choice of the climate model. Using climate change projections from at least two different climate models is recommended to account for model uncertainty. To make the climate projections suitable for impact analysis at the local scale a weather generator approach was adopted. As the weather generator did not treat all the necessary variables, an ad-hoc statistical method was developed to synthesise realistic values of missing variables. The method is presented in this paper, applied to relative humidity, but it could be adopted to other variables if needed.

  6. In-line phase contrast micro-CT reconstruction for biomedical specimens.

    PubMed

    Fu, Jian; Tan, Renbo

    2014-01-01

    X-ray phase contrast micro computed tomography (micro-CT) can non-destructively provide the internal structure information of soft tissues and low atomic number materials. It has become an invaluable analysis tool for biomedical specimens. Here an in-line phase contrast micro-CT reconstruction technique is reported, which consists of a projection extraction method and the conventional filter back-projection (FBP) reconstruction algorithm. The projection extraction is implemented by applying the Fourier transform to the forward projections of in-line phase contrast micro-CT. This work comprises a numerical study of the method and its experimental verification using a biomedical specimen dataset measured at an X-ray tube source micro-CT setup. The numerical and experimental results demonstrate that the presented technique can improve the imaging contrast of biomedical specimens. It will be of interest for a wide range of in-line phase contrast micro-CT applications in medicine and biology.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mosher, J.C.; Leahy, R.M.

    A new method for source localization is described that is based on a modification of the well known multiple signal classification (MUSIC) algorithm. In classical MUSIC, the array manifold vector is projected onto an estimate of the signal subspace, but errors in the estimate can make location of multiple sources difficult. Recursively applied and projected (RAP) MUSIC uses each successively located source to form an intermediate array gain matrix, and projects both the array manifold and the signal subspace estimate into its orthogonal complement. The MUSIC projection is then performed in this reduced subspace. Using the metric of principal angles,more » the authors describe a general form of the RAP-MUSIC algorithm for the case of diversely polarized sources. Through a uniform linear array simulation, the authors demonstrate the improved Monte Carlo performance of RAP-MUSIC relative to MUSIC and two other sequential subspace methods, S and IES-MUSIC.« less

  8. Accelerated Optical Projection Tomography Applied to In Vivo Imaging of Zebrafish

    PubMed Central

    Correia, Teresa; Yin, Jun; Ramel, Marie-Christine; Andrews, Natalie; Katan, Matilda; Bugeon, Laurence; Dallman, Margaret J.; McGinty, James; Frankel, Paul; French, Paul M. W.; Arridge, Simon

    2015-01-01

    Optical projection tomography (OPT) provides a non-invasive 3-D imaging modality that can be applied to longitudinal studies of live disease models, including in zebrafish. Current limitations include the requirement of a minimum number of angular projections for reconstruction of reasonable OPT images using filtered back projection (FBP), which is typically several hundred, leading to acquisition times of several minutes. It is highly desirable to decrease the number of required angular projections to decrease both the total acquisition time and the light dose to the sample. This is particularly important to enable longitudinal studies, which involve measurements of the same fish at different time points. In this work, we demonstrate that the use of an iterative algorithm to reconstruct sparsely sampled OPT data sets can provide useful 3-D images with 50 or fewer projections, thereby significantly decreasing the minimum acquisition time and light dose while maintaining image quality. A transgenic zebrafish embryo with fluorescent labelling of the vasculature was imaged to acquire densely sampled (800 projections) and under-sampled data sets of transmitted and fluorescence projection images. The under-sampled OPT data sets were reconstructed using an iterative total variation-based image reconstruction algorithm and compared against FBP reconstructions of the densely sampled data sets. To illustrate the potential for quantitative analysis following rapid OPT data acquisition, a Hessian-based method was applied to automatically segment the reconstructed images to select the vasculature network. Results showed that 3-D images of the zebrafish embryo and its vasculature of sufficient visual quality for quantitative analysis can be reconstructed using the iterative algorithm from only 32 projections—achieving up to 28 times improvement in imaging speed and leading to total acquisition times of a few seconds. PMID:26308086

  9. Parallelizable 3D statistical reconstruction for C-arm tomosynthesis system

    NASA Astrophysics Data System (ADS)

    Wang, Beilei; Barner, Kenneth; Lee, Denny

    2005-04-01

    Clinical diagnosis and security detection tasks increasingly require 3D information which is difficult or impossible to obtain from 2D (two dimensional) radiographs. As a 3D (three dimensional) radiographic and non-destructive imaging technique, digital tomosynthesis is especially fit for cases where 3D information is required while a complete projection data is not available. Nowadays, FBP (filtered back projection) is extensively used in industry for its fast speed and simplicity. However, it is hard to deal with situations where only a limited number of projections from constrained directions are available, or the SNR (signal to noises ratio) of the projections is low. In order to deal with noise and take into account a priori information of the object, a statistical image reconstruction method is described based on the acquisition model of X-ray projections. We formulate a ML (maximum likelihood) function for this model and develop an ordered-subsets iterative algorithm to estimate the unknown attenuation of the object. Simulations show that satisfied results can be obtained after 1 to 2 iterations, and after that there is no significant improvement of the image quality. An adaptive wiener filter is also applied to the reconstructed image to remove its noise. Some approximations to speed up the reconstruction computation are also considered. Applying this method to computer generated projections of a revised Shepp phantom and true projections from diagnostic radiographs of a patient"s hand and mammography images yields reconstructions with impressive quality. Parallel programming is also implemented and tested. The quality of the reconstructed object is conserved, while the computation time is considerably reduced by almost the number of threads used.

  10. Advanced development and calibration of the network robustness index to identify critical road network links.

    DOT National Transportation Integrated Search

    2010-05-31

    In this research project, transportation flexibility and reliability concepts are extended and applied : to a new method for identifying the most critical links in a road network. Current transportation : management practices typically utilize locali...

  11. Material Testing and Initial Pavement Design Modeling: Minnesota Road Research Project

    DOT National Transportation Integrated Search

    1996-09-01

    Between January 1990 and December 1994, a study verified and applied a Corps of Engineers-developed mechanistic design and evaluation method for pavements in seasonal frost areas as part of a Construction Productivity Advancement Research (CPAR) proj...

  12. Regional projection of climate impact indices over the Mediterranean region

    NASA Astrophysics Data System (ADS)

    Casanueva, Ana; Frías, M.; Dolores; Herrera, Sixto; Bedia, Joaquín; San Martín, Daniel; Gutiérrez, José Manuel; Zaninovic, Ksenija

    2014-05-01

    Climate Impact Indices (CIIs) are being increasingly used in different socioeconomic sectors to transfer information about climate change impacts and risks to stakeholders. CIIs are typically based on different weather variables such as temperature, wind speed, precipitation or humidity and comprise, in a single index, the relevant meteorological information for the particular impact sector (in this study wildfires and tourism). This dependence on several climate variables poses important limitations to the application of statistical downscaling techniques, since physical consistency among variables is required in most cases to obtain reliable local projections. The present study assesses the suitability of the "direct" downscaling approach, in which the downscaling method is directly applied to the CII. In particular, for illustrative purposes, we consider two popular indices used in the wildfire and tourism sectors, the Fire Weather Index (FWI) and the Physiological Equivalent Temperature (PET), respectively. As an example, two case studies are analysed over two representative Mediterranean regions of interest for the EU CLIM-RUN project: continental Spain for the FWI and Croatia for the PET. Results obtained with this "direct" downscaling approach are similar to those found from the application of the statistical downscaling to the individual meteorological drivers prior to the index calculation ("component" downscaling) thus, a wider range of statistical downscaling methods could be used. As an illustration, future changes in both indices are projected by applying two direct statistical downscaling methods, analogs and linear regression, to the ECHAM5 model. Larger differences were found between the two direct statistical downscaling approaches than between the direct and the component approaches with a single downscaling method. While these examples focus on particular indices and Mediterranean regions of interest for CLIM-RUN stakeholders, the same study could be extended to other indices and regions.

  13. Through Their Eyes: Lessons Learned Using Participatory Methods in Health Care Quality Improvement Projects

    PubMed Central

    Balbale, Salva N.; Locatelli, Sara M.; LaVela, Sherri L.

    2016-01-01

    In this methodological article, we examine participatory methods in-depth to demonstrate how these methods can be adopted for quality improvement (QI) projects in health care. We draw on existing literature and our QI initiatives in the Department of Veterans Affairs to discuss the application of photovoice and guided tours in QI efforts. We highlight lessons learned and several benefits of using participatory methods in this area. Using participatory methods, evaluators can engage patients, providers and other stakeholders as partners to enhance care. Participant involvement helps yield actionable data that can be translated into improved care practices. Use of these methods also helps generate key insights to inform improvements that truly resonate with stakeholders. Using participatory methods is a valuable strategy to harness participant engagement and drive improvements that address individual needs. In applying these innovative methodologies, evaluators can transcend traditional approaches to uniquely support evaluations and improvements in health care. PMID:26667882

  14. Learning the scientific method using GloFish.

    PubMed

    Vick, Brianna M; Pollak, Adrianna; Welsh, Cynthia; Liang, Jennifer O

    2012-12-01

    Here we describe projects that used GloFish, brightly colored, fluorescent, transgenic zebrafish, in experiments that enabled students to carry out all steps in the scientific method. In the first project, students in an undergraduate genetics laboratory course successfully tested hypotheses about the relationships between GloFish phenotypes and genotypes using PCR, fluorescence microscopy, and test crosses. In the second and third projects, students doing independent research carried out hypothesis-driven experiments that also developed new GloFish projects for future genetics laboratory students. Brianna Vick, an undergraduate student, identified causes of the different shades of color found in orange GloFish. Adrianna Pollak, as part of a high school science fair project, characterized the fluorescence emission patterns of all of the commercially available colors of GloFish (red, orange, yellow, green, blue, and purple). The genetics laboratory students carrying out the first project found that learning new techniques and applying their knowledge of genetics were valuable. However, assessments of their learning suggest that this project was not challenging to many of the students. Thus, the independent projects will be valuable as bases to widen the scope and range of difficulty of experiments available to future genetics laboratory students.

  15. Engaging stakeholders: lessons from the use of participatory tools for improving maternal and child care health services.

    PubMed

    Ekirapa-Kiracho, Elizabeth; Ghosh, Upasona; Brahmachari, Rittika; Paina, Ligia

    2017-12-28

    Effective stakeholder engagement in research and implementation is important for improving the development and implementation of policies and programmes. A varied number of tools have been employed for stakeholder engagement. In this paper, we discuss two participatory methods for engaging with stakeholders - participatory social network analysis (PSNA) and participatory impact pathways analysis (PIPA). Based on our experience, we derive lessons about when and how to apply these tools. This paper was informed by a review of project reports and documents in addition to reflection meetings with the researchers who applied the tools. These reports were synthesised and used to make thick descriptions of the applications of the methods while highlighting key lessons. PSNA and PIPA both allowed a deep understanding of how the system actors are interconnected and how they influence maternal health and maternal healthcare services. The findings from the PSNA provided guidance on how stakeholders of a health system are interconnected and how they can stimulate more positive interaction between the stakeholders by exposing existing gaps. The PIPA meeting enabled the participants to envision how they could expand their networks and resources by mentally thinking about the contributions that they could make to the project. The processes that were considered critical for successful application of the tools and achievement of outcomes included training of facilitators, language used during the facilitation, the number of times the tool is applied, length of the tools, pretesting of the tools, and use of quantitative and qualitative methods. Whereas both tools allowed the identification of stakeholders and provided a deeper understanding of the type of networks and dynamics within the network, PIPA had a higher potential for promoting collaboration between stakeholders, likely due to allowing interaction between them. Additionally, it was implemented within a participatory action research project. PIPA also allowed participatory evaluation of the project from the perspective of the community. This paper provides lessons about the use of these participatory tools.

  16. A method to deconvolve stellar rotational velocities II. The probability distribution function via Tikhonov regularization

    NASA Astrophysics Data System (ADS)

    Christen, Alejandra; Escarate, Pedro; Curé, Michel; Rial, Diego F.; Cassetti, Julia

    2016-10-01

    Aims: Knowing the distribution of stellar rotational velocities is essential for understanding stellar evolution. Because we measure the projected rotational speed v sin I, we need to solve an ill-posed problem given by a Fredholm integral of the first kind to recover the "true" rotational velocity distribution. Methods: After discretization of the Fredholm integral we apply the Tikhonov regularization method to obtain directly the probability distribution function for stellar rotational velocities. We propose a simple and straightforward procedure to determine the Tikhonov parameter. We applied Monte Carlo simulations to prove that the Tikhonov method is a consistent estimator and asymptotically unbiased. Results: This method is applied to a sample of cluster stars. We obtain confidence intervals using a bootstrap method. Our results are in close agreement with those obtained using the Lucy method for recovering the probability density distribution of rotational velocities. Furthermore, Lucy estimation lies inside our confidence interval. Conclusions: Tikhonov regularization is a highly robust method that deconvolves the rotational velocity probability density function from a sample of v sin I data directly without the need for any convergence criteria.

  17. Impact of Requirements Quality on Project Success or Failure

    NASA Astrophysics Data System (ADS)

    Tamai, Tetsuo; Kamata, Mayumi Itakura

    We are interested in the relationship between the quality of the requirements specifications for software projects and the subsequent outcome of the projects. To examine this relationship, we investigated 32 projects started and completed between 2003 and 2005 by the software development division of a large company in Tokyo. The company has collected reliable data on requirements specification quality, as evaluated by software quality assurance teams, and overall project performance data relating to cost and time overruns. The data for requirements specification quality were first converted into a multiple-dimensional space, with each dimension corresponding to an item of the recommended structure for software requirements specifications (SRS) defined in IEEE Std. 830-1998. We applied various statistical analysis methods to the SRS quality data and project outcomes.

  18. Research on the raw data processing method of the hydropower construction project

    NASA Astrophysics Data System (ADS)

    Tian, Zhichao

    2018-01-01

    In this paper, based on the characteristics of the fixed data, this paper compares the various mathematical statistics analysis methods and chooses the improved Grabs criterion to analyze the data, and through the analysis of the data processing, the data processing method is not suitable. It is proved that this method can be applied to the processing of fixed raw data. This paper provides a reference for reasonably determining the effective quota analysis data.

  19. Bias-correction of CORDEX-MENA projections using the Distribution Based Scaling method

    NASA Astrophysics Data System (ADS)

    Bosshard, Thomas; Yang, Wei; Sjökvist, Elin; Arheimer, Berit; Graham, L. Phil

    2014-05-01

    Within the Regional Initiative for the Assessment of the Impact of Climate Change on Water Resources and Socio-Economic Vulnerability in the Arab Region (RICCAR) lead by UN ESCWA, CORDEX RCM projections for the Middle East Northern Africa (MENA) domain are used to drive hydrological impacts models. Bias-correction of newly available CORDEX-MENA projections is a central part of this project. In this study, the distribution based scaling (DBS) method has been applied to 6 regional climate model projections driven by 2 RCP emission scenarios. The DBS method uses a quantile mapping approach and features a conditional temperature correction dependent on the wet/dry state in the climate model data. The CORDEX-MENA domain is particularly challenging for bias-correction as it spans very diverse climates showing pronounced dry and wet seasons. Results show that the regional climate models simulate too low temperatures and often have a displaced rainfall band compared to WATCH ERA-Interim forcing data in the reference period 1979-2008. DBS is able to correct the temperature biases as well as some aspects of the precipitation biases. Special focus is given to the analysis of the influence of the dry-frequency bias (i.e. climate models simulating too few rain days) on the bias-corrected projections and on the modification of the climate change signal by the DBS method.

  20. Design for Usability; practice-oriented research for user-centered product design.

    PubMed

    van Eijk, Daan; van Kuijk, Jasper; Hoolhorst, Frederik; Kim, Chajoong; Harkema, Christelle; Dorrestijn, Steven

    2012-01-01

    The Design for Usability project aims at improving the usability of electronic professional and consumer products by creating new methodology and methods for user-centred product development, which are feasible to apply in practice. The project was focused on 5 key areas: (i) design methodology, expanding the existing approach of scenario-based design to incorporate the interaction between product design, user characteristics, and user behaviour; (ii) company processes, barriers and enablers for usability in practice; (iii) user characteristics in relation to types of products and use-situations; (iv) usability decision-making; and (v) product impact on user behaviour. The project team developed methods and techniques in each of these areas to support the design of products with a high level of usability. This paper brings together and summarizes the findings.

  1. An efficient variable projection formulation for separable nonlinear least squares problems.

    PubMed

    Gan, Min; Li, Han-Xiong

    2014-05-01

    We consider in this paper a class of nonlinear least squares problems in which the model can be represented as a linear combination of nonlinear functions. The variable projection algorithm projects the linear parameters out of the problem, leaving the nonlinear least squares problems involving only the nonlinear parameters. To implement the variable projection algorithm more efficiently, we propose a new variable projection functional based on matrix decomposition. The advantage of the proposed formulation is that the size of the decomposed matrix may be much smaller than those of previous ones. The Levenberg-Marquardt algorithm using finite difference method is then applied to minimize the new criterion. Numerical results show that the proposed approach achieves significant reduction in computing time.

  2. Comparison of extended field-of-view reconstructions in C-arm flat-detector CT using patient size, shape or attenuation information.

    PubMed

    Kolditz, Daniel; Meyer, Michael; Kyriakou, Yiannis; Kalender, Willi A

    2011-01-07

    In C-arm-based flat-detector computed tomography (FDCT) it frequently happens that the patient exceeds the scan field of view (SFOV) in the transaxial direction because of the limited detector size. This results in data truncation and CT image artefacts. In this work three truncation correction approaches for extended field-of-view (EFOV) reconstructions have been implemented and evaluated. An FDCT-based method estimates the patient size and shape from the truncated projections by fitting an elliptical model to the raw data in order to apply an extrapolation. In a camera-based approach the patient is sampled with an optical tracking system and this information is used to apply an extrapolation. In a CT-based method the projections are completed by artificial projection data obtained from the CT data acquired in an earlier exam. For all methods the extended projections are filtered and backprojected with a standard Feldkamp-type algorithm. Quantitative evaluations have been performed by simulations of voxelized phantoms on the basis of the root mean square deviation and a quality factor Q (Q = 1 represents the ideal correction). Measurements with a C-arm FDCT system have been used to validate the simulations and to investigate the practical applicability using anthropomorphic phantoms which caused truncation in all projections. The proposed approaches enlarged the FOV to cover wider patient cross-sections. Thus, image quality inside and outside the SFOV has been improved. Best results have been obtained using the CT-based method, followed by the camera-based and the FDCT-based truncation correction. For simulations, quality factors up to 0.98 have been achieved. Truncation-induced cupping artefacts have been reduced, e.g., from 218% to less than 1% for the measurements. The proposed truncation correction approaches for EFOV reconstructions are an effective way to ensure accurate CT values inside the SFOV and to recover peripheral information outside the SFOV.

  3. A Bayesian Ensemble Approach for Epidemiological Projections

    PubMed Central

    Lindström, Tom; Tildesley, Michael; Webb, Colleen

    2015-01-01

    Mathematical models are powerful tools for epidemiology and can be used to compare control actions. However, different models and model parameterizations may provide different prediction of outcomes. In other fields of research, ensemble modeling has been used to combine multiple projections. We explore the possibility of applying such methods to epidemiology by adapting Bayesian techniques developed for climate forecasting. We exemplify the implementation with single model ensembles based on different parameterizations of the Warwick model run for the 2001 United Kingdom foot and mouth disease outbreak and compare the efficacy of different control actions. This allows us to investigate the effect that discrepancy among projections based on different modeling assumptions has on the ensemble prediction. A sensitivity analysis showed that the choice of prior can have a pronounced effect on the posterior estimates of quantities of interest, in particular for ensembles with large discrepancy among projections. However, by using a hierarchical extension of the method we show that prior sensitivity can be circumvented. We further extend the method to include a priori beliefs about different modeling assumptions and demonstrate that the effect of this can have different consequences depending on the discrepancy among projections. We propose that the method is a promising analytical tool for ensemble modeling of disease outbreaks. PMID:25927892

  4. Application of ISO standard 27048: dose assessment for the monitoring of workers for internal radiation exposure.

    PubMed

    Henrichs, K

    2011-03-01

    Besides ongoing developments in the dosimetry of incorporated radionuclides, there are various efforts to improve the monitoring of workers for potential or real intakes of radionuclides. The disillusioning experience with numerous intercomparison projects identified substantial differences between national regulations, concepts, applied programmes and methods, and dose assessment procedures. Measured activities were not directly comparable because of significant differences between measuring frequencies and methods, but also results of case studies for dose assessments revealed differences of orders of magnitude. Besides the general common interest in reliable monitoring results, at least the cross-border activities of workers (e.g. nuclear power plant services) require consistent approaches and comparable results. The International Standardization Organization therefore initiated projects to standardise programmes for the monitoring of workers, the requirements for measuring laboratories and the processes for the quantitative evaluation of monitoring results in terms of internal assessed doses. The strength of the concepts applied by the international working group consists in a unified approach defining the requirements, databases and processes. This paper is intended to give a short introduction into the standardization project followed by a more detailed description of the dose assessment standard, which will be published in the very near future.

  5. Color-coded depth information in volume-rendered magnetic resonance angiography

    NASA Astrophysics Data System (ADS)

    Smedby, Orjan; Edsborg, Karin; Henriksson, John

    2004-05-01

    Magnetic Resonance Angiography (MRA) and Computed Tomography Angiography (CTA) data are usually presented using Maximum Intensity Projection (MIP) or Volume Rendering Technique (VRT), but these often fail to demonstrate a stenosis if the projection angle is not suitably chosen. In order to make vascular stenoses visible in projection images independent of the choice of viewing angle, a method is proposed to supplement these images with colors representing the local caliber of the vessel. After preprocessing the volume image with a median filter, segmentation is performed by thresholding, and a Euclidean distance transform is applied. The distance to the background from each voxel in the vessel is mapped to a color. These colors can either be rendered directly using MIP or be presented together with opacity information based on the original image using VRT. The method was tested in a synthetic dataset containing a cylindrical vessel with stenoses in varying angles. The results suggest that the visibility of stenoses is enhanced by the color information. In clinical feasibility experiments, the technique was applied to clinical MRA data. The results are encouraging and indicate that the technique can be used with clinical images.

  6. Fisher's method of scoring in statistical image reconstruction: comparison of Jacobi and Gauss-Seidel iterative schemes.

    PubMed

    Hudson, H M; Ma, J; Green, P

    1994-01-01

    Many algorithms for medical image reconstruction adopt versions of the expectation-maximization (EM) algorithm. In this approach, parameter estimates are obtained which maximize a complete data likelihood or penalized likelihood, in each iteration. Implicitly (and sometimes explicitly) penalized algorithms require smoothing of the current reconstruction in the image domain as part of their iteration scheme. In this paper, we discuss alternatives to EM which adapt Fisher's method of scoring (FS) and other methods for direct maximization of the incomplete data likelihood. Jacobi and Gauss-Seidel methods for non-linear optimization provide efficient algorithms applying FS in tomography. One approach uses smoothed projection data in its iterations. We investigate the convergence of Jacobi and Gauss-Seidel algorithms with clinical tomographic projection data.

  7. Bayesian probabilistic population projections for all countries

    PubMed Central

    Raftery, Adrian E.; Li, Nan; Ševčíková, Hana; Gerland, Patrick; Heilig, Gerhard K.

    2012-01-01

    Projections of countries’ future populations, broken down by age and sex, are widely used for planning and research. They are mostly done deterministically, but there is a widespread need for probabilistic projections. We propose a Bayesian method for probabilistic population projections for all countries. The total fertility rate and female and male life expectancies at birth are projected probabilistically using Bayesian hierarchical models estimated via Markov chain Monte Carlo using United Nations population data for all countries. These are then converted to age-specific rates and combined with a cohort component projection model. This yields probabilistic projections of any population quantity of interest. The method is illustrated for five countries of different demographic stages, continents and sizes. The method is validated by an out of sample experiment in which data from 1950–1990 are used for estimation, and applied to predict 1990–2010. The method appears reasonably accurate and well calibrated for this period. The results suggest that the current United Nations high and low variants greatly underestimate uncertainty about the number of oldest old from about 2050 and that they underestimate uncertainty for high fertility countries and overstate uncertainty for countries that have completed the demographic transition and whose fertility has started to recover towards replacement level, mostly in Europe. The results also indicate that the potential support ratio (persons aged 20–64 per person aged 65+) will almost certainly decline dramatically in most countries over the coming decades. PMID:22908249

  8. High Performance Automatic Character Skinning Based on Projection Distance

    NASA Astrophysics Data System (ADS)

    Li, Jun; Lin, Feng; Liu, Xiuling; Wang, Hongrui

    2018-03-01

    Skeleton-driven-deformation methods have been commonly used in the character deformations. The process of painting skin weights for character deformation is a long-winded task requiring manual tweaking. We present a novel method to calculate skinning weights automatically from 3D human geometric model and corresponding skeleton. The method first, groups each mesh vertex of 3D human model to a skeleton bone by the minimum distance from a mesh vertex to each bone. Secondly, calculates each vertex's weights to the adjacent bones by the vertex's projection point distance to the bone joints. Our method's output can not only be applied to any kind of skeleton-driven deformation, but also to motion capture driven (mocap-driven) deformation. Experiments results show that our method not only has strong generality and robustness, but also has high performance.

  9. Space Radiation Cancer Risks

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.

    2007-01-01

    Space radiation presents major challenges to astronauts on the International Space Station and for future missions to the Earth s moon or Mars. Methods used to project risks on Earth need to be modified because of the large uncertainties in projecting cancer risks from space radiation, and thus impact safety factors. We describe NASA s unique approach to radiation safety that applies uncertainty based criteria within the occupational health program for astronauts: The two terrestrial criteria of a point estimate of maximum acceptable level of risk and application of the principle of As Low As Reasonably Achievable (ALARA) are supplemented by a third requirement that protects against risk projection uncertainties using the upper 95% confidence level (CL) in the radiation cancer projection model. NASA s acceptable level of risk for ISS and their new lunar program have been set at the point-estimate of a 3-percent risk of exposure induced death (REID). Tissue-averaged organ dose-equivalents are combined with age at exposure and gender-dependent risk coefficients to project the cumulative occupational radiation risks incurred by astronauts. The 95% CL criteria in practice is a stronger criterion than ALARA, but not an absolute cut-off as is applied to a point projection of a 3% REID. We describe the most recent astronaut dose limits, and present a historical review of astronaut organ doses estimates from the Mercury through the current ISS program, and future projections for lunar and Mars missions. NASA s 95% CL criteria is linked to a vibrant ground based radiobiology program investigating the radiobiology of high-energy protons and heavy ions. The near-term goal of research is new knowledge leading to the reduction of uncertainties in projection models. Risk projections involve a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. The current model for projecting space radiation cancer risk relies on the three assumptions of linearity, additivity, and scaling along with the use of population averages. We describe uncertainty estimates for this model, and new experimental data that sheds light on the accuracy of the underlying assumptions. These methods make it possible to express risk management objectives in terms of quantitative metrics, i.e., the number of days in space without exceeding a given risk level within well defined confidence limits. The resulting methodology is applied to several human space exploration mission scenarios including lunar station, deep space outpost, and a Mars mission. Factors that dominate risk projection uncertainties and application of this approach to assess candidate mitigation approaches are described.

  10. What is the problem in problem-based learning in higher education mathematics

    NASA Astrophysics Data System (ADS)

    Dahl, Bettina

    2018-01-01

    Problem and Project-Based Learning (PBL) emphasise collaborate work on problems relevant to society and emphases the relation between theory and practice. PBL fits engineering students as preparation for their future professions but what about mathematics? Mathematics is not just applied mathematics, but it is also a body of abstract knowledge where the application in society is not always obvious. Does mathematics, including pure mathematics, fit into a PBL curriculum? This paper argues that it does for two reasons: (1) PBL resembles the working methods of research mathematicians. (2) The concept of society includes the society of researchers to whom theoretical mathematics is relevant. The paper describes two cases of university PBL projects in mathematics; one in pure mathematics and the other in applied mathematics. The paper also discusses that future engineers need to understand the world of mathematics as well as how engineers fit into a process of fundamental-research-turned-into-applied-science.

  11. Reconstruction of the two-dimensional gravitational potential of galaxy clusters from X-ray and Sunyaev-Zel'dovich measurements

    NASA Astrophysics Data System (ADS)

    Tchernin, C.; Bartelmann, M.; Huber, K.; Dekel, A.; Hurier, G.; Majer, C. L.; Meyer, S.; Zinger, E.; Eckert, D.; Meneghetti, M.; Merten, J.

    2018-06-01

    Context. The mass of galaxy clusters is not a direct observable, nonetheless it is commonly used to probe cosmological models. Based on the combination of all main cluster observables, that is, the X-ray emission, the thermal Sunyaev-Zel'dovich (SZ) signal, the velocity dispersion of the cluster galaxies, and gravitational lensing, the gravitational potential of galaxy clusters can be jointly reconstructed. Aims: We derive the two main ingredients required for this joint reconstruction: the potentials individually reconstructed from the observables and their covariance matrices, which act as a weight in the joint reconstruction. We show here the method to derive these quantities. The result of the joint reconstruction applied to a real cluster will be discussed in a forthcoming paper. Methods: We apply the Richardson-Lucy deprojection algorithm to data on a two-dimensional (2D) grid. We first test the 2D deprojection algorithm on a β-profile. Assuming hydrostatic equilibrium, we further reconstruct the gravitational potential of a simulated galaxy cluster based on synthetic SZ and X-ray data. We then reconstruct the projected gravitational potential of the massive and dynamically active cluster Abell 2142, based on the X-ray observations collected with XMM-Newton and the SZ observations from the Planck satellite. Finally, we compute the covariance matrix of the projected reconstructed potential of the cluster Abell 2142 based on the X-ray measurements collected with XMM-Newton. Results: The gravitational potentials of the simulated cluster recovered from synthetic X-ray and SZ data are consistent, even though the potential reconstructed from X-rays shows larger deviations from the true potential. Regarding Abell 2142, the projected gravitational cluster potentials recovered from SZ and X-ray data reproduce well the projected potential inferred from gravitational-lensing observations. We also observe that the covariance matrix of the potential for Abell 2142 reconstructed from XMM-Newton data sensitively depends on the resolution of the deprojected grid and on the smoothing scale used in the deprojection. Conclusions: We show that the Richardson-Lucy deprojection method can be effectively applied on a grid and that the projected potential is well recovered from real and simulated data based on X-ray and SZ signal. The comparison between the reconstructed potentials from the different observables provides additional information on the validity of the assumptions as function of the projected radius.

  12. Recent archaeomagnetic studies in Slovakia: Comparison of methodological approaches

    NASA Astrophysics Data System (ADS)

    Kubišová, Lenka

    2016-03-01

    We review the recent archaeomagnetic studies carried out on the territory of Slovakia, focusing on the comparison of methodological approaches, discussing pros and cons of the individual applied methods from the perspective of our experience. The most widely used methods for the determination of intensity and direction of the archaeomegnetic field by demagnetisation of the sample material are the alternating field (AF) demagnetisation and the Thellier double heating method. These methods are used not only for archaeomagnetic studies but also help to solve some geological problems. The two methods were applied to samples collected recently at several sites of Slovakia, where archaeological prospection invoked by earthwork or reconstruction work of developing projects demanded archaeomagnetic dating. Then we discuss advantages and weaknesses of the investigated methods from different perspectives based on several examples and our recent experience.

  13. Analysis of angular momentum properties of photons emitted in fundamental atomic processes

    NASA Astrophysics Data System (ADS)

    Zaytsev, V. A.; Surzhykov, A. S.; Shabaev, V. M.; Stöhlker, Th.

    2018-04-01

    Many atomic processes result in the emission of photons. Analysis of the properties of emitted photons, such as energy and angular distribution as well as polarization, is regarded as a powerful tool for gaining more insight into the physics of corresponding processes. Another characteristic of light is the projection of its angular momentum upon propagation direction. This property has attracted a special attention over the past decades due to studies of twisted (or vortex) light beams. Measurements being sensitive to this projection may provide valuable information about the role of angular momentum in the fundamental atomic processes. Here we describe a simple theoretical method for determination of the angular momentum properties of the photons emitted in various atomic processes. This method is based on the evaluation of expectation value of the total angular momentum projection operator. To illustrate the method, we apply it to the textbook examples of plane-wave, spherical-wave, and Bessel light. Moreover, we investigate the projection of angular momentum for the photons emitted in the process of the radiative recombination with ionic targets. It is found that the recombination photons do carry a nonzero projection of the orbital angular momentum.

  14. Laboratory and field studies of photocatalytic NOx and O3 removal by coatings on concrete.

    DOT National Transportation Integrated Search

    2017-03-01

    This project involved thorough testing of titanium dioxide (TiO2)-containing commercial photocatalytic coatings : applied to portland cement concrete for highway applications, focusing on the use of these coatings as an : abatement method for atmosph...

  15. Developing a Robust Strategy for Implementing a Water Resources Master Plan in Lima, Peru

    NASA Astrophysics Data System (ADS)

    Kalra, N.; Groves, D.; Bonzanigo, L.; Molina-Perez, E.

    2015-12-01

    Lima, the capital of Peru, faces significant water stress. It is the fifth largest metropolitan area in Latin America, and the second largest desert city in the world. The city has developed a Master Plan of major investment projects to improve water reliability until 2040. Yet key questions remain. Is the Master Plan sufficient for ensuring reliability in the face of deeply uncertain future climate change and demand? How do uncertain budget and project feasibility conditions shape Lima's options? How should the investments in the plan be prioritized, and can some be delayed? Lima is not alone in facing these planning challenges. Governments invest billions of dollars annually in long-term projects. Yet deep uncertainties pose formidable challenges to making near-term decisions that make long-term sense. The World Bank has spearheaded a community of practice on methods for Decision Making Under Deep Uncertainty (DMU). This pilot project in Peru is the first in-depth application of DMU techniques to water supply planning in a developing country. It builds on prior analysis done in New York, California, and for the Colorado River, yet shows how these methods can be applied in regions which do not have as advanced data or tools available. The project combines three methods in particular -- Robust Decision Making, Decision Scaling, and Adaptive Pathways -- to help Lima implement its Master Plan in a way that is robust, no-regret, and adaptive. It was done in close partnership with SEDAPAL, the water utility company in Lima, and in coordination with other national WRM and meteorological agencies. This talk will: Present the planning challenges Lima and other cities face, including climate change Describe DMU methodologies and how they were applied in collaboration with SEDAPAL Summarize recommendations for achieving long-term water reliability in Lima Suggest how these methodologies can benefit other investment projects in developing countries.

  16. A methodology to estimate uncertainty for emission projections through sensitivity analysis.

    PubMed

    Lumbreras, Julio; de Andrés, Juan Manuel; Pérez, Javier; Borge, Rafael; de la Paz, David; Rodríguez, María Encarnación

    2015-04-01

    Air pollution abatement policies must be based on quantitative information on current and future emissions of pollutants. As emission projections uncertainties are inevitable and traditional statistical treatments of uncertainty are highly time/resources consuming, a simplified methodology for nonstatistical uncertainty estimation based on sensitivity analysis is presented in this work. The methodology was applied to the "with measures" scenario for Spain, concretely over the 12 highest emitting sectors regarding greenhouse gas and air pollutants emissions. Examples of methodology application for two important sectors (power plants, and agriculture and livestock) are shown and explained in depth. Uncertainty bands were obtained up to 2020 by modifying the driving factors of the 12 selected sectors and the methodology was tested against a recomputed emission trend in a low economic-growth perspective and official figures for 2010, showing a very good performance. A solid understanding and quantification of uncertainties related to atmospheric emission inventories and projections provide useful information for policy negotiations. However, as many of those uncertainties are irreducible, there is an interest on how they could be managed in order to derive robust policy conclusions. Taking this into account, a method developed to use sensitivity analysis as a source of information to derive nonstatistical uncertainty bands for emission projections is presented and applied to Spain. This method simplifies uncertainty assessment and allows other countries to take advantage of their sensitivity analyses.

  17. Beta Hebbian Learning as a New Method for Exploratory Projection Pursuit.

    PubMed

    Quintián, Héctor; Corchado, Emilio

    2017-09-01

    In this research, a novel family of learning rules called Beta Hebbian Learning (BHL) is thoroughly investigated to extract information from high-dimensional datasets by projecting the data onto low-dimensional (typically two dimensional) subspaces, improving the existing exploratory methods by providing a clear representation of data's internal structure. BHL applies a family of learning rules derived from the Probability Density Function (PDF) of the residual based on the beta distribution. This family of rules may be called Hebbian in that all use a simple multiplication of the output of the neural network with some function of the residuals after feedback. The derived learning rules can be linked to an adaptive form of Exploratory Projection Pursuit and with artificial distributions, the networks perform as the theory suggests they should: the use of different learning rules derived from different PDFs allows the identification of "interesting" dimensions (as far from the Gaussian distribution as possible) in high-dimensional datasets. This novel algorithm, BHL, has been tested over seven artificial datasets to study the behavior of BHL parameters, and was later applied successfully over four real datasets, comparing its results, in terms of performance, with other well-known Exploratory and projection models such as Maximum Likelihood Hebbian Learning (MLHL), Locally-Linear Embedding (LLE), Curvilinear Component Analysis (CCA), Isomap and Neural Principal Component Analysis (Neural PCA).

  18. Predictive control of a chaotic permanent magnet synchronous generator in a wind turbine system

    NASA Astrophysics Data System (ADS)

    Manal, Messadi; Adel, Mellit; Karim, Kemih; Malek, Ghanes

    2015-01-01

    This paper investigates how to address the chaos problem in a permanent magnet synchronous generator (PMSG) in a wind turbine system. Predictive control approach is proposed to suppress chaotic behavior and make operating stable; the advantage of this method is that it can only be applied to one state of the wind turbine system. The use of the genetic algorithms to estimate the optimal parameter values of the wind turbine leads to maximization of the power generation. Moreover, some simulation results are included to visualize the effectiveness and robustness of the proposed method. Project supported by the CMEP-TASSILI Project (Grant No. 14MDU920).

  19. Multispectral high-resolution hologram generation using orthographic projection images

    NASA Astrophysics Data System (ADS)

    Muniraj, I.; Guo, C.; Sheridan, J. T.

    2016-08-01

    We present a new method of synthesizing a digital hologram of three-dimensional (3D) real-world objects from multiple orthographic projection images (OPI). A high-resolution multiple perspectives of 3D objects (i.e., two dimensional elemental image array) are captured under incoherent white light using synthetic aperture integral imaging (SAII) technique and their OPIs are obtained respectively. The reference beam is then multiplied with the corresponding OPI and integrated to form a Fourier hologram. Eventually, a modified phase retrieval algorithm (GS/HIO) is applied to reconstruct the hologram. The principle is validated experimentally and the results support the feasibility of the proposed method.

  20. Integrated Data Collection and Analysis Project: Friction Correlation Study

    DTIC Science & Technology

    2015-08-01

    methods authorized in AOP-7 include Pendulum Friction, Rotary Friction, Sliding Friction (ABL), BAM Friction and Steel/Fiber Shoe Methods. The...sensitivity can be obtained by Pendulum Friction, Rotary Friction, Sliding Friction (such as the ABL), BAM Friction and Steel/Fiber Shoe Methods.3, 4 Within...Figure 4.16 A variable compressive force is applied downward through the wheel hydraulically (50-1995 psi). The 5 kg pendulum impacts (8 ft/sec is the

  1. Limited-memory trust-region methods for sparse relaxation

    NASA Astrophysics Data System (ADS)

    Adhikari, Lasith; DeGuchy, Omar; Erway, Jennifer B.; Lockhart, Shelby; Marcia, Roummel F.

    2017-08-01

    In this paper, we solve the l2-l1 sparse recovery problem by transforming the objective function of this problem into an unconstrained differentiable function and applying a limited-memory trust-region method. Unlike gradient projection-type methods, which uses only the current gradient, our approach uses gradients from previous iterations to obtain a more accurate Hessian approximation. Numerical experiments show that our proposed approach eliminates spurious solutions more effectively while improving computational time.

  2. Exemplar-based inpainting as a solution to the missing wedge problem in electron tomography.

    PubMed

    Trampert, Patrick; Wang, Wu; Chen, Delei; Ravelli, Raimond B G; Dahmen, Tim; Peters, Peter J; Kübel, Christian; Slusallek, Philipp

    2018-04-21

    A new method for dealing with incomplete projection sets in electron tomography is proposed. The approach is inspired by exemplar-based inpainting techniques in image processing and heuristically generates data for missing projection directions. The method has been extended to work on three dimensional data. In general, electron tomography reconstructions suffer from elongation artifacts along the beam direction. These artifacts can be seen in the corresponding Fourier domain as a missing wedge. The new method synthetically generates projections for these missing directions with the help of a dictionary based approach that is able to convey both structure and texture at the same time. It constitutes a preprocessing step that can be combined with any tomographic reconstruction algorithm. The new algorithm was applied to phantom data, to a real electron tomography data set taken from a catalyst, as well as to a real dataset containing solely colloidal gold particles. Visually, the synthetic projections, reconstructions, and corresponding Fourier power spectra showed a decrease of the typical missing wedge artifacts. Quantitatively, the inpainting method is capable to reduce missing wedge artifacts and improves tomogram quality with respect to full width half maximum measurements. Copyright © 2018. Published by Elsevier B.V.

  3. An electromagnetic induction method for underground target detection and characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartel, L.C.; Cress, D.H.

    1997-01-01

    An improved capability for subsurface structure detection is needed to support military and nonproliferation requirements for inspection and for surveillance of activities of threatening nations. As part of the DOE/NN-20 program to apply geophysical methods to detect and characterize underground facilities, Sandia National Laboratories (SNL) initiated an electromagnetic induction (EMI) project to evaluate low frequency electromagnetic (EM) techniques for subsurface structure detection. Low frequency, in this case, extended from kilohertz to hundreds of kilohertz. An EMI survey procedure had already been developed for borehole imaging of coal seams and had successfully been applied in a surface mode to detect amore » drug smuggling tunnel. The SNL project has focused on building upon the success of that procedure and applying it to surface and low altitude airborne platforms. Part of SNL`s work has focused on improving that technology through improved hardware and data processing. The improved hardware development has been performed utilizing Laboratory Directed Research and Development (LDRD) funding. In addition, SNL`s effort focused on: (1) improvements in modeling of the basic geophysics of the illuminating electromagnetic field and its coupling to the underground target (partially funded using LDRD funds) and (2) development of techniques for phase-based and multi-frequency processing and spatial processing to support subsurface target detection and characterization. The products of this project are: (1) an evaluation of an improved EM gradiometer, (2) an improved gradiometer concept for possible future development, (3) an improved modeling capability, (4) demonstration of an EM wave migration method for target recognition, and a demonstration that the technology is capable of detecting targets to depths exceeding 25 meters.« less

  4. Dual Adaptive Filtering by Optimal Projection Applied to Filter Muscle Artifacts on EEG and Comparative Study

    PubMed Central

    Peyrodie, Laurent; Szurhaj, William; Bolo, Nicolas; Pinti, Antonio; Gallois, Philippe

    2014-01-01

    Muscle artifacts constitute one of the major problems in electroencephalogram (EEG) examinations, particularly for the diagnosis of epilepsy, where pathological rhythms occur within the same frequency bands as those of artifacts. This paper proposes to use the method dual adaptive filtering by optimal projection (DAFOP) to automatically remove artifacts while preserving true cerebral signals. DAFOP is a two-step method. The first step consists in applying the common spatial pattern (CSP) method to two frequency windows to identify the slowest components which will be considered as cerebral sources. The two frequency windows are defined by optimizing convolutional filters. The second step consists in using a regression method to reconstruct the signal independently within various frequency windows. This method was evaluated by two neurologists on a selection of 114 pages with muscle artifacts, from 20 clinical recordings of awake and sleeping adults, subject to pathological signals and epileptic seizures. A blind comparison was then conducted with the canonical correlation analysis (CCA) method and conventional low-pass filtering at 30 Hz. The filtering rate was 84.3% for muscle artifacts with a 6.4% reduction of cerebral signals even for the fastest waves. DAFOP was found to be significantly more efficient than CCA and 30 Hz filters. The DAFOP method is fast and automatic and can be easily used in clinical EEG recordings. PMID:25298967

  5. Subspace-based interference removal methods for a multichannel biomagnetic sensor array.

    PubMed

    Sekihara, Kensuke; Nagarajan, Srikantan S

    2017-10-01

    In biomagnetic signal processing, the theory of the signal subspace has been applied to removing interfering magnetic fields, and a representative algorithm is the signal space projection algorithm, in which the signal/interference subspace is defined in the spatial domain as the span of signal/interference-source lead field vectors. This paper extends the notion of this conventional (spatial domain) signal subspace by introducing a new definition of signal subspace in the time domain. It defines the time-domain signal subspace as the span of row vectors that contain the source time course values. This definition leads to symmetric relationships between the time-domain and the conventional (spatial-domain) signal subspaces. As a review, this article shows that the notion of the time-domain signal subspace provides useful insights over existing interference removal methods from a unified perspective. Main results and significance. Using the time-domain signal subspace, it is possible to interpret a number of interference removal methods as the time domain signal space projection. Such methods include adaptive noise canceling, sensor noise suppression, the common temporal subspace projection, the spatio-temporal signal space separation, and the recently-proposed dual signal subspace projection. Our analysis using the notion of the time domain signal space projection reveals implicit assumptions these methods rely on, and shows that the difference between these methods results only from the manner of deriving the interference subspace. Numerical examples that illustrate the results of our arguments are provided.

  6. Subspace-based interference removal methods for a multichannel biomagnetic sensor array

    NASA Astrophysics Data System (ADS)

    Sekihara, Kensuke; Nagarajan, Srikantan S.

    2017-10-01

    Objective. In biomagnetic signal processing, the theory of the signal subspace has been applied to removing interfering magnetic fields, and a representative algorithm is the signal space projection algorithm, in which the signal/interference subspace is defined in the spatial domain as the span of signal/interference-source lead field vectors. This paper extends the notion of this conventional (spatial domain) signal subspace by introducing a new definition of signal subspace in the time domain. Approach. It defines the time-domain signal subspace as the span of row vectors that contain the source time course values. This definition leads to symmetric relationships between the time-domain and the conventional (spatial-domain) signal subspaces. As a review, this article shows that the notion of the time-domain signal subspace provides useful insights over existing interference removal methods from a unified perspective. Main results and significance. Using the time-domain signal subspace, it is possible to interpret a number of interference removal methods as the time domain signal space projection. Such methods include adaptive noise canceling, sensor noise suppression, the common temporal subspace projection, the spatio-temporal signal space separation, and the recently-proposed dual signal subspace projection. Our analysis using the notion of the time domain signal space projection reveals implicit assumptions these methods rely on, and shows that the difference between these methods results only from the manner of deriving the interference subspace. Numerical examples that illustrate the results of our arguments are provided.

  7. Return on Scientific Investment - RoSI: a PMO dynamical index proposal for scientific projects performance evaluation and management.

    PubMed

    Caous, Cristofer André; Machado, Birajara; Hors, Cora; Zeh, Andrea Kaufmann; Dias, Cleber Gustavo; Amaro Junior, Edson

    2012-01-01

    To propose a measure (index) of expected risks to evaluate and follow up the performance analysis of research projects involving financial and adequate structure parameters for its development. A ranking of acceptable results regarding research projects with complex variables was used as an index to gauge a project performance. In order to implement this method the ulcer index as the basic model to accommodate the following variables was applied: costs, high impact publication, fund raising, and patent registry. The proposed structured analysis, named here as RoSI (Return on Scientific Investment) comprises a pipeline of analysis to characterize the risk based on a modeling tool that comprises multiple variables interacting in semi-quantitatively environments. This method was tested with data from three different projects in our Institution (projects A, B and C). Different curves reflected the ulcer indexes identifying the project that may have a minor risk (project C) related to development and expected results according to initial or full investment. The results showed that this model contributes significantly to the analysis of risk and planning as well as to the definition of necessary investments that consider contingency actions with benefits to the different stakeholders: the investor or donor, the project manager and the researchers.

  8. How to Sustain Change and Support Continuous Quality Improvement

    PubMed Central

    McQuillan, Rory; Harel, Ziv; Weizman, Adam V.; Thomas, Alison; Nesrallah, Gihad; Bell, Chaim M.; Chan, Christopher T.; Chertow, Glenn M.

    2016-01-01

    To achieve sustainable change, quality improvement initiatives must become the new way of working rather than something added on to routine clinical care. However, most organizational change is not maintained. In this next article in this Moving Points in Nephrology feature on quality improvement, we provide health care professionals with strategies to sustain and support quality improvement. Threats to sustainability may be identified both at the beginning of a project and when it is ready for implementation. The National Health Service Sustainability Model is reviewed as one example to help identify issues that affect long-term success of quality improvement projects. Tools to help sustain improvement include process control boards, performance boards, standard work, and improvement huddles. Process control and performance boards are methods to communicate improvement results to staff and leadership. Standard work is a written or visual outline of current best practices for a task and provides a framework to ensure that changes that have improved patient care are consistently and reliably applied to every patient encounter. Improvement huddles are short, regular meetings among staff to anticipate problems, review performance, and support a culture of improvement. Many of these tools rely on principles of visual management, which are systems transparent and simple so that every staff member can rapidly distinguish normal from abnormal working conditions. Even when quality improvement methods are properly applied, the success of a project still depends on contextual factors. Context refers to aspects of the local setting in which the project operates. Context affects resources, leadership support, data infrastructure, team motivation, and team performance. For these reasons, the same project may thrive in a supportive context and fail in a different context. To demonstrate the practical applications of these quality improvement principles, these principles are applied to a hypothetical quality improvement initiative that aims to promote home dialysis (home hemodialysis and peritoneal dialysis). PMID:27016498

  9. Evaluation of Historical and Projected Agricultural Climate Risk Over the Continental US

    NASA Astrophysics Data System (ADS)

    Zhu, X.; Troy, T. J.; Devineni, N.

    2016-12-01

    Food demands are rising due to an increasing population with changing food preferences, which places pressure on agricultural systems. In addition, in the past decade climate extremes have highlighted the vulnerability of our agricultural production to climate variability. Quantitative analyses in the climate-agriculture research field have been performed in many studies. However, climate risk still remains difficult to evaluate at large scales yet shows great potential of help us better understand historical climate change impacts and evaluate the future risk given climate projections. In this study, we developed a framework to evaluate climate risk quantitatively by applying statistical methods such as Bayesian regression, distribution fitting, and Monte Carlo simulation. We applied the framework over different climate regions in the continental US both historically and for modeled climate projections. The relative importance of any major growing season climate index, such as maximum dry period or heavy precipitation, was evaluated to determine what climate indices play a role in affecting crop yields. The statistical modeling framework was applied using county yields, with irrigated and rainfed yields separated to evaluate the different risk. This framework provides estimates of the climate risk facing agricultural production in the near-term that account for the full uncertainty of climate occurrences, range of crop response, and spatial correlation in climate. In particular, the method provides robust estimates of importance of irrigation in mitigating agricultural climate risk. The results of this study can contribute to decision making about crop choice and water use in an uncertain climate.

  10. A contrast between DEMATEL-ANP and ANP methods for six sigma project selection: a case study in healthcare industry

    PubMed Central

    2015-01-01

    Background The project selection process is a crucial step for healthcare organizations at the moment of implementing six sigma programs in both administrative and caring processes. However, six-sigma project selection is often defined as a decision making process with interaction and feedback between criteria; so that it is necessary to explore different methods to help healthcare companies to determine the Six-sigma projects that provide the maximum benefits. This paper describes the application of both ANP (Analytic Network process) and DEMATEL (Decision Making trial and evaluation laboratory)-ANP in a public medical centre to establish the most suitable six sigma project and finally, these methods were compared to evaluate their performance in the decision making process. Methods ANP and DEMATEL-ANP were used to evaluate 6 six sigma project alternatives under an evaluation model composed by 3 strategies, 4 criteria and 15 sub-criteria. Judgement matrixes were completed by the six sigma team whose participants worked in different departments of the medical centre. Results The improving of care opportunity in obstetric outpatients was elected as the most suitable six sigma project with a score of 0,117 as contribution to the organization goals. DEMATEL-ANP performed better at decision making process since it reduced the error probability due to interactions and feedback. Conclusions ANP and DEMATEL-ANP effectively supported six sigma project selection processes, helping to create a complete framework that guarantees the prioritization of projects that provide maximum benefits to healthcare organizations. As DEMATEL- ANP performed better, it should be used by practitioners involved in decisions related to the implementation of six sigma programs in healthcare sector accompanied by the adequate identification of the evaluation criteria that support the decision making model. Thus, this comparative study contributes to choosing more effective approaches in this field. Suggestions of further work are also proposed so that these methods can be applied more adequate in six sigma project selection processes in healthcare. PMID:26391445

  11. Restructuring Schools by Applying Deming's Management Theories.

    ERIC Educational Resources Information Center

    Melvin, Charles A., III

    1991-01-01

    Four school districts adopted a school restructuring project using Deming's business management method. Deming offered alternative views of organizations based on psychology, systems, perceptual framework, and causes of variance. He listed 14 points for quality improvement. Evaluation indicated that key staff members willingly engaged in…

  12. Joe Robertson | NREL

    Science.gov Websites

    Joe Robertson Photo of Joe Robertson Joe Robertson Research Engineer Joseph.Robertson@nrel.gov | 303-275-4575 Joe joined NREL in 2012. His research activities include automated building model student from the Colorado School of Mines on projects involving numerical methods applied to uncertainty

  13. Play Nice Across Time Space

    NASA Technical Reports Server (NTRS)

    Conroy, Michael P.

    2015-01-01

    Lecture is an overview of Simulation technologies, methods and practices, as applied to current and past NASA programs. Focus is on sharing experience and the overall benefits to programs and projects of having appropriate simulation and analysis capabilities available at the correct point in a system lifecycle.

  14. Beam hardening correction in CT myocardial perfusion measurement

    NASA Astrophysics Data System (ADS)

    So, Aaron; Hsieh, Jiang; Li, Jian-Ying; Lee, Ting-Yim

    2009-05-01

    This paper presents a method for correcting beam hardening (BH) in cardiac CT perfusion imaging. The proposed algorithm works with reconstructed images instead of projection data. It applies thresholds to separate low (soft tissue) and high (bone and contrast) attenuating material in a CT image. The BH error in each projection is estimated by a polynomial function of the forward projection of the segmented image. The error image is reconstructed by back-projection of the estimated errors. A BH-corrected image is then obtained by subtracting a scaled error image from the original image. Phantoms were designed to simulate the BH artifacts encountered in cardiac CT perfusion studies of humans and animals that are most commonly used in cardiac research. These phantoms were used to investigate whether BH artifacts can be reduced with our approach and to determine the optimal settings, which depend upon the anatomy of the scanned subject, of the correction algorithm for patient and animal studies. The correction algorithm was also applied to correct BH in a clinical study to further demonstrate the effectiveness of our technique.

  15. Ranked centroid projection: a data visualization approach with self-organizing maps.

    PubMed

    Yen, G G; Wu, Z

    2008-02-01

    The self-organizing map (SOM) is an efficient tool for visualizing high-dimensional data. In this paper, the clustering and visualization capabilities of the SOM, especially in the analysis of textual data, i.e., document collections, are reviewed and further developed. A novel clustering and visualization approach based on the SOM is proposed for the task of text mining. The proposed approach first transforms the document space into a multidimensional vector space by means of document encoding. Afterwards, a growing hierarchical SOM (GHSOM) is trained and used as a baseline structure to automatically produce maps with various levels of detail. Following the GHSOM training, the new projection method, namely the ranked centroid projection (RCP), is applied to project the input vectors to a hierarchy of 2-D output maps. The RCP is used as a data analysis tool as well as a direct interface to the data. In a set of simulations, the proposed approach is applied to an illustrative data set and two real-world scientific document collections to demonstrate its applicability.

  16. Sampling limits for electron tomography with sparsity-exploiting reconstructions.

    PubMed

    Jiang, Yi; Padgett, Elliot; Hovden, Robert; Muller, David A

    2018-03-01

    Electron tomography (ET) has become a standard technique for 3D characterization of materials at the nano-scale. Traditional reconstruction algorithms such as weighted back projection suffer from disruptive artifacts with insufficient projections. Popularized by compressed sensing, sparsity-exploiting algorithms have been applied to experimental ET data and show promise for improving reconstruction quality or reducing the total beam dose applied to a specimen. Nevertheless, theoretical bounds for these methods have been less explored in the context of ET applications. Here, we perform numerical simulations to investigate performance of ℓ 1 -norm and total-variation (TV) minimization under various imaging conditions. From 36,100 different simulated structures, our results show specimens with more complex structures generally require more projections for exact reconstruction. However, once sufficient data is acquired, dividing the beam dose over more projections provides no improvements-analogous to the traditional dose-fraction theorem. Moreover, a limited tilt range of ±75° or less can result in distorting artifacts in sparsity-exploiting reconstructions. The influence of optimization parameters on reconstructions is also discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Assessing the Assessment Methods: Climate Change and Hydrologic Impacts

    NASA Astrophysics Data System (ADS)

    Brekke, L. D.; Clark, M. P.; Gutmann, E. D.; Mizukami, N.; Mendoza, P. A.; Rasmussen, R.; Ikeda, K.; Pruitt, T.; Arnold, J. R.; Rajagopalan, B.

    2014-12-01

    The Bureau of Reclamation, the U.S. Army Corps of Engineers, and other water management agencies have an interest in developing reliable, science-based methods for incorporating climate change information into longer-term water resources planning. Such assessments must quantify projections of future climate and hydrology, typically relying on some form of spatial downscaling and bias correction to produce watershed-scale weather information that subsequently drives hydrology and other water resource management analyses (e.g., water demands, water quality, and environmental habitat). Water agencies continue to face challenging method decisions in these endeavors: (1) which downscaling method should be applied and at what resolution; (2) what observational dataset should be used to drive downscaling and hydrologic analysis; (3) what hydrologic model(s) should be used and how should these models be configured and calibrated? There is a critical need to understand the ramification of these method decisions, as they affect the signal and uncertainties produced by climate change assessments and, thus, adaptation planning. This presentation summarizes results from a three-year effort to identify strengths and weaknesses of widely applied methods for downscaling climate projections and assessing hydrologic conditions. Methods were evaluated from two perspectives: historical fidelity, and tendency to modulate a global climate model's climate change signal. On downscaling, four methods were applied at multiple resolutions: statistically using Bias Correction Spatial Disaggregation, Bias Correction Constructed Analogs, and Asynchronous Regression; dynamically using the Weather Research and Forecasting model. Downscaling results were then used to drive hydrologic analyses over the contiguous U.S. using multiple models (VIC, CLM, PRMS), with added focus placed on case study basins within the Colorado Headwaters. The presentation will identify which types of climate changes are expressed robustly across methods versus those that are sensitive to method choice; which method choices seem relatively more important; and where strategic investments in research and development can substantially improve guidance on climate change provided to water managers.

  18. Project- versus Lecture-Based Courses: Assessing the Role of Course Structure on Perceived Utility, Anxiety, Academic Performance, and Satisfaction in the Undergraduate Research Methods Course

    ERIC Educational Resources Information Center

    Rubenking, Bridget; Dodd, Melissa

    2018-01-01

    Previous research suggests that undergraduate research methods students doubt the utility of course content and experience math and research anxiety. Research also suggests involving students in hands-on, applied research activities, although empirical data on the scope and nature of these activities are lacking. This study compared academic…

  19. Resource allocation in road infrastructure using ANP priorities with ZOGP formulation-A case study

    NASA Astrophysics Data System (ADS)

    Alias, Suriana; Adna, Norfarziah; Soid, Siti Khuzaimah; Kardri, Mahani

    2013-09-01

    Road Infrastructure (RI) project evaluation and selection is concern with the allocation of scarce organizational resources. In this paper, it is suggest an improved RI project selection methodology which reflects interdependencies among evaluation criteria and candidate projects. Fuzzy Delphi Method (FDM) is use to evoking expert group opinion and also to determine a degree of interdependences relationship between the alternative projects. In order to provide a systematic approach to set priorities among multi-criteria and trade-off among objectives, Analytic Network Process (ANP) is suggested to be applied prior to Zero-One Goal Programming (ZOGP) formulation. Specifically, this paper demonstrated how to combined FDM and ANP with ZOGP through a real-world RI empirical example on an ongoing decision-making project in Johor, Malaysia.

  20. Data-based adjoint and H2 optimal control of the Ginzburg-Landau equation

    NASA Astrophysics Data System (ADS)

    Banks, Michael; Bodony, Daniel

    2017-11-01

    Equation-free, reduced-order methods of control are desirable when the governing system of interest is of very high dimension or the control is to be applied to a physical experiment. Two-phase flow optimal control problems, our target application, fit these criteria. Dynamic Mode Decomposition (DMD) is a data-driven method for model reduction that can be used to resolve the dynamics of very high dimensional systems and project the dynamics onto a smaller, more manageable basis. We evaluate the effectiveness of DMD-based forward and adjoint operator estimation when applied to H2 optimal control approaches applied to the linear and nonlinear Ginzburg-Landau equation. Perspectives on applying the data-driven adjoint to two phase flow control will be given. Office of Naval Research (ONR) as part of the Multidisciplinary University Research Initiatives (MURI) Program, under Grant Number N00014-16-1-2617.

  1. Enhanced Molecular Dynamics Methods Applied to Drug Design Projects.

    PubMed

    Ziada, Sonia; Braka, Abdennour; Diharce, Julien; Aci-Sèche, Samia; Bonnet, Pascal

    2018-01-01

    Nobel Laureate Richard P. Feynman stated: "[…] everything that living things do can be understood in terms of jiggling and wiggling of atoms […]." The importance of computer simulations of macromolecules, which use classical mechanics principles to describe atom behavior, is widely acknowledged and nowadays, they are applied in many fields such as material sciences and drug discovery. With the increase of computing power, molecular dynamics simulations can be applied to understand biological mechanisms at realistic timescales. In this chapter, we share our computational experience providing a global view of two of the widely used enhanced molecular dynamics methods to study protein structure and dynamics through the description of their characteristics, limits and we provide some examples of their applications in drug design. We also discuss the appropriate choice of software and hardware. In a detailed practical procedure, we describe how to set up, run, and analyze two main molecular dynamics methods, the umbrella sampling (US) and the accelerated molecular dynamics (aMD) methods.

  2. New developments in transit noise and vibration criteria

    NASA Astrophysics Data System (ADS)

    Hanson, Carl E.

    2004-05-01

    Federal Transit Administration (FTA) noise and vibration impact criteria were developed in the early 1990's. Noise criteria are ambient-based, developed from the Schultz curve and fundamental research performed by the U.S. Environmental Protection Agency in the 1970's. Vibration criteria are single-value rms vibration velocity levels. After 10 years of experience applying the criteria in assessments of new transit projects throughout the United States, FTA is updating its methods. Approach to assessment of new projects in existing high-noise environments will be clarified. Method for assessing noise impacts due to horn blowing at grade crossings will be provided. Vibration criteria will be expanded to include spectral information. This paper summarizes the background of the current criteria, discusses examples where existing methods are lacking, and describes the planned remedies to improve criteria and methods.

  3. Bilogy Machine Initiative: Developing Innovative Novel Methods to Improve Neuro-rehabilitation for Amputees and Treatment for Patients at Remote Sites with Acute Brain Injury

    DTIC Science & Technology

    2010-09-01

    service to support the creation of both normative and pathological neuroinformatics databases. Project 3 Deliverable: The aim of this project is to...Gainesville, FL, 2010. D. Hammond, P. Vandergheynst, R. Gribonval, “ Wavelets on Graphs via Spectral Graph Theory,” Applied Computational and...the Phantom Experiments,” International Conference on Electrical Bioimpedance, April 4-8-, Gainesville, FL, 2010. 21 D. Hammond, “ Wavelets on

  4. Sequential Probability Ratio Testing with Power Projective Base Method Improves Decision-Making for BCI

    PubMed Central

    Liu, Rong

    2017-01-01

    Obtaining a fast and reliable decision is an important issue in brain-computer interfaces (BCI), particularly in practical real-time applications such as wheelchair or neuroprosthetic control. In this study, the EEG signals were firstly analyzed with a power projective base method. Then we were applied a decision-making model, the sequential probability ratio testing (SPRT), for single-trial classification of motor imagery movement events. The unique strength of this proposed classification method lies in its accumulative process, which increases the discriminative power as more and more evidence is observed over time. The properties of the method were illustrated on thirteen subjects' recordings from three datasets. Results showed that our proposed power projective method outperformed two benchmark methods for every subject. Moreover, with sequential classifier, the accuracies across subjects were significantly higher than that with nonsequential ones. The average maximum accuracy of the SPRT method was 84.1%, as compared with 82.3% accuracy for the sequential Bayesian (SB) method. The proposed SPRT method provides an explicit relationship between stopping time, thresholds, and error, which is important for balancing the time-accuracy trade-off. These results suggest SPRT would be useful in speeding up decision-making while trading off errors in BCI. PMID:29348781

  5. MO-FG-CAMPUS-JeP3-03: Detection of Unpredictable Patient Movement During SBRT Using a Single KV Projection of An On-Board CBCT System: Simulation Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Y; Sharp, G; Winey, B

    Purpose: An unpredictable movement of a patient can occur during SBRT even when immobilization devices are applied. In the SBRT treatments using a conventional linear accelerator detection of such movements relies heavily on human interaction and monitoring. This study aims to detect such positional abnormalities in real-time by assessing intra-fractional gantry mounted kV projection images of a patient’s spine. Methods: We propose a self-CBCT image based spine tracking method consisting of the following steps: (1)Acquire a pre-treatment CBCT image; (2)Transform the CBCT volume according to the couch correction; (3)Acquire kV projections during treatment beam delivery; (4)Simultaneously with each acquisition generatemore » a DRR from the CBCT volume based-on the current projection geometry; (5)Perform an intensity gradient-based 2D registration between spine ROI images of the projection and the DRR images; (6)Report an alarm if the detected 2D displacement is beyond a threshold value. To demonstrate the feasibility, retrospective simulations were performed on 1,896 projections from nine CBCT sessions of three patients who received lung SBRT. The unpredictable movements were simulated by applying random rotations and translations to the reference CBCT prior to each DRR generation. As the ground truth, the 3D translations and/or rotations causing >3 mm displacement of the midpoint of the thoracic spine were regarded as abnormal. In the measurements, different threshold values of 2D displacement were tested to investigate sensitivity and specificity of the proposed method. Results: A linear relationship between the ground truth 3D displacement and the detected 2D displacement was observed (R{sup 2} = 0.44). When the 2D displacement threshold was set to 3.6 mm the overall sensitivity and specificity were 77.7±5.7% and 77.9±3.5% respectively. Conclusion: In this simulation study, it was demonstrated that intrafractional kV projections from an on-board CBCT system have a potential to detect unpredictable patient movement during SBRT. This research is funded by Interfractional Imaging Research Grant from Elekta.« less

  6. Chemically intuited, large-scale screening of MOFs by machine learning techniques

    NASA Astrophysics Data System (ADS)

    Borboudakis, Giorgos; Stergiannakos, Taxiarchis; Frysali, Maria; Klontzas, Emmanuel; Tsamardinos, Ioannis; Froudakis, George E.

    2017-10-01

    A novel computational methodology for large-scale screening of MOFs is applied to gas storage with the use of machine learning technologies. This approach is a promising trade-off between the accuracy of ab initio methods and the speed of classical approaches, strategically combined with chemical intuition. The results demonstrate that the chemical properties of MOFs are indeed predictable (stochastically, not deterministically) using machine learning methods and automated analysis protocols, with the accuracy of predictions increasing with sample size. Our initial results indicate that this methodology is promising to apply not only to gas storage in MOFs but in many other material science projects.

  7. 31 CFR 205.16 - What special rules apply to Federal assistance programs and projects funded by the Federal...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... assistance programs and projects funded by the Federal Highway Trust Fund? 205.16 Section 205.16 Money and... special rules apply to Federal assistance programs and projects funded by the Federal Highway Trust Fund? The following applies to Federal assistance programs and projects funded out of the Federal Highway...

  8. 31 CFR 205.16 - What special rules apply to Federal assistance programs and projects funded by the Federal...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... assistance programs and projects funded by the Federal Highway Trust Fund? 205.16 Section 205.16 Money and... special rules apply to Federal assistance programs and projects funded by the Federal Highway Trust Fund? The following applies to Federal assistance programs and projects funded out of the Federal Highway...

  9. 31 CFR 205.16 - What special rules apply to Federal assistance programs and projects funded by the Federal...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... assistance programs and projects funded by the Federal Highway Trust Fund? 205.16 Section 205.16 Money and... special rules apply to Federal assistance programs and projects funded by the Federal Highway Trust Fund? The following applies to Federal assistance programs and projects funded out of the Federal Highway...

  10. 31 CFR 205.16 - What special rules apply to Federal assistance programs and projects funded by the Federal...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... assistance programs and projects funded by the Federal Highway Trust Fund? 205.16 Section 205.16 Money and... special rules apply to Federal assistance programs and projects funded by the Federal Highway Trust Fund? The following applies to Federal assistance programs and projects funded out of the Federal Highway...

  11. 31 CFR 205.16 - What special rules apply to Federal assistance programs and projects funded by the Federal...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... assistance programs and projects funded by the Federal Highway Trust Fund? 205.16 Section 205.16 Money and... special rules apply to Federal assistance programs and projects funded by the Federal Highway Trust Fund? The following applies to Federal assistance programs and projects funded out of the Federal Highway...

  12. Geological Sequestration Training and Research Program in Capture and Transport: Development of the Most Economical Separation Method for CO2 Capture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vahdat, Nader

    2013-09-30

    The project provided hands-on training and networking opportunities to undergraduate students in the area of carbon dioxide (CO2) capture and transport, through fundamental research study focused on advanced separation methods that can be applied to the capture of CO2 resulting from the combustion of fossil-fuels for power generation . The project team’s approach to achieve its objectives was to leverage existing Carbon Capture and Storage (CCS) course materials and teaching methods to create and implement an annual CCS short course for the Tuskegee University community; conduct a survey of CO2 separation and capture methods; utilize data to verify and developmore » computer models for CO2 capture and build CCS networks and hands-on training experiences. The objectives accomplished as a result of this project were: (1) A comprehensive survey of CO2 capture methods was conducted and mathematical models were developed to compare the potential economics of the different methods based on the total cost per year per unit of CO2 avoidance; and (2) Training was provided to introduce the latest CO2 capture technologies and deployment issues to the university community.« less

  13. Performing to Understand: Cultural Wealth, Precarity, and Shelter-Dwelling Youth

    ERIC Educational Resources Information Center

    Gallagher, Kathleen; Rodricks, Dirk J.

    2017-01-01

    Collaborating with "Project: Humanity," an acclaimed socially engaged theatre company, we mobilized, over 16 weeks, an applied theatre methodology of drama workshops and traditional qualitative research methods to explore issues of spatialized inequality and localized poverty with a youth shelter community in Toronto, Canada.…

  14. Inclusive Education at Primary Level: Reality or Phantasm

    ERIC Educational Resources Information Center

    Khan, Itfaq Khaliq; Behlol, Malik Ghulam

    2014-01-01

    The objectives of this study were to assess the impacts of Inclusive Education (IE) Project implemented in government schools of Islamabad and anticipate its practicability for public schools. Quantitative and qualitative methods were applied for data collection. Study instruments were structured interviews, unstructured focus group discussions,…

  15. Service Learning and Community Health Nursing: A Natural Fit.

    ERIC Educational Resources Information Center

    Miller, Marilyn P.; Swanson, Elizabeth

    2002-01-01

    Community health nursing students performed community assessments and proposed and implemented service learning projects that addressed adolescent smoking in middle schools, home safety for elderly persons, industrial worker health, and sexual abuse of teenaged girls. Students learned to apply epidemiological research methods, mobilize resources,…

  16. IDC Re-Engineering Phase 2 Glossary Version 1.3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Young, Christopher J.; Harris, James M.

    2017-01-01

    This document contains the glossary of terms used for the IDC Re-Engineering Phase 2 project. This version was created for Iteration E3. The IDC applies automatic processing methods in order to produce, archive, and distribute standard IDC products on behalf of all States Parties.

  17. Research in navigation and optimization for space trajectories

    NASA Technical Reports Server (NTRS)

    Pines, S.; Kelley, H. J.

    1979-01-01

    Topics covered include: (1) initial Cartesian coordinates for rapid precision orbit prediction; (2) accelerating convergence in optimization methods using search routines by applying curvilinear projection ideas; (3) perturbation-magnitude control for difference-quotient estimation of derivatives; and (4) determining the accelerometer bias for in-orbit shuttle trajectories.

  18. The Filtered Abel Transform and Its Application in Combustion Diagnostics

    NASA Technical Reports Server (NTRS)

    Simons, Stephen N. (Technical Monitor); Yuan, Zeng-Guang

    2003-01-01

    Many non-intrusive combustion diagnosis methods generate line-of-sight projections of a flame field. To reconstruct the spatial field of the measured properties, these projections need to be deconvoluted. When the spatial field is axisymmetric, commonly used deconvolution method include the Abel transforms, the onion peeling method and the two-dimensional Fourier transform method and its derivatives such as the filtered back projection methods. This paper proposes a new approach for performing the Abel transform method is developed, which possesses the exactness of the Abel transform and the flexibility of incorporating various filters in the reconstruction process. The Abel transform is an exact method and the simplest among these commonly used methods. It is evinced in this paper that all the exact reconstruction methods for axisymmetric distributions must be equivalent to the Abel transform because of its uniqueness and exactness. Detailed proof is presented to show that the two dimensional Fourier methods when applied to axisymmetric cases is identical to the Abel transform. Discrepancies among various reconstruction method stem from the different approximations made to perform numerical calculations. An equation relating the spectrum of a set of projection date to that of the corresponding spatial distribution is obtained, which shows that the spectrum of the projection is equal to the Abel transform of the spectrum of the corresponding spatial distribution. From the equation, if either the projection or the distribution is bandwidth limited, the other is also bandwidth limited, and both have the same bandwidth. If the two are not bandwidth limited, the Abel transform has a bias against low wave number components in most practical cases. This explains why the Abel transform and all exact deconvolution methods are sensitive to high wave number noises. The filtered Abel transform is based on the fact that the Abel transform of filtered projection data is equal to an integral transform of the original projection data with the kernel function being the Abel transform of the filtering function. The kernel function is independent of the projection data and can be obtained separately when the filtering function is selected. Users can select the best filtering function for a particular set of experimental data. When the kernal function is obtained, it can be used repeatedly to a number of projection data sets (rovs) from the same experiment. When an entire flame image that contains a large number of projection lines needs to be processed, the new approach significantly reduces computational effort in comparison with the conventional approach in which each projection data set is deconvoluted separately. Computer codes have been developed to perform the filter Abel transform for an entire flame field. Measured soot volume fraction data of a jet diffusion flame are processed as an example.

  19. Comparing and using assessments of the value of information to clinical decision-making.

    PubMed Central

    Urquhart, C J; Hepworth, J B

    1996-01-01

    This paper discusses the Value project, which assessed the value to clinical decision-making of information supplied by National Health Service (NHS) library and information services. The project not only showed how health libraries in the United Kingdom help clinicians in decision-making but also provided quality assurance guidelines for these libraries to help make their information services more effective. The paper reviews methods and results used in previous studies of the value of health libraries, noting that methodological differences appear to affect the results. The paper also discusses aspects of user involvement, categories of clinical decision-making, the value of information to present and future clinical decisions, and the combination of quantitative and qualitative assessments of value, as applied to the Value project and the studies reviewed. The Value project also demonstrated that the value placed on information depends in part on the career stage of the physician. The paper outlines the structure of the quality assurance tool kit, which is based on the findings and methods used in the Value project. PMID:8913550

  20. The Boeing Company Applied Academics Project Evaluation: Year Four. Evaluation Report.

    ERIC Educational Resources Information Center

    Wang, Changhua; Owens, Thomas R.

    This paper describes fourth-year outcomes (1993-94) of the Boeing Company-funded Applied Academics Project. Since the 1990-91 school year, the company has provided funds to improve and expand applied academics in 60 Washington high schools. Data were collected from pre- and post-surveys of students enrolled in the project's Applied Mathematics…

  1. Multiple site receptor modeling with a minimal spanning tree combined with a Kohonen neural network

    NASA Astrophysics Data System (ADS)

    Hopke, Philip K.

    1999-12-01

    A combination of two pattern recognition methods has been developed that allows the generation of geographical emission maps form multivariate environmental data. In such a projection into a visually interpretable subspace by a Kohonen Self-Organizing Feature Map, the topology of the higher dimensional variables space can be preserved, but parts of the information about the correct neighborhood among the sample vectors will be lost. This can partly be compensated for by an additional projection of Prim's Minimal Spanning Tree into the trained neural network. This new environmental receptor modeling technique has been adapted for multiple sampling sites. The behavior of the method has been studied using simulated data. Subsequently, the method has been applied to mapping data sets from the Southern California Air Quality Study. The projection of a 17 chemical variables measured at up to 8 sampling sites provided a 2D, visually interpretable, geometrically reasonable arrangement of air pollution source sin the South Coast Air Basin.

  2. Simultaneous reconstruction of 3D refractive index, temperature, and intensity distribution of combustion flame by double computed tomography technologies based on spatial phase-shifting method

    NASA Astrophysics Data System (ADS)

    Guo, Zhenyan; Song, Yang; Yuan, Qun; Wulan, Tuya; Chen, Lei

    2017-06-01

    In this paper, a transient multi-parameter three-dimensional (3D) reconstruction method is proposed to diagnose and visualize a combustion flow field. Emission and transmission tomography based on spatial phase-shifted technology are combined to reconstruct, simultaneously, the various physical parameter distributions of a propane flame. Two cameras triggered by the internal trigger mode capture the projection information of the emission and moiré tomography, respectively. A two-step spatial phase-shifting method is applied to extract the phase distribution in the moiré fringes. By using the filtered back-projection algorithm, we reconstruct the 3D refractive-index distribution of the combustion flow field. Finally, the 3D temperature distribution of the flame is obtained from the refractive index distribution using the Gladstone-Dale equation. Meanwhile, the 3D intensity distribution is reconstructed based on the radiation projections from the emission tomography. Therefore, the structure and edge information of the propane flame are well visualized.

  3. Evaluation of Health Equity Impact of Structural Policies: Overview of Research Methods Used in the SOPHIE Project.

    PubMed

    Kunst, Anton E

    2017-07-01

    This article briefly assesses the research methods that were applied in the SOPHIE project to evaluate the impact of structural policies on population health and health inequalities. The evaluation of structural policies is one of the key methodological challenges in today's public health. The experience in the SOPHIE project was that mixed methods are essential to identify, understand, and predict the health impact of structural policies. On the one hand, quantitative studies that included spatial comparisons or time trend analyses, preferably in a quasi-experimental design, showed that some structural policies were associated with improved population health and smaller health inequalities. On the other hand, qualitative studies, often inspired by realist approaches, were important to understand how these policies could have achieved the observed impact and why they would succeed in some settings but fail in others. This review ends with five recommendations for future studies that aim to evaluate, understand, and predict how health inequalities can be reduced through structural policies.

  4. Detection of the nipple in automated 3D breast ultrasound using coronal slab-average-projection and cumulative probability map

    NASA Astrophysics Data System (ADS)

    Kim, Hannah; Hong, Helen

    2014-03-01

    We propose an automatic method for nipple detection on 3D automated breast ultrasound (3D ABUS) images using coronal slab-average-projection and cumulative probability map. First, to identify coronal images that appeared remarkable distinction between nipple-areola region and skin, skewness of each coronal image is measured and the negatively skewed images are selected. Then, coronal slab-average-projection image is reformatted from selected images. Second, to localize nipple-areola region, elliptical ROI covering nipple-areola region is detected using Hough ellipse transform in coronal slab-average-projection image. Finally, to separate the nipple from areola region, 3D Otsu's thresholding is applied to the elliptical ROI and cumulative probability map in the elliptical ROI is generated by assigning high probability to low intensity region. False detected small components are eliminated using morphological opening and the center point of detected nipple region is calculated. Experimental results show that our method provides 94.4% nipple detection rate.

  5. Globally maximizing, locally minimizing: unsupervised discriminant projection with applications to face and palm biometrics.

    PubMed

    Yang, Jian; Zhang, David; Yang, Jing-Yu; Niu, Ben

    2007-04-01

    This paper develops an unsupervised discriminant projection (UDP) technique for dimensionality reduction of high-dimensional data in small sample size cases. UDP can be seen as a linear approximation of a multimanifolds-based learning framework which takes into account both the local and nonlocal quantities. UDP characterizes the local scatter as well as the nonlocal scatter, seeking to find a projection that simultaneously maximizes the nonlocal scatter and minimizes the local scatter. This characteristic makes UDP more intuitive and more powerful than the most up-to-date method, Locality Preserving Projection (LPP), which considers only the local scatter for clustering or classification tasks. The proposed method is applied to face and palm biometrics and is examined using the Yale, FERET, and AR face image databases and the PolyU palmprint database. The experimental results show that UDP consistently outperforms LPP and PCA and outperforms LDA when the training sample size per class is small. This demonstrates that UDP is a good choice for real-world biometrics applications.

  6. [Characteristics and innovation in projects of ethnomedicine and ethnopharmacology funded by National Natural Science Foundation of China].

    PubMed

    Han, Li-wei

    2015-09-01

    The overall situation of projects of ethnomedicine and ethnopharmacology funded by the National Natural Science Foundation of China (NSFC) since 2008 has been presented in this paper. The main source of characteristics and innovation of the funded projects were summarized, which may come from several aspects, such as the ethnomedical theories, the dominant diseases of ethnomedicine, special diseases in ethnic minorities inhabited areas, unique ethnomedical therapy, special methods for applying medication, endemic medicinal materials in ethnic minorities inhabited areas, same medicinal materials with different applications. Examples have been provided to give references to the applicants in the fields of ethnomedicine and ethnopharmacology.

  7. Research on Green Construction Technology Applied at Guangzhou Hongding Building Project

    NASA Astrophysics Data System (ADS)

    Lou, Yong Zhong

    2018-06-01

    The green construction technology is the embodiment of sustainable development strategy in the construction industry, and it is a new construction mode which requires a higher environmental protection. Based on the Hongding building project, this paper describes the application and innovation of technical in the process of implementing green construction in the project, as well as the difficulties and characteristics in the specific practice; .The economic and social benefits of green construction are compared to the traditional construction model; .The achievements and experience of the green construction technology are summarized in the project; The ideas and methods in the process of implementing green construction are abstracted; some suggestions are put forward for the development of green construction.

  8. User Participation in Coproduction of Health Innovation: Proposal for a Synergy Project.

    PubMed

    Nygren, Jens; Zukauskaite, Elena; Westberg, Niklas

    2018-05-09

    This project concerns advancing knowledge, methods, and logic for user participation in coproduction of health innovations. Such advancement is vital for several reasons. From a user perspective, participation in coproduction provides an opportunity to gain real influence over goal definition, design, and implementation of health innovations, ensuring that the solution developed solves real problems in right ways. From a societal perspective, it's a mean to improve the efficiency of health care and the implementation of the Patient Act. As for industry, frameworks and knowledge of coproduction offer tools to operate in a complex sector, with great potential for innovation of services and products. The fundamental objective of this project is to advance knowledge and methods of how user participation in the coproduction of health innovations can be applied in order to benefit users, industry, and public sector. This project is a synergy project, which means that the objective will be accomplished through collaboration and meta-analysis between three subprojects that address different user groups, apply different strategies to promote human health, and relate to different parts of the health sector. Furthermore, subprojects focus on distinctive stages in the spectrum of innovation, with the objective to generate knowledge of the innovation process as a whole. The project is organized around three work packages related to three challenges-coproduction, positioning, and realization. Each subproject is designed such that it has its own field of study with clearly identified objectives but also targets work packages to contribute to the project as a whole. The work on the work packages will use case methodology for data collection and analysis based on the subprojects as data sources. More concretely, logic of multiple case studies will be applied with each subproject representing a separate case which is similar to each other in its attention to user participation in coproduction, but different regarding, for example, context and target groups. At the synergy level, the framework methodology will be used to handle and analyze the vast amount of information generated within the subprojects. The project period is from July 1, 2018 to June 30, 2022. By addressing the objective of this project, we will create new knowledge on how to manage challenges to health innovation associated with the coproduction process, the positioning of solutions, and realization. ©Jens Nygren, Elena Zukauskaite, Niklas Westberg. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 09.05.2018.

  9. 3D Modeling as Method for Construction and Analysis of Graphic Objects

    NASA Astrophysics Data System (ADS)

    Kheyfets, A. L.; Vasilieva, V. N.

    2017-11-01

    The use of 3D modeling when constructing and analyzing perspective projections and shadows is considered. The creation of photorealistic image is shown. The perspective of the construction project and characterization of its image are given as an example. The authors consider the construction of a dynamic block as a means of graphical information storage and automation of geometric constructions. The example of the dynamic block construction at creating a truss node is demonstrated. The constructions are considered as applied to the Auto-CAD software. The paper is aimed at improving the graphic methods of architectural design and improving the educational process when training the Bachelor’s degree students majoring in construction.

  10. Fuel Aging in Storage and Transportation (FAST): Accelerated Characterization and Performance Assessment of the Used Nuclear Fuel Storage System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDeavitt, Sean

    2016-08-02

    This Integrated Research Project (IRP) was established to characterize key limiting phenomena related to the performance of used nuclear fuel (UNF) storage systems. This was an applied engineering project with a specific application in view (i.e., UNF dry storage). The completed tasks made use of a mixture of basic science and engineering methods. The overall objective was to create, or enable the creation of, predictive tools in the form of observation methods, phenomenological models, and databases that will enable the design, installation, and licensing of dry UNF storage systems that will be capable of containing UNF for extended period ofmore » time.« less

  11. A Survey of Variable Extragalactic Sources with XTE's All Sky Monitor (ASM)

    NASA Technical Reports Server (NTRS)

    Jernigan, Garrett

    1998-01-01

    The original goal of the project was the near real-time detection of AGN utilizing the SSC 3 of the ASM on XTE which does a deep integration on one 100 square degree region of the sky. While the SSC never performed sufficiently well to allow the success of this goal, the work on the project has led to the development of a new analysis method for coded aperture systems which has now been applied to ASM data for mapping regions near clusters of galaxies such as the Perseus Cluster and the Coma Cluster. Publications are in preparation that describe both the new method and the results from mapping clusters of galaxies.

  12. Fan beam image reconstruction with generalized Fourier slice theorem.

    PubMed

    Zhao, Shuangren; Yang, Kang; Yang, Kevin

    2014-01-01

    For parallel beam geometry the Fourier reconstruction works via the Fourier slice theorem (or central slice theorem, projection slice theorem). For fan beam situation, Fourier slice can be extended to a generalized Fourier slice theorem (GFST) for fan-beam image reconstruction. We have briefly introduced this method in a conference. This paper reintroduces the GFST method for fan beam geometry in details. The GFST method can be described as following: the Fourier plane is filled by adding up the contributions from all fanbeam projections individually; thereby the values in the Fourier plane are directly calculated for Cartesian coordinates such avoiding the interpolation from polar to Cartesian coordinates in the Fourier domain; inverse fast Fourier transform is applied to the image in Fourier plane and leads to a reconstructed image in spacial domain. The reconstructed image is compared between the result of the GFST method and the result from the filtered backprojection (FBP) method. The major differences of the GFST and the FBP methods are: (1) The interpolation process are at different data sets. The interpolation of the GFST method is at projection data. The interpolation of the FBP method is at filtered projection data. (2) The filtering process are done in different places. The filtering process of the GFST is at Fourier domain. The filtering process of the FBP method is the ramp filter which is done at projections. The resolution of ramp filter is variable with different location but the filter in the Fourier domain lead to resolution invariable with location. One advantage of the GFST method over the FBP method is in short scan situation, an exact solution can be obtained with the GFST method, but it can not be obtained with the FBP method. The calculation of both the GFST and the FBP methods are at O(N^3), where N is the number of pixel in one dimension.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Danko, George L

    To increase understanding of the energy extraction capacity of Enhanced Geothermal System(s) (EGS), a numerical model development and application project is completed. The general objective of the project is to develop and apply a new, data-coupled Thermal-Hydrological-Mechanical-Chemical (T-H-M-C) model in which the four internal components can be freely selected from existing simulation software without merging and cross-combining a diverse set of computational codes. Eight tasks are completed during the project period. The results are reported in five publications, an MS thesis, twelve quarterly, and two annual reports to DOE. Two US patents have also been issued during the project period,more » with one patent application originated prior to the start of the project. The “Multiphase Physical Transport Modeling Method and Modeling System” (U.S. Patent 8,396,693 B2, 2013), a key element in the GHE sub-model solution, is successfully used for EGS studies. The “Geothermal Energy Extraction System and Method" invention (U.S. Patent 8,430,166 B2, 2013) originates from the time of project performance, describing a new fluid flow control solution. The new, coupled T-H-M-C numerical model will help analyzing and designing new, efficient EGS systems.« less

  14. Through Their Eyes: Lessons Learned Using Participatory Methods in Health Care Quality Improvement Projects.

    PubMed

    Balbale, Salva N; Locatelli, Sara M; LaVela, Sherri L

    2016-08-01

    In this methodological article, we examine participatory methods in depth to demonstrate how these methods can be adopted for quality improvement (QI) projects in health care. We draw on existing literature and our QI initiatives in the Department of Veterans Affairs to discuss the application of photovoice and guided tours in QI efforts. We highlight lessons learned and several benefits of using participatory methods in this area. Using participatory methods, evaluators can engage patients, providers, and other stakeholders as partners to enhance care. Participant involvement helps yield actionable data that can be translated into improved care practices. Use of these methods also helps generate key insights to inform improvements that truly resonate with stakeholders. Using participatory methods is a valuable strategy to harness participant engagement and drive improvements that address individual needs. In applying these innovative methodologies, evaluators can transcend traditional approaches to uniquely support evaluations and improvements in health care. © The Author(s) 2015.

  15. Structure and Management of an Engineering Senior Design Course.

    PubMed

    Tanaka, Martin L; Fischer, Kenneth J

    2016-07-01

    The design of products and processes is an important area in engineering. Students in engineering schools learn fundamental principles in their courses but often lack an opportunity to apply these methods to real-world problems until their senior year. This article describes important elements that should be incorporated into a senior capstone design course. It includes a description of the general principles used in engineering design and a discussion of why students often have difficulty with application and revert to trial and error methods. The structure of a properly designed capstone course is dissected and its individual components are evaluated. Major components include assessing resources, identifying projects, establishing teams, understanding requirements, developing conceptual designs, creating detailed designs, building prototypes, testing performance, and final presentations. In addition to the course design, team management and effective mentoring are critical to success. This article includes suggested guidelines and tips for effective design team leadership, attention to detail, investment of time, and managing project scope. Furthermore, the importance of understanding business culture, displaying professionalism, and considerations of different types of senior projects is discussed. Through a well-designed course and proper mentoring, students will learn to apply their engineering skills and gain basic business knowledge that will prepare them for entry-level positions in industry.

  16. Thermodynamic and Transport Properties of Real Air Plasma in Wide Range of Temperature and Pressure

    NASA Astrophysics Data System (ADS)

    Wang, Chunlin; Wu, Yi; Chen, Zhexin; Yang, Fei; Feng, Ying; Rong, Mingzhe; Zhang, Hantian

    2016-07-01

    Air plasma has been widely applied in industrial manufacture. In this paper, both dry and humid air plasmas' thermodynamic and transport properties are calculated in temperature 300-100000 K and pressure 0.1-100 atm. To build a more precise model of real air plasma, over 70 species are considered for composition. Two different methods, the Gibbs free energy minimization method and the mass action law method, are used to determinate the composition of the air plasma in a different temperature range. For the transport coefficients, the simplified Chapman-Enskog method developed by Devoto has been applied using the most recent collision integrals. It is found that the presence of CO2 has almost no effect on the properties of air plasma. The influence of H2O can be ignored except in low pressure air plasma, in which the saturated vapor pressure is relatively high. The results will serve as credible inputs for computational simulation of air plasma. supported by the National Key Basic Research Program of China (973 Program)(No. 2015CB251002), National Natural Science Foundation of China (Nos. 51521065, 51577145), the Science and Technology Project Funds of the Grid State Corporation (SGTYHT/13-JS-177), the Fundamental Research Funds for the Central Universities, and State Grid Corporation Project (GY71-14-004)

  17. Perspectives on Linguistic Documentation from Sociolinguistic Research on Dialects

    ERIC Educational Resources Information Center

    Tagliamonte, Sali A.

    2017-01-01

    The goal of the paper is to demonstrate how sociolinguistic research can be applied to endangered language documentation field linguistics. It first provides an overview of the techniques and practices of sociolinguistic fieldwork and the ensuring corpus compilation methods. The discussion is framed with examples from research projects focused on…

  18. Quality Improvement Initiative in School-Based Health Centers across New Mexico

    ERIC Educational Resources Information Center

    Booker, John M.; Schluter, Janette A.; Carrillo, Kris; McGrath, Jane

    2011-01-01

    Background: Quality improvement principles have been applied extensively to health care organizations, but implementation of quality improvement methods in school-based health centers (SBHCs) remains in a developmental stage with demonstration projects under way in individual states and nationally. Rural areas, such as New Mexico, benefit from the…

  19. Research on Mobile Learning Activities Applying Tablets

    ERIC Educational Resources Information Center

    Kurilovas, Eugenijus; Juskeviciene, Anita; Bireniene, Virginija

    2015-01-01

    The paper aims to present current research on mobile learning activities in Lithuania while implementing flagship EU-funded CCL project on application of tablet computers in education. In the paper, the quality of modern mobile learning activities based on learning personalisation, problem solving, collaboration, and flipped class methods is…

  20. INVESTIGATION OF ORGANIC WEED CONTROL METHODS, PESTICIDE SPECIAL STUDY, COLORADO STATE UNIVERSITY

    EPA Science Inventory

    The project is proposed for the 2003 and 2004 growing seasons. Corn gluten meal (CGM), treated paper mulch and plastic mulch, along with conventional herbicide, will be applied to fields of drip irrigated broccoli in a randomized complete block design with 6 replicates. Due to ...

  1. The instrumental seismicity of the Barents and Kara sea region: relocated event catalog from early twentieth century to 1989

    NASA Astrophysics Data System (ADS)

    Morozov, Alexey Nikolaevich; Vaganova, Natalya V.; Asming, Vladimir E.; Konechnaya, Yana V.; Evtyugina, Zinaida A.

    2018-05-01

    We have relocated seismic events registered within the Barents and Kara sea region from early twentieth century to 1989 with a view to creating a relocated catalog. For the relocation, we collected all available seismic bulletins from the global network using data from the ISC Bulletin (International Seismological Centre), ISC-GEM project (International Seismological Centre-Global Earthquake Model), EuroSeismos project, and by Soviet seismic stations from Geophysical Survey of the Russian Academy of Sciences. The location was performed by applying a modified method of generalized beamforming. We have considered several travel time models and selected one with the best location accuracy for ground truth events. Verification of the modified method and selection of the travel time model were performed using data on four nuclear explosions that occurred in the area of the Novaya Zemlya Archipelago and in the north of the European part of Russia. The modified method and the Barents travel time model provide sufficient accuracy for event location in the region. The relocation procedure was applied to 31 of 36 seismic events registered within the Barents and Kara sea region.

  2. Transductive multi-view zero-shot learning.

    PubMed

    Fu, Yanwei; Hospedales, Timothy M; Xiang, Tao; Gong, Shaogang

    2015-11-01

    Most existing zero-shot learning approaches exploit transfer learning via an intermediate semantic representation shared between an annotated auxiliary dataset and a target dataset with different classes and no annotation. A projection from a low-level feature space to the semantic representation space is learned from the auxiliary dataset and applied without adaptation to the target dataset. In this paper we identify two inherent limitations with these approaches. First, due to having disjoint and potentially unrelated classes, the projection functions learned from the auxiliary dataset/domain are biased when applied directly to the target dataset/domain. We call this problem the projection domain shift problem and propose a novel framework, transductive multi-view embedding, to solve it. The second limitation is the prototype sparsity problem which refers to the fact that for each target class, only a single prototype is available for zero-shot learning given a semantic representation. To overcome this problem, a novel heterogeneous multi-view hypergraph label propagation method is formulated for zero-shot learning in the transductive embedding space. It effectively exploits the complementary information offered by different semantic representations and takes advantage of the manifold structures of multiple representation spaces in a coherent manner. We demonstrate through extensive experiments that the proposed approach (1) rectifies the projection shift between the auxiliary and target domains, (2) exploits the complementarity of multiple semantic representations, (3) significantly outperforms existing methods for both zero-shot and N-shot recognition on three image and video benchmark datasets, and (4) enables novel cross-view annotation tasks.

  3. Volumetric display containing multiple two-dimensional color motion pictures

    NASA Astrophysics Data System (ADS)

    Hirayama, R.; Shiraki, A.; Nakayama, H.; Kakue, T.; Shimobaba, T.; Ito, T.

    2014-06-01

    We have developed an algorithm which can record multiple two-dimensional (2-D) gradated projection patterns in a single three-dimensional (3-D) object. Each recorded pattern has the individual projected direction and can only be seen from the direction. The proposed algorithm has two important features: the number of recorded patterns is theoretically infinite and no meaningful pattern can be seen outside of the projected directions. In this paper, we expanded the algorithm to record multiple 2-D projection patterns in color. There are two popular ways of color mixing: additive one and subtractive one. Additive color mixing used to mix light is based on RGB colors and subtractive color mixing used to mix inks is based on CMY colors. We made two coloring methods based on the additive mixing and subtractive mixing. We performed numerical simulations of the coloring methods, and confirmed their effectiveness. We also fabricated two types of volumetric display and applied the proposed algorithm to them. One is a cubic displays constructed by light-emitting diodes (LEDs) in 8×8×8 array. Lighting patterns of LEDs are controlled by a microcomputer board. The other one is made of 7×7 array of threads. Each thread is illuminated by a projector connected with PC. As a result of the implementation, we succeeded in recording multiple 2-D color motion pictures in the volumetric displays. Our algorithm can be applied to digital signage, media art and so forth.

  4. Research in Atomic, Ionic and Photonic Systems for Scalable Deterministic Quantum Logic

    DTIC Science & Technology

    2005-11-17

    1. Ion Trap Project (DL, ANS, DS) Year 1 The “pushing gate” that we intend to use to entangle ions was thoroughly studied theoretically (milestone 1...allow more complex experimental sequences (e.g. Raman sideband cooling). We achieved important goals on the way to implementing an entangling gate in...for a two-ion entangling gate (in the method of [3]), we applied the same force to a single ion. When applied to a spin superposition state, the

  5. Size-extensive QCISDT — implementation and application

    NASA Astrophysics Data System (ADS)

    Cremer, Dieter; He, Zhi

    1994-05-01

    A size-extensive quadratic CI method with single (S), double (D), and triple (T) excitations, QCISDT, has been derived by appropriate cancellation of disconnected terms in the CISDT projection equations. Matrix elements of the new QCI method have been evaluated in terms of two-electron integrals and applied to a number of atoms and small molecules. While QCISDT results are of similar accuracy to CCSDT results, the new method is easier to implement, converges in many cases faster and, thereby, leads to advantages compared to CCSDT.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harben, P E; Harris, D; Myers, S

    Seismic imaging and tracking methods have intelligence and monitoring applications. Current systems, however, do not adequately calibrate or model the unknown geological heterogeneity. Current systems are also not designed for rapid data acquisition and analysis in the field. This project seeks to build the core technological capabilities coupled with innovative deployment, processing, and analysis methodologies to allow seismic methods to be effectively utilized in the applications of seismic imaging and vehicle tracking where rapid (minutes to hours) and real-time analysis is required. The goal of this project is to build capabilities in acquisition system design, utilization and in full 3Dmore » finite difference modeling as well as statistical characterization of geological heterogeneity. Such capabilities coupled with a rapid field analysis methodology based on matched field processing are applied to problems associated with surveillance, battlefield management, finding hard and deeply buried targets, and portal monitoring. This project benefits the U.S. military and intelligence community in support of LLNL's national-security mission. FY03 was the final year of this project. In the 2.5 years this project has been active, numerous and varied developments and milestones have been accomplished. A wireless communication module for seismic data was developed to facilitate rapid seismic data acquisition and analysis. The E3D code was enhanced to include topographic effects. Codes were developed to implement the Karhunen-Loeve (K-L) statistical methodology for generating geological heterogeneity that can be utilized in E3D modeling. The matched field processing methodology applied to vehicle tracking and based on a field calibration to characterize geological heterogeneity was tested and successfully demonstrated in a tank tracking experiment at the Nevada Test Site. A 3-seismic-array vehicle tracking testbed was installed on-site at LLNL for testing real-time seismic tracking methods. A field experiment was conducted over a tunnel at the Nevada Site that quantified the tunnel reflection signal and, coupled with modeling, identified key needs and requirements in experimental layout of sensors. A large field experiment was conducted at the Lake Lynn Laboratory, a mine safety research facility in Pennsylvania, over a tunnel complex in realistic, difficult conditions. This experiment gathered the necessary data for a full 3D attempt to apply the methodology. The experiment also collected data to analyze the capabilities to detect and locate in-tunnel explosions for mine safety and other applications.« less

  7. Deducing Climatic Elasticity to Assess Projected Climate Change Impacts on Streamflow Change across China

    NASA Astrophysics Data System (ADS)

    Liu, Jianyu; Zhang, Qiang; Zhang, Yongqiang; Chen, Xi; Li, Jianfeng; Aryal, Santosh K.

    2017-10-01

    Climatic elasticity has been widely applied to assess streamflow responses to climate changes. To fully assess impacts of climate under global warming on streamflow and reduce the error and uncertainty from various control variables, we develop a four-parameter (precipitation, catchment characteristics n, and maximum and minimum temperatures) climatic elasticity method named PnT, based on the widely used Budyko framework and simplified Makkink equation. We use this method to carry out the first comprehensive evaluation of the streamflow response to potential climate change for 372 widely spread catchments in China. The PnT climatic elasticity was first evaluated for a period 1980-2000, and then used to evaluate streamflow change response to climate change based on 12 global climate models under Representative Concentration Pathway 2.6 (RCP2.6) and RCP 8.5 emission scenarios. The results show that (1) the PnT climatic elasticity method is reliable; (2) projected increasing streamflow takes place in more than 60% of the selected catchments, with mean increments of 9% and 15.4% under RCP2.6 and RCP8.5 respectively; and (3) uncertainties in the projected streamflow are considerable in several regions, such as the Pearl River and Yellow River, with more than 40% of the selected catchments showing inconsistent change directions. Our results can help Chinese policy makers to manage and plan water resources more effectively, and the PnT climatic elasticity should be applied to other parts of the world.

  8. From climate-change spaghetti to climate-change distributions for 21st Century California

    USGS Publications Warehouse

    Dettinger, M.D.

    2005-01-01

    The uncertainties associated with climate-change projections for California are unlikely to disappear any time soon, and yet important long-term decisions will be needed to accommodate those potential changes. Projection uncertainties have typically been addressed by analysis of a few scenarios, chosen based on availability or to capture the extreme cases among available projections. However, by focusing on more common projections rather than the most extreme projections (using a new resampling method), new insights into current projections emerge: (1) uncertainties associated with future greenhouse-gas emissions are comparable with the differences among climate models, so that neither source of uncertainties should be neglected or underrepresented; (2) twenty-first century temperature projections spread more, overall, than do precipitation scenarios; (3) projections of extremely wet futures for California are true outliers among current projections; and (4) current projections that are warmest tend, overall, to yield a moderately drier California, while the cooler projections yield a somewhat wetter future. The resampling approach applied in this paper also provides a natural opportunity to objectively incorporate measures of model skill and the likelihoods of various emission scenarios into future assessments.

  9. Aligning ERP systems with companies' real needs: an `Operational Model Based' method

    NASA Astrophysics Data System (ADS)

    Mamoghli, Sarra; Goepp, Virginie; Botta-Genoulaz, Valérie

    2017-02-01

    Enterprise Resource Planning (ERP) systems offer standard functionalities that have to be configured and customised by a specific company depending on its own requirements. A consistent alignment is therefore an essential success factor of ERP projects. To manage this alignment, an 'Operational Model Based' method is proposed. It is based on the design and the matching of models, and conforms to the modelling views and constructs of the ISO 19439 and 19440 enterprise-modelling standards. It is characterised by: (1) a predefined design and matching order of the models; (2) the formalisation, in terms of modelling constructs, of alignment and misalignment situations; and (3) their association with a set of decisions in order to mitigate the misalignment risk. Thus, a comprehensive understanding of the alignment management during ERP projects is given. Unlike existing methods, this one includes decisions related to the organisational changes an ERP system can induce, as well as criteria on which the best decision can be based. In this way, it provides effective support and guidance to companies implementing ERP systems, as the alignment process is detailed and structured. The method is applied on the ERP project of a Small and Medium Enterprise, showing that it can be used even in contexts where the ERP project expertise level is low.

  10. A remark on copy number variation detection methods.

    PubMed

    Li, Shuo; Dou, Xialiang; Gao, Ruiqi; Ge, Xinzhou; Qian, Minping; Wan, Lin

    2018-01-01

    Copy number variations (CNVs) are gain and loss of DNA sequence of a genome. High throughput platforms such as microarrays and next generation sequencing technologies (NGS) have been applied for genome wide copy number losses. Although progress has been made in both approaches, the accuracy and consistency of CNV calling from the two platforms remain in dispute. In this study, we perform a deep analysis on copy number losses on 254 human DNA samples, which have both SNP microarray data and NGS data publicly available from Hapmap Project and 1000 Genomes Project respectively. We show that the copy number losses reported from Hapmap Project and 1000 Genome Project only have < 30% overlap, while these reports are required to have cross-platform (e.g. PCR, microarray and high-throughput sequencing) experimental supporting by their corresponding projects, even though state-of-art calling methods were employed. On the other hand, copy number losses are found directly from HapMap microarray data by an accurate algorithm, i.e. CNVhac, almost all of which have lower read mapping depth in NGS data; furthermore, 88% of which can be supported by the sequences with breakpoint in NGS data. Our results suggest the ability of microarray calling CNVs and the possible introduction of false negatives from the unessential requirement of the additional cross-platform supporting. The inconsistency of CNV reports from Hapmap Project and 1000 Genomes Project might result from the inadequate information containing in microarray data, the inconsistent detection criteria, or the filtration effect of cross-platform supporting. The statistical test on CNVs called from CNVhac show that the microarray data can offer reliable CNV reports, and majority of CNV candidates can be confirmed by raw sequences. Therefore, the CNV candidates given by a good caller could be highly reliable without cross-platform supporting, so additional experimental information should be applied in need instead of necessarily.

  11. Remote Sensing Image Classification Applied to the First National Geographical Information Census of China

    NASA Astrophysics Data System (ADS)

    Yu, Xin; Wen, Zongyong; Zhu, Zhaorong; Xia, Qiang; Shun, Lan

    2016-06-01

    Image classification will still be a long way in the future, although it has gone almost half a century. In fact, researchers have gained many fruits in the image classification domain, but there is still a long distance between theory and practice. However, some new methods in the artificial intelligence domain will be absorbed into the image classification domain and draw on the strength of each to offset the weakness of the other, which will open up a new prospect. Usually, networks play the role of a high-level language, as is seen in Artificial Intelligence and statistics, because networks are used to build complex model from simple components. These years, Bayesian Networks, one of probabilistic networks, are a powerful data mining technique for handling uncertainty in complex domains. In this paper, we apply Tree Augmented Naive Bayesian Networks (TAN) to texture classification of High-resolution remote sensing images and put up a new method to construct the network topology structure in terms of training accuracy based on the training samples. Since 2013, China government has started the first national geographical information census project, which mainly interprets geographical information based on high-resolution remote sensing images. Therefore, this paper tries to apply Bayesian network to remote sensing image classification, in order to improve image interpretation in the first national geographical information census project. In the experiment, we choose some remote sensing images in Beijing. Experimental results demonstrate TAN outperform than Naive Bayesian Classifier (NBC) and Maximum Likelihood Classification Method (MLC) in the overall classification accuracy. In addition, the proposed method can reduce the workload of field workers and improve the work efficiency. Although it is time consuming, it will be an attractive and effective method for assisting office operation of image interpretation.

  12. NASA Langley Research and Technology-Transfer Program in Formal Methods

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Caldwell, James L.; Carreno, Victor A.; Holloway, C. Michael; Miner, Paul S.; DiVito, Ben L.

    1995-01-01

    This paper presents an overview of NASA Langley research program in formal methods. The major goals of this work are to make formal methods practical for use on life critical systems, and to orchestrate the transfer of this technology to U.S. industry through use of carefully designed demonstration projects. Several direct technology transfer efforts have been initiated that apply formal methods to critical subsystems of real aerospace computer systems. The research team consists of five NASA civil servants and contractors from Odyssey Research Associates, SRI International, and VIGYAN Inc.

  13. Uncertainty in benefit cost analysis of smart grid demonstration-projects in the U.S., China, and Italy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karali, Nihan; Flego, Gianluca; Yu, Jiancheng

    Given the substantial investments required, there has been keen interest in conducting benefits analysis, i.e., quantifying, and often monetizing, the performance of smart grid technologies. In this study, we compare two different approaches; (1) Electric Power Research Institute (EPRI)’s benefits analysis method and its adaptation to the European contexts by the European Commission, Joint Research Centre (JRC), and (2) the Analytic Hierarchy Process (AHP) and fuzzy logic decision making method. These are applied to three case demonstration projects executed in three different countries; the U.S., China, and Italy, considering uncertainty in each case. This work is conducted under the U.S.more » (United States)-China Climate Change Working Group, smart grid, with an additional major contribution by the European Commission. The following is a brief description of the three demonstration projects.« less

  14. Business Architecture Development at Public Administration - Insights from Government EA Method Engineering Project in Finland

    NASA Astrophysics Data System (ADS)

    Valtonen, Katariina; Leppänen, Mauri

    Governments worldwide are concerned for efficient production of services to customers. To improve quality of services and to make service production more efficient, information and communication technology (ICT) is largely exploited in public administration (PA). Succeeding in this exploitation calls for large-scale planning which embraces issues from strategic to technological level. In this planning the notion of enterprise architecture (EA) is commonly applied. One of the sub-architectures of EA is business architecture (BA). BA planning is challenging in PA due to a large number of stakeholders, a wide set of customers, and solid and hierarchical structures of organizations. To support EA planning in Finland, a project to engineer a government EA (GEA) method was launched. In this chapter, we analyze the discussions and outputs of the project workshops and reflect emerged issues on current e-government literature. We bring forth insights into and suggestions for government BA and its development.

  15. High Accuracy Human Activity Recognition Based on Sparse Locality Preserving Projections.

    PubMed

    Zhu, Xiangbin; Qiu, Huiling

    2016-01-01

    Human activity recognition(HAR) from the temporal streams of sensory data has been applied to many fields, such as healthcare services, intelligent environments and cyber security. However, the classification accuracy of most existed methods is not enough in some applications, especially for healthcare services. In order to improving accuracy, it is necessary to develop a novel method which will take full account of the intrinsic sequential characteristics for time-series sensory data. Moreover, each human activity may has correlated feature relationship at different levels. Therefore, in this paper, we propose a three-stage continuous hidden Markov model (TSCHMM) approach to recognize human activities. The proposed method contains coarse, fine and accurate classification. The feature reduction is an important step in classification processing. In this paper, sparse locality preserving projections (SpLPP) is exploited to determine the optimal feature subsets for accurate classification of the stationary-activity data. It can extract more discriminative activities features from the sensor data compared with locality preserving projections. Furthermore, all of the gyro-based features are used for accurate classification of the moving data. Compared with other methods, our method uses significantly less number of features, and the over-all accuracy has been obviously improved.

  16. High Accuracy Human Activity Recognition Based on Sparse Locality Preserving Projections

    PubMed Central

    2016-01-01

    Human activity recognition(HAR) from the temporal streams of sensory data has been applied to many fields, such as healthcare services, intelligent environments and cyber security. However, the classification accuracy of most existed methods is not enough in some applications, especially for healthcare services. In order to improving accuracy, it is necessary to develop a novel method which will take full account of the intrinsic sequential characteristics for time-series sensory data. Moreover, each human activity may has correlated feature relationship at different levels. Therefore, in this paper, we propose a three-stage continuous hidden Markov model (TSCHMM) approach to recognize human activities. The proposed method contains coarse, fine and accurate classification. The feature reduction is an important step in classification processing. In this paper, sparse locality preserving projections (SpLPP) is exploited to determine the optimal feature subsets for accurate classification of the stationary-activity data. It can extract more discriminative activities features from the sensor data compared with locality preserving projections. Furthermore, all of the gyro-based features are used for accurate classification of the moving data. Compared with other methods, our method uses significantly less number of features, and the over-all accuracy has been obviously improved. PMID:27893761

  17. Development of a three-dimensional correction method for optical distortion of flow field inside a liquid droplet.

    PubMed

    Gim, Yeonghyeon; Ko, Han Seo

    2016-04-15

    In this Letter, a three-dimensional (3D) optical correction method, which was verified by simulation, was developed to reconstruct droplet-based flow fields. In the simulation, a synthetic phantom was reconstructed using a simultaneous multiplicative algebraic reconstruction technique with three detectors positioned at the synthetic object (represented by the phantom), with offset angles of 30° relative to each other. Additionally, a projection matrix was developed using the ray tracing method. If the phantom is in liquid, the image of the phantom can be distorted since the light passes through a convex liquid-vapor interface. Because of the optical distortion effect, the projection matrix used to reconstruct a 3D field should be supplemented by the revision ray, instead of the original projection ray. The revision ray can be obtained from the refraction ray occurring on the surface of the liquid. As a result, the error on the reconstruction field of the phantom could be reduced using the developed optical correction method. In addition, the developed optical method was applied to a Taylor cone which was caused by the high voltage between the droplet and the substrate.

  18. A contrast between DEMATEL-ANP and ANP methods for six sigma project selection: a case study in healthcare industry.

    PubMed

    Ortíz, Miguel A; Felizzola, Heriberto A; Nieto Isaza, Santiago

    2015-01-01

    The project selection process is a crucial step for healthcare organizations at the moment of implementing six sigma programs in both administrative and caring processes. However, six-sigma project selection is often defined as a decision making process with interaction and feedback between criteria; so that it is necessary to explore different methods to help healthcare companies to determine the Six-sigma projects that provide the maximum benefits. This paper describes the application of both ANP (Analytic Network process) and DEMATEL (Decision Making trial and evaluation laboratory)-ANP in a public medical centre to establish the most suitable six sigma project and finally, these methods were compared to evaluate their performance in the decision making process. ANP and DEMATEL-ANP were used to evaluate 6 six sigma project alternatives under an evaluation model composed by 3 strategies, 4 criteria and 15 sub-criteria. Judgement matrixes were completed by the six sigma team whose participants worked in different departments of the medical centre. The improving of care opportunity in obstetric outpatients was elected as the most suitable six sigma project with a score of 0,117 as contribution to the organization goals. DEMATEL-ANP performed better at decision making process since it reduced the error probability due to interactions and feedback. ANP and DEMATEL-ANP effectively supported six sigma project selection processes, helping to create a complete framework that guarantees the prioritization of projects that provide maximum benefits to healthcare organizations. As DEMATEL- ANP performed better, it should be used by practitioners involved in decisions related to the implementation of six sigma programs in healthcare sector accompanied by the adequate identification of the evaluation criteria that support the decision making model. Thus, this comparative study contributes to choosing more effective approaches in this field. Suggestions of further work are also proposed so that these methods can be applied more adequate in six sigma project selection processes in healthcare.

  19. Geometric artifacts reduction for cone-beam CT via L0-norm minimization without dedicated phantoms.

    PubMed

    Gong, Changcheng; Cai, Yufang; Zeng, Li

    2018-01-01

    For cone-beam computed tomography (CBCT), transversal shifts of the rotation center exist inevitably, which will result in geometric artifacts in CT images. In this work, we propose a novel geometric calibration method for CBCT, which can also be used in micro-CT. The symmetry property of the sinogram is used for the first calibration, and then L0-norm of the gradient image from the reconstructed image is used as the cost function to be minimized for the second calibration. An iterative search method is adopted to pursue the local minimum of the L0-norm minimization problem. The transversal shift value is updated with affirmatory step size within a search range determined by the first calibration. In addition, graphic processing unit (GPU)-based FDK algorithm and acceleration techniques are designed to accelerate the calibration process of the presented new method. In simulation experiments, the mean absolute difference (MAD) and the standard deviation (SD) of the transversal shift value were less than 0.2 pixels between the noise-free and noisy projection images, which indicated highly accurate calibration applying the new calibration method. In real data experiments, the smaller entropies of the corrected images also indicated that higher resolution image was acquired using the corrected projection data and the textures were well protected. Study results also support the feasibility of applying the proposed method to other imaging modalities.

  20. Systematic approach to cutoff frequency selection in continuous-wave electron paramagnetic resonance imaging

    NASA Astrophysics Data System (ADS)

    Hirata, Hiroshi; Itoh, Toshiharu; Hosokawa, Kouichi; Deng, Yuanmu; Susaki, Hitoshi

    2005-08-01

    This article describes a systematic method for determining the cutoff frequency of the low-pass window function that is used for deconvolution in two-dimensional continuous-wave electron paramagnetic resonance (EPR) imaging. An evaluation function for the criterion used to select the cutoff frequency is proposed, and is the product of the effective width of the point spread function for a localized point signal and the noise amplitude of a resultant EPR image. The present method was applied to EPR imaging for a phantom, and the result of cutoff frequency selection was compared with that based on a previously reported method for the same projection data set. The evaluation function has a global minimum point that gives the appropriate cutoff frequency. Images with reasonably good resolution and noise suppression can be obtained from projections with an automatically selected cutoff frequency based on the present method.

  1. OpenMDAO: Framework for Flexible Multidisciplinary Design, Analysis and Optimization Methods

    NASA Technical Reports Server (NTRS)

    Heath, Christopher M.; Gray, Justin S.

    2012-01-01

    The OpenMDAO project is underway at NASA to develop a framework which simplifies the implementation of state-of-the-art tools and methods for multidisciplinary design, analysis and optimization. Foremost, OpenMDAO has been designed to handle variable problem formulations, encourage reconfigurability, and promote model reuse. This work demonstrates the concept of iteration hierarchies in OpenMDAO to achieve a flexible environment for supporting advanced optimization methods which include adaptive sampling and surrogate modeling techniques. In this effort, two efficient global optimization methods were applied to solve a constrained, single-objective and constrained, multiobjective version of a joint aircraft/engine sizing problem. The aircraft model, NASA's nextgeneration advanced single-aisle civil transport, is being studied as part of the Subsonic Fixed Wing project to help meet simultaneous program goals for reduced fuel burn, emissions, and noise. This analysis serves as a realistic test problem to demonstrate the flexibility and reconfigurability offered by OpenMDAO.

  2. Reconstruction of gas distribution pipelines in MOZG in Poland using PE and PA pipes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Borowicz, W.; Podziemski, T.; Kramek, E.

    1996-12-31

    MOZG--Warsaw Regional Gas Distribution Company was established in 1856. Now it is one of six gas distribution companies in Poland. Due to steadily increasing safety demands, some of the pipelines will need reconstruction. The majority of the substandard piping is located in urban areas. The company wanted to gain experiences in applying reconstruction technologies using two different plastic materials polyethylene and polyamide. They also wanted to assess the technical and economic practicalities of performing relining processes. A PE project--large diameter polyethylene relining (450 mm) conducted in Warsaw in 1994/95 and PA projects--relining using polyamide pipes, projects conducted in Radom andmore » in Warsaw during 1993 and 1994 are the most interesting and representative for this kind of works. Thanks to the experience obtained whilst carrying out these projects, reconstruction of old gas pipelines has become routine. Now they often use polyethylene relining of smaller diameters and they continue both construction and reconstruction of gas network using PA pipes. This paper presents the accumulated knowledge showing the advantages and disadvantages of applied methods. It describes project design and implementation with details and reports on the necessary preparation work, on site job organization and the most common problems arising during the construction works.« less

  3. Automatic cable artifact removal for cardiac C-arm CT imaging

    NASA Astrophysics Data System (ADS)

    Haase, C.; Schäfer, D.; Kim, M.; Chen, S. J.; Carroll, J.; Eshuis, P.; Dössel, O.; Grass, M.

    2014-03-01

    Cardiac C-arm computed tomography (CT) imaging using interventional C-arm systems can be applied in various areas of interventional cardiology ranging from structural heart disease and electrophysiology interventions to valve procedures in hybrid operating rooms. In contrast to conventional CT systems, the reconstruction field of view (FOV) of C-arm systems is limited to a region of interest in cone-beam (along the patient axis) and fan-beam (in the transaxial plane) direction. Hence, highly X-ray opaque objects (e.g. cables from the interventional setup) outside the reconstruction field of view, yield streak artifacts in the reconstruction volume. To decrease the impact of these streaks a cable tracking approach on the 2D projection sequences with subsequent interpolation is applied. The proposed approach uses the fact that the projected position of objects outside the reconstruction volume depends strongly on the projection perspective. By tracking candidate points over multiple projections only objects outside the reconstruction volume are segmented in the projections. The method is quantitatively evaluated based on 30 simulated CT data sets. The 3D root mean square deviation to a reference image could be reduced for all cases by an average of 50 % (min 16 %, max 76 %). Image quality improvement is shown for clinical whole heart data sets acquired on an interventional C-arm system.

  4. Topics in computational physics

    NASA Astrophysics Data System (ADS)

    Monville, Maura Edelweiss

    Computational Physics spans a broad range of applied fields extending beyond the border of traditional physics tracks. Demonstrated flexibility and capability to switch to a new project, and pick up the basics of the new field quickly, are among the essential requirements for a computational physicist. In line with the above mentioned prerequisites, my thesis described the development and results of two computational projects belonging to two different applied science areas. The first project is a Materials Science application. It is a prescription for an innovative nano-fabrication technique that is built out of two other known techniques. The preliminary results of the simulation of this novel nano-patterning fabrication method show an average improvement, roughly equal to 18%, with respect to the single techniques it draws on. The second project is a Homeland Security application aimed at preventing smuggling of nuclear material at ports of entry. It is concerned with a simulation of an active material interrogation system based on the analysis of induced photo-nuclear reactions. This project consists of a preliminary evaluation of the photo-fission implementation in the more robust radiation transport Monte Carlo codes, followed by the customization and extension of MCNPX, a Monte Carlo code developed in Los Alamos National Laboratory, and MCNP-PoliMi. The final stage of the project consists of testing the interrogation system against some real world scenarios, for the purpose of determining the system's reliability, material discrimination power, and limitations.

  5. Applying the algorithm "assessing quality using image registration circuits" (AQUIRC) to multi-atlas segmentation

    NASA Astrophysics Data System (ADS)

    Datteri, Ryan; Asman, Andrew J.; Landman, Bennett A.; Dawant, Benoit M.

    2014-03-01

    Multi-atlas registration-based segmentation is a popular technique in the medical imaging community, used to transform anatomical and functional information from a set of atlases onto a new patient that lacks this information. The accuracy of the projected information on the target image is dependent on the quality of the registrations between the atlas images and the target image. Recently, we have developed a technique called AQUIRC that aims at estimating the error of a non-rigid registration at the local level and was shown to correlate to error in a simulated case. Herein, we extend upon this work by applying AQUIRC to atlas selection at the local level across multiple structures in cases in which non-rigid registration is difficult. AQUIRC is applied to 6 structures, the brainstem, optic chiasm, left and right optic nerves, and the left and right eyes. We compare the results of AQUIRC to that of popular techniques, including Majority Vote, STAPLE, Non-Local STAPLE, and Locally-Weighted Vote. We show that AQUIRC can be used as a method to combine multiple segmentations and increase the accuracy of the projected information on a target image, and is comparable to cutting edge methods in the multi-atlas segmentation field.

  6. Technical Note: Synchrotron-based high-energy x-ray phase sensitive microtomography for biomedical research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Huiqiang; Wu, Xizeng, E-mail: xwu@uabmc.edu, E-mail: tqxiao@sinap.ac.cn; Xiao, Tiqiao, E-mail: xwu@uabmc.edu, E-mail: tqxiao@sinap.ac.cn

    Purpose: Propagation-based phase-contrast CT (PPCT) utilizes highly sensitive phase-contrast technology applied to x-ray microtomography. Performing phase retrieval on the acquired angular projections can enhance image contrast and enable quantitative imaging. In this work, the authors demonstrate the validity and advantages of a novel technique for high-resolution PPCT by using the generalized phase-attenuation duality (PAD) method of phase retrieval. Methods: A high-resolution angular projection data set of a fish head specimen was acquired with a monochromatic 60-keV x-ray beam. In one approach, the projection data were directly used for tomographic reconstruction. In two other approaches, the projection data were preprocessed bymore » phase retrieval based on either the linearized PAD method or the generalized PAD method. The reconstructed images from all three approaches were then compared in terms of tissue contrast-to-noise ratio and spatial resolution. Results: The authors’ experimental results demonstrated the validity of the PPCT technique based on the generalized PAD-based method. In addition, the results show that the authors’ technique is superior to the direct PPCT technique as well as the linearized PAD-based PPCT technique in terms of their relative capabilities for tissue discrimination and characterization. Conclusions: This novel PPCT technique demonstrates great potential for biomedical imaging, especially for applications that require high spatial resolution and limited radiation exposure.« less

  7. Applying Sociology to the Teaching of Applied Sociology.

    ERIC Educational Resources Information Center

    Wallace, Richard Cheever

    A college-level applied sociology course in which students use sociological theory or research methodology to solve social problems is described. Guidelines for determining appropriate projects are: (1) the student must feel there is a substantial need for the project; (2) the project must be approachable through recognized sociological…

  8. 23 CFR 636.116 - What organizational conflict of interest requirements apply to design-build projects?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... apply to design-build projects? 636.116 Section 636.116 Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION ENGINEERING AND TRAFFIC OPERATIONS DESIGN-BUILD CONTRACTING General § 636.116 What organizational conflict of interest requirements apply to design-build projects? (a) State...

  9. 23 CFR 636.116 - What organizational conflict of interest requirements apply to design-build projects?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... apply to design-build projects? 636.116 Section 636.116 Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION ENGINEERING AND TRAFFIC OPERATIONS DESIGN-BUILD CONTRACTING General § 636.116 What organizational conflict of interest requirements apply to design-build projects? (a) State...

  10. 23 CFR 636.116 - What organizational conflict of interest requirements apply to design-build projects?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... apply to design-build projects? 636.116 Section 636.116 Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION ENGINEERING AND TRAFFIC OPERATIONS DESIGN-BUILD CONTRACTING General § 636.116 What organizational conflict of interest requirements apply to design-build projects? (a) State...

  11. 23 CFR 636.116 - What organizational conflict of interest requirements apply to design-build projects?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... apply to design-build projects? 636.116 Section 636.116 Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION ENGINEERING AND TRAFFIC OPERATIONS DESIGN-BUILD CONTRACTING General § 636.116 What organizational conflict of interest requirements apply to design-build projects? (a) State...

  12. Statistical bias correction method applied on CMIP5 datasets over the Indian region during the summer monsoon season for climate change applications

    NASA Astrophysics Data System (ADS)

    Prasanna, V.

    2018-01-01

    This study makes use of temperature and precipitation from CMIP5 climate model output for climate change application studies over the Indian region during the summer monsoon season (JJAS). Bias correction of temperature and precipitation from CMIP5 GCM simulation results with respect to observation is discussed in detail. The non-linear statistical bias correction is a suitable bias correction method for climate change data because it is simple and does not add up artificial uncertainties to the impact assessment of climate change scenarios for climate change application studies (agricultural production changes) in the future. The simple statistical bias correction uses observational constraints on the GCM baseline, and the projected results are scaled with respect to the changing magnitude in future scenarios, varying from one model to the other. Two types of bias correction techniques are shown here: (1) a simple bias correction using a percentile-based quantile-mapping algorithm and (2) a simple but improved bias correction method, a cumulative distribution function (CDF; Weibull distribution function)-based quantile-mapping algorithm. This study shows that the percentile-based quantile mapping method gives results similar to the CDF (Weibull)-based quantile mapping method, and both the methods are comparable. The bias correction is applied on temperature and precipitation variables for present climate and future projected data to make use of it in a simple statistical model to understand the future changes in crop production over the Indian region during the summer monsoon season. In total, 12 CMIP5 models are used for Historical (1901-2005), RCP4.5 (2005-2100), and RCP8.5 (2005-2100) scenarios. The climate index from each CMIP5 model and the observed agricultural yield index over the Indian region are used in a regression model to project the changes in the agricultural yield over India from RCP4.5 and RCP8.5 scenarios. The results revealed a better convergence of model projections in the bias corrected data compared to the uncorrected data. The study can be extended to localized regional domains aimed at understanding the changes in the agricultural productivity in the future with an agro-economy or a simple statistical model. The statistical model indicated that the total food grain yield is going to increase over the Indian region in the future, the increase in the total food grain yield is approximately 50 kg/ ha for the RCP4.5 scenario from 2001 until the end of 2100, and the increase in the total food grain yield is approximately 90 kg/ha for the RCP8.5 scenario from 2001 until the end of 2100. There are many studies using bias correction techniques, but this study applies the bias correction technique to future climate scenario data from CMIP5 models and applied it to crop statistics to find future crop yield changes over the Indian region.

  13. On the inversion of geodetic integrals defined over the sphere using 1-D FFT

    NASA Astrophysics Data System (ADS)

    García, R. V.; Alejo, C. A.

    2005-08-01

    An iterative method is presented which performs inversion of integrals defined over the sphere. The method is based on one-dimensional fast Fourier transform (1-D FFT) inversion and is implemented with the projected Landweber technique, which is used to solve constrained least-squares problems reducing the associated 1-D cyclic-convolution error. The results obtained are as precise as the direct matrix inversion approach, but with better computational efficiency. A case study uses the inversion of Hotine’s integral to obtain gravity disturbances from geoid undulations. Numerical convergence is also analyzed and comparisons with respect to the direct matrix inversion method using conjugate gradient (CG) iteration are presented. Like the CG method, the number of iterations needed to get the optimum (i.e., small) error decreases as the measurement noise increases. Nevertheless, for discrete data given over a whole parallel band, the method can be applied directly without implementing the projected Landweber method, since no cyclic convolution error exists.

  14. Monitoring hydrofrac-induced seismicity by surface arrays - the DHM-Project Basel case study

    NASA Astrophysics Data System (ADS)

    Blascheck, P.; Häge, M.; Joswig, M.

    2012-04-01

    The method "nanoseismic monitoring" was applied during the hydraulic stimulation at the Deep-Heat-Mining-Project (DHM-Project) Basel. Two small arrays in a distance of 2.1 km and 4.8 km to the borehole recorded continuously for two days. During this time more than 2500 seismic events were detected. The method of the surface monitoring of induced seismicity was compared to the reference which the hydrofrac monitoring presented. The latter was conducted by a network of borehole seismometers by Geothermal Explorers Limited. Array processing provides a outlier resistant, graphical jack-knifing localization method which resulted in a average deviation towards the reference of 850 m. Additionally, by applying the relative localization master-event method, the NNW-SSE strike direction of the reference was confirmed. It was shown that, in order to successfully estimate the magnitude of completeness as well as the b-value at the event rate and detection sensibility present, 3 h segments of data are sufficient. This is supported by two segment out of over 13 h of evaluated data. These segments were chosen so that they represent a time during the high seismic noise during normal working hours in daytime as well as the minimum anthropogenic noise at night. The low signal-to-noise ratio was compensated by the application of a sonogram event detection as well as a coincidence analysis within each array. Sonograms allow by autoadaptive, non-linear filtering to enhance signals whose amplitudes are just above noise level. For these events the magnitude was determined by the master-event method, allowing to compute the magnitude of completeness by the entire-magnitude-range method provided by the ZMAP toolbox. Additionally, the b-values were determined and compared to the reference values. An introduction to the method of "nanoseismic monitoring" will be given as well as the comparison to reference data in the Basel case study.

  15. Climate Projection Data base for Roads - CliPDaR: Design a guideline for a transnational database of downscaled climate projection data for road impact models - within the Conference's of European Directors of Roads (CEDR) TRANSNATIONAL ROAD RESEARCH PROG

    NASA Astrophysics Data System (ADS)

    Matulla, Christoph; Namyslo, Joachim; Fuchs, Tobias; Türk, Konrad

    2013-04-01

    The European road sector is vulnerable to extreme weather phenomena, which can cause large socio-economic losses. Almost every year there occur several weather triggered events (like heavy precipitation, floods, landslides, high winds, snow and ice, heat or cold waves, etc.), that disrupt transportation, knock out power lines, cut off populated regions from the outside and so on. So, in order to avoid imbalances in the supply of vital goods to people as well as to prevent negative impacts on health and life of people travelling by car it is essential to know present and future threats to roads. Climate change might increase future threats to roads. CliPDaR focuses on parts of the European road network and contributes, based on the current body of knowledge, to the establishment of guidelines helping to decide which methods and scenarios to apply for the estimation of future climate change based challenges in the field of road maintenance. Based on regional scale climate change projections specific road-impact models are applied in order to support protection measures. In recent years, it has been recognised that it is essential to assess the uncertainty and reliability of given climate projections by using ensemble approaches and downscaling methods. A huge amount of scientific work has been done to evaluate these approaches with regard to reliability and usefulness for investigations on possible impacts of climate changes. CliPDaR is going to collect the existing approaches and methodologies in European countries, discuss their differences and - in close cooperation with the road owners - develops a common line on future applications of climate projection data to road impact models. As such, the project will focus on reviewing and assessing existing regional climate change projections regarding transnational highway transport needs. The final project report will include recommendations how the findings of CliPDaR may support the decision processes of European national road administrations regarding possible future climate change impacts. First project results are presented at the conference.

  16. A NEW METHOD TO QUANTIFY AND REDUCE THE NET PROJECTION ERROR IN WHOLE-SOLAR-ACTIVE-REGION PARAMETERS MEASURED FROM VECTOR MAGNETOGRAMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Falconer, David A.; Tiwari, Sanjiv K.; Moore, Ronald L.

    Projection errors limit the use of vector magnetograms of active regions (ARs) far from the disk center. In this Letter, for ARs observed up to 60° from the disk center, we demonstrate a method for measuring and reducing the projection error in the magnitude of any whole-AR parameter that is derived from a vector magnetogram that has been deprojected to the disk center. The method assumes that the center-to-limb curve of the average of the parameter’s absolute values, measured from the disk passage of a large number of ARs and normalized to each AR’s absolute value of the parameter atmore » central meridian, gives the average fractional projection error at each radial distance from the disk center. To demonstrate the method, we use a large set of large-flux ARs and apply the method to a whole-AR parameter that is among the simplest to measure: whole-AR magnetic flux. We measure 30,845 SDO /Helioseismic and Magnetic Imager vector magnetograms covering the disk passage of 272 large-flux ARs, each having whole-AR flux >10{sup 22} Mx. We obtain the center-to-limb radial-distance run of the average projection error in measured whole-AR flux from a Chebyshev fit to the radial-distance plot of the 30,845 normalized measured values. The average projection error in the measured whole-AR flux of an AR at a given radial distance is removed by multiplying the measured flux by the correction factor given by the fit. The correction is important for both the study of the evolution of ARs and for improving the accuracy of forecasts of an AR’s major flare/coronal mass ejection productivity.« less

  17. Image-Based 2D Re-Projection for Attenuation Substitution in PET Neuroimaging.

    PubMed

    Laymon, Charles M; Minhas, Davneet S; Becker, Carl R; Matan, Cristy; Oborski, Matthew J; Price, Julie C; Mountz, James M

    2018-02-27

    In dual modality positron emission tomography (PET)/magnetic resonance imaging (MRI), attenuation correction (AC) methods are continually improving. Although a new AC can sometimes be generated from existing MR data, its application requires a new reconstruction. We evaluate an approximate 2D projection method that allows offline image-based reprocessing. 2-Deoxy-2-[ 18 F]fluoro-D-glucose ([ 18 F]FDG) brain scans were acquired (Siemens HR+) for six subjects. Attenuation data were obtained using the scanner's transmission source (SAC). Additional scanning was performed on a Siemens mMR including production of a Dixon-based MR AC (MRAC). The MRAC was imported to the HR+ and the PET data were reconstructed twice: once using native SAC (ground truth); once using the imported MRAC (imperfect AC). The re-projection method was implemented as follows. The MRAC PET was forward projected to approximately reproduce attenuation-corrected sinograms. The SAC and MRAC images were forward projected and converted to attenuation-correction factors (ACFs). The MRAC ACFs were removed from the MRAC PET sinograms by division; the SAC ACFs were applied by multiplication. The regenerated sinograms were reconstructed by filtered back projection to produce images (SUBAC PET) in which SAC has been substituted for MRAC. Ideally SUBAC PET should match SAC PET. Via coregistered T1 images, FreeSurfer (FS; MGH, Boston) was used to define a set of cortical gray matter regions of interest. Regional activity concentrations were extracted for SAC PET, MRAC PET, and SUBAC PET. SUBAC PET showed substantially smaller root mean square error than MRAC PET with averaged values of 1.5 % versus 8.1 %. Re-projection is a viable image-based method for the application of an alternate attenuation correction in neuroimaging.

  18. Towards Real Time Diagnostics of Hybrid Welding Laser/GMAW

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Timothy Mcjunkin; Dennis C. Kunerth; Corrie Nichol

    2013-07-01

    Methods are currently being developed towards a more robust system real time feedback in the high throughput process combining laser welding with gas metal arc welding. A combination of ultrasonic, eddy current, electronic monitoring, and visual techniques are being applied to the welding process. Initial simulation and bench top evaluation of proposed real time techniques on weld samples are presented along with the concepts to apply the techniques concurrently to the weld process. Consideration for the eventual code acceptance of the methods and system are also being researched as a component of this project. The goal is to detect defectsmore » or precursors to defects and correct when possible during the weld process.« less

  19. Towards real time diagnostics of Hybrid Welding Laser/GMAW

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McJunkin, T. R.; Kunerth, D. C.; Nichol, C. I.

    2014-02-18

    Methods are currently being developed towards a more robust system real time feedback in the high throughput process combining laser welding with gas metal arc welding. A combination of ultrasonic, eddy current, electronic monitoring, and visual techniques are being applied to the welding process. Initial simulation and bench top evaluation of proposed real time techniques on weld samples are presented along with the concepts to apply the techniques concurrently to the weld process. Consideration for the eventual code acceptance of the methods and system are also being researched as a component of this project. The goal is to detect defectsmore » or precursors to defects and correct when possible during the weld process.« less

  20. Towards real time diagnostics of Hybrid Welding Laser/GMAW

    NASA Astrophysics Data System (ADS)

    McJunkin, T. R.; Kunerth, D. C.; Nichol, C. I.; Todorov, E.; Levesque, S.

    2014-02-01

    Methods are currently being developed towards a more robust system real time feedback in the high throughput process combining laser welding with gas metal arc welding. A combination of ultrasonic, eddy current, electronic monitoring, and visual techniques are being applied to the welding process. Initial simulation and bench top evaluation of proposed real time techniques on weld samples are presented along with the concepts to apply the techniques concurrently to the weld process. Consideration for the eventual code acceptance of the methods and system are also being researched as a component of this project. The goal is to detect defects or precursors to defects and correct when possible during the weld process.

  1. Advancing complementary and alternative medicine through social network analysis and agent-based modeling.

    PubMed

    Frantz, Terrill L

    2012-01-01

    This paper introduces the contemporary perspectives and techniques of social network analysis (SNA) and agent-based modeling (ABM) and advocates applying them to advance various aspects of complementary and alternative medicine (CAM). SNA and ABM are invaluable methods for representing, analyzing and projecting complex, relational, social phenomena; they provide both an insightful vantage point and a set of analytic tools that can be useful in a wide range of contexts. Applying these methods in the CAM context can aid the ongoing advances in the CAM field, in both its scientific aspects and in developing broader acceptance in associated stakeholder communities. Copyright © 2012 S. Karger AG, Basel.

  2. Visualizing phylogenetic tree landscapes.

    PubMed

    Wilgenbusch, James C; Huang, Wen; Gallivan, Kyle A

    2017-02-02

    Genomic-scale sequence alignments are increasingly used to infer phylogenies in order to better understand the processes and patterns of evolution. Different partitions within these new alignments (e.g., genes, codon positions, and structural features) often favor hundreds if not thousands of competing phylogenies. Summarizing and comparing phylogenies obtained from multi-source data sets using current consensus tree methods discards valuable information and can disguise potential methodological problems. Discovery of efficient and accurate dimensionality reduction methods used to display at once in 2- or 3- dimensions the relationship among these competing phylogenies will help practitioners diagnose the limits of current evolutionary models and potential problems with phylogenetic reconstruction methods when analyzing large multi-source data sets. We introduce several dimensionality reduction methods to visualize in 2- and 3-dimensions the relationship among competing phylogenies obtained from gene partitions found in three mid- to large-size mitochondrial genome alignments. We test the performance of these dimensionality reduction methods by applying several goodness-of-fit measures. The intrinsic dimensionality of each data set is also estimated to determine whether projections in 2- and 3-dimensions can be expected to reveal meaningful relationships among trees from different data partitions. Several new approaches to aid in the comparison of different phylogenetic landscapes are presented. Curvilinear Components Analysis (CCA) and a stochastic gradient decent (SGD) optimization method give the best representation of the original tree-to-tree distance matrix for each of the three- mitochondrial genome alignments and greatly outperformed the method currently used to visualize tree landscapes. The CCA + SGD method converged at least as fast as previously applied methods for visualizing tree landscapes. We demonstrate for all three mtDNA alignments that 3D projections significantly increase the fit between the tree-to-tree distances and can facilitate the interpretation of the relationship among phylogenetic trees. We demonstrate that the choice of dimensionality reduction method can significantly influence the spatial relationship among a large set of competing phylogenetic trees. We highlight the importance of selecting a dimensionality reduction method to visualize large multi-locus phylogenetic landscapes and demonstrate that 3D projections of mitochondrial tree landscapes better capture the relationship among the trees being compared.

  3. Smoothing of climate time series revisited

    NASA Astrophysics Data System (ADS)

    Mann, Michael E.

    2008-08-01

    We present an easily implemented method for smoothing climate time series, generalizing upon an approach previously described by Mann (2004). The method adaptively weights the three lowest order time series boundary constraints to optimize the fit with the raw time series. We apply the method to the instrumental global mean temperature series from 1850-2007 and to various surrogate global mean temperature series from 1850-2100 derived from the CMIP3 multimodel intercomparison project. These applications demonstrate that the adaptive method systematically out-performs certain widely used default smoothing methods, and is more likely to yield accurate assessments of long-term warming trends.

  4. Applying New Methods to Diagnose Coral Diseases

    USGS Publications Warehouse

    Kellogg, Christina A.; Zawada, David G.

    2009-01-01

    Coral disease, one of the major causes of reef degradation and coral death, has been increasing worldwide since the 1970s, particularly in the Caribbean. Despite increased scientific study, simple questions about the extent of disease outbreaks and the causative agents remain unanswered. A component of the U.S. Geological Survey Coral Reef Ecosystem STudies (USGS CREST) project is focused on developing and using new methods to approach the complex problem of coral disease.

  5. Method and apparatus for detecting and/or imaging clusters of small scattering centers in the body

    DOEpatents

    Perez-Mendez, V.; Sommer, F.G.

    1982-07-13

    An ultrasonic method and apparatus are provided for detecting and imaging clusters of small scattering centers in the breast wherein periodic pulses are applied to an ultrasound emitting transducer and projected into the body, thereafter being received by at least one receiving transducer positioned to receive scattering from the scattering center clusters. The signals are processed to provide an image showing cluster extent and location. 6 figs.

  6. Method and apparatus for detecting and/or imaging clusters of small scattering centers in the body

    DOEpatents

    Perez-Mendez, Victor; Sommer, Frank G.

    1982-01-01

    An ultrasonic method and apparatus are provided for detecting and imaging clusters of small scattering centers in the breast wherein periodic pulses are applied to an ultrasound emitting transducer and projected into the body, thereafter being received by at least one receiving transducer positioned to receive scattering from the scattering center clusters. The signals are processed to provide an image showing cluster extent and location.

  7. A Modeling and Data Analysis of Laser Beam Propagation in the Maritime Domain

    DTIC Science & Technology

    2015-05-18

    approach to computing pdfs is the Kernel Density Method (Reference [9] has an intro - duction to the method), which we will apply to compute the pdf of our...The project has two parts to it: 1) we present a computational analysis of different probability density function approximation techniques; and 2) we... computational analysis of different probability density function approximation techniques; and 2) we introduce preliminary steps towards developing a

  8. Reconstruction method for fluorescent X-ray computed tomography by least-squares method using singular value decomposition

    NASA Astrophysics Data System (ADS)

    Yuasa, T.; Akiba, M.; Takeda, T.; Kazama, M.; Hoshino, A.; Watanabe, Y.; Hyodo, K.; Dilmanian, F. A.; Akatsuka, T.; Itai, Y.

    1997-02-01

    We describe a new attenuation correction method for fluorescent X-ray computed tomography (FXCT) applied to image nonradioactive contrast materials in vivo. The principle of the FXCT imaging is that of computed tomography of the first generation. Using monochromatized synchrotron radiation from the BLNE-5A bending-magnet beam line of Tristan Accumulation Ring in KEK, Japan, we studied phantoms with the FXCT method, and we succeeded in delineating a 4-mm-diameter channel filled with a 500 /spl mu/g I/ml iodine solution in a 20-mm-diameter acrylic cylindrical phantom. However, to detect smaller iodine concentrations, attenuation correction is needed. We present a correction method based on the equation representing the measurement process. The discretized equation system is solved by the least-squares method using the singular value decomposition. The attenuation correction method is applied to the projections by the Monte Carlo simulation and the experiment to confirm its effectiveness.

  9. Development of high-rise buildings: digitalization of life cycle management

    NASA Astrophysics Data System (ADS)

    Gusakova, Elena

    2018-03-01

    The analysis of the accumulated long-term experience in the construction and operation of high-rise buildings reveals not only the engineering specificity of such projects, but also systemic problems in the field of project management. Most of the project decisions are made by the developer and the investor in the early stages of the life cycle - from the acquisition of the site to the start of operation, so most of the participants in the construction and operation of the high-rise building are far from the strategic life-cycle management of the project. The solution of these tasks due to the informatization of management has largely exhausted its efficiency resource. This is due to the fact that the applied IT-systems automated traditional "inherited" processes and management structures, and, in addition, they were focused on informatization of the activities of the construction company, rather than the construction project. Therefore, in the development of high-rise buildings, the tasks of researching approaches and methods for managing the full life cycle of projects that will improve their competitiveness become topical. For this purpose, the article substantiates the most promising approaches and methods of informational modeling of high-rise construction as a basis for managing the full life cycle of this project. Reengineering of information interaction schemes for project participants is considered; formation of a unified digital environment for the life cycle of the project; the development of systems for integrating data management and project management.

  10. The Social Process of Analyzing Real Water Resource Systems Plans and Management Policies

    NASA Astrophysics Data System (ADS)

    Loucks, Daniel

    2016-04-01

    Developing and applying systems analysis methods for improving the development and management of real world water resource systems, I have learned, is primarily a social process. This talk is a call for more recognition of this reality in the modeling approaches we propose in the papers and books we publish. The mathematical models designed to inform planners and managers of water systems that we see in many of our journals often seem more complex than they need be. They also often seem not as connected to reality as they could be. While it may be easier to publish descriptions of complex models than simpler ones, and while adding complexity to models might make them better able to mimic or resemble the actual complexity of the real physical and/or social systems or processes being analyzed, the usefulness of such models often can be an illusion. Sometimes the important features of reality that are of concern or interest to those who make decisions can be adequately captured using relatively simple models. Finding the right balance for the particular issues being addressed or the particular decisions that need to be made is an art. When applied to real world problems or issues in specific basins or regions, systems modeling projects often involve more attention to the social aspects than the mathematical ones. Mathematical models addressing connected interacting interdependent components of complex water systems are in fact some of the most useful methods we have to study and better understand the systems we manage around us. They can help us identify and evaluate possible alternative solutions to problems facing humanity today. The study of real world systems of interacting components using mathematical models is commonly called applied systems analyses. Performing such analyses with decision makers rather than of decision makers is critical if the needed trust between project personnel and their clients is to be developed. Using examples from recent and ongoing modeling projects in different parts of the world, this talk will attempt to show the dependency on the degree of project success with the degree of attention given to the communication between project personnel, the stakeholders and decision making institutions. It will also highlight how initial project terms-of-reference and expected outcomes can change, sometimes in surprising ways, during the course of such projects. Changing project objectives often result from changing stakeholder values, emphasizing the need for analyses that can adapt to this uncertainty.

  11. Project Management Using Modern Guidance, Navigation and Control Theory

    NASA Technical Reports Server (NTRS)

    Hill, Terry R.

    2011-01-01

    Implementing guidance, navigation, and control (GN&C) theory principles and applying them to the human element of project management and control is not a new concept. As both the literature on the subject and the real-world applications are neither readily available nor comprehensive with regard to how such principles might be applied, this paper has been written to educate the project manager on the "laws of physics" of his or her project (not to teach a GN&C engineer how to become a project manager) and to provide an intuitive, mathematical explanation as to the control and behavior of projects. This paper will also address how the fundamental principles of modern GN&C were applied to the National Aeronautics and Space Administration's (NASA) Constellation Program (CxP) space suit project, ensuring the project was managed within cost, schedule, and budget. A project that is akin to a physical system can be modeled and managed using the same over arching principles of GN&C that would be used if that project were a complex vehicle, a complex system(s), or complex software with time-varying processes (at times nonlinear) containing multiple data inputs of varying accuracy and a range of operating points. The classic GN&C theory approach could thus be applied to small, well-defined projects; yet when working with larger, multiyear projects necessitating multiple organizational structures, numerous external influences, and a multitude of diverse resources, modern GN&C principles are required to model and manage the project. The fundamental principles of a GN&C system incorporate these basic concepts: State, Behavior, Feedback Control, Navigation, Guidance and Planning Logic systems. The State of a system defines the aspects of the system that can change over time; e.g., position, velocity, acceleration, coordinate-based attitude, and temperature, etc. The Behavior of the system focuses more on what changes are possible within the system; this is denoted in the state of the system. The behavior of a system, as captured in the system modeling, when properly done will aid in accurately predicting future system performance. The Feedback Control system understands the state and behavior of the system and uses feedback to adjust control inputs into the system. The feedback, which is the right arm of the Control system, allows change to be affected in the overall system; it therefore is important to not only correctly identify the system feedback inputs, but also the system response to the feedback inputs. The Navigation system takes multiple data inputs and based on a priori knowledge of the inputs, develops a statistically based weighting of the inputs and measurements to determine the system's state. Guidance and Planning Logic of the system, complete with an understanding of where the system is (provided by the Navigation system), will in turn determine where the system needs to be and how to get it there. With any system/project, it is critical that the objective of the system/project be clearly defined -- not only to plan but to measure performance and to aid in guiding the system or the project. The system principles discussed above, which can be and have been applied to the current CxP space suit development project, can also be mapped to real-world constituents, thus allowing project managers to apply systems theories that are well defined in engineering and mathematics to a discipline (i.e., Project Management) that historically has been based in personal experience and intuition. This mapping of GN&C theory to Project Management will, in turn, permit a direct, methodical approach to Project Management, planning and control providing a tool to help predict (and guide) performance and an understanding of the project constraints, how the project can be controlled, and the impacts to external influences and inputs. This approach, to a project manager, flows down to the three bottom-line variables of cost, schedule, and scope ando the needed control of these three variables to successfully perform and complete a project.

  12. Biorthogonal projected energies of a Gutzwiller similarity transformed Hamiltonian.

    PubMed

    Wahlen-Strothman, J M; Scuseria, G E

    2016-12-07

    We present a method incorporating biorthogonal orbital-optimization, symmetry projection, and double-occupancy screening with a non-unitary similarity transformation generated by the Gutzwiller factor [Formula: see text], and apply it to the Hubbard model. Energies are calculated with mean-field computational scaling with high-quality results comparable to coupled cluster singles and doubles. This builds on previous work performing similarity transformations with more general, two-body Jastrow-style correlators. The theory is tested on 2D lattices ranging from small systems into the thermodynamic limit and is compared to available reference data.

  13. Reflections on Descriptive Psychology: NASA, Media and Technology, Observation

    NASA Technical Reports Server (NTRS)

    Aucoin, Paschal J., Jr.

    1999-01-01

    At NASA, we have used methods of Descriptive Psychology (DP) to solve problems in several areas: Simulation of proposed Lunar/Mars missions at high level to assess feasibility and needs in the robotics and automation areas. How we would go about making a "person-like" robot. Design and implementation of Systems Engineering practices on behalf of future projects with emphasis on interoperability. Design of a Question and Answer dialog system to handle student questions about Advanced Life Support (ALS) systems - students learn biology by applying it to ALS projects.

  14. Projected Changes in Hydrological Extremes in a Cold Region Watershed: Sensitivity of Results to Statistical Methods of Analysis

    NASA Astrophysics Data System (ADS)

    Dibike, Y. B.; Eum, H. I.; Prowse, T. D.

    2017-12-01

    Flows originating from alpine dominated cold region watersheds typically experience extended winter low flows followed by spring snowmelt and summer rainfall driven high flows. In a warmer climate, there will be temperature- induced shift in precipitation from snow towards rain as well as changes in snowmelt timing affecting the frequency of extreme high and low flow events which could significantly alter ecosystem services. This study examines the potential changes in the frequency and severity of hydrologic extremes in the Athabasca River watershed in Alberta, Canada based on the Variable Infiltration Capacity (VIC) hydrologic model and selected and statistically downscaled climate change scenario data from the latest Coupled Model Intercomparison Project (CMIP5). The sensitivity of these projected changes is also examined by applying different extreme flow analysis methods. The hydrological model projections show an overall increase in mean annual streamflow in the watershed and a corresponding shift in the freshet timing to earlier period. Most of the streams are projected to experience increases during the winter and spring seasons and decreases during the summer and early fall seasons, with an overall projected increases in extreme high flows, especially for low frequency events. While the middle and lower parts of the watershed are characterised by projected increases in extreme high flows, the high elevation alpine region is mainly characterised by corresponding decreases in extreme low flow events. However, the magnitude of projected changes in extreme flow varies over a wide range, especially for low frequent events, depending on the climate scenario and period of analysis, and sometimes in a nonlinear way. Nonetheless, the sensitivity of the projected changes to the statistical method of analysis is found to be relatively small compared to the inter-model variability.

  15. Projected quasiparticle theory for molecular electronic structure

    NASA Astrophysics Data System (ADS)

    Scuseria, Gustavo E.; Jiménez-Hoyos, Carlos A.; Henderson, Thomas M.; Samanta, Kousik; Ellis, Jason K.

    2011-09-01

    We derive and implement symmetry-projected Hartree-Fock-Bogoliubov (HFB) equations and apply them to the molecular electronic structure problem. All symmetries (particle number, spin, spatial, and complex conjugation) are deliberately broken and restored in a self-consistent variation-after-projection approach. We show that the resulting method yields a comprehensive black-box treatment of static correlations with effective one-electron (mean-field) computational cost. The ensuing wave function is of multireference character and permeates the entire Hilbert space of the problem. The energy expression is different from regular HFB theory but remains a functional of an independent quasiparticle density matrix. All reduced density matrices are expressible as an integration of transition density matrices over a gauge grid. We present several proof-of-principle examples demonstrating the compelling power of projected quasiparticle theory for quantum chemistry.

  16. Distance majorization and its applications.

    PubMed

    Chi, Eric C; Zhou, Hua; Lange, Kenneth

    2014-08-01

    The problem of minimizing a continuously differentiable convex function over an intersection of closed convex sets is ubiquitous in applied mathematics. It is particularly interesting when it is easy to project onto each separate set, but nontrivial to project onto their intersection. Algorithms based on Newton's method such as the interior point method are viable for small to medium-scale problems. However, modern applications in statistics, engineering, and machine learning are posing problems with potentially tens of thousands of parameters or more. We revisit this convex programming problem and propose an algorithm that scales well with dimensionality. Our proposal is an instance of a sequential unconstrained minimization technique and revolves around three ideas: the majorization-minimization principle, the classical penalty method for constrained optimization, and quasi-Newton acceleration of fixed-point algorithms. The performance of our distance majorization algorithms is illustrated in several applications.

  17. An improved robust buffer allocation method for the project scheduling problem

    NASA Astrophysics Data System (ADS)

    Ghoddousi, Parviz; Ansari, Ramin; Makui, Ahmad

    2017-04-01

    Unpredictable uncertainties cause delays and additional costs for projects. Often, when using traditional approaches, the optimizing procedure of the baseline project plan fails and leads to delays. In this study, a two-stage multi-objective buffer allocation approach is applied for robust project scheduling. In the first stage, some decisions are made on buffer sizes and allocation to the project activities. A set of Pareto-optimal robust schedules is designed using the meta-heuristic non-dominated sorting genetic algorithm (NSGA-II) based on the decisions made in the buffer allocation step. In the second stage, the Pareto solutions are evaluated in terms of the deviation from the initial start time and due dates. The proposed approach was implemented on a real dam construction project. The outcomes indicated that the obtained buffered schedule reduces the cost of disruptions by 17.7% compared with the baseline plan, with an increase of about 0.3% in the project completion time.

  18. Planungsmodelle und Planungsmethoden. Anhaltspunkte zur Strukturierung und Gestaltung von Planungsprozessen

    NASA Astrophysics Data System (ADS)

    Diller, Christian; Karic, Sarah; Oberding, Sarah

    2017-06-01

    The topic of this article ist the question, in which phases oft he political planning process planners apply their methodological set of tools. That for the results of a research-project are presented, which were gained by an examination of planning-cases in learned journals. Firstly it is argued, which model oft he planning-process is most suitable to reflect the regarded cases and how it is positioned to models oft he political process. Thereafter it is analyzed, which types of planning methods are applied in the several stages oft he planning process. The central findings: Although complex, many planning processes can be thouroughly pictured by a linear modell with predominantly simple feedback loops. Even in times of he communicative turn, concerning their set of tools, planners should pay attention to apply not only communicative methods but as well the classical analytical-rational methods. They are helpful especially for the understanding of the political process before and after the actual planning phase.

  19. Developing Recreation, Leisure, and Sport Professional Competencies through Practitioner/Academic Service Engagement Partnerships

    ERIC Educational Resources Information Center

    VanSickle, Jennifer; Schaumleffel, Nathan A.

    2016-01-01

    The goal of many universities is to prepare students for professional careers, especially in the applied field of recreation, leisure, and sport (Smith, O'Dell, & Schaumleffel, 2002). While some universities continue to use traditional knowledge-transfer methods to accomplish this goal, others have developed service engagement projects that…

  20. Focus Group Discussions: Three Examples from Family and Consumer Science Research.

    ERIC Educational Resources Information Center

    Garrison, M. E. Betsy; Pierce, Sarah H.; Monroe, Pamela A.; Sasser, Diane D.; Shaffer, Amy C.; Blalock, Lydia B.

    1999-01-01

    Gives examples of the focus group method in terms of question development, group composition and recruitment, interview protocols, and data analysis as applied to three family and consumer-sciences research projects: consumer behavior of working female adolescents, work readiness of adult males with low educational attainment, and definition of…

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramachandran, N.

    New technologies were used to cost-effectively remediate several hundred feet of radioactively contaminated subsurface drain pipes at the General Motors site in Adrian, Michigan, and to conduct post-remedial verification surveys. Supplemental cleanup criteria were applied to inaccessible areas of the project, and inexpensive treatment technology was used to treat wastewater generated. Application of these methods resulted in substantial cost savings.

  2. Real World Projects: Creating a Home-Grown Fundraiser for Your Sales Course

    ERIC Educational Resources Information Center

    Fine, Monica B.; Clark, Paul C.

    2013-01-01

    More than any other area of business, expertise in personal selling and sales management can best be seen through applied learning styles. Many universities are now offering sale concentration in marketing or even MBA degrees. However, many students still feel instructors' teaching methods are outdated. Instructors use many different techniques…

  3. Principal Component Analysis: Resources for an Essential Application of Linear Algebra

    ERIC Educational Resources Information Center

    Pankavich, Stephen; Swanson, Rebecca

    2015-01-01

    Principal Component Analysis (PCA) is a highly useful topic within an introductory Linear Algebra course, especially since it can be used to incorporate a number of applied projects. This method represents an essential application and extension of the Spectral Theorem and is commonly used within a variety of fields, including statistics,…

  4. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR THE GENERATION AND OPERATION OF DATA DICTIONARIES (UA-D-4.0)

    EPA Science Inventory

    The purpose of this SOP is to provide a standard method for the writing of data dictionaries. This procedure applies to the dictionaries used during the Arizona NHEXAS project and the "Border" study. Keywords: guidelines; data dictionaries.

    The National Human Exposure Assessme...

  5. SU-E-QI-08: Fourier Properties of Cone Beam CT Projection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bai, T; UT Southwestern Medical Center, Dallas, TX; Yan, H

    Purpose: To explore the Fourier properties of cone beam CT (CBCT) projections and apply the property to directly estimate noise level of CBCT projections without any prior information. Methods: By utilizing the property of Bessel function, we derivate the Fourier properties of the CBCT projections for an arbitrary point object. It is found that there exists a double-wedge shaped region in the Fourier space where the intensity is approximately zero. We further derivate the Fourier properties of independent noise added to CBCT projections. The expectation of the square of the module in any point of the Fourier space is constantmore » and the value approximately equals to noise energy. We further validate the theory in numerical simulations for both a delta function object and a NCAT phantom with different levels of noise added. Results: Our simulation confirmed the existence of the double-wedge shaped region in Fourier domain for the x-ray projection image. The boundary locations of this region agree well with theoretical predictions. In the experiments of estimating noise level, the mean relative error between the theory estimation and the ground truth values is 2.697%. Conclusion: A novel theory on the Fourier properties of CBCT projections has been discovered. Accurate noise level estimation can be achieved by applying this theory directly to the measured CBCT projections. This work was supported in part by NIH(1R01CA154747-01), NSFC((No. 61172163), Research Fund for the Doctoral Program of Higher Education of China (No. 20110201110011) and China Scholarship Council.« less

  6. Wave propagation simulation in the upper core of sodium-cooled fast reactors using a spectral-element method for heterogeneous media

    NASA Astrophysics Data System (ADS)

    Nagaso, Masaru; Komatitsch, Dimitri; Moysan, Joseph; Lhuillier, Christian

    2018-01-01

    ASTRID project, French sodium cooled nuclear reactor of 4th generation, is under development at the moment by Alternative Energies and Atomic Energy Commission (CEA). In this project, development of monitoring techniques for a nuclear reactor during operation are identified as a measure issue for enlarging the plant safety. Use of ultrasonic measurement techniques (e.g. thermometry, visualization of internal objects) are regarded as powerful inspection tools of sodium cooled fast reactors (SFR) including ASTRID due to opacity of liquid sodium. In side of a sodium cooling circuit, heterogeneity of medium occurs because of complex flow state especially in its operation and then the effects of this heterogeneity on an acoustic propagation is not negligible. Thus, it is necessary to carry out verification experiments for developments of component technologies, while such kind of experiments using liquid sodium may be relatively large-scale experiments. This is why numerical simulation methods are essential for preceding real experiments or filling up the limited number of experimental results. Though various numerical methods have been applied for a wave propagation in liquid sodium, we still do not have a method for verifying on three-dimensional heterogeneity. Moreover, in side of a reactor core being a complex acousto-elastic coupled region, it has also been difficult to simulate such problems with conventional methods. The objective of this study is to solve these 2 points by applying three-dimensional spectral element method. In this paper, our initial results on three-dimensional simulation study on heterogeneous medium (the first point) are shown. For heterogeneity of liquid sodium to be considered, four-dimensional temperature field (three spatial and one temporal dimension) calculated by computational fluid dynamics (CFD) with Large-Eddy Simulation was applied instead of using conventional method (i.e. Gaussian Random field). This three-dimensional numerical experiment yields that we could verify the effects of heterogeneity of propagation medium on waves in Liquid sodium.

  7. Do project management and network governance contribute to inter-organisational collaboration in primary care? A mixed methods study.

    PubMed

    Schepman, Sanneke; Valentijn, Pim; Bruijnzeels, Marc; Maaijen, Marlies; de Bakker, Dinny; Batenburg, Ronald; de Bont, Antoinette

    2018-06-07

    The need for organisational development in primary care has increased as it is accepted as a means of curbing rising costs and responding to demographic transitions. It is only within such inter-organisational networks that small-scale practices can offer treatment to complex patients and continuity of care. The aim of this paper is to explore, through the experience of professionals and patients, whether, and how, project management and network governance can improve the outcomes of projects which promote inter-organisational collaboration in primary care. This paper describes a study of projects aimed at improving inter-organisational collaboration in Dutch primary care. The projects' success in project management and network governance was monitored by interviewing project leaders and board members on the one hand, and improvement in the collaboration by surveying professionals and patients on the other. Both qualitative and quantitative methods were applied to assess the projects. These were analysed, finally, using multi-level models in order to account for the variation in the projects, professionals and patients. Successful network governance was associated positively with the professionals' satisfaction with the collaboration; but not with improvements in the quality of care as experienced by patients. Neither patients nor professionals perceived successful project management as associated with the outcomes of the collaboration projects. This study shows that network governance in particular makes a difference to the outcomes of inter-organisational collaboration in primary care. However, project management is not a predictor for successful inter-organisational collaboration in primary care.

  8. Scientific rigour in qualitative research--examples from a study of women's health in family practice.

    PubMed

    Hamberg, K; Johansson, E; Lindgren, G; Westman, G

    1994-06-01

    The increase in qualitative research in family medicine raises a demand for critical discussions about design, methods and conclusions. This article shows how scientific claims for truthful findings and neutrality can be assessed. Established concepts such as validity, reliability, objectivity and generalization cannot be used in qualitative research. Alternative criteria for scientific rigour, initially introduced by Lincoln and Guba, are presented: credibility, dependability, confirmability and transferability. These criteria have been applied to a research project, a qualitative study with in-depth interviews with female patients suffering from chronic pain in the locomotor system. The interview data were analysed on the basis of grounded theory. The proposed indicators for scientific rigour were shown to be useful when applied to the research project. Several examples are given. Difficulties in the use of the alternative criteria are also discussed.

  9. Angular reconstitution-based 3D reconstructions of nanomolecular structures from superresolution light-microscopy images

    PubMed Central

    Salas, Desirée; Le Gall, Antoine; Fiche, Jean-Bernard; Valeri, Alessandro; Ke, Yonggang; Bron, Patrick; Bellot, Gaetan

    2017-01-01

    Superresolution light microscopy allows the imaging of labeled supramolecular assemblies at a resolution surpassing the classical diffraction limit. A serious limitation of the superresolution approach is sample heterogeneity and the stochastic character of the labeling procedure. To increase the reproducibility and the resolution of the superresolution results, we apply multivariate statistical analysis methods and 3D reconstruction approaches originally developed for cryogenic electron microscopy of single particles. These methods allow for the reference-free 3D reconstruction of nanomolecular structures from two-dimensional superresolution projection images. Since these 2D projection images all show the structure in high-resolution directions of the optical microscope, the resulting 3D reconstructions have the best possible isotropic resolution in all directions. PMID:28811371

  10. COOMET pilot comparison 473/RU-a/09: Comparison of hydrophone calibrations in the frequency range 250 Hz to 200 kHz

    NASA Astrophysics Data System (ADS)

    Yi, Chen; Isaev, A. E.; Yuebing, Wang; Enyakov, A. M.; Teng, Fei; Matveev, A. N.

    2011-01-01

    A description is given of the COOMET project 473/RU-a/09: a pilot comparison of hydrophone calibrations at frequencies from 250 Hz to 200 kHz between Hangzhou Applied Acoustics Research Institute (HAARI, China)—pilot laboratory—and Russian National Research Institute for Physicotechnical and Radio Engineering Measurements (VNIIFTRI, Designated Institute of Russia of the CIPM MRA). Two standard hydrophones, B&K 8104 and TC 4033, were calibrated and compared to assess the current state of hydrophone calibration of HAARI (China) and Russia. Three different calibration methods were applied: a vibrating column method, a free-field reciprocity method and a comparison method. The standard facilities of each laboratory were used, and three different sound fields were applied: pressure field, free-field and reverberant field. The maximum deviation of the sensitivities of two hydrophones between the participants' results was 0.36 dB. Main text. To reach the main text of this paper, click on Final Report. The final report has been peer-reviewed and approved for publication by the CCAUV-KCWG.

  11. Revealing Risks in Adaptation Planning: expanding Uncertainty Treatment and dealing with Large Projection Ensembles during Planning Scenario development

    NASA Astrophysics Data System (ADS)

    Brekke, L. D.; Clark, M. P.; Gutmann, E. D.; Wood, A.; Mizukami, N.; Mendoza, P. A.; Rasmussen, R.; Ikeda, K.; Pruitt, T.; Arnold, J. R.; Rajagopalan, B.

    2015-12-01

    Adaptation planning assessments often rely on single methods for climate projection downscaling and hydrologic analysis, do not reveal uncertainties from associated method choices, and thus likely produce overly confident decision-support information. Recent work by the authors has highlighted this issue by identifying strengths and weaknesses of widely applied methods for downscaling climate projections and assessing hydrologic impacts. This work has shown that many of the methodological choices made can alter the magnitude, and even the sign of the climate change signal. Such results motivate consideration of both sources of method uncertainty within an impacts assessment. Consequently, the authors have pursued development of improved downscaling techniques spanning a range of method classes (quasi-dynamical and circulation-based statistical methods) and developed approaches to better account for hydrologic analysis uncertainty (multi-model; regional parameter estimation under forcing uncertainty). This presentation summarizes progress in the development of these methods, as well as implications of pursuing these developments. First, having access to these methods creates an opportunity to better reveal impacts uncertainty through multi-method ensembles, expanding on present-practice ensembles which are often based only on emissions scenarios and GCM choices. Second, such expansion of uncertainty treatment combined with an ever-expanding wealth of global climate projection information creates a challenge of how to use such a large ensemble for local adaptation planning. To address this challenge, the authors are evaluating methods for ensemble selection (considering the principles of fidelity, diversity and sensitivity) that is compatible with present-practice approaches for abstracting change scenarios from any "ensemble of opportunity". Early examples from this development will also be presented.

  12. Public project success as seen in a broad perspective.: Lessons from a meta-evaluation of 20 infrastructure projects in Norway.

    PubMed

    Volden, Gro Holst

    2018-08-01

    Infrastructure projects in developed countries are rarely evaluated ex-post. Despite their number and scope, our knowledge about their various impacts is surprisingly limited. The paper argues that such projects must be assessed in a broad perspective that includes both operational, tactical and strategic aspects, and unintended as well as intended effects. A generic six-criteria evaluation framework is suggested, inspired by a framework frequently used to evaluate development assistance projects. It is tested on 20 Norwegian projects from various sectors (transport, defence, ICT, buildings). The results indicate that the majority of projects were successful, especially in operational terms, possibly because they underwent external quality assurance up-front. It is argued that applying this type of standardized framework provides a good basis for comparison and learning across sectors. It is suggested that evaluations should be conducted with the aim of promoting accountability, building knowledge about infrastructure projects, and continuously improve the tools, methods and governance arrangements used in the front-end of project development. Copyright © 2018 The Author. Published by Elsevier Ltd.. All rights reserved.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agol, Dorice, E-mail: d.agol@uea.a.c.uk; Latawiec, Agnieszka E., E-mail: a.latawiec@iis-rio.org; Opole University of Technology, Department of Production Engineering and Logistics, Luboszycka 5, 45-036 Opole

    There has been an increased interest in using sustainability indicators for evaluating the impacts of development and conservation projects. Past and recent experiences have shown that sustainability indicators can be powerful tools for measuring the outcomes of various interventions, when used appropriately and adequately. Currently, there is a range of methods for applying sustainability indicators for project impact evaluation at the environment–development interface. At the same time, a number of challenges persist which have implication for impact evaluation processes especially in developing countries. We highlight some key and recurrent challenges, using three cases from Kenya, Indonesia and Brazil. In thismore » study, we have conducted a comparative analysis across multiple projects from the three countries, which aimed to conserve biodiversity and improve livelihoods. The assessments of these projects were designed to evaluate their positive, negative, short-term, long term, direct and indirect impacts. We have identified a set of commonly used sustainability indicators to evaluate the projects and have discussed opportunities and challenges associated with their application. Our analysis shows that impact evaluation processes present good opportunities for applying sustainability indicators. On the other hand, we find that project proponents (e.g. managers, evaluators, donors/funders) face challenges with establishing full impacts of interventions and that these are rooted in monitoring and evaluation processes, lack of evidence-based impacts, difficulties of measuring certain outcomes and concerns over scale of a range of impacts. We outline key lessons learnt from the multiple cases and propose ways to overcome common problems. Results from our analysis demonstrate practical experiences of applying sustainability indicators in developing countries context where there are different prevailing socio-economic, cultural and environmental conditions. The knowledge derived from this study may therefore be useful to a wider range of audience who are concerned with sustainable integration of development and environmental conservation. - Highlights: • Sustainability indicators are increasingly used for evaluating project impacts. • Lessons learnt are based on case studies from Africa, Asia and South America. • Similar challenges when assessing impacts of development and conservation projects • Need for pragmatic solutions to overcome challenges when assessing project impacts.« less

  14. a Target Aware Texture Mapping for Sculpture Heritage Modeling

    NASA Astrophysics Data System (ADS)

    Yang, C.; Zhang, F.; Huang, X.; Li, D.; Zhu, Y.

    2017-08-01

    In this paper, we proposed a target aware image to model registration method using silhouette as the matching clues. The target sculpture object in natural environment can be automatically detected from image with complex background with assistant of 3D geometric data. Then the silhouette can be automatically extracted and applied in image to model matching. Due to the user don't need to deliberately draw target area, the time consumption for precisely image to model matching operation can be greatly reduced. To enhance the function of this method, we also improved the silhouette matching algorithm to support conditional silhouette matching. Two experiments using a stone lion sculpture of Ming Dynasty and a potable relic in museum are given to evaluate the method we proposed. The method we proposed in this paper is extended and developed into a mature software applied in many culture heritage documentation projects.

  15. The Expanding Role of Applications in the Development and Validation of CFD at NASA

    NASA Technical Reports Server (NTRS)

    Schuster, David M.

    2010-01-01

    This paper focuses on the recent escalation in application of CFD to manned and unmanned flight projects at NASA and the need to often apply these methods to problems for which little or no previous validation data directly applies. The paper discusses the evolution of NASA.s CFD development from a strict Develop, Validate, Apply strategy to sometimes allowing for a Develop, Apply, Validate approach. The risks of this approach and some of its unforeseen benefits are discussed and tied to specific operational examples. There are distinct advantages for the CFD developer that is able to operate in this paradigm, and recommendations are provided for those inclined and willing to work in this environment.

  16. A project management system for the X-29A flight test program

    NASA Technical Reports Server (NTRS)

    Stewart, J. F.; Bauer, C. A.

    1983-01-01

    The project-management system developed for NASA's participation in the X-29A aircraft development program is characterized from a theoretical perspective, as an example of a system appropriate to advanced, highly integrated technology projects. System-control theory is applied to the analysis of classical project-management techniques and structures, which are found to be of closed-loop multivariable type; and the effects of increasing project complexity and integration are evaluated. The importance of information flow, sampling frequency, information holding, and delays is stressed. The X-29A system is developed in four stages: establishment of overall objectives and requirements, determination of information processes (block diagrams) definition of personnel functional roles and relationships, and development of a detailed work-breakdown structure. The resulting system is shown to require a greater information flow to management than conventional methods. Sample block diagrams are provided.

  17. A low-cost and portable realization on fringe projection three-dimensional measurement

    NASA Astrophysics Data System (ADS)

    Xiao, Suzhi; Tao, Wei; Zhao, Hui

    2015-12-01

    Fringe projection three-dimensional measurement is widely applied in a wide range of industrial application. The traditional fringe projection system has the disadvantages of high expense, big size, and complicated calibration requirements. In this paper we introduce a low-cost and portable realization on three-dimensional measurement with Pico projector. It has the advantages of low cost, compact physical size, and flexible configuration. For the proposed fringe projection system, there is no restriction to camera and projector's relative alignment on parallelism and perpendicularity for installation. Moreover, plane-based calibration method is adopted in this paper that avoids critical requirements on calibration system such as additional gauge block or precise linear z stage. What is more, error sources existing in the proposed system are introduced in this paper. The experimental results demonstrate the feasibility of the proposed low cost and portable fringe projection system.

  18. An Integrated Environment for Efficient Formal Design and Verification

    NASA Technical Reports Server (NTRS)

    1998-01-01

    The general goal of this project was to improve the practicality of formal methods by combining techniques from model checking and theorem proving. At the time the project was proposed, the model checking and theorem proving communities were applying different tools to similar problems, but there was not much cross-fertilization. This project involved a group from SRI that had substantial experience in the development and application of theorem-proving technology, and a group at Stanford that specialized in model checking techniques. Now, over five years after the proposal was submitted, there are many research groups working on combining theorem-proving and model checking techniques, and much more communication between the model checking and theorem proving research communities. This project contributed significantly to this research trend. The research work under this project covered a variety of topics: new theory and algorithms; prototype tools; verification methodology; and applications to problems in particular domains.

  19. Spectral-Spatial Shared Linear Regression for Hyperspectral Image Classification.

    PubMed

    Haoliang Yuan; Yuan Yan Tang

    2017-04-01

    Classification of the pixels in hyperspectral image (HSI) is an important task and has been popularly applied in many practical applications. Its major challenge is the high-dimensional small-sized problem. To deal with this problem, lots of subspace learning (SL) methods are developed to reduce the dimension of the pixels while preserving the important discriminant information. Motivated by ridge linear regression (RLR) framework for SL, we propose a spectral-spatial shared linear regression method (SSSLR) for extracting the feature representation. Comparing with RLR, our proposed SSSLR has the following two advantages. First, we utilize a convex set to explore the spatial structure for computing the linear projection matrix. Second, we utilize a shared structure learning model, which is formed by original data space and a hidden feature space, to learn a more discriminant linear projection matrix for classification. To optimize our proposed method, an efficient iterative algorithm is proposed. Experimental results on two popular HSI data sets, i.e., Indian Pines and Salinas demonstrate that our proposed methods outperform many SL methods.

  20. Dual respiratory and cardiac motion estimation in PET imaging: Methods design and quantitative evaluation.

    PubMed

    Feng, Tao; Wang, Jizhe; Tsui, Benjamin M W

    2018-04-01

    The goal of this study was to develop and evaluate four post-reconstruction respiratory and cardiac (R&C) motion vector field (MVF) estimation methods for cardiac 4D PET data. In Method 1, the dual R&C motions were estimated directly from the dual R&C gated images. In Method 2, respiratory motion (RM) and cardiac motion (CM) were separately estimated from the respiratory gated only and cardiac gated only images. The effects of RM on CM estimation were modeled in Method 3 by applying an image-based RM correction on the cardiac gated images before CM estimation, the effects of CM on RM estimation were neglected. Method 4 iteratively models the mutual effects of RM and CM during dual R&C motion estimations. Realistic simulation data were generated for quantitative evaluation of four methods. Almost noise-free PET projection data were generated from the 4D XCAT phantom with realistic R&C MVF using Monte Carlo simulation. Poisson noise was added to the scaled projection data to generate additional datasets of two more different noise levels. All the projection data were reconstructed using a 4D image reconstruction method to obtain dual R&C gated images. The four dual R&C MVF estimation methods were applied to the dual R&C gated images and the accuracy of motion estimation was quantitatively evaluated using the root mean square error (RMSE) of the estimated MVFs. Results show that among the four estimation methods, Methods 2 performed the worst for noise-free case while Method 1 performed the worst for noisy cases in terms of quantitative accuracy of the estimated MVF. Methods 4 and 3 showed comparable results and achieved RMSE lower by up to 35% than that in Method 1 for noisy cases. In conclusion, we have developed and evaluated 4 different post-reconstruction R&C MVF estimation methods for use in 4D PET imaging. Comparison of the performance of four methods on simulated data indicates separate R&C estimation with modeling of RM before CM estimation (Method 3) to be the best option for accurate estimation of dual R&C motion in clinical situation. © 2018 American Association of Physicists in Medicine.

  1. 25 CFR 170.912 - Does Indian employment preference apply to Federal-aid Highway Projects?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 25 Indians 1 2011-04-01 2011-04-01 false Does Indian employment preference apply to Federal-aid Highway Projects? 170.912 Section 170.912 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR... Indian Preference § 170.912 Does Indian employment preference apply to Federal-aid Highway Projects? (a...

  2. 25 CFR 170.912 - Does Indian employment preference apply to Federal-aid Highway Projects?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 25 Indians 1 2013-04-01 2013-04-01 false Does Indian employment preference apply to Federal-aid Highway Projects? 170.912 Section 170.912 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR... Indian Preference § 170.912 Does Indian employment preference apply to Federal-aid Highway Projects? (a...

  3. 25 CFR 170.912 - Does Indian employment preference apply to Federal-aid Highway Projects?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 25 Indians 1 2012-04-01 2011-04-01 true Does Indian employment preference apply to Federal-aid Highway Projects? 170.912 Section 170.912 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR... Indian Preference § 170.912 Does Indian employment preference apply to Federal-aid Highway Projects? (a...

  4. 25 CFR 170.912 - Does Indian employment preference apply to Federal-aid Highway Projects?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 25 Indians 1 2014-04-01 2014-04-01 false Does Indian employment preference apply to Federal-aid Highway Projects? 170.912 Section 170.912 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR... Indian Preference § 170.912 Does Indian employment preference apply to Federal-aid Highway Projects? (a...

  5. Performance assurance of the re-applying project documentation

    NASA Astrophysics Data System (ADS)

    Kozlova, Olga

    2017-10-01

    Usage of the re-applying project documentation is cost effective measure. Saving of budgetary funds for purchases for development of new project documentation occurs by means of it. It also becomes possible to consider better decisions and prevent the repetition of mistakes. Nowadays, state authorities in construction management are forming separate institute for re-applying project documentation. The article shows the main tasks of such events and the issues to be solved for achievement of a high positive result.

  6. Comparison of Different Approach of Back Projection Method in Retrieving the Rupture Process of Large Earthquakes

    NASA Astrophysics Data System (ADS)

    Tan, F.; Wang, G.; Chen, C.; Ge, Z.

    2016-12-01

    Back-projection of teleseismic P waves [Ishii et al., 2005] has been widely used to image the rupture of earthquakes. Besides the conventional narrowband beamforming in time domain, approaches in frequency domain such as MUSIC back projection (Meng 2011) and compressive sensing (Yao et al, 2011), are proposed to improve the resolution. Each method has its advantages and disadvantages and should be properly used in different cases. Therefore, a thorough research to compare and test these methods is needed. We write a GUI program, which puts the three methods together so that people can conveniently use different methods to process the same data and compare the results. Then we use all the methods to process several earthquake data, including 2008 Wenchuan Mw7.9 earthquake and 2011 Tohoku-Oki Mw9.0 earthquake, and theoretical seismograms of both simple sources and complex ruptures. Our results show differences in efficiency, accuracy and stability among the methods. Quantitative and qualitative analysis are applied to measure their dependence on data and parameters, such as station number, station distribution, grid size, calculate window length and so on. In general, back projection makes it possible to get a good result in a very short time using less than 20 lines of high-quality data with proper station distribution, but the swimming artifact can be significant. Some ways, for instance, combining global seismic data, could help ameliorate this method. Music back projection needs relatively more data to obtain a better and more stable result, which means it needs a lot more time since its runtime accumulates obviously faster than back projection with the increase of station number. Compressive sensing deals more effectively with multiple sources in a same time window, however, costs the longest time due to repeatedly solving matrix. Resolution of all the methods is complicated and depends on many factors. An important one is the grid size, which in turn influences runtime significantly. More detailed results in this research may help people to choose proper data, method and parameters.

  7. Conic Sampling: An Efficient Method for Solving Linear and Quadratic Programming by Randomly Linking Constraints within the Interior

    PubMed Central

    Serang, Oliver

    2012-01-01

    Linear programming (LP) problems are commonly used in analysis and resource allocation, frequently surfacing as approximations to more difficult problems. Existing approaches to LP have been dominated by a small group of methods, and randomized algorithms have not enjoyed popularity in practice. This paper introduces a novel randomized method of solving LP problems by moving along the facets and within the interior of the polytope along rays randomly sampled from the polyhedral cones defined by the bounding constraints. This conic sampling method is then applied to randomly sampled LPs, and its runtime performance is shown to compare favorably to the simplex and primal affine-scaling algorithms, especially on polytopes with certain characteristics. The conic sampling method is then adapted and applied to solve a certain quadratic program, which compute a projection onto a polytope; the proposed method is shown to outperform the proprietary software Mathematica on large, sparse QP problems constructed from mass spectometry-based proteomics. PMID:22952741

  8. Stochastic Methods for Aircraft Design

    NASA Technical Reports Server (NTRS)

    Pelz, Richard B.; Ogot, Madara

    1998-01-01

    The global stochastic optimization method, simulated annealing (SA), was adapted and applied to various problems in aircraft design. The research was aimed at overcoming the problem of finding an optimal design in a space with multiple minima and roughness ubiquitous to numerically generated nonlinear objective functions. SA was modified to reduce the number of objective function evaluations for an optimal design, historically the main criticism of stochastic methods. SA was applied to many CFD/MDO problems including: low sonic-boom bodies, minimum drag on supersonic fore-bodies, minimum drag on supersonic aeroelastic fore-bodies, minimum drag on HSCT aeroelastic wings, FLOPS preliminary design code, another preliminary aircraft design study with vortex lattice aerodynamics, HSR complete aircraft aerodynamics. In every case, SA provided a simple, robust and reliable optimization method which found optimal designs in order 100 objective function evaluations. Perhaps most importantly, from this academic/industrial project, technology has been successfully transferred; this method is the method of choice for optimization problems at Northrop Grumman.

  9. "Applied" Aspects of the Drug Resistance Strategies Project

    ERIC Educational Resources Information Center

    Hecht, Michael L.; Miller-Day, Michelle A.

    2010-01-01

    This paper discusses the applied aspects of our Drug Resistance Strategies Project. We argue that a new definitional distinction is needed to expand the notion of "applied" from the traditional notion of utilizing theory, which we call "applied.1," in order to consider theory-grounded, theory testing and theory developing applied research. We…

  10. A New Analytic-Adaptive Model for EGS Assessment, Development and Management Support

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Danko, George L

    To increase understanding of the energy extraction capacity of Enhanced Geothermal System(s) (EGS), a numerical model development and application project is completed. The general objective of the project is to develop and apply a new, data-coupled Thermal-Hydrological-Mechanical-Chemical (T-H-M-C) model in which the four internal components can be freely selected from existing simulation software without merging and cross-combining a diverse set of computational codes. Eight tasks are completed during the project period. The results are reported in five publications, an MS thesis, twelve quarterly, and two annual reports to DOE. Two US patents have also been issued during the project period,more » with one patent application originated prior to the start of the project. The “Multiphase Physical Transport Modeling Method and Modeling System” (U.S. Patent 8,396,693 B2, 2013), a key element in the GHE sub-model solution, is successfully used for EGS studies. The “Geothermal Energy Extraction System and Method" invention (U.S. Patent 8,430,166 B2, 2013) originates from the time of project performance, describing a new fluid flow control solution. The new, coupled T-H-M-C numerical model will help analyzing and designing new, efficient EGS systems.« less

  11. Seismicity detection around the subduting seamount off Ibaraki the Japan Trench using dense OBS array data

    NASA Astrophysics Data System (ADS)

    Nakatani, Y.; Mochizuki, K.; Shinohara, M.; Yamada, T.; Hino, R.; Ito, Y.; Murai, Y.; Sato, T.

    2013-12-01

    A subducting seamount which has a height of about 3 km was revealed off Ibaraki in the Japan Trench by a seismic survey (Mochizuki et al., 2008). Mochizuki et al. (2008) also interpreted that interplate coupling was weak over the seamount because seismicity was low and the slip of the recent large earthquake did not propagate over it. To carry out further investigation, we deployed dense ocean bottom seismometers (OBSs) array around the seamount for about a year. During the observation period, seismicity off Ibaraki was activated due to the occurrence of the 2011 Tohoku earthquake. The southern edge of the mainshock rupture area was considered to be located around off Ibaraki by many source analyses. Moreover, Kubo et al. (2013) proposes the seamount played an important role in the rupture termination of the largest aftershock. Therefore, in this study, we try to understand about spatiotemporal variation of seismicity around the seamount before and after the Mw 9.0 event as a first step to elucidate relationship between the subducting seamount and seismogenic behavior. We used velocity waveforms of 1 Hz long-term OBSs which were densely deployed at station intervals of about 6 km. The sampling rate is 200 Hz and the observation period is from October 16, 2010 to September 19, 2011. Because of the ambient noise and effects of thick seafloor sediments, it is difficult to apply methods which have been used to on-land observational data for detecting seismicity to OBS data and to handle continuous waveforms automatically. We therefore apply back-projection method (e.g., Kiser and Ishii, 2012) to OBS waveform data which estimate energy-release source by stacking waveforms. Among many back-projection methods, we adopt a semblance analysis (e.g., Honda et al., 2008) which can detect feeble waves. First of all, we constructed a 3-D velocity structure model off Ibaraki by compiling the results of marine seismic surveys (e.g., Nakahigashi et al., 2012). Then, we divided a target area into small areas and calculated P-wave traveltimes between each station and all small areas by fast marching method (Rawlinson et al., 2006). After constructing theoretical travel-time tables, we applied a proper frequency filter to the observed waveforms and estimated seismic energy release by projecting semblance values. As the result of applying our method, we could successfully detect magnitude 2-3 earthquakes.

  12. Identification of Balanced Chromosomal Rearrangements Previously Unknown Among Participants in the 1000 Genomes Project: Implications for Interpretation of Structural Variation in Genomes and the Future of Clinical Cytogenetics

    PubMed Central

    Dong, Zirui; Wang, Huilin; Chen, Haixiao; Jiang, Hui; Yuan, Jianying; Yang, Zhenjun; Wang, Wen-Jing; Xu, Fengping; Guo, Xiaosen; Cao, Ye; Zhu, Zhenzhen; Geng, Chunyu; Cheung, Wan Chee; Kwok, Yvonne K; Yang, Huangming; Leung, Tak Yeung; Morton, Cynthia C.; Cheung, Sau Wai; Choy, Kwong Wai

    2017-01-01

    Purpose Recent studies demonstrate that whole-genome sequencing (WGS) enables detection of cryptic rearrangements in apparently balanced chromosomal rearrangements (also known as balanced chromosomal abnormalities, BCAs) previously identified by conventional cytogenetic methods. We aimed to assess our analytical tool for detecting BCAs in The 1000 Genomes Project without knowing affected bands. Methods The 1000 Genomes Project provides an unprecedented integrated map of structural variants in phenotypically normal subjects, but there is no information on potential inclusion of subjects with apparently BCAs akin to those traditionally detected in diagnostic cytogenetics laboratories. We applied our analytical tool to 1,166 genomes from the 1000 Genomes Project with sufficient physical coverage (8.25-fold). Results Our approach detected four reciprocal balanced translocations and four inversions ranging in size from 57.9 kb to 13.3 Mb, all of which were confirmed by cytogenetic methods and PCR studies. One of DNAs has a subtle translocation that is not readily identified by chromosome analysis due to similar banding patterns and size of exchanged segments, and another results in disruption of all transcripts of an OMIM gene. Conclusions Our study demonstrates the extension of utilizing low-coverage WGS for unbiased detection of BCAs including translocations and inversions previously unknown in the 1000 Genomes Project. PMID:29095815

  13. Modematic: a fast laser beam analyzing system for high power CO2-laser beams

    NASA Astrophysics Data System (ADS)

    Olsen, Flemming O.; Ulrich, Dan

    2003-03-01

    The performance of an industrial laser is very much depending upon the characteristics of the laser beam. The ISO standards 11146 and 11154 describing test methods for laser beam parameters have been approved. To implement these methods in industry is difficult and especially for the infrared laser sources, such as the CO2-laser, the availabl analyzing systems are slow, difficult to apply and having limited reliability due to the nature of the detection methods. In an EUREKA-project the goal was defined to develop a laser beam analyzing system dedicated to high power CO2-lasers, which could fulfill the demands for an entire analyzing system, automating the time consuming pre-alignment and beam conditioning work required before a beam mode analyses, automating the analyzing sequences and data analysis required to determine the laser beam caustics and last but not least to deliver reliable close to real time data to the operator. The results of this project work will be described in this paper. The research project has led to the development of the Modematic laser beam analyzer, which is ready for the market.

  14. A new method for teaching physical examination to junior medical students

    PubMed Central

    Sayma, Meelad; Williams, Hywel Rhys

    2016-01-01

    Introduction Teaching effective physical examination is a key component in the education of medical students. Preclinical medical students often have insufficient clinical knowledge to apply to physical examination recall, which may hinder their learning when taught through certain understanding-based models. This pilot project aimed to develop a method to teach physical examination to preclinical medical students using “core clinical cases”, overcoming the need for “rote” learning. Methods This project was developed utilizing three cycles of planning, action, and reflection. Thematic analysis of feedback was used to improve this model, and ensure it met student expectations. Results and discussion A model core clinical case developed in this project is described, with gout as the basis for a “foot and ankle” examination. Key limitations and difficulties encountered on implementation of this pilot are discussed for future users, including the difficulty encountered in “content overload”. Conclusion This approach aims to teach junior medical students physical examination through understanding, using a simulated patient environment. Robust research is now required to demonstrate efficacy and repeatability in the physical examination of other systems. PMID:26937208

  15. Detection of Nitrogen Content in Rubber Leaves Using Near-Infrared (NIR) Spectroscopy with Correlation-Based Successive Projections Algorithm (SPA).

    PubMed

    Tang, Rongnian; Chen, Xupeng; Li, Chuang

    2018-05-01

    Near-infrared spectroscopy is an efficient, low-cost technology that has potential as an accurate method in detecting the nitrogen content of natural rubber leaves. Successive projections algorithm (SPA) is a widely used variable selection method for multivariate calibration, which uses projection operations to select a variable subset with minimum multi-collinearity. However, due to the fluctuation of correlation between variables, high collinearity may still exist in non-adjacent variables of subset obtained by basic SPA. Based on analysis to the correlation matrix of the spectra data, this paper proposed a correlation-based SPA (CB-SPA) to apply the successive projections algorithm in regions with consistent correlation. The result shows that CB-SPA can select variable subsets with more valuable variables and less multi-collinearity. Meanwhile, models established by the CB-SPA subset outperform basic SPA subsets in predicting nitrogen content in terms of both cross-validation and external prediction. Moreover, CB-SPA is assured to be more efficient, for the time cost in its selection procedure is one-twelfth that of the basic SPA.

  16. Robust method to detect and locate local earthquakes by means of amplitude measurements.

    NASA Astrophysics Data System (ADS)

    del Puy Papí Isaba, María; Brückl, Ewald

    2016-04-01

    In this study we present a robust new method to detect and locate medium and low magnitude local earthquakes. This method is based on an empirical model of the ground motion obtained from amplitude data of earthquakes in the area of interest, which were located using traditional methods. The first step of our method is the computation of maximum resultant ground velocities in sliding time windows covering the whole period of interest. In the second step, these maximum resultant ground velocities are back-projected to every point of a grid covering the whole area of interest while applying the empirical amplitude - distance relations. We refer to these back-projected ground velocities as pseudo-magnitudes. The number of operating seismic stations in the local network equals the number of pseudo-magnitudes at each grid-point. Our method introduces the new idea of selecting the minimum pseudo-magnitude at each grid-point for further analysis instead of searching for a minimum of the L2 or L1 norm. In case no detectable earthquake occurred, the spatial distribution of the minimum pseudo-magnitudes constrains the magnitude of weak earthquakes hidden in the ambient noise. In the case of a detectable local earthquake, the spatial distribution of the minimum pseudo-magnitudes shows a significant maximum at the grid-point nearest to the actual epicenter. The application of our method is restricted to the area confined by the convex hull of the seismic station network. Additionally, one must ensure that there are no dead traces involved in the processing. Compared to methods based on L2 and even L1 norms, our new method is almost wholly insensitive to outliers (data from locally disturbed seismic stations). A further advantage is the fast determination of the epicenter and magnitude of a seismic event located within a seismic network. This is possible due to the method of obtaining and storing a back-projected matrix, independent of the registered amplitude, for each seismic station. As a direct consequence, we are able to save computing time for the calculation of the final back-projected maximum resultant amplitude at every grid-point. The capability of the method was demonstrated firstly using synthetic data. In the next step, this method was applied to data of 43 local earthquakes of low and medium magnitude (1.7 < magnitude scale < 4.3). These earthquakes were recorded and detected by the seismic network ALPAACT (seismological and geodetic monitoring of Alpine PAnnonian ACtive Tectonics) in the period 2010/06/11 to 2013/09/20. Data provided by the ALPAACT network is used in order to understand seismic activity in the Mürz Valley - Semmering - Vienna Basin transfer fault system in Austria and what makes it such a relatively high earthquake hazard and risk area. The method will substantially support our efforts to involve scholars from polytechnic schools in seismological work within the Sparkling Science project Schools & Quakes.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weiss, Paul

    Spectroscopic imaging tools and methods, based on scanning tunneling microscopes (STMs), are being developed and applied to examine buried layers and interfaces with ultrahigh resolution. These new methods measure buried contacts, molecule-substrate bonds, buried dipoles in molecular layers, and key structural aspects of adsorbed molecules, such as tilt angles. We are developing the ability to locate lateral projections of molecular parts as a means of determining the structures of molecular layers. We are developing the ability to measure the orientation of buried functionality.

  18. A Workshop on the Integration of Numerical and Symbolic Computing Methods Held in Saratoga Springs, New York on July 9-11, 1990

    DTIC Science & Technology

    1991-04-01

    SUMMARY OF COMPLETED PROJECT (for public use) The summary (about 200 words) must be self-contained and intellegible to a scientifically literate reader...dialogue among re- searchers in symbolic methods and numerical computation, and their appli- cations in certain disciplines of artificial intelligence...Lozano-Perez Purdue University Artificial Intelligence Laboratory West Lafayette, IN 47907 Massachusetts Institute of Technology (317) 494-6181 545

  19. Comparative Analysis of the Volatile Components of Agrimonia eupatoria from Leaves and Roots by Gas Chromatography-Mass Spectrometry and Multivariate Curve Resolution

    PubMed Central

    Feng, Xiao-Liang; He, Yun-biao; Liang, Yi-Zeng; Wang, Yu-Lin; Huang, Lan-Fang; Xie, Jian-Wei

    2013-01-01

    Gas chromatography-mass spectrometry and multivariate curve resolution were applied to the differential analysis of the volatile components in Agrimonia eupatoria specimens from different plant parts. After extracted with water distillation method, the volatile components in Agrimonia eupatoria from leaves and roots were detected by GC-MS. Then the qualitative and quantitative analysis of the volatile components in the main root of Agrimonia eupatoria was completed with the help of subwindow factor analysis resolving two-dimensional original data into mass spectra and chromatograms. 68 of 87 separated constituents in the total ion chromatogram of the volatile components were identified and quantified, accounting for about 87.03% of the total content. Then, the common peaks in leaf were extracted with orthogonal projection resolution method. Among the components determined, there were 52 components coexisting in the studied samples although the relative content of each component showed difference to some extent. The results showed a fair consistency in their GC-MS fingerprint. It was the first time to apply orthogonal projection method to compare different plant parts of Agrimonia eupatoria, and it reduced the burden of qualitative analysis as well as the subjectivity. The obtained results proved the combined approach powerful for the analysis of complex Agrimonia eupatoria samples. The developed method can be used to further study and quality control of Agrimonia eupatoria. PMID:24286016

  20. Comparative Analysis of the Volatile Components of Agrimonia eupatoria from Leaves and Roots by Gas Chromatography-Mass Spectrometry and Multivariate Curve Resolution.

    PubMed

    Feng, Xiao-Liang; He, Yun-Biao; Liang, Yi-Zeng; Wang, Yu-Lin; Huang, Lan-Fang; Xie, Jian-Wei

    2013-01-01

    Gas chromatography-mass spectrometry and multivariate curve resolution were applied to the differential analysis of the volatile components in Agrimonia eupatoria specimens from different plant parts. After extracted with water distillation method, the volatile components in Agrimonia eupatoria from leaves and roots were detected by GC-MS. Then the qualitative and quantitative analysis of the volatile components in the main root of Agrimonia eupatoria was completed with the help of subwindow factor analysis resolving two-dimensional original data into mass spectra and chromatograms. 68 of 87 separated constituents in the total ion chromatogram of the volatile components were identified and quantified, accounting for about 87.03% of the total content. Then, the common peaks in leaf were extracted with orthogonal projection resolution method. Among the components determined, there were 52 components coexisting in the studied samples although the relative content of each component showed difference to some extent. The results showed a fair consistency in their GC-MS fingerprint. It was the first time to apply orthogonal projection method to compare different plant parts of Agrimonia eupatoria, and it reduced the burden of qualitative analysis as well as the subjectivity. The obtained results proved the combined approach powerful for the analysis of complex Agrimonia eupatoria samples. The developed method can be used to further study and quality control of Agrimonia eupatoria.

  1. Enterprise resource planning (ERP) implementation using the value engineering methodology and Six Sigma tools

    NASA Astrophysics Data System (ADS)

    Leu, Jun-Der; Lee, Larry Jung-Hsing

    2017-09-01

    Enterprise resource planning (ERP) is a software solution that integrates the operational processes of the business functions of an enterprise. However, implementing ERP systems is a complex process. In addition to the technical issues, companies must address problems associated with business process re-engineering, time and budget control, and organisational change. Numerous industrial studies have shown that the failure rate of ERP implementation is high, even for well-designed systems. Thus, ERP projects typically require a clear methodology to support the project execution and effectiveness. In this study, we propose a theoretical model for ERP implementation. The value engineering (VE) method forms the basis of the proposed framework, which integrates Six Sigma tools. The proposed framework encompasses five phases: knowledge generation, analysis, creation, development and execution. In the VE method, potential ERP problems related to software, hardware, consultation and organisation are analysed in a group-decision manner and in relation to value, and Six Sigma tools are applied to avoid any project defects. We validate the feasibility of the proposed model by applying it to an international manufacturing enterprise in Taiwan. The results show improvements in customer response time and operational efficiency in terms of work-in-process and turnover of materials. Based on the evidence from the case study, the theoretical framework is discussed together with the study's limitations and suggestions for future research.

  2. Speckle noise removal applied to ultrasound image of carotid artery based on total least squares model.

    PubMed

    Yang, Lei; Lu, Jun; Dai, Ming; Ren, Li-Jie; Liu, Wei-Zong; Li, Zhen-Zhou; Gong, Xue-Hao

    2016-10-06

    An ultrasonic image speckle noise removal method by using total least squares model is proposed and applied onto images of cardiovascular structures such as the carotid artery. On the basis of the least squares principle, the related principle of minimum square method is applied to cardiac ultrasound image speckle noise removal process to establish the model of total least squares, orthogonal projection transformation processing is utilized for the output of the model, and the denoising processing for the cardiac ultrasound image speckle noise is realized. Experimental results show that the improved algorithm can greatly improve the resolution of the image, and meet the needs of clinical medical diagnosis and treatment of the cardiovascular system for the head and neck. Furthermore, the success in imaging of carotid arteries has strong implications in neurological complications such as stroke.

  3. Applying Multimodel Ensemble from Regional Climate Models for Improving Runoff Projections on Semiarid Regions of Spain

    NASA Astrophysics Data System (ADS)

    Garcia Galiano, S. G.; Olmos, P.; Giraldo Osorio, J. D.

    2015-12-01

    In the Mediterranean area, significant changes on temperature and precipitation are expected throughout the century. These trends could exacerbate the existing conditions in regions already vulnerable to climatic variability, reducing the water availability. Improving knowledge about plausible impacts of climate change on water cycle processes at basin scale, is an important step for building adaptive capacity to the impacts in this region, where severe water shortages are expected for the next decades. RCMs ensemble in combination with distributed hydrological models with few parameters, constitutes a valid and robust methodology to increase the reliability of climate and hydrological projections. For reaching this objective, a novel methodology for building Regional Climate Models (RCMs) ensembles of meteorological variables (rainfall an temperatures), was applied. RCMs ensembles are justified for increasing the reliability of climate and hydrological projections. The evaluation of RCMs goodness-of-fit to build the ensemble is based on empirical probability density functions (PDF) extracted from both RCMs dataset and a highly resolution gridded observational dataset, for the time period 1961-1990. The applied method is considering the seasonal and annual variability of the rainfall and temperatures. The RCMs ensembles constitute the input to a distributed hydrological model at basin scale, for assessing the runoff projections. The selected hydrological model is presenting few parameters in order to reduce the uncertainties involved. The study basin corresponds to a head basin of Segura River Basin, located in the South East of Spain. The impacts on runoff and its trend from observational dataset and climate projections, were assessed. Considering the control period 1961-1990, plausible significant decreases in runoff for the time period 2021-2050, were identified.

  4. Sequential projection pursuit for optimised vibration-based damage detection in an experimental wind turbine blade

    NASA Astrophysics Data System (ADS)

    Hoell, Simon; Omenzetter, Piotr

    2018-02-01

    To advance the concept of smart structures in large systems, such as wind turbines (WTs), it is desirable to be able to detect structural damage early while using minimal instrumentation. Data-driven vibration-based damage detection methods can be competitive in that respect because global vibrational responses encompass the entire structure. Multivariate damage sensitive features (DSFs) extracted from acceleration responses enable to detect changes in a structure via statistical methods. However, even though such DSFs contain information about the structural state, they may not be optimised for the damage detection task. This paper addresses the shortcoming by exploring a DSF projection technique specialised for statistical structural damage detection. High dimensional initial DSFs are projected onto a low-dimensional space for improved damage detection performance and simultaneous computational burden reduction. The technique is based on sequential projection pursuit where the projection vectors are optimised one by one using an advanced evolutionary strategy. The approach is applied to laboratory experiments with a small-scale WT blade under wind-like excitations. Autocorrelation function coefficients calculated from acceleration signals are employed as DSFs. The optimal numbers of projection vectors are identified with the help of a fast forward selection procedure. To benchmark the proposed method, selections of original DSFs as well as principal component analysis scores from these features are additionally investigated. The optimised DSFs are tested for damage detection on previously unseen data from the healthy state and a wide range of damage scenarios. It is demonstrated that using selected subsets of the initial and transformed DSFs improves damage detectability compared to the full set of features. Furthermore, superior results can be achieved by projecting autocorrelation coefficients onto just a single optimised projection vector.

  5. 23 CFR 636.104 - Does this part apply to all Federal-aid design-build projects?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 23 Highways 1 2011-04-01 2011-04-01 false Does this part apply to all Federal-aid design-build projects? 636.104 Section 636.104 Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION ENGINEERING AND TRAFFIC OPERATIONS DESIGN-BUILD CONTRACTING General § 636.104 Does this part apply to all Federal-aid design-build projects? The...

  6. Optimization of energy window and evaluation of scatter compensation methods in MPS using the ideal observer with model mismatch

    NASA Astrophysics Data System (ADS)

    Ghaly, Michael; Links, Jonathan M.; Frey, Eric

    2015-03-01

    In this work, we used the ideal observer (IO) and IO with model mismatch (IO-MM) applied in the projection domain and an anthropomorphic Channelized Hotelling Observer (CHO) applied to reconstructed images to optimize the acquisition energy window width and evaluate various scatter compensation methods in the context of a myocardial perfusion SPECT defect detection task. The IO has perfect knowledge of the image formation process and thus reflects performance with perfect compensation for image-degrading factors. Thus, using the IO to optimize imaging systems could lead to suboptimal parameters compared to those optimized for humans interpreting SPECT images reconstructed with imperfect or no compensation. The IO-MM allows incorporating imperfect system models into the IO optimization process. We found that with near-perfect scatter compensation, the optimal energy window for the IO and CHO were similar; in its absence the IO-MM gave a better prediction of the optimal energy window for the CHO using different scatter compensation methods. These data suggest that the IO-MM may be useful for projection-domain optimization when model mismatch is significant, and that the IO is useful when followed by reconstruction with good models of the image formation process.

  7. Applying a machine learning model using a locally preserving projection based feature regeneration algorithm to predict breast cancer risk

    NASA Astrophysics Data System (ADS)

    Heidari, Morteza; Zargari Khuzani, Abolfazl; Danala, Gopichandh; Mirniaharikandehei, Seyedehnafiseh; Qian, Wei; Zheng, Bin

    2018-03-01

    Both conventional and deep machine learning has been used to develop decision-support tools applied in medical imaging informatics. In order to take advantages of both conventional and deep learning approach, this study aims to investigate feasibility of applying a locally preserving projection (LPP) based feature regeneration algorithm to build a new machine learning classifier model to predict short-term breast cancer risk. First, a computer-aided image processing scheme was used to segment and quantify breast fibro-glandular tissue volume. Next, initially computed 44 image features related to the bilateral mammographic tissue density asymmetry were extracted. Then, an LLP-based feature combination method was applied to regenerate a new operational feature vector using a maximal variance approach. Last, a k-nearest neighborhood (KNN) algorithm based machine learning classifier using the LPP-generated new feature vectors was developed to predict breast cancer risk. A testing dataset involving negative mammograms acquired from 500 women was used. Among them, 250 were positive and 250 remained negative in the next subsequent mammography screening. Applying to this dataset, LLP-generated feature vector reduced the number of features from 44 to 4. Using a leave-onecase-out validation method, area under ROC curve produced by the KNN classifier significantly increased from 0.62 to 0.68 (p < 0.05) and odds ratio was 4.60 with a 95% confidence interval of [3.16, 6.70]. Study demonstrated that this new LPP-based feature regeneration approach enabled to produce an optimal feature vector and yield improved performance in assisting to predict risk of women having breast cancer detected in the next subsequent mammography screening.

  8. Effort to Accelerate MBSE Adoption and Usage at JSC

    NASA Technical Reports Server (NTRS)

    Wang, Lui; Izygon, Michel; Okron, Shira; Garner, Larry; Wagner, Howard

    2016-01-01

    This paper describes the authors' experience in adopting Model Based System Engineering (MBSE) at the NASA/Johnson Space Center (JSC). Since 2009, NASA/JSC has been applying MBSE using the Systems Modeling Language (SysML) to a number of advanced projects. Models integrate views of the system from multiple perspectives, capturing the system design information for multiple stakeholders. This method has allowed engineers to better control changes, improve traceability from requirements to design and manage the numerous interactions between components. As the project progresses, the models become the official source of information and used by multiple stakeholders. Three major types of challenges that hamper the adoption of the MBSE technology are described. These challenges are addressed by a multipronged approach that includes educating the main stakeholders, implementing an organizational infrastructure that supports the adoption effort, defining a set of modeling guidelines to help engineers in their modeling effort, providing a toolset that support the generation of valuable products, and providing a library of reusable models. JSC project case studies are presented to illustrate how the proposed approach has been successfully applied.

  9. Project-based teaching in health informatics: a course on health care quality improvement.

    PubMed

    Moehr, J R; Berenji, G R; Green, C J; Kagolovsky, Y

    2001-01-01

    Teaching the skills and knowledge required in health informatics [1] is a challenge because the skill of applying knowledge in real life requires practice. We relate the experience with introducing a practice component to a course in "Health Care Quality Improvement". Working health care professionals were invited to bring an actual quality problem from their place of work and to work alongside students in running the problem through a quality improvement project lifecycle. Multiple technological and process oriented teaching innovations were employed including project sessions in observation rooms, video recording of these sessions, generation of demonstration examples and distance education components. Both students and their collaborators from the work place developed proficiency in applying quality improvement methods as well as in experiencing the realities of group processes, information gaps and organizational constraints. The principles used to achieve high involvement of the whole class, the employed resources and technical support are described. The resulting academic and practical achievements are discussed in relation to the alternative instructional modalities, and with respect to didactic implications for similar endeavors and beyond to other fields such as systems engineering.

  10. A Comparison of Earthquake Back-Projection Imaging Methods for Dense Local Arrays, and Application to the 2011 Virginia Aftershock Sequence

    NASA Astrophysics Data System (ADS)

    Beskardes, G. D.; Hole, J. A.; Wang, K.; Wu, Q.; Chapman, M. C.; Davenport, K. K.; Michaelides, M.; Brown, L. D.; Quiros, D. A.

    2016-12-01

    Back-projection imaging has recently become a practical method for local earthquake detection and location due to the deployment of densely sampled, continuously recorded, local seismograph arrays. Back-projection is scalable to earthquakes with a wide range of magnitudes from very tiny to very large. Local dense arrays provide the opportunity to capture very tiny events for a range applications, such as tectonic microseismicity, source scaling studies, wastewater injection-induced seismicity, hydraulic fracturing, CO2 injection monitoring, volcano studies, and mining safety. While back-projection sometimes utilizes the full seismic waveform, the waveforms are often pre-processed to overcome imaging issues. We compare the performance of back-projection using four previously used data pre-processing methods: full waveform, envelope, short-term averaging / long-term averaging (STA/LTA), and kurtosis. The goal is to identify an optimized strategy for an entirely automated imaging process that is robust in the presence of real-data issues, has the lowest signal-to-noise thresholds for detection and for location, has the best spatial resolution of the energy imaged at the source, preserves magnitude information, and considers computational cost. Real data issues include aliased station spacing, low signal-to-noise ratio (to <1), large noise bursts and spatially varying waveform polarity. For evaluation, the four imaging methods were applied to the aftershock sequence of the 2011 Virginia earthquake as recorded by the AIDA array with 200-400 m station spacing. These data include earthquake magnitudes from -2 to 3 with highly variable signal to noise, spatially aliased noise, and large noise bursts: realistic issues in many environments. Each of the four back-projection methods has advantages and disadvantages, and a combined multi-pass method achieves the best of all criteria. Preliminary imaging results from the 2011 Virginia dataset will be presented.

  11. Integrating Theory and Practice: Applying the Quality Improvement Paradigm to Product Line Engineering

    NASA Technical Reports Server (NTRS)

    Stark, Michael; Hennessy, Joseph F. (Technical Monitor)

    2002-01-01

    My assertion is that not only are product lines a relevant research topic, but that the tools used by empirical software engineering researchers can address observed practical problems. Our experience at NASA has been there are often externally proposed solutions available, but that we have had difficulties applying them in our particular context. We have also focused on return on investment issues when evaluating product lines, and while these are important, one can not attain objective data on success or failure until several applications from a product family have been deployed. The use of the Quality Improvement Paradigm (QIP) can address these issues: (1) Planning an adoption path from an organization's current state to a product line approach; (2) Constructing a development process to fit the organization's adoption path; (3) Evaluation of product line development processes as the project is being developed. The QIP consists of the following six steps: (1) Characterize the project and its environment; (2) Set quantifiable goals for successful project performance; (3) Choose the appropriate process models, supporting methods, and tools for the project; (4) Execute the process, analyze interim results, and provide real-time feedback for corrective action; (5) Analyze the results of completed projects and recommend improvements; and (6) Package the lessons learned as updated and refined process models. A figure shows the QIP in detail. The iterative nature of the QIP supports an incremental development approach to product lines, and the project learning and feedback provide the necessary early evaluations.

  12. Applications of Earth Observations for Fisheries Management: An analysis of socioeconomic benefits

    NASA Astrophysics Data System (ADS)

    Friedl, L.; Kiefer, D. A.; Turner, W.

    2013-12-01

    This paper will discuss the socioeconomic impacts of a project applying Earth observations and models to support management and conservation of tuna and other marine resources in the eastern Pacific Ocean. A project team created a software package that produces statistical analyses and dynamic maps of habitat for pelagic ocean biota. The tool integrates sea surface temperature and chlorophyll imagery from MODIS, ocean circulation models, and other data products. The project worked with the Inter-American Tropical Tuna Commission, which issues fishery management information, such as stock assessments, for the eastern Pacific region. The Commission uses the tool and broader habitat information to produce better estimates of stock and thus improve their ability to identify species that could be at risk of overfishing. The socioeconomic analysis quantified the relative value that Earth observations contributed to accurate stock size assessments through improvements in calculating population size. The analysis team calculated the first-order economic costs of a fishery collapse (or shutdown), and they calculated the benefits of improved estimates that reduce the uncertainty of stock size and thus reduce the risk of fishery collapse. The team estimated that the project reduced the probability of collapse of different fisheries, and the analysis generated net present values of risk mitigation. USC led the project with sponsorship from the NASA Earth Science Division's Applied Sciences Program, which conducted the socioeconomic impact analysis. The paper will discuss the project and focus primarily on the analytic methods, impact metrics, and the results of the socioeconomic benefits analysis.

  13. ECUT (Energy Conversion and Utilization Technologies) program: Biocatalysis project

    NASA Technical Reports Server (NTRS)

    Baresi, Larry

    1989-01-01

    The Annual Report presents the fiscal year (FY) 1988 research activities and accomplishments, for the Biocatalysis Project of the U.S. Department of Energy, Energy Conversion and Utilization Technologies (ECUT) Division. The ECUT Biocatalysis Project is managed by the Jet Propulsion Laboratory, California Institute of Technology. The Biocatalysis Project is a mission-oriented, applied research and exploratory development activity directed toward resolution of the major generic technical barriers that impede the development of biologically catalyzed commercial chemical production. The approach toward achieving project objectives involves an integrated participation of universities, industrial companies and government research laboratories. The Project's technical activities were organized into three work elements: (1) The Molecular Modeling and Applied Genetics work element includes research on modeling of biological systems, developing rigorous methods for the prediction of three-dimensional (tertiary) protein structure from the amino acid sequence (primary structure) for designing new biocatalysis, defining kinetic models of biocatalyst reactivity, and developing genetically engineered solutions to the generic technical barriers that preclude widespread application of biocatalysis. (2) The Bioprocess Engineering work element supports efforts in novel bioreactor concepts that are likely to lead to substantially higher levels of reactor productivity, product yields and lower separation energetics. Results of work within this work element will be used to establish the technical feasibility of critical bioprocess monitoring and control subsystems. (3) The Bioprocess Design and Assessment work element attempts to develop procedures (via user-friendly computer software) for assessing the energy-economics of biocatalyzed chemical production processes, and initiation of technology transfer for advanced bioprocesses.

  14. ECUT (Energy Conversion and Utilization Technologies) program: Biocatalysis project

    NASA Astrophysics Data System (ADS)

    Baresi, Larry

    1989-03-01

    The Annual Report presents the fiscal year (FY) 1988 research activities and accomplishments, for the Biocatalysis Project of the U.S. Department of Energy, Energy Conversion and Utilization Technologies (ECUT) Division. The ECUT Biocatalysis Project is managed by the Jet Propulsion Laboratory, California Institute of Technology. The Biocatalysis Project is a mission-oriented, applied research and exploratory development activity directed toward resolution of the major generic technical barriers that impede the development of biologically catalyzed commercial chemical production. The approach toward achieving project objectives involves an integrated participation of universities, industrial companies and government research laboratories. The Project's technical activities were organized into three work elements: (1) The Molecular Modeling and Applied Genetics work element includes research on modeling of biological systems, developing rigorous methods for the prediction of three-dimensional (tertiary) protein structure from the amino acid sequence (primary structure) for designing new biocatalysis, defining kinetic models of biocatalyst reactivity, and developing genetically engineered solutions to the generic technical barriers that preclude widespread application of biocatalysis. (2) The Bioprocess Engineering work element supports efforts in novel bioreactor concepts that are likely to lead to substantially higher levels of reactor productivity, product yields and lower separation energetics. Results of work within this work element will be used to establish the technical feasibility of critical bioprocess monitoring and control subsystems. (3) The Bioprocess Design and Assessment work element attempts to develop procedures (via user-friendly computer software) for assessing the energy-economics of biocatalyzed chemical production processes, and initiation of technology transfer for advanced bioprocesses.

  15. An initial investigation into methods of computing transonic aerodynamic sensitivity coefficients

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1994-01-01

    The primary accomplishments of the project are as follows: (1) Using the transonic small perturbation equation as a flowfield model, the project demonstrated that the quasi-analytical method could be used to obtain aerodynamic sensitivity coefficients for airfoils at subsonic, transonic, and supersonic conditions for design variables such as Mach number, airfoil thickness, maximum camber, angle of attack, and location of maximum camber. It was established that the quasi-analytical approach was an accurate method for obtaining aerodynamic sensitivity derivatives for airfoils at transonic conditions and usually more efficient than the finite difference approach. (2) The usage of symbolic manipulation software to determine the appropriate expressions and computer coding associated with the quasi-analytical method for sensitivity derivatives was investigated. Using the three dimensional fully conservative full potential flowfield model, it was determined that symbolic manipulation along with a chain rule approach was extremely useful in developing a combined flowfield and quasi-analytical sensitivity derivative code capable of considering a large number of realistic design variables. (3) Using the three dimensional fully conservative full potential flowfield model, the quasi-analytical method was applied to swept wings (i.e. three dimensional) at transonic flow conditions. (4) The incremental iterative technique has been applied to the three dimensional transonic nonlinear small perturbation flowfield formulation, an equivalent plate deflection model, and the associated aerodynamic and structural discipline sensitivity equations; and coupled aeroelastic results for an aspect ratio three wing in transonic flow have been obtained.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Looney, B; Dawn S. Kaback, D; Eugene L. LeBoeuf, E

    Beginning in 2006, the US Department of Energy (DOE) supported nine applied research projects to improve the protection of the Columbia River and mitigate the impacts of Hanford Site groundwater. These projects were funded through a supplemental Congressional budget allocation, and are now in various stages of completion in accordance with the research plans. The DOE Office of Environmental Management Groundwater and Soil Cleanup Technologies (EM-22) sponsored a technical peer review meeting for these projects in Richland WA, July 28-31, 2008. The overall objective of the peer review is to provide information to support DOE decisions about the status andmore » potential future application of the various technologies. The charge for the peer review panel was to develop recommendations for each of the nine 'technologies'. Team members for the July 2008 review were Brian Looney, Gene LeBoeuf, Dawn Kaback, Karen Skubal, Joe Rossabi, Paul Deutsch, and David Cocke. Previous project reviews were held in May 2007 and March-May of 2006. The team used the following four rating categories for projects: (a) Incorporate the technology/strategy in ongoing and future EM activities; (b) Finish existing scope of applied research and determine potential for EM activities when research program is finished; (c) Discontinue current development activities and do not incorporate technology/strategy into ongoing and future EM activities unless a significant and compelling change in potential viability is documented; and (d) Supplement original funded work to obtain the data needed to support a DOE decision to incorporate the technology into ongoing and future EM activities. The supplemental funding portfolio included two projects that addressed strontium, five projects that addressed chromium, one project that addressed uranium and one project that addressed carbon tetrachloride. The projects ranged from in situ treatment methods for immobilizing contaminants using chemical-based methods such as phosphate addition, to innovative surface treatment technologies such as electrocoagulation. Total funding for the nine projects was $9,900,000 in fiscal year (FY) 2006 and $2,000,000 in FY 2007. At the Richland meeting, the peer reviewers provided a generally neutral assessment of the projects and overall progress, and a generally positive assessment with regard to the principal investigators meeting their stated research objectives and performing the planned laboratory research and limited field work. Only one project, the Electrocoagulation Treatability Test, received a rating of 'discontinue' from the team because the project goals had not been met. Because this particular project has already ended, no action with respect to funding withdrawal is necessary. All other projects were recommended to be finished and/or incorporated into field efforts at Hanford. Specific technical comments and recommendations were provided by the team for each project.« less

  17. Optimal resolution in maximum entropy image reconstruction from projections with multigrid acceleration

    NASA Technical Reports Server (NTRS)

    Limber, Mark A.; Manteuffel, Thomas A.; Mccormick, Stephen F.; Sholl, David S.

    1993-01-01

    We consider the problem of image reconstruction from a finite number of projections over the space L(sup 1)(Omega), where Omega is a compact subset of the set of Real numbers (exp 2). We prove that, given a discretization of the projection space, the function that generates the correct projection data and maximizes the Boltzmann-Shannon entropy is piecewise constant on a certain discretization of Omega, which we call the 'optimal grid'. It is on this grid that one obtains the maximum resolution given the problem setup. The size of this grid grows very quickly as the number of projections and number of cells per projection grow, indicating fast computational methods are essential to make its use feasible. We use a Fenchel duality formulation of the problem to keep the number of variables small while still using the optimal discretization, and propose a multilevel scheme to improve convergence of a simple cyclic maximization scheme applied to the dual problem.

  18. The study on stage financing model of IT project investment.

    PubMed

    Chen, Si-hua; Xu, Sheng-hua; Lee, Changhoon; Xiong, Neal N; He, Wei

    2014-01-01

    Stage financing is the basic operation of venture capital investment. In investment, usually venture capitalists use different strategies to obtain the maximum returns. Due to its advantages to reduce the information asymmetry and agency cost, stage financing is widely used by venture capitalists. Although considerable attentions are devoted to stage financing, very little is known about the risk aversion strategies of IT projects. This paper mainly addresses the problem of risk aversion of venture capital investment in IT projects. Based on the analysis of characteristics of venture capital investment of IT projects, this paper introduces a real option pricing model to measure the value brought by the stage financing strategy and design a risk aversion model for IT projects. Because real option pricing method regards investment activity as contingent decision, it helps to make judgment on the management flexibility of IT projects and then make a more reasonable evaluation about the IT programs. Lastly by being applied to a real case, it further illustrates the effectiveness and feasibility of the model.

  19. The Study on Stage Financing Model of IT Project Investment

    PubMed Central

    Xu, Sheng-hua; Xiong, Neal N.

    2014-01-01

    Stage financing is the basic operation of venture capital investment. In investment, usually venture capitalists use different strategies to obtain the maximum returns. Due to its advantages to reduce the information asymmetry and agency cost, stage financing is widely used by venture capitalists. Although considerable attentions are devoted to stage financing, very little is known about the risk aversion strategies of IT projects. This paper mainly addresses the problem of risk aversion of venture capital investment in IT projects. Based on the analysis of characteristics of venture capital investment of IT projects, this paper introduces a real option pricing model to measure the value brought by the stage financing strategy and design a risk aversion model for IT projects. Because real option pricing method regards investment activity as contingent decision, it helps to make judgment on the management flexibility of IT projects and then make a more reasonable evaluation about the IT programs. Lastly by being applied to a real case, it further illustrates the effectiveness and feasibility of the model. PMID:25147845

  20. A Different Approach to the Scientific Research Methods Course: Effects of a Small-Scale Research Project on Pre-Service Teachers

    ERIC Educational Resources Information Center

    Bastürk, Savas

    2017-01-01

    Selecting and applying appropriate research techniques, analysing data using information and communication technologies, transferring the obtained results of the analysis into tables and interpreting them are the performance indicators evaluated by the Ministry of National Education under teacher competencies. At the beginning of the courses that…

  1. Solution of the Wang Chang-Uhlenbeck equation for molecular hydrogen

    NASA Astrophysics Data System (ADS)

    Anikin, Yu. A.

    2017-06-01

    Molecular hydrogen is modeled by numerically solving the Wang Chang-Uhlenbeck equation. The differential scattering cross sections of molecules are calculated using the quantum mechanical scattering theory of rigid rotors. The collision integral is computed by applying a fully conservative projection method. Numerical results for relaxation, heat conduction, and a one-dimensional shock wave are presented.

  2. A Qualitative Study Comparing the Instruction on Vectors between a Physics Course and a Trigonometry Course

    ERIC Educational Resources Information Center

    James, Wendy Michelle

    2013-01-01

    Science and engineering instructors often observe that students have difficulty using or applying prerequisite mathematics knowledge in their courses. This qualitative project uses a case-study method to investigate the instruction in a trigonometry course and a physics course based on a different methodology and set of assumptions about student…

  3. E-Learning: A Means to Increase Learner Involvement in Research

    ERIC Educational Resources Information Center

    de Beer, Marie; Mason, Roger B.

    2014-01-01

    This paper investigates a method for increasing the involvement of marketing fourth year learners in academic research, by encouraging greater participation in, and commitment to, their research project in the Applied Marketing IV subject. It is assumed that greater involvement will result in a greater pass rate. The main reasons for this lack of…

  4. Money Management and the Consumer, Credit: "Ch . a . r . ge!".

    ERIC Educational Resources Information Center

    Florida State Dept. of Education, Tallahassee. Div. of Elementary and Secondary Education.

    This unit, one of a series of six Project SCAT (Skills for Consumer Applied Today) units, is designed to help senior high school students develop consumer education skills. For a description of the specific objectives and format of the units, see SO 013 467. This document provides teaching methods, learning activities, and student booklet for a…

  5. Applying Laser Cutting Techniques through Horology for Teaching Effective STEM in Design and Technology

    ERIC Educational Resources Information Center

    Jones, Lewis C. R.; Tyrer, John R.; Zanker, Nigel P.

    2013-01-01

    This paper explores the pedagogy underpinning the use of laser manufacturing methods for the teaching of science, technology, engineering and mathematics (STEM) at key stage 3 design and technology. Clock making (horology) has been a popular project in design and technology (D&T) found in many schools, typically it focuses on aesthetical…

  6. Interdisciplinary Program for Quantitative Flaw Definition.

    DTIC Science & Technology

    1978-01-01

    Ceramics .................... 284 UNIT C, TASK 4 - Microfocus X-Ray and Image Enhance- ment of Radiographic Data ....................... 292 UNIT C, TASK 5...Conventional Ultrasonic Inspection Methods Applied to Ceramics ..................... 294 iii 7! SC595.32SA OVERVIEW PROJECT I - QUANTITATIVE...parameters. Unit C was initiated in October of 1977 following encouraging nondestructive defect detectability studies in structural ceramics , using

  7. Sustaining Comprehensive Physical Activity Practice in Elementary School: A Case Study Applying Mixed Methods

    ERIC Educational Resources Information Center

    Tjomsland, Hege Eikland

    2010-01-01

    This study examines an elementary school which during enrollment in the European Network of Health Promoting Schools, 1993-2003, and the Norwegian Physical Activity and Healthy Meals Project, 2004-2006, selected physical activity (PA) as a prioritized area. Survey data, school documents, and focus group data were collected and analyzed through a…

  8. The Analysis of the Impact of Individual Weighting Factor on Individual Scores

    ERIC Educational Resources Information Center

    Kilic, Gulsen Bagci; Cakan, Mehtap

    2006-01-01

    In this study, category-based self and peer assessment were applied twice in a semester in an Elementary Science Teaching Methods course in order to assess individual contributions of group members to group projects as well as to analyze the impact of Individual Weighting Factors (IWF) on individual scores and individual grades. IWF were…

  9. "Learning to Work" in Small Businesses: Learning and Training for Young Adults with Learning Disabilities

    ERIC Educational Resources Information Center

    Ruggeri-Stevens, Geoff; Goodwin, Susan

    2007-01-01

    Purpose: The paper alerts small business employers to new dictates of the Disability Discrimination Act (2005) as it applies to learning disabilities. Then the "Learning to Work" project featured in the paper offers small business employers a set of approaches and methods for the identification of a learning-disabled young adult…

  10. The Use of Gap Analysis to Increase Student Completion Rates at Travelor Adult School

    ERIC Educational Resources Information Center

    Gil, Blanca Estela

    2013-01-01

    This project applied the gap analysis problem-solving framework (Clark & Estes, 2008) in order to help develop strategies to increase completion rates at Travelor Adult School. The purpose of the study was to identify whether the knowledge, motivation and organization barriers were contributing to the identified gap. A mixed method approached…

  11. Accomplishing PETE Learning Standards and Program Accreditation through Teacher Candidates' Technology-Based Service Learning Projects

    ERIC Educational Resources Information Center

    Gibbone, Anne; Mercier, Kevin

    2014-01-01

    Teacher candidates' use of technology is a component of physical education teacher education (PETE) program learning goals and accreditation standards. The methods presented in this article can help teacher candidates to learn about and apply technology as an instructional tool prior to and during field or clinical experiences. The goal in…

  12. Reconceptualizing Teacher Education Programs: Applying Dewey's Theories to Service-Learning with Early Childhood Preservice Teachers

    ERIC Educational Resources Information Center

    Lake, Vickie E.; Winterbottom, Christian; Ethridge, Elizabeth A.; Kelly, Loreen

    2015-01-01

    Dewey's concept of enabling children to explore based on their own interests has evolved into investigations and projects using methods of exploration, experimentation, and discovery--three tenets of service-learning. Using mixed methodology, the authors examined the implementation of service-learning in a teacher education program. A total of 155…

  13. Engaging communities and climate change futures with Multi-Scale, Iterative Scenario Building (MISB) in the western United States

    Treesearch

    Daniel Murphy; Carina Wyborn; Laurie Yung; Daniel R. Williams; Cory Cleveland; Lisa Eby; Solomon Dobrowski; Erin Towler

    2016-01-01

    Current projections of future climate change foretell potentially transformative ecological changes that threaten communities globally. Using two case studies from the United States Intermountain West, this article highlights the ways in which a better articulation between theory and methods in research design can generate proactive applied tools that enable...

  14. A Realistic Experimental Design and Statistical Analysis Project

    ERIC Educational Resources Information Center

    Muske, Kenneth R.; Myers, John A.

    2007-01-01

    A realistic applied chemical engineering experimental design and statistical analysis project is documented in this article. This project has been implemented as part of the professional development and applied statistics courses at Villanova University over the past five years. The novel aspects of this project are that the students are given a…

  15. Community Action Projects: Applying Biotechnology in the Real World

    ERIC Educational Resources Information Center

    Nguyen, Phuong D.; Siegel, Marcelle A.

    2015-01-01

    Project-based learning and action research are powerful pedagogies in improving science education. We implemented a semester-long course using project-based action research to help students apply biotechnology knowledge learned in the classroom to the real world. Students had several choices to make in the project: working individually or as a…

  16. Designing Project-Based Courses with a Focus on Group Formation and Assessment

    ERIC Educational Resources Information Center

    Richards, Debbie

    2009-01-01

    The value and the pitfalls of project and group work are well recognized. The principles and elements which apply to projects in general, apply to project-based courses. Thoughtful and detailed planning, understanding of the stakeholders and their needs, a good design, appropriate testing, monitoring and quality control and continual management…

  17. The European space debris safety and mitigation standard

    NASA Astrophysics Data System (ADS)

    Alby, F.; Alwes, D.; Anselmo, L.; Baccini, H.; Bonnal, C.; Crowther, R.; Flury, W.; Jehn, R.; Klinkrad, H.; Portelli, C.; Tremayne-Smith, R.

    2001-10-01

    A standard has been proposed as one of the series of ECSS Standards intended to be applied together for the management, engineering and product assurance in space projects and applications. The requirements in the Standard are defined in terms of what must be accomplished, rather than in terms of how to organise and perform the necessary work. This allows existing organisational structures and methods within agencies and industry to be applied where they are effective, and for such structures and methods to evolve as necessary, without the need for rewriting the standards. The Standard comprises management requirements, design requirements and operational requirements. The standard was prepared by the European Debris Mitigation Standard Working Group (EDMSWG) involving members from ASI, BNSC, CNES, DLR and ESA.

  18. Autonomous Kinematic Calibration of the Robot Manipulator with a Linear Laser-Vision Sensor

    NASA Astrophysics Data System (ADS)

    Kang, Hee-Jun; Jeong, Jeong-Woo; Shin, Sung-Weon; Suh, Young-Soo; Ro, Young-Schick

    This paper presents a new autonomous kinematic calibration technique by using a laser-vision sensor called "Perceptron TriCam Contour". Because the sensor measures by capturing the image of a projected laser line on the surface of the object, we set up a long, straight line of a very fine string inside the robot workspace, and then allow the sensor mounted on a robot to measure the point intersection of the line of string and the projected laser line. The data collected by changing robot configuration and measuring the intersection points are constrained to on a single straght line such that the closed-loop calibration method can be applied. The obtained calibration method is simple and accurate and also suitable for on-site calibration in an industrial environment. The method is implemented using Hyundai VORG-35 for its effectiveness.

  19. Distance majorization and its applications

    PubMed Central

    Chi, Eric C.; Zhou, Hua; Lange, Kenneth

    2014-01-01

    The problem of minimizing a continuously differentiable convex function over an intersection of closed convex sets is ubiquitous in applied mathematics. It is particularly interesting when it is easy to project onto each separate set, but nontrivial to project onto their intersection. Algorithms based on Newton’s method such as the interior point method are viable for small to medium-scale problems. However, modern applications in statistics, engineering, and machine learning are posing problems with potentially tens of thousands of parameters or more. We revisit this convex programming problem and propose an algorithm that scales well with dimensionality. Our proposal is an instance of a sequential unconstrained minimization technique and revolves around three ideas: the majorization-minimization principle, the classical penalty method for constrained optimization, and quasi-Newton acceleration of fixed-point algorithms. The performance of our distance majorization algorithms is illustrated in several applications. PMID:25392563

  20. Fringe image processing based on structured light series

    NASA Astrophysics Data System (ADS)

    Gai, Shaoyan; Da, Feipeng; Li, Hongyan

    2009-11-01

    The code analysis of the fringe image is playing a vital role in the data acquisition of structured light systems, which affects precision, computational speed and reliability of the measurement processing. According to the self-normalizing characteristic, a fringe image processing method based on structured light is proposed. In this method, a series of projective patterns is used when detecting the fringe order of the image pixels. The structured light system geometry is presented, which consist of a white light projector and a digital camera, the former projects sinusoidal fringe patterns upon the object, and the latter acquires the fringe patterns that are deformed by the object's shape. Then the binary images with distinct white and black strips can be obtained and the ability to resist image noise is improved greatly. The proposed method can be implemented easily and applied for profile measurement based on special binary code in a wide field.

  1. Nursing Research, CER, PICO and PCORI.

    PubMed

    Patel, Darpan I

    2018-01-01

    Community and public health nurse researchers encompass a unique cohort of nurse researchers that have the skills and capacity to lead projects and programs of science centered on improvement of patient outcomes through methods of comparative effectiveness research (CER). CER, as a general method, has been taught to all nurses in the form of the PICO question to improve evidence-based practices. As the climate for funding becomes more and more competitive, nurse researchers are primed to lead the change in improving patient outcomes through patient centered outcomes research (PCOR). However, the number of projects funded by agencies like the Patient Centered Outcomes Research Institute, fall well below the capabilities of the field. The purpose of this commentary is to promote the field of PCOR and encourage novice and experienced nurse researchers to apply for funding from the PCORI by introducing different methods for building capacity and promoting engagement in the national conversations of PCOR and CER.

  2. 20 CFR 638.600 - Applied vocational skills training (VST) through work projects.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 20 Employees' Benefits 3 2011-04-01 2011-04-01 false Applied vocational skills training (VST... Skills Training (VST) § 638.600 Applied vocational skills training (VST) through work projects. (a)(1) The Job Corps Director shall establish procedures for administering applied vocational skills training...

  3. 20 CFR 638.600 - Applied vocational skills training (VST) through work projects.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Applied vocational skills training (VST... Skills Training (VST) § 638.600 Applied vocational skills training (VST) through work projects. (a)(1) The Job Corps Director shall establish procedures for administering applied vocational skills training...

  4. Environmental Impact Assessment of the Industrial Estate Development Plan with the Geographical Information System and Matrix Methods

    PubMed Central

    Ghasemian, Mohammad; Poursafa, Parinaz; Amin, Mohammad Mehdi; Ziarati, Mohammad; Ghoddousi, Hamid; Momeni, Seyyed Alireza; Rezaei, Amir Hossein

    2012-01-01

    Background. The purpose of this study is environmental impact assessment of the industrial estate development planning. Methods. This cross-sectional study was conducted in 2010 in Isfahan province, Iran. GIS and matrix methods were applied. Data analysis was done to identify the current situation of the region, zoning vulnerable areas, and scoping the region. Quantitative evaluation was done by using matrix of Wooten and Rau. Results. The net score for impact of industrial units operation on air quality of the project area was (−3). According to the transition of industrial estate pollutants, residential places located in the radius of 2500 meters of the city were expected to be affected more. The net score for impact of construction of industrial units on plant species of the project area was (−2). Environmental protected areas were not affected by the air and soil pollutants because of their distance from industrial estate. Conclusion. Positive effects of project activities outweigh the drawbacks and the sum scores allocated to the project activities on environmental factor was (+37). Totally it does not have detrimental effects on the environment and residential neighborhood. EIA should be considered as an anticipatory, participatory environmental management tool before determining a plan application. PMID:22272210

  5. Influence of climate change on the flowering of temperate fruit trees

    NASA Astrophysics Data System (ADS)

    Perez-Lopez, D.; Ruiz-Ramos, M.; Sánchez-Sánchez, E.; Centeno, A.; Prieto-Egido, I.; Lopez-de-la-Franca, N.

    2012-04-01

    It is well known that winter chilling is necessary for the flowering of temperate trees. The chilling requirement is a criterion for choosing a species or variety at a given location. Also chemistry products can be used for reducing the chilling-hours needs but make our production more expensive. This study first analysed the observed values of chilling hours for some representative agricultural locations in Spain for the last three decades and their projected changes under climate change scenarios. Usually the chilling is measured and calculated as chilling-hours, and different methods have been used to calculate them (e.g. Richarson et al., 1974 among others) according to the species considered. For our objective North Carolina method (Shaltout and Unrath, 1983) was applied for apples, Utah method (Richardson et al. 1974) for peach and grapevine and the approach used by De Melo-Abreu et al. (2004) for olive trees. The influence of climate change in temperate trees was studied by calculating projections of chilling-hours with climate data from Regional Climate Models (RCMs) at high resolution (25 km) from the European Project ENSEMBLES (http://www.ensembles-eu.org/). These projections will allow for analysing the modelled variations of chill-hours between 2nd half of 20C and 1st half of 21C at the study locations.

  6. Compressive sensing method for recognizing cat-eye effect targets.

    PubMed

    Li, Li; Li, Hui; Dang, Ersheng; Liu, Bo

    2013-10-01

    This paper proposes a cat-eye effect target recognition method with compressive sensing (CS) and presents a recognition method (sample processing before reconstruction based on compressed sensing, or SPCS) for image processing. In this method, the linear projections of original image sequences are applied to remove dynamic background distractions and extract cat-eye effect targets. Furthermore, the corresponding imaging mechanism for acquiring active and passive image sequences is put forward. This method uses fewer images to recognize cat-eye effect targets, reduces data storage, and translates the traditional target identification, based on original image processing, into measurement vectors processing. The experimental results show that the SPCS method is feasible and superior to the shape-frequency dual criteria method.

  7. There's an app for that shirt! Evaluation of augmented reality tracking methods on deformable surfaces for fashion design

    NASA Astrophysics Data System (ADS)

    Ruzanka, Silvia; Chang, Ben; Behar, Katherine

    2013-03-01

    In this paper we present appARel, a creative research project at the intersection of augmented reality, fashion, and performance art. appARel is a mobile augmented reality application that transforms otherwise ordinary garments with 3D animations and modifications. With appARel, entire fashion collections can be uploaded in a smartphone application, and "new looks" can be downloaded in a software update. The project will culminate in a performance art fashion show, scheduled for March 2013. appARel includes textile designs incorporating fiducial markers, garment designs that incorporate multiple markers with the human body, and iOS and Android apps that apply different augments, or "looks", to a garment. We discuss our philosophy for combining computer-generated and physical objects; and share the challenges we encountered in applying fiduciary markers to the 3D curvatures of the human body.

  8. [Hospital management in Brazil: a review of the literature with a view toenhance administrative practices in hospitals].

    PubMed

    Farias, Diego Carlos; Araujo, Fernando Oliveira de

    2017-06-01

    Hospitals are complex organizations which, in addition to the technical assistance expected in the context of treatment and prevention of health hazards, also require good management practices aimed at improving their efficiency in their core business. However, in administrative terms, recurrent conflicts arise involving technical and managerial areas. Thus, this article sets out to conducta review of the scientific literature pertaining to the themes of hospital management and projects that have been applied in the hospital context. In terms of methodology, the study adopts the webiblioming method of collection and systematic analysis of knowledge in indexed journal databases. The results show a greater interest on the part of researchers in looking for a more vertically and horizontally dialogical administration, better definition of work processes, innovative technological tools to support the management process and finally the possibility of applying project management methodologies in collaboration with hospital management.

  9. The requirements and feasibility of business planning in the office of space and terrestrial applications

    NASA Technical Reports Server (NTRS)

    Greenberg, J. S.; Miller, B. P.

    1979-01-01

    The feasibility of applying strategic business planning techniques which are developed and used in the private sector to the planning of certain projects within the NASA Office of Space and Terrestrial Applications was assessed. The methods of strategic business planning that are currently in use in the private sector are examined. The typical contents of a private sector strategic business plan and the techniques commonly used to develop the contents of the plan are described, along with modifications needed to apply these concepts to public sector projects. The current long-range planning process in the Office of Space and Terrestrial Applications is reviewed and program initiatives that might be candidates for the use of strategic business planning techniques are identified. In order to more fully illustrate the information requirements of a strategic business plan for a NASA program, a sample business plan is prepared for a hypothetical Operational Earth Resources Satellite program.

  10. Why and how to include anthropological perspective into multidisciplinary research in the Polish health system.

    PubMed

    Witeska-Młynarczyk, Anna D

    2012-01-01

    The article focuses on ways in which anthropological knowledge, incorporated into multidisciplinary and multilevel research projects, can be applied for understanding health- and illness-related behaviours and functioning of the health system in Poland. It selectively presents potential theoretical and methodological contributions of the anthropological discipline to the field of applied health research, and briefly reviews selected ethnographic theories and methods for researching and interpreting socio-cultural conditioning of healing, health and illness related practices. The review focuses on the following approaches: Critical Medical Anthropology, Cultural Interpretive Theory, phenomenology, narrative analysis, and the biography of pharmaceuticals. The author highlights the need for team work and use of a holistic perspective while analyzing the health system in Poland, and underlines the need for serious attention and financial support to be given to multidisciplinary research projects of which anthropology is a part.

  11. a 3d GIS Method Applied to Cataloging and Restoring: the Case of Aurelian Walls at Rome

    NASA Astrophysics Data System (ADS)

    Canciani, M.; Ceniccola, V.; Messi, M.; Saccone, M.; Zampilli, M.

    2013-07-01

    The project involves architecture, archaeology, restoration, graphic documentation and computer imaging. The objective is development of a method for documentation of an architectural feature, based on a three-dimensional model obtained through laser scanning technologies, linked to a database developed in GIS environment. The case study concerns a short section of Rome's Aurelian walls, including the Porta Latina. The city walls are Rome's largest single architectural monument, subject to continuous deterioration, modification and maintenance since their original construction beginning in 271 AD. The documentation system provides a flexible, precise and easily-applied instrument for recording the full appearance, materials, stratification palimpsest and conservation status, in order to identify restoration criteria and intervention priorities, and to monitor and control the use and conservation of the walls over time. The project began with an analysis and documentation campaign integrating direct, traditional recording methods with indirect, topographic instrument and 3D laser scanning recording. These recording systems permitted development of a geographic information system based on three-dimensional modelling of separate, individual elements, linked to a database and related to the various stratigraphic horizons, the construction techniques, the component materials and their state of degradation. The investigations of the extant wall fabric were further compared to historic documentation, from both graphic and descriptive sources. The resulting model constitutes the core of the GIS system for this specific monument. The methodology is notable for its low cost, precision, practicality and thoroughness, and can be applied to the entire Aurelian wall and to other monuments.

  12. An Analysis of the Costs, Benefits, and Implications of Different Approaches to Capturing the Value of Renewable Energy Tax Incentives

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bolinger, Mark

    This report compares the relative costs, benefits, and implications of capturing the value of renewable energy tax benefits in these three different ways – applying them against outside income , carrying them forward in time until they can be fully absorbed internally, or monetizing them through third-party tax equity investors – to see which method is most competitive under various scenarios. It finds that under current law and late-2013 market conditions, monetization makes sense for all but the most tax-efficient project sponsors. In other words, for most project sponsors, bringing in third-party tax equity currently provides net benefits to amore » project.« less

  13. Lay and professional stakeholder involvement in scoping palliative care issues: Methods used in seven European countries.

    PubMed

    Brereton, Louise; Ingleton, Christine; Gardiner, Clare; Goyder, Elizabeth; Mozygemba, Kati; Lysdahl, Kristin Bakke; Tummers, Marcia; Sacchini, Dario; Leppert, Wojciech; Blaževičienė, Aurelija; van der Wilt, Gert Jan; Refolo, Pietro; De Nicola, Martina; Chilcott, James; Oortwijn, Wija

    2017-02-01

    Stakeholders are people with an interest in a topic. Internationally, stakeholder involvement in palliative care research and health technology assessment requires development. Stakeholder involvement adds value throughout research (from prioritising topics to disseminating findings). Philosophies and understandings about the best ways to involve stakeholders in research differ internationally. Stakeholder involvement took place in seven countries (England, Germany, Italy, Lithuania, the Netherlands, Norway and Poland). Findings informed a project that developed concepts and methods for health technology assessment and applied these to evaluate models of palliative care service delivery. To report on stakeholder involvement in the INTEGRATE-HTA project and how issues identified informed project development. Using stakeholder consultation or a qualitative research design, as appropriate locally, stakeholders in seven countries acted as 'advisors' to aid researchers' decision making. Thematic analysis was used to identify key issues across countries. A total of 132 stakeholders (82 professionals and 50 'lay' people) aged ⩾18 participated in individual face-to-face or telephone interviews, consultation meetings or focus groups. Different stakeholder involvement methods were used successfully to identify key issues in palliative care. A total of 23 issues common to three or more countries informed decisions about the intervention and comparator of interest, sub questions and specific assessments within the health technology assessment. Stakeholders, including patients and families undergoing palliative care, can inform project decision making using various involvement methods according to the local context. Researchers should consider local understandings about stakeholder involvement as views of appropriate and feasible methods vary. Methods for stakeholder involvement, especially consultation, need further development.

  14. Bézier B¯ projection

    NASA Astrophysics Data System (ADS)

    Miao, Di; Borden, Michael J.; Scott, Michael A.; Thomas, Derek C.

    2018-06-01

    In this paper we demonstrate the use of B\\'{e}zier projection to alleviate locking phenomena in structural mechanics applications of isogeometric analysis. Interpreting the well-known $\\bar{B}$ projection in two different ways we develop two formulations for locking problems in beams and nearly incompressible elastic solids. One formulation leads to a sparse symmetric symmetric system and the other leads to a sparse non-symmetric system. To demonstrate the utility of B\\'{e}zier projection for both geometry and material locking phenomena we focus on transverse shear locking in Timoshenko beams and volumetric locking in nearly compressible linear elasticity although the approach can be applied generally to other types of locking phenemona as well. B\\'{e}zier projection is a local projection technique with optimal approximation properties, which in many cases produces solutions that are comparable to global $L^2$ projection. In the context of $\\bar{B}$ methods, the use of B\\'ezier projection produces sparse stiffness matrices with only a slight increase in bandwidth when compared to standard displacement-based methods. Of particular importance is that the approach is applicable to any spline representation that can be written in B\\'ezier form like NURBS, T-splines, LR-splines, etc. We discuss in detail how to integrate this approach into an existing finite element framework with minimal disruption through the use of B\\'ezier extraction operators and a newly introduced dual basis for the B\\'{e}zierprojection operator. We then demonstrate the behavior of the two proposed formulations through several challenging benchmark problems.

  15. Projection correlation based view interpolation for cone beam CT: primary fluence restoration in scatter measurement with a moving beam stop array.

    PubMed

    Yan, Hao; Mou, Xuanqin; Tang, Shaojie; Xu, Qiong; Zankl, Maria

    2010-11-07

    Scatter correction is an open problem in x-ray cone beam (CB) CT. The measurement of scatter intensity with a moving beam stop array (BSA) is a promising technique that offers a low patient dose and accurate scatter measurement. However, when restoring the blocked primary fluence behind the BSA, spatial interpolation cannot well restore the high-frequency part, causing streaks in the reconstructed image. To address this problem, we deduce a projection correlation (PC) to utilize the redundancy (over-determined information) in neighbouring CB views. PC indicates that the main high-frequency information is contained in neighbouring angular projections, instead of the current projection itself, which provides a guiding principle that applies to high-frequency information restoration. On this basis, we present the projection correlation based view interpolation (PC-VI) algorithm; that it outperforms the use of only spatial interpolation is validated. The PC-VI based moving BSA method is developed. In this method, PC-VI is employed instead of spatial interpolation, and new moving modes are designed, which greatly improve the performance of the moving BSA method in terms of reliability and practicability. Evaluation is made on a high-resolution voxel-based human phantom realistically including the entire procedure of scatter measurement with a moving BSA, which is simulated by analytical ray-tracing plus Monte Carlo simulation with EGSnrc. With the proposed method, we get visually artefact-free images approaching the ideal correction. Compared with the spatial interpolation based method, the relative mean square error is reduced by a factor of 6.05-15.94 for different slices. PC-VI does well in CB redundancy mining; therefore, it has further potential in CBCT studies.

  16. Signal separation by nonlinear projections: The fetal electrocardiogram

    NASA Astrophysics Data System (ADS)

    Schreiber, Thomas; Kaplan, Daniel T.

    1996-05-01

    We apply a locally linear projection technique which has been developed for noise reduction in deterministically chaotic signals to extract the fetal component from scalar maternal electrocardiographic (ECG) recordings. Although we do not expect the maternal ECG to be deterministic chaotic, typical signals are effectively confined to a lower-dimensional manifold when embedded in delay space. The method is capable of extracting fetal heart rate even when the fetal component and the noise are of comparable amplitude. If the noise is small, more details of the fetal ECG, like P and T waves, can be recovered.

  17. Development of a model for predicting NASA/MSFC program success

    NASA Technical Reports Server (NTRS)

    Riggs, Jeffrey; Miller, Tracy; Finley, Rosemary

    1990-01-01

    Research conducted during the execution of a previous contract (NAS8-36955/0039) firmly established the feasibility of developing a tool to aid decision makers in predicting the potential success of proposed projects. The final report from that investigation contains an outline of the method to be applied in developing this Project Success Predictor Model. As a follow-on to the previous study, this report describes in detail the development of this model and includes full explanation of the data-gathering techniques used to poll expert opinion. The report includes the presentation of the model code itself.

  18. Applied Anthropology in Broadcasting

    ERIC Educational Resources Information Center

    Eiselein, E. B.

    1976-01-01

    Three different applied media anthropology projects are described. These projects stem from the broadcasters' legal need to know about the community (community ascertainment), the broadcasters' need to know about the station audience (audience profile), and the broadcasters' desire to change a community (action projects). (Author)

  19. Welding studs detection based on line structured light

    NASA Astrophysics Data System (ADS)

    Geng, Lei; Wang, Jia; Wang, Wen; Xiao, Zhitao

    2018-01-01

    The quality of welding studs is significant for installation and localization of components of car in the process of automobile general assembly. A welding stud detection method based on line structured light is proposed. Firstly, the adaptive threshold is designed to calculate the binary images. Then, the light stripes of the image are extracted after skeleton line extraction and morphological filtering. The direction vector of the main light stripe is calculated using the length of the light stripe. Finally, the gray projections along the orientation of the main light stripe and the vertical orientation of the main light stripe are computed to obtain curves of gray projection, which are used to detect the studs. Experimental results demonstrate that the error rate of proposed method is lower than 0.1%, which is applied for automobile manufacturing.

  20. Automatic three-dimensional tracking of particles with high-numerical-aperture digital lensless holographic microscopy.

    PubMed

    Restrepo, John F; Garcia-Sucerquia, Jorge

    2012-02-15

    We present an automatic procedure for 3D tracking of micrometer-sized particles with high-NA digital lensless holographic microscopy. The method uses a two-feature approach to search for the best focal planes and to distinguish particles from artifacts or other elements on the reconstructed stream of the holograms. A set of reconstructed images is axially projected onto a single image. From the projected image, the centers of mass of all the reconstructed elements are identified. Starting from the centers of mass, the morphology of the profile of the maximum intensity along the reconstruction direction allows for the distinguishing of particles from others elements. The method is tested with modeled holograms and applied to automatically track micrometer-sized bubbles in a sample of 4 mm3 of soda.

  1. The Data-to-Action Framework: A Rapid Program Improvement Process.

    PubMed

    Zakocs, Ronda; Hill, Jessica A; Brown, Pamela; Wheaton, Jocelyn; Freire, Kimberley E

    2015-08-01

    Although health education programs may benefit from quality improvement methods, scant resources exist to help practitioners apply these methods for program improvement. The purpose of this article is to describe the Data-to-Action framework, a process that guides practitioners through rapid-feedback cycles in order to generate actionable data to improve implementation of ongoing programs. The framework was designed while implementing DELTA PREP, a 3-year project aimed at building the primary prevention capacities of statewide domestic violence coalitions. The authors describe the framework's main steps and provide a case example of a rapid-feedback cycle and several examples of rapid-feedback memos produced during the project period. The authors also discuss implications for health education evaluation and practice. © 2015 Society for Public Health Education.

  2. FBP and BPF reconstruction methods for circular X-ray tomography with off-center detector.

    PubMed

    Schäfer, Dirk; Grass, Michael; van de Haar, Peter

    2011-07-01

    Circular scanning with an off-center planar detector is an acquisition scheme that allows to save detector area while keeping a large field of view (FOV). Several filtered back-projection (FBP) algorithms have been proposed earlier. The purpose of this work is to present two newly developed back-projection filtration (BPF) variants and evaluate the image quality of these methods compared to the existing state-of-the-art FBP methods. The first new BPF algorithm applies redundancy weighting of overlapping opposite projections before differentiation in a single projection. The second one uses the Katsevich-type differentiation involving two neighboring projections followed by redundancy weighting and back-projection. An averaging scheme is presented to mitigate streak artifacts inherent to circular BPF algorithms along the Hilbert filter lines in the off-center transaxial slices of the reconstructions. The image quality is assessed visually on reconstructed slices of simulated and clinical data. Quantitative evaluation studies are performed with the Forbild head phantom by calculating root-mean-squared-deviations (RMSDs) to the voxelized phantom for different detector overlap settings and by investigating the noise resolution trade-off with a wire phantom in the full detector and off-center scenario. The noise-resolution behavior of all off-center reconstruction methods corresponds to their full detector performance with the best resolution for the FDK based methods with the given imaging geometry. With respect to RMSD and visual inspection, the proposed BPF with Katsevich-type differentiation outperforms all other methods for the smallest chosen detector overlap of about 15 mm. The best FBP method is the algorithm that is also based on the Katsevich-type differentiation and subsequent redundancy weighting. For wider overlap of about 40-50 mm, these two algorithms produce similar results outperforming the other three methods. The clinical case with a detector overlap of about 17 mm confirms these results. The BPF-type reconstructions with Katsevich differentiation are widely independent of the size of the detector overlap and give the best results with respect to RMSD and visual inspection for minimal detector overlap. The increased homogeneity will improve correct assessment of lesions in the entire field of view.

  3. A new method for teaching physical examination to junior medical students.

    PubMed

    Sayma, Meelad; Williams, Hywel Rhys

    2016-01-01

    Teaching effective physical examination is a key component in the education of medical students. Preclinical medical students often have insufficient clinical knowledge to apply to physical examination recall, which may hinder their learning when taught through certain understanding-based models. This pilot project aimed to develop a method to teach physical examination to preclinical medical students using "core clinical cases", overcoming the need for "rote" learning. This project was developed utilizing three cycles of planning, action, and reflection. Thematic analysis of feedback was used to improve this model, and ensure it met student expectations. A model core clinical case developed in this project is described, with gout as the basis for a "foot and ankle" examination. Key limitations and difficulties encountered on implementation of this pilot are discussed for future users, including the difficulty encountered in "content overload". This approach aims to teach junior medical students physical examination through understanding, using a simulated patient environment. Robust research is now required to demonstrate efficacy and repeatability in the physical examination of other systems.

  4. Early Detection of Risk Taking in Groups and Individuals

    DTIC Science & Technology

    2016-03-25

    such methods to Twitter messages from a range of social events, some where disruption is present and others where it is not. The methodologies focus...project  applies  such   methods  to   Twitter  messages  from  a  range  of  social  events,  some  where...come  from  real-­‐time  social   media  in  the  form  of   Twitter  messages  created  during  differing  social

  5. Simulation of scenario earthquake influenced field by using GIS

    USGS Publications Warehouse

    Zuo, H.-Q.; Xie, L.-L.; Borcherdt, R.D.

    1999-01-01

    The method for estimating the site effect on ground motion specified by Borcherdt (1994a, 1994b) is briefly introduced in the paper. This method and the detail geological data and site classification data in San Francisco bay area of California, the United States, are applied to simulate the influenced field of scenario earthquake by GIS technology, and the software for simulating has been drawn up. The paper is a partial result of cooperative research project between China Seismological Bureau and US Geological Survey.

  6. Applied Innovative Technologies for Characterization of Nitrocellulose and Nitroglycerin Contaminated Buildings and Soils

    DTIC Science & Technology

    2008-07-01

    analyses of the NG test group samples are summarized in Table 4- 8 along with results for the lab reference method, STL (SW- 846 ) Method 8330. The results for...Drive, Suite 17D08,Alexandria,VA,22350-3605 8 . PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10...Standard Form 298 (Rev. 8 -98) Prescribed by ANSI Std Z39-18 i COST & PERFORMANCE REPORT Project: ER-0130 TABLE OF CONTENTS Page 1.0 EXECUTIVE

  7. Public Interest Waiver of American Iron and Steel Requirements to the Winston-Salem and Forsyth County City/County Utilities Commission for 30 Inch TR Flex Pipe Fittings

    EPA Pesticide Factsheets

    Product specific project waivers apply only to the specified product and proposed project referenced in the waiver. Any other project that wishes to use a similar product must apply for a separate waiver based on specific project circumstances.

  8. The application of S-transformation and M-2DPCA in I.C. Engine fault diagnosis

    NASA Astrophysics Data System (ADS)

    Zhang, Shixiong; Cai, Yanping; Mu, Weijie

    2017-04-01

    According to the problem of parameter selection and feature extraction for vibration diagnosis of traditional internal combustion engine is discussed. The method based on S-transformation and Module Two Dimensional Principal Components Analysis (M-2DPCA) is proposed to carry out fault diagnosis of I.C. Engine valve mechanism. First of all, the method transfers cylinder surface vibration signals of I.C. into images through S-transform. The second, extracting the optimized projection vectors from the general distribution matrix G which is obtained by all sample sub-images, so that vibration spectrum images can be modularized using M-2DPCA. The last, these features matrix obtained from images project will served as the enters of nearest neighbor classifier, it is used to achieve fault types' division. The method is applied to the diagnosis example of the vibration signal of the valve mechanism eight operating modes, recognition rate up to 94.17 percent; the effectiveness of the proposed method is proved.

  9. Standards for documenting and monitoring bird reintroduction projects

    USGS Publications Warehouse

    Sutherland, W.J.; Armstrong, D.; Butchart, S.H.M.; Earnhardt, J.M.; Ewen, J.; Jamieson, I.; Jones, C.G.; Lee, R.; Newbery, P.; Nichols, J.D.; Parker, K.A.; Sarrazin, F.; Seddon, P.J.; Shah, N.; Tatayah, V.

    2010-01-01

    It would be much easier to assess the effectiveness of different reintroduction methods, and so improve the success of reintroductions, if there was greater standardization in documentation of the methods and outcomes. We suggest a series of standards for documenting and monitoring the methods and outcomes associated with reintroduction projects for birds. Key suggestions are: documenting the planned release before it occurs, specifying the information required on each release, postrelease monitoring occurring at standard intervals of 1 and 5 years (and 10 for long-lived species), carrying out a population estimate unless impractical, distinguishing restocked and existing individuals when supplementing populations, and documenting the results. We suggest these principles would apply, largely unchanged, to other vertebrate classes. Similar methods could be adopted for invertebrates and plants with appropriate modification. We suggest that organizations publically state whether they will adopt these approaches when undertaking reintroductions. Similar standardization would be beneficial for a wide range of topics in environmental monitoring, ecological studies, and practical conservation. ??2010 Wiley Periodicals, Inc.

  10. Strategy on energy saving reconstruction of distribution networks based on life cycle cost

    NASA Astrophysics Data System (ADS)

    Chen, Xiaofei; Qiu, Zejing; Xu, Zhaoyang; Xiao, Chupeng

    2017-08-01

    Because the actual distribution network reconstruction project funds are often limited, the cost-benefit model and the decision-making method are crucial for distribution network energy saving reconstruction project. From the perspective of life cycle cost (LCC), firstly the research life cycle is determined for the energy saving reconstruction of distribution networks with multi-devices. Then, a new life cycle cost-benefit model for energy-saving reconstruction of distribution network is developed, in which the modification schemes include distribution transformers replacement, lines replacement and reactive power compensation. In the operation loss cost and maintenance cost area, the operation cost model considering the influence of load season characteristics and the maintenance cost segmental model of transformers are proposed. Finally, aiming at the highest energy saving profit per LCC, a decision-making method is developed while considering financial and technical constraints as well. The model and method are applied to a real distribution network reconstruction, and the results prove that the model and method are effective.

  11. Implementing Kernel Methods Incrementally by Incremental Nonlinear Projection Trick.

    PubMed

    Kwak, Nojun

    2016-05-20

    Recently, the nonlinear projection trick (NPT) was introduced enabling direct computation of coordinates of samples in a reproducing kernel Hilbert space. With NPT, any machine learning algorithm can be extended to a kernel version without relying on the so called kernel trick. However, NPT is inherently difficult to be implemented incrementally because an ever increasing kernel matrix should be treated as additional training samples are introduced. In this paper, an incremental version of the NPT (INPT) is proposed based on the observation that the centerization step in NPT is unnecessary. Because the proposed INPT does not change the coordinates of the old data, the coordinates obtained by INPT can directly be used in any incremental methods to implement a kernel version of the incremental methods. The effectiveness of the INPT is shown by applying it to implement incremental versions of kernel methods such as, kernel singular value decomposition, kernel principal component analysis, and kernel discriminant analysis which are utilized for problems of kernel matrix reconstruction, letter classification, and face image retrieval, respectively.

  12. SU-D-206-04: Iterative CBCT Scatter Shading Correction Without Prior Information

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bai, Y; Wu, P; Mao, T

    2016-06-15

    Purpose: To estimate and remove the scatter contamination in the acquired projection of cone-beam CT (CBCT), to suppress the shading artifacts and improve the image quality without prior information. Methods: The uncorrected CBCT images containing shading artifacts are reconstructed by applying the standard FDK algorithm on CBCT raw projections. The uncorrected image is then segmented to generate an initial template image. To estimate scatter signal, the differences are calculated by subtracting the simulated projections of the template image from the raw projections. Since scatter signals are dominantly continuous and low-frequency in the projection domain, they are estimated by low-pass filteringmore » the difference signals and subtracted from the raw CBCT projections to achieve the scatter correction. Finally, the corrected CBCT image is reconstructed from the corrected projection data. Since an accurate template image is not readily segmented from the uncorrected CBCT image, the proposed scheme is iterated until the produced template is not altered. Results: The proposed scheme is evaluated on the Catphan©600 phantom data and CBCT images acquired from a pelvis patient. The result shows that shading artifacts have been effectively suppressed by the proposed method. Using multi-detector CT (MDCT) images as reference, quantitative analysis is operated to measure the quality of corrected images. Compared to images without correction, the method proposed reduces the overall CT number error from over 200 HU to be less than 50 HU and can increase the spatial uniformity. Conclusion: An iterative strategy without relying on the prior information is proposed in this work to remove the shading artifacts due to scatter contamination in the projection domain. The method is evaluated in phantom and patient studies and the result shows that the image quality is remarkably improved. The proposed method is efficient and practical to address the poor image quality issue of CBCT images. This work is supported by the Zhejiang Provincial Natural Science Foundation of China (Grant No. LR16F010001), National High-tech R&D Program for Young Scientists by the Ministry of Science and Technology of China (Grant No. 2015AA020917).« less

  13. Construction and demolition waste generation rates for high-rise buildings in Malaysia.

    PubMed

    Mah, Chooi Mei; Fujiwara, Takeshi; Ho, Chin Siong

    2016-12-01

    Construction and demolition waste continues to sharply increase in step with the economic growth of less developed countries. Though the construction industry is large, it is composed of small firms with individual waste management practices, often leading to the deleterious environmental outcomes. Quantifying construction and demolition waste generation allows policy makers and stakeholders to understand the true internal and external costs of construction, providing a necessary foundation for waste management planning that may overcome deleterious environmental outcomes and may be both economically and environmentally optimal. This study offers a theoretical method for estimating the construction and demolition project waste generation rate by utilising available data, including waste disposal truck size and number, and waste volume and composition. This method is proposed as a less burdensome and more broadly applicable alternative, in contrast to waste estimation by on-site hand sorting and weighing. The developed method is applied to 11 projects across Malaysia as the case study. This study quantifies waste generation rate and illustrates the construction method in influencing the waste generation rate, estimating that the conventional construction method has a waste generation rate of 9.88 t 100 m -2 , the mixed-construction method has a waste generation rate of 3.29 t 100 m -2 , and demolition projects have a waste generation rate of 104.28 t 100 m -2 . © The Author(s) 2016.

  14. Characterization of a high-energy in-line phase contrast tomosynthesis prototype.

    PubMed

    Wu, Di; Yan, Aimin; Li, Yuhua; Wong, Molly D; Zheng, Bin; Wu, Xizeng; Liu, Hong

    2015-05-01

    In this research, a high-energy in-line phase contrast tomosynthesis prototype was developed and characterized through quantitative investigations and phantom studies. The prototype system consists of an x-ray source, a motorized rotation stage, and a CMOS detector with a pixel pitch of 0.05 mm. The x-ray source was operated at 120 kVp for this study, and the objects were mounted on the rotation stage 76.2 cm (R1) from the source and 114.3 cm (R2) from the detector. The large air gap between the object and detector guarantees sufficient phase-shift effects. The quantitative evaluation of this prototype included modulation transfer function and noise power spectrum measurements conducted under both projection mode and tomosynthesis mode. Phantom studies were performed including three custom designed phantoms with complex structures: a five-layer bubble wrap phantom, a fishbone phantom, and a chicken breast phantom with embedded fibrils and mass structures extracted from an ACR phantom. In-plane images of the phantoms were acquired to investigate their image qualities through observation, intensity profile plots, edge enhancement evaluations, and/or contrast-to-noise ratio calculations. In addition, the robust phase-attenuation duality (PAD)-based phase retrieval method was applied to tomosynthesis for the first time in this research. It was utilized as a preprocessing method to fully exhibit phase contrast on the angular projection before reconstruction. The resolution and noise characteristics of this high-energy in-line phase contrast tomosynthesis prototype were successfully investigated and demonstrated. The phantom studies demonstrated that this imaging prototype can successfully remove the structure overlapping in phantom projections, obtain delineate interfaces, and achieve better contrast-to-noise ratio after applying phase retrieval to the angular projections. This research successfully demonstrated a high-energy in-line phase contrast tomosynthesis prototype. In addition, the PAD-based method of phase retrieval was combined with tomosynthesis imaging for the first time, which demonstrated its capability in significantly improving the contrast-to-noise ratios in the images.

  15. Characterizing uncertain sea-level rise projections to support investment decisions.

    PubMed

    Sriver, Ryan L; Lempert, Robert J; Wikman-Svahn, Per; Keller, Klaus

    2018-01-01

    Many institutions worldwide are considering how to include uncertainty about future changes in sea-levels and storm surges into their investment decisions regarding large capital infrastructures. Here we examine how to characterize deeply uncertain climate change projections to support such decisions using Robust Decision Making analysis. We address questions regarding how to confront the potential for future changes in low probability but large impact flooding events due to changes in sea-levels and storm surges. Such extreme events can affect investments in infrastructure but have proved difficult to consider in such decisions because of the deep uncertainty surrounding them. This study utilizes Robust Decision Making methods to address two questions applied to investment decisions at the Port of Los Angeles: (1) Under what future conditions would a Port of Los Angeles decision to harden its facilities against extreme flood scenarios at the next upgrade pass a cost-benefit test, and (2) Do sea-level rise projections and other information suggest such conditions are sufficiently likely to justify such an investment? We also compare and contrast the Robust Decision Making methods with a full probabilistic analysis. These two analysis frameworks result in similar investment recommendations for different idealized future sea-level projections, but provide different information to decision makers and envision different types of engagement with stakeholders. In particular, the full probabilistic analysis begins by aggregating the best scientific information into a single set of joint probability distributions, while the Robust Decision Making analysis identifies scenarios where a decision to invest in near-term response to extreme sea-level rise passes a cost-benefit test, and then assembles scientific information of differing levels of confidence to help decision makers judge whether or not these scenarios are sufficiently likely to justify making such investments. Results highlight the highly-localized and context dependent nature of applying Robust Decision Making methods to inform investment decisions.

  16. Characterizing uncertain sea-level rise projections to support investment decisions

    PubMed Central

    Lempert, Robert J.; Wikman-Svahn, Per; Keller, Klaus

    2018-01-01

    Many institutions worldwide are considering how to include uncertainty about future changes in sea-levels and storm surges into their investment decisions regarding large capital infrastructures. Here we examine how to characterize deeply uncertain climate change projections to support such decisions using Robust Decision Making analysis. We address questions regarding how to confront the potential for future changes in low probability but large impact flooding events due to changes in sea-levels and storm surges. Such extreme events can affect investments in infrastructure but have proved difficult to consider in such decisions because of the deep uncertainty surrounding them. This study utilizes Robust Decision Making methods to address two questions applied to investment decisions at the Port of Los Angeles: (1) Under what future conditions would a Port of Los Angeles decision to harden its facilities against extreme flood scenarios at the next upgrade pass a cost-benefit test, and (2) Do sea-level rise projections and other information suggest such conditions are sufficiently likely to justify such an investment? We also compare and contrast the Robust Decision Making methods with a full probabilistic analysis. These two analysis frameworks result in similar investment recommendations for different idealized future sea-level projections, but provide different information to decision makers and envision different types of engagement with stakeholders. In particular, the full probabilistic analysis begins by aggregating the best scientific information into a single set of joint probability distributions, while the Robust Decision Making analysis identifies scenarios where a decision to invest in near-term response to extreme sea-level rise passes a cost-benefit test, and then assembles scientific information of differing levels of confidence to help decision makers judge whether or not these scenarios are sufficiently likely to justify making such investments. Results highlight the highly-localized and context dependent nature of applying Robust Decision Making methods to inform investment decisions. PMID:29414978

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shao, Huaiyong, E-mail: huaiyongshao@163.com; Center for Global Change and Earth Observations, Michigan State University, East Lansing 48823, MI; Sun, Xiaofei

    The Chinese government has conducted the Returning Grazing Land to Grassland Project (RGLGP) across large portions of grasslands from western China since 2003. In order to explore and understand the impact in the grassland's eco-environment during the RGLGP, we utilized Projection Pursuit Model (PPM) and Geographic Information System (GIS) to develop a spatial assessment model to examine the ecological vulnerability of the grassland. Our results include five indications: (1) it is practical to apply the spatial PPM on ecological vulnerability assessment for the grassland. This methodology avoids creating an artificial hypothesis, thereby providing objective results that successfully execute a multi-indexmore » assessment process and analysis under non-linear systems in eco-environments; (2) the spatial PPM is not only capable of evaluating regional eco-environmental vulnerability in a quantitative way, but also can quantitatively demonstrate the degree of effect in each evaluation index for regional eco-environmental vulnerability; (3) the eco-environment of the Xianshui River Basin falls into the medium range level. The normalized difference vegetation index (NDVI) and land use cover and change (LUCC) crucially influence the Xianshui River Basin's eco-environmental vulnerability. Generally, in the Xianshui River Basin, regional eco-environmental conditions improved during 2000 and 2010. The RGLGP positively affected NDVI and LUCC structure, thereby promoting the enhancement of the regional eco-environment; (4) the Xianshui River Basin divides its ecological vulnerability across different levels; therefore our study investigates three ecological regions and proposes specific suggestions for each in order to assist in eco-environmental protection and rehabilitation; and lastly that (5) the spatial PPM established by this study has the potential to be applied on all types of grassland eco-environmental vulnerability assessments under the RGLGP and under the similar conditions in the Returning Agriculture Land to Forest Project (RALFP). However, when establishing an eco-environmental vulnerability assessment model, it is necessary to choose suitable evaluation indexes in accordance with regional eco-environmental characteristics. - Highlights: • We present a method for regional eco-environmental vulnerability assessment. • The method combines Projection Pursuit Model with Geographic Information System. • The Returning Grazing Land to Grassland Project is crucial to environment recovery. • The method is more objective to assess regional eco-environmental vulnerability.« less

  18. Implementing EVM Data Analysis Adding Value from a NASA Project Manager's Perspective

    NASA Technical Reports Server (NTRS)

    Counts, Stacy; Kerby, Jerald

    2006-01-01

    Data Analysis is one of the keys to an effective Earned Value Management (EVM) Process. Project Managers (PM) must continually evaluate data in assessing the health of their projects. Good analysis of data can assist PMs in making better decisions in managing projects. To better support our P Ms, National Aeronautics and Space Administration (NASA) - Marshall Space Flight Center (MSFC) recently renewed its emphasis on sound EVM data analysis practices and processes, During this presentation we will discuss the approach that MSFC followed in implementing better data analysis across its Center. We will address our approach to effectively equip and support our projects in applying a sound data analysis process. In addition, the PM for the Space Station Biological Research Project will share her experiences of how effective data analysis can benefit a PM in the decision making process. The PM will discuss how the emphasis on data analysis has helped create a solid method for assessing the project s performance. Using data analysis successfully can be an effective and efficient tool in today s environment with increasing workloads and downsizing workforces

  19. Reduced order modeling of fluid/structure interaction.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barone, Matthew Franklin; Kalashnikova, Irina; Segalman, Daniel Joseph

    2009-11-01

    This report describes work performed from October 2007 through September 2009 under the Sandia Laboratory Directed Research and Development project titled 'Reduced Order Modeling of Fluid/Structure Interaction.' This project addresses fundamental aspects of techniques for construction of predictive Reduced Order Models (ROMs). A ROM is defined as a model, derived from a sequence of high-fidelity simulations, that preserves the essential physics and predictive capability of the original simulations but at a much lower computational cost. Techniques are developed for construction of provably stable linear Galerkin projection ROMs for compressible fluid flow, including a method for enforcing boundary conditions that preservesmore » numerical stability. A convergence proof and error estimates are given for this class of ROM, and the method is demonstrated on a series of model problems. A reduced order method, based on the method of quadratic components, for solving the von Karman nonlinear plate equations is developed and tested. This method is applied to the problem of nonlinear limit cycle oscillations encountered when the plate interacts with an adjacent supersonic flow. A stability-preserving method for coupling the linear fluid ROM with the structural dynamics model for the elastic plate is constructed and tested. Methods for constructing efficient ROMs for nonlinear fluid equations are developed and tested on a one-dimensional convection-diffusion-reaction equation. These methods are combined with a symmetrization approach to construct a ROM technique for application to the compressible Navier-Stokes equations.« less

  20. Upper ankle joint space detection on low contrast intraoperative fluoroscopic C-arm projections

    NASA Astrophysics Data System (ADS)

    Thomas, Sarina; Schnetzke, Marc; Brehler, Michael; Swartman, Benedict; Vetter, Sven; Franke, Jochen; Grützner, Paul A.; Meinzer, Hans-Peter; Nolden, Marco

    2017-03-01

    Intraoperative mobile C-arm fluoroscopy is widely used for interventional verification in trauma surgery, high flexibility combined with low cost being the main advantages of the method. However, the lack of global device-to- patient orientation is challenging, when comparing the acquired data to other intrapatient datasets. In upper ankle joint fracture reduction accompanied with an unstable syndesmosis, a comparison to the unfractured contralateral site is helpful for verification of the reduction result. To reduce dose and operation time, our approach aims at the comparison of single projections of the unfractured ankle with volumetric images of the reduced fracture. For precise assessment, a pre-alignment of both datasets is a crucial step. We propose a contour extraction pipeline to estimate the joint space location for a prealignment of fluoroscopic C-arm projections containing the upper ankle joint. A quadtree-based hierarchical variance comparison extracts potential feature points and a Hough transform is applied to identify bone shaft lines together with the tibiotalar joint space. By using this information we can define the coarse orientation of the projections independent from the ankle pose during acquisition in order to align those images to the volume of the fractured ankle. The proposed method was evaluated on thirteen cadaveric datasets consisting of 100 projections each with manually adjusted image planes by three trauma surgeons. The results show that the method can be used to detect the joint space orientation. The correlation between angle deviation and anatomical projection direction gives valuable input on the acquisition direction for future clinical experiments.

Top