Sample records for methods including observations

  1. Development of a structured observational method for the systematic assessment of school food-choice architecture.

    PubMed

    Ozturk, Orgul D; McInnes, Melayne M; Blake, Christine E; Frongillo, Edward A; Jones, Sonya J

    2016-01-01

    The objective of this study is to develop a structured observational method for the systematic assessment of the food-choice architecture that can be used to identify key points for behavioral economic intervention intended to improve the health quality of children's diets. We use an ethnographic approach with observations at twelve elementary schools to construct our survey instrument. Elements of the structured observational method include decision environment, salience, accessibility/convenience, defaults/verbal prompts, number of choices, serving ware/method/packaging, and social/physical eating environment. Our survey reveals important "nudgeable" components of the elementary school food-choice architecture, including precommitment and default options on the lunch line.

  2. False alarm recognition in hyperspectral gas plume identification

    DOEpatents

    Conger, James L [San Ramon, CA; Lawson, Janice K [Tracy, CA; Aimonetti, William D [Livermore, CA

    2011-03-29

    According to one embodiment, a method for analyzing hyperspectral data includes collecting first hyperspectral data of a scene using a hyperspectral imager during a no-gas period and analyzing the first hyperspectral data using one or more gas plume detection logics. The gas plume detection logic is executed using a low detection threshold, and detects each occurrence of an observed hyperspectral signature. The method also includes generating a histogram for all occurrences of each observed hyperspectral signature which is detected using the gas plume detection logic, and determining a probability of false alarm (PFA) for all occurrences of each observed hyperspectral signature based on the histogram. Possibly at some other time, the method includes collecting second hyperspectral data, and analyzing the second hyperspectral data using the one or more gas plume detection logics and the PFA to determine if any gas is present. Other systems and methods are also included.

  3. 75 FR 51752 - Proposed Information Collection; Comment Request; An Observer Program for Vessels in the Pacific...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-23

    ... training, debriefing or responses to suspension or decertification. II. Method of Collection Respondents have a choice of either electronic or paper forms. Methods of submittal include electronic (Web-based... define observer duties, train and debrief observers, and manage observer data and its release. The...

  4. Systems and methods for determining a spacecraft orientation

    NASA Technical Reports Server (NTRS)

    Harman, Richard R (Inventor); Luquette, Richard J (Inventor); Lee, Michael H (Inventor)

    2004-01-01

    Disclosed are systems and methods of determining or estimating an orientation of a spacecraft. An exemplary system generates telemetry data, including star observations, in a satellite. A ground station processes the telemetry data with data from a star catalog, to generate display data which, in this example, includes observed stars overlaid with catalog stars. An operator views the display and generates an operator input signal using a mouse device, to pair up observed and catalog stars. Circuitry in the ground station then processes two pairs of observed and catalog stars, to determine an orientation of the spacecraft.

  5. Evaluation of internal noise methods for Hotelling observers

    NASA Astrophysics Data System (ADS)

    Zhang, Yani; Pham, Binh T.; Eckstein, Miguel P.

    2005-04-01

    Including internal noise in computer model observers to degrade model observer performance to human levels is a common method to allow for quantitatively comparisons of human and model performance. In this paper, we studied two different types of methods for injecting internal noise to Hotelling model observers. The first method adds internal noise to the output of the individual channels: a) Independent non-uniform channel noise, b) Independent uniform channel noise. The second method adds internal noise to the decision variable arising from the combination of channel responses: a) internal noise standard deviation proportional to decision variable's standard deviation due to the external noise, b) internal noise standard deviation proportional to decision variable's variance caused by the external noise. We tested the square window Hotelling observer (HO), channelized Hotelling observer (CHO), and Laguerre-Gauss Hotelling observer (LGHO). The studied task was detection of a filling defect of varying size/shape in one of four simulated arterial segment locations with real x-ray angiography backgrounds. Results show that the internal noise method that leads to the best prediction of human performance differs across the studied models observers. The CHO model best predicts human observer performance with the channel internal noise. The HO and LGHO best predict human observer performance with the decision variable internal noise. These results might help explain why previous studies have found different results on the ability of each Hotelling model to predict human performance. Finally, the present results might guide researchers with the choice of method to include internal noise into their Hotelling models.

  6. 75 FR 13515 - Office of Innovation and Improvement (OII); Overview Information; Ready-to-Learn Television...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-22

    ... on rigorous scientifically based research methods to assess the effectiveness of a particular... activities and programs; and (B) Includes research that-- (i) Employs systematic, empirical methods that draw... or observational methods that provide reliable and valid data across evaluators and observers, across...

  7. Comparison of Observed and Predicted Abutment Scour at Selected Bridges in Maine

    USGS Publications Warehouse

    Lombard, Pamela J.; Hodgkins, Glenn A.

    2008-01-01

    Maximum abutment-scour depths predicted with five different methods were compared to maximum abutment-scour depths observed at 100 abutments at 50 bridge sites in Maine with a median bridge age of 66 years. Prediction methods included the Froehlich/Hire method, the Sturm method, and the Maryland method published in Federal Highway Administration Hydraulic Engineering Circular 18 (HEC-18); the Melville method; and envelope curves. No correlation was found between scour calculated using any of the prediction methods and observed scour. Abutment scour observed in the field ranged from 0 to 6.8 feet, with an average observed scour of less than 1.0 foot. Fifteen of the 50 bridge sites had no observable scour. Equations frequently overpredicted scour by an order of magnitude and in some cases by two orders of magnitude. The equations also underpredicted scour 4 to 14 percent of the time.

  8. Qualitative Methods Can Enrich Quantitative Research on Occupational Stress: An Example from One Occupational Group

    ERIC Educational Resources Information Center

    Schonfeld, Irvin Sam; Farrell, Edwin

    2010-01-01

    The chapter examines the ways in which qualitative and quantitative methods support each other in research on occupational stress. Qualitative methods include eliciting from workers unconstrained descriptions of work experiences, careful first-hand observations of the workplace, and participant-observers describing "from the inside" a…

  9. Comparison of different methods to compute a preliminary orbit of Space Debris using radar observations

    NASA Astrophysics Data System (ADS)

    Ma, Hélène; Gronchi, Giovanni F.

    2014-07-01

    We advertise a new method of preliminary orbit determination for space debris using radar observations, which we call Infang †. We can perform a linkage of two sets of four observations collected at close times. The context is characterized by the accuracy of the range ρ, whereas the right ascension α and the declination δ are much more inaccurate due to observational errors. This method can correct α, δ, assuming the exact knowledge of the range ρ. Considering no perturbations from the J 2 effect, but including errors in the observations, we can compare the new method, the classical method of Gibbs, and the more recent Keplerian integrals method. The development of Infang is still on-going and will be further improved and tested.

  10. Applications of propensity score methods in observational comparative effectiveness and safety research: where have we come and where should we go?

    PubMed

    Borah, Bijan J; Moriarty, James P; Crown, William H; Doshi, Jalpa A

    2014-01-01

    Propensity score (PS) methods have proliferated in recent years in observational studies in general and in observational comparative effectiveness research (CER) in particular. PS methods are an important set of tools for estimating treatment effects in observational studies, enabling adjustment for measured confounders in an easy-to-understand and transparent way. This article demonstrates how PS methods have been used to address specific CER questions from 2001 through to 2012 by identifying six impactful studies from this period. This article also discusses areas for improvement, including data infrastructure, and a unified set of guidelines in terms of PS implementation and reporting, which will boost confidence in evidence generated through observational CER using PS methods.

  11. Comparison of observed and predicted abutment scour at selected bridges in Maine.

    DOT National Transportation Integrated Search

    2008-01-01

    Maximum abutment-scour depths predicted with five different methods were compared to : maximum abutment-scour depths observed at 100 abutments at 50 bridge sites in Maine with a : median bridge age of 66 years. Prediction methods included the Froehli...

  12. The Dependability of Classroom Observations.

    ERIC Educational Resources Information Center

    Hiatt, Diana Buell; Keesling, J. Ward

    A generalizability study of timed observations was conducted in 25 primary grade classes to observe teachers' use of time--for instruction, evaluation of instruction, and classroom management--according to the hour and day observed. Observational methods used by on-site researchers included videotape, checklists, running documentaries, frequency…

  13. The Effects of Including Observed Means or Latent Means as Covariates in Multilevel Models for Cluster Randomized Trials

    ERIC Educational Resources Information Center

    Aydin, Burak; Leite, Walter L.; Algina, James

    2016-01-01

    We investigated methods of including covariates in two-level models for cluster randomized trials to increase power to detect the treatment effect. We compared multilevel models that included either an observed cluster mean or a latent cluster mean as a covariate, as well as the effect of including Level 1 deviation scores in the model. A Monte…

  14. Software sensors for bioprocesses.

    PubMed

    Bogaerts, Ph; Vande Wouwer, A

    2003-10-01

    State estimation is a significant problem in biotechnological processes, due to the general lack of hardware sensor measurements of the variables describing the process dynamics. The objective of this paper is to review a number of software sensor design methods, including extended Kalman filters, receding-horizon observers, asymptotic observers, and hybrid observers, which can be efficiently applied to bioprocesses. These several methods are illustrated with simulation and real-life case studies.

  15. Studies of Trace Gas Chemical Cycles Using Observations, Inverse Methods and Global Chemical Transport Models

    NASA Technical Reports Server (NTRS)

    Prinn, Ronald G.

    2001-01-01

    For interpreting observational data, and in particular for use in inverse methods, accurate and realistic chemical transport models are essential. Toward this end we have, in recent years, helped develop and utilize a number of three-dimensional models including the Model for Atmospheric Transport and Chemistry (MATCH).

  16. Controllers, observers, and applications thereof

    NASA Technical Reports Server (NTRS)

    Gao, Zhiqiang (Inventor); Zhou, Wankun (Inventor); Miklosovic, Robert (Inventor); Radke, Aaron (Inventor); Zheng, Qing (Inventor)

    2011-01-01

    Controller scaling and parameterization are described. Techniques that can be improved by employing the scaling and parameterization include, but are not limited to, controller design, tuning and optimization. The scaling and parameterization methods described here apply to transfer function based controllers, including PID controllers. The parameterization methods also apply to state feedback and state observer based controllers, as well as linear active disturbance rejection (ADRC) controllers. Parameterization simplifies the use of ADRC. A discrete extended state observer (DESO) and a generalized extended state observer (GESO) are described. They improve the performance of the ESO and therefore ADRC. A tracking control algorithm is also described that improves the performance of the ADRC controller. A general algorithm is described for applying ADRC to multi-input multi-output systems. Several specific applications of the control systems and processes are disclosed.

  17. Bayesian data analysis in observational comparative effectiveness research: rationale and examples.

    PubMed

    Olson, William H; Crivera, Concetta; Ma, Yi-Wen; Panish, Jessica; Mao, Lian; Lynch, Scott M

    2013-11-01

    Many comparative effectiveness research and patient-centered outcomes research studies will need to be observational for one or both of two reasons: first, randomized trials are expensive and time-consuming; and second, only observational studies can answer some research questions. It is generally recognized that there is a need to increase the scientific validity and efficiency of observational studies. Bayesian methods for the design and analysis of observational studies are scientifically valid and offer many advantages over frequentist methods, including, importantly, the ability to conduct comparative effectiveness research/patient-centered outcomes research more efficiently. Bayesian data analysis is being introduced into outcomes studies that we are conducting. Our purpose here is to describe our view of some of the advantages of Bayesian methods for observational studies and to illustrate both realized and potential advantages by describing studies we are conducting in which various Bayesian methods have been or could be implemented.

  18. Water Budget Estimation by Assimilating Multiple Observations and Hydrological Modeling Using Constrained Ensemble Kalman Filtering

    NASA Astrophysics Data System (ADS)

    Pan, M.; Wood, E. F.

    2004-05-01

    This study explores a method to estimate various components of the water cycle (ET, runoff, land storage, etc.) based on a number of different info sources, including both observations and observation-enhanced model simulations. Different from existing data assimilations, this constrained Kalman filtering approach keeps the water budget perfectly closed while updating the states of the underlying model (VIC model) optimally using observations. Assimilating different data sources in this way has several advantages: (1) physical model is included to make estimation time series smooth, missing-free, and more physically consistent; (2) uncertainties in the model and observations are properly addressed; (3) model is constrained by observation thus to reduce model biases; (4) balance of water is always preserved along the assimilation. Experiments are carried out in Southern Great Plain region where necessary observations have been collected. This method may also be implemented in other applications with physical constraints (e.g. energy cycles) and at different scales.

  19. Fingerprint detection

    DOEpatents

    Saunders, George C.

    1992-01-01

    A method for detection and visualization of latent fingerprints is provided and includes contacting a substrate containing a latent print thereon with a colloidal metal composition for time sufficient to allow reaction of said colloidal metal composition with said latent print, and preserving or recording the observable print. Further, the method for detection and visualization of latent fingerprints can include contacting the metal composition-latent print reaction product with a secondary metal-containing solution for time sufficient to allow precipitation of said secondary metal thereby enhancing the visibility of the latent print, and preserving or recording the observable print.

  20. On the Direct Assimilation of Along-track Sea Surface Height Observations into a Free-surface Ocean Model Using a Weak Constraints Four Dimensional Variational (4dvar) Method

    NASA Astrophysics Data System (ADS)

    Ngodock, H.; Carrier, M.; Smith, S. R.; Souopgui, I.; Martin, P.; Jacobs, G. A.

    2016-02-01

    The representer method is adopted for solving a weak constraints 4dvar problem for the assimilation of ocean observations including along-track SSH, using a free surface ocean model. Direct 4dvar assimilation of SSH observations along the satellite tracks requires that the adjoint model be integrated with Dirac impulses on the right hand side of the adjoint equations for the surface elevation equation. The solution of this adjoint model will inevitably include surface gravity waves, and it constitutes the forcing for the tangent linear model (TLM) according to the representer method. This yields an analysis that is contaminated by gravity waves. A method for avoiding the generation of the surface gravity waves in the analysis is proposed in this study; it consists of removing the adjoint of the free surface from the right hand side (rhs) of the free surface mode in the TLM. The information from the SSH observations will still propagate to all other variables via the adjoint of the balance relationship between the barotropic and baroclinic modes, resulting in the correction to the surface elevation. Two assimilation experiments are carried out in the Gulf of Mexico: one with adjoint forcing included on the rhs of the TLM free surface equation, and the other without. Both analyses are evaluated against the assimilated SSH observations, SSH maps from Aviso and independent surface drifters, showing that the analysis that did not include adjoint forcing in the free surface is more accurate. This study shows that when a weak constraint 4dvar approach is considered for the assimilation of along-track SSH observations using a free surface model, with the aim of correcting the mesoscale circulation, an independent model error should not be assigned to the free surface.

  1. The value of including observational studies in systematic reviews was unclear: a descriptive study.

    PubMed

    Seida, Jennifer; Dryden, Donna M; Hartling, Lisa

    2014-12-01

    To evaluate (1) how often observational studies are included in comparative effectiveness reviews (CERs); (2) the rationale for including observational studies; (3) how data from observational studies are appraised, analyzed, and graded; and (4) the impact of observational studies on strength of evidence (SOE) and conclusions. Descriptive study of 23 CERs published through the Effective Health Care Program of the U.S. Agency for Healthcare Research and Quality. Authors searched for observational studies in 20 CERs, of which 18 included a median of 11 (interquartile range, 2-31) studies. Sixteen CERs incorporated the observational studies in their SOE assessments. Seventy-eight comparisons from 12 CERs included evidence from both trials and observational studies; observational studies had an impact on SOE and conclusions for 19 (24%) comparisons. There was diversity across the CERs regarding decisions to include observational studies; study designs considered; and approaches used to appraise, synthesize, and grade SOE. Reporting and methods guidance are needed to ensure clarity and consistency in how observational studies are incorporated in CERs. It was not always clear that observational studies added value in light of the additional resources needed to search for, select, appraise, and analyze such studies. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Qualitative methods in PhD theses from general practice in Scandinavia.

    PubMed

    Malterud, Kirsti; Hamberg, Katarina; Reventlow, Susanne

    2017-12-01

    Qualitative methodology is gaining increasing attention and esteem in medical research, with general practice research taking a lead. With these methods, human and social interaction and meaning can be explored and shared by systematic interpretation of text from talk, observation or video. Qualitative studies are often included in Ph.D. theses from general practice in Scandinavia. Still, the Ph.D. programs across nations and institutions offer only limited training in qualitative methods. In this opinion article, we draw upon our observations and experiences, unpacking and reflecting upon values and challenges at stake when qualitative studies are included in Ph.D. theses. Hypotheses to explain these observations are presented, followed by suggestions for standards of evaluation and improvement of Ph.D. programs. The authors conclude that multimethod Ph.D. theses should be encouraged in general practice research, in order to offer future researchers an appropriate toolbox.

  3. Qualitative methods in PhD theses from general practice in Scandinavia

    PubMed Central

    Malterud, Kirsti; Hamberg, Katarina; Reventlow, Susanne

    2017-01-01

    Qualitative methodology is gaining increasing attention and esteem in medical research, with general practice research taking a lead. With these methods, human and social interaction and meaning can be explored and shared by systematic interpretation of text from talk, observation or video. Qualitative studies are often included in Ph.D. theses from general practice in Scandinavia. Still, the Ph.D. programs across nations and institutions offer only limited training in qualitative methods. In this opinion article, we draw upon our observations and experiences, unpacking and reflecting upon values and challenges at stake when qualitative studies are included in Ph.D. theses. Hypotheses to explain these observations are presented, followed by suggestions for standards of evaluation and improvement of Ph.D. programs. The authors conclude that multimethod Ph.D. theses should be encouraged in general practice research, in order to offer future researchers an appropriate toolbox. PMID:29094644

  4. System and method for anomaly detection

    DOEpatents

    Scherrer, Chad

    2010-06-15

    A system and method for detecting one or more anomalies in a plurality of observations is provided. In one illustrative embodiment, the observations are real-time network observations collected from a stream of network traffic. The method includes performing a discrete decomposition of the observations, and introducing derived variables to increase storage and query efficiencies. A mathematical model, such as a conditional independence model, is then generated from the formatted data. The formatted data is also used to construct frequency tables which maintain an accurate count of specific variable occurrence as indicated by the model generation process. The formatted data is then applied to the mathematical model to generate scored data. The scored data is then analyzed to detect anomalies.

  5. Review on pen-and-paper-based observational methods for assessing ergonomic risk factors of computer work.

    PubMed

    Rahman, Mohd Nasrull Abdol; Mohamad, Siti Shafika

    2017-01-01

    Computer works are associated with Musculoskeletal Disorders (MSDs). There are several methods have been developed to assess computer work risk factor related to MSDs. This review aims to give an overview of current techniques available for pen-and-paper-based observational methods in assessing ergonomic risk factors of computer work. We searched an electronic database for materials from 1992 until 2015. The selected methods were focused on computer work, pen-and-paper observational methods, office risk factors and musculoskeletal disorders. This review was developed to assess the risk factors, reliability and validity of pen-and-paper observational method associated with computer work. Two evaluators independently carried out this review. Seven observational methods used to assess exposure to office risk factor for work-related musculoskeletal disorders were identified. The risk factors involved in current techniques of pen and paper based observational tools were postures, office components, force and repetition. From the seven methods, only five methods had been tested for reliability. They were proven to be reliable and were rated as moderate to good. For the validity testing, from seven methods only four methods were tested and the results are moderate. Many observational tools already exist, but no single tool appears to cover all of the risk factors including working posture, office component, force, repetition and office environment at office workstations and computer work. Although the most important factor in developing tool is proper validation of exposure assessment techniques, the existing observational method did not test reliability and validity. Futhermore, this review could provide the researchers with ways on how to improve the pen-and-paper-based observational method for assessing ergonomic risk factors of computer work.

  6. Method and apparatus for displaying information

    NASA Technical Reports Server (NTRS)

    Huang, Sui (Inventor); Eichler, Gabriel (Inventor); Ingber, Donald E. (Inventor)

    2010-01-01

    A method for displaying large amounts of information. The method includes the steps of forming a spatial layout of tiles each corresponding to a representative reference element; mapping observed elements onto the spatial layout of tiles of representative reference elements; assigning a respective value to each respective tile of the spatial layout of the representative elements; and displaying an image of the spatial layout of tiles of representative elements. Each tile includes atomic attributes of representative elements. The invention also relates to an apparatus for displaying large amounts of information. The apparatus includes a tiler forming a spatial layout of tiles, each corresponding to a representative reference element; a comparator mapping observed elements onto said spatial layout of tiles of representative reference elements; an assigner assigning a respective value to each respective tile of said spatial layout of representative reference elements; and a display displaying an image of the spatial layout of tiles of representative reference elements.

  7. [Essential characteristics of qualitative research and its commonly used methods].

    PubMed

    Zhang, Hong-wei

    2008-02-01

    The main objectives of qualitative research lies in exploring the opinion, attitude, behavior, and experience of a person as a social role, also a patient. This essay introduces the basic characteristics of qualitative research, including its natural property, inductive method adopted, open character and wholism concept; the results of qualitative research are presented in a text form; and its commonly used methods include observation, individual interview and focus group discussion.

  8. Shoreline Activities.

    ERIC Educational Resources Information Center

    Stout, Prentice K.

    1980-01-01

    Described is a method of photographically recording the seasonal movements of the beach, and the equipment and details used in this method. Included is a method for determining current speed and instruction for construction of a tin can viewing box to be used in underwater sea form observations. (Author/DS)

  9. Angiographic Method Using Green Porphyrinew In Primmate Eyes

    DOEpatents

    Miller, Joan W.; Young, Lucy H.Y.; Gragoudas, Evangelos S.

    1998-01-13

    An angiographic method to observe the condition of blood vessels, including neovasculature in the eyes of living primates using green porphyrins and light at a wavelength of 550-700 nm to effect fluorescence is disclosed.

  10. Low vacuum scanning electron microscopy for paraffin sections utilizing the differential stainability of cells and tissues with platinum blue.

    PubMed

    Inaga, Sumire; Hirashima, Sayuri; Tanaka, Keiichi; Katsumoto, Tetsuo; Kameie, Toshio; Nakane, Hironobu; Naguro, Tomonori

    2009-07-01

    The present study introduces a novel method for the direct observation of histological paraffin sections by low vacuum scanning electron microscopy (LVSEM) with platinum blue (Pt-blue) treatment. Pt-blue was applied not only as a backscattered electron (BSE) signal enhancer but also as a histologically specific stain. In this method, paraffin sections of the rat tongue prepared for conventional light microscopy (LM) were stained on glass slides with a Pt-blue staining solution (pH 9) and observed in a LVSEM using BSE detector. Under LVSEM, overviews of whole sections as well as three-dimensional detailed observations of individual cells and tissues could be easily made at magnifications from x40 to x10,000. Each kind of cell and tissue observed in the section could be clearly distinguished due to the different yields of BSE signals, which depended on the surface structures and different affinities to Pt-blue. Thus, we roughly classified cellular and tissue components into three groups according to the staining intensity of Pt-blue observed by LM and LVSEM: 1) a strongly stained (deep blue by LM and brightest by LVSEM) group which included epithelial tissue, endothelium and mast cells; 2) a moderately stained (light blue and bright) group which included muscular tissue and nervous tissue; 3) an unstained or weakly stained (colorless and dark) group which included elastic fibers and collagen fibers. We expect that this method will prove useful for the three-dimensional direct observation of histological paraffin sections of various tissues by LVSEM with higher resolutions than LM.

  11. Determining characteristics of artificial near-Earth objects using observability analysis

    NASA Astrophysics Data System (ADS)

    Friedman, Alex M.; Frueh, Carolin

    2018-03-01

    Observability analysis is a method for determining whether a chosen state of a system can be determined from the output or measurements. Knowledge of state information availability resulting from observability analysis leads to improved sensor tasking for observation of orbital debris and better control of active spacecraft. This research performs numerical observability analysis of artificial near-Earth objects. Analysis of linearization methods and state transition matrices is performed to determine the viability of applying linear observability methods to the nonlinear orbit problem. Furthermore, pre-whitening is implemented to reformulate classical observability analysis. In addition, the state in observability analysis is typically composed of position and velocity; however, including object characteristics beyond position and velocity can be crucial for precise orbit propagation. For example, solar radiation pressure has a significant impact on the orbit of high area-to-mass ratio objects in geosynchronous orbit. Therefore, determining the time required for solar radiation pressure parameters to become observable is important for understanding debris objects. In order to compare observability analysis results with and without measurement noise and an extended state, quantitative measures of observability are investigated and implemented.

  12. Modification of the BAX Salmonella test kit to include a hot start functionality (modification of AOAC Official Method 2003.09).

    PubMed

    Wallace, F Morgan; DiCosimo, Deana; Farnum, Andrew; Tice, George; Andaloro, Bridget; Davis, Eugene; Burns, Frank R

    2011-01-01

    In 2010, the BAX System PCR assay for Salmonella was modified to include a hot start functionality designed to keep the reaction enzyme inactive until PCR begins. To validate the assay's Official Methods of Analysis status to include this procedure modification, an evaluation was conducted on four food types that were simultaneously analyzed with the BAX System and either the U.S. Food and Drug Administration's Bacteriological Analytical Manual or the U.S. Department of Agriculture-Food Safety and Inspection Service Microbiology Laboratory Guidebook reference method for detecting Salmonella. Identical performance between the BAX System method and the reference methods was observed. Additionally, lysates were analyzed using both the BAX System Classic and BAX System Q7 instruments with identical results using both platforms for all samples tested. Of the 100 samples analyzed, 34 samples were positive for both the BAX System and reference methods, and 66 samples were negative by both the BAX System and reference methods, demonstrating 100% correlation. No instrument platform variation was observed. Additional inclusivity and exclusivity testing using the modified test kit demonstrated the test kit to be 100% accurate in evaluation of test panels of 352 Salmonella strains and 46 non-Salmonella strains.

  13. Forest Herbicide Washoff From Foliar Applications

    Treesearch

    J.L. Michael; Kevin L. Talley; H.C. Fishburn

    1992-01-01

    Field and laboratory experiments were conducted to develop and test methods for determining washoff of foliar applied herbicides typically used in forestry in the South.Preliminary results show good agreement between results of laboratory methods used and observations from field experiments on actual precipitation events. Methods included application of...

  14. A Simple Experimental Demonstration of Microbial Growth and Interaction.

    ERIC Educational Resources Information Center

    Wainwright, Milton

    1988-01-01

    Described is a simple, safe, and inexpensive experiment which allows secondary school pupils to observe how fungi and bacteria grow and interact with each other. Included are discussions of materials, methods, observations, and a historical comment. (Author/CW)

  15. Measurement of RF lightning emissions

    NASA Technical Reports Server (NTRS)

    Lott, G. K., Jr.; Honnell, M. A.; Shumpert, T. H.

    1981-01-01

    A lightning radio emission observation laboratory is described. The signals observed and recorded include HF, VHF and UHF radio emissions, optical signature, electric field measurements, and thunder. The objectives of the station, the equipment used, and the recording methods are discussed.

  16. The ratio method: A new tool to study one-neutron halo nuclei

    DOE PAGES

    Capel, Pierre; Johnson, R. C.; Nunes, F. M.

    2013-10-02

    Recently a new observable to study halo nuclei was introduced, based on the ratio between breakup and elastic angular cross sections. This new observable is shown by the analysis of specific reactions to be independent of the reaction mechanism and to provide nuclear-structure information of the projectile. Here we explore the details of this ratio method, including the sensitivity to binding energy and angular momentum of the projectile. We also study the reliability of the method with breakup energy. Lastly, we provide guidelines and specific examples for experimentalists who wish to apply this method.

  17. A Direct Method for Viewing Ferromagnetic Phase Transition.

    ERIC Educational Resources Information Center

    Lue, Chin-Shan

    1994-01-01

    Provides a method, using the Rowland ring as a specimen, to observe the phase transition process directly on the oscilloscope and even extract the critical exponent of ferromagnetic transition. Includes theory, experimental setup, and results. (MVL)

  18. The Association between Observed Parental Emotion Socialization and Adolescent Self-Medication

    ERIC Educational Resources Information Center

    Hersh, Matthew A.; Hussong, Andrea M.

    2009-01-01

    The current study examined the moderating influence of observed parental emotion socialization (PES) on self-medication in adolescents. Strengths of the study include the use of a newly developed observational coding system further extending the study of PES to adolescence, the use of an experience sampling method to assess the daily covariation…

  19. Conceptual and statistical issues in couples observational research: Rationale and methods for design decisions.

    PubMed

    Baucom, Brian R W; Leo, Karena; Adamo, Colin; Georgiou, Panayiotis; Baucom, Katherine J W

    2017-12-01

    Observational behavioral coding methods are widely used for the study of relational phenomena. There are numerous guidelines for the development and implementation of these methods that include principles for creating new and adapting existing coding systems as well as principles for creating coding teams. While these principles have been successfully implemented in research on relational phenomena, the ever expanding array of phenomena being investigated with observational methods calls for a similar expansion of these principles. Specifically, guidelines are needed for decisions that arise in current areas of emphasis in couple research including observational investigation of related outcomes (e.g., relationship distress and psychological symptoms), the study of change in behavior over time, and the study of group similarities and differences in the enactment and perception of behavior. This article describes conceptual and statistical considerations involved in these 3 areas of research and presents principle- and empirically based rationale for design decisions related to these issues. A unifying principle underlying these guidelines is the need for careful consideration of fit between theory, research questions, selection of coding systems, and creation of coding teams. Implications of (mis)fit for the advancement of theory are discussed. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  20. 77 FR 40866 - Applications for New Awards; Innovative Approaches to Literacy Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-11

    ... supported by the methods that have been employed. The term includes, appropriate to the research being... observational methods that provide reliable data; (iv) making claims of causal relationships only in random...; and (vii) using research designs and methods appropriate to the research question posed...

  1. 75 FR 13745 - Office of Innovation and Improvement Overview Information; Ready To Teach Program-General...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-23

    ... on rigorous, scientifically based research methods to assess the effectiveness of a particular... and programs; and (B) Includes research that-- (i) Employs systematic, empirical methods that draw on... hypotheses and justify the general conclusions drawn; (iii) Relies on measurements or observational methods...

  2. Luminosity and astrometry of comets: A review

    NASA Technical Reports Server (NTRS)

    Roemer, E.

    1976-01-01

    Visual and photographic observations of the brightness of comets are reviewed including methods and sources of errors. Nuclear magnitude estimates are discussed and interpreted in relation to determination of appropriate exposure times for photographic observations. The importance of brightness ephemarides is emphasized.

  3. Classical methods and modern analysis for studying fungal diversity

    Treesearch

    John Paul Schmit

    2005-01-01

    In this chapter, we examine the use of classical methods to study fungal diversity. Classical methods rely on the direct observation of fungi, rather than sampling fungal DNA. We summarize a wide variety of classical methods, including direct sampling of fungal fruiting bodies, incubation of substrata in moist chambers, culturing of endophytes, and particle plating. We...

  4. Classical Methods and Modern Analysis for Studying Fungal Diversity

    Treesearch

    J. P. Schmit; D. J. Lodge

    2005-01-01

    In this chapter, we examine the use of classical methods to study fungal diversity. Classical methods rely on the direct observation of fungi, rather than sampling fungal DNA. We summarize a wide variety of classical methods, including direct sampling of fungal fruiting bodies, incubation of substrata in moist chambers, culturing of endophytes, and particle plating. We...

  5. The assessment of cognitive errors using an observer-rated method.

    PubMed

    Drapeau, Martin

    2014-01-01

    Cognitive Errors (CEs) are a key construct in cognitive behavioral therapy (CBT). Integral to CBT is that individuals with depression process information in an overly negative or biased way, and that this bias is reflected in specific depressotypic CEs which are distinct from normal information processing. Despite the importance of this construct in CBT theory, practice, and research, few methods are available to researchers and clinicians to reliably identify CEs as they occur. In this paper, the author presents a rating system, the Cognitive Error Rating Scale, which can be used by trained observers to identify and assess the cognitive errors of patients or research participants in vivo, i.e., as they are used or reported by the patients or participants. The method is described, including some of the more important rating conventions to be considered when using the method. This paper also describes the 15 cognitive errors assessed, and the different summary scores, including valence of the CEs, that can be derived from the method.

  6. Reporting quality of statistical methods in surgical observational studies: protocol for systematic review.

    PubMed

    Wu, Robert; Glen, Peter; Ramsay, Tim; Martel, Guillaume

    2014-06-28

    Observational studies dominate the surgical literature. Statistical adjustment is an important strategy to account for confounders in observational studies. Research has shown that published articles are often poor in statistical quality, which may jeopardize their conclusions. The Statistical Analyses and Methods in the Published Literature (SAMPL) guidelines have been published to help establish standards for statistical reporting.This study will seek to determine whether the quality of statistical adjustment and the reporting of these methods are adequate in surgical observational studies. We hypothesize that incomplete reporting will be found in all surgical observational studies, and that the quality and reporting of these methods will be of lower quality in surgical journals when compared with medical journals. Finally, this work will seek to identify predictors of high-quality reporting. This work will examine the top five general surgical and medical journals, based on a 5-year impact factor (2007-2012). All observational studies investigating an intervention related to an essential component area of general surgery (defined by the American Board of Surgery), with an exposure, outcome, and comparator, will be included in this systematic review. Essential elements related to statistical reporting and quality were extracted from the SAMPL guidelines and include domains such as intent of analysis, primary analysis, multiple comparisons, numbers and descriptive statistics, association and correlation analyses, linear regression, logistic regression, Cox proportional hazard analysis, analysis of variance, survival analysis, propensity analysis, and independent and correlated analyses. Each article will be scored as a proportion based on fulfilling criteria in relevant analyses used in the study. A logistic regression model will be built to identify variables associated with high-quality reporting. A comparison will be made between the scores of surgical observational studies published in medical versus surgical journals. Secondary outcomes will pertain to individual domains of analysis. Sensitivity analyses will be conducted. This study will explore the reporting and quality of statistical analyses in surgical observational studies published in the most referenced surgical and medical journals in 2013 and examine whether variables (including the type of journal) can predict high-quality reporting.

  7. Position and speed control of brushless DC motors using sensorless techniques and application trends.

    PubMed

    Gamazo-Real, José Carlos; Vázquez-Sánchez, Ernesto; Gómez-Gil, Jaime

    2010-01-01

    This paper provides a technical review of position and speed sensorless methods for controlling Brushless Direct Current (BLDC) motor drives, including the background analysis using sensors, limitations and advances. The performance and reliability of BLDC motor drivers have been improved because the conventional control and sensing techniques have been improved through sensorless technology. Then, in this paper sensorless advances are reviewed and recent developments in this area are introduced with their inherent advantages and drawbacks, including the analysis of practical implementation issues and applications. The study includes a deep overview of state-of-the-art back-EMF sensing methods, which includes Terminal Voltage Sensing, Third Harmonic Voltage Integration, Terminal Current Sensing, Back-EMF Integration and PWM strategies. Also, the most relevant techniques based on estimation and models are briefly analysed, such as Sliding-mode Observer, Extended Kalman Filter, Model Reference Adaptive System, Adaptive observers (Full-order and Pseudoreduced-order) and Artificial Neural Networks.

  8. Systematic review of methods used in meta-analyses where a primary outcome is an adverse or unintended event

    PubMed Central

    2012-01-01

    Background Adverse consequences of medical interventions are a source of concern, but clinical trials may lack power to detect elevated rates of such events, while observational studies have inherent limitations. Meta-analysis allows the combination of individual studies, which can increase power and provide stronger evidence relating to adverse events. However, meta-analysis of adverse events has associated methodological challenges. The aim of this study was to systematically identify and review the methodology used in meta-analyses where a primary outcome is an adverse or unintended event, following a therapeutic intervention. Methods Using a collection of reviews identified previously, 166 references including a meta-analysis were selected for review. At least one of the primary outcomes in each review was an adverse or unintended event. The nature of the intervention, source of funding, number of individual meta-analyses performed, number of primary studies included in the review, and use of meta-analytic methods were all recorded. Specific areas of interest relating to the methods used included the choice of outcome metric, methods of dealing with sparse events, heterogeneity, publication bias and use of individual patient data. Results The 166 included reviews were published between 1994 and 2006. Interventions included drugs and surgery among other interventions. Many of the references being reviewed included multiple meta-analyses with 44.6% (74/166) including more than ten. Randomised trials only were included in 42.2% of meta-analyses (70/166), observational studies only in 33.7% (56/166) and a mix of observational studies and trials in 15.7% (26/166). Sparse data, in the form of zero events in one or both arms where the outcome was a count of events, was found in 64 reviews of two-arm studies, of which 41 (64.1%) had zero events in both arms. Conclusions Meta-analyses of adverse events data are common and useful in terms of increasing the power to detect an association with an intervention, especially when the events are infrequent. However, with regard to existing meta-analyses, a wide variety of different methods have been employed, often with no evident rationale for using a particular approach. More specifically, the approach to dealing with zero events varies, and guidelines on this issue would be desirable. PMID:22553987

  9. HIGH-PRECISION ASTROMETRIC MILLIMETER VERY LONG BASELINE INTERFEROMETRY USING A NEW METHOD FOR ATMOSPHERIC CALIBRATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rioja, M.; Dodson, R., E-mail: maria.rioja@icrar.org

    2011-04-15

    We describe a new method which achieves high-precision very long baseline interferometry (VLBI) astrometry in observations at millimeter (mm) wavelengths. It combines fast frequency-switching observations, to correct for the dominant non-dispersive tropospheric fluctuations, with slow source-switching observations, for the remaining ionospheric dispersive terms. We call this method source-frequency phase referencing. Provided that the switching cycles match the properties of the propagation media, one can recover the source astrometry. We present an analytic description of the two-step calibration strategy, along with an error analysis to characterize its performance. Also, we provide observational demonstrations of a successful application with observations using themore » Very Long Baseline Array at 86 GHz of the pairs of sources 3C274 and 3C273 and 1308+326 and 1308+328 under various conditions. We conclude that this method is widely applicable to mm-VLBI observations of many target sources, and unique in providing bona fide astrometrically registered images and high-precision relative astrometric measurements in mm-VLBI using existing and newly built instruments, including space VLBI.« less

  10. WFIRST: Microlensing Parallax Observations from K2 in the Exoplanet Microlensing Field

    NASA Astrophysics Data System (ADS)

    Ranc, Clement; Radek Poleski, David Bennett, K2C9 Microlensing Science Experiment Team

    2018-01-01

    The recent explosion in our understanding of exoplanetary systems has been driven primarily by the Kepler mission, which has replaced radial velocities as our main planet discovery method. While Kepler has provided a large sample of planets that will allow a robust statistical determination of the properties of exoplanets in close orbits about their host stars, the Kepler mission was stopped shortly after the start of its 5th year. This led to the Kepler 2 (K2) mission, which could observe up to 18 different fields in the ecliptic plane, including a fraction of the WFIRST microlensing field. The K2 mission has focused on lower mass host stars and spending one observing campaign in the Galactic bulge to make use of Kepler’s orbit to determine the masses and distances to microlensing systems via the microlensing parallax effect. These K2 Campaign 9 observations help to develop the microlensing planet detection method, which will be employed by the WFIRST mission that will extend the statistical census of exoplanets to include low-mass planets in wide orbits. While the photometric light curve of a microlensing event observed from the ground provides important constraints on the lens physical parameters, in many cases the lens mass and distance from Earth remain degenerated. The poster will show how simultaneous space- and ground-based observations can break this mass-distance degeneracy. This method will be used for a fraction of the events observed by WFIRST. Finally, the poster will present a new method to correct the K2 photometry from the correlated systematic noise. This investigation helps in characterizing the properties of the lens stars and source stars in one WFIRST field with high extinction.

  11. Comparing WSA coronal and solar wind model predictions driven by line-of-sight and vector HMI ADAPT maps

    NASA Astrophysics Data System (ADS)

    Arge, C. N.; Henney, C. J.; Shurkin, K.; Wallace, S.

    2017-12-01

    As the primary input to nearly all coronal models, reliable estimates of the global solar photospheric magnetic field distribution are critical for accurate modeling and understanding of solar and heliospheric magnetic fields. The Air Force Data Assimilative Photospheric flux Transport (ADAPT) model generates synchronic (i.e., globally instantaneous) maps by evolving observed solar magnetic flux using relatively well understood transport processes when measurements are not available and then updating modeled flux with new observations (available from both the Earth and the far-side of the Sun) using data assimilation methods that rigorously take into account model and observational uncertainties. ADAPT is capable of assimilating line-of-sight and vector magnetic field data from all observatory sources including the expected photospheric vector magnetograms from the Polarimetric and Helioseismic Imager (PHI) on the Solar Orbiter, as well as those generated using helioseismic methods. This paper compares Wang-Sheeley-Arge (WSA) coronal and solar wind modeling results at Earth and STEREO A & B using ADAPT input model maps derived from both line-of-site and vector SDO/HMI magnetograms that include methods for incorporating observations of a large, newly emerged (July 2010) far-side active region (AR11087).

  12. A Longitudinal Examination of Agitation and Resident Characteristics in the Nursing Home

    ERIC Educational Resources Information Center

    Burgio, Louis D.; Park, Nan Sook; Hardin, J. Michael; Sun, Fei

    2007-01-01

    Purpose: Agitation frequently accompanies cognitive decline among nursing home residents. This study used cross-sectional and longitudinal (up to 18 months) methods to examine agitation among profoundly and moderately impaired residents using both staff report and direct observation methods. Design and Methods: The study included participants (N =…

  13. Monte Carlo Bayesian inference on a statistical model of sub-gridcolumn moisture variability using high-resolution cloud observations. Part 1: Method.

    PubMed

    Norris, Peter M; da Silva, Arlindo M

    2016-07-01

    A method is presented to constrain a statistical model of sub-gridcolumn moisture variability using high-resolution satellite cloud data. The method can be used for large-scale model parameter estimation or cloud data assimilation. The gridcolumn model includes assumed probability density function (PDF) intra-layer horizontal variability and a copula-based inter-layer correlation model. The observables used in the current study are Moderate Resolution Imaging Spectroradiometer (MODIS) cloud-top pressure, brightness temperature and cloud optical thickness, but the method should be extensible to direct cloudy radiance assimilation for a small number of channels. The algorithm is a form of Bayesian inference with a Markov chain Monte Carlo (MCMC) approach to characterizing the posterior distribution. This approach is especially useful in cases where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach is not gradient-based and allows jumps into regions of non-zero cloud probability. The current study uses a skewed-triangle distribution for layer moisture. The article also includes a discussion of the Metropolis and multiple-try Metropolis versions of MCMC.

  14. Monte Carlo Bayesian Inference on a Statistical Model of Sub-Gridcolumn Moisture Variability Using High-Resolution Cloud Observations. Part 1: Method

    NASA Technical Reports Server (NTRS)

    Norris, Peter M.; Da Silva, Arlindo M.

    2016-01-01

    A method is presented to constrain a statistical model of sub-gridcolumn moisture variability using high-resolution satellite cloud data. The method can be used for large-scale model parameter estimation or cloud data assimilation. The gridcolumn model includes assumed probability density function (PDF) intra-layer horizontal variability and a copula-based inter-layer correlation model. The observables used in the current study are Moderate Resolution Imaging Spectroradiometer (MODIS) cloud-top pressure, brightness temperature and cloud optical thickness, but the method should be extensible to direct cloudy radiance assimilation for a small number of channels. The algorithm is a form of Bayesian inference with a Markov chain Monte Carlo (MCMC) approach to characterizing the posterior distribution. This approach is especially useful in cases where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach is not gradient-based and allows jumps into regions of non-zero cloud probability. The current study uses a skewed-triangle distribution for layer moisture. The article also includes a discussion of the Metropolis and multiple-try Metropolis versions of MCMC.

  15. Monte Carlo Bayesian inference on a statistical model of sub-gridcolumn moisture variability using high-resolution cloud observations. Part 1: Method

    PubMed Central

    Norris, Peter M.; da Silva, Arlindo M.

    2018-01-01

    A method is presented to constrain a statistical model of sub-gridcolumn moisture variability using high-resolution satellite cloud data. The method can be used for large-scale model parameter estimation or cloud data assimilation. The gridcolumn model includes assumed probability density function (PDF) intra-layer horizontal variability and a copula-based inter-layer correlation model. The observables used in the current study are Moderate Resolution Imaging Spectroradiometer (MODIS) cloud-top pressure, brightness temperature and cloud optical thickness, but the method should be extensible to direct cloudy radiance assimilation for a small number of channels. The algorithm is a form of Bayesian inference with a Markov chain Monte Carlo (MCMC) approach to characterizing the posterior distribution. This approach is especially useful in cases where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach is not gradient-based and allows jumps into regions of non-zero cloud probability. The current study uses a skewed-triangle distribution for layer moisture. The article also includes a discussion of the Metropolis and multiple-try Metropolis versions of MCMC. PMID:29618847

  16. Cox regression analysis with missing covariates via nonparametric multiple imputation.

    PubMed

    Hsu, Chiu-Hsieh; Yu, Mandi

    2018-01-01

    We consider the situation of estimating Cox regression in which some covariates are subject to missing, and there exists additional information (including observed event time, censoring indicator and fully observed covariates) which may be predictive of the missing covariates. We propose to use two working regression models: one for predicting the missing covariates and the other for predicting the missing probabilities. For each missing covariate observation, these two working models are used to define a nearest neighbor imputing set. This set is then used to non-parametrically impute covariate values for the missing observation. Upon the completion of imputation, Cox regression is performed on the multiply imputed datasets to estimate the regression coefficients. In a simulation study, we compare the nonparametric multiple imputation approach with the augmented inverse probability weighted (AIPW) method, which directly incorporates the two working models into estimation of Cox regression, and the predictive mean matching imputation (PMM) method. We show that all approaches can reduce bias due to non-ignorable missing mechanism. The proposed nonparametric imputation method is robust to mis-specification of either one of the two working models and robust to mis-specification of the link function of the two working models. In contrast, the PMM method is sensitive to misspecification of the covariates included in imputation. The AIPW method is sensitive to the selection probability. We apply the approaches to a breast cancer dataset from Surveillance, Epidemiology and End Results (SEER) Program.

  17. Assessment of Interobserver Reliability in Nutrition Studies that Use Direct Observation of School Meals

    PubMed Central

    BAGLIO, MICHELLE L.; BAXTER, SUZANNE DOMEL; GUINN, CAROLINE H.; THOMPSON, WILLIAM O.; SHAFFER, NICOLE M.; FRYE, FRANCESCA H. A.

    2005-01-01

    This article (a) provides a general review of interobserver reliability (IOR) and (b) describes our method for assessing IOR for items and amounts consumed during school meals for a series of studies regarding the accuracy of fourth-grade children's dietary recalls validated with direct observation of school meals. A widely used validation method for dietary assessment is direct observation of meals. Although many studies utilize several people to conduct direct observations, few published studies indicate whether IOR was assessed. Assessment of IOR is necessary to determine that the information collected does not depend on who conducted the observation. Two strengths of our method for assessing IOR are that IOR was assessed regularly throughout the data collection period and that IOR was assessed for foods at the item and amount level instead of at the nutrient level. Adequate agreement among observers is essential to the reasoning behind using observation as a validation tool. Readers are encouraged to question the results of studies that fail to mention and/or to include the results for assessment of IOR when multiple people have conducted observations. PMID:15354155

  18. Statistical procedures for evaluating daily and monthly hydrologic model predictions

    USGS Publications Warehouse

    Coffey, M.E.; Workman, S.R.; Taraba, J.L.; Fogle, A.W.

    2004-01-01

    The overall study objective was to evaluate the applicability of different qualitative and quantitative methods for comparing daily and monthly SWAT computer model hydrologic streamflow predictions to observed data, and to recommend statistical methods for use in future model evaluations. Statistical methods were tested using daily streamflows and monthly equivalent runoff depths. The statistical techniques included linear regression, Nash-Sutcliffe efficiency, nonparametric tests, t-test, objective functions, autocorrelation, and cross-correlation. None of the methods specifically applied to the non-normal distribution and dependence between data points for the daily predicted and observed data. Of the tested methods, median objective functions, sign test, autocorrelation, and cross-correlation were most applicable for the daily data. The robust coefficient of determination (CD*) and robust modeling efficiency (EF*) objective functions were the preferred methods for daily model results due to the ease of comparing these values with a fixed ideal reference value of one. Predicted and observed monthly totals were more normally distributed, and there was less dependence between individual monthly totals than was observed for the corresponding predicted and observed daily values. More statistical methods were available for comparing SWAT model-predicted and observed monthly totals. The 1995 monthly SWAT model predictions and observed data had a regression Rr2 of 0.70, a Nash-Sutcliffe efficiency of 0.41, and the t-test failed to reject the equal data means hypothesis. The Nash-Sutcliffe coefficient and the R r2 coefficient were the preferred methods for monthly results due to the ability to compare these coefficients to a set ideal value of one.

  19. Thermal well-test method

    DOEpatents

    Tsang, C.F.; Doughty, C.A.

    1984-02-24

    A well-test method involving injection of hot (or cold) water into a groundwater aquifer, or injecting cold water into a geothermal reservoir is disclosed. By making temperature measurements at various depths in one or more observation wells, certain properties of the aquifer are determined. These properties, not obtainable from conventional well test procedures, include the permeability anisotropy, and layering in the aquifer, and in-situ thermal properties. The temperature measurements at various depths are obtained from thermistors mounted in the observation wells.

  20. Thermal well-test method

    DOEpatents

    Tsang, Chin-Fu; Doughty, Christine A.

    1985-01-01

    A well-test method involving injection of hot (or cold) water into a groundwater aquifer, or injecting cold water into a geothermal reservoir. By making temperature measurements at various depths in one or more observation wells, certain properties of the aquifer are determined. These properties, not obtainable from conventional well test procedures, include the permeability anisotropy, and layering in the aquifer, and in-situ thermal properties. The temperature measurements at various depths are obtained from thermistors mounted in the observation wells.

  1. Power-law behaviour evaluation from foreign exchange market data using a wavelet transform method

    NASA Astrophysics Data System (ADS)

    Wei, H. L.; Billings, S. A.

    2009-09-01

    Numerous studies in the literature have shown that the dynamics of many time series including observations in foreign exchange markets exhibit scaling behaviours. A simple new statistical approach, derived from the concept of the continuous wavelet transform correlation function (WTCF), is proposed for the evaluation of power-law properties from observed data. The new method reveals that foreign exchange rates obey power-laws and thus belong to the class of self-similarity processes.

  2. The Specificity of Observational Studies in Physical Activity and Sports Sciences: Moving Forward in Mixed Methods Research and Proposals for Achieving Quantitative and Qualitative Symmetry.

    PubMed

    Anguera, M Teresa; Camerino, Oleguer; Castañer, Marta; Sánchez-Algarra, Pedro; Onwuegbuzie, Anthony J

    2017-01-01

    Mixed methods studies are been increasingly applied to a diversity of fields. In this paper, we discuss the growing use-and enormous potential-of mixed methods research in the field of sport and physical activity. A second aim is to contribute to strengthening the characteristics of mixed methods research by showing how systematic observation offers rigor within a flexible framework that can be applied to a wide range of situations. Observational methodology is characterized by high scientific rigor and flexibility throughout its different stages and allows the objective study of spontaneous behavior in natural settings, with no external influence. Mixed methods researchers need to take bold yet thoughtful decisions regarding both substantive and procedural issues. We present three fundamental and complementary ideas to guide researchers in this respect: we show why studies of sport and physical activity that use a mixed methods research approach should be included in the field of mixed methods research, we highlight the numerous possibilities offered by observational methodology in this field through the transformation of descriptive data into quantifiable code matrices, and we discuss possible solutions for achieving true integration of qualitative and quantitative findings.

  3. Applying Nyquist's method for stability determination to solar wind observations

    NASA Astrophysics Data System (ADS)

    Klein, Kristopher G.; Kasper, Justin C.; Korreck, K. E.; Stevens, Michael L.

    2017-10-01

    The role instabilities play in governing the evolution of solar and astrophysical plasmas is a matter of considerable scientific interest. The large number of sources of free energy accessible to such nearly collisionless plasmas makes general modeling of unstable behavior, accounting for the temperatures, densities, anisotropies, and relative drifts of a large number of populations, analytically difficult. We therefore seek a general method of stability determination that may be automated for future analysis of solar wind observations. This work describes an efficient application of the Nyquist instability method to the Vlasov dispersion relation appropriate for hot, collisionless, magnetized plasmas, including the solar wind. The algorithm recovers the familiar proton temperature anisotropy instabilities, as well as instabilities that had been previously identified using fits extracted from in situ observations in Gary et al. (2016). Future proposed applications of this method are discussed.

  4. Mosaic CCD method: A new technique for observing dynamics of cometary magnetospheres

    NASA Technical Reports Server (NTRS)

    Saito, T.; Takeuchi, H.; Kozuba, Y.; Okamura, S.; Konno, I.; Hamabe, M.; Aoki, T.; Minami, S.; Isobe, S.

    1992-01-01

    On April 29, 1990, the plasma tail of Comet Austin was observed with a CCD camera on the 105-cm Schmidt telescope at the Kiso Observatory of the University of Tokyo. The area of the CCD used in this observation is only about 1 sq cm. When this CCD is used on the 105-cm Schmidt telescope at the Kiso Observatory, the area corresponds to a narrow square view of 12 ft x 12 ft. By comparison with the photograph of Comet Austin taken by Numazawa (personal communication) on the same night, we see that only a small part of the plasma tail can be photographed at one time with the CCD. However, by shifting the view on the CCD after each exposure, we succeeded in imaging the entire length of the cometary magnetosphere of 1.6 x 10(exp 6) km. This new technique is called 'the mosaic CCD method'. In order to study the dynamics of cometary plasma tails, seven frames of the comet from the head to the tail region were twice imaged with the mosaic CCD method and two sets of images were obtained. Six microstructures, including arcade structures, were identified in both the images. Sketches of the plasma tail including microstructures are included.

  5. Pharmacoeconomics

    PubMed Central

    Hughes, Dyfrig A

    2012-01-01

    Pharmacoeconomics is an essential component of health technology assessment and the appraisal of medicines for use by UK National Health Service (NHS) patients. As a comparatively young discipline, its methods continue to evolve. Priority research areas for development include methods for synthesizing indirect comparisons when head-to-head trials have not been performed, synthesizing qualitative evidence (for example, stakeholder views), addressing the limitations of the EQ-5D tool for assessing quality of life, including benefits not captured in quality-adjusted life years (QALYs), ways of assessing valuation methods (for determining utility scores), extrapolation of costs and benefits beyond those observed in trials, early estimation of cost-effectiveness (including mechanism-based economic evaluation), methods for incorporating the impact of non-adherence and the role of behavioural economics in influencing patients and prescribers. PMID:22360714

  6. Inter- and intra- observer reliability of risk assessment of repetitive work without an explicit method.

    PubMed

    Eliasson, Kristina; Palm, Peter; Nyman, Teresia; Forsman, Mikael

    2017-07-01

    A common way to conduct practical risk assessments is to observe a job and report the observed long term risks for musculoskeletal disorders. The aim of this study was to evaluate the inter- and intra-observer reliability of ergonomists' risk assessments without the support of an explicit risk assessment method. Twenty-one experienced ergonomists assessed the risk level (low, moderate, high risk) of eight upper body regions, as well as the global risk of 10 video recorded work tasks. Intra-observer reliability was assessed by having nine of the ergonomists repeat the procedure at least three weeks after the first assessment. The ergonomists made their risk assessment based on his/her experience and knowledge. The statistical parameters of reliability included agreement in %, kappa, linearly weighted kappa, intraclass correlation and Kendall's coefficient of concordance. The average inter-observer agreement of the global risk was 53% and the corresponding weighted kappa (K w ) was 0.32, indicating fair reliability. The intra-observer agreement was 61% and 0.41 (K w ). This study indicates that risk assessments of the upper body, without the use of an explicit observational method, have non-acceptable reliability. It is therefore recommended to use systematic risk assessment methods to a higher degree. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. Computing observables in curved multifield models of inflation—A guide (with code) to the transport method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dias, Mafalda; Seery, David; Frazer, Jonathan, E-mail: m.dias@sussex.ac.uk, E-mail: j.frazer@sussex.ac.uk, E-mail: a.liddle@sussex.ac.uk

    We describe how to apply the transport method to compute inflationary observables in a broad range of multiple-field models. The method is efficient and encompasses scenarios with curved field-space metrics, violations of slow-roll conditions and turns of the trajectory in field space. It can be used for an arbitrary mass spectrum, including massive modes and models with quasi-single-field dynamics. In this note we focus on practical issues. It is accompanied by a Mathematica code which can be used to explore suitable models, or as a basis for further development.

  8. The observation of sporadic meteors and meteor showers by means of radio technology measuring equipment

    NASA Astrophysics Data System (ADS)

    Schippke, W.

    1981-08-01

    Advantages regarding a tracking of meteors with the aid of the instruments of radio technology are related to the possibility for continuous observations without any dependence on meteorological conditions or on the time of day or night. Two methods exist for the registration of the traces of meteors, including a passive and an active method. The appropriate frequency range for both methods is the lower VHF range. For passive observations a very sensitive measurement receiver is required along with recording equipment, and a suitable antenna system. In Europe there are many television transmitters which are eminently suited for a detection of meteor traces. The active method for tracking meteors is more difficult and requires for its employment more expensive equipment than the passive method. It is based on the use of a VHF metric-wave radar. These devices operate normally also at a frequency of approximately 50 or 60 MHz. Attention is given to the theory of meteoric scattering, the various types of ionized trails, the geometry of meteor traces, results obtained in an observational station in Munich, and observations in the 144-MHz band.

  9. Interdisciplinary research on patient-provider communication: a cross-method comparison.

    PubMed

    Chou, Wen-Ying Sylvia; Han, Paul; Pilsner, Alison; Coa, Kisha; Greenberg, Larrie; Blatt, Benjamin

    2011-01-01

    Patient-provider communication, a key aspect of healthcare delivery, has been assessed through multiple methods for purposes of research, education, and quality control. Common techniques include satisfaction ratings and quantitatively- and qualitatively-oriented direct observations. Identifying the strengths and weaknesses of different approaches is critically important in determining the appropriate assessment method for a specific research or practical goal. Analyzing ten videotaped simulated encounters between medical students and Standardized Patients (SPs), this study compared three existing assessment methods through the same data set. Methods included: (1) dichotomized SP ratings on students' communication skills; (2) Roter Interaction Analysis System (RIAS) analysis; and (3) inductive discourse analysis informed by sociolinguistic theories. The large dichotomous contrast between good and poor ratings in (1) was not evidenced in any of the other methods. Following a discussion of strengths and weaknesses of each approach, we pilot-tested a combined assessment done by coders blinded to results of (1)-(3). This type of integrative approach has the potential of adding a quantifiable dimension to qualitative, discourse-based observations. Subjecting the same data set to separate analytic methods provides an excellent opportunity for methodological comparisons with the goal of informing future assessment of clinical encounters.

  10. A method to improve visual similarity of breast masses for an interactive computer-aided diagnosis environment.

    PubMed

    Zheng, Bin; Lu, Amy; Hardesty, Lara A; Sumkin, Jules H; Hakim, Christiane M; Ganott, Marie A; Gur, David

    2006-01-01

    The purpose of this study was to develop and test a method for selecting "visually similar" regions of interest depicting breast masses from a reference library to be used in an interactive computer-aided diagnosis (CAD) environment. A reference library including 1000 malignant mass regions and 2000 benign and CAD-generated false-positive regions was established. When a suspicious mass region is identified, the scheme segments the region and searches for similar regions from the reference library using a multifeature based k-nearest neighbor (KNN) algorithm. To improve selection of reference images, we added an interactive step. All actual masses in the reference library were subjectively rated on a scale from 1 to 9 as to their "visual margins speculations". When an observer identifies a suspected mass region during a case interpretation he/she first rates the margins and the computerized search is then limited only to regions rated as having similar levels of spiculation (within +/-1 scale difference). In an observer preference study including 85 test regions, two sets of the six "similar" reference regions selected by the KNN with and without the interactive step were displayed side by side with each test region. Four radiologists and five nonclinician observers selected the more appropriate ("similar") reference set in a two alternative forced choice preference experiment. All four radiologists and five nonclinician observers preferred the sets of regions selected by the interactive method with an average frequency of 76.8% and 74.6%, respectively. The overall preference for the interactive method was highly significant (p < 0.001). The study demonstrated that a simple interactive approach that includes subjectively perceived ratings of one feature alone namely, a rating of margin "spiculation," could substantially improve the selection of "visually similar" reference images.

  11. Semi-supervised clustering methods

    PubMed Central

    Bair, Eric

    2013-01-01

    Cluster analysis methods seek to partition a data set into homogeneous subgroups. It is useful in a wide variety of applications, including document processing and modern genetics. Conventional clustering methods are unsupervised, meaning that there is no outcome variable nor is anything known about the relationship between the observations in the data set. In many situations, however, information about the clusters is available in addition to the values of the features. For example, the cluster labels of some observations may be known, or certain observations may be known to belong to the same cluster. In other cases, one may wish to identify clusters that are associated with a particular outcome variable. This review describes several clustering algorithms (known as “semi-supervised clustering” methods) that can be applied in these situations. The majority of these methods are modifications of the popular k-means clustering method, and several of them will be described in detail. A brief description of some other semi-supervised clustering algorithms is also provided. PMID:24729830

  12. Semi-supervised clustering methods.

    PubMed

    Bair, Eric

    2013-01-01

    Cluster analysis methods seek to partition a data set into homogeneous subgroups. It is useful in a wide variety of applications, including document processing and modern genetics. Conventional clustering methods are unsupervised, meaning that there is no outcome variable nor is anything known about the relationship between the observations in the data set. In many situations, however, information about the clusters is available in addition to the values of the features. For example, the cluster labels of some observations may be known, or certain observations may be known to belong to the same cluster. In other cases, one may wish to identify clusters that are associated with a particular outcome variable. This review describes several clustering algorithms (known as "semi-supervised clustering" methods) that can be applied in these situations. The majority of these methods are modifications of the popular k-means clustering method, and several of them will be described in detail. A brief description of some other semi-supervised clustering algorithms is also provided.

  13. Cross-comparison and evaluation of air pollution field estimation methods

    NASA Astrophysics Data System (ADS)

    Yu, Haofei; Russell, Armistead; Mulholland, James; Odman, Talat; Hu, Yongtao; Chang, Howard H.; Kumar, Naresh

    2018-04-01

    Accurate estimates of human exposure is critical for air pollution health studies and a variety of methods are currently being used to assign pollutant concentrations to populations. Results from these methods may differ substantially, which can affect the outcomes of health impact assessments. Here, we applied 14 methods for developing spatiotemporal air pollutant concentration fields of eight pollutants to the Atlanta, Georgia region. These methods include eight methods relying mostly on air quality observations (CM: central monitor; SA: spatial average; IDW: inverse distance weighting; KRIG: kriging; TESS-D: discontinuous tessellation; TESS-NN: natural neighbor tessellation with interpolation; LUR: land use regression; AOD: downscaled satellite-derived aerosol optical depth), one using the RLINE dispersion model, and five methods using a chemical transport model (CMAQ), with and without using observational data to constrain results. The derived fields were evaluated and compared. Overall, all methods generally perform better at urban than rural area, and for secondary than primary pollutants. We found the CM and SA methods may be appropriate only for small domains, and for secondary pollutants, though the SA method lead to large negative spatial correlations when using data withholding for PM2.5 (spatial correlation coefficient R = -0.81). The TESS-D method was found to have major limitations. Results of the IDW, KRIG and TESS-NN methods are similar. They are found to be better suited for secondary pollutants because of their satisfactory temporal performance (e.g. average temporal R2 > 0.85 for PM2.5 but less than 0.35 for primary pollutant NO2). In addition, they are suitable for areas with relatively dense monitoring networks due to their inability to capture spatial concentration variabilities, as indicated by the negative spatial R (lower than -0.2 for PM2.5 when assessed using data withholding). The performance of LUR and AOD methods were similar to kriging. Using RLINE and CMAQ fields without fusing observational data led to substantial errors and biases, though the CMAQ model captured spatial gradients reasonably well (spatial R = 0.45 for PM2.5). Two unique tests conducted here included quantifying autocorrelation of method biases (which can be important in time series analyses) and how well the methods capture the observed interspecies correlations (which would be of particular importance in multipollutant health assessments). Autocorrelation of method biases lasted longest and interspecies correlations of primary pollutants was higher than observations when air quality models were used without data fusing. Use of hybrid methods that combine air quality model outputs with observational data overcome some of these limitations and is better suited for health studies. Results from this study contribute to better understanding the strengths and weaknesses of different methods for estimating human exposures.

  14. Incorporating Animals in Phenological Assessments: USA National Phenology Network Methods to Observe Animal Phenology

    NASA Astrophysics Data System (ADS)

    Miller-Rushing, A. J.; Weltzin, J. F.

    2009-12-01

    Many assessments of phenology, particularly those operating at large scales, focus on the phenology of plants, in part because of the relevance of plants in cycles of leaf greening and browning that are visible from satellite-based remote sensing, and because plants contribute significantly to global and regional biogeochemical cycles. The USA National Phenology Network (USA-NPN), a consortium of individuals, agencies, and organizations, promotes integrated assessments of both plant and animal phenology. The network is currently developing standard methods to add animal phenology to existing assessments of plant phenology. The first phase will of the standard methods will be implemented online in spring 2010. The methods for observing animals will be similar to the standard methods for making on-the-ground observations of plants—observers will be asked to monitor a fixed location regularly throughout the year. During each visit, observers will answer a series of “yes-no” questions that address the phenological state of the species of interest: Is the species present? Is it mating? Is it feeding? And so on. We are currently testing this method in several national parks in the northeastern United States, including Acadia National Park and the Appalachian Trail. By collecting new observations of this sort for a range of animals—amphibians, birds, fish, insects, mammals, and reptiles—we will greatly increase the ability of scientists and natural resource managers to understand how temporal relationships among these species and the plants on which they depend may be changing. To bolster the data available, we are collaborating with existing monitoring programs to develop common monitoring techniques, data sharing technologies, and visualizations. We are also beginning to collect legacy datasets, such as one from North American Bird Phenology Program that includes 90 years of observations of bird migration times from across the continent. We believe that increasing the amount of animal phenology data available for scientists, natural resource managers, and educators, will greatly advance our understanding of phenological changes and their causes and consequences, particularly in this time of rapid environmental change.

  15. Correlation between human observer performance and model observer performance in differential phase contrast CT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Ke; Garrett, John; Chen, Guang-Hong

    2013-11-15

    Purpose: With the recently expanding interest and developments in x-ray differential phase contrast CT (DPC-CT), the evaluation of its task-specific detection performance and comparison with the corresponding absorption CT under a given radiation dose constraint become increasingly important. Mathematical model observers are often used to quantify the performance of imaging systems, but their correlations with actual human observers need to be confirmed for each new imaging method. This work is an investigation of the effects of stochastic DPC-CT noise on the correlation of detection performance between model and human observers with signal-known-exactly (SKE) detection tasks.Methods: The detectabilities of different objectsmore » (five disks with different diameters and two breast lesion masses) embedded in an experimental DPC-CT noise background were assessed using both model and human observers. The detectability of the disk and lesion signals was then measured using five types of model observers including the prewhitening ideal observer, the nonprewhitening (NPW) observer, the nonprewhitening observer with eye filter and internal noise (NPWEi), the prewhitening observer with eye filter and internal noise (PWEi), and the channelized Hotelling observer (CHO). The same objects were also evaluated by four human observers using the two-alternative forced choice method. The results from the model observer experiment were quantitatively compared to the human observer results to assess the correlation between the two techniques.Results: The contrast-to-detail (CD) curve generated by the human observers for the disk-detection experiments shows that the required contrast to detect a disk is inversely proportional to the square root of the disk size. Based on the CD curves, the ideal and NPW observers tend to systematically overestimate the performance of the human observers. The NPWEi and PWEi observers did not predict human performance well either, as the slopes of their CD curves tended to be steeper. The CHO generated the best quantitative agreement with human observers with its CD curve overlapping with that of human observer. Statistical equivalence between CHO and humans can be claimed within 11% of the human observer results, including both the disk and lesion detection experiments.Conclusions: The model observer method can be used to accurately represent human observer performance with the stochastic DPC-CT noise for SKE tasks with sizes ranging from 8 to 128 pixels. The incorporation of the anatomical noise remains to be studied.« less

  16. A Constructivist Study of Middle School Students' Narratives and Ecological Illustrations

    ERIC Educational Resources Information Center

    Stokrocki, Mary L.; Flatt, Barbara; York, Emily

    2010-01-01

    Using participant observation, we describe/interpret the results of teaching a constructivist unit that empowered students in narrative writing and illustration. Participant observation methods included daily note taking, pre-post questioning, and photographing artworks. We analyzed students' stories and illustrations with borrowed and emerging…

  17. Transformative Learning through Education Abroad: A Case Study of a Community College Program

    ERIC Educational Resources Information Center

    Brenner, Ashley A.

    2014-01-01

    This case study examined how participating in a short-term education abroad program fostered transformative learning for a small group of community college students. As a participant-observer, I utilized ethnographic methods, including interviews, observations, and document analysis, to understand students' perceptions of their experiences…

  18. Quantifying Vegetation Biophysical Variables from Imaging Spectroscopy Data: A Review on Retrieval Methods

    NASA Astrophysics Data System (ADS)

    Verrelst, Jochem; Malenovský, Zbyněk; Van der Tol, Christiaan; Camps-Valls, Gustau; Gastellu-Etchegorry, Jean-Philippe; Lewis, Philip; North, Peter; Moreno, Jose

    2018-06-01

    An unprecedented spectroscopic data stream will soon become available with forthcoming Earth-observing satellite missions equipped with imaging spectroradiometers. This data stream will open up a vast array of opportunities to quantify a diversity of biochemical and structural vegetation properties. The processing requirements for such large data streams require reliable retrieval techniques enabling the spatiotemporally explicit quantification of biophysical variables. With the aim of preparing for this new era of Earth observation, this review summarizes the state-of-the-art retrieval methods that have been applied in experimental imaging spectroscopy studies inferring all kinds of vegetation biophysical variables. Identified retrieval methods are categorized into: (1) parametric regression, including vegetation indices, shape indices and spectral transformations; (2) nonparametric regression, including linear and nonlinear machine learning regression algorithms; (3) physically based, including inversion of radiative transfer models (RTMs) using numerical optimization and look-up table approaches; and (4) hybrid regression methods, which combine RTM simulations with machine learning regression methods. For each of these categories, an overview of widely applied methods with application to mapping vegetation properties is given. In view of processing imaging spectroscopy data, a critical aspect involves the challenge of dealing with spectral multicollinearity. The ability to provide robust estimates, retrieval uncertainties and acceptable retrieval processing speed are other important aspects in view of operational processing. Recommendations towards new-generation spectroscopy-based processing chains for operational production of biophysical variables are given.

  19. Hydrodynamic models of a cepheid atmosphere. Ph.D. Thesis - Maryland Univ., College Park

    NASA Technical Reports Server (NTRS)

    Karp, A. H.

    1974-01-01

    A method for including the solution of the transfer equation in a standard Henyey type hydrodynamic code was developed. This modified Henyey method was used in an implicit hydrodynamic code to compute deep envelope models of a classical Cepheid with a period of 12(d) including radiative transfer effects in the optically thin zones. It was found that the velocity gradients in the atmosphere are not responsible for the large microturbulent velocities observed in Cepheids but may be responsible for the occurrence of supersonic microturbulence. It was found that the splitting of the cores of the strong lines is due to shock induced temperature inversions in the line forming region. The adopted light, color, and velocity curves were used to study three methods frequently used to determine the mean radii of Cepheids. It is concluded that an accuracy of 10% is possible only if high quality observations are used.

  20. Analysis of terrestrial conditions and dynamics

    NASA Technical Reports Server (NTRS)

    Goward, S. N. (Principal Investigator)

    1984-01-01

    Land spectral reflectance properties for selected locations, including the Goddard Space Flight Center, the Wallops Flight Facility, a MLA test site in Cambridge, Maryland, and an acid test site in Burlington, Vermont, were measured. Methods to simulate the bidirectional reflectance properties of vegetated landscapes and a data base for spatial resolution were developed. North American vegetation patterns observed with the Advanced Very High Resolution Radiometer were assessed. Data and methods needed to model large-scale vegetation activity with remotely sensed observations and climate data were compiled.

  1. Multiple-Bit Differential Detection of OQPSK

    NASA Technical Reports Server (NTRS)

    Simon, Marvin

    2005-01-01

    A multiple-bit differential-detection method has been proposed for the reception of radio signals modulated with offset quadrature phase-shift keying (offset QPSK or OQPSK). The method is also applicable to other spectrally efficient offset quadrature modulations. This method is based partly on the same principles as those of a multiple-symbol differential-detection method for M-ary QPSK, which includes QPSK (that is, non-offset QPSK) as a special case. That method was introduced more than a decade ago by the author of the present method as a means of improving performance relative to a traditional (two-symbol observation) differential-detection scheme. Instead of symbol-by-symbol detection, both that method and the present one are based on a concept of maximum-likelihood sequence estimation (MLSE). As applied to the modulations in question, MLSE involves consideration of (1) all possible binary data sequences that could have been received during an observation time of some number, N, of symbol periods and (2) selection of the sequence that yields the best match to the noise-corrupted signal received during that time. The performance of the prior method was shown to range from that of traditional differential detection for short observation times (small N) to that of ideal coherent detection (with differential encoding) for long observation times (large N).

  2. An inter-observer Ki67 reproducibility study applying two different assessment methods: on behalf of the Danish Scientific Committee of Pathology, Danish breast cancer cooperative group (DBCG).

    PubMed

    Laenkholm, Anne-Vibeke; Grabau, Dorthe; Møller Talman, Maj-Lis; Balslev, Eva; Bak Jylling, Anne Marie; Tabor, Tomasz Piotr; Johansen, Morten; Brügmann, Anja; Lelkaitis, Giedrius; Di Caterino, Tina; Mygind, Henrik; Poulsen, Thomas; Mertz, Henrik; Søndergaard, Gorm; Bruun Rasmussen, Birgitte

    2018-01-01

    In 2011, the St. Gallen Consensus Conference introduced the use of pathology to define the intrinsic breast cancer subtypes by application of immunohistochemical (IHC) surrogate markers ER, PR, HER2 and Ki67 with a specified Ki67 cutoff (>14%) for luminal B-like definition. Reports concerning impaired reproducibility of Ki67 estimation and threshold inconsistency led to the initiation of this quality assurance study (2013-2015). The aim of the study was to investigate inter-observer variation for Ki67 estimation in malignant breast tumors by two different quantification methods (assessment method and count method) including measure of agreement between methods. Fourteen experienced breast pathologists from 12 pathology departments evaluated 118 slides from a consecutive series of malignant breast tumors. The staining interpretation was performed according to both the Danish and Swedish guidelines. Reproducibility was quantified by intra-class correlation coefficient (ICC) and Lights Kappa with dichotomization of observations at the larger than (>) 20% threshold. The agreement between observations by the two quantification methods was evaluated by Bland-Altman plot. For the fourteen raters the median ranged from 20% to 40% by the assessment method and from 22.5% to 36.5% by the count method. Light's Kappa was 0.664 for observation by the assessment method and 0.649 by the count method. The ICC was 0.82 (95% CI: 0.77-0.86) by the assessment method vs. 0.84 (95% CI: 0.80-0.87) by the count method. Although the study in general showed a moderate to good inter-observer agreement according to both ICC and Lights Kappa, still major discrepancies were identified in especially the mid-range of observations. Consequently, for now Ki67 estimation is not implemented in the DBCG treatment algorithm.

  3. Position and Speed Control of Brushless DC Motors Using Sensorless Techniques and Application Trends

    PubMed Central

    Gamazo-Real, José Carlos; Vázquez-Sánchez, Ernesto; Gómez-Gil, Jaime

    2010-01-01

    This paper provides a technical review of position and speed sensorless methods for controlling Brushless Direct Current (BLDC) motor drives, including the background analysis using sensors, limitations and advances. The performance and reliability of BLDC motor drivers have been improved because the conventional control and sensing techniques have been improved through sensorless technology. Then, in this paper sensorless advances are reviewed and recent developments in this area are introduced with their inherent advantages and drawbacks, including the analysis of practical implementation issues and applications. The study includes a deep overview of state-of-the-art back-EMF sensing methods, which includes Terminal Voltage Sensing, Third Harmonic Voltage Integration, Terminal Current Sensing, Back-EMF Integration and PWM strategies. Also, the most relevant techniques based on estimation and models are briefly analysed, such as Sliding-mode Observer, Extended Kalman Filter, Model Reference Adaptive System, Adaptive observers (Full-order and Pseudoreduced-order) and Artificial Neural Networks. PMID:22163582

  4. Method and apparatus for in-situ drying investigation and optimization of slurry drying methodology

    DOEpatents

    Armstrong, Beth L.; Daniel, Claus; Howe, Jane Y.; Kiggans, Jr, James O.; Sabau, Adrian S.; Wood, III, David L.; Kalnaus, Sergiy

    2016-05-10

    A method of drying casted slurries that includes calculating drying conditions from an experimental model for a cast slurry and forming a cast film. An infrared heating probe is positioned on one side of the casted slurry and a thermal probe is positioned on an opposing side of the casted slurry. The infrared heating probe may control the temperature of the casted slurry during drying. The casted slurry may be observed with an optical microscope, while applying the drying conditions from the experimental model. Observing the casted slurry includes detecting the incidence of micro-structural changes in the casted slurry during drying to determine if the drying conditions from the experimental model are optimal.

  5. System and method for confining an object to a region of fluid flow having a stagnation point

    NASA Technical Reports Server (NTRS)

    Schroeder, Charles M. (Inventor); Babcock, Hazen P. (Inventor); Shaqfeh, Eric S. G. (Inventor); Chu, Steven (Inventor)

    2006-01-01

    A device for confining an object to a region proximate to a fluid flow stagnation point includes one or more inlets for carrying the fluid into the region, one or more outlets for carrying the fluid out of the region, and a controller, in fluidic communication with the inlets and outlets, for adjusting the motion of the fluid to produce a stagnation point in the region, thereby confining the object to the region. Applications include, for example, prolonged observation of the object, manipulation of the object, etc. The device optionally may employ a feedback control mechanism, a sensing apparatus (e.g., for imaging), and a storage medium for storing, and a computer for analyzing and manipulating, data acquired from observing the object. The invention further provides methods of using such a device and system in a number of fields, including biology, chemistry, physics, material science, and medical science.

  6. Primary mass discrimination of high energy cosmic rays using PNN and k-NN methods

    NASA Astrophysics Data System (ADS)

    Rastegarzadeh, G.; Nemati, M.

    2018-02-01

    Probabilistic neural network (PNN) and k-Nearest Neighbors (k-NN) methods are widely used data classification techniques. In this paper, these two methods have been used to classify the Extensive Air Shower (EAS) data sets which were simulated using the CORSIKA code for three primary cosmic rays. The primaries are proton, oxygen and iron nuclei at energies of 100 TeV-10 PeV. This study is performed in the following of the investigations into the primary cosmic ray mass sensitive observables. We propose a new approach for measuring the mass sensitive observables of EAS in order to improve the primary mass separation. In this work, the EAS observables measurement has performed locally instead of total measurements. Also the relationships between the included number of observables in the classification methods and the prediction accuracy have been investigated. We have shown that the local measurements and inclusion of more mass sensitive observables in the classification processes can improve the classifying quality and also we have shown that muons and electrons energy density can be considered as primary mass sensitive observables in primary mass classification. Also it must be noted that this study is performed for Tehran observation level without considering the details of any certain EAS detection array.

  7. Model Uncertainty Quantification Methods In Data Assimilation

    NASA Astrophysics Data System (ADS)

    Pathiraja, S. D.; Marshall, L. A.; Sharma, A.; Moradkhani, H.

    2017-12-01

    Data Assimilation involves utilising observations to improve model predictions in a seamless and statistically optimal fashion. Its applications are wide-ranging; from improving weather forecasts to tracking targets such as in the Apollo 11 mission. The use of Data Assimilation methods in high dimensional complex geophysical systems is an active area of research, where there exists many opportunities to enhance existing methodologies. One of the central challenges is in model uncertainty quantification; the outcome of any Data Assimilation study is strongly dependent on the uncertainties assigned to both observations and models. I focus on developing improved model uncertainty quantification methods that are applicable to challenging real world scenarios. These include developing methods for cases where the system states are only partially observed, where there is little prior knowledge of the model errors, and where the model error statistics are likely to be highly non-Gaussian.

  8. Regression-Based Estimates of Observed Functional Status in Centenarians

    PubMed Central

    Mitchell, Meghan B.; Miller, L. Stephen; Woodard, John L.; Davey, Adam; Martin, Peter; Burgess, Molly; Poon, Leonard W.

    2011-01-01

    Purpose of the Study: There is lack of consensus on the best method of functional assessment, and there is a paucity of studies on daily functioning in centenarians. We sought to compare associations between performance-based, self-report, and proxy report of functional status in centenarians. We expected the strongest relationships between proxy reports and observed performance of basic activities of daily living (BADLs) and instrumental activities of daily living (IADLs). We hypothesized that the discrepancy between self-report and observed daily functioning would be modified by cognitive status. We additionally sought to provide clinicians with estimates of centenarians’ observed daily functioning based on their mental status in combination with subjective measures of activities of daily living (ADLs). Design and Methods: Two hundred and forty-four centenarians from the Georgia Centenarian Study were included in this cross-sectional population-based study. Measures included the Direct Assessment of Functional Status, self-report and proxy report of functional status, and the Mini-Mental State Examination (MMSE). Results: Associations between observed and proxy reports were stronger than between observed and self-report across BADL and IADL measures. A significant MMSE by type of report interaction was found, indicating that lower MMSE performance is associated with a greater discrepancy between subjective and objective ADL measures. Implications: Results demonstrate associations between 3 methods of assessing functional status and suggest proxy reports are generally more accurate than self-report measures. Cognitive status accounted for some of the discrepancy between observed and self-reports, and we provide clinicians with tables to estimate centenarians’ performance on observed functional measures based on MMSE and subjective report of functional status. PMID:20974657

  9. Mixed methods study on the use of and attitudes towards safety checklists in interventional radiology.

    PubMed

    Munn, Zachary; Giles, Kristy; Aromataris, Edoardo; Deakin, Anita; Schultz, Timothy; Mandel, Catherine; Peters, Micah Dj; Maddern, Guy; Pearson, Alan; Runciman, William

    2018-02-01

    The use of safety checklists in interventional radiology is an intervention aimed at reducing mortality and morbidity. Currently there is little known about their practical use in Australian radiology departments. The primary aim of this mixed methods study was to evaluate how safety checklists (SC) are used and completed in radiology departments within Australian hospitals, and attitudes towards their use as described by Australian radiologists. A mixed methods approach employing both quantitative and qualitative techniques was used for this study. Direct observations of checklist use during radiological procedures were performed to determine compliance. Medical records were also audited to investigate whether there was any discrepancy between practice (actual care measured by direct observation) and documentation (documented care measured by an audit of records). A focus group with Australian radiologists was conducted to determine attitudes towards the use of checklists. Among the four participating radiology departments, overall observed mean completion of the components of the checklist was 38%. The checklist items most commonly observed to be addressed by the operating theatre staff as noted during observations were correct patient (80%) and procedure (60%). Findings from the direct observations conflicted with the medical record audit, where there was a higher percentage of completion (64% completion) in comparison to the 38% observed. The focus group participants spoke of barriers to the use of checklists, including the culture of radiology departments. This is the first study of safety checklist use in radiology within Australia. Overall completion was low across the sites included in this study. Compliance data collected from observations differed markedly from reported compliance in medical records. There remain significant barriers to the proper use of safety checklists in Australian radiology departments. © 2017 The Royal Australian and New Zealand College of Radiologists.

  10. Minimization for conditional simulation: Relationship to optimal transport

    NASA Astrophysics Data System (ADS)

    Oliver, Dean S.

    2014-05-01

    In this paper, we consider the problem of generating independent samples from a conditional distribution when independent samples from the prior distribution are available. Although there are exact methods for sampling from the posterior (e.g. Markov chain Monte Carlo or acceptance/rejection), these methods tend to be computationally demanding when evaluation of the likelihood function is expensive, as it is for most geoscience applications. As an alternative, in this paper we discuss deterministic mappings of variables distributed according to the prior to variables distributed according to the posterior. Although any deterministic mappings might be equally useful, we will focus our discussion on a class of algorithms that obtain implicit mappings by minimization of a cost function that includes measures of data mismatch and model variable mismatch. Algorithms of this type include quasi-linear estimation, randomized maximum likelihood, perturbed observation ensemble Kalman filter, and ensemble of perturbed analyses (4D-Var). When the prior pdf is Gaussian and the observation operators are linear, we show that these minimization-based simulation methods solve an optimal transport problem with a nonstandard cost function. When the observation operators are nonlinear, however, the mapping of variables from the prior to the posterior obtained from those methods is only approximate. Errors arise from neglect of the Jacobian determinant of the transformation and from the possibility of discontinuous mappings.

  11. Assessing methane emission estimation methods based on atmospheric measurements from oil and gas production using LES simulations

    NASA Astrophysics Data System (ADS)

    Saide, P. E.; Steinhoff, D.; Kosovic, B.; Weil, J.; Smith, N.; Blewitt, D.; Delle Monache, L.

    2017-12-01

    There are a wide variety of methods that have been proposed and used to estimate methane emissions from oil and gas production by using air composition and meteorology observations in conjunction with dispersion models. Although there has been some verification of these methodologies using controlled releases and concurrent atmospheric measurements, it is difficult to assess the accuracy of these methods for more realistic scenarios considering factors such as terrain, emissions from multiple components within a well pad, and time-varying emissions representative of typical operations. In this work we use a large-eddy simulation (LES) to generate controlled but realistic synthetic observations, which can be used to test multiple source term estimation methods, also known as an Observing System Simulation Experiment (OSSE). The LES is based on idealized simulations of the Weather Research & Forecasting (WRF) model at 10 m horizontal grid-spacing covering an 8 km by 7 km domain with terrain representative of a region located in the Barnett shale. Well pads are setup in the domain following a realistic distribution and emissions are prescribed every second for the components of each well pad (e.g., chemical injection pump, pneumatics, compressor, tanks, and dehydrator) using a simulator driven by oil and gas production volume, composition and realistic operational conditions. The system is setup to allow assessments under different scenarios such as normal operations, during liquids unloading events, or during other prescribed operational upset events. Methane and meteorology model output are sampled following the specifications of the emission estimation methodologies and considering typical instrument uncertainties, resulting in realistic observations (see Figure 1). We will show the evaluation of several emission estimation methods including the EPA Other Test Method 33A and estimates using the EPA AERMOD regulatory model. We will also show source estimation results from advanced methods such as variational inverse modeling, and Bayesian inference and stochastic sampling techniques. Future directions including other types of observations, other hydrocarbons being considered, and assessment of additional emission estimation methods will be discussed.

  12. Can Functional Brain Imaging Be Used to Explore Interactivity and Cognition in Multimedia Learning Environments?

    ERIC Educational Resources Information Center

    Dalgarno, Barney; Kennedy, Gregor; Bennett, Sue

    2010-01-01

    This paper reviews existing methods used to address questions about interactivity, cognition and learning in multimedia learning environments. Existing behavioural and self-report methods identified include observations, audit trails, questionnaires, interviews, video-stimulated recall, and think-aloud protocols. The limitations of these methods…

  13. Partners in Inquiry: A Collaborative Life Science Investigation with Preservice Teachers and Kindergarten Students

    ERIC Educational Resources Information Center

    Eckhoff, Angela

    2017-01-01

    This article documents a collaborative project involving preservice early childhood education students' development of inquiry-based learning experiences alongside kindergarten students within a science methods course. To document this project, I utilized a multiple methods approach and data included classroom observations, transcripts from lesson…

  14. Post-PRK corneal scatter measurements with a scanning confocal slit photon counter

    NASA Astrophysics Data System (ADS)

    Taboada, John; Gaines, David; Perez, Mary A.; Waller, Steve G.; Ivan, Douglas J.; Baldwin, J. Bruce; LoRusso, Frank; Tutt, Ronald C.; Perez, Jose; Tredici, Thomas; Johnson, Dan A.

    2000-06-01

    Increased corneal light scatter or 'haze' has been associated with excimer laser photorefractive surgery of the cornea. The increased scatter can affect visual performance; however, topical steroid treatment post surgery substantially reduces the post PRK scatter. For the treatment and monitoring of the scattering characteristics of the cornea, various methods have been developed to objectively measure the magnitude of the scatter. These methods generally can measure scatter associated with clinically observable levels of haze. For patients with moderate to low PRK corrections receiving steroid treatment, measurement becomes fairly difficult as the haze clinical rating is non observable. The goal of this development was to realize an objective, non-invasive physical measurement that could produce a significant reading for any level including the background present in a normal cornea. As back-scatter is the only readily accessible observable, the instrument is based on this measurement. To achieve this end required the use of a confocal method to bias out the background light that would normally confound conventional methods. A number of subjects with nominal refractive errors in an Air Force study have undergone PRK surgery. A measurable increase in corneal scatter has been observed in these subjects whereas clinical ratings of the haze were noted as level zero. Other favorable aspects of this back-scatter based instrument include an optical capability to perform what is equivalent to an optical A-scan of the anterior chamber. Lens scatter can also be measured.

  15. The Specificity of Observational Studies in Physical Activity and Sports Sciences: Moving Forward in Mixed Methods Research and Proposals for Achieving Quantitative and Qualitative Symmetry

    PubMed Central

    Anguera, M. Teresa; Camerino, Oleguer; Castañer, Marta; Sánchez-Algarra, Pedro; Onwuegbuzie, Anthony J.

    2017-01-01

    Mixed methods studies are been increasingly applied to a diversity of fields. In this paper, we discuss the growing use—and enormous potential—of mixed methods research in the field of sport and physical activity. A second aim is to contribute to strengthening the characteristics of mixed methods research by showing how systematic observation offers rigor within a flexible framework that can be applied to a wide range of situations. Observational methodology is characterized by high scientific rigor and flexibility throughout its different stages and allows the objective study of spontaneous behavior in natural settings, with no external influence. Mixed methods researchers need to take bold yet thoughtful decisions regarding both substantive and procedural issues. We present three fundamental and complementary ideas to guide researchers in this respect: we show why studies of sport and physical activity that use a mixed methods research approach should be included in the field of mixed methods research, we highlight the numerous possibilities offered by observational methodology in this field through the transformation of descriptive data into quantifiable code matrices, and we discuss possible solutions for achieving true integration of qualitative and quantitative findings. PMID:29312061

  16. A novel method for correcting scanline-observational bias of discontinuity orientation

    PubMed Central

    Huang, Lei; Tang, Huiming; Tan, Qinwen; Wang, Dingjian; Wang, Liangqing; Ez Eldin, Mutasim A. M.; Li, Changdong; Wu, Qiong

    2016-01-01

    Scanline observation is known to introduce an angular bias into the probability distribution of orientation in three-dimensional space. In this paper, numerical solutions expressing the functional relationship between the scanline-observational distribution (in one-dimensional space) and the inherent distribution (in three-dimensional space) are derived using probability theory and calculus under the independence hypothesis of dip direction and dip angle. Based on these solutions, a novel method for obtaining the inherent distribution (also for correcting the bias) is proposed, an approach which includes two procedures: 1) Correcting the cumulative probabilities of orientation according to the solutions, and 2) Determining the distribution of the corrected orientations using approximation methods such as the one-sample Kolmogorov-Smirnov test. The inherent distribution corrected by the proposed method can be used for discrete fracture network (DFN) modelling, which is applied to such areas as rockmass stability evaluation, rockmass permeability analysis, rockmass quality calculation and other related fields. To maximize the correction capacity of the proposed method, the observed sample size is suggested through effectiveness tests for different distribution types, dispersions and sample sizes. The performance of the proposed method and the comparison of its correction capacity with existing methods are illustrated with two case studies. PMID:26961249

  17. An information theory application to improve understanding of subsurface flow and transport conditions at the BARC OPE3 site

    USDA-ARS?s Scientific Manuscript database

    Improving understanding of subsurface conditions includes comparison and discrimination of concurrent models. Additional observations can be useful for that purpose. The objective of this work was to implement and test a novel method for optimization of selecting locations for additional observation...

  18. Critical discussion of evaluation parameters for inter-observer variability in target definition for radiation therapy.

    PubMed

    Fotina, I; Lütgendorf-Caucig, C; Stock, M; Pötter, R; Georg, D

    2012-02-01

    Inter-observer studies represent a valid method for the evaluation of target definition uncertainties and contouring guidelines. However, data from the literature do not yet give clear guidelines for reporting contouring variability. Thus, the purpose of this work was to compare and discuss various methods to determine variability on the basis of clinical cases and a literature review. In this study, 7 prostate and 8 lung cases were contoured on CT images by 8 experienced observers. Analysis of variability included descriptive statistics, calculation of overlap measures, and statistical measures of agreement. Cross tables with ratios and correlations were established for overlap parameters. It was shown that the minimal set of parameters to be reported should include at least one of three volume overlap measures (i.e., generalized conformity index, Jaccard coefficient, or conformation number). High correlation between these parameters and scatter of the results was observed. A combination of descriptive statistics, overlap measure, and statistical measure of agreement or reliability analysis is required to fully report the interrater variability in delineation.

  19. Testing alternative ground water models using cross-validation and other methods

    USGS Publications Warehouse

    Foglia, L.; Mehl, S.W.; Hill, M.C.; Perona, P.; Burlando, P.

    2007-01-01

    Many methods can be used to test alternative ground water models. Of concern in this work are methods able to (1) rank alternative models (also called model discrimination) and (2) identify observations important to parameter estimates and predictions (equivalent to the purpose served by some types of sensitivity analysis). Some of the measures investigated are computationally efficient; others are computationally demanding. The latter are generally needed to account for model nonlinearity. The efficient model discrimination methods investigated include the information criteria: the corrected Akaike information criterion, Bayesian information criterion, and generalized cross-validation. The efficient sensitivity analysis measures used are dimensionless scaled sensitivity (DSS), composite scaled sensitivity, and parameter correlation coefficient (PCC); the other statistics are DFBETAS, Cook's D, and observation-prediction statistic. Acronyms are explained in the introduction. Cross-validation (CV) is a computationally intensive nonlinear method that is used for both model discrimination and sensitivity analysis. The methods are tested using up to five alternative parsimoniously constructed models of the ground water system of the Maggia Valley in southern Switzerland. The alternative models differ in their representation of hydraulic conductivity. A new method for graphically representing CV and sensitivity analysis results for complex models is presented and used to evaluate the utility of the efficient statistics. The results indicate that for model selection, the information criteria produce similar results at much smaller computational cost than CV. For identifying important observations, the only obviously inferior linear measure is DSS; the poor performance was expected because DSS does not include the effects of parameter correlation and PCC reveals large parameter correlations. ?? 2007 National Ground Water Association.

  20. An evaluation of methods for estimating decadal stream loads

    NASA Astrophysics Data System (ADS)

    Lee, Casey J.; Hirsch, Robert M.; Schwarz, Gregory E.; Holtschlag, David J.; Preston, Stephen D.; Crawford, Charles G.; Vecchia, Aldo V.

    2016-11-01

    Effective management of water resources requires accurate information on the mass, or load of water-quality constituents transported from upstream watersheds to downstream receiving waters. Despite this need, no single method has been shown to consistently provide accurate load estimates among different water-quality constituents, sampling sites, and sampling regimes. We evaluate the accuracy of several load estimation methods across a broad range of sampling and environmental conditions. This analysis uses random sub-samples drawn from temporally-dense data sets of total nitrogen, total phosphorus, nitrate, and suspended-sediment concentration, and includes measurements of specific conductance which was used as a surrogate for dissolved solids concentration. Methods considered include linear interpolation and ratio estimators, regression-based methods historically employed by the U.S. Geological Survey, and newer flexible techniques including Weighted Regressions on Time, Season, and Discharge (WRTDS) and a generalized non-linear additive model. No single method is identified to have the greatest accuracy across all constituents, sites, and sampling scenarios. Most methods provide accurate estimates of specific conductance (used as a surrogate for total dissolved solids or specific major ions) and total nitrogen - lower accuracy is observed for the estimation of nitrate, total phosphorus and suspended sediment loads. Methods that allow for flexibility in the relation between concentration and flow conditions, specifically Beale's ratio estimator and WRTDS, exhibit greater estimation accuracy and lower bias. Evaluation of methods across simulated sampling scenarios indicate that (1) high-flow sampling is necessary to produce accurate load estimates, (2) extrapolation of sample data through time or across more extreme flow conditions reduces load estimate accuracy, and (3) WRTDS and methods that use a Kalman filter or smoothing to correct for departures between individual modeled and observed values benefit most from more frequent water-quality sampling.

  1. An evaluation of methods for estimating decadal stream loads

    USGS Publications Warehouse

    Lee, Casey; Hirsch, Robert M.; Schwarz, Gregory E.; Holtschlag, David J.; Preston, Stephen D.; Crawford, Charles G.; Vecchia, Aldo V.

    2016-01-01

    Effective management of water resources requires accurate information on the mass, or load of water-quality constituents transported from upstream watersheds to downstream receiving waters. Despite this need, no single method has been shown to consistently provide accurate load estimates among different water-quality constituents, sampling sites, and sampling regimes. We evaluate the accuracy of several load estimation methods across a broad range of sampling and environmental conditions. This analysis uses random sub-samples drawn from temporally-dense data sets of total nitrogen, total phosphorus, nitrate, and suspended-sediment concentration, and includes measurements of specific conductance which was used as a surrogate for dissolved solids concentration. Methods considered include linear interpolation and ratio estimators, regression-based methods historically employed by the U.S. Geological Survey, and newer flexible techniques including Weighted Regressions on Time, Season, and Discharge (WRTDS) and a generalized non-linear additive model. No single method is identified to have the greatest accuracy across all constituents, sites, and sampling scenarios. Most methods provide accurate estimates of specific conductance (used as a surrogate for total dissolved solids or specific major ions) and total nitrogen – lower accuracy is observed for the estimation of nitrate, total phosphorus and suspended sediment loads. Methods that allow for flexibility in the relation between concentration and flow conditions, specifically Beale’s ratio estimator and WRTDS, exhibit greater estimation accuracy and lower bias. Evaluation of methods across simulated sampling scenarios indicate that (1) high-flow sampling is necessary to produce accurate load estimates, (2) extrapolation of sample data through time or across more extreme flow conditions reduces load estimate accuracy, and (3) WRTDS and methods that use a Kalman filter or smoothing to correct for departures between individual modeled and observed values benefit most from more frequent water-quality sampling.

  2. Photochemical transformations of diazocarbonyl compounds: expected and novel reactions

    NASA Astrophysics Data System (ADS)

    Galkina, O. S.; Rodina, L. L.

    2016-05-01

    Photochemical reactions of diazocarbonyl compounds are well positioned in synthetic practice as an efficient method for ring contraction and homologation of carboxylic acids and as a carbene generation method. However, interpretation of the observed transformations of diazo compounds in electronically excited states is incomplete and requires a careful study of the fine mechanisms of these processes specific to different excited states of diazo compounds resorting to modern methods of investigation, including laser technology. The review is devoted to analysis of new data in the chemistry of excited states of diazocarbonyl compounds. The bibliography includes 155 references.

  3. Evaluating performance of risk identification methods through a large-scale simulation of observational data.

    PubMed

    Ryan, Patrick B; Schuemie, Martijn J

    2013-10-01

    There has been only limited evaluation of statistical methods for identifying safety risks of drug exposure in observational healthcare data. Simulations can support empirical evaluation, but have not been shown to adequately model the real-world phenomena that challenge observational analyses. To design and evaluate a probabilistic framework (OSIM2) for generating simulated observational healthcare data, and to use this data for evaluating the performance of methods in identifying associations between drug exposure and health outcomes of interest. Seven observational designs, including case-control, cohort, self-controlled case series, and self-controlled cohort design were applied to 399 drug-outcome scenarios in 6 simulated datasets with no effect and injected relative risks of 1.25, 1.5, 2, 4, and 10, respectively. Longitudinal data for 10 million simulated patients were generated using a model derived from an administrative claims database, with associated demographics, periods of drug exposure derived from pharmacy dispensings, and medical conditions derived from diagnoses on medical claims. Simulation validation was performed through descriptive comparison with real source data. Method performance was evaluated using Area Under ROC Curve (AUC), bias, and mean squared error. OSIM2 replicates prevalence and types of confounding observed in real claims data. When simulated data are injected with relative risks (RR) ≥ 2, all designs have good predictive accuracy (AUC > 0.90), but when RR < 2, no methods achieve 100 % predictions. Each method exhibits a different bias profile, which changes with the effect size. OSIM2 can support methodological research. Results from simulation suggest method operating characteristics are far from nominal properties.

  4. The Main Belt Comets and ice in the Solar System

    NASA Astrophysics Data System (ADS)

    Snodgrass, Colin; Agarwal, Jessica; Combi, Michael; Fitzsimmons, Alan; Guilbert-Lepoutre, Aurelie; Hsieh, Henry H.; Hui, Man-To; Jehin, Emmanuel; Kelley, Michael S. P.; Knight, Matthew M.; Opitom, Cyrielle; Orosei, Roberto; de Val-Borro, Miguel; Yang, Bin

    2017-11-01

    We review the evidence for buried ice in the asteroid belt; specifically the questions around the so-called Main Belt Comets (MBCs). We summarise the evidence for water throughout the Solar System, and describe the various methods for detecting it, including remote sensing from ultraviolet to radio wavelengths. We review progress in the first decade of study of MBCs, including observations, modelling of ice survival, and discussion on their origins. We then look at which methods will likely be most effective for further progress, including the key challenge of direct detection of (escaping) water in these bodies.

  5. The Inherent Uncertainty of In-Situ Observations and its Implications for Modeling Evapotranspiration

    NASA Astrophysics Data System (ADS)

    Alfieri, J. G.

    2012-12-01

    In-situ observations are essential to a broad range of applications including the development, calibration, and validation of both the numerical and remote sensing-based models. For example, observational data is requisite in order to evaluate the skill of these models both to represent the complex biogeophysical processes regulating evapotranspiration (ET) and to predict the magnitude of the moisture flux. As such, by propagating into these subsequent activities, any uncertainty or errors associated with the observational data have the potential to adversely impact the accuracy and utility of these models. It is, therefore, critical that the factors driving measurement uncertainty are fully understood so that the steps can be taken to account for its effects and mitigate its impact on subsequent analyses. Field measurements of ET can be collected using a variety of techniques including eddy covariance (EC), lysimetry (LY), and scintillometry (SC). Each of these methods is underpinned by a unique set of theoretical considerations and practical constraints; and, as a result, each method is susceptible to differing types of systematic and random error. Since the uncertainty associated with the field measurements is predicated on how well numerous factors - for example, environmental conditions - adhere to those prescribed by the underlying assumptions, the quality of in-situ observations collected via the differing methods can vary significantly both over time and from site-to-site. Using data from both site studies and large field campaigns, such as IHOP_2002 and BEAREX08, the sources of uncertainty in field observations will be discussed. The impact of measurement uncertainty on model validation will also be illustrated.

  6. Partial spline models for the inclusion of tropopause and frontal boundary information in otherwise smooth two- and three-dimensional objective analysis

    NASA Technical Reports Server (NTRS)

    Shiau, Jyh-Jen; Wahba, Grace; Johnson, Donald R.

    1986-01-01

    A new method, based on partial spline models, is developed for including specified discontinuities in otherwise smooth two- and three-dimensional objective analyses. The method is appropriate for including tropopause height information in two- and three-dimensinal temperature analyses, using the O'Sullivan-Wahba physical variational method for analysis of satellite radiance data, and may in principle be used in a combined variational analysis of observed, forecast, and climate information. A numerical method for its implementation is described and a prototype two-dimensional analysis based on simulated radiosonde and tropopause height data is shown. The method may also be appropriate for other geophysical problems, such as modeling the ocean thermocline, fronts, discontinuities, etc.

  7. An aerial survey method to estimate sea otter abundance

    USGS Publications Warehouse

    Bodkin, James L.; Udevitz, Mark S.; Garner, Gerald W.; Amstrup, Steven C.; Laake, Jeffrey L.; Manly, Bryan F.J.; McDonald, Lyman L.; Robertson, Donna G.

    1999-01-01

    Sea otters (Enhydra lutris) occur in shallow coastal habitats and can be highly visible on the sea surface. They generally rest in groups and their detection depends on factors that include sea conditions, viewing platform, observer technique and skill, distance, habitat and group size. While visible on the surface, they are difficult to see while diving and may dive in response to an approaching survey platform. We developed and tested an aerial survey method that uses intensive searches within portions of strip transects to adjust for availability and sightability biases. Correction factors are estimated independently for each survey and observer. In tests of our method using shore-based observers, we estimated detection probabilities of 0.52-0.72 in standard strip-transects and 0.96 in intensive searches. We used the survey method in Prince William Sound, Alaska to estimate a sea otter population size of 9,092 (SE = 1422). The new method represents an improvement over various aspects of previous methods, but additional development and testing will be required prior to its broad application.

  8. New measurements of photospheric magnetic fields in late-type stars and emerging trends

    NASA Technical Reports Server (NTRS)

    Saar, S. H.; Linsky, J. L.

    1986-01-01

    The magnetic fields of late-type stars are measured using the method of Saar et al. (1986). The method includes radiative transfer effects and compensation for line blending; the photospheric magnetic field parameters are derived by comparing observed and theoretical line profiles using an LTE code that includes line saturation and full Zeeman pattern. The preliminary mean active region magnetic field strengths (B) and surface area coverages for 20 stars are discussed. It is observed that there is a trend of increasing B towards the cooler dwarfs stars, and the linear correlation between B and the equipartition value of the magnetic field strength suggests that the photospheric gas pressure determines the photospheric magnetic field strengths. A tendency toward larger filling factors at larger stellar angular velocities is also detected.

  9. A manual for inexpensive methods of analyzing and utilizing remote sensor data

    NASA Technical Reports Server (NTRS)

    Elifrits, C. D.; Barr, D. J.

    1978-01-01

    Instructions are provided for inexpensive methods of using remote sensor data to assist in the completion of the need to observe the earth's surface. When possible, relative costs were included. Equipment need for analysis of remote sensor data is described, and methods of use of these equipment items are included, as well as advantages and disadvantages of the use of individual items. Interpretation and analysis of stereo photos and the interpretation of typical patterns such as tone and texture, landcover, drainage, and erosional form are described. Similar treatment is given to monoscopic image interpretation, including LANDSAT MSS data. Enhancement techniques are detailed with respect to their application and simple techniques of creating an enhanced data item. Techniques described include additive and subtractive (Diazo processes) color techniques and enlargement of photos or images. Applications of these processes, including mappings of land resources, engineering soils, geology, water resources, environmental conditions, and crops and/or vegetation, are outlined.

  10. Atmospheric Effects on InSAR Measurements and Their Mitigation

    PubMed Central

    Ding, Xiao-li; Li, Zhi-wei; Zhu, Jian-jun; Feng, Guang-cai; Long, Jiang-ping

    2008-01-01

    Interferometric Synthetic Aperture Radar (InSAR) is a powerful technology for observing the Earth surface, especially for mapping the Earth's topography and deformations. InSAR measurements are however often significantly affected by the atmosphere as the radar signals propagate through the atmosphere whose state varies both in space and in time. Great efforts have been made in recent years to better understand the properties of the atmospheric effects and to develop methods for mitigating the effects. This paper provides a systematic review of the work carried out in this area. The basic principles of atmospheric effects on repeat-pass InSAR are first introduced. The studies on the properties of the atmospheric effects, including the magnitudes of the effects determined in the various parts of the world, the spectra of the atmospheric effects, the isotropic properties and the statistical distributions of the effects, are then discussed. The various methods developed for mitigating the atmospheric effects are then reviewed, including the methods that are based on PSInSAR processing, the methods that are based on interferogram modeling, and those that are based on external data such as GPS observations, ground meteorological data, and satellite data including those from the MODIS and MERIS. Two examples that use MODIS and MERIS data respectively to calibrate atmospheric effects on InSAR are also given. PMID:27873822

  11. Atmospheric Effects on InSAR Measurements and Their Mitigation.

    PubMed

    Ding, Xiao-Li; Li, Zhi-Wei; Zhu, Jian-Jun; Feng, Guang-Cai; Long, Jiang-Ping

    2008-09-03

    Interferometric Synthetic Aperture Radar (InSAR) is a powerful technology for observing the Earth surface, especially for mapping the Earth's topography and deformations. InSAR measurements are however often significantly affected by the atmosphere as the radar signals propagate through the atmosphere whose state varies both in space and in time. Great efforts have been made in recent years to better understand the properties of the atmospheric effects and to develop methods for mitigating the effects. This paper provides a systematic review of the work carried out in this area. The basic principles of atmospheric effects on repeat-pass InSAR are first introduced. The studies on the properties of the atmospheric effects, including the magnitudes of the effects determined in the various parts of the world, the spectra of the atmospheric effects, the isotropic properties and the statistical distributions of the effects, are then discussed. The various methods developed for mitigating the atmospheric effects are then reviewed, including the methods that are based on PSInSAR processing, the methods that are based on interferogram modeling, and those that are based on external data such as GPS observations, ground meteorological data, and satellite data including those from the MODIS and MERIS. Two examples that use MODIS and MERIS data respectively to calibrate atmospheric effects on InSAR are also given.

  12. Functional Additive Mixed Models

    PubMed Central

    Scheipl, Fabian; Staicu, Ana-Maria; Greven, Sonja

    2014-01-01

    We propose an extensive framework for additive regression models for correlated functional responses, allowing for multiple partially nested or crossed functional random effects with flexible correlation structures for, e.g., spatial, temporal, or longitudinal functional data. Additionally, our framework includes linear and nonlinear effects of functional and scalar covariates that may vary smoothly over the index of the functional response. It accommodates densely or sparsely observed functional responses and predictors which may be observed with additional error and includes both spline-based and functional principal component-based terms. Estimation and inference in this framework is based on standard additive mixed models, allowing us to take advantage of established methods and robust, flexible algorithms. We provide easy-to-use open source software in the pffr() function for the R-package refund. Simulations show that the proposed method recovers relevant effects reliably, handles small sample sizes well and also scales to larger data sets. Applications with spatially and longitudinally observed functional data demonstrate the flexibility in modeling and interpretability of results of our approach. PMID:26347592

  13. Functional Additive Mixed Models.

    PubMed

    Scheipl, Fabian; Staicu, Ana-Maria; Greven, Sonja

    2015-04-01

    We propose an extensive framework for additive regression models for correlated functional responses, allowing for multiple partially nested or crossed functional random effects with flexible correlation structures for, e.g., spatial, temporal, or longitudinal functional data. Additionally, our framework includes linear and nonlinear effects of functional and scalar covariates that may vary smoothly over the index of the functional response. It accommodates densely or sparsely observed functional responses and predictors which may be observed with additional error and includes both spline-based and functional principal component-based terms. Estimation and inference in this framework is based on standard additive mixed models, allowing us to take advantage of established methods and robust, flexible algorithms. We provide easy-to-use open source software in the pffr() function for the R-package refund. Simulations show that the proposed method recovers relevant effects reliably, handles small sample sizes well and also scales to larger data sets. Applications with spatially and longitudinally observed functional data demonstrate the flexibility in modeling and interpretability of results of our approach.

  14. How are Curious People Viewed and How Do they Behave in Social Situations? From the Perspectives of Self, Friends, Parents, and Unacquainted Observers

    PubMed Central

    Kashdan, Todd B.; Sherman, Ryne A.; Yarbro, Jessica; Funder, David C.

    2012-01-01

    Objective People who are open and curious orient their lives around an appreciation of novelty and a strong urge to explore, discover, and grow. Researchers have recently shown that being an open, curious person is linked to healthy social outcomes. Method To better understand the benefits (and liabilities) of being a curious person, we used a multi-method design of social behavior to assess the perspectives of multiple informants including self, friends, and parents, and behavior coded from direct observations in unstructured social interactions. Results We found an impressive degree of convergence among self, friends, and parent reports of curiosity, and observer-rated behavioral correlates of curiosity. A curious personality was linked to a wide range of adaptive behaviors including tolerance of anxiety and uncertainty, positive emotional expressiveness, initiation of humor and playfulness, unconventional thinking, and a non-defensive, non-critical attitude. Conclusions This characterization of curious people provides insights into mechanisms underlying associated healthy social outcomes. PMID:22583101

  15. From Ethnography to Items: A Mixed Methods Approach to Developing a Survey to Examine Graduate Engineering Student Retention

    ERIC Educational Resources Information Center

    Crede, Erin; Borrego, Maura

    2013-01-01

    As part of a sequential exploratory mixed methods study, 9 months of ethnographically guided observations and interviews were used to develop a survey examining graduate engineering student retention. Findings from the ethnographic fieldwork yielded several themes, including international diversity, research group organization and climate,…

  16. Evaluation of Methods to Estimate the Surface Downwelling Longwave Flux during Arctic Winter

    NASA Technical Reports Server (NTRS)

    Chiacchio, Marc; Francis, Jennifer; Stackhouse, Paul, Jr.

    2002-01-01

    Surface longwave radiation fluxes dominate the energy budget of nighttime polar regions, yet little is known about the relative accuracy of existing satellite-based techniques to estimate this parameter. We compare eight methods to estimate the downwelling longwave radiation flux and to validate their performance with measurements from two field programs in thc Arctic: the Coordinated Eastern Arctic Experiment (CEAREX ) conducted in the Barents Sea during the autumn and winter of 1988, and the Lead Experiment performed in the Beaufort Sea in the spring of 1992. Five of the eight methods were developed for satellite-derived quantities, and three are simple parameterizations based on surface observations. All of the algorithms require information about cloud fraction, which is provided from the NASA-NOAA Television and Infrared Observation Satellite (TIROS) Operational Vertical Sounder (TOVS) polar pathfinder dataset (Path-P): some techniques ingest temperature and moisture profiles (also from Path-P): one-half of the methods assume that clouds are opaque and have a constant geometric thickness of 50 hPa, and three include no thickness information whatsoever. With a somewhat limited validation dataset, the following primary conclusions result: (1) all methods exhibit approximately the same correlations with measurements and rms differences, but the biases range from -34 W sq m (16% of the mean) to nearly 0; (2) the error analysis described here indicates that the assumption of a 50-hPa cloud thickness is too thin by a factor of 2 on average in polar nighttime conditions; (3) cloud-overlap techniques. which effectively increase mean cloud thickness, significantly improve the results; (4) simple Arctic-specific parameterizations performed poorly, probably because they were developed with surface-observed cloud fractions; and (5) the single algorithm that includes an estimate of cloud thickness exhibits the smallest differences from observations.

  17. The development and validation of The Inquiry Science Observation Coding Sheet.

    PubMed

    Brandon, P R; Taum, A K H; Young, D B; Pottenger, F M

    2008-08-01

    Evaluation reports increasingly document the degree of program implementation, particularly the extent to which programs adhere to prescribed steps and procedures. Many reports are cursory, however, and few, if any, fully portray the long and winding path taken when developing evaluation instruments, particularly observation instruments. In this article, we describe the development of an observational method for evaluating the degree to which K-12 inquiry science programs are implemented, including the many steps and decisions that occurred during the development, and present evidence for the reliability and validity of the data that we collected with the instrument. The article introduces a method for measuring the adherence of inquiry science implementation and gives evaluators a full picture of what they might expect when developing observation instruments for assessing the degree of program implementation.

  18. The effects of climate downscaling technique and observational data set on modeled ecological responses.

    PubMed

    Pourmokhtarian, Afshin; Driscoll, Charles T; Campbell, John L; Hayhoe, Katharine; Stoner, Anne M K

    2016-07-01

    Assessments of future climate change impacts on ecosystems typically rely on multiple climate model projections, but often utilize only one downscaling approach trained on one set of observations. Here, we explore the extent to which modeled biogeochemical responses to changing climate are affected by the selection of the climate downscaling method and training observations used at the montane landscape of the Hubbard Brook Experimental Forest, New Hampshire, USA. We evaluated three downscaling methods: the delta method (or the change factor method), monthly quantile mapping (Bias Correction-Spatial Disaggregation, or BCSD), and daily quantile regression (Asynchronous Regional Regression Model, or ARRM). Additionally, we trained outputs from four atmosphere-ocean general circulation models (AOGCMs) (CCSM3, HadCM3, PCM, and GFDL-CM2.1) driven by higher (A1fi) and lower (B1) future emissions scenarios on two sets of observations (1/8º resolution grid vs. individual weather station) to generate the high-resolution climate input for the forest biogeochemical model PnET-BGC (eight ensembles of six runs).The choice of downscaling approach and spatial resolution of the observations used to train the downscaling model impacted modeled soil moisture and streamflow, which in turn affected forest growth, net N mineralization, net soil nitrification, and stream chemistry. All three downscaling methods were highly sensitive to the observations used, resulting in projections that were significantly different between station-based and grid-based observations. The choice of downscaling method also slightly affected the results, however not as much as the choice of observations. Using spatially smoothed gridded observations and/or methods that do not resolve sub-monthly shifts in the distribution of temperature and/or precipitation can produce biased results in model applications run at greater temporal and/or spatial resolutions. These results underscore the importance of carefully considering field observations used for training, as well as the downscaling method used to generate climate change projections, for smaller-scale modeling studies. Different sources of variability including selection of AOGCM, emissions scenario, downscaling technique, and data used for training downscaling models, result in a wide range of projected forest ecosystem responses to future climate change. © 2016 by the Ecological Society of America.

  19. Development of a novel observational measure for anxiety in young children: The Anxiety Dimensional Observation Scale

    PubMed Central

    Mian, Nicholas D.; Carter, Alice S.; Pine, Daniel S.; Wakschlag, Lauren S.; Briggs-Gowan, Margaret J.

    2015-01-01

    Background Identifying anxiety disorders in preschool-age children represents an important clinical challenge. Observation is essential to clinical assessment and can help differentiate normative variation from clinically significant anxiety. Yet, most anxiety assessment methods for young children rely on parent-reports. The goal of this article is to present and preliminarily test the reliability and validity of a novel observational paradigm for assessing a range of fearful and anxious behaviors in young children, the Anxiety Dimensional Observation Schedule (Anx-DOS). Methods A diverse sample of 403 children, aged 3 to 6 years, and their mothers was studied. Reliability and validity in relation to parent reports (Preschool Age Psychiatric Assessment) and known risk factors, including indicators of behavioral inhibition (latency to touch novel objects) and attention bias to threat (in the dot-probe task) were investigated. Results The Anx-DOS demonstrated good inter-rater reliability and internal consistency. Evidence for convergent validity was demonstrated relative to mother-reported separation anxiety, social anxiety, phobic avoidance, trauma symptoms, and past service use. Finally, fearfulness was associated with observed latency and attention bias toward threat. Conclusions Findings support the Anx-DOS as a method for capturing early manifestations of fearfulness and anxiety in young children. Multimethod assessments incorporating standardized methods for assessing discrete, observable manifestations of anxiety may be beneficial for early identification and clinical intervention efforts. PMID:25773515

  20. Impact of Tropopause Structures on Deep Convective Transport Observed during MACPEX

    NASA Astrophysics Data System (ADS)

    Mullendore, G. L.; Bigelbach, B. C.; Christensen, L. E.; Maddox, E.; Pinkney, K.; Wagner, S.

    2016-12-01

    Deep convection is the most efficient method of transporting boundary layer mass to the upper troposphere and stratosphere (UTLS). The Mid-latitude Airborne Cirrus Properties Experiment (MACPEX) was conducted during April of 2011 over the central U.S. With a focus on cirrus clouds, the campaign flights often sampled large cirrus anvils downstream from deep convection and included an extensive observational suite of chemical measurements on a high altitude aircraft. As double-tropopause structures are a common feature in the central U.S. during the springtime, the MACPEX campaign provides a good opportunity to gather cases of deep convective transport in the context of both single and double tropopause structures. Sampling of chemical plumes well downstream from convection allows for sampling in relatively quiescent conditions and analysis of irreversible transport. The analysis presented includes multiple methods to assess air mass source and possible convective processing, including back trajectories and ratios of chemical concentrations. Although missions were flown downstream of deep convection, recent processing by convection does not seem likely in all cases that high altitude carbon monoxide plumes were observed. Additionally, the impact of single and double tropopause structures on deep convective transport is shown to be strongly dependent on high stability layers.

  1. Graphene device and method of using graphene device

    DOEpatents

    Bouchiat, Vincent; Girit, Caglar; Kessler, Brian; Zettl, Alexander K.

    2015-08-11

    An embodiment of a graphene device includes a layered structure, first and second electrodes, and a dopant island. The layered structure includes a conductive layer, an insulating layer, and a graphene layer. The electrodes are coupled to the graphene layer. The dopant island is coupled to an exposed surface of the graphene layer between the electrodes. An embodiment of a method of using a graphene device includes providing the graphene device. A voltage is applied to the conductive layer of the graphene device. Another embodiment of a method of using a graphene device includes providing the graphene device without the dopant island. A dopant island is placed on an exposed surface of the graphene layer between the electrodes. A voltage is applied to the conductive layer of the graphene device. A response of the dopant island to the voltage is observed.

  2. Studies on Training Ground Observers to Estimate Range to Aerial Targets.

    ERIC Educational Resources Information Center

    McCluskey, Michael R.; And Others

    Six pilot studies were conducted to determine the effects of training on range estimation performance for aerial targets, and to identify some of the relevant variables. Observers were trained to estimate ranges of 350, 400, 800, 1,500, or 2,500 meters. Several variations of range estimation training methods were used, including immediate…

  3. Spectroradiometric considerations for advanced land observing systems

    NASA Technical Reports Server (NTRS)

    Slater, P. N.

    1986-01-01

    Research aimed at improving the inflight absolute radiometric calibration of advanced land observing systems was initiated. Emphasis was on the satellite sensor calibration program at White Sands. Topics addressed include: absolute radiometric calibration of advanced remote sensing; atmospheric effects on reflected radiation; inflight radiometric calibration; field radiometric methods for reflectance and atmospheric measurement; and calibration of field relectance radiometers.

  4. Assessing the Impact of Observations on Numerical Weather Forecasts Using the Adjoint Method

    NASA Technical Reports Server (NTRS)

    Gelaro, Ronald

    2012-01-01

    The adjoint of a data assimilation system provides a flexible and efficient tool for estimating observation impacts on short-range weather forecasts. The impacts of any or all observations can be estimated simultaneously based on a single execution of the adjoint system. The results can be easily aggregated according to data type, location, channel, etc., making this technique especially attractive for examining the impacts of new hyper-spectral satellite instruments and for conducting regular, even near-real time, monitoring of the entire observing system. This talk provides a general overview of the adjoint method, including the theoretical basis and practical implementation of the technique. Results are presented from the adjoint-based observation impact monitoring tool in NASA's GEOS-5 global atmospheric data assimilation and forecast system. When performed in conjunction with standard observing system experiments (OSEs), the adjoint results reveal both redundancies and dependencies between observing system impacts as observations are added or removed from the assimilation system. Understanding these dependencies may be important for optimizing the use of the current observational network and defining requirements for future observing systems

  5. Quantifying the predictive consequences of model error with linear subspace analysis

    USGS Publications Warehouse

    White, Jeremy T.; Doherty, John E.; Hughes, Joseph D.

    2014-01-01

    All computer models are simplified and imperfect simulators of complex natural systems. The discrepancy arising from simplification induces bias in model predictions, which may be amplified by the process of model calibration. This paper presents a new method to identify and quantify the predictive consequences of calibrating a simplified computer model. The method is based on linear theory, and it scales efficiently to the large numbers of parameters and observations characteristic of groundwater and petroleum reservoir models. The method is applied to a range of predictions made with a synthetic integrated surface-water/groundwater model with thousands of parameters. Several different observation processing strategies and parameterization/regularization approaches are examined in detail, including use of the Karhunen-Loève parameter transformation. Predictive bias arising from model error is shown to be prediction specific and often invisible to the modeler. The amount of calibration-induced bias is influenced by several factors, including how expert knowledge is applied in the design of parameterization schemes, the number of parameters adjusted during calibration, how observations and model-generated counterparts are processed, and the level of fit with observations achieved through calibration. Failure to properly implement any of these factors in a prediction-specific manner may increase the potential for predictive bias in ways that are not visible to the calibration and uncertainty analysis process.

  6. Underlying risk factors for prescribing errors in long-term aged care: a qualitative study.

    PubMed

    Tariq, Amina; Georgiou, Andrew; Raban, Magdalena; Baysari, Melissa Therese; Westbrook, Johanna

    2016-09-01

    To identify system-related risk factors perceived to contribute to prescribing errors in Australian long-term care settings, that is, residential aged care facilities (RACFs). The study used qualitative methods to explore factors that contribute to unsafe prescribing in RACFs. Data were collected at three RACFs in metropolitan Sydney, Australia between May and November 2011. Participants included RACF managers, doctors, pharmacists and RACF staff actively involved in prescribing-related processes. Methods included non-participant observations (74 h), in-depth semistructured interviews (n=25) and artefact analysis. Detailed process activity models were developed for observed prescribing episodes supplemented by triangulated analysis using content analysis methods. System-related factors perceived to increase the risk of prescribing errors in RACFs were classified into three overarching themes: communication systems, team coordination and staff management. Factors associated with communication systems included limited point-of-care access to information, inadequate handovers, information storage across different media (paper, electronic and memory), poor legibility of charts, information double handling, multiple faxing of medication charts and reliance on manual chart reviews. Team factors included lack of established lines of responsibility, inadequate team communication and limited participation of doctors in multidisciplinary initiatives like medication advisory committee meetings. Factors related to staff management and workload included doctors' time constraints and their accessibility, lack of trained RACF staff and high RACF staff turnover. The study highlights several system-related factors including laborious methods for exchanging medication information, which often act together to contribute to prescribing errors. Multiple interventions (eg, technology systems, team communication protocols) are required to support the collaborative nature of RACF prescribing. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  7. [Control of anticoagulation in patients with non-valvular atrial fibrillation in a primary care clinical practice setting in the different autonomous communities. PAULA study].

    PubMed

    Polo García, J; Barrios Alonso, V; Escobar Cervantes, C; Prieto Valiente, L; Lobos Bejarano, J M; Vargas Ortega, D; Prieto Díaz, M Á; Alonso Moreno, F J; Barquilla García, A

    2017-04-01

    To determine the differences between regions in the level of control of patients with non-valvular atrial fibrillation treated with vitamin K antagonists, included in the PAULA study. Observational, and coss-sectional/retrospective study, including 139 Primary Care physicians from 99 Health Care centres in all autonomous communities (except La Rioja). Anticoagulation control was defined as the time in therapeutic range assessed by either the direct method (poor control <60%), or the Rosendaal method (poor control <65%). A total of 1,524 patients were included. Small differences in baseline characteristics of the patients were observed. Differences in the percentage of time in therapeutic range were observed, according to the Rosendaal method (mean 69.0±17.7%), from 78.1%±16.6 (Basque Country) to 61.5±14% (Balearic Islands), by the direct method (mean 63.2±17.9%) from 73.6%±16.6 (Basque Country) to 57.5±15.7% (Extremadura). When comparing regions, in those where the Primary Care physicians assumed full control without restrictions on prescription, the percentage of time in therapeutic range by the direct method was 63.89 vs. 60.95% in those with restrictions (p=.006), by Rosendaal method, 69.39% compared with 67.68% (p=.1036). There are significant differences in the level of control between some regions are still inadequate. Regions in which the Primary Care physicians assumed the management of anticoagulation and without restrictions, time in therapeutic range was somewhat higher, and showed a favourable trend for better control. These findings may have clinical implications, and deserve consideration and specific analysis. Copyright © 2016 Sociedad Española de Médicos de Atención Primaria (SEMERGEN). Publicado por Elsevier España, S.L.U. All rights reserved.

  8. Spontaneously Generating Life in Your Classroom? Pasteur, Spallanzani and Science Process.

    ERIC Educational Resources Information Center

    Byington, Scott

    2001-01-01

    Presents an experiment that tests for spontaneous generation, or abiogenesis. Observes microbial growth in nutrient broth under seven different flask environments. Includes instructions for the methods. (YDS)

  9. Investigations of the lower and middle atmosphere at the Arecibo Observatory and a description of the new VHF radar project

    NASA Technical Reports Server (NTRS)

    Rottger, J.; Ierkic, H. M.; Zimmerman, R. K.; Hagen, J.

    1986-01-01

    The atmospheric science research at the Arecibo Observatory is performed by means of (active) radar methods and (passive) optical methods. The active methods utilize the 430 NHz radar, the S-band radar on 2380 MHz, and a recently constructed Very High Frequency (VHF) radar. The passive methods include measurements of the mesopause temperature by observing the rotational emissions from OH-bands. The VHF radar design is discussed.

  10. Accelerated aging: prediction of chemical stability of pharmaceuticals.

    PubMed

    Waterman, Kenneth C; Adami, Roger C

    2005-04-11

    Methods of rapidly and accurately assessing the chemical stability of pharmaceutical dosage forms are reviewed with respect to the major degradation mechanisms generally observed in pharmaceutical development. Methods are discussed, with the appropriate caveats, for accelerated aging of liquid and solid dosage forms, including small and large molecule active pharmaceutical ingredients. In particular, this review covers general thermal methods, as well as accelerated aging methods appropriate to oxidation, hydrolysis, reaction with reactive excipient impurities, photolysis and protein denaturation.

  11. Integrating qualitative research into occupational health: a case study among hospital workers.

    PubMed

    Gordon, Deborah R; Ames, Genevieve M; Yen, Irene H; Gillen, Marion; Aust, Birgit; Rugulies, Reiner; Frank, John W; Blanc, Paul D

    2005-04-01

    We sought to better use qualitative approaches in occupational health research and integrate them with quantitative methods. We systematically reviewed, selected, and adapted qualitative research methods as part of a multisite study of the predictors and outcomes of work-related musculoskeletal disorders among hospital workers in two large urban tertiary hospitals. The methods selected included participant observation; informal, open-ended, and semistructured interviews with individuals or small groups; and archival study. The nature of the work and social life of the hospitals and the foci of the study all favored using more participant observation methods in the case study than initially anticipated. Exploiting the full methodological spectrum of qualitative methods in occupational health is increasingly relevant. Although labor-intensive, these approaches may increase the yield of established quantitative approaches otherwise used in isolation.

  12. A usability evaluation of Lazada mobile application

    NASA Astrophysics Data System (ADS)

    Hussain, Azham; Mkpojiogu, Emmanuel O. C.; Jamaludin, Nur Hafiza; Moh, Somia T. L.

    2017-10-01

    This paper reports on a usability evaluation of Lazada mobile application, an online shopping app for mobile devices. The evaluation was conducted using 12 users of ages 18 to 24. Seven (7) were expert users and the other 5 were novice users. The study objectives were to evaluate the perceived effectiveness, efficiency and satisfaction of the mobile application. The result provides a positive feedback and shows that the mobile shopping app is effective, efficient, and satisfying as perceived by the study participants. However, there are some observed usability issues with the main menu and the payment method that necessitates improvements to increase the application's effectiveness, efficiency and satisfaction. The suggested improvements include: 1) the main menu should be capitalized and place on the left side of mobile app and 2) payment method tutorial should be included as a hyperlink in the payment method page. This observation will be helpful to the owners of the application in future version development of the app.

  13. Analysis models for the estimation of oceanic fields

    NASA Technical Reports Server (NTRS)

    Carter, E. F.; Robinson, A. R.

    1987-01-01

    A general model for statistically optimal estimates is presented for dealing with scalar, vector and multivariate datasets. The method deals with anisotropic fields and treats space and time dependence equivalently. Problems addressed include the analysis, or the production of synoptic time series of regularly gridded fields from irregular and gappy datasets, and the estimate of fields by compositing observations from several different instruments and sampling schemes. Technical issues are discussed, including the convergence of statistical estimates, the choice of representation of the correlations, the influential domain of an observation, and the efficiency of numerical computations.

  14. Accessible maps for the color vision deficient observers: past and present knowledge and future possibilities

    NASA Astrophysics Data System (ADS)

    Kvitle, Anne Kristin

    2018-05-01

    Color is part of the visual variables in map, serving an aesthetic part and as a guide of attention. Impaired color vision affects the ability to distinguish colors, which makes the task of decoding the map colors difficult. Map reading is reported as a challenging task for these observers, especially when the size of stimuli is small. The aim of this study is to review existing methods for map design for color vision deficient users. A systematic review of research literature and case studies of map design for CVD observers has been conducted in order to give an overview of current knowledge and future research challenges. In addition, relevant research on simulations of CVD and color image enhancement for these observers from other fields of industry is included. The study identified two main approaches: pre-processing by using accessible colors and post-processing by using enhancement methods. Some of the methods may be applied for maps, but requires tailoring of test images according to map types.

  15. Implementing Curriculum-Based Learning Portfolio: A Case Study in Taiwan

    ERIC Educational Resources Information Center

    Chen, Shu-Chin Susan; Cheng, Yu-Pay

    2011-01-01

    The main purpose of this descriptive research is to examine and document the development of a curriculum-based learning portfolio model for children in a preschool for three-six-year-olds in Taiwan. Data collection methods adopted include classroom observation, in-depth interviews, questionnaires and documentation. Participants include a preschool…

  16. Ciencias 2 (Science 2). [Student's Workbook].

    ERIC Educational Resources Information Center

    Raposo, Lucilia

    Ciencias 2 is the second in a series of elementary science textbooks written for Portuguese-speaking students. The text develops the basic skills that students need to study their surroundings and observe natural facts and phenomena by following scientific methods. The book is composed of 10 chapters and includes 57 lessons. Topics included are…

  17. Visual Literacy: Does It Enhance Leadership Abilities Required for the Twenty-First Century?

    ERIC Educational Resources Information Center

    Bintz, Carol

    2016-01-01

    The twenty-first century hosts a well-established global economy, where leaders are required to have increasingly complex skills that include creativity, innovation, vision, relatability, critical thinking and well-honed communications methods. The experience gained by learning to be visually literate includes the ability to see, observe, analyze,…

  18. A method for estimating vertical distibution of the SAGE II opaque cloud frequency

    NASA Technical Reports Server (NTRS)

    Wang, Pi-Huan; Mccormick, M. Patrick; Minnis, Patrick; Kent, Geoffrey S.; Yue, Glenn K.; Skeens, Kristi M.

    1995-01-01

    A method is developed to infer the vertical distribution of the occurrence frequency of clouds that are opaque to the Stratospheric Aerosol and Gas Experiment (SAGE) II instrument. An application of the method to the 1986 SAGE II observations is included in this paper. The 1986 SAGE II results are compared with the 1952-1981 cloud climatology of Warren et al. (1986, 1988)

  19. Marginalized zero-altered models for longitudinal count data.

    PubMed

    Tabb, Loni Philip; Tchetgen, Eric J Tchetgen; Wellenius, Greg A; Coull, Brent A

    2016-10-01

    Count data often exhibit more zeros than predicted by common count distributions like the Poisson or negative binomial. In recent years, there has been considerable interest in methods for analyzing zero-inflated count data in longitudinal or other correlated data settings. A common approach has been to extend zero-inflated Poisson models to include random effects that account for correlation among observations. However, these models have been shown to have a few drawbacks, including interpretability of regression coefficients and numerical instability of fitting algorithms even when the data arise from the assumed model. To address these issues, we propose a model that parameterizes the marginal associations between the count outcome and the covariates as easily interpretable log relative rates, while including random effects to account for correlation among observations. One of the main advantages of this marginal model is that it allows a basis upon which we can directly compare the performance of standard methods that ignore zero inflation with that of a method that explicitly takes zero inflation into account. We present simulations of these various model formulations in terms of bias and variance estimation. Finally, we apply the proposed approach to analyze toxicological data of the effect of emissions on cardiac arrhythmias.

  20. Marginalized zero-altered models for longitudinal count data

    PubMed Central

    Tabb, Loni Philip; Tchetgen, Eric J. Tchetgen; Wellenius, Greg A.; Coull, Brent A.

    2015-01-01

    Count data often exhibit more zeros than predicted by common count distributions like the Poisson or negative binomial. In recent years, there has been considerable interest in methods for analyzing zero-inflated count data in longitudinal or other correlated data settings. A common approach has been to extend zero-inflated Poisson models to include random effects that account for correlation among observations. However, these models have been shown to have a few drawbacks, including interpretability of regression coefficients and numerical instability of fitting algorithms even when the data arise from the assumed model. To address these issues, we propose a model that parameterizes the marginal associations between the count outcome and the covariates as easily interpretable log relative rates, while including random effects to account for correlation among observations. One of the main advantages of this marginal model is that it allows a basis upon which we can directly compare the performance of standard methods that ignore zero inflation with that of a method that explicitly takes zero inflation into account. We present simulations of these various model formulations in terms of bias and variance estimation. Finally, we apply the proposed approach to analyze toxicological data of the effect of emissions on cardiac arrhythmias. PMID:27867423

  1. New Developments in Observer Performance Methodology in Medical Imaging

    PubMed Central

    Chakraborty, Dev P.

    2011-01-01

    A common task in medical imaging is assessing whether a new imaging system, or a variant of an existing one, is an improvement over an existing imaging technology. Imaging systems are generally quite complex, consisting of several components – e.g., image acquisition hardware, image processing and display hardware and software, and image interpretation by radiologists– each of which can affect performance. While it may appear odd to include the radiologist as a “component” of the imaging chain, since the radiologist’s decision determines subsequent patient care, the effect of the human interpretation has to be included. Physical measurements like modulation transfer function, signal to noise ratio, etc., are useful for characterizing the non-human parts of the imaging chain under idealized and often unrealistic conditions, such as uniform background phantoms, target objects with sharp edges, etc. Measuring the effect on performance of the entire imaging chain, including the radiologist, and using real clinical images, requires different methods that fall under the rubric of observer performance methods or “ROC analysis”. The purpose of this paper is to review recent developments in this field, particularly with respect to the free-response method. PMID:21978444

  2. Improving methods to evaluate the impacts of plant invasions: lessons from 40 years of research

    PubMed Central

    Stricker, Kerry Bohl; Hagan, Donald; Flory, S. Luke

    2015-01-01

    Methods used to evaluate the ecological impacts of biological invasions vary widely from broad-scale observational studies to removal experiments in invaded communities and experimental additions in common gardens and greenhouses. Different methods provide information at diverse spatial and temporal scales with varying levels of reliability. Thus, here we provide a synthetic and critical review of the methods used to evaluate the impacts of plant invasions and provide recommendations for future research. We review the types of methods available and report patterns in methods used, including the duration and spatial scale of studies and plant functional groups examined, from 410 peer-reviewed papers published between 1971 and 2011. We found that there has been a marked increase in papers published on plant invasion impacts since 2003 and that more than half of all studies employed observational methods while <5 % included predictive modelling. Most of the studies were temporally and spatially restricted with 51 % of studies lasting <1 year and almost half of all studies conducted in plots or mesocosms <1 m2. There was also a bias in life form studied: more than 60 % of all studies evaluated impacts of invasive forbs and graminoids while <16 % focused on invasive trees. To more effectively quantify invasion impacts, we argue that longer-term experimental research and more studies that use predictive modelling and evaluate impacts of invasions on ecosystem processes and fauna are needed. Combining broad-scale observational studies with experiments and predictive modelling may provide the most insight into invasion impacts for policy makers and land managers seeking to reduce the effects of plant invasions. PMID:25829379

  3. Intelligent Systems: Terrestrial Observation and Prediction Using Remote Sensing Data

    NASA Technical Reports Server (NTRS)

    Coughlan, Joseph C.

    2005-01-01

    NASA has made science and technology investments to better utilize its large space-borne remote sensing data holdings of the Earth. With the launch of Terra, NASA created a data-rich environment where the challenge is to fully utilize the data collected from EOS however, despite unprecedented amounts of observed data, there is a need for increasing the frequency, resolution, and diversity of observations. Current terrestrial models that use remote sensing data were constructed in a relatively data and compute limited era and do not take full advantage of on-line learning methods and assimilation techniques that can exploit these data. NASA has invested in visualization, data mining and knowledge discovery methods which have facilitated data exploitation, but these methods are insufficient for improving Earth science models that have extensive background knowledge nor do these methods refine understanding of complex processes. Investing in interdisciplinary teams that include computational scientists can lead to new models and systems for online operation and analysis of data that can autonomously improve in prediction skill over time.

  4. Causal inference from observational data.

    PubMed

    Listl, Stefan; Jürges, Hendrik; Watt, Richard G

    2016-10-01

    Randomized controlled trials have long been considered the 'gold standard' for causal inference in clinical research. In the absence of randomized experiments, identification of reliable intervention points to improve oral health is often perceived as a challenge. But other fields of science, such as social science, have always been challenged by ethical constraints to conducting randomized controlled trials. Methods have been established to make causal inference using observational data, and these methods are becoming increasingly relevant in clinical medicine, health policy and public health research. This study provides an overview of state-of-the-art methods specifically designed for causal inference in observational data, including difference-in-differences (DiD) analyses, instrumental variables (IV), regression discontinuity designs (RDD) and fixed-effects panel data analysis. The described methods may be particularly useful in dental research, not least because of the increasing availability of routinely collected administrative data and electronic health records ('big data'). © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  5. The presence of field geologists in Mars-like terrain

    NASA Technical Reports Server (NTRS)

    Mcgreevy, Michael W.

    1992-01-01

    Methods of ethnographic observation and analysis have been coupled with object-oriented analysis and design concepts to begin the development of a clear path from observations in the field to the design of virtual presence systems. The existence of redundancies in field geology and presence allowed for the application of methods for understanding complex systems. As a result of this study, some of these redundancies have been characterized. Those described are all classes of continuity relations, including the continuities of continuous existence, context-constituent continuities, and state-process continuities. The discussion of each includes statements of general relationships, logical consequences of these, and hypothetical situations in which the relationships would apply. These are meant to aid in the development of a theory of presence. The discussion also includes design considerations, providing guidance for the design of virtual planetary exploration systems and other virtual presence systems. Converging evidence regarding continuity in presence is found in the nature of psychological dissociation. Specific methodological refinements should enhance ecological validity in subsequent field studies, which are in progress.

  6. Investigation of random walks knee cartilage segmentation model using inter-observer reproducibility: Data from the osteoarthritis initiative.

    PubMed

    Hong-Seng, Gan; Sayuti, Khairil Amir; Karim, Ahmad Helmy Abdul

    2017-01-01

    Existing knee cartilage segmentation methods have reported several technical drawbacks. In essence, graph cuts remains highly susceptible to image noise despite extended research interest; active shape model is often constraint by the selection of training data while shortest path have demonstrated shortcut problem in the presence of weak boundary, which is a common problem in medical images. The aims of this study is to investigate the capability of random walks as knee cartilage segmentation method. Experts would scribble on knee cartilage image to initialize random walks segmentation. Then, reproducibility of the method is assessed against manual segmentation by using Dice Similarity Index. The evaluation consists of normal cartilage and diseased cartilage sections which is divided into whole and single cartilage categories. A total of 15 normal images and 10 osteoarthritic images were included. The results showed that random walks method has demonstrated high reproducibility in both normal cartilage (observer 1: 0.83±0.028 and observer 2: 0.82±0.026) and osteoarthritic cartilage (observer 1: 0.80±0.069 and observer 2: 0.83±0.029). Besides, results from both experts were found to be consistent with each other, suggesting the inter-observer variation is insignificant (Normal: P=0.21; Diseased: P=0.15). The proposed segmentation model has overcame technical problems reported by existing semi-automated techniques and demonstrated highly reproducible and consistent results against manual segmentation method.

  7. Extending the Li&Ma method to include PSF information

    NASA Astrophysics Data System (ADS)

    Nievas-Rosillo, M.; Contreras, J. L.

    2016-02-01

    The so called Li&Ma formula is still the most frequently used method for estimating the significance of observations carried out by Imaging Atmospheric Cherenkov Telescopes. In this work a straightforward extension of the method for point sources that profits from the good imaging capabilities of current instruments is proposed. It is based on a likelihood ratio under the assumption of a well-known PSF and a smooth background. Its performance is tested with Monte Carlo simulations based on real observations and its sensitivity is compared to standard methods which do not incorporate PSF information. The gain of significance that can be attributed to the inclusion of the PSF is around 10% and can be boosted if a background model is assumed or a finer binning is used.

  8. Spatio-Temporal Evolutions of Non-Orthogonal Equatorial Wave Modes Derived from Observations

    NASA Astrophysics Data System (ADS)

    Barton, C.; Cai, M.

    2015-12-01

    Equatorial waves have been studied extensively due to their importance to the tropical climate and weather systems. Historically, their activity is diagnosed mainly in the wavenumber-frequency domain. Recently, many studies have projected observational data onto parabolic cylinder functions (PCF), which represent the meridional structure of individual wave modes, to attain time-dependent spatial wave structures. In this study, we propose a methodology that seeks to identify individual wave modes in instantaneous fields of observations by determining their projections on PCF modes according to the equatorial wave theory. The new method has the benefit of yielding a closed system with a unique solution for all waves' spatial structures, including IG waves, for a given instantaneous observed field. We have applied our method to the ERA-Interim reanalysis dataset in the tropical stratosphere where the wave-mean flow interaction mechanism for the quasi-biennial oscillation (QBO) is well-understood. We have confirmed the continuous evolution of the selection mechanism for equatorial waves in the stratosphere from observations as predicted by the theory for the QBO. This also validates the proposed method for decomposition of observed tropical wave fields into non-orthogonal equatorial wave modes.

  9. [An Introduction to Methods for Evaluating Health Care Technology].

    PubMed

    Lee, Ting-Ting

    2015-06-01

    The rapid and continual advance of healthcare technology makes ensuring that this technology is used effectively to achieve its original goals a critical issue. This paper presents three methods that may be applied by healthcare professionals in the evaluation of healthcare technology. These methods include: the perception/experiences of users, user work-pattern changes, and chart review or data mining. The first method includes two categories: using interviews to explore the user experience and using theory-based questionnaire surveys. The second method applies work sampling to observe the work pattern changes of users. The last method conducts chart reviews or data mining to analyze the designated variables. In conclusion, while evaluative feedback may be used to improve the design and development of healthcare technology applications, the informatics competency and informatics literacy of users may be further explored in future research.

  10. Innovations and Lessons Learned Developing the USDA Long-Term Agroecosystem Research Network Common Observatory Data Repository

    NASA Astrophysics Data System (ADS)

    Campbell, J. D.; Heilman, P.; Goodrich, D. C.; Sadler, J.

    2015-12-01

    The objective for the USDA Long-Term Agroecosystem Research (LTAR) network Common Observatory Repository (CORe) is to provide data management services including archive, discovery, and access for consistently observed data across all 18 nodes. LTAR members have an average of 56 years of diverse historic data. Each LTAR has designated a representative 'permanent' site as the location's common meteorological observatory. CORe implementation is phased, starting with meteorology, then adding hydrology, eddy flux, soil, and biology data. A design goal was to adopt existing best practices while minimizing the additional data management duties for the researchers. LTAR is providing support for data management specialists at the locations and the National Agricultural Library is providing central data management services. Maintaining continuity with historical observations is essential, so observations from both the legacy and new common methods are included in CORe. International standards are used to store robust descriptive metadata (ISO 19115) for the observation station and surrounding locale (WMO), sensors (Sensor ML), and activity (e.g., re-calibration, locale changes) to provide sufficient detail for novel data re-use for the next 50 years. To facilitate data submission a simple text format was designed. Datasets in CORe will receive DOIs to encourage citations giving fair credit for data providers. Data and metadata access are designed to support multiple formats and naming conventions. An automated QC process is being developed to enhance comparability among LTAR locations and to generate QC process metadata. Data provenance is maintained with a permanent record of changes including those by local scientists reviewing the automated QC results. Lessons learned so far include increase in site acceptance of CORe with the decision to store data from both legacy and new common methods. A larger than anticipated variety of currently used methods with potentially significant differences for future data use was found. Cooperative peer support among locations with the same sensors coupled with central support has reduced redundancy in procedural and data documentation.

  11. How Do Children with ADHD (Mis)Manage Their Real-Life Dyadic Friendships? A Multi-Method Investigation

    ERIC Educational Resources Information Center

    Normand, Sebastien; Schneider, Barry H.; Lee, Matthew D.; Maisonneuve, Marie-France; Kuehn, Sally M.; Robaey, Philippe

    2011-01-01

    This multimethod study provides detailed information about the friendships of 87 children (76% boys) with ADHD and 46 comparison children aged 7-13 years. The methods included parent and teacher ratings, self-report measures and direct observation of friends' dyadic behaviors in three structured analogue tasks. Results indicated that, in contrast…

  12. Charting the Learning Journey of a Group of Adults Returning to Education

    ERIC Educational Resources Information Center

    Mooney, Des

    2011-01-01

    Using a qualitative case study method the researcher studied a group of adult returning students completing a childcare course. Methods used included focus groups, a questionnaire and observations. Using a holistic analysis approach (Yin 2003) of the case the researcher then focused on a number of key issues. From this analysis the themes of…

  13. Clowne Science Scheme--A Method Based Course for the Early Years in Secondary Schools

    ERIC Educational Resources Information Center

    Burden, I. J.; And Others

    1975-01-01

    Describes a two-year course sequence that is team taught and theme centered. Themes include the earth, the senses, time, and rate of change. The teaching method is the discovery approach and the role of the teacher is outlined. Explains student assessment and outlines problems and observations related to the program. (GS)

  14. Sherlock Holmes, Master Problem Solver.

    ERIC Educational Resources Information Center

    Ballew, Hunter

    1994-01-01

    Shows the connections between Sherlock Holmes's investigative methods and mathematical problem solving, including observations, characteristics of the problem solver, importance of data, questioning the obvious, learning from experience, learning from errors, and indirect proof. (MKR)

  15. 2-COLOR Pupil Imaging Method to Detect Stellar Oscillations

    NASA Astrophysics Data System (ADS)

    Costantino, Sigismondi; Alessandro, Cacciani; Mauro, Dolci; Stuart, Jeffries; Eric, Fossat; Ludovico, Cesario; Paolo, Rapex; Luca, Bertello; Ferenc, Varadi; Wolfgang, Finsterle

    Stellar intensity oscillations from the ground are strongly affected by atmospheric noise. For solar-type stars even Antarctic scintillation noise is still overwhelming. We proposed and tested a differential method that images on the same CCD detector two-color pupils of the telescope in order to compensate for intensity sky fluctuations guiding and saturation problems. SOHO data reveal that our method has an efficiency of 70% respect to the absolute amplitude variations. Using two instruments at Dome C and South Pole we can further minimize atmospheric color noise with cross-spectrum methods. This way we also decrease the likelihood of gaps in the data string due to bad weather. Observationally while waiting for the South Pole/Dome-C sites we are carrying on tests from available telescopes and Big Bear Mt. Wilson Teramo Milano. On the data analysis side we use the Random Lag Singular Cross-Spectrum Analysis which eliminates noise from the observed signal better than traditional Fourier transform. This method is also well-suited for extracting common oscillatory components from two or more observations including their relative phases as we are planning to do

  16. [Application of nootropic agents in complex treatment of patients with concussion of the brain].

    PubMed

    Tkachev, A V

    2007-01-01

    65 patients with a mild craniocereberal trauma have been observed. Medical examination included among general clinical methods the following methods: KT (MRT) of the brain, oculist examination including the observation of eye fundus. For objectification of a patient' complaints the authors used orientation and Galvestona's amnesia tests, feeling scale (psychological test), the table to determine the level of memory. Tests have been carried out on the first, tenth and thirty day of the treatment. Patients of the first group received in a complex treatment -pramistar, patients of the second group - piracetam. Patients of both groups noted considerable improvement during a complex treatment (disappearance of headache, dizziness and nausea) and at the same time patients receiving pramistar had better restoration of orientation and feeling. Pramistar was also more effective in patients with amnesia.

  17. Understanding radio polarimetry. V. Making matrix self-calibration work: processing of a simulated observation

    NASA Astrophysics Data System (ADS)

    Hamaker, J. P.

    2006-09-01

    Context: .This is Paper V in a series on polarimetric aperture synthesis based on the algebra of 2×2 matrices. Aims: .It validates the matrix self-calibration theory of the preceding Paper IV and outlines the algorithmic methods that had to be developed for its application. Methods: .New avenues of polarimetric self-calibration opened up in Paper IV are explored by processing a simulated observation. To focus on the polarimetric issues, it is set up so as to sidestep some of the common complications of aperture synthesis, yet properly represent physical conditions. In addition to a representative collection of observing errors, the simulated instrument includes strongly varying Faraday rotation and antennas with unequal feeds. The selfcal procedure is described in detail, including aspects in which it differs from the scalar case, and its effects are demonstrated with a number of intermediate image results. Results: .The simulation's outcome is in full agreement with the theory. The nonlinear matrix equations for instrumental parameters are readily solved by iteration; a convergence problem is easily remedied with a new ancillary algorithm. Instrumental effects are cleanly separated from source properties without reference to changes in parallactic rotation during the observation. Polarimetric images of high purity and dynamic range result. As theory predicts, polarimetric errors that are common to all sources inevitably remain; prior knowledge of the statistics of linear and circular polarization in a typical observed field can be applied to eliminate most of them. Conclusions: .The paper conclusively demonstrates that matrix selfcal per se is a viable method that may foster substantial advancement in the art of radio polarimetry. For its application in real observations, a number of issues must be resolved that matrix selfcal has in common with its scalar sibling, such as the treatment of extended sources and the familiar sampling and aliasing problems. The close analogy between scalar interferometry and its matrix-based generalisation suggests that one may apply well-developed methods of scalar interferometry. Marrying these methods to those of this paper will require a significant investment in new software. Two such developments are known to be foreseen or underway.

  18. Network meta-analysis combining individual patient and aggregate data from a mixture of study designs with an application to pulmonary arterial hypertension.

    PubMed

    Thom, Howard H Z; Capkun, Gorana; Cerulli, Annamaria; Nixon, Richard M; Howard, Luke S

    2015-04-12

    Network meta-analysis (NMA) is a methodology for indirectly comparing, and strengthening direct comparisons of two or more treatments for the management of disease by combining evidence from multiple studies. It is sometimes not possible to perform treatment comparisons as evidence networks restricted to randomized controlled trials (RCTs) may be disconnected. We propose a Bayesian NMA model that allows to include single-arm, before-and-after, observational studies to complete these disconnected networks. We illustrate the method with an indirect comparison of treatments for pulmonary arterial hypertension (PAH). Our method uses a random effects model for placebo improvements to include single-arm observational studies into a general NMA. Building on recent research for binary outcomes, we develop a covariate-adjusted continuous-outcome NMA model that combines individual patient data (IPD) and aggregate data from two-arm RCTs with the single-arm observational studies. We apply this model to a complex comparison of therapies for PAH combining IPD from a phase-III RCT of imatinib as add-on therapy for PAH and aggregate data from RCTs and single-arm observational studies, both identified by a systematic review. Through the inclusion of observational studies, our method allowed the comparison of imatinib as add-on therapy for PAH with other treatments. This comparison had not been previously possible due to the limited RCT evidence available. However, the credible intervals of our posterior estimates were wide so the overall results were inconclusive. The comparison should be treated as exploratory and should not be used to guide clinical practice. Our method for the inclusion of single-arm observational studies allows the performance of indirect comparisons that had previously not been possible due to incomplete networks composed solely of available RCTs. We also built on many recent innovations to enable researchers to use both aggregate data and IPD. This method could be used in similar situations where treatment comparisons have not been possible due to restrictions to RCT evidence and where a mixture of aggregate data and IPD are available.

  19. The self-calibration method for multiple systems at the CHARA Array

    NASA Astrophysics Data System (ADS)

    O'Brien, David

    The self-calibration method, a new interferometric technique at the CHARA Array, has been used to derive orbits for several spectroscopic binaries. This method uses the wide component of a hierarchical triple system to calibrate visibility measurements of the triple's close binary system. At certain baselines and separations, the calibrator in one of these systems can be observed quasi-simultaneously with the target. Depending on the orientation of the CHARA observation baseline relative to the orientation of the wide orbit of the triple system, separated fringe packets may be observed. A sophisticated observing scheme must be put in place to ensure the existence of separated fringe packets on nights of observation. Prior to the onset of this project, the reduction of separated fringe packet data had never included the goal of deriving visibilities for both fringe packets, so new data reduction software has been written. Visibilities obtained with separated fringe packet data for the target close binary are run through both Monte Carlo simulations and grid search programs in order to determine the best-fit orbital elements of the close binary. Several targets have been observed in this fashion, and orbits have been derived for seven targets, including three new orbits. Derivation of the orbit of the close pair in a triple system allows for the calculation of the mutual inclination, which is the angle between the planes of the wide and close orbit. Knowledge of this quantity may give insight into the formation processes that create multiple star systems. INDEX WORDS: Long-baseline interferometry, Self calibration, Separated fringe packets, Triple systems, Close binaries, Multiple systems, Orbital parameters, Near-infrared interferometry

  20. A Comparison of Observed Abundances in Five Well-Studied Planetary Nebulae

    NASA Astrophysics Data System (ADS)

    Tanner, Jolene; Balick, B.; Kwitter, K. B.

    2013-01-01

    We have assembled data and derived abundances in several recent careful studies for five bright planetary nebulae (PNe) of low, moderate, and high ionization and relatively simple morphology. Each of the studies employ different apertures, aperture placement, and facilities for the observations. Various methods were used to derive total abundances. All used spectral windows that included [OII]3727 in the UV through Argon lines in the red. Our ultimate goal is to determine the extent to which the derived abundances are consistent. We show that the reddening-corrected line ratios are surprisingly similar despite the different modes of observation and that the various abundance analysis methods yield generally consistent results for He/H, N/H, O/H, and Ne/H (within 50% with a few larger deviations). In addition we processed the line ratios from the different sources using a common abundance derivation method (ELSA) to search for clues of systematic methodological inconsistencies. None were uncovered.

  1. Best (but oft-forgotten) practices: propensity score methods in clinical nutrition research.

    PubMed

    Ali, M Sanni; Groenwold, Rolf Hh; Klungel, Olaf H

    2016-08-01

    In observational studies, treatment assignment is a nonrandom process and treatment groups may not be comparable in their baseline characteristics, a phenomenon known as confounding. Propensity score (PS) methods can be used to achieve comparability of treated and nontreated groups in terms of their observed covariates and, as such, control for confounding in estimating treatment effects. In this article, we provide a step-by-step guidance on how to use PS methods. For illustrative purposes, we used simulated data based on an observational study of the relation between oral nutritional supplementation and hospital length of stay. We focused on the key aspects of PS analysis, including covariate selection, PS estimation, covariate balance assessment, treatment effect estimation, and reporting. PS matching, stratification, covariate adjustment, and weighting are discussed. R codes and example data are provided to show the different steps in a PS analysis. © 2016 American Society for Nutrition.

  2. Observation of negative differential resistance in mesoscopic graphene oxide devices.

    PubMed

    Rathi, Servin; Lee, Inyeal; Kang, Moonshik; Lim, Dongsuk; Lee, Yoontae; Yamacli, Serhan; Joh, Han-Ik; Kim, Seongsu; Kim, Sang-Woo; Yun, Sun Jin; Choi, Sukwon; Kim, Gil-Ho

    2018-05-08

    The fractions of various functional groups in graphene oxide (GO) are directly related to its electrical and chemical properties and can be controlled by various reduction methods like thermal, chemical and optical. However, a method with sufficient controllability to regulate the reduction process has been missing. In this work, a hybrid method of thermal and joule heating processes is demonstrated where a progressive control of the ratio of various functional groups can be achieved in a localized area. With this precise control of carbon-oxygen ratio, negative differential resistance (NDR) is observed in the current-voltage characteristics of a two-terminal device in the ambient environment due to charge-activated electrochemical reactions at the GO surface. This experimental observation correlates with the optical and chemical characterizations. This NDR behavior offers new opportunities for the fabrication and application of such novel electronic devices in a wide range of devices applications including switches and oscillators.

  3. Evaluation of Two Types of Differential Item Functioning in Factor Mixture Models with Binary Outcomes

    ERIC Educational Resources Information Center

    Lee, HwaYoung; Beretvas, S. Natasha

    2014-01-01

    Conventional differential item functioning (DIF) detection methods (e.g., the Mantel-Haenszel test) can be used to detect DIF only across observed groups, such as gender or ethnicity. However, research has found that DIF is not typically fully explained by an observed variable. True sources of DIF may include unobserved, latent variables, such as…

  4. Spatial scale of deformation constrained by combinations of InSAR and GPS observations in Southern California

    NASA Astrophysics Data System (ADS)

    Lohman, R. B.; Scott, C. P.

    2014-12-01

    Efforts to understand the buildup and release of strain within the Earth's crust often rely on well-characterized observations of ground deformation, over time scales that include interseismic periods, earthquakes, and transient deformation episodes. Constraints on current rates of surface deformation in 1-, 2- or 3-dimensions can be obtained by examining sets of GPS and Interferometric Synthetic Aperture Radar (InSAR) observations, both alone and in combination. Contributions to the observed signal often include motion along faults, seasonal cycles of subsidence and recharge associated with aquifers, anthropogenic extraction of hydrocarbons, and variations in atmospheric water vapor and ionospheric properties. Here we examine methods for extracting time-varying ground deformation signals from combinations of InSAR and GPS data, real and synthetic, applied to Southern California. We show that two methods for combining the data through removal of a GPS-constrained function (a plane, and filtering) from the InSAR result in a clear tradeoff between the contribution from the two datatypes at diffferent spatial scales. We also show that the contribution to the secular rates at GPS sites from seasonal signals is large enough to be a significant error in this estimation process, and should be accounted for.

  5. An Investigation on the Contribution of GLONASS to the Precise Point Positioning for Short Time Observations

    NASA Astrophysics Data System (ADS)

    Ulug, R.; Ozludemir, M. T.

    2016-12-01

    After 2011, through the modernization process of GLONASS, the number of satellites increased rapidly. This progress has made the GLONASS the only fully operational system alternative to GPS in point positioning. So far, many researches have been conducted to investigate the contribution of GLONASS to point positioning considering different methods such as Real Time Kinematic (RTK) and Precise Point Positioning (PPP). The latter one, PPP, is a method that performs precise position determination using a single GNSS receiver. PPP method has become very attractive since the early 2000s and it provided great advantages for engineering and scientific applications. However, PPP method needs at least 2 hours observation time and the required observation length may be longer depending on several factors, such as the number of satellites, satellite configuration etc. The more satellites, the less observation time. Nevertheless the impact of the number of satellites included must be known very well. In this study, to determine the contribution of GLONASS on PPP, GLONASS satellite observations were added one by one from 1 to 5 satellite in 2, 4 and 6 hours of observations. For this purpose, the data collected at the IGS site ISTA was used. Data processing has been done for Day of Year (DOY) 197 in 2016. 24 hours GPS observations have been processed by Bernese 5.2 PPP module and the output was selected as the reference while 2, 4 and 6 hours GPS and GPS/GLONASS observations have been processed by magic GNSS PPP module. The results clearly showed that GPS/GLONASS observations improved positional accuracy, precision, dilution of precision and convergence to the reference coordinates. In this context, coordinate differences between 24 hours GPS observations and 6 hours GPS/GLONASS observations have been obtained as less than 2 cm.

  6. Suicide Risk Assessment and Prevention: A Systematic Review Focusing on Veterans.

    PubMed

    Nelson, Heidi D; Denneson, Lauren M; Low, Allison R; Bauer, Brian W; O'Neil, Maya; Kansagara, Devan; Teo, Alan R

    2017-10-01

    Suicide rates in veteran and military populations in the United States are high. This article reviews studies of the accuracy of methods to identify individuals at increased risk of suicide and the effectiveness and adverse effects of health care interventions relevant to U.S. veteran and military populations in reducing suicide and suicide attempts. Trials, observational studies, and systematic reviews relevant to U.S. veterans and military personnel were identified in searches of MEDLINE, PsycINFO, SocINDEX, and Cochrane databases (January 1, 2008, to September 11, 2015), on Web sites, and in reference lists. Investigators extracted and confirmed data and dual-rated risk of bias for included studies. Nineteen studies evaluated accuracy of risk assessment methods, including models using retrospective electronic records data and clinician- or patient-rated instruments. Most methods demonstrated sensitivity ≥80% or area-under-the-curve values ≥.70 in single studies, including two studies based on electronic records of veterans and military personnel, but specificity varied. Suicide rates were reduced in six of eight observational studies of population-level interventions. Only two of ten trials of individual-level psychotherapy reported statistically significant differences between treatment and usual care. Risk assessment methods have been shown to be sensitive predictors of suicide and suicide attempts, but the frequency of false positives limits their clinical utility. Research to refine these methods and examine clinical applications is needed. Studies of suicide prevention interventions are inconclusive; trials of population-level interventions and promising therapies are required to support their clinical use.

  7. Timely disclosure of progress in long-term cancer survival: the boomerang method substantially improved estimates in a comparative study.

    PubMed

    Brenner, Hermann; Jansen, Lina

    2016-02-01

    Monitoring cancer survival is a key task of cancer registries, but timely disclosure of progress in long-term survival remains a challenge. We introduce and evaluate a novel method, denoted "boomerang method," for deriving more up-to-date estimates of long-term survival. We applied three established methods (cohort, complete, and period analysis) and the boomerang method to derive up-to-date 10-year relative survival of patients diagnosed with common solid cancers and hematological malignancies in the United States. Using the Surveillance, Epidemiology and End Results 9 database, we compared the most up-to-date age-specific estimates that might have been obtained with the database including patients diagnosed up to 2001 with 10-year survival later observed for patients diagnosed in 1997-2001. For cancers with little or no increase in survival over time, the various estimates of 10-year relative survival potentially available by the end of 2001 were generally rather similar. For malignancies with strongly increasing survival over time, including breast and prostate cancer and all hematological malignancies, the boomerang method provided estimates that were closest to later observed 10-year relative survival in 23 of the 34 groups assessed. The boomerang method can substantially improve up-to-dateness of long-term cancer survival estimates in times of ongoing improvement in prognosis. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Impulsivity-hyperactivity and subtypes of aggression in early childhood: an observational and short-term longitudinal study.

    PubMed

    Ostrov, Jamie M; Godleski, Stephanie A

    2009-08-01

    This short-term longitudinal study (N = 112) was conducted to explore the concurrent and prospective associations between teacher-reported impulsive-hyperactive behavior and observed relational and physical aggression during early childhood (M = 45.54 months old, SD = 9.07). Multiple informants and methods including observational methods (i.e., 160 min per child) were used to assess aggression and impulsivity-hyperactivity. All measures were found to be valid and reliable. Prospective hierarchical regression analyses revealed that impulsivity-hyperactivity was associated with increases in observed physical aggression across time, controlling for initial relational aggression and gender. These findings add to the growing developmental psychopathology literature that suggests that distinguishing between subtypes of aggression during early childhood may be important for understanding the course of impulsivity-hyperactivity in young children. Implications for practice are discussed.

  9. Teaching machines to find mantle composition

    NASA Astrophysics Data System (ADS)

    Atkins, Suzanne; Tackley, Paul; Trampert, Jeannot; Valentine, Andrew

    2017-04-01

    The composition of the mantle affects many geodynamical processes by altering factors such as the density, the location of phase changes, and melting temperature. The inferences we make about mantle composition also determine how we interpret the changes in velocity, reflections, attenuation and scattering seen by seismologists. However, the bulk composition of the mantle is very poorly constrained. Inferences are made from meteorite samples, rock samples from the Earth and inferences made from geophysical data. All of these approaches require significant assumptions and the inferences made are subject to large uncertainties. Here we present a new method for inferring mantle composition, based on pattern recognition machine learning, which uses large scale in situ observations of the mantle to make fully probabilistic inferences of composition for convection simulations. Our method has an advantage over other petrological approaches because we use large scale geophysical observations. This means that we average over much greater length scales and do not need to rely on extrapolating from localised samples of the mantle or planetary disk. Another major advantage of our method is that it is fully probabilistic. This allows us to include all of the uncertainties inherent in the inference process, giving us far more information about the reliability of the result than other methods. Finally our method includes the impact of composition on mantle convection. This allows us to make much more precise inferences from geophysical data than other geophysical approaches, which attempt to invert one observation with no consideration of the relationship between convection and composition. We use a sampling based inversion method, using hundreds of convection simulations run using StagYY with self consistent mineral physics properties calculated using the PerpleX package. The observations from these simulations are used to train a neural network to make a probabilistic inference for major element oxide composition of the mantle. We find we can constrain bulk mantle FeO molar percent, FeO/MgO and FeO/SiO2 using observations of the temperature and density structure of the mantle in convection simulations.

  10. Hydrodynamics in a Degenerate, Strongly Attractive Fermi Gas

    NASA Technical Reports Server (NTRS)

    Thomas, John E.; Kinast, Joseph; Hemmer, Staci; Turlapov, Andrey; O'Hara, Ken; Gehm, Mike; Granade, Stephen

    2004-01-01

    In summary, we use all-optical methods with evaporative cooling near a Feshbach resonance to produce a strongly interacting degenerate Fermi gas. We observe hydrodynamic behavior in the expansion dynamics. At low temperatures, collisions may not explain the expansion dynamics. We observe hydrodynamics in the trapped gas. Our observations include collisionally-damped excitation spectra at high temperature which were not discussed above. In addition, we observe weakly damped breathing modes at low temperature. The observed temperature dependence of the damping time and hydrodynamic frequency are not consistent with collisional dynamics nor with collisionless mean field interactions. These observations constitute the first evidence for superfluid hydrodynamics in a Fermi gas.

  11. Aeronautic Instruments. Section VI : Aerial Navigation and Navigating Instruments

    NASA Technical Reports Server (NTRS)

    Eaton, H N

    1923-01-01

    This report outlines briefly the methods of aerial navigation which have been developed during the past few years, with a description of the different instruments used. Dead reckoning, the most universal method of aerial navigation, is first discussed. Then follows an outline of the principles of navigation by astronomical observation; a discussion of the practical use of natural horizons, such as sea, land, and cloud, in making extant observations; the use of artificial horizons, including the bubble, pendulum, and gyroscopic types. A description is given of the recent development of the radio direction finder and its application to navigation.

  12. Natural family planning.

    PubMed

    Bourdillon, C

    1982-11-01

    Frequently, when one mentions natural family planning methods, the response is doubt, bewilderment, ridicule, or scorn. Much of this is due to the fact that many people know only of the rhythm method, which depends on a calculation based on the menstrual pattern. What many people do not know, including physicians and nurses, is that in Australia Drs. John and Lyn Billings have been scientifically researching natural methods of family planning for 25 years, and they have pioneered this Ovulation Method of family planning. The Billings Ovulation Method depends only on the mucus sign. It is based on a scientific knowledge of the combined fertility of husband and wife, and an understanding of the physiology of the female body through simple observations. Important facts relating to the practice of the method include: ovulation takes place only once in the cycle; mucus is secreted by the cervical mucosa before ovulation; the ovum lives for only 3 days at the most; and the sperms live for 5 days at the most, and only in the presence of this fertile mucus. There a 2 types of mucus. The 1st type to appear is cloudy or white. It is nonslippery, sticky, and breaks when stretched between 2 fingers. The 2nd type is like the part of a hen's egg. It is very slippery, much clearer than the former, and stretches when pulled between 2 fingers. This is fertile mucus, and ovulation occurs on the last day that this is present. Of course, it cannot be recognized as the last day until its absence is observed on the following day. The mucus sign can be both seen and felt. After a few months a woman will readily recognize her fertile time, but daily charting is advocated. Simple signs, representing the various observations, are taught. Once a couple knows and understands their combined fertility through the observation of the mucus sign they can plan their family. Rules of the method are outlined. The self control required by this method can only serve to increase the selfless love and unselfish care and respect between partners.

  13. The Role of Aesthetic Artifacts in Creative Writing Research: Casting Student Identity Narratives as Cultural Data

    ERIC Educational Resources Information Center

    Bailey, Christine I.

    2014-01-01

    Drawing upon a postmodern ethnographic approach, the modes of inquiry into this qualitative study included observation and data analysis in order to represent a particular community of students: first year college freshmen from a mid-size, religiously-affiliated university in the southern United States. The methods included artifact…

  14. Transforming Elementary Science Teacher Education by Bridging Formal and Informal Science Education in an Innovative Science Methods Course

    NASA Astrophysics Data System (ADS)

    Riedinger, Kelly; Marbach-Ad, Gili; Randy McGinnis, J.; Hestness, Emily; Pease, Rebecca

    2011-02-01

    We investigated curricular and pedagogical innovations in an undergraduate science methods course for elementary education majors at the University of Maryland. The goals of the innovative elementary science methods course included: improving students' attitudes toward and views of science and science teaching, to model innovative science teaching methods and to encourage students to continue in teacher education. We redesigned the elementary science methods course to include aspects of informal science education. The informal science education course features included informal science educator guest speakers, a live animal demonstration and a virtual field trip. We compared data from a treatment course ( n = 72) and a comparison course ( n = 26). Data collection included: researchers' observations, instructors' reflections, and teacher candidates' feedback. Teacher candidate feedback involved interviews and results on a reliable and valid Attitudes and Beliefs about the Nature of and the Teaching of Science instrument. We used complementary methods to analyze the data collected. A key finding of the study was that while benefits were found in both types of courses, the difference in results underscores the need of identifying the primary purpose for innovation as a vital component of consideration.

  15. Imaging of surface spin textures on bulk crystals by scanning electron microscopy

    NASA Astrophysics Data System (ADS)

    Akamine, Hiroshi; Okumura, So; Farjami, Sahar; Murakami, Yasukazu; Nishida, Minoru

    2016-11-01

    Direct observation of magnetic microstructures is vital for advancing spintronics and other technologies. Here we report a method for imaging surface domain structures on bulk samples by scanning electron microscopy (SEM). Complex magnetic domains, referred to as the maze state in CoPt/FePt alloys, were observed at a spatial resolution of less than 100 nm by using an in-lens annular detector. The method allows for imaging almost all the domain walls in the mazy structure, whereas the visualisation of the domain walls with the classical SEM method was limited. Our method provides a simple way to analyse surface domain structures in the bulk state that can be used in combination with SEM functions such as orientation or composition analysis. Thus, the method extends applications of SEM-based magnetic imaging, and is promising for resolving various problems at the forefront of fields including physics, magnetics, materials science, engineering, and chemistry.

  16. The identification of incident cancers in UK primary care databases: a systematic review.

    PubMed

    Rañopa, Michael; Douglas, Ian; van Staa, Tjeerd; Smeeth, Liam; Klungel, Olaf; Reynolds, Robert; Bhaskaran, Krishnan

    2015-01-01

    UK primary care databases are frequently used in observational studies with cancer outcomes. We aimed to systematically review methods used by such studies to identify and validate incident cancers of the breast, colorectum, and prostate. Medline and Embase (1980-2013) were searched for UK primary care database studies with incident breast, colorectal, or prostate cancer outcomes. Data on the methods used for case ascertainment were extracted and summarised. Questionnaires were sent to corresponding authors to obtain details about case ascertainment. Eighty-four studies of breast (n = 51), colorectal (n = 54), and prostate cancer (n = 31) were identified; 30 examined >1 cancer type. Among the 84 studies, 57 defined cancers using only diagnosis codes, while 27 required further evidence such as chemotherapy. Few studies described methods used to create cancer code lists (n = 5); or made lists available directly (n = 5). Twenty-eight code lists were received on request from study authors. All included malignant neoplasm diagnosis codes, but there was considerable variation in the specific codes included which was not explained by coding dictionary changes. Code lists also varied in terms of other types of codes included, such as in-situ, cancer morphology, history of cancer, and secondary/suspected/borderline cancer codes. In UK primary care database studies, methods for identifying breast, colorectal, and prostate cancers were often unclear. Code lists were often unavailable, and where provided, we observed variation in the individual codes and types of codes included. Clearer reporting of methods and publication of code lists would improve transparency and reproducibility of studies. Copyright © 2014 John Wiley & Sons, Ltd.

  17. Astronomical Methods in Aerial Navigation

    NASA Technical Reports Server (NTRS)

    Beij, K Hilding

    1925-01-01

    The astronomical method of determining position is universally used in marine navigation and may also be of service in aerial navigation. The practical application of the method, however, must be modified and adapted to conform to the requirements of aviation. Much of this work of adaptation has already been accomplished, but being scattered through various technical journals in a number of languages, is not readily available. This report is for the purpose of collecting under one cover such previous work as appears to be of value to the aerial navigator, comparing instruments and methods, indicating the best practice, and suggesting future developments. The various methods of determining position and their application and value are outlined, and a brief resume of the theory of the astronomical method is given. Observation instruments are described in detail. A complete discussion of the reduction of observations follows, including a rapid method of finding position from the altitudes of two stars. Maps and map cases are briefly considered. A bibliography of the subject is appended.

  18. Quantifying biopsychosocial aspects in everyday contexts: an integrative methodological approach from the behavioral sciences

    PubMed Central

    Portell, Mariona; Anguera, M Teresa; Hernández-Mendo, Antonio; Jonsson, Gudberg K

    2015-01-01

    Contextual factors are crucial for evaluative research in psychology, as they provide insights into what works, for whom, in what circumstances, in what respects, and why. Studying behavior in context, however, poses numerous methodological challenges. Although a comprehensive framework for classifying methods seeking to quantify biopsychosocial aspects in everyday contexts was recently proposed, this framework does not contemplate contributions from observational methodology. The aim of this paper is to justify and propose a more general framework that includes observational methodology approaches. Our analysis is rooted in two general concepts: ecological validity and methodological complementarity. We performed a narrative review of the literature on research methods and techniques for studying daily life and describe their shared properties and requirements (collection of data in real time, on repeated occasions, and in natural settings) and classification criteria (eg, variables of interest and level of participant involvement in the data collection process). We provide several examples that illustrate why, despite their higher costs, studies of behavior and experience in everyday contexts offer insights that complement findings provided by other methodological approaches. We urge that observational methodology be included in classifications of research methods and techniques for studying everyday behavior and advocate a renewed commitment to prioritizing ecological validity in behavioral research seeking to quantify biopsychosocial aspects. PMID:26089708

  19. The value of the NDT-Bobath method in post-stroke gait training.

    PubMed

    Mikołajewska, Emilia

    2013-01-01

    Stroke is perceived a major cause of disability, including gait disorders. Looking for more effective methods of gait reeducation in post-stroke survivors is one of the most important issues in contemporary neurorehabilitation. Following a stroke, patients suffer from gait disorders. The aim of this paper is to present the outcomes of a study of post-stroke gait reeducation using the NeuroDevelopmental Treatment-Bobath (NDT-Bobath) method. The research was conducted among 60 adult patients who had undergone ischemic stroke. These patients were treated using the NDT-Bobath method. These patients' gait reeducation was assessed using spatio-temporal gait parameters (gait velocity, cadence and stride length). Measurements of these parameters were conducted by the same therapist twice: on admission, and after the tenth session of gait reeducation. Among the 60 patients involved in the study, the results were as follows: in terms of gait velocity, recovery was observed in 39 cases (65%), in terms of cadence, recovery was observed in 39 cases (65%), in terms of stride length, recovery was observed in 50 cases (83.33%). Benefits were observed after short-term therapy, reflected by measurable statistically significant changes in the patients' gait parameters.

  20. Transit spectroscopy of the extrasolar planet HD 209458B: The search for water

    NASA Astrophysics Data System (ADS)

    Rojo, Patricio Michel

    This dissertation describes an attempt to detect water in the atmosphere of the extrasolar planet HD 209458b using transit spectroscopy. It first discusses the importance of water detection and reviews the state of knowledge about extrasolar planets. This review discusses the main statistical trends and describes the detection methods employed to this date. The importance of the transiting planets and the many measurements of the known ones are also discussed. A radiative transfer model designed and built specifically for this project predicts, given a planetary temperature/pressure/composition profile, the dependence in wavelength of the stellar spectrum modulation due to a transiting planet. A total of 352 spectra around 1.8 [mu]m were obtained on four nights (three in transit) of observations on August 3--4, September 26, and October 3 of 2002 using ISAAC at the Very Large Telescope. Correlating the modeled modulation with the infrared spectra yields a nondetection of water in the atmosphere of HD 209458b. It is found that the nondetection is due to an unfortunate choice of observing parameters and conditions that made it impossible to reach the required sensitivity. Nonetheless, the results are scaled with synthetic spectra to place strong limits on the planetary system configurations for which the observing parameters and telluric conditions would have yielded a successful detection. None of the 10 other known transiting planets would be detectable with the choice of parameters and conditions for this observation. A quantitative model of an improved observing strategy for future observations of this kind is developed. The improvements include: airmass and timing constraints, the simultaneous observation of a calibrator star, and a new method to find the optimal wavelength range. The data-reduction process includes several original techniques that were developed during this work, such as a method to remove fringes from flat fields and several methods to correct for telluric absorption, among others. Some of the code developed for this project is available under the GNU General Public License at the DSpace Internet archive from Cornell University.

  1. Supplement: The Rate of Binary Black Hole Mergers Inferred from Advanced LIGO Observations Surrounding GW150914

    NASA Technical Reports Server (NTRS)

    Abbott, B. P.; Abbott, R.; Abbott, T. D.; Abernathy, M. R.; Acernese, F.; Ackley, K.; Adams, C.; Adams, T.; Addesso, P.; Adhikari, R. X.; hide

    2016-01-01

    This article provides supplemental information for a Letter reporting the rate of (BBH) coalescences inferred from 16 days of coincident Advanced LIGO observations surrounding the transient (GW) signal GW150914. In that work wereported various rate estimates whose 90% confidence intervals fell in the range 2600 Gpc(exp -3) yr(exp -1). Here we givedetails on our method and computations, including information about our search pipelines, a derivation of ourlikelihood function for the analysis, a description of the astrophysical search trigger distribution expected frommerging BBHs, details on our computational methods, a description of the effects and our model for calibrationuncertainty, and an analytic method for estimating our detector sensitivity, which is calibrated to our measurements.

  2. Explaining transgression in respiratory rate observation methods in the emergency department: A classic grounded theory analysis.

    PubMed

    Flenady, Tracy; Dwyer, Trudy; Applegarth, Judith

    2017-09-01

    Abnormal respiratory rates are one of the first indicators of clinical deterioration in emergency department(ED) patients. Despite the importance of respiratory rate observations, this vital sign is often inaccurately recorded on ED observation charts, compromising patient safety. Concurrently, there is a paucity of research reporting why this phenomenon occurs. To develop a substantive theory explaining ED registered nurses' reasoning when they miss or misreport respiratory rate observations. This research project employed a classic grounded theory analysis of qualitative data. Seventy-nine registered nurses currently working in EDs within Australia. Data collected included detailed responses from individual interviews and open-ended responses from an online questionnaire. Classic grounded theory (CGT) research methods were utilised, therefore coding was central to the abstraction of data and its reintegration as theory. Constant comparison synonymous with CGT methods were employed to code data. This approach facilitated the identification of the main concern of the participants and aided in the generation of theory explaining how the participants processed this issue. The main concern identified is that ED registered nurses do not believe that collecting an accurate respiratory rate for ALL patients at EVERY round of observations is a requirement, and yet organizational requirements often dictate that a value for the respiratory rate be included each time vital signs are collected. The theory 'Rationalising Transgression', explains how participants continually resolve this problem. The study found that despite feeling professionally conflicted, nurses often erroneously record respiratory rate observations, and then rationalise this behaviour by employing strategies that adjust the significance of the organisational requirement. These strategies include; Compensating, when nurses believe they are compensating for errant behaviour by enhancing the patient's outcome; Minimalizing, when nurses believe that the patient's outcome would be no different if they recorded an accurate respiratory rate or not and; Trivialising, a strategy that sanctions negligent behaviour and occurs when nurses 'cut corners' to get the job done. Nurses' use these strategies to titrate the level ofemotional discomfort associated with erroneous behaviour, thereby rationalising transgression CONCLUSION: This research reveals that despite continuing education regarding gold standard guidelines for respiratory rate collection, suboptimal practice continues. Ideally, to combat this transgression, a culture shift must occur regarding nurses' understanding of acceptable practice methods. Nurses must receive education in a way that permeates their understanding of the relationship between the regular collection of accurate respiratory rate observations and optimal patient outcomes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Quality of data in multiethnic health surveys.

    PubMed Central

    Pasick, R. J.; Stewart, S. L.; Bird, J. A.; D'Onofrio, C. N.

    2001-01-01

    OBJECTIVE: There has been insufficient research on the influence of ethno-cultural and language differences in public health surveys. Using data from three independent studies, the authors examine methods to assess data quality and to identify causes of problematic survey questions. METHODS: Qualitative and quantitative methods were used in this exploratory study, including secondary analyses of data from three baseline surveys (conducted in English, Spanish, Cantonese, Mandarin, and Vietnamese). Collection of additional data included interviews with investigators and interviewers; observations of item development; focus groups; think-aloud interviews; a test-retest assessment survey; and a pilot test of alternatively worded questions. RESULTS: The authors identify underlying causes for the 12 most problematic variables in three multiethnic surveys and describe them in terms of ethnic differences in reliability, validity, and cognitive processes (interpretation, memory retrieval, judgment formation, and response editing), and differences with regard to cultural appropriateness and translation problems. CONCLUSIONS: Multiple complex elements affect measurement in a multiethnic survey, many of which are neither readily observed nor understood through standard tests of data quality. Multiethnic survey questions are best evaluated using a variety of quantitative and qualitative methods that reveal different types and causes of problems. PMID:11889288

  4. A New Ghost Cell/Level Set Method for Moving Boundary Problems: Application to Tumor Growth

    PubMed Central

    Macklin, Paul

    2011-01-01

    In this paper, we present a ghost cell/level set method for the evolution of interfaces whose normal velocity depend upon the solutions of linear and nonlinear quasi-steady reaction-diffusion equations with curvature-dependent boundary conditions. Our technique includes a ghost cell method that accurately discretizes normal derivative jump boundary conditions without smearing jumps in the tangential derivative; a new iterative method for solving linear and nonlinear quasi-steady reaction-diffusion equations; an adaptive discretization to compute the curvature and normal vectors; and a new discrete approximation to the Heaviside function. We present numerical examples that demonstrate better than 1.5-order convergence for problems where traditional ghost cell methods either fail to converge or attain at best sub-linear accuracy. We apply our techniques to a model of tumor growth in complex, heterogeneous tissues that consists of a nonlinear nutrient equation and a pressure equation with geometry-dependent jump boundary conditions. We simulate the growth of glioblastoma (an aggressive brain tumor) into a large, 1 cm square of brain tissue that includes heterogeneous nutrient delivery and varied biomechanical characteristics (white matter, gray matter, cerebrospinal fluid, and bone), and we observe growth morphologies that are highly dependent upon the variations of the tissue characteristics—an effect observed in real tumor growth. PMID:21331304

  5. Radar studies of the atmosphere using spatial and frequency diversity

    NASA Astrophysics Data System (ADS)

    Yu, Tian-You

    This work provides results from a thorough investigation of atmospheric radar imaging including theory, numerical simulations, observational verification, and applications. The theory is generalized to include the existing imaging techniques of coherent radar imaging (CRI) and range imaging (RIM), which are shown to be special cases of three-dimensional imaging (3D Imaging). Mathematically, the problem of atmospheric radar imaging is posed as an inverse problem. In this study, the Fourier, Capon, and maximum entropy (MaxEnt) methods are proposed to solve the inverse problem. After the introduction of the theory, numerical simulations are used to test, validate, and exercise these techniques. Statistical comparisons of the three methods of atmospheric radar imaging are presented for various signal-to-noise ratio (SNR), receiver configuration, and frequency sampling. The MaxEnt method is shown to generally possess the best performance for low SNR. The performance of the Capon method approaches the performance of the MaxEnt method for high SNR. In limited cases, the Capon method actually outperforms the MaxEnt method. The Fourier method generally tends to distort the model structure due to its limited resolution. Experimental justification of CRI and RIM is accomplished using the Middle and Upper (MU) Atmosphere Radar in Japan and the SOUnding SYstem (SOUSY) in Germany, respectively. A special application of CRI to the observation of polar mesosphere summer echoes (PMSE) is used to show direct evidence of wave steepening and possibly explain gravity wave variations associated with PMSE.

  6. Utility of Lava Tubes on Other Worlds

    NASA Technical Reports Server (NTRS)

    Walden, Bryce E.; Billings, T. L.; York, Cheryl Lynn; Gillett, S. L.; Herbert, M. V.

    1998-01-01

    On Mars, as on Earth, lava tubes are found in the extensive lava fields associated with shield volcanism. Lunar lava-tube traces are located near mare-highland boundaries, giving access to a variety of minerals and other resources, including steep slopes, prominent heights for local area communications and observation, large-surface areas in shade, and abundant basalt plains suitable for landing sites, mass-drivers, surface transportation, regolith harvesting, and other uses. Methods for detecting lava tubes include visual observations of collapse trenches and skylights, ground-penetrating radar, gravimetry, magnetometry, seismography, atmospheric effects, laser, lidar, infrared, and human or robotic exploration.

  7. Methodological tools for the collection and analysis of participant observation data using grounded theory.

    PubMed

    Laitinen, Heleena; Kaunonen, Marja; Astedt-Kurki, Päivi

    2014-11-01

    To give clarity to the analysis of participant observation in nursing when implementing the grounded theory method. Participant observation (PO) is a method of collecting data that reveals the reality of daily life in a specific context. In grounded theory, interviews are the primary method of collecting data but PO gives a distinctive insight, revealing what people are really doing, instead of what they say they are doing. However, more focus is needed on the analysis of PO. An observational study carried out to gain awareness of nursing care and its electronic documentation in four acute care wards in hospitals in Finland. Discussion of using the grounded theory method and PO as a data collection tool. The following methodological tools are discussed: an observational protocol, jotting of notes, microanalysis, the use of questioning, constant comparison, and writing and illustrating. Each tool has specific significance in collecting and analysing data, working in constant interaction. Grounded theory and participant observation supplied rich data and revealed the complexity of the daily reality of acute care. In this study, the methodological tools provided a base for the study at the research sites and outside. The process as a whole was challenging. It was time-consuming and it required rigorous and simultaneous data collection and analysis, including reflective writing. Using these methodological tools helped the researcher stay focused from data collection and analysis to building theory. Using PO as a data collection method in qualitative nursing research provides insights. It is not commonly discussed in nursing research and therefore this study can provide insight, which cannot be seen or revealed by using other data collection methods. Therefore, this paper can produce a useful tool for those who intend to use PO and grounded theory in their nursing research.

  8. Qualitative Analysis of E-Liquid Emissions as a Function of Flavor Additives Using Two Aerosol Capture Methods.

    PubMed

    Eddingsaas, Nathan; Pagano, Todd; Cummings, Cody; Rahman, Irfan; Robinson, Risa; Hensel, Edward

    2018-02-13

    This work investigates emissions sampling methods employed for qualitative identification of compounds in e-liquids and their resultant aerosols to assess what capture methods may be sufficient to identify harmful and potentially harmful constituents present. Three popular e-liquid flavors (cinnamon, mango, vanilla) were analyzed using qualitative gas chromatography-mass spectrometry (GC-MS) in the un-puffed state. Each liquid was also machine-puffed under realistic-use flow rate conditions and emissions were captured using two techniques: filter pads and methanol impingers. GC-MS analysis was conducted on the emissions captured using both techniques from all three e-liquids. The e-liquid GC-MS analysis resulted in positive identification of 13 compounds from the cinnamon flavor e-liquid, 31 from mango, and 19 from vanilla, including a number of compounds observed in all e-liquid experiments. Nineteen compounds were observed in emissions which were not present in the un-puffed e-liquid. Qualitative GC-MS analysis of the emissions samples identify compounds observed in all three samples: e-liquid, impinge, and filter pads, and each subset thereof. A limited number of compounds were observed in emissions captured with impingers, but were not observed in emissions captured using filter pads; a larger number of compounds were observed on emissions collected from the filter pads, but not those captured with impingers. It is demonstrated that sampling methods have different sampling efficiencies and some compounds might be missed using only one method. It is recommended to investigate filter pads, impingers, thermal desorption tubes, and solvent extraction resins to establish robust sampling methods for emissions testing of e-cigarette emissions.

  9. Qualitative Analysis of E-Liquid Emissions as a Function of Flavor Additives Using Two Aerosol Capture Methods

    PubMed Central

    Eddingsaas, Nathan; Pagano, Todd; Cummings, Cody; Rahman, Irfan; Robinson, Risa

    2018-01-01

    This work investigates emissions sampling methods employed for qualitative identification of compounds in e-liquids and their resultant aerosols to assess what capture methods may be sufficient to identify harmful and potentially harmful constituents present. Three popular e-liquid flavors (cinnamon, mango, vanilla) were analyzed using qualitative gas chromatography-mass spectrometry (GC-MS) in the un-puffed state. Each liquid was also machine-puffed under realistic-use flow rate conditions and emissions were captured using two techniques: filter pads and methanol impingers. GC-MS analysis was conducted on the emissions captured using both techniques from all three e-liquids. The e-liquid GC-MS analysis resulted in positive identification of 13 compounds from the cinnamon flavor e-liquid, 31 from mango, and 19 from vanilla, including a number of compounds observed in all e-liquid experiments. Nineteen compounds were observed in emissions which were not present in the un-puffed e-liquid. Qualitative GC-MS analysis of the emissions samples identify compounds observed in all three samples: e-liquid, impinge, and filter pads, and each subset thereof. A limited number of compounds were observed in emissions captured with impingers, but were not observed in emissions captured using filter pads; a larger number of compounds were observed on emissions collected from the filter pads, but not those captured with impingers. It is demonstrated that sampling methods have different sampling efficiencies and some compounds might be missed using only one method. It is recommended to investigate filter pads, impingers, thermal desorption tubes, and solvent extraction resins to establish robust sampling methods for emissions testing of e-cigarette emissions. PMID:29438289

  10. New seismic array solution for earthquake observations and hydropower plant health monitoring

    NASA Astrophysics Data System (ADS)

    Antonovskaya, Galina N.; Kapustian, Natalya K.; Moshkunov, Alexander I.; Danilov, Alexey V.; Moshkunov, Konstantin A.

    2017-09-01

    We present the novel fusion of seismic safety monitoring data of the hydropower plant in Chirkey (Caucasus Mountains, Russia). This includes new hardware solutions and observation methods, along with technical limitations for three types of applications: (a) seismic monitoring of the Chirkey reservoir area, (b) structure monitoring of the dam, and (c) monitoring of turbine vibrations. Previous observations and data processing for health monitoring do not include complex data analysis, while the new system is more rational and less expensive. The key new feature of the new system is remote monitoring of turbine vibration. A comparison of the data obtained at the test facilities and by hydropower plant inspection with remote sensors enables early detection of hazardous hydrodynamic phenomena.

  11. Investigating the role of background and observation error correlations in improving a model forecast of forest carbon balance using four dimensional variational data assimilation.

    NASA Astrophysics Data System (ADS)

    Pinnington, Ewan; Casella, Eric; Dance, Sarah; Lawless, Amos; Morison, James; Nichols, Nancy; Wilkinson, Matthew; Quaife, Tristan

    2016-04-01

    Forest ecosystems play an important role in sequestering human emitted carbon-dioxide from the atmosphere and therefore greatly reduce the effect of anthropogenic induced climate change. For that reason understanding their response to climate change is of great importance. Efforts to implement variational data assimilation routines with functional ecology models and land surface models have been limited, with sequential and Markov chain Monte Carlo data assimilation methods being prevalent. When data assimilation has been used with models of carbon balance, background "prior" errors and observation errors have largely been treated as independent and uncorrelated. Correlations between background errors have long been known to be a key aspect of data assimilation in numerical weather prediction. More recently, it has been shown that accounting for correlated observation errors in the assimilation algorithm can considerably improve data assimilation results and forecasts. In this paper we implement a 4D-Var scheme with a simple model of forest carbon balance, for joint parameter and state estimation and assimilate daily observations of Net Ecosystem CO2 Exchange (NEE) taken at the Alice Holt forest CO2 flux site in Hampshire, UK. We then investigate the effect of specifying correlations between parameter and state variables in background error statistics and the effect of specifying correlations in time between observation error statistics. The idea of including these correlations in time is new and has not been previously explored in carbon balance model data assimilation. In data assimilation, background and observation error statistics are often described by the background error covariance matrix and the observation error covariance matrix. We outline novel methods for creating correlated versions of these matrices, using a set of previously postulated dynamical constraints to include correlations in the background error statistics and a Gaussian correlation function to include time correlations in the observation error statistics. The methods used in this paper will allow the inclusion of time correlations between many different observation types in the assimilation algorithm, meaning that previously neglected information can be accounted for. In our experiments we compared the results using our new correlated background and observation error covariance matrices and those using diagonal covariance matrices. We found that using the new correlated matrices reduced the root mean square error in the 14 year forecast of daily NEE by 44 % decreasing from 4.22 g C m-2 day-1 to 2.38 g C m-2 day-1.

  12. CHAPTER 9: USING CENSUS DATA TO APPROXIMATE NEIGHBORHOOD EFFECTS

    EPA Science Inventory

    INTRODUCTION Despite the development of innovative neighborhood data collection methods, such as systematic social observation (1, 2), and the utilization of novel administrative data sources including delinquent tax records, homelessness shelter utilization, reports of housing ...

  13. Spatial Evaluation and Verification of Earthquake Simulators

    NASA Astrophysics Data System (ADS)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2017-06-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  14. Land Surface Model Biases and their Impacts on the Assimilation of Snow-related Observations

    NASA Astrophysics Data System (ADS)

    Arsenault, K. R.; Kumar, S.; Hunter, S. M.; Aman, R.; Houser, P. R.; Toll, D.; Engman, T.; Nigro, J.

    2007-12-01

    Some recent snow modeling studies have employed a wide range of assimilation methods to incorporate snow cover or other snow-related observations into different hydrological or land surface models. These methods often include taking both model and observation biases into account throughout the model integration. This study focuses more on diagnosing the model biases and presenting their subsequent impacts on assimilating snow observations and modeled snowmelt processes. In this study, the land surface model, the Community Land Model (CLM), is used within the Land Information System (LIS) modeling framework to show how such biases impact the assimilation of MODIS snow cover observations. Alternative in-situ and satellite-based observations are used to help guide the CLM LSM in better predicting snowpack conditions and more realistic timing of snowmelt for a western US mountainous region. Also, MODIS snow cover observation biases will be discussed, and validation results will be provided. The issues faced with inserting or assimilating MODIS snow cover at moderate spatial resolutions (like 1km or less) will be addressed, and the impacts on CLM will be presented.

  15. Pedigree data analysis with crossover interference.

    PubMed Central

    Browning, Sharon

    2003-01-01

    We propose a new method for calculating probabilities for pedigree genetic data that incorporates crossover interference using the chi-square models. Applications include relationship inference, genetic map construction, and linkage analysis. The method is based on importance sampling of unobserved inheritance patterns conditional on the observed genotype data and takes advantage of fast algorithms for no-interference models while using reweighting to allow for interference. We show that the method is effective for arbitrarily many markers with small pedigrees. PMID:12930760

  16. Estimating the state of a geophysical system with sparse observations: time delay methods to achieve accurate initial states for prediction

    NASA Astrophysics Data System (ADS)

    An, Zhe; Rey, Daniel; Ye, Jingxin; Abarbanel, Henry D. I.

    2017-01-01

    The problem of forecasting the behavior of a complex dynamical system through analysis of observational time-series data becomes difficult when the system expresses chaotic behavior and the measurements are sparse, in both space and/or time. Despite the fact that this situation is quite typical across many fields, including numerical weather prediction, the issue of whether the available observations are "sufficient" for generating successful forecasts is still not well understood. An analysis by Whartenby et al. (2013) found that in the context of the nonlinear shallow water equations on a β plane, standard nudging techniques require observing approximately 70 % of the full set of state variables. Here we examine the same system using a method introduced by Rey et al. (2014a), which generalizes standard nudging methods to utilize time delayed measurements. We show that in certain circumstances, it provides a sizable reduction in the number of observations required to construct accurate estimates and high-quality predictions. In particular, we find that this estimate of 70 % can be reduced to about 33 % using time delays, and even further if Lagrangian drifter locations are also used as measurements.

  17. a Simple Spatially Weighted Measure of Temporal Stability for Data with Limited Temporal Observations

    NASA Astrophysics Data System (ADS)

    Piburn, J.; Stewart, R.; Morton, A.

    2017-10-01

    Identifying erratic or unstable time-series is an area of interest to many fields. Recently, there have been successful developments towards this goal. These new developed methodologies however come from domains where it is typical to have several thousand or more temporal observations. This creates a challenge when attempting to apply these methodologies to time-series with much fewer temporal observations such as for socio-cultural understanding, a domain where a typical time series of interest might only consist of 20-30 annual observations. Most existing methodologies simply cannot say anything interesting with so few data points, yet researchers are still tasked to work within in the confines of the data. Recently a method for characterizing instability in a time series with limitedtemporal observations was published. This method, Attribute Stability Index (ASI), uses an approximate entropy based method tocharacterize a time series' instability. In this paper we propose an explicitly spatially weighted extension of the Attribute StabilityIndex. By including a mechanism to account for spatial autocorrelation, this work represents a novel approach for the characterizationof space-time instability. As a case study we explore national youth male unemployment across the world from 1991-2014.

  18. Sediment and nutrients transport in watershed and their impact on coastal environment

    PubMed Central

    Ikeda, Syunsuke; Osawa, Kazutoshi; Akamatsu, Yoshihisa

    2009-01-01

    Sediment and nutrients yields especially from farmlands were studied in a watershed in Ishigaki island, Okinawa, Japan. The transport processes of these materials in rivers, mangrove, lagoon and coastal zones were studied by using various observation methods including stable isotope analysis. They were simulated by using a WEPP model which was modified to be applicable to such small islands by identifying several factors from the observations. The model predicts that a proper combination of civil engineering countermeasure and change of farming method can reduce the sediment yield from the watershed by 74%. Observations of water quality and coral recruitment test in Nagura bay indicate that the water is eutrophicated and the corals cannot grow for a long time. Based on these observations, a quantitative target of the reduction of sediment and nutrients yield in watershed can be decided rationally. PMID:19907124

  19. Research Methods in Healthcare Epidemiology and Antimicrobial Stewardship-Observational Studies.

    PubMed

    Snyder, Graham M; Young, Heather; Varman, Meera; Milstone, Aaron M; Harris, Anthony D; Munoz-Price, Silvia

    2016-10-01

    Observational studies compare outcomes among subjects with and without an exposure of interest, without intervention from study investigators. Observational studies can be designed as a prospective or retrospective cohort study or as a case-control study. In healthcare epidemiology, these observational studies often take advantage of existing healthcare databases, making them more cost-effective than clinical trials and allowing analyses of rare outcomes. This paper addresses the importance of selecting a well-defined study population, highlights key considerations for study design, and offers potential solutions including biostatistical tools that are applicable to observational study designs. Infect Control Hosp Epidemiol 2016;1-6.

  20. Observational evidence and strength of evidence domains: case examples

    PubMed Central

    2014-01-01

    Background Systematic reviews of healthcare interventions most often focus on randomized controlled trials (RCTs). However, certain circumstances warrant consideration of observational evidence, and such studies are increasingly being included as evidence in systematic reviews. Methods To illustrate the use of observational evidence, we present case examples of systematic reviews in which observational evidence was considered as well as case examples of individual observational studies, and how they demonstrate various strength of evidence domains in accordance with current Agency for Healthcare Research and Quality (AHRQ) Evidence-based Practice Center (EPC) methods guidance. Results In the presented examples, observational evidence is used when RCTs are infeasible or raise ethical concerns, lack generalizability, or provide insufficient data. Individual study case examples highlight how observational evidence may fulfill required strength of evidence domains, such as study limitations (reduced risk of selection, detection, performance, and attrition); directness; consistency; precision; and reporting bias (publication, selective outcome reporting, and selective analysis reporting), as well as additional domains of dose-response association, plausible confounding that would decrease the observed effect, and strength of association (magnitude of effect). Conclusions The cases highlighted in this paper demonstrate how observational studies may provide moderate to (rarely) high strength evidence in systematic reviews. PMID:24758494

  1. A Review on Human Activity Recognition Using Vision-Based Method.

    PubMed

    Zhang, Shugang; Wei, Zhiqiang; Nie, Jie; Huang, Lei; Wang, Shuang; Li, Zhen

    2017-01-01

    Human activity recognition (HAR) aims to recognize activities from a series of observations on the actions of subjects and the environmental conditions. The vision-based HAR research is the basis of many applications including video surveillance, health care, and human-computer interaction (HCI). This review highlights the advances of state-of-the-art activity recognition approaches, especially for the activity representation and classification methods. For the representation methods, we sort out a chronological research trajectory from global representations to local representations, and recent depth-based representations. For the classification methods, we conform to the categorization of template-based methods, discriminative models, and generative models and review several prevalent methods. Next, representative and available datasets are introduced. Aiming to provide an overview of those methods and a convenient way of comparing them, we classify existing literatures with a detailed taxonomy including representation and classification methods, as well as the datasets they used. Finally, we investigate the directions for future research.

  2. A Review on Human Activity Recognition Using Vision-Based Method

    PubMed Central

    Nie, Jie

    2017-01-01

    Human activity recognition (HAR) aims to recognize activities from a series of observations on the actions of subjects and the environmental conditions. The vision-based HAR research is the basis of many applications including video surveillance, health care, and human-computer interaction (HCI). This review highlights the advances of state-of-the-art activity recognition approaches, especially for the activity representation and classification methods. For the representation methods, we sort out a chronological research trajectory from global representations to local representations, and recent depth-based representations. For the classification methods, we conform to the categorization of template-based methods, discriminative models, and generative models and review several prevalent methods. Next, representative and available datasets are introduced. Aiming to provide an overview of those methods and a convenient way of comparing them, we classify existing literatures with a detailed taxonomy including representation and classification methods, as well as the datasets they used. Finally, we investigate the directions for future research. PMID:29065585

  3. An adaptive approach to the dynamic allocation of buffer storage. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Crooke, S. C.

    1970-01-01

    Several strategies for the dynamic allocation of buffer storage are simulated and compared. The basic algorithms investigated, using actual statistics observed in the Univac 1108 EXEC 8 System, include the buddy method and the first-fit method. Modifications are made to the basic methods in an effort to improve and to measure allocation performance. A simulation model of an adaptive strategy is developed which permits interchanging the two different methods, the buddy and the first-fit methods with some modifications. Using an adaptive strategy, each method may be employed in the statistical environment in which its performance is superior to the other method.

  4. Photonic orbital angular momentum in starlight. Further analysis of the 2011 Starfire Optical Range Observations

    NASA Astrophysics Data System (ADS)

    Oesch, Denis W.; Sanchez, Darryl J.

    2014-07-01

    Context. Each attempt by the Atmospheric Simulation and Adaptive-optics Laboratory Testbed (ASALT) research group to detect turbulence-induced photonic orbital angular momentum (POAM) has been successful, spanning laboratory, simulation and field experiments, with the possible exception of the 2011 Starfire Optical Range (SOR) astronomical observations, a search for POAM induced by astronomical sources. Aims: The purposes of this work are to discuss how POAM from astronomical turbulent assemblages of molecules or atoms (TAMA) would appear in observations and then to reanalyze the data from the 2011 SOR observations using a more refined technique as a demonstration of POAM in starlight. Methods: This work uses the method of projections used previously in analysis of terrestrial data. Results: Using the method of projections, the noise floor of the system was reevaluated and is found to be no greater than 1%. Reevaluation of the 2011 SOR observations reveals that a POAM signal is evident in all of the data. Conclusions: POAM signals have been found in every instance of extended propagation through turbulence conducted by the ASALT research group, including the 2011 SOR observations. POAM is an inevitable result of the propagation of optical waves through turbulence. We express our gratitude to the Air Force Office of Scientific Research for their support of this research.

  5. Monte Carlo Bayesian Inference on a Statistical Model of Sub-Gridcolumn Moisture Variability using High-Resolution Cloud Observations

    NASA Astrophysics Data System (ADS)

    Norris, P. M.; da Silva, A. M., Jr.

    2016-12-01

    Norris and da Silva recently published a method to constrain a statistical model of sub-gridcolumn moisture variability using high-resolution satellite cloud data. The method can be used for large-scale model parameter estimation or cloud data assimilation (CDA). The gridcolumn model includes assumed-PDF intra-layer horizontal variability and a copula-based inter-layer correlation model. The observables used are MODIS cloud-top pressure, brightness temperature and cloud optical thickness, but the method should be extensible to direct cloudy radiance assimilation for a small number of channels. The algorithm is a form of Bayesian inference with a Markov chain Monte Carlo (MCMC) approach to characterizing the posterior distribution. This approach is especially useful in cases where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach is not gradient-based and allows jumps into regions of non-zero cloud probability. In the example provided, the method is able to restore marine stratocumulus near the Californian coast where the background state has a clear swath. The new approach not only significantly reduces mean and standard deviation biases with respect to the assimilated observables, but also improves the simulated rotational-Ramman scattering cloud optical centroid pressure against independent (non-assimilated) retrievals from the OMI instrument. One obvious difficulty for the method, and other CDA methods, is the lack of information content in passive cloud observables on cloud vertical structure, beyond cloud-top and thickness, thus necessitating strong dependence on the background vertical moisture structure. It is found that a simple flow-dependent correlation modification due to Riishojgaard is helpful, better honoring inversion structures in the background state.

  6. The reliability of clinical decisions based on the cervical vertebrae maturation staging method.

    PubMed

    Sohrabi, Aydin; Babay Ahari, Sahar; Moslemzadeh, Hossein; Rafighi, Ali; Aghazadeh, Zahra

    2016-02-01

    Of the various techniques used to determine the optimum timing for growth modification treatments, the cervical vertebrae maturation method has great advantages, including validity and no need for extra X-ray exposure. Recently, the reproducibility of this method has been questioned. The aim of this study was to investigate the cause of poor reproducibility of this method and to assess the reproducibility of the clinical decisions made based on it. Seventy lateral cephalograms of Iranian patients aged 9‒15 years were observed twice by five experienced orthodontists. In addition to determining the developmental stage, each single parameter involved in this method was assessed in terms of inter- and intra-observer reproducibility. In order to evaluate the reproducibility of clinical decisions based on this method, cervical vertebrae maturation staging (CVMS) I and II were considered as phase 1 and CVMS IV and V were considered as phase 3. By considering the clinical approach of the CVMS method, inter-observer reproducibility of this method increased from 0.48 to 0.61 (moderate to substantial) and intra-observer reproducibility enhanced from 0.72 to 0.74. 1. Complete visualization of the first four cervical vertebrae was an inclusion criterion, which also limits the clinical application of CVMS method. 2. These results can be generalized when determining growth modification treatments solely for Class II patients. Difficulty in determining the morphology of C3 and C4 leads to poor reproducibility of the CVMS method. Despite this, it has acceptable reproducibility in determining the timing of functional treatment for Class II patients. © The Author 2015. Published by Oxford University Press on behalf of the European Orthodontic Society. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  7. A complementary marriage of perspectives: understanding organizational social context using mixed methods.

    PubMed

    Beidas, Rinad S; Wolk, Courtney L Benjamin; Walsh, Lucia M; Evans, Arthur C; Hurford, Matthew O; Barg, Frances K

    2014-11-23

    Organizational factors impact the delivery of mental health services in community settings. Mixed-methods analytic approaches have been recommended, though little research within implementation science has explicitly compared inductive and deductive perspectives to understand their relative value in understanding the same constructs. The purpose of our study is to use two different paradigmatic approaches to deepen our understanding of organizational social context. We accomplish this by using a mixed-methods approach in an investigation of organizational social context in community mental health clinics. Nineteen agencies, representing 23 sites, participated. Enrolled participants included 130 therapists, 36 supervisors, and 22 executive administrators. Quantitative data was obtained via the Organizational Social Context (OSC) measure. Qualitative data, comprised of direct observation with spot sampling generated from agency visits, was coded using content analysis and grounded theory. The present study examined elements of organizational social context that would have been missed if only quantitative data had been obtained and utilized mixed methods to investigate if stratifying observations based on quantitative ratings from the OSC resulted in the emergence of differential themes. Four of the six OSC constructs were commonly observed in field observations (i.e., proficiency, rigidity, functionality, stress), while the remaining two constructs were not frequently observed (i.e., resistance, engagement). Constructs emerged related to organizational social context that may have been missed if only quantitative measurement was employed, including those around the physical environment, commentary about evidence-based practice initiatives, leadership, cultural diversity, distrust, and affect. Stratifying agencies by "best," "average," and "worst" organizational social context impacted interpretation for three constructs (affect, stress, and leadership). Results support the additive value of integrating inductive and deductive perspectives in implementation science research. This synthesis of approaches facilitated a more comprehensive understanding and interpretation of the findings than would have been possible if either methodology had been employed in isolation.

  8. Molecular identification of Neofabraea species associated with bull's-eye rot on apple using rolling-circle amplification of partial EF-1α sequence.

    PubMed

    Lin, Huijiao; Jiang, Xiang; Yi, Jianping; Wang, Xinguo; Zuo, Ranling; Jiang, Zide; Wang, Weifang; Zhou, Erxun

    2018-01-01

    A rolling-circle amplification (RCA) method with padlock probes targeted on EF-1α regions was developed for rapid detection of apple bull's-eye rot pathogens, including Neofabraea malicorticis, N. perennans, N. kienholzii, and N. vagabunda (synonym: N. alba). Four padlock probes (PLP-Nm, PLP-Np, PLP-Nk, and PLP-Nv) were designed and tested against 28 samples, including 22 BER pathogen cultures, 4 closely related species, and 2 unrelated species that may cause serious apple decays. The assay successfully identified all the bull's-eye rot pathogenic fungi at the level of species, while no cross-reaction was observed in all target species and no false-positive reaction was observed with all strains used for reference. This study showed that the use of padlock probes and the combination of probe signal amplification by RCA provided an effective and sensitive method for the rapid identification of Neofabraea spp. The method could therefore be a useful tool for monitoring bull's-eye rot pathogens in port quarantine and orchard epidemiological studies.

  9. Development and evaluation of a method of calibrating medical displays based on fixed adaptation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sund, Patrik, E-mail: patrik.sund@vgregion.se; Månsson, Lars Gunnar; Båth, Magnus

    2015-04-15

    Purpose: The purpose of this work was to develop and evaluate a new method for calibration of medical displays that includes the effect of fixed adaptation and by using equipment and luminance levels typical for a modern radiology department. Methods: Low contrast sinusoidal test patterns were derived at nine luminance levels from 2 to 600 cd/m{sup 2} and used in a two alternative forced choice observer study, where the adaptation level was fixed at the logarithmic average of 35 cd/m{sup 2}. The contrast sensitivity at each luminance level was derived by establishing a linear relationship between the ten pattern contrastmore » levels used at every luminance level and a detectability index (d′) calculated from the fraction of correct responses. A Gaussian function was fitted to the data and normalized to the adaptation level. The corresponding equation was used in a display calibration method that included the grayscale standard display function (GSDF) but compensated for fixed adaptation. In the evaluation study, the contrast of circular objects with a fixed pixel contrast was displayed using both calibration methods and was rated on a five-grade scale. Results were calculated using a visual grading characteristics method. Error estimations in both observer studies were derived using a bootstrap method. Results: The contrast sensitivities for the darkest and brightest patterns compared to the contrast sensitivity at the adaptation luminance were 37% and 56%, respectively. The obtained Gaussian fit corresponded well with similar studies. The evaluation study showed a higher degree of equally distributed contrast throughout the luminance range with the calibration method compensated for fixed adaptation than for the GSDF. The two lowest scores for the GSDF were obtained for the darkest and brightest patterns. These scores were significantly lower than the lowest score obtained for the compensated GSDF. For the GSDF, the scores for all luminance levels were statistically separated from the average value; three were lower and two were higher. For the compensated GSDF, three of the scores could not be separated from the average value. Conclusions: An observer study using clinically relevant displays and luminance settings has demonstrated that the calibration of displays according to the GSDF causes the perceived contrast to be unevenly distributed when using displays with a high luminance range. As the luminance range increases, the perceived contrast in the dark and bright regions will be significantly lower than the perceived contrast in the middle of the luminance range. A new calibration method that includes the effect of fixed adaptation was developed and evaluated in an observer study and was found to distribute the contrast of the display more evenly throughout the grayscale than the GSDF.« less

  10. Detection of medication-related problems in hospital practice: a review

    PubMed Central

    Manias, Elizabeth

    2013-01-01

    This review examines the effectiveness of detection methods in terms of their ability to identify and accurately determine medication-related problems in hospitals. A search was conducted of databases from inception to June 2012. The following keywords were used in combination: medication error or adverse drug event or adverse drug reaction, comparison, detection, hospital and method. Seven detection methods were considered: chart review, claims data review, computer monitoring, direct care observation, interviews, prospective data collection and incident reporting. Forty relevant studies were located. Detection methods that were better able to identify medication-related problems compared with other methods tested in the same study included chart review, computer monitoring, direct care observation and prospective data collection. However, only small numbers of studies were involved in comparisons with direct care observation (n = 5) and prospective data collection (n = 6). There was little focus on detecting medication-related problems during various stages of the medication process, and comparisons associated with the seriousness of medication-related problems were examined in 19 studies. Only 17 studies involved appropriate comparisons with a gold standard, which provided details about sensitivities and specificities. In view of the relatively low identification of medication-related problems with incident reporting, use of this method in tracking trends over time should be met with some scepticism. Greater attention should be placed on combining methods, such as chart review and computer monitoring in examining trends. More research is needed on the use of claims data, direct care observation, interviews and prospective data collection as detection methods. PMID:23194349

  11. Advances in the use of observed spatial patterns of catchment hydrological response

    NASA Astrophysics Data System (ADS)

    Grayson, Rodger B.; Blöschl, Günter; Western, Andrew W.; McMahon, Thomas A.

    Over the past two decades there have been repeated calls for the collection of new data for use in developing hydrological science. The last few years have begun to bear fruit from the seeds sown by these calls, through increases in the availability and utility of remote sensing data, as well as the execution of campaigns in research catchments aimed at providing new data for advancing hydrological understanding and predictive capability. In this paper we discuss some philosophical considerations related to model complexity, data availability and predictive performance, highlighting the potential of observed patterns in moving the science and practice of catchment hydrology forward. We then review advances that have arisen from recent work on spatial patterns, including in the characterisation of spatial structure and heterogeneity, and the use of patterns for developing, calibrating and testing distributed hydrological models. We illustrate progress via examples using observed patterns of snow cover, runoff occurrence and soil moisture. Methods for the comparison of patterns are presented, illustrating how they can be used to assess hydrologically important characteristics of model performance. These methods include point-to-point comparisons, spatial relationships between errors and landscape parameters, transects, and optimal local alignment. It is argued that the progress made to date augers well for future developments, but there is scope for improvements in several areas. These include better quantitative methods for pattern comparisons, better use of pattern information in data assimilation and modelling, and a call for improved archiving of data from field studies to assist in comparative studies for generalising results and developing fundamental understanding.

  12. Characterization and Simulation of the Thermoacoustic Instability Behavior of an Advanced, Low Emissions Combustor Prototype

    NASA Technical Reports Server (NTRS)

    DeLaat, John C.; Paxson, Daniel E.

    2008-01-01

    Extensive research is being done toward the development of ultra-low-emissions combustors for aircraft gas turbine engines. However, these combustors have an increased susceptibility to thermoacoustic instabilities. This type of instability was recently observed in an advanced, low emissions combustor prototype installed in a NASA Glenn Research Center test stand. The instability produces pressure oscillations that grow with increasing fuel/air ratio, preventing full power operation. The instability behavior makes the combustor a potentially useful test bed for research into active control methods for combustion instability suppression. The instability behavior was characterized by operating the combustor at various pressures, temperatures, and fuel and air flows representative of operation within an aircraft gas turbine engine. Trends in instability behavior versus operating condition have been identified and documented, and possible explanations for the trends provided. A simulation developed at NASA Glenn captures the observed instability behavior. The physics-based simulation includes the relevant physical features of the combustor and test rig, employs a Sectored 1-D approach, includes simplified reaction equations, and provides time-accurate results. A computationally efficient method is used for area transitions, which decreases run times and allows the simulation to be used for parametric studies, including control method investigations. Simulation results show that the simulation exhibits a self-starting, self-sustained combustion instability and also replicates the experimentally observed instability trends versus operating condition. Future plans are to use the simulation to investigate active control strategies to suppress combustion instabilities and then to experimentally demonstrate active instability suppression with the low emissions combustor prototype, enabling full power, stable operation.

  13. Characterization and Simulation of Thermoacoustic Instability in a Low Emissions Combustor Prototype

    NASA Technical Reports Server (NTRS)

    DeLaat, John C.; Paxson, Daniel E.

    2008-01-01

    Extensive research is being done toward the development of ultra-low-emissions combustors for aircraft gas turbine engines. However, these combustors have an increased susceptibility to thermoacoustic instabilities. This type of instability was recently observed in an advanced, low emissions combustor prototype installed in a NASA Glenn Research Center test stand. The instability produces pressure oscillations that grow with increasing fuel/air ratio, preventing full power operation. The instability behavior makes the combustor a potentially useful test bed for research into active control methods for combustion instability suppression. The instability behavior was characterized by operating the combustor at various pressures, temperatures, and fuel and air flows representative of operation within an aircraft gas turbine engine. Trends in instability behavior vs. operating condition have been identified and documented. A simulation developed at NASA Glenn captures the observed instability behavior. The physics-based simulation includes the relevant physical features of the combustor and test rig, employs a Sectored 1-D approach, includes simplified reaction equations, and provides time-accurate results. A computationally efficient method is used for area transitions, which decreases run times and allows the simulation to be used for parametric studies, including control method investigations. Simulation results show that the simulation exhibits a self-starting, self-sustained combustion instability and also replicates the experimentally observed instability trends vs. operating condition. Future plans are to use the simulation to investigate active control strategies to suppress combustion instabilities and then to experimentally demonstrate active instability suppression with the low emissions combustor prototype, enabling full power, stable operation.

  14. A Robust Method of Measuring Other-Race and Other-Ethnicity Effects: The Cambridge Face Memory Test Format

    PubMed Central

    McKone, Elinor; Stokes, Sacha; Liu, Jia; Cohan, Sarah; Fiorentini, Chiara; Pidcock, Madeleine; Yovel, Galit; Broughton, Mary; Pelleg, Michel

    2012-01-01

    Other-race and other-ethnicity effects on face memory have remained a topic of consistent research interest over several decades, across fields including face perception, social psychology, and forensic psychology (eyewitness testimony). Here we demonstrate that the Cambridge Face Memory Test format provides a robust method for measuring these effects. Testing the Cambridge Face Memory Test original version (CFMT-original; European-ancestry faces from Boston USA) and a new Cambridge Face Memory Test Chinese (CFMT-Chinese), with European and Asian observers, we report a race-of-face by race-of-observer interaction that was highly significant despite modest sample size and despite observers who had quite high exposure to the other race. We attribute this to high statistical power arising from the very high internal reliability of the tasks. This power also allows us to demonstrate a much smaller within-race other ethnicity effect, based on differences in European physiognomy between Boston faces/observers and Australian faces/observers (using the CFMT-Australian). PMID:23118912

  15. Online and unsupervised face recognition for continuous video stream

    NASA Astrophysics Data System (ADS)

    Huo, Hongwen; Feng, Jufu

    2009-10-01

    We present a novel online face recognition approach for video stream in this paper. Our method includes two stages: pre-training and online training. In the pre-training phase, our method observes interactions, collects batches of input data, and attempts to estimate their distributions (Box-Cox transformation is adopted here to normalize rough estimates). In the online training phase, our method incrementally improves classifiers' knowledge of the face space and updates it continuously with incremental eigenspace analysis. The performance achieved by our method shows its great potential in video stream processing.

  16. Observer roles that optimise learning in healthcare simulation education: a systematic review.

    PubMed

    O'Regan, Stephanie; Molloy, Elizabeth; Watterson, Leonie; Nestel, Debra

    2016-01-01

    Simulation is widely used in health professional education. The convention that learners are actively involved may limit access to this educational method. The aim of this paper is to review the evidence for learning methods that employ directed observation as an alternative to hands-on participation in scenario-based simulation training. We sought studies that included either direct comparison of the learning outcomes of observers with those of active participants or identified factors important for the engagement of observers in simulation. We systematically searched health and education databases and reviewed journals and bibliographies for studies investigating or referring to observer roles in simulation using mannequins, simulated patients or role play simulations. A quality framework was used to rate the studies. We sought studies that included either direct comparison of the learning outcomes of observers with those of active participants or identified factors important for the engagement of observers in simulation. We systematically searched health and education databases and reviewed journals and bibliographies for studies investigating or referring to observer roles in simulation using mannequins, simulated patients or role play simulations. A quality framework was used to rate the studies. Nine studies met the inclusion criteria. Five studies suggest learning outcomes in observer roles are as good or better than hands-on roles in simulation. Four studies document learner satisfaction in observer roles. Five studies used a tool to guide observers. Eight studies involved observers in the debrief. Learning and satisfaction in observer roles is closely associated with observer tools, learner engagement, role clarity and contribution to the debrief. Learners that valued observer roles described them as affording an overarching view, examination of details from a distance, and meaningful feedback during the debrief. Learners who did not value observer roles described them as passive, or boring when compared to hands-on engagement in the simulation encounter. Learning outcomes and role satisfaction for observers is improved through learner engagement and the use of observer tools. The value that students attach to observer roles appear contingent on role clarity, use of observer tools, and inclusion of observers' perspectives in the debrief.

  17. Corrected confidence bands for functional data using principal components.

    PubMed

    Goldsmith, J; Greven, S; Crainiceanu, C

    2013-03-01

    Functional principal components (FPC) analysis is widely used to decompose and express functional observations. Curve estimates implicitly condition on basis functions and other quantities derived from FPC decompositions; however these objects are unknown in practice. In this article, we propose a method for obtaining correct curve estimates by accounting for uncertainty in FPC decompositions. Additionally, pointwise and simultaneous confidence intervals that account for both model- and decomposition-based variability are constructed. Standard mixed model representations of functional expansions are used to construct curve estimates and variances conditional on a specific decomposition. Iterated expectation and variance formulas combine model-based conditional estimates across the distribution of decompositions. A bootstrap procedure is implemented to understand the uncertainty in principal component decomposition quantities. Our method compares favorably to competing approaches in simulation studies that include both densely and sparsely observed functions. We apply our method to sparse observations of CD4 cell counts and to dense white-matter tract profiles. Code for the analyses and simulations is publicly available, and our method is implemented in the R package refund on CRAN. Copyright © 2013, The International Biometric Society.

  18. Corrected Confidence Bands for Functional Data Using Principal Components

    PubMed Central

    Goldsmith, J.; Greven, S.; Crainiceanu, C.

    2014-01-01

    Functional principal components (FPC) analysis is widely used to decompose and express functional observations. Curve estimates implicitly condition on basis functions and other quantities derived from FPC decompositions; however these objects are unknown in practice. In this article, we propose a method for obtaining correct curve estimates by accounting for uncertainty in FPC decompositions. Additionally, pointwise and simultaneous confidence intervals that account for both model- and decomposition-based variability are constructed. Standard mixed model representations of functional expansions are used to construct curve estimates and variances conditional on a specific decomposition. Iterated expectation and variance formulas combine model-based conditional estimates across the distribution of decompositions. A bootstrap procedure is implemented to understand the uncertainty in principal component decomposition quantities. Our method compares favorably to competing approaches in simulation studies that include both densely and sparsely observed functions. We apply our method to sparse observations of CD4 cell counts and to dense white-matter tract profiles. Code for the analyses and simulations is publicly available, and our method is implemented in the R package refund on CRAN. PMID:23003003

  19. A Novel Video Tracking Method to Evaluate the Effect of Influenza Infection and Antiviral Treatment on Ferret Activity

    PubMed Central

    Oh, Ding Yuan; Barr, Ian G.; Hurt, Aeron C.

    2015-01-01

    Ferrets are the preferred animal model to assess influenza virus infection, virulence and transmission as they display similar clinical symptoms and pathogenesis to those of humans. Measures of disease severity in the ferret include weight loss, temperature rise, sneezing, viral shedding and reduced activity. To date, the only available method for activity measurement has been the assignment of an arbitrary score by a ‘blind’ observer based on pre-defined responsiveness scale. This manual scoring method is subjective and can be prone to bias. In this study, we described a novel video-tracking methodology for determining activity changes in a ferret model of influenza infection. This method eliminates the various limitations of manual scoring, which include the need for a sole ‘blind’ observer and the requirement to recognise the ‘normal’ activity of ferrets in order to assign relative activity scores. In ferrets infected with an A(H1N1)pdm09 virus, video-tracking was more sensitive than manual scoring in detecting ferret activity changes. Using this video-tracking method, oseltamivir treatment was found to ameliorate the effect of influenza infection on activity in ferret. Oseltamivir treatment of animals was associated with an improvement in clinical symptoms, including reduced inflammatory responses in the upper respiratory tract, lower body weight loss and a smaller rise in body temperature, despite there being no significant reduction in viral shedding. In summary, this novel video-tracking is an easy-to-use, objective and sensitive methodology for measuring ferret activity. PMID:25738900

  20. Science + Writing = Super Learning. Writing Workshop.

    ERIC Educational Resources Information Center

    Bower, Paula Rogovin

    1993-01-01

    Article presents suggestions for motivating elementary students to learn by combining science and writing. The strategies include planning the right environment; teaching the scientific method; establishing a link to literature; and making time for students to observe, experiment, and write. (SM)

  1. How Instructional Designers Solve Workplace Problems

    ERIC Educational Resources Information Center

    Fortney, Kathleen S.; Yamagata-Lynch, Lisa C.

    2013-01-01

    This naturalistic inquiry investigated how instructional designers engage in complex and ambiguous problem solving across organizational boundaries in two corporations. Participants represented a range of instructional design experience, from novices to experts. Research methods included a participant background survey, observations of…

  2. The estimation of bone cyst volume using the Cavalieri principle on computed tomography images.

    PubMed

    Say, Ferhat; Gölpınar, Murat; Kılınç, Cem Yalın; Şahin, Bünyamin

    2018-01-01

    To evaluate the volume of bone cyst using the planimetry method of the Cavalieri principle. A retrospective analysis was carried out on data from 25 computed tomography (CT) images of patients with bone cyst. The volume of the cysts was calculated by two independent observers using the planimetry method. The procedures were repeated 1 month later by each observer. The overall mean volume of the bone cyst was 29.25 ± 25.86 cm 3 . The mean bone cyst volumes calculated by the first observer for the first and second sessions were 29.18 ± 26.14 and 29.27 ± 26.19 cm 3 , respectively. The mean bone cyst volumes calculated by the second observer for the first and second sessions were 29.32 ± 26.36 and 29.23 ± 26.36 cm 3 , respectively. Statistical analysis showed no difference and high agreement between the first and second measurements of both observers. The Bland-Altman plots showed strong intraobserver and interobserver concordance in the measurement of the bone cyst volume. The mean total time necessary to obtain the cyst volume by the two observers was 5.27 ± 2.30 min. The bone cyst of the patients can be objectively evaluated using the planimetry method of the Cavalieri principle on CT. This method showed high interobserver and intraobserver agreement. This volume measurement can be used to evaluate cyst remodeling, including complete healing and cyst recurrence.

  3. On the use and computation of the Jordan canonical form in system theory

    NASA Technical Reports Server (NTRS)

    Sridhar, B.; Jordan, D.

    1974-01-01

    This paper investigates various aspects of the application of the Jordan canonical form of a matrix in system theory and develops a computational approach to determining the Jordan form for a given matrix. Applications include pole placement, controllability and observability studies, serving as an intermediate step in yielding other canonical forms, and theorem proving. The computational method developed in this paper is both simple and efficient. The method is based on the definition of a generalized eigenvector and a natural extension of Gauss elimination techniques. Examples are included for demonstration purposes.

  4. Instructional Methods for Neuroscience in Nurse Anesthesia Graduate Programs: A Survey of Educational Programs

    DTIC Science & Technology

    1999-10-01

    Instructional Methods 4 December 5, 1998). Taught simultaneously with the Human Anatomy course, the neuroscience courses clinically orient the students...drama. His medical writings showed penetrating and often accurate observations on human anatomy , including the nervous system. He established the...Pathophysiology, Advanced Anesthesia Courses, Pharmacology and Human Anatomy . Research Question # 2 The second research question was What are the

  5. Evaluating critical thinking in clinical practice.

    PubMed

    Oermann, M H

    1997-01-01

    Although much has been written about measurement instruments for evaluating critical thinking in nursing, this article describes clinical evaluation strategies for critical thinking. Five methods are discussed: 1) observation of students in practice; 2) questions for critical thinking, including Socratic questioning; 3) conferences; 4) problem-solving strategies; and 5) written assignments. These methods provide a means of evaluating students' critical thinking within the context of clinical practice.

  6. Positron emission tomography probe to monitor selected sugar metabolism in vivo

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Witte, Owen; Clark, Peter M.; Castillo, Blanca Graciela Flores

    The invention disclosed herein discloses selected ribose isomers that are useful as PET probes (e.g. [18F]-2-fluoro-2-deoxy-arabinose). These PET probes are useful, for example, in methods designed to monitor physiological processes including ribose metabolism and/or to selectively observe certain tissue/organs in vivo. The invention disclosed herein further provides methods for making and using such probes.

  7. Innovations in the Analysis of Chandra-ACIS Observations

    NASA Astrophysics Data System (ADS)

    Broos, Patrick S.; Townsley, Leisa K.; Feigelson, Eric D.; Getman, Konstantin V.; Bauer, Franz E.; Garmire, Gordon P.

    2010-05-01

    As members of the instrument team for the Advanced CCD Imaging Spectrometer (ACIS) on NASA's Chandra X-ray Observatory and as Chandra General Observers, we have developed a wide variety of data analysis methods that we believe are useful to the Chandra community, and have constructed a significant body of publicly available software (the ACIS Extract package) addressing important ACIS data and science analysis tasks. This paper seeks to describe these data analysis methods for two purposes: to document the data analysis work performed in our own science projects and to help other ACIS observers judge whether these methods may be useful in their own projects (regardless of what tools and procedures they choose to implement those methods). The ACIS data analysis recommendations we offer here address much of the workflow in a typical ACIS project, including data preparation, point source detection via both wavelet decomposition and image reconstruction, masking point sources, identification of diffuse structures, event extraction for both point and diffuse sources, merging extractions from multiple observations, nonparametric broadband photometry, analysis of low-count spectra, and automation of these tasks. Many of the innovations presented here arise from several, often interwoven, complications that are found in many Chandra projects: large numbers of point sources (hundreds to several thousand), faint point sources, misaligned multiple observations of an astronomical field, point source crowding, and scientifically relevant diffuse emission.

  8. Evidence for Periodicity in 43 year-long Monitoring of NGC 5548

    NASA Astrophysics Data System (ADS)

    Bon, E.; Zucker, S.; Netzer, H.; Marziani, P.; Bon, N.; Jovanović, P.; Shapovalova, A. I.; Komossa, S.; Gaskell, C. M.; Popović, L. Č.; Britzen, S.; Chavushyan, V. H.; Burenkov, A. N.; Sergeev, S.; La Mura, G.; Valdés, J. R.; Stalevski, M.

    2016-08-01

    We present an analysis of 43 years (1972 to 2015) of spectroscopic observations of the Seyfert 1 galaxy NGC 5548. This includes 12 years of new unpublished observations (2003 to 2015). We compiled about 1600 Hβ spectra and analyzed the long-term spectral variations of the 5100 Å continuum and the Hβ line. Our analysis is based on standard procedures, including the Lomb-Scargle method, which is known to be rather limited to such heterogeneous data sets, and a new method developed specifically for this project that is more robust and reveals a ˜5700 day periodicity in the continuum light curve, the Hβ light curve, and the radial velocity curve of the red wing of the Hβ line. The data are consistent with orbital motion inside the broad emission line region of the source. We discuss several possible mechanisms that can explain this periodicity, including orbiting dusty and dust-free clouds, a binary black hole system, tidal disruption events, and the effect of an orbiting star periodically passing through an accretion disk.

  9. Observed fearlessness and positive parenting interact to predict childhood callous-unemotional behaviors among low-income boys

    PubMed Central

    Waller, Rebecca; Shaw, Daniel S.; Hyde, Luke W.

    2016-01-01

    Background Callous-unemotional behaviors identify children at risk for severe and chronic antisocial behavior. Research is needed to establish pathways from temperament and parenting factors that give rise to callous-unemotional behaviors, including interactions of positive versus harsh parenting with child fearlessness. Methods Multi-method data, including parent reports and observations of parent and child behavior, were drawn from a prospective, longitudinal sample of low-income boys (N=310) with assessments at 18, 24, and 42 months, and at ages 10–12 years old. Results Parent-reported callous-unemotional, oppositional, and attention-deficit factors were separable at 42 months. Callous-unemotional behaviors at 42 months predicted callous-unemotional behaviors at ages 10–12, accounting for earlier oppositional and attention-deficit behaviors and self-reported child delinquency at ages 10–12. Observations of fearlessness at 24 months predicted callous-unemotional behaviors at 42 months, but only when parents exhibited low observed levels of positive parenting. The interaction of fearlessness and low positive parenting indirectly predicted callous-unemotional behaviors at 10–12 via callous-unemotional behaviors at 42 months. Conclusions Early fearlessness interacts with low positive parenting to predict early callous-unemotional behaviors, with lasting effects of this person-by-context interaction on callous-unemotional behaviors into late-childhood. PMID:27917472

  10. Nurse practitioner preferences for distance education methods related to learning style, course content, and achievement.

    PubMed

    Andrusyszyn, M A; Cragg, C E; Humbert, J

    2001-04-01

    The relationships among multiple distance delivery methods, preferred learning style, content, and achievement was sought for primary care nurse practitioner students. A researcher-designed questionnaire was completed by 86 (71%) participants, while 6 engaged in follow-up interviews. The results of the study included: participants preferred learning by "considering the big picture"; "setting own learning plans"; and "focusing on concrete examples." Several positive associations were found: learning on own with learning by reading, and setting own learning plans; small group with learning through discussion; large group with learning new things through hearing and with having learning plans set by others. The most preferred method was print-based material and the least preferred method was audio tape. The most suited method for content included video teleconferencing for counseling, political action, and transcultural issues; and video tape for physical assessment. Convenience, self-direction, and timing of learning were more important than delivery method or learning style. Preferred order of learning was reading, discussing, observing, doing, and reflecting. Recommended considerations when designing distance courses include a mix of delivery methods, specific content, outcomes, learner characteristics, and state of technology.

  11. Pareto-front shape in multiobservable quantum control

    NASA Astrophysics Data System (ADS)

    Sun, Qiuyang; Wu, Re-Bing; Rabitz, Herschel

    2017-03-01

    Many scenarios in the sciences and engineering require simultaneous optimization of multiple objective functions, which are usually conflicting or competing. In such problems the Pareto front, where none of the individual objectives can be further improved without degrading some others, shows the tradeoff relations between the competing objectives. This paper analyzes the Pareto-front shape for the problem of quantum multiobservable control, i.e., optimizing the expectation values of multiple observables in the same quantum system. Analytic and numerical results demonstrate that with two commuting observables the Pareto front is a convex polygon consisting of flat segments only, while with noncommuting observables the Pareto front includes convexly curved segments. We also assess the capability of a weighted-sum method to continuously capture the points along the Pareto front. Illustrative examples with realistic physical conditions are presented, including NMR control experiments on a 1H-13C two-spin system with two commuting or noncommuting observables.

  12. Analysing seismic-source mechanisms by linear-programming methods.

    USGS Publications Warehouse

    Julian, B.R.

    1986-01-01

    Linear-programming methods are powerful and efficient tools for objectively analysing seismic focal mechanisms and are applicable to a wide range of problems, including tsunami warning and nuclear explosion identification. The source mechanism is represented as a point in the 6-D space of moment-tensor components. The present method can easily be extended to fit observed seismic-wave amplitudes (either signed or absolute) subject to polarity constraints, and to assess the range of mechanisms consistent with a set of measured amplitudes. -from Author

  13. Change Semantic Constrained Online Data Cleaning Method for Real-Time Observational Data Stream

    NASA Astrophysics Data System (ADS)

    Ding, Yulin; Lin, Hui; Li, Rongrong

    2016-06-01

    Recent breakthroughs in sensor networks have made it possible to collect and assemble increasing amounts of real-time observational data by observing dynamic phenomena at previously impossible time and space scales. Real-time observational data streams present potentially profound opportunities for real-time applications in disaster mitigation and emergency response, by providing accurate and timeliness estimates of environment's status. However, the data are always subject to inevitable anomalies (including errors and anomalous changes/events) caused by various effects produced by the environment they are monitoring. The "big but dirty" real-time observational data streams can rarely achieve their full potential in the following real-time models or applications due to the low data quality. Therefore, timely and meaningful online data cleaning is a necessary pre-requisite step to ensure the quality, reliability, and timeliness of the real-time observational data. In general, a straightforward streaming data cleaning approach, is to define various types of models/classifiers representing normal behavior of sensor data streams and then declare any deviation from this model as normal or erroneous data. The effectiveness of these models is affected by dynamic changes of deployed environments. Due to the changing nature of the complicated process being observed, real-time observational data is characterized by diversity and dynamic, showing a typical Big (Geo) Data characters. Dynamics and diversity is not only reflected in the data values, but also reflected in the complicated changing patterns of the data distributions. This means the pattern of the real-time observational data distribution is not stationary or static but changing and dynamic. After the data pattern changed, it is necessary to adapt the model over time to cope with the changing patterns of real-time data streams. Otherwise, the model will not fit the following observational data streams, which may led to large estimation error. In order to achieve the best generalization error, it is an important challenge for the data cleaning methodology to be able to characterize the behavior of data stream distributions and adaptively update a model to include new information and remove old information. However, the complicated data changing property invalidates traditional data cleaning methods, which rely on the assumption of a stationary data distribution, and drives the need for more dynamic and adaptive online data cleaning methods. To overcome these shortcomings, this paper presents a change semantics constrained online filtering method for real-time observational data. Based on the principle that the filter parameter should vary in accordance to the data change patterns, this paper embeds semantic description, which quantitatively depicts the change patterns in the data distribution to self-adapt the filter parameter automatically. Real-time observational water level data streams of different precipitation scenarios are selected for testing. Experimental results prove that by means of this method, more accurate and reliable water level information can be available, which is prior to scientific and prompt flood assessment and decision-making.

  14. Initial assessment of image quality for low-dose PET: evaluation of lesion detectability

    NASA Astrophysics Data System (ADS)

    Schaefferkoetter, Joshua D.; Yan, Jianhua; Townsend, David W.; Conti, Maurizio

    2015-07-01

    In the context of investigating the potential of low-dose PET imaging for screening applications, we developed methods to assess small lesion detectability as a function of the number of counts in the scan. We present here our methods and preliminary validation using tuberculosis cases. FDG-PET data from seventeen patients presenting diffuse hyper-metabolic lung lesions were selected for the study, to include a wide range of lesion sizes and contrasts. Reduced doses were simulated by randomly discarding events in the PET list mode, and ten realizations at each simulated dose were generated and reconstructed. The data were grouped into 9 categories determined by the number of included true events, from  >40 M to  <250 k counts. The images reconstructed from the original full statistical set were used to identify lung lesions, and each was, at every simulated dose, quantified by 6 parameters: lesion metabolic volume, lesion-to-background contrast, mean lesion tracer uptake, standard deviation of activity measurements (across realizations), lesion signal-to-noise ratio (SNR), and Hotelling observer SNR. Additionally, a lesion-detection task including 550 images was presented to several experienced image readers for qualitative assessment. Human observer performances were ranked using receiver operating characteristic analysis. The observer results were correlated with the lesion image measurements and used to train mathematical observer models. Absolute sensitivities and specificities of the human observers, as well as the area under the ROC curve, showed clustering and performance similarities among images produced from 5 million or greater counts. The results presented here are from a clinically realistic but highly constrained experiment, and more work is needed to validate these findings with a larger patient population.

  15. Initial assessment of image quality for low-dose PET: evaluation of lesion detectability.

    PubMed

    Schaefferkoetter, Joshua D; Yan, Jianhua; Townsend, David W; Conti, Maurizio

    2015-07-21

    In the context of investigating the potential of low-dose PET imaging for screening applications, we developed methods to assess small lesion detectability as a function of the number of counts in the scan. We present here our methods and preliminary validation using tuberculosis cases. FDG-PET data from seventeen patients presenting diffuse hyper-metabolic lung lesions were selected for the study, to include a wide range of lesion sizes and contrasts. Reduced doses were simulated by randomly discarding events in the PET list mode, and ten realizations at each simulated dose were generated and reconstructed. The data were grouped into 9 categories determined by the number of included true events, from  >40 M to  <250 k counts. The images reconstructed from the original full statistical set were used to identify lung lesions, and each was, at every simulated dose, quantified by 6 parameters: lesion metabolic volume, lesion-to-background contrast, mean lesion tracer uptake, standard deviation of activity measurements (across realizations), lesion signal-to-noise ratio (SNR), and Hotelling observer SNR. Additionally, a lesion-detection task including 550 images was presented to several experienced image readers for qualitative assessment. Human observer performances were ranked using receiver operating characteristic analysis. The observer results were correlated with the lesion image measurements and used to train mathematical observer models. Absolute sensitivities and specificities of the human observers, as well as the area under the ROC curve, showed clustering and performance similarities among images produced from 5 million or greater counts. The results presented here are from a clinically realistic but highly constrained experiment, and more work is needed to validate these findings with a larger patient population.

  16. Dynamics of intracellular processes in live-cell systems unveiled by fluorescence correlation microscopy.

    PubMed

    González Bardeci, Nicolás; Angiolini, Juan Francisco; De Rossi, María Cecilia; Bruno, Luciana; Levi, Valeria

    2017-01-01

    Fluorescence fluctuation-based methods are non-invasive microscopy tools especially suited for the study of dynamical aspects of biological processes. These methods examine spontaneous intensity fluctuations produced by fluorescent molecules moving through the small, femtoliter-sized observation volume defined in confocal and multiphoton microscopes. The quantitative analysis of the intensity trace provides information on the processes producing the fluctuations that include diffusion, binding interactions, chemical reactions and photophysical phenomena. In this review, we present the basic principles of the most widespread fluctuation-based methods, discuss their implementation in standard confocal microscopes and briefly revise some examples of their applications to address relevant questions in living cells. The ultimate goal of these methods in the Cell Biology field is to observe biomolecules as they move, interact with targets and perform their biological action in the natural context. © 2016 IUBMB Life, 69(1):8-15, 2017. © 2016 International Union of Biochemistry and Molecular Biology.

  17. Reporting of methodological features in observational studies of pre-harvest food safety.

    PubMed

    Sargeant, Jan M; O'Connor, Annette M; Renter, David G; Kelton, David F; Snedeker, Kate; Wisener, Lee V; Leonard, Erin K; Guthrie, Alessia D; Faires, Meredith

    2011-02-01

    Observational studies in pre-harvest food safety may be useful for identifying risk factors and for evaluating potential mitigation strategies to reduce foodborne pathogens. However, there are no structured reporting guidelines for these types of study designs in livestock species. Our objective was to evaluate the reporting of observational studies in the pre-harvest food safety literature using guidelines modified from the human healthcare literature. We identified 100 pre-harvest food safety studies published between 1999 and 2009. Each study was evaluated independently by two reviewers using a structured checklist. Of the 38 studies that explicitly stated the observational study design, 27 were described as cross-sectional studies, eight as case-control studies, and three as cohort studies. Study features reported in over 75% of the selected studies included: description of the geographic location of the studies, definitions and sources of data for outcomes, organizational level and source of data for independent variables, description of statistical methods and results, number of herds enrolled in the study and included in the analysis, and sources of study funding. However, other features were not consistently reported, including details related to eligibility criteria for groups (such as barn, room, or pen) and individuals, numbers of groups and individuals included in various stages of the study, identification of primary outcomes, the distinction between putative risk factors and confounding variables, the identification of a primary exposure variable, the referent level for evaluation of categorical variable associations, methods of controlling confounding variables and missing variables, model fit, details of subset analysis, demographic information at the sampling unit level, and generalizability of the study results. Improvement in reporting of observational studies of pre-harvest food safety will aid research readers and reviewers in interpreting and evaluating the results of such studies. Copyright © 2010 Elsevier B.V. All rights reserved.

  18. Stochastic rainfall synthesis for urban applications using different regionalization methods

    NASA Astrophysics Data System (ADS)

    Callau Poduje, A. C.; Leimbach, S.; Haberlandt, U.

    2017-12-01

    The proper design and efficient operation of urban drainage systems require long and continuous rainfall series in a high temporal resolution. Unfortunately, these time series are usually available in a few locations and it is therefore suitable to develop a stochastic precipitation model to generate rainfall in locations without observations. The model presented is based on an alternating renewal process and involves an external and an internal structure. The members of these structures are described by probability distributions which are site specific. Different regionalization methods based on site descriptors are presented which are used for estimating the distributions for locations without observations. Regional frequency analysis, multiple linear regressions and a vine-copula method are applied for this purpose. An area located in the north-west of Germany is used to compare the different methods and involves a total of 81 stations with 5 min rainfall records. The site descriptors include information available for the whole region: position, topography and hydrometeorologic characteristics which are estimated from long term observations. The methods are compared directly by cross validation of different rainfall statistics. Given that the model is stochastic the evaluation is performed based on ensembles of many long synthetic time series which are compared with observed ones. The performance is as well indirectly evaluated by setting up a fictional urban hydrological system to test the capability of the different methods regarding flooding and overflow characteristics. The results show a good representation of the seasonal variability and good performance in reproducing the sample statistics of the rainfall characteristics. The copula based method shows to be the most robust of the three methods. Advantages and disadvantages of the different methods are presented and discussed.

  19. Hurricane Isaac: observations and analysis of coastal change

    USGS Publications Warehouse

    Guy, Kristy K.; Stockdon, Hilary F.; Plant, Nathaniel G.; Doran, Kara S.; Morgan, Karen L.M.

    2013-01-01

    Understanding storm-induced coastal change and forecasting these changes require knowledge of the physical processes associated with a storm and the geomorphology of the impacted coastline. The primary physical process of interest is sediment transport that is driven by waves, currents, and storm surge associated with storms. Storm surge, which is the rise in water level due to the wind, barometric pressure, and other factors, allows both waves and currents to impact parts of the coast not normally exposed to these processes. Coastal geomorphology reflects the coastal changes associated with extreme-storm processes. Relevant geomorphic variables that are observable before and after storms include sand dune elevation, beach width, shoreline position, sediment grain size, and foreshore beach slope. These variables, in addition to hydrodynamic processes, can be used to quantify coastal change and are used to predict coastal vulnerability to storms (Stockdon and others, 2007). The U.S. Geological Survey (USGS) National Assessment of Coastal Change Hazards (NACCH) project (http://coastal.er.usgs.gov/national-assessment/) provides hazard information to those concerned about the Nation’s coastlines, including residents of coastal areas, government agencies responsible for coastal management, and coastal researchers. Extreme-storm research is a component of the NACCH project (http://coastal.er.usgs.gov/hurricanes/) that includes development of predictive understanding, vulnerability assessments using models, and updated observations in response to specific storm events. In particular, observations were made to determine morphological changes associated with Hurricane Isaac, which made landfall in the United States first at Southwest Pass, at the mouth of the Mississippi River, at 0000 August 29, 2012 UTC (Coordinated Universal Time) and again, 8 hours later, west of Port Fourchon, Louisiana (Berg, 2013). Methods of observation included oblique aerial photography, airborne light detection and ranging (lidar) topographic surveys, and ground-based topographic surveys. This report documents data-collection efforts and presents qualitative and quantitative descriptions of hurricane-induced changes to the shoreline, beaches, dunes, and infrastructure in the region that was heavily impacted by Hurricane Isaac. The report is divided into the following sections: Section 1: Introduction Section 2: Storm Overview, presents a synopsis of the storm, including meteorological evolution, wind speed impact area, wind-wave generation, and storm-surge extent and magnitudes. Section 3: Coastal-Change Observations, describes data-collection missions, including acquisition of oblique aerial photography and airborne lidar topographic surveys, in response to Hurricane Isaac. Section 4: Coastal-Change Analysis, describes data-analysis methods and observations of coastal change.

  20. Estimating non-circular motions in barred galaxies using numerical N-body simulations

    NASA Astrophysics Data System (ADS)

    Randriamampandry, T. H.; Combes, F.; Carignan, C.; Deg, N.

    2015-12-01

    The observed velocities of the gas in barred galaxies are a combination of the azimuthally averaged circular velocity and non-circular motions, primarily caused by gas streaming along the bar. These non-circular flows must be accounted for before the observed velocities can be used in mass modelling. In this work, we examine the performance of the tilted-ring method and the DISKFIT algorithm for transforming velocity maps of barred spiral galaxies into rotation curves (RCs) using simulated data. We find that the tilted-ring method, which does not account for streaming motions, under-/overestimates the circular motions when the bar is parallel/perpendicular to the projected major axis. DISKFIT, which does include streaming motions, is limited to orientations where the bar is not aligned with either the major or minor axis of the image. Therefore, we propose a method of correcting RCs based on numerical simulations of galaxies. We correct the RC derived from the tilted-ring method based on a numerical simulation of a galaxy with similar properties and projections as the observed galaxy. Using observations of NGC 3319, which has a bar aligned with the major axis, as a test case, we show that the inferred mass models from the uncorrected and corrected RCs are significantly different. These results show the importance of correcting for the non-circular motions and demonstrate that new methods of accounting for these motions are necessary as current methods fail for specific bar alignments.

  1. Observations of enhanced aerosol longwave radiative forcing over an urban environment

    NASA Astrophysics Data System (ADS)

    Panicker, A. S.; Pandithurai, G.; Safai, P. D.; Kewat, S.

    2008-02-01

    Collocated measurements of sun/sky radiance, aerosol chemical composition and radiative fluxes have been utilized to estimate longwave aerosol radiative forcing over Pune, an Indian urban site during dry winter [Dec2004 to Feb2005] by two methods. Hybrid method which uses observed downwelling and modeled upwelling longwave fluxes for different aerosol loadings yielded a surface forcing of 9.4 Wm-2. Model approach includes utilization of skyradiometer derived spectral aerosol optical properties in the visible and near infra-red wavelengths, modeled aerosol properties in 1.2-40 μm using observed soot and chemical composition data, MODIS water vapor and TOMS column ozone in a radiative transfer model. Estimates from model method showed longwave enhancement of 6.5 and 8.2 Wm-2 at the surface with tropical model atmosphere and temporally varying profiles of temperature and humidity, respectively. Study reveals that about 25% of the aerosol shortwave cooling is being compensated by increase in longwave radiation due to aerosol absorption.

  2. Use of Vertically Integrated Ice in WRF-Based Forecasts of Lightning Threat

    NASA Technical Reports Server (NTRS)

    McCaul, E. W., jr.; Goodman, S. J.

    2008-01-01

    Previously reported methods of forecasting lightning threat using fields of graupel flux from WRF simulations are extended to include the simulated field of vertically integrated ice within storms. Although the ice integral shows less temporal variability than graupel flux, it provides more areal coverage, and can thus be used to create a lightning forecast that better matches the areal coverage of the lightning threat found in observations of flash extent density. A blended lightning forecast threat can be constructed that retains much of the desirable temporal sensitivity of the graupel flux method, while also incorporating the coverage benefits of the ice integral method. The graupel flux and ice integral fields contributing to the blended forecast are calibrated against observed lightning flash origin density data, based on Lightning Mapping Array observations from a series of case studies chosen to cover a wide range of flash rate conditions. Linear curve fits that pass through the origin are found to be statistically robust for the calibration procedures.

  3. The Environmental Impact on Occupational Therapy Interventions.

    PubMed

    Skubik-Peplaski, Camille Louise; Howell, Dana; Hunter, Elizabeth

    2016-01-01

    The purpose of this study was to investigate how the environment influenced the intervention choices occupational therapists made for patients recovering from a stroke in an inpatient rehabilitation hospital. Three occupational therapists were observed providing intervention for six patients over a 16-month period. Treatment spaces included a therapy gym, gym with kitchen combination, and a home-like space. Furniture was added to the therapy gym to be more home-like midway through the study. Observations included therapist selection of treatment location and interventions, and observational data of the environment and interactions among therapists and patients. This study found that inpatient rehabilitation environments did influence interventions. The occupational therapists provided therapy in the standard therapy gym environment most often, whether it was enhanced to be more home-like or not, and predominately used preparatory methods.

  4. Support of surgical process modeling by using adaptable software user interfaces

    NASA Astrophysics Data System (ADS)

    Neumuth, T.; Kaschek, B.; Czygan, M.; Goldstein, D.; Strauß, G.; Meixensberger, J.; Burgert, O.

    2010-03-01

    Surgical Process Modeling (SPM) is a powerful method for acquiring data about the evolution of surgical procedures. Surgical Process Models are used in a variety of use cases including evaluation studies, requirements analysis and procedure optimization, surgical education, and workflow management scheme design. This work proposes the use of adaptive, situation-aware user interfaces for observation support software for SPM. We developed a method to support the modeling of the observer by using an ontological knowledge base. This is used to drive the graphical user interface for the observer to restrict the search space of terminology depending on the current situation. In the evaluation study it is shown, that the workload of the observer was decreased significantly by using adaptive user interfaces. 54 SPM observation protocols were analyzed by using the NASA Task Load Index and it was shown that the use of the adaptive user interface disburdens the observer significantly in workload criteria effort, mental demand and temporal demand, helping him to concentrate on his essential task of modeling the Surgical Process.

  5. Flexible methods for segmentation evaluation: results from CT-based luggage screening.

    PubMed

    Karimi, Seemeen; Jiang, Xiaoqian; Cosman, Pamela; Martz, Harry

    2014-01-01

    Imaging systems used in aviation security include segmentation algorithms in an automatic threat recognition pipeline. The segmentation algorithms evolve in response to emerging threats and changing performance requirements. Analysis of segmentation algorithms' behavior, including the nature of errors and feature recovery, facilitates their development. However, evaluation methods from the literature provide limited characterization of the segmentation algorithms. To develop segmentation evaluation methods that measure systematic errors such as oversegmentation and undersegmentation, outliers, and overall errors. The methods must measure feature recovery and allow us to prioritize segments. We developed two complementary evaluation methods using statistical techniques and information theory. We also created a semi-automatic method to define ground truth from 3D images. We applied our methods to evaluate five segmentation algorithms developed for CT luggage screening. We validated our methods with synthetic problems and an observer evaluation. Both methods selected the same best segmentation algorithm. Human evaluation confirmed the findings. The measurement of systematic errors and prioritization helped in understanding the behavior of each segmentation algorithm. Our evaluation methods allow us to measure and explain the accuracy of segmentation algorithms.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kathawa, J.; Fry, C.; Thoennessen, M., E-mail: thoennessen@nscl.msu.edu

    Currently, thirty-eight palladium, thirty-eight antimony, thirty-nine tellurium, thirty-eight iodine, and forty xenon isotopes have been observed and the discovery of these isotopes is described here. For each isotope a brief synopsis of the first refereed publication, including the production and identification method, is presented.

  7. New Onset Diabetes: A Guide for Kidney Transplant Recipients

    MedlinePlus

    ... recommendations for immunosuppressive therapy must be based on observational studies of the incidence of PTDM in groups ... you maintain a healthy weight. There are several methods of meal planning. Two include using ... education, patient and community services, public education ...

  8. An Improved Experiment to Illustrate the Effect of Electronegativity on Chemical Shift.

    ERIC Educational Resources Information Center

    Boggess, Robert K.

    1988-01-01

    Describes a method for using nuclear magnetic resonance to observe the effect of electronegativity on the chemical shift of protons in similar compounds. Suggests the use of 1,3-dihalopropanes as samples. Includes sample questions. (MVL)

  9. A mixed methods descriptive investigation of readiness to change in rural hospitals participating in a tele-critical care intervention

    PubMed Central

    2013-01-01

    Background Telemedicine technology can improve care to patients in rural and medically underserved communities yet adoption has been slow. The objective of this study was to study organizational readiness to participate in an academic-community hospital partnership including clinician education and telemedicine outreach focused on sepsis and trauma care in underserved, rural hospitals. Methods This is a multi-method, observational case study. Participants included staff from 4 participating rural South Carolina hospitals. Using a readiness-for-change model, we evaluated 5 general domains and the related factors or topics of organizational context via key informant interviews (n=23) with hospital leadership and staff, compared these to data from hospital staff surveys (n=86) and triangulated data with investigators’ observational reports. Survey items were grouped into 4 categories (based on content and fit with conceptual model) and scored, allowing regression analyses for inferential comparisons to assess factors related to receptivity toward the telemedicine innovation. Results General agreement existed on the need for the intervention and feasibility of implementation. Previous experience with a telemedicine program appeared pivotal to enthusiasm. Perception of need, task demands and resource need explained nearly 50% of variation in receptivity. Little correlation emerged with hospital or ED leadership culture and support. However qualitative data and investigator observations about communication and differing support among disciplines and between staff and leadership could be important to actual implementation. Conclusions A mixed methods approach proved useful in assessing organizational readiness for change in small organizations. Further research on variable operational definitions, potential influential factors, appropriate and feasible methods and valid instruments for such research are needed. PMID:23360332

  10. Contribution of formative research to design an environmental program for obesity prevention in schools in Mexico City.

    PubMed

    Bonvecchio, Anabelle; Théodore, Florence L; Safdie, Margarita; Duque, Tiffany; Villanueva, María Ángeles; Torres, Catalina; Rivera, Juan

    2014-01-01

    This paper describes the methods and key findings of formative research conducted to design a school-based program for obesity prevention. Formative research was based on the ecological model and the principles of social marketing. A mixed method approach was used. Qualitative (direct observation, indepth interviews, focus group discussions and photo-voice) and quantitative (closed ended surveys, checklists, anthropometry) methods were employed. Formative research key findings, including barriers by levels of the ecological model, were used for designing a program including environmental strategies to discourage the consumption of energy dense foods and sugar beverages. Formative research was fundamental to developing a context specific obesity prevention program in schools that seeks environment modification and behavior change.

  11. An Avalanche Diode Electron Detector for Observing NEET

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kishimoto, Shunji

    2004-05-12

    Nuclear excitation by electron transition (NEET) occurs in atomic inner-shell ionization if the nuclear excitation and the electron transition have nearly the same energy and a common multipolarity. We successfully observed the NEET on 197Au and on 193Ir using a silicon avalanche diode electron detector. The detector was used to find internal conversion electrons emitted from excited nuclei in time spectroscopy with a time gate method. Some nuclear resonant levels, including 8.410 keV on 169Tm and 80.577 keV on 166Er, were also observed with the detector.

  12. [Current situation and thinking of diagnosis and treatment in some types of thyroid cancer].

    PubMed

    Yang, X Y; Yu, Y; Li, D P; Dong, L

    2017-04-07

    As arising incidence of thyroid cancer, the treatment for thyroid carcinoma is becoming increasingly standardized. But there are different opinions on the treatment for some types of thyroid cancers, including the determination of operative opportunity, surgical method, and follow-up observation plan. There are mainly two categories of patients, namely the patients diagnosed as familial thyroid cancer mutation carriers through family screening, including medullary thyroid carcinoma and familial nonmedullary thyroid carcinoma, and the patients with thyroid microcarcinoma that can be observed after diagnosed by fine needle biopsy cytology. We will discuss current situation for the diagnosis and treatment of these patients.

  13. Critically appraising qualitative research: a guide for clinicians more familiar with quantitative techniques.

    PubMed

    Kisely, Stephen; Kendall, Elizabeth

    2011-08-01

    Papers using qualitative methods are increasingly common in psychiatric journals. This overview is an introduction to critically appraising a qualitative paper for clinicians who are more familiar with quantitative methods. Qualitative research uses data from interviews (semi-structured or unstructured), focus groups, observations or written materials. Data analysis is inductive, allowing meaning to emerge from the data, rather than the more deductive, hypothesis centred approach of quantitative research. This overview compares and contrasts quantitative and qualitative research methods. Quantitative concepts such as reliability, validity, statistical power, bias and generalisability have qualitative equivalents. These include triangulation, trustworthiness, saturation, reflexivity and applicability. Reflexivity also shares features of transference. Qualitative approaches include: ethnography, action-assessment, grounded theory, case studies and mixed methods. Qualitative research can complement quantitative approaches. An understanding of both is useful in critically appraising the psychiatric literature.

  14. Comparing observer models and feature selection methods for a task-based statistical assessment of digital breast tomsynthesis in reconstruction space

    NASA Astrophysics Data System (ADS)

    Park, Subok; Zhang, George Z.; Zeng, Rongping; Myers, Kyle J.

    2014-03-01

    A task-based assessment of image quality1 for digital breast tomosynthesis (DBT) can be done in either the projected or reconstructed data space. As the choice of observer models and feature selection methods can vary depending on the type of task and data statistics, we previously investigated the performance of two channelized- Hotelling observer models in conjunction with 2D Laguerre-Gauss (LG) and two implementations of partial least squares (PLS) channels along with that of the Hotelling observer in binary detection tasks involving DBT projections.2, 3 The difference in these observers lies in how the spatial correlation in DBT angular projections is incorporated in the observer's strategy to perform the given task. In the current work, we extend our method to the reconstructed data space of DBT. We investigate how various model observers including the aforementioned compare for performing the binary detection of a spherical signal embedded in structured breast phantoms with the use of DBT slices reconstructed via filtered back projection. We explore how well the model observers incorporate the spatial correlation between different numbers of reconstructed DBT slices while varying the number of projections. For this, relatively small and large scan angles (24° and 96°) are used for comparison. Our results indicate that 1) given a particular scan angle, the number of projections needed to achieve the best performance for each observer is similar across all observer/channel combinations, i.e., Np = 25 for scan angle 96° and Np = 13 for scan angle 24°, and 2) given these sufficient numbers of projections, the number of slices for each observer to achieve the best performance differs depending on the channel/observer types, which is more pronounced in the narrow scan angle case.

  15. Interobserver Reliability of the Total Body Score System for Quantifying Human Decomposition.

    PubMed

    Dabbs, Gretchen R; Connor, Melissa; Bytheway, Joan A

    2016-03-01

    Several authors have tested the accuracy of the Total Body Score (TBS) method for quantifying decomposition, but none have examined the reliability of the method as a scoring system by testing interobserver error rates. Sixteen participants used the TBS system to score 59 observation packets including photographs and written descriptions of 13 human cadavers in different stages of decomposition (postmortem interval: 2-186 days). Data analysis used a two-way random model intraclass correlation in SPSS (v. 17.0). The TBS method showed "almost perfect" agreement between observers, with average absolute correlation coefficients of 0.990 and average consistency correlation coefficients of 0.991. While the TBS method may have sources of error, scoring reliability is not one of them. Individual component scores were examined, and the influences of education and experience levels were investigated. Overall, the trunk component scores were the least concordant. Suggestions are made to improve the reliability of the TBS method. © 2016 American Academy of Forensic Sciences.

  16. Preliminary evaluation of the importance of existing hydraulic-head observation locations to advective-transport predictions, Death Valley regional flow system, California and Nevada

    USGS Publications Warehouse

    Hill, Mary C.; Ely, D. Matthew; Tiedeman, Claire; O'Brien, Grady M.; D'Agnese, Frank A.; Faunt, Claudia C.

    2001-01-01

    When a model is calibrated by nonlinear regression, calculated diagnostic statistics and measures of uncertainty provide a wealth of information about many aspects of the system. This report presents a method of ranking the likely importance of existing observation locations using measures of prediction uncertainty. It is suggested that continued monitoring is warranted at more important locations, and unwarranted or less warranted at less important locations. The report develops the methodology and then demonstrates it using the hydraulic-head observation locations of a three-layer model of the Death Valley regional flow system. The predictions of interest are subsurface transport from beneath Yucca Mountain and 14 Underground Test Areas. The advective component of transport is considered because it is the component most affected by the system dynamics represented by the scale model being used. The problem is addressed using the capabilities of the U.S. Geological Survey computer program MODFLOW-2000, with its ADVective-Travel Observation (ADV) Package, and an additional computer program developed for this work. The methods presented in this report are used in three ways. (1) The ratings for individual observations are obtained by manipulating the measures of prediction uncertainty, and do not involve recalibrating the model. In this analysis, observation locations are each omitted individually and the resulting increase in uncertainty in the predictions is calculated. The uncertainty is quantified as standard deviations on the simulated advective transport. The increase in uncertainty is quantified as the percent increase in the standard deviations caused by omitting the one observation location from the calculation of standard deviations. In general, observation locations associated with larger increases are rated as more important. (2) Ratings for largely geographically based groups are obtained using a straightforward extension of the method used for individual observation locations. This analysis is needed where observations are clustered to determine whether the area is important to the predictions of interest. (3) Finally, the method is used to evaluate omitting a set of 100 observation locations. The locations were selected because they had low individual ratings and were not one of the few locations at which hydraulic heads from deep in the system were measured. The major results of the three analyses, when applied to the three-layer DVRFS ground-water flow system, are described in the following paragraphs. The discussion is labeled using the numbers 1 to 3 to clearly relate it to the three ways the method is used, as listed above. (1) The individual observation location analysis indicates that three observation locations are most important. They are located in Emigrant Valley, Oasis Valley, and Beatty. Of importance is that these and other observations shown to be important by this analysis are far from the travel paths considered. This displays the importance of the regional setting within which the transport occurs, the importance of including some sites throughout the area in the monitoring network, and the importance of including sites in these areas in particular. The method considered in this report indicates that the 19 observation locations that reflect hydraulic heads deeper in the system (in model layers 1, 2, and 3) are not very important. This appears to be because the locations of these observations are in the vicinity of shallow observation locations that also generally are rated as low importance, and because the model layers are hydraulically well connected vertically. The value of deep observations to testing conceptual models, however, is stressed. As a result, the deep observations are rated higher than is consistent with the results of the analysis presented, and none of these observations are omitted in the scenario discussed under (3) below. (2) The geographic grouping of th

  17. Application of Sliding Mode Methods to the Design of Reconfigurable Flight Control Systems

    NASA Technical Reports Server (NTRS)

    Wells, Scott R.

    2002-01-01

    Observer-based sliding mode control is investigated for application to aircraft reconfigurable flight control. A comprehensive overview of reconfigurable flight control is given, including, a review of the current state-of-the-art within the subdisciplines of fault detection, parameter identification, adaptive control schemes, and dynamic control allocation. Of the adaptive control methods reviewed, sliding mode control (SMC) appears very promising due its property of invariance to matched uncertainty. An overview of sliding mode control is given and its remarkable properties are demonstrated by example. Sliding mode methods, however, are difficult to implement because unmodeled parasitic dynamics cause immediate and severe instability. This presents a challenge for all practical applications with limited bandwidth actuators. One method to deal with parasitic dynamics is the use of an asymptotic observer in the feedback path. Observer-based SMC is investigated, and a method for selecting observer gains is offered. An additional method for shaping the feedback loop using a filter is also developed. It is shown that this SMC prefilter is equivalent to a form of model reference hedging. A complete design procedure is given which takes advantage of the sliding mode boundary layer to recast the SMC as a linear control law. Frequency domain loop shaping is then used to design the sliding manifold. Finally, three aircraft applications are demonstrated. An F-18/HARV is used to demonstrate a SISO pitch rate tracking controller. It is also used to demonstrate a MIMO lateral-directional roll rate tracking controller. The last application is a full linear six degree-of-freedom advanced tailless fighter model. The observer-based SMC is seen to provide excellent tracking with superior robustness to parameter changes and actuator failures.

  18. Taking risks with a growth mindset: long-term influence of an elementary pre-service after school science practicum

    NASA Astrophysics Data System (ADS)

    Cartwright, T. J.; Hallar, B.

    2018-02-01

    In this study, we present the long-term influence of an after school science practicum associated with an elementary science methods course. The practicum or field experience could be considered a community-based service learning programme as it is situated both within and for the community. Study participants included eight third- and fifth-grade teachers who had participated in elementary science methods courses; four of these teachers participated in the after school teaching practicum while four participants experienced a more traditional observation-based elementary science practicum. All of these teachers were in their second or third year teaching which was 3-4 years after taking the methods course. Investigation methods included questionnaires, field observations and semi-structured, individual interviews. Teachers more regularly utilised reform-based teaching strategies and cited the after school teaching practicum as preparing them to use these strategies in their own classrooms. All teachers exhibited a growth mindset to some degree, but the after school practicum participants did demonstrate a wider use of reformed-based teaching strategies and a higher growth mindset. Elementary teachers perceive risk associated with these key aspects of instruction: (1) managing instruction and classroom management, (2) teaching science through guided inquiry, and (3) overcoming adoptions in other 'mandated' curriculum like math and reading.

  19. Improved Predictions of Drug-Drug Interactions Mediated by Time-Dependent Inhibition of CYP3A.

    PubMed

    Yadav, Jaydeep; Korzekwa, Ken; Nagar, Swati

    2018-05-07

    Time-dependent inactivation (TDI) of cytochrome P450s (CYPs) is a leading cause of clinical drug-drug interactions (DDIs). Current methods tend to overpredict DDIs. In this study, a numerical approach was used to model complex CYP3A TDI in human-liver microsomes. The inhibitors evaluated included troleandomycin (TAO), erythromycin (ERY), verapamil (VER), and diltiazem (DTZ) along with the primary metabolites N-demethyl erythromycin (NDE), norverapamil (NV), and N-desmethyl diltiazem (NDD). The complexities incorporated into the models included multiple-binding kinetics, quasi-irreversible inactivation, sequential metabolism, inhibitor depletion, and membrane partitioning. The resulting inactivation parameters were incorporated into static in vitro-in vivo correlation (IVIVC) models to predict clinical DDIs. For 77 clinically observed DDIs, with a hepatic-CYP3A-synthesis-rate constant of 0.000 146 min -1 , the average fold difference between the observed and predicted DDIs was 3.17 for the standard replot method and 1.45 for the numerical method. Similar results were obtained using a synthesis-rate constant of 0.000 32 min -1 . These results suggest that numerical methods can successfully model complex in vitro TDI kinetics and that the resulting DDI predictions are more accurate than those obtained with the standard replot approach.

  20. Inter-observer reliability of animal-based welfare indicators included in the Animal Welfare Indicators welfare assessment protocol for dairy goats.

    PubMed

    Vieira, A; Battini, M; Can, E; Mattiello, S; Stilwell, G

    2018-01-08

    This study was conducted within the context of the Animal Welfare Indicators (AWIN) project and the underlying scientific motivation for the development of the study was the scarcity of data regarding inter-observer reliability (IOR) of welfare indicators, particularly given the importance of reliability as a further step for developing on-farm welfare assessment protocols. The objective of this study is therefore to evaluate IOR of animal-based indicators (at group and individual-level) of the AWIN welfare assessment protocol (prototype) for dairy goats. In the design of the study, two pairs of observers, one in Portugal and another in Italy, visited 10 farms each and applied the AWIN prototype protocol. Farms in both countries were visited between January and March 2014, and all the observers received the same training before the farm visits were initiated. Data collected during farm visits, and analysed in this study, include group-level and individual-level observations. The results of our study allow us to conclude that most of the group-level indicators presented the highest IOR level ('substantial', 0.85 to 0.99) in both field studies, pointing to a usable set of animal-based welfare indicators that were therefore included in the first level of the final AWIN welfare assessment protocol for dairy goats. Inter-observer reliability of individual-level indicators was lower, but the majority of them still reached 'fair to good' (0.41 to 0.75) and 'excellent' (0.76 to 1) levels. In the paper we explore reasons for the differences found in IOR between the group and individual-level indicators, including how the number of individual-level indicators to be assessed on each animal and the restraining method may have affected the results. Furthermore, we discuss the differences found in the IOR of individual-level indicators in both countries: the Portuguese pair of observers reached a higher level of IOR, when compared with the Italian observers. We argue how the reasons behind these differences may stem from the restraining method applied, or the different background and experience of the observers. Finally, the discussion of the results emphasizes the importance of considering that reliability is not an absolute attribute of an indicator, but derives from an interaction between the indicators, the observers and the situation in which the assessment is taking place. This highlights the importance of further considering the indicators' reliability while developing welfare assessment protocols.

  1. Continuous throughput and long-term observation of single-molecule FRET without immobilization.

    PubMed

    Tyagi, Swati; VanDelinder, Virginia; Banterle, Niccolò; Fuertes, Gustavo; Milles, Sigrid; Agez, Morgane; Lemke, Edward A

    2014-03-01

    We present an automated microfluidic platform that performs multisecond observation of single molecules with millisecond time resolution while bypassing the need for immobilization procedures. With this system, we confine biomolecules to a thin excitation field by reversibly collapsing microchannels to nanochannels. We demonstrate the power of our method by studying a variety of complex nucleic acid and protein systems, including DNA Holliday junctions, nucleosomes and human transglutaminase 2.

  2. Recent developments with the asian dust and aerosol lidar observation network (AD-NET)

    NASA Astrophysics Data System (ADS)

    Sugimoto, Nobuo; Shimizu, Atsushi; Nishizawa, Tomoaki; Jin, Yoshitaka

    2018-04-01

    Recent developments of lidars and data analysis methods for AD-Net, and the studies using ADNet are presented. Continuous observation was started in 2001 at three stations using polarizationsensitive Mie-scattering lidars. Currently, lidars, including three multi-wavelength Raman lidars and one high-spectral-resolution lidar, are operated at 20 stations. Recent studies on validation/assimilation of chemical transport models, climatology, and epidemiology of Asian dust are also described.

  3. Data-Driven Model Uncertainty Estimation in Hydrologic Data Assimilation

    NASA Astrophysics Data System (ADS)

    Pathiraja, S.; Moradkhani, H.; Marshall, L.; Sharma, A.; Geenens, G.

    2018-02-01

    The increasing availability of earth observations necessitates mathematical methods to optimally combine such data with hydrologic models. Several algorithms exist for such purposes, under the umbrella of data assimilation (DA). However, DA methods are often applied in a suboptimal fashion for complex real-world problems, due largely to several practical implementation issues. One such issue is error characterization, which is known to be critical for a successful assimilation. Mischaracterized errors lead to suboptimal forecasts, and in the worst case, to degraded estimates even compared to the no assimilation case. Model uncertainty characterization has received little attention relative to other aspects of DA science. Traditional methods rely on subjective, ad hoc tuning factors or parametric distribution assumptions that may not always be applicable. We propose a novel data-driven approach (named SDMU) to model uncertainty characterization for DA studies where (1) the system states are partially observed and (2) minimal prior knowledge of the model error processes is available, except that the errors display state dependence. It includes an approach for estimating the uncertainty in hidden model states, with the end goal of improving predictions of observed variables. The SDMU is therefore suited to DA studies where the observed variables are of primary interest. Its efficacy is demonstrated through a synthetic case study with low-dimensional chaotic dynamics and a real hydrologic experiment for one-day-ahead streamflow forecasting. In both experiments, the proposed method leads to substantial improvements in the hidden states and observed system outputs over a standard method involving perturbation with Gaussian noise.

  4. Developing a framework for evaluating tallgrass prairie reconstruction methods and management

    USGS Publications Warehouse

    Larson, Diane L.; Ahlering, Marissa; Drobney, Pauline; Esser, Rebecca; Larson, Jennifer L.; Viste-Sparkman, Karen

    2018-01-01

    The thousands of hectares of prairie reconstructed each year in the tallgrass prairie biome can provide a valuable resource for evaluation of seed mixes, planting methods, and post-planting management if methods used and resulting characteristics of the prairies are recorded and compiled in a publicly accessible database. The objective of this study was to evaluate the use of such data to understand the outcomes of reconstructions over a 10-year period at two U.S. Fish and Wildlife Service refuges. Variables included number of species planted, seed source (combine-harvest or combine-harvest plus hand-collected), fire history, and planting method and season. In 2015 we surveyed vegetation on 81 reconstructions and calculated proportion of planted species observed; introduced species richness; native species richness, evenness and diversity; and mean coefficient of conservatism. We conducted exploratory analyses to learn how implied communities based on seed mix compared with observed vegetation; which seeding or management variables were influential in the outcome of the reconstructions; and consistency of responses between the two refuges. Insights from this analysis include: 1) proportion of planted species observed in 2015 declined as planted richness increased, but lack of data on seeding rate per species limited conclusions about value of added species; 2) differing responses to seeding and management between the two refuges suggest the importance of geographic variability that could be addressed using a public database; and 3) variables such as fire history are difficult to quantify consistently and should be carefully evaluated in the context of a public data repository.

  5. Automatic insertion of simulated microcalcification clusters in a software breast phantom

    NASA Astrophysics Data System (ADS)

    Shankla, Varsha; Pokrajac, David D.; Weinstein, Susan P.; DeLeo, Michael; Tuite, Catherine; Roth, Robyn; Conant, Emily F.; Maidment, Andrew D.; Bakic, Predrag R.

    2014-03-01

    An automated method has been developed to insert realistic clusters of simulated microcalcifications (MCs) into computer models of breast anatomy. This algorithm has been developed as part of a virtual clinical trial (VCT) software pipeline, which includes the simulation of breast anatomy, mechanical compression, image acquisition, image processing, display and interpretation. An automated insertion method has value in VCTs involving large numbers of images. The insertion method was designed to support various insertion placement strategies, governed by probability distribution functions (pdf). The pdf can be predicated on histological or biological models of tumor growth, or estimated from the locations of actual calcification clusters. To validate the automated insertion method, a 2-AFC observer study was designed to compare two placement strategies, undirected and directed. The undirected strategy could place a MC cluster anywhere within the phantom volume. The directed strategy placed MC clusters within fibroglandular tissue on the assumption that calcifications originate from epithelial breast tissue. Three radiologists were asked to select between two simulated phantom images, one from each placement strategy. Furthermore, questions were posed to probe the rationale behind the observer's selection. The radiologists found the resulting cluster placement to be realistic in 92% of cases, validating the automated insertion method. There was a significant preference for the cluster to be positioned on a background of adipose or mixed adipose/fibroglandular tissues. Based upon these results, this automated lesion placement method will be included in our VCT simulation pipeline.

  6. An Approach to the Evaluation of Hypermedia.

    ERIC Educational Resources Information Center

    Knussen, Christina; And Others

    1991-01-01

    Discusses methods that may be applied to the evaluation of hypermedia, based on six models described by Lawton. Techniques described include observation, self-report measures, interviews, automated measures, psychometric tests, checklists and criterion-based techniques, process models, Experimentally Measuring Usability (EMU), and a naturalistic…

  7. Training Feedback Handbook. Research Product 83-7.

    ERIC Educational Resources Information Center

    Burnside, Billy L.; And Others

    This handbook is designed to assist training developers and evaluators in structuring their collection of feedback data. Addressed first are various methods for collecting feedback data, including informal feedback, existing unit performance records, questionnaires, structured interviews, systematic observation, and testing. The next chapter, a…

  8. In Abundance: Networked Participatory Practices as Scholarship

    ERIC Educational Resources Information Center

    Stewart, Bonnie E.

    2015-01-01

    In an era of knowledge abundance, scholars have the capacity to distribute and share ideas and artifacts via digital networks, yet networked scholarship often remains unrecognized within institutional spheres of influence. Using ethnographic methods including participant observation, interviews, and document analysis, this study investigates…

  9. 39 CFR 3001.31 - Evidence.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... methods employed in statistical compilations. The principal title of each exhibit should state what it... furnished: (i) Market research. (a) The following data and information shall be provided: (1) A clear and detailed description of the sample, observational, and data preparation designs, including definitions of...

  10. 39 CFR 3001.31 - Evidence.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... item of information used and the methods employed in statistical compilations. The principal title of... furnished: (i) Market research. (a) The following data and information shall be provided: (1) A clear and detailed description of the sample, observational, and data preparation designs, including definitions of...

  11. 39 CFR 3001.31 - Evidence.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... item of information used and the methods employed in statistical compilations. The principal title of... should be furnished: (i) Market research. (a) The following data and information shall be provided: (1) A clear and detailed description of the sample, observational, and data preparation designs, including...

  12. 39 CFR 3001.31 - Evidence.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... item of information used and the methods employed in statistical compilations. The principal title of... should be furnished: (i) Market research. (a) The following data and information shall be provided: (1) A clear and detailed description of the sample, observational, and data preparation designs, including...

  13. 39 CFR 3001.31 - Evidence.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... item of information used and the methods employed in statistical compilations. The principal title of... should be furnished: (i) Market research. (a) The following data and information shall be provided: (1) A clear and detailed description of the sample, observational, and data preparation designs, including...

  14. Getting Real: Implementing Assessment Alternatives in Mathematics.

    ERIC Educational Resources Information Center

    Hopkins, Martha H.

    1997-01-01

    Recounts experiences of a university professor who returned to the elementary classroom and attempted to implement the National Council of Teachers of Mathematics Standards and appropriate assessment methods, including nontraditional paper-and-pencil tasks, journal-like writing assignments, focused observations, and performance-based assessments…

  15. Using qualitative methods to understand factors contributing to patient satisfaction among dermatology patients: a systematic review.

    PubMed

    Gibbons, Caitlin; Singh, Sanminder; Gibbons, Brittany; Clark, Caitlin; Torres, Josefina; Cheng, Michelle Y; Wang, Elizabeth A; Armstrong, April W

    2018-05-01

    In this systematic review, we aimed to synthesize data that identify factors contributing to patient satisfaction in dermatology care using qualitative methods. We performed a comprehensive search of the literature using the PubMed database for articles published between January 1, 2000 and February 9, 2015. The initial search yielded 186 articles, of which 13 were included after applying inclusion and exclusion criteria. The systematic review of 13 articles included a total of 330 patients. Using in-field observations and semistructured interviews, studies found that qualitative methods and analysis increased the provider's sensitivity to patient needs and enhanced patient care. Analyses using qualitative methods found increased patient satisfaction in their healthcare provider is associated with (1) confidence in the provider's diagnosis, (2) perception of patient-centered, individualized recommendations and (3) quality of patient education and provider explanation during a visit. Patient satisfaction is measured using either quantitative or qualitative methods. Quantitative methods result in standardized data that often does not capture the nuances of patient experience. In contrast, qualitative methodology is integral to gathering patient perspectives on patient care and satisfaction and should be included in future research models.

  16. Estimating the state of a geophysical system with sparse observations: time delay methods to achieve accurate initial states for prediction

    DOE PAGES

    An, Zhe; Rey, Daniel; Ye, Jingxin; ...

    2017-01-16

    The problem of forecasting the behavior of a complex dynamical system through analysis of observational time-series data becomes difficult when the system expresses chaotic behavior and the measurements are sparse, in both space and/or time. Despite the fact that this situation is quite typical across many fields, including numerical weather prediction, the issue of whether the available observations are "sufficient" for generating successful forecasts is still not well understood. An analysis by Whartenby et al. (2013) found that in the context of the nonlinear shallow water equations on a β plane, standard nudging techniques require observing approximately 70 % of themore » full set of state variables. Here we examine the same system using a method introduced by Rey et al. (2014a), which generalizes standard nudging methods to utilize time delayed measurements. Here, we show that in certain circumstances, it provides a sizable reduction in the number of observations required to construct accurate estimates and high-quality predictions. In particular, we find that this estimate of 70 % can be reduced to about 33 % using time delays, and even further if Lagrangian drifter locations are also used as measurements.« less

  17. Estimating the state of a geophysical system with sparse observations: time delay methods to achieve accurate initial states for prediction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    An, Zhe; Rey, Daniel; Ye, Jingxin

    The problem of forecasting the behavior of a complex dynamical system through analysis of observational time-series data becomes difficult when the system expresses chaotic behavior and the measurements are sparse, in both space and/or time. Despite the fact that this situation is quite typical across many fields, including numerical weather prediction, the issue of whether the available observations are "sufficient" for generating successful forecasts is still not well understood. An analysis by Whartenby et al. (2013) found that in the context of the nonlinear shallow water equations on a β plane, standard nudging techniques require observing approximately 70 % of themore » full set of state variables. Here we examine the same system using a method introduced by Rey et al. (2014a), which generalizes standard nudging methods to utilize time delayed measurements. Here, we show that in certain circumstances, it provides a sizable reduction in the number of observations required to construct accurate estimates and high-quality predictions. In particular, we find that this estimate of 70 % can be reduced to about 33 % using time delays, and even further if Lagrangian drifter locations are also used as measurements.« less

  18. Traversing the many paths of workflow research: developing a conceptual framework of workflow terminology through a systematic literature review

    PubMed Central

    Novak, Laurie L; Johnson, Kevin B; Lorenzi, Nancy M

    2010-01-01

    The objective of this review was to describe methods used to study and model workflow. The authors included studies set in a variety of industries using qualitative, quantitative and mixed methods. Of the 6221 matching abstracts, 127 articles were included in the final corpus. The authors collected data from each article on researcher perspective, study type, methods type, specific methods, approaches to evaluating quality of results, definition of workflow and dependent variables. Ethnographic observation and interviews were the most frequently used methods. Long study durations revealed the large time commitment required for descriptive workflow research. The most frequently discussed technique for evaluating quality of study results was triangulation. The definition of the term “workflow” and choice of methods for studying workflow varied widely across research areas and researcher perspectives. The authors developed a conceptual framework of workflow-related terminology for use in future research and present this model for use by other researchers. PMID:20442143

  19. Advancing Methods for U.S. Transgender Health Research

    PubMed Central

    Reisner, Sari L.; Deutsch, Madeline B.; Bhasin, Shalender; Bockting, Walter; Brown, George R.; Feldman, Jamie; Garofalo, Rob; Kreukels, Baudewijntje; Radix, Asa; Safer, Joshua D.; Tangpricha, Vin; T’Sjoen, Guy; Goodman, Michael

    2016-01-01

    Purpose of Review To describe methodological challenges, gaps, and opportunities in U.S. transgender health research. Recent Findings Lack of large prospective observational studies and intervention trials, limited data on risks and benefits of gender affirmation (e.g., hormones and surgical interventions), and inconsistent use of definitions across studies hinder evidence-based care for transgender people. Systematic high-quality observational and intervention-testing studies may be carried out using several approaches, including general population-based, health systems-based, clinic-based, venue-based, and hybrid designs. Each of these approaches has its strength and limitations; however, harmonization of research efforts is needed. Ongoing development of evidence-based clinical recommendations will benefit from a series of observational and intervention studies aimed at identification, recruitment, and follow-up of transgender people of different ages, from different racial, ethnic, and socioeconomic backgrounds and with diverse gender identities. Summary Transgender health research faces challenges that include standardization of lexicon, agreed-upon population definitions, study design, sampling, measurement, outcome ascertainment, and sample size. Application of existing and new methods is needed to fill existing gaps, increase the scientific rigor and reach of transgender health research, and inform evidence-based prevention and care for this underserved population. PMID:26845331

  20. A mixed-methods observational study of human milk sharing communities on Facebook.

    PubMed

    Perrin, Maryanne Tigchelaar; Goodell, L Suzanne; Allen, Jonathan C; Fogleman, April

    2014-04-01

    The Food and Drug Administration discourages the casual sharing of human milk because of the risk of pathogen transmission. No information is currently available on the prevalence of this practice. The purpose of this mixed-methods observational study is to describe the size and activity of online milk sharing communities. Data for 3 months were extracted from nine public Facebook pages that facilitate the exchange of human milk. The numbers of participants, interactions, and comments were analyzed. We observed 954 individuals participating in milk sharing. The number of interactions per individual ranged from none to 16 (mean, 1.74 ± 1.65). Top reasons that participants requested milk included "lactation problems" (69.4%) and "child health problems" (48.5%). Nearly half of donors were offering 100 ounces or more, which is the minimum to be eligible to donate to nonprofit milk banks. Milk sharing networks in the United States are active, with thousands of individuals participating in the direct exchange of raw human milk. Public health issues include increasing the supply of pasteurized donor milk for fragile infants, increasing breastfeeding support, and helping milk sharing families appropriately manage risks.

  1. Evaluation of reporting quality for observational studies using routinely collected health data in pharmacovigilance.

    PubMed

    Nie, Xiaolu; Zhang, Ying; Wu, Zehao; Jia, Lulu; Wang, Xiaoling; Langan, Sinéad M; Benchimol, Eric I; Peng, Xiaoxia

    2018-06-01

    To appraise the reporting quality of studies which concerned linezolid related thrombocytopenia referring to REporting of studies Conducted using Observational Routinely-collected health Data (RECORD) statement. Medline, Embase, Cochrane library and clinicaltrial.gov were searched for observational studies concerning linezolid related thrombocytopenia using routinely collected health data from 2000 to 2017. Two reviewers screened potential eligible articles and extracted data independently. Finally, reporting quality assessment was performed by two senior researchers using RECORD statement. Of 25 included studies, 11 (44.0%) mentioned the type of data in the title and/or abstract. In 38 items derived from RECORD statement, the median number of items reported in the included studies was 22 (interquartile range (IQR) 18 to 27). Inadequate reporting issues were discovered in the following aspects: validation studies of the codes or algorithms, study size estimation, quantitative variables, subgroup statistical methods, missing data, follow-up/matching or sampling strategy, sensitivity analysis and cleaning methods, funding and role of funders and accessibility of protocol, raw data. This study provides the evidence that the reporting quality of post-marketing safety evaluation studies conducted using routinely collected health data was often insufficient. Future stakeholders are encouraged to endorse the RECORD guidelines in pharmacovigilance.

  2. The Trumorph® system: The new universal technique for the observation and analysis of the morphology of living sperm. [corrected].

    PubMed

    Soler, C; García-Molina, A; Contell, J; Silvestre, M A; Sancho, M

    2015-07-01

    Evaluation of sperm morphology is a fundamental component of semen analysis, but its real significance has been obscured by a plethora of techniques that involve fixation and staining procedures that induce artefacts. Here we describe Trumorph℗®, a new method for sperm morphology assessment that is based upon examination of wet preparations of living spermatozoa immobilized by a short 60°C shock using negative phase contrast microscopy. We have observed samples from five animals of the following species: bull, boar, goat and rabbit. In every case, all the components of the sperm head and tail were perfectly defined, including the acrosome and midpiece (in all its length, including cytoplasmic droplets). A range of morphological forms was observed, similar to those found by conventional fixed and stained preparations, but other forms were found, distinguishable only by the optics used. The ease of preparation makes it a robust method applicable for analysis of living unmodified spermatozoa in a range of situations. Subsequent studies on well-characterized samples are required to describe the morphology of potentially fertilizing spermatozoa. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. The Impact of Current and Future Polar Orbiting Satellite Data on Numerical Weather Prediction at NASA/GSFC

    NASA Technical Reports Server (NTRS)

    Atlas, Robert

    2004-01-01

    The lack of adequate observational data continues to be recognized as a major factor limiting both atmospheric research and numerical prediction on a variety of temporal and spatial scales. Since the advent of meteorological satellites in the 1960's, a considerable research effort has been directed toward the design of space-borne meteorological sensors, the development of optimal methods for the utilization of these data, (and an assessment of the influence of existing satellite data and the potential influence of future satellite observations on numerical weather prediction. This has included both Observing System Experiments (OSEs) and Observing System Simulation Experiments (OSSEs). OSEs are conducted to evaluate the impact of specific observations or classes of observations on analyses and forecasts. While OSEs are performed with existing data, OSSEs are conducted to evaluate the potential for future observing systems to improve-NWP, as well as to evaluate trade-offs in observing system design, and to develop and test improved methods for data assimilation. At the conference, results from OSEs to evaluate satellite data sets that have recently become available to the global observing system, such as AIRS and Seawinds, and results from OSSEs to determine the potential impact of space-based lidar winds will be presented.

  4. Observation duration analysis for Earth surface features from a Moon-based platform

    NASA Astrophysics Data System (ADS)

    Ye, Hanlin; Guo, Huadong; Liu, Guang; Ren, Yuanzhen

    2018-07-01

    Earth System Science is a discipline that performs holistic and comprehensive research on various components of the Earth. One of a key issue for the Earth monitoring and observation is to enhance the observation duration, the time intervals during which the Earth surface features can be observed by sensors. In this work, we propose to utilise the Moon as an Earth observation platform. Thanks to the long distance between the Earth and the Moon, and the vast space on the lunar surface which is suitable for sensor installation, this Earth observation platform could have large spatial coverage, long temporal duration, and could perform multi-layer detection of the Earth. The line of sight between a proposed Moon-based platform and the Earth will change with different lunar surface positions; therefore, in this work, the position of the lunar surface was divided into four regions, including one full observation region and three incomplete observation regions. As existing methods are not able to perform global-scale observations, a Boolean matrix method was established to calculate the necessary observation durations from a Moon-based platform. Based on Jet Propulsion Laboratory (JPL) ephemerides and Earth Orientation Parameters (EOP), a formula was developed to describe the geometrical relationship between the Moon-based platform and Earth surface features in the unified spatial coordinate system and the unified time system. In addition, we compared the observation geometries at different positions on the lunar surface and two parameters that are vital to observation duration calculations were considered. Finally, an analysis method was developed. We found that the observation duration of a given Earth surface feature shows little difference regardless of sensor position within the full observation region. However, the observation duration for sensors in the incomplete observation regions is reduced by at least half. In summary, our results demonstrate the suitability of a Moon-based platform located in the full observation region.

  5. Observational constraints on Tachyon and DBI inflation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Sheng; Liddle, Andrew R., E-mail: sl277@sussex.ac.uk, E-mail: arl@roe.ac.uk

    2014-03-01

    We present a systematic method for evaluation of perturbation observables in non-canonical single-field inflation models within the slow-roll approximation, which allied with field redefinitions enables predictions to be established for a wide range of models. We use this to investigate various non-canonical inflation models, including Tachyon inflation and DBI inflation. The Lambert W function will be used extensively in our method for the evaluation of observables. In the Tachyon case, in the slow-roll approximation the model can be approximated by a canonical field with a redefined potential, which yields predictions in better agreement with observations than the canonical equivalents. Formore » DBI inflation models we consider contributions from both the scalar potential and the warp geometry. In the case of a quartic potential, we find a formula for the observables under both non-relativistic (sound speed c{sub s}{sup 2} ∼ 1) and relativistic behaviour (c{sub s}{sup 2} || 1) of the scalar DBI inflaton. For a quadratic potential we find two branches in the non-relativistic c{sub s}{sup 2} ∼ 1 case, determined by the competition of model parameters, while for the relativistic case c{sub s}{sup 2} → 0, we find consistency with results already in the literature. We present a comparison to the latest Planck satellite observations. Most of the non-canonical models we investigate, including the Tachyon, are better fits to data than canonical models with the same potential, but we find that DBI models in the slow-roll regime have difficulty in matching the data.« less

  6. Potential Pitfalls of Reporting and Bias in Observational Studies With Propensity Score Analysis Assessing a Surgical Procedure: A Methodological Systematic Review.

    PubMed

    Lonjon, Guillaume; Porcher, Raphael; Ergina, Patrick; Fouet, Mathilde; Boutron, Isabelle

    2017-05-01

    To describe the evolution of the use and reporting of propensity score (PS) analysis in observational studies assessing a surgical procedure. Assessing surgery in randomized controlled trials raises several challenges. Observational studies with PS analysis are a robust alternative for comparative effectiveness research. In this methodological systematic review, we identified all PubMed reports of observational studies with PS analysis that evaluated a surgical procedure and described the evolution of their use over time. Then, we selected a sample of articles published from August 2013 to July 2014 and systematically appraised the quality of reporting and potential bias of the PS analysis used. We selected 652 reports of observational studies with PS analysis. The publications increased over time, from 1 report in 1987 to 198 in 2013. Among the 129 reports assessed, 20% (n = 24) did not detail the covariates included in the PS and 77% (n = 100) did not report a justification for including these covariates in the PS. The rate of missing data for potential covariates was reported in 9% of articles. When a crossover by conversion was possible, only 14% of reports (n = 12) mentioned this issue. For matched analysis, 10% of articles reported all 4 key elements that allow for reproducibility of a PS-matched analysis (matching ratio, method to choose the nearest neighbors, replacement and method for statistical analysis). Observational studies with PS analysis in surgery are increasing in frequency, but specific methodological issues and weaknesses in reporting exist.

  7. Radiological findings for hip dysplasia at skeletal maturity. Validation of digital and manual measurement techniques.

    PubMed

    Engesæter, Ingvild Øvstebø; Laborie, Lene Bjerke; Lehmann, Trude Gundersen; Sera, Francesco; Fevang, Jonas; Pedersen, Douglas; Morcuende, José; Lie, Stein Atle; Engesæter, Lars Birger; Rosendahl, Karen

    2012-07-01

    To report on intra-observer, inter-observer, and inter-method reliability and agreement for radiological measurements used in the diagnosis of hip dysplasia at skeletal maturity, as obtained by a manual and a digital measurement technique. Pelvic radiographs from 95 participants (56 females) in a follow-up hip study of 18- to 19-year-old patients were included. Eleven radiological measurements relevant for hip dysplasia (Sharp's, Wiberg's, and Ogata's angles; acetabular roof angle of Tönnis; articulo-trochanteric distance; acetabular depth-width ratio; femoral head extrusion index; maximum teardrop width; and the joint space width in three different locations) were validated. Three observers measured the radiographs using both a digital measurement program and manually in AgfaWeb1000. Inter-method and inter- and intra-observer agreement were analyzed using the mean differences between the readings/readers, establishing the 95% limits of agreement. We also calculated the minimum detectable change and the intra-class correlation coefficient. Large variations among different radiological measurements were demonstrated. However, the variation was not related to the use of either the manual or digital measurement technique. For measurements with greater absolute values (Sharp's angle, femoral head extrusion index, and acetabular depth-width ratio) the inter- and intra-observer and inter-method agreements were better as compared to measurements with lower absolute values (acetabular roof angle, teardrop and joint space width). The inter- and intra-observer variation differs notably across different radiological measurements relevant for hip dysplasia at skeletal maturity, a fact that should be taken into account in clinical practice. The agreement between the manual and digital methods is good.

  8. Video Surveillance Captures Student Hand Hygiene Behavior, Reactivity to Observation, and Peer Influence in Kenyan Primary Schools

    PubMed Central

    Pickering, Amy J.; Blum, Annalise G.; Breiman, Robert F.; Ram, Pavani K.; Davis, Jennifer

    2014-01-01

    Background In-person structured observation is considered the best approach for measuring hand hygiene behavior, yet is expensive, time consuming, and may alter behavior. Video surveillance could be a useful tool for objectively monitoring hand hygiene behavior if validated against current methods. Methods Student hand cleaning behavior was monitored with video surveillance and in-person structured observation, both simultaneously and separately, at four primary schools in urban Kenya over a study period of 8 weeks. Findings Video surveillance and in-person observation captured similar rates of hand cleaning (absolute difference <5%, p = 0.74). Video surveillance documented higher hand cleaning rates (71%) when at least one other person was present at the hand cleaning station, compared to when a student was alone (48%; rate ratio  = 1.14 [95% CI 1.01–1.28]). Students increased hand cleaning rates during simultaneous video and in-person monitoring as compared to single-method monitoring, suggesting reactivity to each method of monitoring. This trend was documented at schools receiving a handwashing with soap intervention, but not at schools receiving a sanitizer intervention. Conclusion Video surveillance of hand hygiene behavior yields results comparable to in-person observation among schools in a resource-constrained setting. Video surveillance also has certain advantages over in-person observation, including rapid data processing and the capability to capture new behavioral insights. Peer influence can significantly improve student hand cleaning behavior and, when possible, should be exploited in the design and implementation of school hand hygiene programs. PMID:24676389

  9. Hooked Flare Ribbons and Flux-rope-related QSL Footprints

    NASA Astrophysics Data System (ADS)

    Zhao, Jie; Gilchrist, Stuart A.; Aulanier, Guillaume; Schmieder, Brigitte; Pariat, Etienne; Li, Hui

    2016-05-01

    We studied the magnetic topology of active region 12158 on 2014 September 10 and compared it with the observations before and early in the flare that begins at 17:21 UT (SOL2014-09-10T17:45:00). Our results show that the sigmoidal structure and flare ribbons of this active region observed by the Solar Dynamics Observatory/Atmospheric Imaging Assembly can be well reproduced from a Grad-Rubin nonlinear force-free field extrapolation method. Various inverse-S- and inverse-J-shaped magnetic field lines, which surround a coronal flux rope, coincide with the sigmoid as observed in different extreme-ultraviolet wavelengths, including its multithreaded curved ends. Also, the observed distribution of surface currents in the magnetic polarity where it was not prescribed is well reproduced. This validates our numerical implementation and setup of the Grad-Rubin method. The modeled double inverse-J-shaped quasi-separatrix layer (QSL) footprints match the observed flare ribbons during the rising phase of the flare, including their hooked parts. The spiral-like shape of the latter may be related to a complex pre-eruptive flux rope with more than one turn of twist, as obtained in the model. These ribbon-associated flux-rope QSL footprints are consistent with the new standard flare model in 3D, with the presence of a hyperbolic flux tube located below an inverse-teardrop-shaped coronal QSL. This is a new step forward forecasting the locations of reconnection and ribbons in solar flares and the geometrical properties of eruptive flux ropes.

  10. State estimation and prediction using clustered particle filters.

    PubMed

    Lee, Yoonsang; Majda, Andrew J

    2016-12-20

    Particle filtering is an essential tool to improve uncertain model predictions by incorporating noisy observational data from complex systems including non-Gaussian features. A class of particle filters, clustered particle filters, is introduced for high-dimensional nonlinear systems, which uses relatively few particles compared with the standard particle filter. The clustered particle filter captures non-Gaussian features of the true signal, which are typical in complex nonlinear dynamical systems such as geophysical systems. The method is also robust in the difficult regime of high-quality sparse and infrequent observations. The key features of the clustered particle filtering are coarse-grained localization through the clustering of the state variables and particle adjustment to stabilize the method; each observation affects only neighbor state variables through clustering and particles are adjusted to prevent particle collapse due to high-quality observations. The clustered particle filter is tested for the 40-dimensional Lorenz 96 model with several dynamical regimes including strongly non-Gaussian statistics. The clustered particle filter shows robust skill in both achieving accurate filter results and capturing non-Gaussian statistics of the true signal. It is further extended to multiscale data assimilation, which provides the large-scale estimation by combining a cheap reduced-order forecast model and mixed observations of the large- and small-scale variables. This approach enables the use of a larger number of particles due to the computational savings in the forecast model. The multiscale clustered particle filter is tested for one-dimensional dispersive wave turbulence using a forecast model with model errors.

  11. State estimation and prediction using clustered particle filters

    PubMed Central

    Lee, Yoonsang; Majda, Andrew J.

    2016-01-01

    Particle filtering is an essential tool to improve uncertain model predictions by incorporating noisy observational data from complex systems including non-Gaussian features. A class of particle filters, clustered particle filters, is introduced for high-dimensional nonlinear systems, which uses relatively few particles compared with the standard particle filter. The clustered particle filter captures non-Gaussian features of the true signal, which are typical in complex nonlinear dynamical systems such as geophysical systems. The method is also robust in the difficult regime of high-quality sparse and infrequent observations. The key features of the clustered particle filtering are coarse-grained localization through the clustering of the state variables and particle adjustment to stabilize the method; each observation affects only neighbor state variables through clustering and particles are adjusted to prevent particle collapse due to high-quality observations. The clustered particle filter is tested for the 40-dimensional Lorenz 96 model with several dynamical regimes including strongly non-Gaussian statistics. The clustered particle filter shows robust skill in both achieving accurate filter results and capturing non-Gaussian statistics of the true signal. It is further extended to multiscale data assimilation, which provides the large-scale estimation by combining a cheap reduced-order forecast model and mixed observations of the large- and small-scale variables. This approach enables the use of a larger number of particles due to the computational savings in the forecast model. The multiscale clustered particle filter is tested for one-dimensional dispersive wave turbulence using a forecast model with model errors. PMID:27930332

  12. Integrated software for the detection of epileptogenic zones in refractory epilepsy.

    PubMed

    Mottini, Alejandro; Miceli, Franco; Albin, Germán; Nuñez, Margarita; Ferrándo, Rodolfo; Aguerrebere, Cecilia; Fernandez, Alicia

    2010-01-01

    In this paper we present an integrated software designed to help nuclear medicine physicians in the detection of epileptogenic zones (EZ) by means of ictal-interictal SPECT and MR images. This tool was designed to be flexible, friendly and efficient. A novel detection method was included (A-contrario) along with the classical detection method (Subtraction analysis). The software's performance was evaluated with two separate sets of validation studies: visual interpretation of 12 patient images by an experimented observer and objective analysis of virtual brain phantom experiments by proposed numerical observers. Our results support the potential use of the proposed software to help nuclear medicine physicians in the detection of EZ in clinical practice.

  13. Using genetic data to strengthen causal inference in observational research.

    PubMed

    Pingault, Jean-Baptiste; O'Reilly, Paul F; Schoeler, Tabea; Ploubidis, George B; Rijsdijk, Frühling; Dudbridge, Frank

    2018-06-05

    Causal inference is essential across the biomedical, behavioural and social sciences.By progressing from confounded statistical associations to evidence of causal relationships, causal inference can reveal complex pathways underlying traits and diseases and help to prioritize targets for intervention. Recent progress in genetic epidemiology - including statistical innovation, massive genotyped data sets and novel computational tools for deep data mining - has fostered the intense development of methods exploiting genetic data and relatedness to strengthen causal inference in observational research. In this Review, we describe how such genetically informed methods differ in their rationale, applicability and inherent limitations and outline how they should be integrated in the future to offer a rich causal inference toolbox.

  14. Cold dark matter. 1: The formation of dark halos

    NASA Technical Reports Server (NTRS)

    Gelb, James M.; Bertschinger, Edmund

    1994-01-01

    We use numerical simulations of critically closed cold dark matter (CDM) models to study the effects of numerical resolution on observable quantities. We study simulations with up to 256(exp 3) particles using the particle-mesh (PM) method and with up to 144(exp 3) particles using the adaptive particle-particle-mesh (P3M) method. Comparisons of galaxy halo distributions are made among the various simulations. We also compare distributions with observations, and we explore methods for identifying halos, including a new algorithm that finds all particles within closed contours of the smoothed density field surrounding a peak. The simulated halos show more substructure than predicted by the Press-Schechter theory. We are able to rule out all omega = 1 CDM models for linear amplitude sigma(sub 8) greater than or approximately = 0.5 because the simulations produce too many massive halos compared with the observations. The simulations also produce too many low-mass halos. The distribution of halos characterized by their circular velocities for the P3M simulations is in reasonable agreement with the observations for 150 km/s less than or = V(sub circ) less than or = 350 km/s.

  15. Investigating the relationship between two home numeracy measures: A questionnaire and observations during Lego building and book reading.

    PubMed

    Mutaf Yildiz, Belde; Sasanguie, Delphine; De Smedt, Bert; Reynvoet, Bert

    2018-06-01

    Home numeracy has been defined as the parent-child interactions that include experiences with numerical content in daily-life settings. Previous studies have commonly operationalized home numeracy either via questionnaires or via observational methods. These studies have shown that both types of measures are positively related to variability in children's mathematical skills. This study investigated whether these distinctive data collection methods index the same aspect of home numeracy. The frequencies of home numeracy activities and parents' opinions about their children's mathematics education were assessed via a questionnaire. The amount of home numeracy talk was observed via two semi-structured videotaped parent-child activity sessions (Lego building and book reading). Children's mathematical skills were examined with two calculation subtests. We observed that parents' reports and number of observed numeracy interactions were not related to each other. Interestingly, parents' reports of numeracy activities were positively related to children's calculation abilities, whereas the observed home numeracy talk was negatively related to children's calculation abilities. These results indicate that these two methods tap on different aspects of home numeracy. Statement of contribution What is already known on this subject? Home numeracy, that is, parent-child interactions that include experiences with numerical content, is supposed to have a positive impact on calculation or mathematical ability in general. Despite many positive results, some studies have failed to find such an association. Home numeracy has been assessed with questionnaires on the frequency of numerical experiences and observations of parent-child interactions; however, those two measures of home numeracy have never been compared directly. What does this study add? This study assessed home numeracy through questionnaires and observations in the 44 parent-child dyads and showed that home numeracy measures derived from questionnaires and observations are not related. Moreover, the relation between the reported frequency of home numeracy activities and calculation on the one hand, and parent-child number talk (derived from observations) and calculation on the other hand is in opposite directions; the frequency of activities is positively related to calculation performance; and the amount of number talk is negatively related to calculation. This study shows that both measures tap into different aspects of home numeracy and can be an important factor explaining inconsistencies in literature. © 2018 The British Psychological Society.

  16. Supplement: “The Rate of Binary Black Hole Mergers Inferred from Advanced LIGO Observations Surrounding GW150914” (2016, ApJL, 833, L1)

    NASA Astrophysics Data System (ADS)

    Abbott, B. P.; Abbott, R.; Abbott, T. D.; Abernathy, M. R.; Acernese, F.; Ackley, K.; Adams, C.; Adams, T.; Addesso, P.; Adhikari, R. X.; Adya, V. B.; Affeldt, C.; Agathos, M.; Agatsuma, K.; Aggarwal, N.; Aguiar, O. D.; Aiello, L.; Ain, A.; Ajith, P.; Allen, B.; Allocca, A.; Altin, P. A.; Anderson, S. B.; Anderson, W. G.; Arai, K.; Araya, M. C.; Arceneaux, C. C.; Areeda, J. S.; Arnaud, N.; Arun, K. G.; Ascenzi, S.; Ashton, G.; Ast, M.; Aston, S. M.; Astone, P.; Aufmuth, P.; Aulbert, C.; Babak, S.; Bacon, P.; Bader, M. K. M.; Baker, P. T.; Baldaccini, F.; Ballardin, G.; Ballmer, S. W.; Barayoga, J. C.; Barclay, S. E.; Barish, B. C.; Barker, D.; Barone, F.; Barr, B.; Barsotti, L.; Barsuglia, M.; Barta, D.; Bartlett, J.; Bartos, I.; Bassiri, R.; Basti, A.; Batch, J. C.; Baune, C.; Bavigadda, V.; Bazzan, M.; Behnke, B.; Bejger, M.; Bell, A. S.; Bell, C. J.; Berger, B. K.; Bergman, J.; Bergmann, G.; Berry, C. P. L.; Bersanetti, D.; Bertolini, A.; Betzwieser, J.; Bhagwat, S.; Bhandare, R.; Bilenko, I. A.; Billingsley, G.; Birch, J.; Birney, R.; Biscans, S.; Bisht, A.; Bitossi, M.; Biwer, C.; Bizouard, M. A.; Blackburn, J. K.; Blair, C. D.; Blair, D. G.; Blair, R. M.; Bloemen, S.; Bock, O.; Bodiya, T. P.; Boer, M.; Bogaert, G.; Bogan, C.; Bohe, A.; Bojtos, P.; Bond, C.; Bondu, F.; Bonnand, R.; Boom, B. A.; Bork, R.; Boschi, V.; Bose, S.; Bouffanais, Y.; Bozzi, A.; Bradaschia, C.; Brady, P. R.; Braginsky, V. B.; Branchesi, M.; Brau, J. E.; Briant, T.; Brillet, A.; Brinkmann, M.; Brisson, V.; Brockill, P.; Brooks, A. F.; Brown, D. A.; Brown, D. D.; Brown, N. M.; Buchanan, C. C.; Buikema, A.; Bulik, T.; Bulten, H. J.; Buonanno, A.; Buskulic, D.; Buy, C.; Byer, R. L.; Cadonati, L.; Cagnoli, G.; Cahillane, C.; Calderón Bustillo, J.; Callister, T.; Calloni, E.; Camp, J. B.; Cannon, K. C.; Cao, J.; Capano, C. D.; Capocasa, E.; Carbognani, F.; Caride, S.; Casanueva Diaz, J.; Casentini, C.; Caudill, S.; Cavaglià, M.; Cavalier, F.; Cavalieri, R.; Cella, G.; Cepeda, C. B.; Cerboni Baiardi, L.; Cerretani, G.; Cesarini, E.; Chakraborty, R.; Chalermsongsak, T.; Chamberlin, S. J.; Chan, M.; Chao, S.; Charlton, P.; Chassande-Mottin, E.; Chen, H. Y.; Chen, Y.; Cheng, C.; Chincarini, A.; Chiummo, A.; Cho, H. S.; Cho, M.; Chow, J. H.; Christensen, N.; Chu, Q.; Chua, S.; Chung, S.; Ciani, G.; Clara, F.; Clark, J. A.; Cleva, F.; Coccia, E.; Cohadon, P.-F.; Colla, A.; Collette, C. G.; Cominsky, L.; Constancio, M., Jr.; Conte, A.; Conti, L.; Cook, D.; Corbitt, T. R.; Cornish, N.; Corsi, A.; Cortese, S.; Costa, C. A.; Coughlin, M. W.; Coughlin, S. B.; Coulon, J.-P.; Countryman, S. T.; Couvares, P.; Cowan, E. E.; Coward, D. M.; Cowart, M. J.; Coyne, D. C.; Coyne, R.; Craig, K.; Creighton, J. D. E.; Cripe, J.; Crowder, S. G.; Cumming, A.; Cunningham, L.; Cuoco, E.; Dal Canton, T.; Danilishin, S. L.; D’Antonio, S.; Danzmann, K.; Darman, N. S.; Dattilo, V.; Dave, I.; Daveloza, H. P.; Davier, M.; Davies, G. S.; Daw, E. J.; Day, R.; De, S.; DeBra, D.; Debreczeni, G.; Degallaix, J.; De Laurentis, M.; Deléglise, S.; Del Pozzo, W.; Denker, T.; Dent, T.; Dereli, H.; Dergachev, V.; De Rosa, R.; DeRosa, R. T.; DeSalvo, R.; Dhurandhar, S.; Díaz, M. C.; Di Fiore, L.; Di Giovanni, M.; Di Lieto, A.; Di Pace, S.; Di Palma, I.; Di Virgilio, A.; Dojcinoski, G.; Dolique, V.; Donovan, F.; Dooley, K. L.; Doravari, S.; Douglas, R.; Downes, T. P.; Drago, M.; Drever, R. W. P.; Driggers, J. C.; Du, Z.; Ducrot, M.; Dwyer, S. E.; Edo, T. B.; Edwards, M. C.; Effler, A.; Eggenstein, H.-B.; Ehrens, P.; Eichholz, J.; Eikenberry, S. S.; Engels, W.; Essick, R. C.; Etzel, T.; Evans, M.; Evans, T. M.; Everett, R.; Factourovich, M.; Fafone, V.; Fair, H.; Fairhurst, S.; Fan, X.; Fang, Q.; Farinon, S.; Farr, B.; Farr, W. M.; Favata, M.; Fays, M.; Fehrmann, H.; Fejer, M. M.; Ferrante, I.; Ferreira, E. C.; Ferrini, F.; Fidecaro, F.; Fiori, I.; Fiorucci, D.; Fisher, R. P.; Flaminio, R.; Fletcher, M.; Fong, H.; Fournier, J.-D.; Franco, S.; Frasca, S.; Frasconi, F.; Frei, Z.; Freise, A.; Frey, R.; Frey, V.; Fricke, T. T.; Fritschel, P.; Frolov, V. V.; Fulda, P.; Fyffe, M.; Gabbard, H. A. G.; Gair, J. R.; Gammaitoni, L.; Gaonkar, S. G.; Garufi, F.; Gatto, A.; Gaur, G.; Gehrels, N.; Gemme, G.; Gendre, B.; Genin, E.; Gennai, A.; George, J.; Gergely, L.; Germain, V.; Ghosh, Archisman; Ghosh, S.; Giaime, J. A.; Giardina, K. D.; Giazotto, A.; Gill, K.; Glaefke, A.; Goetz, E.; Goetz, R.; Gondan, L.; González, G.; Gonzalez Castro, J. M.; Gopakumar, A.; Gordon, N. A.; Gorodetsky, M. L.; Gossan, S. E.; Gosselin, M.; Gouaty, R.; Graef, C.; Graff, P. B.; Granata, M.; Grant, A.; Gras, S.; Gray, C.; Greco, G.; Green, A. C.; Groot, P.; Grote, H.; Grunewald, S.; Guidi, G. M.; Guo, X.; Gupta, A.; Gupta, M. K.; Gushwa, K. E.; Gustafson, E. K.; Gustafson, R.; Hacker, J. J.; Hall, B. R.; Hall, E. D.; Hammond, G.; Haney, M.; Hanke, M. M.; Hanks, J.; Hanna, C.; Hannam, M. D.; Hanson, J.; Hardwick, T.; Harms, J.; Harry, G. M.; Harry, I. W.; Hart, M. J.; Hartman, M. T.; Haster, C.-J.; Haughian, K.; Heidmann, A.; Heintze, M. C.; Heitmann, H.; Hello, P.; Hemming, G.; Hendry, M.; Heng, I. S.; Hennig, J.; Heptonstall, A. W.; Heurs, M.; Hild, S.; Hoak, D.; Hodge, K. A.; Hofman, D.; Hollitt, S. E.; Holt, K.; Holz, D. E.; Hopkins, P.; Hosken, D. J.; Hough, J.; Houston, E. A.; Howell, E. J.; Hu, Y. M.; Huang, S.; Huerta, E. A.; Huet, D.; Hughey, B.; Husa, S.; Huttner, S. H.; Huynh-Dinh, T.; Idrisy, A.; Indik, N.; Ingram, D. R.; Inta, R.; Isa, H. N.; Isac, J.-M.; Isi, M.; Islas, G.; Isogai, T.; Iyer, B. R.; Izumi, K.; Jacqmin, T.; Jang, H.; Jani, K.; Jaranowski, P.; Jawahar, S.; Jiménez-Forteza, F.; Johnson, W. W.; Jones, D. I.; Jones, R.; Jonker, R. J. G.; Ju, L.; K, Haris; Kalaghatgi, C. V.; Kalogera, V.; Kandhasamy, S.; Kang, G.; Kanner, J. B.; Karki, S.; Kasprzack, M.; Katsavounidis, E.; Katzman, W.; Kaufer, S.; Kaur, T.; Kawabe, K.; Kawazoe, F.; Kéfélian, F.; Kehl, M. S.; Keitel, D.; Kelley, D. B.; Kells, W.; Kennedy, R.; Key, J. S.; Khalaidovski, A.; Khalili, F. Y.; Khan, I.; Khan, S.; Khan, Z.; Khazanov, E. A.; Kijbunchoo, N.; Kim, C.; Kim, J.; Kim, K.; Kim, Nam-Gyu; Kim, Namjun; Kim, Y.-M.; King, E. J.; King, P. J.; Kinzel, D. L.; Kissel, J. S.; Kleybolte, L.; Klimenko, S.; Koehlenbeck, S. M.; Kokeyama, K.; Koley, S.; Kondrashov, V.; Kontos, A.; Korobko, M.; Korth, W. Z.; Kowalska, I.; Kozak, D. B.; Kringel, V.; Krishnan, B.; Królak, A.; Krueger, C.; Kuehn, G.; Kumar, P.; Kuo, L.; Kutynia, A.; Lackey, B. D.; Landry, M.; Lange, J.; Lantz, B.; Lasky, P. D.; Lazzarini, A.; Lazzaro, C.; Leaci, P.; Leavey, S.; Lebigot, E. O.; Lee, C. H.; Lee, H. K.; Lee, H. M.; Lee, K.; Lenon, A.; Leonardi, M.; Leong, J. R.; Leroy, N.; Letendre, N.; Levin, Y.; Levine, B. M.; Li, T. G. F.; Libson, A.; Littenberg, T. B.; Lockerbie, N. A.; Logue, J.; Lombardi, A. L.; Lord, J. E.; Lorenzini, M.; Loriette, V.; Lormand, M.; Losurdo, G.; Lough, J. D.; Lück, H.; Lundgren, A. P.; Luo, J.; Lynch, R.; Ma, Y.; MacDonald, T.; Machenschalk, B.; MacInnis, M.; Macleod, D. M.; Magaña-Sandoval, F.; Magee, R. M.; Mageswaran, M.; Majorana, E.; Maksimovic, I.; Malvezzi, V.; Man, N.; Mandel, I.; Mandic, V.; Mangano, V.; Mansell, G. L.; Manske, M.; Mantovani, M.; Marchesoni, F.; Marion, F.; Márka, S.; Márka, Z.; Markosyan, A. S.; Maros, E.; Martelli, F.; Martellini, L.; Martin, I. W.; Martin, R. M.; Martynov, D. V.; Marx, J. N.; Mason, K.; Masserot, A.; Massinger, T. J.; Masso-Reid, M.; Matichard, F.; Matone, L.; Mavalvala, N.; Mazumder, N.; Mazzolo, G.; McCarthy, R.; McClelland, D. E.; McCormick, S.; McGuire, S. C.; McIntyre, G.; McIver, J.; McManus, D. J.; McWilliams, S. T.; Meacher, D.; Meadors, G. D.; Meidam, J.; Melatos, A.; Mendell, G.; Mendoza-Gandara, D.; Mercer, R. A.; Merilh, E.; Merzougui, M.; Meshkov, S.; Messenger, C.; Messick, C.; Meyers, P. M.; Mezzani, F.; Miao, H.; Michel, C.; Middleton, H.; Mikhailov, E. E.; Milano, L.; Miller, J.; Millhouse, M.; Minenkov, Y.; Ming, J.; Mirshekari, S.; Mishra, C.; Mitra, S.; Mitrofanov, V. P.; Mitselmakher, G.; Mittleman, R.; Moggi, A.; Mohan, M.; Mohapatra, S. R. P.; Montani, M.; Moore, B. C.; Moore, C. J.; Moraru, D.; Moreno, G.; Morriss, S. R.; Mossavi, K.; Mours, B.; Mow-Lowry, C. M.; Mueller, C. L.; Mueller, G.; Muir, A. W.; Mukherjee, Arunava; Mukherjee, D.; Mukherjee, S.; Mukund, N.; Mullavey, A.; Munch, J.; Murphy, D. J.; Murray, P. G.; Mytidis, A.; Nardecchia, I.; Naticchioni, L.; Nayak, R. K.; Necula, V.; Nedkova, K.; Nelemans, G.; Neri, M.; Neunzert, A.; Newton, G.; Nguyen, T. T.; Nielsen, A. B.; Nissanke, S.; Nitz, A.; Nocera, F.; Nolting, D.; Normandin, M. E.; Nuttall, L. K.; Oberling, J.; Ochsner, E.; O’Dell, J.; Oelker, E.; Ogin, G. H.; Oh, J. J.; Oh, S. H.; Ohme, F.; Oliver, M.; Oppermann, P.; Oram, Richard J.; O’Reilly, B.; O’Shaughnessy, R.; Ottaway, D. J.; Ottens, R. S.; Overmier, H.; Owen, B. J.; Pai, A.; Pai, S. A.; Palamos, J. R.; Palashov, O.; Palomba, C.; Pal-Singh, A.; Pan, H.; Pankow, C.; Pannarale, F.; Pant, B. C.; Paoletti, F.; Paoli, A.; Papa, M. A.; Paris, H. R.; Parker, W.; Pascucci, D.; Pasqualetti, A.; Passaquieti, R.; Passuello, D.; Patricelli, B.; Patrick, Z.; Pearlstone, B. L.; Pedraza, M.; Pedurand, R.; Pekowsky, L.; Pele, A.; Penn, S.; Perreca, A.; Phelps, M.; Piccinni, O.; Pichot, M.; Piergiovanni, F.; Pierro, V.; Pillant, G.; Pinard, L.; Pinto, I. M.; Pitkin, M.; Poggiani, R.; Popolizio, P.; Porter, E. K.; Post, A.; Powell, J.; Prasad, J.; Predoi, V.; Premachandra, S. S.; Prestegard, T.; Price, L. R.; Prijatelj, M.; Principe, M.; Privitera, S.; Prodi, G. A.; Prokhorov, L.; Puncken, O.; Punturo, M.; Puppo, P.; Pürrer, M.; Qi, H.; Qin, J.; Quetschke, V.; Quintero, E. A.; Quitzow-James, R.; Raab, F. J.; Rabeling, D. S.; Radkins, H.; Raffai, P.; Raja, S.; Rakhmanov, M.; Rapagnani, P.; Raymond, V.; Razzano, M.; Re, V.; Read, J.; Reed, C. M.; Regimbau, T.; Rei, L.; Reid, S.; Reitze, D. H.; Rew, H.; Reyes, S. D.; Ricci, F.; Riles, K.; Robertson, N. A.; Robie, R.; Robinet, F.; Rocchi, A.; Rolland, L.; Rollins, J. G.; Roma, V. J.; Romano, R.; Romanov, G.; Romie, J. H.; Rosińska, D.; Rowan, S.; Rüdiger, A.; Ruggi, P.; Ryan, K.; Sachdev, S.; Sadecki, T.; Sadeghian, L.; Salconi, L.; Saleem, M.; Salemi, F.; Samajdar, A.; Sammut, L.; Sampson, L.; Sanchez, E. J.; Sandberg, V.; Sandeen, B.; Sanders, J. R.; Sassolas, B.; Sathyaprakash, B. S.; Saulson, P. R.; Sauter, O.; Savage, R. L.; Sawadsky, A.; Schale, P.; Schilling, R.; Schmidt, J.; Schmidt, P.; Schnabel, R.; Schofield, R. M. S.; Schönbeck, A.; Schreiber, E.; Schuette, D.; Schutz, B. F.; Scott, J.; Scott, S. M.; Sellers, D.; Sengupta, A. S.; Sentenac, D.; Sequino, V.; Sergeev, A.; Serna, G.; Setyawati, Y.; Sevigny, A.; Shaddock, D. A.; Shah, S.; Shahriar, M. S.; Shaltev, M.; Shao, Z.; Shapiro, B.; Shawhan, P.; Sheperd, A.; Shoemaker, D. H.; Shoemaker, D. M.; Siellez, K.; Siemens, X.; Sigg, D.; Silva, A. D.; Simakov, D.; Singer, A.; Singer, L. P.; Singh, A.; Singh, R.; Singhal, A.; Sintes, A. M.; Slagmolen, B. J. J.; Smith, J. R.; Smith, N. D.; Smith, R. J. E.; Son, E. J.; Sorazu, B.; Sorrentino, F.; Souradeep, T.; Srivastava, A. K.; Staley, A.; Steinke, M.; Steinlechner, J.; Steinlechner, S.; Steinmeyer, D.; Stephens, B. C.; Stevenson, S.; Stone, R.; Strain, K. A.; Straniero, N.; Stratta, G.; Strauss, N. A.; Strigin, S.; Sturani, R.; Stuver, A. L.; Summerscales, T. Z.; Sun, L.; Sutton, P. J.; Swinkels, B. L.; Szczepańczyk, M. J.; Tacca, M.; Talukder, D.; Tanner, D. B.; Tápai, M.; Tarabrin, S. P.; Taracchini, A.; Taylor, R.; Theeg, T.; Thirugnanasambandam, M. P.; Thomas, E. G.; Thomas, M.; Thomas, P.; Thorne, K. A.; Thorne, K. S.; Thrane, E.; Tiwari, S.; Tiwari, V.; Tokmakov, K. V.; Tomlinson, C.; Tonelli, M.; Torres, C. V.; Torrie, C. I.; Töyrä, D.; Travasso, F.; Traylor, G.; Trifirò, D.; Tringali, M. C.; Trozzo, L.; Tse, M.; Turconi, M.; Tuyenbayev, D.; Ugolini, D.; Unnikrishnan, C. S.; Urban, A. L.; Usman, S. A.; Vahlbruch, H.; Vajente, G.; Valdes, G.; Vallisneri, M.; van Bakel, N.; van Beuzekom, M.; van den Brand, J. F. J.; Van Den Broeck, C.; Vander-Hyde, D. C.; van der Schaaf, L.; van Heijningen, J. V.; van Veggel, A. A.; Vardaro, M.; Vass, S.; Vasúth, M.; Vaulin, R.; Vecchio, A.; Vedovato, G.; Veitch, J.; Veitch, P. J.; Venkateswara, K.; Verkindt, D.; Vetrano, F.; Viceré, A.; Vinciguerra, S.; Vine, D. J.; Vinet, J.-Y.; Vitale, S.; Vo, T.; Vocca, H.; Vorvick, C.; Voss, D.; Vousden, W. D.; Vyatchanin, S. P.; Wade, A. R.; Wade, L. E.; Wade, M.; Walker, M.; Wallace, L.; Walsh, S.; Wang, G.; Wang, H.; Wang, M.; Wang, X.; Wang, Y.; Ward, R. L.; Warner, J.; Was, M.; Weaver, B.; Wei, L.-W.; Weinert, M.; Weinstein, A. J.; Weiss, R.; Welborn, T.; Wen, L.; Wesels, P.; Westphal, T.; Wette, K.; Whelan, J. T.; White, D. J.; Whiting, B. F.; Williams, R. D.; Williamson, A. R.; Willis, J. L.; Willke, B.; Wimmer, M. H.; Winkler, W.; Wipf, C. C.; Wittel, H.; Woan, G.; Worden, J.; Wright, J. L.; Wu, G.; Yablon, J.; Yam, W.; Yamamoto, H.; Yancey, C. C.; Yap, M. J.; Yu, H.; Yvert, M.; Zadrożny, A.; Zangrando, L.; Zanolin, M.; Zendri, J.-P.; Zevin, M.; Zhang, F.; Zhang, L.; Zhang, M.; Zhang, Y.; Zhao, C.; Zhou, M.; Zhou, Z.; Zhu, X. J.; Zucker, M. E.; Zuraw, S. E.; Zweizig, J.; LIGO Scientific Collaboration; Virgo Collaboration

    2016-12-01

    This article provides supplemental information for a Letter reporting the rate of (BBH) coalescences inferred from 16 days of coincident Advanced LIGO observations surrounding the transient (GW) signal GW150914. In that work we reported various rate estimates whose 90% confidence intervals fell in the range 2–600 Gpc‑3 yr‑1. Here we give details on our method and computations, including information about our search pipelines, a derivation of our likelihood function for the analysis, a description of the astrophysical search trigger distribution expected from merging BBHs, details on our computational methods, a description of the effects and our model for calibration uncertainty, and an analytic method for estimating our detector sensitivity, which is calibrated to our measurements.

  17. Identifying the Oscillatory Mechanism of the Glucose Oxidase-Catalase Coupled Enzyme System.

    PubMed

    Muzika, František; Jurašek, Radovan; Schreiberová, Lenka; Radojković, Vuk; Schreiber, Igor

    2017-10-12

    We provide experimental evidence of periodic and aperiodic oscillations in an enzymatic system of glucose oxidase-catalase in a continuous-flow stirred reactor coupled by a membrane with a continuous-flow reservoir supplied with hydrogen peroxide. To describe such dynamics, we formulate a detailed mechanism based on partial results in the literature. Finally, we introduce a novel method for estimation of unknown kinetic parameters. The method is based on matching experimental data at an oscillatory instability with stoichiometric constraints of the mechanism formulated by applying the stability theory of reaction networks. This approach has been used to estimate rate coefficients in the catalase part of the mechanism. Remarkably, model simulations show good agreement with the observed oscillatory dynamics, including apparently chaotic intermittent behavior. Our method can be applied to any reaction system with an experimentally observable dynamical instability.

  18. Young children's communication and literacy: a qualitative study of language in the inclusive preschool.

    PubMed

    Kliewer, C

    1995-06-01

    Interactive and literacy-based language use of young children within the context of an inclusive preschool classroom was explored. An interpretivist framework and qualitative research methods, including participant observation, were used to examine and analyze language in five preschool classes that were composed of children with and without disabilities. Children's language use included spoken, written, signed, and typed. Results showed complex communicative and literacy language use on the part of young children outside conventional adult perspectives. Also, children who used expressive methods other than speech were often left out of the contexts where spoken language was richest and most complex.

  19. Comparison of three different methods for effective introduction of platelet-rich plasma on PLGA woven mesh.

    PubMed

    Lee, Ji-Hye; Nam, Jinwoo; Kim, Hee Joong; Yoo, Jeong Joon

    2015-03-11

    For successful tissue regeneration, effective cell delivery to defect site is very important. Various types of polymer biomaterials have been developed and applied for effective cell delivery. PLGA (poly lactic-co-glycolic acid), a synthetic polymer, is a commercially available and FDA approved material. Platelet-rich plasma (PRP) is an autologous growth factor cocktail containing various growth factors including PDGF, TGFβ-1 and BMPs, and has shown positive effects on cell behaviors. We hypothesized that PRP pretreatment on PLGA mesh using different methods would cause different patterns of platelet adhesion and stages which would modulate cell adhesion and proliferation on the PLGA mesh. In this study, we pretreated PRP on PLGA using three different methods including simple dripping (SD), dynamic oscillation (DO) and centrifugation (CE), then observed the amount of adhered platelets and their activation stage distribution. The highest amount of platelets was observed on CE mesh and calcium treated CE mesh. Moreover, calcium addition after PRP coating triggered dramatic activation of platelets which showed large and flat morphologies of platelets with rich fibrin networks. Human chondrocytes (hCs) and human bone marrow stromal cells (hBMSCs) were next cultured on PRP-pretreated PLGA meshes using different preparation methods. CE mesh showed a significant increase in the initial cell adhesion of hCs and proliferation of hBMSCs compared with SD and DO meshes. The results demonstrated that the centrifugation method can be considered as a promising coating method to introduce PRP on PLGA polymeric material which could improve cell-material interaction using a simple method.

  20. An Alternative to EPA Method 9 -- Field Validation of the Digital Opacity Compliance System (DOCS)

    DTIC Science & Technology

    2005-03-15

    at the completion of the Phase I and Phase II DOCS field demonstration. These included the following 1) anemometer, 2) sling psychrometer , 3) Abney...anemometer (Eastern Technical Associates, Inc.) Sky conditions Visual observation Relative Humidity Sling Psychrometer (Eastern Technical Associates...least have access to a range of climatic monitoring equipment including the following 1) anemometer, 2) sling psychrometer , 3) Abney Level (sun angle

  1. Behavioral and brain pattern differences between acting and observing in an auditory task

    PubMed Central

    Karanasiou, Irene S; Papageorgiou, Charalabos; Tsianaka, Eleni I; Matsopoulos, George K; Ventouras, Errikos M; Uzunoglu, Nikolaos K

    2009-01-01

    Background Recent research has shown that errors seem to influence the patterns of brain activity. Additionally current notions support the idea that similar brain mechanisms are activated during acting and observing. The aim of the present study was to examine the patterns of brain activity of actors and observers elicited upon receiving feedback information of the actor's response. Methods The task used in the present research was an auditory identification task that included both acting and observing settings, ensuring concurrent ERP measurements of both participants. The performance of the participants was investigated in conditions of varying complexity. ERP data were analyzed with regards to the conditions of acting and observing in conjunction to correct and erroneous responses. Results The obtained results showed that the complexity induced by cue dissimilarity between trials was a demodulating factor leading to poorer performance. The electrophysiological results suggest that feedback information results in different intensities of the ERP patterns of observers and actors depending on whether the actor had made an error or not. The LORETA source localization method yielded significantly larger electrical activity in the supplementary motor area (Brodmann area 6), the posterior cingulate gyrus (Brodmann area 31/23) and the parietal lobe (Precuneus/Brodmann area 7/5). Conclusion These findings suggest that feedback information has a different effect on the intensities of the ERP patterns of actors and observers depending on whether the actor committed an error. Certain neural systems, including medial frontal area, posterior cingulate gyrus and precuneus may mediate these modulating effects. Further research is needed to elucidate in more detail the neuroanatomical and neuropsychological substrates of these systems. PMID:19154586

  2. Assessing digital literacy in web-based physical activity surveillance: the WIN study.

    PubMed

    Mathew, Merly; Morrow, James R; Frierson, Georita M; Bain, Tyson M

    2011-01-01

    PURPOSE. Investigate relations between demographic characteristics and submission method, Internet or paper, when physical activity behaviors are reported. DESIGN. Observational. SETTING . Metropolitan. SUBJECTS. Adult women (N  =  918) observed weekly for 2 years (total number of weekly reports, 44,963). MEASURES. Independent variables included age, race, education, income, employment status, and Internet skills. Dependent variables were method of submission (Internet or paper) and adherence. ANALYSIS . Logistic regression to analyze weekly odds of submitting data online and meeting study adherence criteria. Model 1 investigated method of submission, model 2 analyzed meeting study's Internet adherence, and model 3 analyzed meeting total adherence regardless of submission method. RESULTS. Whites, those with good Internet skills, and those reporting higher incomes were more likely to log online. Those who were white, older, and reported good Internet skills were more likely to be at least 75% adherent online. Older women were more likely to be adherent regardless of method. Employed women were less likely to log online or be adherent. CONCLUSION . Providing participants with multiple submission methods may reduce potential bias and provide more generalizable results relevant for future Internet-based research.

  3. [Quality analysis of observational studies on pelvic organ prolapse in China].

    PubMed

    Wang, Y T; Tao, L Y; He, H J; Han, J S

    2017-06-25

    Objective: To evaluate the quality of observational studies on pelvic organ prolapse in China. Methods: The checklist of strengthening the reporting of observational studies in epidemiology (STROBE) statement was applied to evaluate the observational studies. The articles were searched in the SinoMed database using the terms: prolapse, uterine prolapse, cystocele, rectal prolapse and pelvic floor; limited to Chinese core journals in obstetrics and gynecology from January 1996 to December 2015. With two 10-year groups (1996-2005 and 2006-2015), the χ(2) test was used to evaluate inter-group differences. Results: (1) A total of 386 observational studies were selected, including 15.5%(60/386) of case-control studies, 80.6%(311/386) of cohort studies and 3.9% (15/386) of cross-sectional studies. (2) There were totally 22 items including 34 sub-items in the checklist. There were 17 sub-items (50.0%, 17/34) had a reporting ratio less than 50% in all of aticles, including: 1a (study's design) 3.9% (15/386), 6a (participants) 24.6% (95/386), 6b (matched studies) 0 (0/386), 9 (bias) 8.3% (32/386), 10 (study size) 3.9%, 11 (quantitative variables) 41.2% (159/386), 12b-12e (statistical methods in detail) 0-2.6% (10/386), 13a (numbers of individuals at each stage of study) 18.9% (73/386), 13b (reasons for non-participation at each stage) 18.9%, 13c (flow diagram) 0, 16b and 16c (results of category boundaries and relative risk) 9.6% (37/386) and 0, 19 (limitations) 31.6% (122/386), 22 (funding) 20.5% (79/386). (3) The quality of articles published in the two decades (1996-2005 and 2006-2015) were compared, and 38.2%(13/34) of sub-items had been significantly improved in the second 10-year (all P< 0.05). The improved items were as follows: 1b (integrity of abstract), 2 (background/rationale), 6a (participants), 7 (variables), 8 (data sources/measurement), 9 (bias), 11 (quantitative variables), 12a (statistical methods), 17 (other analyses), 18 (key results), 19 (limitations), 21 (generalisability), 22 (funding). Conclusions: The quality of observational studies on POP in China is suboptimal in half of evaluation items. However, the quality of articles published in the second 10-year have significantly improved.

  4. Inverse Abbe-method for observing small refractive index changes in liquids.

    PubMed

    Räty, Jukka; Peiponen, Kai-Erik

    2015-05-01

    This study concerns an optical method for the detection of minuscule refractive index changes in the liquid phase. The proposed method reverses the operation of the traditional Abbe refractometer and thus utilizes the light dispersion properties of materials, i.e. it involves the dependence of the refractive index on light wavelength. In practice, the method includes the detection of light reflection spectra in the visible spectral range. This inverse Abbe method is suitable for liquid quality studies e.g. for monitoring water purity. Tests have shown that the method reveals less than per mil NaCl or ethanol concentrations in water. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. A Belgian Approach to Learning Disabilities.

    ERIC Educational Resources Information Center

    Hayes, Cheryl W.

    The paper reviews Belgian philosophy toward the education of learning disabled students and cites the differences between American behaviorally-oriented theory and Belgian emphasis on identifying the underlying causes of the disability. Academic methods observed in Belgium (including psychodrama and perceptual motor training) are discussed and are…

  6. Earth Observing System (EOS) Advanced Microwave Sounding Unit-A (AMSU-A) schedule plan

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This report describes Aerojet's methods and procedures used to control and administer contractual schedules for the EOS/AMSU-A program. Included are the following: the master, intermediate, and detail schedules; critical path analysis; and the total program logic network diagrams.

  7. Water Rockets and Indirect Measurement.

    ERIC Educational Resources Information Center

    Inman, Duane

    1997-01-01

    Describes an activity that teaches a number of scientific concepts including indirect measurement, Newton's third law of motion, manipulating and controlling variables, and the scientific method of inquiry. Uses process skills such as observation, inference, prediction, mensuration, and communication as well as problem solving and higher-order…

  8. Diurnal Motion of the Sun as Seen From Mercury

    ERIC Educational Resources Information Center

    Turner, Lawrence E., Jr.

    1978-01-01

    Two methods are described for the quantitative description of the motion of the sun as observed from Mercury. A listing of a computer subroutine is included. The combination of slow rotation and high eccentricity of Mercury's orbit makes this problem an interesting one. (BB)

  9. Maintaining Situation Awareness with Autonomous Airborne Observation Platforms

    NASA Technical Reports Server (NTRS)

    Freed, Michael; Fitzgerald, Will

    2005-01-01

    Unmanned Aerial Vehicles (UAVs) offer tremendous potential as intelligence, surveillance and reconnaissance (ISR) platforms for early detection of security threats and for acquisition and maintenance of situation awareness in crisis conditions. However, using their capabilities effectively requires addressing a range of practical and theoretical problems. The paper will describe progress by the "Autonomous Rotorcraft Project," a collaborative effort between NASA and the U.S. Army to develop a practical, flexible capability for UAV-based ISR. Important facets of the project include optimization methods for allocating scarce aircraft resources to observe numerous, distinct sites of interest; intelligent flight automation software than integrates high-level plan generation capabilities with executive control, failure response and flight control functions; a system architecture supporting reconfiguration of onboard sensors to address different kinds of threats; and an advanced prototype vehicle designed to allow large-scale production at low cost. The paper will also address human interaction issues including an empirical method for determining how to allocate roles and responsibilities between flight automation and human operations.

  10. Introducing Research Methods to Undergraduate Majors Through an On-Campus Observatory with The University of Toledo's Ritter Observatory

    NASA Astrophysics Data System (ADS)

    Richardson, Noel; Hardegree-Ullman, Kevin; Bjorkman, Jon Eric; Bjorkman, Karen S.; Ritter Observing Team

    2017-01-01

    With a 1-m telescope on the University of Toledo (OH) main campus, we have initiated a grad student-undergraduate partnership to help teach the undergraduates observational methods and introduce them to research through peer mentorship. For the last 3 years, we have trained up to 21 undergraduates (primarily physics/astronomy majors) in a given academic semester, ranging from freshman to seniors. Various projects are currently being conducted by undergraduate students with guidance from graduate student mentors, including constructing three-color images, observations of transiting exoplanets, and determination of binary star orbits from echelle spectra. This academic year we initiated a large group research project to help students learn about the databases, journal repositories, and online observing tools astronomers use for day-to-day research. We discuss early inclusion in observational astronomy and research of these students and the impact it has on departmental retention, undergraduate involvement, and academic success.

  11. Optimization of Exposure Time Division for Multi-object Photometry

    NASA Astrophysics Data System (ADS)

    Popowicz, Adam; Kurek, Aleksander R.

    2017-09-01

    Optical observations of wide fields of view entail the problem of selecting the best exposure time. As many objects are usually observed simultaneously, the quality of photometry of the brightest ones is always better than that of the dimmer ones, even though all of them are frequently equally interesting for astronomers. Thus, measuring all objects with the highest possible precision is desirable. In this paper, we present a new optimization algorithm, dedicated for the division of exposure time into sub-exposures, which enables photometry with a more balanced noise budget. The proposed technique increases the photometric precision of dimmer objects at the expense of the measurement fidelity of the brightest ones. We have tested the method on real observations using two telescope setups, demonstrating its usefulness and good consistency with theoretical expectations. The main application of our approach is a wide range of sky surveys, including ones performed by space telescopes. The method can be used to plan virtually any photometric observation of objects that show a wide range of magnitudes.

  12. Estimation of Fine Particulate Matter in Taipei Using Landuse Regression and Bayesian Maximum Entropy Methods

    PubMed Central

    Yu, Hwa-Lung; Wang, Chih-Hsih; Liu, Ming-Che; Kuo, Yi-Ming

    2011-01-01

    Fine airborne particulate matter (PM2.5) has adverse effects on human health. Assessing the long-term effects of PM2.5 exposure on human health and ecology is often limited by a lack of reliable PM2.5 measurements. In Taipei, PM2.5 levels were not systematically measured until August, 2005. Due to the popularity of geographic information systems (GIS), the landuse regression method has been widely used in the spatial estimation of PM concentrations. This method accounts for the potential contributing factors of the local environment, such as traffic volume. Geostatistical methods, on other hand, account for the spatiotemporal dependence among the observations of ambient pollutants. This study assesses the performance of the landuse regression model for the spatiotemporal estimation of PM2.5 in the Taipei area. Specifically, this study integrates the landuse regression model with the geostatistical approach within the framework of the Bayesian maximum entropy (BME) method. The resulting epistemic framework can assimilate knowledge bases including: (a) empirical-based spatial trends of PM concentration based on landuse regression, (b) the spatio-temporal dependence among PM observation information, and (c) site-specific PM observations. The proposed approach performs the spatiotemporal estimation of PM2.5 levels in the Taipei area (Taiwan) from 2005–2007. PMID:21776223

  13. Estimation of fine particulate matter in Taipei using landuse regression and bayesian maximum entropy methods.

    PubMed

    Yu, Hwa-Lung; Wang, Chih-Hsih; Liu, Ming-Che; Kuo, Yi-Ming

    2011-06-01

    Fine airborne particulate matter (PM2.5) has adverse effects on human health. Assessing the long-term effects of PM2.5 exposure on human health and ecology is often limited by a lack of reliable PM2.5 measurements. In Taipei, PM2.5 levels were not systematically measured until August, 2005. Due to the popularity of geographic information systems (GIS), the landuse regression method has been widely used in the spatial estimation of PM concentrations. This method accounts for the potential contributing factors of the local environment, such as traffic volume. Geostatistical methods, on other hand, account for the spatiotemporal dependence among the observations of ambient pollutants. This study assesses the performance of the landuse regression model for the spatiotemporal estimation of PM2.5 in the Taipei area. Specifically, this study integrates the landuse regression model with the geostatistical approach within the framework of the Bayesian maximum entropy (BME) method. The resulting epistemic framework can assimilate knowledge bases including: (a) empirical-based spatial trends of PM concentration based on landuse regression, (b) the spatio-temporal dependence among PM observation information, and (c) site-specific PM observations. The proposed approach performs the spatiotemporal estimation of PM2.5 levels in the Taipei area (Taiwan) from 2005-2007.

  14. Random forests for classification in ecology

    USGS Publications Warehouse

    Cutler, D.R.; Edwards, T.C.; Beard, K.H.; Cutler, A.; Hess, K.T.; Gibson, J.; Lawler, J.J.

    2007-01-01

    Classification procedures are some of the most widely used statistical methods in ecology. Random forests (RF) is a new and powerful statistical classifier that is well established in other disciplines but is relatively unknown in ecology. Advantages of RF compared to other statistical classifiers include (1) very high classification accuracy; (2) a novel method of determining variable importance; (3) ability to model complex interactions among predictor variables; (4) flexibility to perform several types of statistical data analysis, including regression, classification, survival analysis, and unsupervised learning; and (5) an algorithm for imputing missing values. We compared the accuracies of RF and four other commonly used statistical classifiers using data on invasive plant species presence in Lava Beds National Monument, California, USA, rare lichen species presence in the Pacific Northwest, USA, and nest sites for cavity nesting birds in the Uinta Mountains, Utah, USA. We observed high classification accuracy in all applications as measured by cross-validation and, in the case of the lichen data, by independent test data, when comparing RF to other common classification methods. We also observed that the variables that RF identified as most important for classifying invasive plant species coincided with expectations based on the literature. ?? 2007 by the Ecological Society of America.

  15. Flexible methods for segmentation evaluation: Results from CT-based luggage screening

    PubMed Central

    Karimi, Seemeen; Jiang, Xiaoqian; Cosman, Pamela; Martz, Harry

    2017-01-01

    BACKGROUND Imaging systems used in aviation security include segmentation algorithms in an automatic threat recognition pipeline. The segmentation algorithms evolve in response to emerging threats and changing performance requirements. Analysis of segmentation algorithms’ behavior, including the nature of errors and feature recovery, facilitates their development. However, evaluation methods from the literature provide limited characterization of the segmentation algorithms. OBJECTIVE To develop segmentation evaluation methods that measure systematic errors such as oversegmentation and undersegmentation, outliers, and overall errors. The methods must measure feature recovery and allow us to prioritize segments. METHODS We developed two complementary evaluation methods using statistical techniques and information theory. We also created a semi-automatic method to define ground truth from 3D images. We applied our methods to evaluate five segmentation algorithms developed for CT luggage screening. We validated our methods with synthetic problems and an observer evaluation. RESULTS Both methods selected the same best segmentation algorithm. Human evaluation confirmed the findings. The measurement of systematic errors and prioritization helped in understanding the behavior of each segmentation algorithm. CONCLUSIONS Our evaluation methods allow us to measure and explain the accuracy of segmentation algorithms. PMID:24699346

  16. Using instrumental variables to estimate a Cox's proportional hazards regression subject to additive confounding

    PubMed Central

    Tosteson, Tor D.; Morden, Nancy E.; Stukel, Therese A.; O'Malley, A. James

    2014-01-01

    The estimation of treatment effects is one of the primary goals of statistics in medicine. Estimation based on observational studies is subject to confounding. Statistical methods for controlling bias due to confounding include regression adjustment, propensity scores and inverse probability weighted estimators. These methods require that all confounders are recorded in the data. The method of instrumental variables (IVs) can eliminate bias in observational studies even in the absence of information on confounders. We propose a method for integrating IVs within the framework of Cox's proportional hazards model and demonstrate the conditions under which it recovers the causal effect of treatment. The methodology is based on the approximate orthogonality of an instrument with unobserved confounders among those at risk. We derive an estimator as the solution to an estimating equation that resembles the score equation of the partial likelihood in much the same way as the traditional IV estimator resembles the normal equations. To justify this IV estimator for a Cox model we perform simulations to evaluate its operating characteristics. Finally, we apply the estimator to an observational study of the effect of coronary catheterization on survival. PMID:25506259

  17. Using instrumental variables to estimate a Cox's proportional hazards regression subject to additive confounding.

    PubMed

    MacKenzie, Todd A; Tosteson, Tor D; Morden, Nancy E; Stukel, Therese A; O'Malley, A James

    2014-06-01

    The estimation of treatment effects is one of the primary goals of statistics in medicine. Estimation based on observational studies is subject to confounding. Statistical methods for controlling bias due to confounding include regression adjustment, propensity scores and inverse probability weighted estimators. These methods require that all confounders are recorded in the data. The method of instrumental variables (IVs) can eliminate bias in observational studies even in the absence of information on confounders. We propose a method for integrating IVs within the framework of Cox's proportional hazards model and demonstrate the conditions under which it recovers the causal effect of treatment. The methodology is based on the approximate orthogonality of an instrument with unobserved confounders among those at risk. We derive an estimator as the solution to an estimating equation that resembles the score equation of the partial likelihood in much the same way as the traditional IV estimator resembles the normal equations. To justify this IV estimator for a Cox model we perform simulations to evaluate its operating characteristics. Finally, we apply the estimator to an observational study of the effect of coronary catheterization on survival.

  18. Experience of treatment of patients with granulomatous lobular mastitis

    PubMed Central

    Hur, Sung Mo; Cho, Dong Hui; Lee, Se Kyung; Choi, Min-Young; Bae, Soo Youn; Koo, Min Young; Kim, Sangmin; Choe, Jun-Ho; Kim, Jung-Han; Kim, Jee Soo; Nam, Seok-Jin; Yang, Jung-Hyun

    2013-01-01

    Purpose To present the author's experience with various treatment methods of granulomatous lobular mastitis (GLM) and to determine effective treatment methods of GLM. Methods Fifty patients who were diagnosed with GLM were classified into five groups based on the initial treatment methods they underwent, which included observation (n = 8), antibiotics (n = 3), steroid (n = 13), drainage (n = 14), and surgical excision (n = 12). The treatment processes in each group were examined and their clinical characteristics, treatment processes, and results were analyzed respectively. Results Success rates with each initial treatment were observation, 87.5%; antibiotics, 33.3%; steroids, 30.8%; drainage, 28.6%; and surgical excision, 91.7%. In most cases of observation, the lesions were small and the symptoms were mild. A total of 23 patients underwent surgical excision during treatment. Surgical excision showed particularly fast recovery, high success rate (90.3%) and low recurrence rate (8.7%). Conclusion The clinical course of GLM is complex and the outcome of each treatment type are variable. Surgery may play an important role when a lesion is determined to be mass-forming or appears localized as an abscess pocket during breast examination or imaging study. PMID:23833753

  19. Gingival Retraction Methods: A Systematic Review.

    PubMed

    Tabassum, Sadia; Adnan, Samira; Khan, Farhan Raza

    2017-12-01

    The aim of this systematic review was to assess the gingival retraction methods in terms of the amount of gingival retraction achieved and changes observed in various clinical parameters: gingival index (GI), plaque index (PI), probing depth (PD), and attachment loss (AL). Data sources included three major databases, PubMed, CINAHL plus (Ebsco), and Cochrane, along with hand search. Search was made using the key terms in different permutations of gingival retraction* AND displacement method* OR technique* OR agents OR material* OR medicament*. The initial search results yielded 145 articles which were narrowed down to 10 articles using a strict eligibility criteria of including clinical trials or experimental studies on gingival retraction methods with the amount of tooth structure gained and assessment of clinical parameters as the outcomes conducted on human permanent teeth only. Gingival retraction was measured in 6/10 studies whereas the clinical parameters were assessed in 5/10 studies. The total number of teeth assessed in the 10 included studies was 400. The most common method used for gingival retraction was chemomechanical. The results were heterogeneous with regards to the outcome variables. No method seemed to be significantly superior to the other in terms of gingival retraction achieved. Clinical parameters were not significantly affected by the gingival retraction method. © 2016 by the American College of Prosthodontists.

  20. Explanation and Elaboration Document for the STROBE-Vet Statement: Strengthening the Reporting of Observational Studies in Epidemiology-Veterinary Extension.

    PubMed

    O'Connor, A M; Sargeant, J M; Dohoo, I R; Erb, H N; Cevallos, M; Egger, M; Ersbøll, A K; Martin, S W; Nielsen, L R; Pearl, D L; Pfeiffer, D U; Sanchez, J; Torrence, M E; Vigre, H; Waldner, C; Ward, M P

    2016-11-01

    The STROBE (Strengthening the Reporting of Observational Studies in Epidemiology) statement was first published in 2007 and again in 2014. The purpose of the original STROBE was to provide guidance for authors, reviewers, and editors to improve the comprehensiveness of reporting; however, STROBE has a unique focus on observational studies. Although much of the guidance provided by the original STROBE document is directly applicable, it was deemed useful to map those statements to veterinary concepts, provide veterinary examples, and highlight unique aspects of reporting in veterinary observational studies. Here, we present the examples and explanations for the checklist items included in the STROBE-Vet statement. Thus, this is a companion document to the STROBE-Vet statement methods and process document (JVIM_14575 "Methods and Processes of Developing the Strengthening the Reporting of Observational Studies in Epidemiology-Veterinary (STROBE-Vet) Statement" undergoing proofing), which describes the checklist and how it was developed. Copyright © 2016 The Authors. Journal of Veterinary Internal Medicine published by Wiley Periodicals, Inc. on behalf of the American College of Veterinary Internal Medicine.

  1. Using Simulations to Investigate Decision Making in Airline Operations

    NASA Technical Reports Server (NTRS)

    Bruce, Peter J.; Gray, Judy H.

    2003-01-01

    This paper examines a range of methods to collect data for the investigation of decision-making in airline Operations Control Centres (OCCs). A study was conducted of 52 controllers in five OCCs of both domestic and international airlines in the Asia-Pacific region. A range of methods was used including: surveys, interviews, observations, simulations, and think-aloud protocol. The paper compares and evaluates the suitability of these techniques for gathering data and provides recommendations on the application of simulations. Keywords Data Collection, Decision-Making, Research Methods, Simulation, Think-Aloud Protocol.

  2. The use of photogrammetric and stereophotogrammetric methods in aerodynamic experiments

    NASA Astrophysics Data System (ADS)

    Shmyreva, V. N.; Iakovlev, V. A.

    The possibilities afforded by photogrammetry and stereophotogrammetry in current aerodynamic experiments, methods of image recording, and observation data processing are briefly reviewed. Some specific experiments illustrating the application of stereophotogrammetry are described. The applications discussed include the monitoring of model position in wind tunnels, determination of model deformations and displacements, determination of the deformations of real structural elements in static strength tests, and solution of a variety of problems in hydrodynamics.

  3. A data model for environmental scientists

    NASA Astrophysics Data System (ADS)

    Kapeljushnik, O.; Beran, B.; Valentine, D.; van Ingen, C.; Zaslavsky, I.; Whitenack, T.

    2008-12-01

    Environmental science encompasses a wide range of disciplines from water chemistry to microbiology, ecology and atmospheric sciences. Studies often require working across disciplines which differ in their ways of describing and storing data such that it is not possible to devise a monolithic one-size-fits-all data solution. Based on our experiences with Consortium of the Universities for the Advancement of Hydrologic Science Inc. (CUAHSI) Observations Data Model, Berkeley Water Center FLUXNET carbon-climate work and by examining standards like EPA's Water Quality Exchange (WQX), we have developed a flexible data model that allows extensions without need to altering the schema such that scientists can define custom metadata elements to describe their data including observations, analysis methods as well as sensors and geographical features. The data model supports various types of observations including fixed point and moving sensors, bottled samples, rasters from remote sensors and models, and categorical descriptions (e.g. taxonomy) by employing user-defined-types when necessary. It leverages ADO .NET Entity Framework to provide the semantic data models for differing disciplines, while maintaining a common schema below the entity layer. This abstraction layer simplifies data retrieval and manipulation by hiding the logic and complexity of the relational schema from users thus allows programmers and scientists to deal directly with objects such as observations, sensors, watersheds, river reaches, channel cross-sections, laboratory analysis methods and samples as opposed to table joins, columns and rows.

  4. Overview of clinical research design.

    PubMed

    Hartung, Daniel M; Touchette, Daniel

    2009-02-15

    Basic concepts and terminology of clinical research design are presented for new clinical investigators. Clinical research, research involving human subjects, can be described as either observational or experimental. The findings of all clinical research can be threatened by issues of bias and confounding. Biases are systematic errors in how study subjects are selected or measured, which result in false inferences. Confounding is a distortion in findings that is attributable to mixing variable effects. Uncontrolled observation research is generally more prone to bias and confounding than experimental research. Observational research includes designs such as the cohort study, case-control study, and cross-sectional study, while experimental research typically involves a randomized controlled trial (RCT). The cohort study, which includes the RCT, defines subject allocation on the basis of exposure interest (e.g., drug, disease-management program) and follows the patients to assess the outcomes. The case-control study uses the primary outcome of interest (e.g., adverse event) to define subject allocation, and different exposures are assessed in a retrospective manner. Cross-sectional research evaluates both exposure and outcome concurrently. Each of these design methods possesses different strengths and weaknesses in answering research questions, as well as underlying many study subtypes. While experimental research is the strongest method for establishing causality, it can be difficult to accomplish under many scenarios. Observational clinical research offers many design alternatives that may be appropriate if planned and executed carefully.

  5. Kinematic and Hydrometer Data Products from Scanning Radars during MC3E

    DOE Data Explorer

    matthews, Alyssa; Dolan, Brenda; Rutledge, Steven

    2016-02-29

    Recently the Radar Meteorology Group at Colorado State University has completed major case studies of some top cases from MC3E including 25 April, 20 May and 23 May 2011. A discussion on the analysis methods as well as radar quality control methods is included. For each case, a brief overview is first provided. Then, multiple Doppler (using available X-SAPR and C-SAPR data) analyses are presented including statistics on vertical air motions, sub-divided by convective and stratiform precipitation. Mean profiles and CFAD's of vertical motion are included to facilitate comparison with ASR model simulations. Retrieved vertical motion has also been verified with vertically pointing profiler data. Finally for each case, hydrometeor types are included derived from polarimetric radar observations. The latter can be used to provide comparisons to model-generated hydrometeor fields. Instructions for accessing all the data fields are also included. The web page can be found at: http://radarmet.atmos.colostate.edu/mc3e/research/

  6. Simulation of fine organic aerosols in the western Mediterranean area during the ChArMEx 2013 summer campaign

    NASA Astrophysics Data System (ADS)

    Cholakian, Arineh; Beekmann, Matthias; Colette, Augustin; Coll, Isabelle; Siour, Guillaume; Sciare, Jean; Marchand, Nicolas; Couvidat, Florian; Pey, Jorge; Gros, Valerie; Sauvage, Stéphane; Michoud, Vincent; Sellegri, Karine; Colomb, Aurélie; Sartelet, Karine; Langley DeWitt, Helen; Elser, Miriam; Prévot, André S. H.; Szidat, Sonke; Dulac, François

    2018-05-01

    The simulation of fine organic aerosols with CTMs (chemistry-transport models) in the western Mediterranean basin has not been studied until recently. The ChArMEx (the Chemistry-Aerosol Mediterranean Experiment) SOP 1b (Special Observation Period 1b) intensive field campaign in summer of 2013 gathered a large and comprehensive data set of observations, allowing the study of different aspects of the Mediterranean atmosphere including the formation of organic aerosols (OAs) in 3-D models. In this study, we used the CHIMERE CTM to perform simulations for the duration of the SAFMED (Secondary Aerosol Formation in the MEDiterranean) period (July to August 2013) of this campaign. In particular, we evaluated four schemes for the simulation of OA, including the CHIMERE standard scheme, the VBS (volatility basis set) standard scheme with two parameterizations including aging of biogenic secondary OA, and a modified version of the VBS scheme which includes fragmentation and formation of nonvolatile OA. The results from these four schemes are compared to observations at two stations in the western Mediterranean basin, located on Ersa, Cap Corse (Corsica, France), and at Cap Es Pinar (Mallorca, Spain). These observations include OA mass concentration, PMF (positive matrix factorization) results of different OA fractions, and 14C observations showing the fossil or nonfossil origins of carbonaceous particles. Because of the complex orography of the Ersa site, an original method for calculating an orographic representativeness error (ORE) has been developed. It is concluded that the modified VBS scheme is close to observations in all three aspects mentioned above; the standard VBS scheme without BSOA (biogenic secondary organic aerosol) aging also has a satisfactory performance in simulating the mass concentration of OA, but not for the source origin analysis comparisons. In addition, the OA sources over the western Mediterranean basin are explored. OA shows a major biogenic origin, especially at several hundred meters height from the surface; however over the Gulf of Genoa near the surface, the anthropogenic origin is of similar importance. A general assessment of other species was performed to evaluate the robustness of the simulations for this particular domain before evaluating OA simulation schemes. It is also shown that the Cap Corse site presents important orographic complexity, which makes comparison between model simulations and observations difficult. A method was designed to estimate an orographic representativeness error for species measured at Ersa and yields an uncertainty of between 50 and 85 % for primary pollutants, and around 2-10 % for secondary species.

  7. Practical Entanglement Estimation for Spin-System Quantum Simulators.

    PubMed

    Marty, O; Cramer, M; Plenio, M B

    2016-03-11

    We present practical methods to measure entanglement for quantum simulators that can be realized with trapped ions, cold atoms, and superconducting qubits. Focusing on long- and short-range Ising-type Hamiltonians, we introduce schemes that are applicable under realistic experimental conditions including mixedness due to, e.g., noise or temperature. In particular, we identify a single observable whose expectation value serves as a lower bound to entanglement and that may be obtained by a simple quantum circuit. As such circuits are not (yet) available for every platform, we investigate the performance of routinely measured observables as quantitative entanglement witnesses. Possible applications include experimental studies of entanglement scaling in critical systems and the reliable benchmarking of quantum simulators.

  8. The effect of urinary cadmium on cardiovascular fitness as measured by VO{sub 2} max in white, black and Mexican Americans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Egwuogu, Heartley; Shendell, Derek G.; Department of Environmental and Occupational Health, University of Medicine and Dentistry of New Jersey

    Objectives: We explored potential effects of cadmium exposure on cardiovascular fitness measures, including gender and racial/ethnic differences. Methods: Data were from the 1999 to 2000 National Health and Nutrition Examination Survey (NHANES); 1963 participating subjects were included in our analysis. Volume of oxygen consumed at sub-maximum activity (VO{sub 2} max) were recorded in a series of graded exercises; the goal was to elicit 75% of predetermined age-specific heart rates. Cadmium from urine samples was measured in the laboratory using standard methods. Multivariate linear regression analyses were performed to determine potential relationships. Results: Increased urinary cadmium concentrations were generally associated withmore » decreased estimated VO{sub 2} max values. Gender and racial/ethnic differences were also observed. Specifically, associations were statistically significant for white males and Mexican American females. Conclusion: Inverse associations between urinary cadmium concentrations and estimated VO{sub 2} max values were observed, including racial and gender differences. The implications of such gender and racial/ethnic differences on long-term cardiovascular health and health disparities of present public health concern warrant further investigation.« less

  9. TU-FG-209-11: Validation of a Channelized Hotelling Observer to Optimize Chest Radiography Image Processing for Nodule Detection: A Human Observer Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanchez, A; Little, K; Chung, J

    Purpose: To validate the use of a Channelized Hotelling Observer (CHO) model for guiding image processing parameter selection and enable improved nodule detection in digital chest radiography. Methods: In a previous study, an anthropomorphic chest phantom was imaged with and without PMMA simulated nodules using a GE Discovery XR656 digital radiography system. The impact of image processing parameters was then explored using a CHO with 10 Laguerre-Gauss channels. In this work, we validate the CHO’s trend in nodule detectability as a function of two processing parameters by conducting a signal-known-exactly, multi-reader-multi-case (MRMC) ROC observer study. Five naive readers scored confidencemore » of nodule visualization in 384 images with 50% nodule prevalence. The image backgrounds were regions-of-interest extracted from 6 normal patient scans, and the digitally inserted simulated nodules were obtained from phantom data in previous work. Each patient image was processed with both a near-optimal and a worst-case parameter combination, as determined by the CHO for nodule detection. The same 192 ROIs were used for each image processing method, with 32 randomly selected lung ROIs per patient image. Finally, the MRMC data was analyzed using the freely available iMRMC software of Gallas et al. Results: The image processing parameters which were optimized for the CHO led to a statistically significant improvement (p=0.049) in human observer AUC from 0.78 to 0.86, relative to the image processing implementation which produced the lowest CHO performance. Conclusion: Differences in user-selectable image processing methods on a commercially available digital radiography system were shown to have a marked impact on performance of human observers in the task of lung nodule detection. Further, the effect of processing on humans was similar to the effect on CHO performance. Future work will expand this study to include a wider range of detection/classification tasks and more observers, including experienced chest radiologists.« less

  10. [Combination of endoscopic methods in diagnostics and surgical treatment of perforative duodenal ulcer].

    PubMed

    Timofeev, M E; Shapoval'yants, S G; Mikhalev, A I; Fedorov, E D; Konyukhov, G V

    2016-01-01

    To present the results of perforative duodenal ulcer surgical management using combination of endoscopic methods. The study included 279 patients with perforative duodenal ulcer who were operated for the period from 1996 to 2012. Diagnostics and medical tactics were based on developed in our clinic algorithm that includes use of both esophagogastroduodenoscopy and laparoscopy. Presented technique confirmed correct diagnosis, defined medical tactics and choice of surgery in 100% of cases. 67 patients had contraindications for laparoscopic suturing and underwent conventional operations. Herewith postoperative complications and death were observed in 25 (37.3%) and 9 (13.4%) patients respectively. Laparoscopic suturing was performed in 212 patients. Complications were diagnosed in 19 (8.9%) cases including 8 (3.7%) intraoperative and 11 (5.2%) postoperative. Deaths were absent.

  11. Using ACE Observations of Interplanetary Particles and Magnetic Fields as Possible Contributors to Variations Observed at Van Allen Probes during Major events in 2013

    NASA Astrophysics Data System (ADS)

    Armstrong, T. P.; Manweiler, J. W.; Gerrard, A. J.; Gkioulidou, M.; Lanzerotti, L. J.; Patterson, J. D.

    2013-12-01

    Observations from ACE EPAM including energy spectra of protons, helium, and oxygen will be prepared for coordinated use in estimating the direct and indirect access of energetic particles to inner and outer geomagnetic trapping zones. Complete temporal coverage from ACE at 12 seconds, 5 minutes, 17 minutes, hourly and daily cadences will be used to catalog interplanetary events arriving at Earth including interplanetary magnetic field sector boundaries, interplanetary shocks, and interplanetary coronal mass ejections, ICMEs. The first 6 months of 2013 have included both highly disturbed times, March 17 and May 22, and extended quiet periods of little or no variations. Among the specific questions that ACE and Van Allen Probes coordinated observations may aid in resolving are: 1. How much, if any, direct capture of interplanetary energetic particles occurs and what conditions account for it? 2. How much influence do interplanetary field and particle variations have on energization and/or loss of geomagnetically trapped populations? The poster will also present important links and describe methods and important details of access to numerically expressed ACE EPAM and Van Allen Probes RBSPICE observations that can be flexibly and easily accessed via the internet for student and senior researcher use.

  12. Positions of minor planets and Comet Panther (1980 u) obtained at the Chorzow Observatory

    NASA Astrophysics Data System (ADS)

    Wlodarczyk, I.

    Photographic observations of 17 asteroids and Comet Panther were made between 1977 and 1982 with a 200/1000 mm photographic camera coupled to a 300/4500 mm refractor. The Turner method with the complete second-order polynomial was used to reduce the 16 x 16 cm ORWO ZU-2 plates that were obtained. The tabulated information for each asteroid and the comet include the number of the observation, the time of the observation in Universal Time, the topocentric position of the object referred to the mean epoch 1950.0, the dispersion in right ascension and declination, the duration of the exposure in minutes, and the symbol of the observer. Ten observers participated in the program.

  13. A tri-modality image fusion method for target delineation of brain tumors in radiotherapy.

    PubMed

    Guo, Lu; Shen, Shuming; Harris, Eleanor; Wang, Zheng; Jiang, Wei; Guo, Yu; Feng, Yuanming

    2014-01-01

    To develop a tri-modality image fusion method for better target delineation in image-guided radiotherapy for patients with brain tumors. A new method of tri-modality image fusion was developed, which can fuse and display all image sets in one panel and one operation. And a feasibility study in gross tumor volume (GTV) delineation using data from three patients with brain tumors was conducted, which included images of simulation CT, MRI, and 18F-fluorodeoxyglucose positron emission tomography (18F-FDG PET) examinations before radiotherapy. Tri-modality image fusion was implemented after image registrations of CT+PET and CT+MRI, and the transparency weight of each modality could be adjusted and set by users. Three radiation oncologists delineated GTVs for all patients using dual-modality (MRI/CT) and tri-modality (MRI/CT/PET) image fusion respectively. Inter-observer variation was assessed by the coefficient of variation (COV), the average distance between surface and centroid (ADSC), and the local standard deviation (SDlocal). Analysis of COV was also performed to evaluate intra-observer volume variation. The inter-observer variation analysis showed that, the mean COV was 0.14(± 0.09) and 0.07(± 0.01) for dual-modality and tri-modality respectively; the standard deviation of ADSC was significantly reduced (p<0.05) with tri-modality; SDlocal averaged over median GTV surface was reduced in patient 2 (from 0.57 cm to 0.39 cm) and patient 3 (from 0.42 cm to 0.36 cm) with the new method. The intra-observer volume variation was also significantly reduced (p = 0.00) with the tri-modality method as compared with using the dual-modality method. With the new tri-modality image fusion method smaller inter- and intra-observer variation in GTV definition for the brain tumors can be achieved, which improves the consistency and accuracy for target delineation in individualized radiotherapy.

  14. Perspectives on Social Network Analysis for Observational Scientific Data

    NASA Astrophysics Data System (ADS)

    Singh, Lisa; Bienenstock, Elisa Jayne; Mann, Janet

    This chapter is a conceptual look at data quality issues that arise during scientific observations and their impact on social network analysis. We provide examples of the many types of incompleteness, bias and uncertainty that impact the quality of social network data. Our approach is to leverage the insights and experience of observational behavioral scientists familiar with the challenges of making inference when data are not complete, and suggest avenues for extending these to relational data questions. The focus of our discussion is on network data collection using observational methods because they contain high dimensionality, incomplete data, varying degrees of observational certainty, and potential observer bias. However, the problems and recommendations identified here exist in many other domains, including online social networks, cell phone networks, covert networks, and disease transmission networks.

  15. A new FIB fabrication method for micropillar specimens for three-dimensional observation using scanning transmission electron microscopy.

    PubMed

    Fukuda, Muneyuki; Tomimatsu, Satoshi; Nakamura, Kuniyasu; Koguchi, Masanari; Shichi, Hiroyasu; Umemura, Kaoru

    2004-01-01

    A new method to prepare micropillar specimens with a high aspect ratio that is suitable for three-dimensional scanning transmission electron microscopy (3D-STEM) was developed. The key features of the micropillar fabrication are: first, microsampling to extract a small piece including the structure of interest in an IC chip, and second, an ion-beam with an incident direction of 60 degrees to the pillar's axis that enables the parallel sidewalls of the pillar to be produced with a high aspect ratio. A memory-cell structure (length: 6 microm; width: 300 x 500 nm) was fabricated in the micropillar and observed from various directions with a 3D-STEM. A planiform capacitor covered with granular surfaces and a solid crossing gate and metal lines was successfully observed threedimensionally at a resolution of approximately 5 nm.

  16. Lessons Learned from Crime Caught on Camera

    PubMed Central

    Bernasco, Wim

    2018-01-01

    Objectives: The widespread use of camera surveillance in public places offers criminologists the opportunity to systematically and unobtrusively observe crime, their main subject matter. The purpose of this essay is to inform the reader of current developments in research on crimes caught on camera. Methods: We address the importance of direct observation of behavior and review criminological studies that used observational methods, with and without cameras, including the ones published in this issue. We also discuss the uses of camera recordings in other social sciences and in biology. Results: We formulate six key insights that emerge from the literature and make recommendations for future research. Conclusions: Camera recordings of real-life crime are likely to become part of the criminological tool kit that will help us better understand the situational and interactional elements of crime. Like any source, it has limitations that are best addressed by triangulation with other sources. PMID:29472728

  17. A Simple Method for Drawing Chiral Mononuclear Octahedral Metal Complexes

    ERIC Educational Resources Information Center

    Mohamadou, Aminou; Haudrechy, Arnaud

    2008-01-01

    Octahedral transition-metal complexes are involved in a number of reactions and octahedral coordination geometry, frequently observed for metallic centers, includes important topographical stereochemistry. Depending on the number and nature of different ligands, octahedral coordination units with at least two different monodentate ligands give…

  18. Narratives of Experiential Learning: Students' Engagement in a Physical Activity-Based Service-Learning Course

    ERIC Educational Resources Information Center

    Whitley, Meredith A.; Walsh, David; Hayden, Laura; Gould, Daniel

    2017-01-01

    Purpose: Three undergraduate students' experiences in a physical activity-based service learning course are chronicled using narrative inquiry. Method: Data collection included demographics questionnaires, pre- and postservice interviews, reflection journals, postservice written reflections, and participant observations. The data were analyzed…

  19. Quantifying Variations In Multi-parameter Models With The Photon Clean Method (PCM) And Bootstrap Methods

    NASA Astrophysics Data System (ADS)

    Carpenter, Matthew H.; Jernigan, J. G.

    2007-05-01

    We present examples of an analysis progression consisting of a synthesis of the Photon Clean Method (Carpenter, Jernigan, Brown, Beiersdorfer 2007) and bootstrap methods to quantify errors and variations in many-parameter models. The Photon Clean Method (PCM) works well for model spaces with large numbers of parameters proportional to the number of photons, therefore a Monte Carlo paradigm is a natural numerical approach. Consequently, PCM, an "inverse Monte-Carlo" method, requires a new approach for quantifying errors as compared to common analysis methods for fitting models of low dimensionality. This presentation will explore the methodology and presentation of analysis results derived from a variety of public data sets, including observations with XMM-Newton, Chandra, and other NASA missions. Special attention is given to the visualization of both data and models including dynamic interactive presentations. This work was performed under the auspices of the Department of Energy under contract No. W-7405-Eng-48. We thank Peter Beiersdorfer and Greg Brown for their support of this technical portion of a larger program related to science with the LLNL EBIT program.

  20. A new multistage groundwater transport inverse method: presentation, evaluation, and implications

    USGS Publications Warehouse

    Anderman, Evan R.; Hill, Mary C.

    1999-01-01

    More computationally efficient methods of using concentration data are needed to estimate groundwater flow and transport parameters. This work introduces and evaluates a three‐stage nonlinear‐regression‐based iterative procedure in which trial advective‐front locations link decoupled flow and transport models. Method accuracy and efficiency are evaluated by comparing results to those obtained when flow‐ and transport‐model parameters are estimated simultaneously. The new method is evaluated as conclusively as possible by using a simple test case that includes distinct flow and transport parameters, but does not include any approximations that are problem dependent. The test case is analytical; the only flow parameter is a constant velocity, and the transport parameters are longitudinal and transverse dispersivity. Any difficulties detected using the new method in this ideal situation are likely to be exacerbated in practical problems. Monte‐Carlo analysis of observation error ensures that no specific error realization obscures the results. Results indicate that, while this, and probably other, multistage methods do not always produce optimal parameter estimates, the computational advantage may make them useful in some circumstances, perhaps as a precursor to using a simultaneous method.

  1. A Luenberger observer for reaction-diffusion models with front position data

    NASA Astrophysics Data System (ADS)

    Collin, Annabelle; Chapelle, Dominique; Moireau, Philippe

    2015-11-01

    We propose a Luenberger observer for reaction-diffusion models with propagating front features, and for data associated with the location of the front over time. Such models are considered in various application fields, such as electrophysiology, wild-land fire propagation and tumor growth modeling. Drawing our inspiration from image processing methods, we start by proposing an observer for the eikonal-curvature equation that can be derived from the reaction-diffusion model by an asymptotic expansion. We then carry over this observer to the underlying reaction-diffusion equation by an ;inverse asymptotic analysis;, and we show that the associated correction in the dynamics has a stabilizing effect for the linearized estimation error. We also discuss the extension to joint state-parameter estimation by using the earlier-proposed ROUKF strategy. We then illustrate and assess our proposed observer method with test problems pertaining to electrophysiology modeling, including with a realistic model of cardiac atria. Our numerical trials show that state estimation is directly very effective with the proposed Luenberger observer, while specific strategies are needed to accurately perform parameter estimation - as is usual with Kalman filtering used in a nonlinear setting - and we demonstrate two such successful strategies.

  2. Inferring extinction risks from sighting records.

    PubMed

    Thompson, C J; Lee, T E; Stone, L; McCarthy, M A; Burgman, M A

    2013-12-07

    Estimating the probability that a species is extinct based on historical sighting records is important when deciding how much effort and money to invest in conservation policies. The framework we offer is more general than others in the literature to date. Our formulation allows for definite and uncertain observations, and thus better accommodates the realities of sighting record quality. Typically, the probability of observing a species given it is extant/extinct is challenging to define, especially when the possibility of a false observation is included. As such, we assume that observation probabilities derive from a representative probability density function. We incorporate this randomness in two different ways ("quenched" versus "annealed") using a framework that is equivalent to a Bayes formulation. The two methods can lead to significantly different estimates for extinction. In the case of definite sightings only, we provide an explicit deterministic calculation (in which observation probabilities are point estimates). Furthermore, our formulation replicates previous work in certain limiting cases. In the case of uncertain sightings, we allow for the possibility of several independent observational types (specimen, photographs, etc.). The method is applied to the Caribbean monk seal, Monachus tropicalis (which has only definite sightings), and synthetic data, with uncertain sightings. © 2013 Elsevier Ltd. All rights reserved.

  3. A multi-source data assimilation framework for flood forecasting: Accounting for runoff routing lags

    NASA Astrophysics Data System (ADS)

    Meng, S.; Xie, X.

    2015-12-01

    In the flood forecasting practice, model performance is usually degraded due to various sources of uncertainties, including the uncertainties from input data, model parameters, model structures and output observations. Data assimilation is a useful methodology to reduce uncertainties in flood forecasting. For the short-term flood forecasting, an accurate estimation of initial soil moisture condition will improve the forecasting performance. Considering the time delay of runoff routing is another important effect for the forecasting performance. Moreover, the observation data of hydrological variables (including ground observations and satellite observations) are becoming easily available. The reliability of the short-term flood forecasting could be improved by assimilating multi-source data. The objective of this study is to develop a multi-source data assimilation framework for real-time flood forecasting. In this data assimilation framework, the first step is assimilating the up-layer soil moisture observations to update model state and generated runoff based on the ensemble Kalman filter (EnKF) method, and the second step is assimilating discharge observations to update model state and runoff within a fixed time window based on the ensemble Kalman smoother (EnKS) method. This smoothing technique is adopted to account for the runoff routing lag. Using such assimilation framework of the soil moisture and discharge observations is expected to improve the flood forecasting. In order to distinguish the effectiveness of this dual-step assimilation framework, we designed a dual-EnKF algorithm in which the observed soil moisture and discharge are assimilated separately without accounting for the runoff routing lag. The results show that the multi-source data assimilation framework can effectively improve flood forecasting, especially when the runoff routing has a distinct time lag. Thus, this new data assimilation framework holds a great potential in operational flood forecasting by merging observations from ground measurement and remote sensing retrivals.

  4. Comparison of safety effect estimates obtained from empirical Bayes before-after study, propensity scores-potential outcomes framework, and regression model with cross-sectional data.

    PubMed

    Wood, Jonathan S; Donnell, Eric T; Porter, Richard J

    2015-02-01

    A variety of different study designs and analysis methods have been used to evaluate the performance of traffic safety countermeasures. The most common study designs and methods include observational before-after studies using the empirical Bayes method and cross-sectional studies using regression models. The propensity scores-potential outcomes framework has recently been proposed as an alternative traffic safety countermeasure evaluation method to address the challenges associated with selection biases that can be part of cross-sectional studies. Crash modification factors derived from the application of all three methods have not yet been compared. This paper compares the results of retrospective, observational evaluations of a traffic safety countermeasure using both before-after and cross-sectional study designs. The paper describes the strengths and limitations of each method, focusing primarily on how each addresses site selection bias, which is a common issue in observational safety studies. The Safety Edge paving technique, which seeks to mitigate crashes related to roadway departure events, is the countermeasure used in the present study to compare the alternative evaluation methods. The results indicated that all three methods yielded results that were consistent with each other and with previous research. The empirical Bayes results had the smallest standard errors. It is concluded that the propensity scores with potential outcomes framework is a viable alternative analysis method to the empirical Bayes before-after study. It should be considered whenever a before-after study is not possible or practical. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. BONNSAI: correlated stellar observables in Bayesian methods

    NASA Astrophysics Data System (ADS)

    Schneider, F. R. N.; Castro, N.; Fossati, L.; Langer, N.; de Koter, A.

    2017-02-01

    In an era of large spectroscopic surveys of stars and big data, sophisticated statistical methods become more and more important in order to infer fundamental stellar parameters such as mass and age. Bayesian techniques are powerful methods because they can match all available observables simultaneously to stellar models while taking prior knowledge properly into account. However, in most cases it is assumed that observables are uncorrelated which is generally not the case. Here, we include correlations in the Bayesian code Bonnsai by incorporating the covariance matrix in the likelihood function. We derive a parametrisation of the covariance matrix that, in addition to classical uncertainties, only requires the specification of a correlation parameter that describes how observables co-vary. Our correlation parameter depends purely on the method with which observables have been determined and can be analytically derived in some cases. This approach therefore has the advantage that correlations can be accounted for even if information for them are not available in specific cases but are known in general. Because the new likelihood model is a better approximation of the data, the reliability and robustness of the inferred parameters are improved. We find that neglecting correlations biases the most likely values of inferred stellar parameters and affects the precision with which these parameters can be determined. The importance of these biases depends on the strength of the correlations and the uncertainties. For example, we apply our technique to massive OB stars, but emphasise that it is valid for any type of stars. For effective temperatures and surface gravities determined from atmosphere modelling, we find that masses can be underestimated on average by 0.5σ and mass uncertainties overestimated by a factor of about 2 when neglecting correlations. At the same time, the age precisions are underestimated over a wide range of stellar parameters. We conclude that accounting for correlations is essential in order to derive reliable stellar parameters including robust uncertainties and will be vital when entering an era of precision stellar astrophysics thanks to the Gaia satellite.

  6. Magnification Bias in Gravitational Arc Statistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Caminha, G. B.; Estrada, J.; Makler, M.

    2013-08-29

    The statistics of gravitational arcs in galaxy clusters is a powerful probe of cluster structure and may provide complementary cosmological constraints. Despite recent progresses, discrepancies still remain among modelling and observations of arc abundance, specially regarding the redshift distribution of strong lensing clusters. Besides, fast "semi-analytic" methods still have to incorporate the success obtained with simulations. In this paper we discuss the contribution of the magnification in gravitational arc statistics. Although lensing conserves surface brightness, the magnification increases the signal-to-noise ratio of the arcs, enhancing their detectability. We present an approach to include this and other observational effects in semi-analyticmore » calculations for arc statistics. The cross section for arc formation ({\\sigma}) is computed through a semi-analytic method based on the ratio of the eigenvalues of the magnification tensor. Using this approach we obtained the scaling of {\\sigma} with respect to the magnification, and other parameters, allowing for a fast computation of the cross section. We apply this method to evaluate the expected number of arcs per cluster using an elliptical Navarro--Frenk--White matter distribution. Our results show that the magnification has a strong effect on the arc abundance, enhancing the fraction of arcs, moving the peak of the arc fraction to higher redshifts, and softening its decrease at high redshifts. We argue that the effect of magnification should be included in arc statistics modelling and that it could help to reconcile arcs statistics predictions with the observational data.« less

  7. Pre-operative predictive factors for gallbladder cholesterol polyps using conventional diagnostic imaging

    PubMed Central

    Choi, Ji-Hoon; Yun, Jung-Won; Kim, Yong-Sung; Lee, Eun-A; Hwang, Sang-Tae; Cho, Yong-Kyun; Kim, Hong-Joo; Park, Jung-Ho; Park, Dong-Il; Sohn, Chong-Il; Jeon, Woo-Kyu; Kim, Byung-Ik; Kim, Hyoung-Ook; Shin, Jun-Ho

    2008-01-01

    AIM: To determine the clinical data that might be useful for differentiating benign from malignant gallbladder (GB) polyps by comparing radiological methods, including abdominal ultrasonography (US) and computed tomography (CT) scanning, with postoperative pathology findings. METHODS: Fifty-nine patients underwent laparoscopic cholecystectomy for a GB polyp of around 10 mm. They were divided into two groups, one with cholesterol polyps and the other with non-cholesterol polyps. Clinical features such as gender, age, symptoms, size and number of polyps, the presence of a GB stone, the radiologically measured maximum diameter of the polyp by US and CT scanning, and the measurements of diameter from postoperative pathology were recorded for comparative analysis. RESULTS: Fifteen of the 41 cases with cholesterol polyps (36.6%) were detected with US but not CT scanning, whereas all 18 non-cholesterol polyps were observed using both methods. In the cholesterol polyp group, the maximum measured diameter of the polyp was smaller by CT scan than by US. Consequently, the discrepancy between those two scanning measurements was greater than for the non-cholesterol polyp group. CONCLUSION: The clinical signs indicative of a cholesterol polyp include: (1) a polyp observed by US but not observable by CT scanning, (2) a smaller diameter on the CT scan compared to US, and (3) a discrepancy in its maximum diameter between US and CT measurements. In addition, US and the CT scan had low accuracy in predicting the polyp diameter compared to that determined by postoperative pathology. PMID:19058309

  8. Electro-active sensor, method for constructing the same; apparatus and circuitry for detection of electro-active species

    NASA Technical Reports Server (NTRS)

    Buehler, Martin (Inventor)

    2009-01-01

    An electro-active sensor includes a nonconductive platform with a first electrode set attached with a first side of a nonconductive platform. The first electrode set serves as an electrochemical cell that may be utilized to detect electro-active species in solution. A plurality of electrode sets and a variety of additional electrochemical cells and sensors may be attached with the nonconductive platform. The present invention also includes a method for constructing the aforementioned electro-active sensor. Additionally, an apparatus for detection and observation is disclosed, where the apparatus includes a sealable chamber for insertion of a portion of an electro-active sensor. The apparatus allows for monitoring and detection activities. Allowing for control of attached cells and sensors, a dual-mode circuitry is also disclosed. The dual-mode circuitry includes a switch, allowing the circuitry to be switched from a potentiostat to a galvanostat mode.

  9. Resampling methods in Microsoft Excel® for estimating reference intervals

    PubMed Central

    Theodorsson, Elvar

    2015-01-01

    Computer- intensive resampling/bootstrap methods are feasible when calculating reference intervals from non-Gaussian or small reference samples. Microsoft Excel® in version 2010 or later includes natural functions, which lend themselves well to this purpose including recommended interpolation procedures for estimating 2.5 and 97.5 percentiles.
The purpose of this paper is to introduce the reader to resampling estimation techniques in general and in using Microsoft Excel® 2010 for the purpose of estimating reference intervals in particular.
Parametric methods are preferable to resampling methods when the distributions of observations in the reference samples is Gaussian or can transformed to that distribution even when the number of reference samples is less than 120. Resampling methods are appropriate when the distribution of data from the reference samples is non-Gaussian and in case the number of reference individuals and corresponding samples are in the order of 40. At least 500-1000 random samples with replacement should be taken from the results of measurement of the reference samples. PMID:26527366

  10. Resampling methods in Microsoft Excel® for estimating reference intervals.

    PubMed

    Theodorsson, Elvar

    2015-01-01

    Computer-intensive resampling/bootstrap methods are feasible when calculating reference intervals from non-Gaussian or small reference samples. Microsoft Excel® in version 2010 or later includes natural functions, which lend themselves well to this purpose including recommended interpolation procedures for estimating 2.5 and 97.5 percentiles. 
The purpose of this paper is to introduce the reader to resampling estimation techniques in general and in using Microsoft Excel® 2010 for the purpose of estimating reference intervals in particular.
 Parametric methods are preferable to resampling methods when the distributions of observations in the reference samples is Gaussian or can transformed to that distribution even when the number of reference samples is less than 120. Resampling methods are appropriate when the distribution of data from the reference samples is non-Gaussian and in case the number of reference individuals and corresponding samples are in the order of 40. At least 500-1000 random samples with replacement should be taken from the results of measurement of the reference samples.

  11. Non-seismic tsunamis: filling the forecast gap

    NASA Astrophysics Data System (ADS)

    Moore, C. W.; Titov, V. V.; Spillane, M. C.

    2015-12-01

    Earthquakes are the generation mechanism in over 85% of tsunamis. However, non-seismic tsunamis, including those generated by meteorological events, landslides, volcanoes, and asteroid impacts, can inundate significant area and have a large far-field effect. The current National Oceanographic and Atmospheric Administration (NOAA) tsunami forecast system falls short in detecting these phenomena. This study attempts to classify the range of effects possible from these non-seismic threats, and to investigate detection methods appropriate for use in a forecast system. Typical observation platforms are assessed, including DART bottom pressure recorders and tide gauges. Other detection paths include atmospheric pressure anomaly algorithms for detecting meteotsunamis and the early identification of asteroids large enough to produce a regional hazard. Real-time assessment of observations for forecast use can provide guidance to mitigate the effects of a non-seismic tsunami.

  12. Remote observations of reentering spacecraft including the space shuttle orbiter

    NASA Astrophysics Data System (ADS)

    Horvath, Thomas J.; Cagle, Melinda F.; Grinstead, Jay H.; Gibson, David M.

    Flight measurement is a critical phase in development, validation and certification processes of technologies destined for future civilian and military operational capabilities. This paper focuses on several recent NASA-sponsored remote observations that have provided unique engineering and scientific insights of reentry vehicle flight phenomenology and performance that could not necessarily be obtained with more traditional instrumentation methods such as onboard discrete surface sensors. The missions highlighted include multiple spatially-resolved infrared observations of the NASA Space Shuttle Orbiter during hypersonic reentry from 2009 to 2011, and emission spectroscopy of comparatively small-sized sample return capsules returning from exploration missions. Emphasis has been placed upon identifying the challenges associated with these remote sensing missions with focus on end-to-end aspects that include the initial science objective, selection of the appropriate imaging platform and instrumentation suite, target flight path analysis and acquisition strategy, pre-mission simulations to optimize sensor configuration, logistics and communications during the actual observation. Explored are collaborative opportunities and technology investments required to develop a next-generation quantitative imaging system (i.e., an intelligent sensor and platform) with greater capability, which could more affordably support cross cutting civilian and military flight test needs.

  13. Remote Observations of Reentering Spacecraft Including the Space Shuttle Orbiter

    NASA Technical Reports Server (NTRS)

    Horvath, Thomas J.; Cagle, Melinda F.; Grinstead, jay H.; Gibson, David

    2013-01-01

    Flight measurement is a critical phase in development, validation and certification processes of technologies destined for future civilian and military operational capabilities. This paper focuses on several recent NASA-sponsored remote observations that have provided unique engineering and scientific insights of reentry vehicle flight phenomenology and performance that could not necessarily be obtained with more traditional instrumentation methods such as onboard discrete surface sensors. The missions highlighted include multiple spatially-resolved infrared observations of the NASA Space Shuttle Orbiter during hypersonic reentry from 2009 to 2011, and emission spectroscopy of comparatively small-sized sample return capsules returning from exploration missions. Emphasis has been placed upon identifying the challenges associated with these remote sensing missions with focus on end-to-end aspects that include the initial science objective, selection of the appropriate imaging platform and instrumentation suite, target flight path analysis and acquisition strategy, pre-mission simulations to optimize sensor configuration, logistics and communications during the actual observation. Explored are collaborative opportunities and technology investments required to develop a next-generation quantitative imaging system (i.e., an intelligent sensor and platform) with greater capability, which could more affordably support cross cutting civilian and military flight test needs.

  14. Experience of treatment of patients with granulomatous lobular mastitis.

    PubMed

    Hur, Sung Mo; Cho, Dong Hui; Lee, Se Kyung; Choi, Min-Young; Bae, Soo Youn; Koo, Min Young; Kim, Sangmin; Choe, Jun-Ho; Kim, Jung-Han; Kim, Jee Soo; Nam, Seok-Jin; Yang, Jung-Hyun; Lee, Jeong Eon

    2013-07-01

    To present the author's experience with various treatment methods of granulomatous lobular mastitis (GLM) and to determine effective treatment methods of GLM. Fifty patients who were diagnosed with GLM were classified into five groups based on the initial treatment methods they underwent, which included observation (n = 8), antibiotics (n = 3), steroid (n = 13), drainage (n = 14), and surgical excision (n = 12). The treatment processes in each group were examined and their clinical characteristics, treatment processes, and results were analyzed respectively. Success rates with each initial treatment were observation, 87.5%; antibiotics, 33.3%; steroids, 30.8%; drainage, 28.6%; and surgical excision, 91.7%. In most cases of observation, the lesions were small and the symptoms were mild. A total of 23 patients underwent surgical excision during treatment. Surgical excision showed particularly fast recovery, high success rate (90.3%) and low recurrence rate (8.7%). The clinical course of GLM is complex and the outcome of each treatment type are variable. Surgery may play an important role when a lesion is determined to be mass-forming or appears localized as an abscess pocket during breast examination or imaging study.

  15. Live CLEM imaging to analyze nuclear structures at high resolution.

    PubMed

    Haraguchi, Tokuko; Osakada, Hiroko; Koujin, Takako

    2015-01-01

    Fluorescence microscopy (FM) and electron microscopy (EM) are powerful tools for observing molecular components in cells. FM can provide temporal information about cellular proteins and structures in living cells. EM provides nanometer resolution images of cellular structures in fixed cells. We have combined FM and EM to develop a new method of correlative light and electron microscopy (CLEM), called "Live CLEM." In this method, the dynamic behavior of specific molecules of interest is first observed in living cells using fluorescence microscopy (FM) and then cellular structures in the same cell are observed using electron microscopy (EM). Following image acquisition, FM and EM images are compared to enable the fluorescent images to be correlated with the high-resolution images of cellular structures obtained using EM. As this method enables analysis of dynamic events involving specific molecules of interest in the context of specific cellular structures at high resolution, it is useful for the study of nuclear structures including nuclear bodies. Here we describe Live CLEM that can be applied to the study of nuclear structures in mammalian cells.

  16. Fight the power: the limits of empiricism and the costs of positivistic rigor.

    PubMed

    Indick, William

    2002-01-01

    A summary of the influence of positivistic philosophy and empiricism on the field of psychology is followed by a critique of the empirical method. The dialectic process is advocated as an alternative method of inquiry. The main advantage of the dialectic method is that it is open to any logical argument, including empirical hypotheses, but unlike empiricism, it does not automatically reject arguments that are not based on observable data. Evolutionary and moral psychology are discussed as examples of important fields of study that could benefit from types of arguments that frequently do not conform to the empirical standards of systematic observation and falsifiability of hypotheses. A dialectic method is shown to be a suitable perspective for those fields of research, because it allows for logical arguments that are not empirical and because it fosters a functionalist perspective, which is indispensable for both evolutionary and moral theories. It is suggested that all psychologists may gain from adopting a dialectic approach, rather than restricting themselves to empirical arguments alone.

  17. Structured Matrix Completion with Applications to Genomic Data Integration.

    PubMed

    Cai, Tianxi; Cai, T Tony; Zhang, Anru

    2016-01-01

    Matrix completion has attracted significant recent attention in many fields including statistics, applied mathematics and electrical engineering. Current literature on matrix completion focuses primarily on independent sampling models under which the individual observed entries are sampled independently. Motivated by applications in genomic data integration, we propose a new framework of structured matrix completion (SMC) to treat structured missingness by design. Specifically, our proposed method aims at efficient matrix recovery when a subset of the rows and columns of an approximately low-rank matrix are observed. We provide theoretical justification for the proposed SMC method and derive lower bound for the estimation errors, which together establish the optimal rate of recovery over certain classes of approximately low-rank matrices. Simulation studies show that the method performs well in finite sample under a variety of configurations. The method is applied to integrate several ovarian cancer genomic studies with different extent of genomic measurements, which enables us to construct more accurate prediction rules for ovarian cancer survival.

  18. What InSAR time-series methods are best suited for the Ecuadorian volcanoes

    NASA Astrophysics Data System (ADS)

    Mirzaee, S.; Amelung, F.

    2017-12-01

    Ground displacement measurements from stacks of SAR images obtained using interferometric time-series approaches play an increasingly important role for volcanic hazard assessment. The inflation of the ground surface can indicate that magma ascends to shallower levels and that a volcano gets ready for an eruption. Commonly used InSAR time-series approaches include Small Baseline (SB), Persistent Scatter InSAR (PSI) and SqueeSAR methods but it remains unclear which approach is best suited for volcanic environments. On this poster we present InSAR deformation measurements for the active volcanoes of Ecuador (Cotopaxi, Tungurahua and Pichincha) using a variety of INSAR time-series methods. We discuss the pros and cons of each method given the available data stacks (TerraSAR-X, Cosmo-Skymed and Sentinel-1) in an effort to design a comprehensive observation strategy for the Ecuadorian volcanoes. SAR data are provided in the framework of the Group on Earth Observation's Ecuadorian Volcano Geohazard Supersite.

  19. Biases and power for groups comparison on subjective health measurements.

    PubMed

    Hamel, Jean-François; Hardouin, Jean-Benoit; Le Neel, Tanguy; Kubis, Gildas; Roquelaure, Yves; Sébille, Véronique

    2012-01-01

    Subjective health measurements are increasingly used in clinical research, particularly for patient groups comparisons. Two main types of analytical strategies can be used for such data: so-called classical test theory (CTT), relying on observed scores and models coming from Item Response Theory (IRT) relying on a response model relating the items responses to a latent parameter, often called latent trait. Whether IRT or CTT would be the most appropriate method to compare two independent groups of patients on a patient reported outcomes measurement remains unknown and was investigated using simulations. For CTT-based analyses, groups comparison was performed using t-test on the scores. For IRT-based analyses, several methods were compared, according to whether the Rasch model was considered with random effects or with fixed effects, and the group effect was included as a covariate or not. Individual latent traits values were estimated using either a deterministic method or by stochastic approaches. Latent traits were then compared with a t-test. Finally, a two-steps method was performed to compare the latent trait distributions, and a Wald test was performed to test the group effect in the Rasch model including group covariates. The only unbiased IRT-based method was the group covariate Wald's test, performed on the random effects Rasch model. This model displayed the highest observed power, which was similar to the power using the score t-test. These results need to be extended to the case frequently encountered in practice where data are missing and possibly informative.

  20. UTILIZATION OF THE WAVEFRONT SENSOR AND SHORT-EXPOSURE IMAGES FOR SIMULTANEOUS ESTIMATION OF QUASI-STATIC ABERRATION AND EXOPLANET INTENSITY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frazin, Richard A., E-mail: rfrazin@umich.edu

    2013-04-10

    Heretofore, the literature on exoplanet detection with coronagraphic telescope systems has paid little attention to the information content of short exposures and methods of utilizing the measurements of adaptive optics wavefront sensors. This paper provides a framework for the incorporation of the wavefront sensor measurements in the context of observing modes in which the science camera takes millisecond exposures. In this formulation, the wavefront sensor measurements provide a means to jointly estimate the static speckle and the planetary signal. The ability to estimate planetary intensities in as little as a few seconds has the potential to greatly improve the efficiencymore » of exoplanet search surveys. For simplicity, the mathematical development assumes a simple optical system with an idealized Lyot coronagraph. Unlike currently used methods, in which increasing the observation time beyond a certain threshold is useless, this method produces estimates whose error covariances decrease more quickly than inversely proportional to the observation time. This is due to the fact that the estimates of the quasi-static aberrations are informed by a new random (but approximately known) wavefront every millisecond. The method can be extended to include angular (due to diurnal field rotation) and spectral diversity. Numerical experiments are performed with wavefront data from the AEOS Adaptive Optics System sensing at 850 nm. These experiments assume a science camera wavelength {lambda} of 1.1 {mu}, that the measured wavefronts are exact, and a Gaussian approximation of shot-noise. The effects of detector read-out noise and other issues are left to future investigations. A number of static aberrations are introduced, including one with a spatial frequency exactly corresponding the planet location, which was at a distance of Almost-Equal-To 3{lambda}/D from the star. Using only 4 s of simulated observation time, a planetary intensity, of Almost-Equal-To 1 photon ms{sup -1}, and a stellar intensity of Almost-Equal-To 10{sup 5} photons ms{sup -1} (contrast ratio 10{sup 5}), the short-exposure estimation method recovers the amplitudes' static aberrations with 1% accuracy, and the planet brightness with 20% accuracy.« less

  1. Statistical approaches to the analysis of point count data: A little extra information can go a long way

    USGS Publications Warehouse

    Farnsworth, G.L.; Nichols, J.D.; Sauer, J.R.; Fancy, S.G.; Pollock, K.H.; Shriner, S.A.; Simons, T.R.; Ralph, C. John; Rich, Terrell D.

    2005-01-01

    Point counts are a standard sampling procedure for many bird species, but lingering concerns still exist about the quality of information produced from the method. It is well known that variation in observer ability and environmental conditions can influence the detection probability of birds in point counts, but many biologists have been reluctant to abandon point counts in favor of more intensive approaches to counting. However, over the past few years a variety of statistical and methodological developments have begun to provide practical ways of overcoming some of the problems with point counts. We describe some of these approaches, and show how they can be integrated into standard point count protocols to greatly enhance the quality of the information. Several tools now exist for estimation of detection probability of birds during counts, including distance sampling, double observer methods, time-depletion (removal) methods, and hybrid methods that combine these approaches. Many counts are conducted in habitats that make auditory detection of birds much more likely than visual detection. As a framework for understanding detection probability during such counts, we propose separating two components of the probability a bird is detected during a count into (1) the probability a bird vocalizes during the count and (2) the probability this vocalization is detected by an observer. In addition, we propose that some measure of the area sampled during a count is necessary for valid inferences about bird populations. This can be done by employing fixed-radius counts or more sophisticated distance-sampling models. We recommend any studies employing point counts be designed to estimate detection probability and to include a measure of the area sampled.

  2. SUPPLEMENT: “THE RATE OF BINARY BLACK HOLE MERGERS INFERRED FROM ADVANCED LIGO OBSERVATIONS SURROUNDING GW150914” (2016, ApJL, 833, L1)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abbott, B. P.; Abbott, R.; Abernathy, M. R.

    This article provides supplemental information for a Letter reporting the rate of (BBH) coalescences inferred from 16 days of coincident Advanced LIGO observations surrounding the transient (GW) signal GW150914. In that work we reported various rate estimates whose 90% confidence intervals fell in the range 2–600 Gpc{sup −3} yr{sup −1}. Here we give details on our method and computations, including information about our search pipelines, a derivation of our likelihood function for the analysis, a description of the astrophysical search trigger distribution expected from merging BBHs, details on our computational methods, a description of the effects and our model for calibration uncertainty,more » and an analytic method for estimating our detector sensitivity, which is calibrated to our measurements.« less

  3. System and method for phase retrieval for radio telescope and antenna control

    NASA Technical Reports Server (NTRS)

    Dean, Bruce H. (Inventor)

    2013-01-01

    Disclosed herein are systems, methods, and non-transitory computer-readable storage media for radio phase retrieval. A system practicing the method gathers first data from radio waves associated with an object observed via a first aperture, gathers second data from radio waves associated with the object observed via an introduced second aperture associated with the first aperture, generates reduced noise data by incoherently subtracting the second data from the first data, and performs phase retrieval for the radio waves by modeling the reduced noise data using a single Fourier transform. The first and second apertures are at different positions, such as side by side. This approach can include determining a value Q which represents a ratio of wavelength times a focal ratio divided by pixel spacing. This information can be used to accurately measure and correct alignment errors or other optical system flaws in the apertures.

  4. Multiscale Structure of UXO Site Characterization: Spatial Estimation and Uncertainty Quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ostrouchov, George; Doll, William E.; Beard, Les P.

    2009-01-01

    Unexploded ordnance (UXO) site characterization must consider both how the contamination is generated and how we observe that contamination. Within the generation and observation processes, dependence structures can be exploited at multiple scales. We describe a conceptual site characterization process, the dependence structures available at several scales, and consider their statistical estimation aspects. It is evident that most of the statistical methods that are needed to address the estimation problems are known but their application-specific implementation may not be available. We demonstrate estimation at one scale and propose a representation for site contamination intensity that takes full account of uncertainty,more » is flexible enough to answer regulatory requirements, and is a practical tool for managing detailed spatial site characterization and remediation. The representation is based on point process spatial estimation methods that require modern computational resources for practical application. These methods have provisions for including prior and covariate information.« less

  5. Reliability and Validity of Observational Risk Screening in Evaluating Dynamic Knee Valgus

    PubMed Central

    Ekegren, Christina L.; Miller, William C.; Celebrini, Richard G.; Eng, Janice J.; MacIntyre, Donna L.

    2012-01-01

    Study Design Nonexperimental methodological study. Objectives To determine the interrater and intrarater reliability and validity of using observational risk screening guidelines to evaluate dynamic knee valgus. Background A deficiency in the neuromuscular control of the hip has been identified as a key risk factor for non-contact anterior cruciate ligament (ACL) injury in post pubescent females. This deficiency can manifest itself as a valgus knee alignment during tasks involving hip and knee flexion. There are currently no scientifically tested methods to screen for dynamic knee valgus in the clinic or on the field. Methods Three physiotherapists used observational risk screening guidelines to rate 40 adolescent female soccer players according to their risk of ACL injury. The rating was based on the amount of dynamic knee valgus observed on a drop jump landing. Ratings were evaluated for intrarater and interrater agreement using kappa coefficients. Sensitivity and specificity of ratings were evaluated by comparing observational ratings with measurements obtained using 3-dimensional (3D) motion analysis. Results Kappa coefficients for intrarater and interrater agreement ranged from 0.75 to 0.85, indicating that ratings were reasonably consistent over time and between physiotherapists. Sensitivity values were inadequate, ranging from 67–87%. This indicated that raters failed to detect up to a third of “truly high risk” individuals. Specificity values ranged from 60–72% which was considered adequate for the purposes of the screen. Conclusion Observational risk screening is a practical and cost-effective method of screening for ACL injury risk. Rater agreement and specificity were acceptable for this method but sensitivity was not. To detect a greater proportion of individuals at risk of ACL injury, coaches and clinicians should ensure that they include additional tests for other high risk characteristics in their screening protocols. PMID:19721212

  6. The Potential Utility of Urinary Biomarkers for Risk Prediction in Combat Casualties: A Prospective Observational Cohort Study

    DTIC Science & Technology

    2015-06-16

    are associated with poor outcomes, including death and the need for renal replacement therapy. Methods : We conducted a prospective, observational study...penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE 16 JUN 2015...2. REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE The Potential Utility of Urinary Biomarkers for Risk Prediction in Combat

  7. UV spectroscopy including ISM line absorption: of the exciting star of Abell 35

    NASA Astrophysics Data System (ADS)

    Ziegler, M.; Rauch, T.; Werner, K.; Kruk, J. W.

    Reliable spectral analysis that is based on high-resolution UV observations requires an adequate, simultaneous modeling of the interstellar line absorption and reddening. In the case of the central star of the planetary nebula Abell 35, BD-22 3467, we demonstrate our current standard spectral-analysis method that is based on the Tübingen NLTE Model-Atmosphere Package (TMAP). We present an on- going spectral analysis of FUSE and HST/STIS observations of BD-22 3467.

  8. An efficient probe of the cosmological CPT violation

    NASA Astrophysics Data System (ADS)

    Zhao, Gong-Bo; Wang, Yuting; Xia, Jun-Qing; Li, Mingzhe; Zhang, Xinmin

    2015-07-01

    We develop an efficient method based on the linear regression algorithm to probe the cosmological CPT violation using the CMB polarisation data. We validate this method using simulated CMB data and apply it to recent CMB observations. We find that a combined data sample of BICEP1 and BOOMERanG 2003 favours a nonzero isotropic rotation angle at 2.3σ confidence level, i.e., bar alpha=-3.3o±1.4o (68% CL) with systematics included.

  9. Method of calculating retroreflector-array transfer functions. [laser range finders

    NASA Technical Reports Server (NTRS)

    Arnold, D. A.

    1978-01-01

    Techniques and equations used in calculating the transfer functions to relate the observed return laser pulses to the center of mass of the Lageos satellite retroflector array, and for most of the retroreflector-equipped satellites now in orbit are described. The methods derived include the effects of coherent interference, diffraction, polarization, and dihedral-angle offsets. Particular emphasis is given to deriving expressions for the diffraction pattern and active reflecting area of various cube-corner designs.

  10. Calibration of hydrological models using flow-duration curves

    NASA Astrophysics Data System (ADS)

    Westerberg, I. K.; Guerrero, J.-L.; Younger, P. M.; Beven, K. J.; Seibert, J.; Halldin, S.; Freer, J. E.; Xu, C.-Y.

    2011-07-01

    The degree of belief we have in predictions from hydrologic models will normally depend on how well they can reproduce observations. Calibrations with traditional performance measures, such as the Nash-Sutcliffe model efficiency, are challenged by problems including: (1) uncertain discharge data, (2) variable sensitivity of different performance measures to different flow magnitudes, (3) influence of unknown input/output errors and (4) inability to evaluate model performance when observation time periods for discharge and model input data do not overlap. This paper explores a calibration method using flow-duration curves (FDCs) to address these problems. The method focuses on reproducing the observed discharge frequency distribution rather than the exact hydrograph. It consists of applying limits of acceptability for selected evaluation points (EPs) on the observed uncertain FDC in the extended GLUE approach. Two ways of selecting the EPs were tested - based on equal intervals of discharge and of volume of water. The method was tested and compared to a calibration using the traditional model efficiency for the daily four-parameter WASMOD model in the Paso La Ceiba catchment in Honduras and for Dynamic TOPMODEL evaluated at an hourly time scale for the Brue catchment in Great Britain. The volume method of selecting EPs gave the best results in both catchments with better calibrated slow flow, recession and evaporation than the other criteria. Observed and simulated time series of uncertain discharges agreed better for this method both in calibration and prediction in both catchments. An advantage with the method is that the rejection criterion is based on an estimation of the uncertainty in discharge data and that the EPs of the FDC can be chosen to reflect the aims of the modelling application, e.g. using more/less EPs at high/low flows. While the method appears less sensitive to epistemic input/output errors than previous use of limits of acceptability applied directly to the time series of discharge, it still requires a reasonable representation of the distribution of inputs. Additional constraints might therefore be required in catchments subject to snow and where peak-flow timing at sub-daily time scales is of high importance. The results suggest that the calibration method can be useful when observation time periods for discharge and model input data do not overlap. The method could also be suitable for calibration to regional FDCs while taking uncertainties in the hydrological model and data into account.

  11. Calibration of hydrological models using flow-duration curves

    NASA Astrophysics Data System (ADS)

    Westerberg, I. K.; Guerrero, J.-L.; Younger, P. M.; Beven, K. J.; Seibert, J.; Halldin, S.; Freer, J. E.; Xu, C.-Y.

    2010-12-01

    The degree of belief we have in predictions from hydrologic models depends on how well they can reproduce observations. Calibrations with traditional performance measures such as the Nash-Sutcliffe model efficiency are challenged by problems including: (1) uncertain discharge data, (2) variable importance of the performance with flow magnitudes, (3) influence of unknown input/output errors and (4) inability to evaluate model performance when observation time periods for discharge and model input data do not overlap. A new calibration method using flow-duration curves (FDCs) was developed which addresses these problems. The method focuses on reproducing the observed discharge frequency distribution rather than the exact hydrograph. It consists of applying limits of acceptability for selected evaluation points (EPs) of the observed uncertain FDC in the extended GLUE approach. Two ways of selecting the EPs were tested - based on equal intervals of discharge and of volume of water. The method was tested and compared to a calibration using the traditional model efficiency for the daily four-parameter WASMOD model in the Paso La Ceiba catchment in Honduras and for Dynamic TOPMODEL evaluated at an hourly time scale for the Brue catchment in Great Britain. The volume method of selecting EPs gave the best results in both catchments with better calibrated slow flow, recession and evaporation than the other criteria. Observed and simulated time series of uncertain discharges agreed better for this method both in calibration and prediction in both catchments without resulting in overpredicted simulated uncertainty. An advantage with the method is that the rejection criterion is based on an estimation of the uncertainty in discharge data and that the EPs of the FDC can be chosen to reflect the aims of the modelling application e.g. using more/less EPs at high/low flows. While the new method is less sensitive to epistemic input/output errors than the normal use of limits of acceptability applied directly to the time series of discharge, it still requires a reasonable representation of the distribution of inputs. Additional constraints might therefore be required in catchments subject to snow. The results suggest that the new calibration method can be useful when observation time periods for discharge and model input data do not overlap. The new method could also be suitable for calibration to regional FDCs while taking uncertainties in the hydrological model and data into account.

  12. Generating Dynamic Persistence in the Time Domain

    NASA Astrophysics Data System (ADS)

    Guerrero, A.; Smith, L. A.; Smith, L. A.; Kaplan, D. T.

    2001-12-01

    Many dynamical systems present long-range correlations. Physically, these systems vary from biological to economical, including geological or urban systems. Important geophysical candidates for this type of behaviour include weather (or climate) and earthquake sequences. Persistence is characterised by slowly decaying correlation function; that, in theory, never dies out. The Persistence exponent reflects the degree of memory in the system and much effort has been expended creating and analysing methods that successfully estimate this parameter and model data that exhibits persistence. The most widely used methods for generating long correlated time series are not dynamical systems in the time domain, but instead are derived from a given spectral density. Little attention has been drawn to modelling persistence in the time domain. The time domain approach has the advantage that an observation at certain time can be calculated using previous observations which is particularly suitable when investigating the predictability of a long memory process. We will describe two of these methods in the time domain. One is a traditional approach using fractional ARIMA (autoregressive and moving average) models; the second uses a novel approach to extending a given series using random Fourier basis functions. The statistical quality of the two methods is compared, and they are contrasted with weather data which shows, reportedly, persistence. The suitability of this approach both for estimating predictability and for making predictions is discussed.

  13. The state of the art and future opportunities for using longitudinal n-of-1 methods in health behaviour research: a systematic literature overview.

    PubMed

    McDonald, Suzanne; Quinn, Francis; Vieira, Rute; O'Brien, Nicola; White, Martin; Johnston, Derek W; Sniehotta, Falko F

    2017-12-01

    n-of-1 studies test hypotheses within individuals based on repeated measurement of variables within the individual over time. Intra-individual effects may differ from those found in between-participant studies. Using examples from a systematic review of n-of-1 studies in health behaviour research, this article provides a state of the art overview of the use of n-of-1 methods, organised according to key methodological considerations related to n-of-1 design and analysis, and describes future challenges and opportunities. A comprehensive search strategy (PROSPERO:CRD42014007258) was used to identify articles published between 2000 and 2016, reporting observational or interventional n-of-1 studies with health behaviour outcomes. Thirty-nine articles were identified which reported on n-of-1 observational designs and a range of n-of-1 interventional designs, including AB, ABA, ABABA, alternating treatments, n-of-1 randomised controlled trial, multiple baseline and changing criterion designs. Behaviours measured included treatment adherence, physical activity, drug/alcohol use, sleep, smoking and eating behaviour. Descriptive, visual or statistical analyses were used. We identify scope and opportunities for using n-of-1 methods to answer key questions in health behaviour research. n-of-1 methods provide the tools needed to help advance theoretical knowledge and personalise/tailor health behaviour interventions to individuals.

  14. Statistical Correction of Air Temperature Forecasts for City and Road Weather Applications

    NASA Astrophysics Data System (ADS)

    Mahura, Alexander; Petersen, Claus; Sass, Bent; Gilet, Nicolas

    2014-05-01

    The method for statistical correction of air /road surface temperatures forecasts was developed based on analysis of long-term time-series of meteorological observations and forecasts (from HIgh Resolution Limited Area Model & Road Conditions Model; 3 km horizontal resolution). It has been tested for May-Aug 2012 & Oct 2012 - Mar 2013, respectively. The developed method is based mostly on forecasted meteorological parameters with a minimal inclusion of observations (covering only a pre-history period). Although the st iteration correction is based taking into account relevant temperature observations, but the further adjustment of air and road temperature forecasts is based purely on forecasted meteorological parameters. The method is model independent, e.g. it can be applied for temperature correction with other types of models having different horizontal resolutions. It is relatively fast due to application of the singular value decomposition method for matrix solution to find coefficients. Moreover, there is always a possibility for additional improvement due to extra tuning of the temperature forecasts for some locations (stations), and in particular, where for example, the MAEs are generally higher compared with others (see Gilet et al., 2014). For the city weather applications, new operationalized procedure for statistical correction of the air temperature forecasts has been elaborated and implemented for the HIRLAM-SKA model runs at 00, 06, 12, and 18 UTCs covering forecast lengths up to 48 hours. The procedure includes segments for extraction of observations and forecast data, assigning these to forecast lengths, statistical correction of temperature, one-&multi-days statistical evaluation of model performance, decision-making on using corrections by stations, interpolation, visualisation and storage/backup. Pre-operational air temperature correction runs were performed for the mainland Denmark since mid-April 2013 and shown good results. Tests also showed that the CPU time required for the operational procedure is relatively short (less than 15 minutes including a large time spent for interpolation). These also showed that in order to start correction of forecasts there is no need to have a long-term pre-historical data (containing forecasts and observations) and, at least, a couple of weeks will be sufficient when a new observational station is included and added to the forecast point. Note for the road weather application, the operationalization of the statistical correction of the road surface temperature forecasts (for the RWM system daily hourly runs covering forecast length up to 5 hours ahead) for the Danish road network (for about 400 road stations) was also implemented, and it is running in a test mode since Sep 2013. The method can also be applied for correction of the dew point temperature and wind speed (as a part of observations/ forecasts at synoptical stations), where these both meteorological parameters are parts of the proposed system of equations. The evaluation of the method performance for improvement of the wind speed forecasts is planned as well, with considering possibilities for the wind direction improvements (which is more complex due to multi-modal types of such data distribution). The method worked for the entire domain of mainland Denmark (tested for 60 synoptical and 395 road stations), and hence, it can be also applied for any geographical point within this domain, as through interpolation to about 100 cities' locations (for Danish national byvejr forecasts). Moreover, we can assume that the same method can be used in other geographical areas. The evaluation for other domains (with a focus on Greenland and Nordic countries) is planned. In addition, a similar approach might be also tested for statistical correction of concentrations of chemical species, but such approach will require additional elaboration and evaluation.

  15. Vector wind profile gust model

    NASA Technical Reports Server (NTRS)

    Adelfang, S. I.

    1981-01-01

    To enable development of a vector wind gust model suitable for orbital flight test operations and trade studies, hypotheses concerning the distributions of gust component variables were verified. Methods for verification of hypotheses that observed gust variables, including gust component magnitude, gust length, u range, and L range, are gamma distributed and presented. Observed gust modulus has been drawn from a bivariate gamma distribution that can be approximated with a Weibull distribution. Zonal and meridional gust components are bivariate gamma distributed. An analytical method for testing for bivariate gamma distributed variables is presented. Two distributions for gust modulus are described and the results of extensive hypothesis testing of one of the distributions are presented. The validity of the gamma distribution for representation of gust component variables is established.

  16. Plate measurement techniques and reduction methods used by the West German satellite observers, and resulting consequences for the observation

    NASA Technical Reports Server (NTRS)

    Deker, H.

    1971-01-01

    The West German tracking stations are equipped with ballistic cameras. Plate measurement and plate reduction must therefore follow photogrammetric methods. Approximately 100 star positions and 200 satellite positions are measured on each plate. The mathematical model for spatial rotation of the bundle of rays is extended by including terms for distortion and internal orientation of the camera as well as by providing terms for refraction which are computed for the measured coordinates of the star positions on the plate. From the measuring accuracy of the plate coordinates it follows that the timing accuracy for the exposures has to be about one millisecond, in order to obtain a homogeneous system.

  17. Some observations on glass-knife making.

    PubMed

    Ward, R T

    1977-11-01

    The yield of usable knife edge per knife (for thin sectioning) was markedly increased when glass knives were made at an included angle of 55 degrees rather than the customary 45 degrees. A large number of measurements of edge check marks made with a routine light scattering method as well as observations made on a smaller number of test sections with the electron microscope indicated the superiority of 55 degrees knives. Knives were made with both taped pliers and an LKB Knifemaker. Knives were graded by methods easily applied in any biological electron microscope laboratory. Depending on the mode of fracture, the yield of knives having more than 33% of their edges free of check marks was 30 to 100 times greater at 55 degrees than 45 degrees.

  18. Video surveillance captures student hand hygiene behavior, reactivity to observation, and peer influence in Kenyan primary schools.

    PubMed

    Pickering, Amy J; Blum, Annalise G; Breiman, Robert F; Ram, Pavani K; Davis, Jennifer

    2014-01-01

    In-person structured observation is considered the best approach for measuring hand hygiene behavior, yet is expensive, time consuming, and may alter behavior. Video surveillance could be a useful tool for objectively monitoring hand hygiene behavior if validated against current methods. Student hand cleaning behavior was monitored with video surveillance and in-person structured observation, both simultaneously and separately, at four primary schools in urban Kenya over a study period of 8 weeks. Video surveillance and in-person observation captured similar rates of hand cleaning (absolute difference <5%, p = 0.74). Video surveillance documented higher hand cleaning rates (71%) when at least one other person was present at the hand cleaning station, compared to when a student was alone (48%; rate ratio  = 1.14 [95% CI 1.01-1.28]). Students increased hand cleaning rates during simultaneous video and in-person monitoring as compared to single-method monitoring, suggesting reactivity to each method of monitoring. This trend was documented at schools receiving a handwashing with soap intervention, but not at schools receiving a sanitizer intervention. Video surveillance of hand hygiene behavior yields results comparable to in-person observation among schools in a resource-constrained setting. Video surveillance also has certain advantages over in-person observation, including rapid data processing and the capability to capture new behavioral insights. Peer influence can significantly improve student hand cleaning behavior and, when possible, should be exploited in the design and implementation of school hand hygiene programs.

  19. Observations on anatomical aspects of the fruit, leaf and stem tissues of four Citrullus spp.

    USDA-ARS?s Scientific Manuscript database

    Morphological characteristics of the fruit, stem and leaf tissues of four species of Citrullus (L.) Schrad. were examined using standard histological methods. Plant materials included the cultivated watermelon (C. lanatus (Thunb.) Matsum. & Nakai) and three of its related species; C. colocynthis (...

  20. The Illogic of Youth Driving Culture

    ERIC Educational Resources Information Center

    Tilleczek, Kate C.

    2004-01-01

    Most adolescent deaths are caused by injury sustained in traffic crashes, and driver education does not necessarily reduce the problem. This multi-method, ethnographic study describes the logic and regulation of youth driving culture in a northern Ontario community. This included 40 hours of participant observation and a survey of 88 novice…

  1. Flipping Undergraduate Finite Mathematics: Findings and Implications

    ERIC Educational Resources Information Center

    Guerrero, Shannon; Beal, Melissa; Lamb, Chris; Sonderegger, Derek; Baumgartel, Drew

    2015-01-01

    This paper reports on a research project that investigated the effects of a flipped instructional approach on student attitudes and achievement in a lower division university-level Finite Mathematics course. The project employed a mixed-methods design that included content exams, an attitude survey, open-ended student responses, observations, and…

  2. Mediating Factors in Literacy Instruction: How Novice Elementary Teachers Navigate New Teaching Contexts

    ERIC Educational Resources Information Center

    Scales, Roya Qualls; Wolsey, Thomas DeVere; Young, Janet; Smetana, Linda; Grisham, Dana L.; Lenski, Susan; Dobler, Elizabeth; Yoder, Karen Kreider; Chambers, Sandra A.

    2017-01-01

    This longitudinal study, framed by activity theory, examines what seven novice teachers' talk and actions reveal about their literacy teaching practices then delves into mediating influences of the teaching context. Utilizing collective, multi-case methods, data sources included interviews, observations, and artifacts. Findings indicate novices…

  3. Acoustic monitoring system to quantify ingestive behavior of free-grazing cattle

    USDA-ARS?s Scientific Manuscript database

    Methods to estimate intake in grazing livestock include using markers, visual observation, mechanical sensors that respond to jaw movement and acoustic recording. In most of the acoustic monitoring studies, the microphone is inverted on the forehead of the grazing livestock and the skull is utilize...

  4. Family Sense-Making Practices in Science Center Conversations

    ERIC Educational Resources Information Center

    Zimmerman, Heather Toomey; Reeve, Suzanne; Bell, Philip

    2010-01-01

    In this paper, we examine the interactional ways that families make meaning from biological exhibits during a visit to an interactive science center. To understand the museum visits from the perspectives of the families, we use ethnographic and discourse analytic methods, including pre- and postvisit interviews, videotaped observations of the…

  5. Crossing the Chasm--Introducing Flexible Learning into the Botswana Technical Education Programme: From Policy to Action

    ERIC Educational Resources Information Center

    Richardson, Alison Mead

    2009-01-01

    This paper reports on a longitudinal, ethnomethodological case study of the development towards flexible delivery of the Botswana Technical Education Programme (BTEP), offered by Francistown College of Technical & Vocational Education (FCTVE). Data collection methods included documentary analysis, naturalistic participant observation, and…

  6. Best Laid Plans: How Community College Student Success Courses Work

    ERIC Educational Resources Information Center

    Hatch, Deryl K.; Mardock-Uman, Naomi; Garcia, Crystal E.; Johnson, Mary

    2018-01-01

    Objective: Beyond understanding whether first-year student success interventions in community colleges are effective--for which there is mixed evidence in the literature--this study's purpose was to uncover how they work to realize observed outcomes, including at times unanticipated undesirable outcomes. Method: This qualitative multiple case…

  7. SCHOOL INTEGRATION CONTROVERSIES IN NEW YORK CITY, A PILOT STUDY.

    ERIC Educational Resources Information Center

    SWANSON, BERT E.

    THE MAJOR PROBLEM OF THIS PREPARATORY RESEARCH PROGRAM WAS TO ASCERTAIN THE FEASIBILITY OF MAKING A FULL-SCALE STUDY OF THE DYNAMICS OF SCHOOL INTEGRATION CONTROVERSIES IN NEW YORK CITY. METHODS INVOLVED INTERVIEWING AND OBSERVING LEADERS AT CITYWIDE AND NEIGHBORHOOD LEVELS, INCLUDING SCHOOL ADMINISTRATORS, SCHOOL BOARD MEMBERS, TEACHERS, PARENT…

  8. Exploratory Analysis in Learning Analytics

    ERIC Educational Resources Information Center

    Gibson, David; de Freitas, Sara

    2016-01-01

    This article summarizes the methods, observations, challenges and implications for exploratory analysis drawn from two learning analytics research projects. The cases include an analysis of a games-based virtual performance assessment and an analysis of data from 52,000 students over a 5-year period at a large Australian university. The complex…

  9. Delivery Systems: "Saber Tooth" Effect in Counseling.

    ERIC Educational Resources Information Center

    Traylor, Elwood B.

    This study reported the role of counselors as perceived by black students in a secondary school. Observational and interview methods were employed to obtain data from 24 black students selected at random from the junior and senior classes of a large metropolitan secondary school. Findings include: counselors were essentially concerned with…

  10. Laptop Computers in the Elementary Classroom: Authentic Instruction with At-Risk Students

    ERIC Educational Resources Information Center

    Kemker, Kate; Barron, Ann E.; Harmes, J. Christine

    2007-01-01

    This case study investigated the integration of laptop computers into an elementary classroom in a low socioeconomic status (SES) school. Specifically, the research examined classroom management techniques and aspects of authentic learning relative to the student projects and activities. A mixed methods approach included classroom observations,…

  11. A Program Evaluation of a Leadership Academy for School Principals

    ERIC Educational Resources Information Center

    Wagner, Kristi E.

    2014-01-01

    This program evaluation focused on mid-range outcomes of a leadership academy for school principals. The mixed-methods evaluation included interviews, principals' instructional observation database, and teacher surveys. The Principal Academy program was designed to build principals' knowledge of high-yield instructional strategies (Hattie, 2009),…

  12. Emerging Technologies for Assembly of Microscale Hydrogels

    PubMed Central

    Kavaz, Doga; Demirel, Melik C.; Demirci, Utkan

    2013-01-01

    Assembly of cell encapsulating building blocks (i.e., microscale hydrogels) has significant applications in areas including regenerative medicine, tissue engineering, and cell-based in vitro assays for pharmaceutical research and drug discovery. Inspired by the repeating functional units observed in native tissues and biological systems (e.g., the lobule in liver, the nephron in kidney), assembly technologies aim to generate complex tissue structures by organizing microscale building blocks. Novel assembly technologies enable fabrication of engineered tissue constructs with controlled properties including tunable microarchitectural and predefined compositional features. Recent advances in micro- and nano-scale technologies have enabled engineering of microgel based three dimensional (3D) constructs. There is a need for high-throughput and scalable methods to assemble microscale units with a complex 3D micro-architecture. Emerging assembly methods include novel technologies based on microfluidics, acoustic and magnetic fields, nanotextured surfaces, and surface tension. In this review, we survey emerging microscale hydrogel assembly methods offering rapid, scalable microgel assembly in 3D, and provide future perspectives and discuss potential applications. PMID:23184717

  13. Leveraging the Unified Access Framework: A Tale of an Integrated Ocean Data Prototype

    NASA Astrophysics Data System (ADS)

    O'Brien, K.; Kern, K.; Smith, B.; Schweitzer, R.; Simons, R.; Mendelssohn, R.; Diggs, S. C.; Belbeoch, M.; Hankin, S.

    2014-12-01

    The Tropical Pacific Observing System (TPOS) has been functioning and capturing measurements since the mid 1990s during the very successful Tropical Ocean Global Atmosphere (TOGA) project. Unfortunately, in the current environment, some 20 years after the end of the TOGA project, sustaining the observing system is proving difficult. With the many advances in methods of observing the ocean, a group of scientists is taking a fresh look at what the Tropical Pacific Observing System requires for sustainability. This includes utilizing a wide variety of observing system platforms, including Argo floats, unmanned drifters, moorings, ships, etc. This variety of platforms measuring ocean data also provides a significant challenge in terms of integrated data management. It is recognized that data and information management is crucial to the success and impact of any observing system. In order to be successful, it is also crucial to avoid building stovepipes for data management. To that end, NOAA's Observing System Monitoring Center (OSMC) has been tasked to create a testbed of integrated real time and delayed mode observations for the Tropical Pacific region in support of the TPOS. The observing networks included in the prototype are: Argo floats, OceanSites moorings, drifting buoys, hydrographic surveys, underway carbon observations and, of course, real time ocean measurements. In this presentation, we will discuss how the OSMC project is building the integrated data prototype using existing free and open source software. We will explore how we are leveraging successful data management frameworks pioneered by efforts such as NOAA's Unified Access Framework project. We will also show examples of how conforming to well known conventions and standards allows for discoverability, usability and interoperability of data.

  14. Development of an environmental high-voltage electron microscope for reaction science.

    PubMed

    Tanaka, Nobuo; Usukura, Jiro; Kusunoki, Michiko; Saito, Yahachi; Sasaki, Katuhiro; Tanji, Takayoshi; Muto, Shunsuke; Arai, Shigeo

    2013-02-01

    Environmental transmission electron microscopy and ultra-high resolution electron microscopic observation using aberration correctors have recently emerged as topics of great interest. The former method is an extension of the so-called in situ electron microscopy that has been performed since the 1970s. Current research in this area has been focusing on dynamic observation with atomic resolution under gaseous atmospheres and in liquids. Since 2007, Nagoya University has been developing a new 1-MV high voltage (scanning) transmission electron microscope that can be used to observe nanomaterials under conditions that include the presence of gases, liquids and illuminating lights, and it can be also used to perform mechanical operations to nanometre-sized areas as well as electron tomography and elemental analysis by electron energy loss spectroscopy. The new instrument has been used to image and analyse various types of samples including biological ones.

  15. Seismic data fusion anomaly detection

    NASA Astrophysics Data System (ADS)

    Harrity, Kyle; Blasch, Erik; Alford, Mark; Ezekiel, Soundararajan; Ferris, David

    2014-06-01

    Detecting anomalies in non-stationary signals has valuable applications in many fields including medicine and meteorology. These include uses such as identifying possible heart conditions from an Electrocardiography (ECG) signals or predicting earthquakes via seismographic data. Over the many choices of anomaly detection algorithms, it is important to compare possible methods. In this paper, we examine and compare two approaches to anomaly detection and see how data fusion methods may improve performance. The first approach involves using an artificial neural network (ANN) to detect anomalies in a wavelet de-noised signal. The other method uses a perspective neural network (PNN) to analyze an arbitrary number of "perspectives" or transformations of the observed signal for anomalies. Possible perspectives may include wavelet de-noising, Fourier transform, peak-filtering, etc.. In order to evaluate these techniques via signal fusion metrics, we must apply signal preprocessing techniques such as de-noising methods to the original signal and then use a neural network to find anomalies in the generated signal. From this secondary result it is possible to use data fusion techniques that can be evaluated via existing data fusion metrics for single and multiple perspectives. The result will show which anomaly detection method, according to the metrics, is better suited overall for anomaly detection applications. The method used in this study could be applied to compare other signal processing algorithms.

  16. Observation of asphalt binder microstructure with ESEM.

    PubMed

    Mikhailenko, P; Kadhim, H; Baaj, H; Tighe, S

    2017-09-01

    The observation of asphalt binder with the environmental scanning electron microscope (ESEM) has shown the potential to observe asphalt binder microstructure and its evolution with binder aging. A procedure for the induction and identification of the microstructure in asphalt binder was established in this study and included sample preparation and observation parameters. A suitable heat-sampling asphalt binder sample preparation method was determined for the test and several stainless steel and Teflon sample moulds developed, finding that stainless steel was the preferable material. The magnification and ESEM settings conducive to observing the 3D microstructure were determined through a number of observations to be 1000×, although other magnifications could be considered. Both straight run binder (PG 58-28) and an air blown oxidised binder were analysed; their structures being compared for their relative size, abundance and other characteristics, showing a clear evolution in the fibril microstructure. The microstructure took longer to appear for the oxidised binder. It was confirmed that the fibril microstructure corresponded to actual characteristics in the asphalt binder. Additionally, a 'bee' micelle structure was found as a transitional structure in ESEM observation. The test methods in this study will be used for more comprehensive analysis of asphalt binder microstructure. © 2017 The Authors Journal of Microscopy © 2017 Royal Microscopical Society.

  17. Unmanned aerial vehicles for surveying marine fauna: assessing detection probability.

    PubMed

    Hodgson, Amanda; Peel, David; Kelly, Natalie

    2017-06-01

    Aerial surveys are conducted for various fauna to assess abundance, distribution, and habitat use over large spatial scales. They are traditionally conducted using light aircraft with observers recording sightings in real time. Unmanned Aerial Vehicles (UAVs) offer an alternative with many potential advantages, including eliminating human risk. To be effective, this emerging platform needs to provide detection rates of animals comparable to traditional methods. UAVs can also acquire new types of information, and this new data requires a reevaluation of traditional analyses used in aerial surveys; including estimating the probability of detecting animals. We conducted 17 replicate UAV surveys of humpback whales (Megaptera novaeangliae) while simultaneously obtaining a 'census' of the population from land-based observations, to assess UAV detection probability. The ScanEagle UAV, carrying a digital SLR camera, continuously captured images (with 75% overlap) along transects covering the visual range of land-based observers. We also used ScanEagle to conduct focal follows of whale pods (n = 12, mean duration = 40 min), to assess a new method of estimating availability. A comparison of the whale detections from the UAV to the land-based census provided an estimated UAV detection probability of 0.33 (CV = 0.25; incorporating both availability and perception biases), which was not affected by environmental covariates (Beaufort sea state, glare, and cloud cover). According to our focal follows, the mean availability was 0.63 (CV = 0.37), with pods including mother/calf pairs having a higher availability (0.86, CV = 0.20) than those without (0.59, CV = 0.38). The follows also revealed (and provided a potential correction for) a downward bias in group size estimates from the UAV surveys, which resulted from asynchronous diving within whale pods, and a relatively short observation window of 9 s. We have shown that UAVs are an effective alternative to traditional methods, providing a detection probability that is within the range of previous studies for our target species. We also describe a method of assessing availability bias that represents spatial and temporal characteristics of a survey, from the same perspective as the survey platform, is benign, and provides additional data on animal behavior. © 2017 by the Ecological Society of America.

  18. Analysis of survival data from telemetry projects

    USGS Publications Warehouse

    Bunck, C.M.; Winterstein, S.R.; Pollock, K.H.

    1985-01-01

    Telemetry techniques can be used to study the survival rates of animal populations and are particularly suitable for species or settings for which band recovery models are not. Statistical methods for estimating survival rates and parameters of survival distributions from observations of radio-tagged animals will be described. These methods have been applied to medical and engineering studies and to the study of nest success. Estimates and tests based on discrete models, originally introduced by Mayfield, and on continuous models, both parametric and nonparametric, will be described. Generalizations, including staggered entry of subjects into the study and identification of mortality factors will be considered. Additional discussion topics will include sample size considerations, relocation frequency for subjects, and use of covariates.

  19. The Dust Storm Index (DSI): A method for monitoring broadscale wind erosion using meteorological records

    NASA Astrophysics Data System (ADS)

    O'Loingsigh, T.; McTainsh, G. H.; Tews, E. K.; Strong, C. L.; Leys, J. F.; Shinkfield, P.; Tapper, N. J.

    2014-03-01

    Wind erosion of soils is a natural process that has shaped the semi-arid and arid landscapes for millennia. This paper describes the Dust Storm Index (DSI); a methodology for monitoring wind erosion using Australian Bureau of Meteorology (ABM) meteorological observational data since the mid-1960s (long-term), at continental scale. While the 46 year length of the DSI record is its greatest strength from a wind erosion monitoring perspective, there are a number of technical challenges to its use because when the World Meteorological Organisation (WMO) recording protocols were established the use of the data for wind erosion monitoring was never intended. Data recording and storage protocols are examined, including the effects of changes to the definition of how observers should interpret and record dust events. A method is described for selecting the 180 long-term ABM stations used in this study and the limitations of variable observation frequencies between stations are in part resolved. The rationale behind the DSI equation is explained and the examples of temporal and spatial data visualisation products presented include; a long term national wind erosion record (1965-2011), continental DSI maps, and maps of the erosion event types that are factored into the DSI equation. The DSI is tested against dust concentration data and found to provide an accurate representation of wind erosion activity. As the ABM observational records used here were collected according to WMO protocols, the DSI methodology could be used in all countries with WMO-compatible meteorological observation and recording systems.

  20. HOOKED FLARE RIBBONS AND FLUX-ROPE-RELATED QSL FOOTPRINTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Jie; Li, Hui; Gilchrist, Stuart A.

    2016-05-20

    We studied the magnetic topology of active region 12158 on 2014 September 10 and compared it with the observations before and early in the flare that begins at 17:21 UT (SOL2014-09-10T17:45:00). Our results show that the sigmoidal structure and flare ribbons of this active region observed by the Solar Dynamics Observatory /Atmospheric Imaging Assembly can be well reproduced from a Grad–Rubin nonlinear force-free field extrapolation method. Various inverse-S- and inverse-J-shaped magnetic field lines, which surround a coronal flux rope, coincide with the sigmoid as observed in different extreme-ultraviolet wavelengths, including its multithreaded curved ends. Also, the observed distribution of surfacemore » currents in the magnetic polarity where it was not prescribed is well reproduced. This validates our numerical implementation and setup of the Grad–Rubin method. The modeled double inverse-J-shaped quasi-separatrix layer (QSL) footprints match the observed flare ribbons during the rising phase of the flare, including their hooked parts. The spiral-like shape of the latter may be related to a complex pre-eruptive flux rope with more than one turn of twist, as obtained in the model. These ribbon-associated flux-rope QSL footprints are consistent with the new standard flare model in 3D, with the presence of a hyperbolic flux tube located below an inverse-teardrop-shaped coronal QSL. This is a new step forward forecasting the locations of reconnection and ribbons in solar flares and the geometrical properties of eruptive flux ropes.« less

  1. A Comparison of Assessment Tools: Is Direct Observation an Improvement Over Objective Structured Clinical Examinations for Communications Skills Evaluation?

    PubMed

    Goch, Abraham M; Karia, Raj; Taormina, David; Kalet, Adina; Zuckerman, Joseph; Egol, Kenneth A; Phillips, Donna

    2018-04-01

    Evaluation of resident physicians' communications skills is a challenging task and is increasingly accomplished with standardized examinations. There exists a need to identify the effective, efficient methods for assessment of communications skills. We compared objective structured clinical examination (OSCE) and direct observation as approaches for assessing resident communications skills. We conducted a retrospective cohort analysis of orthopaedic surgery resident physicians at a single tertiary care academic institution, using the Institute for Healthcare Communication "4 Es" model for effective communication. Data were collected between 2011 and 2015. A total of 28 residents, each with OSCE and complete direct observation assessment checklists, were included in the analysis. Residents were included if they had 1 OSCE assessment and 2 or more complete direct observation assessments. There were 28 of a possible 59 residents (47%) included. A total of 89% (25 of 28) of residents passed the communications skills OSCE; only 54% (15 of 28) of residents passed the direct observation communications assessment. There was a positive, moderate correlation between OSCE and direct observation scores overall ( r  = 0.415, P  = .028). There was no agreement between OSCE and direct observation in categorizing residents into passing and failing scores (κ = 0.205, P  = .16), after adjusting for chance agreement. Our results suggest that OSCE and direct observation tools provide different insights into resident communications skills (simulation of rare and challenging situations versus real-life daily encounters), and may provide useful perspectives on resident communications skills in different contexts.

  2. Uncertainty in Citizen Science observations: from measurement to user perception

    NASA Astrophysics Data System (ADS)

    Lahoz, William; Schneider, Philipp; Castell, Nuria

    2016-04-01

    Citizen Science activities concern general public engagement in scientific research activities when citizens actively contribute to science either with their intellectual effort or surrounding knowledge or with their tools and resources. The advent of technologies such as the Internet and smartphones, and the growth in their usage, has significantly increased the potential benefits from Citizen Science activities. Citizen Science observations from low-cost sensors, smartphones and Citizen Observatories, provide a novel and recent development in platforms for observing the Earth System, with the opportunity to extend the range of observational platforms available to society to spatio-temporal scales (10-100s m; 1 hr or less) highly relevant to citizen needs. The potential value of Citizen Science is high, with applications in science, education, social aspects, and policy aspects, but this potential, particularly for citizens and policymakers, remains largely untapped. Key areas where Citizen Science data start to have demonstrable benefits include GEOSS Societal Benefit Areas such as Health and Weather. Citizen Science observations have many challenges, including simulation of smaller spatial scales, noisy data, combination with traditional observational methods (satellite and in situ data), and assessment, representation and visualization of uncertainty. Within these challenges, that of the assessment and representation of uncertainty and its communication to users is fundamental, as it provides qualitative and/or quantitative information that influences the belief users will have in environmental information. This presentation will discuss the challenges in assessment and representation of uncertainty in Citizen Science observations, its communication to users, including the use of visualization, and the perception of this uncertainty information by users of Citizen Science observations.

  3. Preliminary Error Budget for the Reflected Solar Instrument for the Climate Absolute Radiance and Refractivity Observatory

    NASA Technical Reports Server (NTRS)

    Thome, Kurtis; Gubbels, Timothy; Barnes, Robert

    2011-01-01

    The Climate Absolute Radiance and Refractivity Observatory (CLARREO) plans to observe climate change trends over decadal time scales to determine the accuracy of climate projections. The project relies on spaceborne earth observations of SI-traceable variables sensitive to key decadal change parameters. The mission includes a reflected solar instrument retrieving at-sensor reflectance over the 320 to 2300 nm spectral range with 500-m spatial resolution and 100-km swath. Reflectance is obtained from the ratio of measurements of the earth s surface to those while viewing the sun relying on a calibration approach that retrieves reflectance with uncertainties less than 0.3%. The calibration is predicated on heritage hardware, reduction of sensor complexity, adherence to detector-based calibration standards, and an ability to simulate in the laboratory on-orbit sources in both size and brightness to provide the basis of a transfer to orbit of the laboratory calibration including a link to absolute solar irradiance measurements. The Climate Absolute Radiance and Refractivity Observatory (CLARREO) mission addresses the need to observe high-accuracy, long-term climate change trends and to use decadal change observations as the most critical method to determine the accuracy of climate change projections such as those in the IPCC Report. A rigorously known accuracy of both decadal change observations as well as climate projections is critical in order to enable sound policy decisions. The CLARREO Project will implement a spaceborne earth observation mission designed to provide rigorous SI traceable observations (i.e., radiance, reflectance, and refractivity) that are sensitive to a wide range of key decadal change variables, including: 1) Surface temperature and atmospheric temperature profile 2) Atmospheric water vapor profile 3) Far infrared water vapor greenhouse 4) Aerosol properties and anthropogenic aerosol direct radiative forcing 5) Total and spectral solar irradiance 6) Broadband reflected and emitted radiative fluxes 7) Cloud properties 8) Surface albedo There are two methods the CLARREO mission will rely on to achieve these critical decadal change benchmarks: direct and reference inter-calibration. A quantitative analysis of the strengths and weaknesses of the two methods has led to the recommended CLARREO mission approach. The project consists of two satellites launched into 90-degree, precessing orbits separated by 90 degrees. The instrument suite receiver on each spacecraft includes one emitted infrared spectrometer, two reflected solar spectrometers: dividing the spectrum from ultraviolet through near infrared, and one global navigation receiver for radio occultation. The measurements will be acquired for a period of three years minimum, with a five-year lifetime goal, enabling follow-on missions to extend the climate record over the decades needed to understand climate change. The current work concentrates on the reflected solar instrument giving an overview of its design and calibration approach. The calibration description includes the approach to achieving an SI-traceable system on orbit. The calibration overview is followed by a preliminary error budget based on techniques currently in place at the National Institute of Standards and Technology (NIST).

  4. Method for assessment of stormwater treatment facilities - Synthetic road runoff addition including micro-pollutants and tracer.

    PubMed

    Cederkvist, Karin; Jensen, Marina B; Holm, Peter E

    2017-08-01

    Stormwater treatment facilities (STFs) are becoming increasingly widespread but knowledge on their performance is limited. This is due to difficulties in obtaining representative samples during storm events and documenting removal of the broad range of contaminants found in stormwater runoff. This paper presents a method to evaluate STFs by addition of synthetic runoff with representative concentrations of contaminant species, including the use of tracer for correction of removal rates for losses not caused by the STF. A list of organic and inorganic contaminant species, including trace elements representative of runoff from roads is suggested, as well as relevant concentration ranges. The method was used for adding contaminants to three different STFs including a curbstone extension with filter soil, a dual porosity filter, and six different permeable pavements. Evaluation of the method showed that it is possible to add a well-defined mixture of contaminants despite different field conditions by having a flexibly system, mixing different stock-solutions on site, and use bromide tracer for correction of outlet concentrations. Bromide recovery ranged from only 12% in one of the permeable pavements to 97% in the dual porosity filter, stressing the importance of including a conservative tracer for correction of contaminant retention values. The method is considered useful in future treatment performance testing of STFs. The observed performance of the STFs is presented in coming papers. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. A Literature Review of the Effect of Malaria on Stunting.

    PubMed

    Jackson, Bianca D; Black, Robert E

    2017-11-01

    Background: The current version of the Lives Saved Tool (LiST) maternal and child health impact modeling software does not include an effect of malaria on stunting. Objective: This literature review was undertaken to determine whether such a causal link should be included in the LiST model. Methods: The PubMed, Embase, and Scopus databases were searched by using broad search terms. The searches returned a total of 4281 documents. Twelve studies from among the retrieved documents were included in the review according to the inclusion and exclusion criteria. Results: There was mixed evidence for an effect of malaria on stunting among longitudinal observational studies, and none of the randomized controlled trials of malaria interventions found an effect of the interventions on stunting. Conclusions: There is insufficient evidence to include malaria as a determinant of stunting or an effect of malaria interventions on stunting in the LiST model. The paucity and heterogeneity of the available literature were a major limitation. In addition, the studies included in the review consistently fulfilled their ethical responsibility to treat children under observation for malaria, which may have interfered with the natural history of the disease and prevented any observable effect on stunting or linear growth. © 2017 American Society for Nutrition.

  6. Wavelet-Bayesian inference of cosmic strings embedded in the cosmic microwave background

    NASA Astrophysics Data System (ADS)

    McEwen, J. D.; Feeney, S. M.; Peiris, H. V.; Wiaux, Y.; Ringeval, C.; Bouchet, F. R.

    2017-12-01

    Cosmic strings are a well-motivated extension to the standard cosmological model and could induce a subdominant component in the anisotropies of the cosmic microwave background (CMB), in addition to the standard inflationary component. The detection of strings, while observationally challenging, would provide a direct probe of physics at very high-energy scales. We develop a framework for cosmic string inference from observations of the CMB made over the celestial sphere, performing a Bayesian analysis in wavelet space where the string-induced CMB component has distinct statistical properties to the standard inflationary component. Our wavelet-Bayesian framework provides a principled approach to compute the posterior distribution of the string tension Gμ and the Bayesian evidence ratio comparing the string model to the standard inflationary model. Furthermore, we present a technique to recover an estimate of any string-induced CMB map embedded in observational data. Using Planck-like simulations, we demonstrate the application of our framework and evaluate its performance. The method is sensitive to Gμ ∼ 5 × 10-7 for Nambu-Goto string simulations that include an integrated Sachs-Wolfe contribution only and do not include any recombination effects, before any parameters of the analysis are optimized. The sensitivity of the method compares favourably with other techniques applied to the same simulations.

  7. Quantification of the Precipitation Loss of Radiation Belt Electrons Observed by SAMPEX (Invited)

    NASA Astrophysics Data System (ADS)

    Tu, W.; Li, X.; Selesnick, R. S.; Looper, M. D.

    2010-12-01

    Based on SAMPEX/PET observations, the fluxes and the spatial and temporal variations of electron loss to the atmosphere in the Earth’s radiation belt were quantified using a drift-diffusion model that includes the effects of azimuthal drift and pitch angle diffusion. The measured electrons by SAMPEX can be distinguished as trapped, quasi-trapped (in the drift loss cone), or precipitating (in the bounce loss cone), and the model simulates the low-altitude electron distribution from SAMPEX. After fitting the model results to the data, the magnitudes and variations of the electron loss rate can be estimated based on the optimum model parameter values. In this presentation we give an overview of our method and published results, followed by some recent improvements we made on the model, including updating the quantified electron lifetimes more frequently (e.g., every two hours instead of half a day) to achieve smoother variations, estimating the adiabatic effects at SAMPEX’s orbit and their influence on our model results, and calculating the error bar associated with each quantified electron lifetime. This method combining a model with low-altitude observations provides direct quantification of the electron loss rate, as required for any accurate modeling of the radiation belt electron dynamics.

  8. Community Structure Of Coral Reefs In Saebus Island, Sumenep District, East Java

    NASA Astrophysics Data System (ADS)

    Rizmaadi, Mada; Riter, Johannes; Fatimah, Siti; Rifaldi, Riyan; Yoga, Arditho; Ramadhan, Fikri; Ambariyanto, Ambariyanto

    2018-02-01

    Increasing degradation coral reefs ecosystem has created many concerns. Reduction of this damage can only be done with good and proper management of coral reef ecosystem based on existing condition. The condition of coral reef ecosystem can be determined by assessing its community structure. This study investigates community structure of coral reef ecosystems around Saebus Island, Sumenep District, East Java, by using satellite imagery analysis and field observations. Satellite imagery analysis by Lyzenga methods was used to determine the observation stations and substrate distribution. Field observations were done by using Line Intercept Transect method at 4 stations, at the depth of 3 and 10 meters. The results showed that the percentage of coral reef coverage at the depth of 3 and 10 meters were 64.36% and 59.29%, respectively, and included in fine coverage category. This study found in total 25 genera from 13 families of corals at all stations. The most common species found were Acropora, Porites, and Pocillopora, while the least common species were Favites and Montastrea. Average value of Diversity, Uniformity and Dominancy indices were 2.94, 0.8 and 0.18 which include as medium, high, and low category, respectively. These results suggest that coral reef ecosystems around Saebus Island is in a good condition.

  9. Anthropometric typology of male and female rowers using k-means clustering.

    PubMed

    Forjasz, Justyna

    2011-06-01

    The aim of this paper is to present the morphological features of rowers. The objective is to establish the type of body build best suited to the present requirements of this sports discipline through the determination of the most important morphological features in rowing with regard to the type of racing boat. The subjects of this study included competitors who practise rowing and were members of the Junior National Team. The considered variables included a group of 32 anthropometric measurements of body composition determined using the BIA method among male and female athletes, while also including rowing boat categories. In order to determine the analysed structures of male and female rowers, an observation analysis was taken into consideration and performed by the k-means clustering method. In the group of male and female rowers using long paddles, higher mean values for the analysed features were observed, with the exception of fat-free mass, and water content in both genders, and trunk length and horizontal reach in women who achieved higher means in the short-paddle group. On the men's team, both groups differed significantly in body mass, longitudinal features, horizontal reach, hand width and body circumferences, while on the women's, they differed in body mass, width and length of the chest, body circumferences and fat content. The method of grouping used in this paper confirmed morphological differences in the competitors with regard to the type of racing boat.

  10. Anthropometric Typology of Male and Female Rowers Using K-Means Clustering

    PubMed Central

    Forjasz, Justyna

    2011-01-01

    The aim of this paper is to present the morphological features of rowers. The objective is to establish the type of body build best suited to the present requirements of this sports discipline through the determination of the most important morphological features in rowing with regard to the type of racing boat. The subjects of this study included competitors who practise rowing and were members of the Junior National Team. The considered variables included a group of 32 anthropometric measurements of body composition determined using the BIA method among male and female athletes, while also including rowing boat categories. In order to determine the analysed structures of male and female rowers, an observation analysis was taken into consideration and performed by the k-means clustering method. In the group of male and female rowers using long paddles, higher mean values for the analysed features were observed, with the exception of fat-free mass, and water content in both genders, and trunk length and horizontal reach in women who achieved higher means in the short-paddle group. On the men’s team, both groups differed significantly in body mass, longitudinal features, horizontal reach, hand width and body circumferences, while on the women’s, they differed in body mass, width and length of the chest, body circumferences and fat content. The method of grouping used in this paper confirmed morphological differences in the competitors with regard to the type of racing boat. PMID:23486287

  11. Network reconstruction via graph blending

    NASA Astrophysics Data System (ADS)

    Estrada, Rolando

    2016-05-01

    Graphs estimated from empirical data are often noisy and incomplete due to the difficulty of faithfully observing all the components (nodes and edges) of the true graph. This problem is particularly acute for large networks where the number of components may far exceed available surveillance capabilities. Errors in the observed graph can render subsequent analyses invalid, so it is vital to develop robust methods that can minimize these observational errors. Errors in the observed graph may include missing and spurious components, as well fused (multiple nodes are merged into one) and split (a single node is misinterpreted as many) nodes. Traditional graph reconstruction methods are only able to identify missing or spurious components (primarily edges, and to a lesser degree nodes), so we developed a novel graph blending framework that allows us to cast the full estimation problem as a simple edge addition/deletion problem. Armed with this framework, we systematically investigate the viability of various topological graph features, such as the degree distribution or the clustering coefficients, and existing graph reconstruction methods for tackling the full estimation problem. Our experimental results suggest that incorporating any topological feature as a source of information actually hinders reconstruction accuracy. We provide a theoretical analysis of this phenomenon and suggest several avenues for improving this estimation problem.

  12. Scalable parallel elastic-plastic finite element analysis using a quasi-Newton method with a balancing domain decomposition preconditioner

    NASA Astrophysics Data System (ADS)

    Yusa, Yasunori; Okada, Hiroshi; Yamada, Tomonori; Yoshimura, Shinobu

    2018-04-01

    A domain decomposition method for large-scale elastic-plastic problems is proposed. The proposed method is based on a quasi-Newton method in conjunction with a balancing domain decomposition preconditioner. The use of a quasi-Newton method overcomes two problems associated with the conventional domain decomposition method based on the Newton-Raphson method: (1) avoidance of a double-loop iteration algorithm, which generally has large computational complexity, and (2) consideration of the local concentration of nonlinear deformation, which is observed in elastic-plastic problems with stress concentration. Moreover, the application of a balancing domain decomposition preconditioner ensures scalability. Using the conventional and proposed domain decomposition methods, several numerical tests, including weak scaling tests, were performed. The convergence performance of the proposed method is comparable to that of the conventional method. In particular, in elastic-plastic analysis, the proposed method exhibits better convergence performance than the conventional method.

  13. Calculating observables in inhomogeneous cosmologies. Part I: general framework

    NASA Astrophysics Data System (ADS)

    Hellaby, Charles; Walters, Anthony

    2018-02-01

    We lay out a general framework for calculating the variation of a set of cosmological observables, down the past null cone of an arbitrarily placed observer, in a given arbitrary inhomogeneous metric. The observables include redshift, proper motions, area distance and redshift-space density. Of particular interest are observables that are zero in the spherically symmetric case, such as proper motions. The algorithm is based on the null geodesic equation and the geodesic deviation equation, and it is tailored to creating a practical numerical implementation. The algorithm provides a method for tracking which light rays connect moving objects to the observer at successive times. Our algorithm is applied to the particular case of the Szekeres metric. A numerical implementation has been created and some results will be presented in a subsequent paper. Future work will explore the range of possibilities.

  14. Summary of Fluidic Thrust Vectoring Research Conducted at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Deere, Karen A.

    2003-01-01

    Interest in low-observable aircraft and in lowering an aircraft's exhaust system weight sparked decades of research for fixed geometry exhaust nozzles. The desire for such integrated exhaust nozzles was the catalyst for new fluidic control techniques; including throat area control, expansion control, and thrust-vector angle control. This paper summarizes a variety of fluidic thrust vectoring concepts that have been tested both experimentally and computationally at NASA Langley Research Center. The nozzle concepts are divided into three categories according to the method used for fluidic thrust vectoring: the shock vector control method, the throat shifting method, and the counterflow method. This paper explains the thrust vectoring mechanism for each fluidic method, provides examples of configurations tested for each method, and discusses the advantages and disadvantages of each method.

  15. Tsunami simulation method initiated from waveforms observed by ocean bottom pressure sensors for real-time tsunami forecast; Applied for 2011 Tohoku Tsunami

    NASA Astrophysics Data System (ADS)

    Tanioka, Yuichiro

    2017-04-01

    After tsunami disaster due to the 2011 Tohoku-oki great earthquake, improvement of the tsunami forecast has been an urgent issue in Japan. National Institute of Disaster Prevention is installing a cable network system of earthquake and tsunami observation (S-NET) at the ocean bottom along the Japan and Kurile trench. This cable system includes 125 pressure sensors (tsunami meters) which are separated by 30 km. Along the Nankai trough, JAMSTEC already installed and operated the cable network system of seismometers and pressure sensors (DONET and DONET2). Those systems are the most dense observation network systems on top of source areas of great underthrust earthquakes in the world. Real-time tsunami forecast has depended on estimation of earthquake parameters, such as epicenter, depth, and magnitude of earthquakes. Recently, tsunami forecast method has been developed using the estimation of tsunami source from tsunami waveforms observed at the ocean bottom pressure sensors. However, when we have many pressure sensors separated by 30km on top of the source area, we do not need to estimate the tsunami source or earthquake source to compute tsunami. Instead, we can initiate a tsunami simulation from those dense tsunami observed data. Observed tsunami height differences with a time interval at the ocean bottom pressure sensors separated by 30 km were used to estimate tsunami height distribution at a particular time. In our new method, tsunami numerical simulation was initiated from those estimated tsunami height distribution. In this paper, the above method is improved and applied for the tsunami generated by the 2011 Tohoku-oki great earthquake. Tsunami source model of the 2011 Tohoku-oki great earthquake estimated using observed tsunami waveforms, coseimic deformation observed by GPS and ocean bottom sensors by Gusman et al. (2012) is used in this study. The ocean surface deformation is computed from the source model and used as an initial condition of tsunami simulation. By assuming that this computed tsunami is a real tsunami and observed at ocean bottom sensors, new tsunami simulation is carried out using the above method. The station distribution (each station is separated by 15 min., about 30 km) observed tsunami waveforms which were actually computed from the source model. Tsunami height distributions are estimated from the above method at 40, 80, and 120 seconds after the origin time of the earthquake. The Near-field Tsunami Inundation forecast method (Gusman et al. 2014) was used to estimate the tsunami inundation along the Sanriku coast. The result shows that the observed tsunami inundation was well explained by those estimated inundation. This also shows that it takes about 10 minutes to estimate the tsunami inundation from the origin time of the earthquake. This new method developed in this paper is very effective for a real-time tsunami forecast.

  16. Solar cell efficiency and high temperature processing of n-type silicon grown by the noncontact crucible method

    DOE PAGES

    Jensen, Mallory A.; LaSalvia, Vincenzo; Morishige, Ashley E.; ...

    2016-08-01

    The capital expense (capex) of conventional crystal growth methods is a barrier to sustainable growth of the photovoltaic industry. It is challenging for innovative techniques to displace conventional growth methods due the low dislocation density and high lifetime required for high efficiency devices. One promising innovation in crystal growth is the noncontact crucible method (NOC-Si), which combines aspects of Czochralski (Cz) and conventional casting. This material has the potential to satisfy the dual requirements, with capex likely between that of Cz (high capex) and multicrystalline silicon (mc-Si, low capex). In this contribution, we observe a strong dependence of solar cellmore » efficiency on ingot height, correlated with the evolution of swirl-like defects, for single crystalline n-type silicon grown by the NOC-Si method. We posit that these defects are similar to those observed in Cz, and we explore the response of NOC-Si to high temperature treatments including phosphorous diffusion gettering (PDG) and Tabula Rasa (TR). The highest lifetimes (2033 us for the top of the ingot and 342 us for the bottom of the ingot) are achieved for TR followed by a PDG process comprising a standard plateau and a low temperature anneal. Further improvements can be gained by tailoring the time-temperature profiles of each process. Lifetime analysis after the PDG process indicates the presence of a getterable impurity in the as-grown material, while analysis after TR points to the presence of oxide precipitates especially at the bottom of the ingot. Uniform lifetime degradation is observed after TR which we assign to a presently unknown defect. Lastly, future work includes additional TR processing to uncover the nature of this defect, microstructural characterization of suspected oxide precipitates, and optimization of the TR process to achieve the dual goals of high lifetime and spatial homogenization.« less

  17. Near-infrared Variability in the 2MASS Calibration Fields: A Search for Planetary Transit Candidates

    NASA Technical Reports Server (NTRS)

    Plavchan, Peter; Jura, M.; Kirkpatrick, J. Davy; Cutri, Roc M.; Gallagher, S. C.

    2008-01-01

    The Two Micron All Sky Survey (2MASS) photometric calibration observations cover approximately 6 square degrees on the sky in 35 'calibration fields,' each sampled in nominal photometric conditions between 562 and 3692 times during the 4 years of the 2MASS mission. We compile a catalog of variables from the calibration observations to search for M dwarfs transited by extrasolar planets. We present our methods for measuring periodic and nonperiodic flux variability. From 7554 sources with apparent K(sub s) magnitudes between 5.6 and 16.1, we identify 247 variables, including extragalactic variables and 23 periodic variables. We have discovered three M dwarf eclipsing systems, including two candidates for transiting extrasolar planets.

  18. An experimental facility for the visual study of turbulent flows.

    NASA Technical Reports Server (NTRS)

    Brodkey, R. S.; Hershey, H. C.; Corino, E. R.

    1971-01-01

    An experimental technique which allows visual observations of the wall area in turbulent pipe flow is described in detail. It requires neither the introduction of any injection or measuring device into the flow nor the presence of a two-phase flow or of a non-Newtonian fluid. The technique involves suspending solid MgO particles of colloidal size in trichloroethylene and photographing their motions near the wall with a high speed movie camera moving with the flow. Trichloroethylene was chosen in order to eliminate the index of refraction problem in a curved wall. Evaluation of the technique including a discussion of limitations is included. Also the technique is compared with previous methods of visual observations of turbulent flow.

  19. Asteroid orbital inversion using uniform phase-space sampling

    NASA Astrophysics Data System (ADS)

    Muinonen, K.; Pentikäinen, H.; Granvik, M.; Oszkiewicz, D.; Virtanen, J.

    2014-07-01

    We review statistical inverse methods for asteroid orbit computation from a small number of astrometric observations and short time intervals of observations. With the help of Markov-chain Monte Carlo methods (MCMC), we present a novel inverse method that utilizes uniform sampling of the phase space for the orbital elements. The statistical orbital ranging method (Virtanen et al. 2001, Muinonen et al. 2001) was set out to resolve the long-lasting challenges in the initial computation of orbits for asteroids. The ranging method starts from the selection of a pair of astrometric observations. Thereafter, the topocentric ranges and angular deviations in R.A. and Decl. are randomly sampled. The two Cartesian positions allow for the computation of orbital elements and, subsequently, the computation of ephemerides for the observation dates. Candidate orbital elements are included in the sample of accepted elements if the χ^2-value between the observed and computed observations is within a pre-defined threshold. The sample orbital elements obtain weights based on a certain debiasing procedure. When the weights are available, the full sample of orbital elements allows the probabilistic assessments for, e.g., object classification and ephemeris computation as well as the computation of collision probabilities. The MCMC ranging method (Oszkiewicz et al. 2009; see also Granvik et al. 2009) replaces the original sampling algorithm described above with a proposal probability density function (p.d.f.), and a chain of sample orbital elements results in the phase space. MCMC ranging is based on a bivariate Gaussian p.d.f. for the topocentric ranges, and allows for the sampling to focus on the phase-space domain with most of the probability mass. In the virtual-observation MCMC method (Muinonen et al. 2012), the proposal p.d.f. for the orbital elements is chosen to mimic the a posteriori p.d.f. for the elements: first, random errors are simulated for each observation, resulting in a set of virtual observations; second, corresponding virtual least-squares orbital elements are derived using the Nelder-Mead downhill simplex method; third, repeating the procedure two times allows for a computation of a difference for two sets of virtual orbital elements; and, fourth, this orbital-element difference constitutes a symmetric proposal in a random-walk Metropolis-Hastings algorithm, avoiding the explicit computation of the proposal p.d.f. In a discrete approximation, the allowed proposals coincide with the differences that are based on a large number of pre-computed sets of virtual least-squares orbital elements. The virtual-observation MCMC method is thus based on the characterization of the relevant volume in the orbital-element phase space. Here we utilize MCMC to map the phase-space domain of acceptable solutions. We can make use of the proposal p.d.f.s from the MCMC ranging and virtual-observation methods. The present phase-space mapping produces, upon convergence, a uniform sampling of the solution space within a pre-defined χ^2-value. The weights of the sampled orbital elements are then computed on the basis of the corresponding χ^2-values. The present method resembles the original ranging method. On one hand, MCMC mapping is insensitive to local extrema in the phase space and efficiently maps the solution space. This is somewhat contrary to the MCMC methods described above. On the other hand, MCMC mapping can suffer from producing a small number of sample elements with small χ^2-values, in resemblance to the original ranging method. We apply the methods to example near-Earth, main-belt, and transneptunian objects, and highlight the utilization of the methods in the data processing and analysis pipeline of the ESA Gaia space mission.

  20. Development and Evaluation of New Methods for Estimating Albedo-area for Stable GEOs

    NASA Astrophysics Data System (ADS)

    Payne, T. E.; Gregory, S. A.; Dentamaro, A.; Ernst, M.; Hollon, J.; Kruchten, A.; Chaudhary, A. B.; Dao, P. D.

    Although direct measurements of the projected areas of various Geosynchronous Earth Orbit (GEO) satellite facets are impossible without high-resolution imaging, estimates of the albedo-Area (aA) product lead to the possibility of inferring the area. Such size estimates are an integral part of its identity. We are engaged in parallel development of two methods for calculating aA for the body/communication antennae structures and one method for the solar panels. We have previously reported on the Two Facet Model (2FM) method for body aA, and here we discuss a method based on differences between new observations and a baseline catalog that has been constructed from the GEO Observations with Longitudinal Diversity Simultaneously (GOLDS) data. We report on evaluations of the 2FM and differential method (DM) algorithm results. We also discuss a new method of estimating solar panel aA by fitting new data that include specular glints. All of these measurement methods are compared to models and simulations that serve as a proxy for ground truth. Because of the partially directional nature of the composite Bi-directional Reflectivity Distribution Function (BRDF) of all bus-mounted appendages, variance of body aA results is expected to be significant. Short-term and long-term variance of the derived aAs will also be discussed.

  1. Applicability of optical scanner method for fine root dynamics

    NASA Astrophysics Data System (ADS)

    Kume, Tomonori; Ohashi, Mizue; Makita, Naoki; Khoon Kho, Lip; Katayama, Ayumi; Matsumoto, Kazuho; Ikeno, Hidetoshi

    2016-04-01

    Fine root dynamics is one of the important components in forest carbon cycling, as ~60 % of tree photosynthetic production can be allocated to root growth and metabolic activities. Various techniques have been developed for monitoring fine root biomass, production, mortality in order to understand carbon pools and fluxes resulting from fine roots dynamics. The minirhizotron method is now a widely used technique, in which a transparent tube is inserted into the soil and researchers count an increase and decrease of roots along the tube using images taken by a minirhizotron camera or minirhizotron video camera inside the tube. This method allows us to observe root behavior directly without destruction, but has several weaknesses; e.g., the difficulty of scaling up the results to stand level because of the small observation windows. Also, most of the image analysis are performed manually, which may yield insufficient quantitative and objective data. Recently, scanner method has been proposed, which can produce much bigger-size images (A4-size) with lower cost than those of the minirhizotron methods. However, laborious and time-consuming image analysis still limits the applicability of this method. In this study, therefore, we aimed to develop a new protocol for scanner image analysis to extract root behavior in soil. We evaluated applicability of this method in two ways; 1) the impact of different observers including root-study professionals, semi- and non-professionals on the detected results of root dynamics such as abundance, growth, and decomposition, and 2) the impact of window size on the results using a random sampling basis exercise. We applied our new protocol to analyze temporal changes of root behavior from sequential scanner images derived from a Bornean tropical forests. The results detected by the six observers showed considerable concordance in temporal changes in the abundance and the growth of fine roots but less in the decomposition. We also examined potential errors due to window size in the temporal changes in abundance and growth using the detected results, suggesting high applicability of the scanner methods with wide observation windows.

  2. Correlation and 3D-tracking of objects by pointing sensors

    DOEpatents

    Griesmeyer, J. Michael

    2017-04-04

    A method and system for tracking at least one object using a plurality of pointing sensors and a tracking system are disclosed herein. In a general embodiment, the tracking system is configured to receive a series of observation data relative to the at least one object over a time base for each of the plurality of pointing sensors. The observation data may include sensor position data, pointing vector data and observation error data. The tracking system may further determine a triangulation point using a magnitude of a shortest line connecting a line of sight value from each of the series of observation data from each of the plurality of sensors to the at least one object, and perform correlation processing on the observation data and triangulation point to determine if at least two of the plurality of sensors are tracking the same object. Observation data may also be branched, associated and pruned using new incoming observation data.

  3. LADEE/LDEX observations of lunar pickup ion distribution and variability

    NASA Astrophysics Data System (ADS)

    Poppe, A. R.; Halekas, J. S.; Szalay, J. R.; Horányi, M.; Levin, Z.; Kempf, S.

    2016-04-01

    We report fortuitous observations of low-energy lunar pickup ion fluxes near the Moon while in the solar wind by the Lunar Dust Experiment (LDEX) on board the Lunar Atmosphere and Dust Environment Explorer (LADEE). We describe the method of observation and the empirical calibration of the instrument for ion observations. LDEX observes several trends in the exospheric ion production rate, including a scale height of approximately 100 km, a positive, linear correlation with solar wind flux, and evidence of a slight enhancement near 7-8 h local time. We compare the LDEX observations to both LADEE Neutral Mass Spectrometer ion mode observations and theoretical models. The LDEX data are best fit by total exospheric ion production rates of ≈6 × 103 m-3 s-1 with dominant contributions from Al+, CO+, and Ar+, although the LDEX data suggest that the aluminum neutral density and corresponding ion production rate are lower than predicted by recent models.

  4. Seafloor Geodetic Observations West off Miyake-jima Island During January to April, 2001

    NASA Astrophysics Data System (ADS)

    Mochizuki, M.; Sato, M.; Yabuki, T.; Yoshida, Z.; Asada, A.

    2001-12-01

    An intensive earthquake swarm started under Miyake-jima Island, 180km south off Honshu, Japan, on June 26, 2000. The earthquake swarm migrated towards northwest off from Miyake-jima Island, where numerous earthquakes, more than 100,000, were detected within about two months and an extensive crustal deformation was observed by on-land geodetic observations. We started seafloor geodetic observation in this area to monitor seafloor deformation for the better understandings of underground magmatic activities. This poster presents summary of the observations and preliminary results from them. IIS has been developing a method of seafloor geodesy in corporation with JHD. A combination of kinematic GPS measurements and precise acoustic ranging techniques is employed to achieve centimeter-level seafloor geodesy. First observation site using the method was Kumano trough, where the Philippine Sea Plate subducts beneath Japan Islands arc. It was confirmed that the method could locate horizontal position of the seafloor reference points within 4 cm standard deviation ( Asada and Yabuki, 2001). We apply this seafloor positioning method to the observations conducted in the area west off Miyake-jima Island. Three seafloor reference systems ( Stations A, B, and C ), which consist of three or four acoustic mirror transponders, were built in triangle area surrounded by three islands, Miyake-jima, Nii-jima and Koudu-jima Islands, in November and December, 2000. This area would be deformed remarkably due to underground magma movement including magma injections from deeper part. Distances among three reference systems were set about 15 km. Stations A and B were located on the two sides of NW-SE trending seismically active area. The observations have been conducted three times until present, in January, February and April 2001. We obtained less data than we had expected due to bad sea condition in January and February observations. Also, a fast and quickly varying ocean current prevented us from keeping ship lines as they were planned at the observation in April. Although the amount and quality of the data might be less and worse than projected, analyses are going on by improving software suitable for each data set in order to extract as much information as possible from available data.

  5. History of telescopic observations of the Martian satellites

    NASA Astrophysics Data System (ADS)

    Pascu, D.; Erard, S.; Thuillot, W.; Lainey, V.

    2014-11-01

    This article intends to review the different studies of the Mars satellites Phobos and Deimos realized by means of ground-based telescopic observations as well in the astrometry and dynamics domain as in the physical one. This study spans the first period of investigations of the Martian satellites since their discovery in 1877 through the astrometry and the spectrometry methods, mainly before the modern period of the space era. It includes also some other observations performed thanks to the Hubble Space Telescope. The different techniques used and the main results obtained for the positionning, the size estimate, the albedo and surface composition are described.

  6. Continuous monitoring of seasonal phenological development by BBCH code

    NASA Astrophysics Data System (ADS)

    Cornelius, Christine; Estrella, Nicole; Menzel, Annette

    2010-05-01

    Phenology, the science of recurrent seasonal natural events, is a proxy for changes in ecosystems due to recent global climate change. Phenological studies mostly deal with data considering the beginning of different development stages e.g. budburst or the beginning of flowering. Just few studies focus on the end of phenological stages, such as the end of flowering or seed dispersal. Information about the entire development cycle of plants, including data of the end of stages, are received by observing plants according to the extended BBCH-scale (MEIER 1997). The scale is a standardized growth stage key which allows a less labor intensive, weekly observation rhythm. Every week frequencies of all occurring phenological stages are noted. These frequencies then constitute the basis to interpolate the development of each phenological stage, even though it was not being seen during field work. Due to the lack of studies using this kind of key for observations over the entire development cycle there is no common methodology to analyze the data. So our objective was to find a method of analysis, with which onset dates as well as endpoints of each development stage could be defined. Three different methods of analysis were compared. Results show that there is no significant difference in onset dates of phenological stages between all methods tested. However, the method of pooled pre/post stage development seems to be most suitable for climate change studies, followed by the method of cumulative stage development and the method of weighted plant development.

  7. Dietary assessment in elderly people: experiences gained from studies in the Netherlands.

    PubMed

    de Vries, J H M; de Groot, L C P G M; van Staveren, W A

    2009-02-01

    In selecting a dietary assessment method, several aspects such as the aim of the study and the characteristics of the target population should be taken into account. In elderly people, diminished functionality and cognitive decline may hamper dietary assessment and require tailored approaches to assess dietary intake. The objective of this paper is to summarize our experience in dietary assessment in a number of different studies in population groups over 65 years of age in the Netherlands, and to discuss this experience in the perspective of other nutrition surveys in the elderly. In longitudinal studies, we applied a modified dietary history; in clinical nursing home studies, trained staff observed and recorded food consumption; and in a controlled trial in healthy elderly men, we used a food frequency questionnaire (FFQ). For all methods applied in the community-dwelling elderly people, validation studies showed a similar underestimation of intake of 10-15% compared with the reference value. In the care-depending elderly, the underestimation was less: 5% according to an observational method. The methods varied widely in the resources required, including burden to the participants, field staff and finances. For effective dietary assessment in older adults, the major challenge will be to distinguish between those elderly who are able to respond correctly to the less intensive methods, such as 24-h recalls or FFQ, and those who are not able to respond to these methods and require adapted techniques, for example, observational records.

  8. Ionosphere Profile Estimation Using Ionosonde & GPS Data in an Inverse Refraction Calculation

    NASA Astrophysics Data System (ADS)

    Psiaki, M. L.

    2014-12-01

    A method has been developed to assimilate ionosonde virtual heights and GPS slant TEC data to estimate the parameters of a local ionosphere model, including estimates of the topside and of latitude and longitude variations. This effort seeks to better assimilate a variety of remote sensing data in order to characterize local (and eventually regional and global) ionosphere electron density profiles. The core calculations involve a forward refractive ray-tracing solution and a nonlinear optimal estimation algorithm that inverts the forward model. The ray-tracing calculations solve a nonlinear two-point boundary value problem for the curved ionosonde or GPS ray path through a parameterized electron density profile. It implements a full 3D solution that can handle the case of a tilted ionosphere. These calculations use Hamiltonian equivalents of the Appleton-Hartree magneto-plasma refraction index model. The current ionosphere parameterization is a modified Booker profile. It has been augmented to include latitude and longitude dependencies. The forward ray-tracing solution yields a given signal's group delay and beat carrier phase observables. An auxiliary set of boundary value problem solutions determine the sensitivities of the ray paths and observables with respect to the parameters of the augmented Booker profile. The nonlinear estimation algorithm compares the measured ionosonde virtual-altitude observables and GPS slant-TEC observables to the corresponding values from the forward refraction model. It uses the parameter sensitivities of the model to iteratively improve its parameter estimates in a way the reduces the residual errors between the measurements and their modeled values. This method has been applied to data from HAARP in Gakona, AK and has produced good TEC and virtual height fits. It has been extended to characterize electron density perturbations caused by HAARP heating experiments through the use of GPS slant TEC data for an LOS through the heated zone. The next planned extension of the method is to estimate the parameters of a regional ionosphere profile. The input observables will be slant TEC from an array of GPS receivers and group delay and carrier phase observables from an array of high-frequency beacons. The beacon array will function as a sort of multi-static ionosonde.

  9. Estimation of geopotential from satellite-to-satellite range rate data: Numerical results

    NASA Technical Reports Server (NTRS)

    Thobe, Glenn E.; Bose, Sam C.

    1987-01-01

    A technique for high-resolution geopotential field estimation by recovering the harmonic coefficients from satellite-to-satellite range rate data is presented and tested against both a controlled analytical simulation of a one-day satellite mission (maximum degree and order 8) and then against a Cowell method simulation of a 32-day mission (maximum degree and order 180). Innovations include: (1) a new frequency-domain observation equation based on kinetic energy perturbations which avoids much of the complication of the usual Keplerian element perturbation approaches; (2) a new method for computing the normalized inclination functions which unlike previous methods is both efficient and numerically stable even for large harmonic degrees and orders; (3) the application of a mass storage FFT to the entire mission range rate history; (4) the exploitation of newly discovered symmetries in the block diagonal observation matrix which reduce each block to the product of (a) a real diagonal matrix factor, (b) a real trapezoidal factor with half the number of rows as before, and (c) a complex diagonal factor; (5) a block-by-block least-squares solution of the observation equation by means of a custom-designed Givens orthogonal rotation method which is both numerically stable and tailored to the trapezoidal matrix structure for fast execution.

  10. Demonstrating an Effective Marine Biodiversity Observation Network in the Santa Barbara Channel

    NASA Astrophysics Data System (ADS)

    Miller, R. J.

    2016-02-01

    The Santa Barbara Channel (SBC) is a transition zone characterized by high species and habitat diversity and strong environmental gradients within a relatively small area where cold- and warm-water species found from Baja to the Bering Sea coexist. These characteristics make SBC an ideal setting for our demonstration Marine Biodiversity Observation Network (BON) project that integrates biological levels from genes to habitats and links biodiversity observations to environmental forcing and biogeography. SBC BON is building a comprehensive demonstration system that includes representation of all levels of biotic diversity, key new tools to expand the scales of present observation, and a data management network to integrate new and existing data sources. Our system will be scalable to expand into a full regional Marine BON, and the methods and decision support tools we develop will be transferable to other regions. Incorporating a broad set of habitats including nearshore coast, continental shelf, and pelagic, and taxonomic breadth from microbes to whales will facilitate this transferability. The Santa Barbara Channel marine BON has three broad objectives: 1. Integrate biodiversity data to enable inferences about regional biodiversity 2. Develop advanced methods in optical and acoustic imaging and genomics for monitoring biodiversity in partnership with ongoing monitoring and research programs to begin filling the gaping gaps in our knowledge. 3. Implement a tradeoff framework that optimizes allocation of sampling effort. Here we discuss our progress towards these goals and challenges in developing an effective MBON.

  11. Using Empirical Orthogonal Teleconnections to Analyze Interannual Precipitation Variability in China

    NASA Astrophysics Data System (ADS)

    Stephan, C.; Klingaman, N. P.; Vidale, P. L.; Turner, A. G.; Demory, M. E.; Guo, L.

    2017-12-01

    Interannual rainfall variability in China affects agriculture, infrastructure and water resource management. A consistent and objective method, Empirical Orthogonal Teleconnection (EOT) analysis, is applied to precipitation observations over China in all seasons. Instead of maximizing the explained space-time variance, the method identifies regions in China that best explain the temporal variability in domain-averaged rainfall. It produces known teleconnections, that include high positive correlations with ENSO in eastern China in winter, along the Yangtze River in summer, and in southeast China during spring. New findings include that variability along the southeast coast in winter, in the Yangtze valley in spring, and in eastern China in autumn, are associated with extratropical Rossby wave trains. The same analysis is applied to six climate simulations of the Met Office Unified Model with and without air-sea coupling and at various horizontal resolutions of 40, 90 and 200 km. All simulations reproduce the observed patterns of interannual rainfall variability in winter, spring and autumn; the leading pattern in summer is present in all but one simulation. However, only in two simulations are all patterns associated with the observed physical mechanism. Coupled simulations capture more observed patterns of variability and associate more of them with the correct physical mechanism, compared to atmosphere-only simulations at the same resolution. Finer resolution does not improve the fidelity of these patterns or their associated mechanisms. Evaluating climate models by only geographical distribution of mean precipitation and its interannual variance is insufficient; attention must be paid to associated mechanisms.

  12. Jet-torus connection in radio galaxies. Relativistic hydrodynamics and synthetic emission

    NASA Astrophysics Data System (ADS)

    Fromm, C. M.; Perucho, M.; Porth, O.; Younsi, Z.; Ros, E.; Mizuno, Y.; Zensus, J. A.; Rezzolla, L.

    2018-01-01

    Context. High resolution very long baseline interferometry observations of active galactic nuclei have revealed asymmetric structures in the jets of radio galaxies. These asymmetric structures may be due to internal asymmetries in the jets or they may be induced by the different conditions in the surrounding ambient medium, including the obscuring torus, or a combination of the two. Aims: In this paper we investigate the influence of the ambient medium, including the obscuring torus, on the observed properties of jets from radio galaxies. Methods: We performed special-relativistic hydrodynamic (SRHD) simulations of over-pressured and pressure-matched jets using the special-relativistic hydrodynamics code Ratpenat, which is based on a second-order accurate finite-volume method and an approximate Riemann solver. Using a newly developed radiative transfer code to compute the electromagnetic radiation, we modelled several jets embedded in various ambient medium and torus configurations and subsequently computed the non-thermal emission produced by the jet and thermal absorption from the torus. To better compare the emission simulations with observations we produced synthetic radio maps, taking into account the properties of the observatory. Results: The detailed analysis of our simulations shows that the observed properties such as core shift could be used to distinguish between over-pressured and pressure matched jets. In addition to the properties of the jets, insights into the extent and density of the obscuring torus can be obtained from analyses of the single-dish spectrum and spectral index maps.

  13. Volume-labeled nanoparticles and methods of preparation

    DOEpatents

    Wang, Wei; Gu, Baohua; Retterer, Scott T; Doktycz, Mitchel J

    2015-04-21

    Compositions comprising nanosized objects (i.e., nanoparticles) in which at least one observable marker, such as a radioisotope or fluorophore, is incorporated within the nanosized object. The nanosized objects include, for example, metal or semi-metal oxide (e.g., silica), quantum dot, noble metal, magnetic metal oxide, organic polymer, metal salt, and core-shell nanoparticles, wherein the label is incorporated within the nanoparticle or selectively in a metal oxide shell of a core-shell nanoparticle. Methods of preparing the volume-labeled nanoparticles are also described.

  14. Toward Intelligent Autonomous Agents for Cyber Defense: Report of the 2017 Workshop by the North Atlantic Treaty Organization (NATO) Research Group IST-152 RTG

    DTIC Science & Technology

    2018-04-18

    Significant research is currently conducted on dynamic learning and threat detection. However, this work is held back by gaps in validation methods ...and network path rotation (e.g., Stream Splitting MTD). Agents can also employ various cyber-deception methods , including direct observation hiding...ARL-SR-0395 ● APR 2018 US Army Research Laboratory Toward Intelligent Autonomous Agents for Cyber Defense: Report of the 2017

  15. Toward Intelligent Autonomous Agents for Cyber Defense: Report of the 2017 Workshop by the North Atlantic Treaty Organization (NATO) Research Group IST-152-RTG

    DTIC Science & Technology

    2018-04-01

    Significant research is currently conducted on dynamic learning and threat detection. However, this work is held back by gaps in validation methods ...and network path rotation (e.g., Stream Splitting MTD). Agents can also employ various cyber-deception methods , including direct observation hiding...ARL-SR-0395 ● APR 2018 US Army Research Laboratory Toward Intelligent Autonomous Agents for Cyber Defense: Report of the 2017

  16. Thermal energy storage devices, systems, and thermal energy storage device monitoring methods

    DOEpatents

    Tugurlan, Maria; Tuffner, Francis K; Chassin, David P.

    2016-09-13

    Thermal energy storage devices, systems, and thermal energy storage device monitoring methods are described. According to one aspect, a thermal energy storage device includes a reservoir configured to hold a thermal energy storage medium, a temperature control system configured to adjust a temperature of the thermal energy storage medium, and a state observation system configured to provide information regarding an energy state of the thermal energy storage device at a plurality of different moments in time.

  17. Vision inspection system and method

    NASA Technical Reports Server (NTRS)

    Huber, Edward D. (Inventor); Williams, Rick A. (Inventor)

    1997-01-01

    An optical vision inspection system (4) and method for multiplexed illuminating, viewing, analyzing and recording a range of characteristically different kinds of defects, depressions, and ridges in a selected material surface (7) with first and second alternating optical subsystems (20, 21) illuminating and sensing successive frames of the same material surface patch. To detect the different kinds of surface features including abrupt as well as gradual surface variations, correspondingly different kinds of lighting are applied in time-multiplexed fashion to the common surface area patches under observation.

  18. [Investigation on emission properties of biogenic VOCs of landscape plants in Shenzhen].

    PubMed

    Huang, Ai-Kui; Li, Nan; Guenther, Alex; Greenberg, Jim; Baker, Brad; Graessli, Michael; Bai, Jian-Hui

    2011-12-01

    Isoprene and monoterpene emissions were characterized using flow and enclosure sampling method and GC-MS in USA for 158 species of plants growing in Shenzhen, China. This survey was designed to include all of the dominant plants within the Shenzhen region as well as unique plants such as Cycads. These are the first measurements in a subtropical Asian metropolis. Substantial isoprene emissions were observed from thirty-one species, including Caryota mitis, Adenanthera pavonina var. microsperma, Mangifera indica and Excoecoria agalloch. Monoterpene emissions were observed from fifty-two species, including Passiflora edulis, Bambusa glaucescens cv. silverstripe as well as some primitive and rare Cycadaceae and Cyatheaceae plants. For the first time some of red plants have been measured, most of them have the ability of releasing terpene. These results will be used to develop biogenic emission model estimates for Shenzhen and the surrounding region that can be used as inputs for regional air quality models.

  19. Low-frequency Carbon Radio Recombination Lines. I. Calculations of Departure Coefficients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salgado, F.; Morabito, L. K.; Oonk, J. B. R.

    In the first paper of this series, we study the level population problem of recombining carbon ions. We focus our study on high quantum numbers, anticipating observations of carbon radio recombination lines to be carried out by the Low Frequency Array. We solve the level population equation including angular momentum levels with updated collision rates up to high principal quantum numbers. We derive departure coefficients by solving the level population equation in the hydrogenic approximation and including low-temperature dielectronic capture effects. Our results in the hydrogenic approximation agree well with those of previous works. When comparing our results including dielectronicmore » capture, we find differences that we ascribe to updates in the atomic physics (e.g., collision rates) and to the approximate solution method of the statistical equilibrium equations adopted in previous studies. A comparison with observations is discussed in an accompanying article, as radiative transfer effects need to be considered.« less

  20. The Asthma Mobile Health Study, a large-scale clinical observational study using ResearchKit.

    PubMed

    Chan, Yu-Feng Yvonne; Wang, Pei; Rogers, Linda; Tignor, Nicole; Zweig, Micol; Hershman, Steven G; Genes, Nicholas; Scott, Erick R; Krock, Eric; Badgeley, Marcus; Edgar, Ron; Violante, Samantha; Wright, Rosalind; Powell, Charles A; Dudley, Joel T; Schadt, Eric E

    2017-04-01

    The feasibility of using mobile health applications to conduct observational clinical studies requires rigorous validation. Here, we report initial findings from the Asthma Mobile Health Study, a research study, including recruitment, consent, and enrollment, conducted entirely remotely by smartphone. We achieved secure bidirectional data flow between investigators and 7,593 participants from across the United States, including many with severe asthma. Our platform enabled prospective collection of longitudinal, multidimensional data (e.g., surveys, devices, geolocation, and air quality) in a subset of users over the 6-month study period. Consistent trending and correlation of interrelated variables support the quality of data obtained via this method. We detected increased reporting of asthma symptoms in regions affected by heat, pollen, and wildfires. Potential challenges with this technology include selection bias, low retention rates, reporting bias, and data security. These issues require attention to realize the full potential of mobile platforms in research and patient care.

  1. Using Kriging with a heterogeneous measurement error to improve the accuracy of extreme precipitation return level estimation

    NASA Astrophysics Data System (ADS)

    Yin, Shui-qing; Wang, Zhonglei; Zhu, Zhengyuan; Zou, Xu-kai; Wang, Wen-ting

    2018-07-01

    Extreme precipitation can cause flooding and may result in great economic losses and deaths. The return level is a commonly used measure of extreme precipitation events and is required for hydrological engineer designs, including those of sewerage systems, dams, reservoirs and bridges. In this paper, we propose a two-step method to estimate the return level and its uncertainty for a study region. In the first step, we use the generalized extreme value distribution, the L-moment method and the stationary bootstrap to estimate the return level and its uncertainty at the site with observations. In the second step, a spatial model incorporating the heterogeneous measurement errors and covariates is trained to estimate return levels at sites with no observations and to improve the estimates at sites with limited information. The proposed method is applied to the daily rainfall data from 273 weather stations in the Haihe river basin of North China. We compare the proposed method with two alternatives: the first one is based on the ordinary Kriging method without measurement error, and the second one smooths the estimated location and scale parameters of the generalized extreme value distribution by the universal Kriging method. Results show that the proposed method outperforms its counterparts. We also propose a novel approach to assess the two-step method by comparing it with the at-site estimation method with a series of reduced length of observations. Estimates of the 2-, 5-, 10-, 20-, 50- and 100-year return level maps and the corresponding uncertainties are provided for the Haihe river basin, and a comparison with those released by the Hydrology Bureau of Ministry of Water Resources of China is made.

  2. Logistic Regression Likelihood Ratio Test Analysis for Detecting Signals of Adverse Events in Post-market Safety Surveillance.

    PubMed

    Nam, Kijoeng; Henderson, Nicholas C; Rohan, Patricia; Woo, Emily Jane; Russek-Cohen, Estelle

    2017-01-01

    The Vaccine Adverse Event Reporting System (VAERS) and other product surveillance systems compile reports of product-associated adverse events (AEs), and these reports may include a wide range of information including age, gender, and concomitant vaccines. Controlling for possible confounding variables such as these is an important task when utilizing surveillance systems to monitor post-market product safety. A common method for handling possible confounders is to compare observed product-AE combinations with adjusted baseline frequencies where the adjustments are made by stratifying on observable characteristics. Though approaches such as these have proven to be useful, in this article we propose a more flexible logistic regression approach which allows for covariates of all types rather than relying solely on stratification. Indeed, a main advantage of our approach is that the general regression framework provides flexibility to incorporate additional information such as demographic factors and concomitant vaccines. As part of our covariate-adjusted method, we outline a procedure for signal detection that accounts for multiple comparisons and controls the overall Type 1 error rate. To demonstrate the effectiveness of our approach, we illustrate our method with an example involving febrile convulsion, and we further evaluate its performance in a series of simulation studies.

  3. State estimation improves prospects for ocean research

    NASA Astrophysics Data System (ADS)

    Stammer, Detlef; Wunsch, C.; Fukumori, I.; Marshall, J.

    Rigorous global ocean state estimation methods can now be used to produce dynamically consistent time-varying model/data syntheses, the results of which are being used to study a variety of important scientific problems. Figure 1 shows a schematic of a complete ocean observing and synthesis system that includes global observations and state-of-the-art ocean general circulation models (OGCM) run on modern computer platforms. A global observing system is described in detail in Smith and Koblinsky [2001],and the present status of ocean modeling and anticipated improvements are addressed by Griffies et al. [2001]. Here, the focus is on the third component of state estimation: the synthesis of the observations and a model into a unified, dynamically consistent estimate.

  4. A Wide Band SpectroPolarimeter (Abstract)

    NASA Astrophysics Data System (ADS)

    Menke, J.

    2017-12-01

    (Abstract only) This is the third paper in a series describing experiments in developing amateur spectropolarimetry instrumentation and observational methods. Spectropolarimetry (SP) can provide insight into the extra-stellar environment, including presence of dust and alignment forces (e.g., magnetic fields). The first two papers (SAS 2014, 2016) described the SP1, a spectropolarimeter based on the medium-resolution spectrometer on our 18-inch, f3.5, Newtonian. The desire to observe fainter stars led to the development of the SP2 reported here that uses a low resolution spectrometer. The SP2 has been used with a C11 f10 telescope, and has allowed observations down to about mag. 8. This paper describes the SP2 and observational results to date.

  5. Assimilation of Precipitation Measurement Missions Microwave Radiance Observations With GEOS-5

    NASA Technical Reports Server (NTRS)

    Jin, Jianjun; Kim, Min-Jeong; McCarty, Will; Akella, Santha; Gu, Wei

    2015-01-01

    The Global Precipitation Mission (GPM) Core Observatory satellite was launched in February, 2014. The GPM Microwave Imager (GMI) is a conically scanning radiometer measuring 13 channels ranging from 10 to 183 GHz and sampling between 65 S 65 N. This instrument is a successor to the Tropical Rainfall Measurement Mission (TRMM) Microwave Imager (TMI), which has observed 9 channels at frequencies ranging 10 to 85 GHz between 40 S 40 N since 1997. This presentation outlines the base procedures developed to assimilate GMI and TMI radiances in clear-sky conditions, including quality control methods, thinning decisions, and the estimation of, observation errors. This presentation also shows the impact of these observations when they are incorporated into the GEOS-5 atmospheric data assimilation system.

  6. Ultimate pier and contraction scour prediction in cohesive soils at selected bridges in Illinois

    USGS Publications Warehouse

    Straub, Timothy D.; Over, Thomas M.; Domanski, Marian M.

    2013-01-01

    The Scour Rate In COhesive Soils-Erosion Function Apparatus (SRICOS-EFA) method includes an ultimate scour prediction that is the equilibrium maximum pier and contraction scour of cohesive soils over time. The purpose of this report is to present the results of testing the ultimate pier and contraction scour methods for cohesive soils on 30 bridge sites in Illinois. Comparison of the ultimate cohesive and noncohesive methods, along with the Illinois Department of Transportation (IDOT) cohesive soil reduction-factor method and measured scour are presented. Also, results of the comparison of historic IDOT laboratory and field values of unconfined compressive strength of soils (Qu) are presented. The unconfined compressive strength is used in both ultimate cohesive and reduction-factor methods, and knowing how the values from field methods compare to the laboratory methods is critical to the informed application of the methods. On average, the non-cohesive method results predict the highest amount of scour, followed by the reduction-factor method results; and the ultimate cohesive method results predict the lowest amount of scour. The 100-year scour predicted for the ultimate cohesive, noncohesive, and reduction-factor methods for each bridge site and soil are always larger than observed scour in this study, except 12% of predicted values that are all within 0.4 ft of the observed scour. The ultimate cohesive scour prediction is smaller than the non-cohesive scour prediction method for 78% of bridge sites and soils. Seventy-six percent of the ultimate cohesive predictions show a 45% or greater reduction from the non-cohesive predictions that are over 10 ft. Comparing the ultimate cohesive and reduction-factor 100-year scour predictions methods for each bridge site and soil, the scour predicted by the ultimate cohesive scour prediction method is less than the reduction-factor 100-year scour prediction method for 51% of bridge sites and soils. Critical shear stress remains a needed parameter in the ultimate scour prediction for cohesive soils. The unconfined soil compressive strength measured by IDOT in the laboratory was found to provide a good prediction of critical shear stress, as measured by using the erosion function apparatus in a previous study. Because laboratory Qu analyses are time-consuming and expensive, the ability of field-measured Rimac data to estimate unconfined soil strength in the critical shear–soil strength relation was tested. A regression analysis was completed using a historic IDOT dataset containing 366 data pairs of laboratory Qu and field Rimac measurements from common sites with cohesive soils. The resulting equations provide a point prediction of Qu, given any Rimac value with the 90% confidence interval. The prediction equations are not significantly different from the identity Qu = Rimac. The alternative predictions of ultimate cohesive scour presented in this study assume Qu will be estimated using Rimac measurements that include computed uncertainty. In particular, the ultimate cohesive predicted scour is greater than observed scour for the entire 90% confidence interval range for predicting Qu at the bridges and soils used in this study, with the exception of the six predicted values that are all within 0.6 ft of the observed scour.

  7. Exercise and Cardiometabolic Risk Factors in Graduate Students: A Longitudinal, Observational Study

    ERIC Educational Resources Information Center

    Racette, Susan B.; Inman, Cindi L.; Clark, B. Ruth; Royer, Nathaniel K.; Steger-May, Karen; Deusinger, Susan S.

    2014-01-01

    Objective: To evaluate cardiometabolic risk of students longitudinally and compare them with age-matched national samples. Participants: Participants are 134 graduate students enrolled between August 2005 and May 2010. Methods: Students were assessed at the beginning and end of their 3-year curriculum. Comparative samples included 966 National…

  8. Use of a portable hyperspectral imaging system for monitoring the efficacy of sanitation procedures in produce processing plants

    USDA-ARS?s Scientific Manuscript database

    Cleaning and sanitation of production surfaces and equipment plays a critical role in lowering the risk of food borne illness associated with consumption of fresh-cut produce. Visual observation and sampling methods including ATP tests and cell culturing are commonly used to monitor the effectivenes...

  9. Third Culture Kids: Transition and Persistence When Repatriating to Attend University

    ERIC Educational Resources Information Center

    Jennison Smith, Virginia M

    2011-01-01

    Scope and Method of Study. This qualitative study used heuristic phenomenology and involved 20 interviews at 3 different sites, 2 focus groups, formal and informal observations, and the analysis of various artifacts including photos, drawings, and documents. Findings and Conclusions. Third culture kids are a diverse group of individuals who spent…

  10. Hispanic Families' Attitudes towards Their Heritage Language in Houston and Its Suburbs

    ERIC Educational Resources Information Center

    Vargas Blanco, Edgar Mauricio

    2015-01-01

    The following study explores the attitudes of 40 Hispanic families towards Spanish and their ethnic community in Houston and its suburbs. The research participants included 20 Colombian families and 20 Mexican families. A mixed methodology using quantitative and qualitative methods was used. Through family's observation, surveys and interviews to…

  11. Application of Research to the Teaching of Basketball at the Elementary School Level.

    ERIC Educational Resources Information Center

    Turkington, H. David

    This paper explores what has been reported by several researchers in an attempt to make recommendations on the most effective method of introducing basketball to elementary school age children. Based on this review the following observations are brought forward for consideration. Factors which significantly affected student success include: height…

  12. Agile Methods and Request for Change (RFC): Observations from DoD Acquisition Programs

    DTIC Science & Technology

    2014-01-01

    at the Software Development Plan, then it’s worth having a conversation with the contractor that includes answering the above questions. MSA TD EMD...Lap- ham 2010] CMU/SEI-2013-TN-031 | 18 those undertaken in more traditional waterfall-based developments. Some of the government PMO enabling

  13. Strategies in an Arts Program for Adults with Atypical Communication

    ERIC Educational Resources Information Center

    Lukac, Christina

    2017-01-01

    The purpose of this study was to observe and implement strategies and adaptations in an arts program for adults with atypical communication due to developmental and intellectual disabilities. This study was conducted in the field using an action research approach with triangulated methods of data collection including semi-structured interviews,…

  14. Experiential Learning and Sustainable Economic Development in Appalachian Communities: A Teaching Note

    ERIC Educational Resources Information Center

    Tonn, Bruce; Ezzell, Tim; Ogle, Eric

    2010-01-01

    This paper describes the results of a participative planning class held in economically dis-advantaged communities in east Tennessee. The class follows a structured method, which includes community workshops and project development, in dealing with the communities. Among many observations gained in eight years of running the class are that…

  15. Leadership Behaviors and Its Relation with Principals' Management Experience

    ERIC Educational Resources Information Center

    Mehdinezhad, Vali; Sardarzahi, Zaid

    2016-01-01

    This paper aims at studying the leadership behaviors reported by principals and observed by teachers and its relationship with management experience of principals. A quantitative method was used in this study. The target population included all principals and teachers of guidance schools and high schools in the Dashtiari District, Iran. A sample…

  16. Sugar and Spice, Toads and Mice: Gender Issues in Family Therapy Training.

    ERIC Educational Resources Information Center

    Roberts, Janine McGill

    1991-01-01

    Presents methods to help family therapy trainees and clinicians articulate how to address gender in families. Describes four experiential exercises (including gender survival messages, gender framed circular questions, and process observation sheets) for training and use with clients. Can examine learnings about gender from families of origin,…

  17. An Investigation of Higher-Order Thinking Skills in Smaller Learning Community Social Studies Classrooms

    ERIC Educational Resources Information Center

    Fischer, Christopher; Bol, Linda; Pribesh, Shana

    2011-01-01

    This study investigated the extent to which higher-order thinking skills are promoted in social studies classes in high schools that are implementing smaller learning communities (SLCs). Data collection in this mixed-methods study included classroom observations and in-depth interviews. Findings indicated that higher-order thinking was rarely…

  18. Exploring Biliteracy Developments among Asian Women in Diasporas: The Case of Taiwan

    ERIC Educational Resources Information Center

    Lee, Yu-Hsiu Hugo

    2010-01-01

    This study presents a broad overview of diasporic biliteracy developments in immigrant women after examining observation data in one Taiwanese community. Methodologies include a mixture of narrative inquiry with some ethnographic methods. Fifteen Asian women in diaspora, two Burmese, one Cambodian, one Filipino, four Indonesians, three Thai and…

  19. Engaging Pre-Service Teachers in Multinational, Multi-Campus Scientific and Mathematical Inquiry

    ERIC Educational Resources Information Center

    Wilhelm, Jennifer Anne; Smith, Walter S.; Walters, Kendra L.; Sherrod, Sonya E.; Mulholland, Judith

    2008-01-01

    Pre-service teachers from Texas and Indiana in the United States and from Queensland, Australia, observed the Moon for a semester and compared and contrasted their findings in asynchronous Internet discussion groups. The 188 pre-service teachers were required to conduct inquiry investigations for their methods coursework which included an initial…

  20. Enhancing Students' Communication Skills through Problem Posing and Presentation

    ERIC Educational Resources Information Center

    Sugito; E. S., Sri Mulyani; Hartono; Supartono

    2017-01-01

    This study was to explore how enhance communication skill through problem posing and presentation method. The subjects of this research were the seven grade students Junior High School, including 20 male and 14 female. This research was conducted in two cycles and each cycle consisted of four steps, they were: planning, action, observation, and…

  1. Space Frontiers for New Pedagogies: A Tale of Constraints and Possibilities

    ERIC Educational Resources Information Center

    Jessop, Tansy; Gubby, Laura; Smith, Angela

    2012-01-01

    This article draws together two linked studies on formal teaching spaces within one university. The first consisted of a multi-method analysis, including observations of four teaching events, interviews with academics and estates staff, analysis of architectural plans, and a talking campus tour. The second study surveyed 166 students about their…

  2. Bayes Nets in Educational Assessment: Where Do the Numbers Come from? CSE Technical Report.

    ERIC Educational Resources Information Center

    Mislevy, Robert J.; Almond, Russell G.; Yan, Duanli; Steinberg, Linda S.

    Educational assessments that exploit advances in technology and cognitive psychology can produce observations and pose student models that outstrip familiar test-theoretic models and analytic methods. Bayesian inference networks (BINs), which include familiar models and techniques as special cases, can be used to manage belief about students'…

  3. Mark-recapture with multiple, non-invasive marks.

    PubMed

    Bonner, Simon J; Holmberg, Jason

    2013-09-01

    Non-invasive marks, including pigmentation patterns, acquired scars, and genetic markers, are often used to identify individuals in mark-recapture experiments. If animals in a population can be identified from multiple, non-invasive marks then some individuals may be counted twice in the observed data. Analyzing the observed histories without accounting for these errors will provide incorrect inference about the population dynamics. Previous approaches to this problem include modeling data from only one mark and combining estimators obtained from each mark separately assuming that they are independent. Motivated by the analysis of data from the ECOCEAN online whale shark (Rhincodon typus) catalog, we describe a Bayesian method to analyze data from multiple, non-invasive marks that is based on the latent-multinomial model of Link et al. (2010, Biometrics 66, 178-185). Further to this, we describe a simplification of the Markov chain Monte Carlo algorithm of Link et al. (2010, Biometrics 66, 178-185) that leads to more efficient computation. We present results from the analysis of the ECOCEAN whale shark data and from simulation studies comparing our method with the previous approaches. © 2013, The International Biometric Society.

  4. Moon-based Earth Observation for Large Scale Geoscience Phenomena

    NASA Astrophysics Data System (ADS)

    Guo, Huadong; Liu, Guang; Ding, Yixing

    2016-07-01

    The capability of Earth observation for large-global-scale natural phenomena needs to be improved and new observing platform are expected. We have studied the concept of Moon as an Earth observation in these years. Comparing with manmade satellite platform, Moon-based Earth observation can obtain multi-spherical, full-band, active and passive information,which is of following advantages: large observation range, variable view angle, long-term continuous observation, extra-long life cycle, with the characteristics of longevity ,consistency, integrity, stability and uniqueness. Moon-based Earth observation is suitable for monitoring the large scale geoscience phenomena including large scale atmosphere change, large scale ocean change,large scale land surface dynamic change,solid earth dynamic change,etc. For the purpose of establishing a Moon-based Earth observation platform, we already have a plan to study the five aspects as follows: mechanism and models of moon-based observing earth sciences macroscopic phenomena; sensors' parameters optimization and methods of moon-based Earth observation; site selection and environment of moon-based Earth observation; Moon-based Earth observation platform; and Moon-based Earth observation fundamental scientific framework.

  5. Effect of distance-related heterogeneity on population size estimates from point counts

    USGS Publications Warehouse

    Efford, Murray G.; Dawson, Deanna K.

    2009-01-01

    Point counts are used widely to index bird populations. Variation in the proportion of birds counted is a known source of error, and for robust inference it has been advocated that counts be converted to estimates of absolute population size. We used simulation to assess nine methods for the conduct and analysis of point counts when the data included distance-related heterogeneity of individual detection probability. Distance from the observer is a ubiquitous source of heterogeneity, because nearby birds are more easily detected than distant ones. Several recent methods (dependent double-observer, time of first detection, time of detection, independent multiple-observer, and repeated counts) do not account for distance-related heterogeneity, at least in their simpler forms. We assessed bias in estimates of population size by simulating counts with fixed radius w over four time intervals (occasions). Detection probability per occasion was modeled as a half-normal function of distance with scale parameter sigma and intercept g(0) = 1.0. Bias varied with sigma/w; values of sigma inferred from published studies were often 50% for a 100-m fixed-radius count. More critically, the bias of adjusted counts sometimes varied more than that of unadjusted counts, and inference from adjusted counts would be less robust. The problem was not solved by using mixture models or including distance as a covariate. Conventional distance sampling performed well in simulations, but its assumptions are difficult to meet in the field. We conclude that no existing method allows effective estimation of population size from point counts.

  6. Study on individual stochastic model of GNSS observations for precise kinematic applications

    NASA Astrophysics Data System (ADS)

    Próchniewicz, Dominik; Szpunar, Ryszard

    2015-04-01

    The proper definition of mathematical positioning model, which is defined by functional and stochastic models, is a prerequisite to obtain the optimal estimation of unknown parameters. Especially important in this definition is realistic modelling of stochastic properties of observations, which are more receiver-dependent and time-varying than deterministic relationships. This is particularly true with respect to precise kinematic applications which are characterized by weakening model strength. In this case, incorrect or simplified definition of stochastic model causes that the performance of ambiguity resolution and accuracy of position estimation can be limited. In this study we investigate the methods of describing the measurement noise of GNSS observations and its impact to derive precise kinematic positioning model. In particular stochastic modelling of individual components of the variance-covariance matrix of observation noise performed using observations from a very short baseline and laboratory GNSS signal generator, is analyzed. Experimental test results indicate that the utilizing the individual stochastic model of observations including elevation dependency and cross-correlation instead of assumption that raw measurements are independent with the same variance improves the performance of ambiguity resolution as well as rover positioning accuracy. This shows that the proposed stochastic assessment method could be a important part in complex calibration procedure of GNSS equipment.

  7. An artificial network model for estimating the network structure underlying partially observed neuronal signals.

    PubMed

    Komatsu, Misako; Namikawa, Jun; Chao, Zenas C; Nagasaka, Yasuo; Fujii, Naotaka; Nakamura, Kiyohiko; Tani, Jun

    2014-01-01

    Many previous studies have proposed methods for quantifying neuronal interactions. However, these methods evaluated the interactions between recorded signals in an isolated network. In this study, we present a novel approach for estimating interactions between observed neuronal signals by theorizing that those signals are observed from only a part of the network that also includes unobserved structures. We propose a variant of the recurrent network model that consists of both observable and unobservable units. The observable units represent recorded neuronal activity, and the unobservable units are introduced to represent activity from unobserved structures in the network. The network structures are characterized by connective weights, i.e., the interaction intensities between individual units, which are estimated from recorded signals. We applied this model to multi-channel brain signals recorded from monkeys, and obtained robust network structures with physiological relevance. Furthermore, the network exhibited common features that portrayed cortical dynamics as inversely correlated interactions between excitatory and inhibitory populations of neurons, which are consistent with the previous view of cortical local circuits. Our results suggest that the novel concept of incorporating an unobserved structure into network estimations has theoretical advantages and could provide insights into brain dynamics beyond what can be directly observed. Copyright © 2014 Elsevier Ireland Ltd and the Japan Neuroscience Society. All rights reserved.

  8. Full observation of ultrafast cascaded radiationless transitions from S{sub 2}(ππ{sup ∗}) state of pyrazine using vacuum ultraviolet photoelectron imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horio, Takuya; Spesyvtsev, Roman; Nagashima, Kazuki

    A photoexcited molecule undergoes multiple deactivation and reaction processes simultaneously or sequentially, which have been observed by combinations of various experimental methods. However, a single experimental method that enables complete observation of the photo-induced dynamics would be of great assistance for such studies. Here we report a full observation of cascaded electronic dephasing from S{sub 2}(ππ{sup *}) in pyrazine (C{sub 4}N{sub 2}H{sub 4}) by time-resolved photoelectron imaging (TRPEI) using 9.3-eV vacuum ultraviolet pulses with a sub-20 fs time duration. While we previously demonstrated a real-time observation of the ultrafast S{sub 2}(ππ{sup *}) → S{sub 1}(nπ{sup *}) internal conversion in pyrazinemore » using TRPEI with UV pulses, this study presents a complete observation of the dynamics including radiationless transitions from S{sub 1} to S{sub 0} (internal conversion) and T{sub 1}(nπ{sup *}) (intersystem crossing). Also discussed are the role of {sup 1}A{sub u}(nπ{sup *}) in the internal conversion and the configuration interaction of the S{sub 2}(ππ{sup *}) electronic wave function.« less

  9. Effects of music therapy and distraction cards on pain relief during phlebotomy in children.

    PubMed

    Aydin, Diler; Sahiner, Nejla Canbulat

    2017-02-01

    To investigate three different distraction methods (distraction cards, listening to music, and distraction cards + music) on pain and anxiety relief in children during phlebotomy. This study was a prospective, randomized, controlled trial. The sample consisted of children aged 7 to 12years who required blood tests. The children were randomized into four groups, distraction cards, music, distraction cards + music, and controls. Data were obtained through face-to-face interviews with the children, their parents, and the observer before and after the procedure. The children's pain levels were assessed and reported by the parents and observers, and the children themselves who self-reported using Wong-Baker FACES. The children's anxiety levels were also assessed using the Children's Fear Scale. Two hundred children (mean age: 9.01±2.35years) were included. No difference was found between the groups in the self, parent, and observer reported procedural pain levels (p=0.72, p=0.23, p=0.15, respectively). Furthermore, no significant differences were observed between groups in procedural child anxiety levels according to the parents and observer (p=0.092, p=0.096, respectively). Pain and anxiety relief was seen in all three methods during phlebotomy; however, no statistically significant difference was observed. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Inferences about landbird abundance from count data: recent advances and future directions

    USGS Publications Warehouse

    Nichols, J.D.; Thomas, L.; Conn, P.B.; Thomson, David L.; Cooch, Evan G.; Conroy, Michael J.

    2009-01-01

    We summarize results of a November 2006 workshop dealing with recent research on the estimation of landbird abundance from count data. Our conceptual framework includes a decomposition of the probability of detecting a bird potentially exposed to sampling efforts into four separate probabilities. Primary inference methods are described and include distance sampling, multiple observers, time of detection, and repeated counts. The detection parameters estimated by these different approaches differ, leading to different interpretations of resulting estimates of density and abundance. Simultaneous use of combinations of these different inference approaches can not only lead to increased precision but also provides the ability to decompose components of the detection process. Recent efforts to test the efficacy of these different approaches using natural systems and a new bird radio test system provide sobering conclusions about the ability of observers to detect and localize birds in auditory surveys. Recent research is reported on efforts to deal with such potential sources of error as bird misclassification, measurement error, and density gradients. Methods for inference about spatial and temporal variation in avian abundance are outlined. Discussion topics include opinions about the need to estimate detection probability when drawing inference about avian abundance, methodological recommendations based on the current state of knowledge and suggestions for future research.

  11. Review of Batteryless Wireless Sensors Using Additively Manufactured Microwave Resonators.

    PubMed

    Memon, Muhammad Usman; Lim, Sungjoon

    2017-09-09

    The significant improvements observed in the field of bulk-production of printed microchip technologies in the past decade have allowed the fabrication of microchip printing on numerous materials including organic and flexible substrates. Printed sensors and electronics are of significant interest owing to the fast and low-cost fabrication techniques used in their fabrication. The increasing amount of research and deployment of specially printed electronic sensors in a number of applications demonstrates the immense attention paid by researchers to this topic in the pursuit of achieving wider-scale electronics on different dielectric materials. Although there are many traditional methods for fabricating radio frequency (RF) components, they are time-consuming, expensive, complicated, and require more power for operation than additive fabrication methods. This paper serves as a summary/review of improvements made to the additive printing technologies. The article focuses on three recently developed printing methods for the fabrication of wireless sensors operating at microwave frequencies. The fabrication methods discussed include inkjet printing, three-dimensional (3D) printing, and screen printing.

  12. Review of Batteryless Wireless Sensors Using Additively Manufactured Microwave Resonators

    PubMed Central

    2017-01-01

    The significant improvements observed in the field of bulk-production of printed microchip technologies in the past decade have allowed the fabrication of microchip printing on numerous materials including organic and flexible substrates. Printed sensors and electronics are of significant interest owing to the fast and low-cost fabrication techniques used in their fabrication. The increasing amount of research and deployment of specially printed electronic sensors in a number of applications demonstrates the immense attention paid by researchers to this topic in the pursuit of achieving wider-scale electronics on different dielectric materials. Although there are many traditional methods for fabricating radio frequency (RF) components, they are time-consuming, expensive, complicated, and require more power for operation than additive fabrication methods. This paper serves as a summary/review of improvements made to the additive printing technologies. The article focuses on three recently developed printing methods for the fabrication of wireless sensors operating at microwave frequencies. The fabrication methods discussed include inkjet printing, three-dimensional (3D) printing, and screen printing. PMID:28891947

  13. Zero-mode clad waveguides for performing spectroscopy with confined effective observation volumes

    DOEpatents

    Levene, Michael J.; Korlach, Jonas; Turner, Stephen W.; Craighead, Harold G.; Webb, Watt W.

    2005-07-12

    The present invention is directed to a method and an apparatus for analysis of an analyte. The method involves providing a zero-mode waveguide which includes a cladding surrounding a core where the cladding is configured to preclude propagation of electromagnetic energy of a frequency less than a cutoff frequency longitudinally through the core of the zero-mode waveguide. The analyte is positioned in the core of the zero-mode waveguide and is then subjected, in the core of the zero-mode waveguide, to activating electromagnetic radiation of a frequency less than the cut-off frequency under conditions effective to permit analysis of the analyte in an effective observation volume which is more compact than if the analysis were carried out in the absence of the zero-mode waveguide.

  14. Waveguides for performing spectroscopy with confined effective observation volumes

    DOEpatents

    Levene, Michael J.; Korlach, Jonas; Turner, Stephen W.; Craighead, Harold G.; Webb, Watt W.

    2006-03-14

    The present invention is directed to a method and an apparatus for analysis of an analyte. The method involves providing a zero-mode waveguide which includes a cladding surrounding a core where the cladding is configured to preclude propagation of electromagnetic energy of a frequency less than a cutoff frequency longitudinally through the core of the zero-mode waveguide. The analyte is positioned in the core of the zero-mode waveguide and is then subjected, in the core of the zero-mode waveguide, to activating electromagnetic radiation of a frequency less than the cut-off frequency under conditions effective to permit analysis of the analyte in an effective observation volume which is more compact than if the analysis were carried out in the absence of the zero-mode waveguide.

  15. The effect of the dynamic wet troposphere on radio interferometric measurements

    NASA Technical Reports Server (NTRS)

    Treuhaft, R. N.; Lanyi, G. E.

    1987-01-01

    A statistical model of water vapor fluctuations is used to describe the effect of the dynamic wet troposphere on radio interferometric measurements. It is assumed that the spatial structure of refractivity is approximated by Kolmogorov turbulence theory, and that the temporal fluctuations are caused by spatial patterns moved over a site by the wind, and these assumptions are examined for the VLBI delay and delay rate observables. The results suggest that the delay rate measurement error is usually dominated by water vapor fluctuations, and water vapor induced VLBI parameter errors and correlations are determined as a function of the delay observable errors. A method is proposed for including the water vapor fluctuations in the parameter estimation method to obtain improved parameter estimates and parameter covariances.

  16. Giving voice to vulnerable people: the value of shadowing for phenomenological healthcare research.

    PubMed

    van der Meide, Hanneke; Leget, Carlo; Olthuis, Gert

    2013-11-01

    Phenomenological healthcare research should include the lived experiences of a broad group of healthcare users. In this paper it is shown how shadowing can give a voice to people in vulnerable situations who are often excluded from interview studies. Shadowing is an observational method in which the researcher observes an individual during a relatively long time. Central aspects of the method are the focus on meaning expressed by the whole body, and an extended stay of the researcher in the phenomenal event itself. Inherent in shadowing is a degree of ambivalence that both challenges the researcher and provides meaningful insights about the phenomenon. A case example of a phenomenological study on the experiences of elderly hospital patients is used to show what shadowing yields.

  17. Biases and Power for Groups Comparison on Subjective Health Measurements

    PubMed Central

    Hamel, Jean-François; Hardouin, Jean-Benoit; Le Neel, Tanguy; Kubis, Gildas; Roquelaure, Yves; Sébille, Véronique

    2012-01-01

    Subjective health measurements are increasingly used in clinical research, particularly for patient groups comparisons. Two main types of analytical strategies can be used for such data: so-called classical test theory (CTT), relying on observed scores and models coming from Item Response Theory (IRT) relying on a response model relating the items responses to a latent parameter, often called latent trait. Whether IRT or CTT would be the most appropriate method to compare two independent groups of patients on a patient reported outcomes measurement remains unknown and was investigated using simulations. For CTT-based analyses, groups comparison was performed using t-test on the scores. For IRT-based analyses, several methods were compared, according to whether the Rasch model was considered with random effects or with fixed effects, and the group effect was included as a covariate or not. Individual latent traits values were estimated using either a deterministic method or by stochastic approaches. Latent traits were then compared with a t-test. Finally, a two-steps method was performed to compare the latent trait distributions, and a Wald test was performed to test the group effect in the Rasch model including group covariates. The only unbiased IRT-based method was the group covariate Wald’s test, performed on the random effects Rasch model. This model displayed the highest observed power, which was similar to the power using the score t-test. These results need to be extended to the case frequently encountered in practice where data are missing and possibly informative. PMID:23115620

  18. Performance characteristics of an ion chromatographic method for the quantitation of citrate and phosphate in pharmaceutical solutions.

    PubMed

    Jenke, Dennis; Sadain, Salma; Nunez, Karen; Byrne, Frances

    2007-01-01

    The performance of an ion chromatographic method for measuring citrate and phosphate in pharmaceutical solutions is evaluated. Performance characteristics examined include accuracy, precision, specificity, response linearity, robustness, and the ability to meet system suitability criteria. In general, the method is found to be robust within reasonable deviations from its specified operating conditions. Analytical accuracy is typically 100 +/- 3%, and short-term precision is not more than 1.5% relative standard deviation. The instrument response is linear over a range of 50% to 150% of the standard preparation target concentrations (12 mg/L for phosphate and 20 mg/L for citrate), and the results obtained using a single-point standard versus a calibration curve are essentially equivalent. A small analytical bias is observed and ascribed to the relative purity of the differing salts, used as raw materials in tested finished products and as reference standards in the analytical method. The assay is specific in that no phosphate or citrate peaks are observed in a variety of method-related solutions and matrix blanks (with and without autoclaving). The assay with manual preparation of the eluents is sensitive to the composition of the eluent in the sense that the eluent must be effectively degassed and protected from CO(2) ingress during use. In order for the assay to perform effectively, extensive system equilibration and conditioning is required. However, a properly conditioned and equilibrated system can be used to test a number of samples via chromatographic runs that include many (> 50) injections.

  19. Intercomparison of 3D pore-scale flow and solute transport simulation methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Xiaofan; Mehmani, Yashar; Perkins, William A.

    2016-09-01

    Multiple numerical approaches have been developed to simulate porous media fluid flow and solute transport at the pore scale. These include methods that 1) explicitly model the three-dimensional geometry of pore spaces and 2) those that conceptualize the pore space as a topologically consistent set of stylized pore bodies and pore throats. In previous work we validated a model of class 1, based on direct numerical simulation using computational fluid dynamics (CFD) codes, against magnetic resonance velocimetry (MRV) measurements of pore-scale velocities. Here we expand that validation to include additional models of class 1 based on the immersed-boundary method (IMB),more » lattice Boltzmann method (LBM), smoothed particle hydrodynamics (SPH), as well as a model of class 2 (a pore-network model or PNM). The PNM approach used in the current study was recently improved and demonstrated to accurately simulate solute transport in a two-dimensional experiment. While the PNM approach is computationally much less demanding than direct numerical simulation methods, the effect of conceptualizing complex three-dimensional pore geometries on solute transport in the manner of PNMs has not been fully determined. We apply all four approaches (CFD, LBM, SPH and PNM) to simulate pore-scale velocity distributions and nonreactive solute transport, and intercompare the model results with previously reported experimental observations. Experimental observations are limited to measured pore-scale velocities, so solute transport comparisons are made only among the various models. Comparisons are drawn both in terms of macroscopic variables (e.g., permeability, solute breakthrough curves) and microscopic variables (e.g., local velocities and concentrations).« less

  20. Round-robin comparison of methods for the detection of human enteric viruses in lettuce.

    PubMed

    Le Guyader, Françoise S; Schultz, Anna-Charlotte; Haugarreau, Larissa; Croci, Luciana; Maunula, Leena; Duizer, Erwin; Lodder-Verschoor, Froukje; von Bonsdorff, Carl-Henrik; Suffredini, Elizabetha; van der Poel, Wim M M; Reymundo, Rosanna; Koopmans, Marion

    2004-10-01

    Five methods that detect human enteric virus contamination in lettuce were compared. To mimic multiple contaminations as observed after sewage contamination, artificial contamination was with human calicivirus and poliovirus and animal calicivirus strains at different concentrations. Nucleic acid extractions were done at the same time in the same laboratory to reduce assay-to-assay variability. Results showed that the two critical steps are the washing step and removal of inhibitors. The more reliable methods (sensitivity, simplicity, low cost) included an elution/concentration step and a commercial kit. Such development of sensitive methods for viral detection in foods other than shellfish is important to improve food safety.

  1. Human nonverbal courtship behavior--a brief historical review.

    PubMed

    Moore, Monica M

    2010-03-01

    This article reviews research findings documenting the nature of nonverbal courtship behavior compiled through both observation and self-report methods. I briefly present the major theoretical perspectives guiding research methodologies used in the field and in the laboratory. Studies of verbal courtship, including those conducted via computer, via text messaging, or through personal advertisement, are not included in this review. The article ends by elucidating some key features of human nonverbal courtship behavior that have become apparent after scrutinizing these data.

  2. Comparative study of methods for recognition of an unknown person's action from a video sequence

    NASA Astrophysics Data System (ADS)

    Hori, Takayuki; Ohya, Jun; Kurumisawa, Jun

    2009-02-01

    This paper proposes a Tensor Decomposition Based method that can recognize an unknown person's action from a video sequence, where the unknown person is not included in the database (tensor) used for the recognition. The tensor consists of persons, actions and time-series image features. For the observed unknown person's action, one of the actions stored in the tensor is assumed. Using the motion signature obtained from the assumption, the unknown person's actions are synthesized. The actions of one of the persons in the tensor are replaced by the synthesized actions. Then, the core tensor for the replaced tensor is computed. This process is repeated for the actions and persons. For each iteration, the difference between the replaced and original core tensors is computed. The assumption that gives the minimal difference is the action recognition result. For the time-series image features to be stored in the tensor and to be extracted from the observed video sequence, the human body silhouette's contour shape based feature is used. To show the validity of our proposed method, our proposed method is experimentally compared with Nearest Neighbor rule and Principal Component analysis based method. Experiments using 33 persons' seven kinds of action show that our proposed method achieves better recognition accuracies for the seven actions than the other methods.

  3. Determination of thermal wave reflection coefficient to better estimate defect depth using pulsed thermography

    NASA Astrophysics Data System (ADS)

    Sirikham, Adisorn; Zhao, Yifan; Mehnen, Jörn

    2017-11-01

    Thermography is a promising method for detecting subsurface defects, but accurate measurement of defect depth is still a big challenge because thermographic signals are typically corrupted by imaging noise and affected by 3D heat conduction. Existing methods based on numerical models are susceptible to signal noise and methods based on analytical models require rigorous assumptions that usually cannot be satisfied in practical applications. This paper presents a new method to improve the measurement accuracy of subsurface defect depth through determining the thermal wave reflection coefficient directly from observed data that is usually assumed to be pre-known. This target is achieved through introducing a new heat transfer model that includes multiple physical parameters to better describe the observed thermal behaviour in pulsed thermographic inspection. Numerical simulations are used to evaluate the performance of the proposed method against four selected state-of-the-art methods. Results show that the accuracy of depth measurement has been improved up to 10% when noise level is high and thermal wave reflection coefficients is low. The feasibility of the proposed method in real data is also validated through a case study on characterising flat-bottom holes in carbon fibre reinforced polymer (CFRP) laminates which has a wide application in various sectors of industry.

  4. Extracting galactic structure parameters from multivariated density estimation

    NASA Technical Reports Server (NTRS)

    Chen, B.; Creze, M.; Robin, A.; Bienayme, O.

    1992-01-01

    Multivariate statistical analysis, including includes cluster analysis (unsupervised classification), discriminant analysis (supervised classification) and principle component analysis (dimensionlity reduction method), and nonparameter density estimation have been successfully used to search for meaningful associations in the 5-dimensional space of observables between observed points and the sets of simulated points generated from a synthetic approach of galaxy modelling. These methodologies can be applied as the new tools to obtain information about hidden structure otherwise unrecognizable, and place important constraints on the space distribution of various stellar populations in the Milky Way. In this paper, we concentrate on illustrating how to use nonparameter density estimation to substitute for the true densities in both of the simulating sample and real sample in the five-dimensional space. In order to fit model predicted densities to reality, we derive a set of equations which include n lines (where n is the total number of observed points) and m (where m: the numbers of predefined groups) unknown parameters. A least-square estimation will allow us to determine the density law of different groups and components in the Galaxy. The output from our software, which can be used in many research fields, will also give out the systematic error between the model and the observation by a Bayes rule.

  5. Estimation of Electron Density profile Using the Propagation Characteristics of Radio Waves by S-520-29 Sounding Rocket

    NASA Astrophysics Data System (ADS)

    Itaya, K.; Ishisaka, K.; Ashihara, Y.; Abe, T.; Kumamoto, A.; Kurihara, J.

    2015-12-01

    S-520-29 sounding rocket experiment was carried out at Uchinoura Space Center (USC) at 19:10 JST on 17 August, 2014. The purpose of this sounding rocket experiments is observation of sporadic E layer that appears in the lower ionosphere at near 100km. Three methods were used in order to observe the sporadic E layer. The first method is an optical method that observe the light of metal ion emitted by the resonance scattering in sporadic E layer using the imager. The second method is observation of characteristic of radio wave propagation that the LF/MF band radio waves transmitted from the ground. The third method is measuring the electron density in the vicinity of sounding rocket using the fast Langmuir probe and the impedance probe. We analyze the propagation characteristics of radio wave in sporadic E layer appeared from the results of the second method observation. This rocket was equipped with LF/MF band radio receiver for observe the LF/MF band radio waves in rocket flight. Antenna of LF/MF band radio receiver is composed of three axis loop antenna. LF/MF band radio receiver receives three radio waves of 873kHz (JOGB), 666kHz (JOBK), 60kHz (JJY) from the ground. 873kHz and 60kHz radio waves are transmitting from north side, and 666kHz radio waves are transmitting from the east side to the trajectory of the rocket. In the sounding rocket experiment, LF/MF band radio receiver was working properly. We have completed the observation of radio wave intensity. We analyze the observation results using a Doppler shift calculations by frequency analysis. Radio waves received by the sounding rocket include the influences of Doppler shift by polarization and the direction of rocket spin and the magnetic field of the Earth. So received radio waves that are separate into characteristics waves using frequency analysis. Then we calculate the Doppler shift from the separated data. As a result, 873kHz, 666kHz radio waves are reflected by the ionosphere. 60kHz wave was able to propagate in ionosphere because wavelength of 60kHz was longer than the thickness of the sporadic E layer. In this study, we explain the result of LF/MF band radio receiver observations and the electron density of the ionosphere using frequency analysis by S-520-29 sounding rocket experiment.

  6. Dark matter self-interactions and small scale structure

    NASA Astrophysics Data System (ADS)

    Tulin, Sean; Yu, Hai-Bo

    2018-02-01

    We review theories of dark matter (DM) beyond the collisionless paradigm, known as self-interacting dark matter (SIDM), and their observable implications for astrophysical structure in the Universe. Self-interactions are motivated, in part, due to the potential to explain long-standing (and more recent) small scale structure observations that are in tension with collisionless cold DM (CDM) predictions. Simple particle physics models for SIDM can provide a universal explanation for these observations across a wide range of mass scales spanning dwarf galaxies, low and high surface brightness spiral galaxies, and clusters of galaxies. At the same time, SIDM leaves intact the success of ΛCDM cosmology on large scales. This report covers the following topics: (1) small scale structure issues, including the core-cusp problem, the diversity problem for rotation curves, the missing satellites problem, and the too-big-to-fail problem, as well as recent progress in hydrodynamical simulations of galaxy formation; (2) N-body simulations for SIDM, including implications for density profiles, halo shapes, substructure, and the interplay between baryons and self-interactions; (3) semi-analytic Jeans-based methods that provide a complementary approach for connecting particle models with observations; (4) merging systems, such as cluster mergers (e.g., the Bullet Cluster) and minor infalls, along with recent simulation results for mergers; (5) particle physics models, including light mediator models and composite DM models; and (6) complementary probes for SIDM, including indirect and direct detection experiments, particle collider searches, and cosmological observations. We provide a summary and critical look for all current constraints on DM self-interactions and an outline for future directions.

  7. Exercise, exercise training, and the immune system. A compendium of research (1902-1991)

    NASA Technical Reports Server (NTRS)

    Hardesty, A. J.; Greenleaf, J. E.; Simonson, S.; Hu, A.; Jackson, C. G. R.

    1993-01-01

    This compendium includes abstracts and synopses of clinical observations and of more basic studies involving physiological mechanisms concerning interaction of physical exercise and the human immune system. If the author's abstract or summary was appropriate, it was included. In other cases, a more detailed synopsis of the paper was prepared under the subheadings 'Purpose,' 'Methods,' 'Results,' and 'Conclusions.' Author and subject indices are provided, plus a selected bibliography of related work or those papers received after the volume was being prepared for publication. This volume includes material published from 1902 through 1991.

  8. Acceleration Tolerance: Effect of Exercise, Acceleration Training; Bed Rest and Weightlessness Deconditioning. A Compendium of Research (1950-1996)

    NASA Technical Reports Server (NTRS)

    Chou, J. L.; McKenzie, M. A.; Stad, N. J.; Barnes, P. R.; Jackson, C. G. R.; Ghiasvand, F.; Greenleaf, J. E.

    1997-01-01

    This compendium includes abstracts and annotations of clinical observations and of more basic studies involving physiological mechanisms concerning interaction of acceleration, training and deconditioning. If the author's abstract or summary was appropriate, it was included. In other cases a more detailed annotation of the paper was prepared under the subheadings Purpose, Methods, Results, and Conclusions. Author and keyword indices are provided, plus an additional selected bibliography of related work and of those papers received after the volume was prepared for publication. This volume includes material published from 1950-1996.

  9. A simplified flight-test method for determining aircraft takeoff performance that includes effects of pilot technique

    NASA Technical Reports Server (NTRS)

    Larson, T. J.; Schweikhard, W. G.

    1974-01-01

    A method for evaluating aircraft takeoff performance from brake release to air-phase height that requires fewer tests than conventionally required is evaluated with data for the XB-70 airplane. The method defines the effects of pilot technique on takeoff performance quantitatively, including the decrease in acceleration from drag due to lift. For a given takeoff weight and throttle setting, a single takeoff provides enough data to establish a standardizing relationship for the distance from brake release to any point where velocity is appropriate to rotation. The lower rotation rates penalized takeoff performance in terms of ground roll distance; the lowest observed rotation rate required a ground roll distance that was 19 percent longer than the highest. Rotations at the minimum rate also resulted in lift-off velocities that were approximately 5 knots lower than the highest rotation rate at any given lift-off distance.

  10. Qualitative research methods in renal medicine: an introduction.

    PubMed

    Bristowe, Katherine; Selman, Lucy; Murtagh, Fliss E M

    2015-09-01

    Qualitative methodologies are becoming increasingly widely used in health research. However, within some specialties, including renal medicine, qualitative approaches remain under-represented in the high-impact factor journals. Qualitative research can be undertaken: (i) as a stand-alone research method, addressing specific research questions; (ii) as part of a mixed methods approach alongside quantitative approaches or (iii) embedded in clinical trials, or during the development of complex interventions. The aim of this paper is to introduce qualitative research, including the rationale for choosing qualitative approaches, and guidance for ensuring quality when undertaking and reporting qualitative research. In addition, we introduce types of qualitative data (observation, interviews and focus groups) as well as some of the most commonly encountered methodological approaches (case studies, ethnography, phenomenology, grounded theory, thematic analysis, framework analysis and content analysis). © The Author 2015. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  11. School Site Visits for Community-Based Participatory Research on Healthy Eating

    PubMed Central

    Patel, Anisha I.; Bogart, Laura M.; Uyeda, Kimberly E.; Martinez, Homero; Knizewski, Ritamarie; Ryan, Gery W.; Schuster, Mark A.

    2010-01-01

    Background School nutrition policies are gaining support as a means of addressing childhood obesity. Community-based participatory research (CBPR) offers an approach for academic and community partners to collaborate to translate obesity-related school policies into practice. Site visits, in which trained observers visit settings to collect multilevel data (e.g., observation, qualitative interviews), may complement other methods that inform health promotion efforts. This paper demonstrates the utility of site visits in the development of an intervention to implement obesity-related policies in Los Angeles Unified School District (LAUSD) middle schools. Methods In 2006, trained observers visited four LAUSD middle schools. Observers mapped cafeteria layout; observed food/beverage offerings, student consumption, waste patterns, and duration of cafeteria lines; spoke with school staff and students; and collected relevant documents. Data were examined for common themes and patterns. Results Food and beverages sold in study schools met LAUSD nutritional guidelines, and nearly all observed students had time to eat most or all of their meal. Some LAUSD policies were not implemented, including posting nutritional information for cafeteria food, marketing school meals to improve student participation in the National School Lunch Program, and serving a variety of fruits and vegetables. Cafeteria understaffing and cost were obstacles to policy implementation. Conclusions Site visits were a valuable methodology for evaluating the implementation of school district obesity-related policies and contributed to the development of a CBPR intervention to translate school food policies into practice. Future CBPR studies may consider site visits in their toolbox of formative research methods. PMID:19896033

  12. The integration of astro-geodetic data observed with ACSYS to the local geoid models Istanbul-Turkey

    NASA Astrophysics Data System (ADS)

    Halicioglu, Kerem; Ozludemir, M. Tevfik; Deniz, Rasim; Ozener, Haluk; Albayrak, Muge; Ulug, Rasit; Basoglu, Burak

    2017-04-01

    Astro-geodetic deflections of the vertical components provide accurate and valuable information of Earth's gravity filed. Conventional methods require considerable effort and time whereas new methods, namely digital zenith camera systems (DZCS), have been designed to eliminate drawbacks of the conventional methods, such as observer dependent errors, long observation times, and to improve the observation accuracy. The observation principle is based on capturing star images near zenithal direction to determine astronomical coordinates of the station point with the integration of CCD, telescope, tiltmeters, and GNSS devices. In Turkey a new DZCS have been designed and tested on control network located in Istanbul, of which the geoid height differences were known with the accuracy of ±3.5 cm. Astro-geodetic Camera System (ACSYS) was used to determine the deflections of the vertical components with an accuracy of ±0.1 - 0.3 arc seconds, and results were compared with geoid height differences using astronomical levelling procedure. The results have also been compared with the ones calculated from global geopotential models. In this study the recent results of the first digital zenith camera system of Turkey are presented and the future studies are introduced such as the current developments of the system including hardware and software upgrades as well as the new observation strategy of the ACSYS. We also discuss the contribution and integration of the astro-geodetic deflections of the vertical components to the geoid determination studies in the light of information of current ongoing projects being operated in Turkey.

  13. Site-occupancy distribution modeling to correct population-trend estimates derived from opportunistic observations

    USGS Publications Warehouse

    Kery, M.; Royle, J. Andrew; Schmid, Hans; Schaub, M.; Volet, B.; Hafliger, G.; Zbinden, N.

    2010-01-01

    Species' assessments must frequently be derived from opportunistic observations made by volunteers (i.e., citizen scientists). Interpretation of the resulting data to estimate population trends is plagued with problems, including teasing apart genuine population trends from variations in observation effort. We devised a way to correct for annual variation in effort when estimating trends in occupancy (species distribution) from faunal or floral databases of opportunistic observations. First, for all surveyed sites, detection histories (i.e., strings of detection-nondetection records) are generated. Within-season replicate surveys provide information on the detectability of an occupied site. Detectability directly represents observation effort; hence, estimating detectablity means correcting for observation effort. Second, site-occupancy models are applied directly to the detection-history data set (i.e., without aggregation by site and year) to estimate detectability and species distribution (occupancy, i.e., the true proportion of sites where a species occurs). Site-occupancy models also provide unbiased estimators of components of distributional change (i.e., colonization and extinction rates). We illustrate our method with data from a large citizen-science project in Switzerland in which field ornithologists record opportunistic observations. We analyzed data collected on four species: the widespread Kingfisher (Alcedo atthis. ) and Sparrowhawk (Accipiter nisus. ) and the scarce Rock Thrush (Monticola saxatilis. ) and Wallcreeper (Tichodroma muraria. ). Our method requires that all observed species are recorded. Detectability was <1 and varied over the years. Simulations suggested some robustness, but we advocate recording complete species lists (checklists), rather than recording individual records of single species. The representation of observation effort with its effect on detectability provides a solution to the problem of differences in effort encountered when extracting trend information from haphazard observations. We expect our method is widely applicable for global biodiversity monitoring and modeling of species distributions. ?? 2010 Society for Conservation Biology.

  14. Measuring domestic water use: a systematic review of methodologies that measure unmetered water use in low-income settings.

    PubMed

    Tamason, Charlotte C; Bessias, Sophia; Villada, Adriana; Tulsiani, Suhella M; Ensink, Jeroen H J; Gurley, Emily S; Mackie Jensen, Peter Kjaer

    2016-11-01

    To present a systematic review of methods for measuring domestic water use in settings where water meters cannot be used. We systematically searched EMBASE, PubMed, Water Intelligence Online, Water Engineering and Development Center, IEEExplore, Scielo, and Science Direct databases for articles that reported methodologies for measuring water use at the household level where water metering infrastructure was absent or incomplete. A narrative review explored similarities and differences between the included studies and provide recommendations for future research in water use. A total of 21 studies were included in the review. Methods ranged from single-day to 14-consecutive-day visits, and water use recall ranged from 12 h to 7 days. Data were collected using questionnaires, observations or both. Many studies only collected information on water that was carried into the household, and some failed to mention whether water was used outside the home. Water use in the selected studies was found to range from two to 113 l per capita per day. No standardised methods for measuring unmetered water use were found, which brings into question the validity and comparability of studies that have measured unmetered water use. In future studies, it will be essential to define all components that make up water use and determine how they will be measured. A pre-study that involves observations and direct measurements during water collection periods (these will have to be determined through questioning) should be used to determine optimal methods for obtaining water use information in a survey. Day-to-day and seasonal variation should be included. A study that investigates water use recall is warranted to further develop standardised methods to measure water use; in the meantime, water use recall should be limited to 24 h or fewer. © 2016 The Authors. Tropical Medicine & International Health Published by John Wiley & Sons Ltd.

  15. Development of Gridded Ensemble Precipitation and Temperature Datasets for the Contiguous United States Plus Hawai'i and Alaska

    NASA Astrophysics Data System (ADS)

    Newman, A. J.; Clark, M. P.; Nijssen, B.; Wood, A.; Gutmann, E. D.; Mizukami, N.; Longman, R. J.; Giambelluca, T. W.; Cherry, J.; Nowak, K.; Arnold, J.; Prein, A. F.

    2016-12-01

    Gridded precipitation and temperature products are inherently uncertain due to myriad factors. These include interpolation from a sparse observation network, measurement representativeness, and measurement errors. Despite this inherent uncertainty, uncertainty is typically not included, or is a specific addition to each dataset without much general applicability across different datasets. A lack of quantitative uncertainty estimates for hydrometeorological forcing fields limits their utility to support land surface and hydrologic modeling techniques such as data assimilation, probabilistic forecasting and verification. To address this gap, we have developed a first of its kind gridded, observation-based ensemble of precipitation and temperature at a daily increment for the period 1980-2012 over the United States (including Alaska and Hawaii). A longer, higher resolution version (1970-present, 1/16th degree) has also been implemented to support real-time hydrologic- monitoring and prediction in several regional US domains. We will present the development and evaluation of the dataset, along with initial applications of the dataset for ensemble data assimilation and probabilistic evaluation of high resolution regional climate model simulations. We will also present results on the new high resolution products for Alaska and Hawaii (2 km and 250 m respectively), to complete the first ensemble observation based product suite for the entire 50 states. Finally, we will present plans to improve the ensemble dataset, focusing on efforts to improve the methods used for station interpolation and ensemble generation, as well as methods to fuse station data with numerical weather prediction model output.

  16. Virtual Observatory and Colitec Software: Modules, Features, Methods

    NASA Astrophysics Data System (ADS)

    Pohorelov, A. V.; Khlamov, S. V.; Savanevych, V. E.; Briukhovetskyi, A. B.; Vlasenko, V. P.

    In this article we described complex processing system created by the CoLiTec project. This system includes features, user-friendly tools for processing control, results reviewing, integration with online catalogs and a lot of different computational modules that are based on the developed methods. Some of them are described in the article.The main directions of the CoLiTec software development are the Virtual Observatory, software for automated asteroids and comets detection and software for brightness equalization.The CoLiTec software is widely used in a number of observatories in the CIS. It has been used in about 700 000 observations, during which 1560 asteroids, including 5 NEO, 21 Trojan asteroids of Jupiter, 1 Centaur and four comets were discovered.

  17. The Snapshot A-Star SurveY (SASSY)

    NASA Astrophysics Data System (ADS)

    Garani, Jasmine; Nielsen, Eric L.; Marchis, Franck; Liu, Michael C.; Macintosh, Bruce; Rajan, Abhijith; De Rosa, Robert J.; Wang, Jason; Esposito, Thomas; Best, William M. J.; Bowler, Brendan P.; Dupuy, Trent J.; Ruffio, Jean-Baptise

    2017-01-01

    We present the Snapshot A-Star SurveY (SASSY), an adaptive optics survey conducted using NIRC2 on the Keck II telescope to search for young, self-luminious planets and brown dwarfs (M > 5MJup) around high mass stars (M > 1.5 M⊙). We describe a custom data-reduction pipeline developed for the coronagraphic observations of our 200 target stars. Our data analysis method includes basic near infrared data processing (flat-field correction, bad pixel removal, distortion correction) as well as performing PSF subtraction through a Reference Differential Imaging algorithm based on a library of PSFs derived from the observations using the pyKLIP routine. We present early results from the survey including planet and brown dwarf candidates and the status of ongoing follow-up observations. Utilizing the high contrast of Keck NIRC2 coronagraphic observations, SASSY reaches sensitivity to brown dwarfs and planetary mass companions at separations between 0.6'' and 4''. With over 200 stars observed we are tripling the number of high-mass stars imaged at these contrasts and sensitivities compared to previous surveys. This work was supported by the NSF REU program at the SETI Institute and NASA grant NNX14AJ80G.

  18. An Adaptive Network-based Fuzzy Inference System for the detection of thermal and TEC anomalies around the time of the Varzeghan, Iran, (Mw = 6.4) earthquake of 11 August 2012

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2013-09-01

    Anomaly detection is extremely important for forecasting the date, location and magnitude of an impending earthquake. In this paper, an Adaptive Network-based Fuzzy Inference System (ANFIS) has been proposed to detect the thermal and Total Electron Content (TEC) anomalies around the time of the Varzeghan, Iran, (Mw = 6.4) earthquake jolted in 11 August 2012 NW Iran. ANFIS is the famous hybrid neuro-fuzzy network for modeling the non-linear complex systems. In this study, also the detected thermal and TEC anomalies using the proposed method are compared to the results dealing with the observed anomalies by applying the classical and intelligent methods including Interquartile, Auto-Regressive Integrated Moving Average (ARIMA), Artificial Neural Network (ANN) and Support Vector Machine (SVM) methods. The duration of the dataset which is comprised from Aqua-MODIS Land Surface Temperature (LST) night-time snapshot images and also Global Ionospheric Maps (GIM), is 62 days. It can be shown that, if the difference between the predicted value using the ANFIS method and the observed value, exceeds the pre-defined threshold value, then the observed precursor value in the absence of non seismic effective parameters could be regarded as precursory anomaly. For two precursors of LST and TEC, the ANFIS method shows very good agreement with the other implemented classical and intelligent methods and this indicates that ANFIS is capable of detecting earthquake anomalies. The applied methods detected anomalous occurrences 1 and 2 days before the earthquake. This paper indicates that the detection of the thermal and TEC anomalies derive their credibility from the overall efficiencies and potentialities of the five integrated methods.

  19. Russian-Cuban Colocation Station for Radio Astronomical Observation and Monitoring of Near-Earth Space

    NASA Astrophysics Data System (ADS)

    Ivanov, D. V.; Uratsuka, M.-R.; Ipatov, A. V.; Marshalov, D. A.; Shuygina, N. V.; Vasilyev, M. V.; Gayazov, I. S.; Ilyin, G. N.; Bondarenko, Yu. S.; Melnikov, A. E.; Suvorkin, V. V.

    2018-04-01

    The article presents the main possibilities of using the projected Russian-Cuban geodynamic colocation station on the basis of the Institute of Geophysics and Astronomy of the Ministry of Science, Technology and the Environment of the Republic of Cuba to carry out radio observations and monitoring the near-Earth space. Potential capabilities of the station are considered for providing various observational programs: astrophysical observations; observations by space geodesy methods using radio very long baselines interferometers, global navigation satellite systems, laser rangers, and various Doppler systems, as well as monitoring of artificial and natural bodies in the near-Earth and deep space, including the ranging of asteroids approaching the Earth. The results of modeling the observations on the planned station are compared with that obtained on the existing geodynamic stations. The efficiency of the projected Russian-Cuban station for solving astronomical tasks is considered.

  20. Mission design for an orbiting volcano observatory

    NASA Technical Reports Server (NTRS)

    Penzo, Paul A.; Johnston, M. Daniel

    1990-01-01

    The Mission to Planet Earth initiative will require global observation of land, sea, and atmosphere, and all associated phenomena over the coming years; perhaps for decades. A major phenomenon playing a major part in earth's environment is volcanic activity. Orbital observations, including IR, UV, and visible imaging, may be made to monitor many active sites, and eventually increase our understanding of volcanoes and lead to the predictability of eruptions. This paper presents the orbital design and maneuvering capability of a low cost, volcano observing satellite, flying in low earth orbit. Major science requirements include observing as many as 10 to 20 active sites daily, or every two or three days. Given specific geographic locations of these sites, it is necessary to search the trajectory space for those orbits which maximize overflight opportunities. Also, once the satellite is in orbit, it may be desirable to alter the orbit to fly over targets of opportunity. These are active areas which are not being monitored, but which give indications of erupting, or have in fact erupted. Multiple impulse orbital maneuvering methods have been developed to minimize propellant usage for these orbital changes.

Top