Science.gov

Sample records for quantitative survey design

  1. Telephone Survey Designs.

    ERIC Educational Resources Information Center

    Casady, Robert J.

    The concepts, definitions, and notation that have evolved with the development of telephone survey design methodology are discussed and presented as a unified structure. This structure is then applied to some of the more well-known telephone survey designs and alternative designs are developed. The relative merits of the different survey designs…

  2. Quantitative resilience analysis through control design.

    SciTech Connect

    Sunderland, Daniel; Vugrin, Eric D.; Camphouse, Russell Chris

    2009-09-01

    Critical infrastructure resilience has become a national priority for the U. S. Department of Homeland Security. System resilience has been studied for several decades in many different disciplines, but no standards or unifying methods exist for critical infrastructure resilience analysis. Few quantitative resilience methods exist, and those existing approaches tend to be rather simplistic and, hence, not capable of sufficiently assessing all aspects of critical infrastructure resilience. This report documents the results of a late-start Laboratory Directed Research and Development (LDRD) project that investigated the development of quantitative resilience through application of control design methods. Specifically, we conducted a survey of infrastructure models to assess what types of control design might be applicable for critical infrastructure resilience assessment. As a result of this survey, we developed a decision process that directs the resilience analyst to the control method that is most likely applicable to the system under consideration. Furthermore, we developed optimal control strategies for two sets of representative infrastructure systems to demonstrate how control methods could be used to assess the resilience of the systems to catastrophic disruptions. We present recommendations for future work to continue the development of quantitative resilience analysis methods.

  3. Applying Knowledge of Quantitative Design and Analysis

    ERIC Educational Resources Information Center

    Baskas, Richard S.

    2011-01-01

    This study compared and contrasted two quantitative scholarly articles in relation to their research designs. Their designs were analyzed by the comparison of research references and research specific vocabulary to describe how various research methods were used. When researching and analyzing quantitative scholarly articles, it is imperative to…

  4. 1997 construction & design survey.

    PubMed

    Pinto, C

    1997-03-31

    Managed care might seem to be putting a damper on healthcare construction, but in fact it's one of several industry changes creating opportunities for architectural and design firms. One example of a trend toward making surroundings as pleasant as possible is the west campus expansion at East Texas Medical Center in Tyler (left). Designed and built by Ellerbe Becket and completed in 1995, the project, including a nine-story medical office building, features artwork and rooftop gardens. PMID:10165801

  5. Report on Solar Water Heating Quantitative Survey

    SciTech Connect

    Focus Marketing Services

    1999-05-06

    This report details the results of a quantitative research study undertaken to better understand the marketplace for solar water-heating systems from the perspective of home builders, architects, and home buyers.

  6. WATERSHED BASED SURVEY DESIGNS

    EPA Science Inventory

    The development of watershed-based design and assessment tools will help to serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional condition to meet Section 305(b), identification of impaired water bodies or wate...

  7. Food Consumption and Handling Survey for Quantitative Microbiological Consumer Phase Risk Assessments.

    PubMed

    Chardon, Jurgen; Swart, Arno

    2016-07-01

    In the consumer phase of a typical quantitative microbiological risk assessment (QMRA), mathematical equations identify data gaps. To acquire useful data we designed a food consumption and food handling survey (2,226 respondents) for QMRA applications that is especially aimed at obtaining quantitative data. For a broad spectrum of food products, the survey covered the following topics: processing status at retail, consumer storage, preparation, and consumption. Questions were designed to facilitate distribution fitting. In the statistical analysis, special attention was given to the selection of the most adequate distribution to describe the data. Bootstrap procedures were used to describe uncertainty. The final result was a coherent quantitative consumer phase food survey and parameter estimates for food handling and consumption practices in The Netherlands, including variation over individuals and uncertainty estimates.

  8. Qualities of a Psychiatric Mentor: A Quantitative Singaporean Survey

    ERIC Educational Resources Information Center

    Tor, Phern-Chern; Goh, Lee-Gan; Ang, Yong-Guan; Lim, Leslie; Winslow, Rasaiah-Munidasa; Ng, Beng-Yeong; Wong, Sze-Tai; Ng, Tse-Pin; Kia, Ee-Heok

    2011-01-01

    Objective: Psychiatric mentors are an important part of the new, seamless training program in Singapore. There is a need to assess the qualities of a good psychiatric mentor vis-a-vis those of a good psychiatrist. Method: An anonymous survey was sent out to all psychiatry trainees and psychiatrists in Singapore to assess quantitatively the…

  9. RESOLVE and ECO: Survey Design

    NASA Astrophysics Data System (ADS)

    Kannappan, Sheila; Moffett, Amanda J.; Norris, Mark A.; Eckert, Kathleen D.; Stark, David; Berlind, Andreas A.; Snyder, Elaine M.; Norman, Dara J.; Hoversten, Erik A.; RESOLVE Team

    2016-01-01

    The REsolved Spectroscopy Of a Local VolumE (RESOLVE) survey is a volume-limited census of stellar, gas, and dynamical mass as well as star formation and galaxy interactions within >50,000 cubic Mpc of the nearby cosmic web, reaching down to dwarf galaxies of baryonic mass ~10^9 Msun and spanning multiple large-scale filaments, walls, and voids. RESOLVE is surrounded by the ~10x larger Environmental COntext (ECO) catalog, with matched custom photometry and environment metrics enabling analysis of cosmic variance with greater statistical power. For the ~1500 galaxies in its two equatorial footprints, RESOLVE goes beyond ECO in providing (i) deep 21cm data with adaptive sensitivity ensuring HI mass detections or upper limits <10% of the stellar mass and (ii) 3D optical spectroscopy including both high-resolution ionized gas or stellar kinematic data for each galaxy and broad 320-725nm spectroscopy spanning [OII] 3727, Halpha, and Hbeta. RESOLVE is designed to complement other radio and optical surveys in providing diverse, contiguous, and uniform local/global environment data as well as unusually high completeness extending into the gas-dominated dwarf galaxy regime. RESOLVE also offers superb reprocessed photometry including full, deep NUV coverage and synergy with other equatorial surveys as well as unique northern and southern facilities such as Arecibo, the GBT, and ALMA. The RESOLVE and ECO surveys have been supported by funding from NSF grants AST-0955368 and OCI-1156614.

  10. Watershed-based survey designs.

    PubMed

    Detenbeck, Naomi E; Cincotta, Dan; Denver, Judith M; Greenlee, Susan K; Olsen, Anthony R; Pitchford, Ann M

    2005-04-01

    Watershed-based sampling design and assessment tools help serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional conditions to meet Section 305(b), identification of impaired water bodies or watersheds to meet Section 303(d), and development of empirical relationships between causes or sources of impairment and biological responses. Creation of GIS databases for hydrography, hydrologically corrected digital elevation models, and hydrologic derivatives such as watershed boundaries and upstream-downstream topology of subcatchments would provide a consistent seamless nationwide framework for these designs. The elements of a watershed-based sample framework can be represented either as a continuous infinite set defined by points along a linear stream network, or as a discrete set of watershed polygons. Watershed-based designs can be developed with existing probabilistic survey methods, including the use of unequal probability weighting, stratification, and two-stage frames for sampling. Case studies for monitoring of Atlantic Coastal Plain streams, West Virginia wadeable streams, and coastal Oregon streams illustrate three different approaches for selecting sites for watershed-based survey designs. PMID:15861987

  11. Watershed-based survey designs

    USGS Publications Warehouse

    Detenbeck, N.E.; Cincotta, D.; Denver, J.M.; Greenlee, S.K.; Olsen, A.R.; Pitchford, A.M.

    2005-01-01

    Watershed-based sampling design and assessment tools help serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional conditions to meet Section 305(b), identification of impaired water bodies or watersheds to meet Section 303(d), and development of empirical relationships between causes or sources of impairment and biological responses. Creation of GIS databases for hydrography, hydrologically corrected digital elevation models, and hydrologic derivatives such as watershed boundaries and upstream-downstream topology of subcatchments would provide a consistent seamless nationwide framework for these designs. The elements of a watershed-based sample framework can be represented either as a continuous infinite set defined by points along a linear stream network, or as a discrete set of watershed polygons. Watershed-based designs can be developed with existing probabilistic survey methods, including the use of unequal probability weighting, stratification, and two-stage frames for sampling. Case studies for monitoring of Atlantic Coastal Plain streams, West Virginia wadeable streams, and coastal Oregon streams illustrate three different approaches for selecting sites for watershed-based survey designs. ?? Springer Science + Business Media, Inc. 2005.

  12. Examining Stress in Graduate Assistants: Combining Qualitative and Quantitative Survey Methods

    ERIC Educational Resources Information Center

    Mazzola, Joseph J.; Walker, Erin J.; Shockley, Kristen M.; Spector, Paul E.

    2011-01-01

    The aim of this study was to employ qualitative and quantitative survey methods in a concurrent mixed model design to assess stressors and strains in graduate assistants. The stressors most frequently reported qualitatively were work overload, interpersonal conflict, and organizational constraints; the most frequently reported psychological…

  13. Quantitative three-dimensional low-speed wake surveys

    NASA Technical Reports Server (NTRS)

    Brune, G. W.

    1992-01-01

    Theoretical and practical aspects of conducting three-dimensional wake measurements in large wind tunnels are reviewed with emphasis on applications in low-speed aerodynamics. Such quantitative wake surveys furnish separate values for the components of drag, such as profile drag and induced drag, but also measure lift without the use of a balance. In addition to global data, details of the wake flowfield as well as spanwise distributions of lift and drag are obtained. The paper demonstrates the value of this measurement technique using data from wake measurements conducted by Boeing on a variety of low-speed configurations including the complex high-lift system of a transport aircraft.

  14. WATERSHED-BASED SURVEY DESIGNS

    EPA Science Inventory

    Water-based sampling design and assessment tools help serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional conditions to meet Section 305(b), identification if impaired water bodies or watersheds to meet Sectio...

  15. Armchair Survey Sampling: An Aid in Teaching Survey Design.

    ERIC Educational Resources Information Center

    Thompson, M. E.

    A fictitious community of 583 households was set up to simulate a survey population, and was used in two laboratory assignments where students "interviewed" householders by a quota sampling procedure and tested the performance of several probability sampling designs. (Author/JEG)

  16. Quantitative Laughter Detection, Measurement, and Classification-A Critical Survey.

    PubMed

    Cosentino, Sarah; Sessa, Salvatore; Takanishi, Atsuo

    2016-01-01

    The study of human nonverbal social behaviors has taken a more quantitative and computational approach in recent years due to the development of smart interfaces and virtual agents or robots able to interact socially. One of the most interesting nonverbal social behaviors, producing a characteristic vocal signal, is laughing. Laughter is produced in several different situations: in response to external physical, cognitive, or emotional stimuli; to negotiate social interactions; and also, pathologically, as a consequence of neural damage. For this reason, laughter has attracted researchers from many disciplines. A consequence of this multidisciplinarity is the absence of a holistic vision of this complex behavior: the methods of analysis and classification of laughter, as well as the terminology used, are heterogeneous; the findings sometimes contradictory and poorly documented. This survey aims at collecting and presenting objective measurement methods and results from a variety of different studies in different fields, to contribute to build a unified model and taxonomy of laughter. This could be successfully used for advances in several fields, from artificial intelligence and human-robot interaction to medicine and psychiatry. PMID:26887012

  17. Statistical considerations in designing raptor surveys

    USGS Publications Warehouse

    Pendleton, G.W.

    1989-01-01

    Careful sampling design is required to obtain useful estimates of raptor abundance. Well-defined objectives, selection of appropriate sample units and sampling scheme, and attention to detail to reduce extraneous sources of variability and error are all important considerations in designing a raptor survey.

  18. Survey Design for Large-Scale, Unstructured Resistivity Surveys

    NASA Astrophysics Data System (ADS)

    Labrecque, D. J.; Casale, D.

    2009-12-01

    In this paper, we discuss the issues in designing data collection strategies for large-scale, poorly structured resistivity surveys. Existing or proposed applications for these types of surveys include carbon sequestration, enhanced oil recovery monitoring, monitoring of leachate from working or abandoned mines, and mineral surveys. Electrode locations are generally chosen by land access, utilities, roads, existing wells etc. Classical arrays such as the Wenner array or dipole-dipole arrays are not applicable if the electrodes cannot be placed in quasi-regular lines or grids. A new, far more generalized strategy is needed for building data collection schemes. Following the approach of earlier two-dimensional (2-D) survey designs, the proposed method begins by defining a base array. In (2-D) design, this base array is often a standard dipole-dipole array. For unstructured three-dimensional (3-D) design, determining this base array is a multi-step process. The first step is to determine a set of base dipoles with similar characteristics. For example, the base dipoles may consist of electrode pairs trending within 30 degrees of north and with a length between 100 and 250 m in length. These dipoles are then combined into a trial set of arrays. This trial set of arrays is reduced by applying a series of filters based on criteria such as separation between the dipoles. Using the base array set, additional arrays are added and tested to determine the overall improvement in resolution and to determine an optimal set of arrays. Examples of the design process are shown for a proposed carbon sequestration monitoring system.

  19. Ambulance Design Survey 2011: A Summary Report

    PubMed Central

    Lee, Y Tina; Kibira, Deogratias; Feeney, Allison Barnard; Marshall, Jennifer

    2013-01-01

    Current ambulance designs are ergonomically inefficient and often times unsafe for practical treatment response to medical emergencies. Thus, the patient compartment of a moving ambulance is a hazardous working environment. As a consequence, emergency medical services (EMS) workers suffer fatalities and injuries that far exceed those of the average work place in the United States. To reduce injury and mortality rates in ambulances, the Department of Homeland Security Science and Technology Directorate has teamed with the National Institute of Standards and Technology, the National Institute for Occupational Safety and Health, and BMT Designers & Planners in a joint project to produce science-based ambulance patient compartment design standards. This project will develop new crash-safety design standards and improved user-design interface guidance for patient compartments that are safer for EMS personnel and patients, and facilitate improved patient care. The project team has been working with practitioners, EMS workers’ organizations, and manufacturers to solicit needs and requirements to address related issues. This paper presents an analysis of practitioners’ concerns, needs, and requirements for improved designs elicited through the web-based survey of ambulance design, held by the National Institute of Standards and Technology. This paper also introduces the survey, analyzes the survey results, and discusses recommendations for future ambulance patient compartments design. PMID:26401439

  20. Design-based and model-based inference in surveys of freshwater mollusks

    USGS Publications Warehouse

    Dorazio, R.M.

    1999-01-01

    Well-known concepts in statistical inference and sampling theory are used to develop recommendations for planning and analyzing the results of quantitative surveys of freshwater mollusks. Two methods of inference commonly used in survey sampling (design-based and model-based) are described and illustrated using examples relevant in surveys of freshwater mollusks. The particular objectives of a survey and the type of information observed in each unit of sampling can be used to help select the sampling design and the method of inference. For example, the mean density of a sparsely distributed population of mollusks can be estimated with higher precision by using model-based inference or by using design-based inference with adaptive cluster sampling than by using design-based inference with conventional sampling. More experience with quantitative surveys of natural assemblages of freshwater mollusks is needed to determine the actual benefits of different sampling designs and inferential procedures.

  1. Spatially balanced survey designs for natural resources

    EPA Science Inventory

    Ecological resource monitoring programs typically require the use of a probability survey design to select locations or entities to be physically sampled in the field. The ecological resource of interest, the target population, occurs over a spatial domain and the sample selecte...

  2. National Lake Assessment 2012 Potenital Survey Design

    EPA Science Inventory

    In 2012 the Office of Water in collaboration with states and tribal nations will conduct the second National Lake Assessment. The purpose of this presentation is to present potential survey design approaches for this national assessment. Currently discussions are underway to de...

  3. The XMM-LSS survey. Survey design and first results

    NASA Astrophysics Data System (ADS)

    Pierre, Marguerite; Valtchanov, Ivan; Altieri, Bruno; Andreon, Stefano; Bolzonella, Micol; Bremer, Malcolm; Disseau, Ludovic; Dos Santos, Sergio; Gandhi, Poshak; Jean, Christophe; Pacaud, Florian; Read, Andrew; Refregier, Alexandre; Willis, Jon; Adami, Christophe; Alloin, Danielle; Birkinshaw, Mark; Chiappetti, Lucio; Cohen, Aaron; Detal, Alain; Duc, Pierre-Alain; Gosset, Eric; Hjorth, Jens; Jones, Laurence; Le Fèvre, Olivier; Lonsdale, Carol; Maccagni, Dario; Mazure, Alain; McBreen, Brian; McCracken, Henry; Mellier, Yannick; Ponman, Trevor; Quintana, Hernan; Rottgering, Huub; Smette, Alain; Surdej, Jean; Starck, Jean-Luc; Vigroux, Laurent; White, Simon

    2004-09-01

    The XMM Large Scale Structure survey (XMM-LSS) is a medium deep large area X-ray survey. Its goal is to extend large scale structure investigations attempted using ROSAT cluster samples to two redshift bins between 0survey design: the evolutionary study of the cluster cluster correlation function and of the cluster number density. The adopted observing configuration consists of an equatorial mosaic of 10 ks pointings, separated by 20^\\prime and covering 8° × 8°, giving a pointsource sensitivity of {\\sim } 5\\times 10^{-15}~{\\mathrm {erg~cm^{-2}~s^{-1}}} in the 0.5 2 keV band. This will yield more than 800 clusters of galaxies and a sample of X-ray AGN with a space density of about 300 deg-2. We present the expected cosmological implications of the survey in the context of LgrCDM models and cluster evolution. We give an overview of the first observational results. The XMM-LSS survey is associated with several other major surveys, ranging from the UV to the radio wavebands, which will provide the necessary resources for X-ray source identification and further statistical studies. In particular, the associated CFHTLS weak lensing and AMiBA Sunyaev Zel'dovich surveys over the entire XMM-LSS area will provide for the first time a comprehensive study of the mass distribution and of cluster physics in the universe on scales of a few hundred Mpc. We describe the main characteristics of our wavelet-based X-ray pipeline and source identification procedures, including the classification of the cluster candidates by means of a photometric redshift analysis. This permits the selection of suitable targets for spectroscopic follow-up. We present preliminary results from the first 25 XMM-LSS pointings: X-ray source properties, optical counterparts, and highlights from the first Magellan and VLT/FORS2 spectroscopic runs as well as preliminary results from the NIR search for z>1

  4. 76 FR 27384 - Agency Information Collection Activity (Veteran Suicide Prevention Online Quantitative Surveys...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-11

    ... AFFAIRS Agency Information Collection Activity (Veteran Suicide Prevention Online Quantitative Surveys...). Type of Review: New collection. Abstract: VA's top priority is the prevention of Veterans suicide. It... better understand Veterans and their families' awareness of VA's suicide prevention and mental...

  5. Statistical design of quantitative mass spectrometry-based proteomic experiments.

    PubMed

    Oberg, Ann L; Vitek, Olga

    2009-05-01

    We review the fundamental principles of statistical experimental design, and their application to quantitative mass spectrometry-based proteomics. We focus on class comparison using Analysis of Variance (ANOVA), and discuss how randomization, replication and blocking help avoid systematic biases due to the experimental procedure, and help optimize our ability to detect true quantitative changes between groups. We also discuss the issues of pooling multiple biological specimens for a single mass analysis, and calculation of the number of replicates in a future study. When applicable, we emphasize the parallels between designing quantitative proteomic experiments and experiments with gene expression microarrays, and give examples from that area of research. We illustrate the discussion using theoretical considerations, and using real-data examples of profiling of disease.

  6. The Dark Energy Survey CCD imager design

    SciTech Connect

    Cease, H.; DePoy, D.; Diehl, H.T.; Estrada, J.; Flaugher, B.; Guarino, V.; Kuk, K.; Kuhlmann, S.; Schultz, K.; Schmitt, R.L.; Stefanik, A.; /Fermilab /Ohio State U. /Argonne

    2008-06-01

    The Dark Energy Survey is planning to use a 3 sq. deg. camera that houses a {approx} 0.5m diameter focal plane of 62 2kx4k CCDs. The camera vessel including the optical window cell, focal plate, focal plate mounts, cooling system and thermal controls is described. As part of the development of the mechanical and cooling design, a full scale prototype camera vessel has been constructed and is now being used for multi-CCD readout tests. Results from this prototype camera are described.

  7. National Survey of Men: design and execution.

    PubMed

    Tanfer, K

    1993-01-01

    The National Survey of Men (NSM-I) was conducted in 1991 to examine issues related to sexual behavior and condom use among noninstitutionalized US men aged 20-39, intended as the baseline survey for a longitudinal study. A total of 20,086 housing units were canvassed, 2434 were excluded, and 16,414 of the remaining 17,652 housing units were successfully screened for eligibility. The main sample of the general population contained 1062 listing areas and an oversample contained 153 listing areas designated as black listing areas. The probability of selection of a listing area in the main survey sample was 1 in 10,511, while the probability in the black oversample was 1 in 1164. The questionnaire consisted of personal particulars; sexual initiation and current exposure; current wife or partner; previous marital relationships; other nonmarital sexual partners; nonsexual partners; health and risk-taking behavior; attitudes, perceptions, and knowledge of health-related and contraception related issues; reasons for using or not using condoms; follow-up information; interviewer observations; and self-administered questions. Data collection and processing was carried out by the Institute for Survey Research at Temple University, Philadelphia. A total of 206 interviewers and 9 regional field coordinators were recruited for the field work; of these, 189 interviewers and 7 coordinators worked on the survey. The response rate of 70% was considered respectable, given the highly sensitive nature of the questions. Standard errors for various estimated percentages were provided separately for the white and the black samples. After the survey was completed, the final sample was weighted to reflect differential sampling rates, as well as to account for multiple households, multiple eligibility, and differential nonresponse. The final weight consisted of sampling weight, screening weight, eligibility weight, nonresponse weight, and poststratification weight. Scaled to the sample size

  8. Quantitative proteomic survey of endoplasmic reticulum in mouse liver.

    PubMed

    Song, Yanping; Jiang, Ying; Ying, Wantao; Gong, Yan; Yan, Yujuan; Yang, Dong; Ma, Jie; Xue, Xiaofang; Zhong, Fan; Wu, Songfeng; Hao, Yunwei; Sun, Aihua; Li, Tao; Sun, Wei; Wei, Handong; Zhu, Yunping; Qian, Xiaohong; He, Fuchu

    2010-03-01

    To gain a better understanding of the critical function of the endoplasmic reticulum (ER) in liver, we carried out a proteomic survey of mouse liver ER. The ER proteome was profiled with a new three-dimensional, gel-based strategy. From 6152 and 6935 MS spectra, 903 and 1042 proteins were identified with at least two peptides matches at 95% confidence in the rough (r) and smooth (s) ER, respectively. Comparison of the rER and sER proteomes showed that calcium-binding proteins are significantly enriched in the sER suggesting that the ion-binding function of the ER is compartmentalized. Comparison of the rat and mouse ER proteomes showed that 662 proteins were common to both, comprising 53.5% and 49.3% of those proteomes, respectively. We proposed that these proteins were stably expressed proteins that were essential for the maintenance of ER function. GO annotation with a hypergeometric model proved this hypothesis. Unexpectedly, 210 unknown proteins and some proteins previously reported to occur in the cytosol were highly enriched in the ER. This study provides a reference map for the ER proteome of liver. Identification of new ER proteins will enhance our current understanding of the ER and also suggest new functions for this organelle.

  9. Young people, alcohol, and designer drinks: quantitative and qualitative study.

    PubMed Central

    Hughes, K.; MacKintosh, A. M.; Hastings, G.; Wheeler, C.; Watson, J.; Inglis, J.

    1997-01-01

    OBJECTIVE: To examine the appeal of "designer drinks" to young people. DESIGN: Qualitative and quantitative research comprising group discussions and questionnaire led interviews with young people accompanied by a self completion questionnaire. SETTINGS: Argyll and Clyde Health Board area, west Scotland. SUBJECTS: Eight groups aged 12-17 years; 824 aged 12-17 recruited by multistage cluster probability sample from the community health index. RESULTS: Young people were familiar with designer drinks, especially MD 20/20 and leading brands of strong white cider. Attitudes towards these drinks varied quite distinctly with age, clearly reflecting their attitudes towards and motivations for drinking in general. The brand imagery of designer drinks-in contrast with that of more mainstream drinks-matched many 14 and 15 year olds' perceptions and expectations of drinking. Popularity of designer drinks peaked between the ages of 13 and 16 while more conventional drinks showed a consistent increase in popularity with age. Consumption of designer drinks tended to be in less controlled circumstances and was associated with heavier alcohol intake and greater drunkenness. CONCLUSIONS: Designer drinks are a cause for concern. They appeal to young people, often more so than conventional drinks, and are particularly attractive to 14-16 year olds. Consumption of designer drinks is also associated with drinking in less controlled environments, heavier drinking, and greater drunkenness. There is a need for policy debate to assess the desirability of these drinks and the extent to which further controls on their marketing are required. PMID:9040387

  10. Quantitative study designs used in quality improvement and assessment.

    PubMed

    Ormes, W S; Brim, M B; Coggan, P

    2001-01-01

    This article describes common quantitative design techniques that can be used to collect and analyze quality data. An understanding of the differences between these design techniques can help healthcare quality professionals make the most efficient use of their time, energies, and resources. To evaluate the advantages and disadvantages of these various study designs, it is necessary to assess factors that threaten the degree with which quality professionals may infer a cause-and-effect relationship from the data collected. Processes, the conduits of organizational function, often can be assessed by methods that do not take into account confounding and compromising circumstances that affect the outcomes of their analyses. An assumption that the implementation of process improvements may cause real change is incomplete without a consideration of other factors that might also have caused the same result. It is only through the identification, assessment, and exclusion of these alternative factors that administrators and healthcare quality professionals can assess the degree to which true process improvement or compliance has occurred. This article describes the advantages and disadvantages of common quantitative design techniques and reviews the corresponding threats to the interpretability of data obtained from their use. PMID:11378972

  11. The Design of Grids in Web Surveys

    PubMed Central

    Couper, Mick P.; Tourangeau, Roger; Conrad, Frederick G.; Zhang, Chan

    2014-01-01

    Grid or matrix questions are associated with a number of problems in Web surveys. In this paper, we present results from two experiments testing the design of grid questions to reduce breakoffs, missing data, and satisficing. The first examines dynamic elements to help guide respondent through the grid, and on splitting a larger grid into component pieces. The second manipulates the visual complexity of the grid and on simplifying the grid. We find that using dynamic feedback to guide respondents through a multi-question grid helps reduce missing data. Splitting the grids into component questions further reduces missing data and motivated underreporting. The visual complexity of the grid appeared to have little effect on performance. PMID:25258472

  12. The Design of Grids in Web Surveys.

    PubMed

    Couper, Mick P; Tourangeau, Roger; Conrad, Frederick G; Zhang, Chan

    2013-06-01

    Grid or matrix questions are associated with a number of problems in Web surveys. In this paper, we present results from two experiments testing the design of grid questions to reduce breakoffs, missing data, and satisficing. The first examines dynamic elements to help guide respondent through the grid, and on splitting a larger grid into component pieces. The second manipulates the visual complexity of the grid and on simplifying the grid. We find that using dynamic feedback to guide respondents through a multi-question grid helps reduce missing data. Splitting the grids into component questions further reduces missing data and motivated underreporting. The visual complexity of the grid appeared to have little effect on performance.

  13. Online Survey Design and Development: A Janus-Faced Approach

    ERIC Educational Resources Information Center

    Lauer, Claire; McLeod, Michael; Blythe, Stuart

    2013-01-01

    In this article we propose a "Janus-faced" approach to survey design--an approach that encourages researchers to consider how they can design and implement surveys more effectively using the latest web and database tools. Specifically, this approach encourages researchers to look two ways at once; attending to both the survey interface…

  14. Survey of rural, private wells. Statistical design

    USGS Publications Warehouse

    Mehnert, Edward; Schock, Susan C.; ,

    1991-01-01

    Half of Illinois' 38 million acres were planted in corn and soybeans in 1988. On the 19 million acres planted in corn and soybeans, approximately 1 million tons of nitrogen fertilizer and 50 million pounds of pesticides were applied. Because groundwater is the water supply for over 90 percent of rural Illinois, the occurrence of agricultural chemicals in groundwater in Illinois is of interest to the agricultural community, the public, and regulatory agencies. The occurrence of agricultural chemicals in groundwater is well documented. However, the extent of this contamination still needs to be defined. This can be done by randomly sampling wells across a geographic area. Key elements of a random, water-well sampling program for regional groundwater quality include the overall statistical design of the program, definition of the sample population, selection of wells to be sampled, and analysis of survey results. These elements must be consistent with the purpose for conducting the program; otherwise, the program will not provide the desired information. The need to carefully design and conduct a sampling program becomes readily apparent when one considers the high cost of collecting and analyzing a sample. For a random sampling program conducted in Illinois, the key elements, as well as the limitations imposed by available information, are described.

  15. Quantitative survey on the shape of the back of men's head as viewed from the side.

    PubMed

    Tamir, Abraham

    2013-05-01

    This article classifies quantitatively into 4 shapes men's back part of the head viewed from the side that are demonstrated in some of the figures in this article. Because of self-evident reasons, the shapes were blurred. The survey is based on the analysis of 2220 shapes obtained by photographing mainly bald men and by finding pictures in the Internet. To the best of the author's knowledge, this quantitative approach has never been implemented before. The results obtained are as follows: the percentage of 376 "flat heads" is 17%; the percentage of 755 "little round heads," 34%; the percentage of 1017 "round heads," 45.8%; and the percentage of 72 "very round heads," 3.2%. This quantitative survey is an additional step in analyzing quantitatively the shape of the parts of the face wherein, in articles that were previously published or that will be published in this magazine, shapes of the nose, ear conch, and human eye were analyzed quantitatively. In addition, the shapes of the leg toes were also analyzed. Finally, it should be noted that, because of obvious reasons, the survey is based on men's head, most of which are with baldness.

  16. Quantitative survey on the shape of the back of men's head as viewed from the side.

    PubMed

    Tamir, Abraham

    2013-05-01

    This article classifies quantitatively into 4 shapes men's back part of the head viewed from the side that are demonstrated in some of the figures in this article. Because of self-evident reasons, the shapes were blurred. The survey is based on the analysis of 2220 shapes obtained by photographing mainly bald men and by finding pictures in the Internet. To the best of the author's knowledge, this quantitative approach has never been implemented before. The results obtained are as follows: the percentage of 376 "flat heads" is 17%; the percentage of 755 "little round heads," 34%; the percentage of 1017 "round heads," 45.8%; and the percentage of 72 "very round heads," 3.2%. This quantitative survey is an additional step in analyzing quantitatively the shape of the parts of the face wherein, in articles that were previously published or that will be published in this magazine, shapes of the nose, ear conch, and human eye were analyzed quantitatively. In addition, the shapes of the leg toes were also analyzed. Finally, it should be noted that, because of obvious reasons, the survey is based on men's head, most of which are with baldness. PMID:23714907

  17. 76 FR 9637 - Proposed Information Collection (Veteran Suicide Prevention Online Quantitative Surveys) Activity...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-18

    ... AFFAIRS Proposed Information Collection (Veteran Suicide Prevention Online Quantitative Surveys) Activity... outreach efforts on the prevention of suicide among Veterans and their families. DATES: Written comments...). Type of Review: New collection. Abstract: VA's top priority is the prevention of Veterans suicide....

  18. Practical Guidelines for Evaluating Sampling Designs in Survey Studies.

    ERIC Educational Resources Information Center

    Fan, Xitao; Wang, Lin

    The popularity of sample surveys in evaluation and research makes it necessary for consumers to tell a good survey from a poor one. Several sources were identified that gave advice on how to evaluate a sample design used in a survey study. The sources are either too limited or too extensive to be useful practically. The purpose of this paper is to…

  19. Design and Validation of the Quantum Mechanics Conceptual Survey

    ERIC Educational Resources Information Center

    McKagan, S. B.; Perkins, K. K.; Wieman, C. E.

    2010-01-01

    The Quantum Mechanics Conceptual Survey (QMCS) is a 12-question survey of students' conceptual understanding of quantum mechanics. It is intended to be used to measure the relative effectiveness of different instructional methods in modern physics courses. In this paper, we describe the design and validation of the survey, a process that included…

  20. Quantitative estimation of minimum offset for multichannel surface-wave survey with actively exciting source

    USGS Publications Warehouse

    Xu, Y.; Xia, J.; Miller, R.D.

    2006-01-01

    Multichannel analysis of surface waves is a developing method widely used in shallow subsurface investigations. The field procedures and related parameters are very important for successful applications. Among these parameters, the source-receiver offset range is seldom discussed in theory and normally determined by empirical or semi-quantitative methods in current practice. This paper discusses the problem from a theoretical perspective. A formula for quantitatively evaluating a layered homogenous elastic model was developed. The analytical results based on simple models and experimental data demonstrate that the formula is correct for surface wave surveys for near-surface applications. ?? 2005 Elsevier B.V. All rights reserved.

  1. Research on Basic Design Education: An International Survey

    ERIC Educational Resources Information Center

    Boucharenc, C. G.

    2006-01-01

    This paper reports on the results of a survey and qualitative analysis on the teaching of "Basic Design" in schools of design and architecture located in 22 countries. In the context of this research work, Basic Design means the teaching and learning of design fundamentals that may also be commonly referred to as the Principles of Two- and…

  2. Designing community surveys to provide a basis for noise policy

    NASA Technical Reports Server (NTRS)

    Fields, J. M.

    1980-01-01

    After examining reports from a large number of social surveys, two areas were identified where methodological improvements in the surveys would be especially useful for public policy. The two study areas are: the definition of noise indexes and the assessment of noise impact. Improvements in the designs of surveys are recommended which would increase the validity and reliability of the noise indexes. Changes in interview questions and sample designs are proposed which would enable surveys to provide measures of noise impact which are directly relevant for public policy.

  3. Designing occupancy studies: general advice and allocating survey effort

    USGS Publications Warehouse

    MacKenzie, D.I.; Royle, J. Andrew

    2005-01-01

    1. The fraction of sampling units in a landscape where a target species is present (occupancy) is an extensively used concept in ecology. Yet in many applications the species will not always be detected in a sampling unit even when present, resulting in biased estimates of occupancy. Given that sampling units are surveyed repeatedly within a relatively short timeframe, a number of similar methods have now been developed to provide unbiased occupancy estimates. However, practical guidance on the efficient design of occupancy studies has been lacking. 2. In this paper we comment on a number of general issues related to designing occupancy studies, including the need for clear objectives that are explicitly linked to science or management, selection of sampling units, timing of repeat surveys and allocation of survey effort. Advice on the number of repeat surveys per sampling unit is considered in terms of the variance of the occupancy estimator, for three possible study designs. 3. We recommend that sampling units should be surveyed a minimum of three times when detection probability is high (> 0.5 survey-1), unless a removal design is used. 4. We found that an optimal removal design will generally be the most efficient, but we suggest it may be less robust to assumption violations than a standard design. 5. Our results suggest that for a rare species it is more efficient to survey more sampling units less intensively, while for a common species fewer sampling units should be surveyed more intensively. 6. Synthesis and applications. Reliable inferences can only result from quality data. To make the best use of logistical resources, study objectives must be clearly defined; sampling units must be selected, and repeated surveys timed appropriately; and a sufficient number of repeated surveys must be conducted. Failure to do so may compromise the integrity of the study. The guidance given here on study design issues is particularly applicable to studies of species

  4. Variance estimation for systematic designs in spatial surveys.

    PubMed

    Fewster, R M

    2011-12-01

    In spatial surveys for estimating the density of objects in a survey region, systematic designs will generally yield lower variance than random designs. However, estimating the systematic variance is well known to be a difficult problem. Existing methods tend to overestimate the variance, so although the variance is genuinely reduced, it is over-reported, and the gain from the more efficient design is lost. The current approaches to estimating a systematic variance for spatial surveys are to approximate the systematic design by a random design, or approximate it by a stratified design. Previous work has shown that approximation by a random design can perform very poorly, while approximation by a stratified design is an improvement but can still be severely biased in some situations. We develop a new estimator based on modeling the encounter process over space. The new "striplet" estimator has negligible bias and excellent precision in a wide range of simulation scenarios, including strip-sampling, distance-sampling, and quadrat-sampling surveys, and including populations that are highly trended or have strong aggregation of objects. We apply the new estimator to survey data for the spotted hyena (Crocuta crocuta) in the Serengeti National Park, Tanzania, and find that the reported coefficient of variation for estimated density is 20% using approximation by a random design, 17% using approximation by a stratified design, and 11% using the new striplet estimator. This large reduction in reported variance is verified by simulation. PMID:21534940

  5. Variance estimation for systematic designs in spatial surveys.

    PubMed

    Fewster, R M

    2011-12-01

    In spatial surveys for estimating the density of objects in a survey region, systematic designs will generally yield lower variance than random designs. However, estimating the systematic variance is well known to be a difficult problem. Existing methods tend to overestimate the variance, so although the variance is genuinely reduced, it is over-reported, and the gain from the more efficient design is lost. The current approaches to estimating a systematic variance for spatial surveys are to approximate the systematic design by a random design, or approximate it by a stratified design. Previous work has shown that approximation by a random design can perform very poorly, while approximation by a stratified design is an improvement but can still be severely biased in some situations. We develop a new estimator based on modeling the encounter process over space. The new "striplet" estimator has negligible bias and excellent precision in a wide range of simulation scenarios, including strip-sampling, distance-sampling, and quadrat-sampling surveys, and including populations that are highly trended or have strong aggregation of objects. We apply the new estimator to survey data for the spotted hyena (Crocuta crocuta) in the Serengeti National Park, Tanzania, and find that the reported coefficient of variation for estimated density is 20% using approximation by a random design, 17% using approximation by a stratified design, and 11% using the new striplet estimator. This large reduction in reported variance is verified by simulation.

  6. Designing surveys for tests of gravity.

    PubMed

    Jain, Bhuvnesh

    2011-12-28

    Modified gravity theories may provide an alternative to dark energy to explain cosmic acceleration. We argue that the observational programme developed to test dark energy needs to be augmented to capture new tests of gravity on astrophysical scales. Several distinct signatures of gravity theories exist outside the 'linear' regime, especially owing to the screening mechanism that operates inside halos such as the Milky Way to ensure that gravity tests in the solar system are satisfied. This opens up several decades in length scale and classes of galaxies at low redshift that can be exploited by surveys. While theoretical work on models of gravity is in the early stages, we can already identify new regimes that cosmological surveys could target to test gravity. These include: (i) a small-scale component that focuses on the interior and vicinity of galaxy and cluster halos, (ii) spectroscopy of low-redshift galaxies, especially galaxies smaller than the Milky Way, in environments that range from voids to clusters, and (iii) a programme of combining lensing and dynamical information, from imaging and spectroscopic surveys, respectively, on the same (or statistically identical) sample of galaxies.

  7. Designing surveys for tests of gravity.

    PubMed

    Jain, Bhuvnesh

    2011-12-28

    Modified gravity theories may provide an alternative to dark energy to explain cosmic acceleration. We argue that the observational programme developed to test dark energy needs to be augmented to capture new tests of gravity on astrophysical scales. Several distinct signatures of gravity theories exist outside the 'linear' regime, especially owing to the screening mechanism that operates inside halos such as the Milky Way to ensure that gravity tests in the solar system are satisfied. This opens up several decades in length scale and classes of galaxies at low redshift that can be exploited by surveys. While theoretical work on models of gravity is in the early stages, we can already identify new regimes that cosmological surveys could target to test gravity. These include: (i) a small-scale component that focuses on the interior and vicinity of galaxy and cluster halos, (ii) spectroscopy of low-redshift galaxies, especially galaxies smaller than the Milky Way, in environments that range from voids to clusters, and (iii) a programme of combining lensing and dynamical information, from imaging and spectroscopic surveys, respectively, on the same (or statistically identical) sample of galaxies. PMID:22084295

  8. Survey of Fashion Design Employers. Volume IX, No. 16.

    ERIC Educational Resources Information Center

    Aurand, Cecilia; Lucas, John A.

    A survey was conducted to determine the availability of internship opportunities for fashion design students at Harper College and to measure the value of Harper design graduates to their employers. A sample of 279 manufacturers, contacts, and retail stores employing fashion designers were identified in the Chicago metropolitan area and after two…

  9. 7. Historic American Buildings Survey ORIGINAL DESIGN SUBMITTED BY PEABODY ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. Historic American Buildings Survey ORIGINAL DESIGN SUBMITTED BY PEABODY AND STEARNS (FROM THE ORIGINAL IN THE LIBRARY OF THE VOLTA BUREAU) - Volta Bureau, 1537 Thirty-fifth Street Northwest, Washington, District of Columbia, DC

  10. The Dark Energy Survey instrument design

    SciTech Connect

    Flaugher, B.; /Fermilab

    2006-05-01

    We describe a new project, the Dark Energy Survey (DES), aimed at measuring the dark energy equation of state parameter, w, to a statistical precision of {approx}5%, with four complementary techniques. The survey will use a new 3 sq. deg. mosaic camera (DECam) mounted at the prime focus of the Blanco 4m telescope at the Cerro-Tololo International Observatory (CTIO). DECam includes a large mosaic camera, a five element optical corrector, four filters (g,r,i,z), and the associated infrastructure for operation in the prime focus cage. The focal plane consists of 62 2K x 4K CCD modules (0.27''/pixel) arranged in a hexagon inscribed within the 2.2 deg. diameter field of view. We plan to use the 250 micron thick fully-depleted CCDs that have been developed at the Lawrence Berkeley National Laboratory (LBNL). At Fermilab, we will establish a packaging factory to produce four-side buttable modules for the LBNL devices, as well as to test and grade the CCDs. R&D is underway and delivery of DECam to CTIO is scheduled for 2009.

  11. Sample design for the residential energy consumption survey

    SciTech Connect

    Not Available

    1994-08-01

    The purpose of this report is to provide detailed information about the multistage area-probability sample design used for the Residential Energy Consumption Survey (RECS). It is intended as a technical report, for use by statisticians, to better understand the theory and procedures followed in the creation of the RECS sample frame. For a more cursory overview of the RECS sample design, refer to the appendix entitled ``How the Survey was Conducted,`` which is included in the statistical reports produced for each RECS survey year.

  12. Surveying clinicians by web: current issues in design and administration.

    PubMed

    Dykema, Jennifer; Jones, Nathan R; Piché, Tara; Stevenson, John

    2013-09-01

    The versatility, speed, and reduced costs with which web surveys can be conducted with clinicians are often offset by low response rates. Drawing on best practices and general recommendations in the literature, we provide an evidence-based overview of methods for conducting online surveys with providers. We highlight important advantages and disadvantages of conducting provider surveys online and include a review of differences in response rates between web and mail surveys of clinicians. When administered online, design-based features affect rates of survey participation and data quality. We examine features likely to have an impact including sample frames, incentives, contacts (type, timing, and content), mixed-mode approaches, and questionnaire length. We make several recommendations regarding optimal web-based designs, but more empirical research is needed, particularly with regard to identifying which combinations of incentive and contact approaches yield the highest response rates and are the most cost-effective. PMID:23975760

  13. Surveying clinicians by web: current issues in design and administration.

    PubMed

    Dykema, Jennifer; Jones, Nathan R; Piché, Tara; Stevenson, John

    2013-09-01

    The versatility, speed, and reduced costs with which web surveys can be conducted with clinicians are often offset by low response rates. Drawing on best practices and general recommendations in the literature, we provide an evidence-based overview of methods for conducting online surveys with providers. We highlight important advantages and disadvantages of conducting provider surveys online and include a review of differences in response rates between web and mail surveys of clinicians. When administered online, design-based features affect rates of survey participation and data quality. We examine features likely to have an impact including sample frames, incentives, contacts (type, timing, and content), mixed-mode approaches, and questionnaire length. We make several recommendations regarding optimal web-based designs, but more empirical research is needed, particularly with regard to identifying which combinations of incentive and contact approaches yield the highest response rates and are the most cost-effective.

  14. Survey of intraocular lens material and design.

    PubMed

    Doan, Kim T; Olson, Randall J; Mamalis, Nick

    2002-02-01

    Modern cataract surgery is constantly evolving and improving in terms of lens material and design. Researchers and physicians strive to obtain better refractive correction with smaller wound size and minimizing host cell response to limit the proliferation of lens epithelial cells leading to opacification of the lens capsule. Intraocular lens material varies in water content, refractive index, and tensile strength. Intraocular lens design has undergone revisions to prohibit lens epithelial cell migration and reflection of internal and external light. The evolution of intraocular lens and extracapsular cataract surgery has lead to faster postoperative recovery and better visual outcomes.

  15. Survey of quantitative data on the solar energy and its spectra distribution

    NASA Technical Reports Server (NTRS)

    Thekaekara, M. P.

    1976-01-01

    This paper presents a survey of available quantitative data on the total and spectral solar irradiance at ground level and outside the atmosphere. Measurements from research aircraft have resulted in the currently accepted NASA/ASTM standards of the solar constant and zero air mass solar spectral irradiance. The intrinsic variability of solar energy output and programs currently under way for more precise measurements from spacecraft are discussed. Instrumentation for solar measurements and their reference radiation scales are examined. Insolation data available from the records of weather stations are reviewed for their applicability to solar energy conversion. Two alternate methods of solarimetry are briefly discussed.

  16. Optical Design for a Survey X-Ray Telescope

    NASA Technical Reports Server (NTRS)

    Saha, Timo T.; Zhang, William W.; McClelland, Ryan S.

    2014-01-01

    Optical design trades are underway at the Goddard Space Flight Center to define a telescope for an x-ray survey mission. Top-level science objectives of the mission include the study of x-ray transients, surveying and long-term monitoring of compact objects in nearby galaxies, as well as both deep and wide-field x-ray surveys. In this paper we consider Wolter, Wolter-Schwarzschild, and modified Wolter-Schwarzschild telescope designs as basic building blocks for the tightly nested survey telescope. Design principles and dominating aberrations of individual telescopes and nested telescopes are discussed and we compare the off-axis optical performance at 1.0 KeV and 4.0 KeV across a 1.0-degree full field-of-view.

  17. Magnetic resonance elastography hardware design: a survey.

    PubMed

    Tse, Z T H; Janssen, H; Hamed, A; Ristic, M; Young, I; Lamperth, M

    2009-05-01

    Magnetic resonance elastography (MRE) is an emerging technique capable of measuring the shear modulus of tissue. A suspected tumour can be identified by comparing its properties with those of tissues surrounding it; this can be achieved even in deep-lying areas as long as mechanical excitation is possible. This would allow non-invasive methods for cancer-related diagnosis in areas not accessible with conventional palpation. An actuating mechanism is required to generate the necessary tissue displacements directly on the patient in the scanner and three different approaches, in terms of actuator action and position, exist to derive stiffness measurements. However, the magnetic resonance (MR) environment places considerable constraints on the design of such devices, such as the possibility of mutual interference between electrical components, the scanner field, and radio frequency pulses, and the physical space restrictions of the scanner bore. This paper presents a review of the current solutions that have been developed for MRE devices giving particular consideration to the design criteria including the required vibration frequency and amplitude in different applications, the issue of MR compatibility, actuation principles, design complexity, and scanner synchronization issues. The future challenges in this field are also described.

  18. Quantitative research.

    PubMed

    Watson, Roger

    2015-04-01

    This article describes the basic tenets of quantitative research. The concepts of dependent and independent variables are addressed and the concept of measurement and its associated issues, such as error, reliability and validity, are explored. Experiments and surveys – the principal research designs in quantitative research – are described and key features explained. The importance of the double-blind randomised controlled trial is emphasised, alongside the importance of longitudinal surveys, as opposed to cross-sectional surveys. Essential features of data storage are covered, with an emphasis on safe, anonymous storage. Finally, the article explores the analysis of quantitative data, considering what may be analysed and the main uses of statistics in analysis.

  19. Quantitative and In-Depth Survey of the Isotopic Abundance Distribution Errors in Shotgun Proteomics.

    PubMed

    Chang, Cheng; Zhang, Jiyang; Xu, Changming; Zhao, Yan; Ma, Jie; Chen, Tao; He, Fuchu; Xie, Hongwei; Zhu, Yunping

    2016-07-01

    Accuracy is an important metric when mass spectrometry (MS) is used in large-scale quantitative proteomics research. For MS-based quantification by extracting ion chromatogram (XIC), both the mass and intensity dimensions must be accurate. Although much research has focused on mass accuracy in recent years, less attention has been paid to intensity errors. Here, we investigated signal intensity measurement errors systematically and quantitatively using the natural properties of isotopic distributions. First, we defined a normalized isotopic abundance error model and presented its merits and demerits. Second, a comprehensive survey of the isotopic abundance errors using data sets with increasing sample complexities and concentrations was performed. We examined parameters such as error distribution, relationships between signal intensities within one isotopic cluster, and correlations between different peak errors in isotopic profiles. Our data demonstrated that the high resolution MS platforms might also generate large isotopic intensity measurement errors (approximately 20%). Meanwhile, this error can be reduced to less than 5% using a novel correction algorithm, which is based on the theoretical isotopic abundance distribution. Finally, a nonlinear relationship was observed as the abundance error decreased in isotopic profiles with higher intensity. Our findings are expected to provide insight into isotopic abundance recalibration in quantitative proteomics.

  20. A Quantitative Approach to the Design of School Bus Routes.

    ERIC Educational Resources Information Center

    Tracz, George S.

    A number of factors--including the reorganization of school administrative structures, the availability of new technology, increased competition among groups for limited resources, and changing patterns of communication--suggest an increased need for quantitative analysis in the school district decision-making process. One area of school…

  1. A SUCCESSFUL BROADBAND SURVEY FOR GIANT Ly{alpha} NEBULAE. I. SURVEY DESIGN AND CANDIDATE SELECTION

    SciTech Connect

    Prescott, Moire K. M.; Dey, Arjun; Jannuzi, Buell T.

    2012-04-01

    Giant Ly{alpha} nebulae (or Ly{alpha} 'blobs') are likely sites of ongoing massive galaxy formation, but the rarity of these powerful sources has made it difficult to form a coherent picture of their properties, ionization mechanisms, and space density. Systematic narrowband Ly{alpha} nebula surveys are ongoing, but the small redshift range covered and the observational expense limit the comoving volume that can be probed by even the largest of these surveys and pose a significant problem when searching for such rare sources. We have developed a systematic search technique designed to find large Ly{alpha} nebulae at 2 {approx}< z {approx}< 3 within deep broadband imaging and have carried out a survey of the 9.4 deg{sup 2} NOAO Deep Wide-Field Survey Booetes field. With a total survey comoving volume of Almost-Equal-To 10{sup 8} h{sup -3}{sub 70} Mpc{sup 3}, this is the largest volume survey for Ly{alpha} nebulae ever undertaken. In this first paper in the series, we present the details of the survey design and a systematically selected sample of 79 candidates, which includes one previously discovered Ly{alpha} nebula.

  2. New journal selection for quantitative survey of infectious disease research: application for Asian trend analysis

    PubMed Central

    2009-01-01

    Background Quantitative survey of research articles, as an application of bibliometrics, is an effective tool for grasping overall trends in various medical research fields. This type of survey has been also applied to infectious disease research; however, previous studies were insufficient as they underestimated articles published in non-English or regional journals. Methods Using a combination of Scopus™ and PubMed, the databases of scientific literature, and English and non-English keywords directly linked to infectious disease control, we identified international and regional infectious disease journals. In order to ascertain whether the newly selected journals were appropriate to survey a wide range of research articles, we compared the number of original articles and reviews registered in the selected journals to those in the 'Infectious Disease Category' of the Science Citation Index Expanded™ (SCI Infectious Disease Category) during 1998-2006. Subsequently, we applied the newly selected journals to survey the number of original articles and reviews originating from 11 Asian countries during the same period. Results One hundred journals, written in English or 7 non-English languages, were newly selected as infectious disease journals. The journals published 14,156 original articles and reviews of Asian origin and 118,158 throughout the world, more than those registered in the SCI Infectious Disease Category (4,621 of Asian origin and 66,518 of the world in the category). In Asian trend analysis of the 100 journals, Japan had the highest percentage of original articles and reviews in the area, and no noticeable increase in articles was revealed during the study period. China, India and Taiwan had relatively large numbers and a high increase rate of original articles among Asian countries. When adjusting the publication of original articles according to the country population and the gross domestic product (GDP), Singapore and Taiwan were the most

  3. Lack of quantitative training among early-career ecologists: a survey of the problem and potential solutions

    PubMed Central

    Ezard, Thomas H.G.; Jørgensen, Peter S.; Zimmerman, Naupaka; Chamberlain, Scott; Salguero-Gómez, Roberto; Curran, Timothy J.; Poisot, Timothée

    2014-01-01

    Proficiency in mathematics and statistics is essential to modern ecological science, yet few studies have assessed the level of quantitative training received by ecologists. To do so, we conducted an online survey. The 937 respondents were mostly early-career scientists who studied biology as undergraduates. We found a clear self-perceived lack of quantitative training: 75% were not satisfied with their understanding of mathematical models; 75% felt that the level of mathematics was “too low” in their ecology classes; 90% wanted more mathematics classes for ecologists; and 95% more statistics classes. Respondents thought that 30% of classes in ecology-related degrees should be focused on quantitative disciplines, which is likely higher than for most existing programs. The main suggestion to improve quantitative training was to relate theoretical and statistical modeling to applied ecological problems. Improving quantitative training will require dedicated, quantitative classes for ecology-related degrees that contain good mathematical and statistical practice. PMID:24688862

  4. Simulating future uncertainty to guide the selection of survey designs for long-term monitoring

    USGS Publications Warehouse

    Garman, Steven L.; Schweiger, E. William; Manier, Daniel J.; Gitzen, Robert A.; Millspaugh, Joshua J.; Cooper, Andrew B.; Licht, Daniel S.

    2012-01-01

    A goal of environmental monitoring is to provide sound information on the status and trends of natural resources (Messer et al. 1991, Theobald et al. 2007, Fancy et al. 2009). When monitoring observations are acquired by measuring a subset of the population of interest, probability sampling as part of a well-constructed survey design provides the most reliable and legally defensible approach to achieve this goal (Cochran 1977, Olsen et al. 1999, Schreuder et al. 2004; see Chapters 2, 5, 6, 7). Previous works have described the fundamentals of sample surveys (e.g. Hansen et al. 1953, Kish 1965). Interest in survey designs and monitoring over the past 15 years has led to extensive evaluations and new developments of sample selection methods (Stevens and Olsen 2004), of strategies for allocating sample units in space and time (Urquhart et al. 1993, Overton and Stehman 1996, Urquhart and Kincaid 1999), and of estimation (Lesser and Overton 1994, Overton and Stehman 1995) and variance properties (Larsen et al. 1995, Stevens and Olsen 2003) of survey designs. Carefully planned, “scientific” (Chapter 5) survey designs have become a standard in contemporary monitoring of natural resources. Based on our experience with the long-term monitoring program of the US National Park Service (NPS; Fancy et al. 2009; Chapters 16, 22), operational survey designs tend to be selected using the following procedures. For a monitoring indicator (i.e. variable or response), a minimum detectable trend requirement is specified, based on the minimum level of change that would result in meaningful change (e.g. degradation). A probability of detecting this trend (statistical power) and an acceptable level of uncertainty (Type I error; see Chapter 2) within a specified time frame (e.g. 10 years) are specified to ensure timely detection. Explicit statements of the minimum detectable trend, the time frame for detecting the minimum trend, power, and acceptable probability of Type I error (

  5. Multidisciplinary aerospace design optimization: Survey of recent developments

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw; Haftka, Raphael T.

    1995-01-01

    The increasing complexity of engineering systems has sparked increasing interest in multidisciplinary optimization (MDO). This paper presents a survey of recent publications in the field of aerospace where interest in MDO has been particularly intense. The two main challenges of MDO are computational expense and organizational complexity. Accordingly the survey is focussed on various ways different researchers use to deal with these challenges. The survey is organized by a breakdown of MDO into its conceptual components. Accordingly, the survey includes sections on Mathematical Modeling, Design-oriented Analysis, Approximation Concepts, Optimization Procedures, System Sensitivity, and Human Interface. With the authors' main expertise being in the structures area, the bulk of the references focus on the interaction of the structures discipline with other disciplines. In particular, two sections at the end focus on two such interactions that have recently been pursued with a particular vigor: Simultaneous Optimization of Structures and Aerodynamics, and Simultaneous Optimization of Structures Combined With Active Control.

  6. Survey of quantitative antimicrobial consumption per production stage in farrow-to-finish pig farms in Spain

    PubMed Central

    Moreno, Miguel A.

    2014-01-01

    Objectives To characterise antimicrobial use (AMU) per production stage in terms of drugs, routes of application, indications, duration and exposed animals in farrow-to-finish pig farms in Spain. Design Survey using a questionnaire on AMU during the six months prior to the interview, administered in face-to-face interviews completed from April to October 2010. Participants 108 potentially eligible farms covering all the country were selected using a multistage sampling methodology; of these, 33 were excluded because they did not fulfil the participation criteria and 49 were surveyed. Results The rank of the most used antimicrobials per farm and production stage and administration route started with polymyxins (colistin) by feed during the growing and the preweaning phases, followed by β-lactams by feed during the growing and the preweaning phases and by injection during the preweaning phase. Conclusions The study demonstrates that the growing stage (from weaning to the start of finishing) has the highest AMU according to different quantitative indicators (number of records, number of antimicrobials used, percentage of farms reporting use, relative number of exposed animals per farm and duration of exposure); feed is the administration route that produces the highest antimicrobial exposure based on the higher number of exposed animals and the longer duration of treatment; and there are large differences in AMU among individual pig farms. PMID:26392868

  7. Survey design and extent estimates for the National Lakes Assessment

    EPA Science Inventory

    The US Environmental Protection Agency (EPA) conducted a National Lake Assessment (NLA) in the conterminous USA in 2007 as part of a national assessment of aquatic resources using probability based survey designs. The USEPA Office of Water led the assessment, in cooperation with...

  8. Engaging Students in Survey Design and Data Collection

    ERIC Educational Resources Information Center

    Sole, Marla A.

    2015-01-01

    Every day, people use data to make decisions that affect their personal and professional lives, trusting that the data are correct. Many times, however, the data are inaccurate, as a result of a flaw in the design or methodology of the survey used to collect the data. Researchers agree that only questions that are clearly worded, unambiguous, free…

  9. Integrated Analysis and Tools for Land Subsidence Surveying and Monitoring: a Semi-Quantitative Approach

    NASA Astrophysics Data System (ADS)

    Mosconi, A.; Pozzoli, A.; Meroni, A.; Gagliano, S.

    2015-10-01

    This paper presents an integrated approach for land subsidence monitoring using measures coming from different sensors. Eni S.p.A., the main Italian oil and gas company, constantly surveys the land with all the state of the art and innovative techniques, and a method able to integrate the results is an important and actual topic. Nowadays the world is a multi-sensor platform, and measure integration is strictly necessary. Combining the different data sources should be done in a clever way, taking advantages from the best performances of each technique. An integrated analysis allows the interpretation of simultaneous temporal series of data, coming from different sources, and try to separate subsidence contributions. With this purpose Exelis VIS in collaboration with Eni S.p.A. customize PISAV (Permanent Interferometric Scatterometer Analysis and Visualization), an ENVI extension able to capitalize on and combine all the different data collected in the surveys. In this article are presented some significant examples to show the potential of this tool in oil and gas activity: a hydrocarbon storage field where the comparison between SAR and production volumes emphasise a correlation between the two measures in few steps; and a hydrocarbon production field with the Satellite Survey Unit (S.S.U.), where SAR, CGPS, piezometers and assestimeters measure in the same area at the same time, giving the opportunity to analyse data contextually. In the integrated analysis performed with PISAV not always a mathematical rigorous study is possible, and a semi-quantitative approach is the only method for results interpretation. As a result, in the first test case strong correlation between injected hydrocarbon volume and vertical displacement were highlighted; in the second one the integrated analysis has different advantages in monitoring the land subsidence: permits a first qualitative "differentiation" of the natural and anthropic component of subsidence, and also gives more

  10. Survey Says? A Primer on Web-based Survey Design and Distribution

    PubMed Central

    Oppenheimer, Adam J.; Pannucci, Christopher J.; Kasten, Steven J.; Haase, Steven C.

    2011-01-01

    The internet has changed the way in which we gather and interpret information. While books were once the exclusive bearers of data, knowledge is now only a keystroke away. The internet has also facilitated the synthesis of new knowledge. Specifically, it has become a tool through which medical research is conducted. A review of the literature reveals that in the past year, over one-hundred medical publications have been based on web-based survey data alone. Due to emerging internet technologies, web-based surveys can now be launched with little computer knowledge. They may also be self-administered, eliminating personnel requirements. Ultimately, an investigator may build, implement, and analyze survey results with speed and efficiency, obviating the need for mass mailings and data processing. All of these qualities have rendered telephone and mail-based surveys virtually obsolete. Despite these capabilities, web-based survey techniques are not without their limitations, namely recall and response biases. When used properly, however, web-based surveys can greatly simplify the research process. This article discusses the implications of web-based surveys and provides guidelines for their effective design and distribution. PMID:21701347

  11. Metamodels for Computer-Based Engineering Design: Survey and Recommendations

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.

  12. Large Synoptic Survey Telescope: From Science Drivers to Reference Design

    SciTech Connect

    Ivezic, Z.; Axelrod, T.; Brandt, W.N.; Burke, D.L.; Claver, C.F.; Connolly, A.; Cook, K.H.; Gee, P.; Gilmore, D.K.; Jacoby, S.H.; Jones, R.L.; Kahn, S.M.; Kantor, J.P.; Krabbendam, V.; Lupton, R.H.; Monet, D.G.; Pinto, P.A.; Saha, A.; Schalk, T.L.; Schneider, D.P.; Strauss, Michael A.; /Washington U., Seattle, Astron. Dept. /LSST Corp. /Penn State U., Astron. Astrophys. /KIPAC, Menlo Park /NOAO, Tucson /LLNL, Livermore /UC, Davis /Princeton U., Astrophys. Sci. Dept. /Naval Observ., Flagstaff /Arizona U., Astron. Dept. - Steward Observ. /UC, Santa Cruz /Harvard U. /Johns Hopkins U. /Illinois U., Urbana

    2011-10-14

    In the history of astronomy, major advances in our understanding of the Universe have come from dramatic improvements in our ability to accurately measure astronomical quantities. Aided by rapid progress in information technology, current sky surveys are changing the way we view and study the Universe. Next-generation surveys will maintain this revolutionary progress. We focus here on the most ambitious survey currently planned in the visible band, the Large Synoptic Survey Telescope (LSST). LSST will have unique survey capability in the faint time domain. The LSST design is driven by four main science themes: constraining dark energy and dark matter, taking an inventory of the Solar System, exploring the transient optical sky, and mapping the Milky Way. It will be a large, wide-field ground-based system designed to obtain multiple images covering the sky that is visible from Cerro Pachon in Northern Chile. The current baseline design, with an 8.4 m (6.5 m effective) primary mirror, a 9.6 deg{sup 2} field of view, and a 3,200 Megapixel camera, will allow about 10,000 square degrees of sky to be covered using pairs of 15-second exposures in two photometric bands every three nights on average. The system is designed to yield high image quality, as well as superb astrometric and photometric accuracy. The survey area will include 30,000 deg{sup 2} with {delta} < +34.5{sup o}, and will be imaged multiple times in six bands, ugrizy, covering the wavelength range 320-1050 nm. About 90% of the observing time will be devoted to a deep-wide-fast survey mode which will observe a 20,000 deg{sup 2} region about 1000 times in the six bands during the anticipated 10 years of operation. These data will result in databases including 10 billion galaxies and a similar number of stars, and will serve the majority of science programs. The remaining 10% of the observing time will be allocated to special programs such as Very Deep and Very Fast time domain surveys. We describe how the

  13. The ZInEP Epidemiology Survey: background, design and methods.

    PubMed

    Ajdacic-Gross, Vladeta; Müller, Mario; Rodgers, Stephanie; Warnke, Inge; Hengartner, Michael P; Landolt, Karin; Hagenmuller, Florence; Meier, Magali; Tse, Lee-Ting; Aleksandrowicz, Aleksandra; Passardi, Marco; Knöpfli, Daniel; Schönfelder, Herdis; Eisele, Jochen; Rüsch, Nicolas; Haker, Helene; Kawohl, Wolfram; Rössler, Wulf

    2014-12-01

    This article introduces the design, sampling, field procedures and instruments used in the ZInEP Epidemiology Survey. This survey is one of six ZInEP projects (Zürcher Impulsprogramm zur nachhaltigen Entwicklung der Psychiatrie, i.e. the "Zurich Program for Sustainable Development of Mental Health Services"). It parallels the longitudinal Zurich Study with a sample comparable in age and gender, and with similar methodology, including identical instruments. Thus, it is aimed at assessing the change of prevalence rates of common mental disorders and the use of professional help and psychiatric sevices. Moreover, the current survey widens the spectrum of topics by including sociopsychiatric questionnaires on stigma, stress related biological measures such as load and cortisol levels, electroencephalographic (EEG) and near-infrared spectroscopy (NIRS) examinations with various paradigms, and sociophysiological tests. The structure of the ZInEP Epidemiology Survey entails four subprojects: a short telephone screening using the SCL-27 (n of nearly 10,000), a comprehensive face-to-face interview based on the SPIKE (Structured Psychopathological Interview and Rating of the Social Consequences for Epidemiology: the main instrument of the Zurich Study) with a stratified sample (n = 1500), tests in the Center for Neurophysiology and Sociophysiology (n = 227), and a prospective study with up to three follow-up interviews and further measures (n = 157). In sum, the four subprojects of the ZInEP Epidemiology Survey deliver a large interdisciplinary database. PMID:24942564

  14. Exploring the utility of quantitative network design in evaluating Arctic sea ice thickness sampling strategies

    NASA Astrophysics Data System (ADS)

    Kaminski, T.; Kauker, F.; Eicken, H.; Karcher, M.

    2015-08-01

    We present a quantitative network design (QND) study of the Arctic sea ice-ocean system using a software tool that can evaluate hypothetical observational networks in a variational data assimilation system. For a demonstration, we evaluate two idealised flight transects derived from NASA's Operation IceBridge airborne ice surveys in terms of their potential to improve 10-day to 5-month sea ice forecasts. As target regions for the forecasts we select the Chukchi Sea, an area particularly relevant for maritime traffic and offshore resource exploration, as well as two areas related to the Barnett ice severity index (BSI), a standard measure of shipping conditions along the Alaskan coast that is routinely issued by ice services. Our analysis quantifies the benefits of sampling upstream of the target area and of reducing the sampling uncertainty. We demonstrate how observations of sea ice and snow thickness can constrain ice and snow variables in a target region and quantify the complementarity of combining two flight transects. We further quantify the benefit of improved atmospheric forecasts and a well-calibrated model.

  15. A survey on methods of design features identification

    NASA Astrophysics Data System (ADS)

    Grabowik, C.; Kalinowski, K.; Paprocka, I.; Kempa, W.

    2015-11-01

    currently identified feature. In the IFR method system designer defines a set of features and sets a collection of recognition process parameters. It allows to unambiguously identifying individual features in automatic or semiautomatic way directly in CAD system or in an external application to which the part model might be transferred. Additionally a user is able to define non-geometrical information such as: overall dimensions, surface roughness etc. In this paper a survey on methods of features identification and recognition is presented especially in context of AFR methods.

  16. Mixing Qualitative and Quantitative Methods: Insights into Design and Analysis Issues

    ERIC Educational Resources Information Center

    Lieber, Eli

    2009-01-01

    This article describes and discusses issues related to research design and data analysis in the mixing of qualitative and quantitative methods. It is increasingly desirable to use multiple methods in research, but questions arise as to how best to design and analyze the data generated by mixed methods projects. I offer a conceptualization for such…

  17. Perceived future career prospects in general practice: quantitative results from questionnaire surveys of UK doctors

    PubMed Central

    Lambert, Trevor W; Smith, Fay; Goldacre, Michael J

    2016-01-01

    Background There are more studies of current job satisfaction among GPs than of their views about their future career prospects, although both are relevant to commitment to careers in general practice. Aim To report on the views of GPs compared with clinicians in other specialties about their future career prospects. Design and setting Questionnaire surveys were sent to UK medical doctors who graduated in selected years between 1974 and 2008. Method Questionnaires were sent to the doctors at different times after graduation, ranging from 3 to 24 years. Results Based on the latest survey of each graduation year of the 20 940 responders, 66.2% of GPs and 74.2% of hospital doctors were positive about their prospects and 9.7% and 8.3%, respectively, were negative. However, with increasing time since graduation and increasing levels of seniority, GPs became less positive about their prospects; by contrast, over time, surgeons became more positive. Three to 5 years after graduation, 86.3% of those training in general practice were positive about their prospects compared with 52.9% of surgical trainees: in surveys conducted 12–24 years after graduation, 60.2% of GPs and 76.6% of surgeons were positive about their prospects. Conclusion GPs held broadly positive views of their career prospects, as did other doctors. However, there was an increase in negativity with increasing time since graduation that was not seen in hospital doctors. Research into the causes of this negativity and policy measures to ameliorate it would contribute to the continued commitment of GPs and may help to reduce attrition. PMID:27578813

  18. SEDS: The Spitzer Extended Deep Survey. Survey Design, Photometry, and Deep IRAC Source Counts

    NASA Technical Reports Server (NTRS)

    Ashby, M. L. N.; Willner, S. P.; Fazio, G. G.; Huang, J.-S.; Arendt, A.; Barmby, P.; Barro, G; Bell, E. F.; Bouwens, R.; Cattaneo, A.; Croton, D.; Dave, R.; Dunlop, J. S.; Egami, E.; Faber, S.; Finlator, K.; Grogin, N. A.; Guhathakurta, P.; Hernquist, L.; Hora, J. L.; Illingworth, G.; Kashlinsky, A; Koekmoer, A. M.; Koo, D. C.; Moseley, H.

    2013-01-01

    The Spitzer Extended Deep Survey (SEDS) is a very deep infrared survey within five well-known extragalactic science fields: the UKIDSS Ultra-Deep Survey, the Extended Chandra Deep Field South, COSMOS, the Hubble Deep Field North, and the Extended Groth Strip. SEDS covers a total area of 1.46 deg(exp 2) to a depth of 26 AB mag (3sigma) in both of the warm Infrared Array Camera (IRAC) bands at 3.6 and 4.5 micron. Because of its uniform depth of coverage in so many widely-separated fields, SEDS is subject to roughly 25% smaller errors due to cosmic variance than a single-field survey of the same size. SEDS was designed to detect and characterize galaxies from intermediate to high redshifts (z = 2-7) with a built-in means of assessing the impact of cosmic variance on the individual fields. Because the full SEDS depth was accumulated in at least three separate visits to each field, typically with six-month intervals between visits, SEDS also furnishes an opportunity to assess the infrared variability of faint objects. This paper describes the SEDS survey design, processing, and publicly-available data products. Deep IRAC counts for the more than 300,000 galaxies detected by SEDS are consistent with models based on known galaxy populations. Discrete IRAC sources contribute 5.6 +/- 1.0 and 4.4 +/- 0.8 nW / square m/sr at 3.6 and 4.5 micron to the diffuse cosmic infrared background (CIB). IRAC sources cannot contribute more than half of the total CIB flux estimated from DIRBE data. Barring an unexpected error in the DIRBE flux estimates, half the CIB flux must therefore come from a diffuse component.

  19. Design Considerations: Falcon M Dwarf Habitable Exoplanet Survey

    NASA Astrophysics Data System (ADS)

    Polsgrove, Daniel; Novotny, Steven; Della-Rose, Devin J.; Chun, Francis; Tippets, Roger; O'Shea, Patrick; Miller, Matthew

    2016-01-01

    The Falcon Telescope Network (FTN) is an assemblage of twelve automated 20-inch telescopes positioned around the globe, controlled from the Cadet Space Operations Center (CSOC) at the US Air Force Academy (USAFA) in Colorado Springs, Colorado. Five of the 12 sites are currently installed, with full operational capability expected by the end of 2016. Though optimized for studying near-earth objects to accomplish its primary mission of Space Situational Awareness (SSA), the Falcon telescopes are in many ways similar to those used by ongoing and planned exoplanet transit surveys targeting individual M dwarf stars (e.g., MEarth, APACHE, SPECULOOS). The network's worldwide geographic distribution provides additional potential advantages. We have performed analytical and empirical studies exploring the viability of employing the FTN for a future survey of nearby late-type M dwarfs tailored to detect transits of 1-2REarth exoplanets in habitable-zone orbits . We present empirical results on photometric precision derived from data collected with multiple Falcon telescopes on a set of nearby (< 25 pc) M dwarfs using infrared filters and a range of exposure times, as well as sample light curves created from images gathered during known transits of varying transit depths. An investigation of survey design parameters is also described, including an analysis of site-specific weather data, anticipated telescope time allocation and the percentage of nearby M dwarfs with sufficient check stars within the Falcons' 11' x 11' field-of-view required to perform effective differential photometry. The results of this ongoing effort will inform the likelihood of discovering one (or more) habitable-zone exoplanets given current occurrence rate estimates over a nominal five-year campaign, and will dictate specific survey design features in preparation for initiating project execution when the FTN begins full-scale automated operations.

  20. The 2-degree Field Lensing Survey: design and clustering measurements

    NASA Astrophysics Data System (ADS)

    Blake, Chris; Amon, Alexandra; Childress, Michael; Erben, Thomas; Glazebrook, Karl; Harnois-Deraps, Joachim; Heymans, Catherine; Hildebrandt, Hendrik; Hinton, Samuel R.; Janssens, Steven; Johnson, Andrew; Joudaki, Shahab; Klaes, Dominik; Kuijken, Konrad; Lidman, Chris; Marin, Felipe A.; Parkinson, David; Poole, Gregory B.; Wolf, Christian

    2016-11-01

    We present the 2-degree Field Lensing Survey (2dFLenS), a new galaxy redshift survey performed at the Anglo-Australian Telescope. 2dFLenS is the first wide-area spectroscopic survey specifically targeting the area mapped by deep-imaging gravitational lensing fields, in this case the Kilo-Degree Survey. 2dFLenS obtained 70 079 redshifts in the range z < 0.9 over an area of 731 deg2, and is designed to extend the data sets available for testing gravitational physics and promote the development of relevant algorithms for joint imaging and spectroscopic analysis. The redshift sample consists first of 40 531 Luminous Red Galaxies (LRGs), which enable analyses of galaxy-galaxy lensing, redshift-space distortion, and the overlapping source redshift distribution by cross-correlation. An additional 28 269 redshifts form a magnitude-limited (r < 19.5) nearly complete subsample, allowing direct source classification and photometric-redshift calibration. In this paper, we describe the motivation, target selection, spectroscopic observations, and clustering analysis of 2dFLenS. We use power spectrum multipole measurements to fit the redshift-space distortion parameter of the LRG sample in two redshift ranges 0.15 < z < 0.43 and 0.43 < z < 0.7 as β = 0.49 ± 0.15 and β = 0.26 ± 0.09, respectively. These values are consistent with those obtained from LRGs in the Baryon Oscillation Spectroscopic Survey. 2dFLenS data products will be released via our website http://2dflens.swin.edu.au.

  1. Quality-by-Design II: Application of Quantitative Risk Analysis to the Formulation of Ciprofloxacin Tablets.

    PubMed

    Claycamp, H Gregg; Kona, Ravikanth; Fahmy, Raafat; Hoag, Stephen W

    2016-04-01

    Qualitative risk assessment methods are often used as the first step to determining design space boundaries; however, quantitative assessments of risk with respect to the design space, i.e., calculating the probability of failure for a given severity, are needed to fully characterize design space boundaries. Quantitative risk assessment methods in design and operational spaces are a significant aid to evaluating proposed design space boundaries. The goal of this paper is to demonstrate a relatively simple strategy for design space definition using a simplified Bayesian Monte Carlo simulation. This paper builds on a previous paper that used failure mode and effects analysis (FMEA) qualitative risk assessment and Plackett-Burman design of experiments to identity the critical quality attributes. The results show that the sequential use of qualitative and quantitative risk assessments can focus the design of experiments on a reduced set of critical material and process parameters that determine a robust design space under conditions of limited laboratory experimentation. This approach provides a strategy by which the degree of risk associated with each known parameter can be calculated and allocates resources in a manner that manages risk to an acceptable level.

  2. Spectroscopic survey telescope design. I - Primary mirror structure and support

    NASA Astrophysics Data System (ADS)

    Ray, F. B.; Krishnamachari, S. V.

    1988-09-01

    The present design for a spectroscopic survey telescope uses a spherical primary mirror whose figure requires that a secondary focus assembly be driven at the tracking rate in an attitude normal to the spherical focal surface, while the telescope, being tilted at a predetermined angular zenith distance, need only be 'set' (and clamped) occasionally in azimuth. The spherical primary mirror segments are configured to an identical radius-of-curvature and supported on a fully triangulated stainless steel space frame; a structural analysis using finite elements indicates that the expected static performance of both the individual segments and the overall space frame present reasonable goals for current engineering practice.

  3. Quantitative Survey and Structural Classification of Hydraulic Fracturing Chemicals Reported in Unconventional Gas Production.

    PubMed

    Elsner, Martin; Hoelzer, Kathrin

    2016-04-01

    Much interest is directed at the chemical structure of hydraulic fracturing (HF) additives in unconventional gas exploitation. To bridge the gap between existing alphabetical disclosures by function/CAS number and emerging scientific contributions on fate and toxicity, we review the structural properties which motivate HF applications, and which determine environmental fate and toxicity. Our quantitative overview relied on voluntary U.S. disclosures evaluated from the FracFocus registry by different sources and on a House of Representatives ("Waxman") list. Out of over 1000 reported substances, classification by chemistry yielded succinct subsets able to illustrate the rationale of their use, and physicochemical properties relevant for environmental fate, toxicity and chemical analysis. While many substances were nontoxic, frequent disclosures also included notorious groundwater contaminants like petroleum hydrocarbons (solvents), precursors of endocrine disruptors like nonylphenols (nonemulsifiers), toxic propargyl alcohol (corrosion inhibitor), tetramethylammonium (clay stabilizer), biocides or strong oxidants. Application of highly oxidizing chemicals, together with occasional disclosures of putative delayed acids and complexing agents (i.e., compounds designed to react in the subsurface) suggests that relevant transformation products may be formed. To adequately investigate such reactions, available information is not sufficient, but instead a full disclosure of HF additives is necessary. PMID:26902161

  4. Skin Microbiome Surveys Are Strongly Influenced by Experimental Design.

    PubMed

    Meisel, Jacquelyn S; Hannigan, Geoffrey D; Tyldsley, Amanda S; SanMiguel, Adam J; Hodkinson, Brendan P; Zheng, Qi; Grice, Elizabeth A

    2016-05-01

    Culture-independent studies to characterize skin microbiota are increasingly common, due in part to affordable and accessible sequencing and analysis platforms. Compared to culture-based techniques, DNA sequencing of the bacterial 16S ribosomal RNA (rRNA) gene or whole metagenome shotgun (WMS) sequencing provides more precise microbial community characterizations. Most widely used protocols were developed to characterize microbiota of other habitats (i.e., gastrointestinal) and have not been systematically compared for their utility in skin microbiome surveys. Here we establish a resource for the cutaneous research community to guide experimental design in characterizing skin microbiota. We compare two widely sequenced regions of the 16S rRNA gene to WMS sequencing for recapitulating skin microbiome community composition, diversity, and genetic functional enrichment. We show that WMS sequencing most accurately recapitulates microbial communities, but sequencing of hypervariable regions 1-3 of the 16S rRNA gene provides highly similar results. Sequencing of hypervariable region 4 poorly captures skin commensal microbiota, especially Propionibacterium. WMS sequencing, which is resource and cost intensive, provides evidence of a community's functional potential; however, metagenome predictions based on 16S rRNA sequence tags closely approximate WMS genetic functional profiles. This study highlights the importance of experimental design for downstream results in skin microbiome surveys. PMID:26829039

  5. The Large Synoptic Survey Telescope preliminary design overview

    NASA Astrophysics Data System (ADS)

    Krabbendam, V. L.; Sweeney, D.

    2010-07-01

    The Large Synoptic Survey Telescope (LSST) Project is a public-private partnership that is well into the design and development of the complete observatory system to conduct a wide fast deep survey and to process and serve the data. The telescope has a 3-mirror wide field optical system with an 8.4 meter primary, 3.4 meter secondary, and 5 meter tertiary mirror. The reflective optics feed three refractive elements and a 64 cm 3.2 gigapixel camera. The LSST data management system will reduce, transport, alert and archive the roughly 15 terabytes of data produced nightly, and will serve the raw and catalog data accumulating at an average of 7 petabytes per year to the community without any proprietary period. The project has completed several data challenges designed to prototype and test the data management system to significant pre-construction levels. The project continues to attract institutional partners and has acquired non-federal funding sufficient to construct the primary mirror, already in progress at the University of Arizona, build the secondary mirror substrate, completed by Corning, and fund detector prototype efforts, several that have been tested on the sky. A focus of the project is systems engineering, risk reduction through prototyping and major efforts in image simulation and operation simulations. The project has submitted a proposal for construction to the National Science Foundation Major Research Equipment and Facilities Construction (MREFC) program and has prepared project advocacy papers for the National Research Council's Astronomy 2010 Decadal Survey. The project is preparing for a 2012 construction funding authorization.

  6. Quantitative Feedback Theory (QFT) applied to the design of a rotorcraft flight control system

    NASA Technical Reports Server (NTRS)

    Hess, R. A.; Gorder, P. J.

    1992-01-01

    Quantitative Feedback Theory describes a frequency-domain technique for the design of multi-input, multi-output control systems which meet time or frequency domain performance criteria when specified uncertainty exists in the linear description of the vehicle dynamics. Quantitative Feedback Theory is applied to the design of the longitudinal flight control system for a linear uncertain model of the AH-64 rotorcraft. In this model, the uncertainty is assigned, and is assumed to be attributable to actual uncertainty in the dynamic model and to the changes in the vehicle aerodynamic characteristics which occur near hover. The model includes an approximation to the rotor and actuator dynamics. The design example indicates the manner in which handling qualities criteria may be incorporated into the design of realistic rotorcraft control systems in which significant uncertainty exists in the vehicle model.

  7. The Unique Optical Design of the NESSI Survey Telescope

    NASA Astrophysics Data System (ADS)

    Ackermann, M.; McGraw, J.; Zimmer, P.; Williams, T.

    The NESSI Survey telescope will be the second incarnation of the CCD/Transit Instrument. It is being designed to accomplish precision astronomical measurements, thus requiring excellent image quality and virtually no distortion over an inscribed 1° x 1° scientific field of view. Project constraints such as re-use of an existing unperforated parabolic f/2.2 primary mirror, and the desire to re-use much of the existing CTI structure, have forced the design in one direction. Scientific constraints such as the 1.42° field, 60μm/arcsec plate scale, zero focus shift with wavelength, zero distortion and 80% encircled energy within 0.25arcsec spot diameters have further limited remaining design options. After exploring nearly every optical telescope configuration known to man, and several never before imagined, the NESSI Project Team as arrived at a unique optical design that produces a field and images meeting or exceeding all these constraints. The baseline configuration is that of a "bent Cassegrain," employing a convex hyperbolic secondary, a 45° folding flat and a four lens refractive field group. One unique feature of this design is that all four lenses lie outside the primary aperture, thus introduce no obscuration. A second unique aspect of the design is that the largest lens is only slightly larger than the focal plane array. The field corrector lenses are not large by today's standards but still large enough to make the availability of glass a serious concern. A number of high performing designs were abandoned when it was learned the glass was either not available or would require a special production. With a little luck, a little insight and a lot of work, we followed the "rugged ways to the stars," and were able to arrive at a relatively simple Cassegrain design where only one corrector lens had an aspheric surface, a simple parabola, and all four lenses were made of BK7 glass. This design appears to be manufactureable and essentially meets all of the

  8. Discrimination, Personality, and Achievement: A Survey of Northern Blacks. Quantitative Studies in Social Relations Series.

    ERIC Educational Resources Information Center

    Crain, Robert L.; Weisman, Carol Sachs

    In the Spring of 1966, the Civil Rights Commission asked the National Opinion Research Center (NORC) to conduct a survey of Northern blacks to determine the effects, if any, of attending integrated versus segregated schools. The result was an extensive survey of 1651 black men and women, aged 21 to 45, living in the metropolitan areas of the…

  9. Quantitative Survey and Structural Classification of Fracking Chemicals Reported in Unconventional Gas Exploitation

    NASA Astrophysics Data System (ADS)

    Elsner, Martin; Schreglmann, Kathrin

    2015-04-01

    Few technologies are being discussed in such controversial terms as hydraulic fracturing ("fracking") in the recovery of unconventional gas. Particular concern regards the chemicals that may return to the surface as a result of hydraulic fracturing. These are either "fracking chemicals" - chemicals that are injected together with the fracking fluid to optimize the fracturing performance or geogenic substances which may turn up during gas production, in the so-called produced water originating from the target formation. Knowledge about them is warranted for several reasons. (1) Monitoring. Air emissions are reported to arise from well drilling, the gas itself or condensate tanks. In addition, potential spills and accidents bear the danger of surface and shallow groundwater contaminations. Monitoring strategies are therefore warranted to screen for "indicator" substances of potential impacts. (2) Chemical Analysis. To meet these analytical demands, target substances must be defined so that adequate sampling approaches and analytical methods can be developed. (3) Transformation in the Subsurface. Identification and classification of fracking chemicals (aromatics vs. alcohols vs. acids, esters, etc.) is further important to assess the possibility of subsurface reactions which may potentially generate new, as yet unidentified transformation products. (4) Wastewater Treatment. For the same reason chemical knowledge is important for optimized wastewater treatment strategies. (5) Human and Ecosystem Health. Knowledge of the most frequent fracking chemicals is further essential for risk assessment (environmental behavior, toxicity) (6) Public Discussions. Finally, an overview of reported fracking chemicals can provide unbiased scientific into current public debates and enable critical reviews of Green Chemistry approaches. Presently, however, such information is not readily available. We aim to close this knowledge gap by providing a quantitative overview of chemical

  10. Mechanistic and quantitative insight into cell surface targeted molecular imaging agent design.

    PubMed

    Zhang, Liang; Bhatnagar, Sumit; Deschenes, Emily; Thurber, Greg M

    2016-01-01

    Molecular imaging agent design involves simultaneously optimizing multiple probe properties. While several desired characteristics are straightforward, including high affinity and low non-specific background signal, in practice there are quantitative trade-offs between these properties. These include plasma clearance, where fast clearance lowers background signal but can reduce target uptake, and binding, where high affinity compounds sometimes suffer from lower stability or increased non-specific interactions. Further complicating probe development, many of the optimal parameters vary depending on both target tissue and imaging agent properties, making empirical approaches or previous experience difficult to translate. Here, we focus on low molecular weight compounds targeting extracellular receptors, which have some of the highest contrast values for imaging agents. We use a mechanistic approach to provide a quantitative framework for weighing trade-offs between molecules. Our results show that specific target uptake is well-described by quantitative simulations for a variety of targeting agents, whereas non-specific background signal is more difficult to predict. Two in vitro experimental methods for estimating background signal in vivo are compared - non-specific cellular uptake and plasma protein binding. Together, these data provide a quantitative method to guide probe design and focus animal work for more cost-effective and time-efficient development of molecular imaging agents. PMID:27147293

  11. Mechanistic and quantitative insight into cell surface targeted molecular imaging agent design

    NASA Astrophysics Data System (ADS)

    Zhang, Liang; Bhatnagar, Sumit; Deschenes, Emily; Thurber, Greg M.

    2016-05-01

    Molecular imaging agent design involves simultaneously optimizing multiple probe properties. While several desired characteristics are straightforward, including high affinity and low non-specific background signal, in practice there are quantitative trade-offs between these properties. These include plasma clearance, where fast clearance lowers background signal but can reduce target uptake, and binding, where high affinity compounds sometimes suffer from lower stability or increased non-specific interactions. Further complicating probe development, many of the optimal parameters vary depending on both target tissue and imaging agent properties, making empirical approaches or previous experience difficult to translate. Here, we focus on low molecular weight compounds targeting extracellular receptors, which have some of the highest contrast values for imaging agents. We use a mechanistic approach to provide a quantitative framework for weighing trade-offs between molecules. Our results show that specific target uptake is well-described by quantitative simulations for a variety of targeting agents, whereas non-specific background signal is more difficult to predict. Two in vitro experimental methods for estimating background signal in vivo are compared - non-specific cellular uptake and plasma protein binding. Together, these data provide a quantitative method to guide probe design and focus animal work for more cost-effective and time-efficient development of molecular imaging agents.

  12. Mechanistic and quantitative insight into cell surface targeted molecular imaging agent design

    NASA Astrophysics Data System (ADS)

    Zhang, Liang; Bhatnagar, Sumit; Deschenes, Emily; Thurber, Greg M.

    2016-05-01

    Molecular imaging agent design involves simultaneously optimizing multiple probe properties. While several desired characteristics are straightforward, including high affinity and low non-specific background signal, in practice there are quantitative trade-offs between these properties. These include plasma clearance, where fast clearance lowers background signal but can reduce target uptake, and binding, where high affinity compounds sometimes suffer from lower stability or increased non-specific interactions. Further complicating probe development, many of the optimal parameters vary depending on both target tissue and imaging agent properties, making empirical approaches or previous experience difficult to translate. Here, we focus on low molecular weight compounds targeting extracellular receptors, which have some of the highest contrast values for imaging agents. We use a mechanistic approach to provide a quantitative framework for weighing trade-offs between molecules. Our results show that specific target uptake is well-described by quantitative simulations for a variety of targeting agents, whereas non-specific background signal is more difficult to predict. Two in vitro experimental methods for estimating background signal in vivo are compared – non-specific cellular uptake and plasma protein binding. Together, these data provide a quantitative method to guide probe design and focus animal work for more cost-effective and time-efficient development of molecular imaging agents.

  13. Mechanistic and quantitative insight into cell surface targeted molecular imaging agent design

    PubMed Central

    Zhang, Liang; Bhatnagar, Sumit; Deschenes, Emily; Thurber, Greg M.

    2016-01-01

    Molecular imaging agent design involves simultaneously optimizing multiple probe properties. While several desired characteristics are straightforward, including high affinity and low non-specific background signal, in practice there are quantitative trade-offs between these properties. These include plasma clearance, where fast clearance lowers background signal but can reduce target uptake, and binding, where high affinity compounds sometimes suffer from lower stability or increased non-specific interactions. Further complicating probe development, many of the optimal parameters vary depending on both target tissue and imaging agent properties, making empirical approaches or previous experience difficult to translate. Here, we focus on low molecular weight compounds targeting extracellular receptors, which have some of the highest contrast values for imaging agents. We use a mechanistic approach to provide a quantitative framework for weighing trade-offs between molecules. Our results show that specific target uptake is well-described by quantitative simulations for a variety of targeting agents, whereas non-specific background signal is more difficult to predict. Two in vitro experimental methods for estimating background signal in vivo are compared – non-specific cellular uptake and plasma protein binding. Together, these data provide a quantitative method to guide probe design and focus animal work for more cost-effective and time-efficient development of molecular imaging agents. PMID:27147293

  14. Mechanistic and quantitative insight into cell surface targeted molecular imaging agent design.

    PubMed

    Zhang, Liang; Bhatnagar, Sumit; Deschenes, Emily; Thurber, Greg M

    2016-05-05

    Molecular imaging agent design involves simultaneously optimizing multiple probe properties. While several desired characteristics are straightforward, including high affinity and low non-specific background signal, in practice there are quantitative trade-offs between these properties. These include plasma clearance, where fast clearance lowers background signal but can reduce target uptake, and binding, where high affinity compounds sometimes suffer from lower stability or increased non-specific interactions. Further complicating probe development, many of the optimal parameters vary depending on both target tissue and imaging agent properties, making empirical approaches or previous experience difficult to translate. Here, we focus on low molecular weight compounds targeting extracellular receptors, which have some of the highest contrast values for imaging agents. We use a mechanistic approach to provide a quantitative framework for weighing trade-offs between molecules. Our results show that specific target uptake is well-described by quantitative simulations for a variety of targeting agents, whereas non-specific background signal is more difficult to predict. Two in vitro experimental methods for estimating background signal in vivo are compared - non-specific cellular uptake and plasma protein binding. Together, these data provide a quantitative method to guide probe design and focus animal work for more cost-effective and time-efficient development of molecular imaging agents.

  15. Current State of Agile User-Centered Design: A Survey

    NASA Astrophysics Data System (ADS)

    Hussain, Zahid; Slany, Wolfgang; Holzinger, Andreas

    Agile software development methods are quite popular nowadays and are being adopted at an increasing rate in the industry every year. However, these methods are still lacking usability awareness in their development lifecycle, and the integration of usability/User-Centered Design (UCD) into agile methods is not adequately addressed. This paper presents the preliminary results of a recently conducted online survey regarding the current state of the integration of agile methods and usability/UCD. A world wide response of 92 practitioners was received. The results show that the majority of practitioners perceive that the integration of agile methods with usability/UCD has added value to their adopted processes and to their teams; has resulted in the improvement of usability and quality of the product developed; and has increased the satisfaction of the end-users of the product developed. The top most used HCI techniques are low-fidelity prototyping, conceptual designs, observational studies of users, usability expert evaluations, field studies, personas, rapid iterative testing, and laboratory usability testing.

  16. Practical Tools for Designing and Weighting Survey Samples

    ERIC Educational Resources Information Center

    Valliant, Richard; Dever, Jill A.; Kreuter, Frauke

    2013-01-01

    Survey sampling is fundamentally an applied field. The goal in this book is to put an array of tools at the fingertips of practitioners by explaining approaches long used by survey statisticians, illustrating how existing software can be used to solve survey problems, and developing some specialized software where needed. This book serves at least…

  17. Textile materials for the design of wearable antennas: a survey.

    PubMed

    Salvado, Rita; Loss, Caroline; Gonçalves, Ricardo; Pinho, Pedro

    2012-11-15

    In the broad context of Wireless Body Sensor Networks for healthcare and pervasive applications, the design of wearable antennas offers the possibility of ubiquitous monitoring, communication and energy harvesting and storage. Specific requirements for wearable antennas are a planar structure and flexible construction materials. Several properties of the materials influence the behaviour of the antenna. For instance, the bandwidth and the efficiency of a planar microstrip antenna are mainly determined by the permittivity and the thickness of the substrate. The use of textiles in wearable antennas requires the characterization of their properties. Specific electrical conductive textiles are available on the market and have been successfully used. Ordinary textile fabrics have been used as substrates. However, little information can be found on the electromagnetic properties of regular textiles. Therefore this paper is mainly focused on the analysis of the dielectric properties of normal fabrics. In general, textiles present a very low dielectric constant that reduces the surface wave losses and increases the impedance bandwidth of the antenna. However, textile materials are constantly exchanging water molecules with the surroundings, which affects their electromagnetic properties. In addition, textile fabrics are porous, anisotropic and compressible materials whose thickness and density might change with low pressures. Therefore it is important to know how these characteristics influence the behaviour of the antenna in order to minimize unwanted effects. This paper presents a survey of the key points for the design and development of textile antennas, from the choice of the textile materials to the framing of the antenna. An analysis of the textile materials that have been used is also presented.

  18. Textile materials for the design of wearable antennas: a survey.

    PubMed

    Salvado, Rita; Loss, Caroline; Gonçalves, Ricardo; Pinho, Pedro

    2012-01-01

    In the broad context of Wireless Body Sensor Networks for healthcare and pervasive applications, the design of wearable antennas offers the possibility of ubiquitous monitoring, communication and energy harvesting and storage. Specific requirements for wearable antennas are a planar structure and flexible construction materials. Several properties of the materials influence the behaviour of the antenna. For instance, the bandwidth and the efficiency of a planar microstrip antenna are mainly determined by the permittivity and the thickness of the substrate. The use of textiles in wearable antennas requires the characterization of their properties. Specific electrical conductive textiles are available on the market and have been successfully used. Ordinary textile fabrics have been used as substrates. However, little information can be found on the electromagnetic properties of regular textiles. Therefore this paper is mainly focused on the analysis of the dielectric properties of normal fabrics. In general, textiles present a very low dielectric constant that reduces the surface wave losses and increases the impedance bandwidth of the antenna. However, textile materials are constantly exchanging water molecules with the surroundings, which affects their electromagnetic properties. In addition, textile fabrics are porous, anisotropic and compressible materials whose thickness and density might change with low pressures. Therefore it is important to know how these characteristics influence the behaviour of the antenna in order to minimize unwanted effects. This paper presents a survey of the key points for the design and development of textile antennas, from the choice of the textile materials to the framing of the antenna. An analysis of the textile materials that have been used is also presented. PMID:23202235

  19. A robust rotorcraft flight control system design methodology utilizing quantitative feedback theory

    NASA Technical Reports Server (NTRS)

    Gorder, Peter James

    1993-01-01

    Rotorcraft flight control systems present design challenges which often exceed those associated with fixed-wing aircraft. First, large variations in the response characteristics of the rotorcraft result from the wide range of airspeeds of typical operation (hover to over 100 kts). Second, the assumption of vehicle rigidity often employed in the design of fixed-wing flight control systems is rarely justified in rotorcraft where rotor degrees of freedom can have a significant impact on the system performance and stability. This research was intended to develop a methodology for the design of robust rotorcraft flight control systems. Quantitative Feedback Theory (QFT) was chosen as the basis for the investigation. Quantitative Feedback Theory is a technique which accounts for variability in the dynamic response of the controlled element in the design robust control systems. It was developed to address a Multiple-Input Single-Output (MISO) design problem, and utilizes two degrees of freedom to satisfy the design criteria. Two techniques were examined for extending the QFT MISO technique to the design of a Multiple-Input-Multiple-Output (MIMO) flight control system (FCS) for a UH-60 Black Hawk Helicopter. In the first, a set of MISO systems, mathematically equivalent to the MIMO system, was determined. QFT was applied to each member of the set simultaneously. In the second, the same set of equivalent MISO systems were analyzed sequentially, with closed loop response information from each loop utilized in subsequent MISO designs. The results of each technique were compared, and the advantages of the second, termed Sequential Loop Closure, were clearly evident.

  20. The Development of the Progressive in 19th Century English: A Quantitative Survey.

    ERIC Educational Resources Information Center

    Arnaud, Rene

    1998-01-01

    Expansion of the progressive (be+ing periphrastic form, where "be" is at the same time the copula and a statement of existence) was a major feature of modernization of the English verb system in the 19th century. A survey (1787-1880) of a collection of private letters, most from famous writers, reveals that linguistic factors played a small role…

  1. Trajectory Design for the Transiting Exoplanet Survey Satellite (TESS)

    NASA Technical Reports Server (NTRS)

    Dichmann, Donald J.; Parker, Joel; Williams, Trevor; Mendelsohn, Chad

    2014-01-01

    The Transiting Exoplanet Survey Satellite (TESS) is a National Aeronautics and Space Administration (NASA) mission launching in 2017. TESS will travel in a highly eccentric orbit around Earth, with initial perigee radius near 17 Earth radii (Re) and apogee radius near 59 Re. The orbit period is near 2:1 resonance with the Moon, with apogee nearly 90 degrees out-of-phase with the Moon, in a configuration that has been shown to be operationally stable. TESS will execute phasing loops followed by a lunar flyby, with a final maneuver to achieve 2:1 resonance with the Moon. The goals of a resonant orbit with long-term stability, short eclipses and limited oscillations of perigee present significant challenges to the trajectory design. To rapidly assess launch opportunities, we adapted the SWM76 launch window tool to assess the TESS mission constraints. To understand the long-term dynamics of such a resonant orbit in the Earth-Moon system we employed Dynamical Systems Theory in the Circular Restricted 3-Body Problem (CR3BP). For precise trajectory analysis we use a high-fidelity model and multiple shooting in the General Mission Analysis Tool (GMAT) to optimize the maneuver delta-V and meet mission constraints. Finally we describe how the techniques we have developed can be applied to missions with similar requirements.

  2. Trajectory Design for the Transiting Exoplanet Survey Satellite

    NASA Technical Reports Server (NTRS)

    Dichmann, Donald J.; Parker, Joel J. K.; Williams, Trevor W.; Mendelsohn, Chad R.

    2014-01-01

    The Transiting Exoplanet Survey Satellite (TESS) is a National Aeronautics and Space Administration (NASA) mission, scheduled to be launched in 2017. TESS will travel in a highly eccentric orbit around Earth, with initial perigee radius near 17 Earth radii (Re) and apogee radius near 59 Re. The orbit period is near 2:1 resonance with the Moon, with apogee nearly 90 degrees out-of-phase with the Moon, in a configuration that has been shown to be operationally stable. TESS will execute phasing loops followed by a lunar flyby, with a final maneuver to achieve 2:1 resonance with the Moon. The goals of a resonant orbit with long-term stability, short eclipses and limited oscillations of perigee present significant challenges to the trajectory design. To rapidly assess launch opportunities, we adapted the Schematics Window Methodology (SWM76) launch window analysis tool to assess the TESS mission constraints. To understand the long-term dynamics of such a resonant orbit in the Earth-Moon system we employed Dynamical Systems Theory in the Circular Restricted 3-Body Problem (CR3BP). For precise trajectory analysis we use a high-fidelity model and multiple shooting in the General Mission Analysis Tool (GMAT) to optimize the maneuver delta-V and meet mission constraints. Finally we describe how the techniques we have developed can be applied to missions with similar requirements. Keywords: resonant orbit, stability, lunar flyby, phasing loops, trajectory optimization

  3. National Aquatic Resource Surveys: Integration of Geospatial Data in Their Survey Design and Analysis

    EPA Science Inventory

    The National Aquatic Resource Surveys (NARS) are a series of four statistical surveys conducted by the U.S. Environmental Protection Agency working in collaboration with states, tribal nations and other federal agencies. The surveys are conducted for lakes and reservoirs, streams...

  4. Influenza knowledge, attitude, and behavior survey for grade school students: design and novel assessment methodology.

    PubMed

    Koep, Tyler H; Huskins, W Charles; Clemens, Christal; Jenkins, Sarah; Pierret, Chris; Ekker, Stephen C; Enders, Felicity T

    2014-12-01

    Despite the fact infectious diseases can spread readily in grade schools, few studies have explored prevention in this setting. Additionally, we lack valid tools for students to self-report knowledge, attitudes, and behaviors. As part of an ongoing study of a curriculum intervention to promote healthy behaviors, we developed and evaluated age-appropriate surveys to determine students' understanding of influenza prevention. Surveys were adapted from adolescent and adult influenza surveys and administered to students in grades 2-5 (ages 7-11) at two Rochester public schools. We assessed student understanding by analyzing percent repeatability of 20 survey questions and compared percent "don't know" (DK) responses across grades, gender, and race. Questions thought to be ambiguous after early survey administration were investigated in student focus groups, modified as appropriate, and reassessed. The response rate across all surveys was >87%. Survey questions were well understood; 16 of 20 questions demonstrated strong pre/post repeatability (>70%). Only 1 question showed an increase in DK response for higher grades (p < .0001). Statistical analysis and qualitative feedback led to modification of 3 survey questions and improved measures of understanding in the final survey administration. Grade-school students' knowledge, attitudes and behavior toward influenza prevention can be assessed using surveys. Quantitative and qualitative analysis may be used to assess participant understanding and refine survey development for pediatric survey instruments. These methods may be used to assess the repeatability and validity of surveys to assess the impact of health education interventions in young children.

  5. Perspectives of Speech-Language Pathologists on the Use of Telepractice in Schools: Quantitative Survey Results

    PubMed Central

    Tucker, Janice K.

    2012-01-01

    This research surveyed 170 school-based speech-language pathologists (SLPs) in one northeastern state, with only 1.8% reporting telepractice use in school-settings. These results were consistent with two ASHA surveys (2002; 2011) that reported limited use of telepractice for school-based speech-language pathology. In the present study, willingness to use telepractice was inversely related to age, perhaps because younger members of the profession are more accustomed to using technology. Overall, respondents were concerned about the validity of assessments administered via telepractice; whether clinicians can adequately establish rapport with clients via telepractice; and if therapy conducted via telepractice can be as effective as in-person speech-language therapy. Most respondents indicated the need to establish procedures and guidelines for school-based telepractice programs. PMID:25945204

  6. Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method, 4th Edition

    ERIC Educational Resources Information Center

    Dillman, Don A.; Smyth, Jolene D.; Christian, Lean Melani

    2014-01-01

    For over two decades, Dillman's classic text on survey design has aided both students and professionals in effectively planning and conducting mail, telephone, and, more recently, Internet surveys. The new edition is thoroughly updated and revised, and covers all aspects of survey research. It features expanded coverage of mobile phones, tablets,…

  7. Quantitative Hydrogeological Framework Interpretations from Modeling Helicopter Electromagnetic Survey Data, Nebraska Panhandle

    NASA Astrophysics Data System (ADS)

    Abraham, J. D.; Ball, L. B.; Bedrosian, P. A.; Cannia, J. C.; Deszcz-Pan, M.; Minsley, B. J.; Peterson, S. M.; Smith, B. D.

    2009-12-01

    The need for allocation and management of water resources within the state of Nebraska has created a demand for innovative approaches to data collection for development of hydrogeologic frameworks to be used for 2D and 3D groundwater models. In 2008, the USGS in cooperation with the North Platte Natural Resources District, the South Platte Natural Resources District, and the University of Nebraska Conservation and Survey Division began using frequency domain helicopter electromagnetic (HEM) surveys to map selected sections of the Nebraska Panhandle. The surveys took place in selected sections of the North Platte River valley, Lodgepole Creek, and portions of the adjacent tablelands. The objective of the surveys is to map the aquifers of the area to improve understanding of the groundwater-surface water relationships and develop better hydrogeologic frameworks used in making more accurate 3D groundwater models of the area. For the HEM method to have an impact in a groundwater model at the basin scale, hydrostratigraphic units need to have detectable physical property (electrical resistivity) contrasts. When these contrasts exist within the study area and they are detectable from an airborne platform, large areas can be surveyed to rapidly generate 2D and 3D maps and models of 3D hydrogeologic features. To make the geophysical data useful to multidimensional groundwater models, numerical inversion is necessary to produce a depth-dependent physical property data set reflecting hydrogeologic features. These maps and depth images of electrical resistivity in themselves are not useful for the hydrogeologist. They need to be turned into maps and depth images of the hydrostratigraphic units and hydrogeologic features. Through a process of numerical imaging, inversion, sensitivity analysis, geological ground truthing (boreholes), geological interpretation, hydrogeologic features are characterized. Resistivity depth sections produced from this process are used to pick

  8. Rotorcraft control system design for uncertain vehicle dynamics using quantitative feedback theory

    NASA Technical Reports Server (NTRS)

    Hess, R. A.

    1994-01-01

    Quantitative Feedback Theory describes a frequency-domain technique for the design of multi-input, multi-output control systems which must meet time or frequency domain performance criteria when specified uncertainty exists in the linear description of the vehicle dynamics. This theory is applied to the design of the longitudinal flight control system for a linear model of the BO-105C rotorcraft. Uncertainty in the vehicle model is due to the variation in the vehicle dynamics over a range of airspeeds from 0-100 kts. For purposes of exposition, the vehicle description contains no rotor or actuator dynamics. The design example indicates the manner in which significant uncertainty exists in the vehicle model. The advantage of using a sequential loop closure technique to reduce the cost of feedback is demonstrated by example.

  9. Inversion-free decentralised quantitative feedback design of large-scale systems

    NASA Astrophysics Data System (ADS)

    Labibi, B.; Mahdi Alavi, S. M.

    2016-06-01

    In this paper, a new method for robust decentralised control of multi-input multi-output (MIMO) systems using quantitative feedback theory (QFT) is suggested. The proposed method does not need inversion of the plant transfer function matrix in the design process. For a given system, an equivalent descriptor system representation is defined. By using this representation, sufficient conditions for closed-loop diagonal dominance over the uncertainty space are obtained. These conditions transform the original MIMO system into a set of isolated multi-input single-output (MISO) subsystems. Then, the local controllers are designed by using the typical MISO QFT technique for each isolated subsystem to satisfy the predefined desired specifications and the closed-loop diagonal dominance sufficient conditions. The proposed technique is less conservative in comparison to the approaches using the over-bounding concept in the design procedure. The effectiveness of the proposed technique is finally assessed on a MIMO Scara robot.

  10. Rotorcraft flight control design using quantitative feedback theory and dynamic crossfeeds

    NASA Technical Reports Server (NTRS)

    Cheng, Rendy P.

    1995-01-01

    A multi-input, multi-output controls design with robust crossfeeds is presented for a rotorcraft in near-hovering flight using quantitative feedback theory (QFT). Decoupling criteria are developed for dynamic crossfeed design and implementation. Frequency dependent performance metrics focusing on piloted flight are developed and tested on 23 flight configurations. The metrics show that the resulting design is superior to alternative control system designs using conventional fixed-gain crossfeeds and to feedback-only designs which rely on high gains to suppress undesired off-axis responses. The use of dynamic, robust crossfeeds prior to the QFT design reduces the magnitude of required feedback gain and results in performance that meets current handling qualities specifications relative to the decoupling of off-axis responses. The combined effect of the QFT feedback design following the implementation of low-order, dynamic crossfeed compensator successfully decouples ten of twelve off-axis channels. For the other two channels it was not possible to find a single, low-order crossfeed that was effective.

  11. SDSS-IV MaNGA: Survey Design and Progress

    NASA Astrophysics Data System (ADS)

    Yan, Renbin; MaNGA Team

    2016-01-01

    The ongoing SDSS-IV/MaNGA Survey will obtain integral field spectroscopy at a resolution of R~2000 with a wavelength coverage from 3,600A to 10,300A for 10,000 nearby galaxies. Within each 3 degree diameter pointing of the 2.5m Sloan Telescope, we deploy 17 hexagonal fiber bundles with sizes ranging from 12 to 32 arcsec in diameter. The bundles are build with 2 arcsec fibers and have a 56% fill factor. During observations, we obtained sets of exposures at 3 different dither positions to achieve near-critical sampling of the effective point spread function, which has a FWHM about 2.5 arcsec, corresponding to 1-2 kpc for the majority of the galaxies targeted. The flux calibration is done using 12 additional mini-fiber-bundles targeting standard stars simultaneously with science targets, achieving a calibration accuracy better than 5% over 90% of the wavelength range. The target galaxies are selected to ensure uniform spatial coverage in units of effective radii for the majority of the galaxies while maximizing spatial resolution. About 2/3 of the sample is covered out to 1.5Re (primary sample) and 1/3 of the sample covered to 2.5Re (secondary sample). The sample is designed to have approximately equal representation from high and low mass galaxies while maintaining volume-limited selection at fixed absolute magnitudes. We obtain an average S/N of 4 per Angstrom in r-band continuum at a surface brightness of 23 AB arcsec-2. With spectral stacking in an elliptical annulus covering 1-1.5Re, our primary sample galaxies have a median S/N of ~60 per Angstrom in r-band.

  12. SAS procedures for designing and analyzing sample surveys

    USGS Publications Warehouse

    Stafford, Joshua D.; Reinecke, Kenneth J.; Kaminski, Richard M.

    2003-01-01

    Complex surveys often are necessary to estimate occurrence (or distribution), density, and abundance of plants and animals for purposes of re-search and conservation. Most scientists are familiar with simple random sampling, where sample units are selected from a population of interest (sampling frame) with equal probability. However, the goal of ecological surveys often is to make inferences about populations over large or complex spatial areas where organisms are not homogeneously distributed or sampling frames are in-convenient or impossible to construct. Candidate sampling strategies for such complex surveys include stratified,multistage, and adaptive sampling (Thompson 1992, Buckland 1994).

  13. Survey design research: a tool for answering nursing research questions.

    PubMed

    Siedlecki, Sandra L; Butler, Robert S; Burchill, Christian N

    2015-01-01

    The clinical nurse specialist is in a unique position to identify and study clinical problems in need of answers, but lack of time and resources may discourage nurses from conducting research. However, some research methods can be used by the clinical nurse specialist that are not time-intensive or cost prohibitive. The purpose of this article is to explain the utility of survey methodology for answering a number of nursing research questions. The article covers survey content, reliability and validity issues, sample size considerations, and methods of survey delivery.

  14. Survey of Quantitative Research Metrics to Assess Pilot Performance in Upset Recovery

    NASA Technical Reports Server (NTRS)

    Le Vie, Lisa R.

    2016-01-01

    Accidents attributable to in-flight loss of control are the primary cause for fatal commercial jet accidents worldwide. The National Aeronautics and Space Administration (NASA) conducted a literature review to determine and identify the quantitative standards for assessing upset recovery performance. This review contains current recovery procedures for both military and commercial aviation and includes the metrics researchers use to assess aircraft recovery performance. Metrics include time to first input, recognition time and recovery time and whether that input was correct or incorrect. Other metrics included are: the state of the autopilot and autothrottle, control wheel/sidestick movement resulting in pitch and roll, and inputs to the throttle and rudder. In addition, airplane state measures, such as roll reversals, altitude loss/gain, maximum vertical speed, maximum/minimum air speed, maximum bank angle and maximum g loading are reviewed as well.

  15. The Importance of Adhering to Details of the Total Design Method (TDM) for Mail Surveys.

    ERIC Educational Resources Information Center

    Dillman, Don A.; And Others

    1984-01-01

    The empirical effects of adherence of details of the Total Design Method (TDM) approach to the design of mail surveys is discussed, based on the implementation of a common survey in 11 different states. The results suggest that greater adherence results in higher response, especially in the later stages of the TDM. (BW)

  16. A Survey of Former Drafting & Engineering Design Technology Students. Summary Findings of Respondents District-Wide.

    ERIC Educational Resources Information Center

    Glyer-Culver, Betty

    In fall 2001 staff of the Los Rios Community College District Office of Institutional Research collaborated with occupational deans, academic deans, and faculty to develop and administer a survey of former Drafting and Engineering Design Technology students. The survey was designed to determine how well courses had met the needs of former drafting…

  17. The Health Effects of Climate Change: A Survey of Recent Quantitative Research

    PubMed Central

    Grasso, Margherita; Manera, Matteo; Chiabai, Aline; Markandya, Anil

    2012-01-01

    In recent years there has been a large scientific and public debate on climate change and its direct as well as indirect effects on human health. In particular, a large amount of research on the effects of climate changes on human health has addressed two fundamental questions. First, can historical data be of some help in revealing how short-run or long-run climate variations affect the occurrence of infectious diseases? Second, is it possible to build more accurate quantitative models which are capable of predicting the future effects of different climate conditions on the transmissibility of particularly dangerous infectious diseases? The primary goal of this paper is to review the most relevant contributions which have directly tackled those questions, both with respect to the effects of climate changes on the diffusion of non-infectious and infectious diseases, with malaria as a case study. Specific attention will be drawn on the methodological aspects of each study, which will be classified according to the type of quantitative model considered, namely time series models, panel data and spatial models, and non-statistical approaches. Since many different disciplines and approaches are involved, a broader view is necessary in order to provide a better understanding of the interactions between climate and health. In this respect, our paper also presents a critical summary of the recent literature related to more general aspects of the impacts of climate changes on human health, such as: the economics of climate change; how to manage the health effects of climate change; the establishment of Early Warning Systems for infectious diseases. PMID:22754455

  18. The health effects of climate change: a survey of recent quantitative research.

    PubMed

    Grasso, Margherita; Manera, Matteo; Chiabai, Aline; Markandya, Anil

    2012-05-01

    In recent years there has been a large scientific and public debate on climate change and its direct as well as indirect effects on human health. In particular, a large amount of research on the effects of climate changes on human health has addressed two fundamental questions. First, can historical data be of some help in revealing how short-run or long-run climate variations affect the occurrence of infectious diseases? Second, is it possible to build more accurate quantitative models which are capable of predicting the future effects of different climate conditions on the transmissibility of particularly dangerous infectious diseases? The primary goal of this paper is to review the most relevant contributions which have directly tackled those questions, both with respect to the effects of climate changes on the diffusion of non-infectious and infectious diseases, with malaria as a case study. Specific attention will be drawn on the methodological aspects of each study, which will be classified according to the type of quantitative model considered, namely time series models, panel data and spatial models, and non-statistical approaches. Since many different disciplines and approaches are involved, a broader view is necessary in order to provide a better understanding of the interactions between climate and health. In this respect, our paper also presents a critical summary of the recent literature related to more general aspects of the impacts of climate changes on human health, such as: the economics of climate change; how to manage the health effects of climate change; the establishment of Early Warning Systems for infectious diseases. PMID:22754455

  19. The health effects of climate change: a survey of recent quantitative research.

    PubMed

    Grasso, Margherita; Manera, Matteo; Chiabai, Aline; Markandya, Anil

    2012-05-01

    In recent years there has been a large scientific and public debate on climate change and its direct as well as indirect effects on human health. In particular, a large amount of research on the effects of climate changes on human health has addressed two fundamental questions. First, can historical data be of some help in revealing how short-run or long-run climate variations affect the occurrence of infectious diseases? Second, is it possible to build more accurate quantitative models which are capable of predicting the future effects of different climate conditions on the transmissibility of particularly dangerous infectious diseases? The primary goal of this paper is to review the most relevant contributions which have directly tackled those questions, both with respect to the effects of climate changes on the diffusion of non-infectious and infectious diseases, with malaria as a case study. Specific attention will be drawn on the methodological aspects of each study, which will be classified according to the type of quantitative model considered, namely time series models, panel data and spatial models, and non-statistical approaches. Since many different disciplines and approaches are involved, a broader view is necessary in order to provide a better understanding of the interactions between climate and health. In this respect, our paper also presents a critical summary of the recent literature related to more general aspects of the impacts of climate changes on human health, such as: the economics of climate change; how to manage the health effects of climate change; the establishment of Early Warning Systems for infectious diseases.

  20. Sample size and optimal sample design in tuberculosis surveys

    PubMed Central

    Sánchez-Crespo, J. L.

    1967-01-01

    Tuberculosis surveys sponsored by the World Health Organization have been carried out in different communities during the last few years. Apart from the main epidemiological findings, these surveys have provided basic statistical data for use in the planning of future investigations. In this paper an attempt is made to determine the sample size desirable in future surveys that include one of the following examinations: tuberculin test, direct microscopy, and X-ray examination. The optimum cluster sizes are found to be 100-150 children under 5 years of age in the tuberculin test, at least 200 eligible persons in the examination for excretors of tubercle bacilli (direct microscopy) and at least 500 eligible persons in the examination for persons with radiological evidence of pulmonary tuberculosis (X-ray). Modifications of the optimum sample size in combined surveys are discussed. PMID:5300008

  1. ESTIMATING AMPHIBIAN OCCUPANCY RATES IN PONDS UNDER COMPLEX SURVEY DESIGNS

    EPA Science Inventory

    Monitoring the occurrence of specific amphibian species in ponds is one component of the US Geological Survey's Amphibian Monitoring and Research Initiative. Two collaborative studies were conducted in Olympic National Park and southeastern region of Oregon. The number of ponds...

  2. Translating HIV sequences into quantitative fitness landscapes predicts viral vulnerabilities for rational immunogen design.

    PubMed

    Ferguson, Andrew L; Mann, Jaclyn K; Omarjee, Saleha; Ndung'u, Thumbi; Walker, Bruce D; Chakraborty, Arup K

    2013-03-21

    A prophylactic or therapeutic vaccine offers the best hope to curb the HIV-AIDS epidemic gripping sub-Saharan Africa, but it remains elusive. A major challenge is the extreme viral sequence variability among strains. Systematic means to guide immunogen design for highly variable pathogens like HIV are not available. Using computational models, we have developed an approach to translate available viral sequence data into quantitative landscapes of viral fitness as a function of the amino acid sequences of its constituent proteins. Predictions emerging from our computationally defined landscapes for the proteins of HIV-1 clade B Gag were positively tested against new in vitro fitness measurements and were consistent with previously defined in vitro measurements and clinical observations. These landscapes chart the peaks and valleys of viral fitness as protein sequences change and inform the design of immunogens and therapies that can target regions of the virus most vulnerable to selection pressure.

  3. Edesign: Primer and Enhanced Internal Probe Design Tool for Quantitative PCR Experiments and Genotyping Assays

    PubMed Central

    Kasahara, Naoko; Delobel, Diane; Hanami, Takeshi; Tanaka, Yuki; de Hoon, Michiel J. L.; Hayashizaki, Yoshihide; Usui, Kengo; Harbers, Matthias

    2016-01-01

    Analytical PCR experiments preferably use internal probes for monitoring the amplification reaction and specific detection of the amplicon. Such internal probes have to be designed in close context with the amplification primers, and may require additional considerations for the detection of genetic variations. Here we describe Edesign, a new online and stand-alone tool for designing sets of PCR primers together with an internal probe for conducting quantitative real-time PCR (qPCR) and genotypic experiments. Edesign can be used for selecting standard DNA oligonucleotides like for instance TaqMan probes, but has been further extended with new functions and enhanced design features for Eprobes. Eprobes, with their single thiazole orange-labelled nucleotide, allow for highly sensitive genotypic assays because of their higher DNA binding affinity as compared to standard DNA oligonucleotides. Using new thermodynamic parameters, Edesign considers unique features of Eprobes during primer and probe design for establishing qPCR experiments and genotyping by melting curve analysis. Additional functions in Edesign allow probe design for effective discrimination between wild-type sequences and genetic variations either using standard DNA oligonucleotides or Eprobes. Edesign can be freely accessed online at http://www.dnaform.com/edesign2/, and the source code is available for download. PMID:26863543

  4. A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework

    NASA Astrophysics Data System (ADS)

    Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo

    An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.

  5. SKA Weak Lensing II: Simulated Performance and Survey Design Considerations

    NASA Astrophysics Data System (ADS)

    Bonaldi, Anna; Harrison, Ian; Camera, Stefano; Brown, Michael L.

    2016-08-01

    We construct a pipeline for simulating weak lensing cosmology surveys with the Square Kilometre Array (SKA), taking as inputs telescope sensitivity curves; correlated source flux, size and redshift distributions; a simple ionospheric model; source redshift and ellipticity measurement errors. We then use this simulation pipeline to optimise a 2-year weak lensing survey performed with the first deployment of the SKA (SKA1). Our assessments are based on the total signal-to-noise of the recovered shear power spectra, a metric that we find to correlate very well with a standard dark energy figure of merit. We first consider the choice of frequency band, trading off increases in number counts at lower frequencies against poorer resolution; our analysis strongly prefers the higher frequency Band 2 (950-1760 MHz) channel of the SKA-MID telescope to the lower frequency Band 1 (350-1050 MHz). Best results would be obtained by allowing the centre of Band 2 to shift towards lower frequency, around 1.1 GHz. We then move on to consider survey size, finding that an area of 5,000 square degrees is optimal for most SKA1 instrumental configurations. Finally, we forecast the performance of a weak lensing survey with the second deployment of the SKA. The increased survey size (3π steradian) and sensitivity improves both the signal-to-noise and the dark energy metrics by two orders of magnitude.

  6. The IMACS Cluster Building Survey. V. Further Evidence for Starburst Recycling from Quantitative Galaxy Morphologies

    NASA Astrophysics Data System (ADS)

    Abramson, Louis E.; Dressler, Alan; Gladders, Michael D.; Oemler, Augustus, Jr.; Poggianti, Bianca M.; Monson, Andrew; Persson, Eric; Vulcani, Benedetta

    2013-11-01

    Using J- and K s-band imaging obtained as part of the IMACS Cluster Building Survey (ICBS), we measure Sérsic indices for 2160 field and cluster galaxies at 0.31 < z < 0.54. Using both mass- and magnitude-limited samples, we compare the distributions for spectroscopically determined passive, continuously star-forming, starburst, and post-starburst systems and show that previously established spatial and statistical connections between these types extend to their gross morphologies. Outside of cluster cores, we find close structural ties between starburst and continuously star-forming, as well as post-starburst and passive types, but not between starbursts and post-starbursts. These results independently support two conclusions presented in Paper II of this series: (1) most starbursts are the product of a non-disruptive triggering mechanism that is insensitive to global environment, such as minor mergers; (2) starbursts and post-starbursts generally represent transient phases in the lives of "normal" star-forming and quiescent galaxies, respectively, originating from and returning to these systems in closed "recycling" loops. In this picture, spectroscopically identified post-starbursts constitute a minority of all recently terminated starbursts, largely ruling out the typical starburst as a quenching event in all but the densest environments. Data were obtained using the 6.5 m Magellan Telescopes at Las Campanas Observatory, Chile.

  7. THE IMACS CLUSTER BUILDING SURVEY. V. FURTHER EVIDENCE FOR STARBURST RECYCLING FROM QUANTITATIVE GALAXY MORPHOLOGIES

    SciTech Connect

    Abramson, Louis E.; Gladders, Michael D.; Dressler, Alan; Oemler, Augustus Jr.; Monson, Andrew; Persson, Eric; Poggianti, Bianca M.; Vulcani, Benedetta

    2013-11-10

    Using J- and K{sub s}-band imaging obtained as part of the IMACS Cluster Building Survey (ICBS), we measure Sérsic indices for 2160 field and cluster galaxies at 0.31 < z < 0.54. Using both mass- and magnitude-limited samples, we compare the distributions for spectroscopically determined passive, continuously star-forming, starburst, and post-starburst systems and show that previously established spatial and statistical connections between these types extend to their gross morphologies. Outside of cluster cores, we find close structural ties between starburst and continuously star-forming, as well as post-starburst and passive types, but not between starbursts and post-starbursts. These results independently support two conclusions presented in Paper II of this series: (1) most starbursts are the product of a non-disruptive triggering mechanism that is insensitive to global environment, such as minor mergers; (2) starbursts and post-starbursts generally represent transient phases in the lives of 'normal' star-forming and quiescent galaxies, respectively, originating from and returning to these systems in closed 'recycling' loops. In this picture, spectroscopically identified post-starbursts constitute a minority of all recently terminated starbursts, largely ruling out the typical starburst as a quenching event in all but the densest environments.

  8. Hydrological drought types in cold climates: quantitative analysis of causing factors and qualitative survey of impacts

    NASA Astrophysics Data System (ADS)

    Van Loon, A. F.; Ploum, S. W.; Parajka, J.; Fleig, A. K.; Garnier, E.; Laaha, G.; Van Lanen, H. A. J.

    2015-04-01

    For drought management and prediction, knowledge of causing factors and socio-economic impacts of hydrological droughts is crucial. Propagation of meteorological conditions in the hydrological cycle results in different hydrological drought types that require separate analysis. In addition to the existing hydrological drought typology, we here define two new drought types related to snow and ice. A snowmelt drought is a deficiency in the snowmelt discharge peak in spring in snow-influenced basins and a glaciermelt drought is a deficiency in the glaciermelt discharge peak in summer in glacierised basins. In 21 catchments in Austria and Norway we studied the meteorological conditions in the seasons preceding and at the time of snowmelt and glaciermelt drought events. Snowmelt droughts in Norway were mainly controlled by below-average winter precipitation, while in Austria both temperature and precipitation played a role. For glaciermelt droughts, the effect of below-average summer air temperature was dominant, both in Austria and Norway. Subsequently, we investigated the impacts of temperature-related drought types (i.e. snowmelt and glaciermelt drought, but also cold and warm snow season drought and rain-to-snow-season drought). In historical archives and drought databases for the US and Europe many impacts were found that can be attributed to these temperature-related hydrological drought types, mainly in the agriculture and electricity production (hydropower) sectors. However, drawing conclusions on the frequency of occurrence of different drought types from reported impacts is difficult, mainly because of reporting biases and the inevitably limited spatial and temporal scales of the information. Finally, this study shows that complete integration of quantitative analysis of causing factors and qualitative analysis of impacts of temperature-related droughts is not yet possible. Analysis of selected events, however, points out that it can be a promising research

  9. Applications of numerical optimization methods to helicopter design problems: A survey

    NASA Technical Reports Server (NTRS)

    Miura, H.

    1984-01-01

    A survey of applications of mathematical programming methods is used to improve the design of helicopters and their components. Applications of multivariable search techniques in the finite dimensional space are considered. Five categories of helicopter design problems are considered: (1) conceptual and preliminary design, (2) rotor-system design, (3) airframe structures design, (4) control system design, and (5) flight trajectory planning. Key technical progress in numerical optimization methods relevant to rotorcraft applications are summarized.

  10. The JCMT Gould Belt Survey: a quantitative comparison between SCUBA-2 data reduction methods

    NASA Astrophysics Data System (ADS)

    Mairs, S.; Johnstone, D.; Kirk, H.; Graves, S.; Buckle, J.; Beaulieu, S. F.; Berry, D. S.; Broekhoven-Fiene, H.; Currie, M. J.; Fich, M.; Hatchell, J.; Jenness, T.; Mottram, J. C.; Nutter, D.; Pattle, K.; Pineda, J. E.; Salji, C.; Francesco, J. Di; Hogerheijde, M. R.; Ward-Thompson, D.; JCMT Gould Belt survey Team

    2015-12-01

    Performing ground-based submillimetre observations is a difficult task as the measurements are subject to absorption and emission from water vapour in the Earth's atmosphere and time variation in weather and instrument stability. Removing these features and other artefacts from the data is a vital process which affects the characteristics of the recovered astronomical structure we seek to study. In this paper, we explore two data reduction methods for data taken with the Submillimetre Common-User Bolometer Array-2 (SCUBA-2) at the James Clerk Maxwell Telescope (JCMT). The JCMT Legacy Reduction 1 (JCMT LR1) and The Gould Belt Legacy Survey Legacy Release 1 (GBS LR1) reduction both use the same software (STARLINK) but differ in their choice of data reduction parameters. We find that the JCMT LR1 reduction is suitable for determining whether or not compact emission is present in a given region and the GBS LR1 reduction is tuned in a robust way to uncover more extended emission, which better serves more in-depth physical analyses of star-forming regions. Using the GBS LR1 method, we find that compact sources are recovered well, even at a peak brightness of only three times the noise, whereas the reconstruction of larger objects requires much care when drawing boundaries around the expected astronomical signal in the data reduction process. Incorrect boundaries can lead to false structure identification or it can cause structure to be missed. In the JCMT LR1 reduction, the extent of the true structure of objects larger than a point source is never fully recovered.

  11. Ergonomic Based Design and Survey of Elementary School Furniture

    ERIC Educational Resources Information Center

    Maheshwar; Jawalkar, Chandrashekhar S.

    2014-01-01

    This paper presents the ergonomic aspects in designing and prototyping of desks cum chairs used in elementary schools. The procedures adopted for the assessment included: the study of existing school furniture, design analysis and development of prototypes. The design approach proposed a series of adjustable desks and chairs developed in terms of…

  12. ESTIMATING PROPORTION OF AREA OCCUPIED UNDER COMPLEX SURVEY DESIGNS

    EPA Science Inventory

    Estimating proportion of sites occupied, or proportion of area occupied (PAO) is a common problem in environmental studies. Typically, field surveys do not ensure that occupancy of a site is made with perfect detection. Maximum likelihood estimation of site occupancy rates when...

  13. Influenza Knowledge, Attitude, and Behavior Survey for Grade School Students: Design and Novel Assessment Methodology

    PubMed Central

    Koep, Tyler H.; Huskins, W. Charles; Clemens, Christal; Jenkins, Sarah; Pierret, Chris; Ekker, Stephen C.; Enders, Felicity T.

    2016-01-01

    Background Despite the fact infectious diseases can spread readily in grade schools, few studies have explored prevention in this setting. Additionally, we lack valid tools for students to self-report knowledge, attitudes, and behaviors. As part of an ongoing study of a curriculum intervention to promote healthy behaviors, we developed and evaluated age-appropriate surveys to determine students’ understanding of influenza prevention. Methods Surveys were adapted from adolescent and adult influenza surveys and administered to students in grades 2–5 (ages 7–11) at two Rochester public schools. We assessed student understanding by analyzing percent repeatability of 20 survey questions and compared percent “Don’t Know” (DK) responses across grades, gender, and race. Questions thought to be ambiguous after early survey administration were investigated in student focus groups, modified as appropriate, and reassessed. Results The response rate across all surveys was > 87%. Survey questions were well understood; 17 of 20 questions demonstrated strong pre/post repeatability (> 70%). Only 1 question showed an increase in DK response for higher grades (p <.0001). Statistical analysis and qualitative feedback led to modification of 3 survey questions and improved measures of understanding in the final survey administration. Conclusions Grade-school students’ knowledge, attitudes and behavior toward influenza prevention can be assessed using surveys. Quantitative and qualitative analysis may be used to assess participant understanding and refine survey development for pediatric survey instruments. These methods may be used to assess the repeatability and validity of surveys to assess the impact of health education interventions in young children. PMID:24859735

  14. Influenza knowledge, attitude, and behavior survey for grade school students: design and novel assessment methodology.

    PubMed

    Koep, Tyler H; Huskins, W Charles; Clemens, Christal; Jenkins, Sarah; Pierret, Chris; Ekker, Stephen C; Enders, Felicity T

    2014-12-01

    Despite the fact infectious diseases can spread readily in grade schools, few studies have explored prevention in this setting. Additionally, we lack valid tools for students to self-report knowledge, attitudes, and behaviors. As part of an ongoing study of a curriculum intervention to promote healthy behaviors, we developed and evaluated age-appropriate surveys to determine students' understanding of influenza prevention. Surveys were adapted from adolescent and adult influenza surveys and administered to students in grades 2-5 (ages 7-11) at two Rochester public schools. We assessed student understanding by analyzing percent repeatability of 20 survey questions and compared percent "don't know" (DK) responses across grades, gender, and race. Questions thought to be ambiguous after early survey administration were investigated in student focus groups, modified as appropriate, and reassessed. The response rate across all surveys was >87%. Survey questions were well understood; 16 of 20 questions demonstrated strong pre/post repeatability (>70%). Only 1 question showed an increase in DK response for higher grades (p < .0001). Statistical analysis and qualitative feedback led to modification of 3 survey questions and improved measures of understanding in the final survey administration. Grade-school students' knowledge, attitudes and behavior toward influenza prevention can be assessed using surveys. Quantitative and qualitative analysis may be used to assess participant understanding and refine survey development for pediatric survey instruments. These methods may be used to assess the repeatability and validity of surveys to assess the impact of health education interventions in young children. PMID:24859735

  15. Quantitative imaging of the human upper airway: instrument design and clinical studies

    NASA Astrophysics Data System (ADS)

    Leigh, M. S.; Armstrong, J. J.; Paduch, A.; Sampson, D. D.; Walsh, J. H.; Hillman, D. R.; Eastwood, P. R.

    2006-08-01

    Imaging of the human upper airway is widely used in medicine, in both clinical practice and research. Common imaging modalities include video endoscopy, X-ray CT, and MRI. However, no current modality is both quantitative and safe to use for extended periods of time. Such a capability would be particularly valuable for sleep research, which is inherently reliant on long observation sessions. We have developed an instrument capable of quantitative imaging of the human upper airway, based on endoscopic optical coherence tomography. There are no dose limits for optical techniques, and the minimally invasive imaging probe is safe for use in overnight studies. We report on the design of the instrument and its use in preliminary clinical studies, and we present results from a range of initial experiments. The experiments show that the instrument is capable of imaging during sleep, and that it can record dynamic changes in airway size and shape. This information is useful for research into sleep disorders, and potentially for clinical diagnosis and therapies.

  16. Design, synthesis and exploring the quantitative structure-activity relationship of some antioxidant flavonoid analogues.

    PubMed

    Das, Sreeparna; Mitra, Indrani; Batuta, Shaikh; Niharul Alam, Md; Roy, Kunal; Begum, Naznin Ara

    2014-11-01

    A series of flavonoid analogues were synthesized and screened for the in vitro antioxidant activity through their ability to quench 1,1-diphenyl-2-picryl hydrazyl (DPPH) radical. The activity of these compounds, measured in comparison to the well-known standard antioxidants (29-32), their precursors (38-42) and other bioactive moieties (38-42) resembling partially the flavone skeleton was analyzed further to develop Quantitative Structure-Activity Relationship (QSAR) models using the Genetic Function Approximation (GFA) technique. Based on the essential structural requirements predicted by the QSAR models, some analogues were designed, synthesized and tested for activity. The predicted and experimental activities of these compounds were well correlated. Flavone analogue 20 was found to be the most potent antioxidant.

  17. Genetic resources for quantitative trait analysis: novelty and efficiency in design from an Arabidopsis perspective.

    PubMed

    Wijnen, Cris L; Keurentjes, Joost J B

    2014-04-01

    The use of genetic resources for the analysis of quantitative traits finds its roots in crop breeding but has seen a rejuvenation in Arabidopsis thaliana thanks to specific tools and genomic approaches. Although widely used in numerous crop and natural species, many approaches were first developed in this reference plant. We will discuss the scientific background and historical use of mapping populations in Arabidopsis and highlight the technological innovations that drove the development of novel strategies. We will especially lay emphasis on the methodologies used to generate the diverse population types and designate possible applications. Finally we highlight some of the most recent developments in generating genetic mapping resources and suggest specific usage for these novel tools and concepts.

  18. Controls design with crossfeeds for hovering rotorcraft using quantitative feedback theory

    NASA Technical Reports Server (NTRS)

    Tischler, Mark B.; Biezad, Daniel J.; Cheng, Rendy

    1996-01-01

    A multi-input, multi-output controls design with dynamic crossfeed pre-compensation is presented for rotorcraft in near-hovering flight using Quantitative Feedback Theory (QFT). The resulting closed-loop control system bandwidth allows the rotorcraft to be considered for use as an inflight simulator. The use of dynamic, robust crossfeeds prior to the QFT design reduces the magnitude of required feedback gain and results in performance that meets most handling qualities specifications relative to the decoupling of off-axis responses. Handling qualities are Level 1 for both low-gain tasks and high-gain tasks in the roll, pitch, and yaw axes except for the 10 deg/sec moderate-amplitude yaw command where the rotorcraft exhibits Level 2 handling qualities in the yaw axis caused by phase lag. The combined effect of the QFT feedback design following the implementation of low-order, dynamic crossfeed compensators successfully decouples ten of twelve off-axis channels. For the other two channels it was not possible to find a single, low-order crossfeed that was effective. This is an area to be investigated in future research.

  19. Probability of detection of nests and implications for survey design

    USGS Publications Warehouse

    Smith, P.A.; Bart, J.; Lanctot, Richard B.; McCaffery, B.J.; Brown, S.

    2009-01-01

    Surveys based on double sampling include a correction for the probability of detection by assuming complete enumeration of birds in an intensively surveyed subsample of plots. To evaluate this assumption, we calculated the probability of detecting active shorebird nests by using information from observers who searched the same plots independently. Our results demonstrate that this probability varies substantially by species and stage of the nesting cycle but less by site or density of nests. Among the species we studied, the estimated single-visit probability of nest detection during the incubation period varied from 0.21 for the White-rumped Sandpiper (Calidris fuscicollis), the most difficult species to detect, to 0.64 for the Western Sandpiper (Calidris mauri), the most easily detected species, with a mean across species of 0.46. We used these detection probabilities to predict the fraction of persistent nests found over repeated nest searches. For a species with the mean value for detectability, the detection rate exceeded 0.85 after four visits. This level of nest detection was exceeded in only three visits for the Western Sandpiper, but six to nine visits were required for the White-rumped Sandpiper, depending on the type of survey employed. Our results suggest that the double-sampling method's requirement of nearly complete counts of birds in the intensively surveyed plots is likely to be met for birds with nests that survive over several visits of nest searching. Individuals with nests that fail quickly or individuals that do not breed can be detected with high probability only if territorial behavior is used to identify likely nesting pairs. ?? The Cooper Ornithological Society, 2009.

  20. Methods for the Design and Administration of Web-based Surveys

    PubMed Central

    Schleyer, Titus K. L.; Forrest, Jane L.

    2000-01-01

    This paper describes the design, development, and administration of a Web-based survey to determine the use of the Internet in clinical practice by 450 dental professionals. The survey blended principles of a controlled mail survey with data collection through a Web-based database application. The survey was implemented as a series of simple HTML pages and tested with a wide variety of operating environments. The response rate was 74.2 percent. Eighty-four percent of the participants completed the Web-based survey, and 16 percent used e-mail or fax. Problems identified during survey administration included incompatibilities/technical problems, usability problems, and a programming error. The cost of the Web-based survey was 38 percent less than that of an equivalent mail survey. A general formula for calculating breakeven points between electronic and hardcopy surveys is presented. Web-based surveys can significantly reduce turnaround time and cost compared with mail surveys and may enhance survey item completion rates. PMID:10887169

  1. Quantitative determination of rarity of freshwater fishes and implications for imperiled-species designations.

    PubMed

    Pritt, Jeremy J; Frimpong, Emmanuel A

    2010-10-01

    Conserving rare species and protecting biodiversity and ecosystem functioning depends on sound information on the nature of rarity. Rarity is multidimensional and has a variety of definitions, which presents the need for a quantitative classification scheme with which to categorize species as rare or common. We constructed such a classification for North American freshwater fishes to better describe rarity in fishes and provide researchers and managers with a tool to streamline conservation efforts. We used data on range extents, habitat specificities, and local population sizes of North American freshwater fishes and a variety of quantitative methods and statistical decision criteria, including quantile regression and a cost-function algorithm to determine thresholds for categorizing a species as rare or common. Species fell into eight groups that conform to an established framework for rarity. Fishes listed by the American Fisheries Society (AFS) as endangered, threatened, or vulnerable were most often rare because their local population sizes were low, ranges were small, and they had specific habitat needs, in that order, whereas unlisted species were most often considered common on the basis of these three factors. Species with large ranges generally had few specific habitat needs, whereas those with small ranges tended to have narrow habitat specificities. We identified 30 species not designated as imperiled by AFS that were rare along all dimensions of rarity and may warrant further study or protection, and we found three designated species that were common along all dimensions and may require a review of their imperilment status. Our approach could be applied to other taxa to aid conservation decisions and serve as a useful tool for future revisions of listings of fish species. PMID:20337684

  2. Estimating occupancy rates with imperfect detection under complex survey designs

    EPA Science Inventory

    Monitoring the occurrence of specific amphibian species is of interest. Typically, the monitoring design is a complex design that involves stratification and unequal probability of selection. When conducting field visits to selected sites, a common problem is that during a singl...

  3. Survey design for lakes and reservoirs in the United States to assess contaminants in fish tissue

    EPA Science Inventory

    The National Lake Fish Tissue Study (NLFTS) was the first survey of fish contamination in lakes and reservoirs in the 48 conterminous states based on probability survey design. This study included the largest set (268) of persistent, bioaccumulative, and toxic (PBT) chemicals ev...

  4. The Laboratory Course Assessment Survey: A Tool to Measure Three Dimensions of Research-Course Design

    ERIC Educational Resources Information Center

    Corwin, Lisa A.; Runyon, Christopher; Robinson, Aspen; Dolan, Erin L.

    2015-01-01

    Course-based undergraduate research experiences (CUREs) are increasingly being offered as scalable ways to involve undergraduates in research. Yet few if any design features that make CUREs effective have been identified. We developed a 17-item survey instrument, the Laboratory Course Assessment Survey (LCAS), that measures students' perceptions…

  5. Targeting Urban Watershed Stressor Gradients: Stream Survey Design, Ecological Responses, and Implications of Land Cover Resolution

    EPA Science Inventory

    We conducted a stream survey in the Narragansett Bay Watershed designed to target a gradient of development intensity, and to examine how associated changes in nutrients, carbon, and stressors affect periphyton and macroinvertebrates. Concentrations of nutrients, cations, and ani...

  6. Effect of survey design and catch rate estimation on total catch estimates in Chinook salmon fisheries

    USGS Publications Warehouse

    McCormick, Joshua L.; Quist, Michael C.; Schill, Daniel J.

    2012-01-01

    Roving–roving and roving–access creel surveys are the primary techniques used to obtain information on harvest of Chinook salmon Oncorhynchus tshawytscha in Idaho sport fisheries. Once interviews are conducted using roving–roving or roving–access survey designs, mean catch rate can be estimated with the ratio-of-means (ROM) estimator, the mean-of-ratios (MOR) estimator, or the MOR estimator with exclusion of short-duration (≤0.5 h) trips. Our objective was to examine the relative bias and precision of total catch estimates obtained from use of the two survey designs and three catch rate estimators for Idaho Chinook salmon fisheries. Information on angling populations was obtained by direct visual observation of portions of Chinook salmon fisheries in three Idaho river systems over an 18-d period. Based on data from the angling populations, Monte Carlo simulations were performed to evaluate the properties of the catch rate estimators and survey designs. Among the three estimators, the ROM estimator provided the most accurate and precise estimates of mean catch rate and total catch for both roving–roving and roving–access surveys. On average, the root mean square error of simulated total catch estimates was 1.42 times greater and relative bias was 160.13 times greater for roving–roving surveys than for roving–access surveys. Length-of-stay bias and nonstationary catch rates in roving–roving surveys both appeared to affect catch rate and total catch estimates. Our results suggest that use of the ROM estimator in combination with an estimate of angler effort provided the least biased and most precise estimates of total catch for both survey designs. However, roving–access surveys were more accurate than roving–roving surveys for Chinook salmon fisheries in Idaho.

  7. Rational design of surface/interface chemistry for quantitative in vivo monitoring of brain chemistry.

    PubMed

    Zhang, Meining; Yu, Ping; Mao, Lanqun

    2012-04-17

    To understand the molecular basis of brain functions, researchers would like to be able to quantitatively monitor the levels of neurochemicals in the extracellular fluid in vivo. However, the chemical and physiological complexity of the central nervous system (CNS) presents challenges for the development of these analytical methods. This Account describes the rational design and careful construction of electrodes and nanoparticles with specific surface/interface chemistry for quantitative in vivo monitoring of brain chemistry. We used the redox nature of neurochemicals at the electrode/electrolyte interface to establish a basis for monitoring specific neurochemicals. Carbon nanotubes provide an electrode/electrolyte interface for the selective oxidation of ascorbate, and we have developed both in vivo voltammetry and an online electrochemical detecting system for continuously monitoring this molecule in the CNS. Although Ca(2+) and Mg(2+) are involved in a number of neurochemical signaling processes, they are still difficult to detect in the CNS. These divalent cations can enhance electrocatalytic oxidation of NADH at an electrode modified with toluidine blue O. We used this property to develop online electrochemical detection systems for simultaneous measurements of Ca(2+) and Mg(2+) and for continuous selective monitoring of Mg(2+) in the CNS. We have also harnessed biological schemes for neurosensing in the brain to design other monitoring systems. By taking advantage of the distinct reaction properties of dopamine (DA), we have developed a nonoxidative mechanism for DA sensing and a system that can potentially be used for continuously sensing of DA release. Using "artificial peroxidase" (Prussian blue) to replace a natural peroxidase (horseradish peroxidase, HRP), our online system can simultaneously detect basal levels of glucose and lactate. By substituting oxidases with dehydrogenases, we have used enzyme-based biosensing schemes to develop a physiologically

  8. Systematic review of effects of current transtibial prosthetic socket designs--Part 2: Quantitative outcomes.

    PubMed

    Safari, Mohammad Reza; Meier, Margrit Regula

    2015-01-01

    This review is an attempt to untangle the complexity of transtibial prosthetic socket fit and perhaps find some indication of whether a particular prosthetic socket type might be best for a given situation. In addition, we identified knowledge gaps, thus providing direction for possible future research. We followed the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines, using medical subject headings and standard key words to search for articles in relevant databases. No restrictions were made on study design and type of outcome measure used. From the obtained search results (n = 1,863), 35 articles were included. The relevant data were entered into a predefined data form that included the Downs and Black risk of bias assessment checklist. This article presents the results from the systematic review of the quantitative outcomes (n = 27 articles). Trends indicate that vacuum-assisted suction sockets improve gait symmetry, volume control, and residual limb health more than other socket designs. Hydrostatic sockets seem to create less inconsistent socket fittings, reducing a problem that greatly influences outcome measures. Knowledge gaps exist in the understanding of clinically meaningful changes in socket fit and its effect on biomechanical outcomes. Further, safe and comfortable pressure thresholds under various conditions should be determined through a systematic approach.

  9. One-year monthly quantitative survey of noroviruses, enteroviruses, and adenoviruses in wastewater collected from six plants in Japan.

    PubMed

    Katayama, Hiroyuki; Haramoto, Eiji; Oguma, Kumiko; Yamashita, Hiromasa; Tajima, Atsushi; Nakajima, Hideichiro; Ohgaki, Shinichiro

    2008-03-01

    Sewerage systems are important nodes to monitor human enteric pathogens transmitted via water. A quantitative virus survey was performed once a month for a year to understand the seasonal profiles of noroviruses genotype 1 and genotype 2, enteroviruses, and adenoviruses in sewerage systems. A total of 72 samples of influent, secondary-treated wastewater before chlorination and effluent were collected from six wastewater treatment plants in Japan. Viruses were successfully recovered from 100ml of influent and 1000ml of the secondary-treated wastewater and effluent using the acid rinse method. Viruses were determined by the RT-PCR or PCR method to obtain the most probable number for each sample. All the samples were also assayed for fecal coliforms (FCs) by a double-layer method. The seasonal profiles of noroviruses genotype 1 and genotype 2 in influent were very similar, i.e. they were abundant in winter (from November to March) at a geometric mean value of 190 and 200 RT-PCR units/ml, respectively, and less frequent in summer (from June to September), at 4.9 and 9.1 RT-PCR units/ml, respectively. The concentrations of enteroviruses and adenoviruses were mostly constant all the year round, 17 RT-PCR units/ml and 320 PCR units/ml in influent, and 0.044 RT-PCR units/ml and 7.0 PCR units/ml in effluent, respectively.

  10. Temporally Adaptive Sampling: A Case Study in Rare Species Survey Design with Marbled Salamanders (Ambystoma opacum)

    PubMed Central

    Charney, Noah D.; Kubel, Jacob E.; Eiseman, Charles S.

    2015-01-01

    Improving detection rates for elusive species with clumped distributions is often accomplished through adaptive sampling designs. This approach can be extended to include species with temporally variable detection probabilities. By concentrating survey effort in years when the focal species are most abundant or visible, overall detection rates can be improved. This requires either long-term monitoring at a few locations where the species are known to occur or models capable of predicting population trends using climatic and demographic data. For marbled salamanders (Ambystoma opacum) in Massachusetts, we demonstrate that annual variation in detection probability of larvae is regionally correlated. In our data, the difference in survey success between years was far more important than the difference among the three survey methods we employed: diurnal surveys, nocturnal surveys, and dipnet surveys. Based on these data, we simulate future surveys to locate unknown populations under a temporally adaptive sampling framework. In the simulations, when pond dynamics are correlated over the focal region, the temporally adaptive design improved mean survey success by as much as 26% over a non-adaptive sampling design. Employing a temporally adaptive strategy costs very little, is simple, and has the potential to substantially improve the efficient use of scarce conservation funds. PMID:25799224

  11. Temporally adaptive sampling: a case study in rare species survey design with marbled salamanders (Ambystoma opacum).

    PubMed

    Charney, Noah D; Kubel, Jacob E; Eiseman, Charles S

    2015-01-01

    Improving detection rates for elusive species with clumped distributions is often accomplished through adaptive sampling designs. This approach can be extended to include species with temporally variable detection probabilities. By concentrating survey effort in years when the focal species are most abundant or visible, overall detection rates can be improved. This requires either long-term monitoring at a few locations where the species are known to occur or models capable of predicting population trends using climatic and demographic data. For marbled salamanders (Ambystoma opacum) in Massachusetts, we demonstrate that annual variation in detection probability of larvae is regionally correlated. In our data, the difference in survey success between years was far more important than the difference among the three survey methods we employed: diurnal surveys, nocturnal surveys, and dipnet surveys. Based on these data, we simulate future surveys to locate unknown populations under a temporally adaptive sampling framework. In the simulations, when pond dynamics are correlated over the focal region, the temporally adaptive design improved mean survey success by as much as 26% over a non-adaptive sampling design. Employing a temporally adaptive strategy costs very little, is simple, and has the potential to substantially improve the efficient use of scarce conservation funds. PMID:25799224

  12. Temporally adaptive sampling: a case study in rare species survey design with marbled salamanders (Ambystoma opacum).

    PubMed

    Charney, Noah D; Kubel, Jacob E; Eiseman, Charles S

    2015-01-01

    Improving detection rates for elusive species with clumped distributions is often accomplished through adaptive sampling designs. This approach can be extended to include species with temporally variable detection probabilities. By concentrating survey effort in years when the focal species are most abundant or visible, overall detection rates can be improved. This requires either long-term monitoring at a few locations where the species are known to occur or models capable of predicting population trends using climatic and demographic data. For marbled salamanders (Ambystoma opacum) in Massachusetts, we demonstrate that annual variation in detection probability of larvae is regionally correlated. In our data, the difference in survey success between years was far more important than the difference among the three survey methods we employed: diurnal surveys, nocturnal surveys, and dipnet surveys. Based on these data, we simulate future surveys to locate unknown populations under a temporally adaptive sampling framework. In the simulations, when pond dynamics are correlated over the focal region, the temporally adaptive design improved mean survey success by as much as 26% over a non-adaptive sampling design. Employing a temporally adaptive strategy costs very little, is simple, and has the potential to substantially improve the efficient use of scarce conservation funds.

  13. Survey of electrical submersible systems design, application, and testing

    SciTech Connect

    Durham, M.O.; Lea, J.F.

    1996-05-01

    The electrical submersible pump industry has numerous recommended practices and procedures addressing various facets of the operation. Ascertaining the appropriate technique is tedious. Seldom are all the documents available at one location. This synopsis of all the industry practices provides a ready reference for testing, design, and application of electrical submersible pumping systems. An extensive bibliography identifies significant documents for further reference.

  14. The influence of the design matrix on treatment effect estimates in the quantitative analyses of single-subject experimental design research.

    PubMed

    Moeyaert, Mariola; Ugille, Maaike; Ferron, John M; Beretvas, S Natasha; Van den Noortgate, Wim

    2014-09-01

    The quantitative methods for analyzing single-subject experimental data have expanded during the last decade, including the use of regression models to statistically analyze the data, but still a lot of questions remain. One question is how to specify predictors in a regression model to account for the specifics of the design and estimate the effect size of interest. These quantitative effect sizes are used in retrospective analyses and allow synthesis of single-subject experimental study results which is informative for evidence-based decision making, research and theory building, and policy discussions. We discuss different design matrices that can be used for the most common single-subject experimental designs (SSEDs), namely, the multiple-baseline designs, reversal designs, and alternating treatment designs, and provide empirical illustrations. The purpose of this article is to guide single-subject experimental data analysts interested in analyzing and meta-analyzing SSED data.

  15. Improved Optical Design for the Large Synoptic Survey Telescope (LSST)

    SciTech Connect

    Seppala, L

    2002-09-24

    This paper presents an improved optical design for the LSST, an fll.25 three-mirror telescope covering 3.0 degrees full field angle, with 6.9 m effective aperture diameter. The telescope operates at five wavelength bands spanning 386.5 nm to 1040 nm (B, V, R, I and Z). For all bands, 80% of the polychromatic diffracted energy is collected within 0.20 arc-seconds diameter. The reflective telescope uses an 8.4 m f/1.06 concave primary, a 3.4 m convex secondary and a 5.2 m concave tertiary in a Paul geometry. The system length is 9.2 m. A refractive corrector near the detector uses three fused silica lenses, rather than the two lenses of previous designs. Earlier designs required that one element be a vacuum barrier, but now the detector sits in an inert gas at ambient pressure. The last lens is the gas barrier. Small adjustments lead to optimal correction at each band. The filters have different axial thicknesses. The primary and tertiary mirrors are repositioned for each wavelength band. The new optical design incorporates features to simplify manufacturing. They include a flat detector, a far less aspheric convex secondary (10 {micro}m from best fit sphere) and reduced aspheric departures on the lenses and tertiary mirror. Five aspheric surfaces, on all three mirrors and on two lenses, are used. The primary is nearly parabolic. The telescope is fully baffled so that no specularly reflected light from any field angle, inside or outside of the full field angle of 3.0 degrees, can reach the detector.

  16. Modified Universal Design Survey: Enhancing Operability of Launch Vehicle Ground Crew Worksites

    NASA Technical Reports Server (NTRS)

    Blume, Jennifer L.

    2010-01-01

    Operability is a driving requirement for next generation space launch vehicles. Launch site ground operations include numerous operator tasks to prepare the vehicle for launch or to perform preflight maintenance. Ensuring that components requiring operator interaction at the launch site are designed for optimal human use is a high priority for operability. To promote operability, a Design Quality Evaluation Survey based on Universal Design framework was developed to support Human Factors Engineering (HFE) evaluation for NASA s launch vehicles. Universal Design per se is not a priority for launch vehicle processing however; applying principles of Universal Design will increase the probability of an error free and efficient design which promotes operability. The Design Quality Evaluation Survey incorporates and tailors the seven Universal Design Principles and adds new measures for Safety and Efficiency. Adapting an approach proven to measure Universal Design Performance in Product, each principle is associated with multiple performance measures which are rated with the degree to which the statement is true. The Design Quality Evaluation Survey was employed for several launch vehicle ground processing worksite analyses. The tool was found to be most useful for comparative judgments as opposed to an assessment of a single design option. It provided a useful piece of additional data when assessing possible operator interfaces or worksites for operability.

  17. Sample and design considerations in post-disaster mental health needs assessment tracking surveys

    PubMed Central

    Kessler, Ronald C.; Keane, Terence M.; Ursano, Robert J.; Mokdad, Ali; Zaslavsky, Alan M.

    2009-01-01

    Although needs assessment surveys are carried out after many large natural and man-made disasters, synthesis of findings across these surveys and disaster situations about patterns and correlates of need is hampered by inconsistencies in study designs and measures. Recognizing this problem, the US Substance Abuse and Mental Health Services Administration (SAMHSA) assembled a task force in 2004 to develop a model study design and interview schedule for use in post-disaster needs assessment surveys. The US National Institute of Mental Health subsequently approved a plan to establish a center to implement post-disaster mental health needs assessment surveys in the future using an integrated series of measures and designs of the sort proposed by the SAMHSA task force. A wide range of measurement, design, and analysis issues will arise in developing this center. Given that the least widely discussed of these issues concerns study design, the current report focuses on the most important sampling and design issues proposed for this center based on our experiences with the SAMHSA task force, subsequent Katrina surveys, and earlier work in other disaster situations. PMID:19035440

  18. Using simulation to evaluate wildlife survey designs: polar bears and seals in the Chukchi Sea.

    PubMed

    Conn, Paul B; Moreland, Erin E; Regehr, Eric V; Richmond, Erin L; Cameron, Michael F; Boveng, Peter L

    2016-01-01

    Logistically demanding and expensive wildlife surveys should ideally yield defensible estimates. Here, we show how simulation can be used to evaluate alternative survey designs for estimating wildlife abundance. Specifically, we evaluate the potential of instrument-based aerial surveys (combining infrared imagery with high-resolution digital photography to detect and identify species) for estimating abundance of polar bears and seals in the Chukchi Sea. We investigate the consequences of different levels of survey effort, flight track allocation and model configuration on bias and precision of abundance estimators. For bearded seals (0.07 animals km(-2)) and ringed seals (1.29 animals km(-2)), we find that eight flights traversing ≈7840 km are sufficient to achieve target precision levels (coefficient of variation (CV)<20%) for a 2.94×10(5) km(2) study area. For polar bears (provisionally, 0.003 animals km(-2)), 12 flights traversing ≈11 760 km resulted in CVs ranging from 28 to 35%. Estimators were relatively unbiased with similar precision over different flight track allocation strategies and estimation models, although some combinations had superior performance. These findings suggest that instrument-based aerial surveys may provide a viable means for monitoring seal and polar bear populations on the surface of the sea ice over large Arctic regions. More broadly, our simulation-based approach to evaluating survey designs can serve as a template for biologists designing their own surveys. PMID:26909183

  19. Using simulation to evaluate wildlife survey designs: polar bears and seals in the Chukchi Sea

    PubMed Central

    Conn, Paul B.; Moreland, Erin E.; Regehr, Eric V.; Richmond, Erin L.; Cameron, Michael F.; Boveng, Peter L.

    2016-01-01

    Logistically demanding and expensive wildlife surveys should ideally yield defensible estimates. Here, we show how simulation can be used to evaluate alternative survey designs for estimating wildlife abundance. Specifically, we evaluate the potential of instrument-based aerial surveys (combining infrared imagery with high-resolution digital photography to detect and identify species) for estimating abundance of polar bears and seals in the Chukchi Sea. We investigate the consequences of different levels of survey effort, flight track allocation and model configuration on bias and precision of abundance estimators. For bearded seals (0.07 animals km−2) and ringed seals (1.29 animals km−2), we find that eight flights traversing ≈7840 km are sufficient to achieve target precision levels (coefficient of variation (CV)<20%) for a 2.94×105 km2 study area. For polar bears (provisionally, 0.003 animals km−2), 12 flights traversing ≈11 760 km resulted in CVs ranging from 28 to 35%. Estimators were relatively unbiased with similar precision over different flight track allocation strategies and estimation models, although some combinations had superior performance. These findings suggest that instrument-based aerial surveys may provide a viable means for monitoring seal and polar bear populations on the surface of the sea ice over large Arctic regions. More broadly, our simulation-based approach to evaluating survey designs can serve as a template for biologists designing their own surveys. PMID:26909183

  20. Assessing the Enhancement of Awareness of Conservation Issues Using Cross-Sectional and Longitudinal Survey Designs.

    ERIC Educational Resources Information Center

    Preston, Guy; Fuggle, Richard

    1987-01-01

    Compares the results of two survey designs administered to visitors of three South African nature reserves to ascertain changes in conservation awareness. Explains the method, data collection, questionnaire design, statistical tests, pretests, pilot studies, results, and conclusions of the study. Finds that no statistical differences between…

  1. Hit by a Perfect Storm? Art & Design in the National Student Survey

    ERIC Educational Resources Information Center

    Yorke, Mantz; Orr, Susan; Blair, Bernadette

    2014-01-01

    There has long been the suspicion amongst staff in Art & Design that the ratings given to their subject disciplines in the UK's National Student Survey are adversely affected by a combination of circumstances--a "perfect storm". The "perfect storm" proposition is tested by comparing ratings for Art & Design with…

  2. Usability Evaluation Survey for Identifying Design Issues in Civil Flight Deck

    NASA Astrophysics Data System (ADS)

    Ozve Aminian, Negin; Izzuddin Romli, Fairuz; Wiriadidjaja, Surjatin

    2016-02-01

    Ergonomics assessment for cockpit in civil aircraft is important as the pilots spend most of their time during flight on the seating posture imposed by its design. The improper seat design can cause discomfort and pain, which will disturb the pilot's concentration in flight. From a conducted survey, it is found that there are some issues regarding the current cockpit design. This study aims to highlight potential mismatches between the current cockpit design and the ergonomic design recommendations for anthropometric dimensions and seat design, which could be the roots of the problems faced by the pilots in the cockpit.

  3. Laboratory design and test procedures for quantitative evaluation of infrared sensors to assess thermal anomalies

    SciTech Connect

    Chang, Y.M.; Grot, R.A.; Wood, J.T.

    1985-06-01

    This report presents the description of the laboratory apparatus and preliminary results of the quantitative evaluation of three high-resolution and two low-resolution infrared imaging systems. These systems which are commonly used for building diagnostics are tested under various background temperatures (from -20/sup 0/C to 25/sup 0/C) for their minimum resolvable temperature differences (MRTD) at spatial frequencies from 0.03 to 0.25 cycles per milliradian. The calibration curves of absolute and differential temperature measurements are obtained for three systems. The signal transfer function and line spread function at ambient temperature of another three systems are also measured. Comparisons of the dependence of the MRTD on background temperatures from the measured data with the predicted values given in ASHRAE Standards 101-83 are also included. The dependence of background temperatures for absolute temperature measurements are presented, as well as comparison of measured data and data given by the manufacturer. Horizontal on-axis magnification factors of the geometric transfer function of two systems are also established to calibrate the horizontal axis for the measured line spread function to obtain the modulation transfer function. The variation of the uniformity for horizontal display of these two sensors are also observed. Included are detailed descriptions of laboratory design, equipment setup, and evaluation procedures of each test. 10 refs., 38 figs., 12 tabs.

  4. Quantitative insights towards the design of potent deazaxanthine antagonists of adenosine 2B receptors.

    PubMed

    Paz, Odailson Santos; Brito, Camila Carane Bitencourt; Castilho, Marcelo Santos

    2014-08-01

    Adenosine receptors have been considered as potential targets for drug development, but one of the main obstacles to this goal is to selectively inhibit one receptor subtype over the others. This subject is particularly crucial for adenosine A2b receptor antagonists (AdoRA2B). The structure–activity relationships of xanthine derivatives which are AdoRA2B have been comprehensively investigated, but the steric and electronic requirements of deazaxanthine AdoRA2B have not been described from a quantitative standpoint of view. Herein we report our efforts to shorten this knowledge gap through 2D-QSAR (HQSAR) and 3D-QSAR (CoMFA) approaches. The good statistical quality (HQSAR--r(2) = 0.85, q(2)(LOO) = 0.77; CoMFA – r(2) = 0.86, q(2) = 0.70) and predictive ability (r(2) = (pred1) = 0.78, r(2)(pred2) = 0.78 and r(2) = (pred1) = 0.70, r(2) = (pred2) = 0.70,respectively) of the models, along with the information provided by contribution and contour maps hints their usefulness to the design of more potent 9-deazaxanthine derivatives.

  5. Feasibility of the grandprogeny design for quantitative trait loci (QTL) detection in purebred beef cattle.

    PubMed

    Moody, D E; Pomp, D; Buchanan, D S

    1997-04-01

    The grandprogeny design (GPD) was developed for dairy cattle to use existing pedigreed populations for quantitative trait locus (QTL) detection. Marker genotypes of grandsires and sons are determined, and trait phenotypic data from grandprogeny are analyzed. The objective of this study was to investigate the potential application of GPD in purebred beef cattle populations. Pedigree structures of Angus (n = 123,319), Hereford (n = 107,778), Brangus (n = 14,449), and Gelbvieh (n = 8,114) sire evaluation reports were analyzed to identify potentially useful families. Power of QTL detection was calculated for a range of QTL effects (.1 to .5 SD) and two Type I error rates (.01 and .001). Reasonable power (> .75) could be achieved using GPD in Angus and Hereford for QTL having moderate effects (.3 SD) on weaning weight and large effects (.4 to .5 SD) on birth, yearling, and maternal weaning weights by genotyping 500 animals. Existing Gelbvieh and Brangus families useful for GPD were limited, and reasonable power could be expected only for QTL having large effects on weaning or birth weights. Although family structures suitable for GPD exist in purebred beef populations, large amounts of genotyping would be required to achieve reasonable power, and only QTL having moderate to large effects could be expected to be identified. PMID:9110205

  6. Quantitative clinical nonpulsatile and localized visible light oximeter: design of the T-Stat tissue oximeter

    NASA Astrophysics Data System (ADS)

    Benaron, David A.; Parachikov, Ilian H.; Cheong, Wai-Fung; Friedland, Shai; Duckworth, Joshua L.; Otten, David M.; Rubinsky, Boris E.; Horchner, Uwe B.; Kermit, Eben L.; Liu, Frank W.; Levinson, Carl J.; Murphy, Aileen L.; Price, John W.; Talmi, Yair; Weersing, James P.

    2003-07-01

    We report the development of a general, quantitative, and localized visible light clinical tissue oximeter, sensitive to both hypoxemia and ischemia. Monitor design and operation were optimized over four instrument generations. A range of clinical probes were developed, including non-contact wands, invasive catheters, and penetrating needles with injection ports. Real-time data were collected (a) from probes, standards, and reference solutions to optimize each component, (b) from ex vivo hemoglobin solutions co-analyzed for StO2% and pO2 during deoxygenation, and (c) from normoxic human subject skin and mucosal tissue surfaces. Results show that (a) differential spectroscopy allows extraction of features with minimization of the effects of scattering, (b) in vitro oximetry produces a hemoglobin saturation binding curve of expected sigmoid shape and values, and (c) that monitoring human tissues allows real-time tissue spectroscopic features to be monitored. Unlike with near-infrared (NIRS) or pulse oximetry (SpO2%) methods, we found non-pulsatile, diffusion-based tissue oximetry (StO2%) to work most reliably for non-contact reflectance monitoring and for invasive catheter- or needle-based monitoring, using blue to orange light (475-600 nm). Measured values were insensitive to motion artifact. Down time was non-existent. We conclude that the T-Stat oximeter design is suitable for the collection of spectroscopic data from human subjects, and that the oximeter may have application in the monitoring of regional hemoglobin oxygen saturation in the capillary tissue spaces of human subjects.

  7. A quantitative analysis of clinical trial designs in spinal cord injury based on ICCP guidelines.

    PubMed

    Sorani, Marco D; Beattie, Michael S; Bresnahan, Jacqueline C

    2012-06-10

    Clinical studies of spinal cord injury (SCI) have evolved into multidisciplinary programs that investigate multiple types of neurological deficits and sequelae. In 2007, the International Campaign for Cures of SCI Paralysis (ICCP) proposed best practices for interventional trial designs, end-points, and inclusion criteria. Here we quantitatively assessed the extent to which SCI trials follow ICCP guidelines and reflect the overall patient population. We obtained data for all 288 SCI trials in ClinicalTrials.gov. We calculated summary statistics and observed trends pre-2007 versus 2007 onward. To compare the trial population to the overall SCI population, we obtained statistics from the National SCI Statistical Center. We generated tag clouds to describe heterogeneous trial outcomes. Most interventional studies were randomized (147, 73.1%), and utilized active (55, 36.7%) or placebo controls (49, 32.7%), both increasing trends (p=0.09). Most trials were open label (116, 53.5%), rather than double- (62, 28.6%) or single-blinded (39, 18.0%), but blinding has increased (p=0.01). Tag clouds of outcomes suggest an emphasis on assessment using scores and scales. Inclusion criteria related to American Spinal Injury Association (ASIA) status and neurological level allowed inclusion of most SCI patients. Age inclusion criteria were most commonly 18-65 or older. Consistent with ICCP recommendations, most trials were randomized and controlled, and blinding has increased. Age inclusion criteria skew older than the overall population. ASIA status criteria reflect the population, but neurological lesion criteria could be broadened. Investigators should make trial designs and results available in a complete manner to enable comparisons of populations and outcomes.

  8. Design and prediction of new acetylcholinesterase inhibitor via quantitative structure activity relationship of huprines derivatives.

    PubMed

    Zhang, Shuqun; Hou, Bo; Yang, Huaiyu; Zuo, Zhili

    2016-05-01

    Acetylcholinesterase (AChE) is an important enzyme in the pathogenesis of Alzheimer's disease (AD). Comparative quantitative structure-activity relationship (QSAR) analyses on some huprines inhibitors against AChE were carried out using comparative molecular field analysis (CoMFA), comparative molecular similarity indices analysis (CoMSIA), and hologram QSAR (HQSAR) methods. Three highly predictive QSAR models were constructed successfully based on the training set. The CoMFA, CoMSIA, and HQSAR models have values of r (2) = 0.988, q (2) = 0.757, ONC = 6; r (2) = 0.966, q (2) = 0.645, ONC = 5; and r (2) = 0.957, q (2) = 0.736, ONC = 6. The predictabilities were validated using an external test sets, and the predictive r (2) values obtained by the three models were 0.984, 0.973, and 0.783, respectively. The analysis was performed by combining the CoMFA and CoMSIA field distributions with the active sites of the AChE to further understand the vital interactions between huprines and the protease. On the basis of the QSAR study, 14 new potent molecules have been designed and six of them are predicted to be more active than the best active compound 24 described in the literature. The final QSAR models could be helpful in design and development of novel active AChE inhibitors.

  9. Effects of sample survey design on the accuracy of classification tree models in species distribution models

    USGS Publications Warehouse

    Edwards, T.C.; Cutler, D.R.; Zimmermann, N.E.; Geiser, L.; Moisen, G.G.

    2006-01-01

    We evaluated the effects of probabilistic (hereafter DESIGN) and non-probabilistic (PURPOSIVE) sample surveys on resultant classification tree models for predicting the presence of four lichen species in the Pacific Northwest, USA. Models derived from both survey forms were assessed using an independent data set (EVALUATION). Measures of accuracy as gauged by resubstitution rates were similar for each lichen species irrespective of the underlying sample survey form. Cross-validation estimates of prediction accuracies were lower than resubstitution accuracies for all species and both design types, and in all cases were closer to the true prediction accuracies based on the EVALUATION data set. We argue that greater emphasis should be placed on calculating and reporting cross-validation accuracy rates rather than simple resubstitution accuracy rates. Evaluation of the DESIGN and PURPOSIVE tree models on the EVALUATION data set shows significantly lower prediction accuracy for the PURPOSIVE tree models relative to the DESIGN models, indicating that non-probabilistic sample surveys may generate models with limited predictive capability. These differences were consistent across all four lichen species, with 11 of the 12 possible species and sample survey type comparisons having significantly lower accuracy rates. Some differences in accuracy were as large as 50%. The classification tree structures also differed considerably both among and within the modelled species, depending on the sample survey form. Overlap in the predictor variables selected by the DESIGN and PURPOSIVE tree models ranged from only 20% to 38%, indicating the classification trees fit the two evaluated survey forms on different sets of predictor variables. The magnitude of these differences in predictor variables throws doubt on ecological interpretation derived from prediction models based on non-probabilistic sample surveys. ?? 2006 Elsevier B.V. All rights reserved.

  10. Review of quantitative surveys of the length and stability of MTBE, TBA, and benzene plumes in groundwater at UST sites.

    PubMed

    Connor, John A; Kamath, Roopa; Walker, Kenneth L; McHugh, Thomas E

    2015-01-01

    Quantitative information regarding the length and stability condition of groundwater plumes of benzene, methyl tert-butyl ether (MTBE), and tert-butyl alcohol (TBA) has been compiled from thousands of underground storage tank (UST) sites in the United States where gasoline fuel releases have occurred. This paper presents a review and summary of 13 published scientific surveys, of which 10 address benzene and/or MTBE plumes only, and 3 address benzene, MTBE, and TBA plumes. These data show the observed lengths of benzene and MTBE plumes to be relatively consistent among various regions and hydrogeologic settings, with median lengths at a delineation limit of 10 µg/L falling into relatively narrow ranges from 101 to 185 feet for benzene and 110 to 178 feet for MTBE. The observed statistical distributions of MTBE and benzene plumes show the two plume types to be of comparable lengths, with 90th percentile MTBE plume lengths moderately exceeding benzene plume lengths by 16% at a 10-µg/L delineation limit (400 feet vs. 345 feet) and 25% at a 5-µg/L delineation limit (530 feet vs. 425 feet). Stability analyses for benzene and MTBE plumes found 94 and 93% of these plumes, respectively, to be in a nonexpanding condition, and over 91% of individual monitoring wells to exhibit nonincreasing concentration trends. Three published studies addressing TBA found TBA plumes to be of comparable length to MTBE and benzene plumes, with 86% of wells in one study showing nonincreasing concentration trends.

  11. Study Quality in SLA: An Assessment of Designs, Analyses, and Reporting Practices in Quantitative L2 Research

    ERIC Educational Resources Information Center

    Plonsky, Luke

    2013-01-01

    This study assesses research and reporting practices in quantitative second language (L2) research. A sample of 606 primary studies, published from 1990 to 2010 in "Language Learning and Studies in Second Language Acquisition," was collected and coded for designs, statistical analyses, reporting practices, and outcomes (i.e., effect…

  12. Evaluating a Modular Design Approach to Collecting Survey Data Using Text Messages

    PubMed Central

    West, Brady T.; Ghimire, Dirgha; Axinn, William G.

    2015-01-01

    This article presents analyses of data from a pilot study in Nepal that was designed to provide an initial examination of the errors and costs associated with an innovative methodology for survey data collection. We embedded a randomized experiment within a long-standing panel survey, collecting data on a small number of items with varying sensitivity from a probability sample of 450 young Nepalese adults. Survey items ranged from simple demographics to indicators of substance abuse and mental health problems. Sampled adults were randomly assigned to one of three different modes of data collection: 1) a standard one-time telephone interview, 2) a “single sitting” back-and-forth interview with an interviewer using text messaging, and 3) an interview using text messages within a modular design framework (which generally involves breaking the survey response task into distinct parts over a short period of time). Respondents in the modular group were asked to respond (via text message exchanges with an interviewer) to only one question on a given day, rather than complete the entire survey. Both bivariate and multivariate analyses demonstrate that the two text messaging modes increased the probability of disclosing sensitive information relative to the telephone mode, and that respondents in the modular design group, while responding less frequently, found the survey to be significantly easier. Further, those who responded in the modular group were not unique in terms of available covariates, suggesting that the reduced item response rates only introduced limited nonresponse bias. Future research should consider enhancing this methodology, applying it with other modes of data collection (e. g., web surveys), and continuously evaluating its effectiveness from a total survey error perspective. PMID:26322137

  13. Application of a Modified Universal Design Survey for Evaluation of Ares 1 Ground Crew Worksites

    NASA Technical Reports Server (NTRS)

    Blume, Jennifer L.

    2010-01-01

    Operability is a driving requirement for NASA's Ares 1 launch vehicle. Launch site ground operations include several operator tasks to prepare the vehicle for launch or to perform maintenance. Ensuring that components requiring operator interaction at the launch site are designed for optimal human use is a high priority for operability. To support design evaluation, the Ares 1 Upper Stage (US) element Human Factors Engineering (HFE) group developed a survey based on the Universal Design approach. Universal Design is a process to create products that can be used effectively by as many people as possible. Universal Design per se is not a priority for Ares 1 because launch vehicle processing is a specialized skill and not akin to a consumer product that should be used by all people of all abilities. However, applying principles of Universal Design will increase the probability of an error free and efficient design which is a priority for Ares 1. The Design Quality Evaluation Survey centers on the following seven principles: (1) Equitable use, (2) Flexibility in use, (3) Simple and intuitive use, (4) Perceptible information, (5) Tolerance for error, (6) Low physical effort, (7) Size and space for approach and use. Each principle is associated with multiple evaluation criteria which were rated with the degree to which the statement is true. All statements are phrased in the utmost positive, or the design goal so that the degree to which judgments tend toward "completely agree" directly reflects the degree to which the design is good. The Design Quality Evaluation Survey was employed for several US analyses. The tool was found to be most useful for comparative judgments as opposed to an assessment of a single design option. It provided a useful piece of additional data when assessing possible operator interfaces or worksites for operability

  14. National health and nutrition examination survey: sample design, 2011-2014.

    PubMed

    Johnson, Clifford L; Dohrmann, Sylvia M; Burt, Vicki L; Mohadjer, Leyla K

    2014-03-01

    Background Data collection for the National Health and Nutrition Examination Survey (NHANES) consists of a household screener, an interview, and a physical examination. The screener primarily determines whether any household members are eligible for the interview and examination. Eligibility is established using preset selection probabilities for the desired demographic subdomains. After an eligible sample person is selected, the interview collects person-level demographic, health, and nutrition information, as well as information about the household. The examination includes physical measurements, tests such as hearing and dental examinations, and the collection of blood and urine specimens for laboratory testing. Objectives This report provides some background on the NHANES program, beginning with the first survey cycle in the 1970s and highlighting significant changes since its inception. The report then describes the broad design specifications for the 2011-2014 survey cycle, including survey objectives, domain and precision specifications, and operational requirements unique to NHANES. The report also describes details of the survey design, including the calculation of sampling rates and sample selection methods. Documentation of survey content, data collection procedures, estimation methods, and methods to assess nonsampling errors are reported elsewhere. PMID:25569458

  15. Final report on the radiological surveys of designated DX firing sites at Los Alamos National Laboratory

    SciTech Connect

    1996-09-09

    CHEMRAD was contracted by Los Alamos National Laboratory to perform USRADS{reg_sign} (UltraSonic Ranging And Data System) radiation scanning surveys at designated DX Sites at the Los Alamos National Laboratory. The primary purpose of these scanning surveys was to identify the presence of Depleted Uranium (D-38) resulting from activities at the DX Firing Sites. This effort was conducted to update the most recent surveys of these areas. This current effort was initiated with site orientation on August 12, 1996. Surveys were completed in the field on September 4, 1996. This Executive Summary briefly presents the major findings of this work. The detail survey results are presented in the balance of this report and are organized by Technical Area and Site number in section 2. This organization is not in chronological order. USRADS and the related survey methods are described in section 3. Quality Control issues are addressed in section 4. Surveys were conducted with an array of radiation detectors either mounted on a backpack frame for man-carried use (Manual mode) or on a tricycle cart (RadCart mode). The array included radiation detectors for gamma and beta surface near surface contamination as well as dose rate at 1 meter above grade. The radiation detectors were interfaced directly to an USRADS 2100 Data Pack.

  16. The Laboratory Course Assessment Survey: A Tool to Measure Three Dimensions of Research-Course Design

    PubMed Central

    Corwin, Lisa A.; Runyon, Christopher; Robinson, Aspen; Dolan, Erin L.

    2015-01-01

    Course-based undergraduate research experiences (CUREs) are increasingly being offered as scalable ways to involve undergraduates in research. Yet few if any design features that make CUREs effective have been identified. We developed a 17-item survey instrument, the Laboratory Course Assessment Survey (LCAS), that measures students’ perceptions of three design features of biology lab courses: 1) collaboration, 2) discovery and relevance, and 3) iteration. We assessed the psychometric properties of the LCAS using established methods for instrument design and validation. We also assessed the ability of the LCAS to differentiate between CUREs and traditional laboratory courses, and found that the discovery and relevance and iteration scales differentiated between these groups. Our results indicate that the LCAS is suited for characterizing and comparing undergraduate biology lab courses and should be useful for determining the relative importance of the three design features for achieving student outcomes. PMID:26466990

  17. The Laboratory Course Assessment Survey: A Tool to Measure Three Dimensions of Research-Course Design.

    PubMed

    Corwin, Lisa A; Runyon, Christopher; Robinson, Aspen; Dolan, Erin L

    2015-01-01

    Course-based undergraduate research experiences (CUREs) are increasingly being offered as scalable ways to involve undergraduates in research. Yet few if any design features that make CUREs effective have been identified. We developed a 17-item survey instrument, the Laboratory Course Assessment Survey (LCAS), that measures students' perceptions of three design features of biology lab courses: 1) collaboration, 2) discovery and relevance, and 3) iteration. We assessed the psychometric properties of the LCAS using established methods for instrument design and validation. We also assessed the ability of the LCAS to differentiate between CUREs and traditional laboratory courses, and found that the discovery and relevance and iteration scales differentiated between these groups. Our results indicate that the LCAS is suited for characterizing and comparing undergraduate biology lab courses and should be useful for determining the relative importance of the three design features for achieving student outcomes.

  18. The Laboratory Course Assessment Survey: A Tool to Measure Three Dimensions of Research-Course Design.

    PubMed

    Corwin, Lisa A; Runyon, Christopher; Robinson, Aspen; Dolan, Erin L

    2015-01-01

    Course-based undergraduate research experiences (CUREs) are increasingly being offered as scalable ways to involve undergraduates in research. Yet few if any design features that make CUREs effective have been identified. We developed a 17-item survey instrument, the Laboratory Course Assessment Survey (LCAS), that measures students' perceptions of three design features of biology lab courses: 1) collaboration, 2) discovery and relevance, and 3) iteration. We assessed the psychometric properties of the LCAS using established methods for instrument design and validation. We also assessed the ability of the LCAS to differentiate between CUREs and traditional laboratory courses, and found that the discovery and relevance and iteration scales differentiated between these groups. Our results indicate that the LCAS is suited for characterizing and comparing undergraduate biology lab courses and should be useful for determining the relative importance of the three design features for achieving student outcomes. PMID:26466990

  19. Optical Design Trade Study for the Wide Field Infrared Survey Telescope [WFIRST

    NASA Technical Reports Server (NTRS)

    Content, David A.; Goullioud, R.; Lehan, John P.; Mentzell, John E.

    2011-01-01

    The Wide Field Infrared Survey Telescope (WFIRST) mission concept was ranked first in new space astrophysics mission by the Astro2010 Decadal Survey incorporating the Joint Dark Energy Mission (JDEM)-Omega payload concept and multiple science white papers. This mission is based on a space telescope at L2 studying exoplanets [via gravitational microlensing], probing dark energy, and surveying the near infrared sky. Since the release of NWNH, the WFIRST project has been working with the WFIRST science definition team (SDT) to refine mission and payload concepts. We present the driving requirements. The current interim reference mission point design, based on the use of a 1.3m unobscured aperture three mirror anastigmat form, with focal imaging and slitless spectroscopy science channels, is consistent with the requirements, requires no technology development, and out performs the JDEM-Omega design.

  20. USING GIS TO GENERATE SPATIALLY-BALANCED RANDOM SURVEY DESIGNS FOR NATURAL RESOURCE APPLICATIONS

    EPA Science Inventory

    Sampling of a population is frequently required to understand trends and patterns in natural resource management because financial and time constraints preclude a complete census. A rigorous probability-based survey design specifies where to sample so that inferences from the sam...

  1. ASSESSING THE ECOLOGICAL CONDITION OF A COASTAL PLAIN WATERSHED USING A PROBABILISTIC SURVEY DESIGN

    EPA Science Inventory

    Using a probabilistic survey design, we assessed the ecological condition of the Florida (USA) portion of the Escambia River watershed using selected environmental and benthic macroinvertebrate data. Macroinvertebrates were sampled at 28 sites during July-August 1996, and 3414 i...

  2. Project SAFE [Survey of Administrative Functional Efficiency]. A Feedback Project Designed to Assist Principals.

    ERIC Educational Resources Information Center

    1980

    After a brief explanation of Project SAFE (Survey of Administrative Functional Efficiency) as a system designed to provide necessary feedback to school principals, the author lists the components of the project: (1) a confidential report to the principal summarizing the results of administering the SAFE instrument in the school, (2) a profile of…

  3. CONDITION ASSESSMENT FOR THE ESCAMBIA RIVER, FL, WATERSHED: BENTHIC MACROINVERTEBRATE SURVEYS USING A PROBABILISTIC SAMPLING DESIGN

    EPA Science Inventory

    Probabilistic sampling has been used to assess the condition of estuarine ecosystems, and the use of this survey design approach was examined for a northwest Florida watershed. Twenty-eight lotic sites within the Escambia River, Florida, watershed were randomly selected and visit...

  4. Why we love or hate our cars: A qualitative approach to the development of a quantitative user experience survey.

    PubMed

    Tonetto, Leandro Miletto; Desmet, Pieter M A

    2016-09-01

    This paper presents a more ecologically valid way of developing theory-based item questionnaires for measuring user experience. In this novel approach, items were generated using natural and domain-specific language of the research population, what seems to have made the survey much more sensitive to real experiences than theory-based ones. The approach was applied in a survey that measured car experience. Ten in-depth interviews were conducted with drivers inside their cars. The resulting transcripts were analysed with the aim of capturing their natural utterances for expressing their car experience. This analysis resulted in 71 categories of answers. For each category, one sentence was selected to serve as a survey-item. In an online platform, 538 respondents answered the survey. Data reliability, tested with Cronbach alpha index, was 0.94, suggesting a survey with highly reliable results to measure drivers' appraisals of their cars. PMID:27184312

  5. Why we love or hate our cars: A qualitative approach to the development of a quantitative user experience survey.

    PubMed

    Tonetto, Leandro Miletto; Desmet, Pieter M A

    2016-09-01

    This paper presents a more ecologically valid way of developing theory-based item questionnaires for measuring user experience. In this novel approach, items were generated using natural and domain-specific language of the research population, what seems to have made the survey much more sensitive to real experiences than theory-based ones. The approach was applied in a survey that measured car experience. Ten in-depth interviews were conducted with drivers inside their cars. The resulting transcripts were analysed with the aim of capturing their natural utterances for expressing their car experience. This analysis resulted in 71 categories of answers. For each category, one sentence was selected to serve as a survey-item. In an online platform, 538 respondents answered the survey. Data reliability, tested with Cronbach alpha index, was 0.94, suggesting a survey with highly reliable results to measure drivers' appraisals of their cars.

  6. Screen Design Guidelines for Motivation in Interactive Multimedia Instruction: A Survey and Framework for Designers.

    ERIC Educational Resources Information Center

    Lee, Sung Heum; Boling, Elizabeth

    1999-01-01

    Identifies guidelines from the literature relating to screen design and design of interactive instructional materials. Describes two types of guidelines--those aimed at enhancing motivation and those aimed at preventing loss of motivation--for typography, graphics, color, and animation and audio. Proposes a framework for considering motivation in…

  7. AI/OR computational model for integrating qualitative and quantitative design methods

    NASA Technical Reports Server (NTRS)

    Agogino, Alice M.; Bradley, Stephen R.; Cagan, Jonathan; Jain, Pramod; Michelena, Nestor

    1990-01-01

    A theoretical framework for integrating qualitative and numerical computational methods for optimally-directed design is described. The theory is presented as a computational model and features of implementations are summarized where appropriate. To demonstrate the versatility of the methodology we focus on four seemingly disparate aspects of the design process and their interaction: (1) conceptual design, (2) qualitative optimal design, (3) design innovation, and (4) numerical global optimization.

  8. Lessons Learned in Interdisciplinary Professional Development Designed to Promote the Teaching of Quantitative Literacy

    ERIC Educational Resources Information Center

    Lardner, Emily; Bookman, Jack

    2013-01-01

    In this paper, we will describe the challenges and insights gained from conducting professional development workshops aimed at helping faculty prepare materials to support the development of students' quantitative skills in different disciplinary contexts. We will examine some of the mistakes we made, and misconceptions we had, in conducting…

  9. Surveys and questionnaires in nursing research.

    PubMed

    Timmins, Fiona

    2015-06-17

    Surveys and questionnaires are often used in nursing research to elicit the views of large groups of people to develop the nursing knowledge base. This article provides an overview of survey and questionnaire use in nursing research, clarifies the place of the questionnaire as a data collection tool in quantitative research design and provides information and advice about best practice in the development of quantitative surveys and questionnaires.

  10. Design and synthesis of target-responsive aptamer-cross-linked hydrogel for visual quantitative detection of ochratoxin A.

    PubMed

    Liu, Rudi; Huang, Yishun; Ma, Yanli; Jia, Shasha; Gao, Mingxuan; Li, Jiuxing; Zhang, Huimin; Xu, Dunming; Wu, Min; Chen, Yan; Zhu, Zhi; Yang, Chaoyong

    2015-04-01

    A target-responsive aptamer-cross-linked hydrogel was designed and synthesized for portable and visual quantitative detection of the toxin Ochratoxin A (OTA), which occurs in food and beverages. The hydrogel network forms by hybridization between one designed DNA strand containing the OTA aptamer and two complementary DNA strands grafting on linear polyacrylamide chains. Upon the introduction of OTA, the aptamer binds with OTA, leading to the dissociation of the hydrogel, followed by release of the preloaded gold nanoparticles (AuNPs), which can be observed by the naked eye. To enable sensitive visual and quantitative detection, we encapsulated Au@Pt core-shell nanoparticles (Au@PtNPs) in the hydrogel to generate quantitative readout in a volumetric bar-chart chip (V-Chip). In the V-Chip, Au@PtNPs catalyzes the oxidation of H2O2 to generate O2, which induces movement of an ink bar to a concentration-dependent distance for visual quantitative readout. Furthermore, to improve the detection limit in complex real samples, we introduced an immunoaffinity column (IAC) of OTA to enrich OTA from beer. After the enrichment, as low as 1.27 nM (0.51 ppb) OTA can be detected by the V-Chip, which satisfies the test requirement (2.0 ppb) by the European Commission. The integration of a target-responsive hydrogel with portable enrichment by IAC, as well as signal amplification and quantitative readout by a simple microfluidic device, offers a new method for portable detection of food safety hazard toxin OTA.

  11. Rationale, design and methodology for the Navajo Health and Nutrition Survey.

    PubMed

    White, L L; Goldberg, H I; Gilbert, T J; Ballew, C; Mendlein, J M; Peter, D G; Percy, C A; Mokdad, A H

    1997-10-01

    As recently as 1990, there was no reservation-wide, population-based health status information about Navajo Indians. To remedy this shortcoming, the Navajo Health and Nutrition Survey was conducted from 1991 to 1992 to assess the health and nutritional status of Navajo Reservation residents using a population-based sample. Using a three-stage design, a representative sample of reservation households was selected for inclusion. All members of selected households 12 y of age and older were invited to participate. A total of 985 people in 459 households participated in the study. Survey protocols were modeled on those of previous national surveys and included a standard blood chemistry profile, complete blood count, oral glucose tolerance test, blood pressure, anthropometric measurements, a single 24-h dietary recall and a questionnaire on health behaviors. The findings from this survey, reported in the accompanying papers, inform efforts to prevent and control chronic disease among the Navajo. Lessons learned from this survey may be of interest to those conducting similar surveys in other American Indian and Alaska Native populations. PMID:9339173

  12. The inclusion of open-ended questions on quantitative surveys of children: Dealing with unanticipated responses relating to child abuse and neglect.

    PubMed

    Lloyd, Katrina; Devine, Paula

    2015-10-01

    Web surveys have been shown to be a viable, and relatively inexpensive, method of data collection with children. For this reason, the Kids' Life and Times (KLT) was developed as an annual online survey of 10 and 11 year old children. Each year, approximately 4,000 children participate in the survey. Throughout the six years that KLT has been running, a range of questions has been asked that are both policy-relevant and important to the lives of children. Given the method employed by the survey, no extremely sensitive questions that might cause the children distress are included. The majority of questions on KLT are closed yielding quantitative data that are analysed statistically; however, one regular open-ended question is included at the end of KLT each year so that the children can suggest questions that they think should be asked on the survey the following year. While most of the responses are innocuous, each year a small minority of children suggest questions on child abuse and neglect. This paper reports the responses to this question and reflects on how researchers can, and should, deal with this issue from both a methodological and an ethical perspective.

  13. I.4 Screening Experimental Designs for Quantitative Trait Loci, Association Mapping, Genotype-by Environment Interaction, and Other Investigations

    PubMed Central

    Federer, Walter T.; Crossa, José

    2012-01-01

    Crop breeding programs using conventional approaches, as well as new biotechnological tools, rely heavily on data resulting from the evaluation of genotypes in different environmental conditions (agronomic practices, locations, and years). Statistical methods used for designing field and laboratory trials and for analyzing the data originating from those trials need to be accurate and efficient. The statistical analysis of multi-environment trails (MET) is useful for assessing genotype × environment interaction (GEI), mapping quantitative trait loci (QTLs), and studying QTL × environment interaction (QEI). Large populations are required for scientific study of QEI, and for determining the association between molecular markers and quantitative trait variability. Therefore, appropriate control of local variability through efficient experimental design is of key importance. In this chapter we present and explain several classes of augmented designs useful for achieving control of variability and assessing genotype effects in a practical and efficient manner. A popular procedure for unreplicated designs is the one known as “systematically spaced checks.” Augmented designs contain “c” check or standard treatments replicated “r” times, and “n” new treatments or genotypes included once (usually) in the experiment. PMID:22675304

  14. Median and quantile tests under complex survey design using SAS and R.

    PubMed

    Pan, Yi; Caudill, Samuel P; Li, Ruosha; Caldwell, Kathleen L

    2014-11-01

    Techniques for conducting hypothesis testing on the median and other quantiles of two or more subgroups under complex survey design are limited. In this paper, we introduce programs in both SAS and R to perform such a test. A detailed illustration of the computations, macro variable definitions, input and output for the SAS and R programs are also included in the text. Urinary iodine data from National Health and Nutrition Examination Survey (NHANES) are used as examples for comparing medians between females and males as well as comparing the 75th percentiles among three salt consumption groups.

  15. The Design of a Novel Survey for Small Objects in the Solar System

    SciTech Connect

    Alcock, C.; Chen, W.P.; de Pater, I.; Lee, T.; Lissauer, J.; Rice, J.; Liang, C.; Cook, K.; Marshall, S.; Akerlof, C.

    2000-08-21

    We evaluated several concepts for a new survey for small objects in the Solar System. We designed a highly novel survey for comets in the outer region of the Solar System, which exploits the occultations of relatively bright stars to infer the presence of otherwise extremely faint objects. The populations and distributions of these objects are not known; the uncertainties span orders of magnitude! These objects are important scientifically as probes of the primordial solar system, and programmatically now that major investments may be made in the possible mitigation of the hazard of asteroid or comet collisions with the Earth.

  16. Estimation of wildlife population ratios incorporating survey design and visibility bias

    USGS Publications Warehouse

    Samuel, M.D.; Steinhorst, R.K.; Garton, E.O.; Unsworth, J.W.

    1992-01-01

    Age and sex ratio statistics are often a key component of the evaluation and management of wildlife populations. These statistics are determined from counts of animals that are commonly plagued by errors associated with either survey design or visibility bias. We present age and sex ratio estimators that incorporate both these sources of error and include the typical situation that animals are sampled in groups. Aerial surveys of elk (Cervus elaphus) in northcentral Idaho illustrate that differential visibility of age or sex classes can produce biased ratio estimates. Visibility models may be used to provide corrected estimates of ratios and their variability that incorporates errors due to sampling, visibility bias, and visibility estimation.

  17. The Design of AN Interactive E-Learning Platform for Surveying Exercise

    NASA Astrophysics Data System (ADS)

    Cheng, S.-C.; Shih, P. T. Y.; Chang, S.-L.; Chen, G.-Y.

    2011-09-01

    Surveying exercise is a fundamental course for Civil Engineering students. This course is featured with field operation. This study explores the design of an e-learning platform for the surveying exercise course. The issues on organizing digital contents such as recorded video of the standard instrument operation, editing learning materials, and constructing the portfolio for the learning process, as well as generating learning motivation, are discussed. Noting the uploaded videos, publishing articles and commentaries, interactive examination sessions, assessing for each other, and mobile device accessing, are found to be useful elements for this platform.

  18. Design of a citizen survey of forest plant injury caused by exposure to ozone

    SciTech Connect

    Morton, B.J.

    1994-12-31

    The North Carolina Environmental Defense Fund has designed a citizen-based survey of forest plant injury caused by exposure to ozone. The first, pilot survey will run for ten weeks in July, August, and September 1994. The surveyors will be trained laypersons who are donating their tie and effort. The scientific objective of the survey is to look for and collect evidence on the incidence and severity of ozone-injury to the leaves of plants and trees (seedling- and sapling-sized plants) in the forests of western North Carolina. The educational objective is to discuss the facts and meaning of air pollution problems in the southern Appalachian Mountains. The third and equally important objective involves policy dialogue: specifically, the objective is to motivate the surveyors to participate in local and regional forums at which mountain air pollution is on the agenda.

  19. Quantitative Comparison of Minimum Inductance and Minimum Power Algorithms for the Design of Shim Coils for Small Animal Imaging.

    PubMed

    Hudson, Parisa; Hudson, Stephen D; Handler, William B; Scholl, Timothy J; Chronik, Blaine A

    2010-04-01

    High-performance shim coils are required for high-field magnetic resonance imaging and spectroscopy. Complete sets of high-power and high-performance shim coils were designed using two different methods: the minimum inductance and the minimum power target field methods. A quantitative comparison of shim performance in terms of merit of inductance (ML) and merit of resistance (MR) was made for shim coils designed using the minimum inductance and the minimum power design algorithms. In each design case, the difference in ML and the difference in MR given by the two design methods was <15%. Comparison of wire patterns obtained using the two design algorithms show that minimum inductance designs tend to feature oscillations within the current density; while minimum power designs tend to feature less rapidly varying current densities and lower power dissipation. Overall, the differences in coil performance obtained by the two methods are relatively small. For the specific case of shim systems customized for small animal imaging, the reduced power dissipation obtained when using the minimum power method is judged to be more significant than the improvements in switching speed obtained from the minimum inductance method.

  20. Design tool survey. IEA Solar Heating and Cooling - Task 8. Passive and hybrid solar low-energy buildings, Subtask C: design methods

    SciTech Connect

    Rittelmann, P.R.; Ahmed, S.F.

    1985-05-01

    This document presents the results of a survey of design tools conducted as part of Subtask C (Design Methods) of Task VIII of the IAE Solar Heating and Cooling Program. At the start of the task, the participants agreed that it would be useful to identify and characterize the various design tools which existed for predicting the energy performance of passive and hybrid solar low energy buildings. A standard survey form was adopted, and Subtask C representatives from the member countries collected and submitted information on the design tools in use in each country. These responses were compiled into the present survey document.

  1. A survey of scientific literacy to provide a foundation for designing science communication in Japan.

    PubMed

    Kawamoto, Shishin; Nakayama, Minoru; Saijo, Miki

    2013-08-01

    There are various definitions and survey methods for scientific literacy. Taking into consideration the contemporary significance of scientific literacy, we have defined it with an emphasis on its social aspects. To acquire the insights needed to design a form of science communication that will enhance the scientific literacy of each individual, we conducted a large-scale random survey within Japan of individuals older than 18 years, using a printed questionnaire. The data thus acquired were analyzed using factor analysis and cluster analysis to create a 3-factor/4-cluster model of people's interest and attitude toward science, technology and society and their resulting tendencies. Differences were found among the four clusters in terms of the three factors: scientific factor, social factor, and science-appreciating factor. We propose a plan for designing a form of science communication that is appropriate to this current status of scientific literacy in Japan.

  2. Complementary methods of system usability evaluation: surveys and observations during software design and development cycles.

    PubMed

    Horsky, Jan; McColgan, Kerry; Pang, Justine E; Melnikas, Andrea J; Linder, Jeffrey A; Schnipper, Jeffrey L; Middleton, Blackford

    2010-10-01

    Poor usability of clinical information systems delays their adoption by clinicians and limits potential improvements to the efficiency and safety of care. Recurring usability evaluations are therefore, integral to the system design process. We compared four methods employed during the development of outpatient clinical documentation software: clinician email response, online survey, observations and interviews. Results suggest that no single method identifies all or most problems. Rather, each approach is optimal for evaluations at a different stage of design and characterizes different usability aspect. Email responses elicited from clinicians and surveys report mostly technical, biomedical, terminology and control problems and are most effective when a working prototype has been completed. Observations of clinical work and interviews inform conceptual and workflow-related problems and are best performed early in the cycle. Appropriate use of these methods consistently during development may significantly improve system usability and contribute to higher adoption rates among clinicians and to improved quality of care. PMID:20546936

  3. Creel survey sampling designs for estimating effort in short-duration Chinook salmon fisheries

    USGS Publications Warehouse

    McCormick, Joshua L.; Quist, Michael C.; Schill, Daniel J.

    2013-01-01

    Chinook Salmon Oncorhynchus tshawytscha sport fisheries in the Columbia River basin are commonly monitored using roving creel survey designs and require precise, unbiased catch estimates. The objective of this study was to examine the relative bias and precision of total catch estimates using various sampling designs to estimate angling effort under the assumption that mean catch rate was known. We obtained information on angling populations based on direct visual observations of portions of Chinook Salmon fisheries in three Idaho river systems over a 23-d period. Based on the angling population, Monte Carlo simulations were used to evaluate the properties of effort and catch estimates for each sampling design. All sampling designs evaluated were relatively unbiased. Systematic random sampling (SYS) resulted in the most precise estimates. The SYS and simple random sampling designs had mean square error (MSE) estimates that were generally half of those observed with cluster sampling designs. The SYS design was more efficient (i.e., higher accuracy per unit cost) than a two-cluster design. Increasing the number of clusters available for sampling within a day decreased the MSE of estimates of daily angling effort, but the MSE of total catch estimates was variable depending on the fishery. The results of our simulations provide guidelines on the relative influence of sample sizes and sampling designs on parameters of interest in short-duration Chinook Salmon fisheries.

  4. Fitting multilevel models in complex survey data with design weights: Recommendations

    PubMed Central

    2009-01-01

    Background Multilevel models (MLM) offer complex survey data analysts a unique approach to understanding individual and contextual determinants of public health. However, little summarized guidance exists with regard to fitting MLM in complex survey data with design weights. Simulation work suggests that analysts should scale design weights using two methods and fit the MLM using unweighted and scaled-weighted data. This article examines the performance of scaled-weighted and unweighted analyses across a variety of MLM and software programs. Methods Using data from the 2005–2006 National Survey of Children with Special Health Care Needs (NS-CSHCN: n = 40,723) that collected data from children clustered within states, I examine the performance of scaling methods across outcome type (categorical vs. continuous), model type (level-1, level-2, or combined), and software (Mplus, MLwiN, and GLLAMM). Results Scaled weighted estimates and standard errors differed slightly from unweighted analyses, agreeing more with each other than with unweighted analyses. However, observed differences were minimal and did not lead to different inferential conclusions. Likewise, results demonstrated minimal differences across software programs, increasing confidence in results and inferential conclusions independent of software choice. Conclusion If including design weights in MLM, analysts should scale the weights and use software that properly includes the scaled weights in the estimation. PMID:19602263

  5. An Eclectic Qualitative-Quantitative Research Design for the Study of Affective-Cognitive Learning.

    ERIC Educational Resources Information Center

    Beauchamp, Darrell G.; Braden, Roberts A.

    This study used an eclectic, qualitative research design to explore the effects of visual and verbal variables on affective response and cognitive learning in four different groups of students. The four design imperatives of the study were: (1) both of the primary learning senses (sight and hearing) had to be included in the study; (2) the inquiry…

  6. Quantitative evaluation of water bodies dynamic by means of thermal infrared and multispectral surveys on the Venetian lagoon

    NASA Technical Reports Server (NTRS)

    Alberotanza, L.; Lechi, G. M.

    1977-01-01

    Surveys employing a two channel Daedalus infrared scanner and multispectral photography were performed. The spring waning tide, the velocity of the water mass, and the types of suspended matter were among the topics studied. Temperature, salinity, sediment transport, and ebb stream velocity were recorded. The bottom topography was correlated with the dynamic characteristics of the sea surface.

  7. Design and Implementation Issues in Surveying the Views of Young Children in Ethnolinguistically Diverse Developing Country Contexts

    ERIC Educational Resources Information Center

    Smith, Hilary A.; Haslett, Stephen J.

    2016-01-01

    This paper discusses issues in the development of a methodology appropriate for eliciting sound quantitative data from primary school children in the complex contexts of ethnolinguistically diverse developing countries. Although these issues often occur in field-based surveys, the large extent and compound effects of their occurrence in…

  8. Quantitation of active pharmaceutical ingredients and excipients in powder blends using designed multivariate calibration models by near-infrared spectroscopy.

    PubMed

    Li, Weiyong; Worosila, Gregory D

    2005-05-13

    This research note demonstrates the simultaneous quantitation of a pharmaceutical active ingredient and three excipients in a simulated powder blend containing acetaminophen, Prosolv and Crospovidone. An experimental design approach was used in generating a 5-level (%, w/w) calibration sample set that included 125 samples. The samples were prepared by weighing suitable amount of powders into separate 20-mL scintillation vials and were mixed manually. Partial least squares (PLS) regression was used in calibration model development. The models generated accurate results for quantitation of Crospovidone (at 5%, w/w) and magnesium stearate (at 0.5%, w/w). Further testing of the models demonstrated that the 2-level models were as effective as the 5-level ones, which reduced the calibration sample number to 50. The models had a small bias for quantitation of acetaminophen (at 30%, w/w) and Prosolv (at 64.5%, w/w) in the blend. The implication of the bias is discussed.

  9. Quantitatively mapping cellular viscosity with detailed organelle information via a designed PET fluorescent probe.

    PubMed

    Liu, Tianyu; Liu, Xiaogang; Spring, David R; Qian, Xuhong; Cui, Jingnan; Xu, Zhaochao

    2014-01-01

    Viscosity is a fundamental physical parameter that influences diffusion in biological processes. The distribution of intracellular viscosity is highly heterogeneous, and it is challenging to obtain a full map of cellular viscosity with detailed organelle information. In this work, we report 1 as the first fluorescent viscosity probe which is able to quantitatively map cellular viscosity with detailed organelle information based on the PET mechanism. This probe exhibited a significant ratiometric fluorescence intensity enhancement as solvent viscosity increases. The emission intensity increase was attributed to combined effects of the inhibition of PET due to restricted conformational access (favorable for FRET, but not for PET), and the decreased PET efficiency caused by viscosity-dependent twisted intramolecular charge transfer (TICT). A full map of subcellular viscosity was successfully constructed via fluorescent ratiometric detection and fluorescence lifetime imaging; it was found that lysosomal regions in a cell possess the highest viscosity, followed by mitochondrial regions.

  10. Real-time quantitative PCR for the design of lentiviral vector analytical assays.

    PubMed

    Delenda, C; Gaillard, C

    2005-10-01

    From the recent and emerging concerns for approving lentiviral vector-mediated gene transfer in human clinical applications, several analytical methods have been applied in preclinical models to address the lentiviral vector load in batches, cells or tissues. This review points out the oldest generation methods (blots, RT activity, standard PCR) as well as a full description of the newest real-time quantitative PCR (qPCR) applications. Combinations of primer and probe sequences, which have worked in the lentiviral amplification context, have been included in the effort to dress an exhaustive list. Also, great variations have been observed from interlaboratory results, we have tempted to compare between them the different analytical methods that have been used to consider (i) the titration of lentiviral vector batches, (ii) the absence of the susceptible emerging replicative lentiviruses or (iii) the lentiviral vector biodistribution in the organism.

  11. A quantitative method for groundwater surveillance monitoring network design at the Hanford Site

    SciTech Connect

    Meyer, P.D.

    1993-12-01

    As part of the Environmental Surveillance Program at the Hanford Site, mandated by the US Department of Energy, hundreds of groundwater wells are sampled each year, with each sample typically analyzed for a variety of constituents. The groundwater sampling program must satisfy several broad objectives. These objectives include an integrated assessment of the condition of groundwater and the identification and quantification of existing, emerging, or potential groundwater problems. Several quantitative network desip objectives are proposed and a mathematical optimization model is developed from these objectives. The model attempts to find minimum cost network alternatives that maximize the amount of information generated by the network. Information is measured both by the rats of change with respect to time of the contaminant concentration and the uncertainty in contaminant concentration. In an application to tritium monitoring at the Hanford Site, both information measures were derived from historical data using time series analysis.

  12. Quantitatively Mapping Cellular Viscosity with Detailed Organelle Information via a Designed PET Fluorescent Probe

    NASA Astrophysics Data System (ADS)

    Liu, Tianyu; Liu, Xiaogang; Spring, David R.; Qian, Xuhong; Cui, Jingnan; Xu, Zhaochao

    2014-06-01

    Viscosity is a fundamental physical parameter that influences diffusion in biological processes. The distribution of intracellular viscosity is highly heterogeneous, and it is challenging to obtain a full map of cellular viscosity with detailed organelle information. In this work, we report 1 as the first fluorescent viscosity probe which is able to quantitatively map cellular viscosity with detailed organelle information based on the PET mechanism. This probe exhibited a significant ratiometric fluorescence intensity enhancement as solvent viscosity increases. The emission intensity increase was attributed to combined effects of the inhibition of PET due to restricted conformational access (favorable for FRET, but not for PET), and the decreased PET efficiency caused by viscosity-dependent twisted intramolecular charge transfer (TICT). A full map of subcellular viscosity was successfully constructed via fluorescent ratiometric detection and fluorescence lifetime imaging; it was found that lysosomal regions in a cell possess the highest viscosity, followed by mitochondrial regions.

  13. Microbial-based evaluation of foaming events in full-scale wastewater treatment plants by microscopy survey and quantitative image analysis.

    PubMed

    Leal, Cristiano; Amaral, António Luís; Costa, Maria de Lourdes

    2016-08-01

    Activated sludge systems are prone to be affected by foaming occurrences causing the sludge to rise in the reactor and affecting the wastewater treatment plant (WWTP) performance. Nonetheless, there is currently a knowledge gap hindering the development of foaming events prediction tools that may be fulfilled by the quantitative monitoring of AS systems biota and sludge characteristics. As such, the present study focuses on the assessment of foaming events in full-scale WWTPs, by quantitative protozoa, metazoa, filamentous bacteria, and sludge characteristics analysis, further used to enlighten the inner relationships between these parameters. In the current study, a conventional activated sludge system (CAS) and an oxidation ditch (OD) were surveyed throughout a period of 2 and 3 months, respectively, regarding their biota and sludge characteristics. The biota community was monitored by microscopic observation, and a new filamentous bacteria index was developed to quantify their occurrence. Sludge characteristics (aggregated and filamentous biomass contents and aggregate size) were determined by quantitative image analysis (QIA). The obtained data was then processed by principal components analysis (PCA), cross-correlation analysis, and decision trees to assess the foaming occurrences, and enlighten the inner relationships. It was found that such events were best assessed by the combined use of the relative abundance of testate amoeba and nocardioform filamentous index, presenting a 92.9 % success rate for overall foaming events, and 87.5 and 100 %, respectively, for persistent and mild events. PMID:27130343

  14. Wide-Field InfraRed Survey Telescope (WFIRST) Slitless Spectrometer: Design, Prototype, and Results

    NASA Technical Reports Server (NTRS)

    Gong, Qian; Content, David; Dominguez, Margaret; Emmett, Thomas; Griesmann, Ulf; Hagopian, John; Kruk, Jeffrey; Marx, Catherine; Pasquale, Bert; Wallace, Thomas; Whipple, Arthur

    2016-01-01

    The slitless spectrometer plays an important role in the Wide-Field InfraRed Survey Telescope (WFIRST) mission for the survey of emission-line galaxies. This will be an unprecedented very wide field, HST quality 3D survey of emission line galaxies. The concept of the compound grism as a slitless spectrometer has been presented previously. The presentation briefly discusses the challenges and solutions of the optical design, and recent specification updates, as well as a brief comparison between the prototype and the latest design. However, the emphasis of this paper is the progress of the grism prototype: the fabrication and test of the complicated diffractive optical elements and powered prism, as well as grism assembly alignment and testing. Especially how to use different tools and methods, such as IR phase shift and wavelength shift interferometry, to complete the element and assembly tests. The paper also presents very encouraging results from recent element tests to assembly tests. Finally we briefly touch the path forward plan to test the spectral characteristic, such as spectral resolution and response.

  15. Large-visual-angle microstructure inspired from quantitative design of Morpho butterflies' lamellae deviation using the FDTD/PSO method.

    PubMed

    Wang, Wanlin; Zhang, Wang; Chen, Weixin; Gu, Jiajun; Liu, Qinglei; Deng, Tao; Zhang, Di

    2013-01-15

    The wide angular range of the treelike structure in Morpho butterfly scales was investigated by finite-difference time-domain (FDTD)/particle-swarm-optimization (PSO) analysis. Using the FDTD method, different parameters in the Morpho butterflies' treelike structure were studied and their contributions to the angular dependence were analyzed. Then a wide angular range was realized by the PSO method from quantitatively designing the lamellae deviation (Δy), which was a crucial parameter with angular range. The field map of the wide-range reflection in a large area was given to confirm the wide angular range. The tristimulus values and corresponding color coordinates for various viewing directions were calculated to confirm the blue color in different observation angles. The wide angular range realized by the FDTD/PSO method will assist us in understanding the scientific principles involved and also in designing artificial optical materials.

  16. Loop Shaping Control Design for a Supersonic Propulsion System Model Using Quantitative Feedback Theory (QFT) Specifications and Bounds

    NASA Technical Reports Server (NTRS)

    Connolly, Joseph W.; Kopasakis, George

    2010-01-01

    This paper covers the propulsion system component modeling and controls development of an integrated mixed compression inlet and turbojet engine that will be used for an overall vehicle Aero-Propulso-Servo-Elastic (APSE) model. Using previously created nonlinear component-level propulsion system models, a linear integrated propulsion system model and loop shaping control design have been developed. The design includes both inlet normal shock position control and jet engine rotor speed control for a potential supersonic commercial transport. A preliminary investigation of the impacts of the aero-elastic effects on the incoming flow field to the propulsion system are discussed, however, the focus here is on developing a methodology for the propulsion controls design that prevents unstart in the inlet and minimizes the thrust oscillation experienced by the vehicle. Quantitative Feedback Theory (QFT) specifications and bounds, and aspects of classical loop shaping are used in the control design process. Model uncertainty is incorporated in the design to address possible error in the system identification mapping of the nonlinear component models into the integrated linear model.

  17. The Proteome of Human Liver Peroxisomes: Identification of Five New Peroxisomal Constituents by a Label-Free Quantitative Proteomics Survey

    PubMed Central

    Ofman, Rob; Bunse, Christian; Pawlas, Magdalena; Hayen, Heiko; Eisenacher, Martin; Stephan, Christian; Meyer, Helmut E.; Waterham, Hans R.; Erdmann, Ralf; Wanders, Ronald J.; Warscheid, Bettina

    2013-01-01

    The peroxisome is a key organelle of low abundance that fulfils various functions essential for human cell metabolism. Severe genetic diseases in humans are caused by defects in peroxisome biogenesis or deficiencies in the function of single peroxisomal proteins. To improve our knowledge of this important cellular structure, we studied for the first time human liver peroxisomes by quantitative proteomics. Peroxisomes were isolated by differential and Nycodenz density gradient centrifugation. A label-free quantitative study of 314 proteins across the density gradient was accomplished using high resolution mass spectrometry. By pairing statistical data evaluation, cDNA cloning and in vivo colocalization studies, we report the association of five new proteins with human liver peroxisomes. Among these, isochorismatase domain containing 1 protein points to the existence of a new metabolic pathway and hydroxysteroid dehydrogenase like 2 protein is likely involved in the transport or β-oxidation of fatty acids in human peroxisomes. The detection of alcohol dehydrogenase 1A suggests the presence of an alternative alcohol-oxidizing system in hepatic peroxisomes. In addition, lactate dehydrogenase A and malate dehydrogenase 1 partially associate with human liver peroxisomes and enzyme activity profiles support the idea that NAD+ becomes regenerated during fatty acid β-oxidation by alternative shuttling processes in human peroxisomes involving lactate dehydrogenase and/or malate dehydrogenase. Taken together, our data represent a valuable resource for future studies of peroxisome biochemistry that will advance research of human peroxisomes in health and disease. PMID:23460848

  18. Detection limits of quantitative and digital PCR assays and their influence in presence-absence surveys of environmental DNA

    USGS Publications Warehouse

    Hunter, Margaret; Dorazio, Robert M.; Butterfield, John S.; Meigs-Friend, Gaia; Nico, Leo; Ferrante, Jason

    2016-01-01

    A set of universal guidelines is needed to determine the limit of detection (LOD) in PCR-based analyses of low concentration DNA. In particular, environmental DNA (eDNA) studies require sensitive and reliable methods to detect rare and cryptic species through shed genetic material in environmental samples. Current strategies for assessing detection limits of eDNA are either too stringent or subjective, possibly resulting in biased estimates of species’ presence. Here, a conservative LOD analysis grounded in analytical chemistry is proposed to correct for overestimated DNA concentrations predominantly caused by the concentration plateau, a nonlinear relationship between expected and measured DNA concentrations. We have used statistical criteria to establish formal mathematical models for both quantitative and droplet digital PCR. To assess the method, a new Grass Carp (Ctenopharyngodon idella) TaqMan assay was developed and tested on both PCR platforms using eDNA in water samples. The LOD adjustment reduced Grass Carp occupancy and detection estimates while increasing uncertainty – indicating that caution needs to be applied to eDNA data without LOD correction. Compared to quantitative PCR, digital PCR had higher occurrence estimates due to increased sensitivity and dilution of inhibitors at low concentrations. Without accurate LOD correction, species occurrence and detection probabilities based on eDNA estimates are prone to a source of bias that cannot be reduced by an increase in sample size or PCR replicates. Other applications also could benefit from a standardized LOD such as GMO food analysis, and forensic and clinical diagnostics.

  19. "Intelligent design" of a 3D reflection survey for the SAFOD drill-hole site

    NASA Astrophysics Data System (ADS)

    Alvarez, G.; Hole, J. A.; Klemperer, S. L.; Biondi, B.; Imhof, M.

    2003-12-01

    SAFOD seeks to better understand the earthquake process by drilling though the San Andreas fault (SAF) to sample an earthquake in situ. To capitalize fully on the opportunities presented by the 1D drill-hole into a complex fault zone we must characterize the surrounding 3D geology at a scale commensurate with the drilling observations, to provide the structural context to extrapolate 1D drilling results along the fault plane and into the surrounding 3D volume. Excellent active-2D and passive-3D seismic observations completed and underway lack the detailed 3D resolution required. Only an industry-quality 3D reflection survey can provide c. 25 m subsurface sample-spacing horizontally and vertically. A 3D reflection survey will provide subsurface structural and stratigraphic control at the 100-m level, mapping major geologic units, structural boundaries, and subsurface relationships between the many faults that make up the SAF fault system. A principal objective should be a reflection-image (horizon-slice through the 3D volume) of the near-vertical fault plane(s) to show variations in physical properties around the drill-hole. Without a 3D reflection image of the fault zone, we risk interpreting drilled anomalies as ubiquitous properties of the fault, or risk missing important anomalies altogether. Such a survey cannot be properly costed or technically designed without major planning. "Intelligent survey design" can minimize source and receiver effort without compromising data-quality at the fault target. Such optimization can in principal reduce the cost of a 3D seismic survey by a factor of two or three, utilizing the known surface logistic constraints, partially-known sub-surface velocity field, and the suite of scientific targets at SAFOD. Our methodology poses the selection of the survey parameters as an optimization process that allows the parameters to vary spatially in response to changes in the subsurface. The acquisition geometry is locally optimized for

  20. Comprehension and Recall of Internet News: A Quantitative Study of Web Page Design.

    ERIC Educational Resources Information Center

    Berry, D. Leigh

    This experimental study examined the effects of multimedia on Internet news readers, in particular focusing on Web site design and its effect on comprehension and recall of news stories. Subjects (84 undergraduate students) viewed one of two versions of the same Web site--one with multimedia and one without. The Web site consisted of six stories…

  1. Using Cramer-Rao theory as spectrometer design tool aimed at quantitative complex-spectrum analysisa)

    NASA Astrophysics Data System (ADS)

    Klinkhamer, J. F. F.; van der Valk, N. C. J.; von Hellermann, M. G.; Jaspers, R.

    2008-10-01

    In this paper the use of the Cramer-Rao lower bound (CRLB) to aid in the design of diagnostic systems dealing with complex spectra is discussed. Specifically it is discussed how a priori knowledge, used to improve the estimation, can be included in the CRLB analysis. This provides improved predictions for the spectrometer characteristics that will lead to an optimal system.

  2. Simulation of complete seismic surveys for evaluation of experiment design and processing

    SciTech Connect

    Oezdenvar, T.; McMechan, G.A.; Chaney, P.

    1996-03-01

    Synthesis of complete seismic survey data sets allows analysis and optimization of all stages in an acquisition/processing sequence. The characteristics of available survey designs, parameter choices, and processing algorithms may be evaluated prior to field acquisition to produce a composite system in which all stages have compatible performance; this maximizes the cost effectiveness for a given level of accuracy, or for targets with specific characteristics. Data sets synthesized for three salt structures provide representative comparisons of time and depth migration, post-stack and prestack processing, and illustrate effects of varying recording aperture and shot spacing, iterative focusing analysis, and the interaction of migration algorithms with recording aperture. A final example demonstrates successful simulation of both 2-D acquisition and processing of a real data line over a salt pod in the Gulf of Mexico.

  3. Sampling effects on the identification of roadkill hotspots: Implications for survey design.

    PubMed

    Santos, Sara M; Marques, J Tiago; Lourenço, André; Medinas, Denis; Barbosa, A Márcia; Beja, Pedro; Mira, António

    2015-10-01

    Although locating wildlife roadkill hotspots is essential to mitigate road impacts, the influence of study design on hotspot identification remains uncertain. We evaluated how sampling frequency affects the accuracy of hotspot identification, using a dataset of vertebrate roadkills (n = 4427) recorded over a year of daily surveys along 37 km of roads. "True" hotspots were identified using this baseline dataset, as the 500-m segments where the number of road-killed vertebrates exceeded the upper 95% confidence limit of the mean, assuming a Poisson distribution of road-kills per segment. "Estimated" hotspots were identified likewise, using datasets representing progressively lower sampling frequencies, which were produced by extracting data from the baseline dataset at appropriate time intervals (1-30 days). Overall, 24.3% of segments were "true" hotspots, concentrating 40.4% of roadkills. For different groups, "true" hotspots accounted from 6.8% (bats) to 29.7% (small birds) of road segments, concentrating from <40% (frogs and toads, snakes) to >60% (lizards, lagomorphs, carnivores) of roadkills. Spatial congruence between "true" and "estimated" hotspots declined rapidly with increasing time interval between surveys, due primarily to increasing false negatives (i.e., missing "true" hotspots). There were also false positives (i.e., wrong "estimated" hotspots), particularly at low sampling frequencies. Spatial accuracy decay with increasing time interval between surveys was higher for smaller-bodied (amphibians, reptiles, small birds, small mammals) than for larger-bodied species (birds of prey, hedgehogs, lagomorphs, carnivores). Results suggest that widely used surveys at weekly or longer intervals may produce poor estimates of roadkill hotspots, particularly for small-bodied species. Surveying daily or at two-day intervals may be required to achieve high accuracy in hotspot identification for multiple species. PMID:26232568

  4. Sampling effects on the identification of roadkill hotspots: Implications for survey design.

    PubMed

    Santos, Sara M; Marques, J Tiago; Lourenço, André; Medinas, Denis; Barbosa, A Márcia; Beja, Pedro; Mira, António

    2015-10-01

    Although locating wildlife roadkill hotspots is essential to mitigate road impacts, the influence of study design on hotspot identification remains uncertain. We evaluated how sampling frequency affects the accuracy of hotspot identification, using a dataset of vertebrate roadkills (n = 4427) recorded over a year of daily surveys along 37 km of roads. "True" hotspots were identified using this baseline dataset, as the 500-m segments where the number of road-killed vertebrates exceeded the upper 95% confidence limit of the mean, assuming a Poisson distribution of road-kills per segment. "Estimated" hotspots were identified likewise, using datasets representing progressively lower sampling frequencies, which were produced by extracting data from the baseline dataset at appropriate time intervals (1-30 days). Overall, 24.3% of segments were "true" hotspots, concentrating 40.4% of roadkills. For different groups, "true" hotspots accounted from 6.8% (bats) to 29.7% (small birds) of road segments, concentrating from <40% (frogs and toads, snakes) to >60% (lizards, lagomorphs, carnivores) of roadkills. Spatial congruence between "true" and "estimated" hotspots declined rapidly with increasing time interval between surveys, due primarily to increasing false negatives (i.e., missing "true" hotspots). There were also false positives (i.e., wrong "estimated" hotspots), particularly at low sampling frequencies. Spatial accuracy decay with increasing time interval between surveys was higher for smaller-bodied (amphibians, reptiles, small birds, small mammals) than for larger-bodied species (birds of prey, hedgehogs, lagomorphs, carnivores). Results suggest that widely used surveys at weekly or longer intervals may produce poor estimates of roadkill hotspots, particularly for small-bodied species. Surveying daily or at two-day intervals may be required to achieve high accuracy in hotspot identification for multiple species.

  5. Implementing the World Mental Health Survey Initiative in Portugal – rationale, design and fieldwork procedures

    PubMed Central

    2013-01-01

    Background The World Mental Health Survey Initiative was designed to evaluate the prevalence, the correlates, the impact and the treatment patterns of mental disorders. This paper describes the rationale and the methodological details regarding the implementation of the survey in Portugal, a country that still lacks representative epidemiological data about psychiatric disorders. Methods The World Mental Health Survey is a cross-sectional study with a representative sample of the Portuguese population, aged 18 or older, based on official census information. The WMH-Composite International Diagnostic Interview, adapted to the Portuguese language by a group of bilingual experts, was used to evaluate the mental health status, disorder severity, impairment, use of services and treatment. Interviews were administered face-to-face at respondent’s dwellings, which were selected from a nationally representative multi-stage clustered area probability sample of households. The survey was administered using computer-assisted personal interview methods by trained lay interviewers. Data quality was strictly controlled in order to ensure the reliability and validity of the collected information. Results A total of 3,849 people completed the main survey, with 2,060 completing the long interview, with a response rate of 57.3%. Data cleaning was conducted in collaboration with the WMHSI Data Analysis Coordination Centre at the Department of Health Care Policy, Harvard Medical School. Collected information will provide lifetime and 12-month mental disorders diagnoses, according to the International Classification of Diseases and to the Diagnostic and Statistical Manual of Mental Disorders. Conclusions The findings of this study could have a major influence in mental health care policy planning efforts over the next years, specially in a country that still has a significant level of unmet needs regarding mental health services organization, delivery of care and epidemiological

  6. Designing, Testing, and Validating an Attitudinal Survey on an Environmental Topic: A Groundwater Pollution Survey Instrument for Secondary School Students

    ERIC Educational Resources Information Center

    Lacosta-Gabari, Idoya; Fernandez-Manzanal, Rosario; Sanchez-Gonzalez, Dolores

    2009-01-01

    Research in environmental attitudes' assessment has significantly increased in recent years. The development of specific attitude scales for specific environmental problems has often been proposed. This paper describes the Groundwater Pollution Test (GPT), a 19-item survey instrument using a Likert-type scale. The survey has been used with…

  7. Conference Discussion: The Challenges in Multi-Object Spectroscopy Instrument and Survey Design, and in Data Processing and Analysis

    NASA Astrophysics Data System (ADS)

    Balcells, M.; Skillen, I.

    2016-10-01

    The final session of the conference Multi-Object Spectroscopy in the Next Decade: Big Questions, Large Surveys, and Wide Fields, held in La Palma 2-6 March 2015, was devoted to a discussion of the challenges in designing and operating the next-generation survey spectrographs, and planning and carrying out their massive surveys. The wide-ranging 1.5-hour debate was recorded on video tape, and in this paper we report the edited transcription of the dialog.

  8. Design and Performance Considerations for the Quantitative Measurement of HEU Residues Resulting from 99Mo Production

    SciTech Connect

    McElroy, Robert Dennis; Chapman, Jeffrey Allen; Bogard, James S; Belian, Anthony P

    2011-01-01

    Molybdenum-99 is produced by the irradiation of high-enriched uranium (HEU) resulting in the accumulation of large quantities of HEU residues. In general, these residues are not recycled but are either disposed of or stored in containers with surface exposure rates as high as 100 R/h. The 235U content of these waste containers must be quantified for both accountability and waste disposal purposes. The challenges of quantifying such difficult-to-assay materials are discussed, along with performance estimates for each of several potential assay options. In particular, the design and performance of a High Activity Active Well Coincidence Counting (HA-AWCC) system designed and built specifically for these irradiated HEU waste materials are presented.

  9. Campsite survey implications for managing designated campsites at Great Smoky Mountains National Park

    USGS Publications Warehouse

    Marion, J.L.; Leung, Y.-F.; Kulhavy, D.L.; Legg, M.H.

    1998-01-01

    Backcountry campsites and shelters in Great Smoky Mountains National Park were surveyed in 1993 as part of a new impact monitoring program. A total of 395 campsites and shelters were located and assessed, including 309 legal campsites located at 84 designated campgrounds, 68 illegal campsites, and 18 shelters. Primary campsite management problems identified by the survey include: (1) campsite proliferation, (2) campsite expansion and excessive size, (3) excessive vegetation loss and soil exposure, (4) lack of visitor solitude at campsites, (5) excessive tree damage, and (6) illegal camping. A number of potential management options are recommended to address the identified campsite management problems. Many problems are linked to the ability of visitors to determine the location and number of individual campsites within each designated campground. A principal recommendation is that managers apply site-selection criteria to existing and potential new campsite locations to identify and designate campsites that will resist and constrain the areal extent of impacts and enhance visitor solitude. Educational solutions are also offered.

  10. Steam turbine blades: considerations in design and a survey of blade failures

    SciTech Connect

    Bates, R.C.; Heymann, F.J.; Swaminathan, V.P.; Cunningham, J.W.

    1981-08-01

    Thermo-mechanical considerations and material selection criteria for the design of steam turbine blades are discussed from the mechanical engineer's point of view in the first two sections of this report. Sources of vibratory excitation, the response of blades to these excitations, the stress levels and load histories that result from this reponse, and various design features incorporated into steam turbine LP blading to minimize or resist these stresses are covered. Blading alloy properties of concern to the blade designer are discussed and compared, and parameters to be used in fatigue testing recommended. The third section of the report describes several blade failure surveys. In addition to a literature survey, results of a questionnaire on LP blade failures to American utilities and a review of recent Westinghouse experience are presented. Correlations between the number of failures and parameters such as blade life, failure location in the turbine and on the blades, alloy, deposit chemistry, steam source, feedwater treatment, cooling water source, balance of plant problems, and temperature and pressure at the failed row are attempted.

  11. Injury survey of a non-traditional 'soft-edged' trampoline designed to lower equipment hazards.

    PubMed

    Eager, David B; Scarrott, Carl; Nixon, Jim; Alexander, Keith

    2013-01-01

    In Australia trampolines contribute one quarter of all childhood play equipment injuries. The objective of this study was to gather and evaluate injury data from a non-traditional, 'soft-edged', consumer trampoline, where the design aimed to minimise injuries from the equipment and from falling off. The manufacturer of the non-traditional trampoline provided the University of Technology Sydney with their Australian customer database. The study involved surveys in Queensland and New South Wales, between May 2007 and March 2010. Initially injury data was gathered by a phone interview pilot study, then in the full study, through an email survey. The 3817 respondents were the carers of child users of the 'soft-edge' trampolines. Responses were compared with Australian and US emergency department data. In both countries the proportion of injuries caused by the equipment and falling off was compared with the proportion caused by the jumpers to themselves or each other. The comparisons showed a significantly lower proportion resulted from falling-off or hitting the equipment for this design when compared to traditional trampolines, both in Australia and the US. This research concludes that equipment-induced and falling-off injuries, the more severe injuries on traditional trampolines, can be significantly reduced with appropriate trampoline design. PMID:22471672

  12. Design Effects and Generalized Variance Functions for the 1990-91 Schools and Staffing Survey (SASS). Volume II. Technical Report.

    ERIC Educational Resources Information Center

    Salvucci, Sameena; And Others

    This technical report provides the results of a study on the calculation and use of generalized variance functions (GVFs) and design effects for the 1990-91 Schools and Staffing Survey (SASS). The SASS is a periodic integrated system of sample surveys conducted by the National Center for Education Statistics (NCES) that produces sampling variances…

  13. Improving the design of acoustic and midwater trawl surveys through stratification, with an application to Lake Michigan prey fishes

    USGS Publications Warehouse

    Adams, J.V.; Argyle, R.L.; Fleischer, G.W.; Curtis, G.L.; Stickel, R.G.

    2006-01-01

    Reliable estimates of fish biomass are vital to the management of aquatic ecosystems and their associated fisheries. Acoustic and midwater trawl surveys are an efficient sampling method for estimating fish biomass in large bodies of water. To improve the precision of biomass estimates from combined acoustic and midwater trawl surveys, sampling effort should be optimally allocated within each stage of the survey design. Based on information collected during fish surveys, we developed an approach to improve the design of combined acoustic and midwater trawl surveys through stratification. Geographic strata for acoustic surveying and depth strata for midwater trawling were defined using neighbor-restricted cluster analysis, and the optimal allocation of sampling effort for each was then determined. As an example, we applied this survey stratification approach to data from lakewide acoustic and midwater trawl surveys of Lake Michigan prey fishes. Precision of biomass estimates from surveys with and without geographic stratification was compared through resampling. Use of geographic stratification with optimal sampling allocation reduced the variance of Lake Michigan acoustic biomass estimates by 77%. Stratification and optimal allocation at each stage of an acoustic and midwater trawl survey should serve to reduce the variance of the resulting biomass estimates.

  14. Recommended survey designs for occupancy modelling using motion-activated cameras: insights from empirical wildlife data

    PubMed Central

    Lewis, Jesse S.; Gerber, Brian D.

    2014-01-01

    Motion-activated cameras are a versatile tool that wildlife biologists can use for sampling wild animal populations to estimate species occurrence. Occupancy modelling provides a flexible framework for the analysis of these data; explicitly recognizing that given a species occupies an area the probability of detecting it is often less than one. Despite the number of studies using camera data in an occupancy framework, there is only limited guidance from the scientific literature about survey design trade-offs when using motion-activated cameras. A fuller understanding of these trade-offs will allow researchers to maximise available resources and determine whether the objectives of a monitoring program or research study are achievable. We use an empirical dataset collected from 40 cameras deployed across 160 km2 of the Western Slope of Colorado, USA to explore how survey effort (number of cameras deployed and the length of sampling period) affects the accuracy and precision (i.e., error) of the occupancy estimate for ten mammal and three virtual species. We do this using a simulation approach where species occupancy and detection parameters were informed by empirical data from motion-activated cameras. A total of 54 survey designs were considered by varying combinations of sites (10–120 cameras) and occasions (20–120 survey days). Our findings demonstrate that increasing total sampling effort generally decreases error associated with the occupancy estimate, but changing the number of sites or sampling duration can have very different results, depending on whether a species is spatially common or rare (occupancy = ψ) and easy or hard to detect when available (detection probability = p). For rare species with a low probability of detection (i.e., raccoon and spotted skunk) the required survey effort includes maximizing the number of sites and the number of survey days, often to a level that may be logistically unrealistic for many studies. For common species with

  15. Recommended survey designs for occupancy modelling using motion-activated cameras: insights from empirical wildlife data.

    PubMed

    Shannon, Graeme; Lewis, Jesse S; Gerber, Brian D

    2014-01-01

    Motion-activated cameras are a versatile tool that wildlife biologists can use for sampling wild animal populations to estimate species occurrence. Occupancy modelling provides a flexible framework for the analysis of these data; explicitly recognizing that given a species occupies an area the probability of detecting it is often less than one. Despite the number of studies using camera data in an occupancy framework, there is only limited guidance from the scientific literature about survey design trade-offs when using motion-activated cameras. A fuller understanding of these trade-offs will allow researchers to maximise available resources and determine whether the objectives of a monitoring program or research study are achievable. We use an empirical dataset collected from 40 cameras deployed across 160 km(2) of the Western Slope of Colorado, USA to explore how survey effort (number of cameras deployed and the length of sampling period) affects the accuracy and precision (i.e., error) of the occupancy estimate for ten mammal and three virtual species. We do this using a simulation approach where species occupancy and detection parameters were informed by empirical data from motion-activated cameras. A total of 54 survey designs were considered by varying combinations of sites (10-120 cameras) and occasions (20-120 survey days). Our findings demonstrate that increasing total sampling effort generally decreases error associated with the occupancy estimate, but changing the number of sites or sampling duration can have very different results, depending on whether a species is spatially common or rare (occupancy = ψ) and easy or hard to detect when available (detection probability = p). For rare species with a low probability of detection (i.e., raccoon and spotted skunk) the required survey effort includes maximizing the number of sites and the number of survey days, often to a level that may be logistically unrealistic for many studies. For common species with

  16. Requirements and concept design for large earth survey telescope for SEOS

    NASA Technical Reports Server (NTRS)

    Mailhot, P.; Bisbee, J.

    1975-01-01

    The efforts of a one year program of Requirements Analysis and Conceptual Design for the Large Earth Survey Telescope for the Synchronous Earth Observatory Satellite is summarized. A 1.4 meter aperture Cassegrain telescope with 0.6 deg field of view is shown to do an excellent job in satisfying the observational requirements for a wide range of earth resources and meteorological applications. The telescope provides imagery or thermal mapping in ten spectral bands at one time in a field sharing grouping of linear detector arrays. Pushbroom scanning is accomplished by spacecraft slew.

  17. 77 FR 71600 - Medicare Program; Request for Information To Aid in the Design and Development of a Survey...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-03

    ... HUMAN SERVICES Centers for Medicare & Medicaid Services Medicare Program; Request for Information To Aid in the Design and Development of a Survey Regarding Patient Experiences With Emergency Department Care AGENCY: Centers for Medicare & Medicaid Services (CMS), HHS. ACTION: Request for...

  18. Design of the South East Asian Nutrition Survey (SEANUTS): a four-country multistage cluster design study.

    PubMed

    Schaafsma, Anne; Deurenberg, Paul; Calame, Wim; van den Heuvel, Ellen G H M; van Beusekom, Christien; Hautvast, Jo; Sandjaja; Bee Koon, Poh; Rojroongwasinkul, Nipa; Le Nguyen, Bao Khanh; Parikh, Panam; Khouw, Ilse

    2013-09-01

    Nutrition is a well-known factor in the growth, health and development of children. It is also acknowledged that worldwide many people have dietary imbalances resulting in over- or undernutrition. In 2009, the multinational food company FrieslandCampina initiated the South East Asian Nutrition Survey (SEANUTS), a combination of surveys carried out in Indonesia, Malaysia, Thailand and Vietnam, to get a better insight into these imbalances. The present study describes the general study design and methodology, as well as some problems and pitfalls encountered. In each of these countries, participants in the age range of 0·5-12 years were recruited according to a multistage cluster randomised or stratified random sampling methodology. Field teams took care of recruitment and data collection. For the health status of children, growth and body composition, physical activity, bone density, and development and cognition were measured. For nutrition, food intake and food habits were assessed by questionnaires, whereas in subpopulations blood and urine samples were collected to measure the biochemical status parameters of Fe, vitamins A and D, and DHA. In Thailand, the researchers additionally studied the lipid profile in blood, whereas in Indonesia iodine excretion in urine was analysed. Biochemical data were analysed in certified laboratories. Study protocols and methodology were aligned where practically possible. In December 2011, data collection was finalised. In total, 16,744 children participated in the present study. Information that will be very relevant for formulating nutritional health policies, as well as for designing innovative food and nutrition research and development programmes, has become available. PMID:24016763

  19. Design and quantitative analysis of parametrisable eFPGA-architectures for arithmetic

    NASA Astrophysics Data System (ADS)

    Neumann, B.; von Sydow, T.; Blume, H.; Noll, T. G.

    2006-09-01

    Future SoCs will feature embedded FPGAs (eFPGAs) to enable flexible and efficient implementations of high-throughput digital signal processing applications. Current research projects on and emerging products containing FPGAs are mainly based on "standard FPGA"-architectures that are optimised for a very wide range of applications. The implementation costs of these FPGAs are dominated by a very complex interconnect network. This paper presents a method to improve the efficiency of eFPGAs by tailoring them for a certain application domain using a parametrisable architecture template derived from the results of a systematic evaluation of the requirements of the application domain. Two different architectures are discussed, a reference architecture to illustrate the methodology and possible optimisation measures as well as a specialised arithmetic-oriented eFPGA for applications like correlators, decoders, and filters. For the arithmetic-oriented architecture, a novel logic element (LE) and a special interconnect architecture that was designed with respect to the connectivity characteristics of regular datapaths, are presented. For both architecture templates, physically optimised implementations based on an automatic design approach have been created. As a first cost comparison of these implementations with standard FPGAs, the LE-density (number of logic elements per mm2) is evaluated. For the arithmetic-oriented architecture, the LE-density could be increased by an order of magnitude compared to standard architectures.

  20. Flow bioreactor design for quantitative measurements over endothelial cells using micro-particle image velocimetry

    NASA Astrophysics Data System (ADS)

    Leong, Chia Min; Voorhees, Abram; Nackman, Gary B.; Wei, Timothy

    2013-04-01

    Mechanotransduction in endothelial cells (ECs) is a highly complex process through which cells respond to changes in hemodynamic loading by generating biochemical signals involving gene and protein expression. To study the effects of mechanical loading on ECs in a controlled fashion, different in vitro devices have been designed to simulate or replicate various aspects of these physiological phenomena. This paper describes the design, use, and validation of a flow chamber which allows for spatially and temporally resolved micro-particle image velocimetry measurements of endothelial surface topography and stresses over living ECs immersed in pulsatile flow. This flow chamber also allows the study of co-cultures (i.e., ECs and smooth muscle cells) and the effect of different substrates (i.e., coverslip and/or polyethylene terepthalate (PET) membrane) on cellular response. In this report, the results of steady and pulsatile flow on fixed endothelial cells seeded on PET membrane and coverslip, respectively, are presented. Surface topography of ECs is computed from multiple two-dimensional flow measurements. The distributions of shear stress and wall pressure on each individual cell are also determined and the importance of both types of stress in cell remodeling is highlighted.

  1. Flow bioreactor design for quantitative measurements over endothelial cells using micro-particle image velocimetry.

    PubMed

    Leong, Chia Min; Voorhees, Abram; Nackman, Gary B; Wei, Timothy

    2013-04-01

    Mechanotransduction in endothelial cells (ECs) is a highly complex process through which cells respond to changes in hemodynamic loading by generating biochemical signals involving gene and protein expression. To study the effects of mechanical loading on ECs in a controlled fashion, different in vitro devices have been designed to simulate or replicate various aspects of these physiological phenomena. This paper describes the design, use, and validation of a flow chamber which allows for spatially and temporally resolved micro-particle image velocimetry measurements of endothelial surface topography and stresses over living ECs immersed in pulsatile flow. This flow chamber also allows the study of co-cultures (i.e., ECs and smooth muscle cells) and the effect of different substrates (i.e., coverslip and∕or polyethylene terepthalate (PET) membrane) on cellular response. In this report, the results of steady and pulsatile flow on fixed endothelial cells seeded on PET membrane and coverslip, respectively, are presented. Surface topography of ECs is computed from multiple two-dimensional flow measurements. The distributions of shear stress and wall pressure on each individual cell are also determined and the importance of both types of stress in cell remodeling is highlighted.

  2. HIV testing during the Canadian immigration medical examination: a national survey of designated medical practitioners.

    PubMed

    Tran, Jennifer M; Li, Alan; Owino, Maureen; English, Ken; Mascarenhas, Lyndon; Tan, Darrell H S

    2014-01-01

    HIV testing is mandatory for individuals wishing to immigrate to Canada. Since the Designated Medical Practitioners (DMPs) who perform these tests may have varying experience in HIV and time constraints in their clinical practices, there may be variability in the quality of pre- and posttest counseling provided. We surveyed DMPs regarding HIV testing, counseling, and immigration inadmissibility. A 16-item survey was mailed to all DMPs across Canada (N = 203). The survey inquired about DMP characteristics, knowledge of HIV, attitudes and practices regarding inadmissibility and counseling, and interest in continuing medical education. There were a total of 83 respondents (41%). Participants frequently rated their knowledge of HIV diagnostics, cultural competency, and HIV/AIDS service organizations as "fair" (40%, 43%, and 44%, respectively). About 25%, 46%, and 11% of the respondents agreed/strongly agreed with the statements "HIV infected individuals pose a danger to public health and safety," "HIV-positive immigrants cause excessive demand on the healthcare system," and "HIV seropositivity is a reasonable ground for denial into Canada," respectively. Language was cited as a barrier to counseling, which focused on transmission risks (46% discussed this as "always" or "often") more than coping and social support (37%). There was a high level of interest (47%) in continuing medical education in this area. There are areas for improvement regarding DMPs' knowledge, attitudes, and practices about HIV infection, counseling, and immigration criteria. Continuing medical education and support for DMPs to facilitate practice changes could benefit newcomers who test positive through the immigration process. PMID:25029636

  3. HIV testing during the Canadian immigration medical examination: a national survey of designated medical practitioners.

    PubMed

    Tran, Jennifer M; Li, Alan; Owino, Maureen; English, Ken; Mascarenhas, Lyndon; Tan, Darrell H S

    2014-01-01

    HIV testing is mandatory for individuals wishing to immigrate to Canada. Since the Designated Medical Practitioners (DMPs) who perform these tests may have varying experience in HIV and time constraints in their clinical practices, there may be variability in the quality of pre- and posttest counseling provided. We surveyed DMPs regarding HIV testing, counseling, and immigration inadmissibility. A 16-item survey was mailed to all DMPs across Canada (N = 203). The survey inquired about DMP characteristics, knowledge of HIV, attitudes and practices regarding inadmissibility and counseling, and interest in continuing medical education. There were a total of 83 respondents (41%). Participants frequently rated their knowledge of HIV diagnostics, cultural competency, and HIV/AIDS service organizations as "fair" (40%, 43%, and 44%, respectively). About 25%, 46%, and 11% of the respondents agreed/strongly agreed with the statements "HIV infected individuals pose a danger to public health and safety," "HIV-positive immigrants cause excessive demand on the healthcare system," and "HIV seropositivity is a reasonable ground for denial into Canada," respectively. Language was cited as a barrier to counseling, which focused on transmission risks (46% discussed this as "always" or "often") more than coping and social support (37%). There was a high level of interest (47%) in continuing medical education in this area. There are areas for improvement regarding DMPs' knowledge, attitudes, and practices about HIV infection, counseling, and immigration criteria. Continuing medical education and support for DMPs to facilitate practice changes could benefit newcomers who test positive through the immigration process.

  4. Statistical methods for efficient design of community surveys of response to noise: Random coefficients regression models

    NASA Technical Reports Server (NTRS)

    Tomberlin, T. J.

    1985-01-01

    Research studies of residents' responses to noise consist of interviews with samples of individuals who are drawn from a number of different compact study areas. The statistical techniques developed provide a basis for those sample design decisions. These techniques are suitable for a wide range of sample survey applications. A sample may consist of a random sample of residents selected from a sample of compact study areas, or in a more complex design, of a sample of residents selected from a sample of larger areas (e.g., cities). The techniques may be applied to estimates of the effects on annoyance of noise level, numbers of noise events, the time-of-day of the events, ambient noise levels, or other factors. Methods are provided for determining, in advance, how accurately these effects can be estimated for different sample sizes and study designs. Using a simple cost function, they also provide for optimum allocation of the sample across the stages of the design for estimating these effects. These techniques are developed via a regression model in which the regression coefficients are assumed to be random, with components of variance associated with the various stages of a multi-stage sample design.

  5. THE DEEP2 GALAXY REDSHIFT SURVEY: DESIGN, OBSERVATIONS, DATA REDUCTION, AND REDSHIFTS

    SciTech Connect

    Newman, Jeffrey A.; Cooper, Michael C.; Davis, Marc; Faber, S. M.; Guhathakurta, Puragra; Koo, David C.; Phillips, Andrew C.; Conroy, Charlie; Harker, Justin J.; Lai, Kamson; Dutton, Aaron A.; Finkbeiner, Douglas P.; Gerke, Brian F.; Rosario, David J.; Weiner, Benjamin J.; Willmer, C. N. A.; Yan Renbin; Kassin, Susan A.; Konidaris, N. P. E-mail: djm70@pitt.edu E-mail: mdavis@berkeley.edu E-mail: koo@ucolick.org E-mail: phillips@ucolick.org; and others

    2013-09-15

    We describe the design and data analysis of the DEEP2 Galaxy Redshift Survey, the densest and largest high-precision redshift survey of galaxies at z {approx} 1 completed to date. The survey was designed to conduct a comprehensive census of massive galaxies, their properties, environments, and large-scale structure down to absolute magnitude M{sub B} = -20 at z {approx} 1 via {approx}90 nights of observation on the Keck telescope. The survey covers an area of 2.8 deg{sup 2} divided into four separate fields observed to a limiting apparent magnitude of R{sub AB} = 24.1. Objects with z {approx}< 0.7 are readily identifiable using BRI photometry and rejected in three of the four DEEP2 fields, allowing galaxies with z > 0.7 to be targeted {approx}2.5 times more efficiently than in a purely magnitude-limited sample. Approximately 60% of eligible targets are chosen for spectroscopy, yielding nearly 53,000 spectra and more than 38,000 reliable redshift measurements. Most of the targets that fail to yield secure redshifts are blue objects that lie beyond z {approx} 1.45, where the [O II] 3727 A doublet lies in the infrared. The DEIMOS 1200 line mm{sup -1} grating used for the survey delivers high spectral resolution (R {approx} 6000), accurate and secure redshifts, and unique internal kinematic information. Extensive ancillary data are available in the DEEP2 fields, particularly in the Extended Groth Strip, which has evolved into one of the richest multiwavelength regions on the sky. This paper is intended as a handbook for users of the DEEP2 Data Release 4, which includes all DEEP2 spectra and redshifts, as well as for the DEEP2 DEIMOS data reduction pipelines. Extensive details are provided on object selection, mask design, biases in target selection and redshift measurements, the spec2d two-dimensional data-reduction pipeline, the spec1d automated redshift pipeline, and the zspec visual redshift verification process, along with examples of instrumental signatures or

  6. The DEEP2 Galaxy Redshift Survey: Design, Observations, Data Reduction, and Redshifts

    NASA Technical Reports Server (NTRS)

    Newman, Jeffrey A.; Cooper, Michael C.; Davis, Marc; Faber, S. M.; Coil, Alison L; Guhathakurta, Puraga; Koo, David C.; Phillips, Andrew C.; Conroy, Charlie; Dutton, Aaron A.; Finkbeiner, Douglas P.; Gerke, Brian F.; Rosario, David J.; Weiner, Benjamin J.; Wilmer, C. N. A.; Yan, Renbin; Harker, Justin J.; Kassin, Susan A.; Konidaris, N. P.; Lai, Kamson; Madgwick, Darren S.; Noeske, K. G.; Wirth, Gregory D.; Kirby, Evan N.; Lotz, Jennifer M.

    2013-01-01

    We describe the design and data analysis of the DEEP2 Galaxy Redshift Survey, the densest and largest high-precision redshift survey of galaxies at z approx. 1 completed to date. The survey was designed to conduct a comprehensive census of massive galaxies, their properties, environments, and large-scale structure down to absolute magnitude MB = -20 at z approx. 1 via approx.90 nights of observation on the Keck telescope. The survey covers an area of 2.8 Sq. deg divided into four separate fields observed to a limiting apparent magnitude of R(sub AB) = 24.1. Objects with z approx. < 0.7 are readily identifiable using BRI photometry and rejected in three of the four DEEP2 fields, allowing galaxies with z > 0.7 to be targeted approx. 2.5 times more efficiently than in a purely magnitude-limited sample. Approximately 60% of eligible targets are chosen for spectroscopy, yielding nearly 53,000 spectra and more than 38,000 reliable redshift measurements. Most of the targets that fail to yield secure redshifts are blue objects that lie beyond z approx. 1.45, where the [O ii] 3727 Ang. doublet lies in the infrared. The DEIMOS 1200 line mm(exp -1) grating used for the survey delivers high spectral resolution (R approx. 6000), accurate and secure redshifts, and unique internal kinematic information. Extensive ancillary data are available in the DEEP2 fields, particularly in the Extended Groth Strip, which has evolved into one of the richest multiwavelength regions on the sky. This paper is intended as a handbook for users of the DEEP2 Data Release 4, which includes all DEEP2 spectra and redshifts, as well as for the DEEP2 DEIMOS data reduction pipelines. Extensive details are provided on object selection, mask design, biases in target selection and redshift measurements, the spec2d two-dimensional data-reduction pipeline, the spec1d automated redshift pipeline, and the zspec visual redshift verification process, along with examples of instrumental signatures or other

  7. The DEEP2 Galaxy Redshift Survey: Design, Observations, Data Reduction, and Redshifts

    NASA Astrophysics Data System (ADS)

    Newman, Jeffrey A.; Cooper, Michael C.; Davis, Marc; Faber, S. M.; Coil, Alison L.; Guhathakurta, Puragra; Koo, David C.; Phillips, Andrew C.; Conroy, Charlie; Dutton, Aaron A.; Finkbeiner, Douglas P.; Gerke, Brian F.; Rosario, David J.; Weiner, Benjamin J.; Willmer, C. N. A.; Yan, Renbin; Harker, Justin J.; Kassin, Susan A.; Konidaris, N. P.; Lai, Kamson; Madgwick, Darren S.; Noeske, K. G.; Wirth, Gregory D.; Connolly, A. J.; Kaiser, N.; Kirby, Evan N.; Lemaux, Brian C.; Lin, Lihwai; Lotz, Jennifer M.; Luppino, G. A.; Marinoni, C.; Matthews, Daniel J.; Metevier, Anne; Schiavon, Ricardo P.

    2013-09-01

    We describe the design and data analysis of the DEEP2 Galaxy Redshift Survey, the densest and largest high-precision redshift survey of galaxies at z ~ 1 completed to date. The survey was designed to conduct a comprehensive census of massive galaxies, their properties, environments, and large-scale structure down to absolute magnitude MB = -20 at z ~ 1 via ~90 nights of observation on the Keck telescope. The survey covers an area of 2.8 deg2 divided into four separate fields observed to a limiting apparent magnitude of R AB = 24.1. Objects with z <~ 0.7 are readily identifiable using BRI photometry and rejected in three of the four DEEP2 fields, allowing galaxies with z > 0.7 to be targeted ~2.5 times more efficiently than in a purely magnitude-limited sample. Approximately 60% of eligible targets are chosen for spectroscopy, yielding nearly 53,000 spectra and more than 38,000 reliable redshift measurements. Most of the targets that fail to yield secure redshifts are blue objects that lie beyond z ~ 1.45, where the [O II] 3727 Å doublet lies in the infrared. The DEIMOS 1200 line mm-1 grating used for the survey delivers high spectral resolution (R ~ 6000), accurate and secure redshifts, and unique internal kinematic information. Extensive ancillary data are available in the DEEP2 fields, particularly in the Extended Groth Strip, which has evolved into one of the richest multiwavelength regions on the sky. This paper is intended as a handbook for users of the DEEP2 Data Release 4, which includes all DEEP2 spectra and redshifts, as well as for the DEEP2 DEIMOS data reduction pipelines. Extensive details are provided on object selection, mask design, biases in target selection and redshift measurements, the spec2d two-dimensional data-reduction pipeline, the spec1d automated redshift pipeline, and the zspec visual redshift verification process, along with examples of instrumental signatures or other artifacts that in some cases remain after data reduction. Redshift

  8. Improving radiation survey data using CADD/CAE (computer-aided design and drafting computer-aided engineering)

    SciTech Connect

    Palau, G.L.; Tarpinian, J.E.

    1987-01-01

    A new application of computer-aided design and drafting (CADD) and computer-aided engineering (CAE) at the Three Mile Island Unit 2 (TMI-2) cleanup is improving the quality of radiation survey data taken in the plant. The use of CADD/CAE-generated survey maps has increased both the accuracy of survey data and the capability to perform analyses with these data. In addition, health physics technician manhours and radiation exposure can be reduced in situations where the CADD/CAE-generated drawings are used for survey mapping.

  9. Trajectory Design Enhancements to Mitigate Risk for the Transiting Exoplanet Survey Satellite (TESS)

    NASA Technical Reports Server (NTRS)

    Dichmann, Donald; Parker, Joel; Nickel, Craig; Lutz, Stephen

    2016-01-01

    The Transiting Exoplanet Survey Satellite (TESS) will employ a highly eccentric Earth orbit, in 2:1 lunar resonance, which will be reached with a lunar flyby preceded by 3.5 phasing loops. The TESS mission has limited propellant and several constraints on the science orbit and on the phasing loops. Based on analysis and simulation, we have designed the phasing loops to reduce delta-V (DV) and to mitigate risk due to maneuver execution errors. We have automated the trajectory design process and use distributed processing to generate and optimal nominal trajectories; to check constraint satisfaction; and finally to model the effects of maneuver errors to identify trajectories that best meet the mission requirements.

  10. Design and Evaluation of Digital Learning Material to Support Acquisition of Quantitative Problem-Solving Skills Within Food Chemistry

    NASA Astrophysics Data System (ADS)

    Diederen, Julia; Gruppen, Harry; Hartog, Rob; Voragen, Alphons G. J.

    2005-12-01

    One of the modules in the course Food Chemistry at Wageningen University (Wageningen, The Netherlands) focuses on quantitative problem-solving skills related to chemical reactions. The intended learning outcomes of this module are firstly, to be able to translate practical food chemistry related problems into mathematical equations and to solve them and secondly, to have a quantitative understanding of chemical reactions in food. Until 3 years ago the learning situation for this module was inefficient for both teachers and students. For this learning situation a staff/student ratio of 1/25 was experienced to be insufficient: the level of student frustration was high and many students could not finish the tasks within the scheduled time. To make this situation more efficient for both students and teachers and to lower the level of frustration, digital learning material was designed. The main characteristic of this learning material is that it provides just-in-time information, such as feedback, hints and links to background information. The material was evaluated in three case studies in a normal educational setting ( n = 22, n = 31, n = 33). The results show that now frustration of students is low, the time in classes is efficiently used, and the staff/student ratio of 1/25 is indeed sufficient. A staff student ratio of around 1/40 is now regarded as realistic.

  11. Designing quantitative structure activity relationships to predict specific toxic endpoints for polybrominated diphenyl ethers in mammalian cells.

    PubMed

    Rawat, S; Bruce, E D

    2014-01-01

    Polybrominated diphenyl ethers (PBDEs) are known as effective flame retardants and have vast industrial application in products like plastics, building materials and textiles. They are found to be structurally similar to thyroid hormones that are responsible for regulating metabolism in the body. Structural similarity with the hormones poses a threat to human health because, once in the system, PBDEs have the potential to affect thyroid hormone transport and metabolism. This study was aimed at designing quantitative structure-activity relationship (QSAR) models for predicting toxic endpoints, namely cell viability and apoptosis, elicited by PBDEs in mammalian cells. Cell viability was evaluated quantitatively using a general cytotoxicity bioassay using Janus Green dye and apoptosis was evaluated using a caspase assay. This study has thus modelled the overall cytotoxic influence of PBDEs at an early and a late endpoint by the Genetic Function Approximation method. This research was a twofold process including running in vitro bioassays to collect data on the toxic endpoints and modeling the evaluated endpoints using QSARs. Cell viability and apoptosis responses for Hep G2 cells exposed to PBDEs were successfully modelled with an r(2) of 0.97 and 0.94, respectively. PMID:24738916

  12. Quantitative evaluation of a thrust vector controlled transport at the conceptual design phase

    NASA Astrophysics Data System (ADS)

    Ricketts, Vincent Patrick

    The impetus to innovate, to push the bounds and break the molds of evolutionary design trends, often comes from competition but sometimes requires catalytic political legislature. For this research endeavor, the 'catalyzing legislation' comes in response to the rise in cost of fossil fuels and the request put forth by NASA on aircraft manufacturers to show reduced aircraft fuel consumption of +60% within 30 years. This necessitates that novel technologies be considered to achieve these values of improved performance. One such technology is thrust vector control (TVC). The beneficial characteristic of thrust vector control technology applied to the traditional tail-aft configuration (TAC) commercial transport is its ability to retain the operational advantage of this highly evolved aircraft type like cabin evacuation, ground operation, safety, and certification. This study explores if the TVC transport concept offers improved flight performance due to synergistically reducing the traditional empennage size, overall resulting in reduced weight and drag, and therefore reduced aircraft fuel consumption. In particular, this study explores if the TVC technology in combination with the reduced empennage methodology enables the TAC aircraft to synergistically evolve while complying with current safety and certification regulation. This research utilizes the multi-disciplinary parametric sizing software, AVD Sizing, developed by the Aerospace Vehicle Design (AVD) Laboratory. The sizing software is responsible for visualizing the total system solution space via parametric trades and is capable of determining if the TVC technology can enable the TAC aircraft to synergistically evolve, showing marked improvements in performance and cost. This study indicates that the TVC plus reduced empennage methodology shows marked improvements in performance and cost.

  13. Wide-Field InfraRed Survey Telescope (WFIRST) slitless spectrometer: design, prototype, and results

    NASA Astrophysics Data System (ADS)

    Gong, Qian; Content, David A.; Dominguez, Margaret; Emmett, Thomas; Griesmann, Ulf; Hagopian, John; Kruk, Jeffrey; Marx, Catherine; Pasquale, Bert; Wallace, Thomas; Whipple, Arthur

    2016-07-01

    The slitless spectrometer plays an important role in the WFIRST mission for the survey of emission-line galaxies. This will be an unprecedented very wide field, HST quality 3D survey of emission line galaxies1. The concept of the compound grism as a slitless spectrometer has been presented previously. The presentation briefly discusses the challenges and solutions of the optical design, and recent specification updates, as well as a brief comparison between the prototype and the latest design. However, the emphasis of this paper is the progress of the grism prototype: the fabrication and test of the complicated diffractive optical elements and powered prism, as well as grism assembly alignment and testing. Especially how to use different tools and methods, such as IR phase shift and wavelength shift interferometry, to complete the element and assembly tests. The paper also presents very encouraging results from recent element tests to assembly tests. Finally we briefly touch the path forward plan to test the spectral characteristic, such as spectral resolution and response.

  14. Design and operation of the 1995 National Survey of Family Growth.

    PubMed

    Mosher, W D

    1998-01-01

    In the US, the 1995 National Survey of Family Growth (NSFG) was designed to provide richer data than previous NSFG surveys from 1973, 1976, 1982, and 1988. Planning for the 1995 NSFG took place at a series of meetings beginning in 1990. Pretesting of the expanded questionnaire, the new computer-assisted personal interviewing method, and the audio computer-assisted, self-interviewing method for sensitive topics occurred in 1993 and led to the decision to offer a cash incentive to respondents and to use the new interviewing methods. The revised questionnaire collected information on event histories, pregnancy history and family formation, partner history, sterilization and fecundity, contraception and birth expectations, use of family planning and other medical services, demographic characteristics, abortion history, number of sexual partners, and rape. The sample for the 1995 NSFG included 14,000 civilian, noninstitutionalized women of reproductive age, 13,795 of whom proved eligible. Of these, 79% completed interviews. Quality control measures included careful design and testing of the questionnaire, use of a Life History Calendar, intensive interviewer training, consistency checks, and use of the incentive. Sampling weights for each respondent were used to derive nationally representative statistical estimates. Sampling errors were created to reflect the complexity of the sample. Research based on the results of the 1995 NSFS has only begun to take advantage of the potential offered by these data. PMID:9494815

  15. Quantitative analysis in field-flow fractionation using ultraviolet-visible detectors: an experimental design for absolute measurements

    PubMed

    Zattoni; Melucci; Torsi; Reschiglian

    2000-03-01

    In previous works, it has been shown that a standard ultraviolet-visible detection system can be used for quantitative analysis of heterogeneous systems (dispersed supermicron particles) in field-flow fractionation (FFF) by single peak area measurements. Such an analysis method was shown to require either experimental measurements (standardless analysis) or an accurate model (absolute analysis) to determine the extinction efficiency of the particulate samples. In this work, an experimental design to assess absolute analysis in FFF through prediction of particles' optical extinction is presented. Prediction derives from the semiempirical approach by van de Hulst and Walstra. Special emphasis is given to the restriction of the experimental domain of instrumental conditions within which absolute analysis is allowed. Validation by statistical analysis and a practical application to real sample recovery studies are also given.

  16. Dealing with trade-offs in destructive sampling designs for occupancy surveys.

    PubMed

    Canessa, Stefano; Heard, Geoffrey W; Robertson, Peter; Sluiter, Ian R K

    2015-01-01

    Occupancy surveys should be designed to minimise false absences. This is commonly achieved by increasing replication or increasing the efficiency of surveys. In the case of destructive sampling designs, in which searches of individual microhabitats represent the repeat surveys, minimising false absences leads to an inherent trade-off. Surveyors can sample more low quality microhabitats, bearing the resultant financial costs and producing wider-spread impacts, or they can target high quality microhabitats were the focal species is more likely to be found and risk more severe impacts on local habitat quality. We show how this trade-off can be solved with a decision-theoretic approach, using the Millewa Skink Hemiergis millewae from southern Australia as a case study. Hemiergis millewae is an endangered reptile that is best detected using destructive sampling of grass hummocks. Within sites that were known to be occupied by H. millewae, logistic regression modelling revealed that lizards were more frequently detected in large hummocks. If this model is an accurate representation of the detection process, searching large hummocks is more efficient and requires less replication, but this strategy also entails destruction of the best microhabitats for the species. We developed an optimisation tool to calculate the minimum combination of the number and size of hummocks to search to achieve a given cumulative probability of detecting the species at a site, incorporating weights to reflect the sensitivity of the results to a surveyor's priorities. The optimisation showed that placing high weight on minimising volume necessitates impractical replication, whereas placing high weight on minimising replication requires searching very large hummocks which are less common and may be vital for H. millewae. While destructive sampling methods are sometimes necessary, surveyors must be conscious of the ecological impacts of these methods. This study provides a simple tool for

  17. System design and development of a pinhole SPECT system for quantitative functional imaging of small animals.

    PubMed

    Aoi, Toshiyuki; Zeniya, Tsutomu; Watabe, Hiroshi; Deloar, Hossain M; Matsuda, Tetsuya; Iida, Hidehiro

    2006-04-01

    Recently, small animal imaging by pinhole SPECT has been widely investigated by several researchers. We developed a pinhole SPECT system specially designed for small animal imaging. The system consists of a rotation unit for a small animal and a SPECT camera attached with a pinhole collimator. In order to acquire complete data of the projections, the system has two orbits with angles of 90 degrees and 45 degrees with respect to the object. In this system, the position of the SPECT camera is kept fixed, and the animal is rotated in order to avoid misalignment of the center of rotation (COR). We implemented a three dimensional OSEM algorithm for the reconstruction of data acquired by the system from both the orbitals. A point source experiment revealed no significant COR misalignment using the proposed system. Experiments with a line phantom clearly indicated that our system succeeded in minimizing the misalignment of the COR. We performed a study with a rat and 99mTc-HMDP, an agent for bone scan, and demonstrated a dramatic improvement in the spatial resolution and uniformity achieved by our system in comparison with the conventional Feldkamp algorithm with one set of orbital data.

  18. Large-Scale Survey Findings Inform Patients’ Experiences in Using Secure Messaging to Engage in Patient-Provider Communication and Self-Care Management: A Quantitative Assessment

    PubMed Central

    Patel, Nitin R; Lind, Jason D; Antinori, Nicole

    2015-01-01

    Background Secure email messaging is part of a national transformation initiative in the United States to promote new models of care that support enhanced patient-provider communication. To date, only a limited number of large-scale studies have evaluated users’ experiences in using secure email messaging. Objective To quantitatively assess veteran patients’ experiences in using secure email messaging in a large patient sample. Methods A cross-sectional mail-delivered paper-and-pencil survey study was conducted with a sample of respondents identified as registered for the Veteran Health Administrations’ Web-based patient portal (My HealtheVet) and opted to use secure messaging. The survey collected demographic data, assessed computer and health literacy, and secure messaging use. Analyses conducted on survey data include frequencies and proportions, chi-square tests, and one-way analysis of variance. Results The majority of respondents (N=819) reported using secure messaging 6 months or longer (n=499, 60.9%). They reported secure messaging to be helpful for completing medication refills (n=546, 66.7%), managing appointments (n=343, 41.9%), looking up test results (n=350, 42.7%), and asking health-related questions (n=340, 41.5%). Notably, some respondents reported using secure messaging to address sensitive health topics (n=67, 8.2%). Survey responses indicated that younger age (P=.039) and higher levels of education (P=.025) and income (P=.003) were associated with more frequent use of secure messaging. Females were more likely to report using secure messaging more often, compared with their male counterparts (P=.098). Minorities were more likely to report using secure messaging more often, at least once a month, compared with nonminorities (P=.086). Individuals with higher levels of health literacy reported more frequent use of secure messaging (P=.007), greater satisfaction (P=.002), and indicated that secure messaging is a useful (P=.002) and easy

  19. Survey of alternative gas turbine engine and cycle design. Final report

    SciTech Connect

    Lukas, H.

    1986-02-01

    In the period of the 1940's to 1960's much experimentation was performed in the areas of intercooling, reheat, and recuperation, as well as the use of low-grade fuels in gas turbines. The Electric Power Research Institute (EPRI), in an effort to document past experience which can be used as the basis for current design activities, commissioned a study to document alternate cycles and components used in gas turbine design. The study was performed by obtaining the important technical and operational criteria of the cycles through a literature search of published documents, articles, and papers. Where possible the information was augmented through dialogue with persons associated with those cycles and with the manufacturers. The survey indicated that many different variations of the simple open-cycle gas turbine plant were used. Many of these changes resulted in increases in efficiency over the low simple-cycle efficiency of that period. Metallurgy, as well as compressor and turbine design, limited the simple-cycle efficiency to the upper teens. The cycle modifications increased those efficiencies to the twenties and thirties. Advances in metallurgy as well as compressor and turbine design, coupled with the decrease in flue cost, stopped the development of these complex cycles. Many of the plants operated successfully for many years, and only because newer simple-cycle gas turbine plants and large steam plants had better heat rates were these units shutdown or put into stand-by service. 24 refs., 25 figs., 114 tabs.

  20. A Survey to Examine Teachers' Perceptions of Design Dispositions, Lesson Design Practices, and Their Relationships with Technological Pedagogical Content Knowledge (TPACK)

    ERIC Educational Resources Information Center

    Koh, Joyce Hwee Ling; Chai, Ching Sing; Hong, Huang-Yao; Tsai, Chin-Chung

    2015-01-01

    This study investigates 201 Singaporean teachers' perceptions of their technological pedagogical content knowledge (TPACK), lesson design practices, and design dispositions through a survey instrument. Investigation of these constructs reveal important variables influencing teachers' perceptions of TPACK which have not yet been explored. The…

  1. Design and Practice on Metadata Service System of Surveying and Mapping Results Based on Geonetwork

    NASA Astrophysics Data System (ADS)

    Zha, Z.; Zhou, X.

    2011-08-01

    Based on the analysis and research on the current geographic information sharing and metadata service,we design, develop and deploy a distributed metadata service system based on GeoNetwork covering more than 30 nodes in provincial units of China.. By identifying the advantages of GeoNetwork, we design a distributed metadata service system of national surveying and mapping results. It consists of 31 network nodes, a central node and a portal. Network nodes are the direct system metadata source, and are distributed arround the country. Each network node maintains a metadata service system, responsible for metadata uploading and management. The central node harvests metadata from network nodes using OGC CSW 2.0.2 standard interface. The portal shows all metadata in the central node, provides users with a variety of methods and interface for metadata search or querying. It also provides management capabilities on connecting the central node and the network nodes together. There are defects with GeoNetwork too. Accordingly, we made improvement and optimization on big-amount metadata uploading, synchronization and concurrent access. For metadata uploading and synchronization, by carefully analysis the database and index operation logs, we successfully avoid the performance bottlenecks. And with a batch operation and dynamic memory management solution, data throughput and system performance are significantly improved; For concurrent access, , through a request coding and results cache solution, query performance is greatly improved. To smoothly respond to huge concurrent requests, a web cluster solution is deployed. This paper also gives an experiment analysis and compares the system performance before and after improvement and optimization. Design and practical results have been applied in national metadata service system of surveying and mapping results. It proved that the improved GeoNetwork service architecture can effectively adaptive for distributed deployment

  2. Utility FGD survey, January--December 1989. Volume 2, Design performance data for operating FGD systems: Part 2

    SciTech Connect

    Hance, S.L.; McKibben, R.S.; Jones, F.M.

    1992-03-01

    This is Volume 2 part 2, of the Utility flue gas desulfurization (FGD) Survey report, which is generated by a computerized data base management system, represents a survey of operational and planned domestic utility flue gas desulfurization (FGD) systems. It summarizes information contributed by the utility industry, system and equipment suppliers, system designers, research organizations, and regulatory agencies. The data cover system design, fuel characteristics, operating history, and actual system performance. Also included is a unit-by-unit discussion of problems and solutions associated with the boilers, scrubbers, and FGD systems. This volume particularly contains basic design and performance data.

  3. THE COS-HALOS SURVEY: RATIONALE, DESIGN, AND A CENSUS OF CIRCUMGALACTIC NEUTRAL HYDROGEN

    SciTech Connect

    Tumlinson, Jason; Thom, Christopher; Sembach, Kenneth R.; Werk, Jessica K.; Prochaska, J. Xavier; Davé, Romeel; Oppenheimer, Benjamin D.; Ford, Amanda Brady; O'Meara, John M.; Peeples, Molly S.; Weinberg, David H.

    2013-11-01

    We present the design and methods of the COS-Halos survey, a systematic investigation of the gaseous halos of 44 z = 0.15-0.35 galaxies using background QSOs observed with the Cosmic Origins Spectrograph aboard the Hubble Space Telescope. This survey has yielded 39 spectra of z{sub em} ≅ 0.5 QSOs with S/N ∼10-15 per resolution element. The QSO sightlines pass within 150 physical kpc of the galaxies, which span early and late types over stellar mass log M{sub *}/M{sub ☉} = 9.5-11.5. We find that the circumgalactic medium exhibits strong H I, averaging ≅ 1 Å in Lyα equivalent width out to 150 kpc, with 100% covering fraction for star-forming galaxies and 75% covering for passive galaxies. We find good agreement in column densities between this survey and previous studies over similar range of impact parameter. There is weak evidence for a difference between early- and late-type galaxies in the strength and distribution of H I. Kinematics indicate that the detected material is bound to the host galaxy, such that ∼> 90% of the detected column density is confined within ±200 km s{sup –1} of the galaxies. This material generally exists well below the halo virial temperatures at T ∼< 10{sup 5} K. We evaluate a number of possible origin scenarios for the detected material, and in the end favor a simple model in which the bulk of the detected H I arises in a bound, cool, low-density photoionized diffuse medium that is generic to all L* galaxies and may harbor a total gaseous mass comparable to galactic stellar masses.

  4. Changes in depth occupied by Great Lakes lake whitefish populations and the influence of survey design

    USGS Publications Warehouse

    Rennie, Michael D.; Weidel, Brian C.; Claramunt, Randy; Dunlob, Erin S.

    2015-01-01

    Understanding fish habitat use is important in determining conditions that ultimately affect fish energetics, growth and reproduction. Great Lakes lake whitefish (Coregonus clupeaformis) have demonstrated dramatic changes in growth and life history traits since the appearance of dreissenid mussels in the Great Lakes, but the role of habitat occupancy in driving these changes is poorly understood. To better understand temporal changes in lake whitefish depth of capture (Dw), we compiled a database of fishery-independent surveys representing multiple populations across all five Laurentian Great Lakes. By demonstrating the importance of survey design in estimating Dw, we describe a novel method for detecting survey-based bias in Dw and removing potentially biased data. Using unbiased Dw estimates, we show clear differences in the pattern and timing of changes in lake whitefish Dw between our reference sites (Lake Superior) and those that have experienced significant benthic food web changes (lakes Michigan, Huron, Erie and Ontario). Lake whitefish Dw in Lake Superior tended to gradually shift to shallower waters, but changed rapidly in other locations coincident with dreissenid establishment and declines in Diporeia densities. Almost all lake whitefish populations that were exposed to dreissenids demonstrated deeper Dw following benthic food web change, though a subset of these populations subsequently shifted to more shallow depths. In some cases in lakes Huron and Ontario, shifts towards more shallow Dw are occurring well after documented Diporeia collapse, suggesting the role of other drivers such as habitat availability or reliance on alternative prey sources.

  5. The COS-Halos Survey: Rationale, Design, and a Census of Circumgalactic Neutral Hydrogen

    NASA Astrophysics Data System (ADS)

    Tumlinson, Jason; Thom, Christopher; Werk, Jessica K.; Prochaska, J. Xavier; Tripp, Todd M.; Katz, Neal; Davé, Romeel; Oppenheimer, Benjamin D.; Meiring, Joseph D.; Ford, Amanda Brady; O'Meara, John M.; Peeples, Molly S.; Sembach, Kenneth R.; Weinberg, David H.

    2013-11-01

    We present the design and methods of the COS-Halos survey, a systematic investigation of the gaseous halos of 44 z = 0.15-0.35 galaxies using background QSOs observed with the Cosmic Origins Spectrograph aboard the Hubble Space Telescope. This survey has yielded 39 spectra of z em ~= 0.5 QSOs with S/N ~10-15 per resolution element. The QSO sightlines pass within 150 physical kpc of the galaxies, which span early and late types over stellar mass log M */M ⊙ = 9.5-11.5. We find that the circumgalactic medium exhibits strong H I, averaging ~= 1 Å in Lyα equivalent width out to 150 kpc, with 100% covering fraction for star-forming galaxies and 75% covering for passive galaxies. We find good agreement in column densities between this survey and previous studies over similar range of impact parameter. There is weak evidence for a difference between early- and late-type galaxies in the strength and distribution of H I. Kinematics indicate that the detected material is bound to the host galaxy, such that >~ 90% of the detected column density is confined within ±200 km s-1 of the galaxies. This material generally exists well below the halo virial temperatures at T <~ 105 K. We evaluate a number of possible origin scenarios for the detected material, and in the end favor a simple model in which the bulk of the detected H I arises in a bound, cool, low-density photoionized diffuse medium that is generic to all L* galaxies and may harbor a total gaseous mass comparable to galactic stellar masses. Based on observations made with the NASA/ESA Hubble Space Telescope, obtained at the Space Telescope Science Institute, which is operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS 5-26555. These observations are associated with program GO11598.

  6. A survey of pulse shape options for a revised plastic ablator ignition design

    SciTech Connect

    Clark, D. S.; Milovich, J. L.; Hinkel, D. E.; Salmonson, J. D.; Peterson, J. L.; Berzak Hopkins, L. F.; Eder, D. C.; Haan, S. W.; Jones, O. S.; Marinak, M. M.; Robey, H. F.; Smalyuk, V. A.; Weber, C. R.

    2014-11-15

    Recent experimental results using the “high foot” pulse shape for inertial confinement fusion ignition experiments on the National Ignition Facility (NIF) [Moses et al., Phys. Plasmas 16, 041006 (2009)] have shown encouraging progress compared to earlier “low foot” experiments. These results strongly suggest that controlling ablation front instability growth can significantly improve implosion performance even in the presence of persistent, large, low-mode distortions. Simultaneously, hydrodynamic growth radiography experiments have confirmed that ablation front instability growth is being modeled fairly well in NIF experiments. It is timely then to combine these two results and ask how current ignition pulse shapes could be modified to improve one-dimensional implosion performance while maintaining the stability properties demonstrated with the high foot. This paper presents such a survey of pulse shapes intermediate between the low and high foot extremes in search of an intermediate foot optimum. Of the design space surveyed, it is found that a higher picket version of the low foot pulse shape shows the most promise for improved compression without loss of stability.

  7. Developing an efficient modelling and data presentation strategy for ATDEM system comparison and survey design

    NASA Astrophysics Data System (ADS)

    Combrinck, Magdel

    2015-10-01

    Forward modelling of airborne time-domain electromagnetic (ATDEM) responses is frequently used to compare systems and design surveys for optimum detection of expected mineral exploration targets. It is a challenging exercise to display and analyse the forward modelled responses due to the large amount of data generated for three dimensional models as well as the system dependent nature of the data. I propose simplifying the display of ATDEM responses through using the dimensionless quantity of signal-to-noise ratios (signal:noise) instead of respective system units. I also introduce the concept of a three-dimensional signal:noise nomo-volume as an efficient tool to visually present and analyse large amounts of data. The signal:noise nomo-volume is a logical extension of the two-dimensional conductance nomogram. It contains the signal:noise values of all system time channels and components for various target depths and conductances integrated into a single interactive three-dimensional image. Responses are calculated over a complete survey grid and therefore include effects of system and target geometries. The user can interactively select signal:noise cut-off values on the nomo-volume and is able to perform visual comparisons between various system and target responses. The process is easy to apply and geophysicists with access to forward modelling airborne electromagnetic (AEM) and three-dimensional imaging software already possess the tools required to produce and analyse signal:noise nomo-volumes.

  8. A survey of pulse shape options for a revised plastic ablator ignition design

    NASA Astrophysics Data System (ADS)

    Clark, Daniel; Eder, David; Haan, Steven; Hinkel, Denise; Jones, Ogden; Marinak, Michael; Milovich, Jose; Peterson, Jayson; Robey, Harold; Salmonson, Jay; Smalyuk, Vladimir; Weber, Christopher

    2014-10-01

    Recent experimental results using the ``high foot'' pulse shape on the National Ignition Facility (NIF) have shown encouraging progress compared to earlier ``low foot'' experiments. These results strongly suggest that controlling ablation front instability growth can dramatically improve implosion performance, even in the presence of persistent, large, low-mode distortions. In parallel, Hydro. Growth Radiography experiments have so far validated the techniques used for modeling ablation front growth in NIF experiments. It is timely then to combine these two results and ask how current ignition pulse shapes could be modified so as to improve implosion performance, namely fuel compressibility, while maintaining the stability properties demonstrated with the high foot. This talk presents a survey of pulse shapes intermediate between the low and high foot extremes in search of a more optimal design. From the database of pulse shapes surveyed, a higher picket version of the original low foot pulse shape shows the most promise for improved compression without loss of stability. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  9. SURVEY DESIGN FOR SPECTRAL ENERGY DISTRIBUTION FITTING: A FISHER MATRIX APPROACH

    SciTech Connect

    Acquaviva, Viviana; Gawiser, Eric; Bickerton, Steven J.; Grogin, Norman A.; Guo Yicheng; Lee, Seong-Kook

    2012-04-10

    The spectral energy distribution (SED) of a galaxy contains information on the galaxy's physical properties, and multi-wavelength observations are needed in order to measure these properties via SED fitting. In planning these surveys, optimization of the resources is essential. The Fisher Matrix (FM) formalism can be used to quickly determine the best possible experimental setup to achieve the desired constraints on the SED-fitting parameters. However, because it relies on the assumption of a Gaussian likelihood function, it is in general less accurate than other slower techniques that reconstruct the probability distribution function (PDF) from the direct comparison between models and data. We compare the uncertainties on SED-fitting parameters predicted by the FM to the ones obtained using the more thorough PDF-fitting techniques. We use both simulated spectra and real data, and consider a large variety of target galaxies differing in redshift, mass, age, star formation history, dust content, and wavelength coverage. We find that the uncertainties reported by the two methods agree within a factor of two in the vast majority ({approx}90%) of cases. If the age determination is uncertain, the top-hat prior in age used in PDF fitting to prevent each galaxy from being older than the universe needs to be incorporated in the FM, at least approximately, before the two methods can be properly compared. We conclude that the FM is a useful tool for astronomical survey design.

  10. Survey of ethical issues reported by Indian medical students: basis for design of a new curriculum.

    PubMed

    Rose, Anuradha; George, Kuryan; T, Arul Dhas; Pulimood, Anna Benjamin

    2014-01-01

    Education in ethics is now a formal part of the undergraduate medical curriculum. However, most courses are structured around principles and case studies more appropriate to western countries. The cultures and practices of countries like India differ from those of western countries. It is, therefore, essential that our teaching should address the issues which are the most relevant to our setting. An anonymised, questionnaire-based, cross-sectional survey of medical students was carried out to get a picture of the ethical problems faced by students in India. The data were categorised into issues related to professional behaviour and ethical dilemmas. Unprofessional behaviour was among the issues reported as a matter of concern by a majority of the medical students. The survey highlights the need to design the curriculum in a way that reflects the structure of medical education in India, where patients are not always considered socio-culturally equal by students or the medical staff. This perspective must underpin any further efforts to address education in ethics in India.

  11. The Unique Optical Design of the CTI-II Survey Telescope

    NASA Astrophysics Data System (ADS)

    Ackermann, Mark R.; McGraw, J. T.; MacFarlane, M.

    2006-12-01

    The CCD/Transit Instrument with Innovative Instrumentation (CTI-II) is being developed for precision ground-based astrometric and photometric astronomical observations. The 1.8m telescope will be stationary, near-zenith pointing and will feature a CCD-mosaic array operated in time-delay and integrate (TDI) mode to image a continuous strip of the sky in five bands. The heart of the telescope is a Nasmyth-like bent-Cassegrain optical system optimized to produce near diffraction-limited images with near zero distortion over a circular1.42 deg field. The optical design includes an f/2.2 parabolic ULE primary with no central hole salvaged from the original CTI telescope and adds the requisite hyperbolic secondary, a folding flat and a highly innovative all-spherical, five lens corrector which includes three plano surfaces. The reflective and refractive portions of the design have been optimized as individual but interdependent systems so that the same reflective system can be used with slightly different refractive correctors. At present, two nearly identical corrector designs are being evaluated, one fabricated from BK-7 glass and the other of fused silica. The five lens corrector consists of an air-spaced triplet separated from follow-on air-spaced doublet. Either design produces 0.25 arcsecond images at 83% encircled energy with a maximum of 0.0005% distortion. The innovative five lens corrector design has been applied to other current and planned Cassegrain, RC and super RC optical systems requiring correctors. The basic five lens approach always results in improved performance compared to the original designs. In some cases, the improvement in image quality is small but includes substantial reductions in distortion. In other cases, the improvement in image quality is substantial. Because the CTI-II corrector is designed for a parabolic primary, it might be especially useful for liquid mirror telescopes. We describe and discuss the CTI-II optical design with respect

  12. Spectroscopic Survey Telescope design. III - Optical support structure and overall configuration

    NASA Astrophysics Data System (ADS)

    Ray, F. B.

    1990-07-01

    The Universities of Texas and Penn State are working together on an Arecibo-type optical telescope to be utilized in a semitransit mode for spectroscopic survey work. Its optics include a spherical primary mirror, a 2-element all-reflecting Gregorian spherical aberration corrector, and a series of optical fibers that will transmit light to a family of spectrographs. An optical support structure is being developed to permit position adjustment in azimuth only. During an azimuth position change, the instrument's entire weight is borne by steel rollers bearing on a circular crane rail of standard section, with support loads transmitted to the telescope base through pneumatic springs. Extensive application of various analytical procedures and computer-aided engineering tools has effectively allowed the detailed examination of several design iterations, thereby increasing the probability of success in the realized structure.

  13. Autonomous Underwater Vehicle Survey Design for Monitoring Carbon Capture and Storage Sites

    NASA Astrophysics Data System (ADS)

    Bull, J. M.; Cevatoglu, M.; Connelly, D.; Wright, I. C.; McPhail, S.; Shitashima, K.

    2013-12-01

    Long-term monitoring of sub-seabed Carbon Capture and Storage (CCS) sites will require systems that are flexible, independent, and have long-endurance. In this presentation we will discuss the utility of autonomous underwater vehicles equipped with different sensor packages in monitoring storage sites. We will present data collected using Autosub AUV, as part of the ECO2 project, from the Sleipner area of the North Sea. The Autosub AUV was equipped with sidescan sonar, an EM2000 multibeam systems, a Chirp sub-bottom profiler, and a variety of chemical sensors. Our presentation will focus on survey design, and the simultaneous use of multiple sensor packages in environmental monitoring on the continental shelf.

  14. Designing HIGH-COST medicine: hospital surveys, health planning, and the paradox of progressive reform.

    PubMed

    Perkins, Barbara Bridgman

    2010-02-01

    Inspired by social medicine, some progressive US health reforms have paradoxically reinforced a business model of high-cost medical delivery that does not match social needs. In analyzing the financial status of their areas' hospitals, for example, city-wide hospital surveys of the 1910s through 1930s sought to direct capital investments and, in so doing, control competition and markets. The 2 national health planning programs that ran from the mid-1960s to the mid-1980s continued similar strategies of economic organization and management, as did the so-called market reforms that followed. Consequently, these reforms promoted large, extremely specialized, capital-intensive institutions and systems at the expense of less complex (and less costly) primary and chronic care. The current capital crisis may expose the lack of sustainability of such a model and open up new ideas and new ways to build health care designed to meet people's health needs. PMID:20019312

  15. Designing HIGH-COST Medicine Hospital Surveys, Health Planning, and the Paradox of Progressive Reform

    PubMed Central

    2010-01-01

    Inspired by social medicine, some progressive US health reforms have paradoxically reinforced a business model of high-cost medical delivery that does not match social needs. In analyzing the financial status of their areas’ hospitals, for example, city-wide hospital surveys of the 1910s through 1930s sought to direct capital investments and, in so doing, control competition and markets. The 2 national health planning programs that ran from the mid-1960s to the mid-1980s continued similar strategies of economic organization and management, as did the so-called market reforms that followed. Consequently, these reforms promoted large, extremely specialized, capital-intensive institutions and systems at the expense of less complex (and less costly) primary and chronic care. The current capital crisis may expose the lack of sustainability of such a model and open up new ideas and new ways to build health care designed to meet people's health needs. PMID:20019312

  16. Hot rocket plume experiment - Survey and conceptual design. [of rhenium-iridium bipropellants

    NASA Technical Reports Server (NTRS)

    Millard, Jerry M.; Luan, Taylor W.; Dowdy, Mack W.

    1992-01-01

    Attention is given to a space-borne engine plume experiment study to fly an experiment which will both verify and quantify the reduced contamination from advanced rhenium-iridium earth-storable bipropellant rockets (hot rockets) and provide a correlation between high-fidelity, in-space measurements and theoretical plume and surface contamination models. The experiment conceptual design is based on survey results from plume and contamination technologists throughout the U.S. With respect to shuttle use, cursory investigations validate Hitchhiker availability and adaptability, adequate remote manipulator system (RMS) articulation and dynamic capability, acceptable RMS attachment capability, adequate power and telemetry capability, and adequate flight altitude and attitude/orbital capability.

  17. Technology transfer with system analysis, design, decision making, and impact (Survey-2000) in acute care hospitals in the United States.

    PubMed

    Hatcher, M

    2001-10-01

    This paper provides the results of the Survey-2000 measuring technology transfer for management information systems in health care. The relationships with systems approaches, user involvement, usersatisfaction, and decision-making were measured and are presented. The survey also measured the levels Internet and Intranet presents in acute care hospitals, which will be discussed in future articles. The depth of the survey includes e-commerce for both business to business and customers. These results are compared, where appropriate, with results from survey 1997 and changes are discussed. This information will provide benchmarks for hospitals to plan their network technology position and to set goals. This is the first of three articles based upon the results of the Srvey-2000. Readers are referred to a prior article by the author that discusses the survey design and provides a tutorial on technology transfer in acute care hospitals.

  18. Nonexperimental Quantitative Research and Its Role in Guiding Instruction

    ERIC Educational Resources Information Center

    Cook, Bryan G.; Cook, Lysandra

    2008-01-01

    Different research designs answer different questions. Educators cannot use nonexperimental quantitative research designs, such as descriptive surveys and correlational research, to determine definitively that an intervention causes improved student outcomes and is an evidence-based practice. However, such research can (a) inform educators about a…

  19. Databases save time and improve the quality of the design, management and processing of ecopathological surveys.

    PubMed

    Sulpice, P; Bugnard, F; Calavas, D

    1994-01-01

    The example of an ecopathological survey on nursing ewe mastitis shows that data bases have 4 complementary functions: assistance during the conception of surveys; follow-up of surveys; management and quality control of data; and data organization for statistical analysis. This is made possible by the simultaneous conception of both the data base and the survey, and by the integration of computer science into the work of the task group that conducts the survey. This methodology helps save time and improve the quality of data in ecopathological surveys.

  20. Mechanical Design of NESSI: New Mexico Tech Extrasolar Spectroscopic Survey Instrument

    NASA Technical Reports Server (NTRS)

    Santoro, Fernando G.; Olivares, Andres M.; Salcido, Christopher D.; Jimenez, Stephen R.; Jurgenson, Colby A.; Hrynevych, Michael A.; Creech-Eakman, Michelle J.; Boston, Penny J.; Schmidt, Luke M.; Bloemhard, Heather; Rodeheffer, Dan; Vaive, Genevieve; Vasisht, Gautam; Swain, Mark R.; Deroo, Pieter

    2011-01-01

    NESSI: the New Mexico Tech Extrasolar Spectroscopic Survey Instrument is a ground-based multi-object spectrograph that operates in the near-infrared. It will be installed on one of the Nasmyth ports of the Magdalena Ridge Observatory (MRO) 2.4-meter Telescope sited in the Magdalena Mountains, about 48 km west of Socorro-NM. NESSI operates stationary to the telescope fork so as not to produce differential flexure between internal opto-mechanical components during or between observations. An appropriate mechanical design allows the instrument alignment to be highly repeatable and stable for both short and long observation timescales, within a wide-range of temperature variation. NESSI is optically composed of a field lens, a field de-rotator, re-imaging optics, an auto-guider and a Dewar spectrograph that operates at LN2 temperature. In this paper we report on NESSI's detailed mechanical and opto-mechanical design, and the planning for mechanical construction, assembly, integration and verification.

  1. Measuring coverage in MNCH: design, implementation, and interpretation challenges associated with tracking vaccination coverage using household surveys.

    PubMed

    Cutts, Felicity T; Izurieta, Hector S; Rhoda, Dale A

    2013-01-01

    Vaccination coverage is an important public health indicator that is measured using administrative reports and/or surveys. The measurement of vaccination coverage in low- and middle-income countries using surveys is susceptible to numerous challenges. These challenges include selection bias and information bias, which cannot be solved by increasing the sample size, and the precision of the coverage estimate, which is determined by the survey sample size and sampling method. Selection bias can result from an inaccurate sampling frame or inappropriate field procedures and, since populations likely to be missed in a vaccination coverage survey are also likely to be missed by vaccination teams, most often inflates coverage estimates. Importantly, the large multi-purpose household surveys that are often used to measure vaccination coverage have invested substantial effort to reduce selection bias. Information bias occurs when a child's vaccination status is misclassified due to mistakes on his or her vaccination record, in data transcription, in the way survey questions are presented, or in the guardian's recall of vaccination for children without a written record. There has been substantial reliance on the guardian's recall in recent surveys, and, worryingly, information bias may become more likely in the future as immunization schedules become more complex and variable. Finally, some surveys assess immunity directly using serological assays. Sero-surveys are important for assessing public health risk, but currently are unable to validate coverage estimates directly. To improve vaccination coverage estimates based on surveys, we recommend that recording tools and practices should be improved and that surveys should incorporate best practices for design, implementation, and analysis.

  2. The Hawk-I UDS and GOODS Survey (HUGS): Survey design and deep K-band number counts

    NASA Astrophysics Data System (ADS)

    Fontana, A.; Dunlop, J. S.; Paris, D.; Targett, T. A.; Boutsia, K.; Castellano, M.; Galametz, A.; Grazian, A.; McLure, R.; Merlin, E.; Pentericci, L.; Wuyts, S.; Almaini, O.; Caputi, K.; Chary, R.-R.; Cirasuolo, M.; Conselice, C. J.; Cooray, A.; Daddi, E.; Dickinson, M.; Faber, S. M.; Fazio, G.; Ferguson, H. C.; Giallongo, E.; Giavalisco, M.; Grogin, N. A.; Hathi, N.; Koekemoer, A. M.; Koo, D. C.; Lucas, R. A.; Nonino, M.; Rix, H. W.; Renzini, A.; Rosario, D.; Santini, P.; Scarlata, C.; Sommariva, V.; Stark, D. P.; van der Wel, A.; Vanzella, E.; Wild, V.; Yan, H.; Zibetti, S.

    2014-10-01

    We present the results of a new, ultra-deep, near-infrared imaging survey executed with the Hawk-I imager at the ESO VLT, of which we make all the data (images and catalog) public. This survey, named HUGS (Hawk-I UDS and GOODS Survey), provides deep, high-quality imaging in the K and Y bands over the portions of the UKIDSS UDS and GOODS-South fields covered by the CANDELS HST WFC3/IR survey. In this paper we describe the survey strategy, the observational campaign, the data reduction process, and the data quality. We show that, thanks to exquisite image quality and extremely long exposure times, HUGS delivers the deepest K-band images ever collected over areas of cosmological interest, and in general ideally complements the CANDELS data set in terms of image quality and depth. In the GOODS-S field, the K-band observations cover the whole CANDELS area with a complex geometry made of 6 different, partly overlapping pointings, in order to best match the deep and wide areas of CANDELS imaging. In the deepest region (which includes most of the Hubble Ultra Deep Field) exposure times exceed 80 hours of integration, yielding a 1 - σ magnitude limit per square arcsec of ≃28.0 AB mag. The seeing is exceptional and homogeneous across the various pointings, confined to the range 0.38-0.43 arcsec. In the UDS field the survey is about one magnitude shallower (to match the correspondingly shallower depth of the CANDELS images) but includes also Y-band band imaging (which, in the UDS, was not provided by the CANDELS WFC3/IR imaging). In the K-band, with an average exposure time of 13 hours, and seeing in the range 0.37-0.43 arcsec, the 1 - σ limit per square arcsec in the UDS imaging is ≃27.3 AB mag. In the Y-band, with an average exposure time ≃8 h, and seeing in the range 0.45-0.5 arcsec, the imaging yields a 1 - σ limit per square arcsec of ≃28.3 AB mag. We show that the HUGS observations are well matched to the depth of the CANDELS WFC3/IR data, since the majority

  3. Quantitative Evaluation of Tissue Surface Adaption of CAD-Designed and 3D Printed Wax Pattern of Maxillary Complete Denture

    PubMed Central

    Chen, Hu; Wang, Han; Lv, Peijun; Wang, Yong; Sun, Yuchun

    2015-01-01

    Objective. To quantitatively evaluate the tissue surface adaption of a maxillary complete denture wax pattern produced by CAD and 3DP. Methods. A standard edentulous maxilla plaster cast model was used, for which a wax pattern of complete denture was designed using CAD software developed in our previous study and printed using a 3D wax printer, while another wax pattern was manufactured by the traditional manual method. The cast model and the two wax patterns were scanned in the 3D scanner as “DataModel,” “DataWaxRP,” and “DataWaxManual.” After setting each wax pattern on the plaster cast, the whole model was scanned for registration. After registration, the deviations of tissue surface between “DataModel” and “DataWaxRP” and between “DataModel” and “DataWaxManual” were measured. The data was analyzed by paired t-test. Results. For both wax patterns produced by the CAD&RP method and the manual method, scanning data of tissue surface and cast surface showed a good fit in the majority. No statistically significant (P > 0.05) difference was observed between the CAD&RP method and the manual method. Conclusions. Wax pattern of maxillary complete denture produced by the CAD&3DP method is comparable with traditional manual method in the adaption to the edentulous cast model. PMID:26583108

  4. High-resolution linkage and quantitative trait locus mapping aided by genome survey sequencing: building up an integrative genomic framework for a bivalve mollusc.

    PubMed

    Jiao, Wenqian; Fu, Xiaoteng; Dou, Jinzhuang; Li, Hengde; Su, Hailin; Mao, Junxia; Yu, Qian; Zhang, Lingling; Hu, Xiaoli; Huang, Xiaoting; Wang, Yangfan; Wang, Shi; Bao, Zhenmin

    2014-02-01

    Genetic linkage maps are indispensable tools in genetic and genomic studies. Recent development of genotyping-by-sequencing (GBS) methods holds great promise for constructing high-resolution linkage maps in organisms lacking extensive genomic resources. In the present study, linkage mapping was conducted for a bivalve mollusc (Chlamys farreri) using a newly developed GBS method-2b-restriction site-associated DNA (2b-RAD). Genome survey sequencing was performed to generate a preliminary reference genome that was utilized to facilitate linkage and quantitative trait locus (QTL) mapping in C. farreri. A high-resolution linkage map was constructed with a marker density (3806) that has, to our knowledge, never been achieved in any other molluscs. The linkage map covered nearly the whole genome (99.5%) with a resolution of 0.41 cM. QTL mapping and association analysis congruously revealed two growth-related QTLs and one potential sex-determination region. An important candidate QTL gene named PROP1, which functions in the regulation of growth hormone production in vertebrates, was identified from the growth-related QTL region detected on the linkage group LG3. We demonstrate that this linkage map can serve as an important platform for improving genome assembly and unifying multiple genomic resources. Our study, therefore, exemplifies how to build up an integrative genomic framework in a non-model organism.

  5. High-Resolution Linkage and Quantitative Trait Locus Mapping Aided by Genome Survey Sequencing: Building Up An Integrative Genomic Framework for a Bivalve Mollusc

    PubMed Central

    Jiao, Wenqian; Fu, Xiaoteng; Dou, Jinzhuang; Li, Hengde; Su, Hailin; Mao, Junxia; Yu, Qian; Zhang, Lingling; Hu, Xiaoli; Huang, Xiaoting; Wang, Yangfan; Wang, Shi; Bao, Zhenmin

    2014-01-01

    Genetic linkage maps are indispensable tools in genetic and genomic studies. Recent development of genotyping-by-sequencing (GBS) methods holds great promise for constructing high-resolution linkage maps in organisms lacking extensive genomic resources. In the present study, linkage mapping was conducted for a bivalve mollusc (Chlamys farreri) using a newly developed GBS method—2b-restriction site-associated DNA (2b-RAD). Genome survey sequencing was performed to generate a preliminary reference genome that was utilized to facilitate linkage and quantitative trait locus (QTL) mapping in C. farreri. A high-resolution linkage map was constructed with a marker density (3806) that has, to our knowledge, never been achieved in any other molluscs. The linkage map covered nearly the whole genome (99.5%) with a resolution of 0.41 cM. QTL mapping and association analysis congruously revealed two growth-related QTLs and one potential sex-determination region. An important candidate QTL gene named PROP1, which functions in the regulation of growth hormone production in vertebrates, was identified from the growth-related QTL region detected on the linkage group LG3. We demonstrate that this linkage map can serve as an important platform for improving genome assembly and unifying multiple genomic resources. Our study, therefore, exemplifies how to build up an integrative genomic framework in a non-model organism. PMID:24107803

  6. Utility FGD Survey, January--December 1989. Volume 2, Design performance data for operating FGD systems, Part 1

    SciTech Connect

    Hance, S.L.; McKibben, R.S.; Jones, F.M.

    1992-03-01

    The Utility flue gas desulfurization (FGD) Survey report, which is generated by a computerized data base management system, represents a survey of operational and planned domestic utility flue gas desulfurization (FGD) systems. It summarizes information contributed by the utility industry, system and equipment suppliers, system designers, research organizations, and regulatory agencies. The data cover system design, fuel characteristics, operating history, and actual system performance. Also included is a unit-by-unit discussion of problems and solutions associated with the boilers, scrubbers, and FGD systems. The development status (operational, under construction, or in the planning stages), system supplier, process, waste disposal practice, and regulatory class are tabulated alphabetically by utility company.

  7. Software Design Aspects and First Test Results of VLT Survey Telescope Control System

    NASA Astrophysics Data System (ADS)

    Brescia, M.; Schipani, P.; Marty, L.; Capaccioli, M.

    2006-08-01

    The 2.6 m VLT Survey Telescope (VST) is going to be installed at Cerro Paranal (Chile) as a powerful survey instrument for the ESO VLT. The tightest requirements to be respected for such a telescope, (large field of view of 1°x1°, pixel scale of 0.21 arcsec/pixel, and hosted in a one of the best worldwide astronomical sites), are basically very high performances of active optics and autoguiding systems and an excellent axes control, in order to obtain the best overall image quality of the telescope. The VST active optics software must basically provide the analysis of the image coming from the 10x10 subpupils Shack Hartmann wavefront sensor and the calculation of primary mirror forces and secondary mirror displacements to correct the intrinsic aberrations of the optical system and the ones originated for thermal or gravity reasons. The algorithm to select the guide star depends on the specific geometry of the adapter system. The adapter of the VST hosts many devices handled by the overall telescope control software: a probe system to select the guide star realized with motions in polar coordinates, a pickup mirror to fold the light to the image analysis and guiding cameras, a selectable reference light system and a focusing device. All these devices deeply interface with autoguiding, active optics and field rotation compensation systems. A reverse engineering approach mixed to the integration of new specific solutions has been fundamental to match the ESO commitments in terms of software re-use, in order to smoothen the integration of a new telescope designed and built by an external institute in the ESO environment. The control software architecture, the simulation code to validate the results and the status of work are here described. This paper includes also first results of preliminary tracking tests performed at the VST integration site for azimuth, altitude and rotator axes, that already match system quality requirements.

  8. Optimal design of a lagrangian observing system for hydrodynamic surveys in coastal areas

    NASA Astrophysics Data System (ADS)

    Cucco, Andrea; Quattrocchi, Giovanni; Antognarelli, Fabio; Satta, Andrea; Maicu, Francesco; Ferrarin, Christian; Umgiesser, Georg

    2014-05-01

    The optimization of ocean observing systems is a pressing need for scientific research. In particular, the improvement of ocean short-term observing networks is achievable by reducing the cost-benefit ratio of the field campaigns and by increasing the quality of measurements. Numerical modeling is a powerful tool for determining the appropriateness of a specific observing system and for optimizing the sampling design. This is particularly true when observations are carried out in coastal areas and lagoons where, the use satellites is prohibitive due to the water shallowness. For such areas, numerical models are the most efficient tool both to provide a preliminary assess of the local physical environment and to make short -term predictions above its change. In this context, a test case experiment was carried out within an enclosed shallow water areas, the Cabras Lagoon (Sardinia, Italy). The aim of the experiment was to explore the optimal design for a field survey based on the use of coastal lagrangian buoys. A three-dimensional hydrodynamic model based on the finite element method (SHYFEM3D, Umgiesser et al., 2004) was implemented to simulate the lagoon water circulation. The model domain extent to the whole Cabras lagoon and to the whole Oristano Gulf, including the surrounding coastal area. Lateral open boundary conditions were provided by the operational ocean model system WMED and only wind forcing, provided by SKIRON atmospheric model (Kallos et al., 1997), was considered as surface boundary conditions. The model was applied to provide a number of ad hoc scenarios and to explore the efficiency of the short-term hydrodynamic survey. A first field campaign was carried out to investigate the lagrangian circulation inside the lagoon under the main wind forcing condition (Mistral wind from North-West). The trajectories followed by the lagrangian buoys and the estimated lagrangian velocities were used to calibrate the model parameters and to validate the

  9. Korean Environmental Health Survey in Children and Adolescents (KorEHS-C): survey design and pilot study results on selected exposure biomarkers.

    PubMed

    Ha, Mina; Kwon, Ho-Jang; Leem, Jong-Han; Kim, Hwan-Cheol; Lee, Kee Jae; Park, Inho; Lim, Young-Wook; Lee, Jong-Hyeon; Kim, Yeni; Seo, Ju-Hee; Hong, Soo-Jong; Choi, Youn-Hee; Yu, Jeesuk; Kim, Jeongseon; Yu, Seung-Do; Lee, Bo-Eun

    2014-03-01

    For the first nationwide representative survey on the environmental health of children and adolescents in Korea, we designed the Korean Environmental Health Survey in Children and Adolescents (KorEHS-C) as a two-phase survey and planned a sampling strategy that would represent the whole population of Korean children and adolescents, based on the school unit for the 6-19 years age group and the household unit for the 5 years or less age group. A pilot study for 351 children and adolescents aged 6 to 19 years in elementary, middle, and high school of two cities was performed to validate several measurement methods and tools, as well as to test their feasibility, and to elaborate the protocols used throughout the survey process. Selected exposure biomarkers, i.e., lead, mercury, cadmium in blood, and bisphenol A, metabolites of diethylhexyl phthalate and di-n-butyl phthalate and cotinine in urine were analyzed. We found that the levels of blood mercury (Median: 1.7 ug/L) and cadmium (Median: 0.30 ug/L) were much higher than those of subjects in Germany and the US, while metabolites of phthalates and bisphenol A showed similar levels and tendencies by age; the highest levels of phthalate metabolites and bisphenol A occurred in the youngest group of children. Specific investigations to elucidate the exposure pathways of major environmental exposure need to be conducted, and the KorEHS-C should cover as many potential environmental hazards as possible.

  10. A two-phase sampling design for increasing detections of rare species in occupancy surveys

    USGS Publications Warehouse

    Pacifici, Krishna; Dorazio, Robert M.; Dorazio, Michael J.

    2012-01-01

    1. Occupancy estimation is a commonly used tool in ecological studies owing to the ease at which data can be collected and the large spatial extent that can be covered. One major obstacle to using an occupancy-based approach is the complications associated with designing and implementing an efficient survey. These logistical challenges become magnified when working with rare species when effort can be wasted in areas with none or very few individuals. 2. Here, we develop a two-phase sampling approach that mitigates these problems by using a design that places more effort in areas with higher predicted probability of occurrence. We compare our new sampling design to traditional single-season occupancy estimation under a range of conditions and population characteristics. We develop an intuitive measure of predictive error to compare the two approaches and use simulations to assess the relative accuracy of each approach. 3. Our two-phase approach exhibited lower predictive error rates compared to the traditional single-season approach in highly spatially correlated environments. The difference was greatest when detection probability was high (0·75) regardless of the habitat or sample size. When the true occupancy rate was below 0·4 (0·05-0·4), we found that allocating 25% of the sample to the first phase resulted in the lowest error rates. 4. In the majority of scenarios, the two-phase approach showed lower error rates compared to the traditional single-season approach suggesting our new approach is fairly robust to a broad range of conditions and design factors and merits use under a wide variety of settings. 5. Synthesis and applications. Conservation and management of rare species are a challenging task facing natural resource managers. It is critical for studies involving rare species to efficiently allocate effort and resources as they are usually of a finite nature. We believe our approach provides a framework for optimal allocation of effort while

  11. Characteristics of Designated Drivers and their Passengers from the 2007 National Roadside Survey in the United States

    PubMed Central

    Bergen, Gwen; Yao, Jie; Shults, Ruth A.; Romano, Eduardo; Lacey, John

    2015-01-01

    Objective The objectives of this study were to estimate the prevalence of designated driving in the United States, compare these results with those from the 1996 National Roadside Survey, and explore the demographic, drinking, and trip characteristics of both designated drivers and their passengers. Methods The data used were from the 2007 National Roadside Survey which randomly stopped drivers, administered breath tests for alcohol, and administered a questionnaire to drivers and front seat passengers. Results Almost a third (30%) of nighttime drivers reported being designated drivers, with 84% of them having a blood alcohol concentration of zero. Drivers who were more likely to be designated drivers were those with a blood alcohol concentration that was over zero but still legal, who were under 35 years of age, who were African-American, Hispanic or Asian, and whose driving trip originated at a bar, tavern, or club. Over a third of passengers of designated drivers reported consuming an alcoholic drink the day of the survey compared with a fifth of passengers of non-designated drivers. One-fifth of designated driver passengers who reported drinking consumed five or more drinks that day. Conclusions Designated driving is widely used in the United States, with the majority of designated drivers abstaining from drinking alcohol. However as designated driving separates drinking from driving for passengers in a group travelling together, this may encourage passengers to binge drink, which is associated with many adverse health consequences in addition to those arising from alcohol-impaired driving. Designated driving programs and campaigns, although not proven to be effective when used alone, can complement proven effective interventions to help reduce excessive drinking and alcohol-impaired driving. PMID:24372499

  12. SIS Mixer Design for a Broadband Millimeter Spectrometer Suitable for Rapid Line Surveys and Redshift Determinations

    NASA Technical Reports Server (NTRS)

    Rice, F.; Sumner, M.; Zmuidzinas, J.; Hu, R.; LeDuc, H.; Harris, A.; Miller, D.

    2004-01-01

    We present some detail of the waveguide probe and SIS mixer chip designs for a low-noise 180-300 GHz double- sideband receiver with an instantaneous RF bandwidth of 24 GHz. The receiver's single SIS junction is excited by a broadband, fixed-tuned waveguide probe on a silicon substrate. The IF output is coupled to a 6-18 GHz MMIC low- noise preamplifier. Following further amplification, the output is processed by an array of 4 GHz, 128-channel analog autocorrelation spectrometers (WASP 11). The single-sideband receiver noise temperature goal of 70 Kelvin will provide a prototype instrument capable of rapid line surveys and of relatively efficient carbon monoxide (CO) emission line searches of distant, dusty galaxies. The latter application's goal is to determine redshifts by measuring the frequencies of CO line emissions from the star-forming regions dominating the submillimeter brightness of these galaxies. Construction of the receiver has begun; lab testing should begin in the fall. Demonstration of the receiver on the Caltech Submillimeter Observatory (CSO) telescope should begin in spring 2003.

  13. Design, Data Collection, Interview Timing, and Data Editing in the 1995 National Household Education Survey (NHES:95). Working Paper Series.

    ERIC Educational Resources Information Center

    Collins, Mary A.; Brick, J. Michael; Loomis, Laura S.; Nicchitta, Patricia G.; Fleischman, Susan

    The National Household Education Survey (NHES) is a data collection effort of the National Center for Education Statistics that collects and publishes data on the condition of education in the United States. The NHES is designed to provide information on issues that are best addressed by contacting households rather than institutions. It is a…

  14. 78 FR 5458 - Medicare Program; Request for Information To Aid in the Design and Development of a Survey...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-25

    ... HUMAN SERVICES Centers for Medicare & Medicaid Services Medicare Program; Request for Information To Aid in the Design and Development of a Survey Regarding Patient and Family Member/Friend Experiences With Hospice Care AGENCY: Centers for Medicare & Medicaid Services (CMS), HHS. ACTION: Request for...

  15. 78 FR 5459 - Medicare Program; Request for Information To Aid in the Design and Development of a Survey...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-25

    ... HUMAN SERVICES Centers for Medicare & Medicaid Services Medicare Program; Request for Information To Aid in the Design and Development of a Survey Regarding Patient Experiences With Hospital Outpatient...: Request for information. SUMMARY: This document is a request for information regarding hospital...

  16. CONDITION ASSESSMENT FOR THE ESCAMBIA RIVER, FL, WATERSHED: BENTHIC MACROINVERTEBRATE SURVEYS USING A PROBABILISTIC SAMPLING DESIGN (POSTER SESSION)

    EPA Science Inventory

    Probabilistic sampling has been used to assess the condition of estuarine ecosystems, and the use of this survey design approach was examined for a northwest Florida watershed. Twenty-eight lotic sites within the Escambia River, Florida, watershed were randomly selected and visit...

  17. Using qualitative research to facilitate the interpretation of quantitative results from a discrete choice experiment: insights from a survey in elderly ophthalmologic patients

    PubMed Central

    Vennedey, Vera; Danner, Marion; Evers, Silvia MAA; Fauser, Sascha; Stock, Stephanie; Dirksen, Carmen D; Hiligsmann, Mickaël

    2016-01-01

    Background Age-related macular degeneration (AMD) is the leading cause of visual impairment and blindness in industrialized countries. Currently, mainly three treatment options are available, which are all intravitreal injections, but differ with regard to the frequency of injections needed, their approval status, and cost. This study aims to estimate patients’ preferences for characteristics of treatment options for neovascular AMD. Methods An interviewer-assisted discrete choice experiment was conducted among patients suffering from AMD treated with intravitreal injections. A Bayesian efficient design was used for the development of 12 choice tasks. In each task patients indicated their preference for one out of two treatment scenarios described by the attributes: side effects, approval status, effect on visual function, injection and monitoring frequency. While answering the choice tasks, patients were asked to think aloud and explain the reasons for choosing or rejecting specific characteristics. Quantitative data were analyzed with a mixed multinomial logit model. Results Eighty-six patients completed the questionnaire. Patients significantly preferred treatments that improve visual function, are approved, are administered in a pro re nata regimen (as needed), and are accompanied by bimonthly monitoring. Patients significantly disliked less frequent monitoring visits (every 4 months) and explained this was due to fear of deterioration being left unnoticed, and in turn experiencing disease deterioration. Significant preference heterogeneity was found for all levels except for bimonthly monitoring visits and severe, rare eye-related side effects. Patients gave clear explanations of their individual preferences during the interviews. Conclusion Significant preference trends were discernible for the overall sample, despite the preference heterogeneity for most treatment characteristics. Patients like to be monitored and treated regularly, but not too frequently

  18. Improving the design of amphibian surveys using soil data: A case study in two wilderness areas

    USGS Publications Warehouse

    Bowen, K.D.; Beever, E.A.; Gafvert, U.B.

    2009-01-01

    Amphibian populations are known, or thought to be, declining worldwide. Although protected natural areas may act as reservoirs of biological integrity and serve as benchmarks for comparison with unprotected areas, they are not immune from population declines and extinctions and should be monitored. Unfortunately, identifying survey sites and performing long-term fieldwork within such (often remote) areas involves a special set of problems. We used the USDA Natural Resource Conservation Service Soil Survey Geographic (SSURGO) Database to identify, a priori, potential habitat for aquatic-breeding amphibians on North and South Manitou Islands, Sleeping Bear Dunes National Lakeshore, Michigan, and compared the results to those obtained using National Wetland Inventory (NWI) data. The SSURGO approach identified more target sites for surveys than the NWI approach, and it identified more small and ephemeral wetlands. Field surveys used a combination of daytime call surveys, night-time call surveys, and perimeter surveys. We found that sites that would not have been identified with NWI data often contained amphibians and, in one case, contained wetland-breeding species that would not have been found using NWI data. Our technique allows for easy a priori identification of numerous survey sites that might not be identified using other sources of spatial information. We recognize, however, that the most effective site identification and survey techniques will likely use a combination of methods in addition to those described here.

  19. National Aquatic Resource Surveys: Multiple objectives and constraints lead to design complexity

    EPA Science Inventory

    The US Environmental Protection Agency began conducting the National Aquatic resource Surveys (NARS) in 2007 with a national survey of lakes (NLA 2007) followed by rivers and streams in 2008-9 (NRSA 2008), coastal waters in 2010 (NCCA 2010) and wetlands in 2011 (NWCA). The surve...

  20. Designing Anti-Influenza Aptamers: Novel Quantitative Structure Activity Relationship Approach Gives Insights into Aptamer – Virus Interaction

    PubMed Central

    Musafia, Boaz; Oren-Banaroya, Rony; Noiman, Silvia

    2014-01-01

    This study describes the development of aptamers as a therapy against influenza virus infection. Aptamers are oligonucleotides (like ssDNA or RNA) that are capable of binding to a variety of molecular targets with high affinity and specificity. We have studied the ssDNA aptamer BV02, which was designed to inhibit influenza infection by targeting the hemagglutinin viral protein, a protein that facilitates the first stage of the virus’ infection. While testing other aptamers and during lead optimization, we realized that the dominant characteristics that determine the aptamer’s binding to the influenza virus may not necessarily be sequence-specific, as with other known aptamers, but rather depend on general 2D structural motifs. We adopted QSAR (quantitative structure activity relationship) tool and developed computational algorithm that correlate six calculated structural and physicochemical properties to the aptamers’ binding affinity to the virus. The QSAR study provided us with a predictive tool of the binding potential of an aptamer to the influenza virus. The correlation between the calculated and actual binding was R2 = 0.702 for the training set, and R2 = 0.66 for the independent test set. Moreover, in the test set the model’s sensitivity was 89%, and the specificity was 87%, in selecting aptamers with enhanced viral binding. The most important properties that positively correlated with the aptamer’s binding were the aptamer length, 2D-loops and repeating sequences of C nucleotides. Based on the structure-activity study, we have managed to produce aptamers having viral affinity that was more than 20 times higher than that of the original BV02 aptamer. Further testing of influenza infection in cell culture and animal models yielded aptamers with 10 to 15 times greater anti-viral activity than the BV02 aptamer. Our insights concerning the mechanism of action and the structural and physicochemical properties that govern the interaction with the

  1. Design Evolution of the Wide Field Infrared Survey Telescope Using Astrophysics Focused Telescope Assets (WFIRST-AFTA) and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Peabody, Hume L.; Peters, Carlton V.; Rodriguez-Ruiz, Juan E.; McDonald, Carson S.; Content, David A.; Jackson, Clifton E.

    2015-01-01

    The design of the Wide Field Infrared Survey Telescope using Astrophysics Focused Telescope Assets (WFIRST-AFTA) continues to evolve as each design cycle is analyzed. In 2012, two Hubble sized (2.4 m diameter) telescopes were donated to NASA from elsewhere in the Federal Government. NASA began investigating potential uses for these telescopes and identified WFIRST as a mission to benefit from these assets. With an updated, deeper, and sharper field of view than previous design iterations with a smaller telescope, the optical designs of the WFIRST instruments were updated and the mechanical and thermal designs evolved around the new optical layout. Beginning with Design Cycle 3, significant analysis efforts yielded a design and model that could be evaluated for Structural-Thermal-Optical-Performance (STOP) purposes for the Wide Field Imager (WFI) and provided the basis for evaluating the high level observatory requirements. Development of the Cycle 3 thermal model provided some valuable analysis lessons learned and established best practices for future design cycles. However, the Cycle 3 design did include some major liens and evolving requirements which were addressed in the Cycle 4 Design. Some of the design changes are driven by requirements changes, while others are optimizations or solutions to liens from previous cycles. Again in Cycle 4, STOP analysis was performed and further insights into the overall design were gained leading to the Cycle 5 design effort currently underway. This paper seeks to capture the thermal design evolution, with focus on major design drivers, key decisions and their rationale, and lessons learned as the design evolved.

  2. Design Evolution of the Wide Field Infrared Survey Telescope using Astrophysics Focused Telescope Assets (WFIRST-AFTA) and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Peabody, Hume; Peters, Carlton; Rodriguez, Juan; McDonald, Carson; Content, David A.; Jackson, Cliff

    2015-01-01

    The design of the Wide Field Infrared Survey Telescope using Astrophysics Focused Telescope Assets (WFIRST-AFTA) continues to evolve as each design cycle is analyzed. In 2012, two Hubble sized (2.4 m diameter) telescopes were donated to NASA from elsewhere in the Federal Government. NASA began investigating potential uses for these telescopes and identified WFIRST as a mission to benefit from these assets. With an updated, deeper, and sharper field of view than previous design iterations with a smaller telescope, the optical designs of the WFIRST instruments were updated and the mechanical and thermal designs evolved around the new optical layout. Beginning with Design Cycle 3, significant analysis efforts yielded a design and model that could be evaluated for Structural-Thermal-Optical-Performance (STOP) purposes for the Wide Field Imager (WFI) and provided the basis for evaluating the high level observatory requirements. Development of the Cycle 3 thermal model provided some valuable analysis lessons learned and established best practices for future design cycles. However, the Cycle 3 design did include some major liens and evolving requirements which were addressed in the Cycle 4 Design. Some of the design changes are driven by requirements changes, while others are optimizations or solutions to liens from previous cycles. Again in Cycle 4, STOP analysis was performed and further insights into the overall design were gained leading to the Cycle 5 design effort currently underway. This paper seeks to capture the thermal design evolution, with focus on major design drivers, key decisions and their rationale, and lessons learned as the design evolved.

  3. A survey of Utah's public secondary education science teachers to determine their feelings of preparedness to teach engineering design

    NASA Astrophysics Data System (ADS)

    Ames, R. Tyler

    The Next Generation Science Standards were released in 2013 and call for the inclusion of engineering design into the science classroom. This integration of science and engineering is very exciting for many people and groups in both fields involved, but a good bit of uncertainty remains about how prepared science teachers feel to teach engineering design. This study analyzes the history of science standards leading up to the Next Generation Science Standards, establishes key components of the engineering design, and lays the background for the study detailed in this report. A survey was given to several hundred public secondary science teachers in the state of Utah in which respondents were asked to report their feelings of preparedness on several aspects of engineering design. The findings of the study show that Utah teachers do not feel fully prepared to teach engineering design at the present time (2014).

  4. Designing a household survey to address seasonality in child care arrangements.

    PubMed

    Schmidt, Stefanie R; Wang, Kevin H; Sonenstein, Freya L

    2008-04-01

    In household telephone surveys, a long field period may be required to maximize the response rate and achieve adequate sample sizes. However, long field periods can be problematic when measures of seasonally affected behavior are sought. Surveys of child care use are one example because child care arrangements vary by season. Options include varying the questions posed about school-year and summer arrangements or posing retrospective questions about child care use for the school year only. This article evaluates the bias associated with the use of retrospective questions about school-year child care arrangements in the 1999 National Survey of America's Families. The authors find little evidence of bias and hence recommend that future surveys use the retrospective approach.

  5. Design and Evaluation of Digital Learning Material to Support Acquisition of Quantitative Problem-Solving Skills within Food Chemistry

    ERIC Educational Resources Information Center

    Diederen, Julia; Gruppen, Harry; Hartog, Rob; Voragen, Alphons G. J.

    2005-01-01

    One of the modules in the course Food Chemistry at Wageningen University (Wageningen, The Netherlands) focuses on quantitative problem-solving skills related to chemical reactions. The intended learning outcomes of this module are firstly, to be able to translate practical food chemistry related problems into mathematical equations and to solve…

  6. [New design of the Health Survey of Catalonia (Spain, 2010-2014): a step forward in health planning and evaluation].

    PubMed

    Alcañiz-Zanón, Manuela; Mompart-Penina, Anna; Guillén-Estany, Montserrat; Medina-Bustos, Antonia; Aragay-Barbany, Josep M; Brugulat-Guiteras, Pilar; Tresserras-Gaju, Ricard

    2014-01-01

    This article presents the genesis of the Health Survey of Catalonia (Spain, 2010-2014) with its semiannual subsamples and explains the basic characteristics of its multistage sampling design. In comparison with previous surveys, the organizational advantages of this new statistical operation include rapid data availability and the ability to continuously monitor the population. The main benefits are timeliness in the production of indicators and the possibility of introducing new topics through the supplemental questionnaire as a function of needs. Limitations consist of the complexity of the sample design and the lack of longitudinal follow-up of the sample. Suitable sampling weights for each specific subsample are necessary for any statistical analysis of micro-data. Accuracy in the analysis of territorial disaggregation or population subgroups increases if annual samples are accumulated.

  7. Quantitative Assessment of a Senge Learning Organization Intervention

    ERIC Educational Resources Information Center

    Kiedrowski, P. Jay

    2006-01-01

    Purpose: To quantitatively assess a Senge learning organization (LO) intervention to determine if it would result in improved employee satisfaction. Design/methodology/approach: A Senge LO intervention in Division 123 of Company ABC was undertaken in 2000. Three employee surveys using likert-scale questions over five years and correlation analysis…

  8. A survey of ground operations tools developed to simulate the pointing of space telescopes and the design for WISE

    NASA Technical Reports Server (NTRS)

    Fabinsky, Beth

    2006-01-01

    WISE, the Wide Field Infrared Survey Explorer, is scheduled for launch in June 2010. The mission operations system for WISE requires a software modeling tool to help plan, integrate and simulate all spacecraft pointing and verify that no attitude constraints are violated. In the course of developing the requirements for this tool, an investigation was conducted into the design of similar tools for other space-based telescopes. This paper summarizes the ground software and processes used to plan and validate pointing for a selection of space telescopes; with this information as background, the design for WISE is presented.

  9. Design of a Mars Airplane Propulsion System for the Aerial Regional-Scale Environmental Survey (ARES) Mission Concept

    NASA Technical Reports Server (NTRS)

    Kuhl, Christopher A.

    2008-01-01

    The Aerial Regional-Scale Environmental Survey (ARES) is a Mars exploration mission concept that utilizes a rocket propelled airplane to take scientific measurements of atmospheric, surface, and subsurface phenomena. The liquid rocket propulsion system design has matured through several design cycles and trade studies since the inception of the ARES concept in 2002. This paper describes the process of selecting a bipropellant system over other propulsion system options, and provides details on the rocket system design, thrusters, propellant tank and PMD design, propellant isolation, and flow control hardware. The paper also summarizes computer model results of thruster plume interactions and simulated flight performance. The airplane has a 6.25 m wingspan with a total wet mass of 185 kg and has to ability to fly over 600 km through the atmosphere of Mars with 45 kg of MMH / MON3 propellant.

  10. Design of Reconnaissance Helicopter Electromagnetic and Magnetic Geophysical Surveys of the North Platte River and Lodgepole Creek, Nebraska

    NASA Astrophysics Data System (ADS)

    Smith, B. D.; Cannia, J. C.; Abraham, J. D.

    2009-12-01

    An innovative flight line layout using widely separated lines was used for frequency domain helicopter electromagnetic (HEM) surveys in 2008 and 2009 in the Panhandle of western Nebraska. The HEM survey design was developed as part of a joint hydrologic study by the North Platte Natural Resource District, South Platte Natural Resource District, UNL-Conservation and Survey Division, and U.S. Geological Survey to improve the understanding of relationships between surface water and groundwater systems critical to developing groundwater flow models used in water resources management programs. Use of HEM methods for hydrologic mapping had been demonstrated by HEM surveys conducted in 2007 of sites in the glaciated Platte River Basin in eastern Nebraska. These surveys covered township-scale areas with flight lines laid out in blocks where the lines were spaced about 270m apart. The HEM successfully mapped the complex 3D geometry of shallow sand and gravel aquifers through and within conductive till to a depth of about 40m in a total area of about 680 km2 (263 mi2). Current groundwater flow models in western Nebraska include the Cooperative Hydrologic Study (COHYST), run by a consortium of state agencies, which is tasked to develop scientifically supportable hydrologic databases, analyses, and models, and the North Platte River Valley Optimization Model (NPRVOM). The COHYST study area, about 75,000 km2 (29,000 mi2), includes the Platte River Basin from the Nebraska - Wyoming border to Lincoln. Considering the large area of the groundwater models, the USGS decided in collaboration with the NRD to use a more reconnaissance-style layout for the 2008 HEM survey which encompassed about 21,000 km2 (8,000 mi2). A reconnaissance-type HEM survey is made possible due to technical capabilities of applicable HEM systems and due to the level of hydrogeologic information available in the NRD. The particular capabilities of the HEM system are careful calibration, low drift, low noise

  11. Essential Steps for Web Surveys: A Guide to Designing, Administering and Utilizing Web Surveys for University Decision-Making. Professional File. Number 102, Winter 2006

    ERIC Educational Resources Information Center

    Cheskis-Gold, Rena; Loescher, Ruth; Shepard-Rabadam, Elizabeth; Carroll, Barbara

    2006-01-01

    During the past few years, several Harvard paper surveys were converted to Web surveys. These were high-profile surveys endorsed by the Provost and the Dean of the College, and covered major portions of the university population (all undergraduates, all graduate students, tenured and non-tenured faculty). When planning for these surveys started in…

  12. Survey of waste package designs for disposal of high-level waste/spent fuel in selected foreign countries

    SciTech Connect

    Schneider, K.J.; Lakey, L.T.; Silviera, D.J.

    1989-09-01

    This report presents the results of a survey of the waste package strategies for seven western countries with active nuclear power programs that are pursuing disposal of spent nuclear fuel or high-level wastes in deep geologic rock formations. Information, current as of January 1989, is given on the leading waste package concepts for Belgium, Canada, France, Federal Republic of Germany, Sweden, Switzerland, and the United Kingdom. All but two of the countries surveyed (France and the UK) have developed design concepts for their repositories, but none of the countries has developed its final waste repository or package concept. Waste package concepts are under study in all the countries surveyed, except the UK. Most of the countries have not yet developed a reference concept and are considering several concepts. Most of the information presented in this report is for the current reference or leading concepts. All canisters for the wastes are cylindrical, and are made of metal (stainless steel, mild steel, titanium, or copper). The canister concepts have relatively thin walls, except those for spent fuel in Sweden and Germany. Diagrams are presented for the reference or leading concepts for canisters for the countries surveyed. The expected lifetimes of the conceptual canisters in their respective disposal environment are typically 500 to 1,000 years, with Sweden's copper canister expected to last as long as one million years. Overpack containers that would contain the canisters are being considered in some of the countries. All of the countries surveyed, except one (Germany) are currently planning to utilize a buffer material (typically bentonite) surrounding the disposal package in the repository. Most of the countries surveyed plan to limit the maximum temperature in the buffer material to about 100{degree}C. 52 refs., 9 figs.

  13. Survey Says

    ERIC Educational Resources Information Center

    McCarthy, Susan K.

    2005-01-01

    Survey Says is a lesson plan designed to teach college students how to access Internet resources for valid data related to the sexual health of young people. Discussion questions based on the most recent available data from two national surveys, the Youth Risk Behavior Surveillance-United States, 2003 (CDC, 2004) and the National Survey of…

  14. Coherent Power Analysis in Multi-Level Studies Using Design Parameters from Surveys

    ERIC Educational Resources Information Center

    Rhoads, Christopher

    2016-01-01

    Current practice for conducting power analyses in hierarchical trials using survey based ICC and effect size estimates may be misestimating power because ICCs are not being adjusted to account for treatment effect heterogeneity. Results presented in Table 1 show that the necessary adjustments can be quite large or quite small. Furthermore, power…

  15. A design of strategic alliance based on value chain of surveying and mapping enterprises in China

    NASA Astrophysics Data System (ADS)

    Duan, Hong; Huang, Xianfeng

    2007-06-01

    In this paper, we use value chain and strategic alliance theories to analyzing the surveying and mapping Industry and enterprises. The value chain of surveying and mapping enterprises is highly-contacted but split by administrative interference, the enterprises are common small scale. According to the above things, we consider that establishing a nonequity- Holding strategic alliance based on value chain is an available way, it can not only let the enterprises share the superior resources in different sectors of the whole value chain each other but avoid offending the interests of related administrative departments, by this way, the surveying and mapping enterprises gain development respectively and totally. Then, we give the method to building up the strategic alliance model through parting the value chain and the using advantage of companies in different value chain sectors. Finally, we analyze the internal rule of strategic alliance and prove it is a suitable way to realize the development of surveying and mapping enterprises through game theory.

  16. The Outer Solar System Origins Survey. I. Design and First-quarter Discoveries

    NASA Astrophysics Data System (ADS)

    Bannister, Michele T.; Kavelaars, J. J.; Petit, Jean-Marc; Gladman, Brett J.; Gwyn, Stephen D. J.; Chen, Ying-Tung; Volk, Kathryn; Alexandersen, Mike; Benecchi, Susan D.; Delsanti, Audrey; Fraser, Wesley C.; Granvik, Mikael; Grundy, Will M.; Guilbert-Lepoutre, Aurélie; Hestroffer, Daniel; Ip, Wing-Huen; Jakubik, Marian; Jones, R. Lynne; Kaib, Nathan; Kavelaars, Catherine F.; Lacerda, Pedro; Lawler, Samantha; Lehner, Matthew J.; Lin, Hsing Wen; Lister, Tim; Lykawka, Patryk Sofia; Monty, Stephanie; Marsset, Michael; Murray-Clay, Ruth; Noll, Keith S.; Parker, Alex; Pike, Rosemary E.; Rousselot, Philippe; Rusk, David; Schwamb, Megan E.; Shankman, Cory; Sicardy, Bruno; Vernazza, Pierre; Wang, Shiang-Yu

    2016-09-01

    We report the discovery, tracking, and detection circumstances for 85 trans-Neptunian objects (TNOs) from the first 42 deg2 of the Outer Solar System Origins Survey. This ongoing r-band solar system survey uses the 0.9 deg2 field of view MegaPrime camera on the 3.6 m Canada-France-Hawaii Telescope. Our orbital elements for these TNOs are precise to a fractional semimajor axis uncertainty <0.1%. We achieve this precision in just two oppositions, as compared to the normal three to five oppositions, via a dense observing cadence and innovative astrometric technique. These discoveries are free of ephemeris bias, a first for large trans-Neptunian surveys. We also provide the necessary information to enable models of TNO orbital distributions to be tested against our TNO sample. We confirm the existence of a cold “kernel” of objects within the main cold classical Kuiper Belt and infer the existence of an extension of the “stirred” cold classical Kuiper Belt to at least several au beyond the 2:1 mean motion resonance with Neptune. We find that the population model of Petit et al. remains a plausible representation of the Kuiper Belt. The full survey, to be completed in 2017, will provide an exquisitely characterized sample of important resonant TNO populations, ideal for testing models of giant planet migration during the early history of the solar system.

  17. The TRacking Adolescents' Individual Lives Survey (TRAILS): Design, Current Status, and Selected Findings

    ERIC Educational Resources Information Center

    Ormel, Johan; Oldehinkel, Albertine J.; Sijtsema, Jelle; van Oort, Floor; Raven, Dennis; Veenstra, Rene; Vollebergh, Wilma A. M.; Verhulst, Frank C.

    2012-01-01

    Objectives: The objectives of this study were as follows: to present a concise overview of the sample, outcomes, determinants, non-response and attrition of the ongoing TRacking Adolescents' Individual Lives Survey (TRAILS), which started in 2001; to summarize a selection of recent findings on continuity, discontinuity, risk, and protective…

  18. The Outer Solar System Origins Survey. I. Design and First-quarter Discoveries

    NASA Astrophysics Data System (ADS)

    Bannister, Michele T.; Kavelaars, J. J.; Petit, Jean-Marc; Gladman, Brett J.; Gwyn, Stephen D. J.; Chen, Ying-Tung; Volk, Kathryn; Alexandersen, Mike; Benecchi, Susan D.; Delsanti, Audrey; Fraser, Wesley C.; Granvik, Mikael; Grundy, Will M.; Guilbert-Lepoutre, Aurélie; Hestroffer, Daniel; Ip, Wing-Huen; Jakubik, Marian; Jones, R. Lynne; Kaib, Nathan; Kavelaars, Catherine F.; Lacerda, Pedro; Lawler, Samantha; Lehner, Matthew J.; Lin, Hsing Wen; Lister, Tim; Lykawka, Patryk Sofia; Monty, Stephanie; Marsset, Michael; Murray-Clay, Ruth; Noll, Keith S.; Parker, Alex; Pike, Rosemary E.; Rousselot, Philippe; Rusk, David; Schwamb, Megan E.; Shankman, Cory; Sicardy, Bruno; Vernazza, Pierre; Wang, Shiang-Yu

    2016-09-01

    We report the discovery, tracking, and detection circumstances for 85 trans-Neptunian objects (TNOs) from the first 42 deg2 of the Outer Solar System Origins Survey. This ongoing r-band solar system survey uses the 0.9 deg2 field of view MegaPrime camera on the 3.6 m Canada–France–Hawaii Telescope. Our orbital elements for these TNOs are precise to a fractional semimajor axis uncertainty <0.1%. We achieve this precision in just two oppositions, as compared to the normal three to five oppositions, via a dense observing cadence and innovative astrometric technique. These discoveries are free of ephemeris bias, a first for large trans-Neptunian surveys. We also provide the necessary information to enable models of TNO orbital distributions to be tested against our TNO sample. We confirm the existence of a cold “kernel” of objects within the main cold classical Kuiper Belt and infer the existence of an extension of the “stirred” cold classical Kuiper Belt to at least several au beyond the 2:1 mean motion resonance with Neptune. We find that the population model of Petit et al. remains a plausible representation of the Kuiper Belt. The full survey, to be completed in 2017, will provide an exquisitely characterized sample of important resonant TNO populations, ideal for testing models of giant planet migration during the early history of the solar system.

  19. Survey of perceived influence of the conceptual design model of interactive television advertising towards impulse purchase tendency

    NASA Astrophysics Data System (ADS)

    Sarif, Siti Mahfuzah; Omar, Azizah Che; Shiratuddin, Norshuhada

    2016-08-01

    With the proliferation of technology assisted shopping, there is growing evidence that impulse buying is an emerging phenomenon, which has been the focus of this study. Literatures indicate that studies related to impulse purchase for interactive television (iTV) advertising are highly scarce. It was found that most of the existing impulse purchase elements are mainly focusing on traditional retail store, website advertising, and traditional TV advertising, but not on iTV advertising. Due to that, through a systematic process, a design model for developing iTV advertising with influence towards impulse purchase tendency was developed and tested in this study. The design model is named as iTVAdIP and comprises of three main components; technology, impulse purchase components, and development process. This paper describes the survey, which measures the influence of iTVAdIP design model towards impulse purchase tendency. 37 potential advertising designers were involved in the survey. The results indicate that the iTVAdIP is practical and workable in developing iTV advertisement that could influence consumer to buy the advertised product.

  20. Design and Specification of Optical Bandpass Filters for Hubble Space Telescope (HST) Advanced Camera for Surveys (ACS)

    NASA Technical Reports Server (NTRS)

    Leviton, Douglas B.; Tsevetanov, Zlatan; Woodruff, Bob; Mooney, Thomas A.

    1998-01-01

    Advanced optical bandpass filters for the Hubble Space Telescope (HST) Advanced Camera for Surveys (ACS) have been developed on a filter-by-filter basis through detailed studies which take into account the instrument's science goals, available optical filter fabrication technology, and developments in ACS's charge-coupled-device (CCD) detector technology. These filters include a subset of filters for the Sloan Digital Sky Survey (SDSS) which are optimized for astronomical photometry using today's charge-coupled-devices (CCD's). In order for ACS to be truly advanced, these filters must push the state-of-the-art in performance in a number of key areas at the same time. Important requirements for these filters include outstanding transmitted wavefront, high transmittance, uniform transmittance across each filter, spectrally structure-free bandpasses, exceptionally high out of band rejection, a high degree of parfocality, and immunity to environmental degradation. These constitute a very stringent set of requirements indeed, especially for filters which are up to 90 mm in diameter. The highly successful paradigm in which final specifications for flight filters were derived through interaction amongst the ACS Science Team, the instrument designer, the lead optical engineer, and the filter designer and vendor is described. Examples of iterative design trade studies carried out in the context of science needs and budgetary and schedule constraints are presented. An overview of the final design specifications for the ACS bandpass and ramp filters is also presented.

  1. Site study plan for EDBH (Engineering Design Boreholes) seismic surveys, Deaf Smith County site, Texas: Revision 1

    SciTech Connect

    Hume, H.

    1987-12-01

    This site study plan describes seismic reflection surveys to run north-south and east-west across the Deaf Smith County site, and intersecting near the Engineering Design Boreholes (EDBH). Both conventional and shallow high-resolution surveys will be run. The field program has been designed to acquire subsurface geologic and stratigraphic data to address information/data needs resulting from Federal and State regulations and Repository program requirements. The data acquired by the conventional surveys will be common-depth- point, seismic reflection data optimized for reflection events that indicate geologic structure near the repository horizon. The data will also resolve the basement structure and shallow reflection events up to about the top of the evaporite sequence. Field acquisition includes a testing phase to check/select parameters and a production phase. The field data will be subjected immediately to conventional data processing and interpretation to determine if there are any anamolous structural for stratigraphic conditions that could affect the choice of the EDBH sites. After the EDBH's have been drilled and logged, including vertical seismic profiling, the data will be reprocessed and reinterpreted for detailed structural and stratigraphic information to guide shaft development. The shallow high-resulition seismic reflection lines will be run along the same alignments, but the lines will be shorter and limited to immediate vicinity of the EDBH sites. These lines are planned to detect faults or thick channel sands that may be present at the EDBH sites. 23 refs. , 7 figs., 5 tabs.

  2. Survey of editors and reviewers of high-impact psychology journals: statistical and research design problems in submitted manuscripts.

    PubMed

    Harris, Alex; Reeder, Rachelle; Hyun, Jenny

    2011-01-01

    The authors surveyed 21 editors and reviewers from major psychology journals to identify and describe the statistical and design errors they encounter most often and to get their advice regarding prevention of these problems. Content analysis of the text responses revealed themes in 3 major areas: (a) problems with research design and reporting (e.g., lack of an a priori power analysis, lack of congruence between research questions and study design/analysis, failure to adequately describe statistical procedures); (b) inappropriate data analysis (e.g., improper use of analysis of variance, too many statistical tests without adjustments, inadequate strategy for addressing missing data); and (c) misinterpretation of results. If researchers attended to these common methodological and analytic issues, the scientific quality of manuscripts submitted to high-impact psychology journals might be significantly improved.

  3. A Quantitative Research Investigation into High School Design and Art Education in a Local High School in Texas

    ERIC Educational Resources Information Center

    Lin, Yi-Hsien

    2013-01-01

    This study was designed to explore the differences between high school teachers with art and science backgrounds in terms of curriculum and student performance in art and design education, federal educational policy, and financial support. The study took place in a local independent school district in Texarkana, Texas. The independent school…

  4. Optical Design of the Camera for Transiting Exoplanet Survey Satellite (TESS)

    NASA Technical Reports Server (NTRS)

    Chrisp, Michael; Clark, Kristin; Primeau, Brian; Dalpiaz, Michael; Lennon, Joseph

    2015-01-01

    The optical design of the wide field of view refractive camera, 34 degrees diagonal field, for the TESS payload is described. This fast f/1.4 cryogenic camera, operating at -75 C, has no vignetting for maximum light gathering within the size and weight constraints. Four of these cameras capture full frames of star images for photometric searches of planet crossings. The optical design evolution, from the initial Petzval design, took advantage of Forbes aspheres to develop a hybrid design form. This maximized the correction from the two aspherics resulting in a reduction of average spot size by sixty percent in the final design. An external long wavelength pass filter was replaced by an internal filter coating on a lens to save weight, and has been fabricated to meet the specifications. The stray light requirements were met by an extended lens hood baffle design, giving the necessary off-axis attenuation.

  5. Optical design of the camera for Transiting Exoplanet Survey Satellite (TESS)

    NASA Astrophysics Data System (ADS)

    Chrisp, Michael; Clark, Kristin; Primeau, Brian; Dalpiaz, Michael; Lennon, Joseph

    2015-09-01

    The optical design of the wide field of view refractive camera with a 34 degree diagonal field for the TESS payload is described. This fast f/1.4 cryogenic camera, operating at -75°C, has no vignetting for maximum light gathering within the size and weight constraints. Four of these cameras capture full frames of star images for photometric searches of planet crossings. The optical design evolution, from the initial Petzval design, takes advantage of Forbes aspheres to develop a hybrid design form. This maximizes the correction from the two aspherics resulting in a reduction of average spot size by sixty percent in the final design. An external long wavelength pass filter has been replaced by an internal filter coating on a lens to save weight, and has been fabricated to meet the specifications. The stray light requirements are met by an extended lens hood baffle design, giving the necessary off-axis attenuation.

  6. Design and analysis of simple choice surveys for natural resource management

    USGS Publications Warehouse

    Fieberg, John; Cornicelli, Louis; Fulton, David C.; Grund, Marrett D.

    2010-01-01

    We used a simple yet powerful method for judging public support for management actions from randomized surveys. We asked respondents to rank choices (representing management regulations under consideration) according to their preference, and we then used discrete choice models to estimate probability of choosing among options (conditional on the set of options presented to respondents). Because choices may share similar unmodeled characteristics, the multinomial logit model, commonly applied to discrete choice data, may not be appropriate. We introduced the nested logit model, which offers a simple approach for incorporating correlation among choices. This forced choice survey approach provides a useful method of gathering public input; it is relatively easy to apply in practice, and the data are likely to be more informative than asking constituents to rate attractiveness of each option separately.

  7. Spatial scales of variation in lichens: implications for sampling design in biomonitoring surveys.

    PubMed

    Giordani, Paolo; Brunialti, Giorgio; Frati, Luisa; Incerti, Guido; Ianesch, Luca; Vallone, Emanuele; Bacaro, Giovanni; Maccherini, Simona

    2013-02-01

    The variability of biological data is a main constraint affecting the quality and reliability of lichen biomonitoring surveys for estimation of the effects of atmospheric pollution. Although most epiphytic lichen bioindication surveys focus on between-site differences at the landscape level, associated with the large scale effects of atmospheric pollution, current protocols are based on multilevel sampling, thus adding further sources of variation and affecting the error budget. We test the hypothesis that assemblages of lichen communities vary at each spatial scale examined, in order to determine what scales should be included in future monitoring studies. We compared four sites in Italy, along gradients of atmospheric pollution and climate, to test the partitioning of the variance components of lichen diversity across spatial scales (from trunks to landscapes). Despite environmental heterogeneity, we observed comparable spatial variance. However, residuals often overcame between-plot variability, leading to biased estimation of atmospheric pollution effects.

  8. Evaluating cost-efficiency and accuracy of hunter harvest survey designs

    USGS Publications Warehouse

    Lukacs, P.M.; Gude, J.A.; Russell, R.E.; Ackerman, B.B.

    2011-01-01

    Effective management of harvested wildlife often requires accurate estimates of the number of animals harvested annually by hunters. A variety of techniques exist to obtain harvest data, such as hunter surveys, check stations, mandatory reporting requirements, and voluntary reporting of harvest. Agencies responsible for managing harvested wildlife such as deer (Odocoileus spp.), elk (Cervus elaphus), and pronghorn (Antilocapra americana) are challenged with balancing the cost of data collection versus the value of the information obtained. We compared precision, bias, and relative cost of several common strategies, including hunter self-reporting and random sampling, for estimating hunter harvest using a realistic set of simulations. Self-reporting with a follow-up survey of hunters who did not report produces the best estimate of harvest in terms of precision and bias, but it is also, by far, the most expensive technique. Self-reporting with no followup survey risks very large bias in harvest estimates, and the cost increases with increased response rate. Probability-based sampling provides a substantial cost savings, though accuracy can be affected by nonresponse bias. We recommend stratified random sampling with a calibration estimator used to reweight the sample based on the proportions of hunters responding in each covariate category as the best option for balancing cost and accuracy. ?? 2011 The Wildlife Society.

  9. Designing Messaging to Engage Patients in an Online Suicide Prevention Intervention: Survey Results From Patients With Current Suicidal Ideation

    PubMed Central

    Lungu, Anita; Richards, Julie; Simon, Gregory E; Clingan, Sarah; Siler, Jaeden; Snyder, Lorilei; Ludman, Evette

    2014-01-01

    Background Computerized, Internet-delivered interventions can be efficacious; however, uptake and maintaining sustained client engagement are still big challenges. We see the development of effective engagement strategies as the next frontier in online health interventions, an area where much creative research has begun. We also argue that for engagement strategies to accomplish their purpose with novel targeted populations, they need to be tailored to such populations (ie, content is designed with the target population in mind). User-centered design frameworks provide a theoretical foundation for increasing user engagement and uptake by including users in development. However, deciding how to implement this approach to enage users in mental health intervention development is challenging. Objective The aim of this study was to get user input and feedback on acceptability of messaging content intended to engage suicidal individuals. Methods In March 2013, clinic intake staff distributed flyers announcing the study, “Your Feedback Counts” to potential participants (individuals waiting to be seen for a mental health appointment) together with the Patient Health Questionnaire. The flyer explained that a score of two or three (“more than half the days” or “nearly every day” respectively) on the suicide ideation question made them eligible to provide feedback on components of a suicide prevention intervention under development. The patient could access an anonymous online survey by following a link. After providing consent online, participants completed the anonymous survey. Results Thirty-four individuals provided data on past demographic information. Participants reported that they would be most drawn to an intervention where they knew that they were cared about, that was personalized, that others like them had found it helpful, and that included examples with real people. Participants preferred email invitations with subject lines expressing concern and

  10. Survey of balloon design problems and prospects for large super-pressure balloons in the next century

    NASA Astrophysics Data System (ADS)

    Yajima, Nobuyuki

    About a half century has passed since modern scientific ballooning started in the 1950's. All this while, size and payload capabilities of zero-pressure balloons have improved rapidly. On the other hand, a super-pressure balloon which can take the place of a conventional large zero-pressure balloon has not yet become operational. To investigate this problem, previous research on balloon design is surveyed. It is concluded that quite important design problems have been left unsolved. Problems occur when a load tape assembly is introduced to the natural shape balloon system. The author proposed a new balloon design concept, named 3-D gore design, at the last COSPAR held in Nagoya in 1998. This theory improves the conventional natural shape design concept for the balloon reinforced by load tapes. This new design concept enables enhancing the strength of a balloon dramatically. In addition, the strength does not depend on balloon size. This theory will accelerate the development of large super-pressure balloons which will play a leading role in scientific ballooning in the 21st century.

  11. GRAND DESIGN AND FLOCCULENT SPIRALS IN THE SPITZER SURVEY OF STELLAR STRUCTURE IN GALAXIES (S{sup 4}G)

    SciTech Connect

    Elmegreen, Debra Meloy; Yau, Andrew; Elmegreen, Bruce G.; Athanassoula, E.; Bosma, Albert; Helou, George; Sheth, Kartik; Ho, Luis C.; Madore, Barry F.; Menendez-Delmestre, KarIn; Gadotti, Dimitri A.; Knapen, Johan H.; Laurikainen, Eija; Salo, Heikki; Meidt, Sharon E.; Regan, Michael W.; Zaritsky, Dennis; Aravena, Manuel

    2011-08-10

    Spiral arm properties of 46 galaxies in the Spitzer Survey of Stellar Structure in Galaxies (S{sup 4}G) were measured at 3.6 {mu}m, where extinction is small and the old stars dominate. The sample includes flocculent, multiple arm, and grand design types with a wide range of Hubble and bar types. We find that most optically flocculent galaxies are also flocculent in the mid-IR because of star formation uncorrelated with stellar density waves, whereas multiple arm and grand design galaxies have underlying stellar waves. Arm-interarm contrasts increase from flocculent to multiple arm to grand design galaxies and with later Hubble types. Structure can be traced further out in the disk than in previous surveys. Some spirals peak at mid-radius while others continuously rise or fall, depending on Hubble and bar type. We find evidence for regular and symmetric modulations of the arm strength in NGC 4321. Bars tend to be long, high amplitude, and flat-profiled in early-type spirals, with arm contrasts that decrease with radius beyond the end of the bar, and they tend to be short, low amplitude, and exponential-profiled in late Hubble types, with arm contrasts that are constant or increase with radius. Longer bars tend to have larger amplitudes and stronger arms.

  12. THE FMOS-COSMOS SURVEY OF STAR-FORMING GALAXIES AT z ∼ 1.6. III. SURVEY DESIGN, PERFORMANCE, AND SAMPLE CHARACTERISTICS

    SciTech Connect

    Silverman, J. D.; Sugiyama, N.; Kashino, D.; Sanders, D.; Zahid, J.; Kewley, L. J.; Chu, J.; Hasinger, G.; Kartaltepe, J. S.; Arimoto, N.; Renzini, A.; Rodighiero, G.; Baronchelli, I.; Daddi, E.; Juneau, S.; Lilly, S. J.; Carollo, C. M.; Capak, P.; Ilbert, O.; and others

    2015-09-15

    We present a spectroscopic survey of galaxies in the COSMOS field using the Fiber Multi-object Spectrograph (FMOS), a near-infrared instrument on the Subaru Telescope. Our survey is specifically designed to detect the Hα emission line that falls within the H-band (1.6–1.8 μm) spectroscopic window from star-forming galaxies with 1.4 < z < 1.7 and M{sub stellar} ≳ 10{sup 10} M{sub ⊙}. With the high multiplex capability of FMOS, it is now feasible to construct samples of over 1000 galaxies having spectroscopic redshifts at epochs that were previously challenging. The high-resolution mode (R ∼ 2600) effectively separates Hα and [N ii]λ6585, thus enabling studies of the gas-phase metallicity and photoionization state of the interstellar medium. The primary aim of our program is to establish how star formation depends on stellar mass and environment, both recognized as drivers of galaxy evolution at lower redshifts. In addition to the main galaxy sample, our target selection places priority on those detected in the far-infrared by Herschel/PACS to assess the level of obscured star formation and investigate, in detail, outliers from the star formation rate (SFR)—stellar mass relation. Galaxies with Hα detections are followed up with FMOS observations at shorter wavelengths using the J-long (1.11–1.35 μm) grating to detect Hβ and [O iii]λ5008 which provides an assessment of the extinction required to measure SFRs not hampered by dust, and an indication of embedded active galactic nuclei. With 460 redshifts measured from 1153 spectra, we assess the performance of the instrument with respect to achieving our goals, discuss inherent biases in the sample, and detail the emission-line properties. Our higher-level data products, including catalogs and spectra, are available to the community.

  13. Some New Bases and Needs for Interior Design from Environmental Research. A Preliminary Survey.

    ERIC Educational Resources Information Center

    Kleeman, Walter, Jr.

    Research which can form new bases for interior design is being greatly accelerated. Investigations in psychology, anthropology, psychiatry, and biology, as well as interdisciplinary projects, turn up literally hundreds of studies, the results of which will vitally affect interior design. This body of research falls into two parts--(1) human…

  14. Aerodynamic aircraft design methods and their notable applications: Survey of the activity in Japan

    NASA Technical Reports Server (NTRS)

    Fujii, Kozo; Takanashi, Susumu

    1991-01-01

    An overview of aerodynamic aircraft design methods and their recent applications in Japan is presented. A design code which was developed at the National Aerospace Laboratory (NAL) and is in use now is discussed, hence, most of the examples are the result of the collaborative work between heavy industry and the National Aerospace Laboratory. A wide variety of applications in transonic to supersonic flow regimes are presented. Although design of aircraft elements for external flows are the main focus, some of the internal flow applications are also presented. Recent applications of the design code, using the Navier Stokes and Euler equations in the analysis mode, include the design of HOPE (a space vehicle) and Upper Surface Blowing (USB) aircraft configurations.

  15. Using SEM to Analyze Complex Survey Data: A Comparison between Design-Based Single-Level and Model-Based Multilevel Approaches

    ERIC Educational Resources Information Center

    Wu, Jiun-Yu; Kwok, Oi-man

    2012-01-01

    Both ad-hoc robust sandwich standard error estimators (design-based approach) and multilevel analysis (model-based approach) are commonly used for analyzing complex survey data with nonindependent observations. Although these 2 approaches perform equally well on analyzing complex survey data with equal between- and within-level model structures…

  16. National Longitudinal Study of the High School Class of 1972: First Followup Survey Design, Instrument Preparation, Data Collection and File Development.

    ERIC Educational Resources Information Center

    Tabler, Kenneth

    Activities involved in the collection and assembling of data for computer processing from the first followup survey of the National Longitudinal Study of the High School Class of 1972 (NLS) are briefly described. Included are an overview of the NLS; the sample design and survey participation; the development of the first followup survey…

  17. An integrated device for magnetically-driven drug release and in situ quantitative measurements: Design, fabrication and testing

    NASA Astrophysics Data System (ADS)

    Bruvera, I. J.; Hernández, R.; Mijangos, C.; Goya, G. F.

    2015-03-01

    We have developed a device capable of remote triggering and in situ quantification of therapeutic drugs, based on magnetically-responsive hydrogels of poly (N-isopropylacrylamide) and alginate (PNiPAAm). The heating efficiency of these hydrogels measured by their specific power absorption (SPA) values showed that the values between 100 and 300 W/g of the material were high enough to reach the lower critical solution temperature (LCST) of the polymeric matrix within few minutes. The drug release through application of AC magnetic fields could be controlled by time-modulated field pulses in order to deliver the desired amount of drug. Using B12 vitamin as a concept drug, the device was calibrated to measure amounts of drug released as small as 25(2)×10-9 g, demonstrating the potential of this device for very precise quantitative control of drug release.

  18. A Survey of Applications and Research in Integrated Design Systems Technology

    NASA Technical Reports Server (NTRS)

    1998-01-01

    The initial part of the study was begun with a combination of literature searches, World Wide Web searches, and contacts with individuals and companies who were known to members of our team to have an interest in topics that seemed to be related to our study. There is a long list of such topics, such as concurrent engineering, design for manufacture, life-cycle engineering, systems engineering, systems integration, systems design, design systems, integrated product and process approaches, enterprise integration, integrated product realization, and similar terms. These all capture, at least in part, the flavor of what we describe here as integrated design systems. An inhibiting factor in this inquiry was the absence of agreed terminology for the study of integrated design systems. It is common for the term to be applied to what are essentially augmented Computer-Aided Design (CAD) systems, which are integrated only to the extent that agreements have been reached to attach proprietary extensions to proprietary CAD programs. It is also common for some to use the term integrated design systems to mean a system that applies only, or mainly, to the design phase of a product life cycle. It is likewise common for many of the terms listed earlier to be used as synonyms for integrated design systems. We tried to avoid this ambiguity by adopting the definition of integrated design systems that is implied in the introductory notes that we provided to our contacts, cited earlier. We thus arrived at this definition: Integrated Design Systems refers to the integration of the different tools and processes that comprise the engineering, of complex systems. It takes a broad view of the engineering of systems, to include consideration of the entire product realization process and the product life cycle. An important aspect of integrated design systems is the extent to which they integrate existing, "islands of automation" into a comprehensive design and product realization

  19. Longitudinal emittance: An introduction to the concept and survey of measurement techniques including design of a wall current monitor

    SciTech Connect

    Webber, R.C.

    1990-03-01

    The properties of charged particle beams associated with the distribution of the particles in energy and in time can be grouped together under the category of longitudinal emittance. This article is intended to provide an intuitive introduction to the concepts longitudinal emittance; to provide an incomplete survey of methods used to measure this emittance and the related properties of bunch length and momentum spread; and to describe the detailed design of a 6 Ghz bandwidth resistive wall current monitor useful for measuring bunch shapes of moderate to high intensity beams. Overall, the article is intended to be broad in scope, in most cases deferring details to cited original papers. 37 refs., 21 figs.

  20. Design and sample characteristics of the 2005-2008 Nutrition and Health Survey in Taiwan.

    PubMed

    Tu, Su-Hao; Chen, Cheng; Hsieh, Yao-Te; Chang, Hsing-Yi; Yeh, Chih-Jung; Lin, Yi-Chin; Pan, Wen-Harn

    2011-01-01

    The Nutrition and Health Survey in Taiwan (NAHSIT) 2005-2008 was funded by the Department of Health to provide continued assessment of health and nutrition of the people in Taiwan. This household survey collected data from children aged less than 6 years and adults aged 19 years and above, and adopted a three-stage stratified, clustered sampling scheme similar to that used in the NAHSIT 1993-1996. Four samples were produced. One sample with five geographical strata was selected for inference to the whole of Taiwan, while the other three samples, including Hakka, Penghu and mountainous areas were produced for inference to each cultural stratum. A total of 6,189 household interviews and 3,670 health examinations were completed. Interview data included household information, socio-demographics, 24-hour dietary recall, food frequency and habits, dietary and nutritional knowledge, attitudes and behaviors, physical activity, medical history and bone health. Health exam data included anthropometry, blood pressure, physical fitness, bone density, as well as blood and urine collection. Response rate for the household interview was 65%. Of these household interviews, 59% participated in the health exam. Only in a few age subgroups were there significant differences in sex, age, education, or ethnicity distribution between respondents and non-respondents. For the health exam, certain significant differences between participants and non-participants were mostly observed in those aged 19-64 years. The results of this survey will be of benefit to researchers, policy makers and the public to understand and improve the nutrition and health status of pre-school children and adults in Taiwan.

  1. ULTRADEEP IRAC IMAGING OVER THE HUDF AND GOODS-SOUTH: SURVEY DESIGN AND IMAGING DATA RELEASE

    SciTech Connect

    Labbé, I.; Bouwens, R. J.; Franx, M.; Stefanon, M.; Oesch, P. A.; Illingworth, G. D.; Holden, B.; Magee, D.; Carollo, C. M.; Trenti, M.; Smit, R.; González, V.; Stiavelli, M.

    2015-12-15

    The IRAC ultradeep field and IRAC Legacy over GOODS programs are two ultradeep imaging surveys at 3.6 and 4.5 μm with the Spitzer Infrared Array Camera (IRAC). The primary aim is to directly detect the infrared light of reionization epoch galaxies at z > 7 and to constrain their stellar populations. The observations cover the Hubble Ultra Deep Field (HUDF), including the two HUDF parallel fields, and the CANDELS/GOODS-South, and are combined with archival data from all previous deep programs into one ultradeep data set. The resulting imaging reaches unprecedented coverage in IRAC 3.6 and 4.5 μm ranging from >50 hr over 150 arcmin{sup 2}, >100 hr over 60 sq arcmin{sup 2}, to ∼200 hr over 5–10 arcmin{sup 2}. This paper presents the survey description, data reduction, and public release of reduced mosaics on the same astrometric system as the CANDELS/GOODS-South Wide Field Camera 3 (WFC3) data. To facilitate prior-based WFC3+IRAC photometry, we introduce a new method to create high signal-to-noise PSFs from the IRAC data and reconstruct the complex spatial variation due to survey geometry. The PSF maps are included in the release, as are registered maps of subsets of the data to enable reliability and variability studies. Simulations show that the noise in the ultradeep IRAC images decreases approximately as the square root of integration time over the range 20–200 hr, well below the classical confusion limit, reaching 1σ point-source sensitivities as faint as 15 nJy (28.5 AB) at 3.6 μm and 18 nJy (28.3 AB) at 4.5 μm. The value of such ultradeep IRAC data is illustrated by direct detections of z = 7–8 galaxies as faint as H{sub AB} = 28.

  2. Design of the Nationwide Nursery School Survey on Child Health Throughout the Great East Japan Earthquake

    PubMed Central

    Matsubara, Hiroko; Ishikuro, Mami; Kikuya, Masahiro; Chida, Shoichi; Hosoya, Mitsuaki; Ono, Atsushi; Kato, Noriko; Yokoya, Susumu; Tanaka, Toshiaki; Isojima, Tsuyoshi; Yamagata, Zentaro; Tanaka, Soichiro; Kuriyama, Shinichi; Kure, Shigeo

    2016-01-01

    Background The Great East Japan Earthquake inflicted severe damage on the Pacific coastal areas of northeast Japan. Although possible health impacts on aged or handicapped populations have been highlighted, little is known about how the serious disaster affected preschool children’s health. We conducted a nationwide nursery school survey to investigate preschool children’s physical development and health status throughout the disaster. Methods The survey was conducted from September to December 2012. We mailed three kinds of questionnaires to nursery schools in all 47 prefectures in Japan. Questionnaire “A” addressed nursery school information, and questionnaires “B1” and “B2” addressed individuals’ data. Our targets were children who were born from April 2, 2004, to April 1, 2005 (those who did not experience the disaster during their preschool days) and children who were born from April 2, 2006, to April 1, 2007 (those who experienced the disaster during their preschool days). The questionnaire inquired about disaster experiences, anthropometric measurements, and presence of diseases. Results In total, 3624 nursery schools from all 47 prefectures participated in the survey. We established two nationwide retrospective cohorts of preschool children; 53 747 children who were born from April 2, 2004, to April 1, 2005, and 69 004 children who were born from April 2, 2006, to April 1, 2007. Among the latter cohort, 1003 were reported to have specific personal experiences with the disaster. Conclusions With the large dataset, we expect to yield comprehensive study results about preschool children’s physical development and health status throughout the disaster. PMID:26460382

  3. The Mississippi Delta Cardiovascular Health Examination Survey: Study Design and Methods

    PubMed Central

    Short, Vanessa L.; Ivory-Walls, Tameka; Smith, Larry; Loustalot, Fleetwood

    2015-01-01

    Assessment of cardiovascular disease (CVD) morbidity and mortality in subnational areas is limited. A model for regional CVD surveillance is needed, particularly among vulnerable populations underrepresented in current monitoring systems. The Mississippi Delta Cardiovascular Health Examination Survey (CHES) is a population-based, cross-sectional study on a representative sample of adults living in the 18-county Mississippi Delta region, a rural, impoverished area with high rates of poor health outcomes and marked health disparities. The primary objectives of Delta CHES are to (1) determine the prevalence and distribution of CVD and CVD risk factors using self-reported and directly measured health metrics and (2) to assess environmental perceptions and existing policies that support or deter healthy choices. An address-based sampling frame is used for household enumeration and participant recruitment and an in-home data collection model is used to collect survey data, anthropometric measures, and blood samples from participants. Data from all sources will be merged into one analytic dataset and sample weights developed to ensure data are representative of the Mississippi Delta region adult population. Information gathered will be used to assess the burden of CVD and guide the development, implementation, and evaluation of cardiovascular health promotion and risk factor control strategies. PMID:25844257

  4. Survey of Aerothermodynamics Facilities Useful for the Design of Hypersonic Vehicles Using Air-Breathing Propulsion

    NASA Technical Reports Server (NTRS)

    Arnold, James O.; Deiwert, George S.

    1997-01-01

    This paper surveys the use of aerothermodynamic facilities which have been useful in the study of external flows and propulsion aspects of hypersonic, air-breathing vehicles. While the paper is not a survey of all facilities, it covers the utility of shock tunnels and conventional hypersonic blow-down facilities which have been used for hypersonic air-breather studies. The problems confronting researchers in the field of aerothermodynamics are outlined. Results from the T5 GALCIT tunnel for the shock-on lip problem are outlined. Experiments on combustors and short expansion nozzles using the semi-free jet method have been conducted in large shock tunnels. An example which employed the NASA Ames 16-Inch shock tunnel is outlined, and the philosophy of the test technique is described. Conventional blow-down hypersonic wind tunnels are quite useful in hypersonic air-breathing studies. Results from an expansion ramp experiment, simulating the nozzle on a hypersonic air-breather from the NASA Ames 3.5 Foot Hypersonic wind tunnel are summarized. Similar work on expansion nozzles conducted in the NASA Langley hypersonic wind tunnel complex is cited. Free-jet air-frame propulsion integration and configuration stability experiments conducted at Langley in the hypersonic wind tunnel complex on a small generic model are also summarized.

  5. Survey and analysis of research on supersonic drag-due-to-lift minimization with recommendations for wing design

    NASA Technical Reports Server (NTRS)

    Carlson, Harry W.; Mann, Michael J.

    1992-01-01

    A survey of research on drag-due-to-lift minimization at supersonic speeds, including a study of the effectiveness of current design and analysis methods was conducted. The results show that a linearized theory analysis with estimated attainable thrust and vortex force effects can predict with reasonable accuracy the lifting efficiency of flat wings. Significantly better wing performance can be achieved through the use of twist and camber. Although linearized theory methods tend to overestimate the amount of twist and camber required for a given application and provide an overly optimistic performance prediction, these deficiencies can be overcome by implementation of recently developed empirical corrections. Numerous examples of the correlation of experiment and theory are presented to demonstrate the applicability and limitations of linearized theory methods with and without empirical corrections. The use of an Euler code for the estimation of aerodynamic characteristics of a twisted and cambered wing and its application to design by iteration are discussed.

  6. Fitting statistical distributions to sea duck count data: implications for survey design and abundance estimation

    USGS Publications Warehouse

    Zipkin, Elise F.; Leirness, Jeffery B.; Kinlan, Brian P.; O'Connell, Allan F.; Silverman, Emily D.

    2014-01-01

    Determining appropriate statistical distributions for modeling animal count data is important for accurate estimation of abundance, distribution, and trends. In the case of sea ducks along the U.S. Atlantic coast, managers want to estimate local and regional abundance to detect and track population declines, to define areas of high and low use, and to predict the impact of future habitat change on populations. In this paper, we used a modified marked point process to model survey data that recorded flock sizes of Common eiders, Long-tailed ducks, and Black, Surf, and White-winged scoters. The data come from an experimental aerial survey, conducted by the United States Fish & Wildlife Service (USFWS) Division of Migratory Bird Management, during which east-west transects were flown along the Atlantic Coast from Maine to Florida during the winters of 2009–2011. To model the number of flocks per transect (the points), we compared the fit of four statistical distributions (zero-inflated Poisson, zero-inflated geometric, zero-inflated negative binomial and negative binomial) to data on the number of species-specific sea duck flocks that were recorded for each transect flown. To model the flock sizes (the marks), we compared the fit of flock size data for each species to seven statistical distributions: positive Poisson, positive negative binomial, positive geometric, logarithmic, discretized lognormal, zeta and Yule–Simon. Akaike’s Information Criterion and Vuong’s closeness tests indicated that the negative binomial and discretized lognormal were the best distributions for all species for the points and marks, respectively. These findings have important implications for estimating sea duck abundances as the discretized lognormal is a more skewed distribution than the Poisson and negative binomial, which are frequently used to model avian counts; the lognormal is also less heavy-tailed than the power law distributions (e.g., zeta and Yule–Simon), which are

  7. Ultradeep IRAC Imaging Over the HUDF and GOODS-South: Survey Design and Imaging Data Release

    NASA Astrophysics Data System (ADS)

    Labbé, I.; Oesch, P. A.; Illingworth, G. D.; van Dokkum, P. G.; Bouwens, R. J.; Franx, M.; Carollo, C. M.; Trenti, M.; Holden, B.; Smit, R.; González, V.; Magee, D.; Stiavelli, M.; Stefanon, M.

    2015-12-01

    The IRAC ultradeep field and IRAC Legacy over GOODS programs are two ultradeep imaging surveys at 3.6 and 4.5 μm with the Spitzer Infrared Array Camera (IRAC). The primary aim is to directly detect the infrared light of reionization epoch galaxies at z > 7 and to constrain their stellar populations. The observations cover the Hubble Ultra Deep Field (HUDF), including the two HUDF parallel fields, and the CANDELS/GOODS-South, and are combined with archival data from all previous deep programs into one ultradeep data set. The resulting imaging reaches unprecedented coverage in IRAC 3.6 and 4.5 μm ranging from >50 hr over 150 arcmin2, >100 hr over 60 sq arcmin2, to ˜200 hr over 5-10 arcmin2. This paper presents the survey description, data reduction, and public release of reduced mosaics on the same astrometric system as the CANDELS/GOODS-South Wide Field Camera 3 (WFC3) data. To facilitate prior-based WFC3+IRAC photometry, we introduce a new method to create high signal-to-noise PSFs from the IRAC data and reconstruct the complex spatial variation due to survey geometry. The PSF maps are included in the release, as are registered maps of subsets of the data to enable reliability and variability studies. Simulations show that the noise in the ultradeep IRAC images decreases approximately as the square root of integration time over the range 20-200 hr, well below the classical confusion limit, reaching 1σ point-source sensitivities as faint as 15 nJy (28.5 AB) at 3.6 μm and 18 nJy (28.3 AB) at 4.5 μm. The value of such ultradeep IRAC data is illustrated by direct detections of z = 7-8 galaxies as faint as HAB = 28. Based on observations made with the NASA/ESA Hubble Space Telescope, obtained from the data archive at the Space Telescope Science Institute. STScI is operated by the Association of Universities for Research in Astronomy, Inc. under NASA contract NAS 5-26555. Based on observations made with the Spitzer Space Telescope, which is operated by the Jet

  8. Advanced power generation systems for the 21st Century: Market survey and recommendations for a design philosophy

    SciTech Connect

    Andriulli, J.B.; Gates, A.E.; Haynes, H.D.; Klett, L.B.; Matthews, S.N.; Nawrocki, E.A.; Otaduy, P.J.; Scudiere, M.B.; Theiss, T.J.; Thomas, J.F.; Tolbert, L.M.; Yauss, M.L.; Voltz, C.A.

    1999-11-01

    The purpose of this report is to document the results of a study designed to enhance the performance of future military generator sets (gen-sets) in the medium power range. The study includes a market survey of the state of the art in several key component areas and recommendations comprising a design philosophy for future military gen-sets. The market survey revealed that the commercial market is in a state of flux, but it is currently or will soon be capable of providing the technologies recommended here in a cost-effective manner. The recommendations, if implemented, should result in future power generation systems that are much more functional than today's gen-sets. The number of differing units necessary (both family sizes and frequency modes) to cover the medium power range would be decreased significantly, while the weight and volume of each unit would decrease, improving the transportability of the power source. Improved fuel economy and overall performance would result from more effective utilization of the prime mover in the generator. The units would allow for more flexibility and control, improved reliability, and more effective power management in the field.

  9. Design and methods in a survey of living conditions in the Arctic – the SLiCA study

    PubMed Central

    Eliassen, Bent-Martin; Melhus, Marita; Kruse, Jack; Poppel, Birger; Broderstad, Ann Ragnhild

    2012-01-01

    Objectives The main objective of this study is to describe the methods and design of the survey of living conditions in the Arctic (SLiCA), relevant participation rates and the distribution of participants, as applicable to the survey data in Alaska, Greenland and Norway. This article briefly addresses possible selection bias in the data and also the ways to tackle it in future studies. Study design Population-based cross-sectional survey. Methods Indigenous individuals aged 16 years and older, living in Greenland, Alaska and in traditional settlement areas in Norway, were invited to participate. Random sampling methods were applied in Alaska and Greenland, while non-probability sampling methods were applied in Norway. Data were collected in 3 periods: in Alaska, from January 2002 to February 2003; in Greenland, from December 2003 to August 2006; and in Norway, in 2003 and from June 2006 to June 2008. The principal method in SLiCA was standardised face-to-face interviews using a questionnaire. Results A total of 663, 1,197 and 445 individuals were interviewed in Alaska, Greenland and Norway, respectively. Very high overall participation rates of 83% were obtained in Greenland and Alaska, while a more conventional rate of 57% was achieved in Norway. A predominance of female respondents was obtained in Alaska. Overall, the Sami cohort is older than the cohorts from Greenland and Alaska. Conclusions Preliminary assessments suggest that selection bias in the Sami sample is plausible but not a major threat. Few or no threats to validity are detected in the data from Alaska and Greenland. Despite different sampling and recruitment methods, and sociocultural differences, a unique database has been generated, which shall be used to explore relationships between health and other living conditions variables. PMID:22456042

  10. The VIRUS-P Exploration of Nearby Galaxies (VENGA): Survey Design and First Results

    NASA Astrophysics Data System (ADS)

    Blanc, G. A.; Gebhardt, K.; Heiderman, A.; Evans, N. J., II; Jogee, S.; van den Bosch, R.; Marinova, I.; Weinzirl, T.; Yoachim, P.; Drory, N.; Fabricius, M.; Fisher, D.; Hao, L.; MacQueen, P. J.; Shen, J.; Hill, G. J.; Kormendy, J.

    2010-10-01

    VENGA is a large-scale extragalactic IFU survey, which maps the bulges, bars and large parts of the outer disks of 32 nearby normal spiral galaxies. The targets are chosen to span a wide range in Hubble types, star formation activities, morphologies, and inclinations, at the same time of having vast available multi-wavelength coverage from the far-UV to the mid-IR, and available CO and 21cm mapping. The VENGA dataset will provide 2D maps of the SFR, stellar and gas kinematics, chemical abundances, ISM density and ionization states, dust extinction and stellar populations for these 32 galaxies. The uniqueness of the VIRUS-P large field of view permits these large-scale mappings to be performed. VENGA will allow us to correlate all these important quantities throughout the different environments present in galactic disks, allowing the conduction of a large number of studies in star formation, structure assembly, galactic feedback and ISM in galaxies.

  11. Siphon penstock installations at hydroelectric projects: A survey of design, construction and operating experience

    SciTech Connect

    Burgoine, D.; Rodrigue, P.; Tarbell, J.C.; Acres International Corp., Amherst, NY . Mechanical Engineering Dept.; Acres International Corp., Amherst, NY )

    1989-01-01

    There can be advantages to using siphon penstocks at small hydro projects, particularly those constructed at existing dams. One problem, however, is a lack of documentation of siphon penstock installations. The design considerations, construction and operating aspects of siphon penstock installations are described here. 4 figs., 1 tab.

  12. Diagrams: A Visual Survey of Graphs, Maps, Charts and Diagrams for the Graphic Designer.

    ERIC Educational Resources Information Center

    Lockwood, Arthur

    Since the ultimate success of any diagram rests in its clarity, it is important that the designer select a method of presentation which will achieve this aim. He should be aware of the various ways in which statistics can be shown diagrammatically, how information can be incorporated in maps, and how events can be plotted in chart or graph form.…

  13. Survey of piloting factors in V/STOL aircraft with implications for flight control system design

    NASA Technical Reports Server (NTRS)

    Ringland, R. F.; Craig, S. J.

    1977-01-01

    Flight control system design factors involved for pilot workload relief are identified. Major contributors to pilot workload include configuration management and control and aircraft stability and response qualities. A digital fly by wire stability augmentation, configuration management, and configuration control system is suggested for reduction of pilot workload during takeoff, hovering, and approach.

  14. Computer Game Theories for Designing Motivating Educational Software: A Survey Study

    ERIC Educational Resources Information Center

    Ang, Chee Siang; Rao, G. S. V. Radha Krishna

    2008-01-01

    The purpose of this study is to evaluate computer game theories for educational software. We propose a framework for designing engaging educational games based on contemporary game studies which includes ludology and narratology. Ludology focuses on the study of computer games as play and game activities, while narratology revolves around the…

  15. A Survey of Career Guidance Needs of Industrial Design Students in Taiwanese Universities

    ERIC Educational Resources Information Center

    Yang, Ming-Ying; You, Manlai

    2010-01-01

    School pupils in Taiwan spend most of their time in studying and having examinations, and consequently many of them decide what major to study in universities rather hastily. Industrial design (ID) programs in universities nowadays recruit students from general and vocational senior high schools through a variety of channels. As a consequence, ID…

  16. DESIGN AND INDICATOR CONSIDERATIONS FOR A PROBABILISTIC SURVEY OF USA GREAT RIVERS: MISSOURI, MISSISSIPPI, OHIO

    EPA Science Inventory

    Great River Ecosystems (GRE) include the river channel and associated backwaters and floodplain habitats. The challenge in designing a GRE monitoring and assessment program is to choose a set of habitats, indicators, and sampling locations that reveal the ecological condition of ...

  17. Disposable surface plasmon resonance aptasensor with membrane-based sample handling design for quantitative interferon-gamma detection.

    PubMed

    Chuang, Tsung-Liang; Chang, Chia-Chen; Chu-Su, Yu; Wei, Shih-Chung; Zhao, Xi-hong; Hsueh, Po-Ren; Lin, Chii-Wann

    2014-08-21

    ELISA and ELISPOT methods are utilized for interferon-gamma (IFN-γ) release assays (IGRAs) to detect the IFN-γ secreted by T lymphocytes. However, the multi-step protocols of the assays are still performed with laboratory instruments and operated by well-trained people. Here, we report a membrane-based microfluidic device integrated with a surface plasmon resonance (SPR) sensor to realize an easy-to-use and cost effective multi-step quantitative analysis. To conduct the SPR measurements, we utilized a membrane-based SPR sensing device in which a rayon membrane was located 300 μm under the absorbent pad. The basic equation covering this type of transport is based on Darcy's law. Furthermore, the concentration of streptavidin delivered from a sucrose-treated glass pad placed alongside the rayon membrane was controlled in a narrow range (0.81 μM ± 6%). Finally, the unbound molecules were removed by a washing buffer that was pre-packed in the reservoir of the chip. Using a bi-functional, hairpin-shaped aptamer as the sensing probe, we specifically detected the IFN-γ and amplified the signal by binding the streptavidin. A high correlation coefficient (R(2) = 0.995) was obtained, in the range from 0.01 to 100 nM. A detection limit of 10 pM was achieved within 30 min. Thus, the SPR assay protocols for IFN-γ detection could be performed using this simple device without an additional pumping system. PMID:24931052

  18. Research design and statistical methods in Indian medical journals: a retrospective survey.

    PubMed

    Hassan, Shabbeer; Yellur, Rajashree; Subramani, Pooventhan; Adiga, Poornima; Gokhale, Manoj; Iyer, Manasa S; Mayya, Shreemathi S

    2015-01-01

    Good quality medical research generally requires not only an expertise in the chosen medical field of interest but also a sound knowledge of statistical methodology. The number of medical research articles which have been published in Indian medical journals has increased quite substantially in the past decade. The aim of this study was to collate all evidence on study design quality and statistical analyses used in selected leading Indian medical journals. Ten (10) leading Indian medical journals were selected based on impact factors and all original research articles published in 2003 (N = 588) and 2013 (N = 774) were categorized and reviewed. A validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation of the articles. Main outcomes considered in the present study were - study design types and their frequencies, error/defects proportion in study design, statistical analyses, and implementation of CONSORT checklist in RCT (randomized clinical trials). From 2003 to 2013: The proportion of erroneous statistical analyses did not decrease (χ2=0.592, Φ=0.027, p=0.4418), 25% (80/320) in 2003 compared to 22.6% (111/490) in 2013. Compared with 2003, significant improvement was seen in 2013; the proportion of papers using statistical tests increased significantly (χ2=26.96, Φ=0.16, p<0.0001) from 42.5% (250/588) to 56.7 % (439/774). The overall proportion of errors in study design decreased significantly (χ2=16.783, Φ=0.12 p<0.0001), 41.3% (243/588) compared to 30.6% (237/774). In 2013, randomized clinical trials designs has remained very low (7.3%, 43/588) with majority showing some errors (41 papers, 95.3%). Majority of the published studies were retrospective in nature both in 2003 [79.1% (465/588)] and in 2013 [78.2% (605/774)]. Major decreases in error proportions were observed in both results presentation (χ2=24.477, Φ=0.17, p<0.0001), 82.2% (263/320) compared to 66.3% (325/490) and

  19. Beyond the Cost of Biologics: Employer Survey Reveals Gap in Understanding Role of Specialty Pharmacy and Benefit Design

    PubMed Central

    Vogenberg, F. Randy; Larson, Cheryl; Rehayem, Margaret; Boress, Larry

    2012-01-01

    Background Advances in biotechnology have led to the development of many new medical therapies for a variety of diseases. These agents, known as biologics or specialty drugs, represent the fastest-growing segment of pharmaceuticals. They have often proved effective in cases where conventional medications have failed; however, they can cost up to $350,000 per patient annually. Employers sponsor a significant proportion of plans that provide healthcare benefits, but surveys on benefit coverage have neglected to measure employers’ understanding of these drugs or their use. Objective To establish a baseline understanding of specialty pharmacy drug benefit coverage from the perspective of the employer (ie, commercial benefit plan sponsors). Methods The Midwest Business Group on Health (MBGH), a Chicago-based, nonprofit coalition of more than 100 large employers, in collaboration with the Institute for Integrated Healthcare, conducted a national web-based survey to determine the extent of employer understanding of specialty pharmacy drug management. MBGH, along with 15 business coalitions nationwide, distributed the survey to their employer members. A total of 120 employers, representing more than 1 million employee lives, completed the survey online. The results were then analyzed by MBGH. Results Of the 120 employers surveyed, 25% had “little to no understanding” of biologics, and only 53% claimed a “moderate understanding” of these agents. When asked to rank the effectiveness of biologics-related disease management support for their employees, 45% of the participating employers did not know whether productivity had increased, and 43% did not know whether their employees had experienced increased quality of life as a result of taking these drugs. The majority (76%) of employers continued to rely heavily on print medium to communicate with their covered population. Overall, the vast majority of employers (78%) claimed either “little to no understanding” or

  20. Thermal Design of the Instrument for the Transiting Exoplanet Survey Satellite

    NASA Technical Reports Server (NTRS)

    Allen, Gregory D.

    2016-01-01

    TESS observatory is a two year NASA Explorer mission which will use a set of four cameras to discover exoplanets. It will be placed in a high-earth orbit with a period of 13.7 days and will be unaffected by temperature disturbances caused by environmental heating from the Earth. The cameras use their stray-light baffles to passively cool the cameras and in turn the CCD's in order to maintain operational temperatures. The design has been well thought out and analyzed to maximize temperature stability. The analysis shows that the design keeps the cameras and their components within their temperature ranges which will help make it a successful mission. It will also meet its survival requirement of sustaining exposure to a five hour eclipse. Official validation and verification planning is underway and will be performed as the system is built up. It is slated for launch in 2017.

  1. A survey on the design of multiprocessing systems for artificial intelligence applications

    NASA Technical Reports Server (NTRS)

    Wah, Benjamin W.; Li, Guo Jie

    1989-01-01

    Some issues in designing computers for artificial intelligence (AI) processing are discussed. These issues are divided into three levels: the representation level, the control level, and the processor level. The representation level deals with the knowledge and methods used to solve the problem and the means to represent it. The control level is concerned with the detection of dependencies and parallelism in the algorithmic and program representations of the problem, and with the synchronization and sheduling of concurrent tasks. The processor level addresses the hardware and architectural components needed to evaluate the algorithmic and program representations. Solutions for the problems of each level are illustrated by a number of representative systems. Design decisions in existing projects on AI computers are classed into top-down, bottom-up, and middle-out approaches.

  2. Design and Calibration of a Flowfield Survey Rake for Inlet Flight Research

    NASA Technical Reports Server (NTRS)

    Flynn, Darin C.; Ratnayake, Nalin A.; Frederick, Michael

    2009-01-01

    Flowfield rake was designed to quantify the flowfield for inlet research underneath NASA DFRC s F-15B airplane. Detailed loads and stress analysis performed using CFD and empirical methods to assure structural integrity. Calibration data were generated through wind tunnel testing of the rake. Calibration algorithm was developed to determine the local Mach and flow angularity at each probe. RAGE was flown November, 2008. Data is currently being analyzed.

  3. An integrated payload design for the Atmospheric Remote-sensing Infrared Exoplanet Large-survey (ARIEL)

    NASA Astrophysics Data System (ADS)

    Eccleston, Paul; Tinetti, Giovanna; Beaulieu, Jean-Philippe; Güdel, Manuel; Hartogh, Paul; Micela, Giuseppina; Min, Michiel; Rataj, Miroslaw; Ray, Tom; Ribas, Ignasi; Vandenbussche, Bart; Auguères, Jean-Louis; Bishop, Georgia; Da Deppo, Vania; Focardi, Mauro; Hunt, Thomas; Malaguti, Giuseppe; Middleton, Kevin; Morgante, Gianluca; Ollivier, Marc; Pace, Emanuele; Pascale, Enzo; Taylor, William

    2016-07-01

    ARIEL (the Atmospheric Remote-sensing Infrared Exoplanet Large-survey) is one of the three candidates for the next ESA medium-class science mission (M4) expected to be launched in 2026. This mission will be devoted to observing spectroscopically in the infrared a large population of warm and hot transiting exoplanets (temperatures from ~500 K to ~3000 K) in our nearby Galactic neighborhood, opening a new discovery space in the field of extrasolar planets and enabling the understanding of the physics and chemistry of these far away worlds. The three candidate missions for M4 are now in a Phase A study which will run until mid-2017 at which point one mission will be selected for implementation. ARIEL is based on a 1-m class telescope feeding both a moderate resolution spectrometer covering the wavelengths from 1.95 to 7.8 microns, and a four channel photometer (which also acts as a Fine Guidance Sensor) with bands between 0.55 and 1.65 microns. During its 3.5 years of operation from an L2 orbit, ARIEL will continuously observe exoplanets transiting their host star.

  4. Survey of Aerothermodynamics Facilities Useful for the Design of Hypersonic Vehicles Using Air-Breathing Propulsion

    NASA Technical Reports Server (NTRS)

    Arnold, James O.; Deiwert, G. S.

    1997-01-01

    The dream of producing an air-breathing, hydrogen fueled, hypervelocity aircraft has been before the aerospace community for decades. However, such a craft has not yet been realized, even in an experimental form. Despite the simplicity and beauty of the concept, many formidable problems must be overcome to make this dream a reality. This paper summarizes the aero/aerothermodynamic issues that must be addressed to make the dream a reality and discusses how aerothermodynamics facilities and their modem companion, real-gas computational fluid dynamics (CFD), can help solve the problems blocking the way to realizing the dream. The approach of the paper is first to outline the concept of an air-breathing hypersonic vehicle and then discuss the nose-to-tail aerothermodynamics issues and special aerodynamic problems that arise with such a craft. Then the utility of aerothermodynamic facilities and companion CFD analysis is illustrated by reviewing results from recent United States publications wherein these problems have been addressed. Papers selected for the discussion have k e n chosen such that the review will serve to survey important U.S. aero/aerothermodynamic real gas and conventional wind tunnel facilities that are useful in the study of hypersonic, hydrogen propelled hypervelocity vehicles.

  5. Quantitative analysis

    PubMed Central

    Nevin, John A.

    1984-01-01

    Quantitative analysis permits the isolation of invariant relations in the study of behavior. The parameters of these relations can serve as higher-order dependent variables in more extensive analyses. These points are illustrated by reference to quantitative descriptions of performance maintained by concurrent schedules, multiple schedules, and signal-detection procedures. Such quantitative descriptions of empirical data may be derived from mathematical theories, which in turn can lead to novel empirical analyses so long as their terms refer to behavioral and environmental events. Thus, quantitative analysis is an integral aspect of the experimental analysis of behavior. PMID:16812400

  6. A Study of Program Management Procedures in the Campus-Based and Basic Grant Programs. Technical Report No. 1: Sample Design, Student Survey Yield and Bias.

    ERIC Educational Resources Information Center

    Puma, Michael J.; Ellis, Richard

    Part of a study of program management procedures in the campus-based and Basic Educational Opportunity Grant programs reports on the design of the site visit component of the study and the results of the student survey, both in terms of the yield obtained and the quality of the data. Chapter 2 describes the design of sampling methodology employed…

  7. Seismic design and engineering research at the U.S. Geological Survey

    USGS Publications Warehouse

    1988-01-01

    The Engineering Seismology Element of the USGS Earthquake Hazards Reduction Program is responsible for the coordination and operation of the National Strong Motion Network to collect, process, and disseminate earthquake strong-motion data; and, the development of improved methodologies to estimate and predict earthquake ground motion.  Instrumental observations of strong ground shaking induced by damaging earthquakes and the corresponding response of man-made structures provide the basis for estimating the severity of shaking from future earthquakes, for earthquake-resistant design, and for understanding the physics of seismologic failure in the Earth's crust.

  8. Hybrid optimization methodology of variable densities mesh model for the axial supporting design of wide-field survey telescope

    NASA Astrophysics Data System (ADS)

    Wang, Hairen; Lou, Zheng; Qian, Yuan; Zheng, Xianzhong; Zuo, Yingxi

    2016-03-01

    The optimization of a primary mirror support system is one of the most critical problems in the design of large telescopes. Here, we propose a hybrid optimization methodology of variable densities mesh model (HOMVDMM) for the axial supporting design, which has three key steps: (1) creating a variable densities mesh model, which will partition the mirror into several sparse mesh areas and several dense mesh areas; (2) global optimization based on the zero-order optimization method for the support of primary mirror with a large tolerance; (3) based on the optimization results of the second step, further optimization with first-order optimization method in dense mesh areas by a small tolerance. HOMVDMM exploits the complementary merits of both the zero- and first-order optimizations, with the former in global scale and the latter in small scale. As an application, the axial support of the primary mirror of the 2.5-m wide-field survey telescope (WFST) is optimized by HOMVDMM. These three designs are obtained via a comparative study of different supporting points including 27 supporting points, 39 supporting points, and 54 supporting points. Their residual half-path length errors are 28.78, 9.32, and 5.29 nm. The latter two designs both meet the specification of WFST. In each of the three designs, a global optimization value with high accuracy will be obtained in an hour on an ordinary PC. As the results suggest, the overall performance of HOMVDMM is superior to the first-order optimization method as well as the zero-order optimization method.

  9. Design and Calibration of a Flowfield Survey Rake for Inlet Flight Research

    NASA Technical Reports Server (NTRS)

    Flynn, Darin C.; Ratnayake, Nalin A.; Frederick, Michael

    2009-01-01

    The Propulsion Flight Test Fixture at the NASA Dryden Flight Research Center is a unique test platform available for use on NASA's F-15B aircraft, tail number 836, as a modular host for a variety of aerodynamics and propulsion research. For future flight data from this platform to be valid, more information must be gathered concerning the quality of the airflow underneath the body of the F-15B at various flight conditions, especially supersonic conditions. The flow angularity and Mach number must be known at multiple locations on any test article interface plane for measurement data at these locations to be valid. To determine this prerequisite information, flight data will be gathered in the Rake Airflow Gauge Experiment using a custom-designed flowfield rake to probe the airflow underneath the F-15B at the desired flight conditions. This paper addresses the design considerations of the rake and probe assembly, including the loads and stress analysis using analytical methods, computational fluid dynamics, and finite element analysis. It also details the flow calibration procedure, including the completed wind-tunnel test and posttest data reduction, calibration verification, and preparation for flight-testing.

  10. Quantitative microscopy of the lung: a problem-based approach. Part 2: stereological parameters and study designs in various diseases of the respiratory tract.

    PubMed

    Mühlfeld, Christian; Ochs, Matthias

    2013-08-01

    Design-based stereology provides efficient methods to obtain valuable quantitative information of the respiratory tract in various diseases. However, the choice of the most relevant parameters in a specific disease setting has to be deduced from the present pathobiological knowledge. Often it is difficult to express the pathological alterations by interpretable parameters in terms of volume, surface area, length, or number. In the second part of this companion review article, we analyze the present pathophysiological knowledge about acute lung injury, diffuse parenchymal lung diseases, emphysema, pulmonary hypertension, and asthma to come up with recommendations for the disease-specific application of stereological principles for obtaining relevant parameters. Worked examples with illustrative images are used to demonstrate the work flow, estimation procedure, and calculation and to facilitate the practical performance of equivalent analyses. PMID:23709622

  11. The Norwegian Offender Mental Health and Addiction Study – Design and Implementation of a National Survey and Prospective Cohort Study

    PubMed Central

    Bukten, Anne; Lund, Ingunn Olea; Rognli, Eline Borger; Stavseth, Marianne Riksheim; Lobmaier, Philipp; Skurtveit, Svetlana; Clausen, Thomas; Kunøe, Nikolaj

    2015-01-01

    The Norwegian prison inmates are burdened by problems before they enter prison. Few studies have managed to assess this burden and relate it to what occurs for the inmates once they leave the prison. The Norwegian Offender Mental Health and Addiction (NorMA) study is a large-scale longitudinal cohort study that combines national survey and registry data in order to understand mental health, substance use, and criminal activity before, during, and after custody among prisoners in Norway. The main goal of the study is to describe the criminal and health-related trajectories based on both survey and registry linkage information. Data were collected from 1,499 inmates in Norwegian prison facilities during 2013–2014. Of these, 741 inmates provided a valid personal identification number and constitute a cohort that will be examined retrospectively and prospectively, along with data from nationwide Norwegian registries. This study describes the design, procedures, and implementation of the ongoing NorMA study and provides an outline of the initial data. PMID:26648732

  12. Design of a Mars Airplane Propulsion System for the Aerial Regional-Scale Environmental Survey (ARES) Mission Concept

    NASA Technical Reports Server (NTRS)

    Kuhl. Christopher A.

    2009-01-01

    The Aerial Regional-Scale Environmental Survey (ARES) is a Mars exploration mission concept with the goal of taking scientific measurements of the atmosphere, surface, and subsurface of Mars by using an airplane as the payload platform. ARES team first conducted a Phase-A study for a 2007 launch opportunity, which was completed in May 2003. Following this study, significant efforts were undertaken to reduce the risk of the atmospheric flight system, under the NASA Langley Planetary Airplane Risk Reduction Project. The concept was then proposed to the Mars Scout program in 2006 for a 2011 launch opportunity. This paper summarizes the design and development of the ARES airplane propulsion subsystem beginning with the inception of the ARES project in 2002 through the submittal of the Mars Scout proposal in July 2006.

  13. Survey on effect of surface winds on aircraft design and operation and recommendations for needed wind research

    NASA Technical Reports Server (NTRS)

    Houbolt, J. C.

    1973-01-01

    A survey of the effect of environmental surface winds and gusts on aircraft design and operation is presented. A listing of the very large number of problems that are encountered is given. Attention is called to the many studies that have been made on surface winds and gusts, but development in the engineering application of these results to aeronautical problems is pointed out to be still in the embryonic stage. Control of the aircraft is of paramount concern. Mathematical models and their application in simulation studies of airplane operation and control are discussed, and an attempt is made to identify their main gaps or deficiencies. Key reference material is cited. The need for better exchange between the meteorologist and the aeronautical engineer is discussed. Suggestions for improvements in the wind and gust models are made.

  14. SURVEY INSTRUMENT

    DOEpatents

    Borkowski, C J

    1954-01-19

    This pulse-type survey instrument is suitable for readily detecting {alpha} particles in the presence of high {beta} and {gamma} backgrounds. The instruments may also be used to survey for neutrons, {beta} particles and {gamma} rays by employing suitably designed interchangeable probes and selecting an operating potential to correspond to the particular probe.

  15. Information Presentation Features and Comprehensibility of Hospital Report Cards: Design Analysis and Online Survey Among Users

    PubMed Central

    2015-01-01

    Background Improving the transparency of information about the quality of health care providers is one way to improve health care quality. It is assumed that Internet information steers patients toward better-performing health care providers and will motivate providers to improve quality. However, the effect of public reporting on hospital quality is still small. One of the reasons is that users find it difficult to understand the formats in which information is presented. Objective We analyzed the presentation of risk-adjusted mortality rate (RAMR) for coronary angiography in the 10 most commonly used German public report cards to analyze the impact of information presentation features on their comprehensibility. We wanted to determine which information presentation features were utilized, were preferred by users, led to better comprehension, and had similar effects to those reported in evidence-based recommendations described in the literature. Methods The study consisted of 5 steps: (1) identification of best-practice evidence about the presentation of information on hospital report cards; (2) selection of a single risk-adjusted quality indicator; (3) selection of a sample of designs adopted by German public report cards; (4) identification of the information presentation elements used in public reporting initiatives in Germany; and (5) an online panel completed an online questionnaire that was conducted to determine if respondents were able to identify the hospital with the lowest RAMR and if respondents’ hospital choices were associated with particular information design elements. Results Evidence-based recommendations were made relating to the following information presentation features relevant to report cards: evaluative table with symbols, tables without symbols, bar charts, bar charts without symbols, bar charts with symbols, symbols, evaluative word labels, highlighting, order of providers, high values to indicate good performance, explicit statements

  16. Survey of Technical Preventative Measures to Reduce Whole-Body Vibration Effects when Designing Mobile Machinery

    NASA Astrophysics Data System (ADS)

    DONATI, P.

    2002-05-01

    Engineering solutions to minimize the effects on operators of vibrating mobile machinery can be conveniently grouped into three areas: Reduction of vibration at source by improvement of the quality of terrain, careful selection of vehicle or machine, correct loading, proper maintenance, etc.Reduction of vibration transmission by incorporating suspension systems (tyres, vehicle suspensions, suspension cab and seat) between the operator and the source of vibration.Improvement of cab ergonomics and seat profiles to optimize operator posture. These paper reviews the different techniques and problems linked to categories (2) and (3). According to epidemiological studies, the main health risk with whole-body vibration exposure would appear to be lower back pain. When designing new mobile machinery, all factors which may contribute to back injury should be considered in order to reduce risk. For example, optimized seat suspension is useless if the suspension seat cannot be correctly and easily adjusted to the driver's weight or if the driver is forced to drive in a bent position to avoid his head striking the ceiling due to the spatial requirement of the suspension seat.

  17. Flow field survey near the rotational plane of an advanced design propeller on a JetStar airplane

    NASA Technical Reports Server (NTRS)

    Walsh, K. R.

    1985-01-01

    An investigation was conducted to obtain upper fuselage surface static pressures and boundary layer velocity profiles below the centerline of an advanced design propeller. This investigation documents the upper fuselage velocity flow field in support of the in-flight acoustic tests conducted on a JetStar airplane. Initial results of the boundary layer survey show evidence of an unusual flow disturbance, which is attributed to the two windshield wiper assemblies on the aircraft. The assemblies were removed, eliminating the disturbances from the flow field. This report presents boundary layer velocity profiles at altitudes of 6096 and 9144 m (20,000 and 30,000 ft) and Mach numbers from 0.6 to 0.8, and it investigated the effects of windshield wiper assemblies on these profiles. Because of the unconventional velocity profiles that were obtained with the assemblies mounted, classical boundary layer parameters, such as momentum and displacement thicknesses, are not presented. The effects of flight test variables (Mach number and angles of attack and sideslip) and an advanced design propeller on boundary layer profiles - with the wiper assemblies mounted and removed - are presented.

  18. Surgical Simulations Based on Limited Quantitative Data: Understanding How Musculoskeletal Models Can Be Used to Predict Moment Arms and Guide Experimental Design

    PubMed Central

    Bednar, Michael S.; Murray, Wendy M.

    2016-01-01

    The utility of biomechanical models and simulations to examine clinical problems is currently limited by the need for extensive amounts of experimental data describing how a given procedure or disease affects the musculoskeletal system. Methods capable of predicting how individual biomechanical parameters are altered by surgery are necessary for the efficient development of surgical simulations. In this study, we evaluate to what extent models based on limited amounts of quantitative data can be used to predict how surgery influences muscle moment arms, a critical parameter that defines how muscle force is transformed into joint torque. We specifically examine proximal row carpectomy and scaphoid-excision four-corner fusion, two common surgeries to treat wrist osteoarthritis. Using models of these surgeries, which are based on limited data and many assumptions, we perform simulations to formulate a hypothesis regarding how these wrist surgeries influence muscle moment arms. Importantly, the hypothesis is based on analysis of only the primary wrist muscles. We then test the simulation-based hypothesis using a cadaveric experiment that measures moment arms of both the primary wrist and extrinsic thumb muscles. The measured moment arms of the primary wrist muscles are used to verify the hypothesis, while those of the extrinsic thumb muscles are used as cross-validation to test whether the hypothesis is generalizable. The moment arms estimated by the models and measured in the cadaveric experiment both indicate that a critical difference between the surgeries is how they alter radial-ulnar deviation versus flexion-extension moment arms at the wrist. Thus, our results demonstrate that models based on limited quantitative data can provide novel insights. This work also highlights that synergistically utilizing simulation and experimental methods can aid the design of experiments and make it possible to test the predictive limits of current computer simulation techniques

  19. Design and synthesis of target-responsive hydrogel for portable visual quantitative detection of uranium with a microfluidic distance-based readout device.

    PubMed

    Huang, Yishun; Fang, Luting; Zhu, Zhi; Ma, Yanli; Zhou, Leiji; Chen, Xi; Xu, Dunming; Yang, Chaoyong

    2016-11-15

    Due to uranium's increasing exploitation in nuclear energy and its toxicity to human health, it is of great significance to detect uranium contamination. In particular, development of a rapid, sensitive and portable method is important for personal health care for those who frequently come into contact with uranium ore mining or who investigate leaks at nuclear power plants. The most stable form of uranium in water is uranyl ion (UO2(2+)). In this work, a UO2(2+) responsive smart hydrogel was designed and synthesized for rapid, portable, sensitive detection of UO2(2+). A UO2(2+) dependent DNAzyme complex composed of substrate strand and enzyme strand was utilized to crosslink DNA-grafted polyacrylamide chains to form a DNA hydrogel. Colorimetric analysis was achieved by encapsulating gold nanoparticles (AuNPs) in the DNAzyme-crosslinked hydrogel to indicate the concentration of UO2(2+). Without UO2(2+), the enzyme strand is not active. The presence of UO2(2+) in the sample activates the enzyme strand and triggers the cleavage of the substrate strand from the enzyme strand, thereby decreasing the density of crosslinkers and destabilizing the hydrogel, which then releases the encapsulated AuNPs. As low as 100nM UO2(2+) was visually detected by the naked eye. The target-responsive hydrogel was also demonstrated to be applicable in natural water spiked with UO2(2+). Furthermore, to avoid the visual errors caused by naked eye observation, a previously developed volumetric bar-chart chip (V-Chip) was used to quantitatively detect UO2(2+) concentrations in water by encapsulating Au-Pt nanoparticles in the hydrogel. The UO2(2+) concentrations were visually quantified from the travelling distance of ink-bar on the V-Chip. The method can be used for portable and quantitative detection of uranium in field applications without skilled operators and sophisticated instruments. PMID:27209576

  20. The VIRUS-P Exploration of Nearby Galaxies (VENGA): Survey Design, Data Processing, and Spectral Analysis Methods

    NASA Astrophysics Data System (ADS)

    Blanc, Guillermo A.; Weinzirl, Tim; Song, Mimi; Heiderman, Amanda; Gebhardt, Karl; Jogee, Shardha; Evans, Neal J., II; van den Bosch, Remco C. E.; Luo, Rongxin; Drory, Niv; Fabricius, Maximilian; Fisher, David; Hao, Lei; Kaplan, Kyle; Marinova, Irina; Vutisalchavakul, Nalin; Yoachim, Peter

    2013-05-01

    We present the survey design, data reduction, and spectral fitting pipeline for the VIRUS-P Exploration of Nearby Galaxies (VENGA). VENGA is an integral field spectroscopic survey, which maps the disks of 30 nearby spiral galaxies. Targets span a wide range in Hubble type, star formation activity, morphology, and inclination. The VENGA data cubes have 5.''6 FWHM spatial resolution, ~5 Å FWHM spectral resolution, sample the 3600 Å-6800 Å range, and cover large areas typically sampling galaxies out to ~0.7R 25. These data cubes can be used to produce two-dimensional maps of the star formation rate, dust extinction, electron density, stellar population parameters, the kinematics and chemical abundances of both stars and ionized gas, and other physical quantities derived from the fitting of the stellar spectrum and the measurement of nebular emission lines. To exemplify our methods and the quality of the data, we present the VENGA data cube on the face-on Sc galaxy NGC 628 (a.k.a. M 74). The VENGA observations of NGC 628 are described, as well as the construction of the data cube, our spectral fitting method, and the fitting of the stellar and ionized gas velocity fields. We also propose a new method to measure the inclination of nearly face-on systems based on the matching of the stellar and gas rotation curves using asymmetric drift corrections. VENGA will measure relevant physical parameters across different environments within these galaxies, allowing a series of studies on star formation, structure assembly, stellar populations, chemical evolution, galactic feedback, nuclear activity, and the properties of the interstellar medium in massive disk galaxies.

  1. Mapping indoor radon-222 in Denmark: design and test of the statistical model used in the second nationwide survey.

    PubMed

    Andersen, C E; Ulbak, K; Damkjaer, A; Kirkegaard, P; Gravesen, P

    2001-05-14

    In Denmark, a new survey of indoor radon-222 has been carried out, 1-year alpha track measurements (CR-39) have been made in 3019 single-family houses. There are from 3 to 23 house measurements in each of the 275 municipalities. Within each municipality, houses have been selected randomly. One important outcome of the survey is the prediction of the fraction of houses in each municipality with an annual average radon concentration above 200 Bq m(-3). To obtain the most accurate estimate and to assess the associated uncertainties, a statistical model has been developed. The purpose of this paper is to describe the design of this model, and to report results of model tests. The model is based on a transformation of the data to normality and on analytical (conditionally) unbiased estimators of the quantities of interest. Bayesian statistics are used to minimize the effect of small sample size. In each municipality, the correction is dependent on the fraction of area where sand and gravel is a dominating surface geology. The uncertainty analysis is done with a Monte-Carlo technique. It is demonstrated that the weighted sum of all municipality model estimates of fractions above 200 Bq m(-3) (3.9% with 95%-confidence interval = [3.4,4.5]) is consistent with the weighted sum of the observations for Denmark taken as a whole (4.6% with 95%-confidence interval = [3.8,5.6]). The total number of single-family houses within each municipality is used as weight. Model estimates are also found to be consistent with observations at the level of individual counties. These typically include a few hundred house measurements. These tests indicate that the model is well suited for its purpose.

  2. Design, methods and demographic findings of the DEMINVALL survey: a population-based study of Dementia in Valladolid, Northwestern Spain

    PubMed Central

    2012-01-01

    Background This article describes the rationale and design of a population-based survey of dementia in Valladolid (northwestern Spain). The main aim of the study was to assess the epidemiology of dementia and its subtypes. Prevalence of anosognosia in dementia patients, nutritional status, diet characteristics, and determinants of non-diagnosed dementia in the community were studied. The main sociodemographic, educational, and general health status characteristics of the study population are described. Methods Cross-over and cohort, population-based study. A two-phase door-to-door study was performed. Both urban and rural environments were included. In phase 1 (February 2009 – February 2010) 28 trained physicians examined a population of 2,989 subjects (age: ≥ 65 years). The seven-minute screen neurocognitive battery was used. In phase 2 (May 2009 – May 2010) 4 neurologists, 1 geriatrician, and 3 neuropsychologists confirmed the diagnosis of dementia and subtype in patients screened positive by a structured neurological evaluation. Specific instruments to assess anosognosia, the nutritional status and diet characteristics were used. Of the initial sample, 2,170 subjects were evaluated (57% female, mean age 76.5 ± 7.8, 5.2% institutionalized), whose characteristics are described. 227 persons were excluded for various reasons. Among those eligible were 592 non-responders. The attrition bias of non-responders was lower in rural areas. 241 screened positive (11.1%). Discussion The survey will explore some clinical, social and health related life-style variables of dementia. The population size and the diversification of social and educational backgrounds will contribute to a better knowledge of dementia in our environment. PMID:22935626

  3. THE VIRUS-P EXPLORATION OF NEARBY GALAXIES (VENGA): SURVEY DESIGN, DATA PROCESSING, AND SPECTRAL ANALYSIS METHODS

    SciTech Connect

    Blanc, Guillermo A.; Weinzirl, Tim; Song, Mimi; Heiderman, Amanda; Gebhardt, Karl; Jogee, Shardha; Evans, Neal J. II; Kaplan, Kyle; Marinova, Irina; Vutisalchavakul, Nalin; Van den Bosch, Remco C. E.; Luo Rongxin; Hao Lei; Drory, Niv; Fabricius, Maximilian; Fisher, David; Yoachim, Peter

    2013-05-15

    We present the survey design, data reduction, and spectral fitting pipeline for the VIRUS-P Exploration of Nearby Galaxies (VENGA). VENGA is an integral field spectroscopic survey, which maps the disks of 30 nearby spiral galaxies. Targets span a wide range in Hubble type, star formation activity, morphology, and inclination. The VENGA data cubes have 5.''6 FWHM spatial resolution, {approx}5 A FWHM spectral resolution, sample the 3600 A-6800 A range, and cover large areas typically sampling galaxies out to {approx}0.7R{sub 25}. These data cubes can be used to produce two-dimensional maps of the star formation rate, dust extinction, electron density, stellar population parameters, the kinematics and chemical abundances of both stars and ionized gas, and other physical quantities derived from the fitting of the stellar spectrum and the measurement of nebular emission lines. To exemplify our methods and the quality of the data, we present the VENGA data cube on the face-on Sc galaxy NGC 628 (a.k.a. M 74). The VENGA observations of NGC 628 are described, as well as the construction of the data cube, our spectral fitting method, and the fitting of the stellar and ionized gas velocity fields. We also propose a new method to measure the inclination of nearly face-on systems based on the matching of the stellar and gas rotation curves using asymmetric drift corrections. VENGA will measure relevant physical parameters across different environments within these galaxies, allowing a series of studies on star formation, structure assembly, stellar populations, chemical evolution, galactic feedback, nuclear activity, and the properties of the interstellar medium in massive disk galaxies.

  4. A mental health needs assessment of children and adolescents in post-conflict Liberia: results from a quantitative key-informant survey

    PubMed Central

    Borba, Christina P.C.; Ng, Lauren C.; Stevenson, Anne; Vesga-Lopez, Oriana; Harris, Benjamin L.; Parnarouskis, Lindsey; Gray, Deborah A.; Carney, Julia R.; Domínguez, Silvia; Wang, Edward K.S.; Boxill, Ryan; Song, Suzan J.; Henderson, David C.

    2016-01-01

    Between 1989 and 2004, Liberia experienced a devastating civil war that resulted in widespread trauma with almost no mental health infrastructure to help citizens cope. In 2009, the Liberian Ministry of Health and Social Welfare collaborated with researchers from Massachusetts General Hospital to conduct a rapid needs assessment survey in Liberia with local key informants (n = 171) to examine the impact of war and post-war events on emotional and behavioral problems of, functional limitations of, and appropriate treatment settings for Liberian youth aged 5–22. War exposure and post-conflict sexual violence, poverty, infectious disease and parental death negatively impacted youth mental health. Key informants perceived that youth displayed internalizing and externalizing symptoms and mental health-related functional impairment at home, school, work and in relationships. Medical clinics were identified as the most appropriate setting for mental health services. Youth in Liberia continue to endure the harsh social, economic and material conditions of everyday life in a protracted post-conflict state, and have significant mental health needs. Their observed functional impairment due to mental health issues further limited their access to protective factors such as education, employment and positive social relationships. Results from this study informed Liberia's first post-conflict mental health policy. PMID:26807147

  5. National Aquatic Resource Surveys: Use of Geospatial data in their design and spatial prediction at non-monitored locations

    EPA Science Inventory

    The National Aquatic Resource Surveys (NARS) are four surveys conducted by the U.S. Environmental Protection Agency working in collaboration with states, tribal nations and other federal agencies. The surveys are conducted for lakes and reservoirs, streams, estuaries and intracoa...

  6. Surveying the Commons: Current Implementation of Information Commons Web sites

    ERIC Educational Resources Information Center

    Leeder, Christopher

    2009-01-01

    This study assessed the content of 72 academic library Information Commons (IC) Web sites using content analysis, quantitative assessment and qualitative surveys of site administrators to analyze current implementation by the academic library community. Results show that IC Web sites vary widely in content, design and functionality, with few…

  7. The Cornella Health Interview Survey Follow-Up (CHIS.FU) Study: design, methods, and response rate

    PubMed Central

    Garcia, Montse; Schiaffino, Anna; Fernandez, Esteve; Marti, Merce; Salto, Esteve; Perez, Gloria; Peris, Merce; Borrell, Carme; Nieto, F Javier; Borras, Josep Maria

    2003-01-01

    Background The aim of this report is to describe the main characteristics of the design, including response rates, of the Cornella Health Interview Survey Follow-up Study. Methods The original cohort consisted of 2,500 subjects (1,263 women and 1,237 men) interviewed as part of the 1994 Cornella Health Interview Study. A record linkage to update the address and vital status of the cohort members was carried out using, first a deterministic method, and secondly a probabilistic one, based on each subject's first name and surnames. Subsequently, we attempted to locate the cohort members to conduct the phone follow-up interviews. A pilot study was carried out to test the overall feasibility and to modify some procedures before the field work began. Results After record linkage, 2,468 (98.7%) subjects were successfully traced. Of these, 91 (3.6%) were deceased, 259 (10.3%) had moved to other towns, and 50 (2.0%) had neither renewed their last municipal census documents nor declared having moved. After using different strategies to track and to retain cohort members, we traced 92% of the CHIS participants. From them, 1,605 subjects answered the follow-up questionnaire. Conclusion The computerized record linkage maximized the success of the follow-up that was carried out 7 years after the baseline interview. The pilot study was useful to increase the efficiency in tracing and interviewing the respondents. PMID:12665430

  8. Heterosis for Biomass-Related Traits in Arabidopsis Investigated by Quantitative Trait Loci Analysis of the Triple Testcross Design With Recombinant Inbred Lines

    PubMed Central

    Kusterer, Barbara; Piepho, Hans-Peter; Utz, H. Friedrich; Schön, Chris C.; Muminovic, Jasmina; Meyer, Rhonda C.; Altmann, Thomas; Melchinger, Albrecht E.

    2007-01-01

    Arabidopsis thaliana has emerged as a leading model species in plant genetics and functional genomics including research on the genetic causes of heterosis. We applied a triple testcross (TTC) design and a novel biometrical approach to identify and characterize quantitative trait loci (QTL) for heterosis of five biomass-related traits by (i) estimating the number, genomic positions, and genetic effects of heterotic QTL, (ii) characterizing their mode of gene action, and (iii) testing for presence of epistatic effects by a genomewide scan and marker × marker interactions. In total, 234 recombinant inbred lines (RILs) of Arabidopsis hybrid C24 × Col-0 were crossed to both parental lines and their F1 and analyzed with 110 single-nucleotide polymorphism (SNP) markers. QTL analyses were conducted using linear transformations Z1, Z2, and Z3 calculated from the adjusted entry means of TTC progenies. With Z1, we detected 12 QTL displaying augmented additive effects. With Z2, we mapped six QTL for augmented dominance effects. A one-dimensional genome scan with Z3 revealed two genomic regions with significantly negative dominance × additive epistatic effects. Two-way analyses of variance between marker pairs revealed nine digenic epistatic interactions: six reflecting dominance × dominance effects with variable sign and three reflecting additive × additive effects with positive sign. We conclude that heterosis for biomass-related traits in Arabidopsis has a polygenic basis with overdominance and/or epistasis being presumably the main types of gene action. PMID:18039885

  9. Quantitative parameters of complexes of tris(1-alkylindol-3-yl)methylium salts with serum albumin: Relevance for the design of drug candidates.

    PubMed

    Durandin, Nikita A; Tsvetkov, Vladimir B; Bykov, Evgeny E; Kaluzhny, Dmitry N; Lavrenov, Sergey N; Tevyashova, Anna N; Preobrazhenskaya, Maria N

    2016-09-01

    Triarylmethane derivatives are extensively investigated as antitumor and antibacterial drug candidates alone and as photoactivatable compounds. In the series of tris(1-alkylindol-3-yl)methylium salts (TIMs) these two activities differed depending on the length of N-alkyl chain, with C4-5 derivatives being the most potent compared to the shorter or longer chain analogs and to the natural compound turbomycin A (no N-substituent). Given that the human serum albumin (HSA) is a major transporter protein with which TIMs can form stable complexes, and that the formation of these complexes might be advantageous for phototoxicity of TIMs we determined the quantitative parameters of TIMs-HSA binding using spectroscopic methods and molecular docking. TIMs bound to HSA (1:1 stoichiometry) altered the protein's secondary structure by changing the α-helix/β-turn ratio. The IIa subdomain (Sudlow site I) is the preferred TIM binding site in HSA as determined in competition experiments with reference drugs ibuprofen and warfarin. The values of binding constants increased with the number of CH2 groups from 0 to 6 and then dropped down for C10 compound, a dependence similar to the one observed for cytocidal potency of TIMs. We tend to attribute this non-linear dependence to an interplay between hydrophobicity and steric hindrance, the two key characteristics of TIMs-HSA complexes calculated in the molecular docking procedure. These structure-activity relationships provide evidence for rational design of TIMs-based antitumor and antimicrobial drugs. PMID:27475780

  10. Quantitative Graphics in Newspapers.

    ERIC Educational Resources Information Center

    Tankard, James W., Jr.

    The use of quantitative graphics in newspapers requires achieving a balance between being accurate and getting the attention of the reader. The statistical representations in newspapers are drawn by graphic designers whose key technique is fusion--the striking combination of two visual images. This technique often results in visual puns,…

  11. Quantitative Thinking.

    ERIC Educational Resources Information Center

    DuBridge, Lee A.

    An appeal for more research to determine how to educate children as effectively as possible is made. Mathematics teachers can readily examine the educational problems of today in their classrooms since learning progress in mathematics can easily be measured and evaluated. Since mathematics teachers have learned to think in quantitative terms and…

  12. QUANTITATIVE MORPHOLOGY

    EPA Science Inventory

    Abstract: In toxicology, the role of quantitative assessment of brain morphology can be understood in the context of two types of treatment-related alterations. One type of alteration is specifically associated with treatment and is not observed in control animals. Measurement ...

  13. A knowledge-based approach in designing combinatorial or medicinal chemistry libraries for drug discovery. 1. A qualitative and quantitative characterization of known drug databases.

    PubMed

    Ghose, A K; Viswanadhan, V N; Wendoloski, J J

    1999-01-01

    The discovery of various protein/receptor targets from genomic research is expanding rapidly. Along with the automation of organic synthesis and biochemical screening, this is bringing a major change in the whole field of drug discovery research. In the traditional drug discovery process, the industry tests compounds in the thousands. With automated synthesis, the number of compounds to be tested could be in the millions. This two-dimensional expansion will lead to a major demand for resources, unless the chemical libraries are made wisely. The objective of this work is to provide both quantitative and qualitative characterization of known drugs which will help to generate "drug-like" libraries. In this work we analyzed the Comprehensive Medicinal Chemistry (CMC) database and seven different subsets belonging to different classes of drug molecules. These include some central nervous system active drugs and cardiovascular, cancer, inflammation, and infection disease states. A quantitative characterization based on computed physicochemical property profiles such as log P, molar refractivity, molecular weight, and number of atoms as well as a qualitative characterization based on the occurrence of functional groups and important substructures are developed here. For the CMC database, the qualifying range (covering more than 80% of the compounds) of the calculated log P is between -0.4 and 5.6, with an average value of 2.52. For molecular weight, the qualifying range is between 160 and 480, with an average value of 357. For molar refractivity, the qualifying range is between 40 and 130, with an average value of 97. For the total number of atoms, the qualifying range is between 20 and 70, with an average value of 48. Benzene is by far the most abundant substructure in this drug database, slightly more abundant than all the heterocyclic rings combined. Nonaromatic heterocyclic rings are twice as abundant as the aromatic heterocycles. Tertiary aliphatic amines, alcoholic

  14. Biological effect of low-head sea lamprey barriers: Designs for extensive surveys and the value of incorporating intensive process-oriented research

    USGS Publications Warehouse

    Hayes, D.B.; Baylis, J.R.; Carl, L.M.; Dodd, H.R.; Goldstein, J.D.; McLaughlin, R.L.; Noakes, D.L.G.; Porto, L.M.

    2003-01-01

    Four sampling designs for quantifying the effect of low-head sea lamprey (Petromyzon marinus) barriers on fish communities were evaluated, and the contribution of process-oriented research to the overall confidence of results obtained was discussed. The designs include: (1) sample barrier streams post-construction; (2) sample barrier and reference streams post-construction; (3) sample barrier streams pre- and post-construction; and (4) sample barrier and reference streams pre- and post-construction. In the statistical literature, the principal basis for comparison of sampling designs is generally the precision achieved by each design. In addition to precision, designs should be compared based on the interpretability of results and on the scale to which the results apply. Using data collected in a broad survey of streams with and without sea lamprey barriers, some of the tradeoffs that occur among precision, scale, and interpretability are illustrated. Although circumstances such as funding and availability of pre-construction data may limit which design can be implemented, a pre/post-construction design including barrier and reference streams provides the most meaningful information for use in barrier management decisions. Where it is not feasible to obtain pre-construction data, a design including reference streams is important to maintain the interpretability of results. Regardless of the design used, process-oriented research provides a framework for interpreting results obtained in broad surveys. As such, information from both extensive surveys and intensive process-oriented research provides the best basis for fishery management actions, and gives researchers and managers the most confidence in the conclusions reached regarding the effects of sea lamprey barriers.

  15. Geothermal energy as a source of electricity. A worldwide survey of the design and operation of geothermal power plants

    NASA Astrophysics Data System (ADS)

    Dipippo, R.

    1980-01-01

    An overview of geothermal power generation is presented. A survey of geothermal power plants is given for the following countries: China, El Salvado, Iceland, Italy, Japan, Mexico, New Zealand, Philippines, Turkey, USSR, and USA. A survey of countries planning geothermal power plants is included.

  16. Geothermal energy as a source of electricity. A worldwide survey of the design and operation of geothermal power plants

    SciTech Connect

    DiPippo, R.

    1980-01-01

    An overview of geothermal power generation is presented. A survey of geothermal power plants is given for the following countries: China, El Salvador, Iceland, Italy, Japan, Mexico, New Zealand, Philippines, Turkey, USSR, and USA. A survey of countries planning geothermal power plants is included. (MHR)

  17. Application of Screening Experimental Designs to Assess Chromatographic Isotope Effect upon Isotope-Coded Derivatization for Quantitative Liquid Chromatography–Mass Spectrometry

    PubMed Central

    2015-01-01

    Isotope effect may cause partial chromatographic separation of labeled (heavy) and unlabeled (light) isotopologue pairs. Together with a simultaneous matrix effect, this could lead to unacceptable accuracy in quantitative liquid chromatography–mass spectrometry assays, especially when electrospray ionization is used. Four biologically relevant reactive aldehydes (acrolein, malondialdehyde, 4-hydroxy-2-nonenal, and 4-oxo-2-nonenal) were derivatized with light or heavy (d3-, 13C6-, 15N2-, or 15N4-labeled) 2,4-dinitrophenylhydrazine and used as model compounds to evaluate chromatographic isotope effects. For comprehensive assessment of retention time differences between light/heavy pairs under various gradient reversed-phase liquid chromatography conditions, major chromatographic parameters (stationary phase, mobile phase pH, temperature, organic solvent, and gradient slope) and different isotope labelings were addressed by multiple-factor screening using experimental designs that included both asymmetrical (Addelman) and Plackett–Burman schemes followed by statistical evaluations. Results confirmed that the most effective approach to avoid chromatographic isotope effect is the use of 15N or 13C labeling instead of deuterium labeling, while chromatographic parameters had no general influence. Comparison of the alternate isotope-coded derivatization assay (AIDA) using deuterium versus 15N labeling gave unacceptable differences (>15%) upon quantifying some of the model aldehydes from biological matrixes. On the basis of our results, we recommend the modification of the AIDA protocol by replacing d3-2,4-dinitrophenylhydrazine with 15N- or 13C-labeled derivatizing reagent to avoid possible unfavorable consequences of chromatographic isotope effects. PMID:24922593

  18. Application of screening experimental designs to assess chromatographic isotope effect upon isotope-coded derivatization for quantitative liquid chromatography-mass spectrometry.

    PubMed

    Szarka, Szabolcs; Prokai-Tatrai, Katalin; Prokai, Laszlo

    2014-07-15

    Isotope effect may cause partial chromatographic separation of labeled (heavy) and unlabeled (light) isotopologue pairs. Together with a simultaneous matrix effect, this could lead to unacceptable accuracy in quantitative liquid chromatography-mass spectrometry assays, especially when electrospray ionization is used. Four biologically relevant reactive aldehydes (acrolein, malondialdehyde, 4-hydroxy-2-nonenal, and 4-oxo-2-nonenal) were derivatized with light or heavy (d3-, (13)C6-, (15)N2-, or (15)N4-labeled) 2,4-dinitrophenylhydrazine and used as model compounds to evaluate chromatographic isotope effects. For comprehensive assessment of retention time differences between light/heavy pairs under various gradient reversed-phase liquid chromatography conditions, major chromatographic parameters (stationary phase, mobile phase pH, temperature, organic solvent, and gradient slope) and different isotope labelings were addressed by multiple-factor screening using experimental designs that included both asymmetrical (Addelman) and Plackett-Burman schemes followed by statistical evaluations. Results confirmed that the most effective approach to avoid chromatographic isotope effect is the use of (15)N or (13)C labeling instead of deuterium labeling, while chromatographic parameters had no general influence. Comparison of the alternate isotope-coded derivatization assay (AIDA) using deuterium versus (15)N labeling gave unacceptable differences (>15%) upon quantifying some of the model aldehydes from biological matrixes. On the basis of our results, we recommend the modification of the AIDA protocol by replacing d3-2,4-dinitrophenylhydrazine with (15)N- or (13)C-labeled derivatizing reagent to avoid possible unfavorable consequences of chromatographic isotope effects. PMID:24922593

  19. Rules for the preparation of manuscript and illustrations designed for publication by the United States Geological Survey

    USGS Publications Warehouse

    Hampson, Thomas

    1888-01-01

    In the annual report of the Director of the U. S. Geological Survey for 1885-'86, pages 40 and 41, you set forth the functions of the chief of the editorial division as follows: "To secure clear and accurate statement in the material sent to press, careful proof-reading, and uniformity in the details of book-making, as well as to assist the Director in exercising a general supervision over the publications of the Survey."

  20. Risk-based design of repeated surveys for the documentation of freedom from non-highly contagious diseases.

    PubMed

    Hadorn, Daniela C; Rüfenacht, Jürg; Hauser, Ruth; Stärk, Katharina D C

    2002-12-30

    The documentation of freedom from disease requires reliable information on the actual disease status in a specific animal population. The implementation of active surveillance (surveys) is an effective method to gain this information. For economical reasons, the sample size should be as small as possible but large enough to achieve the required confidence level for a targeted threshold. When conducting surveys repeatedly, various information sources about the disease status of the population can be taken into account to adjust the required level of confidence for a follow-up survey (e.g. risk assessments regarding disease introduction and results of previous surveys). As a benefit, the sample size for national surveys can be reduced considerably. We illustrate this risk-based approach using examples of national surveys conducted in Switzerland. The sample size for the documentation of freedom from enzootic bovine leucosis (EBL) and Brucella melitensis in sheep and in goats could be reduced from 2325 to 415 cattle herds, from 2325 to 838 sheep herds and from 1975 to 761 goat herds, respectively. PMID:12441234

  1. Doctoral Training in Statistics, Measurement, and Methodology in Psychology: Replication and Extension of Aiken, West, Sechrest, and Reno's (1990) Survey of PhD Programs in North America

    ERIC Educational Resources Information Center

    Aiken, Leona S.; West, Stephen G.; Millsap, Roger E.

    2008-01-01

    In a survey of all PhD programs in psychology in the United States and Canada, the authors documented the quantitative methodology curriculum (statistics, measurement, and research design) to examine the extent to which innovations in quantitative methodology have diffused into the training of PhDs in psychology. In all, 201 psychology PhD…

  2. EuropeaN Energy balance Research to prevent excessive weight Gain among Youth (ENERGY) project: Design and methodology of the ENERGY cross-sectional survey

    PubMed Central

    2011-01-01

    Background Obesity treatment is by large ineffective long term, and more emphasis on the prevention of excessive weight gain in childhood and adolescence is warranted. To inform energy balance related behaviour (EBRB) change interventions, insight in the potential personal, family and school environmental correlates of these behaviours is needed. Studies on such multilevel correlates of EBRB among schoolchildren in Europe are lacking. The ENERGY survey aims to (1) provide up-to-date prevalence rates of measured overweight, obesity, self-reported engagement in EBRBs, and objective accelerometer-based assessment of physical activity and sedentary behaviour and blood-sample biomarkers of metabolic function in countries in different regions of Europe, (2) to identify personal, family and school environmental correlates of these EBRBs. This paper describes the design, methodology and protocol of the survey. Method/Design A school-based cross-sectional survey was carried out in 2010 in seven different European countries; Belgium, Greece, Hungary, the Netherlands, Norway, Slovenia, and Spain. The survey included measurements of anthropometrics, child, parent and school-staff questionnaires, and school observations to measure and assess outcomes (i.e. height, weight, and waist circumference), EBRBs and potential personal, family and school environmental correlates of these behaviours including the social-cultural, physical, political, and economic environmental factors. In addition, a selection of countries conducted accelerometer measurements to objectively assess physical activity and sedentary behaviour, and collected blood samples to assess several biomarkers of metabolic function. Discussion The ENERGY survey is a comprehensive cross-sectional study measuring anthropometrics and biomarkers as well as assessing a range of EBRBs and their potential correlates at the personal, family and school level, among 10-12 year old children in seven European countries. This study

  3. Use of Public Opinion Surveys.

    ERIC Educational Resources Information Center

    Copeland, Susan

    2002-01-01

    Describes how to design and administer public-opinion surveys. Includes types of surveys, preparing survey questions, drawing and validating a sample, and processing the data. (Contains 16 references.) (PKP)

  4. A survey of surveys

    SciTech Connect

    Kent, S.M.

    1994-11-01

    A new era for the field of Galactic structure is about to be opened with the advent of wide-area digital sky surveys. In this article, the author reviews the status and prospects for research for 3 new ground-based surveys: the Sloan Digital Sky Survey (SDSS), the Deep Near-Infrared Survey of the Southern Sky (DENIS) and the Two Micron AU Sky Survey (2MASS). These surveys will permit detailed studies of Galactic structure and stellar populations in the Galaxy with unprecedented detail. Extracting the information, however, will be challenging.

  5. Identity and Philanthropy: Designing a Survey Instrument to Operationalize Lesbian, Gay, Bisexual, Transgender, and Queer Alumni Giving

    ERIC Educational Resources Information Center

    Garvey, Jason C.

    2013-01-01

    This study investigated philanthropic giving to higher education among lesbian, gay, bisexual, transgender, and queer (LGBTQ) alumni. The primary purpose was to create a multi-institutional survey instrument that operationalizes philanthropic involvement and motivation among LGBTQ alumni. Additional objectives included creating factors and items…

  6. A WHOLE-LAKE WATER QUALITY SURVEY OF LAKE OAHE BASED ON A SPATIALLY-BALANCED PROBABILISTIC DESIGN

    EPA Science Inventory

    Assessing conditions on large bodies of water presets multiple statistical and logistical challenges. As part of the Upper Missouri River Program of the Environmental Monitoring and Assessment Project (EMAP) we surveyed water quality of Lake Oahe in July-August, 2002 using a spat...

  7. Preventing pitfalls in patient surveys.

    PubMed

    Steiber, S R

    1989-05-01

    Properly conceived, customer satisfaction surveys can yield the quantitative data needed to gauge patient satisfaction. But, as the author notes, these surveys can be "a veritable mine field of surprises for the uninitiated." This article, the last in a three-part series on measuring patient satisfaction, describes potential pitfalls and discusses the merits of in-person, mail and telephone surveys. PMID:10293191

  8. Preventing pitfalls in patient surveys.

    PubMed

    Steiber, S R

    1989-05-01

    Properly conceived, customer satisfaction surveys can yield the quantitative data needed to gauge patient satisfaction. But, as the author notes, these surveys can be "a veritable mine field of surprises for the uninitiated." This article, the last in a three-part series on measuring patient satisfaction, describes potential pitfalls and discusses the merits of in-person, mail and telephone surveys.

  9. [The first wave of the German Health Interview and Examination Survey for Adults (DEGS1): sample design, response, weighting and representativeness].

    PubMed

    Kamtsiuris, P; Lange, M; Hoffmann, R; Schaffrath Rosario, A; Dahm, S; Kuhnert, R; Kurth, B M

    2013-05-01

    The "German Health Interview and Examination Survey for Adults" (DEGS) is part of the health monitoring program of the Robert Koch Institute (RKI) and is designed as a combined cross-sectional and longitudinal survey. The first wave (DEGS1; 2008-2011) comprised interviews and physical examinations. The target population were 18- to 79-year olds living in Germany. The mixed design consisted of a new sample randomly chosen from local population registries which was supplemented by participants from the "German National Health Interview and Examination Survey 1998" (GNHIES98). In total, 8,152 persons took part, among them 4,193 newly invited (response 42%) and 3,959 who had previously taken part in GNHIES98 (response 62%). 7,238 participants visited one of the 180 local study centres, 914 took part in the interview-only programme. The comparison of the net sample with the group of non-participants and with the resident population of Germany suggests a high representativeness regarding various attributes. To account for certain aspects of the population structure cross-sectional, trend and longitudinal analyses are corrected by weighting factors. Furthermore, different participation probabilities of the former participants of GNHIES98 are compensated for. An English full-text version of this article is available at SpringerLink as supplemental.

  10. Participant dropout as a function of survey length in internet-mediated university studies: implications for study design and voluntary participation in psychological research.

    PubMed

    Hoerger, Michael

    2010-12-01

    Internet-mediated research has offered substantial advantages over traditional laboratory-based research in terms of efficiently and affordably allowing for the recruitment of large samples of participants for psychology studies. Core technical, ethical, and methodological issues have been addressed in recent years, but the important issue of participant dropout has received surprisingly little attention. Specifically, web-based psychology studies often involve undergraduates completing lengthy and time-consuming batteries of online personality questionnaires, but no known published studies to date have closely examined the natural course of participant dropout during attempted completion of these studies. The present investigation examined participant dropout among 1,963 undergraduates completing one of six web-based survey studies relatively representative of those conducted in university settings. Results indicated that 10% of participants could be expected to drop out of these studies nearly instantaneously, with an additional 2% dropping out per 100 survey items included in the study. For individual project investigators, these findings hold ramifications for study design considerations, such as conducting a priori power analyses. The present results also have broader ethical implications for understanding and improving voluntary participation in research involving human subjects. Nonetheless, the generalizability of these conclusions may be limited to studies involving similar design or survey content.

  11. Detailed flow surveys of turning vanes designed for a 0.1-scale model of NASA Lewis Research Center's proposed altitude wind tunnel

    NASA Technical Reports Server (NTRS)

    Moore, Royce D.; Shyne, Rickey J.; Boldman, Donald R.; Gelder, Thomas F.

    1987-01-01

    Detailed flow surveys downstream of the corner turning vanes and downstream of the fan inlet guide vanes have been obtained in a 0.1-scale model of the NASA Lewis Research Center's proposed Altitude Wind Tunnel. Two turning vane designs were evaluated in both corners 1 and 2 (the corners between the test section and the drive fan). Vane A was a controlled-diffusion airfoil and vane B was a circular-arc airfoil. At given flows the turning vane wakes were surveyed to determine the vane pressure losses. For both corners the vane A turning vane configuration gave lower losses than the vane B configuration in the regions where the flow regime should be representative of two-dimensional flow. For both vane sets the vane loss coefficient increased rapidly near the walls.

  12. Survey of selected design and ventilation characteristics of racehorse stables in the Pretoria, Witwatersrand, Vereeniging area of South Africa.

    PubMed

    Lund, R J; Guthrie, A J; Killeen, V M

    1993-12-01

    Stables housing more than 20 horses in training were surveyed in the Pretoria, Witwatersrand, Vereeniging area of South Africa. Most racehorses were kept in loose boxes, bedded on straw or sawdust and remained indoors while the stables were cleaned. The average floor area was 13 m2 and airspace was 55 m3 per animal. The average predicted minimum air change rate by natural convection in calm winds was 7.0 air changes per hour, which was reduced to 2.2 when the doors and shutters were closed. The survey showed that many of the stables had been built without due consideration to factors that might have adverse effects on the occupants.

  13. Evaluation of Nine Consensus Indices in Delphi Foresight Research and Their Dependency on Delphi Survey Characteristics: A Simulation Study and Debate on Delphi Design and Interpretation

    PubMed Central

    Birko, Stanislav; Dove, Edward S.; Özdemir, Vural

    2015-01-01

    The extent of consensus (or the lack thereof) among experts in emerging fields of innovation can serve as antecedents of scientific, societal, investor and stakeholder synergy or conflict. Naturally, how we measure consensus is of great importance to science and technology strategic foresight. The Delphi methodology is a widely used anonymous survey technique to evaluate consensus among a panel of experts. Surprisingly, there is little guidance on how indices of consensus can be influenced by parameters of the Delphi survey itself. We simulated a classic three-round Delphi survey building on the concept of clustered consensus/dissensus. We evaluated three study characteristics that are pertinent for design of Delphi foresight research: (1) the number of survey questions, (2) the sample size, and (3) the extent to which experts conform to group opinion (the Group Conformity Index) in a Delphi study. Their impacts on the following nine Delphi consensus indices were then examined in 1000 simulations: Clustered Mode, Clustered Pairwise Agreement, Conger’s Kappa, De Moivre index, Extremities Version of the Clustered Pairwise Agreement, Fleiss’ Kappa, Mode, the Interquartile Range and Pairwise Agreement. The dependency of a consensus index on the Delphi survey characteristics was expressed from 0.000 (no dependency) to 1.000 (full dependency). The number of questions (range: 6 to 40) in a survey did not have a notable impact whereby the dependency values remained below 0.030. The variation in sample size (range: 6 to 50) displayed the top three impacts for the Interquartile Range, the Clustered Mode and the Mode (dependency = 0.396, 0.130, 0.116, respectively). The Group Conformity Index, a construct akin to measuring stubbornness/flexibility of experts’ opinions, greatly impacted all nine Delphi consensus indices (dependency = 0.200 to 0.504), except the Extremity CPWA and the Interquartile Range that were impacted only beyond the first decimal point (dependency

  14. An investigation into the feasibility of designing a framework for the quantitative evaluation of the Clinical Librarian service at an NHS Trust in Brighton, UK.

    PubMed

    Deshmukh, Archana; Roper, Tom

    2014-12-01

    This feature presents research undertaken by Archana Deshmukh for her MA dissertation at the University of Brighton. She worked closely with Tom Roper, the Clinical Librarian at Brighton and Sussex University Hospitals NHS Trust, in a project to explore the feasibility of applying quantitative measures to evaluate the Clinical Librarian service. The investigation used an innovative participatory approach and the findings showed that although an exclusively quantitative approach to evaluation is not feasible, using a mixed methods approach is a way forward. Agreed outputs and outcomes could be embedded in a marketing plan, and the resulting framework could provide evidence to demonstrate overall impact. Archana graduated in July 2014, gaining a Distinction in the MA in Information Studies, and she is currently looking for work in the health information sector. PMID:25443028

  15. An investigation into the feasibility of designing a framework for the quantitative evaluation of the Clinical Librarian service at an NHS Trust in Brighton, UK.

    PubMed

    Deshmukh, Archana; Roper, Tom

    2014-12-01

    This feature presents research undertaken by Archana Deshmukh for her MA dissertation at the University of Brighton. She worked closely with Tom Roper, the Clinical Librarian at Brighton and Sussex University Hospitals NHS Trust, in a project to explore the feasibility of applying quantitative measures to evaluate the Clinical Librarian service. The investigation used an innovative participatory approach and the findings showed that although an exclusively quantitative approach to evaluation is not feasible, using a mixed methods approach is a way forward. Agreed outputs and outcomes could be embedded in a marketing plan, and the resulting framework could provide evidence to demonstrate overall impact. Archana graduated in July 2014, gaining a Distinction in the MA in Information Studies, and she is currently looking for work in the health information sector.

  16. Berkeley Quantitative Genome Browser

    2008-02-29

    The Berkeley Quantitative Genome Browser provides graphical browsing functionality for genomic data organized, at a minimum, by sequence and position. While supporting the annotation browsing features typical of many other genomic browsers, additional emphasis is placed on viewing and utilizing quantitative data. Data may be read from GFF, SGR, FASTA or any column delimited format. Once the data has been read into the browser's buffer, it may be searched. filtered or subjected to mathematical transformation.more » The browser also supplies some graphical design manipulation functionality geared towards preparing figures for presentations or publication. A plug-in mechanism enables development outside the core functionality that adds more advanced or esoteric analysis capabilities. BBrowse's development and distribution is open-source and has been built to run on Linux, OSX and MS Windows operating systems.« less

  17. Berkeley Quantitative Genome Browser

    SciTech Connect

    Hechmer, Aaron

    2008-02-29

    The Berkeley Quantitative Genome Browser provides graphical browsing functionality for genomic data organized, at a minimum, by sequence and position. While supporting the annotation browsing features typical of many other genomic browsers, additional emphasis is placed on viewing and utilizing quantitative data. Data may be read from GFF, SGR, FASTA or any column delimited format. Once the data has been read into the browser's buffer, it may be searched. filtered or subjected to mathematical transformation. The browser also supplies some graphical design manipulation functionality geared towards preparing figures for presentations or publication. A plug-in mechanism enables development outside the core functionality that adds more advanced or esoteric analysis capabilities. BBrowse's development and distribution is open-source and has been built to run on Linux, OSX and MS Windows operating systems.

  18. Cartography at the U.S. Geological Survey: the National Mapping Division's cartographic programs, products, design, and technology

    USGS Publications Warehouse

    Ogrosky, Charles E.; Gwynn, William; Jannace, Richard

    1989-01-01

    The U.S. Geological Survey (USGS) is the prime source of many kinds of topographic and special-purpose maps of the United States and its outlying areas. It is also a prime source of digital map data. One main goal of the USGS is to provide large-scale topographic map coverage of the entire United States. Most of the Nation is already covered. We expect that initial coverage will be completed by 1991. For many purposes, many public agencies, private organizations, and individuals need reliable cartographic and geographic knowledge about our Nation. To serve such needs, all USGS maps are compiled to exacting standards of accuracy and content.

  19. Design and Implementation of a Comprehensive Web-based Survey for Ovarian Cancer Survivorship with an Analysis of Prediagnosis Symptoms via Text Mining

    PubMed Central

    Sun, Jiayang; Bogie, Kath M; Teagno, Joe; Sun, Yu-Hsiang (Sam); Carter, Rebecca R; Cui, Licong; Zhang, Guo-Qiang

    2014-01-01

    Ovarian cancer (OvCa) is the most lethal gynecologic disease in the United States, with an overall 5-year survival rate of 44.5%, about half of the 89.2% for all breast cancer patients. To identify factors that possibly contribute to the long-term survivorship of women with OvCa, we conducted a comprehensive online Ovarian Cancer Survivorship Survey from 2009 to 2013. This paper presents the design and implementation of our survey, introduces its resulting data source, the OVA-CRADLE™ (Clinical Research Analytics and Data Lifecycle Environment), and illustrates a sample application of the survey and data by an analysis of prediagnosis symptoms, using text mining and statistics. The OVA-CRADLE™ is an application of our patented Physio-MIMI technology, facilitating Web-based access, online query and exploration of data. The prediagnostic symptoms and association of early-stage OvCa diagnosis with endometriosis provide potentially important indicators for future studies in this field. PMID:25861211

  20. Design and Implementation of a Comprehensive Web-based Survey for Ovarian Cancer Survivorship with an Analysis of Prediagnosis Symptoms via Text Mining.

    PubMed

    Sun, Jiayang; Bogie, Kath M; Teagno, Joe; Sun, Yu-Hsiang Sam; Carter, Rebecca R; Cui, Licong; Zhang, Guo-Qiang

    2014-01-01

    Ovarian cancer (OvCa) is the most lethal gynecologic disease in the United States, with an overall 5-year survival rate of 44.5%, about half of the 89.2% for all breast cancer patients. To identify factors that possibly contribute to the long-term survivorship of women with OvCa, we conducted a comprehensive online Ovarian Cancer Survivorship Survey from 2009 to 2013. This paper presents the design and implementation of our survey, introduces its resulting data source, the OVA-CRADLE™ (Clinical Research Analytics and Data Lifecycle Environment), and illustrates a sample application of the survey and data by an analysis of prediagnosis symptoms, using text mining and statistics. The OVA-CRADLE™ is an application of our patented Physio-MIMI technology, facilitating Web-based access, online query and exploration of data. The prediagnostic symptoms and association of early-stage OvCa diagnosis with endometriosis provide potentially important indicators for future studies in this field.

  1. Quantitative measurement of the chemical composition of geological standards with a miniature laser ablation/ionization mass spectrometer designed for in situ application in space research

    NASA Astrophysics Data System (ADS)

    Neuland, M. B.; Grimaudo, V.; Mezger, K.; Moreno-García, P.; Riedo, A.; Tulej, M.; Wurz, P.

    2016-03-01

    A key interest of planetary space missions is the quantitative determination of the chemical composition of the planetary surface material. The chemical composition of surface material (minerals, rocks, soils) yields fundamental information that can be used to answer key scientific questions about the formation and evolution of the planetary body in particular and the Solar System in general. We present a miniature time-of-flight type laser ablation/ionization mass spectrometer (LMS) and demonstrate its capability in measuring the elemental and mineralogical composition of planetary surface samples quantitatively by using a femtosecond laser for ablation/ionization. The small size and weight of the LMS make it a remarkable tool for in situ chemical composition measurements in space research, convenient for operation on a lander or rover exploring a planetary surface. In the laboratory, we measured the chemical composition of four geological standard reference samples USGS AGV-2 Andesite, USGS SCo-l Cody Shale, NIST 97b Flint Clay and USGS QLO-1 Quartz Latite with LMS. These standard samples are used to determine the sensitivity factors of the instrument. One important result is that all sensitivity factors are close to 1. Additionally, it is observed that the sensitivity factor of an element depends on its electron configuration, hence on the electron work function and the elemental group in agreement with existing theory. Furthermore, the conformity of the sensitivity factors is supported by mineralogical analyses of the USGS SCo-l and the NIST 97b samples. With the four different reference samples, the consistency of the calibration factors can be demonstrated, which constitutes the fundamental basis for a standard-less measurement-technique for in situ quantitative chemical composition measurements on planetary surface.

  2. System Infrastructure Needs for Web Course Delivery: A Survey of Online Courses in Florida Community Colleges.

    ERIC Educational Resources Information Center

    Ricci, Glenn A.

    This quantitative study describes the system infrastructure needs and perceptions of the 28 Florida community colleges regarding current Web course delivery. Section 1 assesses 27 (96.4%) Florida Community College Distance Learning Consortium (FCCDLC) member representative responses to a 19-item, researcher-designed survey. The study includes…

  3. A Meta-Analysis Method to Advance Design of Technology-Based Learning Tool: Combining Qualitative and Quantitative Research to Understand Learning in Relation to Different Technology Features

    ERIC Educational Resources Information Center

    Zhang, Lin

    2014-01-01

    Educators design and create various technology tools to scaffold students' learning. As more and more technology designs are incorporated into learning, growing attention has been paid to the study of technology-based learning tool. This paper discusses the emerging issues, such as how can learning effectiveness be understood in relation to…

  4. Structure-based and multiple potential three-dimensional quantitative structure-activity relationship (SB-MP-3D-QSAR) for inhibitor design.

    PubMed

    Du, Qi-Shi; Gao, Jing; Wei, Yu-Tuo; Du, Li-Qin; Wang, Shu-Qing; Huang, Ri-Bo

    2012-04-23

    The inhibitions of enzymes (proteins) are determined by the binding interactions between ligands and targeting proteins. However, traditional QSAR (quantitative structure-activity relationship) is a one-side technique, only considering the structures and physicochemical properties of inhibitors. In this study, the structure-based and multiple potential three-dimensional quantitative structure-activity relationship (SB-MP-3D-QSAR) is presented, in which the structural information of host protein is involved in the QSAR calculations. The SB-MP-3D-QSAR actually is a combinational method of docking approach and QSAR technique. Multiple docking calculations are performed first between the host protein and ligand molecules in a training set. In the targeting protein, the functional residues are selected, which make the major contribution to the binding free energy. The binding free energy between ligand and targeting protein is the summation of multiple potential energies, including van der Waals energy, electrostatic energy, hydrophobic energy, and hydrogen-bond energy, and may include nonthermodynamic factors. In the foundational QSAR equation, two sets of weighting coefficients {aj} and {bp} are assigned to the potential energy terms and to the functional residues, respectively. The two coefficient sets are solved by using iterative double least-squares (IDLS) technique in the training set. Then, the two sets of weighting coefficients are used to predict the bioactivities of inquired ligands. In an application example, the new developed method obtained much better results than that of docking calculations.

  5. Obesity-related behaviours and BMI in five urban regions across Europe: sampling design and results from the SPOTLIGHT cross-sectional survey

    PubMed Central

    Lakerveld, Jeroen; Ben Rebah, Maher; Mackenbach, Joreintje D; Charreire, Hélène; Compernolle, Sofie; Glonti, Ketevan; Bardos, Helga; Rutter, Harry; De Bourdeaudhuij, Ilse; Brug, Johannes; Oppert, Jean-Michel

    2015-01-01

    Objectives To describe the design, methods and first results of a survey on obesity-related behaviours and body mass index (BMI) in adults living in neighbourhoods from five urban regions across Europe. Design A cross-sectional observational study in the framework of an European Union-funded project on obesogenic environments (SPOTLIGHT). Setting 60 urban neighbourhoods (12 per country) were randomly selected in large urban zones in Belgium, France, Hungary, the Netherlands and the UK, based on high or low values for median household income (socioeconomic status, SES) and residential area density. Participants A total of 6037 adults (mean age 52 years, 56% female) participated in the online survey. Outcome measures Self-reported physical activity, sedentary behaviours, dietary habits and BMI. Other measures included general health; barriers and motivations for a healthy lifestyle, perceived social and physical environmental characteristics; the availability of transport modes and their use to specific destinations; self-defined neighbourhood boundaries and items related to residential selection. Results Across five countries, residents from low-SES neighbourhoods ate less fruit and vegetables, drank more sugary drinks and had a consistently higher BMI. SES differences in sedentary behaviours were observed in France, with residents from higher SES neighbourhoods reporting to sit more. Residents from low-density neighbourhoods were less physically active than those from high-density neighbourhoods; during leisure time and (most pronounced) for transport (except for Belgium). BMI differences by residential density were inconsistent across all countries. Conclusions The SPOTLIGHT survey provides an original approach for investigating relations between environmental characteristics, obesity-related behaviours and obesity in Europe. First descriptive results indicate considerable differences in health behaviours and BMI between countries and neighbourhood types. PMID

  6. Computational vaccinology: quantitative approaches.

    PubMed

    Flower, Darren R; McSparron, Helen; Blythe, Martin J; Zygouri, Christianna; Taylor, Debra; Guan, Pingping; Wan, Shouzhan; Coveney, Peter V; Walshe, Valerie; Borrow, Persephone; Doytchinova, Irini A

    2003-01-01

    The immune system is hierarchical and has many levels, exhibiting much emergent behaviour. However, at its heart are molecular recognition events that are indistinguishable from other types of biomacromolecular interaction. These can be addressed well by quantitative experimental and theoretical biophysical techniques, and particularly by methods from drug design. We review here our approach to computational immunovaccinology. In particular, we describe the JenPep database and two new techniques for T cell epitope prediction. One is based on quantitative structure-activity relationships (a 3D-QSAR method based on CoMSIA and another 2D method based on the Free-Wilson approach) and the other on atomistic molecular dynamic simulations using high performance computing. JenPep (http://www.jenner.ar.uk/ JenPep) is a relational database system supporting quantitative data on peptide binding to major histocompatibility complexes, TAP transporters, TCR-pMHC complexes, and an annotated list of B cell and T cell epitopes. Our 2D-QSAR method factors the contribution to peptide binding from individual amino acids as well as 1-2 and 1-3 residue interactions. In the 3D-QSAR approach, the influence of five physicochemical properties (volume, electrostatic potential, hydrophobicity, hydrogen-bond donor and acceptor abilities) on peptide affinity were considered. Both methods are exemplified through their application to the well-studied problem of peptide binding to the human class I MHC molecule HLA-A*0201. PMID:14712934

  7. Web Survey Design in ASP.Net 2.0: A Simple Task with One Line of Code

    ERIC Educational Resources Information Center

    Liu, Chang

    2007-01-01

    Over the past few years, more and more companies have been investing in electronic commerce (EC) by designing and implementing Web-based applications. In the world of practice, the importance of using Web technology to reach individual customers has been presented by many researchers. This paper presents an easy way of conducting marketing…

  8. Reflective Filters Design for Self-Filtering Narrowband Ultraviolet Imaging Experiment Wide-Field Surveys (NUVIEWS) Project

    NASA Technical Reports Server (NTRS)

    Park, Jung- Ho; Kim, Jongmin; Zukic, Muamer; Torr, Douglas G.

    1994-01-01

    We report the design of multilayer reflective filters for the self-filtering cameras of the NUVIEWS project. Wide angle self-filtering cameras were designed to image the C IV (154.9 nm) line emission, and H2 Lyman band fluorescence (centered at 161 nm) over a 20 deg x 30 deg field of view. A key element of the filter design includes the development of pi-multilayers optimized to provide maximum reflectance at 154.9 nm and 161 nm for the respective cameras without significant spectral sensitivity to the large cone angle of the incident radiation. We applied self-filtering concepts to design NUVIEWS telescope filters that are composed of three reflective mirrors and one folding mirror. The filters with narrowband widths of 6 and 8 rim at 154.9 and 161 nm, respectively, have net throughputs of more than 50 % with average blocking of out-of-band wavelengths better than 3 x 10(exp -4)%.

  9. Evaluating quantitative 3-D image analysis as a design tool for low enriched uranium fuel compacts for the transient reactor test facility: A preliminary study

    DOE PAGES

    Kane, J. J.; van Rooyen, I. J.; Craft, A. E.; Roney, T. J.; Morrell, S. R.

    2016-02-05

    In this study, 3-D image analysis when combined with a non-destructive examination technique such as X-ray computed tomography (CT) provides a highly quantitative tool for the investigation of a material’s structure. In this investigation 3-D image analysis and X-ray CT were combined to analyze the microstructure of a preliminary subsized fuel compact for the Transient Reactor Test Facility’s low enriched uranium conversion program to assess the feasibility of the combined techniques for use in the optimization of the fuel compact fabrication process. The quantitative image analysis focused on determining the size and spatial distribution of the surrogate fuel particles andmore » the size, shape, and orientation of voids within the compact. Additionally, the maximum effect of microstructural features on heat transfer through the carbonaceous matrix of the preliminary compact was estimated. The surrogate fuel particles occupied 0.8% of the compact by volume with a log-normal distribution of particle sizes with a mean diameter of 39 μm and a standard deviation of 16 μm. Roughly 39% of the particles had a diameter greater than the specified maximum particle size of 44 μm suggesting that the particles agglomerate during fabrication. The local volume fraction of particles also varies significantly within the compact although uniformities appear to be evenly dispersed throughout the analysed volume. The voids produced during fabrication were on average plate-like in nature with their major axis oriented perpendicular to the compaction direction of the compact. Finally, the microstructure, mainly the large preferentially oriented voids, may cause a small degree of anisotropy in the thermal diffusivity within the compact. α∥/α⊥, the ratio of thermal diffusivities parallel to and perpendicular to the compaction direction are expected to be no less than 0.95 with an upper bound of 1.« less

  10. Stream chemistry in the eastern United States. 1. Synoptic survey design, acid-base status, and regional patterns

    SciTech Connect

    Kaufmann, P.R.; Herlihy, A.T.; Mitch, M.E.; Messer, J.J. ); Overton, W.S. )

    1991-04-01

    To assess the regional acid-base status of streams in the mid-Atlantic and southern US, spring base flow chemistry was surveyed in a probability sample of 500 stream reaches representing a population of 64,300 reaches (224,000 km). Approximately half of the streams had acid-neutralizing capacity (ANC) {le} 200 {mu}eq L{sup {minus}1}. Acidic (ANC {le} 0) streams were located in the highlands of the Mid-Atlantic region (southern New York to southern Virginia, 2,330 km), in coastal lowlands of the Mid-Atlantic (2,600 km), and in Florida (462 km). Acidic streams were rare (less than 1%) in the highlands of the Southeast. Inorganic monomeric aluminum (Al{sub im}) concentrations were highest in acidic streams of the Mid-Atlantic Highlands where over 70% of the acidic streams had Al{sub im} greater than 100 {mu}g L{sup {minus}1}, a concentration above which deleterious biological effects have frequently been reported. Dissolved organic carbon concentrations were much higher in lowland coastal streams, compared with inland streams. The authors data supports a hypothesis that atmospheric sources and watershed retention control regional patterns in streamwater sulfate concentrations. Most stream watersheds retain the vast majority of the total nitrogen loading from wet deposition. The data suggest, however, that some deposition nitrogen may be reaching streams in the Northern Appalachians.

  11. Research Design and Statistical Design.

    ERIC Educational Resources Information Center

    Szymanski, Edna Mora

    1993-01-01

    Presents fourth editorial in series, this one describing research design and explaining its relationship to statistical design. Research design, validity, and research approaches are examined, quantitative research designs and hypothesis testing are described, and control and statistical designs are discussed. Concludes with section on the art of…

  12. City Governments and Aging in Place: Community Design, Transportation and Housing Innovation Adoption

    ERIC Educational Resources Information Center

    Lehning, Amanda J.

    2012-01-01

    Purpose of the study: To examine the characteristics associated with city government adoption of community design, housing, and transportation innovations that could benefit older adults. Design and methods: A mixed-methods study with quantitative data collected via online surveys from 62 city planners combined with qualitative data collected via…

  13. The path of placement of a removable partial denture: a microscope based approach to survey and design.

    PubMed

    Mamoun, John Sami

    2015-02-01

    This article reviews the topic of how to identify and develop a removable partial denture (RPD) path of placement, and provides a literature review of the concept of the RPD path of placement, also known as the path of insertion. An optimal RPD path of placement, guided by mutually parallel guide planes, ensures that the RPD flanges fit intimately over edentulous ridge structures and that the framework fits intimately with guide plane surfaces, which prevents food collecting empty spaces between the intaglio surface of the framework and intraoral surfaces, and ensures that RPD clasps engage adequate numbers of tooth undercuts to ensure RPD retention. The article covers topics such as the causes of obstructions to RPD intra-oral seating, the causes of food collecting empty spaces that may exist around an RPD, and how to identify if a guide plane is parallel with the projected RPD path of placement. The article presents a method of using a surgical operating microscope, or high magnification (6-8x or greater) binocular surgical loupes telescopes, combined with co-axial illumination, to identify a preliminary path of placement for an arch. This preliminary path of placement concept may help to guide a dentist or a dental laboratory technician when surveying a master cast of the arch to develop an RPD path of placement, or in verifying that intra-oral contouring has aligned teeth surfaces optimally with the RPD path of placement. In dentistry, a well-fitting RPD reduces long-term periodontal or structural damage to abutment teeth.

  14. The path of placement of a removable partial denture: a microscope based approach to survey and design.

    PubMed

    Mamoun, John Sami

    2015-02-01

    This article reviews the topic of how to identify and develop a removable partial denture (RPD) path of placement, and provides a literature review of the concept of the RPD path of placement, also known as the path of insertion. An optimal RPD path of placement, guided by mutually parallel guide planes, ensures that the RPD flanges fit intimately over edentulous ridge structures and that the framework fits intimately with guide plane surfaces, which prevents food collecting empty spaces between the intaglio surface of the framework and intraoral surfaces, and ensures that RPD clasps engage adequate numbers of tooth undercuts to ensure RPD retention. The article covers topics such as the causes of obstructions to RPD intra-oral seating, the causes of food collecting empty spaces that may exist around an RPD, and how to identify if a guide plane is parallel with the projected RPD path of placement. The article presents a method of using a surgical operating microscope, or high magnification (6-8x or greater) binocular surgical loupes telescopes, combined with co-axial illumination, to identify a preliminary path of placement for an arch. This preliminary path of placement concept may help to guide a dentist or a dental laboratory technician when surveying a master cast of the arch to develop an RPD path of placement, or in verifying that intra-oral contouring has aligned teeth surfaces optimally with the RPD path of placement. In dentistry, a well-fitting RPD reduces long-term periodontal or structural damage to abutment teeth. PMID:25722842

  15. Study Quality in SLA: A Cumulative and Developmental Assessment of Designs, Analyses, Reporting Practices, and Outcomes in Quantitative L2 Research

    ERIC Educational Resources Information Center

    Plonsky, Luke

    2011-01-01

    I began this study with two assumptions. Assumption 1: Study quality matters. If the means by which researchers design, carry out, and report on their studies lack in rigor or transparency, theory and practice are likely to be misguided or at least decelerated. Assumption 2 is an implication of Assumption 1: Quality should be measured rather than…

  16. Quantitative impurity analysis of monoclonal antibody size heterogeneity by CE-LIF: example of development and validation through a quality-by-design framework.

    PubMed

    Michels, David A; Parker, Monica; Salas-Solano, Oscar

    2012-03-01

    This paper describes the framework of quality by design applied to the development, optimization and validation of a sensitive capillary electrophoresis-sodium dodecyl sulfate (CE-SDS) assay for monitoring impurities that potentially impact drug efficacy or patient safety produced in the manufacture of therapeutic MAb products. Drug substance or drug product samples are derivatized with fluorogenic 3-(2-furoyl)quinoline-2-carboxaldehyde and nucleophilic cyanide before separation by CE-SDS coupled to LIF detection. Three design-of-experiments enabled critical labeling parameters to meet method requirements for detecting minor impurities while building precision and robustness into the assay during development. The screening design predicted optimal conditions to control labeling artifacts while two full factorial designs demonstrated method robustness through control of temperature and cyanide parameters within the normal operating range. Subsequent validation according to the guidelines of the International Committee of Harmonization showed the CE-SDS/LIF assay was specific, accurate, and precise (RSD ≤ 0.8%) for relative peak distribution and linear (R > 0.997) between the range of 0.5-1.5 mg/mL with LOD and LOQ of 10 ng/mL and 35 ng/mL, respectively. Validation confirmed the system suitability criteria used as a level of control to ensure reliable method performance.

  17. Measuring health literacy in populations: illuminating the design and development process of the European Health Literacy Survey Questionnaire (HLS-EU-Q)

    PubMed Central

    2013-01-01

    Background Several measurement tools have been developed to measure health literacy. The tools vary in their approach and design, but few have focused on comprehensive health literacy in populations. This paper describes the design and development of the European Health Literacy Survey Questionnaire (HLS-EU-Q), an innovative, comprehensive tool to measure health literacy in populations. Methods Based on a conceptual model and definition, the process involved item development, pre-testing, field-testing, external consultation, plain language check, and translation from English to Bulgarian, Dutch, German, Greek, Polish, and Spanish. Results The development process resulted in the HLS-EU-Q, which entailed two sections, a core health literacy section and a section on determinants and outcomes associated to health literacy. The health literacy section included 47 items addressing self-reported difficulties in accessing, understanding, appraising and applying information in tasks concerning decisions making in healthcare, disease prevention, and health promotion. The second section included items related to, health behaviour, health status, health service use, community participation, socio-demographic and socio-economic factors. Conclusions By illuminating the detailed steps in the design and development process of the HLS-EU-Q, it is the aim to provide a deeper understanding of its purpose, its capability and its limitations for others using the tool. By stimulating a wide application it is the vision that HLS-EU-Q will be validated in more countries to enhance the understanding of health literacy in different populations. PMID:24112855

  18. Land 3D-seismic data: Preprocessing quality control utilizing survey design specifications, noise properties, normal moveout, first breaks, and offset

    USGS Publications Warehouse

    Raef, A.

    2009-01-01

    The recent proliferation of the 3D reflection seismic method into the near-surface area of geophysical applications, especially in response to the emergence of the need to comprehensively characterize and monitor near-surface carbon dioxide sequestration in shallow saline aquifers around the world, justifies the emphasis on cost-effective and robust quality control and assurance (QC/QA) workflow of 3D seismic data preprocessing that is suitable for near-surface applications. The main purpose of our seismic data preprocessing QC is to enable the use of appropriate header information, data that are free of noise-dominated traces, and/or flawed vertical stacking in subsequent processing steps. In this article, I provide an account of utilizing survey design specifications, noise properties, first breaks, and normal moveout for rapid and thorough graphical QC/QA diagnostics, which are easy to apply and efficient in the diagnosis of inconsistencies. A correlated vibroseis time-lapse 3D-seismic data set from a CO2-flood monitoring survey is used for demonstrating QC diagnostics. An important by-product of the QC workflow is establishing the number of layers for a refraction statics model in a data-driven graphical manner that capitalizes on the spatial coverage of the 3D seismic data. ?? China University of Geosciences (Wuhan) and Springer-Verlag GmbH 2009.

  19. Quantitative Research in Chemical Education.

    ERIC Educational Resources Information Center

    Nurrenbern, Susan C.; Robinson, William R.

    1994-01-01

    Provides an overview of the area of quantitative research in chemical education, which involves the same components that comprise chemical research: (1) a question or hypothesis; (2) research design; (3) data collection and analysis; and (4) interpretation of results. Includes questions of interest to chemical educators; areas of quantitative…

  20. Hydrophilic interaction liquid chromatography-tandem mass spectrometry quantitative method for the cellular analysis of varying structures of gemini surfactants designed as nanomaterial drug carriers.

    PubMed

    Donkuru, McDonald; Michel, Deborah; Awad, Hanan; Katselis, George; El-Aneed, Anas

    2016-05-13

    Diquaternary gemini surfactants have successfully been used to form lipid-based nanoparticles that are able to compact, protect, and deliver genetic materials into cells. However, what happens to the gemini surfactants after they have released their therapeutic cargo is unknown. Such knowledge is critical to assess the quality, safety, and efficacy of gemini surfactant nanoparticles. We have developed a simple and rapid liquid chromatography electrospray ionization-tandem mass spectrometry (LC-ESI-MS/MS) method for the quantitative determination of various structures of gemini surfactants in cells. Hydrophilic interaction liquid chromatography (HILIC) was employed allowing for a short simple isocratic run of only 4min. The lower limit of detection (LLOD) was 3ng/mL. The method was valid to 18 structures of gemini surfactants belonging to two different structural families. A full method validation was performed for two lead compounds according to USFDA guidelines. The HILIC-MS/MS method was compatible with the physicochemical properties of gemini surfactants that bear a permanent positive charge with both hydrophilic and hydrophobic elements within their molecular structure. In addition, an effective liquid-liquid extraction method (98% recovery) was employed surpassing previously used extraction methods. The analysis of nanoparticle-treated cells showed an initial rise in the analyte intracellular concentration followed by a maximum and a somewhat more gradual decrease of the intracellular concentration. The observed intracellular depletion of the gemini surfactants may be attributable to their bio-transformation into metabolites and exocytosis from the host cells. Obtained cellular data showed a pattern that grants additional investigations, evaluating metabolite formation and assessing the subcellular distribution of tested compounds.

  1. Quantitative Reasoning in Environmental Science: A Learning Progression

    ERIC Educational Resources Information Center

    Mayes, Robert Lee; Forrester, Jennifer Harris; Christus, Jennifer Schuttlefield; Peterson, Franziska Isabel; Bonilla, Rachel; Yestness, Nissa

    2014-01-01

    The ability of middle and high school students to reason quantitatively within the context of environmental science was investigated. A quantitative reasoning (QR) learning progression was created with three progress variables: quantification act, quantitative interpretation, and quantitative modeling. An iterative research design was used as it…

  2. Design

    ERIC Educational Resources Information Center

    Buchanan, Richard; Cross, Nigel; Durling, David; Nelson, Harold; Owen, Charles; Valtonen, Anna; Boling, Elizabeth; Gibbons, Andrew; Visscher-Voerman, Irene

    2013-01-01

    Scholars representing the field of design were asked to identify what they considered to be the most exciting and imaginative work currently being done in their field, as well as how that work might change our understanding. The scholars included Richard Buchanan, Nigel Cross, David Durling, Harold Nelson, Charles Owen, and Anna Valtonen. Scholars…

  3. DRAFT - Design of Radiological Survey and Sampling to Support Title Transfer or Lease of Property on the Department of Energy Oak Ridge Reservation

    SciTech Connect

    Cusick L.T.

    2002-09-25

    sampling and laboratory analyses are completed, the data are analyzed and included in an Environmental Baseline Summary (EBS) report for title transfer or in a Baseline Environmental Analysis Report (BEAR) for lease. The data from the BEAR is then used in a Screening-Level Human Health Risk Assessment (SHHRA) or a risk calculation (RC) to assess the potential risks to future owners/occupants. If title is to be transferred, release criteria in the form of specific activity concentrations called Derived Concentration Guideline Levels (DCGLs) will be developed for the each property. The DCGLs are based on the risk model and are used with the data in the EBS to determine, with statistical confidence, that the release criteria for the property have been met. The goal of the survey and sampling efforts is to (1) document the baseline conditions of the property (real or personal) prior to title transfer or lease, (2) obtain enough information that an evaluation of radiological risks can be made, and (3) collect sufftcient data so that areas that contain minimal residual levels of radioactivity can be identified and, following radiological control procedures, be released from radiological control. (It should be noted that release from radiological control does not necessarily mean free release because DOE may maintain institutional control of the site after it is released from radiological control). To meet the goals of this document, a Data Quality Objective (DQO) process will be used to enhance data collection efficiency and assist with decision-making. The steps of the DQO process involve stating the problem, identifying the decision, identifying inputs to the decision, developing study boundaries, developing the decision rule, and optimizing the design. This document describes the DQOs chosen for surveys and sampling efforts performed for the purposes listed above. The previous version to this document focused on the requirements for radiological survey and sampling protocols

  4. Materials design for new superconductors

    NASA Astrophysics Data System (ADS)

    Norman, M. R.

    2016-07-01

    Since the announcement in 2011 of the Materials Genome Initiative by the Obama administration, much attention has been given to the subject of materials design to accelerate the discovery of new materials that could have technological implications. Although having its biggest impact for more applied materials like batteries, there is increasing interest in applying these ideas to predict new superconductors. This is obviously a challenge, given that superconductivity is a many body phenomenon, with whole classes of known superconductors lacking a quantitative theory. Given this caveat, various efforts to formulate materials design principles for superconductors are reviewed here, with a focus on surveying the periodic table in an attempt to identify cuprate analogues.

  5. Optimization of parameters for the quantitative surface-enhanced Raman scattering detection of mephedrone using a fractional factorial design and a portable Raman spectrometer.

    PubMed

    Mabbott, Samuel; Correa, Elon; Cowcher, David P; Allwood, J William; Goodacre, Royston

    2013-01-15

    A new optimization strategy for the SERS detection of mephedrone using a portable Raman system has been developed. A fractional factorial design was employed, and the number of statistically significant experiments (288) was greatly reduced from the actual total number of experiments (1722), which minimized the workload while maintaining the statistical integrity of the results. A number of conditions were explored in relation to mephedrone SERS signal optimization including the type of nanoparticle, pH, and aggregating agents (salts). Through exercising this design, it was possible to derive the significance of each of the individual variables, and we discovered four optimized SERS protocols for which the reproducibility of the SERS signal and the limit of detection (LOD) of mephedrone were established. Using traditional nanoparticles with a combination of salts and pHs, it was shown that the relative standard deviations of mephedrone-specific Raman peaks were as low as 0.51%, and the LOD was estimated to be around 1.6 μg/mL (9.06 × 10(-6) M), a detection limit well beyond the scope of conventional Raman and extremely low for an analytical method optimized for quick and uncomplicated in-field use.

  6. Sensitive quantitation of polyamines in plant foods by ultrasound-assisted benzoylation and dispersive liquid-liquid microextraction with the aid of experimental designs.

    PubMed

    Pinto, Edgar; Melo, Armindo; Ferreira, Isabel M P L V O

    2014-05-14

    A new method involving ultrasound-assisted benzoylation and dispersive liquid-liquid microextraction was optimized with the aid of chemometrics for the extraction, cleanup, and determination of polyamines in plant foods. Putrescine, cadaverine, spermidine, and spermine were derivatized with 3,5-dinitrobenzoyl chloride and extracted by dispersive liquid-liquid microextraction using acetonitrile and carbon tetrachloride as dispersive and extraction solvents, respectively. Two-level full factorial design and central composite design were applied to select the most appropriate derivatization and extraction conditions. The developed method was linear in the 0.5-10.0 mg/L range, with a R(2) ≥ 0.9989. Intra- and interday precisions ranged from 0.8 to 6.9% and from 3.0 to 10.3%, respectively, and the limit of detection ranged between 0.018 and 0.042 μg/g of fresh weight. This method was applied to the analyses of six different types of plant foods, presenting recoveries between 81.7 and 114.2%. The method is inexpensive, versatile, simple, and sensitive.

  7. Sport Management Survey. Employment Perspectives.

    ERIC Educational Resources Information Center

    Quain, Richard J.; Parks, Janet B.

    1986-01-01

    A survey of sport management positions was designed to determine projected vacancy rates in six sport management career areas. Respondents to the survey were also questioned regarding their awareness of college professional preparation programs. Results are presented. (MT)

  8. Streamlining volcano-related, web-based data display and design with a new U.S. Geological Survey Volcano Science Center website

    NASA Astrophysics Data System (ADS)

    Stovall, W. K.; Randall, M. J.; Cervelli, P. F.

    2011-12-01

    The goal of the newly designed U.S. Geological Survey (USGS) Volcano Science Center website is to provide a reliable, easy to understand, and accessible format to display volcano monitoring data and scientific information on US volcanoes and their hazards. There are greater than 150 active or potentially active volcanoes in the United States, and the Volcano Science Center aims to advance the scientific understanding of volcanic processes at these volcanoes and to lessen the harmful impacts of potential volcanic activity. To fulfill a Congressional mandate, the USGS Volcano Hazards Program must communicate scientific findings to authorities and the public in a timely and understandable form. The easiest and most efficient way to deliver this information is via the Internet. We implemented a new database model to organize website content, ensuring consistency, accuracy, and timeliness of information display. Real-time monitoring data is available for over 50 volcanoes in the United States, and web-site visitors are able to interact with a dynamic, map-based display system to access and analyze these data, which are managed by scientists from the five USGS volcano observatories. Helicorders, recent hypocenters, webcams, tilt measurements, deformation, gas emissions, and changes in hydrology can be viewed for any of the real-time instruments. The newly designed Volcano Science Center web presence streamlines the display of research findings, hazard assessments, and real-time monitoring data for the U.S. volcanoes.

  9. Bright Galaxies at Hubble’s Redshift Detection Frontier: Preliminary Results and Design from the Redshift z ~ 9-10 BoRG Pure-Parallel HST Survey

    NASA Astrophysics Data System (ADS)

    Calvi, V.; Trenti, M.; Stiavelli, M.; Oesch, P.; Bradley, L. D.; Schmidt, K. B.; Coe, D.; Brammer, G.; Bernard, S.; Bouwens, R. J.; Carrasco, D.; Carollo, C. M.; Holwerda, B. W.; MacKenty, J. W.; Mason, C. A.; Shull, J. M.; Treu, T.

    2016-02-01

    We present the first results and design from the redshift z ˜ 9-10 Brightest of the Reionizing Galaxies Hubble Space Telescope survey BoRG[z9-10], aimed at searching for intrinsically luminous unlensed galaxies during the first 700 Myr after the Big Bang. BoRG[z9-10] is the continuation of a multi-year pure-parallel near-IR and optical imaging campaign with the Wide Field Camera 3. The ongoing survey uses five filters, optimized for detecting the most distant objects and offering continuous wavelength coverage from λ = 0.35 μm to λ = 1.7 μm. We analyze the initial ˜130 arcmin2 of area over 28 independent lines of sight (˜25% of the total planned) to search for z\\gt 7 galaxies using a combination of Lyman-break and photometric redshift selections. From an effective comoving volume of (5-25) × 105 Mpc3 for magnitudes brighter than {m}{AB}=26.5{{{--}}}24.0 in the {H}{{160}}-band respectively, we find five galaxy candidates at z\\quad ˜ 8.3-10 detected at high confidence ({{S}}/{{N}}\\gt 8), including a source at z\\quad ˜ 8.4 with {m}{AB}=24.5 ({{S}}/{{N}} ˜ 22), which, if confirmed, would be the brightest galaxy identified at such early times (z\\gt 8). In addition, BoRG[z9-10] data yield four galaxies with 7.3≲ z≲ 8. These new Lyman-break galaxies with m≲ 26.5 are ideal targets for follow-up observations from ground and space-based observatories to help investigate the complex interplay between dark matter growth, galaxy assembly, and reionization.

  10. Doing Quantitative Research in Education with SPSS

    ERIC Educational Resources Information Center

    Muijs, Daniel

    2004-01-01

    This book looks at quantitative research methods in education. The book is structured to start with chapters on conceptual issues and designing quantitative research studies before going on to data analysis. While each chapter can be studied separately, a better understanding will be reached by reading the book sequentially. This book is intended…

  11. Design of multiplex calibrant plasmids, their use in GMO detection and the limit of their applicability for quantitative purposes owing to competition effects.

    PubMed

    Debode, Frédéric; Marien, Aline; Janssen, Eric; Berben, Gilbert

    2010-03-01

    Five double-target multiplex plasmids to be used as calibrants for GMO quantification were constructed. They were composed of two modified targets associated in tandem in the same plasmid: (1) a part of the soybean lectin gene and (2) a part of the transgenic construction of the GTS40-3-2 event. Modifications were performed in such a way that each target could be amplified with the same primers as those for the original target from which they were derived but such that each was specifically detected with an appropriate probe. Sequence modifications were done to keep the parameters of the new target as similar as possible to those of its original sequence. The plasmids were designed to be used either in separate reactions or in multiplex reactions. Evidence is given that with each of the five different plasmids used in separate wells as a calibrant for a different copy number, a calibration curve can be built. When the targets were amplified together (in multiplex) and at different concentrations inside the same well, the calibration curves showed that there was a competition effect between the targets and this limits the range of copy numbers for calibration over a maximum of 2 orders of magnitude. Another possible application of multiplex plasmids is discussed.

  12. Design of multiplex calibrant plasmids, their use in GMO detection and the limit of their applicability for quantitative purposes owing to competition effects.

    PubMed

    Debode, Frédéric; Marien, Aline; Janssen, Eric; Berben, Gilbert

    2010-03-01

    Five double-target multiplex plasmids to be used as calibrants for GMO quantification were constructed. They were composed of two modified targets associated in tandem in the same plasmid: (1) a part of the soybean lectin gene and (2) a part of the transgenic construction of the GTS40-3-2 event. Modifications were performed in such a way that each target could be amplified with the same primers as those for the original target from which they were derived but such that each was specifically detected with an appropriate probe. Sequence modifications were done to keep the parameters of the new target as similar as possible to those of its original sequence. The plasmids were designed to be used either in separate reactions or in multiplex reactions. Evidence is given that with each of the five different plasmids used in separate wells as a calibrant for a different copy number, a calibration curve can be built. When the targets were amplified together (in multiplex) and at different concentrations inside the same well, the calibration curves showed that there was a competition effect between the targets and this limits the range of copy numbers for calibration over a maximum of 2 orders of magnitude. Another possible application of multiplex plasmids is discussed. PMID:20099062

  13. Quantitative and qualitative optimization of allergen extraction from peanut and selected tree nuts. Part 2. Optimization of buffer and ionic strength using a full factorial experimental design.

    PubMed

    L'Hocine, Lamia; Pitre, Mélanie

    2016-03-01

    A full factorial design was used to assess the single and interactive effects of three non-denaturing aqueous (phosphate, borate, and carbonate) buffers at various ionic strengths (I) on allergen extractability from and immunoglobulin E (IgE) immunoreactivity of peanut, almond, hazelnut, and pistachio. The results indicated that the type and ionic strength of the buffer had different effects on protein recovery from the nuts under study. Substantial differences in protein profiles, abundance, and IgE-binding intensity with different combinations of pH and ionic strength were found. A significant interaction between pH and ionic strength was observed for pistachio and almond. The optimal buffer system conditions, which maximized the IgE-binding efficiency of allergens and provided satisfactory to superior protein recovery yield and profiles, were carbonate buffer at an ionic strength of I=0.075 for peanut, carbonate buffer at I=0.15 for almond, phosphate buffer at I=0.5 for hazelnut, and borate at I=0.15 for pistachio. The buffer type and its ionic strength could be manipulated to achieve the selective solubility of desired allergens.

  14. Quantitative and qualitative optimization of allergen extraction from peanut and selected tree nuts. Part 1. Screening of optimal extraction conditions using a D-optimal experimental design.

    PubMed

    L'Hocine, Lamia; Pitre, Mélanie

    2016-03-01

    A D-optimal design was constructed to optimize allergen extraction efficiency simultaneously from roasted, non-roasted, defatted, and non-defatted almond, hazelnut, peanut, and pistachio flours using three non-denaturing aqueous (phosphate, borate, and carbonate) buffers at various conditions of ionic strength, buffer-to-protein ratio, extraction temperature, and extraction duration. Statistical analysis showed that roasting and non-defatting significantly lowered protein recovery for all nuts. Increasing the temperature and the buffer-to-protein ratio during extraction significantly increased protein recovery, whereas increasing the extraction time had no significant impact. The impact of the three buffers on protein recovery varied significantly among the nuts. Depending on the extraction conditions, protein recovery varied from 19% to 95% for peanut, 31% to 73% for almond, 17% to 64% for pistachio, and 27% to 88% for hazelnut. A modulation by the buffer type and ionic strength of protein and immunoglobuline E binding profiles of extracts was evidenced, where high protein recovery levels did not always correlate with high immunoreactivity.

  15. Quantitative and qualitative optimization of allergen extraction from peanut and selected tree nuts. Part 1. Screening of optimal extraction conditions using a D-optimal experimental design.

    PubMed

    L'Hocine, Lamia; Pitre, Mélanie

    2016-03-01

    A D-optimal design was constructed to optimize allergen extraction efficiency simultaneously from roasted, non-roasted, defatted, and non-defatted almond, hazelnut, peanut, and pistachio flours using three non-denaturing aqueous (phosphate, borate, and carbonate) buffers at various conditions of ionic strength, buffer-to-protein ratio, extraction temperature, and extraction duration. Statistical analysis showed that roasting and non-defatting significantly lowered protein recovery for all nuts. Increasing the temperature and the buffer-to-protein ratio during extraction significantly increased protein recovery, whereas increasing the extraction time had no significant impact. The impact of the three buffers on protein recovery varied significantly among the nuts. Depending on the extraction conditions, protein recovery varied from 19% to 95% for peanut, 31% to 73% for almond, 17% to 64% for pistachio, and 27% to 88% for hazelnut. A modulation by the buffer type and ionic strength of protein and immunoglobuline E binding profiles of extracts was evidenced, where high protein recovery levels did not always correlate with high immunoreactivity. PMID:26471618

  16. Quantitative and qualitative optimization of allergen extraction from peanut and selected tree nuts. Part 2. Optimization of buffer and ionic strength using a full factorial experimental design.

    PubMed

    L'Hocine, Lamia; Pitre, Mélanie

    2016-03-01

    A full factorial design was used to assess the single and interactive effects of three non-denaturing aqueous (phosphate, borate, and carbonate) buffers at various ionic strengths (I) on allergen extractability from and immunoglobulin E (IgE) immunoreactivity of peanut, almond, hazelnut, and pistachio. The results indicated that the type and ionic strength of the buffer had different effects on protein recovery from the nuts under study. Substantial differences in protein profiles, abundance, and IgE-binding intensity with different combinations of pH and ionic strength were found. A significant interaction between pH and ionic strength was observed for pistachio and almond. The optimal buffer system conditions, which maximized the IgE-binding efficiency of allergens and provided satisfactory to superior protein recovery yield and profiles, were carbonate buffer at an ionic strength of I=0.075 for peanut, carbonate buffer at I=0.15 for almond, phosphate buffer at I=0.5 for hazelnut, and borate at I=0.15 for pistachio. The buffer type and its ionic strength could be manipulated to achieve the selective solubility of desired allergens. PMID:26471623

  17. Effectiveness of Facebook Based Learning to Enhance Creativity among Islamic Studies Students by Employing Isman Instructional Design Model

    ERIC Educational Resources Information Center

    Alias, Norlidah; Siraj, Saedah; Daud, Mohd Khairul Azman Md; Hussin, Zaharah

    2013-01-01

    The study examines the effectiveness of Facebook based learning to enhance creativity among Islamic Studies students in the secondary educational setting in Malaysia. It describes the design process by employing the Isman Instructional Design Model. A quantitative study was carried out using experimental method and background survey. The…

  18. Toward a Theoretical Model of Decision-Making and Resistance to Change among Higher Education Online Course Designers

    ERIC Educational Resources Information Center

    Dodd, Bucky J.

    2013-01-01

    Online course design is an emerging practice in higher education, yet few theoretical models currently exist to explain or predict how the diffusion of innovations occurs in this space. This study used a descriptive, quantitative survey research design to examine theoretical relationships between decision-making style and resistance to change…

  19. Identification of Polyphosphate-Accumulating Organisms and Design of 16S rRNA-Directed Probes for Their Detection and Quantitation

    PubMed Central

    Crocetti, Gregory R.; Hugenholtz, Philip; Bond, Philip L.; Schuler, Andrew; Keller, Jürg; Jenkins, David; Blackall, Linda L.

    2000-01-01

    Laboratory-scale sequencing batch reactors (SBRs) as models for activated sludge processes were used to study enhanced biological phosphorus removal (EBPR) from wastewater. Enrichment for polyphosphate-accumulating organisms (PAOs) was achieved essentially by increasing the phosphorus concentration in the influent to the SBRs. Fluorescence in situ hybridization (FISH) using domain-, division-, and subdivision-level probes was used to assess the proportions of microorganisms in the sludges. The A sludge, a high-performance P-removing sludge containing 15.1% P in the biomass, was comprised of large clusters of polyphosphate-containing coccobacilli. By FISH, >80% of the A sludge bacteria were β-2 Proteobacteria arranged in clusters of coccobacilli, strongly suggesting that this group contains a PAO responsible for EBPR. The second dominant group in the A sludge was the Actinobacteria. Clone libraries of PCR-amplified bacterial 16S rRNA genes from three high-performance P-removing sludges were prepared, and clones belonging to the β-2 Proteobacteria were fully sequenced. A distinctive group of clones (sharing ≥98% sequence identity) related to Rhodocyclus spp. (94 to 97% identity) and Propionibacter pelophilus (95 to 96% identity) was identified as the most likely candidate PAOs. Three probes specific for the highly related candidate PAO group were designed from the sequence data. All three probes specifically bound to the morphologically distinctive clusters of PAOs in the A sludge, exactly coinciding with the β-2 Proteobacteria probe. Sequential FISH and polyphosphate staining of EBPR sludges clearly demonstrated that PAO probe-binding cells contained polyphosphate. Subsequent PAO probe analyses of a number of sludges with various P removal capacities indicated a strong positive correlation between P removal from the wastewater as determined by sludge P content and number of PAO probe-binding cells. We conclude therefore that an important group of PAOs in EBPR

  20. Using design of experiments to optimize derivatization with methyl chloroformate for quantitative analysis of the aqueous phase from hydrothermal liquefaction of biomass.

    PubMed

    Madsen, René Bjerregaard; Jensen, Mads Mørk; Mørup, Anders Juul; Houlberg, Kasper; Christensen, Per Sigaard; Klemmer, Maika; Becker, Jacob; Iversen, Bo Brummerstedt; Glasius, Marianne

    2016-03-01

    Hydrothermal liquefaction is a promising technique for the production of bio-oil. The process produces an oil phase, a gas phase, a solid residue, and an aqueous phase. Gas chromatography coupled with mass spectrometry is used to analyze the complex aqueous phase. Especially small organic acids and nitrogen-containing compounds are of interest. The efficient derivatization reagent methyl chloroformate was used to make analysis of the complex aqueous phase from hydrothermal liquefaction of dried distillers grains with solubles possible. A circumscribed central composite design was used to optimize the responses of both derivatized and nonderivatized analytes, which included small organic acids, pyrazines, phenol, and cyclic ketones. Response surface methodology was used to visualize significant factors and identify optimized derivatization conditions (volumes of methyl chloroformate, NaOH solution, methanol, and pyridine). Twenty-nine analytes of small organic acids, pyrazines, phenol, and cyclic ketones were quantified. An additional three analytes were pseudoquantified with use of standards with similar mass spectra. Calibration curves with high correlation coefficients were obtained, in most cases R (2)  > 0.991. Method validation was evaluated with repeatability, and spike recoveries of all 29 analytes were obtained. The 32 analytes were quantified in samples from the commissioning of a continuous flow reactor and in samples from recirculation experiments involving the aqueous phase. The results indicated when the steady-state condition of the flow reactor was obtained and the effects of recirculation. The validated method will be especially useful for investigations of the effect of small organic acids on the hydrothermal liquefaction process. PMID:26804738

  1. Using design of experiments to optimize derivatization with methyl chloroformate for quantitative analysis of the aqueous phase from hydrothermal liquefaction of biomass.

    PubMed

    Madsen, René Bjerregaard; Jensen, Mads Mørk; Mørup, Anders Juul; Houlberg, Kasper; Christensen, Per Sigaard; Klemmer, Maika; Becker, Jacob; Iversen, Bo Brummerstedt; Glasius, Marianne

    2016-03-01

    Hydrothermal liquefaction is a promising technique for the production of bio-oil. The process produces an oil phase, a gas phase, a solid residue, and an aqueous phase. Gas chromatography coupled with mass spectrometry is used to analyze the complex aqueous phase. Especially small organic acids and nitrogen-containing compounds are of interest. The efficient derivatization reagent methyl chloroformate was used to make analysis of the complex aqueous phase from hydrothermal liquefaction of dried distillers grains with solubles possible. A circumscribed central composite design was used to optimize the responses of both derivatized and nonderivatized analytes, which included small organic acids, pyrazines, phenol, and cyclic ketones. Response surface methodology was used to visualize significant factors and identify optimized derivatization conditions (volumes of methyl chloroformate, NaOH solution, methanol, and pyridine). Twenty-nine analytes of small organic acids, pyrazines, phenol, and cyclic ketones were quantified. An additional three analytes were pseudoquantified with use of standards with similar mass spectra. Calibration curves with high correlation coefficients were obtained, in most cases R (2)  > 0.991. Method validation was evaluated with repeatability, and spike recoveries of all 29 analytes were obtained. The 32 analytes were quantified in samples from the commissioning of a continuous flow reactor and in samples from recirculation experiments involving the aqueous phase. The results indicated when the steady-state condition of the flow reactor was obtained and the effects of recirculation. The validated method will be especially useful for investigations of the effect of small organic acids on the hydrothermal liquefaction process.

  2. Different design of enzyme-triggered CO-releasing molecules (ET-CORMs) reveals quantitative differences in biological activities in terms of toxicity and inflammation.

    PubMed

    Stamellou, E; Storz, D; Botov, S; Ntasis, E; Wedel, J; Sollazzo, S; Krämer, B K; van Son, W; Seelen, M; Schmalz, H G; Schmidt, A; Hafner, M; Yard, B A

    2014-01-01

    Acyloxydiene-Fe(CO)3 complexes can act as enzyme-triggered CO-releasing molecules (ET-CORMs). Their biological activity strongly depends on the mother compound from which they are derived, i.e. cyclohexenone or cyclohexanedione, and on the position of the ester functionality they harbour. The present study addresses if the latter characteristic affects CO release, if cytotoxicity of ET-CORMs is mediated through iron release or inhibition of cell respiration and to what extent cyclohexenone and cyclohexanedione derived ET-CORMs differ in their ability to counteract TNF-α mediated inflammation. Irrespective of the formulation (DMSO or cyclodextrin), toxicity in HUVEC was significantly higher for ET-CORMs bearing the ester functionality at the outer (rac-4), as compared to the inner (rac-1) position of the cyclohexenone moiety. This was paralleled by an increased CO release from the former ET-CORM. Toxicity was not mediated via iron as EC50 values for rac-4 were significantly lower than for FeCl2 or FeCl3 and were not influenced by iron chelation. ATP depletion preceded toxicity suggesting impaired cell respiration as putative cause for cell death. In long-term HUVEC cultures inhibition of VCAM-1 expression by rac-1 waned in time, while for the cyclohexanedione derived rac-8 inhibition seems to increase. NFκB was inhibited by both rac-1 and rac-8 independent of IκBα degradation. Both ET-CORMs activated Nrf-2 and consequently induced the expression of HO-1. This study further provides a rational framework for designing acyloxydiene-Fe(CO)3 complexes as ET-CORMs with differential CO release and biological activities. We also provide a better understanding of how these complexes affect cell-biology in mechanistic terms.

  3. Rigour in quantitative research.

    PubMed

    Claydon, Leica Sarah

    2015-07-22

    This article which forms part of the research series addresses scientific rigour in quantitative research. It explores the basis and use of quantitative research and the nature of scientific rigour. It examines how the reader may determine whether quantitative research results are accurate, the questions that should be asked to determine accuracy and the checklists that may be used in this process. Quantitative research has advantages in nursing, since it can provide numerical data to help answer questions encountered in everyday practice.

  4. Developing Geoscience Students' Quantitative Skills

    NASA Astrophysics Data System (ADS)

    Manduca, C. A.; Hancock, G. S.

    2005-12-01

    Sophisticated quantitative skills are an essential tool for the professional geoscientist. While students learn many of these sophisticated skills in graduate school, it is increasingly important that they have a strong grounding in quantitative geoscience as undergraduates. Faculty have developed many strong approaches to teaching these skills in a wide variety of geoscience courses. A workshop in June 2005 brought together eight faculty teaching surface processes and climate change to discuss and refine activities they use and to publish them on the Teaching Quantitative Skills in the Geosciences website (serc.Carleton.edu/quantskills) for broader use. Workshop participants in consultation with two mathematics faculty who have expertise in math education developed six review criteria to guide discussion: 1) Are the quantitative and geologic goals central and important? (e.g. problem solving, mastery of important skill, modeling, relating theory to observation); 2) Does the activity lead to better problem solving? 3) Are the quantitative skills integrated with geoscience concepts in a way that makes sense for the learning environment and supports learning both quantitative skills and geoscience? 4) Does the methodology support learning? (e.g. motivate and engage students; use multiple representations, incorporate reflection, discussion and synthesis) 5) Are the materials complete and helpful to students? 6) How well has the activity worked when used? Workshop participants found that reviewing each others activities was very productive because they thought about new ways to teach and the experience of reviewing helped them think about their own activity from a different point of view. The review criteria focused their thinking about the activity and would be equally helpful in the design of a new activity. We invite a broad international discussion of the criteria(serc.Carleton.edu/quantskills/workshop05/review.html).The Teaching activities can be found on the

  5. QUANTITATIVE DECISION TOOLS AND MANAGEMENT DEVELOPMENT PROGRAMS.

    ERIC Educational Resources Information Center

    BYARS, LLOYD L.; NUNN, GEOFFREY E.

    THIS ARTICLE OUTLINED THE CURRENT STATUS OF QUANTITATIVE METHODS AND OPERATIONS RESEARCH (OR), SKETCHED THE STRENGTHS OF TRAINING EFFORTS AND ISOLATED WEAKNESSES, AND FORMULATED WORKABLE CRITERIA FOR EVALUATING SUCCESS OF OPERATIONS RESEARCH TRAINING PROGRAMS. A SURVEY OF 105 COMPANIES REVEALED THAT PERT, INVENTORY CONTROL THEORY AND LINEAR…

  6. Changes to the Design of the National Health Interview Survey to Support Enhanced Monitoring of Health Reform Impacts at the State Level

    PubMed Central

    Blewett, Lynn A.; Dahlen, Heather M.; Spencer, Donna; Rivera Drew, Julia A.; Lukanen, Elizabeth

    2016-01-01

    Since 1957, the National Health Interview Survey (NHIS), sponsored by the Centers for Disease Control and Prevention (CDC)’s National Center for Health Statistics (NCHS), has been the primary source of information for monitoring health and health care use of the U.S. population at the national level. The passage of the Patient Protection and Affordable Care Act (ACA) in 2010 generated new needs for data to monitor its implementation and evaluate its effectiveness. In response, the NCHS has taken steps to enhance the content of the NHIS in several key areas and positioned the NHIS as a source of population health information at the national and state levels. This paper reviews recent changes to the NHIS that support enhanced health reform monitoring, including new questions and response categories, sampling design changes to improve state-level analysis, and enhanced dissemination activities. We conclude with a discussion about the importance of the NHIS, the continued need for state-level analysis, and suggestions for future consideration. PMID:27631739

  7. Spatial distribution, sampling precision and survey design optimisation with non-normal variables: The case of anchovy (Engraulis encrasicolus) recruitment in Spanish Mediterranean waters

    NASA Astrophysics Data System (ADS)

    Tugores, M. Pilar; Iglesias, Magdalena; Oñate, Dolores; Miquel, Joan

    2016-02-01

    In the Mediterranean Sea, the European anchovy (Engraulis encrasicolus) displays a key role in ecological and economical terms. Ensuring stock sustainability requires the provision of crucial information, such as species spatial distribution or unbiased abundance and precision estimates, so that management strategies can be defined (e.g. fishing quotas, temporal closure areas or marine protected areas MPA). Furthermore, the estimation of the precision of global abundance at different sampling intensities can be used for survey design optimisation. Geostatistics provide a priori unbiased estimations of the spatial structure, global abundance and precision for autocorrelated data. However, their application to non-Gaussian data introduces difficulties in the analysis in conjunction with low robustness or unbiasedness. The present study applied intrinsic geostatistics in two dimensions in order to (i) analyse the spatial distribution of anchovy in Spanish Western Mediterranean waters during the species' recruitment season, (ii) produce distribution maps, (iii) estimate global abundance and its precision, (iv) analyse the effect of changing the sampling intensity on the precision of global abundance estimates and, (v) evaluate the effects of several methodological options on the robustness of all the analysed parameters. The results suggested that while the spatial structure was usually non-robust to the tested methodological options when working with the original dataset, it became more robust for the transformed datasets (especially for the log-backtransformed dataset). The global abundance was always highly robust and the global precision was highly or moderately robust to most of the methodological options, except for data transformation.

  8. The discovery of novel histone lysine methyltransferase G9a inhibitors (part 1): molecular design based on a series of substituted 2,4-diamino-7- aminoalkoxyquinazoline by molecular-docking-guided 3D quantitative structure-activity relationship studies.

    PubMed

    Feng, Taotao; Wang, Hai; Zhang, Xiaojin; Sun, Haopeng; You, Qidong

    2014-06-01

    Protein lysine methyltransferase G9a, which catalyzes methylation of lysine 9 of histone H3 (H3K9) and lysine 373 (K373) of p53, is overexpressed in human cancers. This suggests that small molecular inhibitors of G9a might be attractive antitumor agents. Herein we report our efforts on the design of novel G9a inhibitor based on the 3D quantitative structure-activity relationship (3D-QSAR) analysis of a series of 2,4-diamino-7-aminoalkoxyquinazolineas G9a inhibitors. The 3D-QSAR model was generated from 47 compounds using docking based molecular alignment. The best predictions were obtained with CoMFA standard model (q2 =0.700, r2 = 0.952) and CoMSIA model combined with steric, electrostatic, hydrophobic, hydrogen bond donor and acceptor fields (q2 = 0.724, r2 =0.960). The structural requirements for substituted 2,4-diamino-7-aminoalkoxyquinazoline for G9a inhibitory activity can be obtained by analysing the COMSIA plots. Based on the information, six novel follow-up analogs were designed.

  9. Use of Web and In-Person Survey Modes to Gather Data From Young Adults on Sex and Drug Use: An Evaluation of Cost, Time, and Survey Error Based on a Randomized Mixed-Mode Design

    PubMed Central

    McMorris, Barbara J.; Petrie, Renee S.; Catalano, Richard F.; Fleming, Charles B.; Haggerty, Kevin P.; Abbott, Robert D.

    2008-01-01

    In a randomized test of mixed-mode data collection strategies, 386 participants in the Raising Healthy Children (RHC) Project were either (1) asked to complete a survey over the Internet and later offered the opportunity to complete the survey in person or (2) first offered an in-person survey, with Web follow-up. The web-first condition resulted in cost savings while the overall completion rates for the two conditions were similar. On average, in-person-first condition participants completed surveys earlier in the field period than web-first condition participants. Based on intent-to-treat analyses, little evidence of condition effects on response bias, with respect to rates or levels of reported behavior, was found. PMID:19029360

  10. A Very High Resolution, Deep-Towed Multichannel Seismic Survey in the Yaquina Basin off Peru ? Technical Design of the new Deep-Tow Streamer

    NASA Astrophysics Data System (ADS)

    Bialas, J.; Breitzke, M.

    2002-12-01

    Within the project INGGAS a new deep towed acoustic profiling instrument consisting of a side scan sonar fish and a 26 channel seismic streamer has been developed for operation in full ocean depth. The digital channels are build by single hydrophones and three engineering nodes (EN) which are connected either by 1 m or 6.5 m long cable segments. Together with high frequent surface sources (e.g. GI gun) this hybrid system allows to complete surveys with target resolutions of higher frequency content than from complete surface based configurations. Consequently special effort has been addressed to positioning information of the submerged towed instrument. Ultra Short Base Line (USBL) navigation of the tow fish allows precise coordinate evaluation even with more than 7 km of tow cable. Specially designed engineering nodes comprise a single hydrophone with compass, depth, pitch and roll sensors. Optional extension of the streamer up to 96 hydrophone nodes and 75 engineering nodes is possible. A telemetry device allows up- and downlink transmission of all system parameters and all recorded data from the tow fish in real time. Signals from the streamer and the various side scan sensors are multiplexed along the deep-sea cable. Within the telemetry system coaxial and fiber optic connectors are available and can be chosen according to the ships needs. In case of small bandwidth only selected portions of data are transmitted onboard to provide full online quality control while a copy of the complete data set is stored within the submerged systems. Onboard the record strings of side scan and streamer are demultiplexed and distributed to the quality control (QC) systems by Ethernet. A standard marine multichannel control system is used to display shot gather, spectra and noise monitoring of the streamer channels as well as data storage in SEG format. Precise navigation post processing includes all available positioning information from the vessel (DGPS), the USBL, the

  11. Population and Star Formation Histories from the Outer Limits Survey

    NASA Astrophysics Data System (ADS)

    Brondel, Brian Joseph; Saha, Abhijit; Olszewski, Edward

    2015-08-01

    The Outer Limits Survey (OLS) is a deep survey of selected fields in the outlying areas of the Magellanic Clouds based on the MOSAIC-II instrument on the Blanco 4-meter Telescope at CTIO. OLS is designed to probe the outer disk and halo structures of Magellanic System. The survey comprises ~50 fields obtained in Landolt R, I and Washington C, M and DDO51 filters, extending to a depth of about 24th magnitude in I. While qualitative examination of the resulting data has yielded interesting published results, we report here on quantitative analysis through matching of Hess diagrams to theoretical isochrones. We present analysis based on techniques developed by Dolphin (e.g., 2002, MNRAS, 332, 91) for fields observed by OLS. Our results broadly match those found by qualitative examination of the CMDs, but interesting details emerge from isochrone fitting.

  12. Application of TaqMan fluorescent probe-based quantitative real-time PCR assay for the environmental survey of Legionella spp. and Legionella pneumophila in drinking water reservoirs in Taiwan.

    PubMed

    Kao, Po-Min; Hsu, Bing-Mu; Hsu, Tsui-Kang; Ji, Wen-Tsai; Huang, Po-Hsiang; Hsueh, Chih-Jen; Chiang, Chuen-Sheue; Huang, Shih-Wei; Huang, Yu-Li

    2014-08-15

    In this study, TaqMan fluorescent quantitative real-time PCR was performed to quantify Legionella species in reservoirs. Water samples were collected from 19 main reservoirs in Taiwan, and 12 (63.2%) were found to contain Legionella spp. The identified species included uncultured Legionella spp., L. pneumophila, L. jordanis, and L. drancourtii. The concentrations of Legionella spp. and L. pneumophila in the water samples were in the range of 1.8×10(2)-2.6×10(3) and 1.6×10(2)-2.4×10(2) cells/L, respectively. The presence and absence of Legionella spp. in the reservoir differed significantly in pH values. These results highlight the importance that L. pneumophila, L. jordanis, and L. drancourtii are potential pathogens in the reservoirs. The presence of L. pneumophila in reservoirs may be a potential public health concern that must be further examined.

  13. Bayesian adaptive survey protocols for resource management

    USGS Publications Warehouse

    Halstead, Brian J.; Wylie, Glenn D.; Coates, Peter S.; Casazza, Michael L.

    2011-01-01

    Transparency in resource management decisions requires a proper accounting of uncertainty at multiple stages of the decision-making process. As information becomes available, periodic review and updating of resource management protocols reduces uncertainty and improves management decisions. One of the most basic steps to mitigating anthropogenic effects on populations is determining if a population of a species occurs in an area that will be affected by human activity. Species are rarely detected with certainty, however, and falsely declaring a species absent can cause improper conservation decisions or even extirpation of populations. We propose a method to design survey protocols for imperfectly detected species that accounts for multiple sources of uncertainty in the detection process, is capable of quantitatively incorporating expert opinion into the decision-making process, allows periodic updates to the protocol, and permits resource managers to weigh the severity of consequences if the species is falsely declared absent. We developed our method using the giant gartersnake (Thamnophis gigas), a threatened species precinctive to the Central Valley of California, as a case study. Survey date was negatively related to the probability of detecting the giant gartersnake, and water temperature was positively related to the probability of detecting the giant gartersnake at a sampled location. Reporting sampling effort, timing and duration of surveys, and water temperatures would allow resource managers to evaluate the probability that the giant gartersnake occurs at sampled sites where it is not detected. This information would also allow periodic updates and quantitative evaluation of changes to the giant gartersnake survey protocol. Because it naturally allows multiple sources of information and is predicated upon the idea of updating information, Bayesian analysis is well-suited to solving the problem of developing efficient sampling protocols for species of

  14. Design, Data Collection, Monitoring, Interview Administration Time, and Data Editing in the 1993 National Household Education Survey (NHES:93). Working Paper Series.

    ERIC Educational Resources Information Center

    Brick, J. Michael; Collins, Mary A.; Nolin, Mary Jo; Davies, Elizabeth; Feibus, Mary L.

    The National Household Education Survey (NHES) is a data collection system of the National Center for Education Statistics that collects and publishes data on the condition of education in the United States. It is a telephone survey of the noninstitutionalized population of the country, and it focuses on issues that are best studied through…

  15. Imaging without fluorescence: nonlinear optical microscopy for quantitative cellular imaging.

    PubMed

    Streets, Aaron M; Li, Ang; Chen, Tao; Huang, Yanyi

    2014-09-01

    Quantitative single-cell analysis enables the characterization of cellular systems with a level of detail that cannot be achieved with ensemble measurement. In this Feature we explore quantitative cellular imaging applications with nonlinear microscopy techniques. We first offer an introductory tutorial on nonlinear optical processes and then survey a range of techniques that have proven to be useful for quantitative live cell imaging without fluorescent labels.

  16. The Survey Questionnaire

    ERIC Educational Resources Information Center

    Ritter, Lois A. Ed.; Sue, Valerie M., Ed.

    2007-01-01

    Internet-based surveys are still relatively new, and researchers are just beginning to articulate best practices for questionnaire design. Online questionnaire design has generally been guided by the principles applying to other self-administered instruments, such as paper-based questionnaires. Web-based questionnaires, however, have the potential…

  17. Towards global benchmarking of food environments and policies to reduce obesity and diet-related non-communicable diseases: design and methods for nation-wide surveys

    PubMed Central

    Vandevijvere, Stefanie; Swinburn, Boyd

    2014-01-01

    Introduction Unhealthy diets are heavily driven by unhealthy food environments. The International Network for Food and Obesity/non-communicable diseases (NCDs) Research, Monitoring and Action Support (INFORMAS) has been established to reduce obesity, NCDs and their related inequalities globally. This paper describes the design and methods of the first-ever, comprehensive national survey on the healthiness of food environments and the public and private sector policies influencing them, as a first step towards global monitoring of food environments and policies. Methods and analysis A package of 11 substudies has been identified: (1) food composition, labelling and promotion on food packages; (2) food prices, shelf space and placement of foods in different outlets (mainly supermarkets); (3) food provision in schools/early childhood education (ECE) services and outdoor food promotion around schools/ECE services; (4) density of and proximity to food outlets in communities; food promotion to children via (5) television, (6) magazines, (7) sport club sponsorships, and (8) internet and social media; (9) analysis of the impact of trade and investment agreements on food environments; (10) government policies and actions; and (11) private sector actions and practices. For the substudies on food prices, provision, promotion and retail, ‘environmental equity’ indicators have been developed to check progress towards reducing diet-related health inequalities. Indicators for these modules will be assessed by tertiles of area deprivation index or school deciles. International ‘best practice benchmarks’ will be identified, against which to compare progress of countries on improving the healthiness of their food environments and policies. Dissemination This research is highly original due to the very ‘upstream’ approach being taken and its direct policy relevance. The detailed protocols will be offered to and adapted for countries of varying size and income in order to

  18. A whole genome scan for quantitative trait loci affecting milk protein percentage in Israeli-Holstein cattle, by means of selective milk DNA pooling in a daughter design, using an adjusted false discovery rate criterion.

    PubMed Central

    Mosig, M O; Lipkin, E; Khutoreskaya, G; Tchourzyna, E; Soller, M; Friedmann, A

    2001-01-01

    Selective DNA pooling was employed in a daughter design to screen all bovine autosomes for quantitative trait loci (QTL) affecting estimated breeding value for milk protein percentage (EBVP%). Milk pools prepared from high and low daughters of each of seven sires were genotyped for 138 dinucleotide microsatellites. Shadow-corrected estimates of sire allele frequencies were compared between high and low pools. An adjusted false discovery rate (FDR) method was employed to calculate experimentwise significance levels and empirical power. Significant associations with milk protein percentage were found for 61 of the markers (adjusted FDR = 0.10; estimated power, 0.68). The significant markers appear to be linked to 19--28 QTL. Mean allele substitution effects of the putative QTL averaged 0.016 (0.009--0.028) in units of the within-sire family standard deviation of EBVP% and summed to 0.460 EBVP%. Overall QTL heterozygosity was 0.40. The identified QTL appear to account for all of the variation in EBVP% in the population. Through use of selective DNA pooling, 4400 pool data points provided the statistical power of 600,000 individual data points. PMID:11290723

  19. Guidelines for Initiating a Research Agenda: Research Design and Dissemination of Results.

    PubMed

    Delost, Maria E; Nadder, Teresa S

    2014-01-01

    Successful research outcomes require selection and implementation of the appropriate research design. A realistic sampling plan appropriate for the design is essential. Qualitative or quantitative methodology may be utilized, depending on the research question and goals. Quantitative research may be experimental where there is an intervention, or nonexperimental, if no intervention is included in the design. Causation can only be established with experimental research. Popular types of nonexperimental research include descriptive and survey research. Research findings may be disseminated via presentations, posters, and publications, such as abstracts and manuscripts.

  20. Reef Fish Survey Techniques: Assessing the Potential for Standardizing Methodologies.

    PubMed

    Caldwell, Zachary R; Zgliczynski, Brian J; Williams, Gareth J; Sandin, Stuart A

    2016-01-01

    Dramatic changes in populations of fishes living on coral reefs have been documented globally and, in response, the research community has initiated efforts to assess and monitor reef fish assemblages. A variety of visual census techniques are employed, however results are often incomparable due to differential methodological performance. Although comparability of data may promote improved assessment of fish populations, and thus management of often critically important nearshore fisheries, to date no standardized and agreed-upon survey method has emerged. This study describes the use of methods across the research community and identifies potential drivers of method selection. An online survey was distributed to researchers from academic, governmental, and non-governmental organizations internationally. Although many methods were identified, 89% of survey-based projects employed one of three methods-belt transect, stationary point count, and some variation of the timed swim method. The selection of survey method was independent of the research design (i.e., assessment goal) and region of study, but was related to the researcher's home institution. While some researchers expressed willingness to modify their current survey protocols to more standardized protocols (76%), their willingness decreased when methodologies were tied to long-term datasets spanning five or more years. Willingness to modify current methodologies was also less common among academic researchers than resource managers. By understanding both the current application of methods and the reported motivations for method selection, we hope to focus discussions towards increasing the comparability of quantitative reef fish survey data. PMID:27111085

  1. Reef Fish Survey Techniques: Assessing the Potential for Standardizing Methodologies

    PubMed Central

    Caldwell, Zachary R.; Zgliczynski, Brian J.; Williams, Gareth J.; Sandin, Stuart A.

    2016-01-01

    Dramatic changes in populations of fishes living on coral reefs have been documented globally and, in response, the research community has initiated efforts to assess and monitor reef fish assemblages. A variety of visual census techniques are employed, however results are often incomparable due to differential methodological performance. Although comparability of data may promote improved assessment of fish populations, and thus management of often critically important nearshore fisheries, to date no standardized and agreed-upon survey method has emerged. This study describes the use of methods across the research community and identifies potential drivers of method selection. An online survey was distributed to researchers from academic, governmental, and non-governmental organizations internationally. Although many methods were identified, 89% of survey-based projects employed one of three methods–belt transect, stationary point count, and some variation of the timed swim method. The selection of survey method was independent of the research design (i.e., assessment goal) and region of study, but was related to the researcher’s home institution. While some researchers expressed willingness to modify their current survey protocols to more standardized protocols (76%), their willingness decreased when methodologies were tied to long-term datasets spanning five or more years. Willingness to modify current methodologies was also less common among academic researchers than resource managers. By understanding both the current application of methods and the reported motivations for method selection, we hope to focus discussions towards increasing the comparability of quantitative reef fish survey data. PMID:27111085

  2. Use of Web and Phone Survey Modes to Gather Data From Adults About Their Young Adult Children: An Evaluation Based on a Randomized Design.

    PubMed

    Fleming, Charles B; Marchesini, Gina; Elgin, Jenna; Haggerty, Kevin P; Woodward, Danielle; Abbott, Robert D; Catalano, Richard F

    2013-11-01

    Mode effects on responses to survey items may introduce bias to data collected using multiple modes of administration. The present study examines data from 704 surveys conducted as part of a longitudinal study in which parents and their children had been surveyed at multiple prior time points. Parents of 22-year-old study participants were randomly assigned to one of two mixed-mode conditions: (a) Web mode first followed by the offer of an interviewer-administered telephone mode; or (b) telephone mode first followed by the offer of the Web mode. Comparison of responses by assigned condition on 12 measures showed one statistically significant difference. Analyses that modeled differences by completed mode and the interaction between assigned condition and completed mode found significant differences on six measures related to completed mode. None of the differences indicated that more socially desirable responses were given in interviewer-administered surveys.

  3. Use of Web and Phone Survey Modes to Gather Data From Adults About Their Young Adult Children: An Evaluation Based on a Randomized Design

    PubMed Central

    Fleming, Charles B.; Marchesini, Gina; Elgin, Jenna; Haggerty, Kevin P.; Woodward, Danielle; Abbott, Robert D.; Catalano, Richard F.

    2013-01-01

    Mode effects on responses to survey items may introduce bias to data collected using multiple modes of administration. The present study examines data from 704 surveys conducted as part of a longitudinal study in which parents and their children had been surveyed at multiple prior time points. Parents of 22-year-old study participants were randomly assigned to one of two mixed-mode conditions: (a) Web mode first followed by the offer of an interviewer-administered telephone mode; or (b) telephone mode first followed by the offer of the Web mode. Comparison of responses by assigned condition on 12 measures showed one statistically significant difference. Analyses that modeled differences by completed mode and the interaction between assigned condition and completed mode found significant differences on six measures related to completed mode. None of the differences indicated that more socially desirable responses were given in interviewer-administered surveys. PMID:24733977

  4. Theory Survey or Survey Theory?

    ERIC Educational Resources Information Center

    Dean, Jodi

    2010-01-01

    Matthew Moore's survey of political theorists in U.S. American colleges and universities is an impressive contribution to political science (Moore 2010). It is the first such survey of political theory as a subfield, the response rate is very high, and the answers to the survey questions provide new information about how political theorists look…

  5. Prototype ultrasonic instrument for quantitative testing

    NASA Technical Reports Server (NTRS)

    Lynnworth, L. C.; Dubois, J. L.; Kranz, P. R.

    1972-01-01

    A prototype ultrasonic instrument has been designed and developed for quantitative testing. The complete delivered instrument consists of a pulser/receiver which plugs into a standard oscilloscope, an rf power amplifier, a standard decade oscillator, and a set of broadband transducers for typical use at 1, 2, 5 and 10 MHz. The system provides for its own calibration, and on the oscilloscope, presents a quantitative (digital) indication of time base and sensitivity scale factors and some measurement data.

  6. Quantitative optical phase microscopy.

    PubMed

    Barty, A; Nugent, K A; Paganin, D; Roberts, A

    1998-06-01

    We present a new method for the extraction of quantitative phase data from microscopic phase samples by use of partially coherent illumination and an ordinary transmission microscope. The technique produces quantitative images of the phase profile of the sample without phase unwrapping. The technique is able to recover phase even in the presence of amplitude modulation, making it significantly more powerful than existing methods of phase microscopy. We demonstrate the technique by providing quantitatively correct phase images of well-characterized test samples and show that the results obtained for more-complex samples correlate with structures observed with Nomarski differential interference contrast techniques.

  7. African primary care research: performing surveys using questionnaires.

    PubMed

    Govender, Indiran; Mabuza, Langalibalele H; Ogunbanjo, Gboyega A; Mash, Bob

    2014-04-25

    The aim of this article is to provide practical guidance on conducting surveys and the use of questionnaires for postgraduate students at a Masters level who are undertaking primary care research. The article is intended to assist with writing the methods section of the research proposal and thinking through the relevant issues that apply to sample size calculation, sampling strategy, design of a questionnaire and administration of a questionnaire. The articleis part of a larger series on primary care research, with other articles in the series focusing on the structure of the research proposal and the literature review, as well as quantitative data analysis.

  8. Recapturing Quantitative Biology.

    ERIC Educational Resources Information Center

    Pernezny, Ken; And Others

    1996-01-01

    Presents a classroom activity on estimating animal populations. Uses shoe boxes and candies to emphasize the importance of mathematics in biology while introducing the methods of quantitative ecology. (JRH)

  9. On Quantitative Rorschach Scales.

    ERIC Educational Resources Information Center

    Haggard, Ernest A.

    1978-01-01

    Two types of quantitative Rorschach scales are discussed: first, those based on the response categories of content, location, and the determinants, and second, global scales based on the subject's responses to all ten stimulus cards. (Author/JKS)

  10. Quantitative receptor autoradiography

    SciTech Connect

    Boast, C.A.; Snowhill, E.W.; Altar, C.A.

    1986-01-01

    Quantitative receptor autoradiography addresses the topic of technical and scientific advances in the sphere of quantitative autoradiography. The volume opens with a overview of the field from a historical and critical perspective. Following is a detailed discussion of in vitro data obtained from a variety of neurotransmitter systems. The next section explores applications of autoradiography, and the final two chapters consider experimental models. Methodological considerations are emphasized, including the use of computers for image analysis.

  11. KE Basin underwater visual fuel survey

    SciTech Connect

    Pitner, A.L.

    1995-02-01

    Results of an underwater video fuel survey in KE Basin using a high resolution camera system are presented. Quantitative and qualitative information on fuel degradation are given, and estimates of the total fraction of ruptured fuel elements are provided. Representative photographic illustrations showing the range of fuel conditions observed in the survey are included.

  12. Very large radio surveys of the sky.

    PubMed

    Condon, J J

    1999-04-27

    Recent advances in electronics and computing have made possible a new generation of large radio surveys of the sky that yield an order-of-magnitude higher sensitivity and positional accuracy. Combined with the unique properties of the radio universe, these quantitative improvements open up qualitatively different and exciting new scientific applications of radio surveys.

  13. Very large radio surveys of the sky

    PubMed Central

    Condon, J. J.

    1999-01-01

    Recent advances in electronics and computing have made possible a new generation of large radio surveys of the sky that yield an order-of-magnitude higher sensitivity and positional accuracy. Combined with the unique properties of the radio universe, these quantitative improvements open up qualitatively different and exciting new scientific applications of radio surveys. PMID:10220365

  14. Recent advances in quantitative neuroproteomics.

    PubMed

    Craft, George E; Chen, Anshu; Nairn, Angus C

    2013-06-15

    The field of proteomics is undergoing rapid development in a number of different areas including improvements in mass spectrometric platforms, peptide identification algorithms and bioinformatics. In particular, new and/or improved approaches have established robust methods that not only allow for in-depth and accurate peptide and protein identification and modification, but also allow for sensitive measurement of relative or absolute quantitation. These methods are beginning to be applied to the area of neuroproteomics, but the central nervous system poses many specific challenges in terms of quantitative proteomics, given the large number of different neuronal cell types that are intermixed and that exhibit distinct patterns of gene and protein expression. This review highlights the recent advances that have been made in quantitative neuroproteomics, with a focus on work published over the last five years that applies emerging methods to normal brain function as well as to various neuropsychiatric disorders including schizophrenia and drug addiction as well as of neurodegenerative diseases including Parkinson's disease and Alzheimer's disease. While older methods such as two-dimensional polyacrylamide electrophoresis continued to be used, a variety of more in-depth MS-based approaches including both label (ICAT, iTRAQ, TMT, SILAC, SILAM), label-free (label-free, MRM, SWATH) and absolute quantification methods, are rapidly being applied to neurobiological investigations of normal and diseased brain tissue as well as of cerebrospinal fluid (CSF). While the biological implications of many of these studies remain to be clearly established, that there is a clear need for standardization of experimental design and data analysis, and that the analysis of protein changes in specific neuronal cell types in the central nervous system remains a serious challenge, it appears that the quality and depth of the more recent quantitative proteomics studies is beginning to shed

  15. Assessing State-Level Job Training Coordination: A Survey Design and Methodology Based upon the Massachusetts Experience. Research Report No. 92-01.

    ERIC Educational Resources Information Center

    Ott, Attiat F.

    A survey instrument developed in 1991 by the Massachusetts Job Council to ascertain basic information regarding funds spent on job training and to provide state policymakers with information about those being served and what services are being provided was used as a model to develop a program assessment instrument. The program assessment…

  16. Arizona Teacher Working Conditions: Designing Schools for Educator and Student Success. Results of the 2006 Phase-In Teacher Working Conditions Survey

    ERIC Educational Resources Information Center

    Hirsch, Eric; Emerick, Scott

    2006-01-01

    Many schools across the country face persistent teacher working condition challenges that are closely related to high teacher turnover rates and chronic difficulties in recruiting and retaining teachers. Center for Teaching Quality (CTQ) research examining working conditions survey results in both North Carolina and South Carolina demonstrates…

  17. The Identification and Description of Critical Thinking Behaviors in the Practice of Clinical Laboratory Science, Part 1: Design, Implementation, Evaluation, and Results of a National Survey.

    ERIC Educational Resources Information Center

    Kenimer, Elizabeth A.

    2002-01-01

    A survey of 1,562 clinical laboratory scientists ranked critical thinking behaviors used in practice. Important behaviors were cognitive, behavioral, affective, and situated/contextual. Findings support a view of critical thinking as a metaprocess that spans learning domains. (Contains 17 references.) (SK)

  18. Multicultural Survey.

    ERIC Educational Resources Information Center

    Renyi, Judith, Comp.

    In May of 1992, the Alliance for Curriculum Reform (ACR) surveyed member organizations and others who had participated in ACR activities concerning their printed policies on issues relating to multicultural education. The areas of interest for the survey were: printed policy(ies) on multicultural content/curriculum; printed policy(ies) on student…

  19. Quantitative aspects of septicemia.

    PubMed Central

    Yagupsky, P; Nolte, F S

    1990-01-01

    For years, quantitative blood cultures found only limited use as aids in the diagnosis and management of septic patients because the available methods were cumbersome, labor intensive, and practical only for relatively small volumes of blood. The development and subsequent commercial availability of lysis-centrifugation direct plating methods for blood cultures have addressed many of the shortcomings of the older methods. The lysis-centrifugation method has demonstrated good performance relative to broth-based blood culture methods. As a result, quantitative blood cultures have found widespread use in clinical microbiology laboratories. Most episodes of clinical significant bacteremia in adults are characterized by low numbers of bacteria per milliliter of blood. In children, the magnitude of bacteremia is generally much higher, with the highest numbers of bacteria found in the blood of septic neonates. The magnitude of bacteremia correlates with the severity of disease in children and with mortality rates in adults, but other factors play more important roles in determining the patient's outcome. Serial quantitative blood cultures have been used to monitor the in vivo efficacy of antibiotic therapy in patients with slowly resolving sepsis, such as disseminated Mycobacterium avium-M. intracellulare complex infections. Quantitative blood culture methods were used in early studies of bacterial endocarditis, and the results significantly contributed to our understanding of the pathophysiology of this disease. Comparison of paired quantitative blood cultures obtained from a peripheral vein and the central venous catheter has been used to help identify patients with catheter-related sepsis and is the only method that does not require removal of the catheter to establish the diagnosis. Quantitation of bacteria in the blood can also help distinguish contaminated from truly positive blood cultures; however, no quantitative criteria can invariably differentiate

  20. Colour in quantitative and qualitative display formats

    NASA Astrophysics Data System (ADS)

    Reising, J. M.; Emerson, T. J.

    1985-12-01

    Advantages of color in display formats are considered. Most people enjoy color because it is aesthetically appealing. However, questions arise regarding an improvement of an operator's performance because of color. In this case, the evidence is not clear, and it has been found that in many instances color does not improve operator efficiency. The present paper has the objective to discuss the use of color in both quantitative and qualitative display formats, to point out cases in which color can offer advantages, and to review some of the rules for color application which designers should use. Attention is given to quantitative and qualitative displays, approaches for using color, and color in quantitative and qualitative displays, approaches for using color, and color in quantitative and qualitative displays. Color in hybrid displays is also discussed, taking into account color as a classifier, color and information processing, color and continuous variables, and color related to hue, saturation, and brightness.