Science.gov

Sample records for quantitative survey design

  1. Quantitative evolutionary design

    PubMed Central

    Diamond, Jared

    2002-01-01

    The field of quantitative evolutionary design uses evolutionary reasoning (in terms of natural selection and ultimate causation) to understand the magnitudes of biological reserve capacities, i.e. excesses of capacities over natural loads. Ratios of capacities to loads, defined as safety factors, fall in the range 1.2-10 for most engineered and biological components, even though engineered safety factors are specified intentionally by humans while biological safety factors arise through natural selection. Familiar examples of engineered safety factors include those of buildings, bridges and elevators (lifts), while biological examples include factors of bones and other structural elements, of enzymes and transporters, and of organ metabolic performances. Safety factors serve to minimize the overlap zone (resulting in performance failure) between the low tail of capacity distributions and the high tail of load distributions. Safety factors increase with coefficients of variation of load and capacity, with capacity deterioration with time, and with cost of failure, and decrease with costs of initial construction, maintenance, operation, and opportunity. Adaptive regulation of many biological systems involves capacity increases with increasing load; several quantitative examples suggest sublinear increases, such that safety factors decrease towards 1.0. Unsolved questions include safety factors of series systems, parallel or branched pathways, elements with multiple functions, enzyme reaction chains, and equilibrium enzymes. The modest sizes of safety factors imply the existence of costs that penalize excess capacities. Those costs are likely to involve wasted energy or space for large or expensive components, but opportunity costs of wasted space at the molecular level for minor components. PMID:12122135

  2. Telephone Survey Designs.

    ERIC Educational Resources Information Center

    Casady, Robert J.

    The concepts, definitions, and notation that have evolved with the development of telephone survey design methodology are discussed and presented as a unified structure. This structure is then applied to some of the more well-known telephone survey designs and alternative designs are developed. The relative merits of the different survey designs…

  3. Quantitative resilience analysis through control design.

    SciTech Connect

    Sunderland, Daniel; Vugrin, Eric D.; Camphouse, Russell Chris

    2009-09-01

    Critical infrastructure resilience has become a national priority for the U. S. Department of Homeland Security. System resilience has been studied for several decades in many different disciplines, but no standards or unifying methods exist for critical infrastructure resilience analysis. Few quantitative resilience methods exist, and those existing approaches tend to be rather simplistic and, hence, not capable of sufficiently assessing all aspects of critical infrastructure resilience. This report documents the results of a late-start Laboratory Directed Research and Development (LDRD) project that investigated the development of quantitative resilience through application of control design methods. Specifically, we conducted a survey of infrastructure models to assess what types of control design might be applicable for critical infrastructure resilience assessment. As a result of this survey, we developed a decision process that directs the resilience analyst to the control method that is most likely applicable to the system under consideration. Furthermore, we developed optimal control strategies for two sets of representative infrastructure systems to demonstrate how control methods could be used to assess the resilience of the systems to catastrophic disruptions. We present recommendations for future work to continue the development of quantitative resilience analysis methods.

  4. Survey design for detecting rare freshwater mussels

    USGS Publications Warehouse

    Smith, D.R.

    2006-01-01

    A common objective when surveying freshwater mussels is to detect the presence of rare populations. In certain situations, such as when endangered or threatened species are potentially in the area of a proposed impact, the survey should be designed to ensure a high probability of detecting species presence. Linking survey design to probability of detecting species presence has been done for quantitative surveys, but commonly applied designs that are based on timed searches have not made that connection. I propose a semiquantitative survey design that links search area and search efficiency to probability of detecting species presence. The survey can be designed to protect against failing to detect populations above a threshold abundance (or density). I illustrate the design for surveys to detect clubshell (Pluerobema clava) and northern riffleshell (Epioblasma torulosa rangiana) in the Allegheny River. Monte Carlo simulation indicated that the proposed survey design performs well under a range of spatial distributions and low densities (<0.05 m2) where search area is sufficient to ensure that the probability of detecting species presence is predicted to be ???0.85. ?? 2006 by The North American Benthological Society.

  5. Applying Knowledge of Quantitative Design and Analysis

    ERIC Educational Resources Information Center

    Baskas, Richard S.

    2011-01-01

    This study compared and contrasted two quantitative scholarly articles in relation to their research designs. Their designs were analyzed by the comparison of research references and research specific vocabulary to describe how various research methods were used. When researching and analyzing quantitative scholarly articles, it is imperative to…

  6. 1997 construction & design survey.

    PubMed

    Pinto, C

    1997-03-31

    Managed care might seem to be putting a damper on healthcare construction, but in fact it's one of several industry changes creating opportunities for architectural and design firms. One example of a trend toward making surroundings as pleasant as possible is the west campus expansion at East Texas Medical Center in Tyler (left). Designed and built by Ellerbe Becket and completed in 1995, the project, including a nine-story medical office building, features artwork and rooftop gardens. PMID:10165801

  7. Report on Solar Water Heating Quantitative Survey

    SciTech Connect

    Focus Marketing Services

    1999-05-06

    This report details the results of a quantitative research study undertaken to better understand the marketplace for solar water-heating systems from the perspective of home builders, architects, and home buyers.

  8. WATERSHED BASED SURVEY DESIGNS

    EPA Science Inventory

    The development of watershed-based design and assessment tools will help to serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional condition to meet Section 305(b), identification of impaired water bodies or wate...

  9. How To Design Surveys. The Survey Kit, Volume 5.

    ERIC Educational Resources Information Center

    Fink, Arlene

    The nine-volume Survey Kit is designed to help readers prepare and conduct surveys and become better users of survey results. All the books in the series contain instructional objectives, exercises and answers, examples of surveys in use, illustrations of survey questions, guidelines for action, checklists of "dos and don'ts," and annotated…

  10. Food Consumption and Handling Survey for Quantitative Microbiological Consumer Phase Risk Assessments.

    PubMed

    Chardon, Jurgen; Swart, Arno

    2016-07-01

    In the consumer phase of a typical quantitative microbiological risk assessment (QMRA), mathematical equations identify data gaps. To acquire useful data we designed a food consumption and food handling survey (2,226 respondents) for QMRA applications that is especially aimed at obtaining quantitative data. For a broad spectrum of food products, the survey covered the following topics: processing status at retail, consumer storage, preparation, and consumption. Questions were designed to facilitate distribution fitting. In the statistical analysis, special attention was given to the selection of the most adequate distribution to describe the data. Bootstrap procedures were used to describe uncertainty. The final result was a coherent quantitative consumer phase food survey and parameter estimates for food handling and consumption practices in The Netherlands, including variation over individuals and uncertainty estimates. PMID:27357043

  11. RESOLVE and ECO: Survey Design

    NASA Astrophysics Data System (ADS)

    Kannappan, Sheila; Moffett, Amanda J.; Norris, Mark A.; Eckert, Kathleen D.; Stark, David; Berlind, Andreas A.; Snyder, Elaine M.; Norman, Dara J.; Hoversten, Erik A.; RESOLVE Team

    2016-01-01

    The REsolved Spectroscopy Of a Local VolumE (RESOLVE) survey is a volume-limited census of stellar, gas, and dynamical mass as well as star formation and galaxy interactions within >50,000 cubic Mpc of the nearby cosmic web, reaching down to dwarf galaxies of baryonic mass ~10^9 Msun and spanning multiple large-scale filaments, walls, and voids. RESOLVE is surrounded by the ~10x larger Environmental COntext (ECO) catalog, with matched custom photometry and environment metrics enabling analysis of cosmic variance with greater statistical power. For the ~1500 galaxies in its two equatorial footprints, RESOLVE goes beyond ECO in providing (i) deep 21cm data with adaptive sensitivity ensuring HI mass detections or upper limits <10% of the stellar mass and (ii) 3D optical spectroscopy including both high-resolution ionized gas or stellar kinematic data for each galaxy and broad 320-725nm spectroscopy spanning [OII] 3727, Halpha, and Hbeta. RESOLVE is designed to complement other radio and optical surveys in providing diverse, contiguous, and uniform local/global environment data as well as unusually high completeness extending into the gas-dominated dwarf galaxy regime. RESOLVE also offers superb reprocessed photometry including full, deep NUV coverage and synergy with other equatorial surveys as well as unique northern and southern facilities such as Arecibo, the GBT, and ALMA. The RESOLVE and ECO surveys have been supported by funding from NSF grants AST-0955368 and OCI-1156614.

  12. Watershed-based survey designs

    USGS Publications Warehouse

    Detenbeck, N.E.; Cincotta, D.; Denver, J.M.; Greenlee, S.K.; Olsen, A.R.; Pitchford, A.M.

    2005-01-01

    Watershed-based sampling design and assessment tools help serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional conditions to meet Section 305(b), identification of impaired water bodies or watersheds to meet Section 303(d), and development of empirical relationships between causes or sources of impairment and biological responses. Creation of GIS databases for hydrography, hydrologically corrected digital elevation models, and hydrologic derivatives such as watershed boundaries and upstream-downstream topology of subcatchments would provide a consistent seamless nationwide framework for these designs. The elements of a watershed-based sample framework can be represented either as a continuous infinite set defined by points along a linear stream network, or as a discrete set of watershed polygons. Watershed-based designs can be developed with existing probabilistic survey methods, including the use of unequal probability weighting, stratification, and two-stage frames for sampling. Case studies for monitoring of Atlantic Coastal Plain streams, West Virginia wadeable streams, and coastal Oregon streams illustrate three different approaches for selecting sites for watershed-based survey designs. ?? Springer Science + Business Media, Inc. 2005.

  13. Qualities of a Psychiatric Mentor: A Quantitative Singaporean Survey

    ERIC Educational Resources Information Center

    Tor, Phern-Chern; Goh, Lee-Gan; Ang, Yong-Guan; Lim, Leslie; Winslow, Rasaiah-Munidasa; Ng, Beng-Yeong; Wong, Sze-Tai; Ng, Tse-Pin; Kia, Ee-Heok

    2011-01-01

    Objective: Psychiatric mentors are an important part of the new, seamless training program in Singapore. There is a need to assess the qualities of a good psychiatric mentor vis-a-vis those of a good psychiatrist. Method: An anonymous survey was sent out to all psychiatry trainees and psychiatrists in Singapore to assess quantitatively the…

  14. Examining Stress in Graduate Assistants: Combining Qualitative and Quantitative Survey Methods

    ERIC Educational Resources Information Center

    Mazzola, Joseph J.; Walker, Erin J.; Shockley, Kristen M.; Spector, Paul E.

    2011-01-01

    The aim of this study was to employ qualitative and quantitative survey methods in a concurrent mixed model design to assess stressors and strains in graduate assistants. The stressors most frequently reported qualitatively were work overload, interpersonal conflict, and organizational constraints; the most frequently reported psychological…

  15. WATERSHED-BASED SURVEY DESIGNS

    EPA Science Inventory

    Water-based sampling design and assessment tools help serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional conditions to meet Section 305(b), identification if impaired water bodies or watersheds to meet Sectio...

  16. Armchair Survey Sampling: An Aid in Teaching Survey Design.

    ERIC Educational Resources Information Center

    Thompson, M. E.

    A fictitious community of 583 households was set up to simulate a survey population, and was used in two laboratory assignments where students "interviewed" householders by a quota sampling procedure and tested the performance of several probability sampling designs. (Author/JEG)

  17. Statistical considerations in designing raptor surveys

    USGS Publications Warehouse

    Pendleton, G.W.

    1989-01-01

    Careful sampling design is required to obtain useful estimates of raptor abundance. Well-defined objectives, selection of appropriate sample units and sampling scheme, and attention to detail to reduce extraneous sources of variability and error are all important considerations in designing a raptor survey.

  18. Strategies for joint geophysical survey design

    NASA Astrophysics Data System (ADS)

    Shakas, Alexis; Maurer, Hansruedi

    2015-04-01

    In recent years, the use of multiple geophysical techniques to image the subsurface has become a popular option. Joint inversions of geophysical datasets are based on the assumption that the spatial variations of the different physical subsurface parameters exhibit structural similarities. In this work, we combine the benefits of joint inversions of geophysical datasets with recent innovations in optimized experimental design. These techniques maximize the data information content while minimizing the data acquisition costs. Experimental design has been used in geophysics over the last twenty years, but it has never been attempted to combine various geophysical imaging methods. We combine direct current geoelectrics, magnetotellurics and seismic refraction travel time tomography data to resolve synthetic 1D layered Earth models. An initial model for the subsurface structure can be taken from a priori geological information and an optimal joint geophysical survey can be designed around the initial model. Another typical scenario includes an existing data set from a past survey and a subsequent survey that is planned to optimally complement the existing data. Our results demonstrate that the joint design methodology provides optimized combinations of data sets that include only a few data points. Nevertheless, they allow constraining the subsurface models equally well as data from a densely sampled survey. Furthermore, we examine the dependency of optimized survey design on the a priori model assumptions. Finally, we apply the methodology to geoelectric and seismic field data collected along 2D profiles.

  19. Survey of adaptive control using Liapunov design

    NASA Technical Reports Server (NTRS)

    Lindorff, D. P.; Carroll, R. L.

    1972-01-01

    A survey was made of the literature devoted to the synthesis of model-tracking adaptive systems based on application of Liapunov's second method. The basic synthesis procedure is introduced and a critical review of extensions made to the theory since 1966 is made. The extensions relate to design for relative stability, reduction of order techniques, design with disturbance, design with time variable parameters, multivariable systems, identification, and an adaptive observer.

  20. Survey Design for Large-Scale, Unstructured Resistivity Surveys

    NASA Astrophysics Data System (ADS)

    Labrecque, D. J.; Casale, D.

    2009-12-01

    In this paper, we discuss the issues in designing data collection strategies for large-scale, poorly structured resistivity surveys. Existing or proposed applications for these types of surveys include carbon sequestration, enhanced oil recovery monitoring, monitoring of leachate from working or abandoned mines, and mineral surveys. Electrode locations are generally chosen by land access, utilities, roads, existing wells etc. Classical arrays such as the Wenner array or dipole-dipole arrays are not applicable if the electrodes cannot be placed in quasi-regular lines or grids. A new, far more generalized strategy is needed for building data collection schemes. Following the approach of earlier two-dimensional (2-D) survey designs, the proposed method begins by defining a base array. In (2-D) design, this base array is often a standard dipole-dipole array. For unstructured three-dimensional (3-D) design, determining this base array is a multi-step process. The first step is to determine a set of base dipoles with similar characteristics. For example, the base dipoles may consist of electrode pairs trending within 30 degrees of north and with a length between 100 and 250 m in length. These dipoles are then combined into a trial set of arrays. This trial set of arrays is reduced by applying a series of filters based on criteria such as separation between the dipoles. Using the base array set, additional arrays are added and tested to determine the overall improvement in resolution and to determine an optimal set of arrays. Examples of the design process are shown for a proposed carbon sequestration monitoring system.

  1. Ambulance Design Survey 2011: A Summary Report

    PubMed Central

    Lee, Y Tina; Kibira, Deogratias; Feeney, Allison Barnard; Marshall, Jennifer

    2013-01-01

    Current ambulance designs are ergonomically inefficient and often times unsafe for practical treatment response to medical emergencies. Thus, the patient compartment of a moving ambulance is a hazardous working environment. As a consequence, emergency medical services (EMS) workers suffer fatalities and injuries that far exceed those of the average work place in the United States. To reduce injury and mortality rates in ambulances, the Department of Homeland Security Science and Technology Directorate has teamed with the National Institute of Standards and Technology, the National Institute for Occupational Safety and Health, and BMT Designers & Planners in a joint project to produce science-based ambulance patient compartment design standards. This project will develop new crash-safety design standards and improved user-design interface guidance for patient compartments that are safer for EMS personnel and patients, and facilitate improved patient care. The project team has been working with practitioners, EMS workers’ organizations, and manufacturers to solicit needs and requirements to address related issues. This paper presents an analysis of practitioners’ concerns, needs, and requirements for improved designs elicited through the web-based survey of ambulance design, held by the National Institute of Standards and Technology. This paper also introduces the survey, analyzes the survey results, and discusses recommendations for future ambulance patient compartments design. PMID:26401439

  2. Design-based and model-based inference in surveys of freshwater mollusks

    USGS Publications Warehouse

    Dorazio, R.M.

    1999-01-01

    Well-known concepts in statistical inference and sampling theory are used to develop recommendations for planning and analyzing the results of quantitative surveys of freshwater mollusks. Two methods of inference commonly used in survey sampling (design-based and model-based) are described and illustrated using examples relevant in surveys of freshwater mollusks. The particular objectives of a survey and the type of information observed in each unit of sampling can be used to help select the sampling design and the method of inference. For example, the mean density of a sparsely distributed population of mollusks can be estimated with higher precision by using model-based inference or by using design-based inference with adaptive cluster sampling than by using design-based inference with conventional sampling. More experience with quantitative surveys of natural assemblages of freshwater mollusks is needed to determine the actual benefits of different sampling designs and inferential procedures.

  3. Survey of quantitative antimicrobial consumption in two different pig finishing systems.

    PubMed

    Moreno, M A

    2012-09-29

    The primary objectives of this study were to: (a) collect on-farm antimicrobial use (AMU) data in fattener pigs employing two questionnaire-based surveys; (b) assess different quantitative measures for quantifying AMU in fattener pigs; (c) compare AMU in fattener pigs between two different management systems producing finishers: farrow-to-finish (FtF) farms versus finisher farms. Two questionnaires were designed both containing five groups of questions focused on the responder, the farm and AMU (eg, in-feed, in-drinking water and parenteral); both surveys were carried out by means of personal face-to-face interviews. Both surveys started with a sample size of 108 potentially eligible farms per survey; nevertheless, finally 67 finisher farms and 49 FtF farms were recruited. Overall percentages of animals exposed to antimicrobials (AM) were high (90 per cent in finisher farms and 54 per cent FtF farms); colistin (61 per cent and 33 per cent) and doxycycline (62 per cent and 23 per cent) were the most common AMs, followed by amoxicillin (51 per cent and 19 per cent) and lincomycin (49 per cent), respectively. Questionnaire-based surveys using face-to-face interviews are useful for capturing information regarding AMU at the farm level. Farm-level data per administration route can be used for comparative AMU analysis between farms. Nevertheless, for the analysis of the putative relationships between AMU and AM resistance, measures based on exposed animals or exposure events are needed. PMID:22915683

  4. Spatially balanced survey designs for natural resources

    EPA Science Inventory

    Ecological resource monitoring programs typically require the use of a probability survey design to select locations or entities to be physically sampled in the field. The ecological resource of interest, the target population, occurs over a spatial domain and the sample selecte...

  5. Survey: Computer Usage in Design Courses.

    ERIC Educational Resources Information Center

    Henley, Ernest J.

    1983-01-01

    Presents results of a survey of chemical engineering departments regarding computer usage in senior design courses. Results are categorized according to: computer usage (use of process simulators, student-written programs, faculty-written or "canned" programs; costs (hard and soft money); and available software. Programs offered are listed in a…

  6. The XMM-LSS survey. Survey design and first results

    NASA Astrophysics Data System (ADS)

    Pierre, Marguerite; Valtchanov, Ivan; Altieri, Bruno; Andreon, Stefano; Bolzonella, Micol; Bremer, Malcolm; Disseau, Ludovic; Dos Santos, Sergio; Gandhi, Poshak; Jean, Christophe; Pacaud, Florian; Read, Andrew; Refregier, Alexandre; Willis, Jon; Adami, Christophe; Alloin, Danielle; Birkinshaw, Mark; Chiappetti, Lucio; Cohen, Aaron; Detal, Alain; Duc, Pierre-Alain; Gosset, Eric; Hjorth, Jens; Jones, Laurence; Le Fèvre, Olivier; Lonsdale, Carol; Maccagni, Dario; Mazure, Alain; McBreen, Brian; McCracken, Henry; Mellier, Yannick; Ponman, Trevor; Quintana, Hernan; Rottgering, Huub; Smette, Alain; Surdej, Jean; Starck, Jean-Luc; Vigroux, Laurent; White, Simon

    2004-09-01

    The XMM Large Scale Structure survey (XMM-LSS) is a medium deep large area X-ray survey. Its goal is to extend large scale structure investigations attempted using ROSAT cluster samples to two redshift bins between 0survey design: the evolutionary study of the cluster cluster correlation function and of the cluster number density. The adopted observing configuration consists of an equatorial mosaic of 10 ks pointings, separated by 20^\\prime and covering 8° × 8°, giving a pointsource sensitivity of {\\sim } 5\\times 10^{-15}~{\\mathrm {erg~cm^{-2}~s^{-1}}} in the 0.5 2 keV band. This will yield more than 800 clusters of galaxies and a sample of X-ray AGN with a space density of about 300 deg-2. We present the expected cosmological implications of the survey in the context of LgrCDM models and cluster evolution. We give an overview of the first observational results. The XMM-LSS survey is associated with several other major surveys, ranging from the UV to the radio wavebands, which will provide the necessary resources for X-ray source identification and further statistical studies. In particular, the associated CFHTLS weak lensing and AMiBA Sunyaev Zel'dovich surveys over the entire XMM-LSS area will provide for the first time a comprehensive study of the mass distribution and of cluster physics in the universe on scales of a few hundred Mpc. We describe the main characteristics of our wavelet-based X-ray pipeline and source identification procedures, including the classification of the cluster candidates by means of a photometric redshift analysis. This permits the selection of suitable targets for spectroscopic follow-up. We present preliminary results from the first 25 XMM-LSS pointings: X-ray source properties, optical counterparts, and highlights from the first Magellan and VLT/FORS2 spectroscopic runs as well as preliminary results from the NIR search for z>1

  7. 76 FR 9637 - Proposed Information Collection (Veteran Suicide Prevention Online Quantitative Surveys) Activity...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-18

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF VETERANS AFFAIRS Proposed Information Collection (Veteran Suicide Prevention Online Quantitative Surveys) Activity... the comment period, comments may be viewed online through FDMS. FOR FURTHER INFORMATION...

  8. The Dark Energy Survey CCD imager design

    SciTech Connect

    Cease, H.; DePoy, D.; Diehl, H.T.; Estrada, J.; Flaugher, B.; Guarino, V.; Kuk, K.; Kuhlmann, S.; Schultz, K.; Schmitt, R.L.; Stefanik, A.; /Fermilab /Ohio State U. /Argonne

    2008-06-01

    The Dark Energy Survey is planning to use a 3 sq. deg. camera that houses a {approx} 0.5m diameter focal plane of 62 2kx4k CCDs. The camera vessel including the optical window cell, focal plate, focal plate mounts, cooling system and thermal controls is described. As part of the development of the mechanical and cooling design, a full scale prototype camera vessel has been constructed and is now being used for multi-CCD readout tests. Results from this prototype camera are described.

  9. Integrated survey and design for transmission lines

    SciTech Connect

    Miller, M.A.; Simpson, K.D.

    1994-12-31

    Gathering and compiling information on the features and uses of the land within a proposed corridor provides the basis for selecting a route, obtaining easements, and designing and constructing a transmission line. Traditionally, gathering this information involved searches of existing maps and records to obtain the available information, which would then be supplemented with aerial photography to record current conditions. Ground surveys were performed to collect topographic data for design purposes. This information was manually transferred to drawings and other documents to show the terrain, environmentally sensitive areas, property ownership, and existing facilities. These drawing served as the base to which the transmission line right-of-way, structures, and other design information were added. As the design was completed, these drawings became the source of information for constructing the line and ultimately, the record of the facility. New technologies and the every growing need for instantly accessible information have resulted in changes in almost every step of gathering, storing and using information. Electronic data collection, global positioning systems (GPS), digitized terrain models, computerized design techniques, development of drawings using CAD, and graphical information systems (GIS) have individually resulted in significant advancements in this process. Combining these components into an integrated system, however, is truly revolutionizing transmission line engineering. This paper gives an overview of the survey and mapping information that is required for transmission line projects, review the traditional techniques that have been employed to obtain and utilize this information, and discuss the recent advances in the technology. Additionally, a system is presented that integrates the components in this process to achieve efficiency, minimize chances of errors, and provide improved access to project information.

  10. Design of future surveys: chapter 13

    USGS Publications Warehouse

    Bart, Jonathan; Smith, Paul A.

    2012-01-01

    This brief chapter addresses two related issues: how effort should be allocated to different parts of the sampling plan and, given optimal allocation, how large a sample will be required to achieve the PRISM accuracy target. Simulations based on data collected to date showed that 2 plots per cluster on rapid surveys, 2 intensive camps per field crew-year, 2-4 intensive plots per intensive camp, and 2-3 rapid surveys per intensive plot is the most efficient allocation of resources. Using this design, we investigated how crew-years should be allocated to each region in order to meet the PRISM accuracy target most efficiently. The analysis indicated that 40-50 crew-years would achieve the accuracy target for 18-24 of the 26 species breeding widely in the Arctic. This analysis was based on assuming that two rounds of surveys were conducted and that a 50% decline occurred between them. We discuss the complexity of making these estimates and why they should be viewed as first approximations.

  11. Young people, alcohol, and designer drinks: quantitative and qualitative study.

    PubMed Central

    Hughes, K.; MacKintosh, A. M.; Hastings, G.; Wheeler, C.; Watson, J.; Inglis, J.

    1997-01-01

    OBJECTIVE: To examine the appeal of "designer drinks" to young people. DESIGN: Qualitative and quantitative research comprising group discussions and questionnaire led interviews with young people accompanied by a self completion questionnaire. SETTINGS: Argyll and Clyde Health Board area, west Scotland. SUBJECTS: Eight groups aged 12-17 years; 824 aged 12-17 recruited by multistage cluster probability sample from the community health index. RESULTS: Young people were familiar with designer drinks, especially MD 20/20 and leading brands of strong white cider. Attitudes towards these drinks varied quite distinctly with age, clearly reflecting their attitudes towards and motivations for drinking in general. The brand imagery of designer drinks-in contrast with that of more mainstream drinks-matched many 14 and 15 year olds' perceptions and expectations of drinking. Popularity of designer drinks peaked between the ages of 13 and 16 while more conventional drinks showed a consistent increase in popularity with age. Consumption of designer drinks tended to be in less controlled circumstances and was associated with heavier alcohol intake and greater drunkenness. CONCLUSIONS: Designer drinks are a cause for concern. They appeal to young people, often more so than conventional drinks, and are particularly attractive to 14-16 year olds. Consumption of designer drinks is also associated with drinking in less controlled environments, heavier drinking, and greater drunkenness. There is a need for policy debate to assess the desirability of these drinks and the extent to which further controls on their marketing are required. PMID:9040387

  12. The Design of Grids in Web Surveys

    PubMed Central

    Couper, Mick P.; Tourangeau, Roger; Conrad, Frederick G.; Zhang, Chan

    2014-01-01

    Grid or matrix questions are associated with a number of problems in Web surveys. In this paper, we present results from two experiments testing the design of grid questions to reduce breakoffs, missing data, and satisficing. The first examines dynamic elements to help guide respondent through the grid, and on splitting a larger grid into component pieces. The second manipulates the visual complexity of the grid and on simplifying the grid. We find that using dynamic feedback to guide respondents through a multi-question grid helps reduce missing data. Splitting the grids into component questions further reduces missing data and motivated underreporting. The visual complexity of the grid appeared to have little effect on performance. PMID:25258472

  13. Quantitative proteomic survey of endoplasmic reticulum in mouse liver.

    PubMed

    Song, Yanping; Jiang, Ying; Ying, Wantao; Gong, Yan; Yan, Yujuan; Yang, Dong; Ma, Jie; Xue, Xiaofang; Zhong, Fan; Wu, Songfeng; Hao, Yunwei; Sun, Aihua; Li, Tao; Sun, Wei; Wei, Handong; Zhu, Yunping; Qian, Xiaohong; He, Fuchu

    2010-03-01

    To gain a better understanding of the critical function of the endoplasmic reticulum (ER) in liver, we carried out a proteomic survey of mouse liver ER. The ER proteome was profiled with a new three-dimensional, gel-based strategy. From 6152 and 6935 MS spectra, 903 and 1042 proteins were identified with at least two peptides matches at 95% confidence in the rough (r) and smooth (s) ER, respectively. Comparison of the rER and sER proteomes showed that calcium-binding proteins are significantly enriched in the sER suggesting that the ion-binding function of the ER is compartmentalized. Comparison of the rat and mouse ER proteomes showed that 662 proteins were common to both, comprising 53.5% and 49.3% of those proteomes, respectively. We proposed that these proteins were stably expressed proteins that were essential for the maintenance of ER function. GO annotation with a hypergeometric model proved this hypothesis. Unexpectedly, 210 unknown proteins and some proteins previously reported to occur in the cytosol were highly enriched in the ER. This study provides a reference map for the ER proteome of liver. Identification of new ER proteins will enhance our current understanding of the ER and also suggest new functions for this organelle. PMID:20073521

  14. Online Survey Design and Development: A Janus-Faced Approach

    ERIC Educational Resources Information Center

    Lauer, Claire; McLeod, Michael; Blythe, Stuart

    2013-01-01

    In this article we propose a "Janus-faced" approach to survey design--an approach that encourages researchers to consider how they can design and implement surveys more effectively using the latest web and database tools. Specifically, this approach encourages researchers to look two ways at once; attending to both the survey interface…

  15. Survey of rural, private wells. Statistical design

    USGS Publications Warehouse

    Mehnert, Edward; Schock, Susan C.

    1991-01-01

    Half of Illinois' 38 million acres were planted in corn and soybeans in 1988. On the 19 million acres planted in corn and soybeans, approximately 1 million tons of nitrogen fertilizer and 50 million pounds of pesticides were applied. Because groundwater is the water supply for over 90 percent of rural Illinois, the occurrence of agricultural chemicals in groundwater in Illinois is of interest to the agricultural community, the public, and regulatory agencies. The occurrence of agricultural chemicals in groundwater is well documented. However, the extent of this contamination still needs to be defined. This can be done by randomly sampling wells across a geographic area. Key elements of a random, water-well sampling program for regional groundwater quality include the overall statistical design of the program, definition of the sample population, selection of wells to be sampled, and analysis of survey results. These elements must be consistent with the purpose for conducting the program; otherwise, the program will not provide the desired information. The need to carefully design and conduct a sampling program becomes readily apparent when one considers the high cost of collecting and analyzing a sample. For a random sampling program conducted in Illinois, the key elements, as well as the limitations imposed by available information, are described.

  16. Adaptive time-lapse optimized survey design for electrical resistivity tomography monitoring

    NASA Astrophysics Data System (ADS)

    Wilkinson, Paul B.; Uhlemann, Sebastian; Meldrum, Philip I.; Chambers, Jonathan E.; Carrière, Simon; Oxby, Lucy S.; Loke, M. H.

    2015-10-01

    Adaptive optimal experimental design methods use previous data and results to guide the choice and design of future experiments. This paper describes the formulation of an adaptive survey design technique to produce optimal resistivity imaging surveys for time-lapse geoelectrical monitoring experiments. These survey designs are time-dependent and, compared to dipole-dipole or static optimized surveys that do not change over time, focus a greater degree of the image resolution on regions of the subsurface that are actively changing. The adaptive optimization method is validated using a controlled laboratory monitoring experiment comprising a well-defined cylindrical target moving along a trajectory that changes its depth and lateral position. The algorithm is implemented on a standard PC in conjunction with a modified automated multichannel resistivity imaging system. Data acquisition using the adaptive survey designs requires no more time or power than with comparable standard surveys, and the algorithm processing takes place while the system batteries recharge. The results show that adaptively designed optimal surveys yield a quantitative increase in image quality over and above that produced by using standard dipole-dipole or static (time-independent) optimized surveys.

  17. Quantitative survey on the shape of the back of men's head as viewed from the side.

    PubMed

    Tamir, Abraham

    2013-05-01

    This article classifies quantitatively into 4 shapes men's back part of the head viewed from the side that are demonstrated in some of the figures in this article. Because of self-evident reasons, the shapes were blurred. The survey is based on the analysis of 2220 shapes obtained by photographing mainly bald men and by finding pictures in the Internet. To the best of the author's knowledge, this quantitative approach has never been implemented before. The results obtained are as follows: the percentage of 376 "flat heads" is 17%; the percentage of 755 "little round heads," 34%; the percentage of 1017 "round heads," 45.8%; and the percentage of 72 "very round heads," 3.2%. This quantitative survey is an additional step in analyzing quantitatively the shape of the parts of the face wherein, in articles that were previously published or that will be published in this magazine, shapes of the nose, ear conch, and human eye were analyzed quantitatively. In addition, the shapes of the leg toes were also analyzed. Finally, it should be noted that, because of obvious reasons, the survey is based on men's head, most of which are with baldness. PMID:23714907

  18. Design Effects and the Analysis of Survey Data.

    ERIC Educational Resources Information Center

    Folsom, Ralph E.; Williams, Rick L.

    The National Assessment of Educational Progress (NAEP), like most large national surveys, employs a complex stratified multistage unequal probability sample. The design provides a rigorous justification for extending survey results to the entire U.S. target population. Developments in the analysis of data from complex surveys which provide a…

  19. Design and Validation of the Quantum Mechanics Conceptual Survey

    ERIC Educational Resources Information Center

    McKagan, S. B.; Perkins, K. K.; Wieman, C. E.

    2010-01-01

    The Quantum Mechanics Conceptual Survey (QMCS) is a 12-question survey of students' conceptual understanding of quantum mechanics. It is intended to be used to measure the relative effectiveness of different instructional methods in modern physics courses. In this paper, we describe the design and validation of the survey, a process that included…

  20. Quantitative estimation of minimum offset for multichannel surface-wave survey with actively exciting source

    USGS Publications Warehouse

    Xu, Y.; Xia, J.; Miller, R.D.

    2006-01-01

    Multichannel analysis of surface waves is a developing method widely used in shallow subsurface investigations. The field procedures and related parameters are very important for successful applications. Among these parameters, the source-receiver offset range is seldom discussed in theory and normally determined by empirical or semi-quantitative methods in current practice. This paper discusses the problem from a theoretical perspective. A formula for quantitatively evaluating a layered homogenous elastic model was developed. The analytical results based on simple models and experimental data demonstrate that the formula is correct for surface wave surveys for near-surface applications. ?? 2005 Elsevier B.V. All rights reserved.

  1. Designing community surveys to provide a basis for noise policy

    NASA Technical Reports Server (NTRS)

    Fields, J. M.

    1980-01-01

    After examining reports from a large number of social surveys, two areas were identified where methodological improvements in the surveys would be especially useful for public policy. The two study areas are: the definition of noise indexes and the assessment of noise impact. Improvements in the designs of surveys are recommended which would increase the validity and reliability of the noise indexes. Changes in interview questions and sample designs are proposed which would enable surveys to provide measures of noise impact which are directly relevant for public policy.

  2. Quantitative Trait Loci (QTL) Detection in Multicross Inbred Designs

    PubMed Central

    Crepieux, Sébastien; Lebreton, Claude; Servin, Bertrand; Charmet, Gilles

    2004-01-01

    Mapping quantitative trait loci in plants is usually conducted using a population derived from a cross between two inbred lines. The power of such QTL detection and the parameter estimates depend largely on the choice of the two parental lines. Thus, the QTL detected in such populations represent only a small part of the genetic architecture of the trait. In addition, the effects of only two alleles are characterized, which is of limited interest to the breeder, while common pedigree breeding material remains unexploited for QTL mapping. In this study, we extend QTL mapping methodology to a generalized framework, based on a two-step IBD variance component approach, applicable to any type of breeding population obtained from inbred parents. We then investigate with simulated data mimicking conventional breeding programs the influence of different estimates of the IBD values on the power of QTL detection. The proposed method would provide an alternative to the development of specifically designed recombinant populations, by utilizing the genetic variation actually managed by plant breeders. The use of these detected QTL in assisting breeding would thus be facilitated. PMID:15579720

  3. Designing occupancy studies: general advice and allocating survey effort

    USGS Publications Warehouse

    MacKenzie, D.I.; Royle, J. Andrew

    2005-01-01

    1. The fraction of sampling units in a landscape where a target species is present (occupancy) is an extensively used concept in ecology. Yet in many applications the species will not always be detected in a sampling unit even when present, resulting in biased estimates of occupancy. Given that sampling units are surveyed repeatedly within a relatively short timeframe, a number of similar methods have now been developed to provide unbiased occupancy estimates. However, practical guidance on the efficient design of occupancy studies has been lacking. 2. In this paper we comment on a number of general issues related to designing occupancy studies, including the need for clear objectives that are explicitly linked to science or management, selection of sampling units, timing of repeat surveys and allocation of survey effort. Advice on the number of repeat surveys per sampling unit is considered in terms of the variance of the occupancy estimator, for three possible study designs. 3. We recommend that sampling units should be surveyed a minimum of three times when detection probability is high (> 0.5 survey-1), unless a removal design is used. 4. We found that an optimal removal design will generally be the most efficient, but we suggest it may be less robust to assumption violations than a standard design. 5. Our results suggest that for a rare species it is more efficient to survey more sampling units less intensively, while for a common species fewer sampling units should be surveyed more intensively. 6. Synthesis and applications. Reliable inferences can only result from quality data. To make the best use of logistical resources, study objectives must be clearly defined; sampling units must be selected, and repeated surveys timed appropriately; and a sufficient number of repeated surveys must be conducted. Failure to do so may compromise the integrity of the study. The guidance given here on study design issues is particularly applicable to studies of species

  4. Variance estimation for systematic designs in spatial surveys.

    PubMed

    Fewster, R M

    2011-12-01

    In spatial surveys for estimating the density of objects in a survey region, systematic designs will generally yield lower variance than random designs. However, estimating the systematic variance is well known to be a difficult problem. Existing methods tend to overestimate the variance, so although the variance is genuinely reduced, it is over-reported, and the gain from the more efficient design is lost. The current approaches to estimating a systematic variance for spatial surveys are to approximate the systematic design by a random design, or approximate it by a stratified design. Previous work has shown that approximation by a random design can perform very poorly, while approximation by a stratified design is an improvement but can still be severely biased in some situations. We develop a new estimator based on modeling the encounter process over space. The new "striplet" estimator has negligible bias and excellent precision in a wide range of simulation scenarios, including strip-sampling, distance-sampling, and quadrat-sampling surveys, and including populations that are highly trended or have strong aggregation of objects. We apply the new estimator to survey data for the spotted hyena (Crocuta crocuta) in the Serengeti National Park, Tanzania, and find that the reported coefficient of variation for estimated density is 20% using approximation by a random design, 17% using approximation by a stratified design, and 11% using the new striplet estimator. This large reduction in reported variance is verified by simulation. PMID:21534940

  5. Survey of adaptive control using Liapunov design

    NASA Technical Reports Server (NTRS)

    Lindorff, D. P.; Carroll, R. L.

    1973-01-01

    A survey of the literature in which Liapunov's second method is used in determining the control law is presented, with emphasis placed on the model-tracking adaptive control problem. Forty references are listed. Following a brief tutorial exposition of the adaptive control problem, the techniques for treating reduction of order, disturbance and time-varying parameters, multivariable systems, identification, and adaptive observers are discussed. The method is critically evaluated, particularly with respect to possibilities for application.

  6. Designing surveys for tests of gravity.

    PubMed

    Jain, Bhuvnesh

    2011-12-28

    Modified gravity theories may provide an alternative to dark energy to explain cosmic acceleration. We argue that the observational programme developed to test dark energy needs to be augmented to capture new tests of gravity on astrophysical scales. Several distinct signatures of gravity theories exist outside the 'linear' regime, especially owing to the screening mechanism that operates inside halos such as the Milky Way to ensure that gravity tests in the solar system are satisfied. This opens up several decades in length scale and classes of galaxies at low redshift that can be exploited by surveys. While theoretical work on models of gravity is in the early stages, we can already identify new regimes that cosmological surveys could target to test gravity. These include: (i) a small-scale component that focuses on the interior and vicinity of galaxy and cluster halos, (ii) spectroscopy of low-redshift galaxies, especially galaxies smaller than the Milky Way, in environments that range from voids to clusters, and (iii) a programme of combining lensing and dynamical information, from imaging and spectroscopic surveys, respectively, on the same (or statistically identical) sample of galaxies. PMID:22084295

  7. Survey of Fashion Design Employers. Volume IX, No. 16.

    ERIC Educational Resources Information Center

    Aurand, Cecilia; Lucas, John A.

    A survey was conducted to determine the availability of internship opportunities for fashion design students at Harper College and to measure the value of Harper design graduates to their employers. A sample of 279 manufacturers, contacts, and retail stores employing fashion designers were identified in the Chicago metropolitan area and after two…

  8. A survey of spacecraft thermal design solutions

    NASA Technical Reports Server (NTRS)

    Humphries, R.; Wegrich, R.; Pierce, E.; Patterson, W.

    1991-01-01

    A number of thermal projects are outlined giving a perspective on the scope and depth of activities in the thermal control group. A set of designs are presented in a form to illustrate some of the more innovative work. Design configurations, solution techniques, and flight anomalies are discussed. Activities include the instruments of the Hubble Space Telescope, Space Station Freedom, and Spacelab.

  9. Use of multispectral data in design of forest sample surveys

    NASA Technical Reports Server (NTRS)

    Titus, S. J.; Wensel, L. C.

    1977-01-01

    The use of multispectral data in design of forest sample surveys using a computer software package is described. The system allows evaluation of a number of alternative sampling systems and, with appropriate cost data, estimates the implementation cost for each.

  10. 7. Historic American Buildings Survey ORIGINAL DESIGN SUBMITTED BY PEABODY ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. Historic American Buildings Survey ORIGINAL DESIGN SUBMITTED BY PEABODY AND STEARNS (FROM THE ORIGINAL IN THE LIBRARY OF THE VOLTA BUREAU) - Volta Bureau, 1537 Thirty-fifth Street Northwest, Washington, District of Columbia, DC

  11. The Dark Energy Survey instrument design

    SciTech Connect

    Flaugher, B.; /Fermilab

    2006-05-01

    We describe a new project, the Dark Energy Survey (DES), aimed at measuring the dark energy equation of state parameter, w, to a statistical precision of {approx}5%, with four complementary techniques. The survey will use a new 3 sq. deg. mosaic camera (DECam) mounted at the prime focus of the Blanco 4m telescope at the Cerro-Tololo International Observatory (CTIO). DECam includes a large mosaic camera, a five element optical corrector, four filters (g,r,i,z), and the associated infrastructure for operation in the prime focus cage. The focal plane consists of 62 2K x 4K CCD modules (0.27''/pixel) arranged in a hexagon inscribed within the 2.2 deg. diameter field of view. We plan to use the 250 micron thick fully-depleted CCDs that have been developed at the Lawrence Berkeley National Laboratory (LBNL). At Fermilab, we will establish a packaging factory to produce four-side buttable modules for the LBNL devices, as well as to test and grade the CCDs. R&D is underway and delivery of DECam to CTIO is scheduled for 2009.

  12. Sample design for the residential energy consumption survey

    SciTech Connect

    Not Available

    1994-08-01

    The purpose of this report is to provide detailed information about the multistage area-probability sample design used for the Residential Energy Consumption Survey (RECS). It is intended as a technical report, for use by statisticians, to better understand the theory and procedures followed in the creation of the RECS sample frame. For a more cursory overview of the RECS sample design, refer to the appendix entitled ``How the Survey was Conducted,`` which is included in the statistical reports produced for each RECS survey year.

  13. Surveying clinicians by web: current issues in design and administration.

    PubMed

    Dykema, Jennifer; Jones, Nathan R; Piché, Tara; Stevenson, John

    2013-09-01

    The versatility, speed, and reduced costs with which web surveys can be conducted with clinicians are often offset by low response rates. Drawing on best practices and general recommendations in the literature, we provide an evidence-based overview of methods for conducting online surveys with providers. We highlight important advantages and disadvantages of conducting provider surveys online and include a review of differences in response rates between web and mail surveys of clinicians. When administered online, design-based features affect rates of survey participation and data quality. We examine features likely to have an impact including sample frames, incentives, contacts (type, timing, and content), mixed-mode approaches, and questionnaire length. We make several recommendations regarding optimal web-based designs, but more empirical research is needed, particularly with regard to identifying which combinations of incentive and contact approaches yield the highest response rates and are the most cost-effective. PMID:23975760

  14. Optical Design for a Survey X-Ray Telescope

    NASA Technical Reports Server (NTRS)

    Saha, Timo T.; Zhang, William W.; McClelland, Ryan S.

    2014-01-01

    Optical design trades are underway at the Goddard Space Flight Center to define a telescope for an x-ray survey mission. Top-level science objectives of the mission include the study of x-ray transients, surveying and long-term monitoring of compact objects in nearby galaxies, as well as both deep and wide-field x-ray surveys. In this paper we consider Wolter, Wolter-Schwarzschild, and modified Wolter-Schwarzschild telescope designs as basic building blocks for the tightly nested survey telescope. Design principles and dominating aberrations of individual telescopes and nested telescopes are discussed and we compare the off-axis optical performance at 1.0 KeV and 4.0 KeV across a 1.0-degree full field-of-view.

  15. Survey of quantitative data on the solar energy and its spectra distribution

    NASA Technical Reports Server (NTRS)

    Thekaekara, M. P.

    1976-01-01

    This paper presents a survey of available quantitative data on the total and spectral solar irradiance at ground level and outside the atmosphere. Measurements from research aircraft have resulted in the currently accepted NASA/ASTM standards of the solar constant and zero air mass solar spectral irradiance. The intrinsic variability of solar energy output and programs currently under way for more precise measurements from spacecraft are discussed. Instrumentation for solar measurements and their reference radiation scales are examined. Insolation data available from the records of weather stations are reviewed for their applicability to solar energy conversion. Two alternate methods of solarimetry are briefly discussed.

  16. Magnetic resonance elastography hardware design: a survey.

    PubMed

    Tse, Z T H; Janssen, H; Hamed, A; Ristic, M; Young, I; Lamperth, M

    2009-05-01

    Magnetic resonance elastography (MRE) is an emerging technique capable of measuring the shear modulus of tissue. A suspected tumour can be identified by comparing its properties with those of tissues surrounding it; this can be achieved even in deep-lying areas as long as mechanical excitation is possible. This would allow non-invasive methods for cancer-related diagnosis in areas not accessible with conventional palpation. An actuating mechanism is required to generate the necessary tissue displacements directly on the patient in the scanner and three different approaches, in terms of actuator action and position, exist to derive stiffness measurements. However, the magnetic resonance (MR) environment places considerable constraints on the design of such devices, such as the possibility of mutual interference between electrical components, the scanner field, and radio frequency pulses, and the physical space restrictions of the scanner bore. This paper presents a review of the current solutions that have been developed for MRE devices giving particular consideration to the design criteria including the required vibration frequency and amplitude in different applications, the issue of MR compatibility, actuation principles, design complexity, and scanner synchronization issues. The future challenges in this field are also described. PMID:19499839

  17. Hemostatic assessment, treatment strategies, and hematology consultation in massive postpartum hemorrhage: results of a quantitative survey of obstetrician-gynecologists

    PubMed Central

    James, Andra H; Cooper, David L; Paidas, Michael J

    2015-01-01

    Objective To assess potential diagnostic and practice barriers to successful management of massive postpartum hemorrhage (PPH), emphasizing recognition and management of contributing coagulation disorders. Study design A quantitative survey was conducted to assess practice patterns of US obstetrician-gynecologists in managing massive PPH, including assessment of coagulation. Results Nearly all (98%) of the 50 obstetrician-gynecologists participating in the survey reported having encountered at least one patient with “massive” PPH in the past 5 years. Approximately half (52%) reported having previously discovered an underlying bleeding disorder in a patient with PPH, with disseminated intravascular coagulation (88%, n=23/26) being identified more often than von Willebrand disease (73%, n=19/26). All reported having used methylergonovine and packed red blood cells in managing massive PPH, while 90% reported performing a hysterectomy. A drop in blood pressure and ongoing visible bleeding were the most commonly accepted indications for rechecking a “stat” complete blood count and coagulation studies, respectively, in patients with PPH; however, 4% of respondents reported that they would not routinely order coagulation studies. Forty-two percent reported having never consulted a hematologist for massive PPH. Conclusion The survey findings highlight potential areas for improved practice in managing massive PPH, including earlier and more consistent assessment, monitoring of coagulation studies, and consultation with a hematologist. PMID:26604829

  18. A survey of spacecraft thermal design solutions

    NASA Technical Reports Server (NTRS)

    Humphries, R.; Wegrich, R.; Pierce, E.; Patterson, W.

    1991-01-01

    A review of activities at the NASA/Marshall Space Flight Center in the heat transfer and thermodynamics disciplines as well as attendant fluid mechanics, transport phenomena, and computer science applications is presented. Attention is focused on recent activities including the Hubble Space Telescope, and large space instruments, particularly telescope thermal control systems such as those flown aboard Spacelab 2 and the Astro missions. Emphasis is placed on defining the thermal control features, unique design schemes, and performance of selected programs. Results obtained both by ground testing and analytical means, as well as flight and postflight data are presented.

  19. Quantitative and In-Depth Survey of the Isotopic Abundance Distribution Errors in Shotgun Proteomics.

    PubMed

    Chang, Cheng; Zhang, Jiyang; Xu, Changming; Zhao, Yan; Ma, Jie; Chen, Tao; He, Fuchu; Xie, Hongwei; Zhu, Yunping

    2016-07-01

    Accuracy is an important metric when mass spectrometry (MS) is used in large-scale quantitative proteomics research. For MS-based quantification by extracting ion chromatogram (XIC), both the mass and intensity dimensions must be accurate. Although much research has focused on mass accuracy in recent years, less attention has been paid to intensity errors. Here, we investigated signal intensity measurement errors systematically and quantitatively using the natural properties of isotopic distributions. First, we defined a normalized isotopic abundance error model and presented its merits and demerits. Second, a comprehensive survey of the isotopic abundance errors using data sets with increasing sample complexities and concentrations was performed. We examined parameters such as error distribution, relationships between signal intensities within one isotopic cluster, and correlations between different peak errors in isotopic profiles. Our data demonstrated that the high resolution MS platforms might also generate large isotopic intensity measurement errors (approximately 20%). Meanwhile, this error can be reduced to less than 5% using a novel correction algorithm, which is based on the theoretical isotopic abundance distribution. Finally, a nonlinear relationship was observed as the abundance error decreased in isotopic profiles with higher intensity. Our findings are expected to provide insight into isotopic abundance recalibration in quantitative proteomics. PMID:27266261

  20. A SUCCESSFUL BROADBAND SURVEY FOR GIANT Ly{alpha} NEBULAE. I. SURVEY DESIGN AND CANDIDATE SELECTION

    SciTech Connect

    Prescott, Moire K. M.; Dey, Arjun; Jannuzi, Buell T.

    2012-04-01

    Giant Ly{alpha} nebulae (or Ly{alpha} 'blobs') are likely sites of ongoing massive galaxy formation, but the rarity of these powerful sources has made it difficult to form a coherent picture of their properties, ionization mechanisms, and space density. Systematic narrowband Ly{alpha} nebula surveys are ongoing, but the small redshift range covered and the observational expense limit the comoving volume that can be probed by even the largest of these surveys and pose a significant problem when searching for such rare sources. We have developed a systematic search technique designed to find large Ly{alpha} nebulae at 2 {approx}< z {approx}< 3 within deep broadband imaging and have carried out a survey of the 9.4 deg{sup 2} NOAO Deep Wide-Field Survey Booetes field. With a total survey comoving volume of Almost-Equal-To 10{sup 8} h{sup -3}{sub 70} Mpc{sup 3}, this is the largest volume survey for Ly{alpha} nebulae ever undertaken. In this first paper in the series, we present the details of the survey design and a systematically selected sample of 79 candidates, which includes one previously discovered Ly{alpha} nebula.

  1. Simulating future uncertainty to guide the selection of survey designs for long-term monitoring

    USGS Publications Warehouse

    Garman, Steven L.; Schweiger, E. William; Manier, Daniel J.

    2012-01-01

    A goal of environmental monitoring is to provide sound information on the status and trends of natural resources (Messer et al. 1991, Theobald et al. 2007, Fancy et al. 2009). When monitoring observations are acquired by measuring a subset of the population of interest, probability sampling as part of a well-constructed survey design provides the most reliable and legally defensible approach to achieve this goal (Cochran 1977, Olsen et al. 1999, Schreuder et al. 2004; see Chapters 2, 5, 6, 7). Previous works have described the fundamentals of sample surveys (e.g. Hansen et al. 1953, Kish 1965). Interest in survey designs and monitoring over the past 15 years has led to extensive evaluations and new developments of sample selection methods (Stevens and Olsen 2004), of strategies for allocating sample units in space and time (Urquhart et al. 1993, Overton and Stehman 1996, Urquhart and Kincaid 1999), and of estimation (Lesser and Overton 1994, Overton and Stehman 1995) and variance properties (Larsen et al. 1995, Stevens and Olsen 2003) of survey designs. Carefully planned, “scientific” (Chapter 5) survey designs have become a standard in contemporary monitoring of natural resources. Based on our experience with the long-term monitoring program of the US National Park Service (NPS; Fancy et al. 2009; Chapters 16, 22), operational survey designs tend to be selected using the following procedures. For a monitoring indicator (i.e. variable or response), a minimum detectable trend requirement is specified, based on the minimum level of change that would result in meaningful change (e.g. degradation). A probability of detecting this trend (statistical power) and an acceptable level of uncertainty (Type I error; see Chapter 2) within a specified time frame (e.g. 10 years) are specified to ensure timely detection. Explicit statements of the minimum detectable trend, the time frame for detecting the minimum trend, power, and acceptable probability of Type I error (

  2. Design and Architecture of Collaborative Online Communities: A Quantitative Analysis

    ERIC Educational Resources Information Center

    Aviv, Reuven; Erlich, Zippy; Ravid, Gilad

    2004-01-01

    This paper considers four aspects of online communities. Design, mechanisms, architecture, and the constructed knowledge. We hypothesize that different designs of communities drive different mechanisms, which give rise to different architectures, which in turn result in different levels of collaborative knowledge construction. To test this chain…

  3. Research on seismic survey design for doubly complex areas

    NASA Astrophysics Data System (ADS)

    Zhao, Hu; Yin, Cheng; Wu, Ming-Sheng; Wu, Xiao-Hua; Pan, Shu-Lin

    2012-06-01

    The complex geological conditions in doubly complex areas tend to result in difficult surface survey operations and poor target layer imaging in the subsurface which has a great impact on seismic data quality. In this paper, we propose an optimal crooked line survey method for decreasing the surface survey operational difficulties and improving the sub-layer event continuity. The method concentrates on the surface shooting conditions, first, selecting the proper shot positions based on the specific surface topographic features to reduce the shot difficulties and then optimizing the receiver positioning to meet the prerequisite that the subsurface reflection points remain in a straight line. Using this method cannot only lower the shooting difficulty of rough surface condition areas but also overcome the subsurface reflection point bending problem appearing in the traditional crooked line survey method. On the other hand, we use local infill shooting rather than conventional overall infill shooting to improve sublayer event continuity and uniformity with lower survey operation cost. A model has been calculated and processed with the proposed optimal crooked line survey and local infill shooting design method workflow and the results show that this new method can work for seismic surveys in double complex areas.

  4. Multidisciplinary aerospace design optimization: Survey of recent developments

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw; Haftka, Raphael T.

    1995-01-01

    The increasing complexity of engineering systems has sparked increasing interest in multidisciplinary optimization (MDO). This paper presents a survey of recent publications in the field of aerospace where interest in MDO has been particularly intense. The two main challenges of MDO are computational expense and organizational complexity. Accordingly the survey is focussed on various ways different researchers use to deal with these challenges. The survey is organized by a breakdown of MDO into its conceptual components. Accordingly, the survey includes sections on Mathematical Modeling, Design-oriented Analysis, Approximation Concepts, Optimization Procedures, System Sensitivity, and Human Interface. With the authors' main expertise being in the structures area, the bulk of the references focus on the interaction of the structures discipline with other disciplines. In particular, two sections at the end focus on two such interactions that have recently been pursued with a particular vigor: Simultaneous Optimization of Structures and Aerodynamics, and Simultaneous Optimization of Structures Combined With Active Control.

  5. Acoustical Surveys of Methane Plumes by Using the Quantitative Echo Sounder in Japan Sea

    NASA Astrophysics Data System (ADS)

    Aoyama, C.; Matsumoto, R.; Hiruta, A.; Machiyama, H.; Numanami, H.; Tomaru, H.; Snyder, G.; Hiromatsu, M.; Igeta, Y.; Freitas, L.

    2006-12-01

    R&T/V Umitaka-maru(Tokyo Univ. of Marine Science and Technology) and R/V Natsushima(JAMSTEC) sailed to the methane seep area on a small ridge in the Naoetsu Basin, in the eastern margin of the Sea of Japan on July 2004 and July 2005 and July 2006 to survey the gas hydrate in the ocean floor and related acoustic signatures of methane plumes by using a quantitative echo sounder. Detailed bathymetric profiles have revealed a number of mounds, pockmarks and collapsed structures within 3km x 4km on the ridge at the water depth of 910m to 980m. We minutely mapped methane plumes by using a quantitative echo sounder (frequency is 38 kHz, beam width is -19.1dB) with positioning data from GPS. The vessels sailed at intervals of 0.05 nmi, and their speed was under 3kt. We also measured averaged echo intensity from the methane plumes and sea bottoms both in every 100m range and every one minute by the echo integrator. We obtained the following results from the present echo-sounder survey. 1) We mapped in detail the methane plumes and the seep areas. There are over pockmark-mound zone. 2) For the survey in 2005, we checked several methane plumes on echogram in another area included in the survey conducted in 2004. 3) Average volume backscattering strength (SV) of each methane plume tends to be related to water temperature and water pressure. The hydrate bubbles float upward until they reach warm waters at 300m depth. The gas volume abruptly increases at this point as the hydrate coating melts. 4) We recovered several fist-sized chunks of methane hydrate by piston coring at the area where we observed the methane plumes. As a following up project, we are planning 1) to measure the SV of methane bubbles and methane hydrate floating in water columns by using the submarine vehicle, called Hyper Dolphin, 2) to make a trial calculation of the amount of floating methane bubbles and methane hydrates and 3) to study how to sample the acoustical data of methane plumes by using a side

  6. Lack of quantitative training among early-career ecologists: a survey of the problem and potential solutions.

    PubMed

    Barraquand, Frédéric; Ezard, Thomas H G; Jørgensen, Peter S; Zimmerman, Naupaka; Chamberlain, Scott; Salguero-Gómez, Roberto; Curran, Timothy J; Poisot, Timothée

    2014-01-01

    Proficiency in mathematics and statistics is essential to modern ecological science, yet few studies have assessed the level of quantitative training received by ecologists. To do so, we conducted an online survey. The 937 respondents were mostly early-career scientists who studied biology as undergraduates. We found a clear self-perceived lack of quantitative training: 75% were not satisfied with their understanding of mathematical models; 75% felt that the level of mathematics was "too low" in their ecology classes; 90% wanted more mathematics classes for ecologists; and 95% more statistics classes. Respondents thought that 30% of classes in ecology-related degrees should be focused on quantitative disciplines, which is likely higher than for most existing programs. The main suggestion to improve quantitative training was to relate theoretical and statistical modeling to applied ecological problems. Improving quantitative training will require dedicated, quantitative classes for ecology-related degrees that contain good mathematical and statistical practice. PMID:24688862

  7. Survey of quantitative antimicrobial consumption per production stage in farrow-to-finish pig farms in Spain

    PubMed Central

    Moreno, Miguel A.

    2014-01-01

    Objectives To characterise antimicrobial use (AMU) per production stage in terms of drugs, routes of application, indications, duration and exposed animals in farrow-to-finish pig farms in Spain. Design Survey using a questionnaire on AMU during the six months prior to the interview, administered in face-to-face interviews completed from April to October 2010. Participants 108 potentially eligible farms covering all the country were selected using a multistage sampling methodology; of these, 33 were excluded because they did not fulfil the participation criteria and 49 were surveyed. Results The rank of the most used antimicrobials per farm and production stage and administration route started with polymyxins (colistin) by feed during the growing and the preweaning phases, followed by β-lactams by feed during the growing and the preweaning phases and by injection during the preweaning phase. Conclusions The study demonstrates that the growing stage (from weaning to the start of finishing) has the highest AMU according to different quantitative indicators (number of records, number of antimicrobials used, percentage of farms reporting use, relative number of exposed animals per farm and duration of exposure); feed is the administration route that produces the highest antimicrobial exposure based on the higher number of exposed animals and the longer duration of treatment; and there are large differences in AMU among individual pig farms. PMID:26392868

  8. Survey design and extent estimates for the National Lakes Assessment

    EPA Science Inventory

    The US Environmental Protection Agency (EPA) conducted a National Lake Assessment (NLA) in the conterminous USA in 2007 as part of a national assessment of aquatic resources using probability based survey designs. The USEPA Office of Water led the assessment, in cooperation with...

  9. Use of multispectral data in design of forest sample surveys

    NASA Technical Reports Server (NTRS)

    Titus, S. J.; Wensel, L. C.

    1977-01-01

    The use of multispectral data in design of forest sample surveys using a computer software package, WILLIAM, is described. The system allows evaluation of a number of alternative sampling systems and, with appropriate cost data, estimates the implementation cost for each.

  10. Engaging Students in Survey Design and Data Collection

    ERIC Educational Resources Information Center

    Sole, Marla A.

    2015-01-01

    Every day, people use data to make decisions that affect their personal and professional lives, trusting that the data are correct. Many times, however, the data are inaccurate, as a result of a flaw in the design or methodology of the survey used to collect the data. Researchers agree that only questions that are clearly worded, unambiguous, free…

  11. Survey Says? A Primer on Web-based Survey Design and Distribution

    PubMed Central

    Oppenheimer, Adam J.; Pannucci, Christopher J.; Kasten, Steven J.; Haase, Steven C.

    2011-01-01

    The internet has changed the way in which we gather and interpret information. While books were once the exclusive bearers of data, knowledge is now only a keystroke away. The internet has also facilitated the synthesis of new knowledge. Specifically, it has become a tool through which medical research is conducted. A review of the literature reveals that in the past year, over one-hundred medical publications have been based on web-based survey data alone. Due to emerging internet technologies, web-based surveys can now be launched with little computer knowledge. They may also be self-administered, eliminating personnel requirements. Ultimately, an investigator may build, implement, and analyze survey results with speed and efficiency, obviating the need for mass mailings and data processing. All of these qualities have rendered telephone and mail-based surveys virtually obsolete. Despite these capabilities, web-based survey techniques are not without their limitations, namely recall and response biases. When used properly, however, web-based surveys can greatly simplify the research process. This article discusses the implications of web-based surveys and provides guidelines for their effective design and distribution. PMID:21701347

  12. Integrated Analysis and Tools for Land Subsidence Surveying and Monitoring: a Semi-Quantitative Approach

    NASA Astrophysics Data System (ADS)

    Mosconi, A.; Pozzoli, A.; Meroni, A.; Gagliano, S.

    2015-10-01

    This paper presents an integrated approach for land subsidence monitoring using measures coming from different sensors. Eni S.p.A., the main Italian oil and gas company, constantly surveys the land with all the state of the art and innovative techniques, and a method able to integrate the results is an important and actual topic. Nowadays the world is a multi-sensor platform, and measure integration is strictly necessary. Combining the different data sources should be done in a clever way, taking advantages from the best performances of each technique. An integrated analysis allows the interpretation of simultaneous temporal series of data, coming from different sources, and try to separate subsidence contributions. With this purpose Exelis VIS in collaboration with Eni S.p.A. customize PISAV (Permanent Interferometric Scatterometer Analysis and Visualization), an ENVI extension able to capitalize on and combine all the different data collected in the surveys. In this article are presented some significant examples to show the potential of this tool in oil and gas activity: a hydrocarbon storage field where the comparison between SAR and production volumes emphasise a correlation between the two measures in few steps; and a hydrocarbon production field with the Satellite Survey Unit (S.S.U.), where SAR, CGPS, piezometers and assestimeters measure in the same area at the same time, giving the opportunity to analyse data contextually. In the integrated analysis performed with PISAV not always a mathematical rigorous study is possible, and a semi-quantitative approach is the only method for results interpretation. As a result, in the first test case strong correlation between injected hydrocarbon volume and vertical displacement were highlighted; in the second one the integrated analysis has different advantages in monitoring the land subsidence: permits a first qualitative "differentiation" of the natural and anthropic component of subsidence, and also gives more

  13. Optimum structural design with plate bending elements - A survey

    NASA Technical Reports Server (NTRS)

    Haftka, R. T.; Prasad, B.

    1981-01-01

    A survey is presented of recently published papers in the field of optimum structural design of plates, largely with respect to the minimum-weight design of plates subject to such constraints as fundamental frequency maximization. It is shown that, due to the availability of powerful computers, the trend in optimum plate design is away from methods tailored to specific geometry and loads and toward methods that can be easily programmed for any kind of plate, such as finite element methods. A corresponding shift is seen in optimization from variational techniques to numerical optimization algorithms. Among the topics covered are fully stressed design and optimality criteria, mathematical programming, smooth and ribbed designs, design against plastic collapse, buckling constraints, and vibration constraints.

  14. Metamodels for Computer-Based Engineering Design: Survey and Recommendations

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.

  15. Large Synoptic Survey Telescope: From Science Drivers to Reference Design

    SciTech Connect

    Ivezic, Z.; Axelrod, T.; Brandt, W.N.; Burke, D.L.; Claver, C.F.; Connolly, A.; Cook, K.H.; Gee, P.; Gilmore, D.K.; Jacoby, S.H.; Jones, R.L.; Kahn, S.M.; Kantor, J.P.; Krabbendam, V.; Lupton, R.H.; Monet, D.G.; Pinto, P.A.; Saha, A.; Schalk, T.L.; Schneider, D.P.; Strauss, Michael A.; /Washington U., Seattle, Astron. Dept. /LSST Corp. /Penn State U., Astron. Astrophys. /KIPAC, Menlo Park /NOAO, Tucson /LLNL, Livermore /UC, Davis /Princeton U., Astrophys. Sci. Dept. /Naval Observ., Flagstaff /Arizona U., Astron. Dept. - Steward Observ. /UC, Santa Cruz /Harvard U. /Johns Hopkins U. /Illinois U., Urbana

    2011-10-14

    In the history of astronomy, major advances in our understanding of the Universe have come from dramatic improvements in our ability to accurately measure astronomical quantities. Aided by rapid progress in information technology, current sky surveys are changing the way we view and study the Universe. Next-generation surveys will maintain this revolutionary progress. We focus here on the most ambitious survey currently planned in the visible band, the Large Synoptic Survey Telescope (LSST). LSST will have unique survey capability in the faint time domain. The LSST design is driven by four main science themes: constraining dark energy and dark matter, taking an inventory of the Solar System, exploring the transient optical sky, and mapping the Milky Way. It will be a large, wide-field ground-based system designed to obtain multiple images covering the sky that is visible from Cerro Pachon in Northern Chile. The current baseline design, with an 8.4 m (6.5 m effective) primary mirror, a 9.6 deg{sup 2} field of view, and a 3,200 Megapixel camera, will allow about 10,000 square degrees of sky to be covered using pairs of 15-second exposures in two photometric bands every three nights on average. The system is designed to yield high image quality, as well as superb astrometric and photometric accuracy. The survey area will include 30,000 deg{sup 2} with {delta} < +34.5{sup o}, and will be imaged multiple times in six bands, ugrizy, covering the wavelength range 320-1050 nm. About 90% of the observing time will be devoted to a deep-wide-fast survey mode which will observe a 20,000 deg{sup 2} region about 1000 times in the six bands during the anticipated 10 years of operation. These data will result in databases including 10 billion galaxies and a similar number of stars, and will serve the majority of science programs. The remaining 10% of the observing time will be allocated to special programs such as Very Deep and Very Fast time domain surveys. We describe how the

  16. Large Synoptic Survey Telescope: From Science Drivers To Reference Design

    NASA Astrophysics Data System (ADS)

    Ivezic, Z.; Axelrod, T.; Brandt, W. N.; Burke, D. L.; Claver, C. F.; Connolly, A.; Cook, K. H.; Gee, P.; Gilmore, D. K.; Jacoby, S. H.; Jones, R. L.; Kahn, S. M.; Kantor, J. P.; Krabbendam, V. V.; Lupton, R. H.; Monet, D. G.; Pinto, P. A.; Saha, A.; Schalk, T. L.; Schneider, D. P.; Strauss, M. A.; Stubbs, C. W.; Sweeney, D.; Szalay, A.; Thaler, J. J.; Tyson, J. A.; LSST Collaboration

    2008-06-01

    In the history of astronomy, major advances in our understanding of the Universe have come from dramatic improvements in our ability to accurately measure astronomical quantities. Aided by rapid progress in information technology, current sky surveys are changing the way we view and study the Universe. Next-generation surveys will maintain this revolutionary progress. We focus here on the most ambitious survey currently planned in the visible band, the Large Synoptic Survey Telescope (LSST). LSST will have unique survey capability in the faint time domain. The LSST design is driven by four main science themes: constraining dark energy and dark matter, taking an inventory of the Solar System, exploring the transient optical sky, and mapping the Milky Way. It will be a large, wide-field ground-based system designed to obtain multiple images covering the sky that is visible from Cerro Pachón in Northern Chile. The current baseline design, with an 8.4 m (6.5 m effective) primary mirror, a 9.6 deg2 field of view, and a 3,200 Megapixel camera, will allow about 10,000 square degrees of sky to be covered using pairs of 15-second exposures in two photometric bands every three nights on average. The system is designed to yield high image quality, as well as superb astrometric and photometric accuracy. The survey area will include 30,000 deg2 with δ<+34.5°, and will be imaged multiple times in six bands, ugrizy, covering the wavelength range 320--1050 nm. About 90% of the observing time will be devoted to a deep-wide-fast survey mode which will observe a 20,000 deg2 region about 1000 times in the six bands during the anticipated 10 years of operation. These data will result in databases including 10 billion galaxies and a similar number of stars, and will serve the majority of science programs. The remaining 10% of the observing time will be allocated to special programs such as Very Deep and Very Fast time domain surveys. We describe how the LSST science drivers led to

  17. Exploring the utility of quantitative network design in evaluating Arctic sea ice thickness sampling strategies

    NASA Astrophysics Data System (ADS)

    Kaminski, T.; Kauker, F.; Eicken, H.; Karcher, M.

    2015-08-01

    We present a quantitative network design (QND) study of the Arctic sea ice-ocean system using a software tool that can evaluate hypothetical observational networks in a variational data assimilation system. For a demonstration, we evaluate two idealised flight transects derived from NASA's Operation IceBridge airborne ice surveys in terms of their potential to improve 10-day to 5-month sea ice forecasts. As target regions for the forecasts we select the Chukchi Sea, an area particularly relevant for maritime traffic and offshore resource exploration, as well as two areas related to the Barnett ice severity index (BSI), a standard measure of shipping conditions along the Alaskan coast that is routinely issued by ice services. Our analysis quantifies the benefits of sampling upstream of the target area and of reducing the sampling uncertainty. We demonstrate how observations of sea ice and snow thickness can constrain ice and snow variables in a target region and quantify the complementarity of combining two flight transects. We further quantify the benefit of improved atmospheric forecasts and a well-calibrated model.

  18. Exploring the utility of quantitative network design in evaluating Arctic sea-ice thickness sampling strategies

    NASA Astrophysics Data System (ADS)

    Kaminski, T.; Kauker, F.; Eicken, H.; Karcher, M.

    2015-03-01

    We present a quantitative network design (QND) study of the Arctic sea ice-ocean system using a software tool that can evaluate hypothetical observational networks in a variational data assimilation system. For a demonstration, we evaluate two idealised flight transects derived from NASA's Operation IceBridge airborne ice surveys in terms of their potential to improve ten-day to five-month sea-ice forecasts. As target regions for the forecasts we select the Chukchi Sea, an area particularly relevant for maritime traffic and offshore resource exploration, as well as two areas related to the Barnett Ice Severity Index (BSI), a standard measure of shipping conditions along the Alaskan coast that is routinely issued by ice services. Our analysis quantifies the benefits of sampling upstream of the target area and of reducing the sampling uncertainty. We demonstrate how observations of sea-ice and snow thickness can constrain ice and snow variables in a target region and quantify the complementarity of combining two flight transects. We further quantify the benefit of improved atmospheric forecasts and a well-calibrated model.

  19. Improved field experimental designs and quantitative evaluation of aquatic ecosystems

    SciTech Connect

    McKenzie, D.H.; Thomas, J.M.

    1984-05-01

    The paired-station concept and a log transformed analysis of variance were used as methods to evaluate zooplankton density data collected during five years at an electrical generation station on Lake Michigan. To discuss the example and the field design necessary for a valid statistical analysis, considerable background is provided on the questions of selecting (1) sampling station pairs, (2) experimentwise error rates for multi-species analyses, (3) levels of Type I and II error rates, (4) procedures for conducting the field monitoring program, and (5) a discussion of the consequences of violating statistical assumptions. Details for estimating sample sizes necessary to detect changes of a specified magnitude are included. Both statistical and biological problems with monitoring programs (as now conducted) are addressed; serial correlation of successive observations in the time series obtained was identified as one principal statistical difficulty. The procedure reduces this problem to a level where statistical methods can be used confidently. 27 references, 4 figures, 2 tables.

  20. A survey on methods of design features identification

    NASA Astrophysics Data System (ADS)

    Grabowik, C.; Kalinowski, K.; Paprocka, I.; Kempa, W.

    2015-11-01

    currently identified feature. In the IFR method system designer defines a set of features and sets a collection of recognition process parameters. It allows to unambiguously identifying individual features in automatic or semiautomatic way directly in CAD system or in an external application to which the part model might be transferred. Additionally a user is able to define non-geometrical information such as: overall dimensions, surface roughness etc. In this paper a survey on methods of features identification and recognition is presented especially in context of AFR methods.

  1. Mixing Qualitative and Quantitative Methods: Insights into Design and Analysis Issues

    ERIC Educational Resources Information Center

    Lieber, Eli

    2009-01-01

    This article describes and discusses issues related to research design and data analysis in the mixing of qualitative and quantitative methods. It is increasingly desirable to use multiple methods in research, but questions arise as to how best to design and analyze the data generated by mixed methods projects. I offer a conceptualization for such…

  2. SEDS: The Spitzer Extended Deep Survey. Survey Design, Photometry, and Deep IRAC Source Counts

    NASA Technical Reports Server (NTRS)

    Ashby, M. L. N.; Willner, S. P.; Fazio, G. G.; Huang, J.-S.; Arendt, A.; Barmby, P.; Barro, G; Bell, E. F.; Bouwens, R.; Cattaneo, A.; Croton, D.; Dave, R.; Dunlop, J. S.; Egami, E.; Faber, S.; Finlator, K.; Grogin, N. A.; Guhathakurta, P.; Hernquist, L.; Hora, J. L.; Illingworth, G.; Kashlinsky, A; Koekmoer, A. M.; Koo, D. C.; Moseley, H.

    2013-01-01

    The Spitzer Extended Deep Survey (SEDS) is a very deep infrared survey within five well-known extragalactic science fields: the UKIDSS Ultra-Deep Survey, the Extended Chandra Deep Field South, COSMOS, the Hubble Deep Field North, and the Extended Groth Strip. SEDS covers a total area of 1.46 deg(exp 2) to a depth of 26 AB mag (3sigma) in both of the warm Infrared Array Camera (IRAC) bands at 3.6 and 4.5 micron. Because of its uniform depth of coverage in so many widely-separated fields, SEDS is subject to roughly 25% smaller errors due to cosmic variance than a single-field survey of the same size. SEDS was designed to detect and characterize galaxies from intermediate to high redshifts (z = 2-7) with a built-in means of assessing the impact of cosmic variance on the individual fields. Because the full SEDS depth was accumulated in at least three separate visits to each field, typically with six-month intervals between visits, SEDS also furnishes an opportunity to assess the infrared variability of faint objects. This paper describes the SEDS survey design, processing, and publicly-available data products. Deep IRAC counts for the more than 300,000 galaxies detected by SEDS are consistent with models based on known galaxy populations. Discrete IRAC sources contribute 5.6 +/- 1.0 and 4.4 +/- 0.8 nW / square m/sr at 3.6 and 4.5 micron to the diffuse cosmic infrared background (CIB). IRAC sources cannot contribute more than half of the total CIB flux estimated from DIRBE data. Barring an unexpected error in the DIRBE flux estimates, half the CIB flux must therefore come from a diffuse component.

  3. Design Considerations: Falcon M Dwarf Habitable Exoplanet Survey

    NASA Astrophysics Data System (ADS)

    Polsgrove, Daniel; Novotny, Steven; Della-Rose, Devin J.; Chun, Francis; Tippets, Roger; O'Shea, Patrick; Miller, Matthew

    2016-01-01

    The Falcon Telescope Network (FTN) is an assemblage of twelve automated 20-inch telescopes positioned around the globe, controlled from the Cadet Space Operations Center (CSOC) at the US Air Force Academy (USAFA) in Colorado Springs, Colorado. Five of the 12 sites are currently installed, with full operational capability expected by the end of 2016. Though optimized for studying near-earth objects to accomplish its primary mission of Space Situational Awareness (SSA), the Falcon telescopes are in many ways similar to those used by ongoing and planned exoplanet transit surveys targeting individual M dwarf stars (e.g., MEarth, APACHE, SPECULOOS). The network's worldwide geographic distribution provides additional potential advantages. We have performed analytical and empirical studies exploring the viability of employing the FTN for a future survey of nearby late-type M dwarfs tailored to detect transits of 1-2REarth exoplanets in habitable-zone orbits . We present empirical results on photometric precision derived from data collected with multiple Falcon telescopes on a set of nearby (< 25 pc) M dwarfs using infrared filters and a range of exposure times, as well as sample light curves created from images gathered during known transits of varying transit depths. An investigation of survey design parameters is also described, including an analysis of site-specific weather data, anticipated telescope time allocation and the percentage of nearby M dwarfs with sufficient check stars within the Falcons' 11' x 11' field-of-view required to perform effective differential photometry. The results of this ongoing effort will inform the likelihood of discovering one (or more) habitable-zone exoplanets given current occurrence rate estimates over a nominal five-year campaign, and will dictate specific survey design features in preparation for initiating project execution when the FTN begins full-scale automated operations.

  4. Quality-by-Design II: Application of Quantitative Risk Analysis to the Formulation of Ciprofloxacin Tablets.

    PubMed

    Claycamp, H Gregg; Kona, Ravikanth; Fahmy, Raafat; Hoag, Stephen W

    2016-04-01

    Qualitative risk assessment methods are often used as the first step to determining design space boundaries; however, quantitative assessments of risk with respect to the design space, i.e., calculating the probability of failure for a given severity, are needed to fully characterize design space boundaries. Quantitative risk assessment methods in design and operational spaces are a significant aid to evaluating proposed design space boundaries. The goal of this paper is to demonstrate a relatively simple strategy for design space definition using a simplified Bayesian Monte Carlo simulation. This paper builds on a previous paper that used failure mode and effects analysis (FMEA) qualitative risk assessment and Plackett-Burman design of experiments to identity the critical quality attributes. The results show that the sequential use of qualitative and quantitative risk assessments can focus the design of experiments on a reduced set of critical material and process parameters that determine a robust design space under conditions of limited laboratory experimentation. This approach provides a strategy by which the degree of risk associated with each known parameter can be calculated and allocates resources in a manner that manages risk to an acceptable level. PMID:26202064

  5. Design of a lightweight, low-cost geophysical survey vehicle

    SciTech Connect

    Ames, K.

    1989-03-01

    A remote-controlled vehicle has been designed at Pacific Northwest Laboratory for surveying sites that are dangerous for manned vehicles. The vehicle is required to be small, maneuverable, inexpensive, and as free of metallic parts as practicable. The prototype being fabricated will have a mostly aluminum engine, dual bicycle tire wheel assemblies, a two-clutch steering system for selective engagement of pairs of wheels on either side of the vehicle, and radio control with fiber-optic umbilical video link. Wireless control and telemetry are planned for the future. Other future possibilities include a mostly plastic engine and a global positioning system that uses satellite signals. 3 figs.

  6. Skin Microbiome Surveys Are Strongly Influenced by Experimental Design.

    PubMed

    Meisel, Jacquelyn S; Hannigan, Geoffrey D; Tyldsley, Amanda S; SanMiguel, Adam J; Hodkinson, Brendan P; Zheng, Qi; Grice, Elizabeth A

    2016-05-01

    Culture-independent studies to characterize skin microbiota are increasingly common, due in part to affordable and accessible sequencing and analysis platforms. Compared to culture-based techniques, DNA sequencing of the bacterial 16S ribosomal RNA (rRNA) gene or whole metagenome shotgun (WMS) sequencing provides more precise microbial community characterizations. Most widely used protocols were developed to characterize microbiota of other habitats (i.e., gastrointestinal) and have not been systematically compared for their utility in skin microbiome surveys. Here we establish a resource for the cutaneous research community to guide experimental design in characterizing skin microbiota. We compare two widely sequenced regions of the 16S rRNA gene to WMS sequencing for recapitulating skin microbiome community composition, diversity, and genetic functional enrichment. We show that WMS sequencing most accurately recapitulates microbial communities, but sequencing of hypervariable regions 1-3 of the 16S rRNA gene provides highly similar results. Sequencing of hypervariable region 4 poorly captures skin commensal microbiota, especially Propionibacterium. WMS sequencing, which is resource and cost intensive, provides evidence of a community's functional potential; however, metagenome predictions based on 16S rRNA sequence tags closely approximate WMS genetic functional profiles. This study highlights the importance of experimental design for downstream results in skin microbiome surveys. PMID:26829039

  7. Quantitative Survey and Structural Classification of Hydraulic Fracturing Chemicals Reported in Unconventional Gas Production.

    PubMed

    Elsner, Martin; Hoelzer, Kathrin

    2016-04-01

    Much interest is directed at the chemical structure of hydraulic fracturing (HF) additives in unconventional gas exploitation. To bridge the gap between existing alphabetical disclosures by function/CAS number and emerging scientific contributions on fate and toxicity, we review the structural properties which motivate HF applications, and which determine environmental fate and toxicity. Our quantitative overview relied on voluntary U.S. disclosures evaluated from the FracFocus registry by different sources and on a House of Representatives ("Waxman") list. Out of over 1000 reported substances, classification by chemistry yielded succinct subsets able to illustrate the rationale of their use, and physicochemical properties relevant for environmental fate, toxicity and chemical analysis. While many substances were nontoxic, frequent disclosures also included notorious groundwater contaminants like petroleum hydrocarbons (solvents), precursors of endocrine disruptors like nonylphenols (nonemulsifiers), toxic propargyl alcohol (corrosion inhibitor), tetramethylammonium (clay stabilizer), biocides or strong oxidants. Application of highly oxidizing chemicals, together with occasional disclosures of putative delayed acids and complexing agents (i.e., compounds designed to react in the subsurface) suggests that relevant transformation products may be formed. To adequately investigate such reactions, available information is not sufficient, but instead a full disclosure of HF additives is necessary. PMID:26902161

  8. Quantitative Feedback Theory (QFT) applied to the design of a rotorcraft flight control system

    NASA Technical Reports Server (NTRS)

    Hess, R. A.; Gorder, P. J.

    1992-01-01

    Quantitative Feedback Theory describes a frequency-domain technique for the design of multi-input, multi-output control systems which meet time or frequency domain performance criteria when specified uncertainty exists in the linear description of the vehicle dynamics. Quantitative Feedback Theory is applied to the design of the longitudinal flight control system for a linear uncertain model of the AH-64 rotorcraft. In this model, the uncertainty is assigned, and is assumed to be attributable to actual uncertainty in the dynamic model and to the changes in the vehicle aerodynamic characteristics which occur near hover. The model includes an approximation to the rotor and actuator dynamics. The design example indicates the manner in which handling qualities criteria may be incorporated into the design of realistic rotorcraft control systems in which significant uncertainty exists in the vehicle model.

  9. The Unique Optical Design of the NESSI Survey Telescope

    NASA Astrophysics Data System (ADS)

    Ackermann, M.; McGraw, J.; Zimmer, P.; Williams, T.

    The NESSI Survey telescope will be the second incarnation of the CCD/Transit Instrument. It is being designed to accomplish precision astronomical measurements, thus requiring excellent image quality and virtually no distortion over an inscribed 1° x 1° scientific field of view. Project constraints such as re-use of an existing unperforated parabolic f/2.2 primary mirror, and the desire to re-use much of the existing CTI structure, have forced the design in one direction. Scientific constraints such as the 1.42° field, 60μm/arcsec plate scale, zero focus shift with wavelength, zero distortion and 80% encircled energy within 0.25arcsec spot diameters have further limited remaining design options. After exploring nearly every optical telescope configuration known to man, and several never before imagined, the NESSI Project Team as arrived at a unique optical design that produces a field and images meeting or exceeding all these constraints. The baseline configuration is that of a "bent Cassegrain," employing a convex hyperbolic secondary, a 45° folding flat and a four lens refractive field group. One unique feature of this design is that all four lenses lie outside the primary aperture, thus introduce no obscuration. A second unique aspect of the design is that the largest lens is only slightly larger than the focal plane array. The field corrector lenses are not large by today's standards but still large enough to make the availability of glass a serious concern. A number of high performing designs were abandoned when it was learned the glass was either not available or would require a special production. With a little luck, a little insight and a lot of work, we followed the "rugged ways to the stars," and were able to arrive at a relatively simple Cassegrain design where only one corrector lens had an aspheric surface, a simple parabola, and all four lenses were made of BK7 glass. This design appears to be manufactureable and essentially meets all of the

  10. 23 CFR 1340.10 - Submission and approval of seat belt survey design.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 23 Highways 1 2013-04-01 2013-04-01 false Submission and approval of seat belt survey design. 1340... TRANSPORTATION UNIFORM CRITERIA FOR STATE OBSERVATIONAL SURVEYS OF SEAT BELT USE Administrative Requirements § 1340.10 Submission and approval of seat belt survey design. (a) Contents: The following...

  11. 23 CFR 1340.10 - Submission and approval of seat belt survey design.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 23 Highways 1 2014-04-01 2014-04-01 false Submission and approval of seat belt survey design. 1340... TRANSPORTATION UNIFORM CRITERIA FOR STATE OBSERVATIONAL SURVEYS OF SEAT BELT USE Administrative Requirements § 1340.10 Submission and approval of seat belt survey design. (a) Contents: The following...

  12. Study of Nurses’ Knowledge about Palliative Care: A Quantitative Cross-sectional Survey

    PubMed Central

    Prem, Venkatesan; Karvannan, Harikesavan; Kumar, Senthil P; Karthikbabu, Surulirajan; Syed, Nafeez; Sisodia, Vaishali; Jaykumar, Saroja

    2012-01-01

    Context: Studies have documented that nurses and other health care professionals are inadequately prepared to care for patients in palliative care. Several reasons have been identified including inadequacies in nursing education, absence of curriculum content related to pain management, and knowledge related to pain and palliative care. Aims: The objective of this paper was to assess the knowledge about palliative care amongst nursing professionals using the palliative care knowledge test (PCKT). Settings and Design: Cross-sectional survey of 363 nurses in a multispecialty hospital. Materials and Methods: The study utilized a self-report questionnaire- PCKT developed by Nakazawa et al., which had 20 items (statements about palliative care) for each of which the person had to indicate ‘correct’, ‘incorrect’, or ‘unsure.’ The PCKT had 5 subscales (philosophy- 2 items, pain- 6 items, dyspnea- 4 items, psychiatric problems- 4 items, and gastro-intestinal problems- 4 items). Statistical Analysis Used: Comparison across individual and professional variables for both dimensions were done using one-way ANOVA, and correlations were done using Karl-Pearson's co-efficient using SPSS version 16.0 for Windows. Results: The overall total score of PCKT was 7.16 ± 2.69 (35.8%). The philosophy score was 73 ± .65 (36.5%), pain score was 2.09 ± 1.19 (34.83%), dyspnea score was 1.13 ± .95 (28.25%), psychiatric problems score was 1.83 ± 1.02 (45.75%), and gastro-intestinal problems score was 1.36 ± .97 (34%). (P = .00). The female nurses scored higher than their male counterparts, but the difference was not significant (P > .05). Conclusions: Overall level of knowledge about palliative care was poor, and nurses had a greater knowledge about psychiatric problems and philosophy than the other aspects indicated in PCKT. PMID:23093828

  13. Mechanistic and quantitative insight into cell surface targeted molecular imaging agent design

    PubMed Central

    Zhang, Liang; Bhatnagar, Sumit; Deschenes, Emily; Thurber, Greg M.

    2016-01-01

    Molecular imaging agent design involves simultaneously optimizing multiple probe properties. While several desired characteristics are straightforward, including high affinity and low non-specific background signal, in practice there are quantitative trade-offs between these properties. These include plasma clearance, where fast clearance lowers background signal but can reduce target uptake, and binding, where high affinity compounds sometimes suffer from lower stability or increased non-specific interactions. Further complicating probe development, many of the optimal parameters vary depending on both target tissue and imaging agent properties, making empirical approaches or previous experience difficult to translate. Here, we focus on low molecular weight compounds targeting extracellular receptors, which have some of the highest contrast values for imaging agents. We use a mechanistic approach to provide a quantitative framework for weighing trade-offs between molecules. Our results show that specific target uptake is well-described by quantitative simulations for a variety of targeting agents, whereas non-specific background signal is more difficult to predict. Two in vitro experimental methods for estimating background signal in vivo are compared – non-specific cellular uptake and plasma protein binding. Together, these data provide a quantitative method to guide probe design and focus animal work for more cost-effective and time-efficient development of molecular imaging agents. PMID:27147293

  14. Mechanistic and quantitative insight into cell surface targeted molecular imaging agent design

    NASA Astrophysics Data System (ADS)

    Zhang, Liang; Bhatnagar, Sumit; Deschenes, Emily; Thurber, Greg M.

    2016-05-01

    Molecular imaging agent design involves simultaneously optimizing multiple probe properties. While several desired characteristics are straightforward, including high affinity and low non-specific background signal, in practice there are quantitative trade-offs between these properties. These include plasma clearance, where fast clearance lowers background signal but can reduce target uptake, and binding, where high affinity compounds sometimes suffer from lower stability or increased non-specific interactions. Further complicating probe development, many of the optimal parameters vary depending on both target tissue and imaging agent properties, making empirical approaches or previous experience difficult to translate. Here, we focus on low molecular weight compounds targeting extracellular receptors, which have some of the highest contrast values for imaging agents. We use a mechanistic approach to provide a quantitative framework for weighing trade-offs between molecules. Our results show that specific target uptake is well-described by quantitative simulations for a variety of targeting agents, whereas non-specific background signal is more difficult to predict. Two in vitro experimental methods for estimating background signal in vivo are compared – non-specific cellular uptake and plasma protein binding. Together, these data provide a quantitative method to guide probe design and focus animal work for more cost-effective and time-efficient development of molecular imaging agents.

  15. Current State of Agile User-Centered Design: A Survey

    NASA Astrophysics Data System (ADS)

    Hussain, Zahid; Slany, Wolfgang; Holzinger, Andreas

    Agile software development methods are quite popular nowadays and are being adopted at an increasing rate in the industry every year. However, these methods are still lacking usability awareness in their development lifecycle, and the integration of usability/User-Centered Design (UCD) into agile methods is not adequately addressed. This paper presents the preliminary results of a recently conducted online survey regarding the current state of the integration of agile methods and usability/UCD. A world wide response of 92 practitioners was received. The results show that the majority of practitioners perceive that the integration of agile methods with usability/UCD has added value to their adopted processes and to their teams; has resulted in the improvement of usability and quality of the product developed; and has increased the satisfaction of the end-users of the product developed. The top most used HCI techniques are low-fidelity prototyping, conceptual designs, observational studies of users, usability expert evaluations, field studies, personas, rapid iterative testing, and laboratory usability testing.

  16. Quantitative Survey and Structural Classification of Fracking Chemicals Reported in Unconventional Gas Exploitation

    NASA Astrophysics Data System (ADS)

    Elsner, Martin; Schreglmann, Kathrin

    2015-04-01

    Few technologies are being discussed in such controversial terms as hydraulic fracturing ("fracking") in the recovery of unconventional gas. Particular concern regards the chemicals that may return to the surface as a result of hydraulic fracturing. These are either "fracking chemicals" - chemicals that are injected together with the fracking fluid to optimize the fracturing performance or geogenic substances which may turn up during gas production, in the so-called produced water originating from the target formation. Knowledge about them is warranted for several reasons. (1) Monitoring. Air emissions are reported to arise from well drilling, the gas itself or condensate tanks. In addition, potential spills and accidents bear the danger of surface and shallow groundwater contaminations. Monitoring strategies are therefore warranted to screen for "indicator" substances of potential impacts. (2) Chemical Analysis. To meet these analytical demands, target substances must be defined so that adequate sampling approaches and analytical methods can be developed. (3) Transformation in the Subsurface. Identification and classification of fracking chemicals (aromatics vs. alcohols vs. acids, esters, etc.) is further important to assess the possibility of subsurface reactions which may potentially generate new, as yet unidentified transformation products. (4) Wastewater Treatment. For the same reason chemical knowledge is important for optimized wastewater treatment strategies. (5) Human and Ecosystem Health. Knowledge of the most frequent fracking chemicals is further essential for risk assessment (environmental behavior, toxicity) (6) Public Discussions. Finally, an overview of reported fracking chemicals can provide unbiased scientific into current public debates and enable critical reviews of Green Chemistry approaches. Presently, however, such information is not readily available. We aim to close this knowledge gap by providing a quantitative overview of chemical

  17. Practical Tools for Designing and Weighting Survey Samples

    ERIC Educational Resources Information Center

    Valliant, Richard; Dever, Jill A.; Kreuter, Frauke

    2013-01-01

    Survey sampling is fundamentally an applied field. The goal in this book is to put an array of tools at the fingertips of practitioners by explaining approaches long used by survey statisticians, illustrating how existing software can be used to solve survey problems, and developing some specialized software where needed. This book serves at least…

  18. A robust rotorcraft flight control system design methodology utilizing quantitative feedback theory

    NASA Technical Reports Server (NTRS)

    Gorder, Peter James

    1993-01-01

    Rotorcraft flight control systems present design challenges which often exceed those associated with fixed-wing aircraft. First, large variations in the response characteristics of the rotorcraft result from the wide range of airspeeds of typical operation (hover to over 100 kts). Second, the assumption of vehicle rigidity often employed in the design of fixed-wing flight control systems is rarely justified in rotorcraft where rotor degrees of freedom can have a significant impact on the system performance and stability. This research was intended to develop a methodology for the design of robust rotorcraft flight control systems. Quantitative Feedback Theory (QFT) was chosen as the basis for the investigation. Quantitative Feedback Theory is a technique which accounts for variability in the dynamic response of the controlled element in the design robust control systems. It was developed to address a Multiple-Input Single-Output (MISO) design problem, and utilizes two degrees of freedom to satisfy the design criteria. Two techniques were examined for extending the QFT MISO technique to the design of a Multiple-Input-Multiple-Output (MIMO) flight control system (FCS) for a UH-60 Black Hawk Helicopter. In the first, a set of MISO systems, mathematically equivalent to the MIMO system, was determined. QFT was applied to each member of the set simultaneously. In the second, the same set of equivalent MISO systems were analyzed sequentially, with closed loop response information from each loop utilized in subsequent MISO designs. The results of each technique were compared, and the advantages of the second, termed Sequential Loop Closure, were clearly evident.

  19. Survey on surgical instrument handle design: ergonomics and acceptance.

    PubMed

    Santos-Carreras, Laura; Hagen, Monika; Gassert, Roger; Bleuler, Hannes

    2012-03-01

    Minimally invasive surgical approaches have revolutionized surgical care and considerably improved surgical outcomes. The instrumentation has changed significantly from open to laparoscopic and robotic surgery with various usability and ergonomics qualities. To establish guidelines for future designing of surgical instruments, this study assesses the effects of current surgical approaches and instruments on the surgeon. Furthermore, an analysis of surgeons' preferences with respect to instrument handles was performed to identify the main acceptance criteria. In all, 49 surgeons (24 with robotic surgery experience, 25 without) completed the survey about physical discomfort and working conditions. The respondents evaluated comfort, intuitiveness, precision, and stability of 7 instrument handles. Robotic surgery procedures generally take a longer time than conventional procedures but result in less back, shoulder, and wrist pain; 28% of surgeons complained about finger and neck pain during robotic surgery. Three handles (conventional needle holder, da Vinci wrist, and joystick-like handle) received significantly higher scores for most of the proposed criteria. The handle preference is best explained by a regression model related only to comfort and precision (R(2) = 0.91) and is significantly affected by the surgeon's background (P < .001). Although robotic surgery seems to alleviate physical discomfort during and after surgery, the results of this study show that there is room for improvement in the sitting posture and in the ergonomics of the handles. Comfort and precision have been found to be the most important aspects for the surgeon's choice of an instrument handle. Furthermore, surgeons' professional background should be considered when designing novel surgical instruments. PMID:21868419

  20. Textile Materials for the Design of Wearable Antennas: A Survey

    PubMed Central

    Salvado, Rita; Loss, Caroline; Gonçalves, Ricardo; Pinho, Pedro

    2012-01-01

    In the broad context of Wireless Body Sensor Networks for healthcare and pervasive applications, the design of wearable antennas offers the possibility of ubiquitous monitoring, communication and energy harvesting and storage. Specific requirements for wearable antennas are a planar structure and flexible construction materials. Several properties of the materials influence the behaviour of the antenna. For instance, the bandwidth and the efficiency of a planar microstrip antenna are mainly determined by the permittivity and the thickness of the substrate. The use of textiles in wearable antennas requires the characterization of their properties. Specific electrical conductive textiles are available on the market and have been successfully used. Ordinary textile fabrics have been used as substrates. However, little information can be found on the electromagnetic properties of regular textiles. Therefore this paper is mainly focused on the analysis of the dielectric properties of normal fabrics. In general, textiles present a very low dielectric constant that reduces the surface wave losses and increases the impedance bandwidth of the antenna. However, textile materials are constantly exchanging water molecules with the surroundings, which affects their electromagnetic properties. In addition, textile fabrics are porous, anisotropic and compressible materials whose thickness and density might change with low pressures. Therefore it is important to know how these characteristics influence the behaviour of the antenna in order to minimize unwanted effects. This paper presents a survey of the key points for the design and development of textile antennas, from the choice of the textile materials to the framing of the antenna. An analysis of the textile materials that have been used is also presented. PMID:23202235

  1. Textile materials for the design of wearable antennas: a survey.

    PubMed

    Salvado, Rita; Loss, Caroline; Gonçalves, Ricardo; Pinho, Pedro

    2012-01-01

    In the broad context of Wireless Body Sensor Networks for healthcare and pervasive applications, the design of wearable antennas offers the possibility of ubiquitous monitoring, communication and energy harvesting and storage. Specific requirements for wearable antennas are a planar structure and flexible construction materials. Several properties of the materials influence the behaviour of the antenna. For instance, the bandwidth and the efficiency of a planar microstrip antenna are mainly determined by the permittivity and the thickness of the substrate. The use of textiles in wearable antennas requires the characterization of their properties. Specific electrical conductive textiles are available on the market and have been successfully used. Ordinary textile fabrics have been used as substrates. However, little information can be found on the electromagnetic properties of regular textiles. Therefore this paper is mainly focused on the analysis of the dielectric properties of normal fabrics. In general, textiles present a very low dielectric constant that reduces the surface wave losses and increases the impedance bandwidth of the antenna. However, textile materials are constantly exchanging water molecules with the surroundings, which affects their electromagnetic properties. In addition, textile fabrics are porous, anisotropic and compressible materials whose thickness and density might change with low pressures. Therefore it is important to know how these characteristics influence the behaviour of the antenna in order to minimize unwanted effects. This paper presents a survey of the key points for the design and development of textile antennas, from the choice of the textile materials to the framing of the antenna. An analysis of the textile materials that have been used is also presented. PMID:23202235

  2. Trajectory Design for the Transiting Exoplanet Survey Satellite

    NASA Technical Reports Server (NTRS)

    Dichmann, Donald J.; Parker, Joel J. K.; Williams, Trevor W.; Mendelsohn, Chad R.

    2014-01-01

    The Transiting Exoplanet Survey Satellite (TESS) is a National Aeronautics and Space Administration (NASA) mission, scheduled to be launched in 2017. TESS will travel in a highly eccentric orbit around Earth, with initial perigee radius near 17 Earth radii (Re) and apogee radius near 59 Re. The orbit period is near 2:1 resonance with the Moon, with apogee nearly 90 degrees out-of-phase with the Moon, in a configuration that has been shown to be operationally stable. TESS will execute phasing loops followed by a lunar flyby, with a final maneuver to achieve 2:1 resonance with the Moon. The goals of a resonant orbit with long-term stability, short eclipses and limited oscillations of perigee present significant challenges to the trajectory design. To rapidly assess launch opportunities, we adapted the Schematics Window Methodology (SWM76) launch window analysis tool to assess the TESS mission constraints. To understand the long-term dynamics of such a resonant orbit in the Earth-Moon system we employed Dynamical Systems Theory in the Circular Restricted 3-Body Problem (CR3BP). For precise trajectory analysis we use a high-fidelity model and multiple shooting in the General Mission Analysis Tool (GMAT) to optimize the maneuver delta-V and meet mission constraints. Finally we describe how the techniques we have developed can be applied to missions with similar requirements. Keywords: resonant orbit, stability, lunar flyby, phasing loops, trajectory optimization

  3. Trajectory Design for the Transiting Exoplanet Survey Satellite (TESS)

    NASA Technical Reports Server (NTRS)

    Dichmann, Donald J.; Parker, Joel; Williams, Trevor; Mendelsohn, Chad

    2014-01-01

    The Transiting Exoplanet Survey Satellite (TESS) is a National Aeronautics and Space Administration (NASA) mission launching in 2017. TESS will travel in a highly eccentric orbit around Earth, with initial perigee radius near 17 Earth radii (Re) and apogee radius near 59 Re. The orbit period is near 2:1 resonance with the Moon, with apogee nearly 90 degrees out-of-phase with the Moon, in a configuration that has been shown to be operationally stable. TESS will execute phasing loops followed by a lunar flyby, with a final maneuver to achieve 2:1 resonance with the Moon. The goals of a resonant orbit with long-term stability, short eclipses and limited oscillations of perigee present significant challenges to the trajectory design. To rapidly assess launch opportunities, we adapted the SWM76 launch window tool to assess the TESS mission constraints. To understand the long-term dynamics of such a resonant orbit in the Earth-Moon system we employed Dynamical Systems Theory in the Circular Restricted 3-Body Problem (CR3BP). For precise trajectory analysis we use a high-fidelity model and multiple shooting in the General Mission Analysis Tool (GMAT) to optimize the maneuver delta-V and meet mission constraints. Finally we describe how the techniques we have developed can be applied to missions with similar requirements.

  4. National Aquatic Resource Surveys: Integration of Geospatial Data in Their Survey Design and Analysis

    EPA Science Inventory

    The National Aquatic Resource Surveys (NARS) are a series of four statistical surveys conducted by the U.S. Environmental Protection Agency working in collaboration with states, tribal nations and other federal agencies. The surveys are conducted for lakes and reservoirs, streams...

  5. Ten Years of LibQual: A Study of Qualitative and Quantitative Survey Results at the University of Mississippi 2001-2010

    ERIC Educational Resources Information Center

    Greenwood, Judy T.; Watson, Alex P.; Dennis, Melissa

    2011-01-01

    This article analyzes quantitative adequacy gap scores and coded qualitative comments from LibQual surveys at the University of Mississippi from 2001 to 2010, looking for relationships between library policy changes and LibQual results and any other trends that emerged. Analysis found no relationship between changes in policy and survey results…

  6. Software for quantitative analysis of radiotherapy: overview, requirement analysis and design solutions.

    PubMed

    Zhang, Lanlan; Hub, Martina; Mang, Sarah; Thieke, Christian; Nix, Oliver; Karger, Christian P; Floca, Ralf O

    2013-06-01

    Radiotherapy is a fast-developing discipline which plays a major role in cancer care. Quantitative analysis of radiotherapy data can improve the success of the treatment and support the prediction of outcome. In this paper, we first identify functional, conceptional and general requirements on a software system for quantitative analysis of radiotherapy. Further we present an overview of existing radiotherapy analysis software tools and check them against the stated requirements. As none of them could meet all of the demands presented herein, we analyzed possible conceptional problems and present software design solutions and recommendations to meet the stated requirements (e.g. algorithmic decoupling via dose iterator pattern; analysis database design). As a proof of concept we developed a software library "RTToolbox" following the presented design principles. The RTToolbox is available as open source library and has already been tested in a larger-scale software system for different use cases. These examples demonstrate the benefit of the presented design principles. PMID:23523366

  7. Optimal color filter array design: quantitative conditions and an efficient search procedure

    NASA Astrophysics Data System (ADS)

    Lu, Yue M.; Vetterli, Martin

    2009-01-01

    Most digital cameras employ a spatial subsampling process, implemented as a color filter array (CFA), to capture color images. The choice of CFA patterns has a great impact on the performance of subsequent reconstruction (demosaicking) algorithms. In this work, we propose a quantitative theory for optimal CFA design. We view the CFA sampling process as an encoding (low-dimensional approximation) operation and, correspondingly, demosaicking as the best decoding (reconstruction) operation. Finding the optimal CFA is thus equivalent to finding the optimal approximation scheme for the original signals with minimum information loss. We present several quantitative conditions for optimal CFA design, and propose an efficient computational procedure to search for the best CFAs that satisfy these conditions. Numerical experiments show that the optimal CFA patterns designed from the proposed procedure can effectively retain the information of the original full-color images. In particular, with the designed CFA patterns, high quality demosaicking can be achieved by using simple and efficient linear filtering operations in the polyphase domain. The visual qualities of the reconstructed images are competitive to those obtained by the state-of-the-art adaptive demosaicking algorithms based on the Bayer pattern.

  8. The Development of the Progressive in 19th Century English: A Quantitative Survey.

    ERIC Educational Resources Information Center

    Arnaud, Rene

    1998-01-01

    Expansion of the progressive (be+ing periphrastic form, where "be" is at the same time the copula and a statement of existence) was a major feature of modernization of the English verb system in the 19th century. A survey (1787-1880) of a collection of private letters, most from famous writers, reveals that linguistic factors played a small role…

  9. A Novel Simulation Technician Laboratory Design: Results of a Survey-Based Study

    PubMed Central

    Hughes, Patrick G; Friedl, Ed; Ortiz Figueroa, Fabiana; Cepeda Brito, Jose R; Frey, Jennifer; Birmingham, Lauren E; Atkinson, Steven Scott

    2016-01-01

    Objective  The purpose of this study was to elicit feedback from simulation technicians prior to developing the first simulation technician-specific simulation laboratory in Akron, OH. Background Simulation technicians serve a vital role in simulation centers within hospitals/health centers around the world. The first simulation technician degree program in the US has been approved in Akron, OH. To satisfy the requirements of this program and to meet the needs of this special audience of learners, a customized simulation lab is essential.  Method A web-based survey was circulated to simulation technicians prior to completion of the lab for the new program. The survey consisted of questions aimed at identifying structural and functional design elements of a novel simulation center for the training of simulation technicians. Quantitative methods were utilized to analyze data. Results Over 90% of technicians (n=65) think that a lab designed explicitly for the training of technicians is novel and beneficial. Approximately 75% of respondents think that the space provided appropriate audiovisual (AV) infrastructure and space to evaluate the ability of technicians to be independent. The respondents think that the lab needed more storage space, visualization space for a large number of students, and more space in the technical/repair area. Conclusions  A space designed for the training of simulation technicians was considered to be beneficial. This laboratory requires distinct space for technical repair, adequate bench space for the maintenance and repair of simulators, an appropriate AV infrastructure, and space to evaluate the ability of technicians to be independent. PMID:27096134

  10. Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method, 4th Edition

    ERIC Educational Resources Information Center

    Dillman, Don A.; Smyth, Jolene D.; Christian, Lean Melani

    2014-01-01

    For over two decades, Dillman's classic text on survey design has aided both students and professionals in effectively planning and conducting mail, telephone, and, more recently, Internet surveys. The new edition is thoroughly updated and revised, and covers all aspects of survey research. It features expanded coverage of mobile phones, tablets,…

  11. Inversion-free decentralised quantitative feedback design of large-scale systems

    NASA Astrophysics Data System (ADS)

    Labibi, B.; Mahdi Alavi, S. M.

    2016-06-01

    In this paper, a new method for robust decentralised control of multi-input multi-output (MIMO) systems using quantitative feedback theory (QFT) is suggested. The proposed method does not need inversion of the plant transfer function matrix in the design process. For a given system, an equivalent descriptor system representation is defined. By using this representation, sufficient conditions for closed-loop diagonal dominance over the uncertainty space are obtained. These conditions transform the original MIMO system into a set of isolated multi-input single-output (MISO) subsystems. Then, the local controllers are designed by using the typical MISO QFT technique for each isolated subsystem to satisfy the predefined desired specifications and the closed-loop diagonal dominance sufficient conditions. The proposed technique is less conservative in comparison to the approaches using the over-bounding concept in the design procedure. The effectiveness of the proposed technique is finally assessed on a MIMO Scara robot.

  12. Rotorcraft control system design for uncertain vehicle dynamics using quantitative feedback theory

    NASA Technical Reports Server (NTRS)

    Hess, R. A.

    1994-01-01

    Quantitative Feedback Theory describes a frequency-domain technique for the design of multi-input, multi-output control systems which must meet time or frequency domain performance criteria when specified uncertainty exists in the linear description of the vehicle dynamics. This theory is applied to the design of the longitudinal flight control system for a linear model of the BO-105C rotorcraft. Uncertainty in the vehicle model is due to the variation in the vehicle dynamics over a range of airspeeds from 0-100 kts. For purposes of exposition, the vehicle description contains no rotor or actuator dynamics. The design example indicates the manner in which significant uncertainty exists in the vehicle model. The advantage of using a sequential loop closure technique to reduce the cost of feedback is demonstrated by example.

  13. Rotorcraft flight control design using quantitative feedback theory and dynamic crossfeeds

    NASA Technical Reports Server (NTRS)

    Cheng, Rendy P.

    1995-01-01

    A multi-input, multi-output controls design with robust crossfeeds is presented for a rotorcraft in near-hovering flight using quantitative feedback theory (QFT). Decoupling criteria are developed for dynamic crossfeed design and implementation. Frequency dependent performance metrics focusing on piloted flight are developed and tested on 23 flight configurations. The metrics show that the resulting design is superior to alternative control system designs using conventional fixed-gain crossfeeds and to feedback-only designs which rely on high gains to suppress undesired off-axis responses. The use of dynamic, robust crossfeeds prior to the QFT design reduces the magnitude of required feedback gain and results in performance that meets current handling qualities specifications relative to the decoupling of off-axis responses. The combined effect of the QFT feedback design following the implementation of low-order, dynamic crossfeed compensator successfully decouples ten of twelve off-axis channels. For the other two channels it was not possible to find a single, low-order crossfeed that was effective.

  14. Perspectives of Speech-Language Pathologists on the Use of Telepractice in Schools: Quantitative Survey Results

    PubMed Central

    Tucker, Janice K.

    2012-01-01

    This research surveyed 170 school-based speech-language pathologists (SLPs) in one northeastern state, with only 1.8% reporting telepractice use in school-settings. These results were consistent with two ASHA surveys (2002; 2011) that reported limited use of telepractice for school-based speech-language pathology. In the present study, willingness to use telepractice was inversely related to age, perhaps because younger members of the profession are more accustomed to using technology. Overall, respondents were concerned about the validity of assessments administered via telepractice; whether clinicians can adequately establish rapport with clients via telepractice; and if therapy conducted via telepractice can be as effective as in-person speech-language therapy. Most respondents indicated the need to establish procedures and guidelines for school-based telepractice programs. PMID:25945204

  15. SDSS-IV MaNGA: Survey Design and Progress

    NASA Astrophysics Data System (ADS)

    Yan, Renbin; MaNGA Team

    2016-01-01

    The ongoing SDSS-IV/MaNGA Survey will obtain integral field spectroscopy at a resolution of R~2000 with a wavelength coverage from 3,600A to 10,300A for 10,000 nearby galaxies. Within each 3 degree diameter pointing of the 2.5m Sloan Telescope, we deploy 17 hexagonal fiber bundles with sizes ranging from 12 to 32 arcsec in diameter. The bundles are build with 2 arcsec fibers and have a 56% fill factor. During observations, we obtained sets of exposures at 3 different dither positions to achieve near-critical sampling of the effective point spread function, which has a FWHM about 2.5 arcsec, corresponding to 1-2 kpc for the majority of the galaxies targeted. The flux calibration is done using 12 additional mini-fiber-bundles targeting standard stars simultaneously with science targets, achieving a calibration accuracy better than 5% over 90% of the wavelength range. The target galaxies are selected to ensure uniform spatial coverage in units of effective radii for the majority of the galaxies while maximizing spatial resolution. About 2/3 of the sample is covered out to 1.5Re (primary sample) and 1/3 of the sample covered to 2.5Re (secondary sample). The sample is designed to have approximately equal representation from high and low mass galaxies while maintaining volume-limited selection at fixed absolute magnitudes. We obtain an average S/N of 4 per Angstrom in r-band continuum at a surface brightness of 23 AB arcsec-2. With spectral stacking in an elliptical annulus covering 1-1.5Re, our primary sample galaxies have a median S/N of ~60 per Angstrom in r-band.

  16. Quantitative autistic traits ascertained in a national survey of 22 529 Japanese schoolchildren

    PubMed Central

    Kamio, Y; Inada, N; Moriwaki, A; Kuroda, M; Koyama, T; Tsujii, H; Kawakubo, Y; Kuwabara, H; Tsuchiya, K J; Uno, Y; Constantino, J N

    2013-01-01

    Objective Recent epidemiologic studies worldwide have documented a rise in prevalence rates for autism spectrum disorders (ASD). Broadening of diagnostic criteria for ASD may be a major contributor to the rise in prevalence, particularly if superimposed on an underlying continuous distribution of autistic traits. This study sought to determine the nature of the population distribution of autistic traits using a quantitative trait measure in a large national population sample of children. Method The Japanese version of the Social Responsiveness Scale (SRS) was completed by parents on a nationally representative sample of 22 529 children, age 6–15. Results Social Responsiveness Scale scores exhibited a skewed normal distribution in the Japanese population with a single-factor structure and no significant relation to IQ within the normal intellectual range. There was no evidence of a natural ‘cutoff’ that would differentiate populations of categorically affected children from unaffected children. Conclusion This study provides evidence of the continuous nature of autistic symptoms measured by the SRS, a validated quantitative trait measure. The findings reveal how paradigms for diagnosis that rest on arbitrarily imposed categorical cutoffs can result in substantial variation in prevalence estimation, especially when measurements used for case assignment are not standardized for a given population. PMID:23171198

  17. A Survey of Former Drafting & Engineering Design Technology Students. Summary Findings of Respondents District-Wide.

    ERIC Educational Resources Information Center

    Glyer-Culver, Betty

    In fall 2001 staff of the Los Rios Community College District Office of Institutional Research collaborated with occupational deans, academic deans, and faculty to develop and administer a survey of former Drafting and Engineering Design Technology students. The survey was designed to determine how well courses had met the needs of former drafting…

  18. Quantitative differential geomorphology of the Monterey Canyon from time-separated multibeam surveys

    NASA Astrophysics Data System (ADS)

    Taramelli, A.; Zucca, F.; Innocenti, C.; Sorichetta, A.; Seeber, L.

    2008-12-01

    Changes of bathymetry derived from multibeam sonars are useful for quantifying the effects of many sedimentary and tectonic processes. The assessment of resolution limits is an essential component of the analysis This research compares submarine morphology as they manifest tectonics in a rapidly transform continental margin (Monterey Bay - California). We study modern submarine processes from a geomorphic change using high-resolution multibeam bathymetry. We first used different techniques that quantify uncertainties and reveals the spatial variations of errors. An sub-area of immobile seafloor in the study area, mapped by the high-resolution multibeam record of the seafloor of the MBR collected by MBARI in each survey in a four years period (spring 2003 to winter 2006), provides a common 'benchmark'. Each survey dataset over the benchmark is filtered with a simple moving-averaging window and depth differences between the two surveys are collated to derive a difference histogram. The procedure is repeated using different length-scales of filtering. By plotting the variability of the differences versus the length-scale of the filter, the different effects of spatially uncorrelated and correlated noise can be deduced. Beside that, a variography analysis is conducted on the dataset build by differencing the benchmark surveys to highlight spatial structures and anisotropies of the measure errors. Data analysis of the Monterey Bay area indicates that the canyon floor contains an axial channel laterally bounded by elevated complex terrace surfaces. Asymmetrical megaripples dominate the active part of the canyon floor, indicating sediment transport. Terraces represent the evidence of recent degradation of the canyon floor. Slump scars and gullies, having a variety of size, shape the canyon walls. Significant changes over the analyzed period include: (a) complete reorganization of the megaripples on the channel floor, (b) local slump scar on the head of the canyon and on

  19. ESTIMATING AMPHIBIAN OCCUPANCY RATES IN PONDS UNDER COMPLEX SURVEY DESIGNS

    EPA Science Inventory

    Monitoring the occurrence of specific amphibian species in ponds is one component of the US Geological Survey's Amphibian Monitoring and Research Initiative. Two collaborative studies were conducted in Olympic National Park and southeastern region of Oregon. The number of ponds...

  20. Survey of Quantitative Research Metrics to Assess Pilot Performance in Upset Recovery

    NASA Technical Reports Server (NTRS)

    Le Vie, Lisa R.

    2016-01-01

    Accidents attributable to in-flight loss of control are the primary cause for fatal commercial jet accidents worldwide. The National Aeronautics and Space Administration (NASA) conducted a literature review to determine and identify the quantitative standards for assessing upset recovery performance. This review contains current recovery procedures for both military and commercial aviation and includes the metrics researchers use to assess aircraft recovery performance. Metrics include time to first input, recognition time and recovery time and whether that input was correct or incorrect. Other metrics included are: the state of the autopilot and autothrottle, control wheel/sidestick movement resulting in pitch and roll, and inputs to the throttle and rudder. In addition, airplane state measures, such as roll reversals, altitude loss/gain, maximum vertical speed, maximum/minimum air speed, maximum bank angle and maximum g loading are reviewed as well.

  1. The health effects of climate change: a survey of recent quantitative research.

    PubMed

    Grasso, Margherita; Manera, Matteo; Chiabai, Aline; Markandya, Anil

    2012-05-01

    In recent years there has been a large scientific and public debate on climate change and its direct as well as indirect effects on human health. In particular, a large amount of research on the effects of climate changes on human health has addressed two fundamental questions. First, can historical data be of some help in revealing how short-run or long-run climate variations affect the occurrence of infectious diseases? Second, is it possible to build more accurate quantitative models which are capable of predicting the future effects of different climate conditions on the transmissibility of particularly dangerous infectious diseases? The primary goal of this paper is to review the most relevant contributions which have directly tackled those questions, both with respect to the effects of climate changes on the diffusion of non-infectious and infectious diseases, with malaria as a case study. Specific attention will be drawn on the methodological aspects of each study, which will be classified according to the type of quantitative model considered, namely time series models, panel data and spatial models, and non-statistical approaches. Since many different disciplines and approaches are involved, a broader view is necessary in order to provide a better understanding of the interactions between climate and health. In this respect, our paper also presents a critical summary of the recent literature related to more general aspects of the impacts of climate changes on human health, such as: the economics of climate change; how to manage the health effects of climate change; the establishment of Early Warning Systems for infectious diseases. PMID:22754455

  2. The Health Effects of Climate Change: A Survey of Recent Quantitative Research

    PubMed Central

    Grasso, Margherita; Manera, Matteo; Chiabai, Aline; Markandya, Anil

    2012-01-01

    In recent years there has been a large scientific and public debate on climate change and its direct as well as indirect effects on human health. In particular, a large amount of research on the effects of climate changes on human health has addressed two fundamental questions. First, can historical data be of some help in revealing how short-run or long-run climate variations affect the occurrence of infectious diseases? Second, is it possible to build more accurate quantitative models which are capable of predicting the future effects of different climate conditions on the transmissibility of particularly dangerous infectious diseases? The primary goal of this paper is to review the most relevant contributions which have directly tackled those questions, both with respect to the effects of climate changes on the diffusion of non-infectious and infectious diseases, with malaria as a case study. Specific attention will be drawn on the methodological aspects of each study, which will be classified according to the type of quantitative model considered, namely time series models, panel data and spatial models, and non-statistical approaches. Since many different disciplines and approaches are involved, a broader view is necessary in order to provide a better understanding of the interactions between climate and health. In this respect, our paper also presents a critical summary of the recent literature related to more general aspects of the impacts of climate changes on human health, such as: the economics of climate change; how to manage the health effects of climate change; the establishment of Early Warning Systems for infectious diseases. PMID:22754455

  3. Phenotype selection for detecting variable genes: a survey of cardiovascular quantitative traits and TNF locus polymorphism.

    PubMed

    Hong, Mun-Gwan; Bennet, Anna M; de Faire, Ulf; Prince, Jonathan A

    2007-06-01

    The practice of using discrete clinical diagnoses in genetic association studies has seldom led to a replicable genetic model. If, as the literature suggests, weak genotype-phenotype relationships are detected when clinical diagnoses are used, power might be increased by exploring more fundamental biological traits. Emerging solutions to this include directly modeling levels of the protein product of a gene (usually in plasma) and sequence variation specifically in/around that gene, as well as exploring multiple quantitative traits related to a disease of interest. Here, we attempt a strategy based upon these premises examining sequence variants near the TNF locus, a region widely studied in cardiovascular disease. Multilocus genotype models were used to perform a systematic screen of 18 metabolic and anthropometric traits for genetic association. While there was no evidence for an effect of TNF polymorphism on plasma TNF levels, a relatively strong effect on plasma PAI-1 levels did emerge (P=0.000019), but this was only evident in post-myocardial infarction patients. Modeled jointly with the common 4G/5G insertion/deletion polymorphism of SERPINE1 (formerly PAI), this effect appears large (10% of variance explained versus 2% for SERPINE1 4G/5G). We exhibit this finding cautiously, and use it to illustrate how transitioning the study of disease risk to quantitative traits might empower the identification of functionally variable genes. Further, a case is highlighted where association between sequence variation in a gene and its product is not readily apparent even in large samples, but where association with a down-stream pathway may be. PMID:17356550

  4. Design and performance of a thin-film calorimeter for quantitative characterization of photopolymerizable systems

    NASA Astrophysics Data System (ADS)

    Roper, Todd M.; Guymon, C. Allan; Hoyle, Charles E.

    2005-05-01

    A thin-film calorimeter (TFC) was designed for the quantitative characterization of photopolymerizable systems. A detailed description of its construction indicates the ease with which a TFC can be assembled and the flexibility inherent in its design. The mechanics of operation were optimized to yield a significantly faster instrument response time than other calorimetric methods such as photodifferential scanning calorimetry (photo-DSC). The TFC has enhanced sensitivity, more than an order of magnitude greater linear response range to changes in light intensity than that of the photo-DSC, resulting in the ability to measure both smaller and larger signals more accurately. The photopolymerization exotherm curves are reproducible and can be collected over a broad range of film thicknesses.

  5. A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework

    NASA Astrophysics Data System (ADS)

    Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo

    An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.

  6. Edesign: Primer and Enhanced Internal Probe Design Tool for Quantitative PCR Experiments and Genotyping Assays

    PubMed Central

    Kasahara, Naoko; Delobel, Diane; Hanami, Takeshi; Tanaka, Yuki; de Hoon, Michiel J. L.; Hayashizaki, Yoshihide; Usui, Kengo; Harbers, Matthias

    2016-01-01

    Analytical PCR experiments preferably use internal probes for monitoring the amplification reaction and specific detection of the amplicon. Such internal probes have to be designed in close context with the amplification primers, and may require additional considerations for the detection of genetic variations. Here we describe Edesign, a new online and stand-alone tool for designing sets of PCR primers together with an internal probe for conducting quantitative real-time PCR (qPCR) and genotypic experiments. Edesign can be used for selecting standard DNA oligonucleotides like for instance TaqMan probes, but has been further extended with new functions and enhanced design features for Eprobes. Eprobes, with their single thiazole orange-labelled nucleotide, allow for highly sensitive genotypic assays because of their higher DNA binding affinity as compared to standard DNA oligonucleotides. Using new thermodynamic parameters, Edesign considers unique features of Eprobes during primer and probe design for establishing qPCR experiments and genotyping by melting curve analysis. Additional functions in Edesign allow probe design for effective discrimination between wild-type sequences and genetic variations either using standard DNA oligonucleotides or Eprobes. Edesign can be freely accessed online at http://www.dnaform.com/edesign2/, and the source code is available for download. PMID:26863543

  7. SKA Weak Lensing II: Simulated Performance and Survey Design Considerations

    NASA Astrophysics Data System (ADS)

    Bonaldi, Anna; Harrison, Ian; Camera, Stefano; Brown, Michael L.

    2016-08-01

    We construct a pipeline for simulating weak lensing cosmology surveys with the Square Kilometre Array (SKA), taking as inputs telescope sensitivity curves; correlated source flux, size and redshift distributions; a simple ionospheric model; source redshift and ellipticity measurement errors. We then use this simulation pipeline to optimise a 2-year weak lensing survey performed with the first deployment of the SKA (SKA1). Our assessments are based on the total signal-to-noise of the recovered shear power spectra, a metric that we find to correlate very well with a standard dark energy figure of merit. We first consider the choice of frequency band, trading off increases in number counts at lower frequencies against poorer resolution; our analysis strongly prefers the higher frequency Band 2 (950-1760 MHz) channel of the SKA-MID telescope to the lower frequency Band 1 (350-1050 MHz). Best results would be obtained by allowing the centre of Band 2 to shift towards lower frequency, around 1.1 GHz. We then move on to consider survey size, finding that an area of 5,000 square degrees is optimal for most SKA1 instrumental configurations. Finally, we forecast the performance of a weak lensing survey with the second deployment of the SKA. The increased survey size (3π steradian) and sensitivity improves both the signal-to-noise and the dark energy metrics by two orders of magnitude.

  8. Acoustical Surveys Of Methane Plumes By Using The Quantitative Echo Sounder In The Eastern Margin Of The Sea of Japan

    NASA Astrophysics Data System (ADS)

    Aoyama, C.; Matsumoto, R.; Okuda, Y.; Ishida, Y.; Hiruta, A.; Sunamura, M.; Numanami, H.; Tomaru, H.; Snyder, G.; Komatsubara, J.; Takeuchi, R.; Hiromatsu, M.; Aoyama, D.; Koike, Y.; Takeda, S.; Hayashi, T.; Hamada, H.

    2004-12-01

    The reseach and trainning/V, Umitaka-maru sailed to the methane seep area on a small ridge in the eastern margin of the Sea of Japan on July to August 2004 to survey the ocean floor gas hydrate and related acoustic signatures of methane plumes by using a quantitative echo sounder. Detailed bathymetric profiles have revealed a number of mounds, pockmarks and collapse structures within 3km x 4km on the ridge at the water depth of 910m to 980m. We mapped minutely methane plumes by using a quantitative echo sounder with positioning data from GPS. We also measured averaged echo intensity from the methane plumes both in every 100m range and every one minute by the echo integrator. We obtained the following results from the present echo-sounder survey. 1) We checked 36 plumes on echogram, ranging 100m to 200m in diameter and 600m to 700m in height, reaching up to 200m to 300m below sea level. 2) We measured the averaged volume backscattering strength (SV) of each methane plume. The strongest SV, -45dB, of the plumes was stronger than SV of fish school. 3) Averaged SV tend to show the highest values around the middle of plumes, whereas the SVs are relatively low at the bottom and the top of plumes. 4) Some of the plumes were observed to show daily fluctuation in height and width. 5) We recovered several fist-sized chunks of methane hydrate by piston coring at the area where we observed methane plumes. As a following up project, we are planning to measure SV of methane bubbles and methane hydrate floating in water columns through an experimental studies in a large water tanks.

  9. Applications of numerical optimization methods to helicopter design problems - A survey

    NASA Technical Reports Server (NTRS)

    Miura, H.

    1985-01-01

    A survey of applications of mathematical programming methods is used to improve the design of helicopters and their components. Applications of multivariable search techniques in the finite dimensional space are considered. Five categories of helicopter design problems are considered: (1) conceptual and preliminary design, (2) rotor-system design, (3) airframe structures design, (4) control system design, and (5) flight trajectory planning. Key technical progress in numerical optimization methods relevant to rotorcraft applications are summarized.

  10. Applications of numerical optimization methods to helicopter design problems - A survey

    NASA Technical Reports Server (NTRS)

    Miura, H.

    1984-01-01

    A survey of applications of mathematical programming methods is used to improve the design of helicopters and their components. Applications of multivariable search techniques in the finite dimensional space are considered. Five categories of helicopter design problems are considered: (1) conceptual and preliminary design, (2) rotor-system design, (3) airframe structures design, (4) control system design, and (5) flight trajectory planning. Key technical progress in numerical optimization methods relevant to rotorcraft applications are summarized.

  11. Applications of numerical optimization methods to helicopter design problems: A survey

    NASA Technical Reports Server (NTRS)

    Miura, H.

    1984-01-01

    A survey of applications of mathematical programming methods is used to improve the design of helicopters and their components. Applications of multivariable search techniques in the finite dimensional space are considered. Five categories of helicopter design problems are considered: (1) conceptual and preliminary design, (2) rotor-system design, (3) airframe structures design, (4) control system design, and (5) flight trajectory planning. Key technical progress in numerical optimization methods relevant to rotorcraft applications are summarized.

  12. THE IMACS CLUSTER BUILDING SURVEY. V. FURTHER EVIDENCE FOR STARBURST RECYCLING FROM QUANTITATIVE GALAXY MORPHOLOGIES

    SciTech Connect

    Abramson, Louis E.; Gladders, Michael D.; Dressler, Alan; Oemler, Augustus Jr.; Monson, Andrew; Persson, Eric; Poggianti, Bianca M.; Vulcani, Benedetta

    2013-11-10

    Using J- and K{sub s}-band imaging obtained as part of the IMACS Cluster Building Survey (ICBS), we measure Sérsic indices for 2160 field and cluster galaxies at 0.31 < z < 0.54. Using both mass- and magnitude-limited samples, we compare the distributions for spectroscopically determined passive, continuously star-forming, starburst, and post-starburst systems and show that previously established spatial and statistical connections between these types extend to their gross morphologies. Outside of cluster cores, we find close structural ties between starburst and continuously star-forming, as well as post-starburst and passive types, but not between starbursts and post-starbursts. These results independently support two conclusions presented in Paper II of this series: (1) most starbursts are the product of a non-disruptive triggering mechanism that is insensitive to global environment, such as minor mergers; (2) starbursts and post-starbursts generally represent transient phases in the lives of 'normal' star-forming and quiescent galaxies, respectively, originating from and returning to these systems in closed 'recycling' loops. In this picture, spectroscopically identified post-starbursts constitute a minority of all recently terminated starbursts, largely ruling out the typical starburst as a quenching event in all but the densest environments.

  13. Hydrological drought types in cold climates: quantitative analysis of causing factors and qualitative survey of impacts

    NASA Astrophysics Data System (ADS)

    Van Loon, A. F.; Ploum, S. W.; Parajka, J.; Fleig, A. K.; Garnier, E.; Laaha, G.; Van Lanen, H. A. J.

    2015-04-01

    For drought management and prediction, knowledge of causing factors and socio-economic impacts of hydrological droughts is crucial. Propagation of meteorological conditions in the hydrological cycle results in different hydrological drought types that require separate analysis. In addition to the existing hydrological drought typology, we here define two new drought types related to snow and ice. A snowmelt drought is a deficiency in the snowmelt discharge peak in spring in snow-influenced basins and a glaciermelt drought is a deficiency in the glaciermelt discharge peak in summer in glacierised basins. In 21 catchments in Austria and Norway we studied the meteorological conditions in the seasons preceding and at the time of snowmelt and glaciermelt drought events. Snowmelt droughts in Norway were mainly controlled by below-average winter precipitation, while in Austria both temperature and precipitation played a role. For glaciermelt droughts, the effect of below-average summer air temperature was dominant, both in Austria and Norway. Subsequently, we investigated the impacts of temperature-related drought types (i.e. snowmelt and glaciermelt drought, but also cold and warm snow season drought and rain-to-snow-season drought). In historical archives and drought databases for the US and Europe many impacts were found that can be attributed to these temperature-related hydrological drought types, mainly in the agriculture and electricity production (hydropower) sectors. However, drawing conclusions on the frequency of occurrence of different drought types from reported impacts is difficult, mainly because of reporting biases and the inevitably limited spatial and temporal scales of the information. Finally, this study shows that complete integration of quantitative analysis of causing factors and qualitative analysis of impacts of temperature-related droughts is not yet possible. Analysis of selected events, however, points out that it can be a promising research

  14. ESTIMATING PROPORTION OF AREA OCCUPIED UNDER COMPLEX SURVEY DESIGNS

    EPA Science Inventory

    Estimating proportion of sites occupied, or proportion of area occupied (PAO) is a common problem in environmental studies. Typically, field surveys do not ensure that occupancy of a site is made with perfect detection. Maximum likelihood estimation of site occupancy rates when...

  15. Ergonomic Based Design and Survey of Elementary School Furniture

    ERIC Educational Resources Information Center

    Maheshwar; Jawalkar, Chandrashekhar S.

    2014-01-01

    This paper presents the ergonomic aspects in designing and prototyping of desks cum chairs used in elementary schools. The procedures adopted for the assessment included: the study of existing school furniture, design analysis and development of prototypes. The design approach proposed a series of adjustable desks and chairs developed in terms of…

  16. Estimating effects of a single gene and polygenes on quantitative traits from a diallel design.

    PubMed

    Lou, Xiang-Yang; Yang, Mark C K

    2006-01-01

    A genetic model is developed with additive and dominance effects of a single gene and polygenes as well as general and specific reciprocal effects for the progeny from a diallel mating design. The methods of ANOVA, minimum norm quadratic unbiased estimation (MINQUE), restricted maximum likelihood estimation (REML), and maximum likelihood estimation (ML) are suggested for estimating variance components, and the methods of generalized least squares (GLS) and ordinary least squares (OLS) for fixed effects, while best linear unbiased prediction, linear unbiased prediction (LUP), and adjusted unbiased prediction are suggested for analyzing random effects. Monte Carlo simulations were conducted to evaluate the unbiasedness and efficiency of statistical methods involving two diallel designs with commonly used sample sizes, 6 and 8 parents, with no and missing crosses, respectively. Simulation results show that GLS and OLS are almost equally efficient for estimation of fixed effects, while MINQUE (1) and REML are better estimators of the variance components and LUP is most practical method for prediction of random effects. Data from a Drosophila melanogaster experiment (Gilbert 1985a, Theor appl Genet 69:625-629) were used as a working example to demonstrate the statistical analysis. The new methodology is also applicable to screening candidate gene(s) and to other mating designs with multiple parents, such as nested (NC Design I) and factorial (NC Design II) designs. Moreover, this methodology can serve as a guide to develop new methods for detecting indiscernible major genes and mapping quantitative trait loci based on mixture distribution theory. The computer program for the methods suggested in this article is freely available from the authors. PMID:17028974

  17. Influenza knowledge, attitude, and behavior survey for grade school students: design and novel assessment methodology.

    PubMed

    Koep, Tyler H; Huskins, W Charles; Clemens, Christal; Jenkins, Sarah; Pierret, Chris; Ekker, Stephen C; Enders, Felicity T

    2014-12-01

    Despite the fact infectious diseases can spread readily in grade schools, few studies have explored prevention in this setting. Additionally, we lack valid tools for students to self-report knowledge, attitudes, and behaviors. As part of an ongoing study of a curriculum intervention to promote healthy behaviors, we developed and evaluated age-appropriate surveys to determine students' understanding of influenza prevention. Surveys were adapted from adolescent and adult influenza surveys and administered to students in grades 2-5 (ages 7-11) at two Rochester public schools. We assessed student understanding by analyzing percent repeatability of 20 survey questions and compared percent "don't know" (DK) responses across grades, gender, and race. Questions thought to be ambiguous after early survey administration were investigated in student focus groups, modified as appropriate, and reassessed. The response rate across all surveys was >87%. Survey questions were well understood; 16 of 20 questions demonstrated strong pre/post repeatability (>70%). Only 1 question showed an increase in DK response for higher grades (p < .0001). Statistical analysis and qualitative feedback led to modification of 3 survey questions and improved measures of understanding in the final survey administration. Grade-school students' knowledge, attitudes and behavior toward influenza prevention can be assessed using surveys. Quantitative and qualitative analysis may be used to assess participant understanding and refine survey development for pediatric survey instruments. These methods may be used to assess the repeatability and validity of surveys to assess the impact of health education interventions in young children. PMID:24859735

  18. The JCMT Gould Belt Survey: a quantitative comparison between SCUBA-2 data reduction methods

    NASA Astrophysics Data System (ADS)

    Mairs, S.; Johnstone, D.; Kirk, H.; Graves, S.; Buckle, J.; Beaulieu, S. F.; Berry, D. S.; Broekhoven-Fiene, H.; Currie, M. J.; Fich, M.; Hatchell, J.; Jenness, T.; Mottram, J. C.; Nutter, D.; Pattle, K.; Pineda, J. E.; Salji, C.; Francesco, J. Di; Hogerheijde, M. R.; Ward-Thompson, D.; JCMT Gould Belt survey Team

    2015-12-01

    Performing ground-based submillimetre observations is a difficult task as the measurements are subject to absorption and emission from water vapour in the Earth's atmosphere and time variation in weather and instrument stability. Removing these features and other artefacts from the data is a vital process which affects the characteristics of the recovered astronomical structure we seek to study. In this paper, we explore two data reduction methods for data taken with the Submillimetre Common-User Bolometer Array-2 (SCUBA-2) at the James Clerk Maxwell Telescope (JCMT). The JCMT Legacy Reduction 1 (JCMT LR1) and The Gould Belt Legacy Survey Legacy Release 1 (GBS LR1) reduction both use the same software (STARLINK) but differ in their choice of data reduction parameters. We find that the JCMT LR1 reduction is suitable for determining whether or not compact emission is present in a given region and the GBS LR1 reduction is tuned in a robust way to uncover more extended emission, which better serves more in-depth physical analyses of star-forming regions. Using the GBS LR1 method, we find that compact sources are recovered well, even at a peak brightness of only three times the noise, whereas the reconstruction of larger objects requires much care when drawing boundaries around the expected astronomical signal in the data reduction process. Incorrect boundaries can lead to false structure identification or it can cause structure to be missed. In the JCMT LR1 reduction, the extent of the true structure of objects larger than a point source is never fully recovered.

  19. Quantitative imaging of the human upper airway: instrument design and clinical studies

    NASA Astrophysics Data System (ADS)

    Leigh, M. S.; Armstrong, J. J.; Paduch, A.; Sampson, D. D.; Walsh, J. H.; Hillman, D. R.; Eastwood, P. R.

    2006-08-01

    Imaging of the human upper airway is widely used in medicine, in both clinical practice and research. Common imaging modalities include video endoscopy, X-ray CT, and MRI. However, no current modality is both quantitative and safe to use for extended periods of time. Such a capability would be particularly valuable for sleep research, which is inherently reliant on long observation sessions. We have developed an instrument capable of quantitative imaging of the human upper airway, based on endoscopic optical coherence tomography. There are no dose limits for optical techniques, and the minimally invasive imaging probe is safe for use in overnight studies. We report on the design of the instrument and its use in preliminary clinical studies, and we present results from a range of initial experiments. The experiments show that the instrument is capable of imaging during sleep, and that it can record dynamic changes in airway size and shape. This information is useful for research into sleep disorders, and potentially for clinical diagnosis and therapies.

  20. A comparative survey of non-adaptive pooling designs

    SciTech Connect

    Balding, D.J.; Bruno, W.J.; Torney, D.C.

    1996-12-31

    Pooling (or {open_quotes}group testing{close_quotes}) designs for screening clone libraries for rare {open_quotes}positives{close_quotes} are described and compared. We focus on non-adaptive designs in which, in order both to facilitate automation and to minimize the total number of pools required in multiple screenings, all the pools are specified in advance of the experiments. The designs considered include deterministic designs, such as set-packing designs, the widely-used {open_quotes}row and column{close_quotes} designs and the more general {open_quotes}transversal{close_quotes} designs, as well as random designs such as {open_quotes}random incidence{close_quotes} and {open_quotes}random k-set{close_quotes} designs. A range of possible performance measures is considered, including the expected numbers of unresolved positive and negative clones, and the probability of a one-pass solution. We describe a flexible strategy in which the experimenter chooses a compromise between the random k-set and the set-packing designs. In general, the latter have superior performance while the former are nearly as efficient and are easier to construct. 39 refs., 1 fig., 4 tabs.

  1. Controls design with crossfeeds for hovering rotorcraft using quantitative feedback theory

    NASA Technical Reports Server (NTRS)

    Tischler, Mark B.; Biezad, Daniel J.; Cheng, Rendy

    1996-01-01

    A multi-input, multi-output controls design with dynamic crossfeed pre-compensation is presented for rotorcraft in near-hovering flight using Quantitative Feedback Theory (QFT). The resulting closed-loop control system bandwidth allows the rotorcraft to be considered for use as an inflight simulator. The use of dynamic, robust crossfeeds prior to the QFT design reduces the magnitude of required feedback gain and results in performance that meets most handling qualities specifications relative to the decoupling of off-axis responses. Handling qualities are Level 1 for both low-gain tasks and high-gain tasks in the roll, pitch, and yaw axes except for the 10 deg/sec moderate-amplitude yaw command where the rotorcraft exhibits Level 2 handling qualities in the yaw axis caused by phase lag. The combined effect of the QFT feedback design following the implementation of low-order, dynamic crossfeed compensators successfully decouples ten of twelve off-axis channels. For the other two channels it was not possible to find a single, low-order crossfeed that was effective. This is an area to be investigated in future research.

  2. Probability of detection of nests and implications for survey design

    USGS Publications Warehouse

    Smith, P.A.; Bart, J.; Lanctot, Richard B.; McCaffery, B.J.; Brown, S.

    2009-01-01

    Surveys based on double sampling include a correction for the probability of detection by assuming complete enumeration of birds in an intensively surveyed subsample of plots. To evaluate this assumption, we calculated the probability of detecting active shorebird nests by using information from observers who searched the same plots independently. Our results demonstrate that this probability varies substantially by species and stage of the nesting cycle but less by site or density of nests. Among the species we studied, the estimated single-visit probability of nest detection during the incubation period varied from 0.21 for the White-rumped Sandpiper (Calidris fuscicollis), the most difficult species to detect, to 0.64 for the Western Sandpiper (Calidris mauri), the most easily detected species, with a mean across species of 0.46. We used these detection probabilities to predict the fraction of persistent nests found over repeated nest searches. For a species with the mean value for detectability, the detection rate exceeded 0.85 after four visits. This level of nest detection was exceeded in only three visits for the Western Sandpiper, but six to nine visits were required for the White-rumped Sandpiper, depending on the type of survey employed. Our results suggest that the double-sampling method's requirement of nearly complete counts of birds in the intensively surveyed plots is likely to be met for birds with nests that survive over several visits of nest searching. Individuals with nests that fail quickly or individuals that do not breed can be detected with high probability only if territorial behavior is used to identify likely nesting pairs. ?? The Cooper Ornithological Society, 2009.

  3. Rapid surveys for program evaluation: design and implementation of an experiment in Ecuador.

    PubMed

    Macintyre, K; Bilsborrow, R E; Olmedo, C; Carrasco, R

    1999-09-01

    This paper presents details from the field test of two rapid surveys in Ecuador in 1995. It focuses on how the surveys were designed and implemented, including descriptions of the sampling procedures, the preparation and use of preprogrammed palmtop computers for data entry, the selection criteria for the interviewing team, and how the training was designed. Lessons are drawn that will assist health professionals plan and carry out better rapid data collection in the future. The objective of the study was to evaluate the reliability and validity of data gathered during the rapid surveys as compared with a recent "gold standard" national survey. A two-way factorial design was used to control for differences in sampling (probability versus quasi-probability) and methods of data collection (paper versus palmtop computer). Few differences were detected between the surveys done on palmtops as compared to paper ones, but urban and rural differentials in contraceptive use were less pronounced in the rapid surveys than in the earlier, national survey. This suggests that caution should be exercised in interpreting the disaggregated data in these rapid surveys. In-depth interviews revealed two features of the rapid surveys that were especially popular: the palmtops for their speed of data entry, and the short questionnaire for its "low impact" on a respondent's time. The common belief that computers would disturb respondents was not found to be the case. Even with no computer experience, the interviewers rapidly mastered the new technology. PMID:10517097

  4. Estimating occupancy rates with imperfect detection under complex survey designs

    EPA Science Inventory

    Monitoring the occurrence of specific amphibian species is of interest. Typically, the monitoring design is a complex design that involves stratification and unequal probability of selection. When conducting field visits to selected sites, a common problem is that during a singl...

  5. 23 CFR 1340.10 - Submission and approval of seat belt survey design.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 23 Highways 1 2012-04-01 2012-04-01 false Submission and approval of seat belt survey design. 1340.10 Section 1340.10 Highways NATIONAL HIGHWAY TRAFFIC SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION UNIFORM CRITERIA FOR STATE OBSERVATIONAL SURVEYS OF SEAT BELT USE Administrative Requirements § 1340.10 Submission and approval of seat...

  6. Survey design for lakes and reservoirs in the United States to assess contaminants in fish tissue

    EPA Science Inventory

    The National Lake Fish Tissue Study (NLFTS) was the first survey of fish contamination in lakes and reservoirs in the 48 conterminous states based on probability survey design. This study included the largest set (268) of persistent, bioaccumulative, and toxic (PBT) chemicals ev...

  7. The Laboratory Course Assessment Survey: A Tool to Measure Three Dimensions of Research-Course Design

    ERIC Educational Resources Information Center

    Corwin, Lisa A.; Runyon, Christopher; Robinson, Aspen; Dolan, Erin L.

    2015-01-01

    Course-based undergraduate research experiences (CUREs) are increasingly being offered as scalable ways to involve undergraduates in research. Yet few if any design features that make CUREs effective have been identified. We developed a 17-item survey instrument, the Laboratory Course Assessment Survey (LCAS), that measures students' perceptions…

  8. National Comorbidity Survey Replication Adolescent Supplement (NCS-A): II. Overview and Design

    ERIC Educational Resources Information Center

    Kessler, Ronald C.; Avenevoli, Shelli; Costello, E. Jane; Green, Jennifer Greif; Gruber, Michael J.; Heeringa, Steven; Merikangas, Kathleen R.; Pennell, Beth-Ellen; Sampson, Nancy A.; Zaslavsky, Alan M.

    2009-01-01

    The national comorbidity survey that seeks to determine the prevalence and correlates of mental disorders among U.S. adolescents is based on a dual-frame design that includes 904 adolescents from a previous household survey and 9,244 adolescent students from a sample of 320 schools. Replacement schools for those that refuse to participate do not…

  9. A survey of aerobraking orbital transfer vehicle design concepts

    NASA Technical Reports Server (NTRS)

    Park, Chul

    1987-01-01

    The five existing design concepts of the aerobraking orbital transfer vehicle (namely, the raked sphere-cone designs, conical lifting-brake, raked elliptic-cone, lifting-body, and ballute) are reviewed and critiqued. Historical backgrounds, and the geometrical, aerothermal, and operational features of these designs are reviewed first. Then, the technological requirements for the vehicle (namely, navigation, aerodynamic stability and control, afterbody flow impingement, nonequilibrium radiation, convective heat-transfer rates, mission abort and multiple atmospheric passes, transportation and construction, and the payload-to-vehicle weight requirements) are delineated by summarizing the recent advancements made on these issues. Each of the five designs are critiqued and rated on these issues. The highest and the lowest ratings are given to the raked sphere-cone and the ballute design, respectively.

  10. 23 CFR 1340.11 - Post-approval alterations to survey design.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... UNIFORM CRITERIA FOR STATE OBSERVATIONAL SURVEYS OF SEAT BELT USE Administrative Requirements § 1340.11..., sample design, seat belt use rate estimation method, variance estimation method and data...

  11. 23 CFR 1340.11 - Post-approval alterations to survey design.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... UNIFORM CRITERIA FOR STATE OBSERVATIONAL SURVEYS OF SEAT BELT USE Administrative Requirements § 1340.11..., sample design, seat belt use rate estimation method, variance estimation method and data...

  12. 23 CFR 1340.11 - Post-approval alterations to survey design.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... UNIFORM CRITERIA FOR STATE OBSERVATIONAL SURVEYS OF SEAT BELT USE Administrative Requirements § 1340.11..., sample design, seat belt use rate estimation method, variance estimation method and data...

  13. Targeting Urban Watershed Stressor Gradients: Stream Survey Design, Ecological Responses, and Implications of Land Cover Resolution

    EPA Science Inventory

    We conducted a stream survey in the Narragansett Bay Watershed designed to target a gradient of development intensity, and to examine how associated changes in nutrients, carbon, and stressors affect periphyton and macroinvertebrates. Concentrations of nutrients, cations, and ani...

  14. Effect of survey design and catch rate estimation on total catch estimates in Chinook salmon fisheries

    USGS Publications Warehouse

    McCormick, Joshua L.; Quist, Michael C.; Schill, Daniel J.

    2012-01-01

    Roving–roving and roving–access creel surveys are the primary techniques used to obtain information on harvest of Chinook salmon Oncorhynchus tshawytscha in Idaho sport fisheries. Once interviews are conducted using roving–roving or roving–access survey designs, mean catch rate can be estimated with the ratio-of-means (ROM) estimator, the mean-of-ratios (MOR) estimator, or the MOR estimator with exclusion of short-duration (≤0.5 h) trips. Our objective was to examine the relative bias and precision of total catch estimates obtained from use of the two survey designs and three catch rate estimators for Idaho Chinook salmon fisheries. Information on angling populations was obtained by direct visual observation of portions of Chinook salmon fisheries in three Idaho river systems over an 18-d period. Based on data from the angling populations, Monte Carlo simulations were performed to evaluate the properties of the catch rate estimators and survey designs. Among the three estimators, the ROM estimator provided the most accurate and precise estimates of mean catch rate and total catch for both roving–roving and roving–access surveys. On average, the root mean square error of simulated total catch estimates was 1.42 times greater and relative bias was 160.13 times greater for roving–roving surveys than for roving–access surveys. Length-of-stay bias and nonstationary catch rates in roving–roving surveys both appeared to affect catch rate and total catch estimates. Our results suggest that use of the ROM estimator in combination with an estimate of angler effort provided the least biased and most precise estimates of total catch for both survey designs. However, roving–access surveys were more accurate than roving–roving surveys for Chinook salmon fisheries in Idaho.

  15. Systematic review of effects of current transtibial prosthetic socket designs--Part 2: Quantitative outcomes.

    PubMed

    Safari, Mohammad Reza; Meier, Margrit Regula

    2015-01-01

    This review is an attempt to untangle the complexity of transtibial prosthetic socket fit and perhaps find some indication of whether a particular prosthetic socket type might be best for a given situation. In addition, we identified knowledge gaps, thus providing direction for possible future research. We followed the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines, using medical subject headings and standard key words to search for articles in relevant databases. No restrictions were made on study design and type of outcome measure used. From the obtained search results (n = 1,863), 35 articles were included. The relevant data were entered into a predefined data form that included the Downs and Black risk of bias assessment checklist. This article presents the results from the systematic review of the quantitative outcomes (n = 27 articles). Trends indicate that vacuum-assisted suction sockets improve gait symmetry, volume control, and residual limb health more than other socket designs. Hydrostatic sockets seem to create less inconsistent socket fittings, reducing a problem that greatly influences outcome measures. Knowledge gaps exist in the understanding of clinically meaningful changes in socket fit and its effect on biomechanical outcomes. Further, safe and comfortable pressure thresholds under various conditions should be determined through a systematic approach. PMID:26436733

  16. Preliminary design of the Kunlun Dark Universe Survey Telescope (KDUST)

    NASA Astrophysics Data System (ADS)

    Yuan, Xiangyan; Cui, Xiangqun; Su, Ding-qiang; Zhu, Yongtian; Wang, Lifan; Gu, Bozhong; Gong, Xuefei; Li, Xinnan

    2013-01-01

    From theoretical analysis and site testing work for 4 years on Dome A, Antarctica, we can reasonably predict that it is a very good astronomical site, as good as or even better than Dome C and suitable for observations ranging from optical to infrared & sub-mm wavelengths. After the Chinese Small Telescope ARray (CSTAR), which was composed of four small fixed telescopes with diameter of 145mm and the three Antarctic Survey Telescopes (AST3) with 500mm entrance diameter, the Kunlun Dark Universe Survey Telescope (KDUST) with diameter of 2.5m is proposed. KDUST will adopt an innovative optical system which can deliver very good image quality over a 2 square degree flat field of view. Some other features are: a fixed focus suitable for different instruments, active optics for miscollimation correction, a lens-prisms that can be used as an atmospheric dispersion corrector or as a very low-dispersion spectrometer when moved in / out of the main optical path without changing the performance of the system, and a compact structure to make easier transportation to Dome A. KDUST will be mounted on a tower with height 15m in order to make a full use of the superb free atmospheric seeing.

  17. First National Survey of Lead and Allergens in Housing: survey design and methods for the allergen and endotoxin components.

    PubMed Central

    Vojta, Patrick J; Friedman, Warren; Marker, David A; Clickner, Robert; Rogers, John W; Viet, Susan M; Muilenberg, Michael L; Thorne, Peter S; Arbes, Samuel J; Zeldin, Darryl C

    2002-01-01

    From July 1998 to August 1999, the U.S. Department of Housing and Urban Development and the National Institute of Environmental Health Sciences conducted the first National Survey of Lead and Allergens in Housing. The purpose of the survey was to assess children's potential household exposure to lead, allergens, and bacterial endotoxins. We surveyed a sample of 831 homes, representing 96 million permanently occupied, noninstitutional housing units that permit resident children. We administered questionnaires to household members, made home observations, and took environmental samples. This article provides general background information on the survey, an overview of the survey design, and a description of the data collection and laboratory methods pertaining to the allergen and endotoxin components. We collected dust samples from a bed, the bedroom floor, a sofa or chair, the living room floor, the kitchen floor, and a basement floor and analyzed them for cockroach allergen Bla g 1, the dust mite allergens Der f 1 and Der p 1, the cat allergen Fel d 1, the dog allergen Can f 1, the rodent allergens Rat n 1 and mouse urinary protein, allergens of the fungus Alternaria alternata, and endotoxin. This article provides the essential context for subsequent reports that will describe the prevalence of allergens and endotoxin in U.S. households, their distribution by various housing characteristics, and their associations with allergic diseases such as asthma and rhinitis. PMID:12003758

  18. THE HETDEX PILOT SURVEY. I. SURVEY DESIGN, PERFORMANCE, AND CATALOG OF EMISSION-LINE GALAXIES

    SciTech Connect

    Adams, Joshua J.; Blanc, Guillermo A.; Gebhardt, Karl; Hao, Lei; Byun, Joyce; Fry, Alex; Jeong, Donghui; Komatsu, Eiichiro; Hill, Gary J.; Cornell, Mark E.; MacQueen, Phillip J.; Drory, Niv; Bender, Ralf; Hopp, Ulrich; Kelzenberg, Ralf; Ciardullo, Robin; Gronwall, Caryl; Finkelstein, Steven L.; Gawiser, Eric; Kelz, Andreas

    2011-01-15

    We present a catalog of emission-line galaxies selected solely by their emission-line fluxes using a wide-field integral field spectrograph. This work is partially motivated as a pilot survey for the upcoming Hobby-Eberly Telescope Dark Energy Experiment. We describe the observations, reductions, detections, redshift classifications, line fluxes, and counterpart information for 397 emission-line galaxies detected over 169 {open_square}' with a 3500-5800 A bandpass under 5 A full-width-half-maximum (FWHM) spectral resolution. The survey's best sensitivity for unresolved objects under photometric conditions is between 4 and 20x 10{sup -17} erg s{sup -1} cm{sup -2} depending on the wavelength, and Ly{alpha} luminosities between 3 x 10{sup 42} and 6 x 10{sup 42} erg s{sup -1} are detectable. This survey method complements narrowband and color-selection techniques in the search of high-redshift galaxies with its different selection properties and large volume probed. The four survey fields within the COSMOS, GOODS-N, MUNICS, and XMM-LSS areas are rich with existing, complementary data. We find 105 galaxies via their high-redshift Ly{alpha} emission at 1.9 < z < 3.8, and the majority of the remainder objects are low-redshift [O II]3727 emitters at z < 0.56. The classification between low- and high-redshift objects depends on rest-frame equivalent width (EW), as well as other indicators, where available. Based on matches to X-ray catalogs, the active galactic nuclei fraction among the Ly{alpha} emitters is 6%. We also analyze the survey's completeness and contamination properties through simulations. We find five high-z, highly significant, resolved objects with FWHM sizes >44 {open_square}' which appear to be extended Ly{alpha} nebulae. We also find three high-z objects with rest-frame Ly{alpha} EW above the level believed to be achievable with normal star formation, EW{sub 0}>240 A. Future papers will investigate the physical properties of this sample.

  19. The HETDEX Pilot Survey. I. Survey Design, Performance, and Catalog of Emission-line Galaxies

    NASA Astrophysics Data System (ADS)

    Adams, Joshua J.; Blanc, Guillermo A.; Hill, Gary J.; Gebhardt, Karl; Drory, Niv; Hao, Lei; Bender, Ralf; Byun, Joyce; Ciardullo, Robin; Cornell, Mark E.; Finkelstein, Steven L.; Fry, Alex; Gawiser, Eric; Gronwall, Caryl; Hopp, Ulrich; Jeong, Donghui; Kelz, Andreas; Kelzenberg, Ralf; Komatsu, Eiichiro; MacQueen, Phillip J.; Murphy, Jeremy; Odoms, P. Samuel; Roth, Martin; Schneider, Donald P.; Tufts, Joseph R.; Wilkinson, Christopher P.

    2011-01-01

    We present a catalog of emission-line galaxies selected solely by their emission-line fluxes using a wide-field integral field spectrograph. This work is partially motivated as a pilot survey for the upcoming Hobby-Eberly Telescope Dark Energy Experiment. We describe the observations, reductions, detections, redshift classifications, line fluxes, and counterpart information for 397 emission-line galaxies detected over 169 squ' with a 3500-5800 Å bandpass under 5 Å full-width-half-maximum (FWHM) spectral resolution. The survey's best sensitivity for unresolved objects under photometric conditions is between 4 and 20× 10-17 erg s-1 cm-2 depending on the wavelength, and Lyα luminosities between 3 × 1042 and 6 × 1042 erg s-1 are detectable. This survey method complements narrowband and color-selection techniques in the search of high-redshift galaxies with its different selection properties and large volume probed. The four survey fields within the COSMOS, GOODS-N, MUNICS, and XMM-LSS areas are rich with existing, complementary data. We find 105 galaxies via their high-redshift Lyα emission at 1.9 < z < 3.8, and the majority of the remainder objects are low-redshift [O II]3727 emitters at z < 0.56. The classification between low- and high-redshift objects depends on rest-frame equivalent width (EW), as well as other indicators, where available. Based on matches to X-ray catalogs, the active galactic nuclei fraction among the Lyα emitters is 6%. We also analyze the survey's completeness and contamination properties through simulations. We find five high-z, highly significant, resolved objects with FWHM sizes >44 squ' which appear to be extended Lyα nebulae. We also find three high-z objects with rest-frame Lyα EW above the level believed to be achievable with normal star formation, EW0>240 Å. Future papers will investigate the physical properties of this sample. This paper includes data taken at The McDonald Observatory of The University of Texas at Austin.

  20. Temporally Adaptive Sampling: A Case Study in Rare Species Survey Design with Marbled Salamanders (Ambystoma opacum)

    PubMed Central

    Charney, Noah D.; Kubel, Jacob E.; Eiseman, Charles S.

    2015-01-01

    Improving detection rates for elusive species with clumped distributions is often accomplished through adaptive sampling designs. This approach can be extended to include species with temporally variable detection probabilities. By concentrating survey effort in years when the focal species are most abundant or visible, overall detection rates can be improved. This requires either long-term monitoring at a few locations where the species are known to occur or models capable of predicting population trends using climatic and demographic data. For marbled salamanders (Ambystoma opacum) in Massachusetts, we demonstrate that annual variation in detection probability of larvae is regionally correlated. In our data, the difference in survey success between years was far more important than the difference among the three survey methods we employed: diurnal surveys, nocturnal surveys, and dipnet surveys. Based on these data, we simulate future surveys to locate unknown populations under a temporally adaptive sampling framework. In the simulations, when pond dynamics are correlated over the focal region, the temporally adaptive design improved mean survey success by as much as 26% over a non-adaptive sampling design. Employing a temporally adaptive strategy costs very little, is simple, and has the potential to substantially improve the efficient use of scarce conservation funds. PMID:25799224

  1. Temporally adaptive sampling: a case study in rare species survey design with marbled salamanders (Ambystoma opacum).

    PubMed

    Charney, Noah D; Kubel, Jacob E; Eiseman, Charles S

    2015-01-01

    Improving detection rates for elusive species with clumped distributions is often accomplished through adaptive sampling designs. This approach can be extended to include species with temporally variable detection probabilities. By concentrating survey effort in years when the focal species are most abundant or visible, overall detection rates can be improved. This requires either long-term monitoring at a few locations where the species are known to occur or models capable of predicting population trends using climatic and demographic data. For marbled salamanders (Ambystoma opacum) in Massachusetts, we demonstrate that annual variation in detection probability of larvae is regionally correlated. In our data, the difference in survey success between years was far more important than the difference among the three survey methods we employed: diurnal surveys, nocturnal surveys, and dipnet surveys. Based on these data, we simulate future surveys to locate unknown populations under a temporally adaptive sampling framework. In the simulations, when pond dynamics are correlated over the focal region, the temporally adaptive design improved mean survey success by as much as 26% over a non-adaptive sampling design. Employing a temporally adaptive strategy costs very little, is simple, and has the potential to substantially improve the efficient use of scarce conservation funds. PMID:25799224

  2. Survey of sodium removal methods: LMFBR conceptual design study, Phase 3

    SciTech Connect

    1981-09-01

    At the project design review of the nuclear island maintenance on May 5, 1981, DOE requested a survey of current sodium cleaning methods and facilities. Stone & Webster provided a plan and schedule for providing this survey. This plan was approved by Boeing Engineering and Construction Company. The purpose of this survey is to document the sodium removal technology and experience as it relates to the CDS Large Developmental Plant, summarize the information, and provide a prospective for the CDS project. The recommendations generated are intended to provide input for a design and layout review of the Nuclear Island Maintenance Building (NIMB).

  3. Thermal design for the Advanced Camera for Surveys

    NASA Astrophysics Data System (ADS)

    Rafal, Marc D.

    1998-08-01

    The advanced camera for surveys (ACS) is a third generation science instrument scheduled for installation into the Hubble Space Telescope (HST) during the third servicing mission scheduled for 1999. ACS, along with the previously installed space telescope imaging spectrograph and near IR camera/multi-object spectrograph, consume significantly more power than the first generation of instruments. Additionally, the larger apertures of these instruments make parallel operations scientifically exciting. These parallel operations demand that all of the instruments operate in their highest power states simultaneously for extended periods of time. These and other factors have resulted in much higher temperatures inside the aft shroud where the ACS will be installed. As a result, new approaches are required to transfer heat inside the instrument and reject it away from the telescope. This paper describes the unique thermal systems required by the ACS. These include capillary pump loops and flexible and rigid heat pipes.

  4. Trading accuracy for speed: A quantitative comparison of search algorithms in protein sequence design.

    PubMed

    Voigt, C A; Gordon, D B; Mayo, S L

    2000-06-01

    Finding the minimum energy amino acid side-chain conformation is a fundamental problem in both homology modeling and protein design. To address this issue, numerous computational algorithms have been proposed. However, there have been few quantitative comparisons between methods and there is very little general understanding of the types of problems that are appropriate for each algorithm. Here, we study four common search techniques: Monte Carlo (MC) and Monte Carlo plus quench (MCQ); genetic algorithms (GA); self-consistent mean field (SCMF); and dead-end elimination (DEE). Both SCMF and DEE are deterministic, and if DEE converges, it is guaranteed that its solution is the global minimum energy conformation (GMEC). This provides a means to compare the accuracy of SCMF and the stochastic methods. For the side-chain placement calculations, we find that DEE rapidly converges to the GMEC in all the test cases. The other algorithms converge on significantly incorrect solutions; the average fraction of incorrect rotamers for SCMF is 0.12, GA 0.09, and MCQ 0.05. For the protein design calculations, design positions are progressively added to the side-chain placement calculation until the time required for DEE diverges sharply. As the complexity of the problem increases, the accuracy of each method is determined so that the results can be extrapolated into the region where DEE is no longer tractable. We find that both SCMF and MCQ perform reasonably well on core calculations (fraction amino acids incorrect is SCMF 0.07, MCQ 0.04), but fail considerably on the boundary (SCMF 0.28, MCQ 0.32) and surface calculations (SCMF 0.37, MCQ 0.44). PMID:10835284

  5. THE BRIGHTEST OF REIONIZING GALAXIES SURVEY: DESIGN AND PRELIMINARY RESULTS

    SciTech Connect

    Trenti, M.; Bradley, L. D.; Stiavelli, M.; MacKenty, J. W.; Oesch, P.; Carollo, C. M.; Treu, T.; Bouwens, R. J.; Illingworth, G. D.; Shull, J. M.

    2011-02-01

    We present the first results on the search for very bright (M{sub AB} {approx} -21) galaxies at redshift z {approx} 8 from the Brightest of Reionizing Galaxies (BoRG) survey. BoRG is a Hubble Space Telescope Wide Field Camera 3 (WFC3) pure-parallel survey that is obtaining images on random lines of sight at high Galactic latitudes in four filters (F606W, F098M, F125W, and F160W), with integration times optimized to identify galaxies at z {approx}> 7.5 as F098M dropouts. We discuss here results from a search area of approximately 130 arcmin{sup 2} over 23 BoRG fields, complemented by six other pure-parallel WFC3 fields with similar filters. This new search area is more than two times wider than previous WFC3 observations at z {approx} 8. We identify four F098M-dropout candidates with high statistical confidence (detected at greater than 8{sigma} confidence in F125W). These sources are among the brightest candidates currently known at z {approx} 8 and approximately 10 times brighter than the z = 8.56 galaxy UDFy-38135539. They thus represent ideal targets for spectroscopic follow-up observations and could potentially lead to a redshift record, as our color selection includes objects up to z {approx} 9. However, the expected contamination rate of our sample is about 30% higher than typical searches for dropout galaxies in legacy fields, such as the GOODS and HUDF, where deeper data and additional optical filters are available to reject contaminants.

  6. Using GIS to generate spatially balanced random survey designs for natural resource applications.

    PubMed

    Theobald, David M; Stevens, Don L; White, Denis; Urquhart, N Scott; Olsen, Anthony R; Norman, John B

    2007-07-01

    Sampling of a population is frequently required to understand trends and patterns in natural resource management because financial and time constraints preclude a complete census. A rigorous probability-based survey design specifies where to sample so that inferences from the sample apply to the entire population. Probability survey designs should be used in natural resource and environmental management situations because they provide the mathematical foundation for statistical inference. Development of long-term monitoring designs demand survey designs that achieve statistical rigor and are efficient but remain flexible to inevitable logistical or practical constraints during field data collection. Here we describe an approach to probability-based survey design, called the Reversed Randomized Quadrant-Recursive Raster, based on the concept of spatially balanced sampling and implemented in a geographic information system. This provides environmental managers a practical tool to generate flexible and efficient survey designs for natural resource applications. Factors commonly used to modify sampling intensity, such as categories, gradients, or accessibility, can be readily incorporated into the spatially balanced sample design. PMID:17546523

  7. Design study of the deepsky ultraviolet survey telescope. [Spacelab payload

    NASA Technical Reports Server (NTRS)

    Page, N. A.; Callaghan, F. G.; Killen, R. H.; Willis, W.

    1977-01-01

    Preliminary mechanical design and specifications are presented for a wide field ultraviolet telescope and detector to be carried as a Spacelab payload. Topics discussed include support structure stiffness (torsional and bending), mirror assembly, thermal control, optical alignment, attachment to the instrument pointing pallet, control and display, power requirements, acceptance and qualification test plans, cost analysis and scheduling. Drawings are included.

  8. Design Trends in Editorial Presentation: A Survey of Business Communicators.

    ERIC Educational Resources Information Center

    Culpepper, Maryanne G.

    This study examines the design and editing procedures of business publications--publications for employees, stockholders, and combinations of these audiences. Following a review of the literature which turned up little information on business publications, it was decided that a mail questionnaire sent to a sample of business publication editors…

  9. Survey of electrical submersible systems design, application, and testing

    SciTech Connect

    Durham, M.O.; Lea, J.F.

    1996-05-01

    The electrical submersible pump industry has numerous recommended practices and procedures addressing various facets of the operation. Ascertaining the appropriate technique is tedious. Seldom are all the documents available at one location. This synopsis of all the industry practices provides a ready reference for testing, design, and application of electrical submersible pumping systems. An extensive bibliography identifies significant documents for further reference.

  10. Improved Optical Design for the Large Synoptic Survey Telescope (LSST)

    SciTech Connect

    Seppala, L

    2002-09-24

    This paper presents an improved optical design for the LSST, an fll.25 three-mirror telescope covering 3.0 degrees full field angle, with 6.9 m effective aperture diameter. The telescope operates at five wavelength bands spanning 386.5 nm to 1040 nm (B, V, R, I and Z). For all bands, 80% of the polychromatic diffracted energy is collected within 0.20 arc-seconds diameter. The reflective telescope uses an 8.4 m f/1.06 concave primary, a 3.4 m convex secondary and a 5.2 m concave tertiary in a Paul geometry. The system length is 9.2 m. A refractive corrector near the detector uses three fused silica lenses, rather than the two lenses of previous designs. Earlier designs required that one element be a vacuum barrier, but now the detector sits in an inert gas at ambient pressure. The last lens is the gas barrier. Small adjustments lead to optimal correction at each band. The filters have different axial thicknesses. The primary and tertiary mirrors are repositioned for each wavelength band. The new optical design incorporates features to simplify manufacturing. They include a flat detector, a far less aspheric convex secondary (10 {micro}m from best fit sphere) and reduced aspheric departures on the lenses and tertiary mirror. Five aspheric surfaces, on all three mirrors and on two lenses, are used. The primary is nearly parabolic. The telescope is fully baffled so that no specularly reflected light from any field angle, inside or outside of the full field angle of 3.0 degrees, can reach the detector.

  11. Using simulation to evaluate wildlife survey designs: polar bears and seals in the Chukchi Sea.

    PubMed

    Conn, Paul B; Moreland, Erin E; Regehr, Eric V; Richmond, Erin L; Cameron, Michael F; Boveng, Peter L

    2016-01-01

    Logistically demanding and expensive wildlife surveys should ideally yield defensible estimates. Here, we show how simulation can be used to evaluate alternative survey designs for estimating wildlife abundance. Specifically, we evaluate the potential of instrument-based aerial surveys (combining infrared imagery with high-resolution digital photography to detect and identify species) for estimating abundance of polar bears and seals in the Chukchi Sea. We investigate the consequences of different levels of survey effort, flight track allocation and model configuration on bias and precision of abundance estimators. For bearded seals (0.07 animals km(-2)) and ringed seals (1.29 animals km(-2)), we find that eight flights traversing ≈7840 km are sufficient to achieve target precision levels (coefficient of variation (CV)<20%) for a 2.94×10(5) km(2) study area. For polar bears (provisionally, 0.003 animals km(-2)), 12 flights traversing ≈11 760 km resulted in CVs ranging from 28 to 35%. Estimators were relatively unbiased with similar precision over different flight track allocation strategies and estimation models, although some combinations had superior performance. These findings suggest that instrument-based aerial surveys may provide a viable means for monitoring seal and polar bear populations on the surface of the sea ice over large Arctic regions. More broadly, our simulation-based approach to evaluating survey designs can serve as a template for biologists designing their own surveys. PMID:26909183

  12. Using simulation to evaluate wildlife survey designs: polar bears and seals in the Chukchi Sea

    PubMed Central

    Conn, Paul B.; Moreland, Erin E.; Regehr, Eric V.; Richmond, Erin L.; Cameron, Michael F.; Boveng, Peter L.

    2016-01-01

    Logistically demanding and expensive wildlife surveys should ideally yield defensible estimates. Here, we show how simulation can be used to evaluate alternative survey designs for estimating wildlife abundance. Specifically, we evaluate the potential of instrument-based aerial surveys (combining infrared imagery with high-resolution digital photography to detect and identify species) for estimating abundance of polar bears and seals in the Chukchi Sea. We investigate the consequences of different levels of survey effort, flight track allocation and model configuration on bias and precision of abundance estimators. For bearded seals (0.07 animals km−2) and ringed seals (1.29 animals km−2), we find that eight flights traversing ≈7840 km are sufficient to achieve target precision levels (coefficient of variation (CV)<20%) for a 2.94×105 km2 study area. For polar bears (provisionally, 0.003 animals km−2), 12 flights traversing ≈11 760 km resulted in CVs ranging from 28 to 35%. Estimators were relatively unbiased with similar precision over different flight track allocation strategies and estimation models, although some combinations had superior performance. These findings suggest that instrument-based aerial surveys may provide a viable means for monitoring seal and polar bear populations on the surface of the sea ice over large Arctic regions. More broadly, our simulation-based approach to evaluating survey designs can serve as a template for biologists designing their own surveys. PMID:26909183

  13. Laboratory design and test procedures for quantitative evaluation of infrared sensors to assess thermal anomalies

    SciTech Connect

    Chang, Y.M.; Grot, R.A.; Wood, J.T.

    1985-06-01

    This report presents the description of the laboratory apparatus and preliminary results of the quantitative evaluation of three high-resolution and two low-resolution infrared imaging systems. These systems which are commonly used for building diagnostics are tested under various background temperatures (from -20/sup 0/C to 25/sup 0/C) for their minimum resolvable temperature differences (MRTD) at spatial frequencies from 0.03 to 0.25 cycles per milliradian. The calibration curves of absolute and differential temperature measurements are obtained for three systems. The signal transfer function and line spread function at ambient temperature of another three systems are also measured. Comparisons of the dependence of the MRTD on background temperatures from the measured data with the predicted values given in ASHRAE Standards 101-83 are also included. The dependence of background temperatures for absolute temperature measurements are presented, as well as comparison of measured data and data given by the manufacturer. Horizontal on-axis magnification factors of the geometric transfer function of two systems are also established to calibrate the horizontal axis for the measured line spread function to obtain the modulation transfer function. The variation of the uniformity for horizontal display of these two sensors are also observed. Included are detailed descriptions of laboratory design, equipment setup, and evaluation procedures of each test. 10 refs., 38 figs., 12 tabs.

  14. Feasibility of the grandprogeny design for quantitative trait loci (QTL) detection in purebred beef cattle.

    PubMed

    Moody, D E; Pomp, D; Buchanan, D S

    1997-04-01

    The grandprogeny design (GPD) was developed for dairy cattle to use existing pedigreed populations for quantitative trait locus (QTL) detection. Marker genotypes of grandsires and sons are determined, and trait phenotypic data from grandprogeny are analyzed. The objective of this study was to investigate the potential application of GPD in purebred beef cattle populations. Pedigree structures of Angus (n = 123,319), Hereford (n = 107,778), Brangus (n = 14,449), and Gelbvieh (n = 8,114) sire evaluation reports were analyzed to identify potentially useful families. Power of QTL detection was calculated for a range of QTL effects (.1 to .5 SD) and two Type I error rates (.01 and .001). Reasonable power (> .75) could be achieved using GPD in Angus and Hereford for QTL having moderate effects (.3 SD) on weaning weight and large effects (.4 to .5 SD) on birth, yearling, and maternal weaning weights by genotyping 500 animals. Existing Gelbvieh and Brangus families useful for GPD were limited, and reasonable power could be expected only for QTL having large effects on weaning or birth weights. Although family structures suitable for GPD exist in purebred beef populations, large amounts of genotyping would be required to achieve reasonable power, and only QTL having moderate to large effects could be expected to be identified. PMID:9110205

  15. Modified Universal Design Survey: Enhancing Operability of Launch Vehicle Ground Crew Worksites

    NASA Technical Reports Server (NTRS)

    Blume, Jennifer L.

    2010-01-01

    Operability is a driving requirement for next generation space launch vehicles. Launch site ground operations include numerous operator tasks to prepare the vehicle for launch or to perform preflight maintenance. Ensuring that components requiring operator interaction at the launch site are designed for optimal human use is a high priority for operability. To promote operability, a Design Quality Evaluation Survey based on Universal Design framework was developed to support Human Factors Engineering (HFE) evaluation for NASA s launch vehicles. Universal Design per se is not a priority for launch vehicle processing however; applying principles of Universal Design will increase the probability of an error free and efficient design which promotes operability. The Design Quality Evaluation Survey incorporates and tailors the seven Universal Design Principles and adds new measures for Safety and Efficiency. Adapting an approach proven to measure Universal Design Performance in Product, each principle is associated with multiple performance measures which are rated with the degree to which the statement is true. The Design Quality Evaluation Survey was employed for several launch vehicle ground processing worksite analyses. The tool was found to be most useful for comparative judgments as opposed to an assessment of a single design option. It provided a useful piece of additional data when assessing possible operator interfaces or worksites for operability.

  16. HomoSAR: bridging comparative protein modeling with quantitative structural activity relationship to design new peptides.

    PubMed

    Borkar, Mahesh R; Pissurlenkar, Raghuvir R S; Coutinho, Evans C

    2013-11-15

    Peptides play significant roles in the biological world. To optimize activity for a specific therapeutic target, peptide library synthesis is inevitable; which is a time consuming and expensive. Computational approaches provide a promising way to simply elucidate the structural basis in the design of new peptides. Earlier, we proposed a novel methodology termed HomoSAR to gain insight into the structure activity relationships underlying peptides. Based on an integrated approach, HomoSAR uses the principles of homology modeling in conjunction with the quantitative structural activity relationship formalism to predict and design new peptide sequences with the optimum activity. In the present study, we establish that the HomoSAR methodology can be universally applied to all classes of peptides irrespective of sequence length by studying HomoSAR on three peptide datasets viz., angiotensin-converting enzyme inhibitory peptides, CAMEL-s antibiotic peptides, and hAmphiphysin-1 SH3 domain binding peptides, using a set of descriptors related to the hydrophobic, steric, and electronic properties of the 20 natural amino acids. Models generated for all three datasets have statistically significant correlation coefficients (r(2)) and predictive r2 (r(pred)2) and cross validated coefficient ( q(LOO)2). The daintiness of this technique lies in its simplicity and ability to extract all the information contained in the peptides to elucidate the underlying structure activity relationships. The difficulties of correlating both sequence diversity and variation in length of the peptides with their biological activity can be addressed. The study has been able to identify the preferred or detrimental nature of amino acids at specific positions in the peptide sequences. PMID:24105965

  17. Design and prediction of new acetylcholinesterase inhibitor via quantitative structure activity relationship of huprines derivatives.

    PubMed

    Zhang, Shuqun; Hou, Bo; Yang, Huaiyu; Zuo, Zhili

    2016-05-01

    Acetylcholinesterase (AChE) is an important enzyme in the pathogenesis of Alzheimer's disease (AD). Comparative quantitative structure-activity relationship (QSAR) analyses on some huprines inhibitors against AChE were carried out using comparative molecular field analysis (CoMFA), comparative molecular similarity indices analysis (CoMSIA), and hologram QSAR (HQSAR) methods. Three highly predictive QSAR models were constructed successfully based on the training set. The CoMFA, CoMSIA, and HQSAR models have values of r (2) = 0.988, q (2) = 0.757, ONC = 6; r (2) = 0.966, q (2) = 0.645, ONC = 5; and r (2) = 0.957, q (2) = 0.736, ONC = 6. The predictabilities were validated using an external test sets, and the predictive r (2) values obtained by the three models were 0.984, 0.973, and 0.783, respectively. The analysis was performed by combining the CoMFA and CoMSIA field distributions with the active sites of the AChE to further understand the vital interactions between huprines and the protease. On the basis of the QSAR study, 14 new potent molecules have been designed and six of them are predicted to be more active than the best active compound 24 described in the literature. The final QSAR models could be helpful in design and development of novel active AChE inhibitors. PMID:26832327

  18. A quantitative analysis of clinical trial designs in spinal cord injury based on ICCP guidelines.

    PubMed

    Sorani, Marco D; Beattie, Michael S; Bresnahan, Jacqueline C

    2012-06-10

    Clinical studies of spinal cord injury (SCI) have evolved into multidisciplinary programs that investigate multiple types of neurological deficits and sequelae. In 2007, the International Campaign for Cures of SCI Paralysis (ICCP) proposed best practices for interventional trial designs, end-points, and inclusion criteria. Here we quantitatively assessed the extent to which SCI trials follow ICCP guidelines and reflect the overall patient population. We obtained data for all 288 SCI trials in ClinicalTrials.gov. We calculated summary statistics and observed trends pre-2007 versus 2007 onward. To compare the trial population to the overall SCI population, we obtained statistics from the National SCI Statistical Center. We generated tag clouds to describe heterogeneous trial outcomes. Most interventional studies were randomized (147, 73.1%), and utilized active (55, 36.7%) or placebo controls (49, 32.7%), both increasing trends (p=0.09). Most trials were open label (116, 53.5%), rather than double- (62, 28.6%) or single-blinded (39, 18.0%), but blinding has increased (p=0.01). Tag clouds of outcomes suggest an emphasis on assessment using scores and scales. Inclusion criteria related to American Spinal Injury Association (ASIA) status and neurological level allowed inclusion of most SCI patients. Age inclusion criteria were most commonly 18-65 or older. Consistent with ICCP recommendations, most trials were randomized and controlled, and blinding has increased. Age inclusion criteria skew older than the overall population. ASIA status criteria reflect the population, but neurological lesion criteria could be broadened. Investigators should make trial designs and results available in a complete manner to enable comparisons of populations and outcomes. PMID:22369673

  19. Hit by a Perfect Storm? Art & Design in the National Student Survey

    ERIC Educational Resources Information Center

    Yorke, Mantz; Orr, Susan; Blair, Bernadette

    2014-01-01

    There has long been the suspicion amongst staff in Art & Design that the ratings given to their subject disciplines in the UK's National Student Survey are adversely affected by a combination of circumstances--a "perfect storm". The "perfect storm" proposition is tested by comparing ratings for Art & Design with…

  20. Usability Evaluation Survey for Identifying Design Issues in Civil Flight Deck

    NASA Astrophysics Data System (ADS)

    Ozve Aminian, Negin; Izzuddin Romli, Fairuz; Wiriadidjaja, Surjatin

    2016-02-01

    Ergonomics assessment for cockpit in civil aircraft is important as the pilots spend most of their time during flight on the seating posture imposed by its design. The improper seat design can cause discomfort and pain, which will disturb the pilot's concentration in flight. From a conducted survey, it is found that there are some issues regarding the current cockpit design. This study aims to highlight potential mismatches between the current cockpit design and the ergonomic design recommendations for anthropometric dimensions and seat design, which could be the roots of the problems faced by the pilots in the cockpit.

  1. Effects of sample survey design on the accuracy of classification tree models in species distribution models

    USGS Publications Warehouse

    Edwards, T.C., Jr.; Cutler, D.R.; Zimmermann, N.E.; Geiser, L.; Moisen, G.G.

    2006-01-01

    We evaluated the effects of probabilistic (hereafter DESIGN) and non-probabilistic (PURPOSIVE) sample surveys on resultant classification tree models for predicting the presence of four lichen species in the Pacific Northwest, USA. Models derived from both survey forms were assessed using an independent data set (EVALUATION). Measures of accuracy as gauged by resubstitution rates were similar for each lichen species irrespective of the underlying sample survey form. Cross-validation estimates of prediction accuracies were lower than resubstitution accuracies for all species and both design types, and in all cases were closer to the true prediction accuracies based on the EVALUATION data set. We argue that greater emphasis should be placed on calculating and reporting cross-validation accuracy rates rather than simple resubstitution accuracy rates. Evaluation of the DESIGN and PURPOSIVE tree models on the EVALUATION data set shows significantly lower prediction accuracy for the PURPOSIVE tree models relative to the DESIGN models, indicating that non-probabilistic sample surveys may generate models with limited predictive capability. These differences were consistent across all four lichen species, with 11 of the 12 possible species and sample survey type comparisons having significantly lower accuracy rates. Some differences in accuracy were as large as 50%. The classification tree structures also differed considerably both among and within the modelled species, depending on the sample survey form. Overlap in the predictor variables selected by the DESIGN and PURPOSIVE tree models ranged from only 20% to 38%, indicating the classification trees fit the two evaluated survey forms on different sets of predictor variables. The magnitude of these differences in predictor variables throws doubt on ecological interpretation derived from prediction models based on non-probabilistic sample surveys. ?? 2006 Elsevier B.V. All rights reserved.

  2. Study Quality in SLA: An Assessment of Designs, Analyses, and Reporting Practices in Quantitative L2 Research

    ERIC Educational Resources Information Center

    Plonsky, Luke

    2013-01-01

    This study assesses research and reporting practices in quantitative second language (L2) research. A sample of 606 primary studies, published from 1990 to 2010 in "Language Learning and Studies in Second Language Acquisition," was collected and coded for designs, statistical analyses, reporting practices, and outcomes (i.e., effect…

  3. Review of quantitative surveys of the length and stability of MTBE, TBA, and benzene plumes in groundwater at UST sites.

    PubMed

    Connor, John A; Kamath, Roopa; Walker, Kenneth L; McHugh, Thomas E

    2015-01-01

    Quantitative information regarding the length and stability condition of groundwater plumes of benzene, methyl tert-butyl ether (MTBE), and tert-butyl alcohol (TBA) has been compiled from thousands of underground storage tank (UST) sites in the United States where gasoline fuel releases have occurred. This paper presents a review and summary of 13 published scientific surveys, of which 10 address benzene and/or MTBE plumes only, and 3 address benzene, MTBE, and TBA plumes. These data show the observed lengths of benzene and MTBE plumes to be relatively consistent among various regions and hydrogeologic settings, with median lengths at a delineation limit of 10 µg/L falling into relatively narrow ranges from 101 to 185 feet for benzene and 110 to 178 feet for MTBE. The observed statistical distributions of MTBE and benzene plumes show the two plume types to be of comparable lengths, with 90th percentile MTBE plume lengths moderately exceeding benzene plume lengths by 16% at a 10-µg/L delineation limit (400 feet vs. 345 feet) and 25% at a 5-µg/L delineation limit (530 feet vs. 425 feet). Stability analyses for benzene and MTBE plumes found 94 and 93% of these plumes, respectively, to be in a nonexpanding condition, and over 91% of individual monitoring wells to exhibit nonincreasing concentration trends. Three published studies addressing TBA found TBA plumes to be of comparable length to MTBE and benzene plumes, with 86% of wells in one study showing nonincreasing concentration trends. PMID:25040137

  4. A survey of design methods for failure detection in dynamic systems

    NASA Technical Reports Server (NTRS)

    Willsky, A. S.

    1975-01-01

    A number of methods for the detection of abrupt changes (such as failures) in stochastic dynamical systems were surveyed. The class of linear systems were emphasized, but the basic concepts, if not the detailed analyses, carry over to other classes of systems. The methods surveyed range from the design of specific failure-sensitive filters, to the use of statistical tests on filter innovations, to the development of jump process formulations. Tradeoffs in complexity versus performance are discussed.

  5. A survey of design methods for failure detection in dynamic systems

    NASA Technical Reports Server (NTRS)

    Willsky, A. S.

    1975-01-01

    A number of methods for detecting abrupt changes (such as failures) in stochastic dynamical systems are surveyed. The class of linear systems is concentrated on but the basic concepts, if not the detailed analyses, carry over to other classes of systems. The methods surveyed range from the design of specific failure-sensitive filters, to the use of statistical tests on filter innovations, to the development of jump process formulations. Tradeoffs in complexity versus performance are discussed.

  6. HRMS sky survey wideband feed system design for DSS 24 beam waveguide antenna

    NASA Technical Reports Server (NTRS)

    Stanton, P. H.; Lee, P. R.; Reilly, H. F.

    1993-01-01

    The High-Resolution Microwave Survey (HRMS) Sky Survey project will be implemented on the DSS 24 beam waveguide (BWG) antenna over the frequency range of 2.86 to 10 GHz. Two wideband, ring-loaded, corrugated feed horns were designed to cover this range. The horns match the frequency-dependent gain requirements for the DSS 24 BWG system. The performance of the feed horns and the calculated system performance of DSS 24 are presented.

  7. The Visible and Infrared Survey Telescope for Astronomy (VISTA): Design, technical overview, and performance

    NASA Astrophysics Data System (ADS)

    Sutherland, Will; Emerson, Jim; Dalton, Gavin; Atad-Ettedgui, Eli; Beard, Steven; Bennett, Richard; Bezawada, Naidu; Born, Andrew; Caldwell, Martin; Clark, Paul; Craig, Simon; Henry, David; Jeffers, Paul; Little, Bryan; McPherson, Alistair; Murray, John; Stewart, Malcolm; Stobie, Brian; Terrett, David; Ward, Kim; Whalley, Martin; Woodhouse, Guy

    2015-03-01

    The Visible and Infrared Survey Telescope for Astronomy (VISTA) is the 4-m wide-field survey telescope at ESO's Paranal Observatory, equipped with the world's largest near-infrared imaging camera (VISTA IR Camera, VIRCAM), with 1.65 degree diameter field of view, and 67 Mpixels giving 0.6 deg2 active pixel area, operating at wavelengths 0.8-2.3 μm. We provide a short history of the project, and an overview of the technical details of the full system including the optical design, mirrors, telescope structure, IR camera, active optics, enclosure and software. The system includes several innovative design features such as the f/1 primary mirror, thedichroic cold-baffle camera design and the sophisticated wavefront sensing system delivering closed-loop 5-axis alignment of the secondary mirror. We conclude with a summary of the delivered performance, and a short overview of the six ESO public surveys in progress on VISTA.

  8. [Development of a simple quantitative method for the strontium-89 concentration of radioactive liquid waste using the plastic scintillation survey meter for beta rays].

    PubMed

    Narita, Hiroto; Tsuchiya, Yuusuke; Hirase, Kiyoshi; Uchiyama, Mayuki; Fukushi, Masahiro

    2012-11-01

    Strontium-89 (89Sr: pure beta, E; 1.495 MeV-100%, halflife: 50.5 days) chloride is used as pain relief from bone metastases. An assay of 89Sr is difficult because of a pure beta emitter. For management of 89Sr, we tried to evaluate a simple quantitative method for the 59Sr concentration of radioactive liquid waste using scintillation survey meter for beta rays. The counting efficiency of the survey meter with this method was 35.95%. A simple 30 minutes measurement of 2 ml of the sample made the quantitative measurement of 89Sr practical. Reducing self-absorption of the beta ray in the solution by counting on the polyethlene paper improved the counting efficiency. Our method made it easy to manage the radioactive liquid waste under the legal restrictions. PMID:23402205

  9. National health and nutrition examination survey: sample design, 2011-2014.

    PubMed

    Johnson, Clifford L; Dohrmann, Sylvia M; Burt, Vicki L; Mohadjer, Leyla K

    2014-03-01

    Background Data collection for the National Health and Nutrition Examination Survey (NHANES) consists of a household screener, an interview, and a physical examination. The screener primarily determines whether any household members are eligible for the interview and examination. Eligibility is established using preset selection probabilities for the desired demographic subdomains. After an eligible sample person is selected, the interview collects person-level demographic, health, and nutrition information, as well as information about the household. The examination includes physical measurements, tests such as hearing and dental examinations, and the collection of blood and urine specimens for laboratory testing. Objectives This report provides some background on the NHANES program, beginning with the first survey cycle in the 1970s and highlighting significant changes since its inception. The report then describes the broad design specifications for the 2011-2014 survey cycle, including survey objectives, domain and precision specifications, and operational requirements unique to NHANES. The report also describes details of the survey design, including the calculation of sampling rates and sample selection methods. Documentation of survey content, data collection procedures, estimation methods, and methods to assess nonsampling errors are reported elsewhere. PMID:25569458

  10. Application of a Modified Universal Design Survey for Evaluation of Ares 1 Ground Crew Worksites

    NASA Technical Reports Server (NTRS)

    Blume, Jennifer L.

    2010-01-01

    Operability is a driving requirement for NASA's Ares 1 launch vehicle. Launch site ground operations include several operator tasks to prepare the vehicle for launch or to perform maintenance. Ensuring that components requiring operator interaction at the launch site are designed for optimal human use is a high priority for operability. To support design evaluation, the Ares 1 Upper Stage (US) element Human Factors Engineering (HFE) group developed a survey based on the Universal Design approach. Universal Design is a process to create products that can be used effectively by as many people as possible. Universal Design per se is not a priority for Ares 1 because launch vehicle processing is a specialized skill and not akin to a consumer product that should be used by all people of all abilities. However, applying principles of Universal Design will increase the probability of an error free and efficient design which is a priority for Ares 1. The Design Quality Evaluation Survey centers on the following seven principles: (1) Equitable use, (2) Flexibility in use, (3) Simple and intuitive use, (4) Perceptible information, (5) Tolerance for error, (6) Low physical effort, (7) Size and space for approach and use. Each principle is associated with multiple evaluation criteria which were rated with the degree to which the statement is true. All statements are phrased in the utmost positive, or the design goal so that the degree to which judgments tend toward "completely agree" directly reflects the degree to which the design is good. The Design Quality Evaluation Survey was employed for several US analyses. The tool was found to be most useful for comparative judgments as opposed to an assessment of a single design option. It provided a useful piece of additional data when assessing possible operator interfaces or worksites for operability

  11. 78 FR 5458 - Medicare Program; Request for Information To Aid in the Design and Development of a Survey...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-25

    ... in the Design and Development of a Survey Regarding Patient and Family Member/Friend Experiences With... of care designed to provide comfort and support to patients and their families when a life-limiting... capturing hospice care experiences. A rigorous, well-designed Hospice Survey will allow us to understand:...

  12. Final report on the radiological surveys of designated DX firing sites at Los Alamos National Laboratory

    SciTech Connect

    1996-09-09

    CHEMRAD was contracted by Los Alamos National Laboratory to perform USRADS{reg_sign} (UltraSonic Ranging And Data System) radiation scanning surveys at designated DX Sites at the Los Alamos National Laboratory. The primary purpose of these scanning surveys was to identify the presence of Depleted Uranium (D-38) resulting from activities at the DX Firing Sites. This effort was conducted to update the most recent surveys of these areas. This current effort was initiated with site orientation on August 12, 1996. Surveys were completed in the field on September 4, 1996. This Executive Summary briefly presents the major findings of this work. The detail survey results are presented in the balance of this report and are organized by Technical Area and Site number in section 2. This organization is not in chronological order. USRADS and the related survey methods are described in section 3. Quality Control issues are addressed in section 4. Surveys were conducted with an array of radiation detectors either mounted on a backpack frame for man-carried use (Manual mode) or on a tricycle cart (RadCart mode). The array included radiation detectors for gamma and beta surface near surface contamination as well as dose rate at 1 meter above grade. The radiation detectors were interfaced directly to an USRADS 2100 Data Pack.

  13. The Laboratory Course Assessment Survey: A Tool to Measure Three Dimensions of Research-Course Design

    PubMed Central

    Corwin, Lisa A.; Runyon, Christopher; Robinson, Aspen; Dolan, Erin L.

    2015-01-01

    Course-based undergraduate research experiences (CUREs) are increasingly being offered as scalable ways to involve undergraduates in research. Yet few if any design features that make CUREs effective have been identified. We developed a 17-item survey instrument, the Laboratory Course Assessment Survey (LCAS), that measures students’ perceptions of three design features of biology lab courses: 1) collaboration, 2) discovery and relevance, and 3) iteration. We assessed the psychometric properties of the LCAS using established methods for instrument design and validation. We also assessed the ability of the LCAS to differentiate between CUREs and traditional laboratory courses, and found that the discovery and relevance and iteration scales differentiated between these groups. Our results indicate that the LCAS is suited for characterizing and comparing undergraduate biology lab courses and should be useful for determining the relative importance of the three design features for achieving student outcomes. PMID:26466990

  14. The Laboratory Course Assessment Survey: A Tool to Measure Three Dimensions of Research-Course Design.

    PubMed

    Corwin, Lisa A; Runyon, Christopher; Robinson, Aspen; Dolan, Erin L

    2015-01-01

    Course-based undergraduate research experiences (CUREs) are increasingly being offered as scalable ways to involve undergraduates in research. Yet few if any design features that make CUREs effective have been identified. We developed a 17-item survey instrument, the Laboratory Course Assessment Survey (LCAS), that measures students' perceptions of three design features of biology lab courses: 1) collaboration, 2) discovery and relevance, and 3) iteration. We assessed the psychometric properties of the LCAS using established methods for instrument design and validation. We also assessed the ability of the LCAS to differentiate between CUREs and traditional laboratory courses, and found that the discovery and relevance and iteration scales differentiated between these groups. Our results indicate that the LCAS is suited for characterizing and comparing undergraduate biology lab courses and should be useful for determining the relative importance of the three design features for achieving student outcomes. PMID:26466990

  15. Optical Design Trade Study for the Wide Field Infrared Survey Telescope [WFIRST

    NASA Technical Reports Server (NTRS)

    Content, David A.; Goullioud, R.; Lehan, John P.; Mentzell, John E.

    2011-01-01

    The Wide Field Infrared Survey Telescope (WFIRST) mission concept was ranked first in new space astrophysics mission by the Astro2010 Decadal Survey incorporating the Joint Dark Energy Mission (JDEM)-Omega payload concept and multiple science white papers. This mission is based on a space telescope at L2 studying exoplanets [via gravitational microlensing], probing dark energy, and surveying the near infrared sky. Since the release of NWNH, the WFIRST project has been working with the WFIRST science definition team (SDT) to refine mission and payload concepts. We present the driving requirements. The current interim reference mission point design, based on the use of a 1.3m unobscured aperture three mirror anastigmat form, with focal imaging and slitless spectroscopy science channels, is consistent with the requirements, requires no technology development, and out performs the JDEM-Omega design.

  16. Wide Field Infrared Survey Telescope [WFIRST]: Telescope Design and Simulated Performance

    NASA Technical Reports Server (NTRS)

    Goullioud, R.; Content, D. A.; Kuan, G. M.; Moore, J. D.; Chang, Z.; Sunada, E. T.; Villalvazo, J.; Hawk, J. P.; Armani, N. V.; Johnson, E. L.; Powell, C. A.

    2012-01-01

    The ASTRO2010 Decadal Survey proposed multiple missions with NIR focal planes and 3 mirror wide field telescopes in the 1.5m aperture range. None of them would have won as standalone missions WFIRST is a combination of these missions, created by Astro 2010 committee. WFIRST Science Definition Team (SDT) tasked to examine the design. Project team is a GSFC-JPL-Caltech collaboration. This interim mission design is a result of combined work by the project team with the SDT.

  17. CONDITION ASSESSMENT FOR THE ESCAMBIA RIVER, FL, WATERSHED: BENTHIC MACROINVERTEBRATE SURVEYS USING A PROBABILISTIC SAMPLING DESIGN

    EPA Science Inventory

    Probabilistic sampling has been used to assess the condition of estuarine ecosystems, and the use of this survey design approach was examined for a northwest Florida watershed. Twenty-eight lotic sites within the Escambia River, Florida, watershed were randomly selected and visit...

  18. Designing cobalt chromium removable partial dentures for patients with shortened dental arches: a pilot survey.

    PubMed

    Nassani, M Z; Devlin, H; Tarakji, B; McCord, J F

    2011-08-01

    The aim of this survey was to investigate the quality of prescription for the fabrication of cobalt chromium removable partial dentures (RPDs) that are used to extend the shortened dental arches (SDAs). A survey of four commercial dental laboratories located in northern England was conducted. The target of this survey was cobalt chromium RPDs that were requested to restore SDAs comprising the anterior teeth and 2-4 premolars. Dentists' prescriptions were scrutinised, and a special data collection form was completed accordingly. A total of 94 dentists' prescriptions and associated SDA casts were examined. Almost all the requested cobalt chromium RPDs were clasp-retained RPDs (97%). Scrutinising the 91 prescriptions for clasp-retained cobalt chromium RPDs showed that dentists' prescriptions did not have any instructions about the design of the partial denture in a considerable proportion of the cases (32%). Teeth to be clasped were identified clearly in 45% of the prescriptions. A majority of the dentists (64%) failed to provide any instructions about the design of the rests to be placed on the most posterior premolar abutment teeth. A considerable proportion of the dentists delegated the task of selecting the type of the major connector to the dental technician (41%). Only 21 (23%) of the examined casts had clearly defined rest seat preparation. The outcome of this pilot survey shows inadequate quality of prescription in designing RPDs for patients with SDAs. This finding has an ethical and clinical bearing and does not fit with current legal guidelines relevant to designing RPDs. PMID:21175736

  19. ASSESSING THE ECOLOGICAL CONDITION OF A COASTAL PLAIN WATERSHED USING A PROBABILISTIC SURVEY DESIGN

    EPA Science Inventory

    Using a probabilistic survey design, we assessed the ecological condition of the Florida (USA) portion of the Escambia River watershed using selected environmental and benthic macroinvertebrate data. Macroinvertebrates were sampled at 28 sites during July-August 1996, and 3414 i...

  20. USING GIS TO GENERATE SPATIALLY-BALANCED RANDOM SURVEY DESIGNS FOR NATURAL RESOURCE APPLICATIONS

    EPA Science Inventory

    Sampling of a population is frequently required to understand trends and patterns in natural resource management because financial and time constraints preclude a complete census. A rigorous probability-based survey design specifies where to sample so that inferences from the sam...

  1. Designer cantilevers for even more accurate quantitative measurements of biological systems with multifrequency AFM

    NASA Astrophysics Data System (ADS)

    Contera, S.

    2016-04-01

    Multifrequency excitation/monitoring of cantilevers has made it possible both to achieve fast, relatively simple, nanometre-resolution quantitative mapping of mechanical of biological systems in solution using atomic force microscopy (AFM), and single molecule resolution detection by nanomechanical biosensors. A recent paper by Penedo et al [2015 Nanotechnology 26 485706] has made a significant contribution by developing simple methods to improve the signal to noise ratio in liquid environments, by selectively enhancing cantilever modes, which will lead to even more accurate quantitative measurements.

  2. Lessons Learned in Interdisciplinary Professional Development Designed to Promote the Teaching of Quantitative Literacy

    ERIC Educational Resources Information Center

    Lardner, Emily; Bookman, Jack

    2013-01-01

    In this paper, we will describe the challenges and insights gained from conducting professional development workshops aimed at helping faculty prepare materials to support the development of students' quantitative skills in different disciplinary contexts. We will examine some of the mistakes we made, and misconceptions we had, in conducting…

  3. Using Focus Groups To Design a Quantitative Measure: Women's Indirect "No" to Sexual Intimacy.

    ERIC Educational Resources Information Center

    Reeder, Heidi M.

    This study combined qualitative and quantitative methods to assess the reasons many women use indirect messages to say "no" to men's attempts to escalate sexual intimacy. Subjects were six female students at a large southwestern university. At one time, one group had four women, at another time the group had two women. All were Caucasian. The room…

  4. AI/OR computational model for integrating qualitative and quantitative design methods

    NASA Technical Reports Server (NTRS)

    Agogino, Alice M.; Bradley, Stephen R.; Cagan, Jonathan; Jain, Pramod; Michelena, Nestor

    1990-01-01

    A theoretical framework for integrating qualitative and numerical computational methods for optimally-directed design is described. The theory is presented as a computational model and features of implementations are summarized where appropriate. To demonstrate the versatility of the methodology we focus on four seemingly disparate aspects of the design process and their interaction: (1) conceptual design, (2) qualitative optimal design, (3) design innovation, and (4) numerical global optimization.

  5. The Spitzer South Pole Telescope Deep Field: Survey Design and Infrared Array Camera Catalogs

    NASA Astrophysics Data System (ADS)

    Ashby, M. L. N.; Stanford, S. A.; Brodwin, M.; Gonzalez, A. H.; Martinez-Manso, J.; Bartlett, J. G.; Benson, B. A.; Bleem, L. E.; Crawford, T. M.; Dey, A.; Dressler, A.; Eisenhardt, P. R. M.; Galametz, A.; Jannuzi, B. T.; Marrone, D. P.; Mei, S.; Muzzin, A.; Pacaud, F.; Pierre, M.; Stern, D.; Vieira, J. D.

    2013-12-01

    The Spitzer South Pole Telescope Deep Field (SSDF) is a wide-area survey using Spitzer's Infrared Array Camera (IRAC) to cover 94 deg2 of extragalactic sky, making it the largest IRAC survey completed to date outside the Milky Way midplane. The SSDF is centered at (α, δ) = (23:30, -55:00), in a region that combines observations spanning a broad wavelength range from numerous facilities. These include millimeter imaging from the South Pole Telescope, far-infrared observations from Herschel/SPIRE, X-ray observations from the XMM XXL survey, near-infrared observations from the VISTA Hemisphere Survey, and radio-wavelength imaging from the Australia Telescope Compact Array, in a panchromatic project designed to address major outstanding questions surrounding galaxy clusters and the baryon budget. Here we describe the Spitzer/IRAC observations of the SSDF, including the survey design, observations, processing, source extraction, and publicly available data products. In particular, we present two band-merged catalogs, one for each of the two warm IRAC selection bands. They contain roughly 5.5 and 3.7 million distinct sources, the vast majority of which are galaxies, down to the SSDF 5σ sensitivity limits of 19.0 and 18.2 Vega mag (7.0 and 9.4 μJy) at 3.6 and 4.5 μm, respectively.

  6. Choosing a Cluster Sampling Design for Lot Quality Assurance Sampling Surveys

    PubMed Central

    Hund, Lauren; Bedrick, Edward J.; Pagano, Marcello

    2015-01-01

    Lot quality assurance sampling (LQAS) surveys are commonly used for monitoring and evaluation in resource-limited settings. Recently several methods have been proposed to combine LQAS with cluster sampling for more timely and cost-effective data collection. For some of these methods, the standard binomial model can be used for constructing decision rules as the clustering can be ignored. For other designs, considered here, clustering is accommodated in the design phase. In this paper, we compare these latter cluster LQAS methodologies and provide recommendations for choosing a cluster LQAS design. We compare technical differences in the three methods and determine situations in which the choice of method results in a substantively different design. We consider two different aspects of the methods: the distributional assumptions and the clustering parameterization. Further, we provide software tools for implementing each method and clarify misconceptions about these designs in the literature. We illustrate the differences in these methods using vaccination and nutrition cluster LQAS surveys as example designs. The cluster methods are not sensitive to the distributional assumptions but can result in substantially different designs (sample sizes) depending on the clustering parameterization. However, none of the clustering parameterizations used in the existing methods appears to be consistent with the observed data, and, consequently, choice between the cluster LQAS methods is not straightforward. Further research should attempt to characterize clustering patterns in specific applications and provide suggestions for best-practice cluster LQAS designs on a setting-specific basis. PMID:26125967

  7. Why we love or hate our cars: A qualitative approach to the development of a quantitative user experience survey.

    PubMed

    Tonetto, Leandro Miletto; Desmet, Pieter M A

    2016-09-01

    This paper presents a more ecologically valid way of developing theory-based item questionnaires for measuring user experience. In this novel approach, items were generated using natural and domain-specific language of the research population, what seems to have made the survey much more sensitive to real experiences than theory-based ones. The approach was applied in a survey that measured car experience. Ten in-depth interviews were conducted with drivers inside their cars. The resulting transcripts were analysed with the aim of capturing their natural utterances for expressing their car experience. This analysis resulted in 71 categories of answers. For each category, one sentence was selected to serve as a survey-item. In an online platform, 538 respondents answered the survey. Data reliability, tested with Cronbach alpha index, was 0.94, suggesting a survey with highly reliable results to measure drivers' appraisals of their cars. PMID:27184312

  8. Addressing statistical and operational challenges in designing large-scale stream condition surveys.

    PubMed

    Dobbie, Melissa J; Negus, Peter

    2013-09-01

    Implementing a statistically valid and practical monitoring design for large-scale stream condition monitoring and assessment programs can be difficult due to factors including the likely existence of a diversity of ecosystem types such as ephemeral streams over the sampling domain; limited resources to undertake detailed monitoring surveys and address knowledge gaps; and operational constraints on effective sampling at monitoring sites. In statistical speak, these issues translate to defining appropriate target populations and sampling units; designing appropriate spatial and temporal sample site selection methods; selection and use of appropriate indicators; and setting effect sizes with limited ecological and statistical information about the indicators of interest. We identify the statistical and operational challenges in designing large-scale stream condition surveys and discuss general approaches for addressing them. The ultimate aim in drawing attention to these challenges is to ensure operational practicality in carrying out future monitoring programs and that the resulting inferences about stream condition are statistically valid and relevant. PMID:23344628

  9. The Design of a Novel Survey for Small Objects in the Solar System

    SciTech Connect

    Alcock, C.; Chen, W.P.; de Pater, I.; Lee, T.; Lissauer, J.; Rice, J.; Liang, C.; Cook, K.; Marshall, S.; Akerlof, C.

    2000-08-21

    We evaluated several concepts for a new survey for small objects in the Solar System. We designed a highly novel survey for comets in the outer region of the Solar System, which exploits the occultations of relatively bright stars to infer the presence of otherwise extremely faint objects. The populations and distributions of these objects are not known; the uncertainties span orders of magnitude! These objects are important scientifically as probes of the primordial solar system, and programmatically now that major investments may be made in the possible mitigation of the hazard of asteroid or comet collisions with the Earth.

  10. Estimation of wildlife population ratios incorporating survey design and visibility bias

    USGS Publications Warehouse

    Samuel, M.D.; Steinhorst, R.K.; Garton, E.O.; Unsworth, J.W.

    1992-01-01

    Age and sex ratio statistics are often a key component of the evaluation and management of wildlife populations. These statistics are determined from counts of animals that are commonly plagued by errors associated with either survey design or visibility bias. We present age and sex ratio estimators that incorporate both these sources of error and include the typical situation that animals are sampled in groups. Aerial surveys of elk (Cervus elaphus) in northcentral Idaho illustrate that differential visibility of age or sex classes can produce biased ratio estimates. Visibility models may be used to provide corrected estimates of ratios and their variability that incorporates errors due to sampling, visibility bias, and visibility estimation.

  11. The inclusion of open-ended questions on quantitative surveys of children: Dealing with unanticipated responses relating to child abuse and neglect.

    PubMed

    Lloyd, Katrina; Devine, Paula

    2015-10-01

    Web surveys have been shown to be a viable, and relatively inexpensive, method of data collection with children. For this reason, the Kids' Life and Times (KLT) was developed as an annual online survey of 10 and 11 year old children. Each year, approximately 4,000 children participate in the survey. Throughout the six years that KLT has been running, a range of questions has been asked that are both policy-relevant and important to the lives of children. Given the method employed by the survey, no extremely sensitive questions that might cause the children distress are included. The majority of questions on KLT are closed yielding quantitative data that are analysed statistically; however, one regular open-ended question is included at the end of KLT each year so that the children can suggest questions that they think should be asked on the survey the following year. While most of the responses are innocuous, each year a small minority of children suggest questions on child abuse and neglect. This paper reports the responses to this question and reflects on how researchers can, and should, deal with this issue from both a methodological and an ethical perspective. PMID:25952476

  12. Distance software: design and analysis of distance sampling surveys for estimating population size

    PubMed Central

    Thomas, Len; Buckland, Stephen T; Rexstad, Eric A; Laake, Jeff L; Strindberg, Samantha; Hedley, Sharon L; Bishop, Jon RB; Marques, Tiago A; Burnham, Kenneth P

    2010-01-01

    1.Distance sampling is a widely used technique for estimating the size or density of biological populations. Many distance sampling designs and most analyses use the software Distance. 2.We briefly review distance sampling and its assumptions, outline the history, structure and capabilities of Distance, and provide hints on its use. 3.Good survey design is a crucial prerequisite for obtaining reliable results. Distance has a survey design engine, with a built-in geographic information system, that allows properties of different proposed designs to be examined via simulation, and survey plans to be generated. 4.A first step in analysis of distance sampling data is modelling the probability of detection. Distance contains three increasingly sophisticated analysis engines for this: conventional distance sampling, which models detection probability as a function of distance from the transect and assumes all objects at zero distance are detected; multiple-covariate distance sampling, which allows covariates in addition to distance; and mark–recapture distance sampling, which relaxes the assumption of certain detection at zero distance. 5.All three engines allow estimation of density or abundance, stratified if required, with associated measures of precision calculated either analytically or via the bootstrap. 6.Advanced analysis topics covered include the use of multipliers to allow analysis of indirect surveys (such as dung or nest surveys), the density surface modelling analysis engine for spatial and habitat modelling, and information about accessing the analysis engines directly from other software. 7.Synthesis and applications. Distance sampling is a key method for producing abundance and density estimates in challenging field conditions. The theory underlying the methods continues to expand to cope with realistic estimation situations. In step with theoretical developments, state-of-the-art software that implements these methods is described that makes the

  13. Quantitative design and experimental validation for a single-molecule DNA nanodevice transformable among three structural states.

    PubMed

    Komiya, Ken; Yamamura, Masayuki; Rose, John A

    2010-07-01

    In this work, we report the development and experimental validation of a coupled statistical thermodynamic model allowing prediction of the structural transitions executed by a novel DNA nanodevice, for quantitative operational design. The efficiency of target structure formation by this nanodevice, implemented with a bistable DNA molecule designed to transform between three distinct structures, is modeled by coupling the isolated equilibrium models for the individual structures. A peculiar behavior is predicted for this nanodevice, which forms the target structure within a limited temperature range by sensing thermal variations. The predicted thermal response is then validated via fluorescence measurements to quantitatively assess whether the nanodevice performs as designed. Agreement between predictions and experiment was substantial, with a 0.95 correlation for overall curve shape over a wide temperature range, from 30 C to 90 C. The obtained accuracy, which is comparable to that of conventional melting behavior prediction for DNA duplexes in isolation, ensures the applicability of the coupled model for illustrating general DNA reaction systems involving competitive duplex formation. Finally, tuning of the nanodevice using the current model towards design of a thermal band pass filter to control chemical circuits, as a novel function of DNA nanodevices is proposed. PMID:20385575

  14. Hepatitis C Virus RNA Real-Time Quantitative RT-PCR Method Based on a New Primer Design Strategy.

    PubMed

    Chen, Lida; Li, Wenli; Zhang, Kuo; Zhang, Rui; Lu, Tian; Hao, Mingju; Jia, Tingting; Sun, Yu; Lin, Guigao; Wang, Lunan; Li, Jinming

    2016-01-01

    Viral nucleic acids are unstable when improperly collected, handled, and stored, resulting in decreased sensitivity of currently available commercial quantitative nucleic acid testing kits. Using known unstable hepatitis C virus RNA, we developed a quantitative RT-PCR method based on a new primer design strategy to reduce the impact of nucleic acid instability on nucleic acid testing. The performance of the method was evaluated for linearity, limit of detection, precision, specificity, and agreement with commercial hepatitis C virus assays. Its clinical application was compared to that of two commercial kits--Cobas AmpliPrep/Cobas TaqMan (CAP/CTM) and Kehua. The quantitative RT-PCR method delivered a good performance, with a linearity of R(2) = 0.99, a total limit of detection (genotypes 1 to 6) of 42.6 IU/mL (95% CI, 32.84 to 67.76 IU/mL), a CV of 1.06% to 3.34%, a specificity of 100%, and a high concordance with the CAP/CTM assay (R(2) = 0.97), with a means ± SD value of -0.06 ± 1.96 log IU/mL (range, -0.38 to 0.25 log IU/mL). The method was superior to commercial assays in detecting unstable hepatitis C virus RNA (P < 0.05). This quantitative RT-PCR method can effectively eliminate the influence of RNA instability on nucleic acid testing. The principle of primer design strategy may be applied to the detection of other RNA or DNA viruses. PMID:26612712

  15. A survey of scientific literacy to provide a foundation for designing science communication in Japan.

    PubMed

    Kawamoto, Shishin; Nakayama, Minoru; Saijo, Miki

    2013-08-01

    There are various definitions and survey methods for scientific literacy. Taking into consideration the contemporary significance of scientific literacy, we have defined it with an emphasis on its social aspects. To acquire the insights needed to design a form of science communication that will enhance the scientific literacy of each individual, we conducted a large-scale random survey within Japan of individuals older than 18 years, using a printed questionnaire. The data thus acquired were analyzed using factor analysis and cluster analysis to create a 3-factor/4-cluster model of people's interest and attitude toward science, technology and society and their resulting tendencies. Differences were found among the four clusters in terms of the three factors: scientific factor, social factor, and science-appreciating factor. We propose a plan for designing a form of science communication that is appropriate to this current status of scientific literacy in Japan. PMID:23885051

  16. Complementary methods of system usability evaluation: surveys and observations during software design and development cycles.

    PubMed

    Horsky, Jan; McColgan, Kerry; Pang, Justine E; Melnikas, Andrea J; Linder, Jeffrey A; Schnipper, Jeffrey L; Middleton, Blackford

    2010-10-01

    Poor usability of clinical information systems delays their adoption by clinicians and limits potential improvements to the efficiency and safety of care. Recurring usability evaluations are therefore, integral to the system design process. We compared four methods employed during the development of outpatient clinical documentation software: clinician email response, online survey, observations and interviews. Results suggest that no single method identifies all or most problems. Rather, each approach is optimal for evaluations at a different stage of design and characterizes different usability aspect. Email responses elicited from clinicians and surveys report mostly technical, biomedical, terminology and control problems and are most effective when a working prototype has been completed. Observations of clinical work and interviews inform conceptual and workflow-related problems and are best performed early in the cycle. Appropriate use of these methods consistently during development may significantly improve system usability and contribute to higher adoption rates among clinicians and to improved quality of care. PMID:20546936

  17. Epidemiological survey of anti-flea IgE in dogs in Japan by using an antigen-specific IgE quantitative measurement method

    PubMed Central

    Ichikawa, Y.; Beugnet, F.

    2012-01-01

    In Japan, an epidemiological survey was performed in dogs from October to December 2008 by using a quantitative measurement method for antigen-specific IgE towards specific Ctenocephalides felis antigens. 214 dogs from 22 veterinary clinics were included. These clinics were located as follows, from North to South: Hokkaido, Aomori, Fukushima, Tochigi, Saitama, Chiba, Tokyo (Tama-City and Ota-ku), Kanagawa, Gifu, Niigata, Kyoto, Nara, Osaka, Hyogo, Kagawa, Ehime, Hiroshima, Yamaguchi, Fukuoka, Kumamoto and Kagoshima. 110 dogs (51.4%) were seropositive for flea-specific IgE. No differences were associated with gender or breed. This survey confirms that flea infestation in dogs is a common problem in Japan. It especially shows that the infestation also occurs in Northern Japan where fleas are considered uncommon by the vet. PMID:22550629

  18. Creel survey sampling designs for estimating effort in short-duration Chinook salmon fisheries

    USGS Publications Warehouse

    McCormick, Joshua L.; Quist, Michael C.; Schill, Daniel J.

    2013-01-01

    Chinook Salmon Oncorhynchus tshawytscha sport fisheries in the Columbia River basin are commonly monitored using roving creel survey designs and require precise, unbiased catch estimates. The objective of this study was to examine the relative bias and precision of total catch estimates using various sampling designs to estimate angling effort under the assumption that mean catch rate was known. We obtained information on angling populations based on direct visual observations of portions of Chinook Salmon fisheries in three Idaho river systems over a 23-d period. Based on the angling population, Monte Carlo simulations were used to evaluate the properties of effort and catch estimates for each sampling design. All sampling designs evaluated were relatively unbiased. Systematic random sampling (SYS) resulted in the most precise estimates. The SYS and simple random sampling designs had mean square error (MSE) estimates that were generally half of those observed with cluster sampling designs. The SYS design was more efficient (i.e., higher accuracy per unit cost) than a two-cluster design. Increasing the number of clusters available for sampling within a day decreased the MSE of estimates of daily angling effort, but the MSE of total catch estimates was variable depending on the fishery. The results of our simulations provide guidelines on the relative influence of sample sizes and sampling designs on parameters of interest in short-duration Chinook Salmon fisheries.

  19. Robotic influence in the conceptual design of mechanical systems in space and vice versa - A survey

    NASA Technical Reports Server (NTRS)

    Sanger, George F.

    1988-01-01

    A survey of methods using robotic devices to construct structural elements in space is presented. Two approaches to robotic construction are considered: one in which the structural elements are designed using conventional aerospace techniques which tend to constrain the function aspects of robotics and one in which the structural elements are designed from the conceptual stage with built-in robotic features. Examples are presented of structural building concepts using robotics, including the construction of the SP-100 nuclear reactor power system, a multimirror large aperture IR space telescope concept, retrieval and repair in space, and the Flight Telerobotic Servicer.

  20. Radiologists' requirements for primary diagnosis workstations: preliminary results of task-based design surveys

    NASA Astrophysics Data System (ADS)

    Hohman, Suzan A.; Johnson, Sandra L.; Valentino, Daniel J.; Taira, Ricky K.; Manzo, William A.

    1994-05-01

    There has been a tremendous amount of effort put into the design of diagnostic radiology workstations; however, few workstations have been clinically accepted. Among the requirements for a clinically acceptable workstation are good image quality, a well designed user-interface, and access to all relevant diagnostic information. The user-interface design should reflect radiologist's film reading habits and encourage new reading methods that take advantage of the electronic environment. As part of our effort to improve diagnostic workstation design, we surveyed radiologists in the UCLA Department of Radiological Sciences. Sixteen radiologists from the fields of pediatric, genitourinary, thoracic, and neuroradiology participated in the initial survey. We asked their opinions regarding our PACS infrastructure performance and our existing diagnostic workstations. We also asked them to identify certain pathologies that they found to be less evident on workstations as compared to film. We are using this information to determine the current limitations of diagnostic workstations and to develop a user interface design that addresses the clinical requirements of a busy teritiary care medical center the radiologists who use it.

  1. Design and Implementation Issues in Surveying the Views of Young Children in Ethnolinguistically Diverse Developing Country Contexts

    ERIC Educational Resources Information Center

    Smith, Hilary A.; Haslett, Stephen J.

    2016-01-01

    This paper discusses issues in the development of a methodology appropriate for eliciting sound quantitative data from primary school children in the complex contexts of ethnolinguistically diverse developing countries. Although these issues often occur in field-based surveys, the large extent and compound effects of their occurrence in…

  2. Quantitation of active pharmaceutical ingredients and excipients in powder blends using designed multivariate calibration models by near-infrared spectroscopy.

    PubMed

    Li, Weiyong; Worosila, Gregory D

    2005-05-13

    This research note demonstrates the simultaneous quantitation of a pharmaceutical active ingredient and three excipients in a simulated powder blend containing acetaminophen, Prosolv and Crospovidone. An experimental design approach was used in generating a 5-level (%, w/w) calibration sample set that included 125 samples. The samples were prepared by weighing suitable amount of powders into separate 20-mL scintillation vials and were mixed manually. Partial least squares (PLS) regression was used in calibration model development. The models generated accurate results for quantitation of Crospovidone (at 5%, w/w) and magnesium stearate (at 0.5%, w/w). Further testing of the models demonstrated that the 2-level models were as effective as the 5-level ones, which reduced the calibration sample number to 50. The models had a small bias for quantitation of acetaminophen (at 30%, w/w) and Prosolv (at 64.5%, w/w) in the blend. The implication of the bias is discussed. PMID:15848006

  3. Quantitatively Mapping Cellular Viscosity with Detailed Organelle Information via a Designed PET Fluorescent Probe

    PubMed Central

    Liu, Tianyu; Liu, Xiaogang; Spring, David R.; Qian, Xuhong; Cui, Jingnan; Xu, Zhaochao

    2014-01-01

    Viscosity is a fundamental physical parameter that influences diffusion in biological processes. The distribution of intracellular viscosity is highly heterogeneous, and it is challenging to obtain a full map of cellular viscosity with detailed organelle information. In this work, we report 1 as the first fluorescent viscosity probe which is able to quantitatively map cellular viscosity with detailed organelle information based on the PET mechanism. This probe exhibited a significant ratiometric fluorescence intensity enhancement as solvent viscosity increases. The emission intensity increase was attributed to combined effects of the inhibition of PET due to restricted conformational access (favorable for FRET, but not for PET), and the decreased PET efficiency caused by viscosity-dependent twisted intramolecular charge transfer (TICT). A full map of subcellular viscosity was successfully constructed via fluorescent ratiometric detection and fluorescence lifetime imaging; it was found that lysosomal regions in a cell possess the highest viscosity, followed by mitochondrial regions. PMID:24957323

  4. A quantitative method for groundwater surveillance monitoring network design at the Hanford Site

    SciTech Connect

    Meyer, P.D.

    1993-12-01

    As part of the Environmental Surveillance Program at the Hanford Site, mandated by the US Department of Energy, hundreds of groundwater wells are sampled each year, with each sample typically analyzed for a variety of constituents. The groundwater sampling program must satisfy several broad objectives. These objectives include an integrated assessment of the condition of groundwater and the identification and quantification of existing, emerging, or potential groundwater problems. Several quantitative network desip objectives are proposed and a mathematical optimization model is developed from these objectives. The model attempts to find minimum cost network alternatives that maximize the amount of information generated by the network. Information is measured both by the rats of change with respect to time of the contaminant concentration and the uncertainty in contaminant concentration. In an application to tritium monitoring at the Hanford Site, both information measures were derived from historical data using time series analysis.

  5. Quantitatively Mapping Cellular Viscosity with Detailed Organelle Information via a Designed PET Fluorescent Probe

    NASA Astrophysics Data System (ADS)

    Liu, Tianyu; Liu, Xiaogang; Spring, David R.; Qian, Xuhong; Cui, Jingnan; Xu, Zhaochao

    2014-06-01

    Viscosity is a fundamental physical parameter that influences diffusion in biological processes. The distribution of intracellular viscosity is highly heterogeneous, and it is challenging to obtain a full map of cellular viscosity with detailed organelle information. In this work, we report 1 as the first fluorescent viscosity probe which is able to quantitatively map cellular viscosity with detailed organelle information based on the PET mechanism. This probe exhibited a significant ratiometric fluorescence intensity enhancement as solvent viscosity increases. The emission intensity increase was attributed to combined effects of the inhibition of PET due to restricted conformational access (favorable for FRET, but not for PET), and the decreased PET efficiency caused by viscosity-dependent twisted intramolecular charge transfer (TICT). A full map of subcellular viscosity was successfully constructed via fluorescent ratiometric detection and fluorescence lifetime imaging; it was found that lysosomal regions in a cell possess the highest viscosity, followed by mitochondrial regions.

  6. Quantitative evaluation of water bodies dynamic by means of thermal infrared and multispectral surveys on the Venetian lagoon

    NASA Technical Reports Server (NTRS)

    Alberotanza, L.; Lechi, G. M.

    1977-01-01

    Surveys employing a two channel Daedalus infrared scanner and multispectral photography were performed. The spring waning tide, the velocity of the water mass, and the types of suspended matter were among the topics studied. Temperature, salinity, sediment transport, and ebb stream velocity were recorded. The bottom topography was correlated with the dynamic characteristics of the sea surface.

  7. Design and methods of the Adult Inuit Health Survey 2007–2008

    PubMed Central

    Saudny, Helga; Leggee, Donna; Egeland, Grace

    2012-01-01

    Background The Canadian International Polar Year (IPY) program made it possible to undertake much needed health research in 3 jurisdictions within the Canadian Inuit Nunangat (homeland) over a 2-year period: Inuvialuit Settlement Region (ISR), Nunavut Territory, and Nunatsiavut. Design The Adult Inuit Health Survey (IHS) was a cross-sectional survey and provides baseline data upon which future comparisons can be made for prospectively assessing factors leading to the progression of chronic diseases among Canadian Inuit. With the help of the Canadian Coast Guard Ship Amundsen, which was equipped with research and laboratory facilities, 33 coastal communities were visited; land survey teams visited 3 inland communities. Results The Adult IHS succeeded in obtaining important baseline information concerning the health status and living conditions of 2,595 adults living in ISR, Nunavut and Nunatsiavut. Conclusion Information from this survey will be useful for future comparisons and the opportunity to link with the International Inuit Cohort, a follow-up evaluation, and for the development of future health policies and public health interventions. PMID:23166895

  8. Microbial-based evaluation of foaming events in full-scale wastewater treatment plants by microscopy survey and quantitative image analysis.

    PubMed

    Leal, Cristiano; Amaral, António Luís; Costa, Maria de Lourdes

    2016-08-01

    Activated sludge systems are prone to be affected by foaming occurrences causing the sludge to rise in the reactor and affecting the wastewater treatment plant (WWTP) performance. Nonetheless, there is currently a knowledge gap hindering the development of foaming events prediction tools that may be fulfilled by the quantitative monitoring of AS systems biota and sludge characteristics. As such, the present study focuses on the assessment of foaming events in full-scale WWTPs, by quantitative protozoa, metazoa, filamentous bacteria, and sludge characteristics analysis, further used to enlighten the inner relationships between these parameters. In the current study, a conventional activated sludge system (CAS) and an oxidation ditch (OD) were surveyed throughout a period of 2 and 3 months, respectively, regarding their biota and sludge characteristics. The biota community was monitored by microscopic observation, and a new filamentous bacteria index was developed to quantify their occurrence. Sludge characteristics (aggregated and filamentous biomass contents and aggregate size) were determined by quantitative image analysis (QIA). The obtained data was then processed by principal components analysis (PCA), cross-correlation analysis, and decision trees to assess the foaming occurrences, and enlighten the inner relationships. It was found that such events were best assessed by the combined use of the relative abundance of testate amoeba and nocardioform filamentous index, presenting a 92.9 % success rate for overall foaming events, and 87.5 and 100 %, respectively, for persistent and mild events. PMID:27130343

  9. Wide-Field InfraRed Survey Telescope (WFIRST) Slitless Spectrometer: Design, Prototype, and Results

    NASA Technical Reports Server (NTRS)

    Gong, Qian; Content, David; Dominguez, Margaret; Emmett, Thomas; Griesmann, Ulf; Hagopian, John; Kruk, Jeffrey; Marx, Catherine; Pasquale, Bert; Wallace, Thomas; Whipple, Arthur

    2016-01-01

    The slitless spectrometer plays an important role in the Wide-Field InfraRed Survey Telescope (WFIRST) mission for the survey of emission-line galaxies. This will be an unprecedented very wide field, HST quality 3D survey of emission line galaxies. The concept of the compound grism as a slitless spectrometer has been presented previously. The presentation briefly discusses the challenges and solutions of the optical design, and recent specification updates, as well as a brief comparison between the prototype and the latest design. However, the emphasis of this paper is the progress of the grism prototype: the fabrication and test of the complicated diffractive optical elements and powered prism, as well as grism assembly alignment and testing. Especially how to use different tools and methods, such as IR phase shift and wavelength shift interferometry, to complete the element and assembly tests. The paper also presents very encouraging results from recent element tests to assembly tests. Finally we briefly touch the path forward plan to test the spectral characteristic, such as spectral resolution and response.

  10. Loop Shaping Control Design for a Supersonic Propulsion System Model Using Quantitative Feedback Theory (QFT) Specifications and Bounds

    NASA Technical Reports Server (NTRS)

    Connolly, Joseph W.; Kopasakis, George

    2010-01-01

    This paper covers the propulsion system component modeling and controls development of an integrated mixed compression inlet and turbojet engine that will be used for an overall vehicle Aero-Propulso-Servo-Elastic (APSE) model. Using previously created nonlinear component-level propulsion system models, a linear integrated propulsion system model and loop shaping control design have been developed. The design includes both inlet normal shock position control and jet engine rotor speed control for a potential supersonic commercial transport. A preliminary investigation of the impacts of the aero-elastic effects on the incoming flow field to the propulsion system are discussed, however, the focus here is on developing a methodology for the propulsion controls design that prevents unstart in the inlet and minimizes the thrust oscillation experienced by the vehicle. Quantitative Feedback Theory (QFT) specifications and bounds, and aspects of classical loop shaping are used in the control design process. Model uncertainty is incorporated in the design to address possible error in the system identification mapping of the nonlinear component models into the integrated linear model.

  11. Comprehension and Recall of Internet News: A Quantitative Study of Web Page Design.

    ERIC Educational Resources Information Center

    Berry, D. Leigh

    This experimental study examined the effects of multimedia on Internet news readers, in particular focusing on Web site design and its effect on comprehension and recall of news stories. Subjects (84 undergraduate students) viewed one of two versions of the same Web site--one with multimedia and one without. The Web site consisted of six stories…

  12. "Intelligent design" of a 3D reflection survey for the SAFOD drill-hole site

    NASA Astrophysics Data System (ADS)

    Alvarez, G.; Hole, J. A.; Klemperer, S. L.; Biondi, B.; Imhof, M.

    2003-12-01

    SAFOD seeks to better understand the earthquake process by drilling though the San Andreas fault (SAF) to sample an earthquake in situ. To capitalize fully on the opportunities presented by the 1D drill-hole into a complex fault zone we must characterize the surrounding 3D geology at a scale commensurate with the drilling observations, to provide the structural context to extrapolate 1D drilling results along the fault plane and into the surrounding 3D volume. Excellent active-2D and passive-3D seismic observations completed and underway lack the detailed 3D resolution required. Only an industry-quality 3D reflection survey can provide c. 25 m subsurface sample-spacing horizontally and vertically. A 3D reflection survey will provide subsurface structural and stratigraphic control at the 100-m level, mapping major geologic units, structural boundaries, and subsurface relationships between the many faults that make up the SAF fault system. A principal objective should be a reflection-image (horizon-slice through the 3D volume) of the near-vertical fault plane(s) to show variations in physical properties around the drill-hole. Without a 3D reflection image of the fault zone, we risk interpreting drilled anomalies as ubiquitous properties of the fault, or risk missing important anomalies altogether. Such a survey cannot be properly costed or technically designed without major planning. "Intelligent survey design" can minimize source and receiver effort without compromising data-quality at the fault target. Such optimization can in principal reduce the cost of a 3D seismic survey by a factor of two or three, utilizing the known surface logistic constraints, partially-known sub-surface velocity field, and the suite of scientific targets at SAFOD. Our methodology poses the selection of the survey parameters as an optimization process that allows the parameters to vary spatially in response to changes in the subsurface. The acquisition geometry is locally optimized for

  13. Simulation of complete seismic surveys for evaluation of experiment design and processing

    SciTech Connect

    Oezdenvar, T.; McMechan, G.A.; Chaney, P.

    1996-03-01

    Synthesis of complete seismic survey data sets allows analysis and optimization of all stages in an acquisition/processing sequence. The characteristics of available survey designs, parameter choices, and processing algorithms may be evaluated prior to field acquisition to produce a composite system in which all stages have compatible performance; this maximizes the cost effectiveness for a given level of accuracy, or for targets with specific characteristics. Data sets synthesized for three salt structures provide representative comparisons of time and depth migration, post-stack and prestack processing, and illustrate effects of varying recording aperture and shot spacing, iterative focusing analysis, and the interaction of migration algorithms with recording aperture. A final example demonstrates successful simulation of both 2-D acquisition and processing of a real data line over a salt pod in the Gulf of Mexico.

  14. KUIPER BELT OBJECT OCCULTATIONS: EXPECTED RATES, FALSE POSITIVES, AND SURVEY DESIGN

    SciTech Connect

    Bickerton, S. J.; Welch, D. L.; Kavelaars, J. J. E-mail: welch@physics.mcmaster.ca

    2009-05-15

    A novel method of generating artificial scintillation noise is developed and used to evaluate occultation rates and false positive rates for surveys probing the Kuiper Belt with the method of serendipitous stellar occultations. A thorough examination of survey design shows that (1) diffraction-dominated occultations are critically (Nyquist) sampled at a rate of 2 Fsu{sup -1}, corresponding to 40 s{sup -1} for objects at 40 AU, (2) occultation detection rates are maximized when targets are observed at solar opposition, (3) Main Belt asteroids will produce occultations light curves identical to those of Kuiper Belt Objects (KBOs) if target stars are observed at solar elongations of: 116{sup 0} {approx}< {epsilon} {approx}< 125 deg., or 131 deg. {approx}< {epsilon} {approx}< 141 deg., and (4) genuine KBO occultations are likely to be so rare that a detection threshold of {approx}>7-8{sigma} should be adopted to ensure that viable candidate events can be disentangled from false positives.

  15. Implementing the World Mental Health Survey Initiative in Portugal – rationale, design and fieldwork procedures

    PubMed Central

    2013-01-01

    Background The World Mental Health Survey Initiative was designed to evaluate the prevalence, the correlates, the impact and the treatment patterns of mental disorders. This paper describes the rationale and the methodological details regarding the implementation of the survey in Portugal, a country that still lacks representative epidemiological data about psychiatric disorders. Methods The World Mental Health Survey is a cross-sectional study with a representative sample of the Portuguese population, aged 18 or older, based on official census information. The WMH-Composite International Diagnostic Interview, adapted to the Portuguese language by a group of bilingual experts, was used to evaluate the mental health status, disorder severity, impairment, use of services and treatment. Interviews were administered face-to-face at respondent’s dwellings, which were selected from a nationally representative multi-stage clustered area probability sample of households. The survey was administered using computer-assisted personal interview methods by trained lay interviewers. Data quality was strictly controlled in order to ensure the reliability and validity of the collected information. Results A total of 3,849 people completed the main survey, with 2,060 completing the long interview, with a response rate of 57.3%. Data cleaning was conducted in collaboration with the WMHSI Data Analysis Coordination Centre at the Department of Health Care Policy, Harvard Medical School. Collected information will provide lifetime and 12-month mental disorders diagnoses, according to the International Classification of Diseases and to the Diagnostic and Statistical Manual of Mental Disorders. Conclusions The findings of this study could have a major influence in mental health care policy planning efforts over the next years, specially in a country that still has a significant level of unmet needs regarding mental health services organization, delivery of care and epidemiological

  16. Sampling effects on the identification of roadkill hotspots: Implications for survey design.

    PubMed

    Santos, Sara M; Marques, J Tiago; Lourenço, André; Medinas, Denis; Barbosa, A Márcia; Beja, Pedro; Mira, António

    2015-10-01

    Although locating wildlife roadkill hotspots is essential to mitigate road impacts, the influence of study design on hotspot identification remains uncertain. We evaluated how sampling frequency affects the accuracy of hotspot identification, using a dataset of vertebrate roadkills (n = 4427) recorded over a year of daily surveys along 37 km of roads. "True" hotspots were identified using this baseline dataset, as the 500-m segments where the number of road-killed vertebrates exceeded the upper 95% confidence limit of the mean, assuming a Poisson distribution of road-kills per segment. "Estimated" hotspots were identified likewise, using datasets representing progressively lower sampling frequencies, which were produced by extracting data from the baseline dataset at appropriate time intervals (1-30 days). Overall, 24.3% of segments were "true" hotspots, concentrating 40.4% of roadkills. For different groups, "true" hotspots accounted from 6.8% (bats) to 29.7% (small birds) of road segments, concentrating from <40% (frogs and toads, snakes) to >60% (lizards, lagomorphs, carnivores) of roadkills. Spatial congruence between "true" and "estimated" hotspots declined rapidly with increasing time interval between surveys, due primarily to increasing false negatives (i.e., missing "true" hotspots). There were also false positives (i.e., wrong "estimated" hotspots), particularly at low sampling frequencies. Spatial accuracy decay with increasing time interval between surveys was higher for smaller-bodied (amphibians, reptiles, small birds, small mammals) than for larger-bodied species (birds of prey, hedgehogs, lagomorphs, carnivores). Results suggest that widely used surveys at weekly or longer intervals may produce poor estimates of roadkill hotspots, particularly for small-bodied species. Surveying daily or at two-day intervals may be required to achieve high accuracy in hotspot identification for multiple species. PMID:26232568

  17. Designing, Testing, and Validating an Attitudinal Survey on an Environmental Topic: A Groundwater Pollution Survey Instrument for Secondary School Students

    ERIC Educational Resources Information Center

    Lacosta-Gabari, Idoya; Fernandez-Manzanal, Rosario; Sanchez-Gonzalez, Dolores

    2009-01-01

    Research in environmental attitudes' assessment has significantly increased in recent years. The development of specific attitude scales for specific environmental problems has often been proposed. This paper describes the Groundwater Pollution Test (GPT), a 19-item survey instrument using a Likert-type scale. The survey has been used with…

  18. The Proteome of Human Liver Peroxisomes: Identification of Five New Peroxisomal Constituents by a Label-Free Quantitative Proteomics Survey

    PubMed Central

    Ofman, Rob; Bunse, Christian; Pawlas, Magdalena; Hayen, Heiko; Eisenacher, Martin; Stephan, Christian; Meyer, Helmut E.; Waterham, Hans R.; Erdmann, Ralf; Wanders, Ronald J.; Warscheid, Bettina

    2013-01-01

    The peroxisome is a key organelle of low abundance that fulfils various functions essential for human cell metabolism. Severe genetic diseases in humans are caused by defects in peroxisome biogenesis or deficiencies in the function of single peroxisomal proteins. To improve our knowledge of this important cellular structure, we studied for the first time human liver peroxisomes by quantitative proteomics. Peroxisomes were isolated by differential and Nycodenz density gradient centrifugation. A label-free quantitative study of 314 proteins across the density gradient was accomplished using high resolution mass spectrometry. By pairing statistical data evaluation, cDNA cloning and in vivo colocalization studies, we report the association of five new proteins with human liver peroxisomes. Among these, isochorismatase domain containing 1 protein points to the existence of a new metabolic pathway and hydroxysteroid dehydrogenase like 2 protein is likely involved in the transport or β-oxidation of fatty acids in human peroxisomes. The detection of alcohol dehydrogenase 1A suggests the presence of an alternative alcohol-oxidizing system in hepatic peroxisomes. In addition, lactate dehydrogenase A and malate dehydrogenase 1 partially associate with human liver peroxisomes and enzyme activity profiles support the idea that NAD+ becomes regenerated during fatty acid β-oxidation by alternative shuttling processes in human peroxisomes involving lactate dehydrogenase and/or malate dehydrogenase. Taken together, our data represent a valuable resource for future studies of peroxisome biochemistry that will advance research of human peroxisomes in health and disease. PMID:23460848

  19. Design and Performance Considerations for the Quantitative Measurement of HEU Residues Resulting from 99Mo Production

    SciTech Connect

    McElroy, Robert Dennis; Chapman, Jeffrey Allen; Bogard, James S; Belian, Anthony P

    2011-01-01

    Molybdenum-99 is produced by the irradiation of high-enriched uranium (HEU) resulting in the accumulation of large quantities of HEU residues. In general, these residues are not recycled but are either disposed of or stored in containers with surface exposure rates as high as 100 R/h. The 235U content of these waste containers must be quantified for both accountability and waste disposal purposes. The challenges of quantifying such difficult-to-assay materials are discussed, along with performance estimates for each of several potential assay options. In particular, the design and performance of a High Activity Active Well Coincidence Counting (HA-AWCC) system designed and built specifically for these irradiated HEU waste materials are presented.

  20. Campsite survey implications for managing designated campsites at Great Smoky Mountains National Park

    USGS Publications Warehouse

    Marion, J.L.; Leung, Y.-F.

    1998-01-01

    Backcountry campsites and shelters in Great Smoky Mountains National Park were surveyed in 1993 as part of a new impact monitoring program. A total of 395 campsites and shelters were located and assessed, including 309 legal campsites located at 84 designated campgrounds, 68 illegal campsites, and 18 shelters. Primary campsite management problems identified by the survey include: (1) campsite proliferation, (2) campsite expansion and excessive size, (3) excessive vegetation loss and soil exposure, (4) lack of visitor solitude at campsites, (5) excessive tree damage, and (6) illegal camping. A number of potential management options are recommended to address the identified campsite management problems. Many problems are linked to the ability of visitors to determine the location and number of individual campsites within each designated campground. A principal recommendation is that managers apply site-selection criteria to existing and potential new campsite locations to identify and designate campsites that will resist and constrain the areal extent of impacts and enhance visitor solitude. Educational solutions are also offered.

  1. Injury survey of a non-traditional 'soft-edged' trampoline designed to lower equipment hazards.

    PubMed

    Eager, David B; Scarrott, Carl; Nixon, Jim; Alexander, Keith

    2013-01-01

    In Australia trampolines contribute one quarter of all childhood play equipment injuries. The objective of this study was to gather and evaluate injury data from a non-traditional, 'soft-edged', consumer trampoline, where the design aimed to minimise injuries from the equipment and from falling off. The manufacturer of the non-traditional trampoline provided the University of Technology Sydney with their Australian customer database. The study involved surveys in Queensland and New South Wales, between May 2007 and March 2010. Initially injury data was gathered by a phone interview pilot study, then in the full study, through an email survey. The 3817 respondents were the carers of child users of the 'soft-edge' trampolines. Responses were compared with Australian and US emergency department data. In both countries the proportion of injuries caused by the equipment and falling off was compared with the proportion caused by the jumpers to themselves or each other. The comparisons showed a significantly lower proportion resulted from falling-off or hitting the equipment for this design when compared to traditional trampolines, both in Australia and the US. This research concludes that equipment-induced and falling-off injuries, the more severe injuries on traditional trampolines, can be significantly reduced with appropriate trampoline design. PMID:22471672

  2. DESIGN AND APPLICATION OF A STRATIFIED UNEQUAL-PROBABILITY STREAM SURVEY IN THE MID-ATLANTIC COASTAL PLAIN

    EPA Science Inventory

    A stratified random sample with unequal probability selection within strata was used to design a multipurpose survey of headwater watersheds in the Mid-Atlantic Coastal Plain. Objectives for data from the survey include unbiased estimates of regional headwater watershed condition...

  3. Improving the design of acoustic and midwater trawl surveys through stratification, with an application to Lake Michigan prey fishes

    USGS Publications Warehouse

    Adams, J.V.; Argyle, R.L.; Fleischer, G.W.; Curtis, G.L.; Stickel, R.G.

    2006-01-01

    Reliable estimates of fish biomass are vital to the management of aquatic ecosystems and their associated fisheries. Acoustic and midwater trawl surveys are an efficient sampling method for estimating fish biomass in large bodies of water. To improve the precision of biomass estimates from combined acoustic and midwater trawl surveys, sampling effort should be optimally allocated within each stage of the survey design. Based on information collected during fish surveys, we developed an approach to improve the design of combined acoustic and midwater trawl surveys through stratification. Geographic strata for acoustic surveying and depth strata for midwater trawling were defined using neighbor-restricted cluster analysis, and the optimal allocation of sampling effort for each was then determined. As an example, we applied this survey stratification approach to data from lakewide acoustic and midwater trawl surveys of Lake Michigan prey fishes. Precision of biomass estimates from surveys with and without geographic stratification was compared through resampling. Use of geographic stratification with optimal sampling allocation reduced the variance of Lake Michigan acoustic biomass estimates by 77%. Stratification and optimal allocation at each stage of an acoustic and midwater trawl survey should serve to reduce the variance of the resulting biomass estimates.

  4. Experimental designs and statistical methods for mapping quantitative trait loci underlying triploid endosperm traits without maternal genetic variation.

    PubMed

    Wen, Yongxian; Wu, Weiren

    2008-01-01

    Many endosperm traits are related to grain quality in cereal crops. Endosperm traits are mainly controlled by the endosperm genome but may be affected by the maternal genome. Studies have shown that maternal genotypic variation could greatly influence the estimation of the direct effects of quantitative trait loci (QTLs) underlying endosperm traits. In this paper, we propose methods of interval mapping of endosperm QTLs using seeds of F2 or BC1 (an equal mixture of F1 x P1 and F1 x P2 with F1 as the female parent) derived from a cross between 2 pure lines (P1 x P2). The most significant advantage of our experimental designs is that the maternal effects do not contribute to the genetic variation of endosperm traits and therefore the direct effects of endosperm QTLs can be estimated without the influence of maternal effects. In addition, the experimental designs can greatly reduce environmental variation because a few F1 plants grown in a small block of field will produce sufficient F2 or BC1 seeds for endosperm QTL analysis. Simulation studies show that the methods can efficiently detect endosperm QTLs and unbiasedly estimate their positions and effects. The BC1 design is better than the F2 design. PMID:18544551

  5. Cigarette pack design and adolescent smoking susceptibility: a cross-sectional survey

    PubMed Central

    Ford, Allison; MacKintosh, Anne Marie; Moodie, Crawford; Richardson, Sol; Hastings, Gerard

    2013-01-01

    Objectives To compare adolescents’ responses to three different styles of cigarette packaging: novelty (branded packs designed with a distinctive shape, opening style or bright colour), regular (branded pack with no special design features) and plain (brown pack with a standard shape and opening and all branding removed, aside from brand name). Design Cross-sectional in-home survey. Setting UK. Participants Random location quota sample of 1025 never smokers aged 11–16 years. Main outcome measures Susceptibility to smoking and composite measures of pack appraisal and pack receptivity derived from 11 survey items. Results Mean responses to the three pack types were negative for all survey items. However, ‘novelty’ packs were rated significantly less negatively than the ‘regular’ pack on most items, and the novelty and regular packs were rated less negatively than the ‘plain’ pack. For the novelty packs, logistic regressions, controlling for factors known to influence youth smoking, showed that susceptibility was associated with positive appraisal and also receptivity. For example, those receptive to the innovative Silk Cut Superslims pack were more than four times as likely to be susceptible to smoking than those not receptive to this pack (AOR=4.42, 95% CI 2.50 to 7.81, p<0.001). For the regular pack, an association was found between positive appraisal and susceptibility but not with receptivity and susceptibility. There was no association with pack appraisal or receptivity for the plain pack. Conclusions Pack structure (shape and opening style) and colour are independently associated, not just with appreciation of and receptivity to the pack, but also with susceptibility to smoke. In other words, those who think most highly of novelty cigarette packaging are also the ones who indicate that they are most likely to go on to smoke. Plain packaging, in contrast, was found to directly reduce the appeal of smoking to adolescents. PMID:24056481

  6. Decision making preferences in the medical encounter – a factorial survey design

    PubMed Central

    Müller-Engelmann, Meike; Krones, Tanja; Keller, Heidi; Donner-Banzhoff, Norbert

    2008-01-01

    Background Up to now it has not been systematically investigated in which kind of clinical situations a consultation style based on shared decision making (SDM) is preferred by patients and physicians. We suggest the factorial survey design to address this problem. This method, which so far has hardly been used in health service research, allows to vary relevant factors describing clinical situations as variables systematically in an experimental random design and to investigate their importance in large samples. Methods/Design To identify situational factors for the survey we first performed a literature search which was followed by a qualitative interview study with patients, physicians and health care experts. As a result, 7 factors (e.g. "Reason for consultation" and "Number of therapeutic options") with 2 to 3 levels (e.g. "One therapeutic option" and "More than one therapeutic option") will be included in the study. For the survey the factor levels will be randomly combined to short stories describing different treatment situations. A randomized sample of all possible short stories will be given to at least 300 subjects (100 GPs, 100 patients and 100 members of self-help groups) who will be asked to rate how the decision should be made. Main outcome measure is the preference for participation in the decision making process in the given clinical situation. Data analysis will estimate the effects of the factors on the rating and also examine differences between groups. Discussion The results will reveal the effects of situational variations on participation preferences. Thus, our findings will contribute to the understanding of normative values in the medical decision making process and will improve future implementation of SDM and decision aids. PMID:19091091

  7. Recommended survey designs for occupancy modelling using motion-activated cameras: insights from empirical wildlife data.

    PubMed

    Shannon, Graeme; Lewis, Jesse S; Gerber, Brian D

    2014-01-01

    Motion-activated cameras are a versatile tool that wildlife biologists can use for sampling wild animal populations to estimate species occurrence. Occupancy modelling provides a flexible framework for the analysis of these data; explicitly recognizing that given a species occupies an area the probability of detecting it is often less than one. Despite the number of studies using camera data in an occupancy framework, there is only limited guidance from the scientific literature about survey design trade-offs when using motion-activated cameras. A fuller understanding of these trade-offs will allow researchers to maximise available resources and determine whether the objectives of a monitoring program or research study are achievable. We use an empirical dataset collected from 40 cameras deployed across 160 km(2) of the Western Slope of Colorado, USA to explore how survey effort (number of cameras deployed and the length of sampling period) affects the accuracy and precision (i.e., error) of the occupancy estimate for ten mammal and three virtual species. We do this using a simulation approach where species occupancy and detection parameters were informed by empirical data from motion-activated cameras. A total of 54 survey designs were considered by varying combinations of sites (10-120 cameras) and occasions (20-120 survey days). Our findings demonstrate that increasing total sampling effort generally decreases error associated with the occupancy estimate, but changing the number of sites or sampling duration can have very different results, depending on whether a species is spatially common or rare (occupancy = ψ) and easy or hard to detect when available (detection probability = p). For rare species with a low probability of detection (i.e., raccoon and spotted skunk) the required survey effort includes maximizing the number of sites and the number of survey days, often to a level that may be logistically unrealistic for many studies. For common species with

  8. Recommended survey designs for occupancy modelling using motion-activated cameras: insights from empirical wildlife data

    PubMed Central

    Lewis, Jesse S.; Gerber, Brian D.

    2014-01-01

    Motion-activated cameras are a versatile tool that wildlife biologists can use for sampling wild animal populations to estimate species occurrence. Occupancy modelling provides a flexible framework for the analysis of these data; explicitly recognizing that given a species occupies an area the probability of detecting it is often less than one. Despite the number of studies using camera data in an occupancy framework, there is only limited guidance from the scientific literature about survey design trade-offs when using motion-activated cameras. A fuller understanding of these trade-offs will allow researchers to maximise available resources and determine whether the objectives of a monitoring program or research study are achievable. We use an empirical dataset collected from 40 cameras deployed across 160 km2 of the Western Slope of Colorado, USA to explore how survey effort (number of cameras deployed and the length of sampling period) affects the accuracy and precision (i.e., error) of the occupancy estimate for ten mammal and three virtual species. We do this using a simulation approach where species occupancy and detection parameters were informed by empirical data from motion-activated cameras. A total of 54 survey designs were considered by varying combinations of sites (10–120 cameras) and occasions (20–120 survey days). Our findings demonstrate that increasing total sampling effort generally decreases error associated with the occupancy estimate, but changing the number of sites or sampling duration can have very different results, depending on whether a species is spatially common or rare (occupancy = ψ) and easy or hard to detect when available (detection probability = p). For rare species with a low probability of detection (i.e., raccoon and spotted skunk) the required survey effort includes maximizing the number of sites and the number of survey days, often to a level that may be logistically unrealistic for many studies. For common species with

  9. Requirements and concept design for large earth survey telescope for SEOS

    NASA Technical Reports Server (NTRS)

    Mailhot, P.; Bisbee, J.

    1975-01-01

    The efforts of a one year program of Requirements Analysis and Conceptual Design for the Large Earth Survey Telescope for the Synchronous Earth Observatory Satellite is summarized. A 1.4 meter aperture Cassegrain telescope with 0.6 deg field of view is shown to do an excellent job in satisfying the observational requirements for a wide range of earth resources and meteorological applications. The telescope provides imagery or thermal mapping in ten spectral bands at one time in a field sharing grouping of linear detector arrays. Pushbroom scanning is accomplished by spacecraft slew.

  10. Optical and electronic design of a calibrated multichannel electronic interferometer for quantitative flow visualization

    NASA Astrophysics Data System (ADS)

    Upton, T. D.; Watt, D. W.

    1995-09-01

    Calibrated multichannel electronic interferometry is an electro-optic technique for performing phase shifting of transient phenomena. The design of an improved system for calibrated multichannel electronic interferometry is discussed. This includes a computational method for alignment of three phase-shifted interferograms and determination of the pixel correspondence. During calibration the phase, modulation, and bias of the optical system are determined. These data are stored electronically and used to compensate for errors associated with the path differences in the interferometer, the separation of the phase-shifted interferograms, and the measurement of the phase shift.