Science.gov

Sample records for quantitative survey design

  1. Quantitative resilience analysis through control design.

    SciTech Connect

    Sunderland, Daniel; Vugrin, Eric D.; Camphouse, Russell Chris

    2009-09-01

    Critical infrastructure resilience has become a national priority for the U. S. Department of Homeland Security. System resilience has been studied for several decades in many different disciplines, but no standards or unifying methods exist for critical infrastructure resilience analysis. Few quantitative resilience methods exist, and those existing approaches tend to be rather simplistic and, hence, not capable of sufficiently assessing all aspects of critical infrastructure resilience. This report documents the results of a late-start Laboratory Directed Research and Development (LDRD) project that investigated the development of quantitative resilience through application of control design methods. Specifically, we conducted a survey of infrastructure models to assess what types of control design might be applicable for critical infrastructure resilience assessment. As a result of this survey, we developed a decision process that directs the resilience analyst to the control method that is most likely applicable to the system under consideration. Furthermore, we developed optimal control strategies for two sets of representative infrastructure systems to demonstrate how control methods could be used to assess the resilience of the systems to catastrophic disruptions. We present recommendations for future work to continue the development of quantitative resilience analysis methods.

  2. Applying Knowledge of Quantitative Design and Analysis

    ERIC Educational Resources Information Center

    Baskas, Richard S.

    2011-01-01

    This study compared and contrasted two quantitative scholarly articles in relation to their research designs. Their designs were analyzed by the comparison of research references and research specific vocabulary to describe how various research methods were used. When researching and analyzing quantitative scholarly articles, it is imperative to…

  3. Report on Solar Water Heating Quantitative Survey

    SciTech Connect

    Focus Marketing Services

    1999-05-06

    This report details the results of a quantitative research study undertaken to better understand the marketplace for solar water-heating systems from the perspective of home builders, architects, and home buyers.

  4. WATERSHED BASED SURVEY DESIGNS

    EPA Science Inventory

    The development of watershed-based design and assessment tools will help to serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional condition to meet Section 305(b), identification of impaired water bodies or wate...

  5. Food Consumption and Handling Survey for Quantitative Microbiological Consumer Phase Risk Assessments.

    PubMed

    Chardon, Jurgen; Swart, Arno

    2016-07-01

    In the consumer phase of a typical quantitative microbiological risk assessment (QMRA), mathematical equations identify data gaps. To acquire useful data we designed a food consumption and food handling survey (2,226 respondents) for QMRA applications that is especially aimed at obtaining quantitative data. For a broad spectrum of food products, the survey covered the following topics: processing status at retail, consumer storage, preparation, and consumption. Questions were designed to facilitate distribution fitting. In the statistical analysis, special attention was given to the selection of the most adequate distribution to describe the data. Bootstrap procedures were used to describe uncertainty. The final result was a coherent quantitative consumer phase food survey and parameter estimates for food handling and consumption practices in The Netherlands, including variation over individuals and uncertainty estimates.

  6. Qualities of a Psychiatric Mentor: A Quantitative Singaporean Survey

    ERIC Educational Resources Information Center

    Tor, Phern-Chern; Goh, Lee-Gan; Ang, Yong-Guan; Lim, Leslie; Winslow, Rasaiah-Munidasa; Ng, Beng-Yeong; Wong, Sze-Tai; Ng, Tse-Pin; Kia, Ee-Heok

    2011-01-01

    Objective: Psychiatric mentors are an important part of the new, seamless training program in Singapore. There is a need to assess the qualities of a good psychiatric mentor vis-a-vis those of a good psychiatrist. Method: An anonymous survey was sent out to all psychiatry trainees and psychiatrists in Singapore to assess quantitatively the…

  7. RESOLVE and ECO: Survey Design

    NASA Astrophysics Data System (ADS)

    Kannappan, Sheila; Moffett, Amanda J.; Norris, Mark A.; Eckert, Kathleen D.; Stark, David; Berlind, Andreas A.; Snyder, Elaine M.; Norman, Dara J.; Hoversten, Erik A.; RESOLVE Team

    2016-01-01

    The REsolved Spectroscopy Of a Local VolumE (RESOLVE) survey is a volume-limited census of stellar, gas, and dynamical mass as well as star formation and galaxy interactions within >50,000 cubic Mpc of the nearby cosmic web, reaching down to dwarf galaxies of baryonic mass ~10^9 Msun and spanning multiple large-scale filaments, walls, and voids. RESOLVE is surrounded by the ~10x larger Environmental COntext (ECO) catalog, with matched custom photometry and environment metrics enabling analysis of cosmic variance with greater statistical power. For the ~1500 galaxies in its two equatorial footprints, RESOLVE goes beyond ECO in providing (i) deep 21cm data with adaptive sensitivity ensuring HI mass detections or upper limits <10% of the stellar mass and (ii) 3D optical spectroscopy including both high-resolution ionized gas or stellar kinematic data for each galaxy and broad 320-725nm spectroscopy spanning [OII] 3727, Halpha, and Hbeta. RESOLVE is designed to complement other radio and optical surveys in providing diverse, contiguous, and uniform local/global environment data as well as unusually high completeness extending into the gas-dominated dwarf galaxy regime. RESOLVE also offers superb reprocessed photometry including full, deep NUV coverage and synergy with other equatorial surveys as well as unique northern and southern facilities such as Arecibo, the GBT, and ALMA. The RESOLVE and ECO surveys have been supported by funding from NSF grants AST-0955368 and OCI-1156614.

  8. Watershed-based survey designs

    USGS Publications Warehouse

    Detenbeck, N.E.; Cincotta, D.; Denver, J.M.; Greenlee, S.K.; Olsen, A.R.; Pitchford, A.M.

    2005-01-01

    Watershed-based sampling design and assessment tools help serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional conditions to meet Section 305(b), identification of impaired water bodies or watersheds to meet Section 303(d), and development of empirical relationships between causes or sources of impairment and biological responses. Creation of GIS databases for hydrography, hydrologically corrected digital elevation models, and hydrologic derivatives such as watershed boundaries and upstream–downstream topology of subcatchments would provide a consistent seamless nationwide framework for these designs. The elements of a watershed-based sample framework can be represented either as a continuous infinite set defined by points along a linear stream network, or as a discrete set of watershed polygons. Watershed-based designs can be developed with existing probabilistic survey methods, including the use of unequal probability weighting, stratification, and two-stage frames for sampling. Case studies for monitoring of Atlantic Coastal Plain streams, West Virginia wadeable streams, and coastal Oregon streams illustrate three different approaches for selecting sites for watershed-based survey designs.

  9. Watershed-based survey designs.

    PubMed

    Detenbeck, Naomi E; Cincotta, Dan; Denver, Judith M; Greenlee, Susan K; Olsen, Anthony R; Pitchford, Ann M

    2005-04-01

    Watershed-based sampling design and assessment tools help serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional conditions to meet Section 305(b), identification of impaired water bodies or watersheds to meet Section 303(d), and development of empirical relationships between causes or sources of impairment and biological responses. Creation of GIS databases for hydrography, hydrologically corrected digital elevation models, and hydrologic derivatives such as watershed boundaries and upstream-downstream topology of subcatchments would provide a consistent seamless nationwide framework for these designs. The elements of a watershed-based sample framework can be represented either as a continuous infinite set defined by points along a linear stream network, or as a discrete set of watershed polygons. Watershed-based designs can be developed with existing probabilistic survey methods, including the use of unequal probability weighting, stratification, and two-stage frames for sampling. Case studies for monitoring of Atlantic Coastal Plain streams, West Virginia wadeable streams, and coastal Oregon streams illustrate three different approaches for selecting sites for watershed-based survey designs.

  10. Examining Stress in Graduate Assistants: Combining Qualitative and Quantitative Survey Methods

    ERIC Educational Resources Information Center

    Mazzola, Joseph J.; Walker, Erin J.; Shockley, Kristen M.; Spector, Paul E.

    2011-01-01

    The aim of this study was to employ qualitative and quantitative survey methods in a concurrent mixed model design to assess stressors and strains in graduate assistants. The stressors most frequently reported qualitatively were work overload, interpersonal conflict, and organizational constraints; the most frequently reported psychological…

  11. Quantitative three-dimensional low-speed wake surveys

    NASA Technical Reports Server (NTRS)

    Brune, G. W.

    1992-01-01

    Theoretical and practical aspects of conducting three-dimensional wake measurements in large wind tunnels are reviewed with emphasis on applications in low-speed aerodynamics. Such quantitative wake surveys furnish separate values for the components of drag, such as profile drag and induced drag, but also measure lift without the use of a balance. In addition to global data, details of the wake flowfield as well as spanwise distributions of lift and drag are obtained. The paper demonstrates the value of this measurement technique using data from wake measurements conducted by Boeing on a variety of low-speed configurations including the complex high-lift system of a transport aircraft.

  12. Quantitative optical techniques for dense sprays investigation: A survey

    NASA Astrophysics Data System (ADS)

    Coghe, A.; Cossali, G. E.

    2012-01-01

    The experimental study of dense sprays by optical techniques poses many challenges and no methods have proven to be completely reliable when accurate quantitative data are required, for example to validate breakup models and CFD simulations. The present survey is aimed to a critical analysis of optical techniques capable to provide quantitative and reliable data in dense sprays and to point out the conditions necessary to safely obtain such measurements. A single parameter, the optical depth, is proposed to quantify the concept of dense spray and to indicate when multiple scattering becomes predominant and could make the experimental results questionable. Many available optical techniques are divided into two categories: the "classical" ones, like PDA, LDV, PIV, etc., that work well in dilute sprays but show many limitations in dense sprays, and the "emerging" ones more suitable for dense sprays. Among the last ones, those considered more promising are discussed in detail. A number of significant applications are also presented and discussed to better clarify the nature of such complex problem and the feasibility of the new proposed approaches.

  13. Accrual Patterns for Clinical Studies Involving Quantitative Imaging: Results of an NCI Quantitative Imaging Network (QIN) Survey

    PubMed Central

    Kurland, Brenda F.; Aggarwal, Sameer; Yankeelov, Thomas E.; Gerstner, Elizabeth R.; Mountz, James M.; Linden, Hannah M.; Jones, Ella F.; Bodeker, Kellie L.; Buatti, John M.

    2017-01-01

    Patient accrual is essential for the success of oncology clinical trials. Recruitment for trials involving the development of quantitative imaging biomarkers may face different challenges than treatment trials. This study surveyed investigators and study personnel for evaluating accrual performance and perceived barriers to accrual and for soliciting solutions to these accrual challenges that are specific to quantitative imaging-based trials. Responses for 25 prospective studies were received from 12 sites. The median percent annual accrual attained was 94.5% (range, 3%–350%). The most commonly selected barrier to recruitment (n = 11/25, 44%) was that “patients decline participation,” followed by “too few eligible patients” (n = 10/25, 40%). In a forced choice for the single greatest recruitment challenge, “too few eligible patients” was the most common response (n = 8/25, 32%). Quantitative analysis and qualitative responses suggested that interactions among institutional, physician, and patient factors contributed to accrual success and challenges. Multidisciplinary collaboration in trial design and execution is essential to accrual success, with attention paid to ensuring and communicating potential trial benefits to enrolled and future patients. PMID:28127586

  14. Approximations for Quantitative Feedback Theory Designs

    NASA Technical Reports Server (NTRS)

    Henderson, D. K.; Hess, R. A.

    1997-01-01

    The computational requirements for obtaining the results summarized in the preceding section were very modest and were easily accomplished using computer-aided control system design software. Of special significance is the ability of the PDT to indicate a loop closure sequence for MIMO QFT designs that employ sequential loop closure. Although discussed as part of a 2 x 2 design, the PDT is obviously applicable to designs with a greater number of inputs and system responses.

  15. WATERSHED-BASED SURVEY DESIGNS

    EPA Science Inventory

    Water-based sampling design and assessment tools help serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional conditions to meet Section 305(b), identification if impaired water bodies or watersheds to meet Sectio...

  16. Large Synoptic Survey Telescope mount final design

    NASA Astrophysics Data System (ADS)

    Callahan, Shawn; Gressler, William; Thomas, Sandrine J.; Gessner, Chuck; Warner, Mike; Barr, Jeff; Lotz, Paul J.; Schumacher, German; Wiecha, Oliver; Angeli, George; Andrew, John; Claver, Chuck; Schoening, Bill; Sebag, Jacques; Krabbendam, Victor; Neill, Doug; Hileman, Ed; Muller, Gary; Araujo, Constanza; Orden Martinez, Alfredo; Perezagua Aguado, Manuel; García-Marchena, Luis; Ruiz de Argandoña, Ismael; Romero, Francisco M.; Rodríguez, Ricardo; Carlos González, José; Venturini, Marco

    2016-08-01

    This paper describes the status and details of the large synoptic survey telescope1,2,3 mount assembly (TMA). On June 9th, 2014 the contract for the design and build of the large synoptic survey telescope mount assembly (TMA) was awarded to GHESA Ingeniería y Tecnología, S.A. and Asturfeito, S.A. The design successfully passed the preliminary design review on October 2, 2015 and the final design review January 29, 2016. This paper describes the detailed design by subsystem, analytical model results, preparations being taken to complete the fabrication, and the transportation and installation plans to install the mount on Cerro Pachón in Chile. This large project is the culmination of work by many people and the authors would like to thank everyone that has contributed to the success of this project.

  17. Ambulance Design Survey 2011: A Summary Report.

    PubMed

    Lee, Y Tina; Kibira, Deogratias; Feeney, Allison Barnard; Marshall, Jennifer

    2013-01-01

    Current ambulance designs are ergonomically inefficient and often times unsafe for practical treatment response to medical emergencies. Thus, the patient compartment of a moving ambulance is a hazardous working environment. As a consequence, emergency medical services (EMS) workers suffer fatalities and injuries that far exceed those of the average work place in the United States. To reduce injury and mortality rates in ambulances, the Department of Homeland Security Science and Technology Directorate has teamed with the National Institute of Standards and Technology, the National Institute for Occupational Safety and Health, and BMT Designers & Planners in a joint project to produce science-based ambulance patient compartment design standards. This project will develop new crash-safety design standards and improved user-design interface guidance for patient compartments that are safer for EMS personnel and patients, and facilitate improved patient care. The project team has been working with practitioners, EMS workers' organizations, and manufacturers to solicit needs and requirements to address related issues. This paper presents an analysis of practitioners' concerns, needs, and requirements for improved designs elicited through the web-based survey of ambulance design, held by the National Institute of Standards and Technology. This paper also introduces the survey, analyzes the survey results, and discusses recommendations for future ambulance patient compartments design.

  18. Ambulance Design Survey 2011: A Summary Report

    PubMed Central

    Lee, Y Tina; Kibira, Deogratias; Feeney, Allison Barnard; Marshall, Jennifer

    2013-01-01

    Current ambulance designs are ergonomically inefficient and often times unsafe for practical treatment response to medical emergencies. Thus, the patient compartment of a moving ambulance is a hazardous working environment. As a consequence, emergency medical services (EMS) workers suffer fatalities and injuries that far exceed those of the average work place in the United States. To reduce injury and mortality rates in ambulances, the Department of Homeland Security Science and Technology Directorate has teamed with the National Institute of Standards and Technology, the National Institute for Occupational Safety and Health, and BMT Designers & Planners in a joint project to produce science-based ambulance patient compartment design standards. This project will develop new crash-safety design standards and improved user-design interface guidance for patient compartments that are safer for EMS personnel and patients, and facilitate improved patient care. The project team has been working with practitioners, EMS workers’ organizations, and manufacturers to solicit needs and requirements to address related issues. This paper presents an analysis of practitioners’ concerns, needs, and requirements for improved designs elicited through the web-based survey of ambulance design, held by the National Institute of Standards and Technology. This paper also introduces the survey, analyzes the survey results, and discusses recommendations for future ambulance patient compartments design. PMID:26401439

  19. Design-based and model-based inference in surveys of freshwater mollusks

    USGS Publications Warehouse

    Dorazio, R.M.

    1999-01-01

    Well-known concepts in statistical inference and sampling theory are used to develop recommendations for planning and analyzing the results of quantitative surveys of freshwater mollusks. Two methods of inference commonly used in survey sampling (design-based and model-based) are described and illustrated using examples relevant in surveys of freshwater mollusks. The particular objectives of a survey and the type of information observed in each unit of sampling can be used to help select the sampling design and the method of inference. For example, the mean density of a sparsely distributed population of mollusks can be estimated with higher precision by using model-based inference or by using design-based inference with adaptive cluster sampling than by using design-based inference with conventional sampling. More experience with quantitative surveys of natural assemblages of freshwater mollusks is needed to determine the actual benefits of different sampling designs and inferential procedures.

  20. Survey of quantitative antimicrobial consumption in two different pig finishing systems.

    PubMed

    Moreno, M A

    2012-09-29

    The primary objectives of this study were to: (a) collect on-farm antimicrobial use (AMU) data in fattener pigs employing two questionnaire-based surveys; (b) assess different quantitative measures for quantifying AMU in fattener pigs; (c) compare AMU in fattener pigs between two different management systems producing finishers: farrow-to-finish (FtF) farms versus finisher farms. Two questionnaires were designed both containing five groups of questions focused on the responder, the farm and AMU (eg, in-feed, in-drinking water and parenteral); both surveys were carried out by means of personal face-to-face interviews. Both surveys started with a sample size of 108 potentially eligible farms per survey; nevertheless, finally 67 finisher farms and 49 FtF farms were recruited. Overall percentages of animals exposed to antimicrobials (AM) were high (90 per cent in finisher farms and 54 per cent FtF farms); colistin (61 per cent and 33 per cent) and doxycycline (62 per cent and 23 per cent) were the most common AMs, followed by amoxicillin (51 per cent and 19 per cent) and lincomycin (49 per cent), respectively. Questionnaire-based surveys using face-to-face interviews are useful for capturing information regarding AMU at the farm level. Farm-level data per administration route can be used for comparative AMU analysis between farms. Nevertheless, for the analysis of the putative relationships between AMU and AM resistance, measures based on exposed animals or exposure events are needed.

  1. National Lake Assessment 2012 Potenital Survey Design

    EPA Science Inventory

    In 2012 the Office of Water in collaboration with states and tribal nations will conduct the second National Lake Assessment. The purpose of this presentation is to present potential survey design approaches for this national assessment. Currently discussions are underway to de...

  2. Spatially balanced survey designs for natural resources

    EPA Science Inventory

    Ecological resource monitoring programs typically require the use of a probability survey design to select locations or entities to be physically sampled in the field. The ecological resource of interest, the target population, occurs over a spatial domain and the sample selecte...

  3. Survey: Computer Usage in Design Courses.

    ERIC Educational Resources Information Center

    Henley, Ernest J.

    1983-01-01

    Presents results of a survey of chemical engineering departments regarding computer usage in senior design courses. Results are categorized according to: computer usage (use of process simulators, student-written programs, faculty-written or "canned" programs; costs (hard and soft money); and available software. Programs offered are…

  4. LIDAR Surveys for Road Design in Thailand

    DTIC Science & Technology

    2004-11-01

    25th ACRS 2004 Chiang Mai, Thailand 167 New Generation of Sensors and Applications A-4.6 LIDAR SURVEYS FOR... LiDAR , DEM, Road design, Pilot project, Thailand, NBIA ABSTRACT Concerned with environmental and drainage problems associated with road...as hilly, unstable terrain. LiDAR technology is of great interest to DOH as its use can make them save enormous amounts of time and money by providing

  5. Statistical design of quantitative mass spectrometry-based proteomic experiments.

    PubMed

    Oberg, Ann L; Vitek, Olga

    2009-05-01

    We review the fundamental principles of statistical experimental design, and their application to quantitative mass spectrometry-based proteomics. We focus on class comparison using Analysis of Variance (ANOVA), and discuss how randomization, replication and blocking help avoid systematic biases due to the experimental procedure, and help optimize our ability to detect true quantitative changes between groups. We also discuss the issues of pooling multiple biological specimens for a single mass analysis, and calculation of the number of replicates in a future study. When applicable, we emphasize the parallels between designing quantitative proteomic experiments and experiments with gene expression microarrays, and give examples from that area of research. We illustrate the discussion using theoretical considerations, and using real-data examples of profiling of disease.

  6. The Dark Energy Survey CCD imager design

    SciTech Connect

    Cease, H.; DePoy, D.; Diehl, H.T.; Estrada, J.; Flaugher, B.; Guarino, V.; Kuk, K.; Kuhlmann, S.; Schultz, K.; Schmitt, R.L.; Stefanik, A.; /Fermilab /Ohio State U. /Argonne

    2008-06-01

    The Dark Energy Survey is planning to use a 3 sq. deg. camera that houses a {approx} 0.5m diameter focal plane of 62 2kx4k CCDs. The camera vessel including the optical window cell, focal plate, focal plate mounts, cooling system and thermal controls is described. As part of the development of the mechanical and cooling design, a full scale prototype camera vessel has been constructed and is now being used for multi-CCD readout tests. Results from this prototype camera are described.

  7. Implementing a Quantitative Analysis Design Tool for Future Generation Interfaces

    DTIC Science & Technology

    2012-03-01

    future MAC-enabled systems. A human-computer interaction ( HCI ) Index, originally applied to multi-function displays was applied to the prototype Vigilant...Spirit interface. A modified version of the HCI Index was successfully applied to perform a quantitative analysis of the baseline VSCS interface and...two modified interface designs. The modified HCI Index incorporates the Hick-Hyman decision time, Fitts’ Law time, and the physical actions

  8. Young people, alcohol, and designer drinks: quantitative and qualitative study.

    PubMed Central

    Hughes, K.; MacKintosh, A. M.; Hastings, G.; Wheeler, C.; Watson, J.; Inglis, J.

    1997-01-01

    OBJECTIVE: To examine the appeal of "designer drinks" to young people. DESIGN: Qualitative and quantitative research comprising group discussions and questionnaire led interviews with young people accompanied by a self completion questionnaire. SETTINGS: Argyll and Clyde Health Board area, west Scotland. SUBJECTS: Eight groups aged 12-17 years; 824 aged 12-17 recruited by multistage cluster probability sample from the community health index. RESULTS: Young people were familiar with designer drinks, especially MD 20/20 and leading brands of strong white cider. Attitudes towards these drinks varied quite distinctly with age, clearly reflecting their attitudes towards and motivations for drinking in general. The brand imagery of designer drinks-in contrast with that of more mainstream drinks-matched many 14 and 15 year olds' perceptions and expectations of drinking. Popularity of designer drinks peaked between the ages of 13 and 16 while more conventional drinks showed a consistent increase in popularity with age. Consumption of designer drinks tended to be in less controlled circumstances and was associated with heavier alcohol intake and greater drunkenness. CONCLUSIONS: Designer drinks are a cause for concern. They appeal to young people, often more so than conventional drinks, and are particularly attractive to 14-16 year olds. Consumption of designer drinks is also associated with drinking in less controlled environments, heavier drinking, and greater drunkenness. There is a need for policy debate to assess the desirability of these drinks and the extent to which further controls on their marketing are required. PMID:9040387

  9. Survey of rural, private wells. Statistical design

    USGS Publications Warehouse

    Mehnert, Edward; Schock, Susan C.; ,

    1991-01-01

    Half of Illinois' 38 million acres were planted in corn and soybeans in 1988. On the 19 million acres planted in corn and soybeans, approximately 1 million tons of nitrogen fertilizer and 50 million pounds of pesticides were applied. Because groundwater is the water supply for over 90 percent of rural Illinois, the occurrence of agricultural chemicals in groundwater in Illinois is of interest to the agricultural community, the public, and regulatory agencies. The occurrence of agricultural chemicals in groundwater is well documented. However, the extent of this contamination still needs to be defined. This can be done by randomly sampling wells across a geographic area. Key elements of a random, water-well sampling program for regional groundwater quality include the overall statistical design of the program, definition of the sample population, selection of wells to be sampled, and analysis of survey results. These elements must be consistent with the purpose for conducting the program; otherwise, the program will not provide the desired information. The need to carefully design and conduct a sampling program becomes readily apparent when one considers the high cost of collecting and analyzing a sample. For a random sampling program conducted in Illinois, the key elements, as well as the limitations imposed by available information, are described.

  10. 76 FR 9637 - Proposed Information Collection (Veteran Suicide Prevention Online Quantitative Surveys) Activity...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-18

    ... AFFAIRS Proposed Information Collection (Veteran Suicide Prevention Online Quantitative Surveys) Activity... outreach efforts on the prevention of suicide among Veterans and their families. DATES: Written comments...). Type of Review: New collection. Abstract: VA's top priority is the prevention of Veterans suicide....

  11. Quantitative survey on the shape of the back of men's head as viewed from the side.

    PubMed

    Tamir, Abraham

    2013-05-01

    This article classifies quantitatively into 4 shapes men's back part of the head viewed from the side that are demonstrated in some of the figures in this article. Because of self-evident reasons, the shapes were blurred. The survey is based on the analysis of 2220 shapes obtained by photographing mainly bald men and by finding pictures in the Internet. To the best of the author's knowledge, this quantitative approach has never been implemented before. The results obtained are as follows: the percentage of 376 "flat heads" is 17%; the percentage of 755 "little round heads," 34%; the percentage of 1017 "round heads," 45.8%; and the percentage of 72 "very round heads," 3.2%. This quantitative survey is an additional step in analyzing quantitatively the shape of the parts of the face wherein, in articles that were previously published or that will be published in this magazine, shapes of the nose, ear conch, and human eye were analyzed quantitatively. In addition, the shapes of the leg toes were also analyzed. Finally, it should be noted that, because of obvious reasons, the survey is based on men's head, most of which are with baldness.

  12. Design Effects and the Analysis of Survey Data.

    ERIC Educational Resources Information Center

    Folsom, Ralph E.; Williams, Rick L.

    The National Assessment of Educational Progress (NAEP), like most large national surveys, employs a complex stratified multistage unequal probability sample. The design provides a rigorous justification for extending survey results to the entire U.S. target population. Developments in the analysis of data from complex surveys which provide a…

  13. Design and Validation of the Quantum Mechanics Conceptual Survey

    ERIC Educational Resources Information Center

    McKagan, S. B.; Perkins, K. K.; Wieman, C. E.

    2010-01-01

    The Quantum Mechanics Conceptual Survey (QMCS) is a 12-question survey of students' conceptual understanding of quantum mechanics. It is intended to be used to measure the relative effectiveness of different instructional methods in modern physics courses. In this paper, we describe the design and validation of the survey, a process that included…

  14. Quantitative estimation of minimum offset for multichannel surface-wave survey with actively exciting source

    USGS Publications Warehouse

    Xu, Y.; Xia, J.; Miller, R.D.

    2006-01-01

    Multichannel analysis of surface waves is a developing method widely used in shallow subsurface investigations. The field procedures and related parameters are very important for successful applications. Among these parameters, the source-receiver offset range is seldom discussed in theory and normally determined by empirical or semi-quantitative methods in current practice. This paper discusses the problem from a theoretical perspective. A formula for quantitatively evaluating a layered homogenous elastic model was developed. The analytical results based on simple models and experimental data demonstrate that the formula is correct for surface wave surveys for near-surface applications. ?? 2005 Elsevier B.V. All rights reserved.

  15. Research on Basic Design Education: An International Survey

    ERIC Educational Resources Information Center

    Boucharenc, C. G.

    2006-01-01

    This paper reports on the results of a survey and qualitative analysis on the teaching of "Basic Design" in schools of design and architecture located in 22 countries. In the context of this research work, Basic Design means the teaching and learning of design fundamentals that may also be commonly referred to as the Principles of Two- and…

  16. Social norms of "good" design: Interdisciplinary perspectives from a survey of engineers and clinicians in bioengineering.

    PubMed

    Johnson, Angela N

    2016-08-01

    In bioengineering training for new researchers and engineers, a great deal of time is spent discussing what constitutes "good" design. Conceptualization of good design, however, varies widely across interdisciplinary team members, with potential to both foster innovation or lead to unproductive conflict. To explore how groups central to bioengineering teams (physicians/clinicians and engineers/physicists) conceptualize good design, we asked 176 professionals in bioengineering to complete a comprehensive online survey including items designed to assess cognitive and moral foundations (validated MFQ30 tool) and custom items assessing perceptions on good design in three areas (good design characteristics, reputation of design approvers, and perceived design patient/consumer suitability). Of those that responded, 82 completed all quantitative survey sections and were included in this preliminary analysis. Correlations between response areas were examined to explore the possible links between cognitive and moral biases and perspectives on good design. The survey results indicated that both groups were more conservative than average Americans based on previous reports, and clinicians scored higher on average for all MFQ30 domains. Numerous significant correlations with good design were observed among clinicians, while engineers/physicists most closely correlated good design with prescriber approval and scientific/technical literature. The exploratory analysis demonstrated the potential utility of sociological frameworks to explore relationships in design thinking with potential utility to stimulate thriving conversation on team-based design thinking in bioengineering education and practice.

  17. Designing community surveys to provide a basis for noise policy

    NASA Technical Reports Server (NTRS)

    Fields, J. M.

    1980-01-01

    After examining reports from a large number of social surveys, two areas were identified where methodological improvements in the surveys would be especially useful for public policy. The two study areas are: the definition of noise indexes and the assessment of noise impact. Improvements in the designs of surveys are recommended which would increase the validity and reliability of the noise indexes. Changes in interview questions and sample designs are proposed which would enable surveys to provide measures of noise impact which are directly relevant for public policy.

  18. Designing occupancy studies: general advice and allocating survey effort

    USGS Publications Warehouse

    MacKenzie, D.I.; Royle, J. Andrew

    2005-01-01

    1. The fraction of sampling units in a landscape where a target species is present (occupancy) is an extensively used concept in ecology. Yet in many applications the species will not always be detected in a sampling unit even when present, resulting in biased estimates of occupancy. Given that sampling units are surveyed repeatedly within a relatively short timeframe, a number of similar methods have now been developed to provide unbiased occupancy estimates. However, practical guidance on the efficient design of occupancy studies has been lacking. 2. In this paper we comment on a number of general issues related to designing occupancy studies, including the need for clear objectives that are explicitly linked to science or management, selection of sampling units, timing of repeat surveys and allocation of survey effort. Advice on the number of repeat surveys per sampling unit is considered in terms of the variance of the occupancy estimator, for three possible study designs. 3. We recommend that sampling units should be surveyed a minimum of three times when detection probability is high (> 0.5 survey-1), unless a removal design is used. 4. We found that an optimal removal design will generally be the most efficient, but we suggest it may be less robust to assumption violations than a standard design. 5. Our results suggest that for a rare species it is more efficient to survey more sampling units less intensively, while for a common species fewer sampling units should be surveyed more intensively. 6. Synthesis and applications. Reliable inferences can only result from quality data. To make the best use of logistical resources, study objectives must be clearly defined; sampling units must be selected, and repeated surveys timed appropriately; and a sufficient number of repeated surveys must be conducted. Failure to do so may compromise the integrity of the study. The guidance given here on study design issues is particularly applicable to studies of species

  19. Variance estimation for systematic designs in spatial surveys.

    PubMed

    Fewster, R M

    2011-12-01

    In spatial surveys for estimating the density of objects in a survey region, systematic designs will generally yield lower variance than random designs. However, estimating the systematic variance is well known to be a difficult problem. Existing methods tend to overestimate the variance, so although the variance is genuinely reduced, it is over-reported, and the gain from the more efficient design is lost. The current approaches to estimating a systematic variance for spatial surveys are to approximate the systematic design by a random design, or approximate it by a stratified design. Previous work has shown that approximation by a random design can perform very poorly, while approximation by a stratified design is an improvement but can still be severely biased in some situations. We develop a new estimator based on modeling the encounter process over space. The new "striplet" estimator has negligible bias and excellent precision in a wide range of simulation scenarios, including strip-sampling, distance-sampling, and quadrat-sampling surveys, and including populations that are highly trended or have strong aggregation of objects. We apply the new estimator to survey data for the spotted hyena (Crocuta crocuta) in the Serengeti National Park, Tanzania, and find that the reported coefficient of variation for estimated density is 20% using approximation by a random design, 17% using approximation by a stratified design, and 11% using the new striplet estimator. This large reduction in reported variance is verified by simulation.

  20. Survey of Fashion Design Employers. Volume IX, No. 16.

    ERIC Educational Resources Information Center

    Aurand, Cecilia; Lucas, John A.

    A survey was conducted to determine the availability of internship opportunities for fashion design students at Harper College and to measure the value of Harper design graduates to their employers. A sample of 279 manufacturers, contacts, and retail stores employing fashion designers were identified in the Chicago metropolitan area and after two…

  1. Sample design for the residential energy consumption survey

    SciTech Connect

    Not Available

    1994-08-01

    The purpose of this report is to provide detailed information about the multistage area-probability sample design used for the Residential Energy Consumption Survey (RECS). It is intended as a technical report, for use by statisticians, to better understand the theory and procedures followed in the creation of the RECS sample frame. For a more cursory overview of the RECS sample design, refer to the appendix entitled ``How the Survey was Conducted,`` which is included in the statistical reports produced for each RECS survey year.

  2. The Dark Energy Survey instrument design

    SciTech Connect

    Flaugher, B.; /Fermilab

    2006-05-01

    We describe a new project, the Dark Energy Survey (DES), aimed at measuring the dark energy equation of state parameter, w, to a statistical precision of {approx}5%, with four complementary techniques. The survey will use a new 3 sq. deg. mosaic camera (DECam) mounted at the prime focus of the Blanco 4m telescope at the Cerro-Tololo International Observatory (CTIO). DECam includes a large mosaic camera, a five element optical corrector, four filters (g,r,i,z), and the associated infrastructure for operation in the prime focus cage. The focal plane consists of 62 2K x 4K CCD modules (0.27''/pixel) arranged in a hexagon inscribed within the 2.2 deg. diameter field of view. We plan to use the 250 micron thick fully-depleted CCDs that have been developed at the Lawrence Berkeley National Laboratory (LBNL). At Fermilab, we will establish a packaging factory to produce four-side buttable modules for the LBNL devices, as well as to test and grade the CCDs. R&D is underway and delivery of DECam to CTIO is scheduled for 2009.

  3. Surveying clinicians by web: current issues in design and administration.

    PubMed

    Dykema, Jennifer; Jones, Nathan R; Piché, Tara; Stevenson, John

    2013-09-01

    The versatility, speed, and reduced costs with which web surveys can be conducted with clinicians are often offset by low response rates. Drawing on best practices and general recommendations in the literature, we provide an evidence-based overview of methods for conducting online surveys with providers. We highlight important advantages and disadvantages of conducting provider surveys online and include a review of differences in response rates between web and mail surveys of clinicians. When administered online, design-based features affect rates of survey participation and data quality. We examine features likely to have an impact including sample frames, incentives, contacts (type, timing, and content), mixed-mode approaches, and questionnaire length. We make several recommendations regarding optimal web-based designs, but more empirical research is needed, particularly with regard to identifying which combinations of incentive and contact approaches yield the highest response rates and are the most cost-effective.

  4. Survey instrument for the universal design of consumer products.

    PubMed

    Beecher, Valerie; Paquet, Victor

    2005-05-01

    Universal design is a process intended to include all user groups in product or environmental design. The objective of this study was to develop a usability testing survey instrument to inform how well consumer products complied with established principles of universal design. Thirty-six adults, aging adults and adult wheelchair users performed standardized tasks with pens, food storage containers, pliers and calculators, and for each task responded to a preliminary set of survey items and rated task difficulty. Factor analysis of the survey responses produced an eleven-factor solution that accounted for 67% of the variance in scores and corresponded fairly closely to the principles of universal design. Analysis of scale scores developed from each factor showed that some of the scales were sensitive to product feature and user group differences, and were negatively associated with perceived task difficulty. Such a tool may aid designers who intend their products for user groups of diverse abilities and preferences.

  5. Optical design for a survey x-ray telescope

    NASA Astrophysics Data System (ADS)

    Saha, Timo T.; Zhang, William W.; McClelland, Ryan S.

    2014-07-01

    Optical design trades are underway at the Goddard Space Flight Center to define a telescope for an x-ray survey mission. Top-level science objectives of the mission include the study of x-ray transients, surveying and long-term monitoring of compact objects in nearby galaxies, as well as both deep and wide-field x-ray surveys. In this paper we consider Wolter, Wolter-Schwarzschild, and modified Wolter-Schwarzschild telescope designs as basic building blocks for the tightly nested survey telescope. Design principles and dominating aberrations of individual telescopes and nested telescopes are discussed and we compare the off-axis optical performance at 1.0 KeV and 4.0 KeV across a 1.0 degree full field-of-view.

  6. Optical Design for a Survey X-Ray Telescope

    NASA Technical Reports Server (NTRS)

    Saha, Timo T.; Zhang, William W.; McClelland, Ryan S.

    2014-01-01

    Optical design trades are underway at the Goddard Space Flight Center to define a telescope for an x-ray survey mission. Top-level science objectives of the mission include the study of x-ray transients, surveying and long-term monitoring of compact objects in nearby galaxies, as well as both deep and wide-field x-ray surveys. In this paper we consider Wolter, Wolter-Schwarzschild, and modified Wolter-Schwarzschild telescope designs as basic building blocks for the tightly nested survey telescope. Design principles and dominating aberrations of individual telescopes and nested telescopes are discussed and we compare the off-axis optical performance at 1.0 KeV and 4.0 KeV across a 1.0-degree full field-of-view.

  7. Survey of quantitative data on the solar energy and its spectra distribution

    NASA Technical Reports Server (NTRS)

    Thekaekara, M. P.

    1976-01-01

    This paper presents a survey of available quantitative data on the total and spectral solar irradiance at ground level and outside the atmosphere. Measurements from research aircraft have resulted in the currently accepted NASA/ASTM standards of the solar constant and zero air mass solar spectral irradiance. The intrinsic variability of solar energy output and programs currently under way for more precise measurements from spacecraft are discussed. Instrumentation for solar measurements and their reference radiation scales are examined. Insolation data available from the records of weather stations are reviewed for their applicability to solar energy conversion. Two alternate methods of solarimetry are briefly discussed.

  8. Joint analyses of open comments and quantitative data: Added value in a job satisfaction survey of hospital professionals

    PubMed Central

    Gilles, Ingrid; Mayer, Mauro; Courvoisier, Nelly; Peytremann-Bridevaux, Isabelle

    2017-01-01

    Objective To obtain a comprehensive understanding of the job opinions of hospital professionals by conducting qualitative analyses of the open comments included in a job satisfaction survey and combining these results with the quantitative results. Design A cross-sectional survey targeting all Lausanne University Hospital professionals was performed in the fall of 2013. Material and methods The survey considered ten job satisfaction dimensions (e.g. self-fulfilment, workload, management, work-related burnout, organisational commitment, intent to stay) and included an open comment section. Computer-assisted qualitative analyses were conducted on these comments. Satisfaction rates on the included dimensions and professional groups were entered as predictive variables in the qualitative analyses. Participants Of 10 838 hospital professionals, 4978 participated in the survey and 1067 provided open comments. Data from 1045 respondents with usable comments constituted the analytic sample (133 physicians, 393 nurses, 135 laboratory technicians, 247 administrative staff, including researchers, 67 logistic staff, 44 psycho-social workers, and 26 unspecified). Results Almost a third of the comments addressed scheduling issues, mostly related to problems and exhaustion linked to shifts, work-life balance, and difficulties with colleagues’ absences and the consequences for quality of care and patient safety. The other two-thirds related to classic themes included in job satisfaction surveys. Although some comments were provided equally by all professional groups, others were group specific: work and hierarchy pressures for physicians, healthcare quality and patient safety for nurses, skill recognition for administrative staff. Overall, respondents’ comments were consistent with their job satisfaction ratings. Conclusion Open comment analysis provides a comprehensive understanding of hospital professionals’ job experiences, allowing better consideration of quality

  9. Mobile Libraries, Design and Construction: A Survey of Current Practice.

    ERIC Educational Resources Information Center

    Eastwood, C. R.; And Others

    Forty-one country libraries in Wales, Scotland and England were surveyed in 1970 in an attempt to establish current practice in the design and construction of mobile libraries. This report is the first step of the Branch and Mobile Libraries Group of the Library Association to establish standards for mobile library design and construction. The…

  10. Magnetic resonance elastography hardware design: a survey.

    PubMed

    Tse, Z T H; Janssen, H; Hamed, A; Ristic, M; Young, I; Lamperth, M

    2009-05-01

    Magnetic resonance elastography (MRE) is an emerging technique capable of measuring the shear modulus of tissue. A suspected tumour can be identified by comparing its properties with those of tissues surrounding it; this can be achieved even in deep-lying areas as long as mechanical excitation is possible. This would allow non-invasive methods for cancer-related diagnosis in areas not accessible with conventional palpation. An actuating mechanism is required to generate the necessary tissue displacements directly on the patient in the scanner and three different approaches, in terms of actuator action and position, exist to derive stiffness measurements. However, the magnetic resonance (MR) environment places considerable constraints on the design of such devices, such as the possibility of mutual interference between electrical components, the scanner field, and radio frequency pulses, and the physical space restrictions of the scanner bore. This paper presents a review of the current solutions that have been developed for MRE devices giving particular consideration to the design criteria including the required vibration frequency and amplitude in different applications, the issue of MR compatibility, actuation principles, design complexity, and scanner synchronization issues. The future challenges in this field are also described.

  11. Quantitative research.

    PubMed

    Watson, Roger

    2015-04-01

    This article describes the basic tenets of quantitative research. The concepts of dependent and independent variables are addressed and the concept of measurement and its associated issues, such as error, reliability and validity, are explored. Experiments and surveys – the principal research designs in quantitative research – are described and key features explained. The importance of the double-blind randomised controlled trial is emphasised, alongside the importance of longitudinal surveys, as opposed to cross-sectional surveys. Essential features of data storage are covered, with an emphasis on safe, anonymous storage. Finally, the article explores the analysis of quantitative data, considering what may be analysed and the main uses of statistics in analysis.

  12. European cardiac resynchronization therapy survey II: rationale and design.

    PubMed

    Dickstein, Kenneth; Normand, Camilla; Anker, Stefan D; Auricchio, Angelo; Blomström, Carina Lundqvist; Lundqvist, Carina Blomström; Bogale, Nigussie; Cleland, John; Filippatos, Gerasimos; Gasparini, Maurizio; Gitt, Anselm; Hindricks, Gerhard; Kuck, Karl-Heinz; Ponikowski, Piotr; Stellbrink, Christoph; Ruschitzka, Frank; Linde, Cecilia

    2015-01-01

    The Cardiac Resynchronization Therapy (CRT) Survey II is a 6 months snapshot survey initiated by two ESC Associations, the European Heart Rhythm Association and the Heart Failure Association, which is designed to describe clinical practice regarding implantation of CRT devices in a broad sample of hospitals in 47 ESC member countries. The large volume of clinical and demographic data collected should reflect current patient selection, implantation, and follow-up practice and provide information relevant for assessing healthcare resource utilization in connection with CRT. The findings of this survey should permit representative benchmarking both nationally and internationally across Europe.

  13. A Quantitative Approach to the Design of School Bus Routes.

    ERIC Educational Resources Information Center

    Tracz, George S.

    A number of factors--including the reorganization of school administrative structures, the availability of new technology, increased competition among groups for limited resources, and changing patterns of communication--suggest an increased need for quantitative analysis in the school district decision-making process. One area of school…

  14. Quantitative label-free phosphoproteomics strategy for multifaceted experimental designs.

    PubMed

    Soderblom, Erik J; Philipp, Melanie; Thompson, J Will; Caron, Marc G; Moseley, M Arthur

    2011-05-15

    Protein phosphorylation is a critical regulator of signaling in nearly all eukaryotic cellular pathways and dysregulated phosphorylation has been implicated in an array of diseases. The majority of MS-based quantitative phosphorylation studies are currently performed from transformed cell lines because of the ability to generate large amounts of starting material with incorporated isotopically labeled amino acids during cell culture. Here we describe a general label-free quantitative phosphoproteomic strategy capable of directly analyzing relatively small amounts of virtually any biological matrix, including human tissue and biological fluids. The strategy utilizes a TiO(2) enrichment protocol in which the selectivity and recovery of phosphopeptides were optimized by assessing a twenty-point condition matrix of binding modifier concentrations and peptide-to-resin capacity ratios. The quantitative reproducibility of the TiO(2) enrichment was determined to be 16% RSD through replicate enrichments of a wild-type Danio rerio (zebrafish) lysate. Measured phosphopeptide fold-changes from alpha-casein spiked into wild-type zebrafish lysate backgrounds were within 5% of the theoretical value. Application to a morpholino induced knock-down of G protein-coupled receptor kinase 5 (GRK5) in zebrafish embryos resulted in the quantitation of 719 phosphorylated peptides corresponding to 449 phosphorylated proteins from 200 μg of zebrafish embryo lysates.

  15. A survey of tools for the analysis of quantitative PCR (qPCR) data.

    PubMed

    Pabinger, Stephan; Rödiger, Stefan; Kriegner, Albert; Vierlinger, Klemens; Weinhäusel, Andreas

    2014-09-01

    Real-time quantitative polymerase-chain-reaction (qPCR) is a standard technique in most laboratories used for various applications in basic research. Analysis of qPCR data is a crucial part of the entire experiment, which has led to the development of a plethora of methods. The released tools either cover specific parts of the workflow or provide complete analysis solutions. Here, we surveyed 27 open-access software packages and tools for the analysis of qPCR data. The survey includes 8 Microsoft Windows, 5 web-based, 9 R-based and 5 tools from other platforms. Reviewed packages and tools support the analysis of different qPCR applications, such as RNA quantification, DNA methylation, genotyping, identification of copy number variations, and digital PCR. We report an overview of the functionality, features and specific requirements of the individual software tools, such as data exchange formats, availability of a graphical user interface, included procedures for graphical data presentation, and offered statistical methods. In addition, we provide an overview about quantification strategies, and report various applications of qPCR. Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.

  16. A Successful Broadband Survey for Giant Lyα Nebulae. I. Survey Design and Candidate Selection

    NASA Astrophysics Data System (ADS)

    Prescott, Moire K. M.; Dey, Arjun; Jannuzi, Buell T.

    2012-04-01

    Giant Lyα nebulae (or Lyα "blobs") are likely sites of ongoing massive galaxy formation, but the rarity of these powerful sources has made it difficult to form a coherent picture of their properties, ionization mechanisms, and space density. Systematic narrowband Lyα nebula surveys are ongoing, but the small redshift range covered and the observational expense limit the comoving volume that can be probed by even the largest of these surveys and pose a significant problem when searching for such rare sources. We have developed a systematic search technique designed to find large Lyα nebulae at 2 <~ z <~ 3 within deep broadband imaging and have carried out a survey of the 9.4 deg2 NOAO Deep Wide-Field Survey Boötes field. With a total survey comoving volume of ≈108 h -3 70 Mpc3, this is the largest volume survey for Lyα nebulae ever undertaken. In this first paper in the series, we present the details of the survey design and a systematically selected sample of 79 candidates, which includes one previously discovered Lyα nebula.

  17. A SUCCESSFUL BROADBAND SURVEY FOR GIANT Ly{alpha} NEBULAE. I. SURVEY DESIGN AND CANDIDATE SELECTION

    SciTech Connect

    Prescott, Moire K. M.; Dey, Arjun; Jannuzi, Buell T.

    2012-04-01

    Giant Ly{alpha} nebulae (or Ly{alpha} 'blobs') are likely sites of ongoing massive galaxy formation, but the rarity of these powerful sources has made it difficult to form a coherent picture of their properties, ionization mechanisms, and space density. Systematic narrowband Ly{alpha} nebula surveys are ongoing, but the small redshift range covered and the observational expense limit the comoving volume that can be probed by even the largest of these surveys and pose a significant problem when searching for such rare sources. We have developed a systematic search technique designed to find large Ly{alpha} nebulae at 2 {approx}< z {approx}< 3 within deep broadband imaging and have carried out a survey of the 9.4 deg{sup 2} NOAO Deep Wide-Field Survey Booetes field. With a total survey comoving volume of Almost-Equal-To 10{sup 8} h{sup -3}{sub 70} Mpc{sup 3}, this is the largest volume survey for Ly{alpha} nebulae ever undertaken. In this first paper in the series, we present the details of the survey design and a systematically selected sample of 79 candidates, which includes one previously discovered Ly{alpha} nebula.

  18. Simulating future uncertainty to guide the selection of survey designs for long-term monitoring

    USGS Publications Warehouse

    Garman, Steven L.; Schweiger, E. William; Manier, Daniel J.; Gitzen, Robert A.; Millspaugh, Joshua J.; Cooper, Andrew B.; Licht, Daniel S.

    2012-01-01

    A goal of environmental monitoring is to provide sound information on the status and trends of natural resources (Messer et al. 1991, Theobald et al. 2007, Fancy et al. 2009). When monitoring observations are acquired by measuring a subset of the population of interest, probability sampling as part of a well-constructed survey design provides the most reliable and legally defensible approach to achieve this goal (Cochran 1977, Olsen et al. 1999, Schreuder et al. 2004; see Chapters 2, 5, 6, 7). Previous works have described the fundamentals of sample surveys (e.g. Hansen et al. 1953, Kish 1965). Interest in survey designs and monitoring over the past 15 years has led to extensive evaluations and new developments of sample selection methods (Stevens and Olsen 2004), of strategies for allocating sample units in space and time (Urquhart et al. 1993, Overton and Stehman 1996, Urquhart and Kincaid 1999), and of estimation (Lesser and Overton 1994, Overton and Stehman 1995) and variance properties (Larsen et al. 1995, Stevens and Olsen 2003) of survey designs. Carefully planned, “scientific” (Chapter 5) survey designs have become a standard in contemporary monitoring of natural resources. Based on our experience with the long-term monitoring program of the US National Park Service (NPS; Fancy et al. 2009; Chapters 16, 22), operational survey designs tend to be selected using the following procedures. For a monitoring indicator (i.e. variable or response), a minimum detectable trend requirement is specified, based on the minimum level of change that would result in meaningful change (e.g. degradation). A probability of detecting this trend (statistical power) and an acceptable level of uncertainty (Type I error; see Chapter 2) within a specified time frame (e.g. 10 years) are specified to ensure timely detection. Explicit statements of the minimum detectable trend, the time frame for detecting the minimum trend, power, and acceptable probability of Type I error (

  19. The China Mental Health Survey: II. Design and field procedures.

    PubMed

    Liu, Zhaorui; Huang, Yueqin; Lv, Ping; Zhang, Tingting; Wang, Hong; Li, Qiang; Yan, Jie; Yu, Yaqin; Kou, Changgui; Xu, Xiufeng; Lu, Jin; Wang, Zhizhong; Qiu, Hongyan; Xu, Yifeng; He, Yanling; Li, Tao; Guo, Wanjun; Tian, Hongjun; Xu, Guangming; Xu, Xiangdong; Ma, Yanjuan; Wang, Linhong; Wang, Limin; Yan, Yongping; Wang, Bo; Xiao, Shuiyuan; Zhou, Liang; Li, Lingjiang; Tan, Liwen; Chen, Hongguang; Ma, Chao

    2016-11-01

    China Mental Health Survey (CMHS), which was carried out from July 2013 to March 2015, was the first national representative community survey of mental disorders and mental health services in China using computer-assisted personal interview (CAPI). Face-to-face interviews were finished in the homes of respondents who were selected from a nationally representative multi-stage disproportionate stratified sampling procedure. Sample selection was integrated with the National Chronic Disease and Risk Factor Surveillance Survey administered by the National Centre for Chronic and Non-communicable Disease Control and Prevention in 2013, which made it possible to obtain both physical and mental health information of Chinese community population. One-stage design of data collection was used in the CMHS to obtain the information of mental disorders, including mood disorders, anxiety disorders, and substance use disorders, while two-stage design was applied for schizophrenia and other psychotic disorders, and dementia. A total of 28,140 respondents finished the survey with 72.9% of the overall response rate. This paper describes the survey mode, fieldwork organization, procedures, and the sample design and weighting of the CMHS. Detailed information is presented on the establishment of a new payment scheme for interviewers, results of the quality control in both stages, and evaluations to the weighting.

  20. New journal selection for quantitative survey of infectious disease research: application for Asian trend analysis

    PubMed Central

    2009-01-01

    Background Quantitative survey of research articles, as an application of bibliometrics, is an effective tool for grasping overall trends in various medical research fields. This type of survey has been also applied to infectious disease research; however, previous studies were insufficient as they underestimated articles published in non-English or regional journals. Methods Using a combination of Scopus™ and PubMed, the databases of scientific literature, and English and non-English keywords directly linked to infectious disease control, we identified international and regional infectious disease journals. In order to ascertain whether the newly selected journals were appropriate to survey a wide range of research articles, we compared the number of original articles and reviews registered in the selected journals to those in the 'Infectious Disease Category' of the Science Citation Index Expanded™ (SCI Infectious Disease Category) during 1998-2006. Subsequently, we applied the newly selected journals to survey the number of original articles and reviews originating from 11 Asian countries during the same period. Results One hundred journals, written in English or 7 non-English languages, were newly selected as infectious disease journals. The journals published 14,156 original articles and reviews of Asian origin and 118,158 throughout the world, more than those registered in the SCI Infectious Disease Category (4,621 of Asian origin and 66,518 of the world in the category). In Asian trend analysis of the 100 journals, Japan had the highest percentage of original articles and reviews in the area, and no noticeable increase in articles was revealed during the study period. China, India and Taiwan had relatively large numbers and a high increase rate of original articles among Asian countries. When adjusting the publication of original articles according to the country population and the gross domestic product (GDP), Singapore and Taiwan were the most

  1. Multidisciplinary aerospace design optimization: Survey of recent developments

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw; Haftka, Raphael T.

    1995-01-01

    The increasing complexity of engineering systems has sparked increasing interest in multidisciplinary optimization (MDO). This paper presents a survey of recent publications in the field of aerospace where interest in MDO has been particularly intense. The two main challenges of MDO are computational expense and organizational complexity. Accordingly the survey is focussed on various ways different researchers use to deal with these challenges. The survey is organized by a breakdown of MDO into its conceptual components. Accordingly, the survey includes sections on Mathematical Modeling, Design-oriented Analysis, Approximation Concepts, Optimization Procedures, System Sensitivity, and Human Interface. With the authors' main expertise being in the structures area, the bulk of the references focus on the interaction of the structures discipline with other disciplines. In particular, two sections at the end focus on two such interactions that have recently been pursued with a particular vigor: Simultaneous Optimization of Structures and Aerodynamics, and Simultaneous Optimization of Structures Combined With Active Control.

  2. Lack of quantitative training among early-career ecologists: a survey of the problem and potential solutions.

    PubMed

    Barraquand, Frédéric; Ezard, Thomas H G; Jørgensen, Peter S; Zimmerman, Naupaka; Chamberlain, Scott; Salguero-Gómez, Roberto; Curran, Timothy J; Poisot, Timothée

    2014-01-01

    Proficiency in mathematics and statistics is essential to modern ecological science, yet few studies have assessed the level of quantitative training received by ecologists. To do so, we conducted an online survey. The 937 respondents were mostly early-career scientists who studied biology as undergraduates. We found a clear self-perceived lack of quantitative training: 75% were not satisfied with their understanding of mathematical models; 75% felt that the level of mathematics was "too low" in their ecology classes; 90% wanted more mathematics classes for ecologists; and 95% more statistics classes. Respondents thought that 30% of classes in ecology-related degrees should be focused on quantitative disciplines, which is likely higher than for most existing programs. The main suggestion to improve quantitative training was to relate theoretical and statistical modeling to applied ecological problems. Improving quantitative training will require dedicated, quantitative classes for ecology-related degrees that contain good mathematical and statistical practice.

  3. Survey of quantitative antimicrobial consumption per production stage in farrow-to-finish pig farms in Spain

    PubMed Central

    Moreno, Miguel A.

    2014-01-01

    Objectives To characterise antimicrobial use (AMU) per production stage in terms of drugs, routes of application, indications, duration and exposed animals in farrow-to-finish pig farms in Spain. Design Survey using a questionnaire on AMU during the six months prior to the interview, administered in face-to-face interviews completed from April to October 2010. Participants 108 potentially eligible farms covering all the country were selected using a multistage sampling methodology; of these, 33 were excluded because they did not fulfil the participation criteria and 49 were surveyed. Results The rank of the most used antimicrobials per farm and production stage and administration route started with polymyxins (colistin) by feed during the growing and the preweaning phases, followed by β-lactams by feed during the growing and the preweaning phases and by injection during the preweaning phase. Conclusions The study demonstrates that the growing stage (from weaning to the start of finishing) has the highest AMU according to different quantitative indicators (number of records, number of antimicrobials used, percentage of farms reporting use, relative number of exposed animals per farm and duration of exposure); feed is the administration route that produces the highest antimicrobial exposure based on the higher number of exposed animals and the longer duration of treatment; and there are large differences in AMU among individual pig farms. PMID:26392868

  4. Engaging Students in Survey Design and Data Collection

    ERIC Educational Resources Information Center

    Sole, Marla A.

    2015-01-01

    Every day, people use data to make decisions that affect their personal and professional lives, trusting that the data are correct. Many times, however, the data are inaccurate, as a result of a flaw in the design or methodology of the survey used to collect the data. Researchers agree that only questions that are clearly worded, unambiguous, free…

  5. Survey design and extent estimates for the National Lakes Assessment

    EPA Science Inventory

    The US Environmental Protection Agency (EPA) conducted a National Lake Assessment (NLA) in the conterminous USA in 2007 as part of a national assessment of aquatic resources using probability based survey designs. The USEPA Office of Water led the assessment, in cooperation with...

  6. Design and Architecture of Collaborative Online Communities: A Quantitative Analysis

    ERIC Educational Resources Information Center

    Aviv, Reuven; Erlich, Zippy; Ravid, Gilad

    2004-01-01

    This paper considers four aspects of online communities. Design, mechanisms, architecture, and the constructed knowledge. We hypothesize that different designs of communities drive different mechanisms, which give rise to different architectures, which in turn result in different levels of collaborative knowledge construction. To test this chain…

  7. Survey says? A primer on web-based survey design and distribution.

    PubMed

    Oppenheimer, Adam J; Pannucci, Christopher J; Kasten, Steven J; Haase, Steven C

    2011-07-01

    The Internet has changed the way in which we gather and interpret information. Although books were once the exclusive bearers of data, knowledge is now only a keystroke away. The Internet has also facilitated the synthesis of new knowledge. Specifically, it has become a tool through which medical research is conducted. A review of the literature reveals that in the past year, over 100 medical publications have been based on Web-based survey data alone. Because of emerging Internet technologies, Web-based surveys can now be launched with little computer knowledge. They may also be self-administered, eliminating personnel requirements. Ultimately, an investigator may build, implement, and analyze survey results with speed and efficiency, obviating the need for mass mailings and data processing. All of these qualities have rendered telephone and mail-based surveys virtually obsolete. Despite these capabilities, Web-based survey techniques are not without their limitations, namely, recall and response biases. When used properly, however, Web-based surveys can greatly simplify the research process. This article discusses the implications of Web-based surveys and provides guidelines for their effective design and distribution.

  8. Integrated Analysis and Tools for Land Subsidence Surveying and Monitoring: a Semi-Quantitative Approach

    NASA Astrophysics Data System (ADS)

    Mosconi, A.; Pozzoli, A.; Meroni, A.; Gagliano, S.

    2015-10-01

    This paper presents an integrated approach for land subsidence monitoring using measures coming from different sensors. Eni S.p.A., the main Italian oil and gas company, constantly surveys the land with all the state of the art and innovative techniques, and a method able to integrate the results is an important and actual topic. Nowadays the world is a multi-sensor platform, and measure integration is strictly necessary. Combining the different data sources should be done in a clever way, taking advantages from the best performances of each technique. An integrated analysis allows the interpretation of simultaneous temporal series of data, coming from different sources, and try to separate subsidence contributions. With this purpose Exelis VIS in collaboration with Eni S.p.A. customize PISAV (Permanent Interferometric Scatterometer Analysis and Visualization), an ENVI extension able to capitalize on and combine all the different data collected in the surveys. In this article are presented some significant examples to show the potential of this tool in oil and gas activity: a hydrocarbon storage field where the comparison between SAR and production volumes emphasise a correlation between the two measures in few steps; and a hydrocarbon production field with the Satellite Survey Unit (S.S.U.), where SAR, CGPS, piezometers and assestimeters measure in the same area at the same time, giving the opportunity to analyse data contextually. In the integrated analysis performed with PISAV not always a mathematical rigorous study is possible, and a semi-quantitative approach is the only method for results interpretation. As a result, in the first test case strong correlation between injected hydrocarbon volume and vertical displacement were highlighted; in the second one the integrated analysis has different advantages in monitoring the land subsidence: permits a first qualitative "differentiation" of the natural and anthropic component of subsidence, and also gives more

  9. A survey of bio-inspired compliant legged robot designs.

    PubMed

    Zhou, Xiaodong; Bi, Shusheng

    2012-12-01

    The roles of biological springs in vertebrate animals and their implementations in compliant legged robots offer significant advantages over the rigid legged ones in certain types of scenarios. A large number of robotics institutes have been attempting to work in conjunction with biologists and incorporated these principles into the design of biologically inspired robots. The motivation of this review is to investigate the most published compliant legged robots and categorize them according to the types of compliant elements adopted in their mechanical structures. Based on the typical robots investigated, the trade-off between each category is summarized. In addition, the most significant performances of these robots are compared quantitatively, and multiple available solutions for the future compliant legged robot design are suggested. Finally, the design challenges for compliant legged robots are analysed. This review will provide useful guidance for robotic designers in creating new designs by inheriting the virtues of those successful robots according to the specific tasks.

  10. Metamodels for Computer-Based Engineering Design: Survey and Recommendations

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.

  11. Large Synoptic Survey Telescope: From Science Drivers to Reference Design

    SciTech Connect

    Ivezic, Z.; Axelrod, T.; Brandt, W.N.; Burke, D.L.; Claver, C.F.; Connolly, A.; Cook, K.H.; Gee, P.; Gilmore, D.K.; Jacoby, S.H.; Jones, R.L.; Kahn, S.M.; Kantor, J.P.; Krabbendam, V.; Lupton, R.H.; Monet, D.G.; Pinto, P.A.; Saha, A.; Schalk, T.L.; Schneider, D.P.; Strauss, Michael A.; /Washington U., Seattle, Astron. Dept. /LSST Corp. /Penn State U., Astron. Astrophys. /KIPAC, Menlo Park /NOAO, Tucson /LLNL, Livermore /UC, Davis /Princeton U., Astrophys. Sci. Dept. /Naval Observ., Flagstaff /Arizona U., Astron. Dept. - Steward Observ. /UC, Santa Cruz /Harvard U. /Johns Hopkins U. /Illinois U., Urbana

    2011-10-14

    In the history of astronomy, major advances in our understanding of the Universe have come from dramatic improvements in our ability to accurately measure astronomical quantities. Aided by rapid progress in information technology, current sky surveys are changing the way we view and study the Universe. Next-generation surveys will maintain this revolutionary progress. We focus here on the most ambitious survey currently planned in the visible band, the Large Synoptic Survey Telescope (LSST). LSST will have unique survey capability in the faint time domain. The LSST design is driven by four main science themes: constraining dark energy and dark matter, taking an inventory of the Solar System, exploring the transient optical sky, and mapping the Milky Way. It will be a large, wide-field ground-based system designed to obtain multiple images covering the sky that is visible from Cerro Pachon in Northern Chile. The current baseline design, with an 8.4 m (6.5 m effective) primary mirror, a 9.6 deg{sup 2} field of view, and a 3,200 Megapixel camera, will allow about 10,000 square degrees of sky to be covered using pairs of 15-second exposures in two photometric bands every three nights on average. The system is designed to yield high image quality, as well as superb astrometric and photometric accuracy. The survey area will include 30,000 deg{sup 2} with {delta} < +34.5{sup o}, and will be imaged multiple times in six bands, ugrizy, covering the wavelength range 320-1050 nm. About 90% of the observing time will be devoted to a deep-wide-fast survey mode which will observe a 20,000 deg{sup 2} region about 1000 times in the six bands during the anticipated 10 years of operation. These data will result in databases including 10 billion galaxies and a similar number of stars, and will serve the majority of science programs. The remaining 10% of the observing time will be allocated to special programs such as Very Deep and Very Fast time domain surveys. We describe how the

  12. Designing a Quantitative Structure-Activity Relationship for the ...

    EPA Pesticide Factsheets

    Toxicokinetic models serve a vital role in risk assessment by bridging the gap between chemical exposure and potentially toxic endpoints. While intrinsic metabolic clearance rates have a strong impact on toxicokinetics, limited data is available for environmentally relevant chemicals including nearly 8000 chemicals tested for in vitro bioactivity in the Tox21 program. To address this gap, a quantitative structure-activity relationship (QSAR) for intrinsic metabolic clearance rate was developed to offer reliable in silico predictions for a diverse array of chemicals. Models were constructed with curated in vitro assay data for both pharmaceutical-like chemicals (ChEMBL database) and environmentally relevant chemicals (ToxCast screening) from human liver microsomes (2176 from ChEMBL) and human hepatocytes (757 from ChEMBL and 332 from ToxCast). Due to variability in the experimental data, a binned approach was utilized to classify metabolic rates. Machine learning algorithms, such as random forest and k-nearest neighbor, were coupled with open source molecular descriptors and fingerprints to provide reasonable estimates of intrinsic metabolic clearance rates. Applicability domains defined the optimal chemical space for predictions, which covered environmental chemicals well. A reduced set of informative descriptors (including relative charge and lipophilicity) and a mixed training set of pharmaceuticals and environmentally relevant chemicals provided the best intr

  13. The ZInEP Epidemiology Survey: background, design and methods.

    PubMed

    Ajdacic-Gross, Vladeta; Müller, Mario; Rodgers, Stephanie; Warnke, Inge; Hengartner, Michael P; Landolt, Karin; Hagenmuller, Florence; Meier, Magali; Tse, Lee-Ting; Aleksandrowicz, Aleksandra; Passardi, Marco; Knöpfli, Daniel; Schönfelder, Herdis; Eisele, Jochen; Rüsch, Nicolas; Haker, Helene; Kawohl, Wolfram; Rössler, Wulf

    2014-12-01

    This article introduces the design, sampling, field procedures and instruments used in the ZInEP Epidemiology Survey. This survey is one of six ZInEP projects (Zürcher Impulsprogramm zur nachhaltigen Entwicklung der Psychiatrie, i.e. the "Zurich Program for Sustainable Development of Mental Health Services"). It parallels the longitudinal Zurich Study with a sample comparable in age and gender, and with similar methodology, including identical instruments. Thus, it is aimed at assessing the change of prevalence rates of common mental disorders and the use of professional help and psychiatric sevices. Moreover, the current survey widens the spectrum of topics by including sociopsychiatric questionnaires on stigma, stress related biological measures such as load and cortisol levels, electroencephalographic (EEG) and near-infrared spectroscopy (NIRS) examinations with various paradigms, and sociophysiological tests. The structure of the ZInEP Epidemiology Survey entails four subprojects: a short telephone screening using the SCL-27 (n of nearly 10,000), a comprehensive face-to-face interview based on the SPIKE (Structured Psychopathological Interview and Rating of the Social Consequences for Epidemiology: the main instrument of the Zurich Study) with a stratified sample (n = 1500), tests in the Center for Neurophysiology and Sociophysiology (n = 227), and a prospective study with up to three follow-up interviews and further measures (n = 157). In sum, the four subprojects of the ZInEP Epidemiology Survey deliver a large interdisciplinary database.

  14. Exploring the utility of quantitative network design in evaluating Arctic sea ice thickness sampling strategies

    NASA Astrophysics Data System (ADS)

    Kaminski, T.; Kauker, F.; Eicken, H.; Karcher, M.

    2015-08-01

    We present a quantitative network design (QND) study of the Arctic sea ice-ocean system using a software tool that can evaluate hypothetical observational networks in a variational data assimilation system. For a demonstration, we evaluate two idealised flight transects derived from NASA's Operation IceBridge airborne ice surveys in terms of their potential to improve 10-day to 5-month sea ice forecasts. As target regions for the forecasts we select the Chukchi Sea, an area particularly relevant for maritime traffic and offshore resource exploration, as well as two areas related to the Barnett ice severity index (BSI), a standard measure of shipping conditions along the Alaskan coast that is routinely issued by ice services. Our analysis quantifies the benefits of sampling upstream of the target area and of reducing the sampling uncertainty. We demonstrate how observations of sea ice and snow thickness can constrain ice and snow variables in a target region and quantify the complementarity of combining two flight transects. We further quantify the benefit of improved atmospheric forecasts and a well-calibrated model.

  15. Exploring the utility of quantitative network design in evaluating Arctic sea-ice thickness sampling strategies

    NASA Astrophysics Data System (ADS)

    Kaminski, T.; Kauker, F.; Eicken, H.; Karcher, M.

    2015-03-01

    We present a quantitative network design (QND) study of the Arctic sea ice-ocean system using a software tool that can evaluate hypothetical observational networks in a variational data assimilation system. For a demonstration, we evaluate two idealised flight transects derived from NASA's Operation IceBridge airborne ice surveys in terms of their potential to improve ten-day to five-month sea-ice forecasts. As target regions for the forecasts we select the Chukchi Sea, an area particularly relevant for maritime traffic and offshore resource exploration, as well as two areas related to the Barnett Ice Severity Index (BSI), a standard measure of shipping conditions along the Alaskan coast that is routinely issued by ice services. Our analysis quantifies the benefits of sampling upstream of the target area and of reducing the sampling uncertainty. We demonstrate how observations of sea-ice and snow thickness can constrain ice and snow variables in a target region and quantify the complementarity of combining two flight transects. We further quantify the benefit of improved atmospheric forecasts and a well-calibrated model.

  16. A survey on methods of design features identification

    NASA Astrophysics Data System (ADS)

    Grabowik, C.; Kalinowski, K.; Paprocka, I.; Kempa, W.

    2015-11-01

    currently identified feature. In the IFR method system designer defines a set of features and sets a collection of recognition process parameters. It allows to unambiguously identifying individual features in automatic or semiautomatic way directly in CAD system or in an external application to which the part model might be transferred. Additionally a user is able to define non-geometrical information such as: overall dimensions, surface roughness etc. In this paper a survey on methods of features identification and recognition is presented especially in context of AFR methods.

  17. Contrast media: quantitative criteria for designing compounds with low toxicity.

    PubMed

    Levitan, H; Rapoport, S I

    1976-01-01

    Toxicity of contrast media that are ionized iodobenzoic acids or their derivatives is highly correlated with lipid solubility, as measured by the octanol/water partition coefficient. New contrast media have been designed with lower lipid solubility than media in current use, taking into account the additive-constitutive nature of the partition coefficient of an organic compound. If these contrast media are chemically stable, they should also be less toxic. It remains to be tested whether the relation between clinical toxicity and lipid solubility applies to non-ionized contrast media as well.

  18. Improved field experimental designs and quantitative evaluation of aquatic ecosystems

    SciTech Connect

    McKenzie, D.H.; Thomas, J.M.

    1984-05-01

    The paired-station concept and a log transformed analysis of variance were used as methods to evaluate zooplankton density data collected during five years at an electrical generation station on Lake Michigan. To discuss the example and the field design necessary for a valid statistical analysis, considerable background is provided on the questions of selecting (1) sampling station pairs, (2) experimentwise error rates for multi-species analyses, (3) levels of Type I and II error rates, (4) procedures for conducting the field monitoring program, and (5) a discussion of the consequences of violating statistical assumptions. Details for estimating sample sizes necessary to detect changes of a specified magnitude are included. Both statistical and biological problems with monitoring programs (as now conducted) are addressed; serial correlation of successive observations in the time series obtained was identified as one principal statistical difficulty. The procedure reduces this problem to a level where statistical methods can be used confidently. 27 references, 4 figures, 2 tables.

  19. Force Design Analysis of the Army Aeromedical Evacuation Company: A Quantitative Approach

    DTIC Science & Technology

    2012-01-01

    Army materiel solutions. Keywords force design, capability assessment, mixed - methods , aeromedical evacuation 1. Introduction 1.1. Background In this...incorporates primary data in a unique mixed - methods (quantitative and qualita- tive) approach to force design. Mixed methods add value in that both... mixed - methods approach to evaluating MEDEVAC DOTMLPF considerations provides a baseline for assessing future Army materiel solutions. Acknowledgment

  20. Mixing Qualitative and Quantitative Methods: Insights into Design and Analysis Issues

    ERIC Educational Resources Information Center

    Lieber, Eli

    2009-01-01

    This article describes and discusses issues related to research design and data analysis in the mixing of qualitative and quantitative methods. It is increasingly desirable to use multiple methods in research, but questions arise as to how best to design and analyze the data generated by mixed methods projects. I offer a conceptualization for such…

  1. SEDS: The Spitzer Extended Deep Survey. Survey Design, Photometry, and Deep IRAC Source Counts

    NASA Technical Reports Server (NTRS)

    Ashby, M. L. N.; Willner, S. P.; Fazio, G. G.; Huang, J.-S.; Arendt, A.; Barmby, P.; Barro, G; Bell, E. F.; Bouwens, R.; Cattaneo, A.; Croton, D.; Dave, R.; Dunlop, J. S.; Egami, E.; Faber, S.; Finlator, K.; Grogin, N. A.; Guhathakurta, P.; Hernquist, L.; Hora, J. L.; Illingworth, G.; Kashlinsky, A; Koekmoer, A. M.; Koo, D. C.; Moseley, H.

    2013-01-01

    The Spitzer Extended Deep Survey (SEDS) is a very deep infrared survey within five well-known extragalactic science fields: the UKIDSS Ultra-Deep Survey, the Extended Chandra Deep Field South, COSMOS, the Hubble Deep Field North, and the Extended Groth Strip. SEDS covers a total area of 1.46 deg(exp 2) to a depth of 26 AB mag (3sigma) in both of the warm Infrared Array Camera (IRAC) bands at 3.6 and 4.5 micron. Because of its uniform depth of coverage in so many widely-separated fields, SEDS is subject to roughly 25% smaller errors due to cosmic variance than a single-field survey of the same size. SEDS was designed to detect and characterize galaxies from intermediate to high redshifts (z = 2-7) with a built-in means of assessing the impact of cosmic variance on the individual fields. Because the full SEDS depth was accumulated in at least three separate visits to each field, typically with six-month intervals between visits, SEDS also furnishes an opportunity to assess the infrared variability of faint objects. This paper describes the SEDS survey design, processing, and publicly-available data products. Deep IRAC counts for the more than 300,000 galaxies detected by SEDS are consistent with models based on known galaxy populations. Discrete IRAC sources contribute 5.6 +/- 1.0 and 4.4 +/- 0.8 nW / square m/sr at 3.6 and 4.5 micron to the diffuse cosmic infrared background (CIB). IRAC sources cannot contribute more than half of the total CIB flux estimated from DIRBE data. Barring an unexpected error in the DIRBE flux estimates, half the CIB flux must therefore come from a diffuse component.

  2. SEDS: THE SPITZER EXTENDED DEEP SURVEY. SURVEY DESIGN, PHOTOMETRY, AND DEEP IRAC SOURCE COUNTS

    SciTech Connect

    Ashby, M. L. N.; Willner, S. P.; Fazio, G. G.; Huang, J.-S.; Hernquist, L.; Hora, J. L.; Arendt, R.; Barmby, P.; Barro, G.; Faber, S.; Guhathakurta, P.; Bouwens, R.; Cattaneo, A.; Croton, D.; Dave, R.; Dunlop, J. S.; Egami, E.; Finlator, K.; Grogin, N. A.; and others

    2013-05-20

    The Spitzer Extended Deep Survey (SEDS) is a very deep infrared survey within five well-known extragalactic science fields: the UKIDSS Ultra-Deep Survey, the Extended Chandra Deep Field South, COSMOS, the Hubble Deep Field North, and the Extended Groth Strip. SEDS covers a total area of 1.46 deg{sup 2} to a depth of 26 AB mag (3{sigma}) in both of the warm Infrared Array Camera (IRAC) bands at 3.6 and 4.5 {mu}m. Because of its uniform depth of coverage in so many widely-separated fields, SEDS is subject to roughly 25% smaller errors due to cosmic variance than a single-field survey of the same size. SEDS was designed to detect and characterize galaxies from intermediate to high redshifts (z = 2-7) with a built-in means of assessing the impact of cosmic variance on the individual fields. Because the full SEDS depth was accumulated in at least three separate visits to each field, typically with six-month intervals between visits, SEDS also furnishes an opportunity to assess the infrared variability of faint objects. This paper describes the SEDS survey design, processing, and publicly-available data products. Deep IRAC counts for the more than 300,000 galaxies detected by SEDS are consistent with models based on known galaxy populations. Discrete IRAC sources contribute 5.6 {+-} 1.0 and 4.4 {+-} 0.8 nW m{sup -2} sr{sup -1} at 3.6 and 4.5 {mu}m to the diffuse cosmic infrared background (CIB). IRAC sources cannot contribute more than half of the total CIB flux estimated from DIRBE data. Barring an unexpected error in the DIRBE flux estimates, half the CIB flux must therefore come from a diffuse component.

  3. The 2-degree Field Lensing Survey: design and clustering measurements

    NASA Astrophysics Data System (ADS)

    Blake, Chris; Amon, Alexandra; Childress, Michael; Erben, Thomas; Glazebrook, Karl; Harnois-Deraps, Joachim; Heymans, Catherine; Hildebrandt, Hendrik; Hinton, Samuel R.; Janssens, Steven; Johnson, Andrew; Joudaki, Shahab; Klaes, Dominik; Kuijken, Konrad; Lidman, Chris; Marin, Felipe A.; Parkinson, David; Poole, Gregory B.; Wolf, Christian

    2016-11-01

    We present the 2-degree Field Lensing Survey (2dFLenS), a new galaxy redshift survey performed at the Anglo-Australian Telescope. 2dFLenS is the first wide-area spectroscopic survey specifically targeting the area mapped by deep-imaging gravitational lensing fields, in this case the Kilo-Degree Survey. 2dFLenS obtained 70 079 redshifts in the range z < 0.9 over an area of 731 deg2, and is designed to extend the data sets available for testing gravitational physics and promote the development of relevant algorithms for joint imaging and spectroscopic analysis. The redshift sample consists first of 40 531 Luminous Red Galaxies (LRGs), which enable analyses of galaxy-galaxy lensing, redshift-space distortion, and the overlapping source redshift distribution by cross-correlation. An additional 28 269 redshifts form a magnitude-limited (r < 19.5) nearly complete subsample, allowing direct source classification and photometric-redshift calibration. In this paper, we describe the motivation, target selection, spectroscopic observations, and clustering analysis of 2dFLenS. We use power spectrum multipole measurements to fit the redshift-space distortion parameter of the LRG sample in two redshift ranges 0.15 < z < 0.43 and 0.43 < z < 0.7 as β = 0.49 ± 0.15 and β = 0.26 ± 0.09, respectively. These values are consistent with those obtained from LRGs in the Baryon Oscillation Spectroscopic Survey. 2dFLenS data products will be released via our website http://2dflens.swin.edu.au.

  4. Quality-by-Design II: Application of Quantitative Risk Analysis to the Formulation of Ciprofloxacin Tablets.

    PubMed

    Claycamp, H Gregg; Kona, Ravikanth; Fahmy, Raafat; Hoag, Stephen W

    2016-04-01

    Qualitative risk assessment methods are often used as the first step to determining design space boundaries; however, quantitative assessments of risk with respect to the design space, i.e., calculating the probability of failure for a given severity, are needed to fully characterize design space boundaries. Quantitative risk assessment methods in design and operational spaces are a significant aid to evaluating proposed design space boundaries. The goal of this paper is to demonstrate a relatively simple strategy for design space definition using a simplified Bayesian Monte Carlo simulation. This paper builds on a previous paper that used failure mode and effects analysis (FMEA) qualitative risk assessment and Plackett-Burman design of experiments to identity the critical quality attributes. The results show that the sequential use of qualitative and quantitative risk assessments can focus the design of experiments on a reduced set of critical material and process parameters that determine a robust design space under conditions of limited laboratory experimentation. This approach provides a strategy by which the degree of risk associated with each known parameter can be calculated and allocates resources in a manner that manages risk to an acceptable level.

  5. Optical design study of the Wide Field Survey Telescope (WFST)

    NASA Astrophysics Data System (ADS)

    Lou, Zheng; Liang, Ming; Yao, Dazhi; Zheng, Xianzhong; Cheng, Jingquan; Wang, Hairen; Liu, Wei; Qian, Yuan; Zhao, Haibin; Yang, Ji

    2016-10-01

    WFST is a proposed 2.5m wide field survey telescope intended for dedicated wide field sciences. The telescope is to operate at six wavelength bands (u, g, r, i, z, and w), spanning from 320 to 1028 nm. Designed with a field of view diameter of 3 degree and an effective aperture diameter of 2.29 m, the WFST acquires a total optical throughput over 29.3 m2deg2. With such a large throughput, WFST will survey up to 6000deg2 of the northern sky in multiple colors each night, reaching 23th magnitude for high-precision photometry and astrometry. The optical design is based on an advanced primary-focus system made up of a 2.5 m f/2.48 concave primary mirror and a primary-focus assembly (PFA) consisting of five corrector lenses, atmospheric dispersion corrector (ADC), filters, and the focal-plane instrument. For zenith angles from 0 to 60 degrees, 80% of the polychromatic diffracted energy falls within a 0.35 arcsec diameter. The optical design also highlights an enhanced transmission in the UV bands. The total optical transmission reaches 23.5% at 320 nm, allowing unique science goals in the U band. Other features include low distortion and ease of baffling against stray lights, etc. The focal-plane instrument is a 0.9 gigapixel mosaic CCD camera comprising 9 pieces of 10K×10K CCD chips. An active optics system (AOS) is used to maintain runtime image quality. Various design aspects of the WFST including the optical design, active optics, mirror supports, and the focal-plane instrument are discussed in detail.

  6. The Large Synoptic Survey Telescope concept design overview

    NASA Astrophysics Data System (ADS)

    Krabbendam, Victor L.

    2008-07-01

    The Large Synoptic Survey Telescope Project is a public-private partnership that has successfully completed the Concept Design of its wide-field ground based survey system and started several long-lead construction activities using private funding. The telescope has a 3-mirror wide field optical system with an 8.4 meter primary, 3.4 meter secondary, and 5 meter tertiary mirror. The reflective optics feed three refractive elements and a 64 cm 3.2 gigapixel camera. The telescope will be located on the summit of Cerro Pachón in Chile. The LSST data management system will reduce, transport, alert, archive the roughly 15 terabytes of data produced nightly, and will serve the raw and catalog data accumulating at an average of 7 petabytes per year to the community without any proprietary period. This survey will yield contiguous overlapping imaging of 20,000 square degrees of sky in 6 optical filter bands covering wavelengths from 320 to 1080nm. The project continues to attract institutional partners and has acquired non-federal funding sufficient to construct the primary mirror, already in progress at the University of Arizona, and fund detector prototype efforts, two of the longest lead items in the LSST. The project has submitted a proposal for construction to the National Science Foundation Major Research Equipment and Facilities Construction (MREFC) program and is preparing for a 2011 funding authorization.

  7. 76 FR 27384 - Agency Information Collection Activity (Veteran Suicide Prevention Online Quantitative Surveys...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-11

    .... Veterans Online Survey, VA Form 10-0513. b. Veterans Family Online Survey, VA Form 10-0513a. c. Veterans...). Type of Review: New collection. Abstract: VA's top priority is the prevention of Veterans suicide. It... Households. Estimated Annual Burden a. Veterans Online Survey, VA Form 10-0513--300 hours. b. Veterans...

  8. Quantitative Survey and Structural Classification of Hydraulic Fracturing Chemicals Reported in Unconventional Gas Production.

    PubMed

    Elsner, Martin; Hoelzer, Kathrin

    2016-04-05

    Much interest is directed at the chemical structure of hydraulic fracturing (HF) additives in unconventional gas exploitation. To bridge the gap between existing alphabetical disclosures by function/CAS number and emerging scientific contributions on fate and toxicity, we review the structural properties which motivate HF applications, and which determine environmental fate and toxicity. Our quantitative overview relied on voluntary U.S. disclosures evaluated from the FracFocus registry by different sources and on a House of Representatives ("Waxman") list. Out of over 1000 reported substances, classification by chemistry yielded succinct subsets able to illustrate the rationale of their use, and physicochemical properties relevant for environmental fate, toxicity and chemical analysis. While many substances were nontoxic, frequent disclosures also included notorious groundwater contaminants like petroleum hydrocarbons (solvents), precursors of endocrine disruptors like nonylphenols (nonemulsifiers), toxic propargyl alcohol (corrosion inhibitor), tetramethylammonium (clay stabilizer), biocides or strong oxidants. Application of highly oxidizing chemicals, together with occasional disclosures of putative delayed acids and complexing agents (i.e., compounds designed to react in the subsurface) suggests that relevant transformation products may be formed. To adequately investigate such reactions, available information is not sufficient, but instead a full disclosure of HF additives is necessary.

  9. The Large Synoptic Survey Telescope preliminary design overview

    NASA Astrophysics Data System (ADS)

    Krabbendam, V. L.; Sweeney, D.

    2010-07-01

    The Large Synoptic Survey Telescope (LSST) Project is a public-private partnership that is well into the design and development of the complete observatory system to conduct a wide fast deep survey and to process and serve the data. The telescope has a 3-mirror wide field optical system with an 8.4 meter primary, 3.4 meter secondary, and 5 meter tertiary mirror. The reflective optics feed three refractive elements and a 64 cm 3.2 gigapixel camera. The LSST data management system will reduce, transport, alert and archive the roughly 15 terabytes of data produced nightly, and will serve the raw and catalog data accumulating at an average of 7 petabytes per year to the community without any proprietary period. The project has completed several data challenges designed to prototype and test the data management system to significant pre-construction levels. The project continues to attract institutional partners and has acquired non-federal funding sufficient to construct the primary mirror, already in progress at the University of Arizona, build the secondary mirror substrate, completed by Corning, and fund detector prototype efforts, several that have been tested on the sky. A focus of the project is systems engineering, risk reduction through prototyping and major efforts in image simulation and operation simulations. The project has submitted a proposal for construction to the National Science Foundation Major Research Equipment and Facilities Construction (MREFC) program and has prepared project advocacy papers for the National Research Council's Astronomy 2010 Decadal Survey. The project is preparing for a 2012 construction funding authorization.

  10. Software for quantitative analysis of radiotherapy: overview, requirement analysis and design solutions.

    PubMed

    Zhang, Lanlan; Hub, Martina; Mang, Sarah; Thieke, Christian; Nix, Oliver; Karger, Christian P; Floca, Ralf O

    2013-06-01

    Radiotherapy is a fast-developing discipline which plays a major role in cancer care. Quantitative analysis of radiotherapy data can improve the success of the treatment and support the prediction of outcome. In this paper, we first identify functional, conceptional and general requirements on a software system for quantitative analysis of radiotherapy. Further we present an overview of existing radiotherapy analysis software tools and check them against the stated requirements. As none of them could meet all of the demands presented herein, we analyzed possible conceptional problems and present software design solutions and recommendations to meet the stated requirements (e.g. algorithmic decoupling via dose iterator pattern; analysis database design). As a proof of concept we developed a software library "RTToolbox" following the presented design principles. The RTToolbox is available as open source library and has already been tested in a larger-scale software system for different use cases. These examples demonstrate the benefit of the presented design principles.

  11. Quantitative Feedback Theory (QFT) applied to the design of a rotorcraft flight control system

    NASA Technical Reports Server (NTRS)

    Hess, R. A.; Gorder, P. J.

    1992-01-01

    Quantitative Feedback Theory describes a frequency-domain technique for the design of multi-input, multi-output control systems which meet time or frequency domain performance criteria when specified uncertainty exists in the linear description of the vehicle dynamics. Quantitative Feedback Theory is applied to the design of the longitudinal flight control system for a linear uncertain model of the AH-64 rotorcraft. In this model, the uncertainty is assigned, and is assumed to be attributable to actual uncertainty in the dynamic model and to the changes in the vehicle aerodynamic characteristics which occur near hover. The model includes an approximation to the rotor and actuator dynamics. The design example indicates the manner in which handling qualities criteria may be incorporated into the design of realistic rotorcraft control systems in which significant uncertainty exists in the vehicle model.

  12. The High Cadence Transient Survey (HITS). I. Survey Design and Supernova Shock Breakout Constraints

    NASA Astrophysics Data System (ADS)

    Förster, F.; Maureira, J. C.; San Martín, J.; Hamuy, M.; Martínez, J.; Huijse, P.; Cabrera, G.; Galbany, L.; de Jaeger, Th.; González–Gaitán, S.; Anderson, J. P.; Kunkarayakti, H.; Pignata, G.; Bufano, F.; Littín, J.; Olivares, F.; Medina, G.; Smith, R. C.; Vivas, A. K.; Estévez, P. A.; Muñoz, R.; Vera, E.

    2016-12-01

    We present the first results of the High Cadence Transient Survey (HiTS), a survey for which the objective is to detect and follow-up optical transients with characteristic timescales from hours to days, especially the earliest hours of supernova (SN) explosions. HiTS uses the Dark Energy Camera and a custom pipeline for image subtraction, candidate filtering and candidate visualization, which runs in real-time to be able to react rapidly to the new transients. We discuss the survey design, the technical challenges associated with the real-time analysis of these large volumes of data and our first results. In our 2013, 2014, and 2015 campaigns, we detected more than 120 young SN candidates, but we did not find a clear signature from the short-lived SN shock breakouts (SBOs) originating after the core collapse of red supergiant stars, which was the initial science aim of this survey. Using the empirical distribution of limiting magnitudes from our observational campaigns, we measured the expected recovery fraction of randomly injected SN light curves, which included SBO optical peaks produced with models from Tominaga et al. (2011) and Nakar & Sari (2010). From this analysis, we cannot rule out the models from Tominaga et al. (2011) under any reasonable distributions of progenitor masses, but we can marginally rule out the brighter and longer-lived SBO models from Nakar & Sari (2010) under our best-guess distribution of progenitor masses. Finally, we highlight the implications of this work for future massive data sets produced by astronomical observatories, such as LSST.

  13. Remote sensing surveys design in regional agricultural inventories

    NASA Astrophysics Data System (ADS)

    Andreev, G. G.; Djemardian, Y. A.; Ezkov, V. V.; Sazanov, N. V.

    In this paper, we consider the methodology problems of remote sensing surveys design in regional agricultural inventories. The strategy of samples, based on the combined use of multispectral aerospace data and ground truth data obtained on test sites in the region under supervision, is used. The strategy of samples includes: selection of areas, which are statistically homogenous with certain agricultural parameters under research, identification of representative test sites grid; remote sensing from aerospace platforms and ground truth data acquisition on test sites as well. The ground measurements of biometrical parameters of certain agricultural crops under research are taken at test sites, maps of anomalies are compiled, spectrometrical and other optico-physical characteristics of vegetation canopies and soils are defined. The derived data are used in automatic interactive imagery processing at the training stages of the procedures of classification and application of thematical remote sensing data processing results to the entire region.

  14. Design database for quantitative trait loci (QTL) data warehouse, data mining, and meta-analysis.

    PubMed

    Hu, Zhi-Liang; Reecy, James M; Wu, Xiao-Lin

    2012-01-01

    A database can be used to warehouse quantitative trait loci (QTL) data from multiple sources for comparison, genomic data mining, and meta-analysis. A robust database design involves sound data structure logistics, meaningful data transformations, normalization, and proper user interface designs. This chapter starts with a brief review of relational database basics and concentrates on issues associated with curation of QTL data into a relational database, with emphasis on the principles of data normalization and structure optimization. In addition, some simple examples of QTL data mining and meta-analysis are included. These examples are provided to help readers better understand the potential and importance of sound database design.

  15. Quantitative Survey and Structural Classification of Fracking Chemicals Reported in Unconventional Gas Exploitation

    NASA Astrophysics Data System (ADS)

    Elsner, Martin; Schreglmann, Kathrin

    2015-04-01

    Few technologies are being discussed in such controversial terms as hydraulic fracturing ("fracking") in the recovery of unconventional gas. Particular concern regards the chemicals that may return to the surface as a result of hydraulic fracturing. These are either "fracking chemicals" - chemicals that are injected together with the fracking fluid to optimize the fracturing performance or geogenic substances which may turn up during gas production, in the so-called produced water originating from the target formation. Knowledge about them is warranted for several reasons. (1) Monitoring. Air emissions are reported to arise from well drilling, the gas itself or condensate tanks. In addition, potential spills and accidents bear the danger of surface and shallow groundwater contaminations. Monitoring strategies are therefore warranted to screen for "indicator" substances of potential impacts. (2) Chemical Analysis. To meet these analytical demands, target substances must be defined so that adequate sampling approaches and analytical methods can be developed. (3) Transformation in the Subsurface. Identification and classification of fracking chemicals (aromatics vs. alcohols vs. acids, esters, etc.) is further important to assess the possibility of subsurface reactions which may potentially generate new, as yet unidentified transformation products. (4) Wastewater Treatment. For the same reason chemical knowledge is important for optimized wastewater treatment strategies. (5) Human and Ecosystem Health. Knowledge of the most frequent fracking chemicals is further essential for risk assessment (environmental behavior, toxicity) (6) Public Discussions. Finally, an overview of reported fracking chemicals can provide unbiased scientific into current public debates and enable critical reviews of Green Chemistry approaches. Presently, however, such information is not readily available. We aim to close this knowledge gap by providing a quantitative overview of chemical

  16. Current State of Agile User-Centered Design: A Survey

    NASA Astrophysics Data System (ADS)

    Hussain, Zahid; Slany, Wolfgang; Holzinger, Andreas

    Agile software development methods are quite popular nowadays and are being adopted at an increasing rate in the industry every year. However, these methods are still lacking usability awareness in their development lifecycle, and the integration of usability/User-Centered Design (UCD) into agile methods is not adequately addressed. This paper presents the preliminary results of a recently conducted online survey regarding the current state of the integration of agile methods and usability/UCD. A world wide response of 92 practitioners was received. The results show that the majority of practitioners perceive that the integration of agile methods with usability/UCD has added value to their adopted processes and to their teams; has resulted in the improvement of usability and quality of the product developed; and has increased the satisfaction of the end-users of the product developed. The top most used HCI techniques are low-fidelity prototyping, conceptual designs, observational studies of users, usability expert evaluations, field studies, personas, rapid iterative testing, and laboratory usability testing.

  17. Mechanistic and quantitative insight into cell surface targeted molecular imaging agent design

    NASA Astrophysics Data System (ADS)

    Zhang, Liang; Bhatnagar, Sumit; Deschenes, Emily; Thurber, Greg M.

    2016-05-01

    Molecular imaging agent design involves simultaneously optimizing multiple probe properties. While several desired characteristics are straightforward, including high affinity and low non-specific background signal, in practice there are quantitative trade-offs between these properties. These include plasma clearance, where fast clearance lowers background signal but can reduce target uptake, and binding, where high affinity compounds sometimes suffer from lower stability or increased non-specific interactions. Further complicating probe development, many of the optimal parameters vary depending on both target tissue and imaging agent properties, making empirical approaches or previous experience difficult to translate. Here, we focus on low molecular weight compounds targeting extracellular receptors, which have some of the highest contrast values for imaging agents. We use a mechanistic approach to provide a quantitative framework for weighing trade-offs between molecules. Our results show that specific target uptake is well-described by quantitative simulations for a variety of targeting agents, whereas non-specific background signal is more difficult to predict. Two in vitro experimental methods for estimating background signal in vivo are compared – non-specific cellular uptake and plasma protein binding. Together, these data provide a quantitative method to guide probe design and focus animal work for more cost-effective and time-efficient development of molecular imaging agents.

  18. Mechanistic and quantitative insight into cell surface targeted molecular imaging agent design

    PubMed Central

    Zhang, Liang; Bhatnagar, Sumit; Deschenes, Emily; Thurber, Greg M.

    2016-01-01

    Molecular imaging agent design involves simultaneously optimizing multiple probe properties. While several desired characteristics are straightforward, including high affinity and low non-specific background signal, in practice there are quantitative trade-offs between these properties. These include plasma clearance, where fast clearance lowers background signal but can reduce target uptake, and binding, where high affinity compounds sometimes suffer from lower stability or increased non-specific interactions. Further complicating probe development, many of the optimal parameters vary depending on both target tissue and imaging agent properties, making empirical approaches or previous experience difficult to translate. Here, we focus on low molecular weight compounds targeting extracellular receptors, which have some of the highest contrast values for imaging agents. We use a mechanistic approach to provide a quantitative framework for weighing trade-offs between molecules. Our results show that specific target uptake is well-described by quantitative simulations for a variety of targeting agents, whereas non-specific background signal is more difficult to predict. Two in vitro experimental methods for estimating background signal in vivo are compared – non-specific cellular uptake and plasma protein binding. Together, these data provide a quantitative method to guide probe design and focus animal work for more cost-effective and time-efficient development of molecular imaging agents. PMID:27147293

  19. Practical Tools for Designing and Weighting Survey Samples

    ERIC Educational Resources Information Center

    Valliant, Richard; Dever, Jill A.; Kreuter, Frauke

    2013-01-01

    Survey sampling is fundamentally an applied field. The goal in this book is to put an array of tools at the fingertips of practitioners by explaining approaches long used by survey statisticians, illustrating how existing software can be used to solve survey problems, and developing some specialized software where needed. This book serves at least…

  20. A custom-built PET phantom design for quantitative imaging of printed distributions

    NASA Astrophysics Data System (ADS)

    Markiewicz, P. J.; Angelis, G. I.; Kotasidis, F.; Green, M.; Lionheart, W. R.; Reader, A. J.; Matthews, J. C.

    2011-11-01

    This note presents a practical approach to a custom-made design of PET phantoms enabling the use of digital radioactive distributions with high quantitative accuracy and spatial resolution. The phantom design allows planar sources of any radioactivity distribution to be imaged in transaxial and axial (sagittal or coronal) planes. Although the design presented here is specially adapted to the high-resolution research tomograph (HRRT), the presented methods can be adapted to almost any PET scanner. Although the presented phantom design has many advantages, a number of practical issues had to be overcome such as positioning of the printed source, calibration, uniformity and reproducibility of printing. A well counter (WC) was used in the calibration procedure to find the nonlinear relationship between digital voxel intensities and the actual measured radioactive concentrations. Repeated printing together with WC measurements and computed radiography (CR) using phosphor imaging plates (IP) were used to evaluate the reproducibility and uniformity of such printing. Results show satisfactory printing uniformity and reproducibility; however, calibration is dependent on the printing mode and the physical state of the cartridge. As a demonstration of the utility of using printed phantoms, the image resolution and quantitative accuracy of reconstructed HRRT images are assessed. There is very good quantitative agreement in the calibration procedure between HRRT, CR and WC measurements. However, the high resolution of CR and its quantitative accuracy supported by WC measurements made it possible to show the degraded resolution of HRRT brain images caused by the partial-volume effect and the limits of iterative image reconstruction.

  1. Influenza knowledge, attitude, and behavior survey for grade school students: design and novel assessment methodology.

    PubMed

    Koep, Tyler H; Huskins, W Charles; Clemens, Christal; Jenkins, Sarah; Pierret, Chris; Ekker, Stephen C; Enders, Felicity T

    2014-12-01

    Despite the fact infectious diseases can spread readily in grade schools, few studies have explored prevention in this setting. Additionally, we lack valid tools for students to self-report knowledge, attitudes, and behaviors. As part of an ongoing study of a curriculum intervention to promote healthy behaviors, we developed and evaluated age-appropriate surveys to determine students' understanding of influenza prevention. Surveys were adapted from adolescent and adult influenza surveys and administered to students in grades 2-5 (ages 7-11) at two Rochester public schools. We assessed student understanding by analyzing percent repeatability of 20 survey questions and compared percent "don't know" (DK) responses across grades, gender, and race. Questions thought to be ambiguous after early survey administration were investigated in student focus groups, modified as appropriate, and reassessed. The response rate across all surveys was >87%. Survey questions were well understood; 16 of 20 questions demonstrated strong pre/post repeatability (>70%). Only 1 question showed an increase in DK response for higher grades (p < .0001). Statistical analysis and qualitative feedback led to modification of 3 survey questions and improved measures of understanding in the final survey administration. Grade-school students' knowledge, attitudes and behavior toward influenza prevention can be assessed using surveys. Quantitative and qualitative analysis may be used to assess participant understanding and refine survey development for pediatric survey instruments. These methods may be used to assess the repeatability and validity of surveys to assess the impact of health education interventions in young children.

  2. Design of primers and probes for quantitative real-time PCR methods.

    PubMed

    Rodríguez, Alicia; Rodríguez, Mar; Córdoba, Juan J; Andrade, María J

    2015-01-01

    Design of primers and probes is one of the most crucial factors affecting the success and quality of quantitative real-time PCR (qPCR) analyses, since an accurate and reliable quantification depends on using efficient primers and probes. Design of primers and probes should meet several criteria to find potential primers and probes for specific qPCR assays. The formation of primer-dimers and other non-specific products should be avoided or reduced. This factor is especially important when designing primers for SYBR(®) Green protocols but also in designing probes to ensure specificity of the developed qPCR protocol. To design primers and probes for qPCR, multiple software programs and websites are available being numerous of them free. These tools often consider the default requirements for primers and probes, although new research advances in primer and probe design should be progressively added to different algorithm programs. After a proper design, a precise validation of the primers and probes is necessary. Specific consideration should be taken into account when designing primers and probes for multiplex qPCR and reverse transcription qPCR (RT-qPCR). This chapter provides guidelines for the design of suitable primers and probes and their subsequent validation through the development of singlex qPCR, multiplex qPCR, and RT-qPCR protocols.

  3. Textile materials for the design of wearable antennas: a survey.

    PubMed

    Salvado, Rita; Loss, Caroline; Gonçalves, Ricardo; Pinho, Pedro

    2012-11-15

    In the broad context of Wireless Body Sensor Networks for healthcare and pervasive applications, the design of wearable antennas offers the possibility of ubiquitous monitoring, communication and energy harvesting and storage. Specific requirements for wearable antennas are a planar structure and flexible construction materials. Several properties of the materials influence the behaviour of the antenna. For instance, the bandwidth and the efficiency of a planar microstrip antenna are mainly determined by the permittivity and the thickness of the substrate. The use of textiles in wearable antennas requires the characterization of their properties. Specific electrical conductive textiles are available on the market and have been successfully used. Ordinary textile fabrics have been used as substrates. However, little information can be found on the electromagnetic properties of regular textiles. Therefore this paper is mainly focused on the analysis of the dielectric properties of normal fabrics. In general, textiles present a very low dielectric constant that reduces the surface wave losses and increases the impedance bandwidth of the antenna. However, textile materials are constantly exchanging water molecules with the surroundings, which affects their electromagnetic properties. In addition, textile fabrics are porous, anisotropic and compressible materials whose thickness and density might change with low pressures. Therefore it is important to know how these characteristics influence the behaviour of the antenna in order to minimize unwanted effects. This paper presents a survey of the key points for the design and development of textile antennas, from the choice of the textile materials to the framing of the antenna. An analysis of the textile materials that have been used is also presented.

  4. Textile Materials for the Design of Wearable Antennas: A Survey

    PubMed Central

    Salvado, Rita; Loss, Caroline; Gonçalves, Ricardo; Pinho, Pedro

    2012-01-01

    In the broad context of Wireless Body Sensor Networks for healthcare and pervasive applications, the design of wearable antennas offers the possibility of ubiquitous monitoring, communication and energy harvesting and storage. Specific requirements for wearable antennas are a planar structure and flexible construction materials. Several properties of the materials influence the behaviour of the antenna. For instance, the bandwidth and the efficiency of a planar microstrip antenna are mainly determined by the permittivity and the thickness of the substrate. The use of textiles in wearable antennas requires the characterization of their properties. Specific electrical conductive textiles are available on the market and have been successfully used. Ordinary textile fabrics have been used as substrates. However, little information can be found on the electromagnetic properties of regular textiles. Therefore this paper is mainly focused on the analysis of the dielectric properties of normal fabrics. In general, textiles present a very low dielectric constant that reduces the surface wave losses and increases the impedance bandwidth of the antenna. However, textile materials are constantly exchanging water molecules with the surroundings, which affects their electromagnetic properties. In addition, textile fabrics are porous, anisotropic and compressible materials whose thickness and density might change with low pressures. Therefore it is important to know how these characteristics influence the behaviour of the antenna in order to minimize unwanted effects. This paper presents a survey of the key points for the design and development of textile antennas, from the choice of the textile materials to the framing of the antenna. An analysis of the textile materials that have been used is also presented. PMID:23202235

  5. Ten Years of LibQual: A Study of Qualitative and Quantitative Survey Results at the University of Mississippi 2001-2010

    ERIC Educational Resources Information Center

    Greenwood, Judy T.; Watson, Alex P.; Dennis, Melissa

    2011-01-01

    This article analyzes quantitative adequacy gap scores and coded qualitative comments from LibQual surveys at the University of Mississippi from 2001 to 2010, looking for relationships between library policy changes and LibQual results and any other trends that emerged. Analysis found no relationship between changes in policy and survey results…

  6. Trajectory Design for the Transiting Exoplanet Survey Satellite

    NASA Technical Reports Server (NTRS)

    Dichmann, Donald J.; Parker, Joel J. K.; Williams, Trevor W.; Mendelsohn, Chad R.

    2014-01-01

    The Transiting Exoplanet Survey Satellite (TESS) is a National Aeronautics and Space Administration (NASA) mission, scheduled to be launched in 2017. TESS will travel in a highly eccentric orbit around Earth, with initial perigee radius near 17 Earth radii (Re) and apogee radius near 59 Re. The orbit period is near 2:1 resonance with the Moon, with apogee nearly 90 degrees out-of-phase with the Moon, in a configuration that has been shown to be operationally stable. TESS will execute phasing loops followed by a lunar flyby, with a final maneuver to achieve 2:1 resonance with the Moon. The goals of a resonant orbit with long-term stability, short eclipses and limited oscillations of perigee present significant challenges to the trajectory design. To rapidly assess launch opportunities, we adapted the Schematics Window Methodology (SWM76) launch window analysis tool to assess the TESS mission constraints. To understand the long-term dynamics of such a resonant orbit in the Earth-Moon system we employed Dynamical Systems Theory in the Circular Restricted 3-Body Problem (CR3BP). For precise trajectory analysis we use a high-fidelity model and multiple shooting in the General Mission Analysis Tool (GMAT) to optimize the maneuver delta-V and meet mission constraints. Finally we describe how the techniques we have developed can be applied to missions with similar requirements. Keywords: resonant orbit, stability, lunar flyby, phasing loops, trajectory optimization

  7. Trajectory Design for the Transiting Exoplanet Survey Satellite (TESS)

    NASA Technical Reports Server (NTRS)

    Dichmann, Donald J.; Parker, Joel; Williams, Trevor; Mendelsohn, Chad

    2014-01-01

    The Transiting Exoplanet Survey Satellite (TESS) is a National Aeronautics and Space Administration (NASA) mission launching in 2017. TESS will travel in a highly eccentric orbit around Earth, with initial perigee radius near 17 Earth radii (Re) and apogee radius near 59 Re. The orbit period is near 2:1 resonance with the Moon, with apogee nearly 90 degrees out-of-phase with the Moon, in a configuration that has been shown to be operationally stable. TESS will execute phasing loops followed by a lunar flyby, with a final maneuver to achieve 2:1 resonance with the Moon. The goals of a resonant orbit with long-term stability, short eclipses and limited oscillations of perigee present significant challenges to the trajectory design. To rapidly assess launch opportunities, we adapted the SWM76 launch window tool to assess the TESS mission constraints. To understand the long-term dynamics of such a resonant orbit in the Earth-Moon system we employed Dynamical Systems Theory in the Circular Restricted 3-Body Problem (CR3BP). For precise trajectory analysis we use a high-fidelity model and multiple shooting in the General Mission Analysis Tool (GMAT) to optimize the maneuver delta-V and meet mission constraints. Finally we describe how the techniques we have developed can be applied to missions with similar requirements.

  8. National Aquatic Resource Surveys: Integration of Geospatial Data in Their Survey Design and Analysis

    EPA Science Inventory

    The National Aquatic Resource Surveys (NARS) are a series of four statistical surveys conducted by the U.S. Environmental Protection Agency working in collaboration with states, tribal nations and other federal agencies. The surveys are conducted for lakes and reservoirs, streams...

  9. A robust rotorcraft flight control system design methodology utilizing quantitative feedback theory

    NASA Technical Reports Server (NTRS)

    Gorder, Peter James

    1993-01-01

    Rotorcraft flight control systems present design challenges which often exceed those associated with fixed-wing aircraft. First, large variations in the response characteristics of the rotorcraft result from the wide range of airspeeds of typical operation (hover to over 100 kts). Second, the assumption of vehicle rigidity often employed in the design of fixed-wing flight control systems is rarely justified in rotorcraft where rotor degrees of freedom can have a significant impact on the system performance and stability. This research was intended to develop a methodology for the design of robust rotorcraft flight control systems. Quantitative Feedback Theory (QFT) was chosen as the basis for the investigation. Quantitative Feedback Theory is a technique which accounts for variability in the dynamic response of the controlled element in the design robust control systems. It was developed to address a Multiple-Input Single-Output (MISO) design problem, and utilizes two degrees of freedom to satisfy the design criteria. Two techniques were examined for extending the QFT MISO technique to the design of a Multiple-Input-Multiple-Output (MIMO) flight control system (FCS) for a UH-60 Black Hawk Helicopter. In the first, a set of MISO systems, mathematically equivalent to the MIMO system, was determined. QFT was applied to each member of the set simultaneously. In the second, the same set of equivalent MISO systems were analyzed sequentially, with closed loop response information from each loop utilized in subsequent MISO designs. The results of each technique were compared, and the advantages of the second, termed Sequential Loop Closure, were clearly evident.

  10. A Novel Simulation Technician Laboratory Design: Results of a Survey-Based Study

    PubMed Central

    Hughes, Patrick G; Friedl, Ed; Ortiz Figueroa, Fabiana; Cepeda Brito, Jose R; Frey, Jennifer; Birmingham, Lauren E; Atkinson, Steven Scott

    2016-01-01

    Objective  The purpose of this study was to elicit feedback from simulation technicians prior to developing the first simulation technician-specific simulation laboratory in Akron, OH. Background Simulation technicians serve a vital role in simulation centers within hospitals/health centers around the world. The first simulation technician degree program in the US has been approved in Akron, OH. To satisfy the requirements of this program and to meet the needs of this special audience of learners, a customized simulation lab is essential.  Method A web-based survey was circulated to simulation technicians prior to completion of the lab for the new program. The survey consisted of questions aimed at identifying structural and functional design elements of a novel simulation center for the training of simulation technicians. Quantitative methods were utilized to analyze data. Results Over 90% of technicians (n=65) think that a lab designed explicitly for the training of technicians is novel and beneficial. Approximately 75% of respondents think that the space provided appropriate audiovisual (AV) infrastructure and space to evaluate the ability of technicians to be independent. The respondents think that the lab needed more storage space, visualization space for a large number of students, and more space in the technical/repair area. Conclusions  A space designed for the training of simulation technicians was considered to be beneficial. This laboratory requires distinct space for technical repair, adequate bench space for the maintenance and repair of simulators, an appropriate AV infrastructure, and space to evaluate the ability of technicians to be independent. PMID:27096134

  11. Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method, 4th Edition

    ERIC Educational Resources Information Center

    Dillman, Don A.; Smyth, Jolene D.; Christian, Lean Melani

    2014-01-01

    For over two decades, Dillman's classic text on survey design has aided both students and professionals in effectively planning and conducting mail, telephone, and, more recently, Internet surveys. The new edition is thoroughly updated and revised, and covers all aspects of survey research. It features expanded coverage of mobile phones, tablets,…

  12. Quantitative Hydrogeological Framework Interpretations from Modeling Helicopter Electromagnetic Survey Data, Nebraska Panhandle

    NASA Astrophysics Data System (ADS)

    Abraham, J. D.; Ball, L. B.; Bedrosian, P. A.; Cannia, J. C.; Deszcz-Pan, M.; Minsley, B. J.; Peterson, S. M.; Smith, B. D.

    2009-12-01

    The need for allocation and management of water resources within the state of Nebraska has created a demand for innovative approaches to data collection for development of hydrogeologic frameworks to be used for 2D and 3D groundwater models. In 2008, the USGS in cooperation with the North Platte Natural Resources District, the South Platte Natural Resources District, and the University of Nebraska Conservation and Survey Division began using frequency domain helicopter electromagnetic (HEM) surveys to map selected sections of the Nebraska Panhandle. The surveys took place in selected sections of the North Platte River valley, Lodgepole Creek, and portions of the adjacent tablelands. The objective of the surveys is to map the aquifers of the area to improve understanding of the groundwater-surface water relationships and develop better hydrogeologic frameworks used in making more accurate 3D groundwater models of the area. For the HEM method to have an impact in a groundwater model at the basin scale, hydrostratigraphic units need to have detectable physical property (electrical resistivity) contrasts. When these contrasts exist within the study area and they are detectable from an airborne platform, large areas can be surveyed to rapidly generate 2D and 3D maps and models of 3D hydrogeologic features. To make the geophysical data useful to multidimensional groundwater models, numerical inversion is necessary to produce a depth-dependent physical property data set reflecting hydrogeologic features. These maps and depth images of electrical resistivity in themselves are not useful for the hydrogeologist. They need to be turned into maps and depth images of the hydrostratigraphic units and hydrogeologic features. Through a process of numerical imaging, inversion, sensitivity analysis, geological ground truthing (boreholes), geological interpretation, hydrogeologic features are characterized. Resistivity depth sections produced from this process are used to pick

  13. A survey report for the design of biped locomotion robot: The WL-12 (Waseda Leg-12)

    NASA Astrophysics Data System (ADS)

    Takanishi, Atsuo; Kato, Ichiro; Kume, Etsuo

    1991-11-01

    A mechanical design study of biped locomotion robots is going on at JAERI within the scope of the Human Acts Simulation Program (HASP). The design study at JAERI is of an arbitrarily mobile robot for inspection of nuclear facilities. A survey has been performed for collecting useful information from already existing biped locomotion robots. This is a survey report of the biped locomotion robot: the WL-12 designed and developed at Waseda University. This report includes the mechanical model and control system designs.

  14. Rotorcraft flight control design using quantitative feedback theory and dynamic crossfeeds

    NASA Technical Reports Server (NTRS)

    Cheng, Rendy P.

    1995-01-01

    A multi-input, multi-output controls design with robust crossfeeds is presented for a rotorcraft in near-hovering flight using quantitative feedback theory (QFT). Decoupling criteria are developed for dynamic crossfeed design and implementation. Frequency dependent performance metrics focusing on piloted flight are developed and tested on 23 flight configurations. The metrics show that the resulting design is superior to alternative control system designs using conventional fixed-gain crossfeeds and to feedback-only designs which rely on high gains to suppress undesired off-axis responses. The use of dynamic, robust crossfeeds prior to the QFT design reduces the magnitude of required feedback gain and results in performance that meets current handling qualities specifications relative to the decoupling of off-axis responses. The combined effect of the QFT feedback design following the implementation of low-order, dynamic crossfeed compensator successfully decouples ten of twelve off-axis channels. For the other two channels it was not possible to find a single, low-order crossfeed that was effective.

  15. Rotorcraft control system design for uncertain vehicle dynamics using quantitative feedback theory

    NASA Technical Reports Server (NTRS)

    Hess, R. A.

    1994-01-01

    Quantitative Feedback Theory describes a frequency-domain technique for the design of multi-input, multi-output control systems which must meet time or frequency domain performance criteria when specified uncertainty exists in the linear description of the vehicle dynamics. This theory is applied to the design of the longitudinal flight control system for a linear model of the BO-105C rotorcraft. Uncertainty in the vehicle model is due to the variation in the vehicle dynamics over a range of airspeeds from 0-100 kts. For purposes of exposition, the vehicle description contains no rotor or actuator dynamics. The design example indicates the manner in which significant uncertainty exists in the vehicle model. The advantage of using a sequential loop closure technique to reduce the cost of feedback is demonstrated by example.

  16. The Importance of Adhering to Details of the Total Design Method (TDM) for Mail Surveys.

    ERIC Educational Resources Information Center

    Dillman, Don A.; And Others

    1984-01-01

    The empirical effects of adherence of details of the Total Design Method (TDM) approach to the design of mail surveys is discussed, based on the implementation of a common survey in 11 different states. The results suggest that greater adherence results in higher response, especially in the later stages of the TDM. (BW)

  17. Quantitative differential geomorphology of the Monterey Canyon from time-separated multibeam surveys

    NASA Astrophysics Data System (ADS)

    Taramelli, A.; Zucca, F.; Innocenti, C.; Sorichetta, A.; Seeber, L.

    2008-12-01

    Changes of bathymetry derived from multibeam sonars are useful for quantifying the effects of many sedimentary and tectonic processes. The assessment of resolution limits is an essential component of the analysis This research compares submarine morphology as they manifest tectonics in a rapidly transform continental margin (Monterey Bay - California). We study modern submarine processes from a geomorphic change using high-resolution multibeam bathymetry. We first used different techniques that quantify uncertainties and reveals the spatial variations of errors. An sub-area of immobile seafloor in the study area, mapped by the high-resolution multibeam record of the seafloor of the MBR collected by MBARI in each survey in a four years period (spring 2003 to winter 2006), provides a common 'benchmark'. Each survey dataset over the benchmark is filtered with a simple moving-averaging window and depth differences between the two surveys are collated to derive a difference histogram. The procedure is repeated using different length-scales of filtering. By plotting the variability of the differences versus the length-scale of the filter, the different effects of spatially uncorrelated and correlated noise can be deduced. Beside that, a variography analysis is conducted on the dataset build by differencing the benchmark surveys to highlight spatial structures and anisotropies of the measure errors. Data analysis of the Monterey Bay area indicates that the canyon floor contains an axial channel laterally bounded by elevated complex terrace surfaces. Asymmetrical megaripples dominate the active part of the canyon floor, indicating sediment transport. Terraces represent the evidence of recent degradation of the canyon floor. Slump scars and gullies, having a variety of size, shape the canyon walls. Significant changes over the analyzed period include: (a) complete reorganization of the megaripples on the channel floor, (b) local slump scar on the head of the canyon and on

  18. SAS procedures for designing and analyzing sample surveys

    USGS Publications Warehouse

    Stafford, Joshua D.; Reinecke, Kenneth J.; Kaminski, Richard M.

    2003-01-01

    Complex surveys often are necessary to estimate occurrence (or distribution), density, and abundance of plants and animals for purposes of re-search and conservation. Most scientists are familiar with simple random sampling, where sample units are selected from a population of interest (sampling frame) with equal probability. However, the goal of ecological surveys often is to make inferences about populations over large or complex spatial areas where organisms are not homogeneously distributed or sampling frames are in-convenient or impossible to construct. Candidate sampling strategies for such complex surveys include stratified,multistage, and adaptive sampling (Thompson 1992, Buckland 1994).

  19. Measuring access to medicines: a review of quantitative methods used in household surveys

    PubMed Central

    2010-01-01

    Background Medicine access is an important goal of medicine policy; however the evaluation of medicine access is a subject under conceptual and methodological development. The aim of this study was to describe quantitative methodologies to measure medicine access on household level, access expressed as paid or unpaid medicine acquisition. Methods Searches were carried out in electronic databases and health institutional sites; within references from retrieved papers and by contacting authors. Results Nine papers were located. The methodologies of the studies presented differences in the recall period, recruitment of subjects and medicine access characterization. Conclusions The standardization of medicine access indicators and the definition of appropriate recall periods are required to evaluate different medicines and access dimensions, improving studies comparison. Besides, specific keywords must be established to allow future literature reviews about this topic. PMID:20509960

  20. Survey of Quantitative Research Metrics to Assess Pilot Performance in Upset Recovery

    NASA Technical Reports Server (NTRS)

    Le Vie, Lisa R.

    2016-01-01

    Accidents attributable to in-flight loss of control are the primary cause for fatal commercial jet accidents worldwide. The National Aeronautics and Space Administration (NASA) conducted a literature review to determine and identify the quantitative standards for assessing upset recovery performance. This review contains current recovery procedures for both military and commercial aviation and includes the metrics researchers use to assess aircraft recovery performance. Metrics include time to first input, recognition time and recovery time and whether that input was correct or incorrect. Other metrics included are: the state of the autopilot and autothrottle, control wheel/sidestick movement resulting in pitch and roll, and inputs to the throttle and rudder. In addition, airplane state measures, such as roll reversals, altitude loss/gain, maximum vertical speed, maximum/minimum air speed, maximum bank angle and maximum g loading are reviewed as well.

  1. The health effects of climate change: a survey of recent quantitative research.

    PubMed

    Grasso, Margherita; Manera, Matteo; Chiabai, Aline; Markandya, Anil

    2012-05-01

    In recent years there has been a large scientific and public debate on climate change and its direct as well as indirect effects on human health. In particular, a large amount of research on the effects of climate changes on human health has addressed two fundamental questions. First, can historical data be of some help in revealing how short-run or long-run climate variations affect the occurrence of infectious diseases? Second, is it possible to build more accurate quantitative models which are capable of predicting the future effects of different climate conditions on the transmissibility of particularly dangerous infectious diseases? The primary goal of this paper is to review the most relevant contributions which have directly tackled those questions, both with respect to the effects of climate changes on the diffusion of non-infectious and infectious diseases, with malaria as a case study. Specific attention will be drawn on the methodological aspects of each study, which will be classified according to the type of quantitative model considered, namely time series models, panel data and spatial models, and non-statistical approaches. Since many different disciplines and approaches are involved, a broader view is necessary in order to provide a better understanding of the interactions between climate and health. In this respect, our paper also presents a critical summary of the recent literature related to more general aspects of the impacts of climate changes on human health, such as: the economics of climate change; how to manage the health effects of climate change; the establishment of Early Warning Systems for infectious diseases.

  2. The Health Effects of Climate Change: A Survey of Recent Quantitative Research

    PubMed Central

    Grasso, Margherita; Manera, Matteo; Chiabai, Aline; Markandya, Anil

    2012-01-01

    In recent years there has been a large scientific and public debate on climate change and its direct as well as indirect effects on human health. In particular, a large amount of research on the effects of climate changes on human health has addressed two fundamental questions. First, can historical data be of some help in revealing how short-run or long-run climate variations affect the occurrence of infectious diseases? Second, is it possible to build more accurate quantitative models which are capable of predicting the future effects of different climate conditions on the transmissibility of particularly dangerous infectious diseases? The primary goal of this paper is to review the most relevant contributions which have directly tackled those questions, both with respect to the effects of climate changes on the diffusion of non-infectious and infectious diseases, with malaria as a case study. Specific attention will be drawn on the methodological aspects of each study, which will be classified according to the type of quantitative model considered, namely time series models, panel data and spatial models, and non-statistical approaches. Since many different disciplines and approaches are involved, a broader view is necessary in order to provide a better understanding of the interactions between climate and health. In this respect, our paper also presents a critical summary of the recent literature related to more general aspects of the impacts of climate changes on human health, such as: the economics of climate change; how to manage the health effects of climate change; the establishment of Early Warning Systems for infectious diseases. PMID:22754455

  3. Sample size and optimal sample design in tuberculosis surveys

    PubMed Central

    Sánchez-Crespo, J. L.

    1967-01-01

    Tuberculosis surveys sponsored by the World Health Organization have been carried out in different communities during the last few years. Apart from the main epidemiological findings, these surveys have provided basic statistical data for use in the planning of future investigations. In this paper an attempt is made to determine the sample size desirable in future surveys that include one of the following examinations: tuberculin test, direct microscopy, and X-ray examination. The optimum cluster sizes are found to be 100-150 children under 5 years of age in the tuberculin test, at least 200 eligible persons in the examination for excretors of tubercle bacilli (direct microscopy) and at least 500 eligible persons in the examination for persons with radiological evidence of pulmonary tuberculosis (X-ray). Modifications of the optimum sample size in combined surveys are discussed. PMID:5300008

  4. Quantitatively structural control of the karst based on speleological cave survey data: Cabeza Llerosos massif (Picos de Europa, Spain)

    NASA Astrophysics Data System (ADS)

    Ballesteros, D.; Jiménez-Sánchez, M.; García-Sansegundo, J.; Borreguero, M.; Sendra, G.

    2012-04-01

    Speleological cave survey characterizes each cave passage by a 3D line (called shot survey) defined by its length, direction and dipping. This line represents the three-dimensional geometry of the karst system and cave passage scale and can be statistically analyzed and compared with the geometry of the massif discontinuities. The aim of this work is to establish the quantitative influence of the structural geology in caves based on the comparison between cave survey data, joint and bedding measurements with stereographic projection. 15 km of cave surveys from Cabeza Llerosos massif (Picos de Europa, Northern Spain) were chosen to illustrate the method. The length of the cavities range between 50 to 4,438 m and their depth is up to 738 m. The methodology of work includes: 1) cave survey collection from caving reports; 2) geological mapping and cross-sections with cavities projection;3) data collection of bedding and joints in caves and near outcrops;4) definition of families of joints and bedding planes by stereographic projection; 5) definition of groups of cave passages from stereographic projection (based on their directions and dipping) and 6) comparison between bedding, families of joints and cave survey data by stereographic projection. Seven families of joints have been defined in all the area of study. The joint families are: J1) sub-vertical, J2) N63/68SE, J3) N29E/46NW, J4) N52E/72NW, J5) N129E/17NE, J6) N167E/57NE and J7) N180E/26E; the bedding is N30-55/60-80NE. Five groups of cave passages have been defined. "A" group of cave passage is formed by sub-vertical series; it is represented by the 61 % of all the cave passages and is conditioned by the joint families J1, J3, J4 and J6, as well as their intersections. "B" group is formed by N10W-N10E/3-20N galleries; it corresponds with the 13 % of the series and is controlled by the intersection between families J5 and J6. "C" group is defined by N20-70E/0-50NE passages; it is represented by the 13 % of the

  5. Determination of quantitative trait variants by concordance via application of the a posteriori granddaughter design to the U.S. Holstein population

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Experimental designs that exploit family information can provide substantial predictive power in quantitative trait variant discovery projects. Concordance between quantitative trait locus genotype as determined by the a posteriori granddaughter design and marker genotype was determined for 29 trai...

  6. Translating HIV sequences into quantitative fitness landscapes predicts viral vulnerabilities for rational immunogen design.

    PubMed

    Ferguson, Andrew L; Mann, Jaclyn K; Omarjee, Saleha; Ndung'u, Thumbi; Walker, Bruce D; Chakraborty, Arup K

    2013-03-21

    A prophylactic or therapeutic vaccine offers the best hope to curb the HIV-AIDS epidemic gripping sub-Saharan Africa, but it remains elusive. A major challenge is the extreme viral sequence variability among strains. Systematic means to guide immunogen design for highly variable pathogens like HIV are not available. Using computational models, we have developed an approach to translate available viral sequence data into quantitative landscapes of viral fitness as a function of the amino acid sequences of its constituent proteins. Predictions emerging from our computationally defined landscapes for the proteins of HIV-1 clade B Gag were positively tested against new in vitro fitness measurements and were consistent with previously defined in vitro measurements and clinical observations. These landscapes chart the peaks and valleys of viral fitness as protein sequences change and inform the design of immunogens and therapies that can target regions of the virus most vulnerable to selection pressure.

  7. Translating HIV sequences into quantitative fitness landscapes predicts viral vulnerabilities for rational immunogen design

    PubMed Central

    Ferguson, Andrew L.; Mann, Jaclyn K.; Omarjee, Saleha; Ndung’u, Thumbi; Walker, Bruce D.; Chakraborty, Arup K.

    2013-01-01

    Summary A prophylactic or therapeutic vaccine offers the best hope to curb the HIV-AIDS epidemic gripping sub-Saharan Africa, but remains elusive. A major challenge is the extreme viral sequence variability among strains. Systematic means to guide immunogen design for highly variable pathogens like HIV are not available. Using computational models, we have developed an approach to translate available viral sequence data into quantitative landscapes of viral fitness as a function of the amino acid sequences of its constituent proteins. Predictions emerging from our computationally defined landscapes for the proteins of HIV-1 clade B Gag were positively tested against new in vitro fitness measurements, and were consistent with previously defined in vitro measurements and clinical observations. These landscapes chart the peaks and valleys of viral fitness as protein sequences change, and inform the design of immunogens and therapies that can target regions of the virus most vulnerable to selection pressure. PMID:23521886

  8. Spacecraft drag-free attitude control system design with Quantitative Feedback Theory

    NASA Astrophysics Data System (ADS)

    Wu, Shu-Fan; Fertin, Denis

    2008-06-01

    One of the key technologies to be demonstrated on board the LISA Pathfinder spacecraft (S/C) is the drag-free attitude control systems (DFACS), aiming to control the S/C attitude and the S/C test masses relative motion with a precision of the order of the nanometer. This paper explores how the controllers could be designed and tuned with the Quantitative Feedback Theory (QFT). After a summary of the plant dynamics and the control strategy using input decoupling, the various performance specifications are presented and transformed into a set of design criteria expressed as constraints for the controller sensitivity and complementary sensitivity transfer functions of each individual control axis. The QFT technique is then used for designing and tuning the controllers, in particular to perform the trade-off between performances and stability and use the available design margins in the drag-free controllers to meet different performance specifications. Both frequency-domain analysis and time-domain simulation test results are presented to evaluate the performance of controllers designed for different purposes.

  9. Edesign: Primer and Enhanced Internal Probe Design Tool for Quantitative PCR Experiments and Genotyping Assays.

    PubMed

    Kimura, Yasumasa; Soma, Takahiro; Kasahara, Naoko; Delobel, Diane; Hanami, Takeshi; Tanaka, Yuki; de Hoon, Michiel J L; Hayashizaki, Yoshihide; Usui, Kengo; Harbers, Matthias

    2016-01-01

    Analytical PCR experiments preferably use internal probes for monitoring the amplification reaction and specific detection of the amplicon. Such internal probes have to be designed in close context with the amplification primers, and may require additional considerations for the detection of genetic variations. Here we describe Edesign, a new online and stand-alone tool for designing sets of PCR primers together with an internal probe for conducting quantitative real-time PCR (qPCR) and genotypic experiments. Edesign can be used for selecting standard DNA oligonucleotides like for instance TaqMan probes, but has been further extended with new functions and enhanced design features for Eprobes. Eprobes, with their single thiazole orange-labelled nucleotide, allow for highly sensitive genotypic assays because of their higher DNA binding affinity as compared to standard DNA oligonucleotides. Using new thermodynamic parameters, Edesign considers unique features of Eprobes during primer and probe design for establishing qPCR experiments and genotyping by melting curve analysis. Additional functions in Edesign allow probe design for effective discrimination between wild-type sequences and genetic variations either using standard DNA oligonucleotides or Eprobes. Edesign can be freely accessed online at http://www.dnaform.com/edesign2/, and the source code is available for download.

  10. Applications of numerical optimization methods to helicopter design problems: A survey

    NASA Technical Reports Server (NTRS)

    Miura, H.

    1984-01-01

    A survey of applications of mathematical programming methods is used to improve the design of helicopters and their components. Applications of multivariable search techniques in the finite dimensional space are considered. Five categories of helicopter design problems are considered: (1) conceptual and preliminary design, (2) rotor-system design, (3) airframe structures design, (4) control system design, and (5) flight trajectory planning. Key technical progress in numerical optimization methods relevant to rotorcraft applications are summarized.

  11. Applications of numerical optimization methods to helicopter design problems - A survey

    NASA Technical Reports Server (NTRS)

    Miura, H.

    1984-01-01

    A survey of applications of mathematical programming methods is used to improve the design of helicopters and their components. Applications of multivariable search techniques in the finite dimensional space are considered. Five categories of helicopter design problems are considered: (1) conceptual and preliminary design, (2) rotor-system design, (3) airframe structures design, (4) control system design, and (5) flight trajectory planning. Key technical progress in numerical optimization methods relevant to rotorcraft applications are summarized.

  12. Applications of numerical optimization methods to helicopter design problems - A survey

    NASA Technical Reports Server (NTRS)

    Miura, H.

    1985-01-01

    A survey of applications of mathematical programming methods is used to improve the design of helicopters and their components. Applications of multivariable search techniques in the finite dimensional space are considered. Five categories of helicopter design problems are considered: (1) conceptual and preliminary design, (2) rotor-system design, (3) airframe structures design, (4) control system design, and (5) flight trajectory planning. Key technical progress in numerical optimization methods relevant to rotorcraft applications are summarized.

  13. SKA weak lensing- II. Simulated performance and survey design considerations

    NASA Astrophysics Data System (ADS)

    Bonaldi, Anna; Harrison, Ian; Camera, Stefano; Brown, Michael L.

    2016-12-01

    We construct a pipeline for simulating weak lensing cosmology surveys with the Square Kilometre Array (SKA), taking as inputs telescope sensitivity curves; correlated source flux, size and redshift distributions; a simple ionospheric model; source redshift and ellipticity measurement errors. We then use this simulation pipeline to optimize a 2-yr weak lensing survey performed with the first deployment of the SKA (SKA1). Our assessments are based on the total signal to noise of the recovered shear power spectra, a metric that we find to correlate very well with a standard dark energy figure of merit. We first consider the choice of frequency band, trading off increases in number counts at lower frequencies against poorer resolution; our analysis strongly prefers the higher frequency Band 2 (950-1760 MHz) channel of the SKA-MID telescope to the lower frequency Band 1 (350-1050 MHz). Best results would be obtained by allowing the centre of Band 2 to shift towards lower frequency, around 1.1 GHz. We then move on to consider survey size, finding that an area of 5000 deg2 is optimal for most SKA1 instrumental configurations. Finally, we forecast the performance of a weak lensing survey with the second deployment of the SKA. The increased survey size (3π steradian) and sensitivity improves both the signal to noise and the dark energy metrics by two orders of magnitude.

  14. Joint NRC/EPA Sewage Sludge Radiological Survey: Survey Design & Test Site Results

    EPA Pesticide Factsheets

    This report contains the results of a radiological survey of nine publicly POTWs around the country, which was commissioned by the Sewage Sludge Subcommittee, to determine whether and to what extent radionuclides concentrate in sewage treatment wastes.

  15. DoD Survey of Officers and Enlisted Personnel: Survey Design and Administrative Procedures (1978).

    DTIC Science & Technology

    1980-04-01

    installation specific quota samples, withI definite rules for substitution of respondents. Sample selection, notification, and all record keeping is at the...considcration of rc.,pondent atention span, proced,.:rs to raxfjijae rerpo)1, rates, and costs of administration. Since both of thae rurveys are Ipotast data...this particular survey, the following technique for administration is being offered to commanders Army-wide: a. Administer the survey through your

  16. Hydrological drought types in cold climates: quantitative analysis of causing factors and qualitative survey of impacts

    NASA Astrophysics Data System (ADS)

    Van Loon, A. F.; Ploum, S. W.; Parajka, J.; Fleig, A. K.; Garnier, E.; Laaha, G.; Van Lanen, H. A. J.

    2015-04-01

    For drought management and prediction, knowledge of causing factors and socio-economic impacts of hydrological droughts is crucial. Propagation of meteorological conditions in the hydrological cycle results in different hydrological drought types that require separate analysis. In addition to the existing hydrological drought typology, we here define two new drought types related to snow and ice. A snowmelt drought is a deficiency in the snowmelt discharge peak in spring in snow-influenced basins and a glaciermelt drought is a deficiency in the glaciermelt discharge peak in summer in glacierised basins. In 21 catchments in Austria and Norway we studied the meteorological conditions in the seasons preceding and at the time of snowmelt and glaciermelt drought events. Snowmelt droughts in Norway were mainly controlled by below-average winter precipitation, while in Austria both temperature and precipitation played a role. For glaciermelt droughts, the effect of below-average summer air temperature was dominant, both in Austria and Norway. Subsequently, we investigated the impacts of temperature-related drought types (i.e. snowmelt and glaciermelt drought, but also cold and warm snow season drought and rain-to-snow-season drought). In historical archives and drought databases for the US and Europe many impacts were found that can be attributed to these temperature-related hydrological drought types, mainly in the agriculture and electricity production (hydropower) sectors. However, drawing conclusions on the frequency of occurrence of different drought types from reported impacts is difficult, mainly because of reporting biases and the inevitably limited spatial and temporal scales of the information. Finally, this study shows that complete integration of quantitative analysis of causing factors and qualitative analysis of impacts of temperature-related droughts is not yet possible. Analysis of selected events, however, points out that it can be a promising research

  17. THE OPTICALLY UNBIASED GAMMA-RAY BURST HOST (TOUGH) SURVEY. I. SURVEY DESIGN AND CATALOGS

    SciTech Connect

    Hjorth, Jens; Malesani, Daniele; Fynbo, Johan P. U.; Kruehler, Thomas; Milvang-Jensen, Bo; Watson, Darach; Jakobsson, Pall; Schulze, Steve; Jaunsen, Andreas O.; Gorosabel, Javier; Levan, Andrew J.; Michalowski, Michal J.; Moller, Palle; Tanvir, Nial R.

    2012-09-10

    Long-duration gamma-ray bursts (GRBs) are powerful tracers of star-forming galaxies. We have defined a homogeneous subsample of 69 Swift GRB-selected galaxies spanning a very wide redshift range. Special attention has been devoted to making the sample optically unbiased through simple and well-defined selection criteria based on the high-energy properties of the bursts and their positions on the sky. Thanks to our extensive follow-up observations, this sample has now achieved a comparatively high degree of redshift completeness, and thus provides a legacy sample, useful for statistical studies of GRBs and their host galaxies. In this paper, we present the survey design and summarize the results of our observing program conducted at the ESO Very Large Telescope (VLT) aimed at obtaining the most basic properties of galaxies in this sample, including a catalog of R and K{sub s} magnitudes and redshifts. We detect the host galaxies for 80% of the GRBs in the sample, although only 42% have K{sub s} -band detections, which confirms that GRB-selected host galaxies are generally blue. The sample is not uniformly blue, however, with two extremely red objects detected. Moreover, galaxies hosting GRBs with no optical/NIR afterglows, whose identification therefore relies on X-ray localizations, are significantly brighter and redder than those with an optical/NIR afterglow. This supports a scenario where GRBs occurring in more massive and dusty galaxies frequently suffer high optical obscuration. Our spectroscopic campaign has resulted in 77% now having redshift measurements, with a median redshift of 2.14 {+-} 0.18. TOUGH alone includes 17 detected z > 2 Swift GRB host galaxies suitable for individual and statistical studies-a substantial increase over previous samples. Seven hosts have detections of the Ly{alpha} emission line and we can exclude an early indication that Ly{alpha} emission is ubiquitous among GRB hosts, but confirm that Ly{alpha} is stronger in GRB

  18. Study the multi-band co-caliber infrared system optimize design and quantitative measurement

    NASA Astrophysics Data System (ADS)

    Guo, Ju guang; Ma, Yong hui; Yang, Zhi hui

    2016-10-01

    The main optical system of multi-band co-caliber infrared system is designed by using a Cassegrain telescope whose primary mirror (PM) and secondary mirror (SM) are aspherical form, and the structure of which is using total reflection system for the former level, the refractive lens group for the stage. After the target radiation to reach the primary mirror, reflecting onto the secondary mirror, and on top of toggling the spectral radiometric flux , respectively, which is reflected by different spectrum region, transmit to infrared focal plane array (IR FPA) for each other imaging detector. Then, photoelectric converse those information which were received by IRFPA. The output signal of detectors are processed and displayed by Read-Out Integrated Circuit (ROIC). We are confirming the image quality of different bands during system model optimization design. According to the specification of design system, establishing the measurement program of quantitative study. The results experimental measurement shows that the optimized design of the optical system has good validity.

  19. Ergonomic Based Design and Survey of Elementary School Furniture

    ERIC Educational Resources Information Center

    Maheshwar; Jawalkar, Chandrashekhar S.

    2014-01-01

    This paper presents the ergonomic aspects in designing and prototyping of desks cum chairs used in elementary schools. The procedures adopted for the assessment included: the study of existing school furniture, design analysis and development of prototypes. The design approach proposed a series of adjustable desks and chairs developed in terms of…

  20. Using design effects from previous cluster surveys to guide sample size calculation in emergency settings.

    PubMed

    Kaiser, Reinhard; Woodruff, Bradley A; Bilukha, Oleg; Spiegel, Paul B; Salama, Peter

    2006-06-01

    A good estimate of the design effect is critical for calculating the most efficient sample size for cluster surveys. We reviewed the design effects for seven nutrition and health outcomes from nine population-based cluster surveys conducted in emergency settings. Most of the design effects for outcomes in children, and one-half of the design effects for crude mortality, were below two. A reassessment of mortality data from Kosovo and Badghis, Afghanistan revealed that, given the same number of clusters, changing sample size had a relatively small impact on the precision of the estimate of mortality. We concluded that, in most surveys, assuming a design effect of 1.5 for acute malnutrition in children and two or less for crude mortality would produce a more efficient sample size. In addition, enhancing the sample size in cluster surveys without increasing the number of clusters may not result in substantial improvements in precision.

  1. The JCMT Gould Belt Survey: a quantitative comparison between SCUBA-2 data reduction methods

    NASA Astrophysics Data System (ADS)

    Mairs, S.; Johnstone, D.; Kirk, H.; Graves, S.; Buckle, J.; Beaulieu, S. F.; Berry, D. S.; Broekhoven-Fiene, H.; Currie, M. J.; Fich, M.; Hatchell, J.; Jenness, T.; Mottram, J. C.; Nutter, D.; Pattle, K.; Pineda, J. E.; Salji, C.; Francesco, J. Di; Hogerheijde, M. R.; Ward-Thompson, D.; JCMT Gould Belt survey Team

    2015-12-01

    Performing ground-based submillimetre observations is a difficult task as the measurements are subject to absorption and emission from water vapour in the Earth's atmosphere and time variation in weather and instrument stability. Removing these features and other artefacts from the data is a vital process which affects the characteristics of the recovered astronomical structure we seek to study. In this paper, we explore two data reduction methods for data taken with the Submillimetre Common-User Bolometer Array-2 (SCUBA-2) at the James Clerk Maxwell Telescope (JCMT). The JCMT Legacy Reduction 1 (JCMT LR1) and The Gould Belt Legacy Survey Legacy Release 1 (GBS LR1) reduction both use the same software (STARLINK) but differ in their choice of data reduction parameters. We find that the JCMT LR1 reduction is suitable for determining whether or not compact emission is present in a given region and the GBS LR1 reduction is tuned in a robust way to uncover more extended emission, which better serves more in-depth physical analyses of star-forming regions. Using the GBS LR1 method, we find that compact sources are recovered well, even at a peak brightness of only three times the noise, whereas the reconstruction of larger objects requires much care when drawing boundaries around the expected astronomical signal in the data reduction process. Incorrect boundaries can lead to false structure identification or it can cause structure to be missed. In the JCMT LR1 reduction, the extent of the true structure of objects larger than a point source is never fully recovered.

  2. Estimating effects of a single gene and polygenes on quantitative traits from a diallel design.

    PubMed

    Lou, Xiang-Yang; Yang, Mark C K

    2006-01-01

    A genetic model is developed with additive and dominance effects of a single gene and polygenes as well as general and specific reciprocal effects for the progeny from a diallel mating design. The methods of ANOVA, minimum norm quadratic unbiased estimation (MINQUE), restricted maximum likelihood estimation (REML), and maximum likelihood estimation (ML) are suggested for estimating variance components, and the methods of generalized least squares (GLS) and ordinary least squares (OLS) for fixed effects, while best linear unbiased prediction, linear unbiased prediction (LUP), and adjusted unbiased prediction are suggested for analyzing random effects. Monte Carlo simulations were conducted to evaluate the unbiasedness and efficiency of statistical methods involving two diallel designs with commonly used sample sizes, 6 and 8 parents, with no and missing crosses, respectively. Simulation results show that GLS and OLS are almost equally efficient for estimation of fixed effects, while MINQUE (1) and REML are better estimators of the variance components and LUP is most practical method for prediction of random effects. Data from a Drosophila melanogaster experiment (Gilbert 1985a, Theor appl Genet 69:625-629) were used as a working example to demonstrate the statistical analysis. The new methodology is also applicable to screening candidate gene(s) and to other mating designs with multiple parents, such as nested (NC Design I) and factorial (NC Design II) designs. Moreover, this methodology can serve as a guide to develop new methods for detecting indiscernible major genes and mapping quantitative trait loci based on mixture distribution theory. The computer program for the methods suggested in this article is freely available from the authors.

  3. Quantitative imaging of the human upper airway: instrument design and clinical studies

    NASA Astrophysics Data System (ADS)

    Leigh, M. S.; Armstrong, J. J.; Paduch, A.; Sampson, D. D.; Walsh, J. H.; Hillman, D. R.; Eastwood, P. R.

    2006-08-01

    Imaging of the human upper airway is widely used in medicine, in both clinical practice and research. Common imaging modalities include video endoscopy, X-ray CT, and MRI. However, no current modality is both quantitative and safe to use for extended periods of time. Such a capability would be particularly valuable for sleep research, which is inherently reliant on long observation sessions. We have developed an instrument capable of quantitative imaging of the human upper airway, based on endoscopic optical coherence tomography. There are no dose limits for optical techniques, and the minimally invasive imaging probe is safe for use in overnight studies. We report on the design of the instrument and its use in preliminary clinical studies, and we present results from a range of initial experiments. The experiments show that the instrument is capable of imaging during sleep, and that it can record dynamic changes in airway size and shape. This information is useful for research into sleep disorders, and potentially for clinical diagnosis and therapies.

  4. How quantitative measures unravel design principles in multi-stage phosphorylation cascades.

    PubMed

    Frey, Simone; Millat, Thomas; Hohmann, Stefan; Wolkenhauer, Olaf

    2008-09-07

    We investigate design principles of linear multi-stage phosphorylation cascades by using quantitative measures for signaling time, signal duration and signal amplitude. We compare alternative pathway structures by varying the number of phosphorylations and the length of the cascade. We show that a model for a weakly activated pathway does not reflect the biological context well, unless it is restricted to certain parameter combinations. Focusing therefore on a more general model, we compare alternative structures with respect to a multivariate optimization criterion. We test the hypothesis that the structure of a linear multi-stage phosphorylation cascade is the result of an optimization process aiming for a fast response, defined by the minimum of the product of signaling time and signal duration. It is then shown that certain pathway structures minimize this criterion. Several popular models of MAPK cascades form the basis of our study. These models represent different levels of approximation, which we compare and discuss with respect to the quantitative measures.

  5. Probability of detection of nests and implications for survey design

    USGS Publications Warehouse

    Smith, P.A.; Bart, J.; Lanctot, Richard B.; McCaffery, B.J.; Brown, S.

    2009-01-01

    Surveys based on double sampling include a correction for the probability of detection by assuming complete enumeration of birds in an intensively surveyed subsample of plots. To evaluate this assumption, we calculated the probability of detecting active shorebird nests by using information from observers who searched the same plots independently. Our results demonstrate that this probability varies substantially by species and stage of the nesting cycle but less by site or density of nests. Among the species we studied, the estimated single-visit probability of nest detection during the incubation period varied from 0.21 for the White-rumped Sandpiper (Calidris fuscicollis), the most difficult species to detect, to 0.64 for the Western Sandpiper (Calidris mauri), the most easily detected species, with a mean across species of 0.46. We used these detection probabilities to predict the fraction of persistent nests found over repeated nest searches. For a species with the mean value for detectability, the detection rate exceeded 0.85 after four visits. This level of nest detection was exceeded in only three visits for the Western Sandpiper, but six to nine visits were required for the White-rumped Sandpiper, depending on the type of survey employed. Our results suggest that the double-sampling method's requirement of nearly complete counts of birds in the intensively surveyed plots is likely to be met for birds with nests that survive over several visits of nest searching. Individuals with nests that fail quickly or individuals that do not breed can be detected with high probability only if territorial behavior is used to identify likely nesting pairs. ?? The Cooper Ornithological Society, 2009.

  6. Survey design for lakes and reservoirs in the United States to assess contaminants in fish tissue

    EPA Science Inventory

    The National Lake Fish Tissue Study (NLFTS) was the first survey of fish contamination in lakes and reservoirs in the 48 conterminous states based on probability survey design. This study included the largest set (268) of persistent, bioaccumulative, and toxic (PBT) chemicals ev...

  7. The Laboratory Course Assessment Survey: A Tool to Measure Three Dimensions of Research-Course Design

    ERIC Educational Resources Information Center

    Corwin, Lisa A.; Runyon, Christopher; Robinson, Aspen; Dolan, Erin L.

    2015-01-01

    Course-based undergraduate research experiences (CUREs) are increasingly being offered as scalable ways to involve undergraduates in research. Yet few if any design features that make CUREs effective have been identified. We developed a 17-item survey instrument, the Laboratory Course Assessment Survey (LCAS), that measures students' perceptions…

  8. Controls design with crossfeeds for hovering rotorcraft using quantitative feedback theory

    NASA Technical Reports Server (NTRS)

    Tischler, Mark B.; Biezad, Daniel J.; Cheng, Rendy

    1996-01-01

    A multi-input, multi-output controls design with dynamic crossfeed pre-compensation is presented for rotorcraft in near-hovering flight using Quantitative Feedback Theory (QFT). The resulting closed-loop control system bandwidth allows the rotorcraft to be considered for use as an inflight simulator. The use of dynamic, robust crossfeeds prior to the QFT design reduces the magnitude of required feedback gain and results in performance that meets most handling qualities specifications relative to the decoupling of off-axis responses. Handling qualities are Level 1 for both low-gain tasks and high-gain tasks in the roll, pitch, and yaw axes except for the 10 deg/sec moderate-amplitude yaw command where the rotorcraft exhibits Level 2 handling qualities in the yaw axis caused by phase lag. The combined effect of the QFT feedback design following the implementation of low-order, dynamic crossfeed compensators successfully decouples ten of twelve off-axis channels. For the other two channels it was not possible to find a single, low-order crossfeed that was effective. This is an area to be investigated in future research.

  9. Quantitative determination of rarity of freshwater fishes and implications for imperiled-species designations.

    PubMed

    Pritt, Jeremy J; Frimpong, Emmanuel A

    2010-10-01

    Conserving rare species and protecting biodiversity and ecosystem functioning depends on sound information on the nature of rarity. Rarity is multidimensional and has a variety of definitions, which presents the need for a quantitative classification scheme with which to categorize species as rare or common. We constructed such a classification for North American freshwater fishes to better describe rarity in fishes and provide researchers and managers with a tool to streamline conservation efforts. We used data on range extents, habitat specificities, and local population sizes of North American freshwater fishes and a variety of quantitative methods and statistical decision criteria, including quantile regression and a cost-function algorithm to determine thresholds for categorizing a species as rare or common. Species fell into eight groups that conform to an established framework for rarity. Fishes listed by the American Fisheries Society (AFS) as endangered, threatened, or vulnerable were most often rare because their local population sizes were low, ranges were small, and they had specific habitat needs, in that order, whereas unlisted species were most often considered common on the basis of these three factors. Species with large ranges generally had few specific habitat needs, whereas those with small ranges tended to have narrow habitat specificities. We identified 30 species not designated as imperiled by AFS that were rare along all dimensions of rarity and may warrant further study or protection, and we found three designated species that were common along all dimensions and may require a review of their imperilment status. Our approach could be applied to other taxa to aid conservation decisions and serve as a useful tool for future revisions of listings of fish species.

  10. Targeting Urban Watershed Stressor Gradients: Stream Survey Design, Ecological Responses, and Implications of Land Cover Resolution

    EPA Science Inventory

    We conducted a stream survey in the Narragansett Bay Watershed designed to target a gradient of development intensity, and to examine how associated changes in nutrients, carbon, and stressors affect periphyton and macroinvertebrates. Concentrations of nutrients, cations, and ani...

  11. Estimating occupancy rates with imperfect detection under complex survey designs

    EPA Science Inventory

    Monitoring the occurrence of specific amphibian species is of interest. Typically, the monitoring design is a complex design that involves stratification and unequal probability of selection. When conducting field visits to selected sites, a common problem is that during a singl...

  12. Qualification of NCI-Designated Cancer Centers for Quantitative PET/CT Imaging in Clinical Trials.

    PubMed

    Scheuermann, Joshua S; Reddin, Janet S; Opanowski, Adam; Kinahan, Paul E; Siegel, Barry A; Shankar, Lalitha K; Karp, Joel S

    2017-03-02

    The National Cancer Institute (NCI) developed the Centers for Quantitative Imaging Excellence (CQIE) initiative in 2010 to pre-qualify imaging facilities at all of the NCI-designated Comprehensive and Clinical Cancer Centers for oncology trials using advanced imaging techniques, including positron emission tomography (PET). This paper reviews the CQIE PET/CT (Computed Tomography) scanner qualification process and results in detail. Methods: Over a period of approximately 5 years, sites were requested to submit a variety of phantom, including uniform and ACR (American College of Radiology) phantoms, PET/CT images, as well as examples of clinical images. Submissions were divided into 3 distinct time points: initial submission (T0), followed by two requalification submissions (T1 and T2). Images were analyzed using standardized procedures and scanners received a pass or fail designation. Sites had the opportunity to submit new data for failed scanners. Quantitative results were compared: across scanners within a given time point and across time points for a given scanner. Results: 65 unique PET/CT scanners across 42 sites were submitted for CQIE T0 qualification, with 64 passing qualification. 44 (68%) of the scanners from T0 had data submitted for T2. From T0 to T2 the percentage of scanners passing the CQIE qualification on the first attempt rose from 38% in T1 to 67% in T2. The most common reasons for failure were: standardized uptake value (SUV) out of specifications, incomplete data submission and uniformity issues. Uniform phantom and ACR phantom results between scanner manufacturers are similar. Conclusion: The results of the CQIE process show that periodic requalification may decrease the frequency of deficient data submissions. The CQIE project also highlighted the concern within imaging facilities about the burden of maintaining different qualifications and accreditations. Finally, we note that for quantitative imaging-based trials the relationships between

  13. Effect of survey design and catch rate estimation on total catch estimates in Chinook salmon fisheries

    USGS Publications Warehouse

    McCormick, Joshua L.; Quist, Michael C.; Schill, Daniel J.

    2012-01-01

    Roving–roving and roving–access creel surveys are the primary techniques used to obtain information on harvest of Chinook salmon Oncorhynchus tshawytscha in Idaho sport fisheries. Once interviews are conducted using roving–roving or roving–access survey designs, mean catch rate can be estimated with the ratio-of-means (ROM) estimator, the mean-of-ratios (MOR) estimator, or the MOR estimator with exclusion of short-duration (≤0.5 h) trips. Our objective was to examine the relative bias and precision of total catch estimates obtained from use of the two survey designs and three catch rate estimators for Idaho Chinook salmon fisheries. Information on angling populations was obtained by direct visual observation of portions of Chinook salmon fisheries in three Idaho river systems over an 18-d period. Based on data from the angling populations, Monte Carlo simulations were performed to evaluate the properties of the catch rate estimators and survey designs. Among the three estimators, the ROM estimator provided the most accurate and precise estimates of mean catch rate and total catch for both roving–roving and roving–access surveys. On average, the root mean square error of simulated total catch estimates was 1.42 times greater and relative bias was 160.13 times greater for roving–roving surveys than for roving–access surveys. Length-of-stay bias and nonstationary catch rates in roving–roving surveys both appeared to affect catch rate and total catch estimates. Our results suggest that use of the ROM estimator in combination with an estimate of angler effort provided the least biased and most precise estimates of total catch for both survey designs. However, roving–access surveys were more accurate than roving–roving surveys for Chinook salmon fisheries in Idaho.

  14. Large Synoptic Survey Telescope: From Science Drivers to Reference Design

    DTIC Science & Technology

    2008-01-01

    such as Very Deep and Very Fast time domain surveys. We describe how the LSST science drivers led to these choices of system parameters . Key words...Sciences2 concluded that a dedicated ground-based wide-field imaging telescope with an effective aperture of 6–8 meters is a high priority for planetary ... parameters , as de- scribed in §2. A project status report and concluding remarks are presented in §3. For detailed and up-to- date information, please

  15. First National Survey of Lead and Allergens in Housing: survey design and methods for the allergen and endotoxin components.

    PubMed Central

    Vojta, Patrick J; Friedman, Warren; Marker, David A; Clickner, Robert; Rogers, John W; Viet, Susan M; Muilenberg, Michael L; Thorne, Peter S; Arbes, Samuel J; Zeldin, Darryl C

    2002-01-01

    From July 1998 to August 1999, the U.S. Department of Housing and Urban Development and the National Institute of Environmental Health Sciences conducted the first National Survey of Lead and Allergens in Housing. The purpose of the survey was to assess children's potential household exposure to lead, allergens, and bacterial endotoxins. We surveyed a sample of 831 homes, representing 96 million permanently occupied, noninstitutional housing units that permit resident children. We administered questionnaires to household members, made home observations, and took environmental samples. This article provides general background information on the survey, an overview of the survey design, and a description of the data collection and laboratory methods pertaining to the allergen and endotoxin components. We collected dust samples from a bed, the bedroom floor, a sofa or chair, the living room floor, the kitchen floor, and a basement floor and analyzed them for cockroach allergen Bla g 1, the dust mite allergens Der f 1 and Der p 1, the cat allergen Fel d 1, the dog allergen Can f 1, the rodent allergens Rat n 1 and mouse urinary protein, allergens of the fungus Alternaria alternata, and endotoxin. This article provides the essential context for subsequent reports that will describe the prevalence of allergens and endotoxin in U.S. households, their distribution by various housing characteristics, and their associations with allergic diseases such as asthma and rhinitis. PMID:12003758

  16. Temporally adaptive sampling: a case study in rare species survey design with marbled salamanders (Ambystoma opacum).

    PubMed

    Charney, Noah D; Kubel, Jacob E; Eiseman, Charles S

    2015-01-01

    Improving detection rates for elusive species with clumped distributions is often accomplished through adaptive sampling designs. This approach can be extended to include species with temporally variable detection probabilities. By concentrating survey effort in years when the focal species are most abundant or visible, overall detection rates can be improved. This requires either long-term monitoring at a few locations where the species are known to occur or models capable of predicting population trends using climatic and demographic data. For marbled salamanders (Ambystoma opacum) in Massachusetts, we demonstrate that annual variation in detection probability of larvae is regionally correlated. In our data, the difference in survey success between years was far more important than the difference among the three survey methods we employed: diurnal surveys, nocturnal surveys, and dipnet surveys. Based on these data, we simulate future surveys to locate unknown populations under a temporally adaptive sampling framework. In the simulations, when pond dynamics are correlated over the focal region, the temporally adaptive design improved mean survey success by as much as 26% over a non-adaptive sampling design. Employing a temporally adaptive strategy costs very little, is simple, and has the potential to substantially improve the efficient use of scarce conservation funds.

  17. Temporally Adaptive Sampling: A Case Study in Rare Species Survey Design with Marbled Salamanders (Ambystoma opacum)

    PubMed Central

    Charney, Noah D.; Kubel, Jacob E.; Eiseman, Charles S.

    2015-01-01

    Improving detection rates for elusive species with clumped distributions is often accomplished through adaptive sampling designs. This approach can be extended to include species with temporally variable detection probabilities. By concentrating survey effort in years when the focal species are most abundant or visible, overall detection rates can be improved. This requires either long-term monitoring at a few locations where the species are known to occur or models capable of predicting population trends using climatic and demographic data. For marbled salamanders (Ambystoma opacum) in Massachusetts, we demonstrate that annual variation in detection probability of larvae is regionally correlated. In our data, the difference in survey success between years was far more important than the difference among the three survey methods we employed: diurnal surveys, nocturnal surveys, and dipnet surveys. Based on these data, we simulate future surveys to locate unknown populations under a temporally adaptive sampling framework. In the simulations, when pond dynamics are correlated over the focal region, the temporally adaptive design improved mean survey success by as much as 26% over a non-adaptive sampling design. Employing a temporally adaptive strategy costs very little, is simple, and has the potential to substantially improve the efficient use of scarce conservation funds. PMID:25799224

  18. THE HETDEX PILOT SURVEY. I. SURVEY DESIGN, PERFORMANCE, AND CATALOG OF EMISSION-LINE GALAXIES

    SciTech Connect

    Adams, Joshua J.; Blanc, Guillermo A.; Gebhardt, Karl; Hao, Lei; Byun, Joyce; Fry, Alex; Jeong, Donghui; Komatsu, Eiichiro; Hill, Gary J.; Cornell, Mark E.; MacQueen, Phillip J.; Drory, Niv; Bender, Ralf; Hopp, Ulrich; Kelzenberg, Ralf; Ciardullo, Robin; Gronwall, Caryl; Finkelstein, Steven L.; Gawiser, Eric; Kelz, Andreas

    2011-01-15

    We present a catalog of emission-line galaxies selected solely by their emission-line fluxes using a wide-field integral field spectrograph. This work is partially motivated as a pilot survey for the upcoming Hobby-Eberly Telescope Dark Energy Experiment. We describe the observations, reductions, detections, redshift classifications, line fluxes, and counterpart information for 397 emission-line galaxies detected over 169 {open_square}' with a 3500-5800 A bandpass under 5 A full-width-half-maximum (FWHM) spectral resolution. The survey's best sensitivity for unresolved objects under photometric conditions is between 4 and 20x 10{sup -17} erg s{sup -1} cm{sup -2} depending on the wavelength, and Ly{alpha} luminosities between 3 x 10{sup 42} and 6 x 10{sup 42} erg s{sup -1} are detectable. This survey method complements narrowband and color-selection techniques in the search of high-redshift galaxies with its different selection properties and large volume probed. The four survey fields within the COSMOS, GOODS-N, MUNICS, and XMM-LSS areas are rich with existing, complementary data. We find 105 galaxies via their high-redshift Ly{alpha} emission at 1.9 < z < 3.8, and the majority of the remainder objects are low-redshift [O II]3727 emitters at z < 0.56. The classification between low- and high-redshift objects depends on rest-frame equivalent width (EW), as well as other indicators, where available. Based on matches to X-ray catalogs, the active galactic nuclei fraction among the Ly{alpha} emitters is 6%. We also analyze the survey's completeness and contamination properties through simulations. We find five high-z, highly significant, resolved objects with FWHM sizes >44 {open_square}' which appear to be extended Ly{alpha} nebulae. We also find three high-z objects with rest-frame Ly{alpha} EW above the level believed to be achievable with normal star formation, EW{sub 0}>240 A. Future papers will investigate the physical properties of this sample.

  19. Rational design of surface/interface chemistry for quantitative in vivo monitoring of brain chemistry.

    PubMed

    Zhang, Meining; Yu, Ping; Mao, Lanqun

    2012-04-17

    To understand the molecular basis of brain functions, researchers would like to be able to quantitatively monitor the levels of neurochemicals in the extracellular fluid in vivo. However, the chemical and physiological complexity of the central nervous system (CNS) presents challenges for the development of these analytical methods. This Account describes the rational design and careful construction of electrodes and nanoparticles with specific surface/interface chemistry for quantitative in vivo monitoring of brain chemistry. We used the redox nature of neurochemicals at the electrode/electrolyte interface to establish a basis for monitoring specific neurochemicals. Carbon nanotubes provide an electrode/electrolyte interface for the selective oxidation of ascorbate, and we have developed both in vivo voltammetry and an online electrochemical detecting system for continuously monitoring this molecule in the CNS. Although Ca(2+) and Mg(2+) are involved in a number of neurochemical signaling processes, they are still difficult to detect in the CNS. These divalent cations can enhance electrocatalytic oxidation of NADH at an electrode modified with toluidine blue O. We used this property to develop online electrochemical detection systems for simultaneous measurements of Ca(2+) and Mg(2+) and for continuous selective monitoring of Mg(2+) in the CNS. We have also harnessed biological schemes for neurosensing in the brain to design other monitoring systems. By taking advantage of the distinct reaction properties of dopamine (DA), we have developed a nonoxidative mechanism for DA sensing and a system that can potentially be used for continuously sensing of DA release. Using "artificial peroxidase" (Prussian blue) to replace a natural peroxidase (horseradish peroxidase, HRP), our online system can simultaneously detect basal levels of glucose and lactate. By substituting oxidases with dehydrogenases, we have used enzyme-based biosensing schemes to develop a physiologically

  20. The National Survey of Family Growth, Cycle IV, evaluation of linked design.

    PubMed

    Waksberg, J; Sperry, S; Judkins, D; Smith, V

    1993-07-01

    Research was undertaken to quantify the effects of costs of alternative methods for selecting sample women for the National Survey of Family Growth (NSFG) from the National Health Interview Survey (NHIS). This report presents estimates of the effects of alternative design options, obtained by statistical modeling techniques, for linking the NSFG with the NHIS; the cost data and the statistical precision of estimates were based on data from the NSFG, Cycle IV. The estimated survey costs and projected response rates for alternative linked design options and for the unlinked design are compared for fixed precision. The findings confirm that substantial gains in the NSFG design efficiency were obtained by linking the NSFG sample design to that of the NHIS.

  1. The quantitation of cocaine on U.S. currency: survey and significance of the levels of contamination.

    PubMed

    Jourdan, Thomas H; Veitenheimer, Allison M; Murray, Cynthia K; Wagner, Jarrad R

    2013-05-01

    It has long been suspected that the illicit distribution of cocaine in the United States has led to a large-scale contamination of the currency supply. To investigate the extent of contamination, 418 currency samples (4174 bills) were collected from 90 locations around the United States from 1993 to 2009. The extent of their cocaine contamination was quantitated via gas chromatography/mass spectrometry or liquid chromatography/mass spectrometry. The level of cocaine contamination was determined to average 2.34 ng/bill across all denominations ($1, $5, $10, $20, $50, and $100). Levels of cocaine contamination on currency submitted to the Federal Bureau of Investigation Laboratory in criminal cases over the 1993-2001 timeframe had significantly higher contamination than currency in general circulation. A mathematical model was developed based on the background survey that indicates the likelihood of drawing a bill in specific concentration ranges. For example, there is a 0.8349 likelihood that random bill will have contamination less than 20 ng.

  2. Compatible immuno-NASBA LOC device for quantitative detection of waterborne pathogens: design and validation.

    PubMed

    Zhao, Xinyan; Dong, Tao; Yang, Zhaochu; Pires, Nuno; Høivik, Nils

    2012-02-07

    Waterborne pathogens usually pose a global threat to animals and human beings. There has been a growing demand for convenient and sensitive tools to detect the potential emerging pathogens in water. In this study, a lab-on-a-chip (LOC) device based on the real-time immuno-NASBA (immuno-nucleic acid sequence-based amplification) assay was designed, fabricated and verified. The disposable immuno-NASBA chip is modelled on a 96-well ELISA microplate, which contains 43 reaction chambers inside the bionic channel networks. All valves are designed outside the chip and are reusable. The sample and reagent solutions were pushed into each chamber in turn, which was controlled by the valve system. Notably, the immuno-NASBA chip is completely compatible with common microplate readers in a biological laboratory, and can distinguish multiple waterborne pathogens in water samples quantitatively and simultaneously. The performance of the LOC device was demonstrated by detecting the presence of a synthetic peptide, ACTH (adrenocorticotropic hormone) and two common waterborne pathogens, Escherichia coli (E. coli) and rotavirus, in artificial samples. The results indicated that the LOC device has the potential to quantify traces of waterborne pathogens at femtomolar levels with high specificity, although the detection process was still subject to some factors, such as ribonuclease (RNase) contamination and non-specific adsorption. As an ultra-sensitive tool to quantify waterborne pathogens, the LOC device can be used to monitor water quality in the drinking water system. Furthermore, a series of compatible high-throughput LOC devices for monitoring waterborne pathogens could be derived from this prototype with the same design idea, which may render the complicated immuno-NASBA assays convenient to common users without special training.

  3. Design study of the deepsky ultraviolet survey telescope. [Spacelab payload

    NASA Technical Reports Server (NTRS)

    Page, N. A.; Callaghan, F. G.; Killen, R. H.; Willis, W.

    1977-01-01

    Preliminary mechanical design and specifications are presented for a wide field ultraviolet telescope and detector to be carried as a Spacelab payload. Topics discussed include support structure stiffness (torsional and bending), mirror assembly, thermal control, optical alignment, attachment to the instrument pointing pallet, control and display, power requirements, acceptance and qualification test plans, cost analysis and scheduling. Drawings are included.

  4. Survey of electrical submersible systems design, application, and testing

    SciTech Connect

    Durham, M.O.; Lea, J.F.

    1996-05-01

    The electrical submersible pump industry has numerous recommended practices and procedures addressing various facets of the operation. Ascertaining the appropriate technique is tedious. Seldom are all the documents available at one location. This synopsis of all the industry practices provides a ready reference for testing, design, and application of electrical submersible pumping systems. An extensive bibliography identifies significant documents for further reference.

  5. The Large Area Radio Galaxy Evolution Spectroscopic Survey (LARGESS): survey design, data catalogue and GAMA/WiggleZ spectroscopy

    NASA Astrophysics Data System (ADS)

    Ching, John H. Y.; Sadler, Elaine M.; Croom, Scott M.; Johnston, Helen M.; Pracy, Michael B.; Couch, Warrick J.; Hopkins, A. M.; Jurek, Russell J.; Pimbblet, K. A.

    2017-01-01

    We present the Large Area Radio Galaxy Evolution Spectroscopic Survey (LARGESS), a spectroscopic catalogue of radio sources designed to include the full range of radio AGN populations out to redshift z ˜ 0.8. The catalogue covers ˜800 deg2 of sky, and provides optical identifications for 19 179 radio sources from the 1.4 GHz Faint Images of the Radio Sky at Twenty-cm (FIRST) survey down to an optical magnitude limit of imod < 20.5 in Sloan Digital Sky Survey (SDSS) images. Both galaxies and point-like objects are included, and no colour cuts are applied. In collaboration with the WiggleZ and Galaxy And Mass Assembly (GAMA) spectroscopic survey teams, we have obtained new spectra for over 5000 objects in the LARGESS sample. Combining these new spectra with data from earlier surveys provides spectroscopic data for 12 329 radio sources in the survey area, of which 10 856 have reliable redshifts. 85 per cent of the LARGESS spectroscopic sample are radio AGN (median redshift z = 0.44), and 15 per cent are nearby star-forming galaxies (median z = 0.08). Low-excitation radio galaxies (LERGs) comprise the majority (83 per cent) of LARGESS radio AGN at z < 0.8, with 12 per cent being high-excitation radio galaxies (HERGs) and 5 per cent radio-loud QSOs. Unlike the more homogeneous LERG and QSO sub-populations, HERGs are a heterogeneous class of objects with relatively blue optical colours and a wide dispersion in mid-infrared colours. This is consistent with a picture in which most HERGs are hosted by galaxies with recent or ongoing star formation as well as a classical accretion disc.

  6. The influence of the design matrix on treatment effect estimates in the quantitative analyses of single-subject experimental design research.

    PubMed

    Moeyaert, Mariola; Ugille, Maaike; Ferron, John M; Beretvas, S Natasha; Van den Noortgate, Wim

    2014-09-01

    The quantitative methods for analyzing single-subject experimental data have expanded during the last decade, including the use of regression models to statistically analyze the data, but still a lot of questions remain. One question is how to specify predictors in a regression model to account for the specifics of the design and estimate the effect size of interest. These quantitative effect sizes are used in retrospective analyses and allow synthesis of single-subject experimental study results which is informative for evidence-based decision making, research and theory building, and policy discussions. We discuss different design matrices that can be used for the most common single-subject experimental designs (SSEDs), namely, the multiple-baseline designs, reversal designs, and alternating treatment designs, and provide empirical illustrations. The purpose of this article is to guide single-subject experimental data analysts interested in analyzing and meta-analyzing SSED data.

  7. Modified Universal Design Survey: Enhancing Operability of Launch Vehicle Ground Crew Worksites

    NASA Technical Reports Server (NTRS)

    Blume, Jennifer L.

    2010-01-01

    Operability is a driving requirement for next generation space launch vehicles. Launch site ground operations include numerous operator tasks to prepare the vehicle for launch or to perform preflight maintenance. Ensuring that components requiring operator interaction at the launch site are designed for optimal human use is a high priority for operability. To promote operability, a Design Quality Evaluation Survey based on Universal Design framework was developed to support Human Factors Engineering (HFE) evaluation for NASA s launch vehicles. Universal Design per se is not a priority for launch vehicle processing however; applying principles of Universal Design will increase the probability of an error free and efficient design which promotes operability. The Design Quality Evaluation Survey incorporates and tailors the seven Universal Design Principles and adds new measures for Safety and Efficiency. Adapting an approach proven to measure Universal Design Performance in Product, each principle is associated with multiple performance measures which are rated with the degree to which the statement is true. The Design Quality Evaluation Survey was employed for several launch vehicle ground processing worksite analyses. The tool was found to be most useful for comparative judgments as opposed to an assessment of a single design option. It provided a useful piece of additional data when assessing possible operator interfaces or worksites for operability.

  8. Sample and design considerations in post-disaster mental health needs assessment tracking surveys.

    PubMed

    Kessler, Ronald C; Keane, Terence M; Ursano, Robert J; Mokdad, Ali; Zaslavsky, Alan M

    2008-12-01

    Although needs assessment surveys are carried out after many large natural and man-made disasters, synthesis of findings across these surveys and disaster situations about patterns and correlates of need is hampered by inconsistencies in study designs and measures. Recognizing this problem, the US Substance Abuse and Mental Health Services Administration (SAMHSA) assembled a task force in 2004 to develop a model study design and interview schedule for use in post-disaster needs assessment surveys. The US National Institute of Mental Health subsequently approved a plan to establish a center to implement post-disaster mental health needs assessment surveys in the future using an integrated series of measures and designs of the sort proposed by the SAMHSA task force. A wide range of measurement, design, and analysis issues will arise in developing this center. Given that the least widely discussed of these issues concerns study design, the current report focuses on the most important sampling and design issues proposed for this center based on our experiences with the SAMHSA task force, subsequent Katrina surveys, and earlier work in other disaster situations.

  9. Using simulation to evaluate wildlife survey designs: polar bears and seals in the Chukchi Sea

    PubMed Central

    Conn, Paul B.; Moreland, Erin E.; Regehr, Eric V.; Richmond, Erin L.; Cameron, Michael F.; Boveng, Peter L.

    2016-01-01

    Logistically demanding and expensive wildlife surveys should ideally yield defensible estimates. Here, we show how simulation can be used to evaluate alternative survey designs for estimating wildlife abundance. Specifically, we evaluate the potential of instrument-based aerial surveys (combining infrared imagery with high-resolution digital photography to detect and identify species) for estimating abundance of polar bears and seals in the Chukchi Sea. We investigate the consequences of different levels of survey effort, flight track allocation and model configuration on bias and precision of abundance estimators. For bearded seals (0.07 animals km−2) and ringed seals (1.29 animals km−2), we find that eight flights traversing ≈7840 km are sufficient to achieve target precision levels (coefficient of variation (CV)<20%) for a 2.94×105 km2 study area. For polar bears (provisionally, 0.003 animals km−2), 12 flights traversing ≈11 760 km resulted in CVs ranging from 28 to 35%. Estimators were relatively unbiased with similar precision over different flight track allocation strategies and estimation models, although some combinations had superior performance. These findings suggest that instrument-based aerial surveys may provide a viable means for monitoring seal and polar bear populations on the surface of the sea ice over large Arctic regions. More broadly, our simulation-based approach to evaluating survey designs can serve as a template for biologists designing their own surveys. PMID:26909183

  10. Hit by a Perfect Storm? Art & Design in the National Student Survey

    ERIC Educational Resources Information Center

    Yorke, Mantz; Orr, Susan; Blair, Bernadette

    2014-01-01

    There has long been the suspicion amongst staff in Art & Design that the ratings given to their subject disciplines in the UK's National Student Survey are adversely affected by a combination of circumstances--a "perfect storm". The "perfect storm" proposition is tested by comparing ratings for Art & Design with those…

  11. Usability Evaluation Survey for Identifying Design Issues in Civil Flight Deck

    NASA Astrophysics Data System (ADS)

    Ozve Aminian, Negin; Izzuddin Romli, Fairuz; Wiriadidjaja, Surjatin

    2016-02-01

    Ergonomics assessment for cockpit in civil aircraft is important as the pilots spend most of their time during flight on the seating posture imposed by its design. The improper seat design can cause discomfort and pain, which will disturb the pilot's concentration in flight. From a conducted survey, it is found that there are some issues regarding the current cockpit design. This study aims to highlight potential mismatches between the current cockpit design and the ergonomic design recommendations for anthropometric dimensions and seat design, which could be the roots of the problems faced by the pilots in the cockpit.

  12. Improved Optical Design for the Large Synoptic Survey Telescope (LSST)

    SciTech Connect

    Seppala, L

    2002-09-24

    This paper presents an improved optical design for the LSST, an fll.25 three-mirror telescope covering 3.0 degrees full field angle, with 6.9 m effective aperture diameter. The telescope operates at five wavelength bands spanning 386.5 nm to 1040 nm (B, V, R, I and Z). For all bands, 80% of the polychromatic diffracted energy is collected within 0.20 arc-seconds diameter. The reflective telescope uses an 8.4 m f/1.06 concave primary, a 3.4 m convex secondary and a 5.2 m concave tertiary in a Paul geometry. The system length is 9.2 m. A refractive corrector near the detector uses three fused silica lenses, rather than the two lenses of previous designs. Earlier designs required that one element be a vacuum barrier, but now the detector sits in an inert gas at ambient pressure. The last lens is the gas barrier. Small adjustments lead to optimal correction at each band. The filters have different axial thicknesses. The primary and tertiary mirrors are repositioned for each wavelength band. The new optical design incorporates features to simplify manufacturing. They include a flat detector, a far less aspheric convex secondary (10 {micro}m from best fit sphere) and reduced aspheric departures on the lenses and tertiary mirror. Five aspheric surfaces, on all three mirrors and on two lenses, are used. The primary is nearly parabolic. The telescope is fully baffled so that no specularly reflected light from any field angle, inside or outside of the full field angle of 3.0 degrees, can reach the detector.

  13. Effects of sample survey design on the accuracy of classification tree models in species distribution models

    USGS Publications Warehouse

    Edwards, T.C.; Cutler, D.R.; Zimmermann, N.E.; Geiser, L.; Moisen, G.G.

    2006-01-01

    We evaluated the effects of probabilistic (hereafter DESIGN) and non-probabilistic (PURPOSIVE) sample surveys on resultant classification tree models for predicting the presence of four lichen species in the Pacific Northwest, USA. Models derived from both survey forms were assessed using an independent data set (EVALUATION). Measures of accuracy as gauged by resubstitution rates were similar for each lichen species irrespective of the underlying sample survey form. Cross-validation estimates of prediction accuracies were lower than resubstitution accuracies for all species and both design types, and in all cases were closer to the true prediction accuracies based on the EVALUATION data set. We argue that greater emphasis should be placed on calculating and reporting cross-validation accuracy rates rather than simple resubstitution accuracy rates. Evaluation of the DESIGN and PURPOSIVE tree models on the EVALUATION data set shows significantly lower prediction accuracy for the PURPOSIVE tree models relative to the DESIGN models, indicating that non-probabilistic sample surveys may generate models with limited predictive capability. These differences were consistent across all four lichen species, with 11 of the 12 possible species and sample survey type comparisons having significantly lower accuracy rates. Some differences in accuracy were as large as 50%. The classification tree structures also differed considerably both among and within the modelled species, depending on the sample survey form. Overlap in the predictor variables selected by the DESIGN and PURPOSIVE tree models ranged from only 20% to 38%, indicating the classification trees fit the two evaluated survey forms on different sets of predictor variables. The magnitude of these differences in predictor variables throws doubt on ecological interpretation derived from prediction models based on non-probabilistic sample surveys. ?? 2006 Elsevier B.V. All rights reserved.

  14. Improved optical design for the Large Synoptic Survey Telescope (LSST)

    NASA Astrophysics Data System (ADS)

    Seppala, Lynn G.

    2002-12-01

    This paper presents an improved optical design for the LSST, an f/1.25 three-mirror telescope covering 3.0 degrees full field angle, with 6.9 m effective aperture diameter. The telescope operates at five wavelength bands spanning 386.5 nm to 1040 nm (B, V, R, I and Z). For all bands, 80% of the polychromatic diffracted energy is collected within 0.20 arc-seconds diameter. The reflective telescope uses an 8.4 m f/1.06 concave primary, a 3.4 m convex secondary and a 5.2 m concave tertiary in a Paul geometry. The system length is 9.2 m. A refractive corrector near the detector uses three fused silica lenses, rather than the two lenses of previous designs. Earlier designs required that one element be a vacuum barrier, but now the detector sits in an inert gas at ambient pressure, with the last lens serving as the gas barrier. Small adjustments lead to optimal correction at each band. Each filter has a different axial thickness, and the primary and tertiary mirrors are repositioned for each wavelength band. Features that simplify manufacturing include a flat detector, a far less aspheric convex secondary (10 μm from best fit sphere) and reduced aspheric departures on the lenses and tertiary mirror. Five aspheric surfaces, on all three mirrors and on two lenses, are used. The primary is nearly parabolic. The telescope is fully baffled so that no specularly reflected light from any field angle, inside or outside of the full field angle of 3.0 degrees, can reach the detector.

  15. Survey of Ada (Trademark)-Based PDLs (Program Design Language).

    DTIC Science & Technology

    1985-01-01

    Each of these may comprise a view of auxiliary or tangible referents. This leads to an abstract syntax closely 59 related to simple English grammar and...bqUNIT NuvOEMS 7NJnsBranch Drive 17021 Sioux Lane ’e 1 d __Suite 6(0OWCihr14ug D2P McLean, VA 22102 G tirhrM)28 ’a ~OLLOUG or" Ict NAME AND ADDRESS IZ...Mechanisms for indicating postponed decisions, such as TBD constructs and English narrative. 2) Mechanisms for expressing supplemental design information

  16. Review of quantitative surveys of the length and stability of MTBE, TBA, and benzene plumes in groundwater at UST sites.

    PubMed

    Connor, John A; Kamath, Roopa; Walker, Kenneth L; McHugh, Thomas E

    2015-01-01

    Quantitative information regarding the length and stability condition of groundwater plumes of benzene, methyl tert-butyl ether (MTBE), and tert-butyl alcohol (TBA) has been compiled from thousands of underground storage tank (UST) sites in the United States where gasoline fuel releases have occurred. This paper presents a review and summary of 13 published scientific surveys, of which 10 address benzene and/or MTBE plumes only, and 3 address benzene, MTBE, and TBA plumes. These data show the observed lengths of benzene and MTBE plumes to be relatively consistent among various regions and hydrogeologic settings, with median lengths at a delineation limit of 10 µg/L falling into relatively narrow ranges from 101 to 185 feet for benzene and 110 to 178 feet for MTBE. The observed statistical distributions of MTBE and benzene plumes show the two plume types to be of comparable lengths, with 90th percentile MTBE plume lengths moderately exceeding benzene plume lengths by 16% at a 10-µg/L delineation limit (400 feet vs. 345 feet) and 25% at a 5-µg/L delineation limit (530 feet vs. 425 feet). Stability analyses for benzene and MTBE plumes found 94 and 93% of these plumes, respectively, to be in a nonexpanding condition, and over 91% of individual monitoring wells to exhibit nonincreasing concentration trends. Three published studies addressing TBA found TBA plumes to be of comparable length to MTBE and benzene plumes, with 86% of wells in one study showing nonincreasing concentration trends.

  17. A survey of design methods for failure detection in dynamic systems

    NASA Technical Reports Server (NTRS)

    Willsky, A. S.

    1975-01-01

    A number of methods for the detection of abrupt changes (such as failures) in stochastic dynamical systems were surveyed. The class of linear systems were emphasized, but the basic concepts, if not the detailed analyses, carry over to other classes of systems. The methods surveyed range from the design of specific failure-sensitive filters, to the use of statistical tests on filter innovations, to the development of jump process formulations. Tradeoffs in complexity versus performance are discussed.

  18. Study Quality in SLA: An Assessment of Designs, Analyses, and Reporting Practices in Quantitative L2 Research

    ERIC Educational Resources Information Center

    Plonsky, Luke

    2013-01-01

    This study assesses research and reporting practices in quantitative second language (L2) research. A sample of 606 primary studies, published from 1990 to 2010 in "Language Learning and Studies in Second Language Acquisition," was collected and coded for designs, statistical analyses, reporting practices, and outcomes (i.e., effect…

  19. [Development of a simple quantitative method for the strontium-89 concentration of radioactive liquid waste using the plastic scintillation survey meter for beta rays].

    PubMed

    Narita, Hiroto; Tsuchiya, Yuusuke; Hirase, Kiyoshi; Uchiyama, Mayuki; Fukushi, Masahiro

    2012-11-01

    Strontium-89 (89Sr: pure beta, E; 1.495 MeV-100%, halflife: 50.5 days) chloride is used as pain relief from bone metastases. An assay of 89Sr is difficult because of a pure beta emitter. For management of 89Sr, we tried to evaluate a simple quantitative method for the 59Sr concentration of radioactive liquid waste using scintillation survey meter for beta rays. The counting efficiency of the survey meter with this method was 35.95%. A simple 30 minutes measurement of 2 ml of the sample made the quantitative measurement of 89Sr practical. Reducing self-absorption of the beta ray in the solution by counting on the polyethlene paper improved the counting efficiency. Our method made it easy to manage the radioactive liquid waste under the legal restrictions.

  20. Evaluating a Modular Design Approach to Collecting Survey Data Using Text Messages

    PubMed Central

    West, Brady T.; Ghimire, Dirgha; Axinn, William G.

    2015-01-01

    This article presents analyses of data from a pilot study in Nepal that was designed to provide an initial examination of the errors and costs associated with an innovative methodology for survey data collection. We embedded a randomized experiment within a long-standing panel survey, collecting data on a small number of items with varying sensitivity from a probability sample of 450 young Nepalese adults. Survey items ranged from simple demographics to indicators of substance abuse and mental health problems. Sampled adults were randomly assigned to one of three different modes of data collection: 1) a standard one-time telephone interview, 2) a “single sitting” back-and-forth interview with an interviewer using text messaging, and 3) an interview using text messages within a modular design framework (which generally involves breaking the survey response task into distinct parts over a short period of time). Respondents in the modular group were asked to respond (via text message exchanges with an interviewer) to only one question on a given day, rather than complete the entire survey. Both bivariate and multivariate analyses demonstrate that the two text messaging modes increased the probability of disclosing sensitive information relative to the telephone mode, and that respondents in the modular design group, while responding less frequently, found the survey to be significantly easier. Further, those who responded in the modular group were not unique in terms of available covariates, suggesting that the reduced item response rates only introduced limited nonresponse bias. Future research should consider enhancing this methodology, applying it with other modes of data collection (e. g., web surveys), and continuously evaluating its effectiveness from a total survey error perspective. PMID:26322137

  1. Application of a Modified Universal Design Survey for Evaluation of Ares 1 Ground Crew Worksites

    NASA Technical Reports Server (NTRS)

    Blume, Jennifer L.

    2010-01-01

    Operability is a driving requirement for NASA's Ares 1 launch vehicle. Launch site ground operations include several operator tasks to prepare the vehicle for launch or to perform maintenance. Ensuring that components requiring operator interaction at the launch site are designed for optimal human use is a high priority for operability. To support design evaluation, the Ares 1 Upper Stage (US) element Human Factors Engineering (HFE) group developed a survey based on the Universal Design approach. Universal Design is a process to create products that can be used effectively by as many people as possible. Universal Design per se is not a priority for Ares 1 because launch vehicle processing is a specialized skill and not akin to a consumer product that should be used by all people of all abilities. However, applying principles of Universal Design will increase the probability of an error free and efficient design which is a priority for Ares 1. The Design Quality Evaluation Survey centers on the following seven principles: (1) Equitable use, (2) Flexibility in use, (3) Simple and intuitive use, (4) Perceptible information, (5) Tolerance for error, (6) Low physical effort, (7) Size and space for approach and use. Each principle is associated with multiple evaluation criteria which were rated with the degree to which the statement is true. All statements are phrased in the utmost positive, or the design goal so that the degree to which judgments tend toward "completely agree" directly reflects the degree to which the design is good. The Design Quality Evaluation Survey was employed for several US analyses. The tool was found to be most useful for comparative judgments as opposed to an assessment of a single design option. It provided a useful piece of additional data when assessing possible operator interfaces or worksites for operability

  2. Final report on the radiological surveys of designated DX firing sites at Los Alamos National Laboratory

    SciTech Connect

    1996-09-09

    CHEMRAD was contracted by Los Alamos National Laboratory to perform USRADS{reg_sign} (UltraSonic Ranging And Data System) radiation scanning surveys at designated DX Sites at the Los Alamos National Laboratory. The primary purpose of these scanning surveys was to identify the presence of Depleted Uranium (D-38) resulting from activities at the DX Firing Sites. This effort was conducted to update the most recent surveys of these areas. This current effort was initiated with site orientation on August 12, 1996. Surveys were completed in the field on September 4, 1996. This Executive Summary briefly presents the major findings of this work. The detail survey results are presented in the balance of this report and are organized by Technical Area and Site number in section 2. This organization is not in chronological order. USRADS and the related survey methods are described in section 3. Quality Control issues are addressed in section 4. Surveys were conducted with an array of radiation detectors either mounted on a backpack frame for man-carried use (Manual mode) or on a tricycle cart (RadCart mode). The array included radiation detectors for gamma and beta surface near surface contamination as well as dose rate at 1 meter above grade. The radiation detectors were interfaced directly to an USRADS 2100 Data Pack.

  3. The Laboratory Course Assessment Survey: A Tool to Measure Three Dimensions of Research-Course Design

    PubMed Central

    Corwin, Lisa A.; Runyon, Christopher; Robinson, Aspen; Dolan, Erin L.

    2015-01-01

    Course-based undergraduate research experiences (CUREs) are increasingly being offered as scalable ways to involve undergraduates in research. Yet few if any design features that make CUREs effective have been identified. We developed a 17-item survey instrument, the Laboratory Course Assessment Survey (LCAS), that measures students’ perceptions of three design features of biology lab courses: 1) collaboration, 2) discovery and relevance, and 3) iteration. We assessed the psychometric properties of the LCAS using established methods for instrument design and validation. We also assessed the ability of the LCAS to differentiate between CUREs and traditional laboratory courses, and found that the discovery and relevance and iteration scales differentiated between these groups. Our results indicate that the LCAS is suited for characterizing and comparing undergraduate biology lab courses and should be useful for determining the relative importance of the three design features for achieving student outcomes. PMID:26466990

  4. The Laboratory Course Assessment Survey: A Tool to Measure Three Dimensions of Research-Course Design.

    PubMed

    Corwin, Lisa A; Runyon, Christopher; Robinson, Aspen; Dolan, Erin L

    2015-01-01

    Course-based undergraduate research experiences (CUREs) are increasingly being offered as scalable ways to involve undergraduates in research. Yet few if any design features that make CUREs effective have been identified. We developed a 17-item survey instrument, the Laboratory Course Assessment Survey (LCAS), that measures students' perceptions of three design features of biology lab courses: 1) collaboration, 2) discovery and relevance, and 3) iteration. We assessed the psychometric properties of the LCAS using established methods for instrument design and validation. We also assessed the ability of the LCAS to differentiate between CUREs and traditional laboratory courses, and found that the discovery and relevance and iteration scales differentiated between these groups. Our results indicate that the LCAS is suited for characterizing and comparing undergraduate biology lab courses and should be useful for determining the relative importance of the three design features for achieving student outcomes.

  5. Optical Design Trade Study for the Wide Field Infrared Survey Telescope [WFIRST

    NASA Technical Reports Server (NTRS)

    Content, David A.; Goullioud, R.; Lehan, John P.; Mentzell, John E.

    2011-01-01

    The Wide Field Infrared Survey Telescope (WFIRST) mission concept was ranked first in new space astrophysics mission by the Astro2010 Decadal Survey incorporating the Joint Dark Energy Mission (JDEM)-Omega payload concept and multiple science white papers. This mission is based on a space telescope at L2 studying exoplanets [via gravitational microlensing], probing dark energy, and surveying the near infrared sky. Since the release of NWNH, the WFIRST project has been working with the WFIRST science definition team (SDT) to refine mission and payload concepts. We present the driving requirements. The current interim reference mission point design, based on the use of a 1.3m unobscured aperture three mirror anastigmat form, with focal imaging and slitless spectroscopy science channels, is consistent with the requirements, requires no technology development, and out performs the JDEM-Omega design.

  6. Wide Field Infrared Survey Telescope [WFIRST]: Telescope Design and Simulated Performance

    NASA Technical Reports Server (NTRS)

    Goullioud, R.; Content, D. A.; Kuan, G. M.; Moore, J. D.; Chang, Z.; Sunada, E. T.; Villalvazo, J.; Hawk, J. P.; Armani, N. V.; Johnson, E. L.; Powell, C. A.

    2012-01-01

    The ASTRO2010 Decadal Survey proposed multiple missions with NIR focal planes and 3 mirror wide field telescopes in the 1.5m aperture range. None of them would have won as standalone missions WFIRST is a combination of these missions, created by Astro 2010 committee. WFIRST Science Definition Team (SDT) tasked to examine the design. Project team is a GSFC-JPL-Caltech collaboration. This interim mission design is a result of combined work by the project team with the SDT.

  7. Epidemiological survey of the feasibility of broadband ultrasound attenuation measured using calcaneal quantitative ultrasound to predict the incidence of falls in the middle aged and elderly

    PubMed Central

    Ou, Ling-Chun; Chang, Yin-Fan; Chang, Chin-Sung; Chiu, Ching-Ju; Chao, Ting-Hsing; Sun, Zih-Jie; Lin, Ruey-Mo; Wu, Chih-Hsing

    2017-01-01

    Objectives We investigated whether calcaneal quantitative ultrasound (QUS-C) is a feasible tool for predicting the incidence of falls. Design Prospective epidemiological cohort study. Setting Community-dwelling people sampled in central western Taiwan. Participants A cohort of community-dwelling people who were ≥40 years old (men: 524; women: 676) in 2009–2010. Follow-up questionnaires were completed by 186 men and 257 women in 2012. Methods Structured questionnaires and broadband ultrasound attenuation (BUA) data were obtained in 2009–2010 using QUS-C, and follow-up surveys were done in a telephone interview in 2012. Using a binary logistic regression model, the risk factors associated with a new fall during follow-up were analysed with all significant variables from the bivariate comparisons and theoretically important variables. Primary outcome measures The incidence of falls was determined when the first new fall occurred during the follow-up period. The mean follow-up time was 2.83 years. Results The total incidence of falls was 28.0 per 1000 person-years for the ≥40 year old group (all participants), 23.3 per 1000 person-years for the 40–70 year old group, and 45.6 per 1000 person-years for the ≥70 year old group. Using multiple logistic regression models, the independent factors were current smoking, living alone, psychiatric drug usage and lower BUA (OR 0.93; 95% CI 0.88 to 0.99, p<0.05) in the ≥70 year old group. Conclusions The incidence of falls was highest in the ≥70 year old group. Using QUS-C-derived BUA is feasible for predicting the incidence of falls in community-dwelling elderly people aged ≥70 years. PMID:28069623

  8. ASSESSING THE ECOLOGICAL CONDITION OF A COASTAL PLAIN WATERSHED USING A PROBABILISTIC SURVEY DESIGN

    EPA Science Inventory

    Using a probabilistic survey design, we assessed the ecological condition of the Florida (USA) portion of the Escambia River watershed using selected environmental and benthic macroinvertebrate data. Macroinvertebrates were sampled at 28 sites during July-August 1996, and 3414 i...

  9. USING GIS TO GENERATE SPATIALLY-BALANCED RANDOM SURVEY DESIGNS FOR NATURAL RESOURCE APPLICATIONS

    EPA Science Inventory

    Sampling of a population is frequently required to understand trends and patterns in natural resource management because financial and time constraints preclude a complete census. A rigorous probability-based survey design specifies where to sample so that inferences from the sam...

  10. "Is This Ethical?" A Survey of Opinion on Principles and Practices of Document Design.

    ERIC Educational Resources Information Center

    Dragga, Sam

    1996-01-01

    Reprints a corrected version of an article originally published in the volume 43, number 1 issue of this journal. Presents results of a national survey of technical communicators and technical communication teachers assessing the ethics of seven document design cases involving manipulation of typography, illustrations, and photographs. Offers…

  11. CONDITION ASSESSMENT FOR THE ESCAMBIA RIVER, FL, WATERSHED: BENTHIC MACROINVERTEBRATE SURVEYS USING A PROBABILISTIC SAMPLING DESIGN

    EPA Science Inventory

    Probabilistic sampling has been used to assess the condition of estuarine ecosystems, and the use of this survey design approach was examined for a northwest Florida watershed. Twenty-eight lotic sites within the Escambia River, Florida, watershed were randomly selected and visit...

  12. Measuring the prevalence and impact of poor menstrual hygiene management: a quantitative survey of schoolgirls in rural Uganda

    PubMed Central

    Hennegan, Julie; Wu, Maryalice; Scott, Linda; Montgomery, Paul

    2016-01-01

    Objectives The primary objective was to describe Ugandan schoolgirls’ menstrual hygiene management (MHM) practices and estimate the prevalence of inadequate MHM. Second, to assess the relative contribution of aspects of MHM to health, education and psychosocial outcomes. Design Secondary analysis of survey data collected as part of the final follow-up from a controlled trial of reusable sanitary pad and puberty education provision was used to provide a cross-sectional description of girls’ MHM practices and assess relationships with outcomes. Setting Rural primary schools in the Kamuli district, Uganda. Participants Participants were 205 menstruating schoolgirls (10–19 years) from the eight study sites. Primary and secondary outcome measures The prevalence of adequate MHM, consistent with the concept definition, was estimated using dimensions of absorbent used, frequency of absorbent change, washing and drying procedures and privacy. Self-reported health, education (school attendance and engagement) and psychosocial (shame, insecurity, embarrassment) outcomes hypothesised to result from poor MHM were assessed as primary outcomes. Outcomes were measured through English surveys loaded on iPads and administered verbally in the local language. Results 90.5% (95% CI 85.6% to 93.9%) of girls failed to meet available criteria for adequate MHM, with no significant difference between those using reusable sanitary pads (88.9%, 95% CI 79.0% to 94.4%) and those using existing methods, predominantly cloth (91.5%, 95% CI 85.1% to 95.3%; χ2 (1)=0.12, p=0.729). Aspects of MHM predicted some consequences including shame, not standing in class to answer questions and concerns about odour. Conclusions This study was the first to assess the prevalence of MHM consistent with the concept definition. Results suggest that when all aspects of menstrual hygiene are considered together, the prevalence is much higher than has previously been reported based on absorbents alone. The

  13. Surveys and questionnaires in nursing research.

    PubMed

    Timmins, Fiona

    2015-06-17

    Surveys and questionnaires are often used in nursing research to elicit the views of large groups of people to develop the nursing knowledge base. This article provides an overview of survey and questionnaire use in nursing research, clarifies the place of the questionnaire as a data collection tool in quantitative research design and provides information and advice about best practice in the development of quantitative surveys and questionnaires.

  14. Why we love or hate our cars: A qualitative approach to the development of a quantitative user experience survey.

    PubMed

    Tonetto, Leandro Miletto; Desmet, Pieter M A

    2016-09-01

    This paper presents a more ecologically valid way of developing theory-based item questionnaires for measuring user experience. In this novel approach, items were generated using natural and domain-specific language of the research population, what seems to have made the survey much more sensitive to real experiences than theory-based ones. The approach was applied in a survey that measured car experience. Ten in-depth interviews were conducted with drivers inside their cars. The resulting transcripts were analysed with the aim of capturing their natural utterances for expressing their car experience. This analysis resulted in 71 categories of answers. For each category, one sentence was selected to serve as a survey-item. In an online platform, 538 respondents answered the survey. Data reliability, tested with Cronbach alpha index, was 0.94, suggesting a survey with highly reliable results to measure drivers' appraisals of their cars.

  15. Screen Design Guidelines for Motivation in Interactive Multimedia Instruction: A Survey and Framework for Designers.

    ERIC Educational Resources Information Center

    Lee, Sung Heum; Boling, Elizabeth

    1999-01-01

    Identifies guidelines from the literature relating to screen design and design of interactive instructional materials. Describes two types of guidelines--those aimed at enhancing motivation and those aimed at preventing loss of motivation--for typography, graphics, color, and animation and audio. Proposes a framework for considering motivation in…

  16. Median and quantile tests under complex survey design using SAS and R.

    PubMed

    Pan, Yi; Caudill, Samuel P; Li, Ruosha; Caldwell, Kathleen L

    2014-11-01

    Techniques for conducting hypothesis testing on the median and other quantiles of two or more subgroups under complex survey design are limited. In this paper, we introduce programs in both SAS and R to perform such a test. A detailed illustration of the computations, macro variable definitions, input and output for the SAS and R programs are also included in the text. Urinary iodine data from National Health and Nutrition Examination Survey (NHANES) are used as examples for comparing medians between females and males as well as comparing the 75th percentiles among three salt consumption groups.

  17. The Design of a Novel Survey for Small Objects in the Solar System

    SciTech Connect

    Alcock, C.; Chen, W.P.; de Pater, I.; Lee, T.; Lissauer, J.; Rice, J.; Liang, C.; Cook, K.; Marshall, S.; Akerlof, C.

    2000-08-21

    We evaluated several concepts for a new survey for small objects in the Solar System. We designed a highly novel survey for comets in the outer region of the Solar System, which exploits the occultations of relatively bright stars to infer the presence of otherwise extremely faint objects. The populations and distributions of these objects are not known; the uncertainties span orders of magnitude! These objects are important scientifically as probes of the primordial solar system, and programmatically now that major investments may be made in the possible mitigation of the hazard of asteroid or comet collisions with the Earth.

  18. AI/OR computational model for integrating qualitative and quantitative design methods

    NASA Technical Reports Server (NTRS)

    Agogino, Alice M.; Bradley, Stephen R.; Cagan, Jonathan; Jain, Pramod; Michelena, Nestor

    1990-01-01

    A theoretical framework for integrating qualitative and numerical computational methods for optimally-directed design is described. The theory is presented as a computational model and features of implementations are summarized where appropriate. To demonstrate the versatility of the methodology we focus on four seemingly disparate aspects of the design process and their interaction: (1) conceptual design, (2) qualitative optimal design, (3) design innovation, and (4) numerical global optimization.

  19. Distance software: design and analysis of distance sampling surveys for estimating population size.

    PubMed

    Thomas, Len; Buckland, Stephen T; Rexstad, Eric A; Laake, Jeff L; Strindberg, Samantha; Hedley, Sharon L; Bishop, Jon Rb; Marques, Tiago A; Burnham, Kenneth P

    2010-02-01

    1.Distance sampling is a widely used technique for estimating the size or density of biological populations. Many distance sampling designs and most analyses use the software Distance.2.We briefly review distance sampling and its assumptions, outline the history, structure and capabilities of Distance, and provide hints on its use.3.Good survey design is a crucial prerequisite for obtaining reliable results. Distance has a survey design engine, with a built-in geographic information system, that allows properties of different proposed designs to be examined via simulation, and survey plans to be generated.4.A first step in analysis of distance sampling data is modelling the probability of detection. Distance contains three increasingly sophisticated analysis engines for this: conventional distance sampling, which models detection probability as a function of distance from the transect and assumes all objects at zero distance are detected; multiple-covariate distance sampling, which allows covariates in addition to distance; and mark-recapture distance sampling, which relaxes the assumption of certain detection at zero distance.5.All three engines allow estimation of density or abundance, stratified if required, with associated measures of precision calculated either analytically or via the bootstrap.6.Advanced analysis topics covered include the use of multipliers to allow analysis of indirect surveys (such as dung or nest surveys), the density surface modelling analysis engine for spatial and habitat modelling, and information about accessing the analysis engines directly from other software.7.Synthesis and applications. Distance sampling is a key method for producing abundance and density estimates in challenging field conditions. The theory underlying the methods continues to expand to cope with realistic estimation situations. In step with theoretical developments, state-of-the-art software that implements these methods is described that makes the methods

  20. Design and synthesis of target-responsive aptamer-cross-linked hydrogel for visual quantitative detection of ochratoxin A.

    PubMed

    Liu, Rudi; Huang, Yishun; Ma, Yanli; Jia, Shasha; Gao, Mingxuan; Li, Jiuxing; Zhang, Huimin; Xu, Dunming; Wu, Min; Chen, Yan; Zhu, Zhi; Yang, Chaoyong

    2015-04-01

    A target-responsive aptamer-cross-linked hydrogel was designed and synthesized for portable and visual quantitative detection of the toxin Ochratoxin A (OTA), which occurs in food and beverages. The hydrogel network forms by hybridization between one designed DNA strand containing the OTA aptamer and two complementary DNA strands grafting on linear polyacrylamide chains. Upon the introduction of OTA, the aptamer binds with OTA, leading to the dissociation of the hydrogel, followed by release of the preloaded gold nanoparticles (AuNPs), which can be observed by the naked eye. To enable sensitive visual and quantitative detection, we encapsulated Au@Pt core-shell nanoparticles (Au@PtNPs) in the hydrogel to generate quantitative readout in a volumetric bar-chart chip (V-Chip). In the V-Chip, Au@PtNPs catalyzes the oxidation of H2O2 to generate O2, which induces movement of an ink bar to a concentration-dependent distance for visual quantitative readout. Furthermore, to improve the detection limit in complex real samples, we introduced an immunoaffinity column (IAC) of OTA to enrich OTA from beer. After the enrichment, as low as 1.27 nM (0.51 ppb) OTA can be detected by the V-Chip, which satisfies the test requirement (2.0 ppb) by the European Commission. The integration of a target-responsive hydrogel with portable enrichment by IAC, as well as signal amplification and quantitative readout by a simple microfluidic device, offers a new method for portable detection of food safety hazard toxin OTA.

  1. Single-Camera Trap Survey Designs Miss Detections: Impacts on Estimates of Occupancy and Community Metrics

    PubMed Central

    Nielsen, Clayton K.; Holzmueller, Eric J.

    2016-01-01

    The use of camera traps as a tool for studying wildlife populations is commonplace. However, few have considered how the number of detections of wildlife differ depending upon the number of camera traps placed at cameras-sites, and how this impacts estimates of occupancy and community composition. During December 2015–February 2016, we deployed four camera traps per camera-site, separated into treatment groups of one, two, and four camera traps, in southern Illinois to compare whether estimates of wildlife community metrics and occupancy probabilities differed among survey methods. The overall number of species detected per camera-site was greatest with the four-camera survey method (P<0.0184). The four-camera survey method detected 1.25 additional species per camera-site than the one-camera survey method, and was the only survey method to completely detect the ground-dwelling silvicolous community. The four-camera survey method recorded individual species at 3.57 additional camera-sites (P = 0.003) and nearly doubled the number of camera-sites where white-tailed deer (Odocoileus virginianus) were detected compared to one- and two-camera survey methods. We also compared occupancy rates estimated by survey methods; as the number of cameras deployed per camera-site increased, occupancy estimates were closer to naïve estimates, detection probabilities increased, and standard errors of detection probabilities decreased. Additionally, each survey method resulted in differing top-ranked, species-specific occupancy models when habitat covariates were included. Underestimates of occurrence and misrepresented community metrics can have significant impacts on species of conservation concern, particularly in areas where habitat manipulation is likely. Having multiple camera traps per site revealed significant shortcomings with the common one-camera trap survey method. While we realize survey design is often constrained logistically, we suggest increasing effort to at least

  2. Single-Camera Trap Survey Designs Miss Detections: Impacts on Estimates of Occupancy and Community Metrics.

    PubMed

    Pease, Brent S; Nielsen, Clayton K; Holzmueller, Eric J

    2016-01-01

    The use of camera traps as a tool for studying wildlife populations is commonplace. However, few have considered how the number of detections of wildlife differ depending upon the number of camera traps placed at cameras-sites, and how this impacts estimates of occupancy and community composition. During December 2015-February 2016, we deployed four camera traps per camera-site, separated into treatment groups of one, two, and four camera traps, in southern Illinois to compare whether estimates of wildlife community metrics and occupancy probabilities differed among survey methods. The overall number of species detected per camera-site was greatest with the four-camera survey method (P<0.0184). The four-camera survey method detected 1.25 additional species per camera-site than the one-camera survey method, and was the only survey method to completely detect the ground-dwelling silvicolous community. The four-camera survey method recorded individual species at 3.57 additional camera-sites (P = 0.003) and nearly doubled the number of camera-sites where white-tailed deer (Odocoileus virginianus) were detected compared to one- and two-camera survey methods. We also compared occupancy rates estimated by survey methods; as the number of cameras deployed per camera-site increased, occupancy estimates were closer to naïve estimates, detection probabilities increased, and standard errors of detection probabilities decreased. Additionally, each survey method resulted in differing top-ranked, species-specific occupancy models when habitat covariates were included. Underestimates of occurrence and misrepresented community metrics can have significant impacts on species of conservation concern, particularly in areas where habitat manipulation is likely. Having multiple camera traps per site revealed significant shortcomings with the common one-camera trap survey method. While we realize survey design is often constrained logistically, we suggest increasing effort to at least

  3. What Value Can Qualitative Research Add to Quantitative Research Design? An Example From an Adolescent Idiopathic Scoliosis Trial Feasibility Study.

    PubMed

    Toye, Francine; Williamson, Esther; Williams, Mark A; Fairbank, Jeremy; Lamb, Sarah E

    2016-08-09

    Using an example of qualitative research embedded in a non-surgical feasibility trial, we explore the benefits of including qualitative research in trial design and reflect on epistemological challenges. We interviewed 18 trial participants and used methods of Interpretive Phenomenological Analysis. Our findings demonstrate that qualitative research can make a valuable contribution by allowing trial stakeholders to see things from alternative perspectives. Specifically, it can help to make specific recommendations for improved trial design, generate questions which contextualize findings, and also explore disease experience beyond the trial. To make the most out of qualitative research embedded in quantitative design it would be useful to (a) agree specific qualitative study aims that underpin research design, (b) understand the impact of differences in epistemological truth claims, (c) provide clear thematic interpretations for trial researchers to utilize, and (d) include qualitative findings that explore experience beyond the trial setting within the impact plan.

  4. Electron-density descriptors as predictors in quantitative structure--activity/property relationships and drug design.

    PubMed

    Matta, Chérif F; Arabi, Alya A

    2011-06-01

    The use of electron density-based molecular descriptors in drug research, particularly in quantitative structure--activity relationships/quantitative structure--property relationships studies, is reviewed. The exposition starts by a discussion of molecular similarity and transferability in terms of the underlying electron density, which leads to a qualitative introduction to the quantum theory of atoms in molecules (QTAIM). The starting point of QTAIM is the topological analysis of the molecular electron-density distributions to extract atomic and bond properties that characterize every atom and bond in the molecule. These atomic and bond properties have considerable potential as bases for the construction of robust quantitative structure--activity/property relationships models as shown by selected examples in this review. QTAIM is applicable to the electron density calculated from quantum-chemical calculations and/or that obtained from ultra-high resolution x-ray diffraction experiments followed by nonspherical refinement. Atomic and bond properties are introduced followed by examples of application of each of these two families of descriptors. The review ends with a study whereby the molecular electrostatic potential, uniquely determined by the density, is used in conjunction with atomic properties to elucidate the reasons for the biological similarity of bioisosteres.

  5. A survey of scientific literacy to provide a foundation for designing science communication in Japan.

    PubMed

    Kawamoto, Shishin; Nakayama, Minoru; Saijo, Miki

    2013-08-01

    There are various definitions and survey methods for scientific literacy. Taking into consideration the contemporary significance of scientific literacy, we have defined it with an emphasis on its social aspects. To acquire the insights needed to design a form of science communication that will enhance the scientific literacy of each individual, we conducted a large-scale random survey within Japan of individuals older than 18 years, using a printed questionnaire. The data thus acquired were analyzed using factor analysis and cluster analysis to create a 3-factor/4-cluster model of people's interest and attitude toward science, technology and society and their resulting tendencies. Differences were found among the four clusters in terms of the three factors: scientific factor, social factor, and science-appreciating factor. We propose a plan for designing a form of science communication that is appropriate to this current status of scientific literacy in Japan.

  6. Composite Interval Mapping Based on Lattice Design for Error Control May Increase Power of Quantitative Trait Locus Detection

    PubMed Central

    Huang, Zhongwen; Zhao, Tuanjie; Xing, Guangnan; Gai, Junyi; Guan, Rongzhan

    2015-01-01

    Experimental error control is very important in quantitative trait locus (QTL) mapping. Although numerous statistical methods have been developed for QTL mapping, a QTL detection model based on an appropriate experimental design that emphasizes error control has not been developed. Lattice design is very suitable for experiments with large sample sizes, which is usually required for accurate mapping of quantitative traits. However, the lack of a QTL mapping method based on lattice design dictates that the arithmetic mean or adjusted mean of each line of observations in the lattice design had to be used as a response variable, resulting in low QTL detection power. As an improvement, we developed a QTL mapping method termed composite interval mapping based on lattice design (CIMLD). In the lattice design, experimental errors are decomposed into random errors and block-within-replication errors. Four levels of block-within-replication errors were simulated to show the power of QTL detection under different error controls. The simulation results showed that the arithmetic mean method, which is equivalent to a method under random complete block design (RCBD), was very sensitive to the size of the block variance and with the increase of block variance, the power of QTL detection decreased from 51.3% to 9.4%. In contrast to the RCBD method, the power of CIMLD and the adjusted mean method did not change for different block variances. The CIMLD method showed 1.2- to 7.6-fold higher power of QTL detection than the arithmetic or adjusted mean methods. Our proposed method was applied to real soybean (Glycine max) data as an example and 10 QTLs for biomass were identified that explained 65.87% of the phenotypic variation, while only three and two QTLs were identified by arithmetic and adjusted mean methods, respectively. PMID:26076140

  7. Creel survey sampling designs for estimating effort in short-duration Chinook salmon fisheries

    USGS Publications Warehouse

    McCormick, Joshua L.; Quist, Michael C.; Schill, Daniel J.

    2013-01-01

    Chinook Salmon Oncorhynchus tshawytscha sport fisheries in the Columbia River basin are commonly monitored using roving creel survey designs and require precise, unbiased catch estimates. The objective of this study was to examine the relative bias and precision of total catch estimates using various sampling designs to estimate angling effort under the assumption that mean catch rate was known. We obtained information on angling populations based on direct visual observations of portions of Chinook Salmon fisheries in three Idaho river systems over a 23-d period. Based on the angling population, Monte Carlo simulations were used to evaluate the properties of effort and catch estimates for each sampling design. All sampling designs evaluated were relatively unbiased. Systematic random sampling (SYS) resulted in the most precise estimates. The SYS and simple random sampling designs had mean square error (MSE) estimates that were generally half of those observed with cluster sampling designs. The SYS design was more efficient (i.e., higher accuracy per unit cost) than a two-cluster design. Increasing the number of clusters available for sampling within a day decreased the MSE of estimates of daily angling effort, but the MSE of total catch estimates was variable depending on the fishery. The results of our simulations provide guidelines on the relative influence of sample sizes and sampling designs on parameters of interest in short-duration Chinook Salmon fisheries.

  8. Trajectory Design to Mitigate Risk on the Transiting Exoplanet Survey Satellite (TESS) Mission

    NASA Technical Reports Server (NTRS)

    Dichmann, Donald

    2016-01-01

    The Transiting Exoplanet Survey Satellite (TESS) will employ a highly eccentric Earth orbit, in 2:1 lunar resonance, reached with a lunar flyby preceded by 3.5 phasing loops. The TESS mission has limited propellant and several orbit constraints. Based on analysis and simulation, we have designed the phasing loops to reduce delta-V and to mitigate risk due to maneuver execution errors. We have automated the trajectory design process and use distributed processing to generate and to optimize nominal trajectories, check constraint satisfaction, and finally model the effects of maneuver errors to identify trajectories that best meet the mission requirements.

  9. The GABRIEL Advanced Surveys: study design, participation and evaluation of bias.

    PubMed

    Genuneit, Jon; Büchele, Gisela; Waser, Marco; Kovacs, Katalin; Debinska, Anna; Boznanski, Andrzej; Strunz-Lehner, Christine; Horak, Elisabeth; Cullinan, Paul; Heederik, Dick; Braun-Fahrländer, Charlotte; von Mutius, Erika

    2011-09-01

    Exposure to farming environments has been shown to protect substantially against asthma and atopic disease across Europe and in other parts of the world. The GABRIEL Advanced Surveys (GABRIELA) were conducted to determine factors in farming environments which are fundamental to protecting against asthma and atopic disease. The GABRIEL Advanced Surveys have a multi-phase stratified design. In a first-screening phase, a comprehensive population-based survey was conducted to assess the prevalence of exposure to farming environments and of asthma and atopic diseases (n = 103,219). The second phase was designed to ascertain detailed exposure to farming environments and to collect biomaterial and environmental samples in a stratified random sample of phase 1 participants (n = 15,255). A third phase was carried out in a further stratified sample only in Bavaria, southern Germany, aiming at in-depth respiratory disease and exposure assessment including extensive environmental sampling (n = 895). Participation rates in phase 1 were around 60% but only about half of the participating study population consented to further study modules in phase 2. We found that consenting behaviour was related to familial allergies, high parental education, wheeze, doctor diagnosed asthma and rhinoconjunctivitis, and to a lesser extent to exposure to farming environments. The association of exposure to farm environments with asthma or rhinoconjunctivitis was not biased by participation or consenting behaviour. The GABRIEL Advanced Surveys are one of the largest studies to shed light on the protective 'farm effect' on asthma and atopic disease. Bias with regard to the main study question was able to be ruled out by representativeness and high participation rates in phases 2 and 3. The GABRIEL Advanced Surveys have created extensive collections of questionnaire data, biomaterial and environmental samples promising new insights into this area of research.

  10. Design and Implementation Issues in Surveying the Views of Young Children in Ethnolinguistically Diverse Developing Country Contexts

    ERIC Educational Resources Information Center

    Smith, Hilary A.; Haslett, Stephen J.

    2016-01-01

    This paper discusses issues in the development of a methodology appropriate for eliciting sound quantitative data from primary school children in the complex contexts of ethnolinguistically diverse developing countries. Although these issues often occur in field-based surveys, the large extent and compound effects of their occurrence in…

  11. Quantitative evaluation of water bodies dynamic by means of thermal infrared and multispectral surveys on the Venetian lagoon

    NASA Technical Reports Server (NTRS)

    Alberotanza, L.; Lechi, G. M.

    1977-01-01

    Surveys employing a two channel Daedalus infrared scanner and multispectral photography were performed. The spring waning tide, the velocity of the water mass, and the types of suspended matter were among the topics studied. Temperature, salinity, sediment transport, and ebb stream velocity were recorded. The bottom topography was correlated with the dynamic characteristics of the sea surface.

  12. [Perception, translation and current use of English terms on the part of Italian non-physician healthcare personnel: a qualitative and quantitative survey].

    PubMed

    Conti, Andrea A

    2008-11-01

    The study of the use of English for medicine has become a continual source of enquiry. Aim of this survey was the systematic evaluation of the qualitative and quantitative perception, translation and current use of English terms on the part of Italian health operators. Eight English terms directly connected with the health scenario or related to it compliance", "imaging", "likelihood", "odds ratio", "outcome", "stent", "test", "trial") were selected and, by means of a paper registration form, they were administered to forty Italian health professionals (non-physicians), already active in the health sector and attending specialised health degree courses. The participants were asked to furnish up to two translational proposals for every single English term, and, after the written registration, there followed a structured oral discussion of the translation, perception and everyday use of the English terms in the working reality of the participants. This survey provides a scientific "real world" experience, and its qualitative and quantitative findings are of use in evaluating the level of correction in the adoption of English language on the part of health operators.

  13. Importance of Survey Design for Studying the Epidemiology of Emerging Tobacco Product Use Among Youth.

    PubMed

    Delnevo, Cristine D; Gundersen, Daniel A; Manderski, Michelle T B; Giovenco, Daniel P; Giovino, Gary A

    2017-03-22

    Accurate surveillance is critical for monitoring the epidemiology of emerging tobacco products in the United States, and survey science suggests that survey response format can impact prevalence estimates. We utilized data from the 2014 New Jersey Youth Tobacco Survey (n = 3,909) to compare estimates of the prevalence of 4 behaviors (ever hookah use, current hookah use, ever e-cigarette use, and current e-cigarette use) among New Jersey high school students, as assessed using "check-all-that-apply" questions, with estimates measured by means of "forced-choice" questions. Measurement discrepancies were apparent for all 4 outcomes, with the forced-choice questions yielding prevalence estimates approximately twice those of the check-all-that-apply questions, and agreement was fair to moderate. The sensitivity of the check-all-that-apply questions, treating the forced-choice format as the "gold standard," ranged from 38.1% (current hookah use) to 58.3% (ever e-cigarette use), indicating substantial false-negative rates. These findings highlight the impact of question response format on prevalence estimates of emerging tobacco products among youth and suggest that estimates generated by means of check-all-that-apply questions may be biased downward. Alternative survey designs should be considered to avoid check-all-that-apply response formats, and researchers should use caution when interpreting tobacco use data obtained from check-all-that-apply formats.

  14. Microbial-based evaluation of foaming events in full-scale wastewater treatment plants by microscopy survey and quantitative image analysis.

    PubMed

    Leal, Cristiano; Amaral, António Luís; Costa, Maria de Lourdes

    2016-08-01

    Activated sludge systems are prone to be affected by foaming occurrences causing the sludge to rise in the reactor and affecting the wastewater treatment plant (WWTP) performance. Nonetheless, there is currently a knowledge gap hindering the development of foaming events prediction tools that may be fulfilled by the quantitative monitoring of AS systems biota and sludge characteristics. As such, the present study focuses on the assessment of foaming events in full-scale WWTPs, by quantitative protozoa, metazoa, filamentous bacteria, and sludge characteristics analysis, further used to enlighten the inner relationships between these parameters. In the current study, a conventional activated sludge system (CAS) and an oxidation ditch (OD) were surveyed throughout a period of 2 and 3 months, respectively, regarding their biota and sludge characteristics. The biota community was monitored by microscopic observation, and a new filamentous bacteria index was developed to quantify their occurrence. Sludge characteristics (aggregated and filamentous biomass contents and aggregate size) were determined by quantitative image analysis (QIA). The obtained data was then processed by principal components analysis (PCA), cross-correlation analysis, and decision trees to assess the foaming occurrences, and enlighten the inner relationships. It was found that such events were best assessed by the combined use of the relative abundance of testate amoeba and nocardioform filamentous index, presenting a 92.9 % success rate for overall foaming events, and 87.5 and 100 %, respectively, for persistent and mild events.

  15. Wide-Field InfraRed Survey Telescope (WFIRST) Slitless Spectrometer: Design, Prototype, and Results

    NASA Technical Reports Server (NTRS)

    Gong, Qian; Content, David; Dominguez, Margaret; Emmett, Thomas; Griesmann, Ulf; Hagopian, John; Kruk, Jeffrey; Marx, Catherine; Pasquale, Bert; Wallace, Thomas; Whipple, Arthur

    2016-01-01

    The slitless spectrometer plays an important role in the Wide-Field InfraRed Survey Telescope (WFIRST) mission for the survey of emission-line galaxies. This will be an unprecedented very wide field, HST quality 3D survey of emission line galaxies. The concept of the compound grism as a slitless spectrometer has been presented previously. The presentation briefly discusses the challenges and solutions of the optical design, and recent specification updates, as well as a brief comparison between the prototype and the latest design. However, the emphasis of this paper is the progress of the grism prototype: the fabrication and test of the complicated diffractive optical elements and powered prism, as well as grism assembly alignment and testing. Especially how to use different tools and methods, such as IR phase shift and wavelength shift interferometry, to complete the element and assembly tests. The paper also presents very encouraging results from recent element tests to assembly tests. Finally we briefly touch the path forward plan to test the spectral characteristic, such as spectral resolution and response.

  16. Quantitatively Mapping Cellular Viscosity with Detailed Organelle Information via a Designed PET Fluorescent Probe

    PubMed Central

    Liu, Tianyu; Liu, Xiaogang; Spring, David R.; Qian, Xuhong; Cui, Jingnan; Xu, Zhaochao

    2014-01-01

    Viscosity is a fundamental physical parameter that influences diffusion in biological processes. The distribution of intracellular viscosity is highly heterogeneous, and it is challenging to obtain a full map of cellular viscosity with detailed organelle information. In this work, we report 1 as the first fluorescent viscosity probe which is able to quantitatively map cellular viscosity with detailed organelle information based on the PET mechanism. This probe exhibited a significant ratiometric fluorescence intensity enhancement as solvent viscosity increases. The emission intensity increase was attributed to combined effects of the inhibition of PET due to restricted conformational access (favorable for FRET, but not for PET), and the decreased PET efficiency caused by viscosity-dependent twisted intramolecular charge transfer (TICT). A full map of subcellular viscosity was successfully constructed via fluorescent ratiometric detection and fluorescence lifetime imaging; it was found that lysosomal regions in a cell possess the highest viscosity, followed by mitochondrial regions. PMID:24957323

  17. Quantitatively Mapping Cellular Viscosity with Detailed Organelle Information via a Designed PET Fluorescent Probe

    NASA Astrophysics Data System (ADS)

    Liu, Tianyu; Liu, Xiaogang; Spring, David R.; Qian, Xuhong; Cui, Jingnan; Xu, Zhaochao

    2014-06-01

    Viscosity is a fundamental physical parameter that influences diffusion in biological processes. The distribution of intracellular viscosity is highly heterogeneous, and it is challenging to obtain a full map of cellular viscosity with detailed organelle information. In this work, we report 1 as the first fluorescent viscosity probe which is able to quantitatively map cellular viscosity with detailed organelle information based on the PET mechanism. This probe exhibited a significant ratiometric fluorescence intensity enhancement as solvent viscosity increases. The emission intensity increase was attributed to combined effects of the inhibition of PET due to restricted conformational access (favorable for FRET, but not for PET), and the decreased PET efficiency caused by viscosity-dependent twisted intramolecular charge transfer (TICT). A full map of subcellular viscosity was successfully constructed via fluorescent ratiometric detection and fluorescence lifetime imaging; it was found that lysosomal regions in a cell possess the highest viscosity, followed by mitochondrial regions.

  18. Quantitatively mapping cellular viscosity with detailed organelle information via a designed PET fluorescent probe.

    PubMed

    Liu, Tianyu; Liu, Xiaogang; Spring, David R; Qian, Xuhong; Cui, Jingnan; Xu, Zhaochao

    2014-06-24

    Viscosity is a fundamental physical parameter that influences diffusion in biological processes. The distribution of intracellular viscosity is highly heterogeneous, and it is challenging to obtain a full map of cellular viscosity with detailed organelle information. In this work, we report 1 as the first fluorescent viscosity probe which is able to quantitatively map cellular viscosity with detailed organelle information based on the PET mechanism. This probe exhibited a significant ratiometric fluorescence intensity enhancement as solvent viscosity increases. The emission intensity increase was attributed to combined effects of the inhibition of PET due to restricted conformational access (favorable for FRET, but not for PET), and the decreased PET efficiency caused by viscosity-dependent twisted intramolecular charge transfer (TICT). A full map of subcellular viscosity was successfully constructed via fluorescent ratiometric detection and fluorescence lifetime imaging; it was found that lysosomal regions in a cell possess the highest viscosity, followed by mitochondrial regions.

  19. Survey of the quality of experimental design, statistical analysis and reporting of research using animals.

    PubMed

    Kilkenny, Carol; Parsons, Nick; Kadyszewski, Ed; Festing, Michael F W; Cuthill, Innes C; Fry, Derek; Hutton, Jane; Altman, Douglas G

    2009-11-30

    For scientific, ethical and economic reasons, experiments involving animals should be appropriately designed, correctly analysed and transparently reported. This increases the scientific validity of the results, and maximises the knowledge gained from each experiment. A minimum amount of relevant information must be included in scientific publications to ensure that the methods and results of a study can be reviewed, analysed and repeated. Omitting essential information can raise scientific and ethical concerns. We report the findings of a systematic survey of reporting, experimental design and statistical analysis in published biomedical research using laboratory animals. Medline and EMBASE were searched for studies reporting research on live rats, mice and non-human primates carried out in UK and US publicly funded research establishments. Detailed information was collected from 271 publications, about the objective or hypothesis of the study, the number, sex, age and/or weight of animals used, and experimental and statistical methods. Only 59% of the studies stated the hypothesis or objective of the study and the number and characteristics of the animals used. Appropriate and efficient experimental design is a critical component of high-quality science. Most of the papers surveyed did not use randomisation (87%) or blinding (86%), to reduce bias in animal selection and outcome assessment. Only 70% of the publications that used statistical methods described their methods and presented the results with a measure of error or variability. This survey has identified a number of issues that need to be addressed in order to improve experimental design and reporting in publications describing research using animals. Scientific publication is a powerful and important source of information; the authors of scientific publications therefore have a responsibility to describe their methods and results comprehensively, accurately and transparently, and peer reviewers and

  20. "Intelligent design" of a 3D reflection survey for the SAFOD drill-hole site

    NASA Astrophysics Data System (ADS)

    Alvarez, G.; Hole, J. A.; Klemperer, S. L.; Biondi, B.; Imhof, M.

    2003-12-01

    SAFOD seeks to better understand the earthquake process by drilling though the San Andreas fault (SAF) to sample an earthquake in situ. To capitalize fully on the opportunities presented by the 1D drill-hole into a complex fault zone we must characterize the surrounding 3D geology at a scale commensurate with the drilling observations, to provide the structural context to extrapolate 1D drilling results along the fault plane and into the surrounding 3D volume. Excellent active-2D and passive-3D seismic observations completed and underway lack the detailed 3D resolution required. Only an industry-quality 3D reflection survey can provide c. 25 m subsurface sample-spacing horizontally and vertically. A 3D reflection survey will provide subsurface structural and stratigraphic control at the 100-m level, mapping major geologic units, structural boundaries, and subsurface relationships between the many faults that make up the SAF fault system. A principal objective should be a reflection-image (horizon-slice through the 3D volume) of the near-vertical fault plane(s) to show variations in physical properties around the drill-hole. Without a 3D reflection image of the fault zone, we risk interpreting drilled anomalies as ubiquitous properties of the fault, or risk missing important anomalies altogether. Such a survey cannot be properly costed or technically designed without major planning. "Intelligent survey design" can minimize source and receiver effort without compromising data-quality at the fault target. Such optimization can in principal reduce the cost of a 3D seismic survey by a factor of two or three, utilizing the known surface logistic constraints, partially-known sub-surface velocity field, and the suite of scientific targets at SAFOD. Our methodology poses the selection of the survey parameters as an optimization process that allows the parameters to vary spatially in response to changes in the subsurface. The acquisition geometry is locally optimized for

  1. Simulation of complete seismic surveys for evaluation of experiment design and processing

    SciTech Connect

    Oezdenvar, T.; McMechan, G.A.; Chaney, P.

    1996-03-01

    Synthesis of complete seismic survey data sets allows analysis and optimization of all stages in an acquisition/processing sequence. The characteristics of available survey designs, parameter choices, and processing algorithms may be evaluated prior to field acquisition to produce a composite system in which all stages have compatible performance; this maximizes the cost effectiveness for a given level of accuracy, or for targets with specific characteristics. Data sets synthesized for three salt structures provide representative comparisons of time and depth migration, post-stack and prestack processing, and illustrate effects of varying recording aperture and shot spacing, iterative focusing analysis, and the interaction of migration algorithms with recording aperture. A final example demonstrates successful simulation of both 2-D acquisition and processing of a real data line over a salt pod in the Gulf of Mexico.

  2. KUIPER BELT OBJECT OCCULTATIONS: EXPECTED RATES, FALSE POSITIVES, AND SURVEY DESIGN

    SciTech Connect

    Bickerton, S. J.; Welch, D. L.; Kavelaars, J. J. E-mail: welch@physics.mcmaster.ca

    2009-05-15

    A novel method of generating artificial scintillation noise is developed and used to evaluate occultation rates and false positive rates for surveys probing the Kuiper Belt with the method of serendipitous stellar occultations. A thorough examination of survey design shows that (1) diffraction-dominated occultations are critically (Nyquist) sampled at a rate of 2 Fsu{sup -1}, corresponding to 40 s{sup -1} for objects at 40 AU, (2) occultation detection rates are maximized when targets are observed at solar opposition, (3) Main Belt asteroids will produce occultations light curves identical to those of Kuiper Belt Objects (KBOs) if target stars are observed at solar elongations of: 116{sup 0} {approx}< {epsilon} {approx}< 125 deg., or 131 deg. {approx}< {epsilon} {approx}< 141 deg., and (4) genuine KBO occultations are likely to be so rare that a detection threshold of {approx}>7-8{sigma} should be adopted to ensure that viable candidate events can be disentangled from false positives.

  3. Large-visual-angle microstructure inspired from quantitative design of Morpho butterflies' lamellae deviation using the FDTD/PSO method.

    PubMed

    Wang, Wanlin; Zhang, Wang; Chen, Weixin; Gu, Jiajun; Liu, Qinglei; Deng, Tao; Zhang, Di

    2013-01-15

    The wide angular range of the treelike structure in Morpho butterfly scales was investigated by finite-difference time-domain (FDTD)/particle-swarm-optimization (PSO) analysis. Using the FDTD method, different parameters in the Morpho butterflies' treelike structure were studied and their contributions to the angular dependence were analyzed. Then a wide angular range was realized by the PSO method from quantitatively designing the lamellae deviation (Δy), which was a crucial parameter with angular range. The field map of the wide-range reflection in a large area was given to confirm the wide angular range. The tristimulus values and corresponding color coordinates for various viewing directions were calculated to confirm the blue color in different observation angles. The wide angular range realized by the FDTD/PSO method will assist us in understanding the scientific principles involved and also in designing artificial optical materials.

  4. Conference Discussion: The Challenges in Multi-Object Spectroscopy Instrument and Survey Design, and in Data Processing and Analysis

    NASA Astrophysics Data System (ADS)

    Balcells, M.; Skillen, I.

    2016-10-01

    The final session of the conference Multi-Object Spectroscopy in the Next Decade: Big Questions, Large Surveys, and Wide Fields, held in La Palma 2-6 March 2015, was devoted to a discussion of the challenges in designing and operating the next-generation survey spectrographs, and planning and carrying out their massive surveys. The wide-ranging 1.5-hour debate was recorded on video tape, and in this paper we report the edited transcription of the dialog.

  5. Loop Shaping Control Design for a Supersonic Propulsion System Model Using Quantitative Feedback Theory (QFT) Specifications and Bounds

    NASA Technical Reports Server (NTRS)

    Connolly, Joseph W.; Kopasakis, George

    2010-01-01

    This paper covers the propulsion system component modeling and controls development of an integrated mixed compression inlet and turbojet engine that will be used for an overall vehicle Aero-Propulso-Servo-Elastic (APSE) model. Using previously created nonlinear component-level propulsion system models, a linear integrated propulsion system model and loop shaping control design have been developed. The design includes both inlet normal shock position control and jet engine rotor speed control for a potential supersonic commercial transport. A preliminary investigation of the impacts of the aero-elastic effects on the incoming flow field to the propulsion system are discussed, however, the focus here is on developing a methodology for the propulsion controls design that prevents unstart in the inlet and minimizes the thrust oscillation experienced by the vehicle. Quantitative Feedback Theory (QFT) specifications and bounds, and aspects of classical loop shaping are used in the control design process. Model uncertainty is incorporated in the design to address possible error in the system identification mapping of the nonlinear component models into the integrated linear model.

  6. Implementing the World Mental Health Survey Initiative in Portugal – rationale, design and fieldwork procedures

    PubMed Central

    2013-01-01

    Background The World Mental Health Survey Initiative was designed to evaluate the prevalence, the correlates, the impact and the treatment patterns of mental disorders. This paper describes the rationale and the methodological details regarding the implementation of the survey in Portugal, a country that still lacks representative epidemiological data about psychiatric disorders. Methods The World Mental Health Survey is a cross-sectional study with a representative sample of the Portuguese population, aged 18 or older, based on official census information. The WMH-Composite International Diagnostic Interview, adapted to the Portuguese language by a group of bilingual experts, was used to evaluate the mental health status, disorder severity, impairment, use of services and treatment. Interviews were administered face-to-face at respondent’s dwellings, which were selected from a nationally representative multi-stage clustered area probability sample of households. The survey was administered using computer-assisted personal interview methods by trained lay interviewers. Data quality was strictly controlled in order to ensure the reliability and validity of the collected information. Results A total of 3,849 people completed the main survey, with 2,060 completing the long interview, with a response rate of 57.3%. Data cleaning was conducted in collaboration with the WMHSI Data Analysis Coordination Centre at the Department of Health Care Policy, Harvard Medical School. Collected information will provide lifetime and 12-month mental disorders diagnoses, according to the International Classification of Diseases and to the Diagnostic and Statistical Manual of Mental Disorders. Conclusions The findings of this study could have a major influence in mental health care policy planning efforts over the next years, specially in a country that still has a significant level of unmet needs regarding mental health services organization, delivery of care and epidemiological

  7. Detection limits of quantitative and digital PCR assays and their influence in presence-absence surveys of environmental DNA

    USGS Publications Warehouse

    Hunter, Margaret; Dorazio, Robert M.; Butterfield, John S.; Meigs-Friend, Gaia; Nico, Leo; Ferrante, Jason

    2017-01-01

    A set of universal guidelines is needed to determine the limit of detection (LOD) in PCR-based analyses of low concentration DNA. In particular, environmental DNA (eDNA) studies require sensitive and reliable methods to detect rare and cryptic species through shed genetic material in environmental samples. Current strategies for assessing detection limits of eDNA are either too stringent or subjective, possibly resulting in biased estimates of species’ presence. Here, a conservative LOD analysis grounded in analytical chemistry is proposed to correct for overestimated DNA concentrations predominantly caused by the concentration plateau, a nonlinear relationship between expected and measured DNA concentrations. We have used statistical criteria to establish formal mathematical models for both quantitative and droplet digital PCR. To assess the method, a new Grass Carp (Ctenopharyngodon idella) TaqMan assay was developed and tested on both PCR platforms using eDNA in water samples. The LOD adjustment reduced Grass Carp occupancy and detection estimates while increasing uncertainty – indicating that caution needs to be applied to eDNA data without LOD correction. Compared to quantitative PCR, digital PCR had higher occurrence estimates due to increased sensitivity and dilution of inhibitors at low concentrations. Without accurate LOD correction, species occurrence and detection probabilities based on eDNA estimates are prone to a source of bias that cannot be reduced by an increase in sample size or PCR replicates. Other applications also could benefit from a standardized LOD such as GMO food analysis, and forensic and clinical diagnostics.

  8. Detection limits of quantitative and digital PCR assays and their influence in presence-absence surveys of environmental DNA.

    PubMed

    Hunter, Margaret E; Dorazio, Robert M; Butterfield, John S S; Meigs-Friend, Gaia; Nico, Leo G; Ferrante, Jason A

    2017-03-01

    A set of universal guidelines is needed to determine the limit of detection (LOD) in PCR-based analyses of low-concentration DNA. In particular, environmental DNA (eDNA) studies require sensitive and reliable methods to detect rare and cryptic species through shed genetic material in environmental samples. Current strategies for assessing detection limits of eDNA are either too stringent or subjective, possibly resulting in biased estimates of species' presence. Here, a conservative LOD analysis grounded in analytical chemistry is proposed to correct for overestimated DNA concentrations predominantly caused by the concentration plateau, a nonlinear relationship between expected and measured DNA concentrations. We have used statistical criteria to establish formal mathematical models for both quantitative and droplet digital PCR. To assess the method, a new Grass Carp (Ctenopharyngodon idella) TaqMan assay was developed and tested on both PCR platforms using eDNA in water samples. The LOD adjustment reduced Grass Carp occupancy and detection estimates while increasing uncertainty-indicating that caution needs to be applied to eDNA data without LOD correction. Compared to quantitative PCR, digital PCR had higher occurrence estimates due to increased sensitivity and dilution of inhibitors at low concentrations. Without accurate LOD correction, species occurrence and detection probabilities based on eDNA estimates are prone to a source of bias that cannot be reduced by an increase in sample size or PCR replicates. Other applications also could benefit from a standardized LOD such as GMO food analysis and forensic and clinical diagnostics.

  9. Designing, Testing, and Validating an Attitudinal Survey on an Environmental Topic: A Groundwater Pollution Survey Instrument for Secondary School Students

    ERIC Educational Resources Information Center

    Lacosta-Gabari, Idoya; Fernandez-Manzanal, Rosario; Sanchez-Gonzalez, Dolores

    2009-01-01

    Research in environmental attitudes' assessment has significantly increased in recent years. The development of specific attitude scales for specific environmental problems has often been proposed. This paper describes the Groundwater Pollution Test (GPT), a 19-item survey instrument using a Likert-type scale. The survey has been used with…

  10. DESIGN AND APPLICATION OF A STRATIFIED UNEQUAL-PROBABILITY STREAM SURVEY IN THE MID-ATLANTIC COASTAL PLAIN

    EPA Science Inventory

    A stratified random sample with unequal probability selection within strata was used to design a multipurpose survey of headwater watersheds in the Mid-Atlantic Coastal Plain. Objectives for data from the survey include unbiased estimates of regional headwater watershed condition...

  11. Integrated siRNA design based on surveying of features associated with high RNAi effectiveness

    PubMed Central

    Gong, Wuming; Ren, Yongliang; Xu, Qiqi; Wang, Yejun; Lin, Dong; Zhou, Haiyan; Li, Tongbin

    2006-01-01

    Background Short interfering RNAs have allowed the development of clean and easily regulated methods for disruption of gene expression. However, while these methods continue to grow in popularity, designing effective siRNA experiments can be challenging. The various existing siRNA design guidelines suffer from two problems: they differ considerably from each other, and they produce high levels of false-positive predictions when tested on data of independent origins. Results Using a distinctly large set of siRNA efficacy data assembled from a vast diversity of origins (the siRecords data, containing records of 3,277 siRNA experiments targeting 1,518 genes, derived from 1,417 independent studies), we conducted extensive analyses of all known features that have been implicated in increasing RNAi effectiveness. A number of features having positive impacts on siRNA efficacy were identified. By performing quantitative analyses on cooperative effects among these features, then applying a disjunctive rule merging (DRM) algorithm, we developed a bundle of siRNA design rule sets with the false positive problem well curbed. A comparison with 15 online siRNA design tools indicated that some of the rule sets we developed surpassed all of these design tools commonly used in siRNA design practice in positive predictive values (PPVs). Conclusion The availability of the large and diverse siRNA dataset from siRecords and the approach we describe in this report have allowed the development of highly effective and generally applicable siRNA design rule sets. Together with ever improving RNAi lab techniques, these design rule sets are expected to make siRNAs a more useful tool for molecular genetics, functional genomics, and drug discovery studies. PMID:17129386

  12. Campsite survey implications for managing designated campsites at Great Smoky Mountains National Park

    USGS Publications Warehouse

    Marion, J.L.; Leung, Y.-F.; Kulhavy, D.L.; Legg, M.H.

    1998-01-01

    Backcountry campsites and shelters in Great Smoky Mountains National Park were surveyed in 1993 as part of a new impact monitoring program. A total of 395 campsites and shelters were located and assessed, including 309 legal campsites located at 84 designated campgrounds, 68 illegal campsites, and 18 shelters. Primary campsite management problems identified by the survey include: (1) campsite proliferation, (2) campsite expansion and excessive size, (3) excessive vegetation loss and soil exposure, (4) lack of visitor solitude at campsites, (5) excessive tree damage, and (6) illegal camping. A number of potential management options are recommended to address the identified campsite management problems. Many problems are linked to the ability of visitors to determine the location and number of individual campsites within each designated campground. A principal recommendation is that managers apply site-selection criteria to existing and potential new campsite locations to identify and designate campsites that will resist and constrain the areal extent of impacts and enhance visitor solitude. Educational solutions are also offered.

  13. Improving the design of acoustic and midwater trawl surveys through stratification, with an application to Lake Michigan prey fishes

    USGS Publications Warehouse

    Adams, J.V.; Argyle, R.L.; Fleischer, G.W.; Curtis, G.L.; Stickel, R.G.

    2006-01-01

    Reliable estimates of fish biomass are vital to the management of aquatic ecosystems and their associated fisheries. Acoustic and midwater trawl surveys are an efficient sampling method for estimating fish biomass in large bodies of water. To improve the precision of biomass estimates from combined acoustic and midwater trawl surveys, sampling effort should be optimally allocated within each stage of the survey design. Based on information collected during fish surveys, we developed an approach to improve the design of combined acoustic and midwater trawl surveys through stratification. Geographic strata for acoustic surveying and depth strata for midwater trawling were defined using neighbor-restricted cluster analysis, and the optimal allocation of sampling effort for each was then determined. As an example, we applied this survey stratification approach to data from lakewide acoustic and midwater trawl surveys of Lake Michigan prey fishes. Precision of biomass estimates from surveys with and without geographic stratification was compared through resampling. Use of geographic stratification with optimal sampling allocation reduced the variance of Lake Michigan acoustic biomass estimates by 77%. Stratification and optimal allocation at each stage of an acoustic and midwater trawl survey should serve to reduce the variance of the resulting biomass estimates.

  14. Cigarette pack design and adolescent smoking susceptibility: a cross-sectional survey

    PubMed Central

    Ford, Allison; MacKintosh, Anne Marie; Moodie, Crawford; Richardson, Sol; Hastings, Gerard

    2013-01-01

    Objectives To compare adolescents’ responses to three different styles of cigarette packaging: novelty (branded packs designed with a distinctive shape, opening style or bright colour), regular (branded pack with no special design features) and plain (brown pack with a standard shape and opening and all branding removed, aside from brand name). Design Cross-sectional in-home survey. Setting UK. Participants Random location quota sample of 1025 never smokers aged 11–16 years. Main outcome measures Susceptibility to smoking and composite measures of pack appraisal and pack receptivity derived from 11 survey items. Results Mean responses to the three pack types were negative for all survey items. However, ‘novelty’ packs were rated significantly less negatively than the ‘regular’ pack on most items, and the novelty and regular packs were rated less negatively than the ‘plain’ pack. For the novelty packs, logistic regressions, controlling for factors known to influence youth smoking, showed that susceptibility was associated with positive appraisal and also receptivity. For example, those receptive to the innovative Silk Cut Superslims pack were more than four times as likely to be susceptible to smoking than those not receptive to this pack (AOR=4.42, 95% CI 2.50 to 7.81, p<0.001). For the regular pack, an association was found between positive appraisal and susceptibility but not with receptivity and susceptibility. There was no association with pack appraisal or receptivity for the plain pack. Conclusions Pack structure (shape and opening style) and colour are independently associated, not just with appreciation of and receptivity to the pack, but also with susceptibility to smoke. In other words, those who think most highly of novelty cigarette packaging are also the ones who indicate that they are most likely to go on to smoke. Plain packaging, in contrast, was found to directly reduce the appeal of smoking to adolescents. PMID:24056481

  15. Decision making preferences in the medical encounter – a factorial survey design

    PubMed Central

    Müller-Engelmann, Meike; Krones, Tanja; Keller, Heidi; Donner-Banzhoff, Norbert

    2008-01-01

    Background Up to now it has not been systematically investigated in which kind of clinical situations a consultation style based on shared decision making (SDM) is preferred by patients and physicians. We suggest the factorial survey design to address this problem. This method, which so far has hardly been used in health service research, allows to vary relevant factors describing clinical situations as variables systematically in an experimental random design and to investigate their importance in large samples. Methods/Design To identify situational factors for the survey we first performed a literature search which was followed by a qualitative interview study with patients, physicians and health care experts. As a result, 7 factors (e.g. "Reason for consultation" and "Number of therapeutic options") with 2 to 3 levels (e.g. "One therapeutic option" and "More than one therapeutic option") will be included in the study. For the survey the factor levels will be randomly combined to short stories describing different treatment situations. A randomized sample of all possible short stories will be given to at least 300 subjects (100 GPs, 100 patients and 100 members of self-help groups) who will be asked to rate how the decision should be made. Main outcome measure is the preference for participation in the decision making process in the given clinical situation. Data analysis will estimate the effects of the factors on the rating and also examine differences between groups. Discussion The results will reveal the effects of situational variations on participation preferences. Thus, our findings will contribute to the understanding of normative values in the medical decision making process and will improve future implementation of SDM and decision aids. PMID:19091091

  16. Recommended survey designs for occupancy modelling using motion-activated cameras: insights from empirical wildlife data.

    PubMed

    Shannon, Graeme; Lewis, Jesse S; Gerber, Brian D

    2014-01-01

    Motion-activated cameras are a versatile tool that wildlife biologists can use for sampling wild animal populations to estimate species occurrence. Occupancy modelling provides a flexible framework for the analysis of these data; explicitly recognizing that given a species occupies an area the probability of detecting it is often less than one. Despite the number of studies using camera data in an occupancy framework, there is only limited guidance from the scientific literature about survey design trade-offs when using motion-activated cameras. A fuller understanding of these trade-offs will allow researchers to maximise available resources and determine whether the objectives of a monitoring program or research study are achievable. We use an empirical dataset collected from 40 cameras deployed across 160 km(2) of the Western Slope of Colorado, USA to explore how survey effort (number of cameras deployed and the length of sampling period) affects the accuracy and precision (i.e., error) of the occupancy estimate for ten mammal and three virtual species. We do this using a simulation approach where species occupancy and detection parameters were informed by empirical data from motion-activated cameras. A total of 54 survey designs were considered by varying combinations of sites (10-120 cameras) and occasions (20-120 survey days). Our findings demonstrate that increasing total sampling effort generally decreases error associated with the occupancy estimate, but changing the number of sites or sampling duration can have very different results, depending on whether a species is spatially common or rare (occupancy = ψ) and easy or hard to detect when available (detection probability = p). For rare species with a low probability of detection (i.e., raccoon and spotted skunk) the required survey effort includes maximizing the number of sites and the number of survey days, often to a level that may be logistically unrealistic for many studies. For common species with

  17. Assessment of Pictographs Developed Through a Participatory Design Process Using an Online Survey Tool

    PubMed Central

    Kim, Hyeoneui; Nakamura, Carlos

    2009-01-01

    Background Inpatient discharge instructions are a mandatory requirement of the Centers for Medicare and Medicaid Services and Joint Commission on Accreditation of Healthcare Organizations. The instructions include all the information relevant to post-discharge patient care. Prior studies show that patients often cannot fully understand or remember all the instructions. To address this issue, we have previously conducted a pilot study in which pictographs were created through a participatory design process to facilitate the comprehension and recall of discharge instructions. Objective The main objective of this study was to verify the individual effectiveness of pictographs created through a participatory design process. Methods In this study, we included 20 pictographs developed by our group and 20 pictographs developed by the Robert Wood Johnson Foundation as a reference baseline for pictographic recognition. To assess whether the participants could recognize the meaning of the pictographs, we designed an asymmetrical pictograph–text label-linking test. Data collection lasted for 7 days after the email invitation. A total of 44 people accessed the survey site. We excluded 7 participants who completed less than 50% of the survey. A total of 719 answers from 37 participants were analyzed. Results The analysis showed that the participants recognized the pictographs developed in-house significantly better than those included in the study as a baseline (P< .001). This trend was true regardless of the participant’s gender, age, and education level. The results also revealed that there is a large variance in the quality of the pictographs developed using the same design process—the recognition rate ranged from below 50% to above 90%. Conclusions This study confirmed that the majority of the pictographs developed in a participatory design process involving a small number of nurses and consumers were recognizable by a larger number of consumers. The variance in

  18. Requirements and concept design for large earth survey telescope for SEOS

    NASA Technical Reports Server (NTRS)

    Mailhot, P.; Bisbee, J.

    1975-01-01

    The efforts of a one year program of Requirements Analysis and Conceptual Design for the Large Earth Survey Telescope for the Synchronous Earth Observatory Satellite is summarized. A 1.4 meter aperture Cassegrain telescope with 0.6 deg field of view is shown to do an excellent job in satisfying the observational requirements for a wide range of earth resources and meteorological applications. The telescope provides imagery or thermal mapping in ten spectral bands at one time in a field sharing grouping of linear detector arrays. Pushbroom scanning is accomplished by spacecraft slew.

  19. Questionnaire survey of customer satisfaction for product categories towards certification of ergonomic quality in design.

    PubMed

    Mochimaru, Masaaki; Takahashi, Miwako; Hatakenaka, Nobuko; Horiuchi, Hitoshi

    2012-01-01

    Customer satisfaction was surveyed for 6 product categories (consumer electronics, daily commodities, home equipment, information systems, cars, and health appliances) by questionnaires based on the Analytic Hierarchy Process. Analyzing weight of evaluation factors, the 6 product categories were reorganized into 4 categories, those were related to 4 aspects in daily living that formed by two axes: home living - mobility life and healthy life - active communication. It was found that consumers were attracted by the actual user test by public institutes for all product categories. The certification based on the design process standard established by authorities, such as EQUID was the second best attractor for consumers.

  20. Quantitative Analysis, Design, and Fabrication of Biosensing and Bioprocessing Devices in Living Cells

    DTIC Science & Technology

    2015-03-10

    Vecchio (MIT) Annual accomplishments Summary of the Project: This project aims at designing sensing systems in bacteria E. coli by employing and re...minimal delay with respect to when the environmental molecule appeared. Sensing through phosphorylation: Isolation Amplifier Circuit in E. coli The...Fig 2. Implementation of a semi-synthetic transmission system based on phosphorylation in E. coli . The phosphorylation cycle is given by the NRI

  1. Design of the South East Asian Nutrition Survey (SEANUTS): a four-country multistage cluster design study.

    PubMed

    Schaafsma, Anne; Deurenberg, Paul; Calame, Wim; van den Heuvel, Ellen G H M; van Beusekom, Christien; Hautvast, Jo; Sandjaja; Bee Koon, Poh; Rojroongwasinkul, Nipa; Le Nguyen, Bao Khanh; Parikh, Panam; Khouw, Ilse

    2013-09-01

    Nutrition is a well-known factor in the growth, health and development of children. It is also acknowledged that worldwide many people have dietary imbalances resulting in over- or undernutrition. In 2009, the multinational food company FrieslandCampina initiated the South East Asian Nutrition Survey (SEANUTS), a combination of surveys carried out in Indonesia, Malaysia, Thailand and Vietnam, to get a better insight into these imbalances. The present study describes the general study design and methodology, as well as some problems and pitfalls encountered. In each of these countries, participants in the age range of 0·5-12 years were recruited according to a multistage cluster randomised or stratified random sampling methodology. Field teams took care of recruitment and data collection. For the health status of children, growth and body composition, physical activity, bone density, and development and cognition were measured. For nutrition, food intake and food habits were assessed by questionnaires, whereas in subpopulations blood and urine samples were collected to measure the biochemical status parameters of Fe, vitamins A and D, and DHA. In Thailand, the researchers additionally studied the lipid profile in blood, whereas in Indonesia iodine excretion in urine was analysed. Biochemical data were analysed in certified laboratories. Study protocols and methodology were aligned where practically possible. In December 2011, data collection was finalised. In total, 16,744 children participated in the present study. Information that will be very relevant for formulating nutritional health policies, as well as for designing innovative food and nutrition research and development programmes, has become available.

  2. HIV testing during the Canadian immigration medical examination: a national survey of designated medical practitioners.

    PubMed

    Tran, Jennifer M; Li, Alan; Owino, Maureen; English, Ken; Mascarenhas, Lyndon; Tan, Darrell H S

    2014-01-01

    HIV testing is mandatory for individuals wishing to immigrate to Canada. Since the Designated Medical Practitioners (DMPs) who perform these tests may have varying experience in HIV and time constraints in their clinical practices, there may be variability in the quality of pre- and posttest counseling provided. We surveyed DMPs regarding HIV testing, counseling, and immigration inadmissibility. A 16-item survey was mailed to all DMPs across Canada (N = 203). The survey inquired about DMP characteristics, knowledge of HIV, attitudes and practices regarding inadmissibility and counseling, and interest in continuing medical education. There were a total of 83 respondents (41%). Participants frequently rated their knowledge of HIV diagnostics, cultural competency, and HIV/AIDS service organizations as "fair" (40%, 43%, and 44%, respectively). About 25%, 46%, and 11% of the respondents agreed/strongly agreed with the statements "HIV infected individuals pose a danger to public health and safety," "HIV-positive immigrants cause excessive demand on the healthcare system," and "HIV seropositivity is a reasonable ground for denial into Canada," respectively. Language was cited as a barrier to counseling, which focused on transmission risks (46% discussed this as "always" or "often") more than coping and social support (37%). There was a high level of interest (47%) in continuing medical education in this area. There are areas for improvement regarding DMPs' knowledge, attitudes, and practices about HIV infection, counseling, and immigration criteria. Continuing medical education and support for DMPs to facilitate practice changes could benefit newcomers who test positive through the immigration process.

  3. A successful 3D seismic survey in the ``no-data zone,`` offshore Mississippi delta: Survey design and refraction static correction processing

    SciTech Connect

    Carvill, C.; Faris, N.; Chambers, R.

    1996-12-31

    This is a success story of survey design and refraction static correction processing of a large 3D seismic survey in the South Pass area of the Mississippi delta. In this transition zone, subaqueous mudflow gullies and lobes of the delta, in various states of consolidation and gas saturation, are strong absorbers of seismic energy. Seismic waves penetrating the mud are severely restricted in bandwidth and variously delayed by changes in mud velocity and thickness. Using a delay-time refraction static correction method, the authors find compensation for the various delays, i.e., static corrections, commonly vary 150 ms over a short distance. Application of the static corrections markedly improves the seismic stack volume. This paper shows that intelligent survey design and delay-time refraction static correction processing economically eliminate the historic no data status of this area.

  4. Quantitative Analysis, Design, and Fabrication of Biosensing and Bioprocessing Devices in Living Cells

    DTIC Science & Technology

    2012-08-29

    DESIGN, AND FABRICATION OF BIOSENSING AND BIOPROCESSING DEVICES IN LIVING CELLS Grant/Contract Number: FA9550-10-1-0242 PI: Domitilla Del Vecchio...43% change). ( C ) The slower response in the early stages of induction, can be quantified by comparing the delay at different elimination levels. The...400min. The time to reach 50% of the steady-state level post-wash went down from 403±9min in the isolated system to 355±15min in the loaded system. ( C

  5. THE DEEP2 GALAXY REDSHIFT SURVEY: DESIGN, OBSERVATIONS, DATA REDUCTION, AND REDSHIFTS

    SciTech Connect

    Newman, Jeffrey A.; Cooper, Michael C.; Davis, Marc; Faber, S. M.; Guhathakurta, Puragra; Koo, David C.; Phillips, Andrew C.; Conroy, Charlie; Harker, Justin J.; Lai, Kamson; Dutton, Aaron A.; Finkbeiner, Douglas P.; Gerke, Brian F.; Rosario, David J.; Weiner, Benjamin J.; Willmer, C. N. A.; Yan Renbin; Kassin, Susan A.; Konidaris, N. P. E-mail: djm70@pitt.edu E-mail: mdavis@berkeley.edu E-mail: koo@ucolick.org E-mail: phillips@ucolick.org; and others

    2013-09-15

    We describe the design and data analysis of the DEEP2 Galaxy Redshift Survey, the densest and largest high-precision redshift survey of galaxies at z {approx} 1 completed to date. The survey was designed to conduct a comprehensive census of massive galaxies, their properties, environments, and large-scale structure down to absolute magnitude M{sub B} = -20 at z {approx} 1 via {approx}90 nights of observation on the Keck telescope. The survey covers an area of 2.8 deg{sup 2} divided into four separate fields observed to a limiting apparent magnitude of R{sub AB} = 24.1. Objects with z {approx}< 0.7 are readily identifiable using BRI photometry and rejected in three of the four DEEP2 fields, allowing galaxies with z > 0.7 to be targeted {approx}2.5 times more efficiently than in a purely magnitude-limited sample. Approximately 60% of eligible targets are chosen for spectroscopy, yielding nearly 53,000 spectra and more than 38,000 reliable redshift measurements. Most of the targets that fail to yield secure redshifts are blue objects that lie beyond z {approx} 1.45, where the [O II] 3727 A doublet lies in the infrared. The DEIMOS 1200 line mm{sup -1} grating used for the survey delivers high spectral resolution (R {approx} 6000), accurate and secure redshifts, and unique internal kinematic information. Extensive ancillary data are available in the DEEP2 fields, particularly in the Extended Groth Strip, which has evolved into one of the richest multiwavelength regions on the sky. This paper is intended as a handbook for users of the DEEP2 Data Release 4, which includes all DEEP2 spectra and redshifts, as well as for the DEEP2 DEIMOS data reduction pipelines. Extensive details are provided on object selection, mask design, biases in target selection and redshift measurements, the spec2d two-dimensional data-reduction pipeline, the spec1d automated redshift pipeline, and the zspec visual redshift verification process, along with examples of instrumental signatures or

  6. The DEEP2 Galaxy Redshift Survey: Design, Observations, Data Reduction, and Redshifts

    NASA Technical Reports Server (NTRS)

    Newman, Jeffrey A.; Cooper, Michael C.; Davis, Marc; Faber, S. M.; Coil, Alison L; Guhathakurta, Puraga; Koo, David C.; Phillips, Andrew C.; Conroy, Charlie; Dutton, Aaron A.; Finkbeiner, Douglas P.; Gerke, Brian F.; Rosario, David J.; Weiner, Benjamin J.; Wilmer, C. N. A.; Yan, Renbin; Harker, Justin J.; Kassin, Susan A.; Konidaris, N. P.; Lai, Kamson; Madgwick, Darren S.; Noeske, K. G.; Wirth, Gregory D.; Kirby, Evan N.; Lotz, Jennifer M.

    2013-01-01

    We describe the design and data analysis of the DEEP2 Galaxy Redshift Survey, the densest and largest high-precision redshift survey of galaxies at z approx. 1 completed to date. The survey was designed to conduct a comprehensive census of massive galaxies, their properties, environments, and large-scale structure down to absolute magnitude MB = -20 at z approx. 1 via approx.90 nights of observation on the Keck telescope. The survey covers an area of 2.8 Sq. deg divided into four separate fields observed to a limiting apparent magnitude of R(sub AB) = 24.1. Objects with z approx. < 0.7 are readily identifiable using BRI photometry and rejected in three of the four DEEP2 fields, allowing galaxies with z > 0.7 to be targeted approx. 2.5 times more efficiently than in a purely magnitude-limited sample. Approximately 60% of eligible targets are chosen for spectroscopy, yielding nearly 53,000 spectra and more than 38,000 reliable redshift measurements. Most of the targets that fail to yield secure redshifts are blue objects that lie beyond z approx. 1.45, where the [O ii] 3727 Ang. doublet lies in the infrared. The DEIMOS 1200 line mm(exp -1) grating used for the survey delivers high spectral resolution (R approx. 6000), accurate and secure redshifts, and unique internal kinematic information. Extensive ancillary data are available in the DEEP2 fields, particularly in the Extended Groth Strip, which has evolved into one of the richest multiwavelength regions on the sky. This paper is intended as a handbook for users of the DEEP2 Data Release 4, which includes all DEEP2 spectra and redshifts, as well as for the DEEP2 DEIMOS data reduction pipelines. Extensive details are provided on object selection, mask design, biases in target selection and redshift measurements, the spec2d two-dimensional data-reduction pipeline, the spec1d automated redshift pipeline, and the zspec visual redshift verification process, along with examples of instrumental signatures or other

  7. The DEEP2 Galaxy Redshift Survey: Design, Observations, Data Reduction, and Redshifts

    NASA Astrophysics Data System (ADS)

    Newman, Jeffrey A.; Cooper, Michael C.; Davis, Marc; Faber, S. M.; Coil, Alison L.; Guhathakurta, Puragra; Koo, David C.; Phillips, Andrew C.; Conroy, Charlie; Dutton, Aaron A.; Finkbeiner, Douglas P.; Gerke, Brian F.; Rosario, David J.; Weiner, Benjamin J.; Willmer, C. N. A.; Yan, Renbin; Harker, Justin J.; Kassin, Susan A.; Konidaris, N. P.; Lai, Kamson; Madgwick, Darren S.; Noeske, K. G.; Wirth, Gregory D.; Connolly, A. J.; Kaiser, N.; Kirby, Evan N.; Lemaux, Brian C.; Lin, Lihwai; Lotz, Jennifer M.; Luppino, G. A.; Marinoni, C.; Matthews, Daniel J.; Metevier, Anne; Schiavon, Ricardo P.

    2013-09-01

    We describe the design and data analysis of the DEEP2 Galaxy Redshift Survey, the densest and largest high-precision redshift survey of galaxies at z ~ 1 completed to date. The survey was designed to conduct a comprehensive census of massive galaxies, their properties, environments, and large-scale structure down to absolute magnitude MB = -20 at z ~ 1 via ~90 nights of observation on the Keck telescope. The survey covers an area of 2.8 deg2 divided into four separate fields observed to a limiting apparent magnitude of R AB = 24.1. Objects with z <~ 0.7 are readily identifiable using BRI photometry and rejected in three of the four DEEP2 fields, allowing galaxies with z > 0.7 to be targeted ~2.5 times more efficiently than in a purely magnitude-limited sample. Approximately 60% of eligible targets are chosen for spectroscopy, yielding nearly 53,000 spectra and more than 38,000 reliable redshift measurements. Most of the targets that fail to yield secure redshifts are blue objects that lie beyond z ~ 1.45, where the [O II] 3727 Å doublet lies in the infrared. The DEIMOS 1200 line mm-1 grating used for the survey delivers high spectral resolution (R ~ 6000), accurate and secure redshifts, and unique internal kinematic information. Extensive ancillary data are available in the DEEP2 fields, particularly in the Extended Groth Strip, which has evolved into one of the richest multiwavelength regions on the sky. This paper is intended as a handbook for users of the DEEP2 Data Release 4, which includes all DEEP2 spectra and redshifts, as well as for the DEEP2 DEIMOS data reduction pipelines. Extensive details are provided on object selection, mask design, biases in target selection and redshift measurements, the spec2d two-dimensional data-reduction pipeline, the spec1d automated redshift pipeline, and the zspec visual redshift verification process, along with examples of instrumental signatures or other artifacts that in some cases remain after data reduction. Redshift

  8. A Survey of Mathematical Optimization Models and Algorithms for Designing and Extending Irrigation and Wastewater Networks

    NASA Astrophysics Data System (ADS)

    Mandl, Christoph E.

    1981-08-01

    This paper presents a state of the art survey of network models and algorithms that can be used as planning tools in irrigation and wastewater systems. It is shown that the problem of designing or extending such systems basically leads to the same type of mathematical optimization model. The difficulty in solving this model lies mainly in the properties of the objective function. Trying to minimize construction and/or operating costs of a system typically results in a concave cost (objective) function due to economies of scale. A number of ways to attack such models are discussed and compared, including linear programing, integer programing, and specially designed exact and heuristic algorithms. The usefulness of each approach is evaluated in terms of the validity of the model, the computational complexity of the algorithm, the properties of the solution, the availability of software, and the capability for sensitivity analysis.

  9. Trajectory Design Enhancements to Mitigate Risk for the Transiting Exoplanet Survey Satellite (TESS)

    NASA Technical Reports Server (NTRS)

    Dichmann, Donald; Parker, Joel; Nickel, Craig; Lutz, Stephen

    2016-01-01

    The Transiting Exoplanet Survey Satellite (TESS) will employ a highly eccentric Earth orbit, in 2:1 lunar resonance, which will be reached with a lunar flyby preceded by 3.5 phasing loops. The TESS mission has limited propellant and several constraints on the science orbit and on the phasing loops. Based on analysis and simulation, we have designed the phasing loops to reduce delta-V (DV) and to mitigate risk due to maneuver execution errors. We have automated the trajectory design process and use distributed processing to generate and optimal nominal trajectories; to check constraint satisfaction; and finally to model the effects of maneuver errors to identify trajectories that best meet the mission requirements.

  10. Flow bioreactor design for quantitative measurements over endothelial cells using micro-particle image velocimetry

    NASA Astrophysics Data System (ADS)

    Leong, Chia Min; Voorhees, Abram; Nackman, Gary B.; Wei, Timothy

    2013-04-01

    Mechanotransduction in endothelial cells (ECs) is a highly complex process through which cells respond to changes in hemodynamic loading by generating biochemical signals involving gene and protein expression. To study the effects of mechanical loading on ECs in a controlled fashion, different in vitro devices have been designed to simulate or replicate various aspects of these physiological phenomena. This paper describes the design, use, and validation of a flow chamber which allows for spatially and temporally resolved micro-particle image velocimetry measurements of endothelial surface topography and stresses over living ECs immersed in pulsatile flow. This flow chamber also allows the study of co-cultures (i.e., ECs and smooth muscle cells) and the effect of different substrates (i.e., coverslip and/or polyethylene terepthalate (PET) membrane) on cellular response. In this report, the results of steady and pulsatile flow on fixed endothelial cells seeded on PET membrane and coverslip, respectively, are presented. Surface topography of ECs is computed from multiple two-dimensional flow measurements. The distributions of shear stress and wall pressure on each individual cell are also determined and the importance of both types of stress in cell remodeling is highlighted.

  11. Wide-Field InfraRed Survey Telescope (WFIRST) slitless spectrometer: design, prototype, and results

    NASA Astrophysics Data System (ADS)

    Gong, Qian; Content, David A.; Dominguez, Margaret; Emmett, Thomas; Griesmann, Ulf; Hagopian, John; Kruk, Jeffrey; Marx, Catherine; Pasquale, Bert; Wallace, Thomas; Whipple, Arthur

    2016-07-01

    The slitless spectrometer plays an important role in the WFIRST mission for the survey of emission-line galaxies. This will be an unprecedented very wide field, HST quality 3D survey of emission line galaxies1. The concept of the compound grism as a slitless spectrometer has been presented previously. The presentation briefly discusses the challenges and solutions of the optical design, and recent specification updates, as well as a brief comparison between the prototype and the latest design. However, the emphasis of this paper is the progress of the grism prototype: the fabrication and test of the complicated diffractive optical elements and powered prism, as well as grism assembly alignment and testing. Especially how to use different tools and methods, such as IR phase shift and wavelength shift interferometry, to complete the element and assembly tests. The paper also presents very encouraging results from recent element tests to assembly tests. Finally we briefly touch the path forward plan to test the spectral characteristic, such as spectral resolution and response.

  12. Survey of light-water-reactor designs to be offered in the United States

    SciTech Connect

    Spiewak, I.

    1986-03-01

    ORNL has conducted a Nuclear Power Options Viability Study for the Department of Energy. That study is primarily concerned with new technology which could be developed for initial operation in the 2000 to 2010 time frame. Such technology would have to compete not only with coal options but with incrementally improved commercial light-water-reactors. This survey reported here was undertaken to gain an understanding of the nuclear commercial technology likely to be offered in the late 1980s and perhaps beyond. The three US vendors actively marketing NSSSs are each developing a product for the future which they expect to be more reliable, more maintainable, more economical, and safer than the present plants. These are all essentially 3800-MW(t) designs, although all are studying smaller plants. They apparently will be off offered as standard prelicensed designs with much larger scope than earlier NSSS offerings, with the possibility of firm prices. Westinghouse with Mitsubishi Heavy Industries is developing a completely new design (APWR) to be built initially in Japan, hopefully for operation by the mid-1990s. Westinghouse is making a strong effort to have the APWR licensed in the US as a standard plant. Combustion Engineering (C-E) is evaluating potential improvements to the System-80 standard design (CESSAR) that has already received final design approval by the NRC. General Electric (GE), with Hitachi and Toshiba, is developing a new design (ABWR) that incorporates advanced features which have been proven by the worldwide BWR suppliers. The ABWR is to be built initially in Japan, but the design could be adapted to the United States. Westinghouse, C-E, and GE have done some conceptual evaluation of reactors in the 600-MW(e) class. The Westinghouse concept is a two-loop plant intended for factory assembly in a shipyard and delivery to a site by barge. The GE concept is a modification of the ABWR with some additional passive safety features. 16 figs.

  13. Electrochemical detection of magnetically-entrapped DNA sequences from complex samples by multiplexed enzymatic labelling: Application to a transgenic food/feed quantitative survey.

    PubMed

    Manzanares-Palenzuela, C L; Martín-Clemente, J P; Lobo-Castañón, M J; López-Ruiz, B

    2017-03-01

    Monitoring of genetically modified organisms in food and feed demands molecular techniques that deliver accurate quantitative results. Electrochemical DNA detection has been widely described in this field, yet most reports convey qualitative data and application in processed food and feed samples is limited. Herein, the applicability of an electrochemical multiplex assay for DNA quantification in complex samples is assessed. The method consists of the simultaneous magnetic entrapment via sandwich hybridisation of two DNA sequences (event-specific and taxon-specific) onto the surface of magnetic microparticles, followed by bienzymatic labelling. As proof-of-concept, we report its application in a transgenic food/feed survey where relative quantification (two-target approach) of Roundup Ready Soybean® (RRS) was performed in food and feed. Quantitative coupling to end-point PCR was performed and calibration was achieved from 22 and 243 DNA copies spanning two orders of magnitude for the event and taxon-specific sequences, respectively. We collected a total of 33 soybean-containing samples acquired in local supermarkets, four out of which were found to contain undeclared presence of genetically modified soybean. A real-time PCR method was used to verify these findings. High correlation was found between results, indicating the suitability of the proposed multiplex method for food and feed monitoring.

  14. Quantitative evaluation of a thrust vector controlled transport at the conceptual design phase

    NASA Astrophysics Data System (ADS)

    Ricketts, Vincent Patrick

    The impetus to innovate, to push the bounds and break the molds of evolutionary design trends, often comes from competition but sometimes requires catalytic political legislature. For this research endeavor, the 'catalyzing legislation' comes in response to the rise in cost of fossil fuels and the request put forth by NASA on aircraft manufacturers to show reduced aircraft fuel consumption of +60% within 30 years. This necessitates that novel technologies be considered to achieve these values of improved performance. One such technology is thrust vector control (TVC). The beneficial characteristic of thrust vector control technology applied to the traditional tail-aft configuration (TAC) commercial transport is its ability to retain the operational advantage of this highly evolved aircraft type like cabin evacuation, ground operation, safety, and certification. This study explores if the TVC transport concept offers improved flight performance due to synergistically reducing the traditional empennage size, overall resulting in reduced weight and drag, and therefore reduced aircraft fuel consumption. In particular, this study explores if the TVC technology in combination with the reduced empennage methodology enables the TAC aircraft to synergistically evolve while complying with current safety and certification regulation. This research utilizes the multi-disciplinary parametric sizing software, AVD Sizing, developed by the Aerospace Vehicle Design (AVD) Laboratory. The sizing software is responsible for visualizing the total system solution space via parametric trades and is capable of determining if the TVC technology can enable the TAC aircraft to synergistically evolve, showing marked improvements in performance and cost. This study indicates that the TVC plus reduced empennage methodology shows marked improvements in performance and cost.

  15. Design of eye models used in quantitative analysis of interaction between chromatic and higher-order aberrations of eye

    NASA Astrophysics Data System (ADS)

    Zhai, Yi; Wang, Yan; Wang, Zhaoqi; Liu, Yongji; Zhang, Lin; He, Yuanqing; Chang, Shengjiang

    2014-12-01

    Special kinds of eye models are constructed by means of optical system design to quantitatively investigate impacts of longitudinal chromatic aberration (LCA), transverse chromatic aberration (TCA) and LCA+TCA on retina image quality and on depth of focus (DOF), as well as interaction between chromatic and higher-order aberrations. Results show that LCA plays a main role in enhancement of DOF and higher-order aberrations further increase DOF. For most of the eyes the impact of higher-order aberrations on vision is much smaller than that of LCA+TCA and the presence of LCA+TCA further reduces the impact of higher-order aberrations. The impact of LCA approximates to that of LCA+TCA, and the impact of TCA approximates to that of normal level of higher-order aberrations and is negligible.

  16. A Survey to Examine Teachers' Perceptions of Design Dispositions, Lesson Design Practices, and Their Relationships with Technological Pedagogical Content Knowledge (TPACK)

    ERIC Educational Resources Information Center

    Koh, Joyce Hwee Ling; Chai, Ching Sing; Hong, Huang-Yao; Tsai, Chin-Chung

    2015-01-01

    This study investigates 201 Singaporean teachers' perceptions of their technological pedagogical content knowledge (TPACK), lesson design practices, and design dispositions through a survey instrument. Investigation of these constructs reveal important variables influencing teachers' perceptions of TPACK which have not yet been explored. The…

  17. Use of physiological constraints to identify quantitative design principles for gene expression in yeast adaptation to heat shock

    PubMed Central

    Vilaprinyo, Ester; Alves, Rui; Sorribas, Albert

    2006-01-01

    Background Understanding the relationship between gene expression changes, enzyme activity shifts, and the corresponding physiological adaptive response of organisms to environmental cues is crucial in explaining how cells cope with stress. For example, adaptation of yeast to heat shock involves a characteristic profile of changes to the expression levels of genes coding for enzymes of the glycolytic pathway and some of its branches. The experimental determination of changes in gene expression profiles provides a descriptive picture of the adaptive response to stress. However, it does not explain why a particular profile is selected for any given response. Results We used mathematical models and analysis of in silico gene expression profiles (GEPs) to understand how changes in gene expression correlate to an efficient response of yeast cells to heat shock. An exhaustive set of GEPs, matched with the corresponding set of enzyme activities, was simulated and analyzed. The effectiveness of each profile in the response to heat shock was evaluated according to relevant physiological and functional criteria. The small subset of GEPs that lead to effective physiological responses after heat shock was identified as the result of the tuning of several evolutionary criteria. The experimentally observed transcriptional changes in response to heat shock belong to this set and can be explained by quantitative design principles at the physiological level that ultimately constrain changes in gene expression. Conclusion Our theoretical approach suggests a method for understanding the combined effect of changes in the expression of multiple genes on the activity of metabolic pathways, and consequently on the adaptation of cellular metabolism to heat shock. This method identifies quantitative design principles that facilitate understating the response of the cell to stress. PMID:16584550

  18. On the design of a cold neutron irradiator (CNI) for quantitative materials characterization

    NASA Astrophysics Data System (ADS)

    Atwood, Alexander Grover

    1997-11-01

    A design study of a cold neutron irradiator (CNI) for materials characterization using prompt gamma-ray neutron activation analysis (PGNAA) is presented. Using 252Cf neutron sources in a block of moderator, a portion of which is maintained at a cryogenic temperature, the CNI employs cold neutrons instead of thermal neutrons to enhance the neutron capture reaction rate in a sample. Capture gamma rays are detected in an HPGe photon detector. Optimization of the CNI with respect to elemental sensitivity (counts per mg) is the primary goal of this design study. Monte Carlo simulation of radiation transport, by means of the MCNP code and the ENDF/B cross-section libraries, is used to model the CNI. A combination of solid methane at 22 K, room-temperature polyethylene, and room-temperature beryllium has been chosen for the neutron delivery subsystem of the CNI. Using four 250-microgram 252Cf neutron sources, with a total neutron emission rate of 2.3× 109 neutrons/s, a thermal-equivalent neutron flux of 1.7× 107 neutrons/cm2-s in an internally located cylindrical sample space of diameter 6.5 cm and height 6.0 cm is predicted by MCNP calculations. A cylindrical port with an integral annular collimator composed of bismuth, lead, polyethylene, and lithium carbonate, is located between the sample and the detector. Calculations have been performed of gamma-ray and neutron transport in the port and integral collimator with the objective of optimizing the statistical precision with which one can measure elemental masses in the sample while also limiting the fast neutron flux incident upon the HPGe detector to a reasonable level. The statistical precision with which one can measure elemental masses can be enhanced by a factor of between 2.3 and 5.3 (depending on the origin of the background gamma rays) compared with a neutron irradiator identical to the CNI except for the replacement of the cryogenic solid methane by room-temperature polyethylene. The projected performance of

  19. Body image in adolescence: cross-cultural research--results of the preliminary phase of a quantitative survey.

    PubMed

    Ferron, C

    1997-01-01

    This preliminary phase of a quantitative research had two main objectives: to identify the emotional and relational components of body image in adolescents, and to determine whether the experience of body changes is dependent upon individuals' context. Two samples of adolescents, both 13 to 17 years of age, who were healthy, middle- or upper middle-class, and randomly chosen, participated in the study. Subjects were 80 French adolescents (40 boys and 40 girls) from a center for preventive medicine, and 60 American adolescents (30 boys and 30 girls), from a suburban high school. Thorough individual interviews were conducted with these adolescents on the basis of a precise interview guide in order to determine their perceptions, attitudes, and beliefs about body image. A thematic analysis of the content of these recorded interviews revealed the differences between adolescents from the two countries. I was found that the main cultural differences were based on the belief that the real body and the ideal body coincide, and on the way physical appearance is included in the diversity of relational experiences. Gender differences were shown to be centered more on the level of control of body changes and on self-assessment modes; the signs of a failing or troubled body image may find their origin on an individual level, in the particularities of the family and parental language about the body, and on a collective level in the social representation of the body. The consequences of these symbolic representations on the adolescents' body image and attitudes toward their own health, are presented and discussed.

  20. Quantitative transportation assessment in curved canals prepared with an off-centered rectangular design system.

    PubMed

    Silva, Emmanuel João Nogueira Leal; Vieira, Vania Cristina Gomes; Tameirão, Michele Dias Nunes; Belladonna, Felipe Gonçalves; Neves, Aline de Almeida; Souza, Erick Miranda; DE-Deus, Gustavo

    2016-01-01

    The purpose of this study was to assess the ability of an off-centered rectangular design system [ProTaper Next (PTN)] to maintain the original profile of the root canal anatomy. To this end, ProTaper Universal (PTU), Reciproc (R) and WaveOne (WO) systems were used as reference techniques for comparison. Forty clear resin blocks with simulated curved root canals were randomly assigned to 4 groups (n = 10) according to the instrumentation system used: PTN, PTU, R and WO. Color stereomicroscopic images of each block were taken before and after instrumentation. All image processing and data analysis were performed with an open source program (Fiji v.1.47n). Evaluation of canal transportation was obtained for two independent regions: straight and curved portions. Univariate analysis of variance and Tukey's Honestly Significant Difference test were performed, and a cut-off for significance was set at α = 5%. Instrumentation systems significantly influenced canal transportation (p = 0.000). Overall, R induced significantly lower canal transportation compared with WO, PTN and PTU (p = 0.000). The curved portion displayed superior canal transportation compared to the straight one (p = 0.000). The significance of the difference among instrumentation systems varied according to the canal level evaluated (p = 0.000). In its straight portion, R and WO exhibited significantly lower transportation than PTN; whereas in the curved portion, R produced the lowest deviation. PTU exhibited the highest canal transportation at both levels. It can be concluded that PTN produced less canal transportation than PTU and WO; however, R exhibited better centering ability than PTN.

  1. Utility FGD survey, January--December 1989. Volume 2, Design performance data for operating FGD systems: Part 2

    SciTech Connect

    Hance, S.L.; McKibben, R.S.; Jones, F.M.

    1992-03-01

    This is Volume 2 part 2, of the Utility flue gas desulfurization (FGD) Survey report, which is generated by a computerized data base management system, represents a survey of operational and planned domestic utility flue gas desulfurization (FGD) systems. It summarizes information contributed by the utility industry, system and equipment suppliers, system designers, research organizations, and regulatory agencies. The data cover system design, fuel characteristics, operating history, and actual system performance. Also included is a unit-by-unit discussion of problems and solutions associated with the boilers, scrubbers, and FGD systems. This volume particularly contains basic design and performance data.

  2. Changes in depth occupied by Great Lakes lake whitefish populations and the influence of survey design

    USGS Publications Warehouse

    Rennie, Michael D.; Weidel, Brian C.; Claramunt, Randall M.; Dunlob, Erin S.

    2015-01-01

    Understanding fish habitat use is important in determining conditions that ultimately affect fish energetics, growth and reproduction. Great Lakes lake whitefish (Coregonus clupeaformis) have demonstrated dramatic changes in growth and life history traits since the appearance of dreissenid mussels in the Great Lakes, but the role of habitat occupancy in driving these changes is poorly understood. To better understand temporal changes in lake whitefish depth of capture (Dw), we compiled a database of fishery-independent surveys representing multiple populations across all five Laurentian Great Lakes. By demonstrating the importance of survey design in estimating Dw, we describe a novel method for detecting survey-based bias in Dw and removing potentially biased data. Using unbiased Dw estimates, we show clear differences in the pattern and timing of changes in lake whitefish Dw between our reference sites (Lake Superior) and those that have experienced significant benthic food web changes (lakes Michigan, Huron, Erie and Ontario). Lake whitefish Dw in Lake Superior tended to gradually shift to shallower waters, but changed rapidly in other locations coincident with dreissenid establishment and declines in Diporeia densities. Almost all lake whitefish populations that were exposed to dreissenids demonstrated deeper Dw following benthic food web change, though a subset of these populations subsequently shifted to more shallow depths. In some cases in lakes Huron and Ontario, shifts towards more shallow Dw are occurring well after documented Diporeia collapse, suggesting the role of other drivers such as habitat availability or reliance on alternative prey sources.

  3. THE COS-HALOS SURVEY: RATIONALE, DESIGN, AND A CENSUS OF CIRCUMGALACTIC NEUTRAL HYDROGEN

    SciTech Connect

    Tumlinson, Jason; Thom, Christopher; Sembach, Kenneth R.; Werk, Jessica K.; Prochaska, J. Xavier; Davé, Romeel; Oppenheimer, Benjamin D.; Ford, Amanda Brady; O'Meara, John M.; Peeples, Molly S.; Weinberg, David H.

    2013-11-01

    We present the design and methods of the COS-Halos survey, a systematic investigation of the gaseous halos of 44 z = 0.15-0.35 galaxies using background QSOs observed with the Cosmic Origins Spectrograph aboard the Hubble Space Telescope. This survey has yielded 39 spectra of z{sub em} ≅ 0.5 QSOs with S/N ∼10-15 per resolution element. The QSO sightlines pass within 150 physical kpc of the galaxies, which span early and late types over stellar mass log M{sub *}/M{sub ☉} = 9.5-11.5. We find that the circumgalactic medium exhibits strong H I, averaging ≅ 1 Å in Lyα equivalent width out to 150 kpc, with 100% covering fraction for star-forming galaxies and 75% covering for passive galaxies. We find good agreement in column densities between this survey and previous studies over similar range of impact parameter. There is weak evidence for a difference between early- and late-type galaxies in the strength and distribution of H I. Kinematics indicate that the detected material is bound to the host galaxy, such that ∼> 90% of the detected column density is confined within ±200 km s{sup –1} of the galaxies. This material generally exists well below the halo virial temperatures at T ∼< 10{sup 5} K. We evaluate a number of possible origin scenarios for the detected material, and in the end favor a simple model in which the bulk of the detected H I arises in a bound, cool, low-density photoionized diffuse medium that is generic to all L* galaxies and may harbor a total gaseous mass comparable to galactic stellar masses.

  4. Mapping epistatic quantitative trait loci underlying endosperm traits using all markers on the entire genome in a random hybridization design.

    PubMed

    He, X-H; Zhang, Y-M

    2008-07-01

    Triploid endosperm is of great economic importance owing to its nutritious quality. Mapping endosperm trait loci (ETL) can provide an efficient way to genetically improve grain quality. However, most triploid ETL mapping methods do not produce unbiased estimates of the two dominant effects of ETL. A random hybridization design is an alternative method that may be used to overcome this problem. However, epistasis has an important role in the dissection of genetic architecture for complex traits. In this study, therefore, an attempt was made to map epistatic ETL (eETL) under a triploid genetic model of endosperm traits in a random hybridization design. The endosperm trait means of random hybrid lines, together with known marker genotype information from their corresponding parental F(2) plants, were used to estimate, efficiently and without bias, the positions and all of the effects of eETL using a penalized maximum likelihood method. The method proposed in this article was verified by a series of Monte Carlo simulation experiments. Results from the simulated studies show that the proposed method provides accurate estimates of eETL parameters with a low false-positive rate and a relatively short running time. This new method enables us to map triploid eETL in the same way as diploid quantitative traits.

  5. Sampling design for an integrated socioeconomic and ecological survey by using satellite remote sensing and ordination.

    PubMed

    Binford, Michael W; Lee, Tae Jeong; Townsend, Robert M

    2004-08-03

    Environmental variability is an important risk factor in rural agricultural communities. Testing models requires empirical sampling that generates data that are representative in both economic and ecological domains. Detrended correspondence analysis of satellite remote sensing data were used to design an effective low-cost sampling protocol for a field study to create an integrated socioeconomic and ecological database when no prior information on ecology of the survey area existed. We stratified the sample for the selection of tambons from various preselected provinces in Thailand based on factor analysis of spectral land-cover classes derived from satellite data. We conducted the survey for the sampled villages in the chosen tambons. The resulting data capture interesting variations in soil productivity and in the timing of good and bad years, which a purely random sample would likely have missed. Thus, this database will allow tests of hypotheses concerning the effect of credit on productivity, the sharing of idiosyncratic risks, and the economic influence of environmental variability.

  6. Survey of ethical issues reported by Indian medical students: basis for design of a new curriculum.

    PubMed

    Rose, Anuradha; George, Kuryan; T, Arul Dhas; Pulimood, Anna Benjamin

    2014-01-01

    Education in ethics is now a formal part of the undergraduate medical curriculum. However, most courses are structured around principles and case studies more appropriate to western countries. The cultures and practices of countries like India differ from those of western countries. It is, therefore, essential that our teaching should address the issues which are the most relevant to our setting. An anonymised, questionnaire-based, cross-sectional survey of medical students was carried out to get a picture of the ethical problems faced by students in India. The data were categorised into issues related to professional behaviour and ethical dilemmas. Unprofessional behaviour was among the issues reported as a matter of concern by a majority of the medical students. The survey highlights the need to design the curriculum in a way that reflects the structure of medical education in India, where patients are not always considered socio-culturally equal by students or the medical staff. This perspective must underpin any further efforts to address education in ethics in India.

  7. SURVEY DESIGN FOR SPECTRAL ENERGY DISTRIBUTION FITTING: A FISHER MATRIX APPROACH

    SciTech Connect

    Acquaviva, Viviana; Gawiser, Eric; Bickerton, Steven J.; Grogin, Norman A.; Guo Yicheng; Lee, Seong-Kook

    2012-04-10

    The spectral energy distribution (SED) of a galaxy contains information on the galaxy's physical properties, and multi-wavelength observations are needed in order to measure these properties via SED fitting. In planning these surveys, optimization of the resources is essential. The Fisher Matrix (FM) formalism can be used to quickly determine the best possible experimental setup to achieve the desired constraints on the SED-fitting parameters. However, because it relies on the assumption of a Gaussian likelihood function, it is in general less accurate than other slower techniques that reconstruct the probability distribution function (PDF) from the direct comparison between models and data. We compare the uncertainties on SED-fitting parameters predicted by the FM to the ones obtained using the more thorough PDF-fitting techniques. We use both simulated spectra and real data, and consider a large variety of target galaxies differing in redshift, mass, age, star formation history, dust content, and wavelength coverage. We find that the uncertainties reported by the two methods agree within a factor of two in the vast majority ({approx}90%) of cases. If the age determination is uncertain, the top-hat prior in age used in PDF fitting to prevent each galaxy from being older than the universe needs to be incorporated in the FM, at least approximately, before the two methods can be properly compared. We conclude that the FM is a useful tool for astronomical survey design.

  8. Developing an efficient modelling and data presentation strategy for ATDEM system comparison and survey design

    NASA Astrophysics Data System (ADS)

    Combrinck, Magdel

    2015-10-01

    Forward modelling of airborne time-domain electromagnetic (ATDEM) responses is frequently used to compare systems and design surveys for optimum detection of expected mineral exploration targets. It is a challenging exercise to display and analyse the forward modelled responses due to the large amount of data generated for three dimensional models as well as the system dependent nature of the data. I propose simplifying the display of ATDEM responses through using the dimensionless quantity of signal-to-noise ratios (signal:noise) instead of respective system units. I also introduce the concept of a three-dimensional signal:noise nomo-volume as an efficient tool to visually present and analyse large amounts of data. The signal:noise nomo-volume is a logical extension of the two-dimensional conductance nomogram. It contains the signal:noise values of all system time channels and components for various target depths and conductances integrated into a single interactive three-dimensional image. Responses are calculated over a complete survey grid and therefore include effects of system and target geometries. The user can interactively select signal:noise cut-off values on the nomo-volume and is able to perform visual comparisons between various system and target responses. The process is easy to apply and geophysicists with access to forward modelling airborne electromagnetic (AEM) and three-dimensional imaging software already possess the tools required to produce and analyse signal:noise nomo-volumes.

  9. A survey of pulse shape options for a revised plastic ablator ignition design

    SciTech Connect

    Clark, D. S.; Milovich, J. L.; Hinkel, D. E.; Salmonson, J. D.; Peterson, J. L.; Berzak Hopkins, L. F.; Eder, D. C.; Haan, S. W.; Jones, O. S.; Marinak, M. M.; Robey, H. F.; Smalyuk, V. A.; Weber, C. R.

    2014-11-15

    Recent experimental results using the “high foot” pulse shape for inertial confinement fusion ignition experiments on the National Ignition Facility (NIF) [Moses et al., Phys. Plasmas 16, 041006 (2009)] have shown encouraging progress compared to earlier “low foot” experiments. These results strongly suggest that controlling ablation front instability growth can significantly improve implosion performance even in the presence of persistent, large, low-mode distortions. Simultaneously, hydrodynamic growth radiography experiments have confirmed that ablation front instability growth is being modeled fairly well in NIF experiments. It is timely then to combine these two results and ask how current ignition pulse shapes could be modified to improve one-dimensional implosion performance while maintaining the stability properties demonstrated with the high foot. This paper presents such a survey of pulse shapes intermediate between the low and high foot extremes in search of an intermediate foot optimum. Of the design space surveyed, it is found that a higher picket version of the low foot pulse shape shows the most promise for improved compression without loss of stability.

  10. The Unique Optical Design of the CTI-II Survey Telescope

    NASA Astrophysics Data System (ADS)

    Ackermann, Mark R.; McGraw, J. T.; MacFarlane, M.

    2006-12-01

    The CCD/Transit Instrument with Innovative Instrumentation (CTI-II) is being developed for precision ground-based astrometric and photometric astronomical observations. The 1.8m telescope will be stationary, near-zenith pointing and will feature a CCD-mosaic array operated in time-delay and integrate (TDI) mode to image a continuous strip of the sky in five bands. The heart of the telescope is a Nasmyth-like bent-Cassegrain optical system optimized to produce near diffraction-limited images with near zero distortion over a circular1.42 deg field. The optical design includes an f/2.2 parabolic ULE primary with no central hole salvaged from the original CTI telescope and adds the requisite hyperbolic secondary, a folding flat and a highly innovative all-spherical, five lens corrector which includes three plano surfaces. The reflective and refractive portions of the design have been optimized as individual but interdependent systems so that the same reflective system can be used with slightly different refractive correctors. At present, two nearly identical corrector designs are being evaluated, one fabricated from BK-7 glass and the other of fused silica. The five lens corrector consists of an air-spaced triplet separated from follow-on air-spaced doublet. Either design produces 0.25 arcsecond images at 83% encircled energy with a maximum of 0.0005% distortion. The innovative five lens corrector design has been applied to other current and planned Cassegrain, RC and super RC optical systems requiring correctors. The basic five lens approach always results in improved performance compared to the original designs. In some cases, the improvement in image quality is small but includes substantial reductions in distortion. In other cases, the improvement in image quality is substantial. Because the CTI-II corrector is designed for a parabolic primary, it might be especially useful for liquid mirror telescopes. We describe and discuss the CTI-II optical design with respect

  11. Hot rocket plume experiment - Survey and conceptual design. [of rhenium-iridium bipropellants

    NASA Technical Reports Server (NTRS)

    Millard, Jerry M.; Luan, Taylor W.; Dowdy, Mack W.

    1992-01-01

    Attention is given to a space-borne engine plume experiment study to fly an experiment which will both verify and quantify the reduced contamination from advanced rhenium-iridium earth-storable bipropellant rockets (hot rockets) and provide a correlation between high-fidelity, in-space measurements and theoretical plume and surface contamination models. The experiment conceptual design is based on survey results from plume and contamination technologists throughout the U.S. With respect to shuttle use, cursory investigations validate Hitchhiker availability and adaptability, adequate remote manipulator system (RMS) articulation and dynamic capability, acceptable RMS attachment capability, adequate power and telemetry capability, and adequate flight altitude and attitude/orbital capability.

  12. Autonomous Underwater Vehicle Survey Design for Monitoring Carbon Capture and Storage Sites

    NASA Astrophysics Data System (ADS)

    Bull, J. M.; Cevatoglu, M.; Connelly, D.; Wright, I. C.; McPhail, S.; Shitashima, K.

    2013-12-01

    Long-term monitoring of sub-seabed Carbon Capture and Storage (CCS) sites will require systems that are flexible, independent, and have long-endurance. In this presentation we will discuss the utility of autonomous underwater vehicles equipped with different sensor packages in monitoring storage sites. We will present data collected using Autosub AUV, as part of the ECO2 project, from the Sleipner area of the North Sea. The Autosub AUV was equipped with sidescan sonar, an EM2000 multibeam systems, a Chirp sub-bottom profiler, and a variety of chemical sensors. Our presentation will focus on survey design, and the simultaneous use of multiple sensor packages in environmental monitoring on the continental shelf.

  13. Monte Carlo Analysis as a Trajectory Design Driver for the Transiting Exoplanet Survey Satellite (TESS) Mission

    NASA Technical Reports Server (NTRS)

    Nickel, Craig; Parker, Joel; Dichmann, Don; Lebois, Ryan; Lutz, Stephen

    2016-01-01

    The Transiting Exoplanet Survey Satellite (TESS) will be injected into a highly eccentric Earth orbit and fly 3.5 phasing loops followed by a lunar flyby to enter a mission orbit with lunar 2:1 resonance. Through the phasing loops and mission orbit, the trajectory is significantly affected by lunar and solar gravity. We have developed a trajectory design to achieve the mission orbit and meet mission constraints, including eclipse avoidance and a 30-year geostationary orbit avoidance requirement. A parallelized Monte Carlo simulation was performed to validate the trajectory after injecting common perturbations, including launch dispersions, orbit determination errors, and maneuver execution errors. The Monte Carlo analysis helped identify mission risks and is used in the trajectory selection process.

  14. Designing HIGH-COST Medicine Hospital Surveys, Health Planning, and the Paradox of Progressive Reform

    PubMed Central

    2010-01-01

    Inspired by social medicine, some progressive US health reforms have paradoxically reinforced a business model of high-cost medical delivery that does not match social needs. In analyzing the financial status of their areas’ hospitals, for example, city-wide hospital surveys of the 1910s through 1930s sought to direct capital investments and, in so doing, control competition and markets. The 2 national health planning programs that ran from the mid-1960s to the mid-1980s continued similar strategies of economic organization and management, as did the so-called market reforms that followed. Consequently, these reforms promoted large, extremely specialized, capital-intensive institutions and systems at the expense of less complex (and less costly) primary and chronic care. The current capital crisis may expose the lack of sustainability of such a model and open up new ideas and new ways to build health care designed to meet people's health needs. PMID:20019312

  15. Designing HIGH-COST medicine: hospital surveys, health planning, and the paradox of progressive reform.

    PubMed

    Perkins, Barbara Bridgman

    2010-02-01

    Inspired by social medicine, some progressive US health reforms have paradoxically reinforced a business model of high-cost medical delivery that does not match social needs. In analyzing the financial status of their areas' hospitals, for example, city-wide hospital surveys of the 1910s through 1930s sought to direct capital investments and, in so doing, control competition and markets. The 2 national health planning programs that ran from the mid-1960s to the mid-1980s continued similar strategies of economic organization and management, as did the so-called market reforms that followed. Consequently, these reforms promoted large, extremely specialized, capital-intensive institutions and systems at the expense of less complex (and less costly) primary and chronic care. The current capital crisis may expose the lack of sustainability of such a model and open up new ideas and new ways to build health care designed to meet people's health needs.

  16. Nonexperimental Quantitative Research and Its Role in Guiding Instruction

    ERIC Educational Resources Information Center

    Cook, Bryan G.; Cook, Lysandra

    2008-01-01

    Different research designs answer different questions. Educators cannot use nonexperimental quantitative research designs, such as descriptive surveys and correlational research, to determine definitively that an intervention causes improved student outcomes and is an evidence-based practice. However, such research can (a) inform educators about a…

  17. S-CANDELS: The Spitzer-Cosmic Assembly Near-Infrared Deep Extragalactic Survey. Survey Design, Photometry, and Deep IRAC Source Counts

    NASA Astrophysics Data System (ADS)

    Ashby, M. L. N.; Willner, S. P.; Fazio, G. G.; Dunlop, J. S.; Egami, E.; Faber, S. M.; Ferguson, H. C.; Grogin, N. A.; Hora, J. L.; Huang, J.-S.; Koekemoer, A. M.; Labbé, I.; Wang, Z.

    2015-06-01

    The Spitzer-Cosmic Assembly Deep Near-infrared Extragalactic Legacy Survey (S-CANDELS; PI G.Fazio) is a Cycle 8 Exploration Program designed to detect galaxies at very high redshifts (z\\gt 5). To mitigate the effects of cosmic variance and also to take advantage of deep coextensive coverage in multiple bands by the Hubble Space Telescope (HST) Multi-cycle Treasury Program CANDELS, S-CANDELS was carried out within five widely separated extragalactic fields: the UKIDSS Ultra-deep Survey, the Extended Chandra Deep Field South, COSMOS, the HST Deep Field North, and the Extended Groth Strip. S-CANDELS builds upon the existing coverage of these fields from the Spitzer Extended Deep Survey (SEDS), a Cycle 6 Exploration Program, by increasing the integration time from SEDS’ 12 hr to a total of 50 hr but within a smaller area, 0.16 deg2. The additional depth significantly increases the survey completeness at faint magnitudes. This paper describes the S-CANDELS survey design, processing, and publicly available data products. We present Infrared Array Camera (IRAC) dual-band 3.6+4.5 μ {{m}} catalogs reaching to a depth of 26.5 AB mag. Deep IRAC counts for the roughly 135,000 galaxies detected by S-CANDELS are consistent with models based on known galaxy populations. The increase in depth beyond earlier Spitzer/IRAC surveys does not reveal a significant additional contribution from discrete sources to the diffuse Cosmic Infrared Background (CIB). Thus it remains true that only roughly half of the estimated CIB flux from COBE/DIRBE is resolved.

  18. Technology transfer with system analysis, design, decision making, and impact (Survey-2000) in acute care hospitals in the United States.

    PubMed

    Hatcher, M

    2001-10-01

    This paper provides the results of the Survey-2000 measuring technology transfer for management information systems in health care. The relationships with systems approaches, user involvement, usersatisfaction, and decision-making were measured and are presented. The survey also measured the levels Internet and Intranet presents in acute care hospitals, which will be discussed in future articles. The depth of the survey includes e-commerce for both business to business and customers. These results are compared, where appropriate, with results from survey 1997 and changes are discussed. This information will provide benchmarks for hospitals to plan their network technology position and to set goals. This is the first of three articles based upon the results of the Srvey-2000. Readers are referred to a prior article by the author that discusses the survey design and provides a tutorial on technology transfer in acute care hospitals.

  19. Thermal Design of the Instrument for the Transiting Exoplanet Survey Satellite

    NASA Astrophysics Data System (ADS)

    Allen, Gregory D.

    The thermal design and analysis of space systems is an important application for the field of mechanical engineering. Space systems encounter harsh environments and often have exacting temperature and performance requirements. In this thesis, the thermal design and analysis process undertaken for the Instrument of Transiting Exoplanet Survey Satellite (TESS) is detailed. The TESS program is a two year NASA Explorer mission which uses four cameras to discover exoplanets via the transit photometry method. It will be placed in a high-earth orbit with a period of 13.7 days and will be unaffected by temperature disturbances caused by environmental heating from the Earth. The cameras use their stray-light baffles to passively cool the cameras and in turn the CCDs in order to maintain operational temperatures. It is a payload which encompasses four cameras that have unique thermal requirements which the system was designed to accommodate. These requirements include large power level uncertainty, highly stable temperatures, low temperature CCDs and a compact mechanical design. The design was matured through analysis using a thermal modeling tool known as Thermal DesktopRTM which uses the finite difference method. A system level model was built with this tool using inputs such as the thermal, thermal-optical properties, the 3D CAD model and thermal contact resistances. It was then used to analyze the system against component temperature limits including NASA specified design margins. Bounding cases have been developed which envelope hot and cold operational cases as well as cold survival during eclipse. Results are presented which show that margins are positive. These design margins provide for contingency in the case of modeling inaccuracies. Later in the program the Instrument will undergo thermal vacuum testing in order to verify the model. Official validation and verification planning is underway and will be performed as the system is built up. It is slated for launch

  20. Mechanical Design of NESSI: New Mexico Tech Extrasolar Spectroscopic Survey Instrument

    NASA Technical Reports Server (NTRS)

    Santoro, Fernando G.; Olivares, Andres M.; Salcido, Christopher D.; Jimenez, Stephen R.; Jurgenson, Colby A.; Hrynevych, Michael A.; Creech-Eakman, Michelle J.; Boston, Penny J.; Schmidt, Luke M.; Bloemhard, Heather; Rodeheffer, Dan; Vaive, Genevieve; Vasisht, Gautam; Swain, Mark R.; Deroo, Pieter

    2011-01-01

    NESSI: the New Mexico Tech Extrasolar Spectroscopic Survey Instrument is a ground-based multi-object spectrograph that operates in the near-infrared. It will be installed on one of the Nasmyth ports of the Magdalena Ridge Observatory (MRO) 2.4-meter Telescope sited in the Magdalena Mountains, about 48 km west of Socorro-NM. NESSI operates stationary to the telescope fork so as not to produce differential flexure between internal opto-mechanical components during or between observations. An appropriate mechanical design allows the instrument alignment to be highly repeatable and stable for both short and long observation timescales, within a wide-range of temperature variation. NESSI is optically composed of a field lens, a field de-rotator, re-imaging optics, an auto-guider and a Dewar spectrograph that operates at LN2 temperature. In this paper we report on NESSI's detailed mechanical and opto-mechanical design, and the planning for mechanical construction, assembly, integration and verification.

  1. The Hawk-I UDS and GOODS Survey (HUGS): Survey design and deep K-band number counts

    NASA Astrophysics Data System (ADS)

    Fontana, A.; Dunlop, J. S.; Paris, D.; Targett, T. A.; Boutsia, K.; Castellano, M.; Galametz, A.; Grazian, A.; McLure, R.; Merlin, E.; Pentericci, L.; Wuyts, S.; Almaini, O.; Caputi, K.; Chary, R.-R.; Cirasuolo, M.; Conselice, C. J.; Cooray, A.; Daddi, E.; Dickinson, M.; Faber, S. M.; Fazio, G.; Ferguson, H. C.; Giallongo, E.; Giavalisco, M.; Grogin, N. A.; Hathi, N.; Koekemoer, A. M.; Koo, D. C.; Lucas, R. A.; Nonino, M.; Rix, H. W.; Renzini, A.; Rosario, D.; Santini, P.; Scarlata, C.; Sommariva, V.; Stark, D. P.; van der Wel, A.; Vanzella, E.; Wild, V.; Yan, H.; Zibetti, S.

    2014-10-01

    We present the results of a new, ultra-deep, near-infrared imaging survey executed with the Hawk-I imager at the ESO VLT, of which we make all the data (images and catalog) public. This survey, named HUGS (Hawk-I UDS and GOODS Survey), provides deep, high-quality imaging in the K and Y bands over the portions of the UKIDSS UDS and GOODS-South fields covered by the CANDELS HST WFC3/IR survey. In this paper we describe the survey strategy, the observational campaign, the data reduction process, and the data quality. We show that, thanks to exquisite image quality and extremely long exposure times, HUGS delivers the deepest K-band images ever collected over areas of cosmological interest, and in general ideally complements the CANDELS data set in terms of image quality and depth. In the GOODS-S field, the K-band observations cover the whole CANDELS area with a complex geometry made of 6 different, partly overlapping pointings, in order to best match the deep and wide areas of CANDELS imaging. In the deepest region (which includes most of the Hubble Ultra Deep Field) exposure times exceed 80 hours of integration, yielding a 1 - σ magnitude limit per square arcsec of ≃28.0 AB mag. The seeing is exceptional and homogeneous across the various pointings, confined to the range 0.38-0.43 arcsec. In the UDS field the survey is about one magnitude shallower (to match the correspondingly shallower depth of the CANDELS images) but includes also Y-band band imaging (which, in the UDS, was not provided by the CANDELS WFC3/IR imaging). In the K-band, with an average exposure time of 13 hours, and seeing in the range 0.37-0.43 arcsec, the 1 - σ limit per square arcsec in the UDS imaging is ≃27.3 AB mag. In the Y-band, with an average exposure time ≃8 h, and seeing in the range 0.45-0.5 arcsec, the imaging yields a 1 - σ limit per square arcsec of ≃28.3 AB mag. We show that the HUGS observations are well matched to the depth of the CANDELS WFC3/IR data, since the majority

  2. Utility FGD Survey, January--December 1989. Volume 2, Design performance data for operating FGD systems, Part 1

    SciTech Connect

    Hance, S.L.; McKibben, R.S.; Jones, F.M.

    1992-03-01

    The Utility flue gas desulfurization (FGD) Survey report, which is generated by a computerized data base management system, represents a survey of operational and planned domestic utility flue gas desulfurization (FGD) systems. It summarizes information contributed by the utility industry, system and equipment suppliers, system designers, research organizations, and regulatory agencies. The data cover system design, fuel characteristics, operating history, and actual system performance. Also included is a unit-by-unit discussion of problems and solutions associated with the boilers, scrubbers, and FGD systems. The development status (operational, under construction, or in the planning stages), system supplier, process, waste disposal practice, and regulatory class are tabulated alphabetically by utility company.

  3. Quantitative Evaluation of Tissue Surface Adaption of CAD-Designed and 3D Printed Wax Pattern of Maxillary Complete Denture

    PubMed Central

    Chen, Hu; Wang, Han; Lv, Peijun; Wang, Yong; Sun, Yuchun

    2015-01-01

    Objective. To quantitatively evaluate the tissue surface adaption of a maxillary complete denture wax pattern produced by CAD and 3DP. Methods. A standard edentulous maxilla plaster cast model was used, for which a wax pattern of complete denture was designed using CAD software developed in our previous study and printed using a 3D wax printer, while another wax pattern was manufactured by the traditional manual method. The cast model and the two wax patterns were scanned in the 3D scanner as “DataModel,” “DataWaxRP,” and “DataWaxManual.” After setting each wax pattern on the plaster cast, the whole model was scanned for registration. After registration, the deviations of tissue surface between “DataModel” and “DataWaxRP” and between “DataModel” and “DataWaxManual” were measured. The data was analyzed by paired t-test. Results. For both wax patterns produced by the CAD&RP method and the manual method, scanning data of tissue surface and cast surface showed a good fit in the majority. No statistically significant (P > 0.05) difference was observed between the CAD&RP method and the manual method. Conclusions. Wax pattern of maxillary complete denture produced by the CAD&3DP method is comparable with traditional manual method in the adaption to the edentulous cast model. PMID:26583108

  4. A quantitative mosquito survey of 7 villages in Punjab Province, Pakistan with notes on bionomics, sampling methodology and the effects of insecticides.

    PubMed

    Reisen, W K

    1978-12-01

    A total of 451,337 female and male mosquitoes comprising 43 species in 9 genera were collected during a quantitative survey of 7 suburban and rural villages in the Lahore area during 1976 and 1977 using larval, indoor resting, outdoor resting, biting and light trap collections at weekly intervals. Culex tritaeniorhynchus was the most abundant species collected comprising 51.8% of the total specimens, followed by Cx. quinquefasciatus (16.4%), Cx. pseudovishnui (6.8%), An. subpictus (4.8%) and An. culicifacies (4.7%). Bovid bait collections provided the greatest diversity and highest numbers of mosquitoes per unit of collection effort, while light traps provided the poorest diversity and lowest numbers of specimens. Most species exhibited a bimodal seasonal abundance pattern, with peaks occurring in late spring and after the cessation of the heavy monsoon rains. The spraying of houses and cattle sheds with organophosphorous insecticides was effective in controlling the endophilic resting vectors of human Plasmodia, An. culicifacies and An. stephensi, but had little effect on the partially or completely exophilic resting species.

  5. What Are Probability Surveys?

    EPA Pesticide Factsheets

    The National Aquatic Resource Surveys (NARS) use probability-survey designs to assess the condition of the nation’s waters. In probability surveys (also known as sample-surveys or statistical surveys), sampling sites are selected randomly.

  6. Optimal design of a lagrangian observing system for hydrodynamic surveys in coastal areas

    NASA Astrophysics Data System (ADS)

    Cucco, Andrea; Quattrocchi, Giovanni; Antognarelli, Fabio; Satta, Andrea; Maicu, Francesco; Ferrarin, Christian; Umgiesser, Georg

    2014-05-01

    The optimization of ocean observing systems is a pressing need for scientific research. In particular, the improvement of ocean short-term observing networks is achievable by reducing the cost-benefit ratio of the field campaigns and by increasing the quality of measurements. Numerical modeling is a powerful tool for determining the appropriateness of a specific observing system and for optimizing the sampling design. This is particularly true when observations are carried out in coastal areas and lagoons where, the use satellites is prohibitive due to the water shallowness. For such areas, numerical models are the most efficient tool both to provide a preliminary assess of the local physical environment and to make short -term predictions above its change. In this context, a test case experiment was carried out within an enclosed shallow water areas, the Cabras Lagoon (Sardinia, Italy). The aim of the experiment was to explore the optimal design for a field survey based on the use of coastal lagrangian buoys. A three-dimensional hydrodynamic model based on the finite element method (SHYFEM3D, Umgiesser et al., 2004) was implemented to simulate the lagoon water circulation. The model domain extent to the whole Cabras lagoon and to the whole Oristano Gulf, including the surrounding coastal area. Lateral open boundary conditions were provided by the operational ocean model system WMED and only wind forcing, provided by SKIRON atmospheric model (Kallos et al., 1997), was considered as surface boundary conditions. The model was applied to provide a number of ad hoc scenarios and to explore the efficiency of the short-term hydrodynamic survey. A first field campaign was carried out to investigate the lagrangian circulation inside the lagoon under the main wind forcing condition (Mistral wind from North-West). The trajectories followed by the lagrangian buoys and the estimated lagrangian velocities were used to calibrate the model parameters and to validate the

  7. Characteristics of Designated Drivers and their Passengers from the 2007 National Roadside Survey in the United States

    PubMed Central

    Bergen, Gwen; Yao, Jie; Shults, Ruth A.; Romano, Eduardo; Lacey, John

    2015-01-01

    Objective The objectives of this study were to estimate the prevalence of designated driving in the United States, compare these results with those from the 1996 National Roadside Survey, and explore the demographic, drinking, and trip characteristics of both designated drivers and their passengers. Methods The data used were from the 2007 National Roadside Survey which randomly stopped drivers, administered breath tests for alcohol, and administered a questionnaire to drivers and front seat passengers. Results Almost a third (30%) of nighttime drivers reported being designated drivers, with 84% of them having a blood alcohol concentration of zero. Drivers who were more likely to be designated drivers were those with a blood alcohol concentration that was over zero but still legal, who were under 35 years of age, who were African-American, Hispanic or Asian, and whose driving trip originated at a bar, tavern, or club. Over a third of passengers of designated drivers reported consuming an alcoholic drink the day of the survey compared with a fifth of passengers of non-designated drivers. One-fifth of designated driver passengers who reported drinking consumed five or more drinks that day. Conclusions Designated driving is widely used in the United States, with the majority of designated drivers abstaining from drinking alcohol. However as designated driving separates drinking from driving for passengers in a group travelling together, this may encourage passengers to binge drink, which is associated with many adverse health consequences in addition to those arising from alcohol-impaired driving. Designated driving programs and campaigns, although not proven to be effective when used alone, can complement proven effective interventions to help reduce excessive drinking and alcohol-impaired driving. PMID:24372499

  8. A two-phase sampling design for increasing detections of rare species in occupancy surveys

    USGS Publications Warehouse

    Pacifici, Krishna; Dorazio, Robert M.; Dorazio, Michael J.

    2012-01-01

    1. Occupancy estimation is a commonly used tool in ecological studies owing to the ease at which data can be collected and the large spatial extent that can be covered. One major obstacle to using an occupancy-based approach is the complications associated with designing and implementing an efficient survey. These logistical challenges become magnified when working with rare species when effort can be wasted in areas with none or very few individuals. 2. Here, we develop a two-phase sampling approach that mitigates these problems by using a design that places more effort in areas with higher predicted probability of occurrence. We compare our new sampling design to traditional single-season occupancy estimation under a range of conditions and population characteristics. We develop an intuitive measure of predictive error to compare the two approaches and use simulations to assess the relative accuracy of each approach. 3. Our two-phase approach exhibited lower predictive error rates compared to the traditional single-season approach in highly spatially correlated environments. The difference was greatest when detection probability was high (0·75) regardless of the habitat or sample size. When the true occupancy rate was below 0·4 (0·05-0·4), we found that allocating 25% of the sample to the first phase resulted in the lowest error rates. 4. In the majority of scenarios, the two-phase approach showed lower error rates compared to the traditional single-season approach suggesting our new approach is fairly robust to a broad range of conditions and design factors and merits use under a wide variety of settings. 5. Synthesis and applications. Conservation and management of rare species are a challenging task facing natural resource managers. It is critical for studies involving rare species to efficiently allocate effort and resources as they are usually of a finite nature. We believe our approach provides a framework for optimal allocation of effort while

  9. 50 CFR 600.1417 - Requirements for exempted state designation based on submission of recreational survey data.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 50 Wildlife and Fisheries 8 2010-10-01 2010-10-01 false Requirements for exempted state designation based on submission of recreational survey data. 600.1417 Section 600.1417 Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT...

  10. 50 CFR 600.1417 - Requirements for exempted state designation based on submission of recreational survey data.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 50 Wildlife and Fisheries 12 2012-10-01 2012-10-01 false Requirements for exempted state designation based on submission of recreational survey data. 600.1417 Section 600.1417 Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT...

  11. 50 CFR 600.1417 - Requirements for exempted state designation based on submission of recreational survey data.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 50 Wildlife and Fisheries 12 2013-10-01 2013-10-01 false Requirements for exempted state designation based on submission of recreational survey data. 600.1417 Section 600.1417 Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT...

  12. CONDITION ASSESSMENT FOR THE ESCAMBIA RIVER, FL, WATERSHED: BENTHIC MACROINVERTEBRATE SURVEYS USING A PROBABILISTIC SAMPLING DESIGN (POSTER SESSION)

    EPA Science Inventory

    Probabilistic sampling has been used to assess the condition of estuarine ecosystems, and the use of this survey design approach was examined for a northwest Florida watershed. Twenty-eight lotic sites within the Escambia River, Florida, watershed were randomly selected and visit...

  13. SIS Mixer Design for a Broadband Millimeter Spectrometer Suitable for Rapid Line Surveys and Redshift Determinations

    NASA Technical Reports Server (NTRS)

    Rice, F.; Sumner, M.; Zmuidzinas, J.; Hu, R.; LeDuc, H.; Harris, A.; Miller, D.

    2004-01-01

    We present some detail of the waveguide probe and SIS mixer chip designs for a low-noise 180-300 GHz double- sideband receiver with an instantaneous RF bandwidth of 24 GHz. The receiver's single SIS junction is excited by a broadband, fixed-tuned waveguide probe on a silicon substrate. The IF output is coupled to a 6-18 GHz MMIC low- noise preamplifier. Following further amplification, the output is processed by an array of 4 GHz, 128-channel analog autocorrelation spectrometers (WASP 11). The single-sideband receiver noise temperature goal of 70 Kelvin will provide a prototype instrument capable of rapid line surveys and of relatively efficient carbon monoxide (CO) emission line searches of distant, dusty galaxies. The latter application's goal is to determine redshifts by measuring the frequencies of CO line emissions from the star-forming regions dominating the submillimeter brightness of these galaxies. Construction of the receiver has begun; lab testing should begin in the fall. Demonstration of the receiver on the Caltech Submillimeter Observatory (CSO) telescope should begin in spring 2003.

  14. Using qualitative research to facilitate the interpretation of quantitative results from a discrete choice experiment: insights from a survey in elderly ophthalmologic patients

    PubMed Central

    Vennedey, Vera; Danner, Marion; Evers, Silvia MAA; Fauser, Sascha; Stock, Stephanie; Dirksen, Carmen D; Hiligsmann, Mickaël

    2016-01-01

    Background Age-related macular degeneration (AMD) is the leading cause of visual impairment and blindness in industrialized countries. Currently, mainly three treatment options are available, which are all intravitreal injections, but differ with regard to the frequency of injections needed, their approval status, and cost. This study aims to estimate patients’ preferences for characteristics of treatment options for neovascular AMD. Methods An interviewer-assisted discrete choice experiment was conducted among patients suffering from AMD treated with intravitreal injections. A Bayesian efficient design was used for the development of 12 choice tasks. In each task patients indicated their preference for one out of two treatment scenarios described by the attributes: side effects, approval status, effect on visual function, injection and monitoring frequency. While answering the choice tasks, patients were asked to think aloud and explain the reasons for choosing or rejecting specific characteristics. Quantitative data were analyzed with a mixed multinomial logit model. Results Eighty-six patients completed the questionnaire. Patients significantly preferred treatments that improve visual function, are approved, are administered in a pro re nata regimen (as needed), and are accompanied by bimonthly monitoring. Patients significantly disliked less frequent monitoring visits (every 4 months) and explained this was due to fear of deterioration being left unnoticed, and in turn experiencing disease deterioration. Significant preference heterogeneity was found for all levels except for bimonthly monitoring visits and severe, rare eye-related side effects. Patients gave clear explanations of their individual preferences during the interviews. Conclusion Significant preference trends were discernible for the overall sample, despite the preference heterogeneity for most treatment characteristics. Patients like to be monitored and treated regularly, but not too frequently

  15. Performance Observations of Scanner Qualification of NCI-Designated Cancer Centers: Results From the Centers of Quantitative Imaging Excellence (CQIE) Program

    PubMed Central

    Rosen, Mark; Kinahan, Paul E.; Gimpel, James F.; Opanowski, Adam; Siegel, Barry A.; Hill, G. Craig; Weiss, Linda; Shankar, Lalitha

    2016-01-01

    We present an overview of the Centers for Quantitative Imaging Excellence (CQIE) program, which was initiated in 2010 to establish a resource of clinical trial-ready sites within the National Cancer Institute (NCI)-designated Cancer Centers (NCI-CCs) network. The intent was to enable imaging centers in the NCI-CCs network capable of conducting treatment trials with advanced quantitative imaging end points. We describe the motivations for establishing the CQIE, the process used to initiate the network, the methods of site qualification for positron emission tomography, computed tomography, and magnetic resonance imaging, and the results of the evaluations over the subsequent 3 years. PMID:28395794

  16. Improving the design of amphibian surveys using soil data: A case study in two wilderness areas

    USGS Publications Warehouse

    Bowen, K.D.; Beever, E.A.; Gafvert, U.B.

    2009-01-01

    Amphibian populations are known, or thought to be, declining worldwide. Although protected natural areas may act as reservoirs of biological integrity and serve as benchmarks for comparison with unprotected areas, they are not immune from population declines and extinctions and should be monitored. Unfortunately, identifying survey sites and performing long-term fieldwork within such (often remote) areas involves a special set of problems. We used the USDA Natural Resource Conservation Service Soil Survey Geographic (SSURGO) Database to identify, a priori, potential habitat for aquatic-breeding amphibians on North and South Manitou Islands, Sleeping Bear Dunes National Lakeshore, Michigan, and compared the results to those obtained using National Wetland Inventory (NWI) data. The SSURGO approach identified more target sites for surveys than the NWI approach, and it identified more small and ephemeral wetlands. Field surveys used a combination of daytime call surveys, night-time call surveys, and perimeter surveys. We found that sites that would not have been identified with NWI data often contained amphibians and, in one case, contained wetland-breeding species that would not have been found using NWI data. Our technique allows for easy a priori identification of numerous survey sites that might not be identified using other sources of spatial information. We recognize, however, that the most effective site identification and survey techniques will likely use a combination of methods in addition to those described here.

  17. Enhancing sampling design in mist-net bat surveys by accounting for sample size optimization.

    PubMed

    Trevelin, Leonardo Carreira; Novaes, Roberto Leonan Morim; Colas-Rosas, Paul François; Benathar, Thayse Cristhina Melo; Peres, Carlos A

    2017-01-01

    The advantages of mist-netting, the main technique used in Neotropical bat community studies to date, include logistical implementation, standardization and sampling representativeness. Nonetheless, study designs still have to deal with issues of detectability related to how different species behave and use the environment. Yet there is considerable sampling heterogeneity across available studies in the literature. Here, we approach the problem of sample size optimization. We evaluated the common sense hypothesis that the first six hours comprise the period of peak night activity for several species, thereby resulting in a representative sample for the whole night. To this end, we combined re-sampling techniques, species accumulation curves, threshold analysis, and community concordance of species compositional data, and applied them to datasets of three different Neotropical biomes (Amazonia, Atlantic Forest and Cerrado). We show that the strategy of restricting sampling to only six hours of the night frequently results in incomplete sampling representation of the entire bat community investigated. From a quantitative standpoint, results corroborated the existence of a major Sample Area effect in all datasets, although for the Amazonia dataset the six-hour strategy was significantly less species-rich after extrapolation, and for the Cerrado dataset it was more efficient. From the qualitative standpoint, however, results demonstrated that, for all three datasets, the identity of species that are effectively sampled will be inherently impacted by choices of sub-sampling schedule. We also propose an alternative six-hour sampling strategy (at the beginning and the end of a sample night) which performed better when resampling Amazonian and Atlantic Forest datasets on bat assemblages. Given the observed magnitude of our results, we propose that sample representativeness has to be carefully weighed against study objectives, and recommend that the trade-off between

  18. Enhancing sampling design in mist-net bat surveys by accounting for sample size optimization

    PubMed Central

    Trevelin, Leonardo Carreira; Novaes, Roberto Leonan Morim; Colas-Rosas, Paul François; Benathar, Thayse Cristhina Melo; Peres, Carlos A.

    2017-01-01

    The advantages of mist-netting, the main technique used in Neotropical bat community studies to date, include logistical implementation, standardization and sampling representativeness. Nonetheless, study designs still have to deal with issues of detectability related to how different species behave and use the environment. Yet there is considerable sampling heterogeneity across available studies in the literature. Here, we approach the problem of sample size optimization. We evaluated the common sense hypothesis that the first six hours comprise the period of peak night activity for several species, thereby resulting in a representative sample for the whole night. To this end, we combined re-sampling techniques, species accumulation curves, threshold analysis, and community concordance of species compositional data, and applied them to datasets of three different Neotropical biomes (Amazonia, Atlantic Forest and Cerrado). We show that the strategy of restricting sampling to only six hours of the night frequently results in incomplete sampling representation of the entire bat community investigated. From a quantitative standpoint, results corroborated the existence of a major Sample Area effect in all datasets, although for the Amazonia dataset the six-hour strategy was significantly less species-rich after extrapolation, and for the Cerrado dataset it was more efficient. From the qualitative standpoint, however, results demonstrated that, for all three datasets, the identity of species that are effectively sampled will be inherently impacted by choices of sub-sampling schedule. We also propose an alternative six-hour sampling strategy (at the beginning and the end of a sample night) which performed better when resampling Amazonian and Atlantic Forest datasets on bat assemblages. Given the observed magnitude of our results, we propose that sample representativeness has to be carefully weighed against study objectives, and recommend that the trade-off between

  19. National Aquatic Resource Surveys: Multiple objectives and constraints lead to design complexity

    EPA Science Inventory

    The US Environmental Protection Agency began conducting the National Aquatic resource Surveys (NARS) in 2007 with a national survey of lakes (NLA 2007) followed by rivers and streams in 2008-9 (NRSA 2008), coastal waters in 2010 (NCCA 2010) and wetlands in 2011 (NWCA). The surve...

  20. Experimental design and data-analysis in label-free quantitative LC/MS proteomics: A tutorial with MSqRob.

    PubMed

    Goeminne, Ludger J E; Gevaert, Kris; Clement, Lieven

    2017-04-05

    Label-free shotgun proteomics is routinely used to assess proteomes. However, extracting relevant information from the massive amounts of generated data remains difficult. This tutorial provides a strong foundation on analysis of quantitative proteomics data. We provide key statistical concepts that help researchers to design proteomics experiments and we showcase how to analyze quantitative proteomics data using our recent free and open-source R package MSqRob, which was developed to implement the peptide-level robust ridge regression method for relative protein quantification described by Goeminne et al. . MSqRob can handle virtually any experimental proteomics design and outputs proteins ordered by statistical significance. Moreover, its graphical user interface and interactive diagnostic plots provide easy inspection and also detection of anomalies in the data and flaws in the data analysis, allowing deeper assessment of the validity of results and a critical review of the experimental design. Our tutorial discusses interactive preprocessing, data analysis and visualization of label-free MS-based quantitative proteomics experiments with simple and more complex designs. We provide well-documented scripts to run analyses in bash mode on GitHub, enabling the integration of MSqRob in automated pipelines on cluster environments (https://github.com/statOmics/MSqRob).

  1. Design Evolution of the Wide Field Infrared Survey Telescope Using Astrophysics Focused Telescope Assets (WFIRST-AFTA) and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Peabody, Hume L.; Peters, Carlton V.; Rodriguez-Ruiz, Juan E.; McDonald, Carson S.; Content, David A.; Jackson, Clifton E.

    2015-01-01

    The design of the Wide Field Infrared Survey Telescope using Astrophysics Focused Telescope Assets (WFIRST-AFTA) continues to evolve as each design cycle is analyzed. In 2012, two Hubble sized (2.4 m diameter) telescopes were donated to NASA from elsewhere in the Federal Government. NASA began investigating potential uses for these telescopes and identified WFIRST as a mission to benefit from these assets. With an updated, deeper, and sharper field of view than previous design iterations with a smaller telescope, the optical designs of the WFIRST instruments were updated and the mechanical and thermal designs evolved around the new optical layout. Beginning with Design Cycle 3, significant analysis efforts yielded a design and model that could be evaluated for Structural-Thermal-Optical-Performance (STOP) purposes for the Wide Field Imager (WFI) and provided the basis for evaluating the high level observatory requirements. Development of the Cycle 3 thermal model provided some valuable analysis lessons learned and established best practices for future design cycles. However, the Cycle 3 design did include some major liens and evolving requirements which were addressed in the Cycle 4 Design. Some of the design changes are driven by requirements changes, while others are optimizations or solutions to liens from previous cycles. Again in Cycle 4, STOP analysis was performed and further insights into the overall design were gained leading to the Cycle 5 design effort currently underway. This paper seeks to capture the thermal design evolution, with focus on major design drivers, key decisions and their rationale, and lessons learned as the design evolved.

  2. Design Evolution of the Wide Field Infrared Survey Telescope using Astrophysics Focused Telescope Assets (WFIRST-AFTA) and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Peabody, Hume; Peters, Carlton; Rodriguez, Juan; McDonald, Carson; Content, David A.; Jackson, Cliff

    2015-01-01

    The design of the Wide Field Infrared Survey Telescope using Astrophysics Focused Telescope Assets (WFIRST-AFTA) continues to evolve as each design cycle is analyzed. In 2012, two Hubble sized (2.4 m diameter) telescopes were donated to NASA from elsewhere in the Federal Government. NASA began investigating potential uses for these telescopes and identified WFIRST as a mission to benefit from these assets. With an updated, deeper, and sharper field of view than previous design iterations with a smaller telescope, the optical designs of the WFIRST instruments were updated and the mechanical and thermal designs evolved around the new optical layout. Beginning with Design Cycle 3, significant analysis efforts yielded a design and model that could be evaluated for Structural-Thermal-Optical-Performance (STOP) purposes for the Wide Field Imager (WFI) and provided the basis for evaluating the high level observatory requirements. Development of the Cycle 3 thermal model provided some valuable analysis lessons learned and established best practices for future design cycles. However, the Cycle 3 design did include some major liens and evolving requirements which were addressed in the Cycle 4 Design. Some of the design changes are driven by requirements changes, while others are optimizations or solutions to liens from previous cycles. Again in Cycle 4, STOP analysis was performed and further insights into the overall design were gained leading to the Cycle 5 design effort currently underway. This paper seeks to capture the thermal design evolution, with focus on major design drivers, key decisions and their rationale, and lessons learned as the design evolved.

  3. The Design of a Survey Instrument to Aid in Identifying Sex Related Barriers to Employment and the Administration of That Instrument to Rural and Urban Employers.

    ERIC Educational Resources Information Center

    Jones, B. Dolores; Mook, Corena

    A project was conducted to design a survey instrument that would help in identifying sex-related barriers to employment and to administer that instrument to employers in both rural and urban counties of Kansas. It was projected that the data derived from the survey could be used to aid in designing vocational education methods and techniques.…

  4. Design and methodology of a mixed methods follow-up study to the 2014 Ghana Demographic and Health Survey

    PubMed Central

    Staveteig, Sarah; Aryeetey, Richmond; Anie-Ansah, Michael; Ahiadeke, Clement; Ortiz, Ladys

    2017-01-01

    ABSTRACT Background: The intended meaning behind responses to standard questions posed in large-scale health surveys are not always well understood. Systematic follow-up studies, particularly those which pose a few repeated questions followed by open-ended discussions, are well positioned to gauge stability and consistency of data and to shed light on the intended meaning behind survey responses. Such follow-up studies require extensive coordination and face challenges in protecting respondent confidentiality during the process of recontacting and reinterviewing participants. Objectives: We describe practical field strategies for undertaking a mixed methods follow-up study during a large-scale health survey. Methods: The study was designed as a mixed methods follow-up study embedded within the 2014 Ghana Demographic and Health Survey (GDHS). The study was implemented in 13 clusters. Android tablets were used to import reference data from the parent survey and to administer the questionnaire, which asked a mixture of closed- and open-ended questions on reproductive intentions, decision-making, and family planning. Results: Despite a number of obstacles related to recontacting respondents and concern about respondent fatigue, over 92 percent of the selected sub-sample were successfully recontacted and reinterviewed; all consented to audio recording. A confidential linkage between GDHS data, follow-up tablet data, and audio transcripts was successfully created for the purpose of analysis. Conclusions: We summarize the challenges in follow-up study design, including ethical considerations, sample size, auditing, filtering, successful use of tablets, and share lessons learned for future such follow-up surveys. PMID:28145817

  5. Genetic expectations of quantitative trait loci main and interaction effects obtained with the triple testcross design and their relevance for the analysis of heterosis.

    PubMed

    Melchinger, A E; Utz, H F; Schön, C C

    2008-04-01

    Interpretation of experimental results from quantitative trait loci (QTL) mapping studies on the predominant type of gene action can be severely affected by the choice of statistical model, experimental design, and provision of epistasis. In this study, we derive quantitative genetic expectations of (i) QTL effects obtained from one-dimensional genome scans with the triple testcross (TTC) design and (ii) pairwise interactions between marker loci using two-way analyses of variance (ANOVA) under the F(2)- and the F(infinity)-metric model. The theoretical results show that genetic expectations of QTL effects estimated with the TTC design are complex, comprising both main and epistatic effects, and that genetic expectations of two-way marker interactions are not straightforward extensions of effects estimated in one-dimensional scans. We also demonstrate that the TTC design can partially overcome the limitations of the design III in separating QTL main effects and their epistatic interactions in the analysis of heterosis and that dominance x additive epistatic interactions of individual QTL with the genetic background can be estimated with a one-dimensional genome scan. Furthermore, we present genetic expectations of variance components for the analysis of TTC progeny tested in a split-plot design, assuming digenic epistasis and arbitrary linkage.

  6. A nationwide population-based cross-sectional survey of health-related quality of life in patients with myeloproliferative neoplasms in Denmark (MPNhealthSurvey): survey design and characteristics of respondents and nonrespondents

    PubMed Central

    Brochmann, Nana; Flachs, Esben Meulengracht; Christensen, Anne Illemann; Andersen, Christen Lykkegaard; Juel, Knud; Hasselbalch, Hans Carl; Zwisler, Ann-Dorthe

    2017-01-01

    Objective The Department of Hematology, Zealand University Hospital, Denmark, and the National Institute of Public Health, University of Southern Denmark, created the first nationwide, population-based, and the most comprehensive cross-sectional health-related quality of life (HRQoL) survey of patients with myeloproliferative neoplasms (MPNs). In Denmark, all MPN patients are treated in public hospitals and treatments received are free of charge for these patients. Therefore, MPN patients receive the best available treatment to the extent of its suitability for them and if they wish to receive the treatment. The aims of this article are to describe the survey design and the characteristics of respondents and nonrespondents. Material and methods Individuals with MPN diagnoses registered in the Danish National Patient Register (NPR) were invited to participate. The registers of the Danish Civil Registration System and Statistics Denmark provided information regarding demographics. The survey contained 120 questions: validated patient-reported outcome (PRO) questionnaires and additional questions addressing lifestyle. Results A total of 4,704 individuals were registered with MPN diagnoses in the NPR of whom 4,236 were eligible for participation and 2,613 (62%) responded. Overall, the respondents covered the broad spectrum of MPN patients, but patients 70–79 years old, living with someone, of a Danish/Western ethnicity, and with a higher level of education exhibited the highest response rate. Conclusion A nationwide, population-based, and comprehensive HRQoL survey of MPN patients in Denmark was undertaken (MPNhealthSurvey). We believe that the respondents broadly represent the MPN population in Denmark. However, the differences between respondents and nonrespondents have to be taken into consideration when examining PROs from the respondents. The results of the investigation of the respondents’ HRQoL in this survey will follow in future articles. PMID:28280390

  7. An Overview of Wide-Field-Of-View Optical Designs for Survey Telescopes

    DTIC Science & Technology

    2010-09-01

    exist with each having its own strengths and limitations. 1. Introduction Wide-field astronomical sky survey work dates back to the mid 1840s...shortly after the invention of photography [1]. Early survey instruments were nothing more than cameras and the optics were early camera lenses...Some of the better known optics were made by Voigtlander and Petzval [2]. As the technical characteristics of photography improved, larger

  8. [New design of the Health Survey of Catalonia (Spain, 2010-2014): a step forward in health planning and evaluation].

    PubMed

    Alcañiz-Zanón, Manuela; Mompart-Penina, Anna; Guillén-Estany, Montserrat; Medina-Bustos, Antonia; Aragay-Barbany, Josep M; Brugulat-Guiteras, Pilar; Tresserras-Gaju, Ricard

    2014-01-01

    This article presents the genesis of the Health Survey of Catalonia (Spain, 2010-2014) with its semiannual subsamples and explains the basic characteristics of its multistage sampling design. In comparison with previous surveys, the organizational advantages of this new statistical operation include rapid data availability and the ability to continuously monitor the population. The main benefits are timeliness in the production of indicators and the possibility of introducing new topics through the supplemental questionnaire as a function of needs. Limitations consist of the complexity of the sample design and the lack of longitudinal follow-up of the sample. Suitable sampling weights for each specific subsample are necessary for any statistical analysis of micro-data. Accuracy in the analysis of territorial disaggregation or population subgroups increases if annual samples are accumulated.

  9. Use of mechanistic simulations as a quantitative risk-ranking tool within the quality by design framework.

    PubMed

    Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G

    2014-11-20

    The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment.

  10. Design and Evaluation of Digital Learning Material to Support Acquisition of Quantitative Problem-Solving Skills within Food Chemistry

    ERIC Educational Resources Information Center

    Diederen, Julia; Gruppen, Harry; Hartog, Rob; Voragen, Alphons G. J.

    2005-01-01

    One of the modules in the course Food Chemistry at Wageningen University (Wageningen, The Netherlands) focuses on quantitative problem-solving skills related to chemical reactions. The intended learning outcomes of this module are firstly, to be able to translate practical food chemistry related problems into mathematical equations and to solve…

  11. A survey of ground operations tools developed to simulate the pointing of space telescopes and the design for WISE

    NASA Technical Reports Server (NTRS)

    Fabinsky, Beth

    2006-01-01

    WISE, the Wide Field Infrared Survey Explorer, is scheduled for launch in June 2010. The mission operations system for WISE requires a software modeling tool to help plan, integrate and simulate all spacecraft pointing and verify that no attitude constraints are violated. In the course of developing the requirements for this tool, an investigation was conducted into the design of similar tools for other space-based telescopes. This paper summarizes the ground software and processes used to plan and validate pointing for a selection of space telescopes; with this information as background, the design for WISE is presented.

  12. Design of a Mars Airplane Propulsion System for the Aerial Regional-Scale Environmental Survey (ARES) Mission Concept

    NASA Technical Reports Server (NTRS)

    Kuhl, Christopher A.

    2008-01-01

    The Aerial Regional-Scale Environmental Survey (ARES) is a Mars exploration mission concept that utilizes a rocket propelled airplane to take scientific measurements of atmospheric, surface, and subsurface phenomena. The liquid rocket propulsion system design has matured through several design cycles and trade studies since the inception of the ARES concept in 2002. This paper describes the process of selecting a bipropellant system over other propulsion system options, and provides details on the rocket system design, thrusters, propellant tank and PMD design, propellant isolation, and flow control hardware. The paper also summarizes computer model results of thruster plume interactions and simulated flight performance. The airplane has a 6.25 m wingspan with a total wet mass of 185 kg and has to ability to fly over 600 km through the atmosphere of Mars with 45 kg of MMH / MON3 propellant.

  13. Design of Reconnaissance Helicopter Electromagnetic and Magnetic Geophysical Surveys of the North Platte River and Lodgepole Creek, Nebraska

    NASA Astrophysics Data System (ADS)

    Smith, B. D.; Cannia, J. C.; Abraham, J. D.

    2009-12-01

    An innovative flight line layout using widely separated lines was used for frequency domain helicopter electromagnetic (HEM) surveys in 2008 and 2009 in the Panhandle of western Nebraska. The HEM survey design was developed as part of a joint hydrologic study by the North Platte Natural Resource District, South Platte Natural Resource District, UNL-Conservation and Survey Division, and U.S. Geological Survey to improve the understanding of relationships between surface water and groundwater systems critical to developing groundwater flow models used in water resources management programs. Use of HEM methods for hydrologic mapping had been demonstrated by HEM surveys conducted in 2007 of sites in the glaciated Platte River Basin in eastern Nebraska. These surveys covered township-scale areas with flight lines laid out in blocks where the lines were spaced about 270m apart. The HEM successfully mapped the complex 3D geometry of shallow sand and gravel aquifers through and within conductive till to a depth of about 40m in a total area of about 680 km2 (263 mi2). Current groundwater flow models in western Nebraska include the Cooperative Hydrologic Study (COHYST), run by a consortium of state agencies, which is tasked to develop scientifically supportable hydrologic databases, analyses, and models, and the North Platte River Valley Optimization Model (NPRVOM). The COHYST study area, about 75,000 km2 (29,000 mi2), includes the Platte River Basin from the Nebraska - Wyoming border to Lincoln. Considering the large area of the groundwater models, the USGS decided in collaboration with the NRD to use a more reconnaissance-style layout for the 2008 HEM survey which encompassed about 21,000 km2 (8,000 mi2). A reconnaissance-type HEM survey is made possible due to technical capabilities of applicable HEM systems and due to the level of hydrogeologic information available in the NRD. The particular capabilities of the HEM system are careful calibration, low drift, low noise

  14. SDSS-IV MaNGA IFS Galaxy Survey—Survey Design, Execution, and Initial Data Quality

    NASA Astrophysics Data System (ADS)

    Yan, Renbin; Bundy, Kevin; Law, David R.; Bershady, Matthew A.; Andrews, Brett; Cherinka, Brian; Diamond-Stanic, Aleksandar M.; Drory, Niv; MacDonald, Nicholas; Sánchez-Gallego, José R.; Thomas, Daniel; Wake, David A.; Weijmans, Anne-Marie; Westfall, Kyle B.; Zhang, Kai; Aragón-Salamanca, Alfonso; Belfiore, Francesco; Bizyaev, Dmitry; Blanc, Guillermo A.; Blanton, Michael R.; Brownstein, Joel; Cappellari, Michele; D'Souza, Richard; Emsellem, Eric; Fu, Hai; Gaulme, Patrick; Graham, Mark T.; Goddard, Daniel; Gunn, James E.; Harding, Paul; Jones, Amy; Kinemuchi, Karen; Li, Cheng; Li, Hongyu; Maiolino, Roberto; Mao, Shude; Maraston, Claudia; Masters, Karen; Merrifield, Michael R.; Oravetz, Daniel; Pan, Kaike; Parejko, John K.; Sanchez, Sebastian F.; Schlegel, David; Simmons, Audrey; Thanjavur, Karun; Tinker, Jeremy; Tremonti, Christy; van den Bosch, Remco; Zheng, Zheng

    2016-12-01

    The MaNGA Survey (Mapping Nearby Galaxies at Apache Point Observatory) is one of three core programs in the Sloan Digital Sky Survey IV. It is obtaining integral field spectroscopy for 10,000 nearby galaxies at a spectral resolution of R ˜ 2000 from 3622 to 10354 Å. The design of the survey is driven by a set of science requirements on the precision of estimates of the following properties: star formation rate surface density, gas metallicity, stellar population age, metallicity, and abundance ratio, and their gradients; stellar and gas kinematics; and enclosed gravitational mass as a function of radius. We describe how these science requirements set the depth of the observations and dictate sample selection. The majority of targeted galaxies are selected to ensure uniform spatial coverage in units of effective radius (R e ) while maximizing spatial resolution. About two-thirds of the sample is covered out to 1.5R e (Primary sample), and one-third of the sample is covered to 2.5R e (Secondary sample). We describe the survey execution with details that would be useful in the design of similar future surveys. We also present statistics on the achieved data quality, specifically the point-spread function, sampling uniformity, spectral resolution, sky subtraction, and flux calibration. For our Primary sample, the median r-band signal-to-noise ratio is ˜70 per 1.4 Å pixel for spectra stacked between 1R e and 1.5R e . Measurements of various galaxy properties from the first-year data show that we are meeting or exceeding the defined requirements for the majority of our science goals.

  15. Essential Steps for Web Surveys: A Guide to Designing, Administering and Utilizing Web Surveys for University Decision-Making. Professional File. Number 102, Winter 2006

    ERIC Educational Resources Information Center

    Cheskis-Gold, Rena; Loescher, Ruth; Shepard-Rabadam, Elizabeth; Carroll, Barbara

    2006-01-01

    During the past few years, several Harvard paper surveys were converted to Web surveys. These were high-profile surveys endorsed by the Provost and the Dean of the College, and covered major portions of the university population (all undergraduates, all graduate students, tenured and non-tenured faculty). When planning for these surveys started in…

  16. Geological and seismological survey for new design-basis earthquake ground motion of Kashiwazaki-Kariwa NPS

    NASA Astrophysics Data System (ADS)

    Takao, M.; Mizutani, H.

    2009-05-01

    At about 10:13 on July 16, 2007, a strong earthquake named 'Niigata-ken Chuetsu-oki Earthquake' of Mj6.8 on Japan Meteorological Agencyfs scale occurred offshore Niigata prefecture in Japan. However, all of the nuclear reactors at Kashiwazaki-Kariwa Nuclear Power Station (KKNPS) in Niigata prefecture operated by Tokyo Electric Power Company shut down safely. In other words, automatic safety function composed of shutdown, cooling and containment worked as designed immediately after the earthquake. During the earthquake, the peak acceleration of the ground motion exceeded the design-basis ground motion (DBGM), but the force due to the earthquake applied to safety-significant facilities was about the same as or less than the design basis taken into account as static seismic force. In order to assess anew the safety of nuclear power plants, we have evaluated a new DBGM after conducting geomorphological, geological, geophysical, seismological survey and analyses. [Geomorphological, Geological and Geophysical survey] In the land area, aerial photograph interpretation was performed at least within the 30km radius to extract geographies that could possibly be tectonic reliefs as a geomorphological survey. After that, geological reconnaissance was conducted to confirm whether the extracted landforms are tectonic reliefs or not. Especially we carefully investigated Nagaoka Plain Western Boundary Fault Zone (NPWBFZ), which consists of Kakuda-Yahiko fault, Kihinomiya fault and Katakai fault, because NPWBFZ is the one of the active faults which have potential of Mj8 class in Japan. In addition to the geological survey, seismic reflection prospecting of approximate 120km in total length was completed to evaluate the geological structure of the faults and to assess the consecutiveness of the component faults of NPWBFZ. As a result of geomorphological, geological and geophysical surveys, we evaluated that the three component faults of NPWBFZ are independent to each other from the

  17. Is the linear modeling technique good enough for optimal form design? A comparison of quantitative analysis models.

    PubMed

    Lin, Yang-Cheng; Yeh, Chung-Hsing; Wang, Chen-Cheng; Wei, Chun-Chun

    2012-01-01

    How to design highly reputable and hot-selling products is an essential issue in product design. Whether consumers choose a product depends largely on their perception of the product image. A consumer-oriented design approach presented in this paper helps product designers incorporate consumers' perceptions of product forms in the design process. The consumer-oriented design approach uses quantification theory type I, grey prediction (the linear modeling technique), and neural networks (the nonlinear modeling technique) to determine the optimal form combination of product design for matching a given product image. An experimental study based on the concept of Kansei Engineering is conducted to collect numerical data for examining the relationship between consumers' perception of product image and product form elements of personal digital assistants (PDAs). The result of performance comparison shows that the QTTI model is good enough to help product designers determine the optimal form combination of product design. Although the PDA form design is used as a case study, the approach is applicable to other consumer products with various design elements and product images. The approach provides an effective mechanism for facilitating the consumer-oriented product design process.

  18. Is the Linear Modeling Technique Good Enough for Optimal Form Design? A Comparison of Quantitative Analysis Models

    PubMed Central

    Lin, Yang-Cheng; Yeh, Chung-Hsing; Wang, Chen-Cheng; Wei, Chun-Chun

    2012-01-01

    How to design highly reputable and hot-selling products is an essential issue in product design. Whether consumers choose a product depends largely on their perception of the product image. A consumer-oriented design approach presented in this paper helps product designers incorporate consumers' perceptions of product forms in the design process. The consumer-oriented design approach uses quantification theory type I, grey prediction (the linear modeling technique), and neural networks (the nonlinear modeling technique) to determine the optimal form combination of product design for matching a given product image. An experimental study based on the concept of Kansei Engineering is conducted to collect numerical data for examining the relationship between consumers' perception of product image and product form elements of personal digital assistants (PDAs). The result of performance comparison shows that the QTTI model is good enough to help product designers determine the optimal form combination of product design. Although the PDA form design is used as a case study, the approach is applicable to other consumer products with various design elements and product images. The approach provides an effective mechanism for facilitating the consumer-oriented product design process. PMID:23258961

  19. German health interview and examination survey for adults (DEGS) - design, objectives and implementation of the first data collection wave

    PubMed Central

    2012-01-01

    Background The German Health Interview and Examination Survey for Adults (DEGS) is part of the recently established national health monitoring conducted by the Robert Koch Institute. DEGS combines a nationally representative periodic health survey and a longitudinal study based on follow-up of survey participants. Funding is provided by the German Ministry of Health and supplemented for specific research topics from other sources. Methods/design The first DEGS wave of data collection (DEGS1) extended from November 2008 to December 2011. Overall, 8152 men and women participated. Of these, 3959 persons already participated in the German National Health Interview and Examination Survey 1998 (GNHIES98) at which time they were 18–79 years of age. Another 4193 persons 18–79 years of age were recruited for DEGS1 in 2008–2011 based on two-stage stratified random sampling from local population registries. Health data and context variables were collected using standardized computer assisted personal interviews, self-administered questionnaires, and standardized measurements and tests. In order to keep survey results representative for the population aged 18–79 years, results will be weighted by survey-specific weighting factors considering sampling and drop-out probabilities as well as deviations between the design-weighted net sample and German population statistics 2010. Discussion DEGS aims to establish a nationally representative data base on health of adults in Germany. This health data platform will be used for continuous health reporting and health care research. The results will help to support health policy planning and evaluation. Repeated cross-sectional surveys will permit analyses of time trends in morbidity, functional capacity levels, disability, and health risks and resources. Follow-up of study participants will provide the opportunity to study trajectories of health and disability. A special focus lies on chronic diseases including asthma

  20. Design and Specification of Optical Bandpass Filters for Hubble Space Telescope (HST) Advanced Camera for Surveys (ACS)

    NASA Technical Reports Server (NTRS)

    Leviton, Douglas B.; Tsevetanov, Zlatan; Woodruff, Bob; Mooney, Thomas A.

    1998-01-01

    Advanced optical bandpass filters for the Hubble Space Telescope (HST) Advanced Camera for Surveys (ACS) have been developed on a filter-by-filter basis through detailed studies which take into account the instrument's science goals, available optical filter fabrication technology, and developments in ACS's charge-coupled-device (CCD) detector technology. These filters include a subset of filters for the Sloan Digital Sky Survey (SDSS) which are optimized for astronomical photometry using today's charge-coupled-devices (CCD's). In order for ACS to be truly advanced, these filters must push the state-of-the-art in performance in a number of key areas at the same time. Important requirements for these filters include outstanding transmitted wavefront, high transmittance, uniform transmittance across each filter, spectrally structure-free bandpasses, exceptionally high out of band rejection, a high degree of parfocality, and immunity to environmental degradation. These constitute a very stringent set of requirements indeed, especially for filters which are up to 90 mm in diameter. The highly successful paradigm in which final specifications for flight filters were derived through interaction amongst the ACS Science Team, the instrument designer, the lead optical engineer, and the filter designer and vendor is described. Examples of iterative design trade studies carried out in the context of science needs and budgetary and schedule constraints are presented. An overview of the final design specifications for the ACS bandpass and ramp filters is also presented.

  1. Survey of perceived influence of the conceptual design model of interactive television advertising towards impulse purchase tendency

    NASA Astrophysics Data System (ADS)

    Sarif, Siti Mahfuzah; Omar, Azizah Che; Shiratuddin, Norshuhada

    2016-08-01

    With the proliferation of technology assisted shopping, there is growing evidence that impulse buying is an emerging phenomenon, which has been the focus of this study. Literatures indicate that studies related to impulse purchase for interactive television (iTV) advertising are highly scarce. It was found that most of the existing impulse purchase elements are mainly focusing on traditional retail store, website advertising, and traditional TV advertising, but not on iTV advertising. Due to that, through a systematic process, a design model for developing iTV advertising with influence towards impulse purchase tendency was developed and tested in this study. The design model is named as iTVAdIP and comprises of three main components; technology, impulse purchase components, and development process. This paper describes the survey, which measures the influence of iTVAdIP design model towards impulse purchase tendency. 37 potential advertising designers were involved in the survey. The results indicate that the iTVAdIP is practical and workable in developing iTV advertisement that could influence consumer to buy the advertised product.

  2. Coherent Power Analysis in Multi-Level Studies Using Design Parameters from Surveys

    ERIC Educational Resources Information Center

    Rhoads, Christopher

    2016-01-01

    Current practice for conducting power analyses in hierarchical trials using survey based ICC and effect size estimates may be misestimating power because ICCs are not being adjusted to account for treatment effect heterogeneity. Results presented in Table 1 show that the necessary adjustments can be quite large or quite small. Furthermore, power…

  3. The Outer Solar System Origins Survey. I. ; Design and First-Quarter Discoveries

    NASA Technical Reports Server (NTRS)

    Bannister, Michele T.; Kavelaars, J. J.; Petit, Jean-Marc; Gladman, Brett J.; Gwyn, Stephen D. J.; Chen, Ying-Tung; Volk, Kathryn; Alexandersen, Mike; Benecchi, Susan D.; Delsanti, Audrey; Fraser, Wesley C.; Granvik, Mikael; Grundy, Will M.; Guilbert-Lepoutre, Aurelie; Hestroffer, Daniel; Ip, Wing-Huen; Jakubik, Marian; Jones, R. Lynne; Kaib, Nathan; Kavelaars, Catherine F.; Lacerda, Pedro; Lawler, Samantha; Lehner, Matthew; Lin, Hsing Wen; Lister, Tim; Lykawka, Patryk Sofia; Monty, Stephanie; Marsset, Michael; Murray-Clay, Ruth; Noll, Keith

    2016-01-01

    We report the discovery, tracking, and detection circumstances for 85 trans-Neptunian objects (TNOs) from the first 42 square degrees of the Outer Solar System Origins Survey. This ongoing r-band solar system survey uses the 0.9 square degree field of view MegaPrime camera on the 3.6 meter Canada-France-Hawaii Telescope. Our orbital elements for these TNOs are precise to a fractional semimajor axis uncertainty of less than 0.1 percent. We achieve this precision in just two oppositions, as compared to the normal three to five oppositions, via a dense observing cadence and innovative astrometric technique. These discoveries are free of ephemeris bias, a first for large trans-Neptunian surveys. We also provide the necessary information to enable models of TNO orbital distributions to be tested against our TNO sample. We confirm the existence of a cold "kernel" of objects within the main cold classical Kuiper Belt and infer the existence of an extension of the "stirred" cold classical Kuiper Belt to at least several au beyond the 2:1 mean motion resonance with Neptune. We find that the population model of Petit et al. remains a plausible representation of the Kuiper Belt. The full survey, to be completed in 2017, will provide an exquisitely characterized sample of important resonant TNO populations, ideal for testing models of giant planet migration during the early history of the solar system.

  4. The Outer Solar System Origins Survey. I. Design and First-quarter Discoveries

    NASA Astrophysics Data System (ADS)

    Bannister, Michele T.; Kavelaars, J. J.; Petit, Jean-Marc; Gladman, Brett J.; Gwyn, Stephen D. J.; Chen, Ying-Tung; Volk, Kathryn; Alexandersen, Mike; Benecchi, Susan D.; Delsanti, Audrey; Fraser, Wesley C.; Granvik, Mikael; Grundy, Will M.; Guilbert-Lepoutre, Aurélie; Hestroffer, Daniel; Ip, Wing-Huen; Jakubik, Marian; Jones, R. Lynne; Kaib, Nathan; Kavelaars, Catherine F.; Lacerda, Pedro; Lawler, Samantha; Lehner, Matthew J.; Lin, Hsing Wen; Lister, Tim; Lykawka, Patryk Sofia; Monty, Stephanie; Marsset, Michael; Murray-Clay, Ruth; Noll, Keith S.; Parker, Alex; Pike, Rosemary E.; Rousselot, Philippe; Rusk, David; Schwamb, Megan E.; Shankman, Cory; Sicardy, Bruno; Vernazza, Pierre; Wang, Shiang-Yu

    2016-09-01

    We report the discovery, tracking, and detection circumstances for 85 trans-Neptunian objects (TNOs) from the first 42 deg2 of the Outer Solar System Origins Survey. This ongoing r-band solar system survey uses the 0.9 deg2 field of view MegaPrime camera on the 3.6 m Canada-France-Hawaii Telescope. Our orbital elements for these TNOs are precise to a fractional semimajor axis uncertainty <0.1%. We achieve this precision in just two oppositions, as compared to the normal three to five oppositions, via a dense observing cadence and innovative astrometric technique. These discoveries are free of ephemeris bias, a first for large trans-Neptunian surveys. We also provide the necessary information to enable models of TNO orbital distributions to be tested against our TNO sample. We confirm the existence of a cold “kernel” of objects within the main cold classical Kuiper Belt and infer the existence of an extension of the “stirred” cold classical Kuiper Belt to at least several au beyond the 2:1 mean motion resonance with Neptune. We find that the population model of Petit et al. remains a plausible representation of the Kuiper Belt. The full survey, to be completed in 2017, will provide an exquisitely characterized sample of important resonant TNO populations, ideal for testing models of giant planet migration during the early history of the solar system.

  5. The Results of the National Heritage Language Survey: Implications for Teaching, Curriculum Design, and Professional Development

    ERIC Educational Resources Information Center

    Carreira, Maria; Kagan, Olga

    2011-01-01

    This article reports on a survey of heritage language learners (HLLs) across different heritage languages (HLs) and geographic regions in the United States. A general profile of HLLs emerges as a student who (1) acquired English in early childhood, after acquiring the HL; (2) has limited exposure to the HL outside the home; (3) has relatively…

  6. Special Alumni Survey; Design, Coding and Data on Earnings. First Report 1970.

    ERIC Educational Resources Information Center

    Witmer, David R.

    In the belief that college and university programs cannot be meaningfully improved in the absence of information about their outcomes and effects, the Wisconsin State Universities System conducted a special alumni survey to obtain 1968 data on the occupation, salary, income and continuing education of a random sample of persons who had attended…

  7. Site study plan for EDBH (Engineering Design Boreholes) seismic surveys, Deaf Smith County site, Texas: Revision 1

    SciTech Connect

    Hume, H.

    1987-12-01

    This site study plan describes seismic reflection surveys to run north-south and east-west across the Deaf Smith County site, and intersecting near the Engineering Design Boreholes (EDBH). Both conventional and shallow high-resolution surveys will be run. The field program has been designed to acquire subsurface geologic and stratigraphic data to address information/data needs resulting from Federal and State regulations and Repository program requirements. The data acquired by the conventional surveys will be common-depth- point, seismic reflection data optimized for reflection events that indicate geologic structure near the repository horizon. The data will also resolve the basement structure and shallow reflection events up to about the top of the evaporite sequence. Field acquisition includes a testing phase to check/select parameters and a production phase. The field data will be subjected immediately to conventional data processing and interpretation to determine if there are any anamolous structural for stratigraphic conditions that could affect the choice of the EDBH sites. After the EDBH's have been drilled and logged, including vertical seismic profiling, the data will be reprocessed and reinterpreted for detailed structural and stratigraphic information to guide shaft development. The shallow high-resulition seismic reflection lines will be run along the same alignments, but the lines will be shorter and limited to immediate vicinity of the EDBH sites. These lines are planned to detect faults or thick channel sands that may be present at the EDBH sites. 23 refs. , 7 figs., 5 tabs.

  8. Survey of editors and reviewers of high-impact psychology journals: statistical and research design problems in submitted manuscripts.

    PubMed

    Harris, Alex; Reeder, Rachelle; Hyun, Jenny

    2011-01-01

    The authors surveyed 21 editors and reviewers from major psychology journals to identify and describe the statistical and design errors they encounter most often and to get their advice regarding prevention of these problems. Content analysis of the text responses revealed themes in 3 major areas: (a) problems with research design and reporting (e.g., lack of an a priori power analysis, lack of congruence between research questions and study design/analysis, failure to adequately describe statistical procedures); (b) inappropriate data analysis (e.g., improper use of analysis of variance, too many statistical tests without adjustments, inadequate strategy for addressing missing data); and (c) misinterpretation of results. If researchers attended to these common methodological and analytic issues, the scientific quality of manuscripts submitted to high-impact psychology journals might be significantly improved.

  9. Design of a detection survey for Ostreid herpesvirus-1 using hydrodynamic dispersion models to determine epidemiological units.

    PubMed

    Pande, Anjali; Acosta, Hernando; Brangenberg, Naya Alexis; Keeling, Suzanne Elizabeth

    2015-04-01

    Using Ostreid herpesvirus-1 (OsHV-1) as a case study, this paper considers a survey design methodology for an aquatic animal pathogen that incorporates the concept of biologically independent epidemiological units. Hydrodynamically-modelled epidemiological units are used to divide marine areas into sensible sampling units for detection surveys of waterborne diseases. In the aquatic environment it is difficult to manage disease at the animal level, hence management practices are often aimed at a group of animals sharing a similar risk. Using epidemiological units is a way to define these groups, based on a similar level of probability of exposure based on the modelled potential spread of a viral particle via coastal currents, that can help inform management decisions.

  10. Optical Design of the Camera for Transiting Exoplanet Survey Satellite (TESS)

    NASA Technical Reports Server (NTRS)

    Chrisp, Michael; Clark, Kristin; Primeau, Brian; Dalpiaz, Michael; Lennon, Joseph

    2015-01-01

    The optical design of the wide field of view refractive camera, 34 degrees diagonal field, for the TESS payload is described. This fast f/1.4 cryogenic camera, operating at -75 C, has no vignetting for maximum light gathering within the size and weight constraints. Four of these cameras capture full frames of star images for photometric searches of planet crossings. The optical design evolution, from the initial Petzval design, took advantage of Forbes aspheres to develop a hybrid design form. This maximized the correction from the two aspherics resulting in a reduction of average spot size by sixty percent in the final design. An external long wavelength pass filter was replaced by an internal filter coating on a lens to save weight, and has been fabricated to meet the specifications. The stray light requirements were met by an extended lens hood baffle design, giving the necessary off-axis attenuation.

  11. Spatial scales of variation in lichens: implications for sampling design in biomonitoring surveys.

    PubMed

    Giordani, Paolo; Brunialti, Giorgio; Frati, Luisa; Incerti, Guido; Ianesch, Luca; Vallone, Emanuele; Bacaro, Giovanni; Maccherini, Simona

    2013-02-01

    The variability of biological data is a main constraint affecting the quality and reliability of lichen biomonitoring surveys for estimation of the effects of atmospheric pollution. Although most epiphytic lichen bioindication surveys focus on between-site differences at the landscape level, associated with the large scale effects of atmospheric pollution, current protocols are based on multilevel sampling, thus adding further sources of variation and affecting the error budget. We test the hypothesis that assemblages of lichen communities vary at each spatial scale examined, in order to determine what scales should be included in future monitoring studies. We compared four sites in Italy, along gradients of atmospheric pollution and climate, to test the partitioning of the variance components of lichen diversity across spatial scales (from trunks to landscapes). Despite environmental heterogeneity, we observed comparable spatial variance. However, residuals often overcame between-plot variability, leading to biased estimation of atmospheric pollution effects.

  12. Evaluating cost-efficiency and accuracy of hunter harvest survey designs

    USGS Publications Warehouse

    Lukacs, P.M.; Gude, J.A.; Russell, R.E.; Ackerman, B.B.

    2011-01-01

    Effective management of harvested wildlife often requires accurate estimates of the number of animals harvested annually by hunters. A variety of techniques exist to obtain harvest data, such as hunter surveys, check stations, mandatory reporting requirements, and voluntary reporting of harvest. Agencies responsible for managing harvested wildlife such as deer (Odocoileus spp.), elk (Cervus elaphus), and pronghorn (Antilocapra americana) are challenged with balancing the cost of data collection versus the value of the information obtained. We compared precision, bias, and relative cost of several common strategies, including hunter self-reporting and random sampling, for estimating hunter harvest using a realistic set of simulations. Self-reporting with a follow-up survey of hunters who did not report produces the best estimate of harvest in terms of precision and bias, but it is also, by far, the most expensive technique. Self-reporting with no followup survey risks very large bias in harvest estimates, and the cost increases with increased response rate. Probability-based sampling provides a substantial cost savings, though accuracy can be affected by nonresponse bias. We recommend stratified random sampling with a calibration estimator used to reweight the sample based on the proportions of hunters responding in each covariate category as the best option for balancing cost and accuracy. ?? 2011 The Wildlife Society.

  13. The DESI Experiment Part I: Science,Targeting, and Survey Design

    SciTech Connect

    Aghamousa, Amir; et al.

    2016-10-31

    DESI (Dark Energy Spectroscopic Instrument) is a Stage IV ground-based dark energy experiment that will study baryon acoustic oscillations (BAO) and the growth of structure through redshift-space distortions with a wide-area galaxy and quasar redshift survey. To trace the underlying dark matter distribution, spectroscopic targets will be selected in four classes from imaging data. We will measure luminous red galaxies up to $z=1.0$. To probe the Universe out to even higher redshift, DESI will target bright [O II] emission line galaxies up to $z=1.7$. Quasars will be targeted both as direct tracers of the underlying dark matter distribution and, at higher redshifts ($ 2.1 < z < 3.5$), for the Ly-$\\alpha$ forest absorption features in their spectra, which will be used to trace the distribution of neutral hydrogen. When moonlight prevents efficient observations of the faint targets of the baseline survey, DESI will conduct a magnitude-limited Bright Galaxy Survey comprising approximately 10 million galaxies with a median $z\\approx 0.2$. In total, more than 30 million galaxy and quasar redshifts will be obtained to measure the BAO feature and determine the matter power spectrum, including redshift space distortions.

  14. THE FMOS-COSMOS SURVEY OF STAR-FORMING GALAXIES AT z ∼ 1.6. III. SURVEY DESIGN, PERFORMANCE, AND SAMPLE CHARACTERISTICS

    SciTech Connect

    Silverman, J. D.; Sugiyama, N.; Kashino, D.; Sanders, D.; Zahid, J.; Kewley, L. J.; Chu, J.; Hasinger, G.; Kartaltepe, J. S.; Arimoto, N.; Renzini, A.; Rodighiero, G.; Baronchelli, I.; Daddi, E.; Juneau, S.; Lilly, S. J.; Carollo, C. M.; Capak, P.; Ilbert, O.; and others

    2015-09-15

    We present a spectroscopic survey of galaxies in the COSMOS field using the Fiber Multi-object Spectrograph (FMOS), a near-infrared instrument on the Subaru Telescope. Our survey is specifically designed to detect the Hα emission line that falls within the H-band (1.6–1.8 μm) spectroscopic window from star-forming galaxies with 1.4 < z < 1.7 and M{sub stellar} ≳ 10{sup 10} M{sub ⊙}. With the high multiplex capability of FMOS, it is now feasible to construct samples of over 1000 galaxies having spectroscopic redshifts at epochs that were previously challenging. The high-resolution mode (R ∼ 2600) effectively separates Hα and [N ii]λ6585, thus enabling studies of the gas-phase metallicity and photoionization state of the interstellar medium. The primary aim of our program is to establish how star formation depends on stellar mass and environment, both recognized as drivers of galaxy evolution at lower redshifts. In addition to the main galaxy sample, our target selection places priority on those detected in the far-infrared by Herschel/PACS to assess the level of obscured star formation and investigate, in detail, outliers from the star formation rate (SFR)—stellar mass relation. Galaxies with Hα detections are followed up with FMOS observations at shorter wavelengths using the J-long (1.11–1.35 μm) grating to detect Hβ and [O iii]λ5008 which provides an assessment of the extinction required to measure SFRs not hampered by dust, and an indication of embedded active galactic nuclei. With 460 redshifts measured from 1153 spectra, we assess the performance of the instrument with respect to achieving our goals, discuss inherent biases in the sample, and detail the emission-line properties. Our higher-level data products, including catalogs and spectra, are available to the community.

  15. Some New Bases and Needs for Interior Design from Environmental Research. A Preliminary Survey.

    ERIC Educational Resources Information Center

    Kleeman, Walter, Jr.

    Research which can form new bases for interior design is being greatly accelerated. Investigations in psychology, anthropology, psychiatry, and biology, as well as interdisciplinary projects, turn up literally hundreds of studies, the results of which will vitally affect interior design. This body of research falls into two parts--(1) human…

  16. Aerodynamic aircraft design methods and their notable applications: Survey of the activity in Japan

    NASA Technical Reports Server (NTRS)

    Fujii, Kozo; Takanashi, Susumu

    1991-01-01

    An overview of aerodynamic aircraft design methods and their recent applications in Japan is presented. A design code which was developed at the National Aerospace Laboratory (NAL) and is in use now is discussed, hence, most of the examples are the result of the collaborative work between heavy industry and the National Aerospace Laboratory. A wide variety of applications in transonic to supersonic flow regimes are presented. Although design of aircraft elements for external flows are the main focus, some of the internal flow applications are also presented. Recent applications of the design code, using the Navier Stokes and Euler equations in the analysis mode, include the design of HOPE (a space vehicle) and Upper Surface Blowing (USB) aircraft configurations.

  17. A Survey of Applications and Research in Integrated Design Systems Technology

    NASA Technical Reports Server (NTRS)

    1998-01-01

    The initial part of the study was begun with a combination of literature searches, World Wide Web searches, and contacts with individuals and companies who were known to members of our team to have an interest in topics that seemed to be related to our study. There is a long list of such topics, such as concurrent engineering, design for manufacture, life-cycle engineering, systems engineering, systems integration, systems design, design systems, integrated product and process approaches, enterprise integration, integrated product realization, and similar terms. These all capture, at least in part, the flavor of what we describe here as integrated design systems. An inhibiting factor in this inquiry was the absence of agreed terminology for the study of integrated design systems. It is common for the term to be applied to what are essentially augmented Computer-Aided Design (CAD) systems, which are integrated only to the extent that agreements have been reached to attach proprietary extensions to proprietary CAD programs. It is also common for some to use the term integrated design systems to mean a system that applies only, or mainly, to the design phase of a product life cycle. It is likewise common for many of the terms listed earlier to be used as synonyms for integrated design systems. We tried to avoid this ambiguity by adopting the definition of integrated design systems that is implied in the introductory notes that we provided to our contacts, cited earlier. We thus arrived at this definition: Integrated Design Systems refers to the integration of the different tools and processes that comprise the engineering, of complex systems. It takes a broad view of the engineering of systems, to include consideration of the entire product realization process and the product life cycle. An important aspect of integrated design systems is the extent to which they integrate existing, "islands of automation" into a comprehensive design and product realization

  18. ULTRADEEP IRAC IMAGING OVER THE HUDF AND GOODS-SOUTH: SURVEY DESIGN AND IMAGING DATA RELEASE

    SciTech Connect

    Labbé, I.; Bouwens, R. J.; Franx, M.; Stefanon, M.; Oesch, P. A.; Illingworth, G. D.; Holden, B.; Magee, D.; Carollo, C. M.; Trenti, M.; Smit, R.; González, V.; Stiavelli, M.

    2015-12-15

    The IRAC ultradeep field and IRAC Legacy over GOODS programs are two ultradeep imaging surveys at 3.6 and 4.5 μm with the Spitzer Infrared Array Camera (IRAC). The primary aim is to directly detect the infrared light of reionization epoch galaxies at z > 7 and to constrain their stellar populations. The observations cover the Hubble Ultra Deep Field (HUDF), including the two HUDF parallel fields, and the CANDELS/GOODS-South, and are combined with archival data from all previous deep programs into one ultradeep data set. The resulting imaging reaches unprecedented coverage in IRAC 3.6 and 4.5 μm ranging from >50 hr over 150 arcmin{sup 2}, >100 hr over 60 sq arcmin{sup 2}, to ∼200 hr over 5–10 arcmin{sup 2}. This paper presents the survey description, data reduction, and public release of reduced mosaics on the same astrometric system as the CANDELS/GOODS-South Wide Field Camera 3 (WFC3) data. To facilitate prior-based WFC3+IRAC photometry, we introduce a new method to create high signal-to-noise PSFs from the IRAC data and reconstruct the complex spatial variation due to survey geometry. The PSF maps are included in the release, as are registered maps of subsets of the data to enable reliability and variability studies. Simulations show that the noise in the ultradeep IRAC images decreases approximately as the square root of integration time over the range 20–200 hr, well below the classical confusion limit, reaching 1σ point-source sensitivities as faint as 15 nJy (28.5 AB) at 3.6 μm and 18 nJy (28.3 AB) at 4.5 μm. The value of such ultradeep IRAC data is illustrated by direct detections of z = 7–8 galaxies as faint as H{sub AB} = 28.

  19. Design and sample characteristics of the 2005-2008 Nutrition and Health Survey in Taiwan.

    PubMed

    Tu, Su-Hao; Chen, Cheng; Hsieh, Yao-Te; Chang, Hsing-Yi; Yeh, Chih-Jung; Lin, Yi-Chin; Pan, Wen-Harn

    2011-01-01

    The Nutrition and Health Survey in Taiwan (NAHSIT) 2005-2008 was funded by the Department of Health to provide continued assessment of health and nutrition of the people in Taiwan. This household survey collected data from children aged less than 6 years and adults aged 19 years and above, and adopted a three-stage stratified, clustered sampling scheme similar to that used in the NAHSIT 1993-1996. Four samples were produced. One sample with five geographical strata was selected for inference to the whole of Taiwan, while the other three samples, including Hakka, Penghu and mountainous areas were produced for inference to each cultural stratum. A total of 6,189 household interviews and 3,670 health examinations were completed. Interview data included household information, socio-demographics, 24-hour dietary recall, food frequency and habits, dietary and nutritional knowledge, attitudes and behaviors, physical activity, medical history and bone health. Health exam data included anthropometry, blood pressure, physical fitness, bone density, as well as blood and urine collection. Response rate for the household interview was 65%. Of these household interviews, 59% participated in the health exam. Only in a few age subgroups were there significant differences in sex, age, education, or ethnicity distribution between respondents and non-respondents. For the health exam, certain significant differences between participants and non-participants were mostly observed in those aged 19-64 years. The results of this survey will be of benefit to researchers, policy makers and the public to understand and improve the nutrition and health status of pre-school children and adults in Taiwan.

  20. An integrated device for magnetically-driven drug release and in situ quantitative measurements: Design, fabrication and testing

    NASA Astrophysics Data System (ADS)

    Bruvera, I. J.; Hernández, R.; Mijangos, C.; Goya, G. F.

    2015-03-01

    We have developed a device capable of remote triggering and in situ quantification of therapeutic drugs, based on magnetically-responsive hydrogels of poly (N-isopropylacrylamide) and alginate (PNiPAAm). The heating efficiency of these hydrogels measured by their specific power absorption (SPA) values showed that the values between 100 and 300 W/g of the material were high enough to reach the lower critical solution temperature (LCST) of the polymeric matrix within few minutes. The drug release through application of AC magnetic fields could be controlled by time-modulated field pulses in order to deliver the desired amount of drug. Using B12 vitamin as a concept drug, the device was calibrated to measure amounts of drug released as small as 25(2)×10-9 g, demonstrating the potential of this device for very precise quantitative control of drug release.

  1. Ethical considerations in the design and execution of the National and Hispanic Health and Nutrition Examination Survey (HANES).

    PubMed

    Wagener, D K

    1995-04-01

    The purpose of this article is to describe some ethical considerations that have arisen during the design and implementation of the health examination surveys conducted by the National Center for Health Statistics of the Centers for Disease Control and Prevention. Three major areas of concern are discussed: sharing information from the study, banking and using banked tissue samples, and obligations for future testing of subjects. Specific concerns of sharing information include: when to inform, whom to inform, maintaining confidentiality, and how to inform individuals. Specific concerns of determining when sera will be banked and using banked samples include: depletion of samples for quality control, obtaining informed consent for unanticipated uses, access by others, and requests for batches of samples. Finally, specific concerns regarding future testing of subjects include: retesting for verification, retesting for interpretation, testing for different risk factors, and follow-up. Although existing surveys can provide experience or even suggest guidelines, the uniqueness of any new survey will generate unique ethical problems, requiring the careful formulation of unique solutions.

  2. Design of removable partial dentures: a survey of dental laboratories in Greece.

    PubMed

    Avrampou, Marianna; Kamposiora, Phophi; Papavasiliou, Georgios; Pissiotis, Argirios; Katsoulis, Joannis; Doukoudakis, Asterios

    2012-01-01

    The aim of this study was to compare data on design and fabrication methods of removable partial dentures (RPDs) in two major cities in Greece. A questionnaire was sent to 150 randomly selected dental technicians. The participation rate was 79.3%. The anterior palatal strap, the lingual bar, and the Roach-type clasp arm designs were preferred. Half of the RPDs fabricated were retained using precision attachments. Differences between the two cities were observed in types of major maxillary connectors used, types of attachments and impression materials used, as well as the design of distal-extension RPDs. Postdoctoral education was found to have an impact on RPD fabrication. Despite the differences observed, design and fabrication of RPDs followed commonly used principles.

  3. Survey and analysis of research on supersonic drag-due-to-lift minimization with recommendations for wing design

    NASA Technical Reports Server (NTRS)

    Carlson, Harry W.; Mann, Michael J.

    1992-01-01

    A survey of research on drag-due-to-lift minimization at supersonic speeds, including a study of the effectiveness of current design and analysis methods was conducted. The results show that a linearized theory analysis with estimated attainable thrust and vortex force effects can predict with reasonable accuracy the lifting efficiency of flat wings. Significantly better wing performance can be achieved through the use of twist and camber. Although linearized theory methods tend to overestimate the amount of twist and camber required for a given application and provide an overly optimistic performance prediction, these deficiencies can be overcome by implementation of recently developed empirical corrections. Numerous examples of the correlation of experiment and theory are presented to demonstrate the applicability and limitations of linearized theory methods with and without empirical corrections. The use of an Euler code for the estimation of aerodynamic characteristics of a twisted and cambered wing and its application to design by iteration are discussed.

  4. Survey of Aerothermodynamics Facilities Useful for the Design of Hypersonic Vehicles Using Air-Breathing Propulsion

    NASA Technical Reports Server (NTRS)

    Arnold, James O.; Deiwert, George S.

    1997-01-01

    This paper surveys the use of aerothermodynamic facilities which have been useful in the study of external flows and propulsion aspects of hypersonic, air-breathing vehicles. While the paper is not a survey of all facilities, it covers the utility of shock tunnels and conventional hypersonic blow-down facilities which have been used for hypersonic air-breather studies. The problems confronting researchers in the field of aerothermodynamics are outlined. Results from the T5 GALCIT tunnel for the shock-on lip problem are outlined. Experiments on combustors and short expansion nozzles using the semi-free jet method have been conducted in large shock tunnels. An example which employed the NASA Ames 16-Inch shock tunnel is outlined, and the philosophy of the test technique is described. Conventional blow-down hypersonic wind tunnels are quite useful in hypersonic air-breathing studies. Results from an expansion ramp experiment, simulating the nozzle on a hypersonic air-breather from the NASA Ames 3.5 Foot Hypersonic wind tunnel are summarized. Similar work on expansion nozzles conducted in the NASA Langley hypersonic wind tunnel complex is cited. Free-jet air-frame propulsion integration and configuration stability experiments conducted at Langley in the hypersonic wind tunnel complex on a small generic model are also summarized.

  5. The Mississippi Delta Cardiovascular Health Examination Survey: Study Design and Methods.

    PubMed

    Short, Vanessa L; Ivory-Walls, Tameka; Smith, Larry; Loustalot, Fleetwood

    2014-01-01

    Assessment of cardiovascular disease (CVD) morbidity and mortality in subnational areas is limited. A model for regional CVD surveillance is needed, particularly among vulnerable populations underrepresented in current monitoring systems. The Mississippi Delta Cardiovascular Health Examination Survey (CHES) is a population-based, cross-sectional study on a representative sample of adults living in the 18-county Mississippi Delta region, a rural, impoverished area with high rates of poor health outcomes and marked health disparities. The primary objectives of Delta CHES are to (1) determine the prevalence and distribution of CVD and CVD risk factors using self-reported and directly measured health metrics and (2) to assess environmental perceptions and existing policies that support or deter healthy choices. An address-based sampling frame is used for household enumeration and participant recruitment and an in-home data collection model is used to collect survey data, anthropometric measures, and blood samples from participants. Data from all sources will be merged into one analytic dataset and sample weights developed to ensure data are representative of the Mississippi Delta region adult population. Information gathered will be used to assess the burden of CVD and guide the development, implementation, and evaluation of cardiovascular health promotion and risk factor control strategies.

  6. Advanced power generation systems for the 21st Century: Market survey and recommendations for a design philosophy

    SciTech Connect

    Andriulli, J.B.; Gates, A.E.; Haynes, H.D.; Klett, L.B.; Matthews, S.N.; Nawrocki, E.A.; Otaduy, P.J.; Scudiere, M.B.; Theiss, T.J.; Thomas, J.F.; Tolbert, L.M.; Yauss, M.L.; Voltz, C.A.

    1999-11-01

    The purpose of this report is to document the results of a study designed to enhance the performance of future military generator sets (gen-sets) in the medium power range. The study includes a market survey of the state of the art in several key component areas and recommendations comprising a design philosophy for future military gen-sets. The market survey revealed that the commercial market is in a state of flux, but it is currently or will soon be capable of providing the technologies recommended here in a cost-effective manner. The recommendations, if implemented, should result in future power generation systems that are much more functional than today's gen-sets. The number of differing units necessary (both family sizes and frequency modes) to cover the medium power range would be decreased significantly, while the weight and volume of each unit would decrease, improving the transportability of the power source. Improved fuel economy and overall performance would result from more effective utilization of the prime mover in the generator. The units would allow for more flexibility and control, improved reliability, and more effective power management in the field.

  7. SABE Colombia: Survey on Health, Well-Being, and Aging in Colombia—Study Design and Protocol

    PubMed Central

    Corchuelo, Jairo; Curcio, Carmen-Lucia; Calzada, Maria-Teresa; Mendez, Fabian

    2016-01-01

    Objective. To describe the design of the SABE Colombia study. The major health study of the old people in Latin America and the Caribbean (LAC) is the Survey on Health, Well-Being, and Aging in LAC, SABE (from initials in Spanish: SAlud, Bienestar & Envejecimiento). Methods. The SABE Colombia is a population-based cross-sectional study on health, aging, and well-being of elderly individuals aged at least 60 years focusing attention on social determinants of health inequities. Methods and design were similar to original LAC SABE. The total sample size of the study at the urban and rural research sites (244 municipalities) was 23.694 elderly Colombians representative of the total population. The study had three components: (1) a questionnaire covering active aging determinants including anthropometry, blood pressure measurement, physical function, and biochemical and hematological measures; (2) a subsample survey among family caregivers; (3) a qualitative study with gender and cultural perspectives of quality of life to understand different dimensions of people meanings. Conclusions. The SABE Colombia is a comprehensive, multidisciplinary study of the elderly with respect to active aging determinants. The results of this study are intended to inform public policies aimed at tackling health inequalities for the aging society in Colombia. PMID:27956896

  8. SABE Colombia: Survey on Health, Well-Being, and Aging in Colombia-Study Design and Protocol.

    PubMed

    Gomez, Fernando; Corchuelo, Jairo; Curcio, Carmen-Lucia; Calzada, Maria-Teresa; Mendez, Fabian

    2016-01-01

    Objective. To describe the design of the SABE Colombia study. The major health study of the old people in Latin America and the Caribbean (LAC) is the Survey on Health, Well-Being, and Aging in LAC, SABE (from initials in Spanish: SAlud, Bienestar & Envejecimiento). Methods. The SABE Colombia is a population-based cross-sectional study on health, aging, and well-being of elderly individuals aged at least 60 years focusing attention on social determinants of health inequities. Methods and design were similar to original LAC SABE. The total sample size of the study at the urban and rural research sites (244 municipalities) was 23.694 elderly Colombians representative of the total population. The study had three components: (1) a questionnaire covering active aging determinants including anthropometry, blood pressure measurement, physical function, and biochemical and hematological measures; (2) a subsample survey among family caregivers; (3) a qualitative study with gender and cultural perspectives of quality of life to understand different dimensions of people meanings. Conclusions. The SABE Colombia is a comprehensive, multidisciplinary study of the elderly with respect to active aging determinants. The results of this study are intended to inform public policies aimed at tackling health inequalities for the aging society in Colombia.

  9. Quantitating the effect of prosthesis design on femoral remodeling using high-resolution region-free densitometric analysis (DXA-RFA).

    PubMed

    Farzi, Mohsen; Morris, Richard M; Penny, Jeannette; Yang, Lang; Pozo, Jose M; Overgaard, Søren; Frangi, Alejandro F; Wilkinson, J Mark

    2017-02-07

    Dual energy X-ray absorptiometry (DXA) is the reference standard method used to study bone mineral density (BMD) after total hip arthroplasty (THA). However, the subtle, spatially-complex changes in bone mass due to strain-adaptive bone remodeling relevant to different prosthesis designs are not readily resolved using conventional DXA analysis. DXA region free analysis (DXA RFA) is a novel computational image analysis technique that provides a high-resolution quantitation of periprosthetic BMD. Here we applied the technique to quantitate the magnitude and areal size of periprosthetic BMD changes using scans acquired during two previous randomized clinical trials (2004 to 2009); one comparing three cemented prosthesis design geometries, and the other comparing a hip resurfacing versus a conventional cementless prosthesis. DXA RFA resolved subtle differences in magnitude and area of bone remodeling between prosthesis designs not previously identified in conventional DXA analyses. A mean bone loss of 10.3%, 12.1%, and 11.1% occurred for the three cemented prostheses within a bone area fraction of 14.8%, 14.4%, and 6.2%, mostly within the lesser trochanter (P < 0.001). For the cementless prosthesis, a diffuse pattern of bone loss (-14.3%) was observed at the shaft of femur in a small area fraction of 0.6% versus no significant bone loss for the hip resurfacing prosthesis (P < 0.001). BMD increases were observed consistently at the greater trochanter for all prostheses except the hip-resurfacing prosthesis, where BMD increase was widespread across the metaphysis (P < 0.001). DXA RFA provides high-resolution insights into the effect of prosthesis design on the local strain environment in bone. This article is protected by copyright. All rights reserved.

  10. Mail and Web Surveys: A Comparison of Demographic Characteristics and Response Quality When Respondents Self-Select the Survey Administration Mode

    ERIC Educational Resources Information Center

    Mackety, Dawn M.

    2007-01-01

    The purpose of this study was to use a nonexperimental, quantitative design to compare mail and web surveys with survey mode self-selection at two data collection waves. Research questions examined differences and predictabilities among demographics (gender, ethnicity, age, and professional employment) and response quality (pronoun use, item…

  11. Fitting statistical distributions to sea duck count data: implications for survey design and abundance estimation

    USGS Publications Warehouse

    Zipkin, Elise F.; Leirness, Jeffery B.; Kinlan, Brian P.; O'Connell, Allan F.; Silverman, Emily D.

    2014-01-01

    Determining appropriate statistical distributions for modeling animal count data is important for accurate estimation of abundance, distribution, and trends. In the case of sea ducks along the U.S. Atlantic coast, managers want to estimate local and regional abundance to detect and track population declines, to define areas of high and low use, and to predict the impact of future habitat change on populations. In this paper, we used a modified marked point process to model survey data that recorded flock sizes of Common eiders, Long-tailed ducks, and Black, Surf, and White-winged scoters. The data come from an experimental aerial survey, conducted by the United States Fish & Wildlife Service (USFWS) Division of Migratory Bird Management, during which east-west transects were flown along the Atlantic Coast from Maine to Florida during the winters of 2009–2011. To model the number of flocks per transect (the points), we compared the fit of four statistical distributions (zero-inflated Poisson, zero-inflated geometric, zero-inflated negative binomial and negative binomial) to data on the number of species-specific sea duck flocks that were recorded for each transect flown. To model the flock sizes (the marks), we compared the fit of flock size data for each species to seven statistical distributions: positive Poisson, positive negative binomial, positive geometric, logarithmic, discretized lognormal, zeta and Yule–Simon. Akaike’s Information Criterion and Vuong’s closeness tests indicated that the negative binomial and discretized lognormal were the best distributions for all species for the points and marks, respectively. These findings have important implications for estimating sea duck abundances as the discretized lognormal is a more skewed distribution than the Poisson and negative binomial, which are frequently used to model avian counts; the lognormal is also less heavy-tailed than the power law distributions (e.g., zeta and Yule–Simon), which are

  12. Ultradeep IRAC Imaging Over the HUDF and GOODS-South: Survey Design and Imaging Data Release

    NASA Astrophysics Data System (ADS)

    Labbé, I.; Oesch, P. A.; Illingworth, G. D.; van Dokkum, P. G.; Bouwens, R. J.; Franx, M.; Carollo, C. M.; Trenti, M.; Holden, B.; Smit, R.; González, V.; Magee, D.; Stiavelli, M.; Stefanon, M.

    2015-12-01

    The IRAC ultradeep field and IRAC Legacy over GOODS programs are two ultradeep imaging surveys at 3.6 and 4.5 μm with the Spitzer Infrared Array Camera (IRAC). The primary aim is to directly detect the infrared light of reionization epoch galaxies at z > 7 and to constrain their stellar populations. The observations cover the Hubble Ultra Deep Field (HUDF), including the two HUDF parallel fields, and the CANDELS/GOODS-South, and are combined with archival data from all previous deep programs into one ultradeep data set. The resulting imaging reaches unprecedented coverage in IRAC 3.6 and 4.5 μm ranging from >50 hr over 150 arcmin2, >100 hr over 60 sq arcmin2, to ˜200 hr over 5-10 arcmin2. This paper presents the survey description, data reduction, and public release of reduced mosaics on the same astrometric system as the CANDELS/GOODS-South Wide Field Camera 3 (WFC3) data. To facilitate prior-based WFC3+IRAC photometry, we introduce a new method to create high signal-to-noise PSFs from the IRAC data and reconstruct the complex spatial variation due to survey geometry. The PSF maps are included in the release, as are registered maps of subsets of the data to enable reliability and variability studies. Simulations show that the noise in the ultradeep IRAC images decreases approximately as the square root of integration time over the range 20-200 hr, well below the classical confusion limit, reaching 1σ point-source sensitivities as faint as 15 nJy (28.5 AB) at 3.6 μm and 18 nJy (28.3 AB) at 4.5 μm. The value of such ultradeep IRAC data is illustrated by direct detections of z = 7-8 galaxies as faint as HAB = 28. Based on observations made with the NASA/ESA Hubble Space Telescope, obtained from the data archive at the Space Telescope Science Institute. STScI is operated by the Association of Universities for Research in Astronomy, Inc. under NASA contract NAS 5-26555. Based on observations made with the Spitzer Space Telescope, which is operated by the Jet

  13. Spectacle Design Preferences among Chinese Primary and Secondary Students and Their Parents: A Qualitative and Quantitative Study

    PubMed Central

    Zhou, Zhongqiang; Kecman, Maja; Chen, Tingting; Liu, Tianyu; Jin, Ling; Chen, Shangji; Chen, Qianyun; He, Mingguang; Silver, Josh; Moore, Bruce; Congdon, Nathan

    2014-01-01

    Purpose To identify the specific characteristics making glasses designs, particularly those compatible with adjustable glasses, more or less appealing to Chinese children and their parents. Patients and Methods Primary and secondary school children from urban and rural China with < = −1.00 diopters of bilateral myopia and their parents ranked four conventional-style frames identified by local optical shops as popular versus four child-specific frames compatible with adjustable spectacles. Scores based on the proportion of maximum possible ranking were computed for each style. Selected children and their parents also participated in Focus Groups (FGs) discussing spectacle design preference. Recordings were transcribed and coded by two independents reviewers using NVivo software. Results Among 136 urban primary school children (age range 9–11 years), 290 rural secondary school children (11–17 years) and 16 parents, all adjustable-style frames (scores on 0–100 scale 25.7–62.4) were ranked behind all conventional frames (63.0–87.5). For eight FGs including 12 primary children, 26 secondary children and 16 parents, average kappa values for NVivo coding were 0.81 (students) and 0.70 (parents). All groups agreed that the key changes to make adjustable designs more attractive were altering the round lenses to rectangular or oval shapes and adding curved earpieces for more stable wear. The thick frames of the adjustable designs were considered stylish, and children indicated they would wear them if the lens shape were modified. Conclusions Current adjustable lens designs are unattractive to Chinese children and their parents, though this study identified specific modifications which would make them more appealing. PMID:24594799

  14. Approach on quantitative structure-activity relationship for design of a pH neutral carrier containing tertiary amino group.

    PubMed

    Cao, Zhong; Gong, Fu-Chun; Li, He-Ping; Xiao, Zhong-Liang; Long, Shu; Zhang, Ling; Peng, San-Jun

    2007-01-02

    The quantitative structure-activity relationship (QSAR) for neutral carriers used to prepare hydrogen ion sensors has been studied. A series of synthesized carrier compounds were taken as the training set. Five molecular structure parameters of the compounds were calculated by using CNDO/2 algorithm and used as feature variables in constructing QSAR model. The lower and upper limits of the linear pH response range were taken as the activity measure. The corresponding model equations were derived from the stepwise regression procedure. With the established QSAR model, a new pH carrier, (4-hydroxybenzyl) didodecylamine (XIII) was proposed and synthesized. The PVC membrane pH electrode based on carrier XIII with a wide pH linear response range of 2.0-12.5 was prepared. Having a theoretical Nernstian response slope of 57.2+/-0.3 mV/pH (n=5 at 25 degrees C) without a super-Nernstian phenomenon, the sensor had low resistance, short response time, high selectivity and good reproducibility. Moreover, the sensor was successfully applied to detecting the pH value of serum samples.

  15. Quantitative PCR measurements of the effects of introducing inosines into primers provides guidelines for improved degenerate primer design.

    PubMed

    Zheng, Linda; Gibbs, Mark J; Rodoni, Brendan C

    2008-11-01

    Polymerase chain reaction (PCR) is used to detect groups of viruses with the use of group-specific degenerate primers. Inosine residues are sometimes used in the primers to match variable positions within the complementary target sequences, but there is little data on their effects on cDNA synthesis and amplification. A quantitative reverse-transcription PCR was used to measure the rate of amplification with primers containing inosine residues substituted at different positions and in increasing numbers. Experiments were conducted using standard quantities of cloned DNA copied from Potato virus Y genomic RNA and RNA (cRNA) transcribed from the cloned DNA. Single inosine residues had no affect on the amplification rate in the forward primer, except at one position close to the 3' terminus. Conversely, single inosine residues significantly reduced the amplification rate when placed at three out of four positions in the reverse primer. Four or five inosine substitutions could be tolerated with some decline in rates, but amplification often failed from cRNA templates with primers containing larger numbers of inosines. Greater declines in the rate of amplification were observed with RNA templates, suggesting that reverse transcription suffers more than PCR amplification when inosine is included in the reverse primer.

  16. Diagrams: A Visual Survey of Graphs, Maps, Charts and Diagrams for the Graphic Designer.

    ERIC Educational Resources Information Center

    Lockwood, Arthur

    Since the ultimate success of any diagram rests in its clarity, it is important that the designer select a method of presentation which will achieve this aim. He should be aware of the various ways in which statistics can be shown diagrammatically, how information can be incorporated in maps, and how events can be plotted in chart or graph form.…

  17. Computer Game Theories for Designing Motivating Educational Software: A Survey Study

    ERIC Educational Resources Information Center

    Ang, Chee Siang; Rao, G. S. V. Radha Krishna

    2008-01-01

    The purpose of this study is to evaluate computer game theories for educational software. We propose a framework for designing engaging educational games based on contemporary game studies which includes ludology and narratology. Ludology focuses on the study of computer games as play and game activities, while narratology revolves around the…

  18. A Study of Disruptive Behavior Disorders in Puerto Rican Youth: I. Background, Design, and Survey Methods

    ERIC Educational Resources Information Center

    Bird, Hector R.; Canino, Glorisa J.; Davies, Mark; Duarte, Cristiane S.; Febo, Vivian; Ramirez, Rafael; Hoven, Christina; Wicks, Judith; Musa, George; Loeber, Rolf

    2006-01-01

    Objective: This is the first of two related articles on a study carried out between 2000 and 2003 designed to assess the prevalence, associated comorbidities, and correlates of disruptive behavior disorders in two populations of Puerto Rican children: one in the Standard Metropolitan Areas of San Juan and Caguas in Puerto Rico, and the other in…

  19. A Survey of Career Guidance Needs of Industrial Design Students in Taiwanese Universities

    ERIC Educational Resources Information Center

    Yang, Ming-Ying; You, Manlai

    2010-01-01

    School pupils in Taiwan spend most of their time in studying and having examinations, and consequently many of them decide what major to study in universities rather hastily. Industrial design (ID) programs in universities nowadays recruit students from general and vocational senior high schools through a variety of channels. As a consequence, ID…

  20. Research Design and Statistical Methods in Indian Medical Journals: A Retrospective Survey

    PubMed Central

    Hassan, Shabbeer; Yellur, Rajashree; Subramani, Pooventhan; Adiga, Poornima; Gokhale, Manoj; Iyer, Manasa S.; Mayya, Shreemathi S.

    2015-01-01

    Good quality medical research generally requires not only an expertise in the chosen medical field of interest but also a sound knowledge of statistical methodology. The number of medical research articles which have been published in Indian medical journals has increased quite substantially in the past decade. The aim of this study was to collate all evidence on study design quality and statistical analyses used in selected leading Indian medical journals. Ten (10) leading Indian medical journals were selected based on impact factors and all original research articles published in 2003 (N = 588) and 2013 (N = 774) were categorized and reviewed. A validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation of the articles. Main outcomes considered in the present study were – study design types and their frequencies, error/defects proportion in study design, statistical analyses, and implementation of CONSORT checklist in RCT (randomized clinical trials). From 2003 to 2013: The proportion of erroneous statistical analyses did not decrease (χ2=0.592, Φ=0.027, p=0.4418), 25% (80/320) in 2003 compared to 22.6% (111/490) in 2013. Compared with 2003, significant improvement was seen in 2013; the proportion of papers using statistical tests increased significantly (χ2=26.96, Φ=0.16, p<0.0001) from 42.5% (250/588) to 56.7 % (439/774). The overall proportion of errors in study design decreased significantly (χ2=16.783, Φ=0.12 p<0.0001), 41.3% (243/588) compared to 30.6% (237/774). In 2013, randomized clinical trials designs has remained very low (7.3%, 43/588) with majority showing some errors (41 papers, 95.3%). Majority of the published studies were retrospective in nature both in 2003 [79.1% (465/588)] and in 2013 [78.2% (605/774)]. Major decreases in error proportions were observed in both results presentation (χ2=24.477, Φ=0.17, p<0.0001), 82.2% (263/320) compared to 66.3% (325

  1. Research design and statistical methods in Indian medical journals: a retrospective survey.

    PubMed

    Hassan, Shabbeer; Yellur, Rajashree; Subramani, Pooventhan; Adiga, Poornima; Gokhale, Manoj; Iyer, Manasa S; Mayya, Shreemathi S

    2015-01-01

    Good quality medical research generally requires not only an expertise in the chosen medical field of interest but also a sound knowledge of statistical methodology. The number of medical research articles which have been published in Indian medical journals has increased quite substantially in the past decade. The aim of this study was to collate all evidence on study design quality and statistical analyses used in selected leading Indian medical journals. Ten (10) leading Indian medical journals were selected based on impact factors and all original research articles published in 2003 (N = 588) and 2013 (N = 774) were categorized and reviewed. A validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation of the articles. Main outcomes considered in the present study were - study design types and their frequencies, error/defects proportion in study design, statistical analyses, and implementation of CONSORT checklist in RCT (randomized clinical trials). From 2003 to 2013: The proportion of erroneous statistical analyses did not decrease (χ2=0.592, Φ=0.027, p=0.4418), 25% (80/320) in 2003 compared to 22.6% (111/490) in 2013. Compared with 2003, significant improvement was seen in 2013; the proportion of papers using statistical tests increased significantly (χ2=26.96, Φ=0.16, p<0.0001) from 42.5% (250/588) to 56.7 % (439/774). The overall proportion of errors in study design decreased significantly (χ2=16.783, Φ=0.12 p<0.0001), 41.3% (243/588) compared to 30.6% (237/774). In 2013, randomized clinical trials designs has remained very low (7.3%, 43/588) with majority showing some errors (41 papers, 95.3%). Majority of the published studies were retrospective in nature both in 2003 [79.1% (465/588)] and in 2013 [78.2% (605/774)]. Major decreases in error proportions were observed in both results presentation (χ2=24.477, Φ=0.17, p<0.0001), 82.2% (263/320) compared to 66.3% (325/490) and

  2. Building confidence in quantitative systems pharmacology models: An engineer's guide to exploring the rationale in model design and development

    PubMed Central

    Alden, K; Andrews, P; Clark, E; Nellis, A; Naylor, B; Coles, M; Kaye, P

    2017-01-01

    This tutorial promotes good practice for exploring the rationale of systems pharmacology models. A safety systems engineering inspired notation approach provides much needed rigor and transparency in development and application of models for therapeutic discovery and design of intervention strategies. Structured arguments over a model's development, underpinning biological knowledge, and analyses of model behaviors are constructed to determine the confidence that a model is fit for the purpose for which it will be applied. PMID:27863172

  3. Building confidence in quantitative systems pharmacology models: An engineer's guide to exploring the rationale in model design and development.

    PubMed

    Timmis, J; Alden, K; Andrews, P; Clark, E; Nellis, A; Naylor, B; Coles, M; Kaye, P

    2017-03-01

    This tutorial promotes good practice for exploring the rationale of systems pharmacology models. A safety systems engineering inspired notation approach provides much needed rigor and transparency in development and application of models for therapeutic discovery and design of intervention strategies. Structured arguments over a model's development, underpinning biological knowledge, and analyses of model behaviors are constructed to determine the confidence that a model is fit for the purpose for which it will be applied.

  4. Unmanned Research Vehicle (URV): Development, Implementation, & Flight Test of a MIMO Digital Flight Control System Designed Using Quantitative Feedback Theory

    DTIC Science & Technology

    2000-04-01

    systemes pour vehicules et en integration] To order the complete compilation report, use: ADA381871 The component part is provided here to allow users...Lambda URV that would be part of an must also understand the environment in which the system autonomous flight control system. During the project the...would interface with an autonomous waypoint directed successful control design has been achieved, autopilot. 111-2.13 REDESIGN (#13) Specifications At

  5. Beyond the Cost of Biologics: Employer Survey Reveals Gap in Understanding Role of Specialty Pharmacy and Benefit Design

    PubMed Central

    Vogenberg, F. Randy; Larson, Cheryl; Rehayem, Margaret; Boress, Larry

    2012-01-01

    Background Advances in biotechnology have led to the development of many new medical therapies for a variety of diseases. These agents, known as biologics or specialty drugs, represent the fastest-growing segment of pharmaceuticals. They have often proved effective in cases where conventional medications have failed; however, they can cost up to $350,000 per patient annually. Employers sponsor a significant proportion of plans that provide healthcare benefits, but surveys on benefit coverage have neglected to measure employers’ understanding of these drugs or their use. Objective To establish a baseline understanding of specialty pharmacy drug benefit coverage from the perspective of the employer (ie, commercial benefit plan sponsors). Methods The Midwest Business Group on Health (MBGH), a Chicago-based, nonprofit coalition of more than 100 large employers, in collaboration with the Institute for Integrated Healthcare, conducted a national web-based survey to determine the extent of employer understanding of specialty pharmacy drug management. MBGH, along with 15 business coalitions nationwide, distributed the survey to their employer members. A total of 120 employers, representing more than 1 million employee lives, completed the survey online. The results were then analyzed by MBGH. Results Of the 120 employers surveyed, 25% had “little to no understanding” of biologics, and only 53% claimed a “moderate understanding” of these agents. When asked to rank the effectiveness of biologics-related disease management support for their employees, 45% of the participating employers did not know whether productivity had increased, and 43% did not know whether their employees had experienced increased quality of life as a result of taking these drugs. The majority (76%) of employers continued to rely heavily on print medium to communicate with their covered population. Overall, the vast majority of employers (78%) claimed either “little to no understanding” or

  6. A Study of Program Management Procedures in the Campus-Based and Basic Grant Programs. Technical Report No. 1: Sample Design, Student Survey Yield and Bias.

    ERIC Educational Resources Information Center

    Puma, Michael J.; Ellis, Richard

    Part of a study of program management procedures in the campus-based and Basic Educational Opportunity Grant programs reports on the design of the site visit component of the study and the results of the student survey, both in terms of the yield obtained and the quality of the data. Chapter 2 describes the design of sampling methodology employed…

  7. A survey on the design of multiprocessing systems for artificial intelligence applications

    NASA Technical Reports Server (NTRS)

    Wah, Benjamin W.; Li, Guo Jie

    1989-01-01

    Some issues in designing computers for artificial intelligence (AI) processing are discussed. These issues are divided into three levels: the representation level, the control level, and the processor level. The representation level deals with the knowledge and methods used to solve the problem and the means to represent it. The control level is concerned with the detection of dependencies and parallelism in the algorithmic and program representations of the problem, and with the synchronization and sheduling of concurrent tasks. The processor level addresses the hardware and architectural components needed to evaluate the algorithmic and program representations. Solutions for the problems of each level are illustrated by a number of representative systems. Design decisions in existing projects on AI computers are classed into top-down, bottom-up, and middle-out approaches.

  8. A Survey of the Role of Noncovalent Sulfur Interactions in Drug Design.

    PubMed

    Beno, Brett R; Yeung, Kap-Sun; Bartberger, Michael D; Pennington, Lewis D; Meanwell, Nicholas A

    2015-06-11

    Electron deficient, bivalent sulfur atoms have two areas of positive electrostatic potential, a consequence of the low-lying σ* orbitals of the C-S bond that are available for interaction with electron donors including oxygen and nitrogen atoms and, possibly, π-systems. Intramolecular interactions are by far the most common manifestation of this effect, which offers a means of modulating the conformational preferences of a molecule. Although a well-documented phenomenon, a priori applications in drug design are relatively sparse and this interaction, which is often isosteric with an intramolecular hydrogen-bonding interaction, appears to be underappreciated by the medicinal chemistry community. In this Perspective, we discuss the theoretical basis for sulfur σ* orbital interactions and illustrate their importance in the context of drug design and organic synthesis. The role of sulfur interactions in protein structure and function is discussed and although relatively rare, intermolecular interactions between ligand C-S σ* orbitals and proteins are illustrated.

  9. Thermal Design of the Instrument for the Transiting Exoplanet Survey Satellite

    NASA Technical Reports Server (NTRS)

    Allen, Gregory D.

    2016-01-01

    TESS observatory is a two year NASA Explorer mission which will use a set of four cameras to discover exoplanets. It will be placed in a high-earth orbit with a period of 13.7 days and will be unaffected by temperature disturbances caused by environmental heating from the Earth. The cameras use their stray-light baffles to passively cool the cameras and in turn the CCD's in order to maintain operational temperatures. The design has been well thought out and analyzed to maximize temperature stability. The analysis shows that the design keeps the cameras and their components within their temperature ranges which will help make it a successful mission. It will also meet its survival requirement of sustaining exposure to a five hour eclipse. Official validation and verification planning is underway and will be performed as the system is built up. It is slated for launch in 2017.

  10. Design and Calibration of a Flowfield Survey Rake for Inlet Flight Research

    NASA Technical Reports Server (NTRS)

    Flynn, Darin C.; Ratnayake, Nalin A.; Frederick, Michael

    2009-01-01

    Flowfield rake was designed to quantify the flowfield for inlet research underneath NASA DFRC s F-15B airplane. Detailed loads and stress analysis performed using CFD and empirical methods to assure structural integrity. Calibration data were generated through wind tunnel testing of the rake. Calibration algorithm was developed to determine the local Mach and flow angularity at each probe. RAGE was flown November, 2008. Data is currently being analyzed.

  11. Survey of Aerothermodynamics Facilities Useful for the Design of Hypersonic Vehicles Using Air-Breathing Propulsion

    NASA Technical Reports Server (NTRS)

    Arnold, James O.; Deiwert, G. S.

    1997-01-01

    The dream of producing an air-breathing, hydrogen fueled, hypervelocity aircraft has been before the aerospace community for decades. However, such a craft has not yet been realized, even in an experimental form. Despite the simplicity and beauty of the concept, many formidable problems must be overcome to make this dream a reality. This paper summarizes the aero/aerothermodynamic issues that must be addressed to make the dream a reality and discusses how aerothermodynamics facilities and their modem companion, real-gas computational fluid dynamics (CFD), can help solve the problems blocking the way to realizing the dream. The approach of the paper is first to outline the concept of an air-breathing hypersonic vehicle and then discuss the nose-to-tail aerothermodynamics issues and special aerodynamic problems that arise with such a craft. Then the utility of aerothermodynamic facilities and companion CFD analysis is illustrated by reviewing results from recent United States publications wherein these problems have been addressed. Papers selected for the discussion have k e n chosen such that the review will serve to survey important U.S. aero/aerothermodynamic real gas and conventional wind tunnel facilities that are useful in the study of hypersonic, hydrogen propelled hypervelocity vehicles.

  12. An integrated payload design for the Atmospheric Remote-sensing Infrared Exoplanet Large-survey (ARIEL)

    NASA Astrophysics Data System (ADS)

    Eccleston, Paul; Tinetti, Giovanna; Beaulieu, Jean-Philippe; Güdel, Manuel; Hartogh, Paul; Micela, Giuseppina; Min, Michiel; Rataj, Miroslaw; Ray, Tom; Ribas, Ignasi; Vandenbussche, Bart; Auguères, Jean-Louis; Bishop, Georgia; Da Deppo, Vania; Focardi, Mauro; Hunt, Thomas; Malaguti, Giuseppe; Middleton, Kevin; Morgante, Gianluca; Ollivier, Marc; Pace, Emanuele; Pascale, Enzo; Taylor, William

    2016-07-01

    ARIEL (the Atmospheric Remote-sensing Infrared Exoplanet Large-survey) is one of the three candidates for the next ESA medium-class science mission (M4) expected to be launched in 2026. This mission will be devoted to observing spectroscopically in the infrared a large population of warm and hot transiting exoplanets (temperatures from ~500 K to ~3000 K) in our nearby Galactic neighborhood, opening a new discovery space in the field of extrasolar planets and enabling the understanding of the physics and chemistry of these far away worlds. The three candidate missions for M4 are now in a Phase A study which will run until mid-2017 at which point one mission will be selected for implementation. ARIEL is based on a 1-m class telescope feeding both a moderate resolution spectrometer covering the wavelengths from 1.95 to 7.8 microns, and a four channel photometer (which also acts as a Fine Guidance Sensor) with bands between 0.55 and 1.65 microns. During its 3.5 years of operation from an L2 orbit, ARIEL will continuously observe exoplanets transiting their host star.

  13. SURVEY INSTRUMENT

    DOEpatents

    Borkowski, C J

    1954-01-19

    This pulse-type survey instrument is suitable for readily detecting {alpha} particles in the presence of high {beta} and {gamma} backgrounds. The instruments may also be used to survey for neutrons, {beta} particles and {gamma} rays by employing suitably designed interchangeable probes and selecting an operating potential to correspond to the particular probe.

  14. Hybrid optimization methodology of variable densities mesh model for the axial supporting design of wide-field survey telescope

    NASA Astrophysics Data System (ADS)

    Wang, Hairen; Lou, Zheng; Qian, Yuan; Zheng, Xianzhong; Zuo, Yingxi

    2016-03-01

    The optimization of a primary mirror support system is one of the most critical problems in the design of large telescopes. Here, we propose a hybrid optimization methodology of variable densities mesh model (HOMVDMM) for the axial supporting design, which has three key steps: (1) creating a variable densities mesh model, which will partition the mirror into several sparse mesh areas and several dense mesh areas; (2) global optimization based on the zero-order optimization method for the support of primary mirror with a large tolerance; (3) based on the optimization results of the second step, further optimization with first-order optimization method in dense mesh areas by a small tolerance. HOMVDMM exploits the complementary merits of both the zero- and first-order optimizations, with the former in global scale and the latter in small scale. As an application, the axial support of the primary mirror of the 2.5-m wide-field survey telescope (WFST) is optimized by HOMVDMM. These three designs are obtained via a comparative study of different supporting points including 27 supporting points, 39 supporting points, and 54 supporting points. Their residual half-path length errors are 28.78, 9.32, and 5.29 nm. The latter two designs both meet the specification of WFST. In each of the three designs, a global optimization value with high accuracy will be obtained in an hour on an ordinary PC. As the results suggest, the overall performance of HOMVDMM is superior to the first-order optimization method as well as the zero-order optimization method.

  15. Seismic design and engineering research at the U.S. Geological Survey

    USGS Publications Warehouse

    1988-01-01

    The Engineering Seismology Element of the USGS Earthquake Hazards Reduction Program is responsible for the coordination and operation of the National Strong Motion Network to collect, process, and disseminate earthquake strong-motion data; and, the development of improved methodologies to estimate and predict earthquake ground motion.  Instrumental observations of strong ground shaking induced by damaging earthquakes and the corresponding response of man-made structures provide the basis for estimating the severity of shaking from future earthquakes, for earthquake-resistant design, and for understanding the physics of seismologic failure in the Earth's crust.

  16. The Norwegian Offender Mental Health and Addiction Study - Design and Implementation of a National Survey and Prospective Cohort Study.

    PubMed

    Bukten, Anne; Lund, Ingunn Olea; Rognli, Eline Borger; Stavseth, Marianne Riksheim; Lobmaier, Philipp; Skurtveit, Svetlana; Clausen, Thomas; Kunøe, Nikolaj

    2015-01-01

    The Norwegian prison inmates are burdened by problems before they enter prison. Few studies have managed to assess this burden and relate it to what occurs for the inmates once they leave the prison. The Norwegian Offender Mental Health and Addiction (NorMA) study is a large-scale longitudinal cohort study that combines national survey and registry data in order to understand mental health, substance use, and criminal activity before, during, and after custody among prisoners in Norway. The main goal of the study is to describe the criminal and health-related trajectories based on both survey and registry linkage information. Data were collected from 1,499 inmates in Norwegian prison facilities during 2013-2014. Of these, 741 inmates provided a valid personal identification number and constitute a cohort that will be examined retrospectively and prospectively, along with data from nationwide Norwegian registries. This study describes the design, procedures, and implementation of the ongoing NorMA study and provides an outline of the initial data.

  17. The Norwegian Offender Mental Health and Addiction Study – Design and Implementation of a National Survey and Prospective Cohort Study

    PubMed Central

    Bukten, Anne; Lund, Ingunn Olea; Rognli, Eline Borger; Stavseth, Marianne Riksheim; Lobmaier, Philipp; Skurtveit, Svetlana; Clausen, Thomas; Kunøe, Nikolaj

    2015-01-01

    The Norwegian prison inmates are burdened by problems before they enter prison. Few studies have managed to assess this burden and relate it to what occurs for the inmates once they leave the prison. The Norwegian Offender Mental Health and Addiction (NorMA) study is a large-scale longitudinal cohort study that combines national survey and registry data in order to understand mental health, substance use, and criminal activity before, during, and after custody among prisoners in Norway. The main goal of the study is to describe the criminal and health-related trajectories based on both survey and registry linkage information. Data were collected from 1,499 inmates in Norwegian prison facilities during 2013–2014. Of these, 741 inmates provided a valid personal identification number and constitute a cohort that will be examined retrospectively and prospectively, along with data from nationwide Norwegian registries. This study describes the design, procedures, and implementation of the ongoing NorMA study and provides an outline of the initial data. PMID:26648732

  18. Availability and Structure of Ambulatory Rehabilitation Services: A Survey of Hospitals with Designated Rehabilitation Beds in Ontario, Canada

    PubMed Central

    Passalent, Laura A.; Cott, Cheryl A.

    2008-01-01

    Purpose: To determine the degree to which ambulatory physical therapy (PT), occupational therapy (OT), and speech language pathology (SLP) services are available in hospitals with designated rehabilitation beds (DRBs) in Ontario, and to explore the structure of delivery and funding among services that exist. Methods: Questions regarding ambulatory services were included in the System Integration and Change (SIC) survey sent to all hospitals participating in the Hospital Report 2005: Rehabilitation initiative. Results: The response rate was 75.9% (41 of 54 hospitals). All hospitals surveyed provide some degree of ambulatory rehabilitation services, but the nature of these services varies according to rehabilitation client groups (RCGs). The majority of hospitals continue to deliver services through their employees rather than by contracting out or by creating for-profit subsidiary clinics, but an increasing proportion is accessing private sources to finance ambulatory services. Conclusions: Most hospitals with DRBs provide some degree of ambulatory rehabilitation services. Privatization of delivery is not widespread in these facilities. PMID:20145757

  19. The effects of dominance, regular inbreeding and sampling design on Q(ST), an estimator of population differentiation for quantitative traits.

    PubMed

    Goudet, Jérôme; Büchi, Lucie

    2006-02-01

    To test whether quantitative traits are under directional or homogenizing selection, it is common practice to compare population differentiation estimates at molecular markers (F(ST)) and quantitative traits (Q(ST)). If the trait is neutral and its determinism is additive, then theory predicts that Q(ST) = F(ST), while Q(ST) > F(ST) is predicted under directional selection for different local optima, and Q(ST) < F(ST) is predicted under homogenizing selection. However, nonadditive effects can alter these predictions. Here, we investigate the influence of dominance on the relation between Q(ST) and F(ST) for neutral traits. Using analytical results and computer simulations, we show that dominance generally deflates Q(ST) relative to F(ST). Under inbreeding, the effect of dominance vanishes, and we show that for selfing species, a better estimate of Q(ST) is obtained from selfed families than from half-sib families. We also compare several sampling designs and find that it is always best to sample many populations (>20) with few families (five) rather than few populations with many families. Provided that estimates of Q(ST) are derived from individuals originating from many populations, we conclude that the pattern Q(ST) > F(ST), and hence the inference of directional selection for different local optima, is robust to the effect of nonadditive gene actions.

  20. Combining the least correlation design, wavelet packet transform and correlation coefficient test to reduce the size of calibration set for NIR quantitative analysis in multi-component systems.

    PubMed

    Cai, Chen-Bo; Xu, Lu; Han, Qing-Juan; Wu, Hai-Long; Nie, Jin-Fang; Fu, Hai-Yan; Yu, Ru-Qin

    2010-05-15

    The paper focuses on solving a common and important problem of NIR quantitative analysis in multi-component systems: how to significantly reduce the size of the calibration set while not impairing the predictive precision. To cope with the problem orthogonal discrete wavelet packet transform (WPT), the least correlation design and correlation coefficient test (r-test) have been combined together. As three examples, a two-component carbon tetrachloride system with 21 calibration samples, a two-component aqueous system with 21 calibration samples, and a two-component aqueous system with 41 calibration samples have been treated with the proposed strategy, respectively. In comparison with some previous methods based on much more calibration samples, the results out of the strategy showed that the predictive ability was not obviously decreased for the first system while being clearly strengthened for the second one, and the predictive precision out of the third one was even satisfactory enough for most cases of quantitative analysis. In addition, all important factors and parameters related to our strategy are discussed in detail.

  1. Experimental design approach for the optimisation of a HPLC-fluorimetric method for the quantitation of the angiotensin II receptor antagonist telmisartan in urine.

    PubMed

    Torrealday, N; González, L; Alonso, R M; Jiménez, R M; Ortiz Lastra, E

    2003-08-08

    A high performance liquid chromatographic method with fluorimetric detection has been developed for the quantitation of the angiotensin II receptor antagonist (ARA II) 4-((2-n-propyl-4-methyl-6-(1-methylbenzimidazol-2-yl)-benzimidazol-1-yl)methyl)biphenyl-2-carboxylic acid (telmisartan) in urine, using a Novapak C18 column 3.9 x 150 mm, 4 microm. The mobile phase consisted of a mixture acetonitrile-phosphate buffer (pH 6.0, 5 mM) (45:55, v/v) pumped at a flow rate of 0.5 ml min(-1). Effluent was monitored at excitation and emission wavelengths of 305 and 365 nm, respectively. Separation was carried out at room temperature. Chromatographic variables were optimised by means of experimental design. A clean-up step was used for urine samples consisting of a solid-phase extraction procedure with C8 cartridges and methanol as eluent. This method proved to be accurate (RE from -12 to 6%), precise (intra- and inter-day coefficients of variation (CV) were lower than 8%) and sensitive enough (limit of quantitation (LOQ), ca. 1 microg l(-1)) to be applied to the determination of the active drug in urine samples obtained from hypertensive patients. Concentration levels of telmisartan at different time intervals (from 0 up to 36 h after oral intake) were monitored.

  2. Design of a Mars Airplane Propulsion System for the Aerial Regional-Scale Environmental Survey (ARES) Mission Concept

    NASA Technical Reports Server (NTRS)

    Kuhl. Christopher A.

    2009-01-01

    The Aerial Regional-Scale Environmental Survey (ARES) is a Mars exploration mission concept with the goal of taking scientific measurements of the atmosphere, surface, and subsurface of Mars by using an airplane as the payload platform. ARES team first conducted a Phase-A study for a 2007 launch opportunity, which was completed in May 2003. Following this study, significant efforts were undertaken to reduce the risk of the atmospheric flight system, under the NASA Langley Planetary Airplane Risk Reduction Project. The concept was then proposed to the Mars Scout program in 2006 for a 2011 launch opportunity. This paper summarizes the design and development of the ARES airplane propulsion subsystem beginning with the inception of the ARES project in 2002 through the submittal of the Mars Scout proposal in July 2006.

  3. Designing for Dissemination Among Public Health Researchers: Findings From a National Survey in the United States

    PubMed Central

    Jacobs, Julie A.; Tabak, Rachel G.; Hoehner, Christine M.; Stamatakis, Katherine A.

    2013-01-01

    Objectives. We have described the practice of designing for dissemination among researchers in the United States with the intent of identifying gaps and areas for improvement. Methods. In 2012, we conducted a cross-sectional study of 266 researchers using a search of the top 12 public health journals in PubMed and lists available from government-sponsored research. The sample involved scientists at universities, the National Institutes of Health, and the Centers for Disease Control and Prevention in the United States. Results. In the pooled sample, 73% of respondents estimated they spent less than 10% of their time on dissemination. About half of respondents (53%) had a person or team in their unit dedicated to dissemination. Seventeen percent of all respondents used a framework or theory to plan their dissemination activities. One third of respondents (34%) always or usually involved stakeholders in the research process. Conclusions. The current data and the existing literature suggest considerable room for improvement in designing for dissemination. PMID:23865659

  4. Design and upgrades for the Sloan Digital Sky Survey telescope's roll-off enclosure

    NASA Astrophysics Data System (ADS)

    Leger, R. French; Long, Dan; Klaene, Mark A.

    2003-02-01

    The SDSS telescope is housed, when not in use, in a roll-off enclosure. This enclosure rolls away from the telescope a distance of 60 feet, leaving the telescope fully exposed for operations. Design considerations for wind and solar loading, thermal venting, conditioning and stability are reviewed. Originally, the enclosure had been constructed to minimize its surface area obstruction to the telescopes field of view. This design feature, however, offered little room to perform engineering tasks during non-operational time. An upgrade to the structure, in the form of raising the roof, was instituted. This improvement greatly enhanced the engineering and testing functions performed on the telescope, thereby increasing operational efficiency and the time allotted to engineering tasks. Problems maintaining and associated with weather sealing, lightning protection, truck wheel alignment, altitude effects on truck controllers and thermal conditioning are examined. Communication and electrical connections between stationary and moving elements of the enclosure are described. Two types of systems, to date, have been used one a reel and the other a slider system. Advantages and disadvantages of both are examined from the perspective of four years experience.

  5. Design and Calibration of a Flowfield Survey Rake for Inlet Flight Research

    NASA Technical Reports Server (NTRS)

    Flynn, Darin C.; Ratnayake, Nalin A.; Frederick, Michael

    2009-01-01

    The Propulsion Flight Test Fixture at the NASA Dryden Flight Research Center is a unique test platform available for use on NASA's F-15B aircraft, tail number 836, as a modular host for a variety of aerodynamics and propulsion research. For future flight data from this platform to be valid, more information must be gathered concerning the quality of the airflow underneath the body of the F-15B at various flight conditions, especially supersonic conditions. The flow angularity and Mach number must be known at multiple locations on any test article interface plane for measurement data at these locations to be valid. To determine this prerequisite information, flight data will be gathered in the Rake Airflow Gauge Experiment using a custom-designed flowfield rake to probe the airflow underneath the F-15B at the desired flight conditions. This paper addresses the design considerations of the rake and probe assembly, including the loads and stress analysis using analytical methods, computational fluid dynamics, and finite element analysis. It also details the flow calibration procedure, including the completed wind-tunnel test and posttest data reduction, calibration verification, and preparation for flight-testing.

  6. A rank-based nonparametric method for mapping quantitative trait loci in outbred half-sib pedigrees: application to milk production in a granddaughter design.

    PubMed Central

    Coppieters, W; Kvasz, A; Farnir, F; Arranz, J J; Grisart, B; Mackinnon, M; Georges, M

    1998-01-01

    We describe the development of a multipoint nonparametric quantitative trait loci mapping method based on the Wilcoxon rank-sum test applicable to outbred half-sib pedigrees. The method has been evaluated on a simulated dataset and its efficiency compared with interval mapping by using regression. It was shown that the rank-based approach is slightly inferior to regression when the residual variance is homoscedastic normal; however, in three out of four other scenarios envisaged, i.e., residual variance heteroscedastic normal, homoscedastic skewed, and homoscedastic positively kurtosed, the latter outperforms the former one. Both methods were applied to a real data set analyzing the effect of bovine chromosome 6 on milk yield and composition by using a 125-cM map comprising 15 microsatellites and a granddaughter design counting 1158 Holstein-Friesian sires. PMID:9649541

  7. Quantitative microscopy of the lung: a problem-based approach. Part 2: stereological parameters and study designs in various diseases of the respiratory tract.

    PubMed

    Mühlfeld, Christian; Ochs, Matthias

    2013-08-01

    Design-based stereology provides efficient methods to obtain valuable quantitative information of the respiratory tract in various diseases. However, the choice of the most relevant parameters in a specific disease setting has to be deduced from the present pathobiological knowledge. Often it is difficult to express the pathological alterations by interpretable parameters in terms of volume, surface area, length, or number. In the second part of this companion review article, we analyze the present pathophysiological knowledge about acute lung injury, diffuse parenchymal lung diseases, emphysema, pulmonary hypertension, and asthma to come up with recommendations for the disease-specific application of stereological principles for obtaining relevant parameters. Worked examples with illustrative images are used to demonstrate the work flow, estimation procedure, and calculation and to facilitate the practical performance of equivalent analyses.

  8. Diamagnetic Imaging Agents with a Modular Chemical Design for Quantitative Detection of β-Galactosidase and β-Glucuronidase Activities with CatalyCEST MRI.

    PubMed

    Fernández-Cuervo, Gabriela; Tucker, Kirsten A; Malm, Scott W; Jones, Kyle M; Pagel, Mark D

    2016-10-06

    Imaging agents for the noninvasive in vivo detection of enzyme activity in preclinical and clinical settings could have fundamental implications in the field of drug discovery. Furthermore, a new class of targeted prodrug treatments takes advantage of high enzyme activity to tailor therapy and improve treatment outcomes. Herein, we report the design and synthesis of new magnetic resonance imaging (MRI) agents that quantitatively detect β-galactosidase and β-glucuronidase activities by measuring changes in chemical exchange saturation transfer (CEST). Based on a modular approach, we incorporated the enzymes' respective substrates to a salicylate moiety with a chromogenic spacer via a carbamate linkage. This furnished highly selective diamagnetic CEST agents that detected and quantified enzyme activities of glycoside hydrolase enzymes. Michaelis-Menten enzyme kinetics studies were performed by monitoring catalyCEST MRI signals, which were validated with UV-vis assays.

  9. TOPoS. I. Survey design and analysis of the first sample

    NASA Astrophysics Data System (ADS)

    Caffau, E.; Bonifacio, P.; Sbordone, L.; François, P.; Monaco, L.; Spite, M.; Plez, B.; Cayrel, R.; Christlieb, N.; Clark, P.; Glover, S.; Klessen, R.; Koch, A.; Ludwig, H.-G.; Spite, F.; Steffen, M.; Zaggia, S.

    2013-12-01

    Context. The metal-weak tail of the metallicity distribution function (MDF) of the Galactic Halo stars contains crucial information on the formation mode of the first generation of stars. To determine this observationally, it is necessary to observe large numbers of extremely metal-poor stars. Aims: We present here the Turn-Off Primordial Stars survey (TOPoS) that is conducted as an ESO Large Programme at the VLT. This project has four main goals: (i) to understand the formation of low-mass stars in a low-metallicity gas: determine the metal-weak tail of the halo MDF below [M/H] = -3.5; in particular, we aim at determining the critical metallicity, that is the lowest metallicity sufficient for the formation of low-mass stars; (ii) to determine in extremely metal-poor stars the relative abundances of the elements that are the signature of the massive first stars; (iii) to determine the trend of the lithium abundance at the time when the Galaxy formed; and (iv) to derive the fraction of C-enhanced extremely metal-poor stars with respect to normal extremely metal-poor stars. The large number of stars observed in the SDSS provides a good sample of candidate stars at extremely low metallicity. Methods: Candidates with turn-off colours down to magnitude g = 20 were selected from the low-resolution spectra of SDSS by means of an automated procedure. X-Shooter has the potential of performing the necessary follow-up spectroscopy, providing accurate metallicities and abundance ratios for several key elements for these stars. Results: We present here the stellar parameters of the first set of stars. The nineteen stars range in iron abundance between -4.1 and -2.9 dex relative to the Sun. Two stars have a high radial velocity and, according to our estimate of their kinematics, appear to be marginally bound to the Galaxy and are possibly accreted from another galaxy. Based on observations obtained at ESO Paranal Observatory, GTO programme 189.D-0165(A).

  10. Quantitative analysis of human ankle characteristics at different gait phases and speeds for utilizing in ankle-foot prosthetic design

    PubMed Central

    2014-01-01

    Background Ankle characteristics vary in terms of gait phase and speed change. This study aimed to quantify the components of ankle characteristics, including quasi-stiffness and work in different gait phases and at various speeds. Methods The kinetic and kinematic data of 20 healthy participants were collected during normal gait at four speeds. Stance moment-angle curves were divided into three sub-phases including controlled plantarflexion, controlled dorsiflexion and powered plantarflexion. The slope of the moment-angle curves was quantified as quasi-stiffness. The area under the curves was defined as work. Results The lowest quasi-stiffness was observed in the controlled plantarflexion. The fitted line to moment-angle curves showed R2 > 0.8 at controlled dorsiflexion and powered plantarflexion. Quasi-stiffness was significantly different at different speeds (P = 0.00). In the controlled dorsiflexion, the ankle absorbed energy; by comparison, energy was generated in the powered plantarflexion. A negative work value was recorded at slower speeds and a positive value was observed at faster speeds. Ankle peak powers were increased with walking speed (P = 0.00). Conclusions Our findings suggested that the quasi-stiffness and work of the ankle joint can be regulated at different phases and speeds. These findings may be clinically applicable in the design and development of ankle prosthetic devices that can naturally replicate human walking at various gait speeds. PMID:24568175

  11. Survey of Technical Preventative Measures to Reduce Whole-Body Vibration Effects when Designing Mobile Machinery

    NASA Astrophysics Data System (ADS)

    DONATI, P.

    2002-05-01

    Engineering solutions to minimize the effects on operators of vibrating mobile machinery can be conveniently grouped into three areas: Reduction of vibration at source by improvement of the quality of terrain, careful selection of vehicle or machine, correct loading, proper maintenance, etc.Reduction of vibration transmission by incorporating suspension systems (tyres, vehicle suspensions, suspension cab and seat) between the operator and the source of vibration.Improvement of cab ergonomics and seat profiles to optimize operator posture. These paper reviews the different techniques and problems linked to categories (2) and (3). According to epidemiological studies, the main health risk with whole-body vibration exposure would appear to be lower back pain. When designing new mobile machinery, all factors which may contribute to back injury should be considered in order to reduce risk. For example, optimized seat suspension is useless if the suspension seat cannot be correctly and easily adjusted to the driver's weight or if the driver is forced to drive in a bent position to avoid his head striking the ceiling due to the spatial requirement of the suspension seat.

  12. Quantitative plant proteomics.

    PubMed

    Bindschedler, Laurence V; Cramer, Rainer

    2011-02-01

    Quantitation is an inherent requirement in comparative proteomics and there is no exception to this for plant proteomics. Quantitative proteomics has high demands on the experimental workflow, requiring a thorough design and often a complex multi-step structure. It has to include sufficient numbers of biological and technical replicates and methods that are able to facilitate a quantitative signal read-out. Quantitative plant proteomics in particular poses many additional challenges but because of the nature of plants it also offers some potential advantages. In general, analysis of plants has been less prominent in proteomics. Low protein concentration, difficulties in protein extraction, genome multiploidy, high Rubisco abundance in green tissue, and an absence of well-annotated and completed genome sequences are some of the main challenges in plant proteomics. However, the latter is now changing with several genomes emerging for model plants and crops such as potato, tomato, soybean, rice, maize and barley. This review discusses the current status in quantitative plant proteomics (MS-based and non-MS-based) and its challenges and potentials. Both relative and absolute quantitation methods in plant proteomics from DIGE to MS-based analysis after isotope labeling and label-free quantitation are described and illustrated by published studies. In particular, we describe plant-specific quantitative methods such as metabolic labeling methods that can take full advantage of plant metabolism and culture practices, and discuss other potential advantages and challenges that may arise from the unique properties of plants.

  13. Rapid Method Development in Hydrophilic Interaction Liquid Chromatography for Pharmaceutical Analysis Using a Combination of Quantitative Structure-Retention Relationships and Design of Experiments.

    PubMed

    Taraji, Maryam; Haddad, Paul R; Amos, Ruth I J; Talebi, Mohammad; Szucs, Roman; Dolan, John W; Pohl, Chris A

    2017-02-07

    A design-of-experiment (DoE) model was developed, able to describe the retention times of a mixture of pharmaceutical compounds in hydrophilic interaction liquid chromatography (HILIC) under all possible combinations of acetonitrile content, salt concentration, and mobile-phase pH with R(2) > 0.95. Further, a quantitative structure-retention relationship (QSRR) model was developed to predict retention times for new analytes, based only on their chemical structures, with a root-mean-square error of prediction (RMSEP) as low as 0.81%. A compound classification based on the concept of similarity was applied prior to QSRR modeling. Finally, we utilized a combined QSRR-DoE approach to propose an optimal design space in a quality-by-design (QbD) workflow to facilitate the HILIC method development. The mathematical QSRR-DoE model was shown to be highly predictive when applied to an independent test set of unseen compounds in unseen conditions with a RMSEP value of 5.83%. The QSRR-DoE computed retention time of pharmaceutical test analytes and subsequently calculated separation selectivity was used to optimize the chromatographic conditions for efficient separation of targets. A Monte Carlo simulation was performed to evaluate the risk of uncertainty in the model's prediction, and to define the design space where the desired quality criterion was met. Experimental realization of peak selectivity between targets under the selected optimal working conditions confirmed the theoretical predictions. These results demonstrate how discovery of optimal conditions for the separation of new analytes can be accelerated by the use of appropriate theoretical tools.

  14. Willingness to pay for treated mosquito nets in Surat, India: the design and descriptive analysis of a household survey.

    PubMed

    Bhatia, M R; Fox-Rushby, J A

    2002-12-01

    For willingness to pay (WTP) studies to have an appropriate impact on policy making, it is essential that the design and analysis are undertaken carefully. This paper aims to describe and justify the design of the survey tool used to assess hypothetical WTP for treated mosquito nets (TMN) in rural Surat, India and report its findings. Results from qualitative work were used as an input for developing the WTP questionnaire. A total of 1200 households belonging to 80 villages in rural Surat were selected for the study. A bidding format was used to elicit WTP values, using three different starting bids. The scenario was constructed in a way to reduce the possibility of respondents acting strategically. The response rate was 100%. About 79% of the respondents were willing to buy TMNs and the mean WTP was Rs57. Descriptive results of economic and other taste and preference variables are also presented, which include preventive measures used by households and treatment seeking behaviour for malaria. It is observed that WTP as well as demographic variables and prevention methods differ significantly across arms of the trial. This paper suggests that policy-makers could use the evidence following further analysis, along with information on costs of implementation, to ascertain the levels of subsidy that may be needed at different levels of coverage.

  15. Flow field survey near the rotational plane of an advanced design propeller on a JetStar airplane

    NASA Technical Reports Server (NTRS)

    Walsh, K. R.

    1985-01-01

    An investigation was conducted to obtain upper fuselage surface static pressures and boundary layer velocity profiles below the centerline of an advanced design propeller. This investigation documents the upper fuselage velocity flow field in support of the in-flight acoustic tests conducted on a JetStar airplane. Initial results of the boundary layer survey show evidence of an unusual flow disturbance, which is attributed to the two windshield wiper assemblies on the aircraft. The assemblies were removed, eliminating the disturbances from the flow field. This report presents boundary layer velocity profiles at altitudes of 6096 and 9144 m (20,000 and 30,000 ft) and Mach numbers from 0.6 to 0.8, and it investigated the effects of windshield wiper assemblies on these profiles. Because of the unconventional velocity profiles that were obtained with the assemblies mounted, classical boundary layer parameters, such as momentum and displacement thicknesses, are not presented. The effects of flight test variables (Mach number and angles of attack and sideslip) and an advanced design propeller on boundary layer profiles - with the wiper assemblies mounted and removed - are presented.

  16. A New UPLC Method with Chemometric Design-Optimization Approach for the Simultaneous Quantitation of Brimonidine Tartrate and Timolol Maleate in an Eye Drop Preparation.

    PubMed

    Büker, Eda; Dinç, Erdal

    2017-02-01

    A new ultra-performance liquid chromatography (UPLC) with photodiode array was proposed for the quantitation of Brimonidine Tartrate (BRI) and Timolol Maleate (TIM) in eye drop using experimental design and optimization methodology. A 3(3) full factorial design was applied to uncover the effects of the selected factors and their interactions on the chromatographic response function for the optimization of experimental conditions in the development of a new UPLC method. As a result, the optimal chromatographic conditions giving a better separation and short analysis time were found to be 49.2°C for column temperature; 0.38 mL/min for flow rate and 56.7 % (v/v) for 0.1 M CH3COOH used in mobile phase. The elution of BRI and TIM was reported as 0.508 and 0.652 min within a short runtime of 1.5 min, respectively. Calibration graphs for BRI and TIM were obtained by the regression of the concentration on the peak area, which was detected at 246 and 298 nm, respectively. The method validation was performed by the analysis of the synthetic mixtures, intra-day and inter-day samples and standard addition samples. This study shows that the optimized and validated UPLC method is very promising and available for the quantification of BRI and TIM in an eye drop formulation.

  17. Autonomous application of quantitative PCR in the deep sea: in situ surveys of aerobic methanotrophs using the deep-sea environmental sample processor.

    PubMed

    Ussler, William; Preston, Christina; Tavormina, Patricia; Pargett, Doug; Jensen, Scott; Roman, Brent; Marin, Roman; Shah, Sunita R; Girguis, Peter R; Birch, James M; Orphan, Victoria; Scholin, Christopher

    2013-08-20

    Recent advances in ocean observing systems and genomic technologies have led to the development of the deep-sea environmental sample processor (D-ESP). The D-ESP filters particulates from seawater at depths up to 4000 m and applies a variety of molecular assays to the particulates, including quantitative PCR (qPCR), to identify particular organisms and genes in situ. Preserved samples enable laboratory-based validation of in situ results and expanded studies of genomic diversity and gene expression. Tests of the D-ESP at a methane-rich mound in the Santa Monica Basin centered on detection of 16S rRNA and particulate methane monooxygenase (pmoA) genes for two putative aerobic methanotrophs. Comparison of in situ qPCR results with laboratory-based assays of preserved samples demonstrates the D-ESP generated high-quality qPCR data while operating autonomously on the seafloor. Levels of 16S rRNA and pmoA cDNA detected in preserved samples are consistent with an active community of aerobic methanotrophs near the methane-rich mound. These findings are substantiated at low methane sites off Point Conception and in Monterey Bay where target genes are at or below detection limits. Successful deployment of the D-ESP is a major step toward developing autonomous systems to facilitate a wide range of marine microbiological investigations.

  18. THE VIRUS-P EXPLORATION OF NEARBY GALAXIES (VENGA): SURVEY DESIGN, DATA PROCESSING, AND SPECTRAL ANALYSIS METHODS

    SciTech Connect

    Blanc, Guillermo A.; Weinzirl, Tim; Song, Mimi; Heiderman, Amanda; Gebhardt, Karl; Jogee, Shardha; Evans, Neal J. II; Kaplan, Kyle; Marinova, Irina; Vutisalchavakul, Nalin; Van den Bosch, Remco C. E.; Luo Rongxin; Hao Lei; Drory, Niv; Fabricius, Maximilian; Fisher, David; Yoachim, Peter

    2013-05-15

    We present the survey design, data reduction, and spectral fitting pipeline for the VIRUS-P Exploration of Nearby Galaxies (VENGA). VENGA is an integral field spectroscopic survey, which maps the disks of 30 nearby spiral galaxies. Targets span a wide range in Hubble type, star formation activity, morphology, and inclination. The VENGA data cubes have 5.''6 FWHM spatial resolution, {approx}5 A FWHM spectral resolution, sample the 3600 A-6800 A range, and cover large areas typically sampling galaxies out to {approx}0.7R{sub 25}. These data cubes can be used to produce two-dimensional maps of the star formation rate, dust extinction, electron density, stellar population parameters, the kinematics and chemical abundances of both stars and ionized gas, and other physical quantities derived from the fitting of the stellar spectrum and the measurement of nebular emission lines. To exemplify our methods and the quality of the data, we present the VENGA data cube on the face-on Sc galaxy NGC 628 (a.k.a. M 74). The VENGA observations of NGC 628 are described, as well as the construction of the data cube, our spectral fitting method, and the fitting of the stellar and ionized gas velocity fields. We also propose a new method to measure the inclination of nearly face-on systems based on the matching of the stellar and gas rotation curves using asymmetric drift corrections. VENGA will measure relevant physical parameters across different environments within these galaxies, allowing a series of studies on star formation, structure assembly, stellar populations, chemical evolution, galactic feedback, nuclear activity, and the properties of the interstellar medium in massive disk galaxies.

  19. Design, methods and demographic findings of the DEMINVALL survey: a population-based study of Dementia in Valladolid, Northwestern Spain

    PubMed Central

    2012-01-01

    Background This article describes the rationale and design of a population-based survey of dementia in Valladolid (northwestern Spain). The main aim of the study was to assess the epidemiology of dementia and its subtypes. Prevalence of anosognosia in dementia patients, nutritional status, diet characteristics, and determinants of non-diagnosed dementia in the community were studied. The main sociodemographic, educational, and general health status characteristics of the study population are described. Methods Cross-over and cohort, population-based study. A two-phase door-to-door study was performed. Both urban and rural environments were included. In phase 1 (February 2009 – February 2010) 28 trained physicians examined a population of 2,989 subjects (age: ≥ 65 years). The seven-minute screen neurocognitive battery was used. In phase 2 (May 2009 – May 2010) 4 neurologists, 1 geriatrician, and 3 neuropsychologists confirmed the diagnosis of dementia and subtype in patients screened positive by a structured neurological evaluation. Specific instruments to assess anosognosia, the nutritional status and diet characteristics were used. Of the initial sample, 2,170 subjects were evaluated (57% female, mean age 76.5 ± 7.8, 5.2% institutionalized), whose characteristics are described. 227 persons were excluded for various reasons. Among those eligible were 592 non-responders. The attrition bias of non-responders was lower in rural areas. 241 screened positive (11.1%). Discussion The survey will explore some clinical, social and health related life-style variables of dementia. The population size and the diversification of social and educational backgrounds will contribute to a better knowledge of dementia in our environment. PMID:22935626

  20. The VIRUS-P Exploration of Nearby Galaxies (VENGA): Survey Design, Data Processing, and Spectral Analysis Methods

    NASA Astrophysics Data System (ADS)

    Blanc, Guillermo A.; Weinzirl, Tim; Song, Mimi; Heiderman, Amanda; Gebhardt, Karl; Jogee, Shardha; Evans, Neal J., II; van den Bosch, Remco C. E.; Luo, Rongxin; Drory, Niv; Fabricius, Maximilian; Fisher, David; Hao, Lei; Kaplan, Kyle; Marinova, Irina; Vutisalchavakul, Nalin; Yoachim, Peter

    2013-05-01

    We present the survey design, data reduction, and spectral fitting pipeline for the VIRUS-P Exploration of Nearby Galaxies (VENGA). VENGA is an integral field spectroscopic survey, which maps the disks of 30 nearby spiral galaxies. Targets span a wide range in Hubble type, star formation activity, morphology, and inclination. The VENGA data cubes have 5.''6 FWHM spatial resolution, ~5 Å FWHM spectral resolution, sample the 3600 Å-6800 Å range, and cover large areas typically sampling galaxies out to ~0.7R 25. These data cubes can be used to produce two-dimensional maps of the star formation rate, dust extinction, electron density, stellar population parameters, the kinematics and chemical abundances of both stars and ionized gas, and other physical quantities derived from the fitting of the stellar spectrum and the measurement of nebular emission lines. To exemplify our methods and the quality of the data, we present the VENGA data cube on the face-on Sc galaxy NGC 628 (a.k.a. M 74). The VENGA observations of NGC 628 are described, as well as the construction of the data cube, our spectral fitting method, and the fitting of the stellar and ionized gas velocity fields. We also propose a new method to measure the inclination of nearly face-on systems based on the matching of the stellar and gas rotation curves using asymmetric drift corrections. VENGA will measure relevant physical parameters across different environments within these galaxies, allowing a series of studies on star formation, structure assembly, stellar populations, chemical evolution, galactic feedback, nuclear activity, and the properties of the interstellar medium in massive disk galaxies.

  1. National Aquatic Resource Surveys: Use of Geospatial data in their design and spatial prediction at non-monitored locations

    EPA Science Inventory

    The National Aquatic Resource Surveys (NARS) are four surveys conducted by the U.S. Environmental Protection Agency working in collaboration with states, tribal nations and other federal agencies. The surveys are conducted for lakes and reservoirs, streams, estuaries and intracoa...

  2. Design and synthesis of target-responsive hydrogel for portable visual quantitative detection of uranium with a microfluidic distance-based readout device.

    PubMed

    Huang, Yishun; Fang, Luting; Zhu, Zhi; Ma, Yanli; Zhou, Leiji; Chen, Xi; Xu, Dunming; Yang, Chaoyong

    2016-11-15

    Due to uranium's increasing exploitation in nuclear energy and its toxicity to human health, it is of great significance to detect uranium contamination. In particular, development of a rapid, sensitive and portable method is important for personal health care for those who frequently come into contact with uranium ore mining or who investigate leaks at nuclear power plants. The most stable form of uranium in water is uranyl ion (UO2(2+)). In this work, a UO2(2+) responsive smart hydrogel was designed and synthesized for rapid, portable, sensitive detection of UO2(2+). A UO2(2+) dependent DNAzyme complex composed of substrate strand and enzyme strand was utilized to crosslink DNA-grafted polyacrylamide chains to form a DNA hydrogel. Colorimetric analysis was achieved by encapsulating gold nanoparticles (AuNPs) in the DNAzyme-crosslinked hydrogel to indicate the concentration of UO2(2+). Without UO2(2+), the enzyme strand is not active. The presence of UO2(2+) in the sample activates the enzyme strand and triggers the cleavage of the substrate strand from the enzyme strand, thereby decreasing the density of crosslinkers and destabilizing the hydrogel, which then releases the encapsulated AuNPs. As low as 100nM UO2(2+) was visually detected by the naked eye. The target-responsive hydrogel was also demonstrated to be applicable in natural water spiked with UO2(2+). Furthermore, to avoid the visual errors caused by naked eye observation, a previously developed volumetric bar-chart chip (V-Chip) was used to quantitatively detect UO2(2+) concentrations in water by encapsulating Au-Pt nanoparticles in the hydrogel. The UO2(2+) concentrations were visually quantified from the travelling distance of ink-bar on the V-Chip. The method can be used for portable and quantitative detection of uranium in field applications without skilled operators and sophisticated instruments.

  3. Surgical Simulations Based on Limited Quantitative Data: Understanding How Musculoskeletal Models Can Be Used to Predict Moment Arms and Guide Experimental Design

    PubMed Central

    Bednar, Michael S.; Murray, Wendy M.

    2016-01-01

    The utility of biomechanical models and simulations to examine clinical problems is currently limited by the need for extensive amounts of experimental data describing how a given procedure or disease affects the musculoskeletal system. Methods capable of predicting how individual biomechanical parameters are altered by surgery are necessary for the efficient development of surgical simulations. In this study, we evaluate to what extent models based on limited amounts of quantitative data can be used to predict how surgery influences muscle moment arms, a critical parameter that defines how muscle force is transformed into joint torque. We specifically examine proximal row carpectomy and scaphoid-excision four-corner fusion, two common surgeries to treat wrist osteoarthritis. Using models of these surgeries, which are based on limited data and many assumptions, we perform simulations to formulate a hypothesis regarding how these wrist surgeries influence muscle moment arms. Importantly, the hypothesis is based on analysis of only the primary wrist muscles. We then test the simulation-based hypothesis using a cadaveric experiment that measures moment arms of both the primary wrist and extrinsic thumb muscles. The measured moment arms of the primary wrist muscles are used to verify the hypothesis, while those of the extrinsic thumb muscles are used as cross-validation to test whether the hypothesis is generalizable. The moment arms estimated by the models and measured in the cadaveric experiment both indicate that a critical difference between the surgeries is how they alter radial-ulnar deviation versus flexion-extension moment arms at the wrist. Thus, our results demonstrate that models based on limited quantitative data can provide novel insights. This work also highlights that synergistically utilizing simulation and experimental methods can aid the design of experiments and make it possible to test the predictive limits of current computer simulation techniques

  4. A mental health needs assessment of children and adolescents in post-conflict Liberia: results from a quantitative key-informant survey

    PubMed Central

    Borba, Christina P.C.; Ng, Lauren C.; Stevenson, Anne; Vesga-Lopez, Oriana; Harris, Benjamin L.; Parnarouskis, Lindsey; Gray, Deborah A.; Carney, Julia R.; Domínguez, Silvia; Wang, Edward K.S.; Boxill, Ryan; Song, Suzan J.; Henderson, David C.

    2016-01-01

    Between 1989 and 2004, Liberia experienced a devastating civil war that resulted in widespread trauma with almost no mental health infrastructure to help citizens cope. In 2009, the Liberian Ministry of Health and Social Welfare collaborated with researchers from Massachusetts General Hospital to conduct a rapid needs assessment survey in Liberia with local key informants (n = 171) to examine the impact of war and post-war events on emotional and behavioral problems of, functional limitations of, and appropriate treatment settings for Liberian youth aged 5–22. War exposure and post-conflict sexual violence, poverty, infectious disease and parental death negatively impacted youth mental health. Key informants perceived that youth displayed internalizing and externalizing symptoms and mental health-related functional impairment at home, school, work and in relationships. Medical clinics were identified as the most appropriate setting for mental health services. Youth in Liberia continue to endure the harsh social, economic and material conditions of everyday life in a protracted post-conflict state, and have significant mental health needs. Their observed functional impairment due to mental health issues further limited their access to protective factors such as education, employment and positive social relationships. Results from this study informed Liberia's first post-conflict mental health policy. PMID:26807147

  5. A mental health needs assessment of children and adolescents in post-conflict Liberia: results from a quantitative key-informant survey.

    PubMed

    Borba, Christina P C; Ng, Lauren C; Stevenson, Anne; Vesga-Lopez, Oriana; Harris, Benjamin L; Parnarouskis, Lindsey; Gray, Deborah A; Carney, Julia R; Domínguez, Silvia; Wang, Edward K S; Boxill, Ryan; Song, Suzan J; Henderson, David C

    2016-01-02

    Between 1989 and 2004, Liberia experienced a devastating civil war that resulted in widespread trauma with almost no mental health infrastructure to help citizens cope. In 2009, the Liberian Ministry of Health and Social Welfare collaborated with researchers from Massachusetts General Hospital to conduct a rapid needs assessment survey in Liberia with local key informants (n = 171) to examine the impact of war and post-war events on emotional and behavioral problems of, functional limitations of, and appropriate treatment settings for Liberian youth aged 5-22. War exposure and post-conflict sexual violence, poverty, infectious disease and parental death negatively impacted youth mental health. Key informants perceived that youth displayed internalizing and externalizing symptoms and mental health-related functional impairment at home, school, work and in relationships. Medical clinics were identified as the most appropriate setting for mental health services. Youth in Liberia continue to endure the harsh social, economic and material conditions of everyday life in a protracted post-conflict state, and have significant mental health needs. Their observed functional impairment due to mental health issues further limited their access to protective factors such as education, employment and positive social relationships. Results from this study informed Liberia's first post-conflict mental health policy.

  6. A Study Investigating Indian Middle School Students' Ideas of Design and Designers

    ERIC Educational Resources Information Center

    Ara, Farhat; Chunawala, Sugra; Natarajan, Chitra

    2011-01-01

    This paper reports on an investigation into middle school students' naive ideas about, and attitudes towards design and designers. The sample for the survey consisted of students from Classes 7 to 9 from a school located in Mumbai. The data were analysed qualitatively and quantitatively to look for trends in students' responses. Results show that…

  7. How does the size and shape of local populations in China compare to general anthropometric surveys currently used for product design?

    PubMed

    Daniell, Nathan; Fraysse, François; Paul, Gunther

    2012-01-01

    Anthropometry has long been used for a range of ergonomic applications & product design. Although products are often designed for specific cohorts, anthropometric data are typically sourced from large scale surveys representative of the general population. Additionally, few data are available for emerging markets like China and India. This study measured 80 Chinese males that were representative of a specific cohort targeted for the design of a new product. Thirteen anthropometric measurements were recorded and compared to two large databases that represented a general population, a Chinese database and a Western database. Substantial differences were identified between the Chinese males measured in this study and both databases. The subjects were substantially taller, heavier and broader than subjects in the older Chinese database. However, they were still substantially smaller, lighter and thinner than Western males. Data from current Western anthropometric surveys are unlikely to accurately represent the target population for product designers and manufacturers in emerging markets like China.

  8. Surveying the Commons: Current Implementation of Information Commons Web sites

    ERIC Educational Resources Information Center

    Leeder, Christopher

    2009-01-01

    This study assessed the content of 72 academic library Information Commons (IC) Web sites using content analysis, quantitative assessment and qualitative surveys of site administrators to analyze current implementation by the academic library community. Results show that IC Web sites vary widely in content, design and functionality, with few…

  9. Mental health clinicians' attitudes about consumer and consumer consultant participation in Australia: A cross-sectional survey design.

    PubMed

    McCann, Terence V; Clark, Eileen; Baird, John; Lu, Sai

    2008-06-01

    The purpose of this study was to assess mental health clinicians' attitudes about mental health consumer participation in inpatient psychiatric units. A cross-sectional survey design was used with a non-probability sample of 47 clinicians in the psychiatric units of a large Australian hospital. The results showed that gender, length of time as a clinician, and how long the staff worked in the units influenced their attitudes about consumer involvement. Females were more likely than males to support consumer participation in management and consumer consultants. Less experienced staff showed greater support than more experienced staff for mental health consumer involvement in treatment-related matters and consumer consultants in units. New staff members were more likely to register agreement-to-uncertainty regarding consumer involvement in treatment-related issues, whereas established staff members were more likely to record uncertainty about this issue. The findings showed that although reports and policies promoted participation, some clinicians were reluctant to accept consumer and consultant involvement.

  10. The Cornella Health Interview Survey Follow-Up (CHIS.FU) Study: design, methods, and response rate

    PubMed Central

    Garcia, Montse; Schiaffino, Anna; Fernandez, Esteve; Marti, Merce; Salto, Esteve; Perez, Gloria; Peris, Merce; Borrell, Carme; Nieto, F Javier; Borras, Josep Maria

    2003-01-01

    Background The aim of this report is to describe the main characteristics of the design, including response rates, of the Cornella Health Interview Survey Follow-up Study. Methods The original cohort consisted of 2,500 subjects (1,263 women and 1,237 men) interviewed as part of the 1994 Cornella Health Interview Study. A record linkage to update the address and vital status of the cohort members was carried out using, first a deterministic method, and secondly a probabilistic one, based on each subject's first name and surnames. Subsequently, we attempted to locate the cohort members to conduct the phone follow-up interviews. A pilot study was carried out to test the overall feasibility and to modify some procedures before the field work began. Results After record linkage, 2,468 (98.7%) subjects were successfully traced. Of these, 91 (3.6%) were deceased, 259 (10.3%) had moved to other towns, and 50 (2.0%) had neither renewed their last municipal census documents nor declared having moved. After using different strategies to track and to retain cohort members, we traced 92% of the CHIS participants. From them, 1,605 subjects answered the follow-up questionnaire. Conclusion The computerized record linkage maximized the success of the follow-up that was carried out 7 years after the baseline interview. The pilot study was useful to increase the efficiency in tracing and interviewing the respondents. PMID:12665430

  11. Designing an Effective Survey

    DTIC Science & Technology

    2005-09-01

    this analogy again, the sip (sample) would determine the taste whether it was from a large pot or a small one. The idea here is that the size of the...soup pot doesn’t really matter. The sample (sip) represents the population (the pot of soup) whether the population is big or small. 28 CMU/SEI-2005-HB...small, then the size of the pot does begin to matter. is small When your population size is small, you can employ the finite population correction

  12. Heterosis for biomass-related traits in Arabidopsis investigated by quantitative trait loci analysis of the triple testcross design with recombinant inbred lines.

    PubMed

    Kusterer, Barbara; Piepho, Hans-Peter; Utz, H Friedrich; Schön, Chris C; Muminovic, Jasmina; Meyer, Rhonda C; Altmann, Thomas; Melchinger, Albrecht E

    2007-11-01

    Arabidopsis thaliana has emerged as a leading model species in plant genetics and functional genomics including research on the genetic causes of heterosis. We applied a triple testcross (TTC) design and a novel biometrical approach to identify and characterize quantitative trait loci (QTL) for heterosis of five biomass-related traits by (i) estimating the number, genomic positions, and genetic effects of heterotic QTL, (ii) characterizing their mode of gene action, and (iii) testing for presence of epistatic effects by a genomewide scan and marker x marker interactions. In total, 234 recombinant inbred lines (RILs) of Arabidopsis hybrid C24 x Col-0 were crossed to both parental lines and their F1 and analyzed with 110 single-nucleotide polymorphism (SNP) markers. QTL analyses were conducted using linear transformations Z1, Z2, and Z3 calculated from the adjusted entry means of TTC progenies. With Z1, we detected 12 QTL displaying augmented additive effects. With Z2, we mapped six QTL for augmented dominance effects. A one-dimensional genome scan with Z3 revealed two genomic regions with significantly negative dominance x additive epistatic effects. Two-way analyses of variance between marker pairs revealed nine digenic epistatic interactions: six reflecting dominance x dominance effects with variable sign and three reflecting additive x additive effects with positive sign. We conclude that heterosis for biomass-related traits in Arabidopsis has a polygenic basis with overdominance and/or epistasis being presumably the main types of gene action.

  13. Biological effect of low-head sea lamprey barriers: Designs for extensive surveys and the value of incorporating intensive process-oriented research

    USGS Publications Warehouse

    Hayes, D.B.; Baylis, J.R.; Carl, L.M.; Dodd, H.R.; Goldstein, J.D.; McLaughlin, R.L.; Noakes, D.L.G.; Porto, L.M.

    2003-01-01

    Four sampling designs for quantifying the effect of low-head sea lamprey (Petromyzon marinus) barriers on fish communities were evaluated, and the contribution of process-oriented research to the overall confidence of results obtained was discussed. The designs include: (1) sample barrier streams post-construction; (2) sample barrier and reference streams post-construction; (3) sample barrier streams pre- and post-construction; and (4) sample barrier and reference streams pre- and post-construction. In the statistical literature, the principal basis for comparison of sampling designs is generally the precision achieved by each design. In addition to precision, designs should be compared based on the interpretability of results and on the scale to which the results apply. Using data collected in a broad survey of streams with and without sea lamprey barriers, some of the tradeoffs that occur among precision, scale, and interpretability are illustrated. Although circumstances such as funding and availability of pre-construction data may limit which design can be implemented, a pre/post-construction design including barrier and reference streams provides the most meaningful information for use in barrier management decisions. Where it is not feasible to obtain pre-construction data, a design including reference streams is important to maintain the interpretability of results. Regardless of the design used, process-oriented research provides a framework for interpreting results obtained in broad surveys. As such, information from both extensive surveys and intensive process-oriented research provides the best basis for fishery management actions, and gives researchers and managers the most confidence in the conclusions reached regarding the effects of sea lamprey barriers.

  14. QUANTITATIVE MORPHOLOGY

    EPA Science Inventory

    Abstract: In toxicology, the role of quantitative assessment of brain morphology can be understood in the context of two types of treatment-related alterations. One type of alteration is specifically associated with treatment and is not observed in control animals. Measurement ...

  15. Geothermal energy as a source of electricity. A worldwide survey of the design and operation of geothermal power plants

    SciTech Connect

    DiPippo, R.

    1980-01-01

    An overview of geothermal power generation is presented. A survey of geothermal power plants is given for the following countries: China, El Salvador, Iceland, Italy, Japan, Mexico, New Zealand, Philippines, Turkey, USSR, and USA. A survey of countries planning geothermal power plants is included. (MHR)

  16. A Mixed Model Design Study of RN to BS Distance Learning:Survey of Graduates' Perceptions of Strengths and Challenges

    ERIC Educational Resources Information Center

    Lock, Leonard K.; Schnell, Zoanne; Pratt-Mullen, Jerrilynn

    2011-01-01

    This article reports on findings from a survey administered to graduates of a distance learning RN-to-BS completion program. A questionnaire was constructed to examine graduate experiences and perceptions regarding distance learning formats, course content, time management, student empowerment, and program support. A total of 251 surveys were…

  17. Bone mass status of school-aged children in Taiwan assessed by quantitative ultrasound: the Nutrition and Health Survey in Taiwan Elementary School Children (NAHSIT Children 2001-2002).

    PubMed

    Lin, Yi-Chin; Tu, Su-Hao; Pan, Wen-Harn L

    2007-01-01

    Bone health status in childhood and adolescence may be important factors influencing the attainment of peak bone mass. The Nutrition and Health Survey in Taiwan Elementary School Children 2000-2001 was carried out to evaluate the overall nutrition and health status of school children aged between 6 and 13 years. The survey was conducted using a multi-stage complex sampling scheme. Townships and city districts in Taiwan were classified into 13 strata. Bone mass measured as broadband ultrasound attenuation was taken at heel by quantitative ultrasound bone densitometry. A total of 1164 boys and 1016 girls who had complete physical examination data with ultrasound bone scan were included in the current analysis. There were no apparent differences in BUA across all strata for both genders. In both boys and girls, age, height, body weight, BMI, and intake frequencies of vegetables and fruits/juices were significantly related to BUA. Results of multivariate regression showed that age (beta=1.36, p=0.0002) and body weight (beta=0.40, p<0.0001) were significant predictors for BUA in boys, whereas in girls body weight (beta=0.47, p<0.0001), height, (beta=0.20, p=0.01), dietary phosphorus intake (beta=-0.002, p=0.038), and frequency of fruit/juice intake (beta=0.15, p=0.029) remained statistically significant. The differential effects dietary intake variables on BUA in boys and girls may be in part due to the development of puberty. It would be necessary to include levels of physical activity in future analyses for better understanding factors influencing the development of peak bone mass in Taiwanese children.

  18. Second-generation Cobas AmpliPrep/Cobas TaqMan HCV quantitative test for viral load monitoring: a novel dual-probe assay design.

    PubMed

    Zitzer, Heike; Heilek, Gabrielle; Truchon, Karine; Susser, Simone; Vermehren, Johannes; Sizmann, Dorothea; Cobb, Bryan; Sarrazin, Christoph

    2013-02-01

    Hepatitis C virus (HCV) RNA viral load (VL) monitoring is a well-established diagnostic tool for the management of chronic hepatitis C patients. HCV RNA VL results are used to make treatment decisions with the goal of therapy to achieve an undetectable VL result. Therefore, a sensitive assay with high specificity in detecting and accurately quantifying HCV RNA across genotypes is critical. Additionally, a lower sample volume requirement is desirable for the laboratory and the patient. This study evaluated the performance characteristics of a second-generation real-time PCR assay, the Cobas AmpliPrep/Cobas TaqMan HCV quantitative test, version 2.0 (CAP/CTM HCV test, v2.0), designed with a novel dual-probe approach and an optimized automated extraction and amplification procedure. The new assay demonstrated a limit of detection and lower limit of quantification of 15 IU/ml across all HCV genotypes and was linear from 15 to 100,000,000 IU/ml with high accuracy (<0.2-log(10) difference) and precision (standard deviation of 0.04 to 0.22 log(10)). A specificity of 100% was demonstrated with 600 HCV-seronegative specimens without cross-reactivity or interference. Correlation to the Cobas AmpliPrep/Cobas TaqMan HCV test (version 1) was good (n = 412 genotype 1 to 6 samples, R(2) = 0.88; R(2) = 0.94 without 105 genotype 4 samples). Paired plasma and serum samples showed similar performance (n = 25, R(2) = 0.99). The sample input volume was reduced from 1 to 0.65 ml in the second version. The CAP/CTM HCV test, v2.0, demonstrated excellent performance and sensitivity across all HCV genotypes with a smaller sample volume. The new HCV RNA VL assay has performance characteristics that make it suitable for use with currently available direct-acting antiviral agents.

  19. Application of Screening Experimental Designs to Assess Chromatographic Isotope Effect upon Isotope-Coded Derivatization for Quantitative Liquid Chromatography–Mass Spectrometry

    PubMed Central

    2015-01-01

    Isotope effect may cause partial chromatographic separation of labeled (heavy) and unlabeled (light) isotopologue pairs. Together with a simultaneous matrix effect, this could lead to unacceptable accuracy in quantitative liquid chromatography–mass spectrometry assays, especially when electrospray ionization is used. Four biologically relevant reactive aldehydes (acrolein, malondialdehyde, 4-hydroxy-2-nonenal, and 4-oxo-2-nonenal) were derivatized with light or heavy (d3-, 13C6-, 15N2-, or 15N4-labeled) 2,4-dinitrophenylhydrazine and used as model compounds to evaluate chromatographic isotope effects. For comprehensive assessment of retention time differences between light/heavy pairs under various gradient reversed-phase liquid chromatography conditions, major chromatographic parameters (stationary phase, mobile phase pH, temperature, organic solvent, and gradient slope) and different isotope labelings were addressed by multiple-factor screening using experimental designs that included both asymmetrical (Addelman) and Plackett–Burman schemes followed by statistical evaluations. Results confirmed that the most effective approach to avoid chromatographic isotope effect is the use of 15N or 13C labeling instead of deuterium labeling, while chromatographic parameters had no general influence. Comparison of the alternate isotope-coded derivatization assay (AIDA) using deuterium versus 15N labeling gave unacceptable differences (>15%) upon quantifying some of the model aldehydes from biological matrixes. On the basis of our results, we recommend the modification of the AIDA protocol by replacing d3-2,4-dinitrophenylhydrazine with 15N- or 13C-labeled derivatizing reagent to avoid possible unfavorable consequences of chromatographic isotope effects. PMID:24922593

  20. Rational design of novel anti-microtubule agent (9-azido-noscapine) from quantitative structure activity relationship (QSAR) evaluation of noscapinoids.

    PubMed

    Santoshi, Seneha; Naik, Pradeep K; Joshi, Harish C

    2011-10-01

    An anticough medicine, noscapine [(S)-3-((R)4-methoxy-6-methyl-5,6,7,8-tetrahydro-[1,3]dioxolo[4,5-g]isoquinolin-5-yl)-6,7-dimethoxyiso-benzofuran-1(3H)-one], was discovered in the authors' laboratory as a novel type of tubulin-binding agent that mitigates polymerization dynamics of microtubule polymers without changing overall subunit-polymer equilibrium. To obtain systematic insight into the relationship between the structural framework of noscapine scaffold and its antitumor activity, the authors synthesized strategic derivatives (including two new ones in this article). The IC(50) values of these analogs vary from 1.2 to 56.0 µM in human acute lymphoblastic leukemia cells (CEM). Geometrical optimization was performed using semiempirical quantum chemical calculations at the 3-21G* level. Structures were in agreement with nuclear magnetic resonance analysis of molecular flexibility in solution and crystal structures. A genetic function approximation algorithm of variable selection was used to generate the quantitative structure activity relationship (QSAR) model. The robustness of the QSAR model (R(2) = 0.942) was analyzed by values of the internal cross-validated regression coefficient (R(2) (LOO) = 0.815) for the training set and determination coefficient (R(2) (test) = 0.817) for the test set. Validation was achieved by rational design of further novel and potent antitumor noscapinoid, 9-azido-noscapine, and reduced 9-azido-noscapine. The experimentally determined value of pIC(50) for both the compounds (5.585 M) turned out to be very close to predicted pIC(50) (5.731 and 5.710 M).

  1. Quantitative glycomics.

    PubMed

    Orlando, Ron

    2010-01-01

    The ability to quantitatively determine changes is an essential component of comparative glycomics. Multiple strategies are available by which this can be accomplished. These include label-free approaches and strategies where an isotopic label is incorporated into the glycans prior to analysis. The focus of this chapter is to describe each of these approaches while providing insight into their strengths and weaknesses, so that glycomic investigators can make an educated choice of the strategy that is best suited for their particular application.

  2. Bovine tuberculosis infection in wild mammals in the South-West region of England: a survey of prevalence and a semi-quantitative assessment of the relative risks to cattle.

    PubMed

    Delahay, R J; Smith, G C; Barlow, A M; Walker, N; Harris, A; Clifton-Hadley, R S; Cheeseman, C L

    2007-03-01

    In the United Kingdom, badgers are implicated in the transmission of Mycobacterium bovis to cattle, but little information is available on the potential role of other wild mammals. This paper presents the results of the largest systematic UK survey of M. bovis infection in other wild mammals. Mammal carcasses (4715) from throughout the South-West region of England were subjected to a systematic post mortem examination, microbiological culture of tissues and spoligotyping of isolates. Infection was confirmed in fox, stoat, polecat, common shrew, yellow-necked mouse, wood mouse, field vole, grey squirrel, roe deer, red deer, fallow deer and muntjac. Prevalence in deer may have been underestimated because the majority were incomplete carcasses, which reduced the likelihood of detecting infection. Infected cases were found in Wiltshire, Somerset, Devon and Cornwall, Gloucestershire and Herefordshire. Lesions were found in a high proportion of spoligotype-positive fallow, red and roe deer, and a single fox, stoat and muntjac. M. bovis spoligotypes occurred in a similar frequency of occurrence to that in cattle and badgers. Data on prevalence, pathology, abundance and ecology of wild mammals was integrated in a semi-quantitative risk assessment of the likelihood of transmission to cattle relative to badgers. Although most species presented a relatively low risk, higher values and uncertainty associated with muntjac, roe, red and in particular fallow deer, suggest they require further investigation. The results suggest that deer should be considered as potential, although probably localised, sources of infection for cattle.

  3. Doctoral Training in Statistics, Measurement, and Methodology in Psychology: Replication and Extension of Aiken, West, Sechrest, and Reno's (1990) Survey of PhD Programs in North America

    ERIC Educational Resources Information Center

    Aiken, Leona S.; West, Stephen G.; Millsap, Roger E.

    2008-01-01

    In a survey of all PhD programs in psychology in the United States and Canada, the authors documented the quantitative methodology curriculum (statistics, measurement, and research design) to examine the extent to which innovations in quantitative methodology have diffused into the training of PhDs in psychology. In all, 201 psychology PhD…

  4. Points of View: A Survey of Survey Courses--Are They Effective? Running out of Hands: Designing a Modern Biology Curriculum

    ERIC Educational Resources Information Center

    Eisen, Arri

    2005-01-01

    What makes a good teacher? What makes a good curriculum? While these two questions are intimately related, they are different. This author states that when he reflects on his best teachers, he cannot separate the person from what the person taught. On the other hand, when designing a curriculum, it is important to figure out what to teach and how,…

  5. Rules for the preparation of manuscript and illustrations designed for publication by the United States Geological Survey

    USGS Publications Warehouse

    Hampson, Thomas

    1888-01-01

    In the annual report of the Director of the U. S. Geological Survey for 1885-'86, pages 40 and 41, you set forth the functions of the chief of the editorial division as follows: "To secure clear and accurate statement in the material sent to press, careful proof-reading, and uniformity in the details of book-making, as well as to assist the Director in exercising a general supervision over the publications of the Survey."

  6. Yeasts in floral nectar: a quantitative survey

    PubMed Central

    Herrera, Carlos M.; de Vega, Clara; Canto, Azucena; Pozo, María I.

    2009-01-01

    Background and Aims One peculiarity of floral nectar that remains relatively unexplored from an ecological perspective is its role as a natural habitat for micro-organisms. This study assesses the frequency of occurrence and abundance of yeast cells in floral nectar of insect-pollinated plants from three contrasting plant communities on two continents. Possible correlations between interspecific differences in yeast incidence and pollinator composition are also explored. Methods The study was conducted at three widely separated areas, two in the Iberian Peninsula (Spain) and one in the Yucatán Peninsula (Mexico). Floral nectar samples from 130 species (37–63 species per region) in 44 families were examined microscopically for the presence of yeast cells. For one of the Spanish sites, the relationship across species between incidence of yeasts in nectar and the proportion of flowers visited by each of five major pollinator categories was also investigated. Key Results Yeasts occurred regularly in the floral nectar of many species, where they sometimes reached extraordinary densities (up to 4 × 105 cells mm−3). Depending on the region, between 32 and 44 % of all nectar samples contained yeasts. Yeast cell densities in the order of 104 cells mm−3 were commonplace, and densities >105 cells mm−3 were not rare. About one-fifth of species at each site had mean yeast cell densities >104 cells mm−3. Across species, yeast frequency and abundance were directly correlated with the proportion of floral visits by bumble-bees, and inversely with the proportion of visits by solitary bees. Conclusions Incorporating nectar yeasts into the scenario of plant–pollinator interactions opens up a number of intriguing avenues for research. In addition, with yeasts being as ubiquitous and abundant in floral nectars as revealed by this study, and given their astounding metabolic versatility, studies focusing on nectar chemical features should carefully control for the presence of yeasts in nectar samples. PMID:19208669

  7. Survey Sense.

    ERIC Educational Resources Information Center

    Pollick, Anne M.

    1995-01-01

    This article provides advice on how to plan and conduct an alumni census through the mail, drawing on the experiences of Stonehill College in North Easton, Massachusetts, which undertook such a survey in 1992. It focuses on costs, information needs, questionnaire design, mailing considerations, reporting the results, and expected response rates.…

  8. Surveys: an introduction.

    PubMed

    Rubenfeld, Gordon D

    2004-10-01

    Surveys are a valuable research tool for studying the knowledge, attitudes, and behavior of a study population. This article explores quantitative analyses of written questionnaires as instruments for survey research. Obtaining accurate and precise information from a survey requires minimizing the possibility of bias from inappropriate sampling or a flawed survey instrument, and this article describes strategies to minimize sampling bias by increasing response rates, comparing responders to nonresponders, and identifying the appropriate sampling population. It is crucial that the survey instrument be valid, meaning that it actually measures what the investigator intends it to measure. In developing a valid survey instrument, it can be useful to adapt survey instruments that were developed by other researchers and to conduct extensive pilot-testing of your survey instrument.

  9. EuropeaN Energy balance Research to prevent excessive weight Gain among Youth (ENERGY) project: Design and methodology of the ENERGY cross-sectional survey

    PubMed Central

    2011-01-01

    Background Obesity treatment is by large ineffective long term, and more emphasis on the prevention of excessive weight gain in childhood and adolescence is warranted. To inform energy balance related behaviour (EBRB) change interventions, insight in the potential personal, family and school environmental correlates of these behaviours is needed. Studies on such multilevel correlates of EBRB among schoolchildren in Europe are lacking. The ENERGY survey aims to (1) provide up-to-date prevalence rates of measured overweight, obesity, self-reported engagement in EBRBs, and objective accelerometer-based assessment of physical activity and sedentary behaviour and blood-sample biomarkers of metabolic function in countries in different regions of Europe, (2) to identify personal, family and school environmental correlates of these EBRBs. This paper describes the design, methodology and protocol of the survey. Method/Design A school-based cross-sectional survey was carried out in 2010 in seven different European countries; Belgium, Greece, Hungary, the Netherlands, Norway, Slovenia, and Spain. The survey included measurements of anthropometrics, child, parent and school-staff questionnaires, and school observations to measure and assess outcomes (i.e. height, weight, and waist circumference), EBRBs and potential personal, family and school environmental correlates of these behaviours including the social-cultural, physical, political, and economic environmental factors. In addition, a selection of countries conducted accelerometer measurements to objectively assess physical activity and sedentary behaviour, and collected blood samples to assess several biomarkers of metabolic function. Discussion The ENERGY survey is a comprehensive cross-sectional study measuring anthropometrics and biomarkers as well as assessing a range of EBRBs and their potential correlates at the personal, family and school level, among 10-12 year old children in seven European countries. This study

  10. A survey of surveys

    SciTech Connect

    Kent, S.M.

    1994-11-01

    A new era for the field of Galactic structure is about to be opened with the advent of wide-area digital sky surveys. In this article, the author reviews the status and prospects for research for 3 new ground-based surveys: the Sloan Digital Sky Survey (SDSS), the Deep Near-Infrared Survey of the Southern Sky (DENIS) and the Two Micron AU Sky Survey (2MASS). These surveys will permit detailed studies of Galactic structure and stellar populations in the Galaxy with unprecedented detail. Extracting the information, however, will be challenging.

  11. How To Sample in Surveys. The Survey Kit, Volume 6.

    ERIC Educational Resources Information Center

    Fink, Arlene

    The nine-volume Survey Kit is designed to help readers prepare and conduct surveys and become better users of survey results. All the books in the series contain instructional objectives, exercises and answers, examples of surveys in use, illustrations of survey questions, guidelines for action, checklists of "dos and don'ts," and…

  12. Development and use of a multiplex real-time quantitative polymerase chain reaction assay for detection and differentiation of Porcine circovirus-2 genotypes 2a and 2b in an epidemiological survey.

    PubMed

    Gagnon, Carl A; del Castillo, Jérome R E; Music, Nedzad; Fontaine, Guy; Harel, Josée; Tremblay, Donald

    2008-09-01

    By the end of 2004, the Canadian swine population had experienced a severe increase in the incidence of Porcine circovirus-associated disease (PCVAD), a problem that was associated with the emergence of a new Porcine circovirus-2 genotype (PCV-2b), previously unrecovered in North America. Thus, it became important to develop a diagnostic tool that could differentiate between the old and new circulating genotypes (PCV-2a and PCV-2b, respectively). Consequently, a multiplex real-time quantitative polymerase chain reaction (mrtqPCR) assay that could sensitively and specifically identify and differentiate PCV-2 genotypes was developed. A retrospective epidemiologic survey that used the mrtqPCR assay was performed to determine if cofactors could affect the risk of PCVAD. From 121 PCV-2-positive cases gathered for this study, 4.13%, 92.56%, and 3.31% were positive for PCV-2a, PCV-2b, and both genotypes, respectively. In a data analysis using univariate logistic regressions, the PCVAD-compatible (PCVAD/c) score was significantly associated with the presence of Porcine reproductive and respiratory syndrome virus (PRRSV), PRRSV viral load, PCV-2 viral load, and PCV-2 immunohistochemistry (IHC) results. Polytomous logistic regression analysis revealed that PCVAD/c score was affected by PCV-2 viral load (P = 0.0161) and IHC (P = 0.0128), but not by the PRRSV variables (P > 0.9), which suggests that mrtqPCR in tissue is a reliable alternative to IHC. Logistic regression analyses revealed that PCV-2 increased the odds ratio of isolating 2 major swine pathogens of the respiratory tract, Actinobacillus pleuropneumoniae and Streptococcus suis serotypes 1/2, 1, 2, 3, 4, and 7, which are serotypes commonly associated with clinical diseases.

  13. 78 FR 5458 - Medicare Program; Request for Information To Aid in the Design and Development of a Survey...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-25

    ... new health care delivery models.'' The survey, now under development, hereinafter referred to as the... Hospice Care AGENCY: Centers for Medicare & Medicaid Services (CMS), HHS. ACTION: Request for information... experiences with hospice care. DATES: The information solicited in this notice must be received at the...

  14. A WHOLE-LAKE WATER QUALITY SURVEY OF LAKE OAHE BASED ON A SPATIALLY-BALANCED PROBABILISTIC DESIGN

    EPA Science Inventory

    Assessing conditions on large bodies of water presets multiple statistical and logistical challenges. As part of the Upper Missouri River Program of the Environmental Monitoring and Assessment Project (EMAP) we surveyed water quality of Lake Oahe in July-August, 2002 using a spat...

  15. [The first wave of the German Health Interview and Examination Survey for Adults (DEGS1): sample design, response, weighting and representativeness].

    PubMed

    Kamtsiuris, P; Lange, M; Hoffmann, R; Schaffrath Rosario, A; Dahm, S; Kuhnert, R; Kurth, B M

    2013-05-01

    The "German Health Interview and Examination Survey for Adults" (DEGS) is part of the health monitoring program of the Robert Koch Institute (RKI) and is designed as a combined cross-sectional and longitudinal survey. The first wave (DEGS1; 2008-2011) comprised interviews and physical examinations. The target population were 18- to 79-year olds living in Germany. The mixed design consisted of a new sample randomly chosen from local population registries which was supplemented by participants from the "German National Health Interview and Examination Survey 1998" (GNHIES98). In total, 8,152 persons took part, among them 4,193 newly invited (response 42%) and 3,959 who had previously taken part in GNHIES98 (response 62%). 7,238 participants visited one of the 180 local study centres, 914 took part in the interview-only programme. The comparison of the net sample with the group of non-participants and with the resident population of Germany suggests a high representativeness regarding various attributes. To account for certain aspects of the population structure cross-sectional, trend and longitudinal analyses are corrected by weighting factors. Furthermore, different participation probabilities of the former participants of GNHIES98 are compensated for. An English full-text version of this article is available at SpringerLink as supplemental.

  16. Participant dropout as a function of survey length in internet-mediated university studies: implications for study design and voluntary participation in psychological research.

    PubMed

    Hoerger, Michael

    2010-12-01

    Internet-mediated research has offered substantial advantages over traditional laboratory-based research in terms of efficiently and affordably allowing for the recruitment of large samples of participants for psychology studies. Core technical, ethical, and methodological issues have been addressed in recent years, but the important issue of participant dropout has received surprisingly little attention. Specifically, web-based psychology studies often involve undergraduates completing lengthy and time-consuming batteries of online personality questionnaires, but no known published studies to date have closely examined the natural course of participant dropout during attempted completion of these studies. The present investigation examined participant dropout among 1,963 undergraduates completing one of six web-based survey studies relatively representative of those conducted in university settings. Results indicated that 10% of participants could be expected to drop out of these studies nearly instantaneously, with an additional 2% dropping out per 100 survey items included in the study. For individual project investigators, these findings hold ramifications for study design considerations, such as conducting a priori power analyses. The present results also have broader ethical implications for understanding and improving voluntary participation in research involving human subjects. Nonetheless, the generalizability of these conclusions may be limited to studies involving similar design or survey content.

  17. Surveying Future Surveys

    NASA Astrophysics Data System (ADS)

    Carlstrom, John E.

    2016-06-01

    The now standard model of cosmology has been tested and refined by the analysis of increasingly sensitive, large astronomical surveys, especially with statistically significant millimeter-wave surveys of the cosmic microwave background and optical surveys of the distribution of galaxies. This talk will offer a glimpse of the future, which promises an acceleration of this trend with cosmological information coming from new surveys across the electromagnetic spectrum as well as particles and even gravitational waves.

  18. A Quantitative Multimodal Discourse Analysis of Teaching and Learning in a Web-Conferencing Environment--The Efficacy of Student-Centred Learning Designs

    ERIC Educational Resources Information Center

    Bower, Matt; Hedberg, John G.

    2010-01-01

    This paper presents a quantitative approach to multimodal discourse analysis for analyzing online collaborative learning. The coding framework draws together the fields of systemic functional linguistics and Activity Theory to analyze interactions between collaborative-, content- and technology-related discourse. The approach is used to examine…

  19. Psychometric and Cognitive Analysis as a Basis for the Design and Revision of Quantitative Item Models. Research Report. ETS RR-05-25

    ERIC Educational Resources Information Center

    Graf, Edith Aurora; Peterson, Stephen; Steffen, Manfred; Lawless, René

    2005-01-01

    We describe the item modeling development and evaluation process as applied to a quantitative assessment with high-stakes outcomes. In addition to expediting the item-creation process, a model-based approach may reduce pretesting costs, if the difficulty and discrimination of model-generated items may be predicted to a predefined level of…

  20. 78 FR 64202 - Quantitative Messaging Research

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-28

    ... combines key findings from both the survey as well as other qualitative research. Findings from the summary... COMMISSION Quantitative Messaging Research AGENCY: Commodity Futures Trading Commission. ACTION: Notice... financial fraud as part of a consumer-facing anti-fraud campaign. This survey will follow...

  1. Evaluation of Nine Consensus Indices in Delphi Foresight Research and Their Dependency on Delphi Survey Characteristics: A Simulation Study and Debate on Delphi Design and Interpretation

    PubMed Central

    Birko, Stanislav; Dove, Edward S.; Özdemir, Vural

    2015-01-01

    The extent of consensus (or the lack thereof) among experts in emerging fields of innovation can serve as antecedents of scientific, societal, investor and stakeholder synergy or conflict. Naturally, how we measure consensus is of great importance to science and technology strategic foresight. The Delphi methodology is a widely used anonymous survey technique to evaluate consensus among a panel of experts. Surprisingly, there is little guidance on how indices of consensus can be influenced by parameters of the Delphi survey itself. We simulated a classic three-round Delphi survey building on the concept of clustered consensus/dissensus. We evaluated three study characteristics that are pertinent for design of Delphi foresight research: (1) the number of survey questions, (2) the sample size, and (3) the extent to which experts conform to group opinion (the Group Conformity Index) in a Delphi study. Their impacts on the following nine Delphi consensus indices were then examined in 1000 simulations: Clustered Mode, Clustered Pairwise Agreement, Conger’s Kappa, De Moivre index, Extremities Version of the Clustered Pairwise Agreement, Fleiss’ Kappa, Mode, the Interquartile Range and Pairwise Agreement. The dependency of a consensus index on the Delphi survey characteristics was expressed from 0.000 (no dependency) to 1.000 (full dependency). The number of questions (range: 6 to 40) in a survey did not have a notable impact whereby the dependency values remained below 0.030. The variation in sample size (range: 6 to 50) displayed the top three impacts for the Interquartile Range, the Clustered Mode and the Mode (dependency = 0.396, 0.130, 0.116, respectively). The Group Conformity Index, a construct akin to measuring stubbornness/flexibility of experts’ opinions, greatly impacted all nine Delphi consensus indices (dependency = 0.200 to 0.504), except the Extremity CPWA and the Interquartile Range that were impacted only beyond the first decimal point (dependency

  2. Evaluation of Nine Consensus Indices in Delphi Foresight Research and Their Dependency on Delphi Survey Characteristics: A Simulation Study and Debate on Delphi Design and Interpretation.

    PubMed

    Birko, Stanislav; Dove, Edward S; Özdemir, Vural

    2015-01-01

    The extent of consensus (or the lack thereof) among experts in emerging fields of innovation can serve as antecedents of scientific, societal, investor and stakeholder synergy or conflict. Naturally, how we measure consensus is of great importance to science and technology strategic foresight. The Delphi methodology is a widely used anonymous survey technique to evaluate consensus among a panel of experts. Surprisingly, there is little guidance on how indices of consensus can be influenced by parameters of the Delphi survey itself. We simulated a classic three-round Delphi survey building on the concept of clustered consensus/dissensus. We evaluated three study characteristics that are pertinent for design of Delphi foresight research: (1) the number of survey questions, (2) the sample size, and (3) the extent to which experts conform to group opinion (the Group Conformity Index) in a Delphi study. Their impacts on the following nine Delphi consensus indices were then examined in 1000 simulations: Clustered Mode, Clustered Pairwise Agreement, Conger's Kappa, De Moivre index, Extremities Version of the Clustered Pairwise Agreement, Fleiss' Kappa, Mode, the Interquartile Range and Pairwise Agreement. The dependency of a consensus index on the Delphi survey characteristics was expressed from 0.000 (no dependency) to 1.000 (full dependency). The number of questions (range: 6 to 40) in a survey did not have a notable impact whereby the dependency values remained below 0.030. The variation in sample size (range: 6 to 50) displayed the top three impacts for the Interquartile Range, the Clustered Mode and the Mode (dependency = 0.396, 0.130, 0.116, respectively). The Group Conformity Index, a construct akin to measuring stubbornness/flexibility of experts' opinions, greatly impacted all nine Delphi consensus indices (dependency = 0.200 to 0.504), except the Extremity CPWA and the Interquartile Range that were impacted only beyond the first decimal point (dependency = 0

  3. Survey of selected design and ventilation characteristics of racehorse stables in the Pretoria, Witwatersrand, Vereeniging area of South Africa.

    PubMed

    Lund, R J; Guthrie, A J; Killeen, V M

    1993-12-01

    Stables housing more than 20 horses in training were surveyed in the Pretoria, Witwatersrand, Vereeniging area of South Africa. Most racehorses were kept in loose boxes, bedded on straw or sawdust and remained indoors while the stables were cleaned. The average floor area was 13 m2 and airspace was 55 m3 per animal. The average predicted minimum air change rate by natural convection in calm winds was 7.0 air changes per hour, which was reduced to 2.2 when the doors and shutters were closed. The survey showed that many of the stables had been built without due consideration to factors that might have adverse effects on the occupants.

  4. Quantitative Techniques in Volumetric Analysis

    NASA Astrophysics Data System (ADS)

    Zimmerman, John; Jacobsen, Jerrold J.

    1996-12-01

    Quantitative Techniques in Volumetric Analysis is a visual library of techniques used in making volumetric measurements. This 40-minute VHS videotape is designed as a resource for introducing students to proper volumetric methods and procedures. The entire tape, or relevant segments of the tape, can also be used to review procedures used in subsequent experiments that rely on the traditional art of quantitative analysis laboratory practice. The techniques included are: Quantitative transfer of a solid with a weighing spoon Quantitative transfer of a solid with a finger held weighing bottle Quantitative transfer of a solid with a paper strap held bottle Quantitative transfer of a solid with a spatula Examples of common quantitative weighing errors Quantitative transfer of a solid from dish to beaker to volumetric flask Quantitative transfer of a solid from dish to volumetric flask Volumetric transfer pipet A complete acid-base titration Hand technique variations The conventional view of contemporary quantitative chemical measurement tends to focus on instrumental systems, computers, and robotics. In this view, the analyst is relegated to placing standards and samples on a tray. A robotic arm delivers a sample to the analysis center, while a computer controls the analysis conditions and records the results. In spite of this, it is rare to find an analysis process that does not rely on some aspect of more traditional quantitative analysis techniques, such as careful dilution to the mark of a volumetric flask. Figure 2. Transfer of a solid with a spatula. Clearly, errors in a classical step will affect the quality of the final analysis. Because of this, it is still important for students to master the key elements of the traditional art of quantitative chemical analysis laboratory practice. Some aspects of chemical analysis, like careful rinsing to insure quantitative transfer, are often an automated part of an instrumental process that must be understood by the

  5. An investigation into the feasibility of designing a framework for the quantitative evaluation of the Clinical Librarian service at an NHS Trust in Brighton, UK.

    PubMed

    Deshmukh, Archana; Roper, Tom

    2014-12-01

    This feature presents research undertaken by Archana Deshmukh for her MA dissertation at the University of Brighton. She worked closely with Tom Roper, the Clinical Librarian at Brighton and Sussex University Hospitals NHS Trust, in a project to explore the feasibility of applying quantitative measures to evaluate the Clinical Librarian service. The investigation used an innovative participatory approach and the findings showed that although an exclusively quantitative approach to evaluation is not feasible, using a mixed methods approach is a way forward. Agreed outputs and outcomes could be embedded in a marketing plan, and the resulting framework could provide evidence to demonstrate overall impact. Archana graduated in July 2014, gaining a Distinction in the MA in Information Studies, and she is currently looking for work in the health information sector.

  6. Design and Implementation of a Comprehensive Web-based Survey for Ovarian Cancer Survivorship with an Analysis of Prediagnosis Symptoms via Text Mining.

    PubMed

    Sun, Jiayang; Bogie, Kath M; Teagno, Joe; Sun, Yu-Hsiang Sam; Carter, Rebecca R; Cui, Licong; Zhang, Guo-Qiang

    2014-01-01

    Ovarian cancer (OvCa) is the most lethal gynecologic disease in the United States, with an overall 5-year survival rate of 44.5%, about half of the 89.2% for all breast cancer patients. To identify factors that possibly contribute to the long-term survivorship of women with OvCa, we conducted a comprehensive online Ovarian Cancer Survivorship Survey from 2009 to 2013. This paper presents the design and implementation of our survey, introduces its resulting data source, the OVA-CRADLE™ (Clinical Research Analytics and Data Lifecycle Environment), and illustrates a sample application of the survey and data by an analysis of prediagnosis symptoms, using text mining and statistics. The OVA-CRADLE™ is an application of our patented Physio-MIMI technology, facilitating Web-based access, online query and exploration of data. The prediagnostic symptoms and association of early-stage OvCa diagnosis with endometriosis provide potentially important indicators for future studies in this field.

  7. State-of-the-art and dissemination of computational tools for drug-design purposes: a survey among Italian academics and industrial institutions.

    PubMed

    Artese, Anna; Alcaro, Stefano; Moraca, Federica; Reina, Rocco; Ventura, Marzia; Costantino, Gabriele; Beccari, Andrea R; Ortuso, Francesco

    2013-05-01

    During the first edition of the Computationally Driven Drug Discovery meeting, held in November 2011 at Dompé Pharma (L'Aquila, Italy), a questionnaire regarding the diffusion and the use of computational tools for drug-design purposes in both academia and industry was distributed among all participants. This is a follow-up of a previously reported investigation carried out among a few companies in 2007. The new questionnaire implemented five sections dedicated to: research group identification and classification; 18 different computational techniques; software information; hardware data; and economical business considerations. In this article, together with a detailed history of the different computational methods, a statistical analysis of the survey results that enabled the identification of the prevalent computational techniques adopted in drug-design projects is reported and a profile of the computational medicinal chemist currently working in academia and pharmaceutical companies in Italy is highlighted.

  8. Obesity-related behaviours and BMI in five urban regions across Europe: sampling design and results from the SPOTLIGHT cross-sectional survey

    PubMed Central

    Lakerveld, Jeroen; Ben Rebah, Maher; Mackenbach, Joreintje D; Charreire, Hélène; Compernolle, Sofie; Glonti, Ketevan; Bardos, Helga; Rutter, Harry; De Bourdeaudhuij, Ilse; Brug, Johannes; Oppert, Jean-Michel

    2015-01-01

    Objectives To describe the design, methods and first results of a survey on obesity-related behaviours and body mass index (BMI) in adults living in neighbourhoods from five urban regions across Europe. Design A cross-sectional observational study in the framework of an European Union-funded project on obesogenic environments (SPOTLIGHT). Setting 60 urban neighbourhoods (12 per country) were randomly selected in large urban zones in Belgium, France, Hungary, the Netherlands and the UK, based on high or low values for median household income (socioeconomic status, SES) and residential area density. Participants A total of 6037 adults (mean age 52 years, 56% female) participated in the online survey. Outcome measures Self-reported physical activity, sedentary behaviours, dietary habits and BMI. Other measures included general health; barriers and motivations for a healthy lifestyle, perceived social and physical environmental characteristics; the availability of transport modes and their use to specific destinations; self-defined neighbourhood boundaries and items related to residential selection. Results Across five countries, residents from low-SES neighbourhoods ate less fruit and vegetables, drank more sugary drinks and had a consistently higher BMI. SES differences in sedentary behaviours were observed in France, with residents from higher SES neighbourhoods reporting to sit more. Residents from low-density neighbourhoods were less physically active than those from high-density neighbourhoods; during leisure time and (most pronounced) for transport (except for Belgium). BMI differences by residential density were inconsistent across all countries. Conclusions The SPOTLIGHT survey provides an original approach for investigating relations between environmental characteristics, obesity-related behaviours and obesity in Europe. First descriptive results indicate considerable differences in health behaviours and BMI between countries and neighbourhood types. PMID

  9. Design, objectives, and lessons from a pilot 25 year follow up re- survey of survivors in the Whitehall study of London Civil Servants

    PubMed Central

    Clarke, R.; Breeze, E.; Sherliker, P.; Shipley, M.; Youngman, L.; Fletcher, A.; Fuhrer, R.; Leon, D.; Parish, S.; Collins, R.; Marmot, M.

    1998-01-01

    DESIGN: To assess the feasibility of conducting a re-survey of men who are resident in the United Kingdom 25 years after enrollment in the Whitehall study of London Civil Servants. METHODS: A random sample of 401 study survivors resident in three health authority areas was selected for this pilot study. They were mailed a request to complete a self administered questionnaire, and then asked to attend their general practice to have their blood pressure, weight, and height measured and a blood sample collected into a supplied vacutainer, and mailed to a central laboratory. Using a 2 x 2 factorial design, the impact of including additional questions on income and of an informant questionnaire on cognitive function was assessed. RESULTS: Accurate addresses were obtained from the health authorities for 96% of the sample. Questionnaires were received from 73% and blood samples from 61% of the sample. Questions on income had no adverse effect on the response rate, but inclusion of the informant questionnaire did. Between 1970 and 1995 there were substantial changes within men in the mean blood pressure and blood total cholesterol recorded, as reflected by correlation coefficients between 1970 and 1995 values of 0.26, and 0.30 for systolic and diastolic blood pressure and 0.38 for total cholesterol. CONCLUSION: This pilot study demonstrated the feasibility of conducting a re-survey using postal questionnaires and mailed whole blood samples. The magnitude of change in blood pressure and blood total cholesterol concentrations within individuals was greater than anticipated, suggesting that such remeasurements may be required at different intervals in prospective studies to help interpret risks associations properly. These issues will be considered in a re-survey of the remaining survivors of the Whitehall study.   PMID:9764257

  10. Quantitative measurement of the chemical composition of geological standards with a miniature laser ablation/ionization mass spectrometer designed for in situ application in space research

    NASA Astrophysics Data System (ADS)

    Neuland, M. B.; Grimaudo, V.; Mezger, K.; Moreno-García, P.; Riedo, A.; Tulej, M.; Wurz, P.

    2016-03-01

    A key interest of planetary space missions is the quantitative determination of the chemical composition of the planetary surface material. The chemical composition of surface material (minerals, rocks, soils) yields fundamental information that can be used to answer key scientific questions about the formation and evolution of the planetary body in particular and the Solar System in general. We present a miniature time-of-flight type laser ablation/ionization mass spectrometer (LMS) and demonstrate its capability in measuring the elemental and mineralogical composition of planetary surface samples quantitatively by using a femtosecond laser for ablation/ionization. The small size and weight of the LMS make it a remarkable tool for in situ chemical composition measurements in space research, convenient for operation on a lander or rover exploring a planetary surface. In the laboratory, we measured the chemical composition of four geological standard reference samples USGS AGV-2 Andesite, USGS SCo-l Cody Shale, NIST 97b Flint Clay and USGS QLO-1 Quartz Latite with LMS. These standard samples are used to determine the sensitivity factors of the instrument. One important result is that all sensitivity factors are close to 1. Additionally, it is observed that the sensitivity factor of an element depends on its electron configuration, hence on the electron work function and the elemental group in agreement with existing theory. Furthermore, the conformity of the sensitivity factors is supported by mineralogical analyses of the USGS SCo-l and the NIST 97b samples. With the four different reference samples, the consistency of the calibration factors can be demonstrated, which constitutes the fundamental basis for a standard-less measurement-technique for in situ quantitative chemical composition measurements on planetary surface.

  11. Towards a capabilities database to inform inclusive design: experimental investigation of effective survey-based predictors of human-product interaction.

    PubMed

    Tenneti, Raji; Johnson, Daniel; Goldenberg, Liz; Parker, Richard A; Huppert, Felicia A

    2012-07-01

    A key issue in the field of inclusive design is the ability to provide designers with an understanding of people's range of capabilities. Since it is not feasible to assess product interactions with a large sample, this paper assesses a range of proxy measures of design-relevant capabilities. It describes a study that was conducted to identify which measures provide the best prediction of people's abilities to use a range of products. A detailed investigation with 100 respondents aged 50-80 years was undertaken to examine how they manage typical household products. Predictor variables included self-report and performance measures across a variety of capabilities (vision, hearing, dexterity and cognitive function), component activities used in product interactions (e.g. using a remote control, touch screen) and psychological characteristics (e.g. self-efficacy, confidence with using electronic devices). Results showed, as expected, a higher prevalence of visual, hearing, dexterity, cognitive and product interaction difficulties in the 65-80 age group. Regression analyses showed that, in addition to age, performance measures of vision (acuity, contrast sensitivity) and hearing (hearing threshold) and self-report and performance measures of component activities are strong predictors of successful product interactions. These findings will guide the choice of measures to be used in a subsequent national survey of design-relevant capabilities, which will lead to the creation of a capability database. This will be converted into a tool for designers to understand the implications of their design decisions, so that they can design products in a more inclusive way.

  12. A Meta-Analysis Method to Advance Design of Technology-Based Learning Tool: Combining Qualitative and Quantitative Research to Understand Learning in Relation to Different Technology Features

    ERIC Educational Resources Information Center

    Zhang, Lin

    2014-01-01

    Educators design and create various technology tools to scaffold students' learning. As more and more technology designs are incorporated into learning, growing attention has been paid to the study of technology-based learning tool. This paper discusses the emerging issues, such as how can learning effectiveness be understood in relation to…

  13. Web Survey Design in ASP.Net 2.0: A Simple Task with One Line of Code

    ERIC Educational Resources Information Center

    Liu, Chang

    2007-01-01

    Over the past few years, more and more companies have been investing in electronic commerce (EC) by designing and implementing Web-based applications. In the world of practice, the importance of using Web technology to reach individual customers has been presented by many researchers. This paper presents an easy way of conducting marketing…

  14. Reflective Filters Design for Self-Filtering Narrowband Ultraviolet Imaging Experiment Wide-Field Surveys (NUVIEWS) Project

    NASA Technical Reports Server (NTRS)

    Park, Jung- Ho; Kim, Jongmin; Zukic, Muamer; Torr, Douglas G.

    1994-01-01

    We report the design of multilayer reflective filters for the self-filtering cameras of the NUVIEWS project. Wide angle self-filtering cameras were designed to image the C IV (154.9 nm) line emission, and H2 Lyman band fluorescence (centered at 161 nm) over a 20 deg x 30 deg field of view. A key element of the filter design includes the development of pi-multilayers optimized to provide maximum reflectance at 154.9 nm and 161 nm for the respective cameras without significant spectral sensitivity to the large cone angle of the incident radiation. We applied self-filtering concepts to design NUVIEWS telescope filters that are composed of three reflective mirrors and one folding mirror. The filters with narrowband widths of 6 and 8 rim at 154.9 and 161 nm, respectively, have net throughputs of more than 50 % with average blocking of out-of-band wavelengths better than 3 x 10(exp -4)%.

  15. City Governments and Aging in Place: Community Design, Transportation and Housing Innovation Adoption

    ERIC Educational Resources Information Center

    Lehning, Amanda J.

    2012-01-01

    Purpose of the study: To examine the characteristics associated with city government adoption of community design, housing, and transportation innovations that could benefit older adults. Design and methods: A mixed-methods study with quantitative data collected via online surveys from 62 city planners combined with qualitative data collected via…

  16. Overview of the study design, participation and field work of the German Environmental Survey on Children 2003-2006 (GerES IV).

    PubMed

    Schulz, Christine; Seiwert, Margarete; Babisch, Wolfgang; Becker, Kerstin; Conrad, André; Szewzyk, Regine; Kolossa-Gehring, Marike

    2012-07-01

    The German Federal Environment Agency carried out its fourth German Environmental Survey (GerES IV), which is the first survey on children only and the environment-related module of the German Health Interview and Examination Survey for Children and Adolescents (German acronym: KiGGS), conducted by the Robert Koch Institute (RKI). The German Environmental Surveys are nationwide population studies conducted to determine the exposure to environmental pollutants, to explore exposure pathways and to identify sub-groups with higher exposure. GerES IV was conducted on randomly selected 1790 children aged 3-14 years from the cross-sectional sample of KiGGS. The participants of GerES IV lived in 150 sampling locations all over Germany. Field work was carried out from May 2003 to May 2006. The response rate in GerES IV was 77.3%. Due to the fact that participation in GerES IV was limited to children that had previously participated in the KiGGS study, the total response rate in GerES IV resulted in 52.6%. Response rates did neither differ significantly between West and East Germany, nor between different community sizes, age groups and gender. The basic study programme included blood samples, morning urine, tap water and house dust as well as comprehensive questionnaire-based interviews. In addition, subgroups were studied with regard to "noise, hearing capacity and stress hormones", "chemical contamination of indoor air" and "biogenic indoor contamination". A key element of the field work in GerES IV was a home visit to carry out interviews, conduct measurements and collect samples. An exception was blood sampling which was carried out within KiGGS. The quality of field work, data collection, evaluation, and chemical, biological and physical analyses was successfully evaluated by internal and external quality assurance. This comprehensive overview aims at giving other research groups the opportunity to compare different study designs or to adapt their own design to get

  17. Evaluating quantitative 3-D image analysis as a design tool for low enriched uranium fuel compacts for the transient reactor test facility: A preliminary study

    SciTech Connect

    Kane, J. J.; van Rooyen, I. J.; Craft, A. E.; Roney, T. J.; Morrell, S. R.

    2016-02-05

    In this study, 3-D image analysis when combined with a non-destructive examination technique such as X-ray computed tomography (CT) provides a highly quantitative tool for the investigation of a material’s structure. In this investigation 3-D image analysis and X-ray CT were combined to analyze the microstructure of a preliminary subsized fuel compact for the Transient Reactor Test Facility’s low enriched uranium conversion program to assess the feasibility of the combined techniques for use in the optimization of the fuel compact fabrication process. The quantitative image analysis focused on determining the size and spatial distribution of the surrogate fuel particles and the size, shape, and orientation of voids within the compact. Additionally, the maximum effect of microstructural features on heat transfer through the carbonaceous matrix of the preliminary compact was estimated. The surrogate fuel particles occupied 0.8% of the compact by volume with a log-normal distribution of particle sizes with a mean diameter of 39 μm and a standard deviation of 16 μm. Roughly 39% of the particles had a diameter greater than the specified maximum particle size of 44 μm suggesting that the particles agglomerate during fabrication. The local volume fraction of particles also varies significantly within the compact although uniformities appear to be evenly dispersed throughout the analysed volume. The voids produced during fabrication were on average plate-like in nature with their major axis oriented perpendicular to the compaction direction of the compact. Finally, the microstructure, mainly the large preferentially oriented voids, may cause a small degree of anisotropy in the thermal diffusivity within the compact. α∥/α⊥, the ratio of thermal diffusivities parallel to and perpendicular to the compaction direction are expected to be no less than 0.95 with an upper bound of 1.

  18. Evaluating quantitative 3-D image analysis as a design tool for low enriched uranium fuel compacts for the transient reactor test facility: A preliminary study

    DOE PAGES

    Kane, J. J.; van Rooyen, I. J.; Craft, A. E.; ...

    2016-02-05

    In this study, 3-D image analysis when combined with a non-destructive examination technique such as X-ray computed tomography (CT) provides a highly quantitative tool for the investigation of a material’s structure. In this investigation 3-D image analysis and X-ray CT were combined to analyze the microstructure of a preliminary subsized fuel compact for the Transient Reactor Test Facility’s low enriched uranium conversion program to assess the feasibility of the combined techniques for use in the optimization of the fuel compact fabrication process. The quantitative image analysis focused on determining the size and spatial distribution of the surrogate fuel particles andmore » the size, shape, and orientation of voids within the compact. Additionally, the maximum effect of microstructural features on heat transfer through the carbonaceous matrix of the preliminary compact was estimated. The surrogate fuel particles occupied 0.8% of the compact by volume with a log-normal distribution of particle sizes with a mean diameter of 39 μm and a standard deviation of 16 μm. Roughly 39% of the particles had a diameter greater than the specified maximum particle size of 44 μm suggesting that the particles agglomerate during fabrication. The local volume fraction of particles also varies significantly within the compact although uniformities appear to be evenly dispersed throughout the analysed volume. The voids produced during fabrication were on average plate-like in nature with their major axis oriented perpendicular to the compaction direction of the compact. Finally, the microstructure, mainly the large preferentially oriented voids, may cause a small degree of anisotropy in the thermal diffusivity within the compact. α∥/α⊥, the ratio of thermal diffusivities parallel to and perpendicular to the compaction direction are expected to be no less than 0.95 with an upper bound of 1.« less

  19. Instrument development, study design implementation, and survey conduct for the national social life, health, and aging project.

    PubMed

    Smith, Stephen; Jaszczak, Angela; Graber, Jessica; Lundeen, Katie; Leitsch, Sara; Wargo, Erin; O'Muircheartaigh, Colm

    2009-11-01

    The National Opinion Research Center, led by a team of investigators at the University of Chicago, conducted more than 3,000 in-person interviews with a nationally representative sample of adults aged 57-85 years. Data collection included in-person questionnaire items, an extensive array of biomeasures, and a postinterview self-administered questionnaire. The National Social Life, Health, and Aging Project (NSHAP) interview included the collection of 13 biomeasures: weight, waist circumference, height, blood pressure, smell, saliva collection, taste, a self-administered vaginal swab for female respondents, "Get Up and Go," distance vision, touch, oral mucosal transudate (Orasure) human immunodeficiency virus test, and blood spots. This article discusses the development of NSHAP's instruments and implementation of the study design. Measures, such as response and cooperation rates, are also provided to evaluate the effectiveness of the design and implementation.

  20. The U. S. Geological Survey's Albemarle-Pamlico National Water-Quality Assessment Study; background and design

    USGS Publications Warehouse

    Spruill, T.B.; Harned, Douglas A.; McMahon, Gerard

    1995-01-01

    The Albemarle-Pamlico Study Unit is one of 20 National Water-Quality Assessment (NAWQA) studies begun in 1991 by the U.S. Geological Survey (USGS) to assess the Nation's water quality. One of the missions of the USGS is to assess the quantity and quality of the Nation's water resources. The NAWQA program was established to help accomplish this mission. The Albemarle-Pamlico Study Unit, located in Virginia and North Carolina, drains an area of about 28,000 square miles. Four major rivers, the Chowan, the Roanoke, the Tar-Pamlico and the Neuse, all drain into the Albemarle-Pamlico Sound in North Carolina. Four physiographic regions (areas of homogeneous climatic, geologic, and biological characteristics), the Valley and Ridge, Blue Ridge, Piedmont and Coastal Plain Physiographic Provinces are included within the Albemarle-Pamlico Study Unit. Until 1991, there was no single program that could answer the question, 'Are the Nation's ground and surface waters getting better, worse, or are they staying the same?' A program was needed to evaluate water quality by using standard techniques to allow assessment of water quality at local, regional, and national scales. The NAWQA Program was implemented to answer questions about the Nation's water quality using consistent and comparable methods. A total of 60 basins, or study units, will be in place by 1997 to assess the Nation's water quality.

  1. Surveying drainage culvert use by carnivores: sampling design and cost-benefit analyzes of track-pads vs. video-surveillance methods.

    PubMed

    Mateus, Ana Rita A; Grilo, Clara; Santos-Reis, Margarida

    2011-10-01

    Environmental assessment studies often evaluate the effectiveness of drainage culverts as habitat linkages for species, however, the efficiency of the sampling designs and the survey methods are not known. Our main goal was to estimate the most cost-effective monitoring method for sampling carnivore culvert using track-pads and video-surveillance. We estimated the most efficient (lower costs and high detection success) interval between visits (days) when using track-pads and also determined the advantages of using each method. In 2006, we selected two highways in southern Portugal and sampled 15 culverts over two 10-day sampling periods (spring and summer). Using the track-pad method, 90% of the animal tracks were detected using a 2-day interval between visits. We recorded a higher number of crossings for most species using video-surveillance (n = 129) when compared with the track-pad technique (n = 102); however, the detection ability using the video-surveillance method varied with type of structure and species. More crossings were detected in circular culverts (1 m and 1.5 m diameter) than in box culverts (2 m to 4 m width), likely because video cameras had a reduced vision coverage area. On the other hand, carnivore species with small feet such as the common genet Genetta genetta were detected less often using the track-pad surveying method. The cost-benefit analyzes shows that the track-pad technique is the most appropriate technique, but video-surveillance allows year-round surveys as well as the behavior response analyzes of species using crossing structures.

  2. Land 3D-seismic data: Preprocessing quality control utilizing survey design specifications, noise properties, normal moveout, first breaks, and offset

    USGS Publications Warehouse

    Raef, A.

    2009-01-01

    The recent proliferation of the 3D reflection seismic method into the near-surface area of geophysical applications, especially in response to the emergence of the need to comprehensively characterize and monitor near-surface carbon dioxide sequestration in shallow saline aquifers around the world, justifies the emphasis on cost-effective and robust quality control and assurance (QC/QA) workflow of 3D seismic data preprocessing that is suitable for near-surface applications. The main purpose of our seismic data preprocessing QC is to enable the use of appropriate header information, data that are free of noise-dominated traces, and/or flawed vertical stacking in subsequent processing steps. In this article, I provide an account of utilizing survey design specifications, noise properties, first breaks, and normal moveout for rapid and thorough graphical QC/QA diagnostics, which are easy to apply and efficient in the diagnosis of inconsistencies. A correlated vibroseis time-lapse 3D-seismic data set from a CO2-flood monitoring survey is used for demonstrating QC diagnostics. An important by-product of the QC workflow is establishing the number of layers for a refraction statics model in a data-driven graphical manner that capitalizes on the spatial coverage of the 3D seismic data. ?? China University of Geosciences (Wuhan) and Springer-Verlag GmbH 2009.

  3. The Impact of Trap Type and Design Features on Survey and Detection of Bark and Woodboring Beetles and Their Associates: A Review and Meta-Analysis.

    PubMed

    Allison, Jeremy D; Redak, Richard A

    2017-01-31

    A large literature on the survey and detection of forest Coleoptera and their associates exists. Identification of patterns in the effect of trap types and design features among guilds and families of forest insects would facilitate the optimization and development of intercept traps for use in management programs. We reviewed the literature on trapping bark and woodboring beetles and their associates and conducted meta-analyses to examine patterns in effects across guilds and families; we observed the following general patterns: (a) Panel traps were superior to multiple-funnel traps, (b) bark beetles and woodborers were captured in higher numbers in traps treated with a surface treatment to make them slippery than untreated traps,

  4. Study Quality in SLA: A Cumulative and Developmental Assessment of Designs, Analyses, Reporting Practices, and Outcomes in Quantitative L2 Research

    ERIC Educational Resources Information Center

    Plonsky, Luke

    2011-01-01

    I began this study with two assumptions. Assumption 1: Study quality matters. If the means by which researchers design, carry out, and report on their studies lack in rigor or transparency, theory and practice are likely to be misguided or at least decelerated. Assumption 2 is an implication of Assumption 1: Quality should be measured rather than…

  5. Quantitative impurity analysis of monoclonal antibody size heterogeneity by CE-LIF: example of development and validation through a quality-by-design framework.

    PubMed

    Michels, David A; Parker, Monica; Salas-Solano, Oscar

    2012-03-01

    This paper describes the framework of quality by design applied to the development, optimization and validation of a sensitive capillary electrophoresis-sodium dodecyl sulfate (CE-SDS) assay for monitoring impurities that potentially impact drug efficacy or patient safety produced in the manufacture of therapeutic MAb products. Drug substance or drug product samples are derivatized with fluorogenic 3-(2-furoyl)quinoline-2-carboxaldehyde and nucleophilic cyanide before separation by CE-SDS coupled to LIF detection. Three design-of-experiments enabled critical labeling parameters to meet method requirements for detecting minor impurities while building precision and robustness into the assay during development. The screening design predicted optimal conditions to control labeling artifacts while two full factorial designs demonstrated method robustness through control of temperature and cyanide parameters within the normal operating range. Subsequent validation according to the guidelines of the International Committee of Harmonization showed the CE-SDS/LIF assay was specific, accurate, and precise (RSD ≤ 0.8%) for relative peak distribution and linear (R > 0.997) between the range of 0.5-1.5 mg/mL with LOD and LOQ of 10 ng/mL and 35 ng/mL, respectively. Validation confirmed the system suitability criteria used as a level of control to ensure reliable method performance.

  6. Three-dimensional quantitative structure-activity relationships and docking studies of some structurally diverse flavonoids and design of new aldose reductase inhibitors

    PubMed Central

    Chandra De, Utpal; Debnath, Tanusree; Sen, Debanjan; Debnath, Sudhan

    2015-01-01

    Aldose reductase (AR) plays an important role in the development of several long-term diabetic complications. Inhibition of AR activities is a strategy for controlling complications arising from chronic diabetes. Several AR inhibitors have been reported in the literature. Flavonoid type compounds are shown to have significant AR inhibition. The objective of this study was to perform a computational work to get an idea about structural insight of flavonoid type compounds for developing as well as for searching new flavonoid based AR inhibitors. The data-set comprising 68 flavones along with their pIC50 values ranging from 0.44 to 4.59 have been collected from literature. Structure of all the flavonoids were drawn in Chembiodraw Ultra 11.0, converted into corresponding three-dimensional structure, saved as mole file and then imported to maestro project table. Imported ligands were prepared using LigPrep option of maestro 9.6 version. Three-dimensional quantitative structure-activity relationships and docking studies were performed with appropriate options of maestro 9.6 version installed in HP Z820 workstation with CentOS 6.3 (Linux). A model with partial least squares factor 5, standard deviation 0.2482, R2 = 0.9502 and variance ratio of regression 122 has been found as the best statistical model. PMID:25709964

  7. Materials design for new superconductors.

    PubMed

    Norman, M R

    2016-07-01

    Since the announcement in 2011 of the Materials Genome Initiative by the Obama administration, much attention has been given to the subject of materials design to accelerate the discovery of new materials that could have technological implications. Although having its biggest impact for more applied materials like batteries, there is increasing interest in applying these ideas to predict new superconductors. This is obviously a challenge, given that superconductivity is a many body phenomenon, with whole classes of known superconductors lacking a quantitative theory. Given this caveat, various efforts to formulate materials design principles for superconductors are reviewed here, with a focus on surveying the periodic table in an attempt to identify cuprate analogues.

  8. Materials design for new superconductors

    NASA Astrophysics Data System (ADS)

    Norman, M. R.

    2016-07-01

    Since the announcement in 2011 of the Materials Genome Initiative by the Obama administration, much attention has been given to the subject of materials design to accelerate the discovery of new materials that could have technological implications. Although having its biggest impact for more applied materials like batteries, there is increasing interest in applying these ideas to predict new superconductors. This is obviously a challenge, given that superconductivity is a many body phenomenon, with whole classes of known superconductors lacking a quantitative theory. Given this caveat, various efforts to formulate materials design principles for superconductors are reviewed here, with a focus on surveying the periodic table in an attempt to identify cuprate analogues.

  9. Quantitative Reasoning in Environmental Science: A Learning Progression

    ERIC Educational Resources Information Center

    Mayes, Robert Lee; Forrester, Jennifer Harris; Christus, Jennifer Schuttlefield; Peterson, Franziska Isabel; Bonilla, Rachel; Yestness, Nissa

    2014-01-01

    The ability of middle and high school students to reason quantitatively within the context of environmental science was investigated. A quantitative reasoning (QR) learning progression was created with three progress variables: quantification act, quantitative interpretation, and quantitative modeling. An iterative research design was used as it…

  10. DRAFT - Design of Radiological Survey and Sampling to Support Title Transfer or Lease of Property on the Department of Energy Oak Ridge Reservation

    SciTech Connect

    Cusick L.T.

    2002-09-25

    sampling and laboratory analyses are completed, the data are analyzed and included in an Environmental Baseline Summary (EBS) report for title transfer or in a Baseline Environmental Analysis Report (BEAR) for lease. The data from the BEAR is then used in a Screening-Level Human Health Risk Assessment (SHHRA) or a risk calculation (RC) to assess the potential risks to future owners/occupants. If title is to be transferred, release criteria in the form of specific activity concentrations called Derived Concentration Guideline Levels (DCGLs) will be developed for the each property. The DCGLs are based on the risk model and are used with the data in the EBS to determine, with statistical confidence, that the release criteria for the property have been met. The goal of the survey and sampling efforts is to (1) document the baseline conditions of the property (real or personal) prior to title transfer or lease, (2) obtain enough information that an evaluation of radiological risks can be made, and (3) collect sufftcient data so that areas that contain minimal residual levels of radioactivity can be identified and, following radiological control procedures, be released from radiological control. (It should be noted that release from radiological control does not necessarily mean free release because DOE may maintain institutional control of the site after it is released from radiological control). To meet the goals of this document, a Data Quality Objective (DQO) process will be used to enhance data collection efficiency and assist with decision-making. The steps of the DQO process involve stating the problem, identifying the decision, identifying inputs to the decision, developing study boundaries, developing the decision rule, and optimizing the design. This document describes the DQOs chosen for surveys and sampling efforts performed for the purposes listed above. The previous version to this document focused on the requirements for radiological survey and sampling protocols

  11. Sport Management Survey. Employment Perspectives.

    ERIC Educational Resources Information Center

    Quain, Richard J.; Parks, Janet B.

    1986-01-01

    A survey of sport management positions was designed to determine projected vacancy rates in six sport management career areas. Respondents to the survey were also questioned regarding their awareness of college professional preparation programs. Results are presented. (MT)

  12. Questionnaire Surveys in Educational Planning.

    ERIC Educational Resources Information Center

    Psacharopoulos, George

    1980-01-01

    This paper reviews and discusses some critical issues related to the use of questionnaire surveys in educational planning. Ten brief sections discuss survey objectives, coverage, questionnaire design, administration, validity, nonresponse, cost considerations, coding, statistical analysis, and interpretation. Five illustrative questionnaire…

  13. Hydrophilic interaction liquid chromatography-tandem mass spectrometry quantitative method for the cellular analysis of varying structures of gemini surfactants designed as nanomaterial drug carriers.

    PubMed

    Donkuru, McDonald; Michel, Deborah; Awad, Hanan; Katselis, George; El-Aneed, Anas

    2016-05-13

    Diquaternary gemini surfactants have successfully been used to form lipid-based nanoparticles that are able to compact, protect, and deliver genetic materials into cells. However, what happens to the gemini surfactants after they have released their therapeutic cargo is unknown. Such knowledge is critical to assess the quality, safety, and efficacy of gemini surfactant nanoparticles. We have developed a simple and rapid liquid chromatography electrospray ionization-tandem mass spectrometry (LC-ESI-MS/MS) method for the quantitative determination of various structures of gemini surfactants in cells. Hydrophilic interaction liquid chromatography (HILIC) was employed allowing for a short simple isocratic run of only 4min. The lower limit of detection (LLOD) was 3ng/mL. The method was valid to 18 structures of gemini surfactants belonging to two different structural families. A full method validation was performed for two lead compounds according to USFDA guidelines. The HILIC-MS/MS method was compatible with the physicochemical properties of gemini surfactants that bear a permanent positive charge with both hydrophilic and hydrophobic elements within their molecular structure. In addition, an effective liquid-liquid extraction method (98% recovery) was employed surpassing previously used extraction methods. The analysis of nanoparticle-treated cells showed an initial rise in the analyte intracellular concentration followed by a maximum and a somewhat more gradual decrease of the intracellular concentration. The observed intracellular depletion of the gemini surfactants may be attributable to their bio-transformation into metabolites and exocytosis from the host cells. Obtained cellular data showed a pattern that grants additional investigations, evaluating metabolite formation and assessing the subcellular distribution of tested compounds.

  14. Survey design and observations relating to cancer education funding. Cancer Education Survey II: cancer education in United States medical schools (conducted by The American Association for Cancer Education with the support of the American Cancer Society).

    PubMed

    Bakemeier, R F; Kupchella, C E; Chamberlain, R M; Gallagher, R E; O'Donnell, J F; Parker, J A; Hill, G J; Brooks, C M

    1992-01-01

    A survey has been conducted of cancer education programs for medical students in United States medical schools by the American Association for Cancer Education with grant support from the Department of Detection and Treatment of the American Cancer Society (formerly the Professional Education Department). Two questionnaires were used, an Educational Resources Questionnaire (ERQ), which 126 of the 128 medical schools completed and returned, and a Faculty and Curriculum Questionnaire (FCQ), which was completed and returned by 1,035 faculty members who had been named as active in undergraduate medical student cancer education by respondents in each school who had been designated by the Dean's Office to complete the ERQ. Overall conclusions included: (1) increased coordination of cancer education activities is a major need in many schools; (2) there is widespread interest in the further development of cancer education objectives; (3) development of a national cancer education curriculum is needed; (4) there is interest in the development of improved instructional materials and methods; (5) development of evaluation methods is needed for cancer education programs; and (6) an ongoing funding process is needed to provide support for interdepartmental coordination of cancer education activities. Cancer prevention and detection topics were ranked above cancer treatment in plans for future curriculum emphasis. More detailed conclusions and recommendations are provided in this publication and three subsequent articles in this issue of the Journal of Cancer Education.

  15. Design.

    ERIC Educational Resources Information Center

    Online-Offline, 1998

    1998-01-01

    Provides an annotated bibliography of resources on this month's theme "Design" for K-8 language arts, art and architecture, music and dance, science, math, social studies, health, and physical education. Includes Web sites, CD-ROMs and software, videos, books, audiotapes, magazines, professional resources and classroom activities.…

  16. Design

    ERIC Educational Resources Information Center

    Buchanan, Richard; Cross, Nigel; Durling, David; Nelson, Harold; Owen, Charles; Valtonen, Anna; Boling, Elizabeth; Gibbons, Andrew; Visscher-Voerman, Irene

    2013-01-01

    Scholars representing the field of design were asked to identify what they considered to be the most exciting and imaginative work currently being done in their field, as well as how that work might change our understanding. The scholars included Richard Buchanan, Nigel Cross, David Durling, Harold Nelson, Charles Owen, and Anna Valtonen. Scholars…

  17. The PdBI arcsecond whirlpool survey (PAWS). I. A cloud-scale/multi-wavelength view of the interstellar medium in a grand-design spiral galaxy

    SciTech Connect

    Schinnerer, Eva; Meidt, Sharon E.; Hughes, Annie; Colombo, Dario; Pety, Jérôme; Schuster, Karl F.; Dumas, Gaëlle; García-Burillo, Santiago; Dobbs, Clare L.; Leroy, Adam K.; Kramer, Carsten; Thompson, Todd A.; Regan, Michael W.

    2013-12-10

    The Plateau de Bure Interferometer Arcsecond Whirlpool Survey has mapped the molecular gas in the central ∼9 kpc of M51 in its {sup 12}CO(1-0) line emission at a cloud-scale resolution of ∼40 pc using both IRAM telescopes. We utilize this data set to quantitatively characterize the relation of molecular gas (or CO emission) to other tracers of the interstellar medium, star formation, and stellar populations of varying ages. Using two-dimensional maps, a polar cross-correlation technique and pixel-by-pixel diagrams, we find: (1) that (as expected) the distribution of the molecular gas can be linked to different components of the gravitational potential; (2) evidence for a physical link between CO line emission and radio continuum that seems not to be caused by massive stars, but rather depends on the gas density; (3) a close spatial relation between polycyclic aromatic hydrocarbon (PAH) and molecular gas emission, but no predictive power of PAH emission for the molecular gas mass; (4) that the I – H color map is an excellent predictor of the distribution (and to a lesser degree, the brightness) of CO emission; and (5) that the impact of massive (UV-intense) young star-forming regions on the bulk of the molecular gas in central ∼9 kpc cannot be significant due to a complex spatial relation between molecular gas and star-forming regions that ranges from cospatial to spatially offset to absent. The last point, in particular, highlights the importance of galactic environment—and thus the underlying gravitational potential—for the distribution of molecular gas and star formation.

  18. Case study: survey of patient satisfaction with prosthesis quality and design among below-knee prosthetic leg socket users.

    PubMed

    Mohd Hawari, Nurhanisah; Jawaid, Mohammad; Md Tahir, Paridah; Azmeer, Raja Ahmad

    2017-01-10

    The aim of this case study was to explore patient satisfaction with the quality of prosthetic leg sockets intended for persons with lower limb amputations. A qualitative study based on in-depth interviews, preceded by a questionnaire session, was carried out with patients from the Rehabilitation Center and Hospital in Malaysia. Twelve out-patient and in-patient amputees with lower limb amputations, specifically below-knee amputations, were chosen randomly. The analysis of patients' narratives aimed to identify the functional and esthetic characteristics of currently used prosthetic leg sockets and any problems related to them. The obtained results indicated that out of the 12 participants, 41.7% and 25% were satisfied and somewhat satisfied with their current prosthetic sockets. Durability and comfort were rated by the participants as the most important characteristics of prosthetic sockets, with 83.3%. As regards the esthetic appearance of the socket, 66.7% of the respondents considered that the most important feature was the material from which the socket was fabricated. Thus, we conclude that current satisfaction levels with the quality of prosthetic sockets among amputees in Malaysia are suitable, prosthesis being preferred by many amputees. The results can be used to direct future research on cosmesis and functionality of prosthetic socket design. Implications for Rehabilitation Case study will help participants to get cost effective prosthetic leg socket. Develop prosthetic leg socket comfortable as comparative to existing one. Help Malaysian government to make policy to develop local prosthetic leg socket at affordable price.

  19. Sensitive quantitation of polyamines in plant foods by ultrasound-assisted benzoylation and dispersive liquid-liquid microextraction with the aid of experimental designs.

    PubMed

    Pinto, Edgar; Melo, Armindo; Ferreira, Isabel M P L V O

    2014-05-14

    A new method involving ultrasound-assisted benzoylation and dispersive liquid-liquid microextraction was optimized with the aid of chemometrics for the extraction, cleanup, and determination of polyamines in plant foods. Putrescine, cadaverine, spermidine, and spermine were derivatized with 3,5-dinitrobenzoyl chloride and extracted by dispersive liquid-liquid microextraction using acetonitrile and carbon tetrachloride as dispersive and extraction solvents, respectively. Two-level full factorial design and central composite design were applied to select the most appropriate derivatization and extraction conditions. The developed method was linear in the 0.5-10.0 mg/L range, with a R(2) ≥ 0.9989. Intra- and interday precisions ranged from 0.8 to 6.9% and from 3.0 to 10.3%, respectively, and the limit of detection ranged between 0.018 and 0.042 μg/g of fresh weight. This method was applied to the analyses of six different types of plant foods, presenting recoveries between 81.7 and 114.2%. The method is inexpensive, versatile, simple, and sensitive.

  20. Toward a Theoretical Model of Decision-Making and Resistance to Change among Higher Education Online Course Designers

    ERIC Educational Resources Information Center

    Dodd, Bucky J.

    2013-01-01

    Online course design is an emerging practice in higher education, yet few theoretical models currently exist to explain or predict how the diffusion of innovations occurs in this space. This study used a descriptive, quantitative survey research design to examine theoretical relationships between decision-making style and resistance to change…

  1. Coverage Evaluation of Academic Libraries Survey (ALS).

    ERIC Educational Resources Information Center

    Marston, Christopher C.

    1999-01-01

    Evaluates universe coverage, data coverage, and response rates of the Academic Libraries Survey. Includes examination of survey design and data collection, perceptions of regional survey coordinators, and reporting by public versus private institutions. (Author)

  2. Ye Olde Maile Surveye.

    ERIC Educational Resources Information Center

    Berty, Ernest

    This publication is primarily designed for educational practitioners who possess little or no training in conducting mail surveys or have not kept current on the present state of the art of survey methods and techniques. It is also intended to be a checking and comparing aid to ensure that important research considerations are taken into account.…

  3. The Personal Health Survey

    ERIC Educational Resources Information Center

    Thorne, Frederick C.

    1978-01-01

    The Personal Health Survey (PHS) is a 200-item inventory designed to sample symptomatology as subjective experiences from the 12 principal domains of organ system and psychophysiological functioning. This study investigates the factorial validity of the empirically constructed scales. (Author)

  4. Doing Quantitative Research in Education with SPSS

    ERIC Educational Resources Information Center

    Muijs, Daniel

    2004-01-01

    This book looks at quantitative research methods in education. The book is structured to start with chapters on conceptual issues and designing quantitative research studies before going on to data analysis. While each chapter can be studied separately, a better understanding will be reached by reading the book sequentially. This book is intended…

  5. Design of multiplex calibrant plasmids, their use in GMO detection and the limit of their applicability for quantitative purposes owing to competition effects.

    PubMed

    Debode, Frédéric; Marien, Aline; Janssen, Eric; Berben, Gilbert

    2010-03-01

    Five double-target multiplex plasmids to be used as calibrants for GMO quantification were constructed. They were composed of two modified targets associated in tandem in the same plasmid: (1) a part of the soybean lectin gene and (2) a part of the transgenic construction of the GTS40-3-2 event. Modifications were performed in such a way that each target could be amplified with the same primers as those for the original target from which they were derived but such that each was specifically detected with an appropriate probe. Sequence modifications were done to keep the parameters of the new target as similar as possible to those of its original sequence. The plasmids were designed to be used either in separate reactions or in multiplex reactions. Evidence is given that with each of the five different plasmids used in separate wells as a calibrant for a different copy number, a calibration curve can be built. When the targets were amplified together (in multiplex) and at different concentrations inside the same well, the calibration curves showed that there was a competition effect between the targets and this limits the range of copy numbers for calibration over a maximum of 2 orders of magnitude. Another possible application of multiplex plasmids is discussed.

  6. Quantitative and qualitative optimization of allergen extraction from peanut and selected tree nuts. Part 1. Screening of optimal extraction conditions using a D-optimal experimental design.

    PubMed

    L'Hocine, Lamia; Pitre, Mélanie

    2016-03-01

    A D-optimal design was constructed to optimize allergen extraction efficiency simultaneously from roasted, non-roasted, defatted, and non-defatted almond, hazelnut, peanut, and pistachio flours using three non-denaturing aqueous (phosphate, borate, and carbonate) buffers at various conditions of ionic strength, buffer-to-protein ratio, extraction temperature, and extraction duration. Statistical analysis showed that roasting and non-defatting significantly lowered protein recovery for all nuts. Increasing the temperature and the buffer-to-protein ratio during extraction significantly increased protein recovery, whereas increasing the extraction time had no significant impact. The impact of the three buffers on protein recovery varied significantly among the nuts. Depending on the extraction conditions, protein recovery varied from 19% to 95% for peanut, 31% to 73% for almond, 17% to 64% for pistachio, and 27% to 88% for hazelnut. A modulation by the buffer type and ionic strength of protein and immunoglobuline E binding profiles of extracts was evidenced, where high protein recovery levels did not always correlate with high immunoreactivity.

  7. Different design of enzyme-triggered CO-releasing molecules (ET-CORMs) reveals quantitative differences in biological activities in terms of toxicity and inflammation.

    PubMed

    Stamellou, E; Storz, D; Botov, S; Ntasis, E; Wedel, J; Sollazzo, S; Krämer, B K; van Son, W; Seelen, M; Schmalz, H G; Schmidt, A; Hafner, M; Yard, B A

    2014-01-01

    Acyloxydiene-Fe(CO)3 complexes can act as enzyme-triggered CO-releasing molecules (ET-CORMs). Their biological activity strongly depends on the mother compound from which they are derived, i.e. cyclohexenone or cyclohexanedione, and on the position of the ester functionality they harbour. The present study addresses if the latter characteristic affects CO release, if cytotoxicity of ET-CORMs is mediated through iron release or inhibition of cell respiration and to what extent cyclohexenone and cyclohexanedione derived ET-CORMs differ in their ability to counteract TNF-α mediated inflammation. Irrespective of the formulation (DMSO or cyclodextrin), toxicity in HUVEC was significantly higher for ET-CORMs bearing the ester functionality at the outer (rac-4), as compared to the inner (rac-1) position of the cyclohexenone moiety. This was paralleled by an increased CO release from the former ET-CORM. Toxicity was not mediated via iron as EC50 values for rac-4 were significantly lower than for FeCl2 or FeCl3 and were not influenced by iron chelation. ATP depletion preceded toxicity suggesting impaired cell respiration as putative cause for cell death. In long-term HUVEC cultures inhibition of VCAM-1 expression by rac-1 waned in time, while for the cyclohexanedione derived rac-8 inhibition seems to increase. NFκB was inhibited by both rac-1 and rac-8 independent of IκBα degradation. Both ET-CORMs activated Nrf-2 and consequently induced the expression of HO-1. This study further provides a rational framework for designing acyloxydiene-Fe(CO)3 complexes as ET-CORMs with differential CO release and biological activities. We also provide a better understanding of how these complexes affect cell-biology in mechanistic terms.

  8. Using design of experiments to optimize derivatization with methyl chloroformate for quantitative analysis of the aqueous phase from hydrothermal liquefaction of biomass.

    PubMed

    Madsen, René Bjerregaard; Jensen, Mads Mørk; Mørup, Anders Juul; Houlberg, Kasper; Christensen, Per Sigaard; Klemmer, Maika; Becker, Jacob; Iversen, Bo Brummerstedt; Glasius, Marianne

    2016-03-01

    Hydrothermal liquefaction is a promising technique for the production of bio-oil. The process produces an oil phase, a gas phase, a solid residue, and an aqueous phase. Gas chromatography coupled with mass spectrometry is used to analyze the complex aqueous phase. Especially small organic acids and nitrogen-containing compounds are of interest. The efficient derivatization reagent methyl chloroformate was used to make analysis of the complex aqueous phase from hydrothermal liquefaction of dried distillers grains with solubles possible. A circumscribed central composite design was used to optimize the responses of both derivatized and nonderivatized analytes, which included small organic acids, pyrazines, phenol, and cyclic ketones. Response surface methodology was used to visualize significant factors and identify optimized derivatization conditions (volumes of methyl chloroformate, NaOH solution, methanol, and pyridine). Twenty-nine analytes of small organic acids, pyrazines, phenol, and cyclic ketones were quantified. An additional three analytes were pseudoquantified with use of standards with similar mass spectra. Calibration curves with high correlation coefficients were obtained, in most cases R (2)  > 0.991. Method validation was evaluated with repeatability, and spike recoveries of all 29 analytes were obtained. The 32 analytes were quantified in samples from the commissioning of a continuous flow reactor and in samples from recirculation experiments involving the aqueous phase. The results indicated when the steady-state condition of the flow reactor was obtained and the effects of recirculation. The validated method will be especially useful for investigations of the effect of small organic acids on the hydrothermal liquefaction process.

  9. 78 FR 52166 - Quantitative Messaging Research

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-22

    ... qualitative message testing research (for which CFTC received fast-track OMB approval) and is necessary to... the survey as well as other qualitative research. Findings from the summary report will be used to... COMMISSION Quantitative Messaging Research AGENCY: Commodity Futures Trading Commission. ACTION:...

  10. Developing Geoscience Students' Quantitative Skills

    NASA Astrophysics Data System (ADS)

    Manduca, C. A.; Hancock, G. S.

    2005-12-01

    Sophisticated quantitative skills are an essential tool for the professional geoscientist. While students learn many of these sophisticated skills in graduate school, it is increasingly important that they have a strong grounding in quantitative geoscience as undergraduates. Faculty have developed many strong approaches to teaching these skills in a wide variety of geoscience courses. A workshop in June 2005 brought together eight faculty teaching surface processes and climate change to discuss and refine activities they use and to publish them on the Teaching Quantitative Skills in the Geosciences website (serc.Carleton.edu/quantskills) for broader use. Workshop participants in consultation with two mathematics faculty who have expertise in math education developed six review criteria to guide discussion: 1) Are the quantitative and geologic goals central and important? (e.g. problem solving, mastery of important skill, modeling, relating theory to observation); 2) Does the activity lead to better problem solving? 3) Are the quantitative skills integrated with geoscience concepts in a way that makes sense for the learning environment and supports learning both quantitative skills and geoscience? 4) Does the methodology support learning? (e.g. motivate and engage students; use multiple representations, incorporate reflection, discussion and synthesis) 5) Are the materials complete and helpful to students? 6) How well has the activity worked when used? Workshop participants found that reviewing each others activities was very productive because they thought about new ways to teach and the experience of reviewing helped them think about their own activity from a different point of view. The review criteria focused their thinking about the activity and would be equally helpful in the design of a new activity. We invite a broad international discussion of the criteria(serc.Carleton.edu/quantskills/workshop05/review.html).The Teaching activities can be found on the

  11. Rigour in quantitative research.

    PubMed

    Claydon, Leica Sarah

    2015-07-22

    This article which forms part of the research series addresses scientific rigour in quantitative research. It explores the basis and use of quantitative research and the nature of scientific rigour. It examines how the reader may determine whether quantitative research results are accurate, the questions that should be asked to determine accuracy and the checklists that may be used in this process. Quantitative research has advantages in nursing, since it can provide numerical data to help answer questions encountered in everyday practice.

  12. Doctoral training in statistics, measurement, and methodology in psychology: replication and extension of Aiken, West, Sechrest, and Reno's (1990) survey of PhD programs in North America.

    PubMed

    Aiken, Leona S; West, Stephen G; Millsap, Roger E

    2008-01-01

    In a survey of all PhD programs in psychology in the United States and Canada, the authors documented the quantitative methodology curriculum (statistics, measurement, and research design) to examine the extent to which innovations in quantitative methodology have diffused into the training of PhDs in psychology. In all, 201 psychology PhD programs (86%) participated. This survey replicated and extended a previous survey (L. S. Aiken, S. G. West, L. B. Sechrest, & R. R. Reno, 1990), permitting examination of curriculum development. Most training supported laboratory and not field research. The median of 1.6 years of training in statistics and measurement was mainly devoted to the modally 1-year introductory statistics course, leaving little room for advanced study. Curricular enhancements were noted in statistics and to a minor degree in measurement. Additional coverage of both fundamental and innovative quantitative methodology is needed. The research design curriculum has largely stagnated, a cause for great concern. Elite programs showed no overall advantage in quantitative training. Forces that support curricular innovation are characterized. Human capital challenges to quantitative training, including recruiting and supporting young quantitative faculty, are discussed. Steps must be taken to bring innovations in quantitative methodology into the curriculum of PhD programs in psychology.

  13. To Survey or Not to Survey: What Is the Question?

    ERIC Educational Resources Information Center

    Hobbs, Walter C.

    1979-01-01

    In designing an institutional advancement survey, it is thought that the researcher must have a clear conception of the survey's purpose. The relationship of research to theory, the nature of the survey, data collection, and analysis and interpretation are discussed. (Author/MLW)

  14. Spatial distribution, sampling precision and survey design optimisation with non-normal variables: The case of anchovy (Engraulis encrasicolus) recruitment in Spanish Mediterranean waters

    NASA Astrophysics Data System (ADS)

    Tugores, M. Pilar; Iglesias, Magdalena; Oñate, Dolores; Miquel, Joan

    2016-02-01

    In the Mediterranean Sea, the European anchovy (Engraulis encrasicolus) displays a key role in ecological and economical terms. Ensuring stock sustainability requires the provision of crucial information, such as species spatial distribution or unbiased abundance and precision estimates, so that management strategies can be defined (e.g. fishing quotas, temporal closure areas or marine protected areas MPA). Furthermore, the estimation of the precision of global abundance at different sampling intensities can be used for survey design optimisation. Geostatistics provide a priori unbiased estimations of the spatial structure, global abundance and precision for autocorrelated data. However, their application to non-Gaussian data introduces difficulties in the analysis in conjunction with low robustness or unbiasedness. The present study applied intrinsic geostatistics in two dimensions in order to (i) analyse the spatial distribution of anchovy in Spanish Western Mediterranean waters during the species' recruitment season, (ii) produce distribution maps, (iii) estimate global abundance and its precision, (iv) analyse the effect of changing the sampling intensity on the precision of global abundance estimates and, (v) evaluate the effects of several methodological options on the robustness of all the analysed parameters. The results suggested that while the spatial structure was usually non-robust to the tested methodological options when working with the original dataset, it became more robust for the transformed datasets (especially for the log-backtransformed dataset). The global abundance was always highly robust and the global precision was highly or moderately robust to most of the methodological options, except for data transformation.

  15. Changes to the Design of the National Health Interview Survey to Support Enhanced Monitoring of Health Reform Impacts at the State Level

    PubMed Central

    Blewett, Lynn A.; Dahlen, Heather M.; Spencer, Donna; Rivera Drew, Julia A.; Lukanen, Elizabeth

    2016-01-01

    Since 1957, the National Health Interview Survey (NHIS), sponsored by the Centers for Disease Control and Prevention (CDC)’s National Center for Health Statistics (NCHS), has been the primary source of information for monitoring health and health care use of the U.S. population at the national level. The passage of the Patient Protection and Affordable Care Act (ACA) in 2010 generated new needs for data to monitor its implementation and evaluate its effectiveness. In response, the NCHS has taken steps to enhance the content of the NHIS in several key areas and positioned the NHIS as a source of population health information at the national and state levels. This paper reviews recent changes to the NHIS that support enhanced health reform monitoring, including new questions and response categories, sampling design changes to improve state-level analysis, and enhanced dissemination activities. We conclude with a discussion about the importance of the NHIS, the continued need for state-level analysis, and suggestions for future consideration. PMID:27631739

  16. Population and Star Formation Histories from the Outer Limits Survey

    NASA Astrophysics Data System (ADS)

    Brondel, Brian Joseph; Saha, Abhijit; Olszewski, Edward

    2015-08-01

    The Outer Limits Survey (OLS) is a deep survey of selected fields in the outlying areas of the Magellanic Clouds based on the MOSAIC-II instrument on the Blanco 4-meter Telescope at CTIO. OLS is designed to probe the outer disk and halo structures of Magellanic System. The survey comprises ~50 fields obtained in Landolt R, I and Washington C, M and DDO51 filters, extending to a depth of about 24th magnitude in I. While qualitative examination of the resulting data has yielded interesting published results, we report here on quantitative analysis through matching of Hess diagrams to theoretical isochrones. We present analysis based on techniques developed by Dolphin (e.g., 2002, MNRAS, 332, 91) for fields observed by OLS. Our results broadly match those found by qualitative examination of the CMDs, but interesting details emerge from isochrone fitting.

  17. Cumberlandian Mollusk Conservation Program. Activity 1: mussel distribution surveys

    SciTech Connect

    Ahlstedt, S.A.

    1986-01-01

    The distribution of Cumberlandian mollusks in the Tennessee Valley is one of nine research activities developed as part of TVA's Cumberlandian Mollusk Conservation Program (CMCP). The name Cumberlandian refers to an endemic faunal assemblage that encompasses portions of 7 states bordering the southern Appalachian Mountains and the Cumberland Plateau Region. This geographic region is known as one of the major centers for mussel speciation and is considered the most prolific areas of the world for this particular group of organisms. Nine Tennessee Valley streams were selected for intensive qualitative and quantitative mussel surveys under Activity I of the CMCP. The surveys were designed to gather information on the present distribution of Cumberlandian mollusks. The streams chosen for surveys were based on the documented presence of diverse mussel fauna, endangered mussels, and/or sufficient information (diverse fish fauna, good water quality, etc.) to suggest potential for occurrence of diverse mussel fauna or endangered species.

  18. Tutorial on technology transfer and survey design and data collection for measuring Internet and Intranet existence, usage, and impact (survey-2000) in acute care hospitals in the United States.

    PubMed

    Hatcher, M

    2001-02-01

    This paper provides a tutorial of technology transfer for management information systems in health care. Additionally it describes the process for a national survey of acute care hospitals using a random sample of 813 hospitals. The purpose of the survey was to measure the levels of Internet and Intranet existence and usage in acute care hospitals. The depth of the survey includes e-commerce for both business to business and with customers. The relationships with systems approaches, user involvement, user satisfaction and decision-making will be studied. Changes with results of a prior survey conducted in 1997 can be studied and enabling and inhabiting factors identified. This information will provide benchmarks for hospitals to plan their network technology position and to set goals.

  19. Quantitative immunoglobulins in adulthood.

    PubMed

    Crisp, Howard C; Quinn, James M

    2009-01-01

    Although age-related changes in serum immunoglobulins are well described in childhood, alterations in immunoglobulins in the elderly are less well described and published. This study was designed to better define expected immunoglobulin ranges and differences in adults of differing decades of life. Sera from 404 patients, aged 20-89 years old were analyzed for quantitative immunoglobulin G (IgG), immunoglobulin M (IgM), and immunoglobulin A (IgA). The patients with diagnoses or medications known to affect immunoglobulin levels were identified while blinded to their immunoglobulin levels. A two-factor ANOVA was performed using decade of life and gender on both the entire sample population as well as the subset without any disease or medication expected to alter immunoglobulin levels. A literature review was also performed on all English language articles evaluating quantitative immunoglobulin levels in adults >60 years old. For the entire population, IgM was found to be higher in women when compared with men (p < 0.001) and lower in the oldest sample population compared with the youngest population (p < 0.001). For the population without diseases known to affect immunoglobulin levels, the differences in IgM with gender and age were maintained (p < or = 0.001) and IgA levels were generally higher in the older population when compared with the younger population (p = 0.009). Elderly patients without disease known to affect immunoglobulin levels have higher serum IgA levels and lower serum IgM levels. Women have higher IgM levels than men throughout life. IgG levels are not significantly altered in an older population.

  20. Bayesian adaptive survey protocols for resource management

    USGS Publications Warehouse

    Halstead, Brian J.; Wylie, Glenn D.; Coates, Peter S.; Casazza, Michael L.

    2011-01-01

    Transparency in resource management decisions requires a proper accounting of uncertainty at multiple stages of the decision-making process. As information becomes available, periodic review and updating of resource management protocols reduces uncertainty and improves management decisions. One of the most basic steps to mitigating anthropogenic effects on populations is determining if a population of a species occurs in an area that will be affected by human activity. Species are rarely detected with certainty, however, and falsely declaring a species absent can cause improper conservation decisions or even extirpation of populations. We propose a method to design survey protocols for imperfectly detected species that accounts for multiple sources of uncertainty in the detection process, is capable of quantitatively incorporating expert opinion into the decision-making process, allows periodic updates to the protocol, and permits resource managers to weigh the severity of consequences if the species is falsely declared absent. We developed our method using the giant gartersnake (Thamnophis gigas), a threatened species precinctive to the Central Valley of California, as a case study. Survey date was negatively related to the probability of detecting the giant gartersnake, and water temperature was positively related to the probability of detecting the giant gartersnake at a sampled location. Reporting sampling effort, timing and duration of surveys, and water temperatures would allow resource managers to evaluate the probability that the giant gartersnake occurs at sampled sites where it is not detected. This information would also allow periodic updates and quantitative evaluation of changes to the giant gartersnake survey protocol. Because it naturally allows multiple sources of information and is predicated upon the idea of updating information, Bayesian analysis is well-suited to solving the problem of developing efficient sampling protocols for species of