Sample records for on-line statistics program

  1. Development of new on-line statistical program for the Korean Society for Radiation Oncology

    PubMed Central

    Song, Si Yeol; Ahn, Seung Do; Chung, Weon Kuu; Choi, Eun Kyung; Cho, Kwan Ho

    2015-01-01

    Purpose To develop new on-line statistical program for the Korean Society for Radiation Oncology (KOSRO) to collect and extract medical data in radiation oncology more efficiently. Materials and Methods The statistical program is a web-based program. The directory was placed in a sub-folder of the homepage of KOSRO and its web address is http://www.kosro.or.kr/asda. The operating systems server is Linux and the webserver is the Apache HTTP server. For database (DB) server, MySQL is adopted and dedicated scripting language is the PHP. Each ID and password are controlled independently and all screen pages for data input or analysis are made to be friendly to users. Scroll-down menu is actively used for the convenience of user and the consistence of data analysis. Results Year of data is one of top categories and main topics include human resource, equipment, clinical statistics, specialized treatment and research achievement. Each topic or category has several subcategorized topics. Real-time on-line report of analysis is produced immediately after entering each data and the administrator is able to monitor status of data input of each hospital. Backup of data as spread sheets can be accessed by the administrator and be used for academic works by any members of the KOSRO. Conclusion The new on-line statistical program was developed to collect data from nationwide departments of radiation oncology. Intuitive screen and consistent input structure are expected to promote entering data of member hospitals and annual statistics should be a cornerstone of advance in radiation oncology. PMID:26157684

  2. Development of new on-line statistical program for the Korean Society for Radiation Oncology.

    PubMed

    Song, Si Yeol; Ahn, Seung Do; Chung, Weon Kuu; Shin, Kyung Hwan; Choi, Eun Kyung; Cho, Kwan Ho

    2015-06-01

    To develop new on-line statistical program for the Korean Society for Radiation Oncology (KOSRO) to collect and extract medical data in radiation oncology more efficiently. The statistical program is a web-based program. The directory was placed in a sub-folder of the homepage of KOSRO and its web address is http://www.kosro.or.kr/asda. The operating systems server is Linux and the webserver is the Apache HTTP server. For database (DB) server, MySQL is adopted and dedicated scripting language is the PHP. Each ID and password are controlled independently and all screen pages for data input or analysis are made to be friendly to users. Scroll-down menu is actively used for the convenience of user and the consistence of data analysis. Year of data is one of top categories and main topics include human resource, equipment, clinical statistics, specialized treatment and research achievement. Each topic or category has several subcategorized topics. Real-time on-line report of analysis is produced immediately after entering each data and the administrator is able to monitor status of data input of each hospital. Backup of data as spread sheets can be accessed by the administrator and be used for academic works by any members of the KOSRO. The new on-line statistical program was developed to collect data from nationwide departments of radiation oncology. Intuitive screen and consistent input structure are expected to promote entering data of member hospitals and annual statistics should be a cornerstone of advance in radiation oncology.

  3. Difference to Inference: teaching logical and statistical reasoning through on-line interactivity.

    PubMed

    Malloy, T E

    2001-05-01

    Difference to Inference is an on-line JAVA program that simulates theory testing and falsification through research design and data collection in a game format. The program, based on cognitive and epistemological principles, is designed to support learning of the thinking skills underlying deductive and inductive logic and statistical reasoning. Difference to Inference has database connectivity so that game scores can be counted as part of course grades.

  4. Utah Virtual Lab: JAVA interactivity for teaching science and statistics on line.

    PubMed

    Malloy, T E; Jensen, G C

    2001-05-01

    The Utah on-line Virtual Lab is a JAVA program run dynamically off a database. It is embedded in StatCenter (www.psych.utah.edu/learn/statsampler.html), an on-line collection of tools and text for teaching and learning statistics. Instructors author a statistical virtual reality that simulates theories and data in a specific research focus area by defining independent, predictor, and dependent variables and the relations among them. Students work in an on-line virtual environment to discover the principles of this simulated reality: They go to a library, read theoretical overviews and scientific puzzles, and then go to a lab, design a study, collect and analyze data, and write a report. Each student's design and data analysis decisions are computer-graded and recorded in a database; the written research report can be read by the instructor or by other students in peer groups simulating scientific conventions.

  5. On-Line Analysis of Southern FIA Data

    Treesearch

    Michael P. Spinney; Paul C. Van Deusen; Francis A. Roesch

    2006-01-01

    The Southern On-Line Estimator (SOLE) is a web-based FIA database analysis tool designed with an emphasis on modularity. The Java-based user interface is simple and intuitive to use and the R-based analysis engine is fast and stable. Each component of the program (data retrieval, statistical analysis and output) can be individually modified to accommodate major...

  6. Parallel line analysis: multifunctional software for the biomedical sciences

    NASA Technical Reports Server (NTRS)

    Swank, P. R.; Lewis, M. L.; Damron, K. L.; Morrison, D. R.

    1990-01-01

    An easy to use, interactive FORTRAN program for analyzing the results of parallel line assays is described. The program is menu driven and consists of five major components: data entry, data editing, manual analysis, manual plotting, and automatic analysis and plotting. Data can be entered from the terminal or from previously created data files. The data editing portion of the program is used to inspect and modify data and to statistically identify outliers. The manual analysis component is used to test the assumptions necessary for parallel line assays using analysis of covariance techniques and to determine potency ratios with confidence limits. The manual plotting component provides a graphic display of the data on the terminal screen or on a standard line printer. The automatic portion runs through multiple analyses without operator input. Data may be saved in a special file to expedite input at a future time.

  7. Kendall-Theil Robust Line (KTRLine--version 1.0)-A Visual Basic Program for Calculating and Graphing Robust Nonparametric Estimates of Linear-Regression Coefficients Between Two Continuous Variables

    USGS Publications Warehouse

    Granato, Gregory E.

    2006-01-01

    The Kendall-Theil Robust Line software (KTRLine-version 1.0) is a Visual Basic program that may be used with the Microsoft Windows operating system to calculate parameters for robust, nonparametric estimates of linear-regression coefficients between two continuous variables. The KTRLine software was developed by the U.S. Geological Survey, in cooperation with the Federal Highway Administration, for use in stochastic data modeling with local, regional, and national hydrologic data sets to develop planning-level estimates of potential effects of highway runoff on the quality of receiving waters. The Kendall-Theil robust line was selected because this robust nonparametric method is resistant to the effects of outliers and nonnormality in residuals that commonly characterize hydrologic data sets. The slope of the line is calculated as the median of all possible pairwise slopes between points. The intercept is calculated so that the line will run through the median of input data. A single-line model or a multisegment model may be specified. The program was developed to provide regression equations with an error component for stochastic data generation because nonparametric multisegment regression tools are not available with the software that is commonly used to develop regression models. The Kendall-Theil robust line is a median line and, therefore, may underestimate total mass, volume, or loads unless the error component or a bias correction factor is incorporated into the estimate. Regression statistics such as the median error, the median absolute deviation, the prediction error sum of squares, the root mean square error, the confidence interval for the slope, and the bias correction factor for median estimates are calculated by use of nonparametric methods. These statistics, however, may be used to formulate estimates of mass, volume, or total loads. The program is used to read a two- or three-column tab-delimited input file with variable names in the first row and data in subsequent rows. The user may choose the columns that contain the independent (X) and dependent (Y) variable. A third column, if present, may contain metadata such as the sample-collection location and date. The program screens the input files and plots the data. The KTRLine software is a graphical tool that facilitates development of regression models by use of graphs of the regression line with data, the regression residuals (with X or Y), and percentile plots of the cumulative frequency of the X variable, Y variable, and the regression residuals. The user may individually transform the independent and dependent variables to reduce heteroscedasticity and to linearize data. The program plots the data and the regression line. The program also prints model specifications and regression statistics to the screen. The user may save and print the regression results. The program can accept data sets that contain up to about 15,000 XY data points, but because the program must sort the array of all pairwise slopes, the program may be perceptibly slow with data sets that contain more than about 1,000 points.

  8. [Effect of an on-line health promotion program connected with a hospital health examination center on health promotion behavior and health status].

    PubMed

    Park, Jeong Sook; Kwon, Sang Min

    2008-06-01

    The purpose of this study was to evaluate the effect of an On-line health promotion program connected with a hospital health examination center. Based on contents developed, the www.kmwellbeing.com homepage was developed. The research design was a one group pretest-posttest design. Seventy-three clients participated in this study. The data were collected from January 3 to June 30, 2005. As a way of utilizing the homepage, this paper attempted to measure the change of pre and post program health promotion behavior and health status (perceived health status, objective health index-blood pressure, pulse, total cholesterol, blood sugar, waist flexibility, grip strength and lower extremity strength). Data were analyzed by descriptive statistics and paired t-test with the SPSS/Win 12.0 program. There were significant differences of perceived health status, systolic BP, waist flexibility and grip strength. However, there were no significant differences in health promotion behavior, diastolic BP, pulse, lower extremity strength, blood sugar and total cholesterol between pre program and post program. It is expected that an on-line health promotion program connected with a hospital health examination center will provide an effective learning media for health education and partially contribute to client's health promotion. A strategy, however, is needed to facilitate the continuous use of the on-line health promotion program for adult clients.

  9. Fieldcrest Cannon, Inc. Advanced Technical Preparation. Statistical Process Control (SPC). PRE-SPC 11: SPC & Graphs. Instructor Book.

    ERIC Educational Resources Information Center

    Averitt, Sallie D.

    This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills in working with line graphs and teaching…

  10. A Statistically Based Training Diagnostic Tool for Marine Aviation

    DTIC Science & Technology

    2014-06-01

    mission essential task list MDG maneuver description guide MOS military occupational specialty MSHARP Marine Sierra Hotel Aviation Reporting Program...include the Defense Readiness Reporting System (DRRS) Marine Corps, the Current Readiness Program (CRP), and the Marine Sierra Hotel Aviation...Beuschel, 2008). Many of these systems focus on business decisions regarding how companies can increase their bottom line, by appealing to customers more

  11. A statistical data analysis and plotting program for cloud microphysics experiments

    NASA Technical Reports Server (NTRS)

    Jordan, A. J.

    1981-01-01

    The analysis software developed for atmospheric cloud microphysics experiments conducted in the laboratory as well as aboard a KC-135 aircraft is described. A group of four programs was developed and implemented on a Hewlett Packard 1000 series F minicomputer running under HP's RTE-IVB operating system. The programs control and read data from a MEMODYNE Model 3765-8BV cassette recorder, format the data on the Hewlett Packard disk subsystem, and generate statistical data (mean, variance, standard deviation) and voltage and engineering unit plots on a user selected plotting device. The programs are written in HP FORTRAN IV and HP ASSEMBLY Language with the graphics software using the HP 1000 Graphics. The supported plotting devices are the HP 2647A graphics terminal, the HP 9872B four color pen plotter, and the HP 2608A matrix line printer.

  12. From micro to mainframe. A practical approach to perinatal data processing.

    PubMed

    Yeh, S Y; Lincoln, T

    1985-04-01

    A new, practical approach to perinatal data processing for a large obstetric population is described. This was done with a microcomputer for data entry and a mainframe computer for data reduction. The Screen Oriented Data Access (SODA) program was used to generate the data entry form and to input data into the Apple II Plus computer. Data were stored on diskettes and transmitted through a modern and telephone line to the IBM 370/168 computer. The Statistical Analysis System (SAS) program was used for statistical analyses and report generations. This approach was found to be most practical, flexible, and economical.

  13. A Categorization of Dynamic Analyzers

    NASA Technical Reports Server (NTRS)

    Lujan, Michelle R.

    1997-01-01

    Program analysis techniques and tools are essential to the development process because of the support they provide in detecting errors and deficiencies at different phases of development. The types of information rendered through analysis includes the following: statistical measurements of code, type checks, dataflow analysis, consistency checks, test data,verification of code, and debugging information. Analyzers can be broken into two major categories: dynamic and static. Static analyzers examine programs with respect to syntax errors and structural properties., This includes gathering statistical information on program content, such as the number of lines of executable code, source lines. and cyclomatic complexity. In addition, static analyzers provide the ability to check for the consistency of programs with respect to variables. Dynamic analyzers in contrast are dependent on input and the execution of a program providing the ability to find errors that cannot be detected through the use of static analysis alone. Dynamic analysis provides information on the behavior of a program rather than on the syntax. Both types of analysis detect errors in a program, but dynamic analyzers accomplish this through run-time behavior. This paper focuses on the following broad classification of dynamic analyzers: 1) Metrics; 2) Models; and 3) Monitors. Metrics are those analyzers that provide measurement. The next category, models, captures those analyzers that present the state of the program to the user at specified points in time. The last category, monitors, checks specified code based on some criteria. The paper discusses each classification and the techniques that are included under them. In addition, the role of each technique in the software life cycle is discussed. Familiarization with the tools that measure, model and monitor programs provides a framework for understanding the program's dynamic behavior from different, perspectives through analysis of the input/output data.

  14. Gnuastro: GNU Astronomy Utilities

    NASA Astrophysics Data System (ADS)

    Akhlaghi, Mohammad

    2018-01-01

    Gnuastro (GNU Astronomy Utilities) manipulates and analyzes astronomical data. It is an official GNU package of a large collection of programs and C/C++ library functions. Command-line programs perform arithmetic operations on images, convert FITS images to common types like JPG or PDF, convolve an image with a given kernel or matching of kernels, perform cosmological calculations, crop parts of large images (possibly in multiple files), manipulate FITS extensions and keywords, and perform statistical operations. In addition, it contains programs to make catalogs from detection maps, add noise, make mock profiles with a variety of radial functions using monte-carlo integration for their centers, match catalogs, and detect objects in an image among many other operations. The command-line programs share the same basic command-line user interface for the comfort of both the users and developers. Gnuastro is written to comply fully with the GNU coding standards and integrates well with all Unix-like operating systems. This enables astronomers to expect a fully familiar experience in the source code, building, installing and command-line user interaction that they have seen in all the other GNU software that they use. Gnuastro's extensive library is included for users who want to build their own unique programs.

  15. Radar error statistics for the space shuttle

    NASA Technical Reports Server (NTRS)

    Lear, W. M.

    1979-01-01

    Radar error statistics of C-band and S-band that are recommended for use with the groundtracking programs to process space shuttle tracking data are presented. The statistics are divided into two parts: bias error statistics, using the subscript B, and high frequency error statistics, using the subscript q. Bias errors may be slowly varying to constant. High frequency random errors (noise) are rapidly varying and may or may not be correlated from sample to sample. Bias errors were mainly due to hardware defects and to errors in correction for atmospheric refraction effects. High frequency noise was mainly due to hardware and due to atmospheric scintillation. Three types of atmospheric scintillation were identified: horizontal, vertical, and line of sight. This was the first time that horizontal and line of sight scintillations were identified.

  16. Arlequin suite ver 3.5: a new series of programs to perform population genetics analyses under Linux and Windows.

    PubMed

    Excoffier, Laurent; Lischer, Heidi E L

    2010-05-01

    We present here a new version of the Arlequin program available under three different forms: a Windows graphical version (Winarl35), a console version of Arlequin (arlecore), and a specific console version to compute summary statistics (arlsumstat). The command-line versions run under both Linux and Windows. The main innovations of the new version include enhanced outputs in XML format, the possibility to embed graphics displaying computation results directly into output files, and the implementation of a new method to detect loci under selection from genome scans. Command-line versions are designed to handle large series of files, and arlsumstat can be used to generate summary statistics from simulated data sets within an Approximate Bayesian Computation framework. © 2010 Blackwell Publishing Ltd.

  17. Sole: Online Analysis of Southern FIA Data

    Treesearch

    Michael P. Spinney; Paul C. Van Deusen; Francis A. Roesch

    2006-01-01

    The Southern On Line Estimator (SOLE) is a flexible modular software program for analyzing U.S. Department of Agriculture Forest Service Forest Inventory and Analysis data. SOLE produces statistical tables, figures, maps, and portable document format reports based on user selected area and variables. SOLE?s Java-based graphical user interface is easy to use, and its R-...

  18. A Low-Maintenance Approach to Improving Retention: Short On-Line Tutorials in Elementary Statistics

    ERIC Educational Resources Information Center

    Sargent, Carol Springer; Borthick, A. Faye; Lederberg, Amy R.; Haardorfer, Regine

    2013-01-01

    The struggle to get weak students to use learning support services plagues virtually all retention programs (Friedlander, 1980; Hodges, 2001; Karabenick & Knapp, 1988; Moore & LeDee, 2006; Simpson, Hynd, Nist, & Burrell, 1997; Webster & Dee, 1998). This study presents a cost-effective form of supplemental instruction (SI), in the form of on-line…

  19. A Multi-Wavelength Study of the Hot Component of the Interstellar Medium

    NASA Technical Reports Server (NTRS)

    Oliversen, Ronald J. (Technical Monitor)

    2004-01-01

    This research focuses on the kinematics and evolution of the hot phase of the interstellar medium in the Galaxy. The plan was to measure the UV spectra of all hot stars observed with IUE, in order to identify and measure the main component and any high velocity components to the interstellar lines. Collection of data from higher resolution instruments on HST has been proposed for some of the interesting lines of sight. IUE spectra of 240 stars up to 8 kpc in 2 quadrants of the galactic plane have been examined to (1) estimate the total column density per kpc as a function of direction and distance, and (2) to obtain a lower limit to the number of high velocity components to the interstellar lines, thus giving an approximation of the number of conductive interfaces encountered per line of sight. By determining an approximation to the number of components per unit distance we aim to derive statistics on interfaces between hot and cold gas in the Galaxy. We find that 20% of the stars in this sample show at least one high velocity component in the C IV interstellar line. Two successful FUSE programs address this research and collected data for several of the lines of sight identified as locations of hot, expanding gas with the IUE data. One FUSE program is complete for the Vela SNR region. Data from another FUSE program to investigate the Cygnus superbubble region are being analyzed.

  20. Resident accuracy of joint line palpation using ultrasound verification.

    PubMed

    Rho, Monica E; Chu, Samuel K; Yang, Aaron; Hameed, Farah; Lin, Cindy Yuchin; Hurh, Peter J

    2014-10-01

    To determine the accuracy of knee and acromioclavicular (AC) joint line palpation in Physical Medicine and Rehabilitation (PM&R) residents using ultrasound (US) verification. Cohort study. PM&R residency program at an academic institution. Twenty-four PM&R residents participating in a musculoskeletal US course (7 PGY-2, 8 PGY-3, and 9 PGY4 residents). Twenty-four PM&R residents participating in an US course were asked to palpate the AC joint and lateral joint line of the knee in a female and male model before the start of the course. Once the presumed joint line was localized, the residents were asked to tape an 18-gauge, 1.5-inch, blunt-tip needle parallel to the joint line on the overlying skin. The accuracy of needle placement over the joint line was verified using US. US verification of correct needle placement over the joint line. Overall AC joint palpation accuracy was 16.7%, and knee lateral joint line palpation accuracy was 58.3%. Based on the resident level of education, using a value of P < .05, there were no statistically significant differences in the accuracy of joint line palpation. Residents in this study demonstrate poor accuracy of AC joint and lateral knee joint line identification by palpation, using US as the criterion standard for verification. There were no statistically significant differences in the accuracy rates of joint line palpation based on resident level of education. US may be a useful tool to use to advance the current methods of teaching the physical examination in medical education. Copyright © 2014 American Academy of Physical Medicine and Rehabilitation. Published by Elsevier Inc. All rights reserved.

  1. Stan: A Probabilistic Programming Language for Bayesian Inference and Optimization

    ERIC Educational Resources Information Center

    Gelman, Andrew; Lee, Daniel; Guo, Jiqiang

    2015-01-01

    Stan is a free and open-source C++ program that performs Bayesian inference or optimization for arbitrary user-specified models and can be called from the command line, R, Python, Matlab, or Julia and has great promise for fitting large and complex statistical models in many areas of application. We discuss Stan from users' and developers'…

  2. KERNELHR: A program for estimating animal home ranges

    USGS Publications Warehouse

    Seaman, D.E.; Griffith, B.; Powell, R.A.

    1998-01-01

    Kernel methods are state of the art for estimating animal home-range area and utilization distribution (UD). The KERNELHR program was developed to provide researchers and managers a tool to implement this extremely flexible set of methods with many variants. KERNELHR runs interactively or from the command line on any personal computer (PC) running DOS. KERNELHR provides output of fixed and adaptive kernel home-range estimates, as well as density values in a format suitable for in-depth statistical and spatial analyses. An additional package of programs creates contour files for plotting in geographic information systems (GIS) and estimates core areas of ranges.

  3. Financial Bottom Line: Estimating the Cost of Faculty/Adjunct Turnover and Attrition for Online Programs

    ERIC Educational Resources Information Center

    Betts, Kristen S.; Sikorski, Bernadine

    2008-01-01

    Turnover and attrition of online faculty and adjunct faculty is a reality. While there are no reported national statistics or data on annual turnover/attrition for online faculty/adjunct, the overall costs of recruiting, training, and replacing faculty/adjunct can be staggering. Moreover, the short and long term effects of online faculty/adjunct…

  4. GenomeGraphs: integrated genomic data visualization with R.

    PubMed

    Durinck, Steffen; Bullard, James; Spellman, Paul T; Dudoit, Sandrine

    2009-01-06

    Biological studies involve a growing number of distinct high-throughput experiments to characterize samples of interest. There is a lack of methods to visualize these different genomic datasets in a versatile manner. In addition, genomic data analysis requires integrated visualization of experimental data along with constantly changing genomic annotation and statistical analyses. We developed GenomeGraphs, as an add-on software package for the statistical programming environment R, to facilitate integrated visualization of genomic datasets. GenomeGraphs uses the biomaRt package to perform on-line annotation queries to Ensembl and translates these to gene/transcript structures in viewports of the grid graphics package. This allows genomic annotation to be plotted together with experimental data. GenomeGraphs can also be used to plot custom annotation tracks in combination with different experimental data types together in one plot using the same genomic coordinate system. GenomeGraphs is a flexible and extensible software package which can be used to visualize a multitude of genomic datasets within the statistical programming environment R.

  5. Dental hygiene students' perceptions of distance learning: do they change over time?

    PubMed

    Sledge, Rhonda; Vuk, Jasna; Long, Susan

    2014-02-01

    The University of Arkansas for Medical Sciences dental hygiene program established a distant site where the didactic curriculum was broadcast via interactive video from the main campus to the distant site, supplemented with on-line learning via Blackboard. This study compared the perceptions of students towards distance learning as they progressed through the 21 month curriculum. Specifically, the study sought to answer the following questions: Is there a difference in the initial perceptions of students on the main campus and at the distant site toward distance learning? Do students' perceptions change over time with exposure to synchronous distance learning over the course of the curriculum? All 39 subjects were women between the ages of 20 and 35 years. Of the 39 subjects, 37 were Caucasian and 2 were African-American. A 15-question Likert scale survey was administered at 4 different periods during the 21 month program to compare changes in perceptions toward distance learning as students progressed through the program. An independent sample t-test and ANOVA were utilized for statistical analysis. At the beginning of the program, independent samples t-test revealed that students at the main campus (n=34) perceived statistically significantly higher effectiveness of distance learning than students at the distant site (n=5). Repeated measures of ANOVA revealed that perceptions of students at the main campus on effectiveness and advantages of distance learning statistically significantly decreased whereas perceptions of students at distant site statistically significantly increased over time. Distance learning in the dental hygiene program was discussed, and replication of the study with larger samples of students was recommended.

  6. The Keys to Success in Doctoral Studies: A Preimmersion Course.

    PubMed

    Salani, Deborah; Albuja, Laura Dean; Azaiza, Khitam

    2016-01-01

    This article will review an innovative on-line preimmersion course for a hybrid doctor of nursing practice (DNP) program and a traditional face-to-face doctor of philosophy nursing program. The doctoral candidates include both postbaccalaureate and postmaster's students. The authors of the preimmersion course developed and initiated the course in order to address various issues that have surfaced in discussions between students and faculty. Examples of common themes identified include writing skills, statistics, life-work-school balance, and navigating instructional technology. Doctoral studies may pose challenges to students studying nursing, in regard to academic rigor and experiencing on-line education for the first time, especially for students who have been out of school for an extended amount of time or are not accustomed to a nontraditional classroom; thus, having a preimmersion course established may facilitate a smooth transition to rigorous academic studies in a hybrid program. The course, which was developed and delivered through Blackboard, a learning management system, includes the following 9 preimmersion modules: academic strategies (learning styles, creating an effective PowerPoint presentation), library support (introduction to the university library, literature review tutorial, and citation styles), mindfulness, wellness, statistics essentials, writing express, DNP capstone, netiquette, and DNP/doctor of philosophy mentorship. Each module consists of various tools that may promote student success in specific courses and the programs in general. The purpose of designing the preimmersion course is to decrease attrition rates and increase success of the students. While the majority of students have succeeded in their coursework and been graduated from the program, the authors of this article found that many students struggled with the work, life, and school balance. Future work will include the evaluation of results from graduate students enrolled in the program. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Genetic Programming as Alternative for Predicting Development Effort of Individual Software Projects

    PubMed Central

    Chavoya, Arturo; Lopez-Martin, Cuauhtemoc; Andalon-Garcia, Irma R.; Meda-Campaña, M. E.

    2012-01-01

    Statistical and genetic programming techniques have been used to predict the software development effort of large software projects. In this paper, a genetic programming model was used for predicting the effort required in individually developed projects. Accuracy obtained from a genetic programming model was compared against one generated from the application of a statistical regression model. A sample of 219 projects developed by 71 practitioners was used for generating the two models, whereas another sample of 130 projects developed by 38 practitioners was used for validating them. The models used two kinds of lines of code as well as programming language experience as independent variables. Accuracy results from the model obtained with genetic programming suggest that it could be used to predict the software development effort of individual projects when these projects have been developed in a disciplined manner within a development-controlled environment. PMID:23226305

  8. Single Parents and Displaced Homemakers: A National Problem and Concern. Ohio's Response. A Report on the Carl D. Perkins Vocational Education Act, 1986-1991.

    ERIC Educational Resources Information Center

    Ohio State Dept. of Education, Columbus. Div. of Vocational and Career Education.

    The Carl D. Perkins Vocational Education Act of 1984 included a component focusing on programs for displaced homemakers and single parents. This part of the legislation was precipitated by the following statistics: almost half of all families maintained by single women had incomes below the poverty line; the number of displaced homemakers and…

  9. Object-oriented productivity metrics

    NASA Technical Reports Server (NTRS)

    Connell, John L.; Eller, Nancy

    1992-01-01

    Software productivity metrics are useful for sizing and costing proposed software and for measuring development productivity. Estimating and measuring source lines of code (SLOC) has proven to be a bad idea because it encourages writing more lines of code and using lower level languages. Function Point Analysis is an improved software metric system, but it is not compatible with newer rapid prototyping and object-oriented approaches to software development. A process is presented here for counting object-oriented effort points, based on a preliminary object-oriented analysis. It is proposed that this approach is compatible with object-oriented analysis, design, programming, and rapid prototyping. Statistics gathered on actual projects are presented to validate the approach.

  10. Binary Programming Models of Spatial Pattern Recognition: Applications in Remote Sensing Image Analysis

    DTIC Science & Technology

    1991-12-01

    9 2.6.1 Multi-Shape Detection. .. .. .. .. .. .. ...... 9 Page 2.6.2 Line Segment Extraction and Re-Combination.. 9 2.6.3 Planimetric Feature... Extraction ............... 10 2.6.4 Line Segment Extraction From Statistical Texture Analysis .............................. 11 2.6.5 Edge Following as Graph...image after image, could benefit clue to the fact that major spatial characteristics of subregions could be extracted , and minor spatial changes could be

  11. Online Statistical Modeling (Regression Analysis) for Independent Responses

    NASA Astrophysics Data System (ADS)

    Made Tirta, I.; Anggraeni, Dian; Pandutama, Martinus

    2017-06-01

    Regression analysis (statistical analmodelling) are among statistical methods which are frequently needed in analyzing quantitative data, especially to model relationship between response and explanatory variables. Nowadays, statistical models have been developed into various directions to model various type and complex relationship of data. Rich varieties of advanced and recent statistical modelling are mostly available on open source software (one of them is R). However, these advanced statistical modelling, are not very friendly to novice R users, since they are based on programming script or command line interface. Our research aims to developed web interface (based on R and shiny), so that most recent and advanced statistical modelling are readily available, accessible and applicable on web. We have previously made interface in the form of e-tutorial for several modern and advanced statistical modelling on R especially for independent responses (including linear models/LM, generalized linier models/GLM, generalized additive model/GAM and generalized additive model for location scale and shape/GAMLSS). In this research we unified them in the form of data analysis, including model using Computer Intensive Statistics (Bootstrap and Markov Chain Monte Carlo/ MCMC). All are readily accessible on our online Virtual Statistics Laboratory. The web (interface) make the statistical modeling becomes easier to apply and easier to compare them in order to find the most appropriate model for the data.

  12. The Number Density Evolution of Extreme Emission Line Galaxies in 3D-HST: Results from a Novel Automated Line Search Technique for Slitless Spectroscopy

    NASA Astrophysics Data System (ADS)

    Maseda, Michael V.; van der Wel, Arjen; Rix, Hans-Walter; Momcheva, Ivelina; Brammer, Gabriel B.; Franx, Marijn; Lundgren, Britt F.; Skelton, Rosalind E.; Whitaker, Katherine E.

    2018-02-01

    The multiplexing capability of slitless spectroscopy is a powerful asset in creating large spectroscopic data sets, but issues such as spectral confusion make the interpretation of the data challenging. Here we present a new method to search for emission lines in the slitless spectroscopic data from the 3D-HST survey utilizing the Wide-Field Camera 3 on board the Hubble Space Telescope. Using a novel statistical technique, we can detect compact (extended) emission lines at 90% completeness down to fluxes of 1.5(3.0)× {10}-17 {erg} {{{s}}}-1 {{cm}}-2, close to the noise level of the grism exposures, for objects detected in the deep ancillary photometric data. Unlike previous methods, the Bayesian nature allows for probabilistic line identifications, namely redshift estimates, based on secondary emission line detections and/or photometric redshift priors. As a first application, we measure the comoving number density of Extreme Emission Line Galaxies (restframe [O III] λ5007 equivalent widths in excess of 500 Å). We find that these galaxies are nearly 10× more common above z ∼ 1.5 than at z ≲ 0.5. With upcoming large grism surveys such as Euclid and WFIRST, as well as grisms featured prominently on the NIRISS and NIRCam instruments on the James Webb Space Telescope, methods like the one presented here will be crucial for constructing emission line redshift catalogs in an automated and well-understood manner. This work is based on observations taken by the 3D-HST Treasury Program and the CANDELS Multi-Cycle Treasury Program with the NASA/ESA HST, which is operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS5-26555.

  13. Scheduler software for tracking and data relay satellite system loading analysis: User manual and programmer guide

    NASA Technical Reports Server (NTRS)

    Craft, R.; Dunn, C.; Mccord, J.; Simeone, L.

    1980-01-01

    A user guide and programmer documentation is provided for a system of PRIME 400 minicomputer programs. The system was designed to support loading analyses on the Tracking Data Relay Satellite System (TDRSS). The system is a scheduler for various types of data relays (including tape recorder dumps and real time relays) from orbiting payloads to the TDRSS. Several model options are available to statistically generate data relay requirements. TDRSS time lines (representing resources available for scheduling) and payload/TDRSS acquisition and loss of sight time lines are input to the scheduler from disk. Tabulated output from the interactive system includes a summary of the scheduler activities over time intervals specified by the user and overall summary of scheduler input and output information. A history file, which records every event generated by the scheduler, is written to disk to allow further scheduling on remaining resources and to provide data for graphic displays or additional statistical analysis.

  14. "Asymptotic Parabola" Fits for Smoothing Generally Asymmetric Light Curves

    NASA Astrophysics Data System (ADS)

    Andrych, K. D.; Andronov, I. L.; Chinarova, L. L.; Marsakova, V. I.

    A computer program is introduced, which allows to determine statistically optimal approximation using the "Asymptotic Parabola" fit, or, in other words, the spline consisting of polynomials of order 1,2,1, or two lines ("asymptotes") connected with a parabola. The function itself and its derivative is continuous. There are 5 parameters: two points, where a line switches to a parabola and vice versa, the slopes of the line and the curvature of the parabola. Extreme cases are either the parabola without lines (i.e.the parabola of width of the whole interval), or lines without a parabola (zero width of the parabola), or "line+parabola" without a second line. Such an approximation is especially effective for pulsating variables, for which the slopes of the ascending and descending branches are generally different, so the maxima and minima have asymmetric shapes. The method was initially introduced by Marsakova and Andronov (1996OAP.....9...127M) and realized as a computer program written in QBasic under DOS. It was used for dozens of variable stars, particularly, for the catalogs of the individual characteristics of pulsations of the Mira (1998OAP....11...79M) and semi-regular (200OAP....13..116C) pulsating variables. For the eclipsing variables with nearly symmetric shapes of the minima, we use a "symmetric" version of the "Asymptotic parabola". Here we introduce a Windows-based program, which does not have DOS limitation for the memory (number of observations) and screen resolution. The program has an user-friendly interface and is illustrated by an application to the test signal and to the pulsating variable AC Her.

  15. Automating approximate Bayesian computation by local linear regression.

    PubMed

    Thornton, Kevin R

    2009-07-07

    In several biological contexts, parameter inference often relies on computationally-intensive techniques. "Approximate Bayesian Computation", or ABC, methods based on summary statistics have become increasingly popular. A particular flavor of ABC based on using a linear regression to approximate the posterior distribution of the parameters, conditional on the summary statistics, is computationally appealing, yet no standalone tool exists to automate the procedure. Here, I describe a program to implement the method. The software package ABCreg implements the local linear-regression approach to ABC. The advantages are: 1. The code is standalone, and fully-documented. 2. The program will automatically process multiple data sets, and create unique output files for each (which may be processed immediately in R), facilitating the testing of inference procedures on simulated data, or the analysis of multiple data sets. 3. The program implements two different transformation methods for the regression step. 4. Analysis options are controlled on the command line by the user, and the program is designed to output warnings for cases where the regression fails. 5. The program does not depend on any particular simulation machinery (coalescent, forward-time, etc.), and therefore is a general tool for processing the results from any simulation. 6. The code is open-source, and modular.Examples of applying the software to empirical data from Drosophila melanogaster, and testing the procedure on simulated data, are shown. In practice, the ABCreg simplifies implementing ABC based on local-linear regression.

  16. Preliminary Survey of Icing Conditions Measured During Routine Transcontinental Airline Operation

    NASA Technical Reports Server (NTRS)

    Perkins, Porter J.

    1952-01-01

    Icing data collected on routine operations by four DC-4-type aircraft equipped with NACA pressure-type icing-rate meters are presented as preliminary information obtained from a statistical icing data program sponsored by the NACA with the cooperation of many airline companies and the United States Air Force. The program is continuing on a much greater scale to provide large quantities of data from many air routes in the United States and overseas. Areas not covered by established air routes are also being included in the survey. The four aircraft which collected the data presented in this report were operated by United Air Lines over a transcontinental route from January through May, 1951. An analysis of the pressure-type icing-rate meter was satisfactory for collecting statistical data during routine operations. Data obtained on routine flight icing encounters from.these four instrumented aircraft, although insufficient for a conclusive statistical analysis, provide a greater quantity and considerably more realistic information than that obtained from random research flights. A summary of statistical data will be published when the information obtained daring the 1951-52 icing season and that to be obtained during the 1952-53 season can be analyzed and assembled. The 1951-52 data already analyzed indicate that the quantity, quality, and range of icing information being provided by this expanded program should afford a sound basis for ice-protection-system design by defining the important meteorological parameters of the icing cloud.

  17. RAD-ADAPT: Software for modelling clonogenic assay data in radiation biology.

    PubMed

    Zhang, Yaping; Hu, Kaiqiang; Beumer, Jan H; Bakkenist, Christopher J; D'Argenio, David Z

    2017-04-01

    We present a comprehensive software program, RAD-ADAPT, for the quantitative analysis of clonogenic assays in radiation biology. Two commonly used models for clonogenic assay analysis, the linear-quadratic model and single-hit multi-target model, are included in the software. RAD-ADAPT uses maximum likelihood estimation method to obtain parameter estimates with the assumption that cell colony count data follow a Poisson distribution. The program has an intuitive interface, generates model prediction plots, tabulates model parameter estimates, and allows automatic statistical comparison of parameters between different groups. The RAD-ADAPT interface is written using the statistical software R and the underlying computations are accomplished by the ADAPT software system for pharmacokinetic/pharmacodynamic systems analysis. The use of RAD-ADAPT is demonstrated using an example that examines the impact of pharmacologic ATM and ATR kinase inhibition on human lung cancer cell line A549 after ionizing radiation. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. GAPD and tubulin are suitable internal controls for qPCR analysis of oral squamous cell carcinoma cell lines.

    PubMed

    Campos, M S; Rodini, C O; Pinto-Júnior, D S; Nunes, F D

    2009-02-01

    The selection of housekeeping genes is critical for gene expression studies. To address this issue, four candidate housekeeping genes, including several commonly used ones, were investigated in oral squamous cell carcinoma cell lines. A simple quantitative RT-PCR approach was employed by comparing relative expression of the four candidate genes within two cancerous cell lines (HN6 and HN31) and one noncancerous cell line (HaCaT) treated or not with EGF and TGF-beta1. Data were analyzed using ANOVA followed by the NormFinder software program. On this basis, stability of the candidate housekeeping genes was ranked and non statistical differences were found using ANOVA test. On the other hand, the NormFinder was able to show that GAPD and TUBB presented the less variable results, representing appropriated housekeeping genes for the samples and conditions analyzed. In conclusion, this study suggests that the GAPD and the TUBB represent adequate normalizers for gene profiling studies in OSCC cell lines, covering, respectively, high and low expression levels genes.

  19. Analysis of solar ultraviolet lines

    NASA Technical Reports Server (NTRS)

    Chipman, E.

    1971-01-01

    The formation of the strongest ultra-violet emission lines of Mg II, O I, C II, and C III in the solar atmosphere is studied in detail. The equations of statistical equilibrium and radiative transfer for each ion are solved using a general computer program that is capable of solving non-LTE line-formation problems for arbitrary atmospheric and atomic models. Interpreting the results in terms of the structure of the solar atmosphere, it is concluded that the HSRA atmosphere has a temperature too low by about 500 K near h = 1100 km and that a temperature plateau with T sub e approximately = 18,000 K and width close to 60 km exists in the upper chromosphere. The structure of the solar atmosphere in the range 20,000 to 100,000 K and the effects of microturbulence on the formation of lines are also investigated. Approximate analytic line-formation problems are solved, and more exact solutions are derived later. An attempt is made to make the best possible fit to the Ca II K line center-to-limb profiles with a one-component atmosphere, with an assumed source function and microturbulent velocity.

  20. Computing Models of M-type Host Stars and their Panchromatic Spectral Output

    NASA Astrophysics Data System (ADS)

    Linsky, Jeffrey; Tilipman, Dennis; France, Kevin

    2018-06-01

    We have begun a program of computing state-of-the-art model atmospheres from the photospheres to the coronae of M stars that are the host stars of known exoplanets. For each model we are computing the emergent radiation at all wavelengths that are critical for assessingphotochemistry and mass-loss from exoplanet atmospheres. In particular, we are computing the stellar extreme ultraviolet radiation that drives hydrodynamic mass loss from exoplanet atmospheres and is essential for determing whether an exoplanet is habitable. The model atmospheres are computed with the SSRPM radiative transfer/statistical equilibrium code developed by Dr. Juan Fontenla. The code solves for the non-LTE statistical equilibrium populations of 18,538 levels of 52 atomic and ion species and computes the radiation from all species (435,986 spectral lines) and about 20,000,000 spectral lines of 20 diatomic species.The first model computed in this program was for the modestly active M1.5 V star GJ 832 by Fontenla et al. (ApJ 830, 152 (2016)). We will report on a preliminary model for the more active M5 V star GJ 876 and compare this model and its emergent spectrum with GJ 832. In the future, we will compute and intercompare semi-empirical models and spectra for all of the stars observed with the HST MUSCLES Treasury Survey, the Mega-MUSCLES Treasury Survey, and additional stars including Proxima Cen and Trappist-1.This multiyear theory program is supported by a grant from the Space Telescope Science Institute.

  1. 7 CFR 52.38b - Statistical sampling procedures for on-line inspection by attributes of processed fruits and...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 2 2011-01-01 2011-01-01 false Statistical sampling procedures for on-line inspection by attributes of processed fruits and vegetables. 52.38b Section 52.38b Agriculture Regulations of... Regulations Governing Inspection and Certification Sampling § 52.38b Statistical sampling procedures for on...

  2. 7 CFR 52.38b - Statistical sampling procedures for on-line inspection by attributes of processed fruits and...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Statistical sampling procedures for on-line inspection by attributes of processed fruits and vegetables. 52.38b Section 52.38b Agriculture Regulations of... Regulations Governing Inspection and Certification Sampling § 52.38b Statistical sampling procedures for on...

  3. A Distributed System for Learning Programming On-Line

    ERIC Educational Resources Information Center

    Verdu, Elena; Regueras, Luisa M.; Verdu, Maria J.; Leal, Jose P.; de Castro, Juan P.; Queiros, Ricardo

    2012-01-01

    Several Web-based on-line judges or on-line programming trainers have been developed in order to allow students to train their programming skills. However, their pedagogical functionalities in the learning of programming have not been clearly defined. EduJudge is a project which aims to integrate the "UVA On-line Judge", an existing…

  4. Computer aided statistical process control for on-line instrumentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meils, D.E.

    1995-01-01

    On-line chemical process instrumentation historically has been used for trending. Recent technological advances in on-line instrumentation have improved the accuracy and reliability of on-line instrumentation. However, little attention has been given to validating and verifying on-line instrumentation. This paper presents two practical approaches for validating instrument performance by comparison of on-line instrument response to either another portable instrument or another bench instrument. Because the comparison of two instruments` performance to each other requires somewhat complex statistical calculations, a computer code (Lab Stats Pack{reg_sign}) is used to simplify the calculations. Lab Stats Pack{reg_sign} also develops control charts that may be usedmore » for continuous verification of on-line instrument performance.« less

  5. Simulation and study of small numbers of random events

    NASA Technical Reports Server (NTRS)

    Shelton, R. D.

    1986-01-01

    Random events were simulated by computer and subjected to various statistical methods to extract important parameters. Various forms of curve fitting were explored, such as least squares, least distance from a line, maximum likelihood. Problems considered were dead time, exponential decay, and spectrum extraction from cosmic ray data using binned data and data from individual events. Computer programs, mostly of an iterative nature, were developed to do these simulations and extractions and are partially listed as appendices. The mathematical basis for the compuer programs is given.

  6. The exclusion from welfare benefits: Resentment and survey attrition in a randomized controlled trial in Mexico.

    PubMed

    Stecklov, Guy; Weinreb, Alexander; Winters, Paul

    2016-11-01

    Public policy programs must often impose limits on who may be eligible for benefits. Despite research on the impact of exclusion in developed countries, there is little evidence on how people react to being excluded from benefits in developing societies. Utilizing repeated waves of data from an experimental evaluation of Mexico's foundational PROGRESA antipoverty program, we examine the impact of exclusion and distinguish two separate forms. "Statistical exclusion" occurs where determination of benefits is based on randomized assignment to a treatment and control group. "Needs-based exclusion" occurs when benefits programs are designed to be selective rather than universal, basing eligibility on characteristics, like relative poverty, that are difficult to measure simply and accurately. Focusing on temporal variation in survey non-response as our behavioral outcome, we show that needs-based exclusion has much greater negative effects on continued participation than statistical exclusion. We also show that these effects are concentrated among the wealthy, that is, those furthest from the eligibility cut-off line. These findings reinforce general concerns about the validity of evaluation studies when incentives are at work. We discuss both the behavioral explanations that might underlie these findings as well as some potential approaches to reduce threats to evaluation validity. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Index of flood maps prepared by the U.S. Geological Survey through 1973

    USGS Publications Warehouse

    Carrigan, Philip Hadley

    1974-01-01

    A listing is presented of flood maps prepared by the U.S. Geological Survey through 1973. Maps are listed by State and county and the list provides information on the type of flooding depicted and the reliability of the delineation.The list was prepared from a computer file, and an available program allows retrieval of data by land-line location, State and county, and Standard Metropolitan Statistical Area (SMSA). The file will be continuously updated.

  8. Condylar guidance: correlation between protrusive interocclusal record and panoramic radiographic image: a pilot study.

    PubMed

    Tannamala, Pavan Kumar; Pulagam, Mahesh; Pottem, Srinivas R; Swapna, B

    2012-04-01

    The purpose of this study was to compare the sagittal condylar angles set in the Hanau articulator by use of a method of obtaining an intraoral protrusive record to those angles found using a panoramic radiographic image. Ten patients, free of signs and symptoms of temporomandibular disorder and with intact dentition were selected. The dental stone casts of the subjects were mounted on a Hanau articulator with a springbow and poly(vinyl siloxane) interocclusal records. For all patients, the protrusive records were obtained when the mandible moved forward by approximately 6 mm. All procedures for recording, mounting, and setting were done in the same session. The condylar guidance angles obtained were tabulated. A panoramic radiographic image of each patient was made with the Frankfurt horizontal plane parallel to the floor of the mouth. Tracings of the radiographic images were made. The horizontal reference line was marked by joining the orbitale and porion. The most superior and most inferior points of the curvatures were identified. These two lines were connected by a straight line representing the mean curvature line. Angles made by the intersection of the mean curvature line and the horizontal reference line were measured. The results were subjected to statistical analysis with a significance level of p < 0.05. The radiographic values were on average 4° greater than the values obtained by protrusive interocclusal record method. The mean condylar guidance angle between the right and left side by both the methods was not statistically significant. The comparison of mean condylar guidance angles between the right side of the protrusive record method and the right side of the panoramic radiographic method and the left side of the protrusive record method and the left side of the panoramic radiographic method (p= 0.071 and p= 0.057, respectively) were not statistically significant. Within the limitations of this study, it was concluded that the protrusive condylar guidance angles obtained by panoramic radiograph may be used in programming semi-adjustable articulators. © 2012 by the American College of Prosthodontists.

  9. On-line capacity-building program on "analysis of data" for medical educators in the South Asia region: a qualitative exploration of our experience.

    PubMed

    Dongre, A R; Chacko, T V; Banu, S; Bhandary, S; Sahasrabudhe, R A; Philip, S; Deshmukh, P R

    2010-11-01

    In medical education, using the World Wide Web is a new approach for building the capacity of faculty. However, there is little information available on medical education researchers' needs and their collective learning outcomes in such on-line environments. Hence, the present study attempted: 1)to identify needs for capacity-building of fellows in a faculty development program on the topic of data analysis; and 2) to describe, analyze and understand the collective learning outcomes of the fellows during this need-based on-line session. The present research is based on quantitative (on-line survey for needs assessment) and qualitative (contents of e-mails exchanged in listserv discussion) data which were generated during the October 2009 Mentoring and Learning (M-L) Web discussion on the topic of data analysis. The data sources were shared e-mail responses during the process of planning and executing the M-L Web discussion. Content analysis was undertaken and the categories of discussion were presented as a simple non-hierarchical typology which represents the collective learning of the project fellows. We identified the types of learning needs on the topic 'Analysis of Data' to be addressed for faculty development in the field of education research. This need-based M-L Web discussion could then facilitate collective learning on such topics as 'basic concepts in statistics', tests of significance, Likert scale analysis, bivariate correlation, and simple regression analysis and content analysis of qualitative data. Steps like identifying the learning needs for an on-line M-L Web discussion, addressing the immediate needs of learners and creating a flexible reflective learning environment on the M-L Web facilitated the collective learning of the fellows on the topic of data analysis. Our outcomes can be useful in the design of on-line pedagogical strategies for supporting research in medical education.

  10. Methods for computational disease surveillance in infection prevention and control: Statistical process control versus Twitter's anomaly and breakout detection algorithms.

    PubMed

    Wiemken, Timothy L; Furmanek, Stephen P; Mattingly, William A; Wright, Marc-Oliver; Persaud, Annuradha K; Guinn, Brian E; Carrico, Ruth M; Arnold, Forest W; Ramirez, Julio A

    2018-02-01

    Although not all health care-associated infections (HAIs) are preventable, reducing HAIs through targeted intervention is key to a successful infection prevention program. To identify areas in need of targeted intervention, robust statistical methods must be used when analyzing surveillance data. The objective of this study was to compare and contrast statistical process control (SPC) charts with Twitter's anomaly and breakout detection algorithms. SPC and anomaly/breakout detection (ABD) charts were created for vancomycin-resistant Enterococcus, Acinetobacter baumannii, catheter-associated urinary tract infection, and central line-associated bloodstream infection data. Both SPC and ABD charts detected similar data points as anomalous/out of control on most charts. The vancomycin-resistant Enterococcus ABD chart detected an extra anomalous point that appeared to be higher than the same time period in prior years. Using a small subset of the central line-associated bloodstream infection data, the ABD chart was able to detect anomalies where the SPC chart was not. SPC charts and ABD charts both performed well, although ABD charts appeared to work better in the context of seasonal variation and autocorrelation. Because they account for common statistical issues in HAI data, ABD charts may be useful for practitioners for analysis of HAI surveillance data. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  11. TOAD Editor

    NASA Technical Reports Server (NTRS)

    Bingle, Bradford D.; Shea, Anne L.; Hofler, Alicia S.

    1993-01-01

    Transferable Output ASCII Data (TOAD) computer program (LAR-13755), implements format designed to facilitate transfer of data across communication networks and dissimilar host computer systems. Any data file conforming to TOAD format standard called TOAD file. TOAD Editor is interactive software tool for manipulating contents of TOAD files. Commonly used to extract filtered subsets of data for visualization of results of computation. Also offers such user-oriented features as on-line help, clear English error messages, startup file, macroinstructions defined by user, command history, user variables, UNDO features, and full complement of mathematical statistical, and conversion functions. Companion program, TOAD Gateway (LAR-14484), converts data files from variety of other file formats to that of TOAD. TOAD Editor written in FORTRAN 77.

  12. Growth curves of carcass traits obtained by ultrasonography in three lines of Nellore cattle selected for body weight.

    PubMed

    Coutinho, C C; Mercadante, M E Z; Jorge, A M; Paz, C C P; El Faro, L; Monteiro, F M

    2015-10-30

    The effect of selection for postweaning weight was evaluated within the growth curve parameters for both growth and carcass traits. Records of 2404 Nellore animals from three selection lines were analyzed: two selection lines for high postweaning weight, selection (NeS) and traditional (NeT); and a control line (NeC) in which animals were selected for postweaning weight close to the average. Body weight (BW), hip height (HH), rib eye area (REA), back fat thickness (BFT), and rump fat thickness (RFT) were measured and records collected from animals 8 to 20 (males) and 11 to 26 (females) months of age. The parameters A (asymptotic value) and k (growth rate) were estimated using the nonlinear model procedure of the Statistical Analysis System program, which included fixed effect of line (NeS, NeT, and NeC) in the model, with the objective to evaluate differences in the estimated parameters between lines. Selected animals (NeS and NeT) showed higher growth rates than control line animals (NeC) for all traits. Line effect on curves parameters was significant (P < 0.001) for BW, HH, and REA in males, and for BFT and RFT in females. Selection for postweaning weight was effective in altering growth curves, resulting in animals with higher growth potential.

  13. The effect of CNC and manual laser machining on electrical resistance of HDPE/MWCNT composite

    NASA Astrophysics Data System (ADS)

    Mohammadi, Fatemeh; Farshbaf Zinati, Reza; Fattahi, A. M.

    2018-05-01

    In this study, electrical conductivity of high-density polyethylene (HDPE)/multi-walled carbon nanotube (MWCNT) composite was investigated after laser machining. To this end, produced using plastic injection process, nano-composite samples were laser machined with various combinations of input parameters such as feed rate (35, 45, and 55 mm/min), feed angle with injection flow direction (0°, 45°, and 90°), and MWCNT content (0.5, 1, and 1.5 wt%). The angle between laser feed and injected flow direction was set via either of two different methods: CNC programming and manual setting. The results showed that the parameters of angle between laser line and melt flow direction and feed rate were both found to have statistically significance and physical impacts on electrical resistance of the samples in manual setting. Also, maximum conductivity was seen when the angle between laser line and melt flow direction was set to 90° in manual setting, and maximum conductivity was seen at feed rate of 55 mm/min in both of CNC programming and manual setting.

  14. Effective Vaccine Communication during the Disneyland Measles Outbreak

    PubMed Central

    Broniatowski, David Andre; Hilyard, Karen M.; Dredze, Mark

    2016-01-01

    Vaccine refusal rates have increased in recent years, highlighting the need for effective risk communication, especially over social media. Fuzzy-trace theory predicts that individuals encode bottom-line meaning ("gist") and statistical information ("verbatim") in parallel and that articles expressing a clear gist will be most compelling. We coded news articles (n=4,686) collected during the 2014–2015 Disneyland measles for content including statistics, stories, or opinions containing bottom-line gists regarding vaccines and vaccine-preventable illnesses. We measured the extent to which articles were compelling by how frequently they were shared on Facebook. The most widely shared articles expressed bottom-line opinions, although articles containing statistics were also more likely to be shared than articles lacking statistics. Stories had limited impact on Facebook shares. Results support Fuzzy Trace Theory's predictions regarding the distinct yet parallel impact of categorical gist and statistical verbatim information on public health communication. PMID:27179915

  15. Effective vaccine communication during the disneyland measles outbreak.

    PubMed

    Broniatowski, David A; Hilyard, Karen M; Dredze, Mark

    2016-06-14

    Vaccine refusal rates have increased in recent years, highlighting the need for effective risk communication, especially over social media. Fuzzy-trace theory predicts that individuals encode bottom-line meaning ("gist") and statistical information ("verbatim") in parallel and those articles expressing a clear gist will be most compelling. We coded news articles (n=4581) collected during the 2014-2015 Disneyland measles for content including statistics, stories, or bottom-line gists regarding vaccines and vaccine-preventable illnesses. We measured the extent to which articles were compelling by how frequently they were shared on Facebook. The most widely shared articles expressed bottom-line gists, although articles containing statistics were also more likely to be shared than articles lacking statistics. Stories had limited impact on Facebook shares. Results support Fuzzy Trace Theory's predictions regarding the distinct yet parallel impact of categorical gist and statistical verbatim information on public health communication. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Bayesian Latent Class Analysis Tutorial.

    PubMed

    Li, Yuelin; Lord-Bessen, Jennifer; Shiyko, Mariya; Loeb, Rebecca

    2018-01-01

    This article is a how-to guide on Bayesian computation using Gibbs sampling, demonstrated in the context of Latent Class Analysis (LCA). It is written for students in quantitative psychology or related fields who have a working knowledge of Bayes Theorem and conditional probability and have experience in writing computer programs in the statistical language R . The overall goals are to provide an accessible and self-contained tutorial, along with a practical computation tool. We begin with how Bayesian computation is typically described in academic articles. Technical difficulties are addressed by a hypothetical, worked-out example. We show how Bayesian computation can be broken down into a series of simpler calculations, which can then be assembled together to complete a computationally more complex model. The details are described much more explicitly than what is typically available in elementary introductions to Bayesian modeling so that readers are not overwhelmed by the mathematics. Moreover, the provided computer program shows how Bayesian LCA can be implemented with relative ease. The computer program is then applied in a large, real-world data set and explained line-by-line. We outline the general steps in how to extend these considerations to other methodological applications. We conclude with suggestions for further readings.

  17. Atomic Data and Spectral Line Intensities for Ne III

    NASA Technical Reports Server (NTRS)

    Bhatia, A. K.; Thomas, R. J.; Landi, E.; Fisher, Richard R. (Technical Monitor)

    2002-01-01

    A number of satellites and rockets have been launched to observe radiation from the Sun and other astrophysical objects. Line radiation is emitted when the electron impact excited levels decay to the lower levels by photon emission. From this radiation, the physical parameters such as electron temperature and density of the astrophysical plasma, elemental abundance, and opacity can be inferred. Ne III lines have been observed in H II regions, Ne-rich filaments in supernovae, and planetary nebulae. The allowed line at 489.50 Angstroms due to the transition 2s(sup 2) 2p(sup 5) (sup 3) P2 (goes to) 2s(sup 2)2p(sup 4)(sup 3)P2 has been identified in the solar spectrum by Vernazza and Reeves using Skylab observations. Other Ne III lines in the solar EUV spectrum have been reported by Thomas and Neupert based on observations from the Solar EUV Rocket Telescope and Spectrograph (SERTS) instrument. Atomic data for Ne III have been calculated by using a set of programs developed at, University College, London. The Superstructure and Distorted Wave (DW) programs have been updated over the years. In the Superstructure program, configuration interaction can be taken into account and radial functions are calculated in a modified Thomas-Fermi-Amaldi potential. This is a statistical potential and depends on parameters lambda 1 which are determined by optimizing the weighted sum of term energies. They are found to be lambda(sub 0)=1.2467, lambda(sub 1)=1.1617, and lambda(sub 2)=1.0663. The relativistic corrections are included by using the Breit-Pauli Hamiltonian as a perturbation to the nonrelativistic Hamiltonian. The same potential is used to calculate reactance matrices in the DW approximation in LS coupling. Collision strengths in intermediate coupling are obtained by using term coupling coefficients obtained from the Superstructure program. In this calculation, the configurations used are 2s(sup 2)2p(sup 4), 2s2p(sup 5), 2s(sup 2)2p(sup 3)3s, 2s(sup 2)p(sup 3)3d giving rise to 57 fine-structure levels in intermediate coupling.

  18. On Designing Multicore-Aware Simulators for Systems Biology Endowed with OnLine Statistics

    PubMed Central

    Calcagno, Cristina; Coppo, Mario

    2014-01-01

    The paper arguments are on enabling methodologies for the design of a fully parallel, online, interactive tool aiming to support the bioinformatics scientists .In particular, the features of these methodologies, supported by the FastFlow parallel programming framework, are shown on a simulation tool to perform the modeling, the tuning, and the sensitivity analysis of stochastic biological models. A stochastic simulation needs thousands of independent simulation trajectories turning into big data that should be analysed by statistic and data mining tools. In the considered approach the two stages are pipelined in such a way that the simulation stage streams out the partial results of all simulation trajectories to the analysis stage that immediately produces a partial result. The simulation-analysis workflow is validated for performance and effectiveness of the online analysis in capturing biological systems behavior on a multicore platform and representative proof-of-concept biological systems. The exploited methodologies include pattern-based parallel programming and data streaming that provide key features to the software designers such as performance portability and efficient in-memory (big) data management and movement. Two paradigmatic classes of biological systems exhibiting multistable and oscillatory behavior are used as a testbed. PMID:25050327

  19. On designing multicore-aware simulators for systems biology endowed with OnLine statistics.

    PubMed

    Aldinucci, Marco; Calcagno, Cristina; Coppo, Mario; Damiani, Ferruccio; Drocco, Maurizio; Sciacca, Eva; Spinella, Salvatore; Torquati, Massimo; Troina, Angelo

    2014-01-01

    The paper arguments are on enabling methodologies for the design of a fully parallel, online, interactive tool aiming to support the bioinformatics scientists .In particular, the features of these methodologies, supported by the FastFlow parallel programming framework, are shown on a simulation tool to perform the modeling, the tuning, and the sensitivity analysis of stochastic biological models. A stochastic simulation needs thousands of independent simulation trajectories turning into big data that should be analysed by statistic and data mining tools. In the considered approach the two stages are pipelined in such a way that the simulation stage streams out the partial results of all simulation trajectories to the analysis stage that immediately produces a partial result. The simulation-analysis workflow is validated for performance and effectiveness of the online analysis in capturing biological systems behavior on a multicore platform and representative proof-of-concept biological systems. The exploited methodologies include pattern-based parallel programming and data streaming that provide key features to the software designers such as performance portability and efficient in-memory (big) data management and movement. Two paradigmatic classes of biological systems exhibiting multistable and oscillatory behavior are used as a testbed.

  20. Ultra Barrier Topsheet Film for Flexible Photovoltaics with 3M Company

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Funkenbusch, Arnie; Ruth, Charles

    2014-12-30

    In this DOE sponsored program, 3M achieved the critical UBT features to enable durable flexible high efficiency modules to be produced by a range of customers who have now certified the 3M UBT and are actively developing said flexible modules. The specific objectives and accomplishments of the work under this program were; Scale-up the current Generation-1 UBT from 12” width, as made on 3M’s pilot line, to 1+meter width full-scale manufacturing, while maintaining baseline performance metrics (see table below); This objective was fully met; Validate service life of Generation-1 UBT for the 25+ year lifetime demanded by the photovoltaic market;more » Aggressive testing revealed potential failure modes in the Gen 1 UBT. Deficiencies were identified and corrective action taken in the Gen 2 UBT; Develop a Generation-2 UBT on the pilot line, targeting improved performance relative to baseline, including higher %T (percent transmission), lower water vapor transmission rate (WVTR) with targets based on what the technology needs for 25 year lifetime, proven lifetime of 25 years in solar module construction in the field, and lower cost; Testing of UBT Gen 2 under a wide range of conditions presented in this report failed to reveal any failure mode. Therefore UBT Gen 2 is known to be highly durable. 3M will continue to test towards statistically validating a 25 year lifetime under 3M funding; Transfer Generation-2 UBT from the pilot line to the full-scale manufacturing line within three years; and This objective was fully met.« less

  1. Exploring Foundation Concepts in Introductory Statistics Using Dynamic Data Points

    ERIC Educational Resources Information Center

    Ekol, George

    2015-01-01

    This paper analyses introductory statistics students' verbal and gestural expressions as they interacted with a dynamic sketch (DS) designed using "Sketchpad" software. The DS involved numeric data points built on the number line whose values changed as the points were dragged along the number line. The study is framed on aggregate…

  2. Visual Survey of Infantry Troops. Part 1. Visual Acuity, Refractive Status, Interpupillary Distance and Visual Skills

    DTIC Science & Technology

    1989-06-01

    letters on one line and several letters on the next line, there is no accurate way to credit these extra letters for statistical analysis. The decimal and...contains the descriptive statistics of the objective refractive error components of infantrymen. Figures 8-11 show the frequency distributions for sphere...equivalents. Nonspectacle wearers Table 12 contains the idescriptive statistics for non- spectacle wearers. Based or these refractive error data, about 30

  3. TcCYS4, a cystatin from cocoa, reduces necrosis triggered by MpNEP2 in tobacco plants.

    PubMed

    Santana, L S; Costa, M G C; Pirovani, N M; Almeida, A F; Alvim, F C; Pirovani, C P

    2014-09-26

    In Brazil, most cocoa bean production occurs in Southern Bahia. Witches' broom disease arrived in this area in 1989 and has since caused heavy losses in production. The disease is caused by the basidiomycete fungus Moniliophthora perniciosa, a hemibiotrophic fungus that produces the necrosis and ethylene-inducting protein (MpNEP2) during infection; this protein can activate cysteine proteases and induce programmed cell death. Cysteine proteases can be modulated by cystatin. In this study, we overexpressed TcCYS4, a cocoa cystatin, in tobacco plants and evaluated the effect on MpNEP2 in model plants. Tccys4 cDNA was cloned into the pCAMBIA 1390 vector and inserted into the tobacco plants via Agrobacterium tumefaciens. Transgene expression was analyzed by reverse transcription-quantitative PCR and Western blot analysis. Transcript and protein levels in Tcccys4:tobacco lines were 8.9- and 1.5-fold higher than in wild-type plants (wt). Tcccys4:tobacco lines showed no change in growth compared to wt plants. CO2 net assimilation (A) increased in Tcccys4:tobacco lines compared to wt plants. Only one line showed statistically significant stomatal conductance (gs) and transpiration rate (E) changes. MpNEP2 was infiltered into the foliar mesophyll of Tcccys4:tobacco lines and wt plants, and necrotic lesions were attenuated in lines highly expressing Tccys4. Our results suggest that cocoa cystatin TcCYS4 affects MpNEP2 activity related to the progression of programmed cell death in tobacco plants. This may occur through the action of cystatin to inhibit cysteine proteases activated by MpNEP2 in plant tissues. Further studies are necessary to examine cystatin in the Theobroma cacao-M. perniciosa pathosystem.

  4. A sampling plan for conduit-flow karst springs: Minimizing sampling cost and maximizing statistical utility

    USGS Publications Warehouse

    Currens, J.C.

    1999-01-01

    Analytical data for nitrate and triazines from 566 samples collected over a 3-year period at Pleasant Grove Spring, Logan County, KY, were statistically analyzed to determine the minimum data set needed to calculate meaningful yearly averages for a conduit-flow karst spring. Results indicate that a biweekly sampling schedule augmented with bihourly samples from high-flow events will provide meaningful suspended-constituent and dissolved-constituent statistics. Unless collected over an extensive period of time, daily samples may not be representative and may also be autocorrelated. All high-flow events resulting in a significant deflection of a constituent from base-line concentrations should be sampled. Either the geometric mean or the flow-weighted average of the suspended constituents should be used. If automatic samplers are used, then they may be programmed to collect storm samples as frequently as every few minutes to provide details on the arrival time of constituents of interest. However, only samples collected bihourly should be used to calculate averages. By adopting a biweekly sampling schedule augmented with high-flow samples, the need to continuously monitor discharge, or to search for and analyze existing data to develop a statistically valid monitoring plan, is lessened.Analytical data for nitrate and triazines from 566 samples collected over a 3-year period at Pleasant Grove Spring, Logan County, KY, were statistically analyzed to determine the minimum data set needed to calculate meaningful yearly averages for a conduit-flow karst spring. Results indicate that a biweekly sampling schedule augmented with bihourly samples from high-flow events will provide meaningful suspended-constituent and dissolved-constituent statistics. Unless collected over an extensive period of time, daily samples may not be representative and may also be autocorrelated. All high-flow events resulting in a significant deflection of a constituent from base-line concentrations should be sampled. Either the geometric mean or the flow-weighted average of the suspended constituents should be used. If automatic samplers are used, then they may be programmed to collect storm samples as frequently as every few minutes to provide details on the arrival time of constituents of interest. However, only samples collected bihourly should be used to calculate averages. By adopting a biweekly sampling schedule augmented with high-flow samples, the need to continuously monitor discharge, or to search for and analyze existing data to develop a statistically valid monitoring plan, is lessened.

  5. MHSS: a material handling system simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pomernacki, L.; Hollstien, R.B.

    1976-04-07

    A Material Handling System Simulator (MHSS) program is described that provides specialized functional blocks for modeling and simulation of nuclear material handling systems. Models of nuclear fuel fabrication plants may be built using functional blocks that simulate material receiving, storage, transport, inventory, processing, and shipping operations as well as the control and reporting tasks of operators or on-line computers. Blocks are also provided that allow the user to observe and gather statistical information on the dynamic behavior of simulated plants over single or replicated runs. Although it is currently being developed for the nuclear materials handling application, MHSS can bemore » adapted to other industries in which material accountability is important. In this paper, emphasis is on the simulation methodology of the MHSS program with application to the nuclear material safeguards problem. (auth)« less

  6. Cruise Report for G1-03-GM, USGS Gas Hydrates Cruise, R/V Gyre, 1-14 May 2003, Northern Gulf of Mexico

    USGS Publications Warehouse

    Hutchinson, Deborah R.; Hart, Patrick E.

    2004-01-01

    This report gives a summary of the field program and instrumentation used on the R/V Gyre in the Gulf of Mexico in May, 2003, to collect multichannel seismic data in support of USGS and Department of Energy gas hydrate studies. Tabulated statistics, metadata, figures and maps are included to show the breadth of data collected and preliminary interpretations made during the field program. Geophysical data collected during this cruise will be released in a separate report. At the start of the cruise, three test lines were run to compare different source configurations in order to optimize data quality for the objectives of the cruise. The source chosen was the 13/13 in3 Generator-Injector (GI) Gun. Following these tests, a total of 101 lines (approximately 1033 km) of 24-channel high-resolution seismic reflection data were collected in the northern Gulf of Mexico. 59 lines (about 600 km) were collected in and around lease block Keathley Canyon 195. An additional 4 lines (85 km) provided a seismic tie between the Keathley Canyon data and USGS multichannel data collected in 1999. About 253 km of data were collected along 35 short lines in and around lease block Atwater Valley 14 on the floor of the Mississippi Canyon. Three lines (53 km) completed the cruise and provided a seismic tie to USGS multichannel data collected in 1998. Two on-board trained marine-mammal observers fulfilled the requirements determined by NOAA/National Marine Fisheries Service to avoid incidental harassment of marine mammals as established in the Marine Mammal Protection Act (MMPA). A total of three species of dolphins were observed during the cruise and one basking shark. No sperm whales were sighted. During the cruise, seismic operations were not delayed or terminated because of marine mammal activity.

  7. A computer program for thermal radiation from gaseous rocket exhuast plumes (GASRAD)

    NASA Technical Reports Server (NTRS)

    Reardon, J. E.; Lee, Y. C.

    1979-01-01

    A computer code is presented for predicting incident thermal radiation from defined plume gas properties in either axisymmetric or cylindrical coordinate systems. The radiation model is a statistical band model for exponential line strength distribution with Lorentz/Doppler line shapes for 5 gaseous species (H2O, CO2, CO, HCl and HF) and an appoximate (non-scattering) treatment of carbon particles. The Curtis-Godson approximation is used for inhomogeneous gases, but a subroutine is available for using Young's intuitive derivative method for H2O with Lorentz line shape and exponentially-tailed-inverse line strength distribution. The geometry model provides integration over a hemisphere with up to 6 individually oriented identical axisymmetric plumes, a single 3-D plume, Shading surfaces may be used in any of 7 shapes, and a conical limit may be defined for the plume to set individual line-of-signt limits. Intermediate coordinate systems may specified to simplify input of plumes and shading surfaces.

  8. Development and Validation of Methods for Applying Pharmacokinetic Data in Risk Assessment. Volume 7. PBPK SIM

    DTIC Science & Technology

    1990-12-01

    keys 7 Executing PBPKSIM 10 Main Menu 12 File Selection 13 Data 13 simulation 13 All 14 sTatistics 14 Change directory 14 dos Shell 15 eXit 15 Data...the PBPKSIM program are based upon the window design seen here: TITLE I MENU BAR I INFORMATION LINE I I I IMIN DISPLAY AREAI1 1 I I I I I I I STATUS...AREAI Title shows the location of the program by supplying the name of the window being exeLuted. Menu Bar displays the other windows or other

  9. Operating a Geiger Müller tube using a PC sound card

    NASA Astrophysics Data System (ADS)

    Azooz, A. A.

    2009-01-01

    In this paper, a simple MATLAB-based PC program that enables the computer to function as a replacement for the electronic scalar-counter system associated with a Geiger-Müller (GM) tube is described. The program utilizes the ability of MATLAB to acquire data directly from the computer sound card. The signal from the GM tube is applied to the computer sound card via the line in port. All standard GM experiments, pulse shape and statistical analysis experiments can be carried out using this system. A new visual demonstration of dead time effects is also presented.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jarocki, John Charles; Zage, David John; Fisher, Andrew N.

    LinkShop is a software tool for applying the method of Linkography to the analysis time-sequence data. LinkShop provides command line, web, and application programming interfaces (API) for input and processing of time-sequence data, abstraction models, and ontologies. The software creates graph representations of the abstraction model, ontology, and derived linkograph. Finally, the tool allows the user to perform statistical measurements of the linkograph and refine the ontology through direct manipulation of the linkograph.

  11. msap: a tool for the statistical analysis of methylation-sensitive amplified polymorphism data.

    PubMed

    Pérez-Figueroa, A

    2013-05-01

    In this study msap, an R package which analyses methylation-sensitive amplified polymorphism (MSAP or MS-AFLP) data is presented. The program provides a deep analysis of epigenetic variation starting from a binary data matrix indicating the banding pattern between the isoesquizomeric endonucleases HpaII and MspI, with differential sensitivity to cytosine methylation. After comparing the restriction fragments, the program determines if each fragment is susceptible to methylation (representative of epigenetic variation) or if there is no evidence of methylation (representative of genetic variation). The package provides, in a user-friendly command line interface, a pipeline of different analyses of the variation (genetic and epigenetic) among user-defined groups of samples, as well as the classification of the methylation occurrences in those groups. Statistical testing provides support to the analyses. A comprehensive report of the analyses and several useful plots could help researchers to assess the epigenetic and genetic variation in their MSAP experiments. msap is downloadable from CRAN (http://cran.r-project.org/) and its own webpage (http://msap.r-forge.R-project.org/). The package is intended to be easy to use even for those people unfamiliar with the R command line environment. Advanced users may take advantage of the available source code to adapt msap to more complex analyses. © 2013 Blackwell Publishing Ltd.

  12. Tool for Statistical Analysis and Display of Landing Sites

    NASA Technical Reports Server (NTRS)

    Wawrzyniak, Geoffrey; Kennedy, Brian; Knocke, Philip; Michel, John

    2006-01-01

    MarsLS is a software tool for analyzing statistical dispersion of spacecraft-landing sites and displaying the results of its analyses. Originally intended for the Mars Explorer Rover (MER) mission, MarsLS is also applicable to landing sites on Earth and non-MER sites on Mars. MarsLS is a collection of interdependent MATLAB scripts that utilize the MATLAB graphical-user-interface software environment to display landing-site data (see figure) on calibrated image-maps of the Martian or other terrain. The landing-site data comprise latitude/longitude pairs generated by Monte Carlo runs of other computer programs that simulate entry, descent, and landing. Using these data, MarsLS can compute a landing-site ellipse a standard means of depicting the area within which the spacecraft can be expected to land with a given probability. MarsLS incorporates several features for the user s convenience, including capabilities for drawing lines and ellipses, overlaying kilometer or latitude/longitude grids, drawing and/or specifying lines and/or points, entering notes, defining and/or displaying polygons to indicate hazards or areas of interest, and evaluating hazardous and/or scientifically interesting areas. As part of such an evaluation, MarsLS can compute the probability of landing in a specified polygonal area.

  13. AESS: Accelerated Exact Stochastic Simulation

    NASA Astrophysics Data System (ADS)

    Jenkins, David D.; Peterson, Gregory D.

    2011-12-01

    The Stochastic Simulation Algorithm (SSA) developed by Gillespie provides a powerful mechanism for exploring the behavior of chemical systems with small species populations or with important noise contributions. Gene circuit simulations for systems biology commonly employ the SSA method, as do ecological applications. This algorithm tends to be computationally expensive, so researchers seek an efficient implementation of SSA. In this program package, the Accelerated Exact Stochastic Simulation Algorithm (AESS) contains optimized implementations of Gillespie's SSA that improve the performance of individual simulation runs or ensembles of simulations used for sweeping parameters or to provide statistically significant results. Program summaryProgram title: AESS Catalogue identifier: AEJW_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJW_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: University of Tennessee copyright agreement No. of lines in distributed program, including test data, etc.: 10 861 No. of bytes in distributed program, including test data, etc.: 394 631 Distribution format: tar.gz Programming language: C for processors, CUDA for NVIDIA GPUs Computer: Developed and tested on various x86 computers and NVIDIA C1060 Tesla and GTX 480 Fermi GPUs. The system targets x86 workstations, optionally with multicore processors or NVIDIA GPUs as accelerators. Operating system: Tested under Ubuntu Linux OS and CentOS 5.5 Linux OS Classification: 3, 16.12 Nature of problem: Simulation of chemical systems, particularly with low species populations, can be accurately performed using Gillespie's method of stochastic simulation. Numerous variations on the original stochastic simulation algorithm have been developed, including approaches that produce results with statistics that exactly match the chemical master equation (CME) as well as other approaches that approximate the CME. Solution method: The Accelerated Exact Stochastic Simulation (AESS) tool provides implementations of a wide variety of popular variations on the Gillespie method. Users can select the specific algorithm considered most appropriate. Comparisons between the methods and with other available implementations indicate that AESS provides the fastest known implementation of Gillespie's method for a variety of test models. Users may wish to execute ensembles of simulations to sweep parameters or to obtain better statistical results, so AESS supports acceleration of ensembles of simulation using parallel processing with MPI, SSE vector units on x86 processors, and/or using NVIDIA GPUs with CUDA.

  14. FDA Approval Summary: Pembrolizumab for Treatment of Metastatic Non‐Small Cell Lung Cancer: First‐Line Therapy and Beyond

    PubMed Central

    Blumenthal, Gideon M.; Li, Hongshan; Subramaniam, Sriram; Mishra‐Kalyani, Pallavi S.; He, Kun; Zhao, Hong; Yu, Jingyu; Paciga, Mark; Goldberg, Kirsten B.; McKee, Amy E.; Keegan, Patricia; Pazdur, Richard

    2017-01-01

    Abstract On October 24, 2016, the U.S. Food and Drug Administration (FDA) approved pembrolizumab (Keytruda; Merck & Co., Inc., https://www.merck.com) for treatment of patients with metastatic non‐small cell lung cancer (mNSCLC) whose tumors express programmed death‐ligand 1 (PD‐L1) as determined by an FDA‐approved test, as follows: (a) first‐line treatment of patients with mNSCLC whose tumors have high PD‐L1 expression (tumor proportion score [TPS] ≥50%), with no epidermal growth factor receptor (EGFR) or anaplastic lymphoma kinase (ALK) genomic tumor aberrations, and (b) treatment of patients with mNSCLC whose tumors express PD‐L1 (TPS ≥1%), with disease progression on or after platinum‐containing chemotherapy. Patients with EGFR or ALK genomic tumor aberrations should have disease progression on FDA‐approved therapy for these aberrations prior to receiving pembrolizumab. Approval was based on two randomized, open‐label, active‐controlled trials demonstrating statistically significant improvements in progression‐free survival (PFS) and overall survival (OS) for patients randomized to pembrolizumab compared with chemotherapy. In KEYNOTE−024, patients with previously untreated mNSCLC who received pembrolizumab (200 mg intravenously [IV] every 3 weeks) had a statistically significant improvement in OS (hazard ratio [HR] 0.60; 95% confidence interval [CI]: 0.41–0.89; p = .005), and significant improvement in PFS (HR 0.50; 95% CI: 0.37–0.68; p < .001). In KEYNOTE‐010, patients with disease progression on or after platinum‐containing chemotherapy received pembrolizumab IV 2 mg/kg, 10 mg/kg, or docetaxel 75 mg/m2 every 3 weeks. The HR and p value for OS was 0.71 (95% CI: 0.58–0.88), p < .001 comparing pembrolizumab 2 mg/kg with chemotherapy and the HR and p value for OS was 0.61 (95% CI: 0.49–0.75), p < .001 comparing pembrolizumab 10 mg/kg with chemotherapy. Implications for Practice. This is the first U.S. Food and Drug Administration approval of a checkpoint inhibitor for first‐line treatment of lung cancer. This approval expands the pembrolizumab indication in second‐line treatment of lung cancer to include all patients with programmed death‐ligand 1‐expressing non‐small cell lung cancer. PMID:28835513

  15. Adaptability and stability of soybean genotypes in off-season cultivation.

    PubMed

    Batista, R O; Hamawaki, R L; Sousa, L B; Nogueira, A P O; Hamawaki, O T

    2015-08-14

    The oil and protein contents of soybean grains are important quantitative traits for use in breeding. However, few breeding programs perform selection based on these traits in different environments. This study assessed the adaptability and stability of 14 elite early soybean breeding lines in off-season cultivation with respect to yield, and oil and protein contents. A range of statistical methods was applied and these analyses indicated that for off-season cultivation, the lines UFUS 5 and UFUS 10 could be recommended due to their superior performance in grain yield, oil content, and specific adaptability to unfavorable environments along with high stability in these characteristics. Also recommended were UFUS 06, which demonstrated superior performance in all three tested characteristics and showed adaptation to favorable environments, and UFUS 13, which showed high adaptability and stability and a superior performance for protein content.

  16. GOT C+: A Herschel Space Observatory Key Program to Study the Diffuse ISM

    NASA Astrophysics Data System (ADS)

    Langer, William; Goldsmith, P. F.; Li, D.; Velusamy, T.; Yorke, H. W.

    2009-01-01

    Galactic Observations of the Terahertz C+ Line (GOT C+) is a Herschel Space Observatory (HSO) Key Program to study the diffuse interstellar medium by sampling the C+ fine structure line emission at 1.9 THz (158 microns) in the Galactic disk. Star formation activity is regulated by pressures in the interstellar medium, which in turn depend on heating and cooling rates, modulated by the gravitational potential, and shock and turbulent pressures. To understand these processes we need information about properties of the diffuse atomic and diffuse molecular gas clouds. The 158-micron CII line is an important tracer of diffuse regions, and C+ is a major ISM coolant, the Galaxy's strongest emission line virtually unobscured by dust, with a total luminosity about a 1000 times that of CO J=1-0. The GOT C+ program will obtain high spectral resolution CII spectra using the Heterodyne Instrument for the Far Infrared (HIFI) receiver. It will employ deep integrations, wide velocity coverage (350 km/s) with 0.22 km/s resolution, and systematic sparse sampling of the Galactic disk together with observations of selected targets, of over 900 lines of sight. It will be a resource to determine the properties of the atomic gas, in the (a) overall Galactic disk, (b) central 300pc of the Galactic center, (c) Galactic warp, (d) high latitude HI clouds, and (e) Photon Dominated Regions (PDRs). These spectra will provide the astronomical community with a rich statistical database of diffuse cloud properties, especially those of the atomic gas, sampled throughout the Galaxy for understanding the role of barometric pressure and turbulence in cloud evolution in the Galactic ISM and, by extension, other galaxies. The GOT C+ project will provide a template for future even larger-scale Galactic C+ surveys. This research was conducted at the Jet Propulsion Laboratory and is supported by a NASA grant.

  17. Genomic selection and association mapping in rice (Oryza sativa): effect of trait genetic architecture, training population composition, marker number and statistical model on accuracy of rice genomic selection in elite, tropical rice breeding lines.

    PubMed

    Spindel, Jennifer; Begum, Hasina; Akdemir, Deniz; Virk, Parminder; Collard, Bertrand; Redoña, Edilberto; Atlin, Gary; Jannink, Jean-Luc; McCouch, Susan R

    2015-02-01

    Genomic Selection (GS) is a new breeding method in which genome-wide markers are used to predict the breeding value of individuals in a breeding population. GS has been shown to improve breeding efficiency in dairy cattle and several crop plant species, and here we evaluate for the first time its efficacy for breeding inbred lines of rice. We performed a genome-wide association study (GWAS) in conjunction with five-fold GS cross-validation on a population of 363 elite breeding lines from the International Rice Research Institute's (IRRI) irrigated rice breeding program and herein report the GS results. The population was genotyped with 73,147 markers using genotyping-by-sequencing. The training population, statistical method used to build the GS model, number of markers, and trait were varied to determine their effect on prediction accuracy. For all three traits, genomic prediction models outperformed prediction based on pedigree records alone. Prediction accuracies ranged from 0.31 and 0.34 for grain yield and plant height to 0.63 for flowering time. Analyses using subsets of the full marker set suggest that using one marker every 0.2 cM is sufficient for genomic selection in this collection of rice breeding materials. RR-BLUP was the best performing statistical method for grain yield where no large effect QTL were detected by GWAS, while for flowering time, where a single very large effect QTL was detected, the non-GS multiple linear regression method outperformed GS models. For plant height, in which four mid-sized QTL were identified by GWAS, random forest produced the most consistently accurate GS models. Our results suggest that GS, informed by GWAS interpretations of genetic architecture and population structure, could become an effective tool for increasing the efficiency of rice breeding as the costs of genotyping continue to decline.

  18. Genomic Selection and Association Mapping in Rice (Oryza sativa): Effect of Trait Genetic Architecture, Training Population Composition, Marker Number and Statistical Model on Accuracy of Rice Genomic Selection in Elite, Tropical Rice Breeding Lines

    PubMed Central

    Spindel, Jennifer; Begum, Hasina; Akdemir, Deniz; Virk, Parminder; Collard, Bertrand; Redoña, Edilberto; Atlin, Gary; Jannink, Jean-Luc; McCouch, Susan R.

    2015-01-01

    Genomic Selection (GS) is a new breeding method in which genome-wide markers are used to predict the breeding value of individuals in a breeding population. GS has been shown to improve breeding efficiency in dairy cattle and several crop plant species, and here we evaluate for the first time its efficacy for breeding inbred lines of rice. We performed a genome-wide association study (GWAS) in conjunction with five-fold GS cross-validation on a population of 363 elite breeding lines from the International Rice Research Institute's (IRRI) irrigated rice breeding program and herein report the GS results. The population was genotyped with 73,147 markers using genotyping-by-sequencing. The training population, statistical method used to build the GS model, number of markers, and trait were varied to determine their effect on prediction accuracy. For all three traits, genomic prediction models outperformed prediction based on pedigree records alone. Prediction accuracies ranged from 0.31 and 0.34 for grain yield and plant height to 0.63 for flowering time. Analyses using subsets of the full marker set suggest that using one marker every 0.2 cM is sufficient for genomic selection in this collection of rice breeding materials. RR-BLUP was the best performing statistical method for grain yield where no large effect QTL were detected by GWAS, while for flowering time, where a single very large effect QTL was detected, the non-GS multiple linear regression method outperformed GS models. For plant height, in which four mid-sized QTL were identified by GWAS, random forest produced the most consistently accurate GS models. Our results suggest that GS, informed by GWAS interpretations of genetic architecture and population structure, could become an effective tool for increasing the efficiency of rice breeding as the costs of genotyping continue to decline. PMID:25689273

  19. CRM Assessment: Determining the Generalization of Rater Calibration Training. Summary of Research Report: Gold Standards Training

    NASA Technical Reports Server (NTRS)

    Baker, David P.

    2002-01-01

    The extent to which pilot instructors are trained to assess crew resource management (CRM) skills accurately during Line-Oriented Flight Training (LOFT) and Line Operational Evaluation (LOE) scenarios is critical. Pilot instructors must make accurate performance ratings to ensure that proper feedback is provided to flight crews and appropriate decisions are made regarding certification to fly the line. Furthermore, the Federal Aviation Administration's (FAA) Advanced Qualification Program (AQP) requires that instructors be trained explicitly to evaluate both technical and CRM performance (i.e., rater training) and also requires that proficiency and standardization of instructors be verified periodically. To address the critical need for effective pilot instructor training, the American Institutes for Research (AIR) reviewed the relevant research on rater training and, based on "best practices" from this research, developed a new strategy for training pilot instructors to assess crew performance. In addition, we explored new statistical techniques for assessing the effectiveness of pilot instructor training. The results of our research are briefly summarized below. This summary is followed by abstracts of articles and book chapters published under this grant.

  20. Statistical significance of task related deep brain EEG dynamic changes in the time-frequency domain.

    PubMed

    Chládek, J; Brázdil, M; Halámek, J; Plešinger, F; Jurák, P

    2013-01-01

    We present an off-line analysis procedure for exploring brain activity recorded from intra-cerebral electroencephalographic data (SEEG). The objective is to determine the statistical differences between different types of stimulations in the time-frequency domain. The procedure is based on computing relative signal power change and subsequent statistical analysis. An example of characteristic statistically significant event-related de/synchronization (ERD/ERS) detected across different frequency bands following different oddball stimuli is presented. The method is used for off-line functional classification of different brain areas.

  1. Bayesian Atmospheric Radiative Transfer (BART) Code and Application to WASP-43b

    NASA Astrophysics Data System (ADS)

    Blecic, Jasmina; Harrington, Joseph; Cubillos, Patricio; Bowman, Oliver; Rojo, Patricio; Stemm, Madison; Lust, Nathaniel B.; Challener, Ryan; Foster, Austin James; Foster, Andrew S.; Blumenthal, Sarah D.; Bruce, Dylan

    2016-01-01

    We present a new open-source Bayesian radiative-transfer framework, Bayesian Atmospheric Radiative Transfer (BART, https://github.com/exosports/BART), and its application to WASP-43b. BART initializes a model for the atmospheric retrieval calculation, generates thousands of theoretical model spectra using parametrized pressure and temperature profiles and line-by-line radiative-transfer calculation, and employs a statistical package to compare the models with the observations. It consists of three self-sufficient modules available to the community under the reproducible-research license, the Thermochemical Equilibrium Abundances module (TEA, https://github.com/dzesmin/TEA, Blecic et al. 2015}, the radiative-transfer module (Transit, https://github.com/exosports/transit), and the Multi-core Markov-chain Monte Carlo statistical module (MCcubed, https://github.com/pcubillos/MCcubed, Cubillos et al. 2015). We applied BART on all available WASP-43b secondary eclipse data from the space- and ground-based observations constraining the temperature-pressure profile and molecular abundances of the dayside atmosphere of WASP-43b. This work was supported by NASA Planetary Atmospheres grant NNX12AI69G and NASA Astrophysics Data Analysis Program grant NNX13AF38G. JB holds a NASA Earth and Space Science Fellowship.

  2. A comprehensive study on pavement edge line implementation.

    DOT National Transportation Integrated Search

    2014-04-01

    The previous 2011 study Safety Improvement from Edge Lines on Rural Two-Lane Highways analyzed the crash data of : three years before and one year after edge line implementation by using the latest safety analysis statistical method. It : concl...

  3. Interactive graphics for the Macintosh: software review of FlexiGraphs.

    PubMed

    Antonak, R F

    1990-01-01

    While this product is clearly unique, its usefulness to individuals outside small business environments is somewhat limited. FlexiGraphs is, however, a reasonable first attempt to design a microcomputer software package that controls data through interactive editing within a graph. Although the graphics capabilities of mainframe programs such as MINITAB (Ryan, Joiner, & Ryan, 1981) and the graphic manipulations available through exploratory data analysis (e.g., Velleman & Hoaglin, 1981) will not be surpassed anytime soon by this program, a researcher may want to add this program to a software library containing other Macintosh statistics, drawing, and graphics programs if only to obtain the easy-to-obtain curve fitting and line smoothing options. I welcome the opportunity to review the enhanced "scientific" version of FlexiGraphs that the author of the program indicates is currently under development. An MS-DOS version of the program should be available within the year.

  4. Implementation of an experimental program to investigate the performance characteristics of OMEGA navigation

    NASA Technical Reports Server (NTRS)

    Baxa, E. G., Jr.

    1974-01-01

    A theoretical formulation of differential and composite OMEGA error is presented to establish hypotheses about the functional relationships between various parameters and OMEGA navigational errors. Computer software developed to provide for extensive statistical analysis of the phase data is described. Results from the regression analysis used to conduct parameter sensitivity studies on differential OMEGA error tend to validate the theoretically based hypothesis concerning the relationship between uncorrected differential OMEGA error and receiver separation range and azimuth. Limited results of measurement of receiver repeatability error and line of position measurement error are also presented.

  5. Employing online quantum random number generators for generating truly random quantum states in Mathematica

    NASA Astrophysics Data System (ADS)

    Miszczak, Jarosław Adam

    2013-01-01

    The presented package for the Mathematica computing system allows the harnessing of quantum random number generators (QRNG) for investigating the statistical properties of quantum states. The described package implements a number of functions for generating random states. The new version of the package adds the ability to use the on-line quantum random number generator service and implements new functions for retrieving lists of random numbers. Thanks to the introduced improvements, the new version provides faster access to high-quality sources of random numbers and can be used in simulations requiring large amount of random data. New version program summaryProgram title: TRQS Catalogue identifier: AEKA_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKA_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 18 134 No. of bytes in distributed program, including test data, etc.: 2 520 49 Distribution format: tar.gz Programming language: Mathematica, C. Computer: Any supporting Mathematica in version 7 or higher. Operating system: Any platform supporting Mathematica; tested with GNU/Linux (32 and 64 bit). RAM: Case-dependent Supplementary material: Fig. 1 mentioned below can be downloaded. Classification: 4.15. External routines: Quantis software library (http://www.idquantique.com/support/quantis-trng.html) Catalogue identifier of previous version: AEKA_v1_0 Journal reference of previous version: Comput. Phys. Comm. 183(2012)118 Does the new version supersede the previous version?: Yes Nature of problem: Generation of random density matrices and utilization of high-quality random numbers for the purpose of computer simulation. Solution method: Use of a physical quantum random number generator and an on-line service providing access to the source of true random numbers generated by quantum real number generator. Reasons for new version: Added support for the high-speed on-line quantum random number generator and improved methods for retrieving lists of random numbers. Summary of revisions: The presented version provides two signicant improvements. The first one is the ability to use the on-line Quantum Random Number Generation service developed by PicoQuant GmbH and the Nano-Optics groups at the Department of Physics of Humboldt University. The on-line service supported in the version 2.0 of the TRQS package provides faster access to true randomness sources constructed using the laws of quantum physics. The service is freely available at https://qrng.physik.hu-berlin.de/. The use of this service allows using the presented package with the need of a physical quantum random number generator. The second improvement introduced in this version is the ability to retrieve arrays of random data directly for the used source. This increases the speed of the random number generation, especially in the case of an on-line service, where it reduces the time necessary to establish the connection. Thanks to the speed improvement of the presented version, the package can now be used in simulations requiring larger amounts of random data. Moreover, the functions for generating random numbers provided by the current version of the package more closely follow the pattern of functions for generating pseudo- random numbers provided in Mathematica. Additional comments: Speed comparison: The implementation of the support for the QRNG on-line service provides a noticeable improvement in the speed of random number generation. For the samples of real numbers of size 101; 102,…,107 the times required to generate these samples using Quantis USB device and QRNG service are compared in Fig. 1. The presented results show that the use of the on-line service provides faster access to random numbers. One should note, however, that the speed gain can increase or decrease depending on the connection speed between the computer and the server providing random numbers. Running time: Depends on the used source of randomness and the amount of random data used in the experiment. References: [1] M. Wahl, M. Leifgen, M. Berlin, T. Röhlicke, H.-J. Rahn, O. Benson., An ultrafast quantum random number generator with provably bounded output bias based on photon arrival time measurements, Applied Physics Letters, Vol. 098, 171105 (2011). http://dx.doi.org/10.1063/1.3578456.

  6. The Diagnostic Potential of Fe Lines Applied to Protostellar Jets

    NASA Astrophysics Data System (ADS)

    Giannini, T.; Nisini, B.; Antoniucci, S.; Alcalá, J. M.; Bacciotti, F.; Bonito, R.; Podio, L.; Stelzer, B.; Whelan, E. T.

    2013-11-01

    We investigate the diagnostic capabilities of iron lines for tracing the physical conditions of shock-excited gas in jets driven by pre-main sequence stars. We have analyzed the 3000-25000 Å, X-shooter spectra of two jets driven by the pre-main sequence stars ESO-Hα 574 and Par-Lup 3-4. Both spectra are very rich in [Fe II] lines over the whole spectral range; in addition, lines from [Fe III] are detected in the ESO-Hα 574 spectrum. Non-local thermal equilibrium codes solving the equations of the statistical equilibrium along with codes for the ionization equilibrium are used to derive the gas excitation conditions of electron temperature and density and fractional ionization. An estimate of the iron gas-phase abundance is provided by comparing the iron lines emissivity with that of neutral oxygen at 6300 Å. The [Fe II] line analysis indicates that the jet driven by ESO-Hα 574 is, on average, colder (T e ~ 9000 K), less dense (n e ~ 2 × 104 cm-3), and more ionized (x e ~ 0.7) than the Par-Lup 3-4 jet (T e ~ 13,000 K, n e ~ 6 × 104 cm-3, x e < 0.4), even if the existence of a higher density component (n e ~ 2 × 105 cm-3) is probed by the [Fe III] and [Fe II] ultra-violet lines. The physical conditions derived from the iron lines are compared with shock models suggesting that the shock at work in ESO-Hα 574 is faster and likely more energetic than the Par-Lup 3-4 shock. This latter feature is confirmed by the high percentage of gas-phase iron measured in ESO-Hα 574 (50%-60% of its solar abundance in comparison with less than 30% in Par-Lup 3-4), which testifies that the ESO-Hα 574 shock is powerful enough to partially destroy the dust present inside the jet. This work demonstrates that a multiline Fe analysis can be effectively used to probe the excitation and ionization conditions of the gas in a jet without any assumption on ionic abundances. The main limitation on the diagnostics resides in the large uncertainties of the atomic data, which, however, can be overcome through a statistical approach involving many lines. Based on observations collected with X-shooter at the Very Large Telescope on Cerro Paranal (Chile), operated by the European Southern Observatory (ESO). Program ID: 085.C-0238(A).

  7. A computer program for converting rectangular coordinates to latitude-longitude coordinates

    USGS Publications Warehouse

    Rutledge, A.T.

    1989-01-01

    A computer program was developed for converting the coordinates of any rectangular grid on a map to coordinates on a grid that is parallel to lines of equal latitude and longitude. Using this program in conjunction with groundwater flow models, the user can extract data and results from models with varying grid orientations and place these data into grid structure that is oriented parallel to lines of equal latitude and longitude. All cells in the rectangular grid must have equal dimensions, and all cells in the latitude-longitude grid measure one minute by one minute. This program is applicable if the map used shows lines of equal latitude as arcs and lines of equal longitude as straight lines and assumes that the Earth 's surface can be approximated as a sphere. The program user enters the row number , column number, and latitude and longitude of the midpoint of the cell for three test cells on the rectangular grid. The latitude and longitude of boundaries of the rectangular grid also are entered. By solving sets of simultaneous linear equations, the program calculates coefficients that are used for making the conversion. As an option in the program, the user may build a groundwater model file based on a grid that is parallel to lines of equal latitude and longitude. The program reads a data file based on the rectangular coordinates and automatically forms the new data file. (USGS)

  8. Imfit: A Fast, Flexible Program for Astronomical Image Fitting

    NASA Astrophysics Data System (ADS)

    Erwin, Peter

    2014-08-01

    Imift is an open-source astronomical image-fitting program specialized for galaxies but potentially useful for other sources, which is fast, flexible, and highly extensible. Its object-oriented design allows new types of image components (2D surface-brightness functions) to be easily written and added to the program. Image functions provided with Imfit include Sersic, exponential, and Gaussian galaxy decompositions along with Core-Sersic and broken-exponential profiles, elliptical rings, and three components that perform line-of-sight integration through 3D luminosity-density models of disks and rings seen at arbitrary inclinations. Available minimization algorithms include Levenberg-Marquardt, Nelder-Mead simplex, and Differential Evolution, allowing trade-offs between speed and decreased sensitivity to local minima in the fit landscape. Minimization can be done using the standard chi^2 statistic (using either data or model values to estimate per-pixel Gaussian errors, or else user-supplied error images) or the Cash statistic; the latter is particularly appropriate for cases of Poisson data in the low-count regime. The C++ source code for Imfit is available under the GNU Public License.

  9. Fuzzy Adaptive Control for Intelligent Autonomous Space Exploration Problems

    NASA Technical Reports Server (NTRS)

    Esogbue, Augustine O.

    1998-01-01

    The principal objective of the research reported here is the re-design, analysis and optimization of our newly developed neural network fuzzy adaptive controller model for complex processes capable of learning fuzzy control rules using process data and improving its control through on-line adaption. The learned improvement is according to a performance objective function that provides evaluative feedback; this performance objective is broadly defined to meet long-range goals over time. Although fuzzy control had proven effective for complex, nonlinear, imprecisely-defined processes for which standard models and controls are either inefficient, impractical or cannot be derived, the state of the art prior to our work showed that procedures for deriving fuzzy control, however, were mostly ad hoc heuristics. The learning ability of neural networks was exploited to systematically derive fuzzy control and permit on-line adaption and in the process optimize control. The operation of neural networks integrates very naturally with fuzzy logic. The neural networks which were designed and tested using simulation software and simulated data, followed by realistic industrial data were reconfigured for application on several platforms as well as for the employment of improved algorithms. The statistical procedures of the learning process were investigated and evaluated with standard statistical procedures (such as ANOVA, graphical analysis of residuals, etc.). The computational advantage of dynamic programming-like methods of optimal control was used to permit on-line fuzzy adaptive control. Tests for the consistency, completeness and interaction of the control rules were applied. Comparisons to other methods and controllers were made so as to identify the major advantages of the resulting controller model. Several specific modifications and extensions were made to the original controller. Additional modifications and explorations have been proposed for further study. Some of these are in progress in our laboratory while others await additional support. All of these enhancements will improve the attractiveness of the controller as an effective tool for the on line control of an array of complex process environments.

  10. How many spectral lines are statistically significant?

    NASA Astrophysics Data System (ADS)

    Freund, J.

    When experimental line spectra are fitted with least squares techniques one frequently does not know whether n or n + 1 lines may be fitted safely. This paper shows how an F-test can be applied in order to determine the statistical significance of including an extra line into the fitting routine.

  11. One Rural Hospital's Experience Implementing the Society for Healthcare Epidemiology of America Guidelines to Decrease Central Line Infections.

    PubMed

    Curlej, Maria H; Katrancha, Elizabeth

    2016-01-01

    In an effort to take advantage of the Highmark Quality Blue Initiative () requiring information from hospitals detailing their central line-associated blood stream infections (CLABSIs) surveillance system, quality improvement program, and statistics regarding the CLABSI events, this institution investigated the latest evidence-based recommendations to reduce CLABSIs. Recognizing the baseline rate of 2.4 CLABSIs per 1,000 central line days and its effect on patient outcomes and medical costs, this hospital made a commitment to improve their CLABSI outcomes. As a result, the facility adopted the Society for Healthcare Epidemiology of America (SHEA) guidelines. The purpose of this article is to review the CLABSI rates and examine the prevention strategies following implementation of the SHEA guidelines. A quantitative, descriptive retrospective program evaluation examined the hospital's pre- and post-SHEA implementation methods of decreasing CLABSIs and the subsequent CLABSI rates over 3 time periods. Any patient with a CLABSI infection admitted to this hospital July 2007 to June 2010 (N = 78). CLABSI rates decreased from 1.9 to 1.3 over the study period. Compliance with specific SHEA guidelines was evaluated and measures were put into place to increase compliance where necessary. CLABSI rates at this facility remain below the baseline of 2.4 for calendar year 2013 (0.79), 2014 (0.07), and 2015 (0.33).

  12. Vega roll and attitude control system algorithms trade-off study

    NASA Astrophysics Data System (ADS)

    Paulino, N.; Cuciniello, G.; Cruciani, I.; Corraro, F.; Spallotta, D.; Nebula, F.

    2013-12-01

    This paper describes the trade-off study for the selection of the most suitable algorithms for the Roll and Attitude Control System (RACS) within the FPS-A program, aimed at developing the new Flight Program Software of VEGA Launcher. Two algorithms were analyzed: Switching Lines (SL) and Quaternion Feedback Regulation. Using a development simulation tool that models two critical flight phases (Long Coasting Phase (LCP) and Payload Release (PLR) Phase), both algorithms were assessed with Monte Carlo batch simulations for both of the phases. The statistical outcomes of the results demonstrate a 100 percent success rate for Quaternion Feedback Regulation, and support the choice of this method.

  13. Statistical Methods in Assembly Quality Management of Multi-Element Products on Automatic Rotor Lines

    NASA Astrophysics Data System (ADS)

    Pries, V. V.; Proskuriakov, N. E.

    2018-04-01

    To control the assembly quality of multi-element mass-produced products on automatic rotor lines, control methods with operational feedback are required. However, due to possible failures in the operation of the devices and systems of automatic rotor line, there is always a real probability of getting defective (incomplete) products into the output process stream. Therefore, a continuous sampling control of the products completeness, based on the use of statistical methods, remains an important element in managing the quality of assembly of multi-element mass products on automatic rotor lines. The feature of continuous sampling control of the multi-element products completeness in the assembly process is its breaking sort, which excludes the possibility of returning component parts after sampling control to the process stream and leads to a decrease in the actual productivity of the assembly equipment. Therefore, the use of statistical procedures for continuous sampling control of the multi-element products completeness when assembled on automatic rotor lines requires the use of such sampling plans that ensure a minimum size of control samples. Comparison of the values of the limit of the average output defect level for the continuous sampling plan (CSP) and for the automated continuous sampling plan (ACSP) shows the possibility of providing lower limit values for the average output defects level using the ACSP-1. Also, the average sample size when using the ACSP-1 plan is less than when using the CSP-1 plan. Thus, the application of statistical methods in the assembly quality management of multi-element products on automatic rotor lines, involving the use of proposed plans and methods for continuous selective control, will allow to automating sampling control procedures and the required level of quality of assembled products while minimizing sample size.

  14. Modulation Doped GaAs/Al sub xGA sub (1-x)As Layered Structures with Applications to Field Effect Transistors.

    DTIC Science & Technology

    1982-02-15

    function of the doping density at 300 and 77 K for the classical Boltzmann statistics or depletion approximation (solid line) and for the approximate...Fermi-Dirac statistics (equation (19) dotted line)• This comparison demonstrates that the deviation from Boltzmann statistics is quite noticeable...tunneling Schottky barriers cannot be obtained at these doping levels. The dotted lines are obtained when Boltzmann statistics are used in the Al Ga

  15. A six-week neuromuscular training program for competitive junior tennis players.

    PubMed

    Barber-Westin, Sue D; Hermeto, Alex A; Noyes, Frank R

    2010-09-01

    This study evaluated the effectiveness of a tennis-specific training program on improving neuromuscular indices in competitive junior players. Tennis is a demanding sport because it requires speed, agility, explosive power, and aerobic conditioning along with the ability to react and anticipate quickly, and there are limited studies that evaluate these indices in young players after a multiweek training program. The program designed for this study implemented the essential components of a previously published neuromuscular training program and also included exercises designed to improve dynamic balance, agility, speed, and strength. Fifteen junior tennis players (10 girls, 5 boys; mean age, 13.0 +/- 1.5 years) who routinely participated in local tournaments and high-school teams participated in the 6-week supervised program. Training was conducted 3 times a week, with sessions lasting 1.5 hours that included a dynamic warm-up, plyometric and jump training, strength training (lower extremity, upper extremity, core), tennis-specific drills, and flexibility. After training, statistically significant improvements and large-to-moderate effect sizes were found in the single-leg triple crossover hop for both legs (p < 0.05), the baseline forehand (p = 0.006) and backhand (p = 0.0008) tests, the service line (p = 0.0009) test, the 1-court suicide (p < 0.0001), the 2-court suicide (p = 0.02), and the abdominal endurance test (p = 0.01). Mean improvements between pretrain and posttrain test sessions were 15% for the single-leg triple crossover hop, 10-11% for the baseline tests, 18% for the service line test, 21% for the 1-court suicide, 10% for the 2-court suicide, and 76% for the abdominal endurance test. No athlete sustained an injury or developed an overuse syndrome as a result of the training program. The results demonstrate that this program is feasible, low in cost, and appears to be effective in improving the majority of neuromuscular indices tested. We accomplished our goal of developing training and testing procedures that could all be performed on the tennis court.

  16. Statistical evaluation of stability data: criteria for change-over-time and data variability.

    PubMed

    Bar, Raphael

    2003-01-01

    In a recently issued ICH Q1E guidance on evaluation of stability data of drug substances and products, the need to perform a statistical extrapolation of a shelf-life of a drug product or a retest period for a drug substance is based heavily on whether data exhibit a change-over-time and/or variability. However, this document suggests neither measures nor acceptance criteria of these two parameters. This paper demonstrates a useful application of simple statistical parameters for determining whether sets of stability data from either accelerated or long-term storage programs exhibit a change-over-time and/or variability. These parameters are all derived from a simple linear regression analysis first performed on the stability data. The p-value of the slope of the regression line is taken as a measure for change-over-time, and a value of 0.25 is suggested as a limit to insignificant change of the quantitative stability attributes monitored. The minimal process capability index, Cpk, calculated from the standard deviation of the regression line, is suggested as a measure for variability with a value of 2.5 as a limit for an insignificant variability. The usefulness of the above two parameters, p-value and Cpk, was demonstrated on stability data of a refrigerated drug product and on pooled data of three batches of a drug substance. In both cases, the determined parameters allowed characterization of the data in terms of change-over-time and variability. Consequently, complete evaluation of the stability data could be pursued according to the ICH guidance. It is believed that the application of the above two parameters with their acceptance criteria will allow a more unified evaluation of stability data.

  17. IMFIT: A FAST, FLEXIBLE NEW PROGRAM FOR ASTRONOMICAL IMAGE FITTING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erwin, Peter; Universitäts-Sternwarte München, Scheinerstrasse 1, D-81679 München

    2015-02-01

    I describe a new, open-source astronomical image-fitting program called IMFIT, specialized for galaxies but potentially useful for other sources, which is fast, flexible, and highly extensible. A key characteristic of the program is an object-oriented design that allows new types of image components (two-dimensional surface-brightness functions) to be easily written and added to the program. Image functions provided with IMFIT include the usual suspects for galaxy decompositions (Sérsic, exponential, Gaussian), along with Core-Sérsic and broken-exponential profiles, elliptical rings, and three components that perform line-of-sight integration through three-dimensional luminosity-density models of disks and rings seen at arbitrary inclinations. Available minimization algorithmsmore » include Levenberg-Marquardt, Nelder-Mead simplex, and Differential Evolution, allowing trade-offs between speed and decreased sensitivity to local minima in the fit landscape. Minimization can be done using the standard χ{sup 2} statistic (using either data or model values to estimate per-pixel Gaussian errors, or else user-supplied error images) or Poisson-based maximum-likelihood statistics; the latter approach is particularly appropriate for cases of Poisson data in the low-count regime. I show that fitting low-signal-to-noise ratio galaxy images using χ{sup 2} minimization and individual-pixel Gaussian uncertainties can lead to significant biases in fitted parameter values, which are avoided if a Poisson-based statistic is used; this is true even when Gaussian read noise is present.« less

  18. Engagement matters: lessons from assessing classroom implementation of steps to respect: a bullying prevention program over a one-year period.

    PubMed

    Low, Sabina; Van Ryzin, Mark J; Brown, Eric C; Smith, Brian H; Haggerty, Kevin P

    2014-04-01

    Steps to Respect: A Bullying Prevention Program (STR) relies on a social-ecological model of prevention to increase school staff awareness and responsiveness, foster socially responsible beliefs among students, and teach social-emotional skills to students to reduce bullying behavior. As part of a school-randomized controlled trial of STR, we examined predictors and outcomes associated with classroom curriculum implementation in intervention schools. Data on classroom implementation (adherence and engagement) were collected from a sample of teachers using a weekly on-line Teacher Implementation Checklist system. Pre-post data related to school bullying-related outcomes were collected from 1,424 students and archival school demographic data were obtained from the National Center for Education Statistics. Results of multilevel analyses indicated that higher levels of program engagement were influenced by school-level percentage of students receiving free/reduced lunch, as well as classroom-level climate indicators. Results also suggest that higher levels of program engagement were related to lower levels of school bullying problems, enhanced school climate and attitudes less supportive of bullying. Predictors and outcomes related to program fidelity (i.e., adherence) were largely nonsignificant. Results suggest that student engagement is a key element of program impact, though implementation is influenced by both school-level demographics and classroom contexts.

  19. A Longitudinal Analysis of the Influence of a Peer Run Warm Line Phone Service on Psychiatric Recovery.

    PubMed

    Dalgin, Rebecca Spirito; Dalgin, M Halim; Metzger, Scott J

    2018-05-01

    This article focuses on the impact of a peer run warm line as part of the psychiatric recovery process. It utilized data including the Recovery Assessment Scale, community integration measures and crisis service usage. Longitudinal statistical analysis was completed on 48 sets of data from 2011, 2012, and 2013. Although no statistically significant differences were observed for the RAS score, community integration data showed increases in visits to primary care doctors, leisure/recreation activities and socialization with others. This study highlights the complexity of psychiatric recovery and that nonclinical peer services like peer run warm lines may be critical to the process.

  20. [Health for All-Italia: an indicator system on health].

    PubMed

    Burgio, Alessandra; Crialesi, Roberta; Loghi, Marzia

    2003-01-01

    The Health for All - Italia information system collects health data from several sources. It is intended to be a cornerstone for the achievement of an overview about health in Italy. Health is analyzed at different levels, ranging from health services, health needs, lifestyles, demographic, social, economic and environmental contexts. The database associated software allows to pin down statistical data into graphs and tables, and to carry out simple statistical analysis. It is therefore possible to view the indicators' time series, make simple projections and compare the various indicators over the years for each territorial unit. This is possible by means of tables, graphs (histograms, line graphs, frequencies, linear regression with calculation of correlation coefficients, etc) and maps. These charts can be exported to other programs (i.e. Word, Excel, Power Point), or they can be directly printed in color or black and white.

  1. CAPSAS: Computer Assisted Program for the Selection of Appropriate Statistics.

    ERIC Educational Resources Information Center

    Shermis, Mark D.; Albert, Susan L.

    A computer-assisted program has been developed for the selection of statistics or statistical techniques by both students and researchers. Based on Andrews, Klem, Davidson, O'Malley and Rodgers "A Guide for Selecting Statistical Techniques for Analyzing Social Science Data," this FORTRAN-compiled interactive computer program was…

  2. On representing the prognostic value of continuous gene expression biomarkers with the restricted mean survival curve.

    PubMed

    Eng, Kevin H; Schiller, Emily; Morrell, Kayla

    2015-11-03

    Researchers developing biomarkers for cancer prognosis from quantitative gene expression data are often faced with an odd methodological discrepancy: while Cox's proportional hazards model, the appropriate and popular technique, produces a continuous and relative risk score, it is hard to cast the estimate in clear clinical terms like median months of survival and percent of patients affected. To produce a familiar Kaplan-Meier plot, researchers commonly make the decision to dichotomize a continuous (often unimodal and symmetric) score. It is well known in the statistical literature that this procedure induces significant bias. We illustrate the liabilities of common techniques for categorizing a risk score and discuss alternative approaches. We promote the use of the restricted mean survival (RMS) and the corresponding RMS curve that may be thought of as an analog to the best fit line from simple linear regression. Continuous biomarker workflows should be modified to include the more rigorous statistical techniques and descriptive plots described in this article. All statistics discussed can be computed via standard functions in the Survival package of the R statistical programming language. Example R language code for the RMS curve is presented in the appendix.

  3. Genetic diversity and population structure of the Guinea pig (Cavia porcellus, Rodentia, Caviidae) in Colombia.

    PubMed

    Burgos-Paz, William; Cerón-Muñoz, Mario; Solarte-Portilla, Carlos

    2011-10-01

    The aim was to establish the genetic diversity and population structure of three guinea pig lines, from seven production zones located in Nariño, southwest Colombia. A total of 384 individuals were genotyped with six microsatellite markers. The measurement of intrapopulation diversity revealed allelic richness ranging from 3.0 to 6.56, and observed heterozygosity (Ho) from 0.33 to 0.60, with a deficit in heterozygous individuals. Although statistically significant (p < 0.05), genetic differentiation between population pairs was found to be low. Genetic distance, as well as clustering of guinea-pig lines and populations, coincided with the historical and geographical distribution of the populations. Likewise, high genetic identity between improved and native lines was established. An analysis of group probabilistic assignment revealed that each line should not be considered as a genetically homogeneous group. The findings corroborate the absorption of native genetic material into the improved line introduced into Colombia from Peru. It is necessary to establish conservation programs for native-line individuals in Nariño, and control genealogical and production records in order to reduce the inbreeding values in the populations.

  4. Genetic diversity and population structure of the Guinea pig (Cavia porcellus, Rodentia, Caviidae) in Colombia

    PubMed Central

    Burgos-Paz, William; Cerón-Muñoz, Mario; Solarte-Portilla, Carlos

    2011-01-01

    The aim was to establish the genetic diversity and population structure of three guinea pig lines, from seven production zones located in Nariño, southwest Colombia. A total of 384 individuals were genotyped with six microsatellite markers. The measurement of intrapopulation diversity revealed allelic richness ranging from 3.0 to 6.56, and observed heterozygosity (Ho) from 0.33 to 0.60, with a deficit in heterozygous individuals. Although statistically significant (p < 0.05), genetic differentiation between population pairs was found to be low. Genetic distance, as well as clustering of guinea-pig lines and populations, coincided with the historical and geographical distribution of the populations. Likewise, high genetic identity between improved and native lines was established. An analysis of group probabilistic assignment revealed that each line should not be considered as a genetically homogeneous group. The findings corroborate the absorption of native genetic material into the improved line introduced into Colombia from Peru. It is necessary to establish conservation programs for native-line individuals in Nariño, and control genealogical and production records in order to reduce the inbreeding values in the populations. PMID:22215979

  5. 75 FR 24718 - Guidance for Industry on Documenting Statistical Analysis Programs and Data Files; Availability

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-05

    ...] Guidance for Industry on Documenting Statistical Analysis Programs and Data Files; Availability AGENCY... Programs and Data Files.'' This guidance is provided to inform study statisticians of recommendations for documenting statistical analyses and data files submitted to the Center for Veterinary Medicine (CVM) for the...

  6. Computer programs for computing particle-size statistics of fluvial sediments

    USGS Publications Warehouse

    Stevens, H.H.; Hubbell, D.W.

    1986-01-01

    Two versions of computer programs for inputing data and computing particle-size statistics of fluvial sediments are presented. The FORTRAN 77 language versions are for use on the Prime computer, and the BASIC language versions are for use on microcomputers. The size-statistics program compute Inman, Trask , and Folk statistical parameters from phi values and sizes determined for 10 specified percent-finer values from inputed size and percent-finer data. The program also determines the percentage gravel, sand, silt, and clay, and the Meyer-Peter effective diameter. Documentation and listings for both versions of the programs are included. (Author 's abstract)

  7. Preparing and Presenting Effective Research Posters

    PubMed Central

    Miller, Jane E

    2007-01-01

    Objectives Posters are a common way to present results of a statistical analysis, program evaluation, or other project at professional conferences. Often, researchers fail to recognize the unique nature of the format, which is a hybrid of a published paper and an oral presentation. This methods note demonstrates how to design research posters to convey study objectives, methods, findings, and implications effectively to varied professional audiences. Methods A review of existing literature on research communication and poster design is used to identify and demonstrate important considerations for poster content and layout. Guidelines on how to write about statistical methods, results, and statistical significance are illustrated with samples of ineffective writing annotated to point out weaknesses, accompanied by concrete examples and explanations of improved presentation. A comparison of the content and format of papers, speeches, and posters is also provided. Findings Each component of a research poster about a quantitative analysis should be adapted to the audience and format, with complex statistical results translated into simplified charts, tables, and bulleted text to convey findings as part of a clear, focused story line. Conclusions Effective research posters should be designed around two or three key findings with accompanying handouts and narrative description to supply additional technical detail and encourage dialog with poster viewers. PMID:17355594

  8. Expanding Capacity With an Accelerated On-Line BSN Program.

    PubMed

    Lindley, Marie Kelly; Ashwill, Regina; Cipher, Daisha J; Mancini, Mary E

    Colleges of nursing are challenged to identify innovative, efficient, and effective mechanisms to expand enrollment in prelicensure programs. This objective of this project was to identify whether a prelicensure nursing program that is both accelerated and on-line is as effective as a traditional face-to-face program, in terms of graduation rates and National Council Licensure Exam pass rates. This analysis of 1,064 students compared demographic and outcomes data between students in a state university's college of nursing who were enrolled in an accelerated, fully on-line bachelors of science in nursing (BSN) program and the traditional on-campus BSN program. Students significantly differed in their ethnicity, level of prior education, and graduation rates (95% vs. 89.3%). First-time National Council Licensure Exam pass rates for both groups did not significantly differ (92.5% vs. 94.5%). Results indicate that an accelerated on-line BSN program can overcome factors known to limit capacity expansion in schools of nursing and produce high-quality student outcomes. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Opening Pandora's Box: Texas Elementary Campus Administrators use of Educational Policy And Highly Qualified Classroom Teachers Professional Development through Data-informed Decisions for Science Education

    NASA Astrophysics Data System (ADS)

    Brown, Linda Lou

    Federal educational policy, No Child Left Behind Act of 2001, focused attention on America's education with conspicuous results. One aspect, highly qualified classroom teacher and principal (HQ), was taxing since states established individual accountability structures. The HQ impact and use of data-informed decision-making (DIDM) for Texas elementary science education monitoring by campus administrators, Campus Instruction Leader (CILs), provides crucial relationships to 5th grade students' learning and achievement. Forty years research determined improved student results when sustained, supported, and focused professional development (PD) for teachers is available. Using mixed methods research, this study applied quantitative and qualitative analysis from two, electronic, on-line surveys: Texas Elementary, Intermediate or Middle School Teacher Survey(c) and the Texas Elementary Campus Administrator Survey(c) with results from 22.3% Texas school districts representing 487 elementary campuses surveyed. Participants selected in random, stratified sampling of 5th grade teachers who attended local Texas Regional Collaboratives science professional development (PD) programs between 2003-2008. Survey information compared statistically to campus-level average passing rate scores on the 5th grade science TAKS using Statistical Process Software (SPSS). Written comments from both surveys analyzed with Qualitative Survey Research (NVivo) software. Due to the level of uncertainty of variables within a large statewide study, Mauchly's Test of Sphericity statistical test used to validate repeated measures factor ANOVAs. Although few individual results were statistically significant, when jointly analyzed, striking constructs were revealed regarding the impact of HQ policy applications and elementary CILs use of data-informed decisions on improving 5th grade students' achievement and teachers' PD learning science content. Some constructs included the use of data-warehouse programs; teachers' applications of DIDM to modify lessons for differentiated science instruction, the numbers of years' teachers attended science PD, and teachers' influence on CILs staffing decisions. Yet CILs reported 14% of Texas elementary campuses had limited or no science education programs due to federal policy requirement for reading and mathematics. Three hypothesis components were supported and accepted from research data resulted in two models addressing elementary science, science education PD, and CILs impact for federal policy applications.

  10. Line-by-line spectroscopic simulations on graphics processing units

    NASA Astrophysics Data System (ADS)

    Collange, Sylvain; Daumas, Marc; Defour, David

    2008-01-01

    We report here on software that performs line-by-line spectroscopic simulations on gases. Elaborate models (such as narrow band and correlated-K) are accurate and efficient for bands where various components are not simultaneously and significantly active. Line-by-line is probably the most accurate model in the infrared for blends of gases that contain high proportions of H 2O and CO 2 as this was the case for our prototype simulation. Our implementation on graphics processing units sustains a speedup close to 330 on computation-intensive tasks and 12 on memory intensive tasks compared to implementations on one core of high-end processors. This speedup is due to data parallelism, efficient memory access for specific patterns and some dedicated hardware operators only available in graphics processing units. It is obtained leaving most of processor resources available and it would scale linearly with the number of graphics processing units in parallel machines. Line-by-line simulation coupled with simulation of fluid dynamics was long believed to be economically intractable but our work shows that it could be done with some affordable additional resources compared to what is necessary to perform simulations on fluid dynamics alone. Program summaryProgram title: GPU4RE Catalogue identifier: ADZY_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADZY_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 62 776 No. of bytes in distributed program, including test data, etc.: 1 513 247 Distribution format: tar.gz Programming language: C++ Computer: x86 PC Operating system: Linux, Microsoft Windows. Compilation requires either gcc/g++ under Linux or Visual C++ 2003/2005 and Cygwin under Windows. It has been tested using gcc 4.1.2 under Ubuntu Linux 7.04 and using Visual C++ 2005 with Cygwin 1.5.24 under Windows XP. RAM: 1 gigabyte Classification: 21.2 External routines: OpenGL ( http://www.opengl.org) Nature of problem: Simulating radiative transfer on high-temperature high-pressure gases. Solution method: Line-by-line Monte-Carlo ray-tracing. Unusual features: Parallel computations are moved to the GPU. Additional comments: nVidia GeForce 7000 or ATI Radeon X1000 series graphics processing unit is required. Running time: A few minutes.

  11. Bayesian Atmospheric Radiative Transfer (BART): Model, Statistics Driver, and Application to HD 209458b

    NASA Astrophysics Data System (ADS)

    Cubillos, Patricio; Harrington, Joseph; Blecic, Jasmina; Stemm, Madison M.; Lust, Nate B.; Foster, Andrew S.; Rojo, Patricio M.; Loredo, Thomas J.

    2014-11-01

    Multi-wavelength secondary-eclipse and transit depths probe the thermo-chemical properties of exoplanets. In recent years, several research groups have developed retrieval codes to analyze the existing data and study the prospects of future facilities. However, the scientific community has limited access to these packages. Here we premiere the open-source Bayesian Atmospheric Radiative Transfer (BART) code. We discuss the key aspects of the radiative-transfer algorithm and the statistical package. The radiation code includes line databases for all HITRAN molecules, high-temperature H2O, TiO, and VO, and includes a preprocessor for adding additional line databases without recompiling the radiation code. Collision-induced absorption lines are available for H2-H2 and H2-He. The parameterized thermal and molecular abundance profiles can be modified arbitrarily without recompilation. The generated spectra are integrated over arbitrary bandpasses for comparison to data. BART's statistical package, Multi-core Markov-chain Monte Carlo (MC3), is a general-purpose MCMC module. MC3 implements the Differental-evolution Markov-chain Monte Carlo algorithm (ter Braak 2006, 2009). MC3 converges 20-400 times faster than the usual Metropolis-Hastings MCMC algorithm, and in addition uses the Message Passing Interface (MPI) to parallelize the MCMC chains. We apply the BART retrieval code to the HD 209458b data set to estimate the planet's temperature profile and molecular abundances. This work was supported by NASA Planetary Atmospheres grant NNX12AI69G and NASA Astrophysics Data Analysis Program grant NNX13AF38G. JB holds a NASA Earth and Space Science Fellowship.

  12. GASPS—A Herschel Survey of Gas and Dust in Protoplanetary Disks: Summary and Initial Statistics

    NASA Astrophysics Data System (ADS)

    Dent, W. R. F.; Thi, W. F.; Kamp, I.; Williams, J. P.; Menard, F.; Andrews, S.; Ardila, D.; Aresu, G.; Augereau, J.-C.; Barrado y Navascues, D.; Brittain, S.; Carmona, A.; Ciardi, D.; Danchi, W.; Donaldson, J.; Duchene, G.; Eiroa, C.; Fedele, D.; Grady, C.; de Gregorio-Molsalvo, I.; Howard, C.; Huélamo, N.; Krivov, A.; Lebreton, J.; Liseau, R.; Martin-Zaidi, C.; Mathews, G.; Meeus, G.; Mendigutía, I.; Montesinos, B.; Morales-Calderon, M.; Mora, A.; Nomura, H.; Pantin, E.; Pascucci, I.; Phillips, N.; Pinte, C.; Podio, L.; Ramsay, S. K.; Riaz, B.; Riviere-Marichalar, P.; Roberge, A.; Sandell, G.; Solano, E.; Tilling, I.; Torrelles, J. M.; Vandenbusche, B.; Vicente, S.; White, G. J.; Woitke, P.

    2013-05-01

    We describe a large-scale far-infrared line and continuum survey of protoplanetary disk through to young debris disk systems carried out using the ACS instrument on the Herschel Space Observatory. This Open Time Key program, known as GASPS (Gas Survey of Protoplanetary Systems), targeted ~250 young stars in narrow wavelength regions covering the [OI] fine structure line at 63 μm the brightest far-infrared line in such objects. A subset of the brightest targets were also surveyed in [OI]145 μm, [CII] at 157 μm, as well as several transitions of H2O and high-excitation CO lines at selected wavelengths between 78 and 180 μm. Additionally, GASPS included continuum photometry at 70, 100 and 160 μm, around the peak of the dust emission. The targets were SED Class II-III T Tauri stars and debris disks from seven nearby young associations, along with a comparable sample of isolated Herbig AeBe stars. The aim was to study the global gas and dust content in a wide sample of circumstellar disks, combining the results with models in a systematic way. In this overview paper we review the scientific aims, target selection and observing strategy of the program. We summarise some of the initial results, showing line identifications, listing the detections, and giving a first statistical study of line detectability. The [OI] line at 63 μm was the brightest line seen in almost all objects, by a factor of ~10. Overall [OI]63 μm detection rates were 49%, with 100% of HAeBe stars and 43% of T Tauri stars detected. A comparison with published disk dust masses (derived mainly from sub-mm continuum, assuming standard values of the mm mass opacity) shows a dust mass threshold for [OI]63 μm detection of ~10-5 Msolar. Normalising to a distance of 140 pc, 84% of objects with dust masses >=10-5 Msolar can be detected in this line in the present survey; 32% of those of mass 10-6-10-5 Msolar, and only a very small number of unusual objects with lower masses can be detected. This is consistent with models with a moderate UV excess and disk flaring. For a given disk mass, [OI] detectability is lower for M stars compared with earlier spectral types. Both the continuum and line emission was, in most systems, spatially and spectrally unresolved and centred on the star, suggesting that emission in most cases was from the disk. Approximately 10 objects showed resolved emission, most likely from outflows. In the GASPS sample, [OI] detection rates in T Tauri associations in the 0.3-4 Myr age range were ~50%. For each association in the 5-20 Myr age range, ~2 stars remain detectable in [OI]63 μm, and no systems were detected in associations with age >20 Myr. Comparing with the total number of young stars in each association, and assuming a ISM-like gas/dust ratio, this indicates that ~18% of stars retain a gas-rich disk of total mass ~1 MJupiter for 1-4 Myr, 1-7% keep such disks for 5-10 Myr, but none are detected beyond 10-20 Myr. The brightest [OI] objects from GASPS were also observed in [OI]145 μm, [CII]157 μm and CO J = 18 - 17, with detection rates of 20-40%. Detection of the [CII] line was not correlated with disk mass, suggesting it arises more commonly from a compact remnant envelope.

  13. Modular reweighting software for statistical mechanical analysis of biased equilibrium data

    NASA Astrophysics Data System (ADS)

    Sindhikara, Daniel J.

    2012-07-01

    Here a simple, useful, modular approach and software suite designed for statistical reweighting and analysis of equilibrium ensembles is presented. Statistical reweighting is useful and sometimes necessary for analysis of equilibrium enhanced sampling methods, such as umbrella sampling or replica exchange, and also in experimental cases where biasing factors are explicitly known. Essentially, statistical reweighting allows extrapolation of data from one or more equilibrium ensembles to another. Here, the fundamental separable steps of statistical reweighting are broken up into modules - allowing for application to the general case and avoiding the black-box nature of some “all-inclusive” reweighting programs. Additionally, the programs included are, by-design, written with little dependencies. The compilers required are either pre-installed on most systems, or freely available for download with minimal trouble. Examples of the use of this suite applied to umbrella sampling and replica exchange molecular dynamics simulations will be shown along with advice on how to apply it in the general case. New version program summaryProgram title: Modular reweighting version 2 Catalogue identifier: AEJH_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJH_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 179 118 No. of bytes in distributed program, including test data, etc.: 8 518 178 Distribution format: tar.gz Programming language: C++, Python 2.6+, Perl 5+ Computer: Any Operating system: Any RAM: 50-500 MB Supplementary material: An updated version of the original manuscript (Comput. Phys. Commun. 182 (2011) 2227) is available Classification: 4.13 Catalogue identifier of previous version: AEJH_v1_0 Journal reference of previous version: Comput. Phys. Commun. 182 (2011) 2227 Does the new version supersede the previous version?: Yes Nature of problem: While equilibrium reweighting is ubiquitous, there are no public programs available to perform the reweighting in the general case. Further, specific programs often suffer from many library dependencies and numerical instability. Solution method: This package is written in a modular format that allows for easy applicability of reweighting in the general case. Modules are small, numerically stable, and require minimal libraries. Reasons for new version: Some minor bugs, some upgrades needed, error analysis added. analyzeweight.py/analyzeweight.py2 has been replaced by “multihist.py”. This new program performs all the functions of its predecessor while being versatile enough to handle other types of histograms and probability analysis. “bootstrap.py” was added. This script performs basic bootstrap resampling allowing for error analysis of data. “avg_dev_distribution.py” was added. This program computes the averages and standard deviations of multiple distributions, making error analysis (e.g. from bootstrap resampling) easier to visualize. WRE.cpp was slightly modified purely for cosmetic reasons. The manual was updated for clarity and to reflect version updates. Examples were removed from the manual in favor of online tutorials (packaged examples remain). Examples were updated to reflect the new format. An additional example is included to demonstrate error analysis. Running time: Preprocessing scripts 1-5 minutes, WHAM engine <1 minute, postprocess script ∼1-5 minutes.

  14. An Evaluation of On-Line, Interactive Tutorials Designed to Teach Practice Concepts

    ERIC Educational Resources Information Center

    Seabury, Brett A.

    2005-01-01

    This paper presents an evaluation of two on-line-based programs designed to teach practice skills. One program teaches crisis intervention and the other teaches suicide assessment. The evaluation of the use of these programs compares outcomes for two groups of students, one using the interactive program outside a class context and the other using…

  15. Designing health care risk management on-line:meeting regulators' concerns for fixed-hour curriculum.

    PubMed

    Hyer, Kathryn; Taylor, Heidi H; Nanni, Kennith

    2004-01-01

    This paper describes the experience of creating a continuing professional education on-line risk management program that is designed to meet Florida's educational requirements for licensure as a risk manager in health-care settings and details the challenges faced when the in-class didactic program of 15 eight-hour sessions is reformatted as an on-line program. Structuring instructor/learner interactivity remains a challenge, especially if the program allows learner control and is a key feature in marketing the program. The article presents the dilemmas for state regulators as they work to determine if the on-line program meets legislative intent and statutory requirements because the learning platform does not have a clock function that accumulates time for each learner. While some details reflect the uniqueness of the 120-hour educational requirements for risk managers in Florida, the experience of the authors provides insight into the development of continuing professional education distance learning programs that are multidisciplinary and move primarily from a time-based format into a curriculum that uses time as only one dimension of the evaluation of learning.

  16. Exposure time independent summary statistics for assessment of drug dependent cell line growth inhibition.

    PubMed

    Falgreen, Steffen; Laursen, Maria Bach; Bødker, Julie Støve; Kjeldsen, Malene Krag; Schmitz, Alexander; Nyegaard, Mette; Johnsen, Hans Erik; Dybkær, Karen; Bøgsted, Martin

    2014-06-05

    In vitro generated dose-response curves of human cancer cell lines are widely used to develop new therapeutics. The curves are summarised by simplified statistics that ignore the conventionally used dose-response curves' dependency on drug exposure time and growth kinetics. This may lead to suboptimal exploitation of data and biased conclusions on the potential of the drug in question. Therefore we set out to improve the dose-response assessments by eliminating the impact of time dependency. First, a mathematical model for drug induced cell growth inhibition was formulated and used to derive novel dose-response curves and improved summary statistics that are independent of time under the proposed model. Next, a statistical analysis workflow for estimating the improved statistics was suggested consisting of 1) nonlinear regression models for estimation of cell counts and doubling times, 2) isotonic regression for modelling the suggested dose-response curves, and 3) resampling based method for assessing variation of the novel summary statistics. We document that conventionally used summary statistics for dose-response experiments depend on time so that fast growing cell lines compared to slowly growing ones are considered overly sensitive. The adequacy of the mathematical model is tested for doxorubicin and found to fit real data to an acceptable degree. Dose-response data from the NCI60 drug screen were used to illustrate the time dependency and demonstrate an adjustment correcting for it. The applicability of the workflow was illustrated by simulation and application on a doxorubicin growth inhibition screen. The simulations show that under the proposed mathematical model the suggested statistical workflow results in unbiased estimates of the time independent summary statistics. Variance estimates of the novel summary statistics are used to conclude that the doxorubicin screen covers a significant diverse range of responses ensuring it is useful for biological interpretations. Time independent summary statistics may aid the understanding of drugs' action mechanism on tumour cells and potentially renew previous drug sensitivity evaluation studies.

  17. Exposure time independent summary statistics for assessment of drug dependent cell line growth inhibition

    PubMed Central

    2014-01-01

    Background In vitro generated dose-response curves of human cancer cell lines are widely used to develop new therapeutics. The curves are summarised by simplified statistics that ignore the conventionally used dose-response curves’ dependency on drug exposure time and growth kinetics. This may lead to suboptimal exploitation of data and biased conclusions on the potential of the drug in question. Therefore we set out to improve the dose-response assessments by eliminating the impact of time dependency. Results First, a mathematical model for drug induced cell growth inhibition was formulated and used to derive novel dose-response curves and improved summary statistics that are independent of time under the proposed model. Next, a statistical analysis workflow for estimating the improved statistics was suggested consisting of 1) nonlinear regression models for estimation of cell counts and doubling times, 2) isotonic regression for modelling the suggested dose-response curves, and 3) resampling based method for assessing variation of the novel summary statistics. We document that conventionally used summary statistics for dose-response experiments depend on time so that fast growing cell lines compared to slowly growing ones are considered overly sensitive. The adequacy of the mathematical model is tested for doxorubicin and found to fit real data to an acceptable degree. Dose-response data from the NCI60 drug screen were used to illustrate the time dependency and demonstrate an adjustment correcting for it. The applicability of the workflow was illustrated by simulation and application on a doxorubicin growth inhibition screen. The simulations show that under the proposed mathematical model the suggested statistical workflow results in unbiased estimates of the time independent summary statistics. Variance estimates of the novel summary statistics are used to conclude that the doxorubicin screen covers a significant diverse range of responses ensuring it is useful for biological interpretations. Conclusion Time independent summary statistics may aid the understanding of drugs’ action mechanism on tumour cells and potentially renew previous drug sensitivity evaluation studies. PMID:24902483

  18. Calibration transfer of a Raman spectroscopic quantification method for the assessment of liquid detergent compositions from at-line laboratory to in-line industrial scale.

    PubMed

    Brouckaert, D; Uyttersprot, J-S; Broeckx, W; De Beer, T

    2018-03-01

    Calibration transfer or standardisation aims at creating a uniform spectral response on different spectroscopic instruments or under varying conditions, without requiring a full recalibration for each situation. In the current study, this strategy is applied to construct at-line multivariate calibration models and consequently employ them in-line in a continuous industrial production line, using the same spectrometer. Firstly, quantitative multivariate models are constructed at-line at laboratory scale for predicting the concentration of two main ingredients in hard surface cleaners. By regressing the Raman spectra of a set of small-scale calibration samples against their reference concentration values, partial least squares (PLS) models are developed to quantify the surfactant levels in the liquid detergent compositions under investigation. After evaluating the models performance with a set of independent validation samples, a univariate slope/bias correction is applied in view of transporting these at-line calibration models to an in-line manufacturing set-up. This standardisation technique allows a fast and easy transfer of the PLS regression models, by simply correcting the model predictions on the in-line set-up, without adjusting anything to the original multivariate calibration models. An extensive statistical analysis is performed in order to assess the predictive quality of the transferred regression models. Before and after transfer, the R 2 and RMSEP of both models is compared for evaluating if their magnitude is similar. T-tests are then performed to investigate whether the slope and intercept of the transferred regression line are not statistically different from 1 and 0, respectively. Furthermore, it is inspected whether no significant bias can be noted. F-tests are executed as well, for assessing the linearity of the transfer regression line and for investigating the statistical coincidence of the transfer and validation regression line. Finally, a paired t-test is performed to compare the original at-line model to the slope/bias corrected in-line model, using interval hypotheses. It is shown that the calibration models of Surfactant 1 and Surfactant 2 yield satisfactory in-line predictions after slope/bias correction. While Surfactant 1 passes seven out of eight statistical tests, the recommended validation parameters are 100% successful for Surfactant 2. It is hence concluded that the proposed strategy for transferring at-line calibration models to an in-line industrial environment via a univariate slope/bias correction of the predicted values offers a successful standardisation approach. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. On the radiated EMI current extraction of dc transmission line based on corona current statistical measurements

    NASA Astrophysics Data System (ADS)

    Yi, Yong; Chen, Zhengying; Wang, Liming

    2018-05-01

    Corona-originated discharge of DC transmission lines is the main reason for the radiated electromagnetic interference (EMI) field in the vicinity of transmission lines. A joint time-frequency analysis technique was proposed to extract the radiated EMI current (excitation current) of DC corona based on corona current statistical measurements. A reduced-scale experimental platform was setup to measure the statistical distributions of current waveform parameters of aluminum conductor steel reinforced. Based on the measured results, the peak value, root-mean-square value and average value with 9 kHz and 200 Hz band-with of 0.5 MHz radiated EMI current were calculated by the technique proposed and validated with conventional excitation function method. Radio interference (RI) was calculated based on the radiated EMI current and a wire-to-plate platform was built for the validity of the RI computation results. The reason for the certain deviation between the computations and measurements was detailed analyzed.

  20. Slope and Line of Best Fit: A Transfer of Knowledge Case Study

    ERIC Educational Resources Information Center

    Nagle, Courtney; Casey, Stephanie; Moore-Russo, Deborah

    2017-01-01

    This paper brings together research on slope from mathematics education and research on line of best fit from statistics education by considering what knowledge of slope students transfer to a novel task involving determining the placement of an informal line of best fit. This study focuses on two students who transitioned from placing inaccurate…

  1. Chromospheric and Transition Region Emission Properties of G, K, and M dwarf Exoplanet Host Stars

    NASA Astrophysics Data System (ADS)

    France, Kevin; Arulanantham, Nicole; Fossati, Luca; Lanza, A. F.; Linsky, Jeffrey L.; Redfield, Seth; Loyd, Robert; Schneider, Christian

    2018-01-01

    Exoplanet magnetic fields have proven notoriously hard to detect, despite theoretical predictions of substantial magnetic field strengths on close-in extrasolar giant planets. It has been suggested that stellar and planetary magnetic field interactions can manifest as enhanced stellar activity relative to nominal age-rotation-activity relationships for main sequence stars or enhanced activity on stars hosting short-period massive planets. In a recent study of M and K dwarf exoplanet host stars, we demonstrated a significant correlation between the relative luminosity in high-temperature stellar emission lines (L(ion)/L_Bol) and the “star-planet interaction strength”, M_plan/a_plan. Here, we expand on that work with a survey of G, K, and M dwarf exoplanet host stars obtained in two recent far-ultraviolet spectroscopic programs with the Hubble Space Telescope. We have measured the relative luminosities of stellar lines C II, Si III, Si IV, and N V (formation temperatures from 30,000 – 150,000 K) in a sample of ~60 exoplanet host stars and an additional ~40 dwarf stars without known planets. We present results on star-planet interaction signals as a function of spectral type and line formation temperature, as well as a statistical comparison of stars with and without planets.

  2. Joint Data Assimilation and Parameter Calibration in on-line groundwater modelling using Sequential Monte Carlo techniques

    NASA Astrophysics Data System (ADS)

    Ramgraber, M.; Schirmer, M.

    2017-12-01

    As computational power grows and wireless sensor networks find their way into common practice, it becomes increasingly feasible to pursue on-line numerical groundwater modelling. The reconciliation of model predictions with sensor measurements often necessitates the application of Sequential Monte Carlo (SMC) techniques, most prominently represented by the Ensemble Kalman Filter. In the pursuit of on-line predictions it seems advantageous to transcend the scope of pure data assimilation and incorporate on-line parameter calibration as well. Unfortunately, the interplay between shifting model parameters and transient states is non-trivial. Several recent publications (e.g. Chopin et al., 2013, Kantas et al., 2015) in the field of statistics discuss potential algorithms addressing this issue. However, most of these are computationally intractable for on-line application. In this study, we investigate to what extent compromises between mathematical rigour and computational restrictions can be made within the framework of on-line numerical modelling of groundwater. Preliminary studies are conducted in a synthetic setting, with the goal of transferring the conclusions drawn into application in a real-world setting. To this end, a wireless sensor network has been established in the valley aquifer around Fehraltorf, characterized by a highly dynamic groundwater system and located about 20 km to the East of Zürich, Switzerland. By providing continuous probabilistic estimates of the state and parameter distribution, a steady base for branched-off predictive scenario modelling could be established, providing water authorities with advanced tools for assessing the impact of groundwater management practices. Chopin, N., Jacob, P.E. and Papaspiliopoulos, O. (2013): SMC2: an efficient algorithm for sequential analysis of state space models. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 75 (3), p. 397-426. Kantas, N., Doucet, A., Singh, S.S., Maciejowski, J., and Chopin, N. (2015): On Particle Methods for Parameter Estimation in State-Space Models. Statistical Science, 30 (3), p. 328.-351.

  3. Program Enhances Drawings Of Three-Dimensional Objects

    NASA Technical Reports Server (NTRS)

    Hedgley, David R., Jr.

    1992-01-01

    SILHOUETTE is program for line drawings rendering any subset of polygons as silhouette. Program is improvement on, and replacement for, HIDDEN LINE COMPUTER CODE (ARC-11446). Offers combinations of silhouette and nonsilhouette specifications for arbitrary solid. Written in FORTRAN 77.

  4. Use of anti-tuberculosis drugs among newly diagnosed pulmonary tuberculosis inpatients in China: a retrospective study.

    PubMed

    Huang, Fei; Zhang, Hui; Lv, Qing; Sato, Kaori D; Qu, Yan; Huan, Shitong; Cheng, Jun; Zhao, Fei; Wang, Lixia

    2016-01-21

    China's national tuberculosis control program (NTP) provides free, first-line anti-tuberculosis (TB) drugs to pulmonary TB patients. This treatment regimen follows the World Health Organization's (WHO) guideline. The objective of this paper is to evaluate the current status of anti-TB drug use for newly diagnosed pulmonary TB inpatients treated in prefecture- and county-level designated hospitals. Three prefecture-level hospitals and nine county-level hospitals were selected for the study. All newly diagnosed pulmonary TB inpatient medical records from 2012 were reviewed and doubly examined by two national senior physicians. The rational use of anti-TB drugs was evaluated based on criteria in line with WHO's guideline. Of the 2,060 total treatment regimens for TB, 53.1 % were found to be rational (1093/2060). The percentages in prefecture-level and county-level hospitals were 50.3 % (761/1513) and 60.7 % (332/547), respectively. The difference between the two levels of hospitals was statistically significant (Chi-square value = 17.44, P < 0.01). The percentages of rational treatment regimens for first-time hospitalizations and for two or more hospitalizations were 59.5 % (983/1653) and 27.0 % (110/407), respectively, with a statistically significant difference (Chi-square value = 138.00, P < 0.01). The overall use of second-line drugs (SLD) was 54.9 % (1131/2060). The percentages for prefecture-level and county-level hospitals were 50.6 % (766/1513) and 66.7 % (365/547), respectively. A statistically significant difference was found (Chi-square value = 42.06, P < 0.01). The use of SLD for inpatients hospitalized once and inpatients hospitalized twice or more was 58.4 % (966/1653) and 40.5 % (165/407), respectively, with a statistically significant difference (Chi-square value = 42.26, P < 0.01). Half of inpatients might be treated with irrational regimens, and the use of SLD was more appropriately dispensed in city-level hospitals than in county-level hospitals. Trainings and guidelines for health personnel, supervision led by health authorities and increased investment to designated hospitals may help to improve the rational use of anti-TB drugs.

  5. 7 CFR 295.5 - Program statistical reports.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 4 2011-01-01 2011-01-01 false Program statistical reports. 295.5 Section 295.5 Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF... statistical reports. Current and historical information on FNS food assistance program size, monetary outlays...

  6. 7 CFR 295.5 - Program statistical reports.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 4 2010-01-01 2010-01-01 false Program statistical reports. 295.5 Section 295.5 Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF... statistical reports. Current and historical information on FNS food assistance program size, monetary outlays...

  7. Why Wait? The Influence of Academic Self-Regulation, Intrinsic Motivation, and Statistics Anxiety on Procrastination in Online Statistics

    ERIC Educational Resources Information Center

    Dunn, Karee

    2014-01-01

    Online graduate education programs are expanding rapidly. Many of these programs require a statistics course, resulting in an increasing need for online statistics courses. The study reported here grew from experiences teaching online, graduate statistics courses. In seeking answers on how to improve this class, I discovered that research has yet…

  8. On the fractal characterization of Paretian Poisson processes

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo I.; Sokolov, Igor M.

    2012-06-01

    Paretian Poisson processes are Poisson processes which are defined on the positive half-line, have maximal points, and are quantified by power-law intensities. Paretian Poisson processes are elemental in statistical physics, and are the bedrock of a host of power-law statistics ranging from Pareto's law to anomalous diffusion. In this paper we establish evenness-based fractal characterizations of Paretian Poisson processes. Considering an array of socioeconomic evenness-based measures of statistical heterogeneity, we show that: amongst the realm of Poisson processes which are defined on the positive half-line, and have maximal points, Paretian Poisson processes are the unique class of 'fractal processes' exhibiting scale-invariance. The results established in this paper are diametric to previous results asserting that the scale-invariance of Poisson processes-with respect to physical randomness-based measures of statistical heterogeneity-is characterized by exponential Poissonian intensities.

  9. Syringe Pump Performance Maintained with IV Filter Use During Low Flow Rate Delivery for Pediatric Patients.

    PubMed

    Chau, Destiny F; Vasilopoulos, Terrie; Schoepf, Miriam; Zhang, Christina; Fahy, Brenda G

    2016-09-01

    Complex surgical and critically ill pediatric patients rely on syringe infusion pumps for precise delivery of IV medications. Low flow rates and in-line IV filter use may affect drug delivery. To determine the effects of an in-line filter to remove air and/or contaminants on syringe pump performance at low flow rates, we compared the measured rates with the programmed flow rates with and without in-line IV filters. Standardized IV infusion assemblies with and without IV filters (filter and control groups) attached to a 10-mL syringe were primed and then loaded onto a syringe pump and connected to a 16-gauge, 16-cm single-lumen catheter. The catheter was suspended in a normal saline fluid column to simulate the back pressure from central venous circulation. The delivered infusate was measured by gravimetric methods at predetermined time intervals, and flow rate was calculated. Experimental trials for initial programmed rates of 1.0, 0.8, 0.6, and 0.4 mL/h were performed in control and filter groups. For each trial, the flow rate was changed to double the initial flow rate and was then returned to the initial flow rate to analyze pump performance for titration of rates often required during medication administration. These conditions (initial rate, doubling of initial rate, and return to initial rate) were analyzed separately for steady-state flow rate and time to steady state, whereas their average was used for percent deviation analysis. Differences between control and filter groups were assessed using Student t tests with adjustment for multiplicity (using n = 3 replications per group). Mean time from 0 to initial flow (startup delay) was <1 minute in both groups with no statistical difference between groups (P = 1.0). The average time to reach steady-state flow after infusion startup or rate changes was not statistically different between the groups (range, 0.8-5.5 minutes), for any flow rate or part of the trial (initial rate, doubling of initial rate, and return to initial rate), although the study was underpowered to detect small time differences. Overall, the mean steady-state flow rate for each trial was below the programmed flow rate with negative mean percent deviations for each trial. In the 1.0-mL/h initial rate trial, the steady-state flow rate attained was lower in the filter than the control group for the initial rate (P = 0.04) and doubling of initial rate (P = 0.04) with a trend during the return to initial rate (P = 0.06), although this same effect was not observed when doubling the initial rate trials of 0.8 or 0.6 mL/h or any other rate trials compared with the control group. With low flow rates used in complex surgical and pediatric critically ill patients, the addition of IV filters did not confer statistically significant changes in startup delay, flow variability, or time to reach steady-state flow of medications administered by syringe infusion pumps. The overall flow rate was lower than programmed flow rate with or without a filter.

  10. PCSYS: The optimal design integration system picture drawing system with hidden line algorithm capability for aerospace vehicle configurations

    NASA Technical Reports Server (NTRS)

    Hague, D. S.; Vanderburg, J. D.

    1977-01-01

    A vehicle geometric definition based upon quadrilateral surface elements to produce realistic pictures of an aerospace vehicle. The PCSYS programs can be used to visually check geometric data input, monitor geometric perturbations, and to visualize the complex spatial inter-relationships between the internal and external vehicle components. PCSYS has two major component programs. The between program, IMAGE, draws a complex aerospace vehicle pictorial representation based on either an approximate but rapid hidden line algorithm or without any hidden line algorithm. The second program, HIDDEN, draws a vehicle representation using an accurate but time consuming hidden line algorithm.

  11. The professional profile of UFBA nursing management graduate students.

    PubMed

    Paiva, Mirian Santos; Coelho, Edméia de Almeida Cardoso; Nascimento, Enilda Rosendo do; Melo, Cristina Maria Meira de; Fernandes, Josicelia Dumêt; Santos, Ninalva de Andrade

    2011-12-01

    The objective of the present study was to analyze the professional profile of the nursing graduate students of Federal University of Bahia, more specifically of the nursing management area. This descriptive, exploratory study was performed using documental research. The data was collected from the graduates' curriculum on the Lattes Platform and from the graduate program documents, using a form. The study population consisted of graduates enrolled under the line of research The Organization and Evaluation of Health Care Systems, who developed dissertations/theses addressing Nursing/Health Management. The data were stored using Microsoft Excel, and then transferred to the STATA 9.0 statistical software. Results showed that most graduates are women, originally from the State of Bahia, and had completed the course between 2000 and 2011; faculty of public institutions who continued involved in academic work after completing the course. These results point at the program as an academic environment committed to preparing researchers.

  12. SAP- FORTRAN STATIC SOURCE CODE ANALYZER PROGRAM (IBM VERSION)

    NASA Technical Reports Server (NTRS)

    Manteufel, R.

    1994-01-01

    The FORTRAN Static Source Code Analyzer program, SAP, was developed to automatically gather statistics on the occurrences of statements and structures within a FORTRAN program and to provide for the reporting of those statistics. Provisions have been made for weighting each statistic and to provide an overall figure of complexity. Statistics, as well as figures of complexity, are gathered on a module by module basis. Overall summed statistics are also accumulated for the complete input source file. SAP accepts as input syntactically correct FORTRAN source code written in the FORTRAN 77 standard language. In addition, code written using features in the following languages is also accepted: VAX-11 FORTRAN, IBM S/360 FORTRAN IV Level H Extended; and Structured FORTRAN. The SAP program utilizes two external files in its analysis procedure. A keyword file allows flexibility in classifying statements and in marking a statement as either executable or non-executable. A statistical weight file allows the user to assign weights to all output statistics, thus allowing the user flexibility in defining the figure of complexity. The SAP program is written in FORTRAN IV for batch execution and has been implemented on a DEC VAX series computer under VMS and on an IBM 370 series computer under MVS. The SAP program was developed in 1978 and last updated in 1985.

  13. SAP- FORTRAN STATIC SOURCE CODE ANALYZER PROGRAM (DEC VAX VERSION)

    NASA Technical Reports Server (NTRS)

    Merwarth, P. D.

    1994-01-01

    The FORTRAN Static Source Code Analyzer program, SAP, was developed to automatically gather statistics on the occurrences of statements and structures within a FORTRAN program and to provide for the reporting of those statistics. Provisions have been made for weighting each statistic and to provide an overall figure of complexity. Statistics, as well as figures of complexity, are gathered on a module by module basis. Overall summed statistics are also accumulated for the complete input source file. SAP accepts as input syntactically correct FORTRAN source code written in the FORTRAN 77 standard language. In addition, code written using features in the following languages is also accepted: VAX-11 FORTRAN, IBM S/360 FORTRAN IV Level H Extended; and Structured FORTRAN. The SAP program utilizes two external files in its analysis procedure. A keyword file allows flexibility in classifying statements and in marking a statement as either executable or non-executable. A statistical weight file allows the user to assign weights to all output statistics, thus allowing the user flexibility in defining the figure of complexity. The SAP program is written in FORTRAN IV for batch execution and has been implemented on a DEC VAX series computer under VMS and on an IBM 370 series computer under MVS. The SAP program was developed in 1978 and last updated in 1985.

  14. Monitoring the metering performance of an electronic voltage transformer on-line based on cyber-physics correlation analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Zhu; Li, Hongbin; Tang, Dengping; Hu, Chen; Jiao, Yang

    2017-10-01

    Metering performance is the key parameter of an electronic voltage transformer (EVT), and it requires high accuracy. The conventional off-line calibration method using a standard voltage transformer is not suitable for the key equipment in a smart substation, which needs on-line monitoring. In this article, we propose a method for monitoring the metering performance of an EVT on-line based on cyber-physics correlation analysis. By the electrical and physical properties of a substation running in three-phase symmetry, the principal component analysis method is used to separate the metering deviation caused by the primary fluctuation and the EVT anomaly. The characteristic statistics of the measured data during operation are extracted, and the metering performance of the EVT is evaluated by analyzing the change in statistics. The experimental results show that the method successfully monitors the metering deviation of a Class 0.2 EVT accurately. The method demonstrates the accurate evaluation of on-line monitoring of the metering performance on an EVT without a standard voltage transformer.

  15. An On-Line Program for Intermediate Level Latin Readings.

    ERIC Educational Resources Information Center

    Raia, Ann

    2001-01-01

    Introduces an on-line intermediate Latin program to potential users by describing the goals and elements of the site (www.iona.edu/latin). Operation of the program is described, as well as the benefits for language education, its current uses, and suggestions for more creative uses in the classroom. (Author/VWL)

  16. In Search of the Best On-Line Degree Programs in Human Resources.

    ERIC Educational Resources Information Center

    Kirk, James J.; Waltemyer, Holly

    The advent of the Internet, the World Wide Web, and global infrastructures for e-learning are revolutionizing how colleges and universities deliver degree programs to adult students. Benefits of on-line degree programs to working adults include flexibility, convenience, and time and cost savings. New technologies that greatly affect how online…

  17. Nongrayness Effects in Wolf-Rayet Wind Momentum Deposition

    NASA Astrophysics Data System (ADS)

    Onifer, A. J.; Gayley, K. G.

    2004-05-01

    Wolf-Rayet winds are characterized by their large momentum fluxes and optically thick winds. A simple analytic approach that helps to understand the most critical processes is the effecively gray approximation, but this has not been generalized to more realistic nongray opacities. We have developed a simplified theory for describing the interaction of the stellar flux with nongray wind opacity. We replace the detailed line list with a set of statistical parameters that are sensitive to the line strengths as well as the wavelength distribution of lines. We determine these statistical parameters for several real line lists, exploring the effects of temperature and density changes on the efficiency of momentum driving relative to gray opacity. We wish to acknowledge NSF grant AST-0098155.

  18. REAL-TIME MONITORING OF DIOXINS AND OTHER ...

    EPA Pesticide Factsheets

    This project is part of EPA's EMPACT program which was begun in 1998 and is jointly administered by EPA's Office of Research and Development, the National Center for Environmental Research and Quality Assurance (NCERQA), and the National Center for Environmental Assessment. The program was developed to provide understandable environmental information on various research initiatives to the public in a timely manner on various issues of importance. This particular project involves development of the application of an on-line, real time, trace organic air toxic monitor, with special emphasis on dioxin-related compounds. Research efforts demonstrate the utility and usefulness of the Resonance Enhanced Multi-Photon Ionization (REMPI) analytical method for trace organics control, monitoring, and compliance assurance. Project objectives will be to develop the REMPI instrumental method into a tool that will be used for assessment of potential dioxin sources, control and prevention of dioxin formation in known sources, and communication of facility performance. This will be accomplished through instrument development, laboratory verification, thermokinetic modelling, equilibrium modelling, statistical determinations, field validation, program publication and presentation, regulatory office support, and development of data communication/presentation procedures. For additional information on this EMPACT project, visit the website at http://www.epa.gov/appcdwww/crb/empa

  19. Type II Supernova Spectral Diversity. I. Observations, Sample Characterization, and Spectral Line Evolution

    NASA Astrophysics Data System (ADS)

    Gutiérrez, Claudia P.; Anderson, Joseph P.; Hamuy, Mario; Morrell, Nidia; González-Gaitan, Santiago; Stritzinger, Maximilian D.; Phillips, Mark M.; Galbany, Lluis; Folatelli, Gastón; Dessart, Luc; Contreras, Carlos; Della Valle, Massimo; Freedman, Wendy L.; Hsiao, Eric Y.; Krisciunas, Kevin; Madore, Barry F.; Maza, José; Suntzeff, Nicholas B.; Prieto, Jose Luis; González, Luis; Cappellaro, Enrico; Navarrete, Mauricio; Pizzella, Alessandro; Ruiz, Maria T.; Smith, R. Chris; Turatto, Massimo

    2017-11-01

    We present 888 visual-wavelength spectra of 122 nearby type II supernovae (SNe II) obtained between 1986 and 2009, and ranging between 3 and 363 days post-explosion. In this first paper, we outline our observations and data reduction techniques, together with a characterization based on the spectral diversity of SNe II. A statistical analysis of the spectral matching technique is discussed as an alternative to nondetection constraints for estimating SN explosion epochs. The time evolution of spectral lines is presented and analyzed in terms of how this differs for SNe of different photometric, spectral, and environmental properties: velocities, pseudo-equivalent widths, decline rates, magnitudes, time durations, and environment metallicity. Our sample displays a large range in ejecta expansion velocities, from ˜9600 to ˜1500 km s-1 at 50 days post-explosion with a median {{{H}}}α value of 7300 km s-1. This is most likely explained through differing explosion energies. Significant diversity is also observed in the absolute strength of spectral lines, characterized through their pseudo-equivalent widths. This implies significant diversity in both temperature evolution (linked to progenitor radius) and progenitor metallicity between different SNe II. Around 60% of our sample shows an extra absorption component on the blue side of the {{{H}}}α P-Cygni profile (“Cachito” feature) between 7 and 120 days since explosion. Studying the nature of Cachito, we conclude that these features at early times (before ˜35 days) are associated with Si II λ 6355, while past the middle of the plateau phase they are related to high velocity (HV) features of hydrogen lines. This paper includes data gathered with the 6.5 m Magellan Telescopes located at Las Campanas Observatory, Chile; and the Gemini Observatory, Cerro Pachon, Chile (Gemini Program GS-2008B-Q-56). Based on observations collected at the European Organisation for Astronomical Research in the Southern Hemisphere, Chile (ESO Programs 076.A-0156, 078.D-0048, 080.A-0516, and 082.A-0526).

  20. The effect of a combined high-intensity plyometric and speed training program on the running and jumping ability of male handball players.

    PubMed

    Cherif, Monsef; Said, Mohamed; Chaatani, Sana; Nejlaoui, Olfa; Gomri, Daghbaji; Abdallah, Aouidet

    2012-03-01

    The aim of this study was to investigate the effect of a combined program including sprint repetitions and drop jump training in the same session on male handball players. Twenty-two male handball players aged more than 20 years were assigned into 2 groups: experimental group (n=11) and control group (n=11). Selection was based on variables "axis" and "lines", goalkeepers were not included. The experimental group was subjected to 2 testing periods (test and retest) separated by 12 weeks of an additional combined plyometric and running speed training program. The control group performed the usual handball training. The testing period comprised, at the first day, a medical checking, anthropometric measurements and an incremental exercise test called yo-yo intermittent recovery test. 2 days later, participants performed the Repeated Sprint Ability test (RSA), and performed the Jumping Performance using 3 different events: Squat jump (SJ), Countermovement jump without (CMJ) and with arms (CMJA), and Drop jump (DJ). At the end of the training period, participants performed again the repeated sprint ability test, and the jumping performance. The conventional combined program improved the explosive force ability of handball players in CMJ (P=0.01), CMJA (P=0.01) and DJR (P=0.03). The change was 2.78, 2.42 and 2.62% respectively. No significant changes were noted in performances of the experimental group at the squat jump test and the drop jump with the left leg test. The training intervention also improved the running speed ability of the experimental group (P=0.003). No statistical differences were observed between lines or axes. Additional combined training program between sprint repetition and vertical jump in the same training session positively influence the jumping ability and the sprint ability of handball players.

  1. Herschel Studies of the Evolution and Environs of Young Stars in the DIGIT, WISH, and FOOSH Programs

    NASA Astrophysics Data System (ADS)

    Green, Joel D.; DIGIT OT Key Project Team; WISH GT Key Project Team; FOOSH OT1 Team

    2012-01-01

    The Herschel Space Observatory has enabled us to probe the physical conditions of outer disks, envelopes, and outflows of young stellar objects, including embedded objects, Herbig Ae/Be disks, and T Tauri disks. We will report on results from three projects, DIGIT, WISH, and FOOSH. The DIGIT (Dust, Ice, and Gas in Time) program (PI: Neal Evans) utilizes the full spectral range of the PACS instrument to explore simultaneously the solid and gas-phase chemistry around sources in all of these stages. WISH (Water in Star Forming Regions with Herschel, PI Ewine van Dishoeck) focuses on observations of key lines with HIFI and line scans of selected spectral regions with PACS. FOOSH (FU Orionis Objects Surveyed with Herschel, PI Joel Green) studies FU Orionis objects with full range PACS and SPIRE scans. DIGIT includes examples of low luminosity protostars, while FOOSH studies the high luminosity objects during outburst states. Rotational ladders of highly excited CO and OH emission are detected in both disks and protostars. The highly excited lines are more commonly seen in the embedded phases, where there appear to be two temperature components. Intriguingly, water is frequently detected in spectra of embedded sources, but not in the disk spectra. In addition to gas features, we explore the extent of the newly detected 69 um forsterite dust feature in both T Tauri and Herbig Ae/Be stars. When analyzed along with the Spitzer-detected dust features, these provide constraints on a population of colder crystalline material. We will present some models of individual sources, as well as some broad statistics of the emission from these stages of star and planet formation.

  2. The FTS atomic spectrum tool (FAST) for rapid analysis of line spectra

    NASA Astrophysics Data System (ADS)

    Ruffoni, M. P.

    2013-07-01

    The FTS Atomic Spectrum Tool (FAST) is an interactive graphical program designed to simplify the analysis of atomic emission line spectra obtained from Fourier transform spectrometers. Calculated, predicted and/or known experimental line parameters are loaded alongside experimentally observed spectral line profiles for easy comparison between new experimental data and existing results. Many such line profiles, which could span numerous spectra, may be viewed simultaneously to help the user detect problems from line blending or self-absorption. Once the user has determined that their experimental line profile fits are good, a key feature of FAST is the ability to calculate atomic branching fractions, transition probabilities, and oscillator strengths-and their uncertainties-which is not provided by existing analysis packages. Program SummaryProgram title: FAST: The FTS Atomic Spectrum Tool Catalogue identifier: AEOW_v1_0 Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEOW_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 293058 No. of bytes in distributed program, including test data, etc.: 13809509 Distribution format: tar.gz Programming language: C++. Computer: Intel x86-based systems. Operating system: Linux/Unix/Windows. RAM: 8 MB minimum. About 50-200 MB for a typical analysis. Classification: 2.2, 2.3, 21.2. Nature of problem: Visualisation of atomic line spectra including the comparison of theoretical line parameters with experimental atomic line profiles. Accurate intensity calibration of experimental spectra, and the determination of observed relative line intensities that are needed for calculating atomic branching fractions and oscillator strengths. Solution method: FAST is centred around a graphical interface, where a user may view sets of experimental line profiles and compare them to calculated data (such as from the Kurucz database [1]), predicted line parameters, and/or previously known experimental results. With additional information on the spectral response of the spectrometer, obtained from a calibrated standard light source, FT spectra may be intensity calibrated. In turn, this permits the user to calculate atomic branching fractions and oscillator strengths, and their respective uncertainties. Running time: Open ended. Defined by the user. References: [1] R.L. Kurucz (2007). URL http://kurucz.harvard.edu/atoms/.

  3. Eliminating Barriers: A Training Intervention in the Use of Medical Information Resources Within an Information-rich Ambulatory Care Environment

    PubMed Central

    Cuddy, Colleen; Brewer, Karen; Fitzpatrick, Bronson; Faraino, Richard; Trainor, Angela; Ciotoli, Carlo

    2001-01-01

    The NYU Ehrman Medical Library worked with the NYU Health Center to establish a base line analysis of the Center staff's knowledge and skills about medical information resources and how they apply them to clinical problem solving in their practice. Based on the results of this survey, the library conducted a targeted 12-month training program in how to select and use electronic resources for clinical problem solving. The survey was repeated and analyzed for significant self-reported change in information-seeking behavior and information skills. The poster presents the statistically significant changes and a set of the resultant research hypotheses.

  4. GOT C+: A Herschel Space Observatory Key Program to Study the Diffuse ISM

    NASA Astrophysics Data System (ADS)

    Langer, William; Velusamy, T.; Goldsmith, P. F.; Li, D.; Pineda, J.; Yorke, H.

    2010-01-01

    Star formation activity is regulated by pressures in the interstellar medium, which in turn depend on heating and cooling rates, modulated by the gravitational potential, and shock and turbulent pressures. To understand these processes we need information about the diffuse atomic and diffuse molecular gas cloud properties. The ionized carbon CII fine structure line at 1.9 THz is an important tracer of the atomic gas in the diffuse regions and the atomic to molecular cloud transformation. Furthermore, C+ is a major ISM coolant, the Galaxy's strongest emission line, with a total luminosity about a 1000 times that of CO J=1-0. Galactic Observations of the Terahertz C+ Line (GOT C+) is a Herschel Space Observatory Open Time Key Program to study the diffuse interstellar medium by sampling CII line emission throughout the Galactic disk. GOT C+ will obtain high spectral resolution CII using the Heterodyne Instrument for the Far Infrared (HIFI) instrument. It employees deep integrations, wide velocity coverage (350 km s-1) with 0.22 km s-1 resolution, and systematic sparse sampling of the Galactic disk together with observations of selected targets, of over 900 lines of sight. It will be a resource of the atomic gas properties, in the (a) Galactic disk, (b) Galaxy's central 300pc, (c) Galactic warp, (d) high latitude HI clouds, and (e) Photon Dominated Regions (PDRs). Along with HI, CO isotopes, and CI spectra, our C+ data will provide the astronomical community with a rich statistical database of diffuse cloud properties, for understanding the role of barometric pressure and turbulence in cloud evolution in the Galactic ISM and, by extension, other galaxies. The GOT C+ project will provide a template for future even larger-scale CII surveys. This research was conducted at the Jet Propulsion Laboratory, California Institute of Technology and is supported by a NASA grant.

  5. Off-line robot programming and graphical verification of path planning

    NASA Technical Reports Server (NTRS)

    Tonkay, Gregory L.

    1989-01-01

    The objective of this project was to develop or specify an integrated environment for off-line programming, graphical path verification, and debugging for robotic systems. Two alternatives were compared. The first was the integration of the ASEA Off-line Programming package with ROBSIM, a robotic simulation program. The second alternative was the purchase of the commercial product IGRIP. The needs of the RADL (Robotics Applications Development Laboratory) were explored and the alternatives were evaluated based on these needs. As a result, IGRIP was proposed as the best solution to the problem.

  6. A seismic survey of the Manson disturbed area

    NASA Technical Reports Server (NTRS)

    Sendlein, L. V. A.; Smith, T. A.

    1971-01-01

    The region in north-central Iowa referred to as the Manson disturbed area was investigated with the seismic refraction method and the bedrock configuration mapped. The area is approximately 30 km in diameter and is not detectable from the surface topography; however, water wells that penetrate the bedrock indicate that the bedrock is composed of disturbed Cretaceous sediments with a central region approximately 6 km in diameter composed of Precambrian crystalline rock. Seismic velocity differences between the overlying glacial till and the Cretaceous sediments were so small that a statistical program was developed to analyze the data. The program developed utilizes existing 2 segment regression analyses and extends the method to fit 3 or more regression lines to seismic data.

  7. Research on the correlation between corona current spectrum and audible noise spectrum of HVDC transmission line

    NASA Astrophysics Data System (ADS)

    Liu, Yingyi; Zhou, Lijuan; Liu, Yuanqing; Yuan, Haiwen; Ji, Liang

    2017-11-01

    Audible noise is closely related to corona current on a high voltage direct current (HVDC) transmission line. In this paper, we measured a large amount of audible noise and corona current waveforms simultaneously based on the largest outdoor HVDC corona cage all over the world. By analyzing the experimental data, the related statistical regularities between a corona current spectrum and an audible noise spectrum were obtained. Furthermore, the generation mechanism of audible noise was analyzed theoretically, and the related mathematical expression between the audible noise spectrum and the corona current spectrum, which is suitable for all of these measuring points in the space, has been established based on the electro-acoustic conversion theory. Finally, combined with the obtained mathematical relation, the internal reasons for these statistical regularities appearing in measured corona current and audible noise data were explained. The results of this paper not only present the statistical association regularities between the corona current spectrum and the audible noise spectrum on a HVDC transmission line, but also reveal the inherent reasons of these associated rules.

  8. Captures of Boll Weevils (Coleoptera: Curculionidae) in Relation to Trap Orientation and Distance From Brush Lines.

    PubMed

    Spurgeon, Dale W

    2016-04-01

    Eradication programs for the boll weevil (Anthonomus grandis grandis Boheman) rely on pheromone-baited traps to trigger insecticide treatments and monitor program progress. A key objective of monitoring in these programs is the timely detection of incipient weevil populations to limit or prevent re-infestation. Therefore, improvements in the effectiveness of trapping would enhance efforts to achieve and maintain eradication. Association of pheromone traps with woodlots and other prominent vegetation are reported to increase captures of weevils, but the spatial scale over which this effect occurs is unknown. The influences of trap distance (0, 10, and 20 m) and orientation (leeward or windward) to brush lines on boll weevil captures were examined during three noncropping seasons (October to February) in the Rio Grande Valley of Texas. Differences in numbers of captured weevils and in the probability of capture between traps at 10 or 20 m from brush, although often statistically significant, were generally small and variable. Variations in boll weevil population levels, wind directions, and wind speeds apparently contributed to this variability. In contrast, traps closely associated with brush (0 m) generally captured larger numbers of weevils, and offered a higher probability of weevil capture compared with traps away from brush. These increases in the probability of weevil capture were as high as 30%. Such increases in the ability of traps to detect low-level boll weevil populations indicate trap placement with respect to prominent vegetation is an important consideration in maximizing the effectiveness of trap-based monitoring for the boll weevil.

  9. On-line data analysis and monitoring for H1 drift chambers

    NASA Astrophysics Data System (ADS)

    Düllmann, Dirk

    1992-05-01

    The on-line monitoring, slow control and calibration of the H1 central jet chamber uses a VME multiprocessor system to perform the analysis and a connected Macintosh computer as graphical interface to the operator on shift. Task of this system are: - analysis of event data including on-line track search, - on-line calibration from normal events and testpulse events, - control of the high voltage and monitoring of settings and currents, - monitoring of temperature, pressure and mixture of the chambergas. A program package is described which controls the dataflow between data aquisition, differnt VME CPUs and Macintosh. It allows to run off-line style programs for the different tasks.

  10. TreSpEx—Detection of Misleading Signal in Phylogenetic Reconstructions Based on Tree Information

    PubMed Central

    Struck, Torsten H

    2014-01-01

    Phylogenies of species or genes are commonplace nowadays in many areas of comparative biological studies. However, for phylogenetic reconstructions one must refer to artificial signals such as paralogy, long-branch attraction, saturation, or conflict between different datasets. These signals might eventually mislead the reconstruction even in phylogenomic studies employing hundreds of genes. Unfortunately, there has been no program allowing the detection of such effects in combination with an implementation into automatic process pipelines. TreSpEx (Tree Space Explorer) now combines different approaches (including statistical tests), which utilize tree-based information like nodal support or patristic distances (PDs) to identify misleading signals. The program enables the parallel analysis of hundreds of trees and/or predefined gene partitions, and being command-line driven, it can be integrated into automatic process pipelines. TreSpEx is implemented in Perl and supported on Linux, Mac OS X, and MS Windows. Source code, binaries, and additional material are freely available at http://www.annelida.de/research/bioinformatics/software.html. PMID:24701118

  11. Entomological efficacy of durable wall lining with reduced wall surface coverage for strengthening visceral leishmaniasis vector control in Bangladesh, India and Nepal.

    PubMed

    Huda, M Mamun; Kumar, Vijay; Das, Murari Lal; Ghosh, Debashis; Priyanka, Jyoti; Das, Pradeep; Alim, Abdul; Matlashewski, Greg; Kroeger, Axel; Alfonso-Sierra, Eduardo; Mondal, Dinesh

    2016-10-06

    New methods for controlling sand fly are highly desired by the Visceral Leishmaniasis (VL) elimination program of Bangladesh, India and Nepal for its consolidation and maintenance phases. To support the program we investigated safety, efficacy and cost of Durable Wall Lining to control sand fly. This multicentre randomized controlled study in Bangladesh, India and Nepal included randomized two intervention clusters and one control cluster. Each cluster had 50 households except full wall surface coverage (DWL-FWSC) cluster in Nepal which had 46 households. Ten of 50 households were randomly selected for entomological activities except India where it was 6 households. Interventions were DWL-FWSC and reduced wall surface coverage (DWL-RWSC) with DWL which covers 1.8 m and 1.5 m height from floor respectively. Efficacy was measured by reduction in sand fly density by intervention and sand fly mortality assessment by the WHO cone bioassay test at 1 month after intervention. Trained field research assistants interviewed household heads for socio-demographic information, knowledge and practice about VL, vector control, and for their experience following the intervention. Cost data was collected using cost data collection tool which was designed for this study. Statistical analysis included difference-in-differences estimate, bivariate analysis, Poisson regression model and incremental cost-efficacy ratio calculation. Mean sand fly density reduction by DWL-FWSC and DWL-RWSC was respectively -4.96 (95 % CI, -4.54, -5.38) and -5.38 (95 % CI, -4.89, -5.88). The sand fly density reduction attributed by both the interventions were statistically significant after adjusting for covariates (IRR = 0.277, p < 0.001 for DWL-RWSC and IRR = 0.371, p < 0.001 for DWL-FWSC). The efficacy of DWL-RWSC and DWL-FWSC on sand fly density reduction was statistically comparable (p = 0.214). The acceptability of both interventions was high. Transient burning sensations, flash on face and itching were most common adverse events and were observed mostly in Indian site. There was no serious adverse event. DWL-RWSC is cost-saving compared to DWL-FWSC. The incremental cost-efficacy ratio was -6.36, where DWL-RWSC dominates DWL-FWSC. DWL-RWSC intervention is safe, efficacious, cost-saving and cost-effective in reducing indoor sand fly density. The VL elimination program in the Indian sub-continent may consider DWL-RWSC for sand fly control for its consolidation and maintenance phases.

  12. Service life evaluation of rigid explosive transfer lines

    NASA Technical Reports Server (NTRS)

    Bement, L. J.; Kayser, E. G.; Schimmel, M. L.

    1983-01-01

    This paper describes a joint Army/NASA-sponsored research program on the service life evaluation of rigid explosive transfer lines. These transfer lines are used to initiate emergency crew escape functions on a wide variety of military and NASA aircraft. The purpose of this program was to determine quantitatively the effects of service, age, and degradation on rigid explosive transfer lines to allow responsible, conservative, service life determination. More than 800 transfer lines were removed from the U.S. Army AH-1G and AH-1S, the U.S. Air Force B-1 and F-111, and the U.S. Navy F-14 aircraft for testing. The results indicated that the lines were not adversely affected by age, service, or a repeat of the thermal qualification tests on full-service lines. Extension of the service life of rigid explosive transfer lines should be considered, since considerable cost savings could be realized with no measurable decrease in system reliability.

  13. R2 & NE Tract - 2010 Census; Housing and Population Summary

    EPA Pesticide Factsheets

    The TIGER/Line Files are shapefiles and related database files (.dbf) that are an extract of selected geographic and cartographic information from the U.S. Census Bureau's Master Address File / Topologically Integrated Geographic Encoding and Referencing (MAF/TIGER) Database (MTDB). The MTDB represents a seamless national file with no overlaps or gaps between parts, however, each TIGER/Line File is designed to stand alone as an independent data set, or they can be combined to cover the entire nation. Census tracts are small, relatively permanent statistical subdivisions of a county or equivalent entity, and were defined by local participants as part of the 2010 Census Participant Statistical Areas Program. The Census Bureau delineated the census tracts in situations where no local participant existed or where all the potential participants declined to participate. The primary purpose of census tracts is to provide a stable set of geographic units for the presentation of census data and comparison back to previous decennial censuses. Census tracts generally have a population size between 1,200 and 8,000 people, with an optimum size of 4,000 people. When first delineated, census tracts were designed to be homogeneous with respect to population characteristics, economic status, and living conditions. The spatial size of census tracts varies widely depending on the density of settlement. Physical changes in street patterns caused by highway construction, new

  14. BaCoCa--a heuristic software tool for the parallel assessment of sequence biases in hundreds of gene and taxon partitions.

    PubMed

    Kück, Patrick; Struck, Torsten H

    2014-01-01

    BaCoCa (BAse COmposition CAlculator) is a user-friendly software that combines multiple statistical approaches (like RCFV and C value calculations) to identify biases in aligned sequence data which potentially mislead phylogenetic reconstructions. As a result of its speed and flexibility, the program provides the possibility to analyze hundreds of pre-defined gene partitions and taxon subsets in one single process run. BaCoCa is command-line driven and can be easily integrated into automatic process pipelines of phylogenomic studies. Moreover, given the tab-delimited output style the results can be easily used for further analyses in programs like Excel or statistical packages like R. A built-in option of BaCoCa is the generation of heat maps with hierarchical clustering of certain results using R. As input files BaCoCa can handle FASTA and relaxed PHYLIP, which are commonly used in phylogenomic pipelines. BaCoCa is implemented in Perl and works on Windows PCs, Macs and Linux operating systems. The executable source code as well as example test files and a detailed documentation of BaCoCa are freely available at http://software.zfmk.de. Copyright © 2013 Elsevier Inc. All rights reserved.

  15. Effectiveness of an Integrated Community- and Clinic-Based Intervention on HIV Testing, HIV Knowledge, and Sexual Risk Behavior of Young Men Who Have Sex With Men in Myanmar.

    PubMed

    Aung, Poe Poe; Ryan, Claire; Bajracharya, Ashish; Pasricha, Naanki; Thein, Zaw Win; Agius, Paul A; Sein, Than Tun; Willenberg, Lisa; Soe, Ei Mon; Zaw, Ne Tun; Tun, Waimar; Yam, Eileen; Luchters, Stanley

    2017-02-01

    Young men who have sex with men (YMSM) in Myanmar are disproportionately affected by HIV, with prevalence five times that of the general population. The Link Up project implemented an intervention using peer education and outreach providing education and counseling on health seeking around sexually transmitted infections and reproductive health, combined with focused clinic capacity building to improve the sexual and reproductive health of YMSM. This study aimed to evaluate the effectiveness and acceptability of the intervention. Using a mixed-methods approach, and employing a quasi-experimental design, we conducted two quantitative repeat cross-sectional surveys in purposively selected control (no intervention) and intervention townships, before and after implementation of the Link Up intervention. Respondent-driven sampling was used to recruit YMSM aged 15-24 years, and study participants were administered a structured questionnaire assessing intervention exposure, health service access, knowledge of HIV, and sexual risk behavior. Focus group discussions were held to elicit perspectives on the use and acceptability of the health services and peer outreach. At baseline, 314 YMSM were recruited in the intervention townships and 309 YMSM in the control townships. At end line, 267 (intervention) and 318 (control) YMSM were recruited. Coverage of the program was relatively low, with one-third of participants in the intervention townships having heard of the Link Up program by the end line. Comparing changes between baseline and end line, a greater proportion of HIV-negative or unknown status YMSM accessed HIV testing in the past 3 months in intervention townships (from 45.0% to 57.1%) compared with those in control townships (remained at 29.0%); however, this difference in the effect over time was not statistically significant in multivariate modeling (adjusted odds ratio: 1.45; 95% confidence interval: .66-3.17). Qualitative findings showed that the intervention was acceptable to YMSM. Overall, the intervention was perceived as acceptable. Although not statistically significant, results showed some trends toward improvements among YMSM in accessing HIV testing services and HIV-related knowledge. The modest coverage and short time frame of the evaluation likely limits the ability for any significant behavioral improvements. Copyright © 2016 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  16. Investigation of LANDSAT D Thematic Mapper geometric performance: Line to line and band to band registration. [Toulouse, France and Mississippi, U.S.A.

    NASA Technical Reports Server (NTRS)

    Begni, G.; BOISSIN; Desachy, M. J.; PERBOS

    1984-01-01

    The geometric accuray of LANDSAT TM raw data of Toulouse (France) raw data of Mississippi, and preprocessed data of Mississippi was examined using a CDC computer. Analog images were restituted on the VIZIR SEP device. The methods used for line to line and band to band registration are based on automatic correlation techniques and are widely used in automated image to image registration at CNES. Causes of intraband and interband misregistration are identified and statistics are given for both line to line and band to band misregistration.

  17. Volcano plots in analyzing differential expressions with mRNA microarrays.

    PubMed

    Li, Wentian

    2012-12-01

    A volcano plot displays unstandardized signal (e.g. log-fold-change) against noise-adjusted/standardized signal (e.g. t-statistic or -log(10)(p-value) from the t-test). We review the basic and interactive use of the volcano plot and its crucial role in understanding the regularized t-statistic. The joint filtering gene selection criterion based on regularized statistics has a curved discriminant line in the volcano plot, as compared to the two perpendicular lines for the "double filtering" criterion. This review attempts to provide a unifying framework for discussions on alternative measures of differential expression, improved methods for estimating variance, and visual display of a microarray analysis result. We also discuss the possibility of applying volcano plots to other fields beyond microarray.

  18. Characterization of Capsicum annuum Genetic Diversity and Population Structure Based on Parallel Polymorphism Discovery with a 30K Unigene Pepper GeneChip

    PubMed Central

    Hill, Theresa A.; Ashrafi, Hamid; Reyes-Chin-Wo, Sebastian; Yao, JiQiang; Stoffel, Kevin; Truco, Maria-Jose; Kozik, Alexander; Michelmore, Richard W.; Van Deynze, Allen

    2013-01-01

    The widely cultivated pepper, Capsicum spp., important as a vegetable and spice crop world-wide, is one of the most diverse crops. To enhance breeding programs, a detailed characterization of Capsicum diversity including morphological, geographical and molecular data is required. Currently, molecular data characterizing Capsicum genetic diversity is limited. The development and application of high-throughput genome-wide markers in Capsicum will facilitate more detailed molecular characterization of germplasm collections, genetic relationships, and the generation of ultra-high density maps. We have developed the Pepper GeneChip® array from Affymetrix for polymorphism detection and expression analysis in Capsicum. Probes on the array were designed from 30,815 unigenes assembled from expressed sequence tags (ESTs). Our array design provides a maximum redundancy of 13 probes per base pair position allowing integration of multiple hybridization values per position to detect single position polymorphism (SPP). Hybridization of genomic DNA from 40 diverse C. annuum lines, used in breeding and research programs, and a representative from three additional cultivated species (C. frutescens, C. chinense and C. pubescens) detected 33,401 SPP markers within 13,323 unigenes. Among the C. annuum lines, 6,426 SPPs covering 3,818 unigenes were identified. An estimated three-fold reduction in diversity was detected in non-pungent compared with pungent lines, however, we were able to detect 251 highly informative markers across these C. annuum lines. In addition, an 8.7 cM region without polymorphism was detected around Pun1 in non-pungent C. annuum. An analysis of genetic relatedness and diversity using the software Structure revealed clustering of the germplasm which was confirmed with statistical support by principle components analysis (PCA) and phylogenetic analysis. This research demonstrates the effectiveness of parallel high-throughput discovery and application of genome-wide transcript-based markers to assess genetic and genomic features among Capsicum annuum. PMID:23409153

  19. Characterization of Capsicum annuum genetic diversity and population structure based on parallel polymorphism discovery with a 30K unigene Pepper GeneChip.

    PubMed

    Hill, Theresa A; Ashrafi, Hamid; Reyes-Chin-Wo, Sebastian; Yao, JiQiang; Stoffel, Kevin; Truco, Maria-Jose; Kozik, Alexander; Michelmore, Richard W; Van Deynze, Allen

    2013-01-01

    The widely cultivated pepper, Capsicum spp., important as a vegetable and spice crop world-wide, is one of the most diverse crops. To enhance breeding programs, a detailed characterization of Capsicum diversity including morphological, geographical and molecular data is required. Currently, molecular data characterizing Capsicum genetic diversity is limited. The development and application of high-throughput genome-wide markers in Capsicum will facilitate more detailed molecular characterization of germplasm collections, genetic relationships, and the generation of ultra-high density maps. We have developed the Pepper GeneChip® array from Affymetrix for polymorphism detection and expression analysis in Capsicum. Probes on the array were designed from 30,815 unigenes assembled from expressed sequence tags (ESTs). Our array design provides a maximum redundancy of 13 probes per base pair position allowing integration of multiple hybridization values per position to detect single position polymorphism (SPP). Hybridization of genomic DNA from 40 diverse C. annuum lines, used in breeding and research programs, and a representative from three additional cultivated species (C. frutescens, C. chinense and C. pubescens) detected 33,401 SPP markers within 13,323 unigenes. Among the C. annuum lines, 6,426 SPPs covering 3,818 unigenes were identified. An estimated three-fold reduction in diversity was detected in non-pungent compared with pungent lines, however, we were able to detect 251 highly informative markers across these C. annuum lines. In addition, an 8.7 cM region without polymorphism was detected around Pun1 in non-pungent C. annuum. An analysis of genetic relatedness and diversity using the software Structure revealed clustering of the germplasm which was confirmed with statistical support by principle components analysis (PCA) and phylogenetic analysis. This research demonstrates the effectiveness of parallel high-throughput discovery and application of genome-wide transcript-based markers to assess genetic and genomic features among Capsicum annuum.

  20. Development of GUI Type On-Line Condition Monitoring Program for a Turboprop Engine Using Labview

    NASA Astrophysics Data System (ADS)

    Kong, Changduk; Kim, Keonwoo

    2011-12-01

    Recently, an aero gas turbine health monitoring system has been developed for precaution and maintenance action against faults or performance degradations of the advanced propulsion system which occurs in severe environments such as high altitude, foreign object damage particles, hot and heavy rain and snowy atmospheric conditions. However to establish this health monitoring system, the online condition monitoring program is firstly required, and the program must monitor the engine performance trend through comparison between measured engine performance data and base performance results calculated by base engine performance model. This work aims to develop a GUI type on-line condition monitoring program for the PT6A-67 turboprop engine of a high altitude and long endurance operation UAV using LabVIEW. The base engine performance of the on-line condition monitoring program is simulated using component maps inversely generated from the limited performance deck data provided by engine manufacturer. The base engine performance simulation program is evaluated because analysis results by this program agree well with the performance deck data. The proposed on-line condition program can monitor the real engine performance as well as the trend through precise comparison between clean engine performance results calculated by the base performance simulation program and measured engine performance signals. In the development phase of this monitoring system, a signal generation module is proposed to evaluate the proposed online monitoring system. For user friendly purpose, all monitoring program are coded by LabVIEW, and monitoring examples are demonstrated using the proposed GUI type on-condition monitoring program.

  1. Spectral line and continuum studies using Haystack antenna

    NASA Technical Reports Server (NTRS)

    1973-01-01

    During the last half of 1972, the Haystack antenna was utilized 88% of the time. Of this useful time, 81% was devoted to radio astronomy investigations, 8% was spent on radar-related research and 11% was scheduled for maintenance and system improvements. Thirteen programs were completed of which 10 were spectral-line studies involving primarily recombination lines and H2O vapor investigations. The others involved 2 cm and 1.3 cm continuum observations. Fifteen new programs were accepted and the currently active radio observing programs totalled 24 as of 31 December 1973. The last radar measurements in the lunar topography program have now been completed. Radar activity, including measurements on Mercury, Venus and synchronous satellites has continued.

  2. Notes on numerical reliability of several statistical analysis programs

    USGS Publications Warehouse

    Landwehr, J.M.; Tasker, Gary D.

    1999-01-01

    This report presents a benchmark analysis of several statistical analysis programs currently in use in the USGS. The benchmark consists of a comparison between the values provided by a statistical analysis program for variables in the reference data set ANASTY and their known or calculated theoretical values. The ANASTY data set is an amendment of the Wilkinson NASTY data set that has been used in the statistical literature to assess the reliability (computational correctness) of calculated analytical results.

  3. Detection of nonlinear transfer functions by the use of Gaussian statistics

    NASA Technical Reports Server (NTRS)

    Sheppard, J. G.

    1972-01-01

    The possibility of using on-line signal statistics to detect electronic equipment nonlinearities is discussed. The results of an investigation using Gaussian statistics are presented, and a nonlinearity test that uses ratios of the moments of a Gaussian random variable is developed and discussed. An outline for further investigation is presented.

  4. Environmental flow allocation and statistics calculator

    USGS Publications Warehouse

    Konrad, Christopher P.

    2011-01-01

    The Environmental Flow Allocation and Statistics Calculator (EFASC) is a computer program that calculates hydrologic statistics based on a time series of daily streamflow values. EFASC will calculate statistics for daily streamflow in an input file or will generate synthetic daily flow series from an input file based on rules for allocating and protecting streamflow and then calculate statistics for the synthetic time series. The program reads dates and daily streamflow values from input files. The program writes statistics out to a series of worksheets and text files. Multiple sites can be processed in series as one run. EFASC is written in MicrosoftRegistered Visual BasicCopyright for Applications and implemented as a macro in MicrosoftOffice Excel 2007Registered. EFASC is intended as a research tool for users familiar with computer programming. The code for EFASC is provided so that it can be modified for specific applications. All users should review how output statistics are calculated and recognize that the algorithms may not comply with conventions used to calculate streamflow statistics published by the U.S. Geological Survey.

  5. Calculation of Weibull strength parameters, Batdorf flaw density constants and related statistical quantities using PC-CARES

    NASA Technical Reports Server (NTRS)

    Szatmary, Steven A.; Gyekenyesi, John P.; Nemeth, Noel N.

    1990-01-01

    This manual describes the operation and theory of the PC-CARES (Personal Computer-Ceramic Analysis and Reliability Evaluation of Structures) computer program for the IBM PC and compatibles running PC-DOS/MS-DOR OR IBM/MS-OS/2 (version 1.1 or higher) operating systems. The primary purpose of this code is to estimate Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities. Included in the manual is the description of the calculation of shape and scale parameters of the two-parameter Weibull distribution using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. The methods for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull line, as well as the techniques for calculating the Batdorf flaw-density constants are also described.

  6. A Fe K Line in GRB 970508

    NASA Astrophysics Data System (ADS)

    Protassov, R.; van Dyk, D.; Connors, A.; Kashyap, V.; Siemiginowska, A.

    2000-12-01

    We examine the x-ray spectrum of the afterglow of GRB 970508, analyzed for Fe line emission by Piro et al (1999, ApJL, 514, L73). This is a difficult and extremely important measurement: the detection of x-ray afterglows from γ -ray bursts is at best a tricky business, relying on near-real satellite time response to unpredictable events; and a great deal of luck in catching a burst bright enough for a useful spectral analysis. Detecting a clear atomic (or cyclotron) line in the generally smooth and featureless afterglow (or burst) emission not only gives one of the few very specific keys to the physics local to the emission region, but also provides clues or confirmation of its distance (via redshift). Unfortunately, neither the likelihood ratio test or the related F-statistic commonly used to detect spectral lines adhere to their nominal Chi square and F-distributions. Thus we begin by calibrating the F-statistic used in Piro et al (1999, ApJL, 514, L73) via a simulation study. The simulation study relies on a completely specified source model, i.e. we do Monte Carlo simulations with all model parameters fixed (so--called ``parametric bootstrapping''). Second, we employ the method of posterior predictive p-values to calibrate a LRT statistic while accounting for the uncertainty in the parameters of the source model. Our analysis reveals evidence for the Fe K line.

  7. Tracking the Genetic Stability of a Honey Bee (Hymenoptera: Apidae) Breeding Program With Genetic Markers.

    PubMed

    Bourgeois, Lelania; Beaman, Lorraine

    2017-08-01

    A genetic stock identification (GSI) assay was developed in 2008 to distinguish Russian honey bees from other honey bee stocks that are commercially produced in the United States. Probability of assignment (POA) values have been collected and maintained since the stock release in 2008 to the Russian Honey Bee Breeders Association. These data were used to assess stability of the breeding program and the diversity levels of the contemporary breeding stock through comparison of POA values and genetic diversity parameters from the initial release to current values. POA values fluctuated throughout 2010-2016, but have recovered to statistically similar levels in 2016 (POA(2010) = 0.82, POA(2016) = 0.74; P = 0.33). Genetic diversity parameters (i.e., allelic richness and gene diversity) in 2016 also remained at similar levels when compared to those in 2010. Estimates of genetic structure revealed stability (FST(2009/2016) = 0.0058) with a small increase in the estimate of the inbreeding coefficient (FIS(2010) = 0.078, FIS(2016) = 0.149). The relationship among breeding lines, based on genetic distance measurement, was similar in 2008 and 2016 populations, but with increased homogeneity among lines (i.e., decreased genetic distance). This was expected based on the closed breeding system used for Russian honey bees. The successful application of the GSI assay in a commercial breeding program demonstrates the utility and stability of such technology to contribute to and monitor the genetic integrity of a breeding stock of an insect species. Published by Oxford University Press on behalf of Entomological Society of America 2017. This work is written by US Government employees and is in the public domain in the US.

  8. Effectiveness of a Computer-Based Training Program of Attention and Memory in Patients with Acquired Brain Damage

    PubMed Central

    Fernandez, Elizabeth; Bergado Rosado, Jorge A.; Rodriguez Perez, Daymi; Salazar Santana, Sonia; Torres Aguilar, Maydane; Bringas, Maria Luisa

    2017-01-01

    Many training programs have been designed using modern software to restore the impaired cognitive functions in patients with acquired brain damage (ABD). The objective of this study was to evaluate the effectiveness of a computer-based training program of attention and memory in patients with ABD, using a two-armed parallel group design, where the experimental group (n = 50) received cognitive stimulation using RehaCom software, and the control group (n = 30) received the standard cognitive stimulation (non-computerized) for eight weeks. In order to assess the possible cognitive changes after the treatment, a post-pre experimental design was employed using the following neuropsychological tests: Wechsler Memory Scale (WMS) and Trail Making test A and B. The effectiveness of the training procedure was statistically significant (p < 0.05) when it established the comparison between the performance in these scales, before and after the training period, in each patient and between the two groups. The training group had statistically significant (p < 0.001) changes in focused attention (Trail A), two subtests (digit span and logical memory), and the overall score of WMS. Finally, we discuss the advantages of computerized training rehabilitation and further directions of this line of work. PMID:29301194

  9. Flight flutter testing technology at Grumman. [automated telemetry station for on line data reduction

    NASA Technical Reports Server (NTRS)

    Perangelo, H. J.; Milordi, F. W.

    1976-01-01

    Analysis techniques used in the automated telemetry station (ATS) for on line data reduction are encompassed in a broad range of software programs. Concepts that form the basis for the algorithms used are mathematically described. The control the user has in interfacing with various on line programs is discussed. The various programs are applied to an analysis of flight data which includes unimodal and bimodal response signals excited via a swept frequency shaker and/or random aerodynamic forces. A nonlinear response error modeling analysis approach is described. Preliminary results in the analysis of a hard spring nonlinear resonant system are also included.

  10. Computer program for calculating supersonic flow on the windward side conical delta wings by the method of lines

    NASA Technical Reports Server (NTRS)

    Klunker, E. B.; South, J. C., Jr.; Davis, R. M.

    1972-01-01

    A user's manual is presented for a program that calculates the supersonic flow on the windward side of conical delta wings with shock attached at the sharp leading edge by the method of lines. The program also has a limited capability for computing the flow about circular and elliptic cones at incidence. It provides information including the shock shape, flow field, isentropic surface-flow properties, and force coefficients. A description of the program operation, a sample computation, and a FORTRAN 4 program listing are included.

  11. In Vitro Comparison of Cytotoxicity of Four Root Canal Sealers on Human Gingival Fibroblasts

    PubMed Central

    Konjhodzic-Prcic, Alma; Gorduysus, Omer; Kucukkaya, Selen; Atila, Burcu; Muftuoglu, Sevda; Zeybek, Dilara

    2015-01-01

    The goal of this in vitro study was to evaluate the relative biocompatibility of four endodontic sealers on the cell culture of the human fibroblast through cytotoxicity. Materials and Methods: In this study four endodontics sealers was used GuttaFlow (Roeko)silicone based sealer, AH plus (De Tray-DENTSPLY) epoxy resin based, Apexit (Vivadent) calcium hydroxide based and Endorez (Ultradent) methacrylate based sealer. Sealers were tested on primary cell lines of human gingival fibroblasts. Experiments were preformed in laboratories of Hacettepe University of Ankara, Turkey and Faculty of Dentistry, University of Sarajevo, Bosnia and Herzegovina Cytotoxicity was determinate using WST-1 assay. Results: Results were analyzed by SPSS 19 program. Kolgomorov-Smirnov test, Shapiro-Wilk and descriptive statistics also were used, as well as Kriskall-Wallis, ANOVA test and T- test. According to our results all four sealers showed different cytotoxicity effects on human gingival fibroblast cell culture, but all of them are slightly cytotoxic. Conclusions: According to results of this study it can be concluded: all four sealers showed different cytotoxicity effects on primary cell lines of human gingival fibroblasts, but all of them are slightly cytotoxicity. PMID:25870472

  12. AOIPS 3 user's guide. Volume 2: Program descriptions

    NASA Technical Reports Server (NTRS)

    Schotz, Steve S.; Piper, Thomas S.; Negri, Andrew J.

    1990-01-01

    The Atmospheric and Oceanographic Information Processing System (AOIPS) 3 is the version of the AOIPS software as of April 1989. The AOIPS software was developed jointly by the Goddard Space Flight Center and General Sciences Corporation. A detailed description of very AOIPS program is presented. It is intended to serve as a reference for such items as program functionality, program operational instructions, and input/output variable descriptions. Program descriptions are derived from the on-line help information. Each program description is divided into two sections. The functional description section describes the purpose of the program and contains any pertinent operational information. The program description sections lists the program variables as they appear on-line, and describes them in detail.

  13. Space Discovery: Teaching with Space. Evaluation: Summer, Fall 1998 Programs

    NASA Technical Reports Server (NTRS)

    Ewell, Bob

    1998-01-01

    This is the final report of the 1998 NASA-sponsored evaluation of the effectiveness of the United States Space Foundation's five-day Space Discovery Standard Graduate Course (Living and Working in Space), the five-day Space Discovery Advanced Graduate Course (Advanced Technology and Biomedical Research), the five-day introductory course Aviation and Space Basics all conducted during the summer of 1998, and the Teaching with Space two-day Inservice program. The purpose of the program is to motivate and equip K- 12 teachers to use proven student-attracting space and technology concepts to support standard curriculum. These programs support the America 2000 National Educational Goals, encouraging more students to stay in school, increase in competence, and have a better opportunity to be attracted to math and science. The 1998 research program continues the comprehensive evaluation begun in 1992, this year studying five summer five-day sessions and five Inservice programs offered during the Fall of 1998 in California, Colorado, New York, and Virginia. A comprehensive research design by Dr. Robert Ewell of Creative Solutions and Dr. Darwyn Linder of Arizona State University evaluated the effectiveness of various areas of the program and its applicability on diverse groups. Preliminary research methodology was a set of survey instruments administered after the courses, and another to be sent in April-4-5 months following the last inservice involved in this study. This year, we have departed from this evaluation design in two ways. First, the five-day programs used NASA's new EDCATS on-line system and associated survey rather than the Linder/Ewell instruments. The Inservice programs were evaluated using the previously developed survey adapted for Inservice programs. Second, we did not do a follow-on survey of the teachers after they had been in the field as we have done in the past. Therefore, this evaluation captures only the reactions of the teachers to the programs immediately after the instruction. Although EDCATS is designed for teachers to enter their data logged onto the appropriate internet web site, most surveys were completed on a printed copy and entered into the EDCATS system by USSF personnel. The Aviation and Space Basics class were taken to a computer lab where they responded to the survey on-line. Data from the Inservice surveys were manually entered into a computer spreadsheet program by US Space Foundation personnel and processed using the statistical program SPSS for Windows. The raw data and copy of the EDCATS survey for the five-day programs are in Appendix 1. The raw data and copy of the Inservice survey are in Appendix 2. Comments from both programs are in Appendices 3 and 4, respectively.

  14. ZED- A LINE EDITOR FOR THE DEC VAX

    NASA Technical Reports Server (NTRS)

    Scott, P. J.

    1994-01-01

    The ZED editor for the DEC VAX is a simple, yet powerful line editor for text, program source code, and non-binary data. Line editors can be superior to screen editors in some cases, such as executing complex multiple or conditional commands, or editing via slow modem lines. ZED excels in the area of text processing by using procedure files. For example, such procedures can reformat a file of addresses or remove all comment lines from a FORTRAN program. In addition to command files, ZED also features versatile search qualifiers, global changes, conditionals, on-line help, hexadecimal mode, space compression, looping, logical combinations of search strings, journaling, visible control characters, and automatic detabbing. The ZED editor was originally developed at Cambridge University in London and has been continuously enhanced since 1976. Users of the Cambridge implementation have devised such elaborate ZED procedures as chess games, calculators, and programs for evaluating Pi. This implementation of ZED strives to maintain the characteristics of the Cambridge editor. A complete ZED manual is included on the tape. ZED is written entirely in C for either batch or interactive execution on the DEC VAX under VMS 4.X and requires 80,896 bytes of memory. This program was released in 1988 and updated in 1989.

  15. The Role of Managers in Employee Wellness Programs: A Mixed-Methods Study.

    PubMed

    Passey, Deborah G; Hammerback, Kristen; Huff, Aaron; Harris, Jeffrey R; Hannon, Peggy A

    2018-01-01

    The purpose of this study is to evaluate managers' barriers and facilitators to supporting employee participation in the Washington State Wellness program. Exploratory sequential mixed methods. Four Washington State agencies located in Olympia and Tumwater, Washington. State employees in management positions (executive, middle, and line), whose job includes supervision of subordinates and responsibility for the performance and conduct of a subunit or group. We interviewed 23 managers and then used the results to create a survey that was fielded to all managers at the 4 agencies. The survey response rate was 65% (n = 607/935). We used qualitative coding techniques to analyze interview transcripts and descriptive statistics to summarize survey data. We used the Total Worker Health framework to organize our findings and conclusions. Managers support the wellness program, but they also face challenges with accommodating employees' participation due to workload, scheduling inflexibility, and self-efficacy to discuss wellness with direct reports. About half the managers receive support from the manager above them, and most have not received training on the wellness program. Our findings point to several strategies that can strengthen managers' role in supporting the wellness program: the provision of training, targeted messages, formal expectations, and encouragement (from the manager above) to support employees' participation.

  16. AI-BL1.0: a program for automatic on-line beamline optimization using the evolutionary algorithm.

    PubMed

    Xi, Shibo; Borgna, Lucas Santiago; Zheng, Lirong; Du, Yonghua; Hu, Tiandou

    2017-01-01

    In this report, AI-BL1.0, an open-source Labview-based program for automatic on-line beamline optimization, is presented. The optimization algorithms used in the program are Genetic Algorithm and Differential Evolution. Efficiency was improved by use of a strategy known as Observer Mode for Evolutionary Algorithm. The program was constructed and validated at the XAFCA beamline of the Singapore Synchrotron Light Source and 1W1B beamline of the Beijing Synchrotron Radiation Facility.

  17. Visual Data Analysis for Satellites

    NASA Technical Reports Server (NTRS)

    Lau, Yee; Bhate, Sachin; Fitzpatrick, Patrick

    2008-01-01

    The Visual Data Analysis Package is a collection of programs and scripts that facilitate visual analysis of data available from NASA and NOAA satellites, as well as dropsonde, buoy, and conventional in-situ observations. The package features utilities for data extraction, data quality control, statistical analysis, and data visualization. The Hierarchical Data Format (HDF) satellite data extraction routines from NASA's Jet Propulsion Laboratory were customized for specific spatial coverage and file input/output. Statistical analysis includes the calculation of the relative error, the absolute error, and the root mean square error. Other capabilities include curve fitting through the data points to fill in missing data points between satellite passes or where clouds obscure satellite data. For data visualization, the software provides customizable Generic Mapping Tool (GMT) scripts to generate difference maps, scatter plots, line plots, vector plots, histograms, timeseries, and color fill images.

  18. A computer program designed to produce tables from alphanumeric data

    USGS Publications Warehouse

    Ridgley, Jennie L.; Schnabel, Robert Wayne

    1978-01-01

    This program is designed to produce tables from alphanumeric data. Each line of data that appears in the table is entered into a data file as a single line of data. Where necessary, a predetermined delimiter is added to break up the data into column data. The program can process the following types of data: (1) title, (2) headnote, (3) footnote, (4) two levels of column headers, (5) solid lines, (6) blank lines, (7) most types of numeric data, and (8) all types of alphanumeric data. In addition, the program can produce a series of continuation tables from large data sets. Fitting of all data to the final table format is performed by the program, although provisions have been made for user-modification of the final format. The width of the table is adjustable, but may not exceed 158 characters per line. The program is useful in that it permits alteration of original data or table format without having to physically retype all or portions of the table. The final results may be obtained quickly using interactive terminals, and execution of the program requires only minimal knowledge of computer usage. Tables produced may be of publishable quality, especially when reduced. Complete user documentation and program listing are included. NOTE: Although this program has been subjected to many tests a warranty on accuracy or proper functioning is neither implied nor expressed.

  19. A Prototype System for Retrieval of Gene Functional Information

    PubMed Central

    Folk, Lillian C.; Patrick, Timothy B.; Pattison, James S.; Wolfinger, Russell D.; Mitchell, Joyce A.

    2003-01-01

    Microarrays allow researchers to gather data about the expression patterns of thousands of genes simultaneously. Statistical analysis can reveal which genes show statistically significant results. Making biological sense of those results requires the retrieval of functional information about the genes thus identified, typically a manual gene-by-gene retrieval of information from various on-line databases. For experiments generating thousands of genes of interest, retrieval of functional information can become a significant bottleneck. To address this issue, we are currently developing a prototype system to automate the process of retrieval of functional information from multiple on-line sources. PMID:14728346

  20. English Collocation Learning through Corpus Data: On-Line Concordance and Statistical Information

    ERIC Educational Resources Information Center

    Ohtake, Hiroshi; Fujita, Nobuyuki; Kawamoto, Takeshi; Morren, Brian; Ugawa, Yoshihiro; Kaneko, Shuji

    2012-01-01

    We developed an English Collocations On Demand system offering on-line corpus and concordance information to help Japanese researchers acquire a better command of English collocation patterns. The Life Science Dictionary Corpus consists of approximately 90,000,000 words collected from life science related research papers published in academic…

  1. National Survey of US academic anesthesiology chairs on clinician wellness.

    PubMed

    Vinson, Amy E; Zurakowski, David; Randel, Gail I; Schlecht, Kathy D

    2016-11-01

    The prevalence of anesthesiology department wellness programs is unknown. A database of wellness programs is needed as a resource for departments attempting to respond to the Accreditation Council for Graduate Medical Education Anesthesiology Milestones Project. The purpose of this study was to survey academic anesthesiology chairs on wellness issues, characterize initiatives, and establish wellness contacts for a Wellness Initiative Database (WID). An Internet-based survey instrument was distributed to academic anesthesiology department chairs in the United States. On-line. None. None. Analysis for continuous variables used standard means, modes, and averages for individual responses; 95% confidence intervals for proportions were calculated by Wilson's method. Seventy-five (56.4%) responses (of a potential 133 programs) were obtained. Forty-one (of 71 responders; 57.8%) expressed interest in participating in a WID, and 33 (44%) provided contact information. Most (74.7%) had recently referred staff for counseling or wellness resources, yet many (79.5% and 67.1%, respectively) had never surveyed their department's interest in wellness resources. Thirty-four percent had a wellness resources repository. Of 22 wellness topics, 8 garnered >60% strong interest from respondents: Addiction Counseling, Sleep Hygiene, Peer Support Program, Stress Management, Conflict Management, Burnout Counseling, Time Management, and Dealing with Adverse Events Training. There was a statistically significant difference in interest between those willing to participate or not in the WID across most topics but no significant difference based on need for recent staff referral. The majority of chairs needed to recently refer a department member to wellness resources or counseling. Most were interested in participating in a WID, whereas a minority had gauged staff interest in wellness topics or had a wellness resource repository. Highest interest was in topics most related to function as an anesthesiologist. Those willing to participate in the database had statistically significant differences in interest across most wellness topics. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Dynamic Tunnel Usability Study: Format Recommendations for Synthetic Vision System Primary Flight Displays

    NASA Technical Reports Server (NTRS)

    Arthur, Jarvis J., III; Prinzel, Lawrence J., III; Kramer, Lynda J.; Bailey, Randall E.

    2006-01-01

    A usability study evaluating dynamic tunnel concepts has been completed under the Aviation Safety and Security Program, Synthetic Vision Systems Project. The usability study was conducted in the Visual Imaging Simulator for Transport Aircraft Systems (VISTAS) III simulator in the form of questionnaires and pilot-in-the-loop simulation sessions. Twelve commercial pilots participated in the study to determine their preferences via paired comparisons and subjective rankings regarding the color, line thickness and sensitivity of the dynamic tunnel. The results of the study showed that color was not significant in pilot preference paired comparisons or in pilot rankings. Line thickness was significant for both pilot preference paired comparisons and in pilot rankings. The preferred line/halo thickness combination was a line width of 3 pixels and a halo of 4 pixels. Finally, pilots were asked their preference for the current dynamic tunnel compared to a less sensitive dynamic tunnel. The current dynamic tunnel constantly gives feedback to the pilot with regard to path error while the less sensitive tunnel only changes as the path error approaches the edges of the tunnel. The tunnel sensitivity comparison results were not statistically significant.

  3. Ordering statistics of four random walkers on a line

    NASA Astrophysics Data System (ADS)

    Helenbrook, Brian; ben-Avraham, Daniel

    2018-05-01

    We study the ordering statistics of four random walkers on the line, obtaining a much improved estimate for the long-time decay exponent of the probability that a particle leads to time t , Plead(t ) ˜t-0.91287850 , and that a particle lags to time t (never assumes the lead), Plag(t ) ˜t-0.30763604 . Exponents of several other ordering statistics for N =4 walkers are obtained to eight-digit accuracy as well. The subtle correlations between n walkers that lag jointly, out of a field of N , are discussed: for N =3 there are no correlations and Plead(t ) ˜Plag(t) 2 . In contrast, our results rule out the possibility that Plead(t ) ˜Plag(t) 3 for N =4 , although the correlations in this borderline case are tiny.

  4. OSO 8 observational limits to the acoustic coronal heating mechanism

    NASA Technical Reports Server (NTRS)

    Bruner, E. C., Jr.

    1981-01-01

    An improved analysis of time-resolved line profiles of the C IV resonance line at 1548 A has been used to test the acoustic wave hypothesis of solar coronal heating. It is shown that the observed motions and brightness fluctuations are consistent with the existence of acoustic waves. Specific account is taken of the effect of photon statistics on the observed velocities, and a test is devised to determine whether the motions represent propagating or evanescent waves. It is found that on the average about as much energy is carried upward as downward such that the net acoustic flux density is statistically consistent with zero. The statistical uncertainty in this null result is three orders of magnitue lower than the flux level needed to heat the corona.

  5. TRAIL-induced programmed necrosis as a novel approach to eliminate tumor cells

    PubMed Central

    2014-01-01

    Background The cytokine TRAIL represents one of the most promising candidates for the apoptotic elimination of tumor cells, either alone or in combination therapies. However, its efficacy is often limited by intrinsic or acquired resistance of tumor cells to apoptosis. Programmed necrosis is an alternative, molecularly distinct mode of programmed cell death that is elicited by TRAIL under conditions when the classical apoptosis machinery fails or is actively inhibited. The potential of TRAIL-induced programmed necrosis in tumor therapy is, however, almost completely uncharacterized. We therefore investigated its impact on a panel of tumor cell lines of wide-ranging origin. Methods Cell death/viability was measured by flow cytometry/determination of intracellular ATP levels/crystal violet staining. Cell surface expression of TRAIL receptors was detected by flow cytometry, expression of proteins by Western blot. Ceramide levels were quantified by high-performance thin layer chromatography and densitometric analysis, clonogenic survival of cells was determined by crystal violet staining or by soft agarose cloning. Results TRAIL-induced programmed necrosis killed eight out of 14 tumor cell lines. Clonogenic survival was reduced in all sensitive and even one resistant cell lines tested. TRAIL synergized with chemotherapeutics in killing tumor cell lines by programmed necrosis, enhancing their effect in eight out of 10 tested tumor cell lines and in 41 out of 80 chemotherapeutic/TRAIL combinations. Susceptibility/resistance of the investigated tumor cell lines to programmed necrosis seems to primarily depend on expression of the pro-necrotic kinase RIPK3 rather than the related kinase RIPK1 or cell surface expression of TRAIL receptors. Furthermore, interference with production of the lipid ceramide protected all tested tumor cell lines. Conclusions Our study provides evidence that TRAIL-induced programmed necrosis represents a feasible approach for the elimination of tumor cells, and that this treatment may represent a promising new option for the future development of combination therapies. Our data also suggest that RIPK3 expression may serve as a potential predictive marker for the sensitivity of tumor cells to programmed necrosis and extend the previously established role of ceramide as a key mediator of death receptor-induced programmed necrosis (and thus as a potential target for future therapies) also to the tumor cell lines examined here. PMID:24507727

  6. STATISTICS AND INTELLIGENCE IN DEVELOPING COUNTRIES: A NOTE.

    PubMed

    Kodila-Tedika, Oasis; Asongu, Simplice A; Azia-Dimbu, Florentin

    2017-05-01

    The purpose of this study is to assess the relationship between intelligence (or human capital) and the statistical capacity of developing countries. The line of inquiry is motivated essentially by the scarce literature on poor statistics in developing countries and an evolving stream of literature on the knowledge economy. A positive association is established between intelligence quotient (IQ) and statistical capacity. The relationship is robust to alternative specifications with varying conditioning information sets and control for outliers. Policy implications are discussed.

  7. Report for Florida Community Colleges, 1983-1984. Part I: Statistical Tables.

    ERIC Educational Resources Information Center

    Florida State Dept. of Education, Tallahassee. Div. of Community Colleges.

    Statistical data are presented on student enrollments, academic programs, personnel and salaries, and finances for the Florida community colleges for 1983-84. A series of tables provide data on: (1) opening fall enrollment by class, program and student status; (2) fall enrollment headcount by age groups; (3) annual program headcount enrollment;…

  8. Statistical properties of links of network: A survey on the shipping lines of Worldwide Marine Transport Network

    NASA Astrophysics Data System (ADS)

    Zhang, Wenjun; Deng, Weibing; Li, Wei

    2018-07-01

    Node properties and node importance identification of networks have been vastly studied in the last decades. While in this work, we analyze the links' properties of networks by taking the Worldwide Marine Transport Network (WMTN) as an example, i.e., statistical properties of the shipping lines of WMTN have been investigated in various aspects: Firstly, we study the feature of loops in the shipping lines by defining the line saturability. It is found that the line saturability decays exponentially with the increase of line length. Secondly, to detect the geographical community structure of shipping lines, the Label Propagation Algorithm with compression of Flow (LPAF) and Multi-Dimensional Scaling (MDS) method are employed, which show rather consistent communities. Lastly, to analyze the redundancy property of shipping lines of different marine companies, the multilayer networks are constructed by aggregating the shipping lines of different marine companies. It is observed that the topological quantities, such as average degree, average clustering coefficient, etc., increase smoothly when marine companies are randomly merged (randomly choose two marine companies, then merge the shipping lines of them together), while the relative entropy decreases when the merging sequence is determined by the Jensen-Shannon distance (choose two marine companies when the Jensen-Shannon distance between them is the lowest). This indicates the low redundancy of shipping lines among different marine companies.

  9. GASPS--A Herschel Survey of Gas and Dust in Protoplanetary Disks: Summary and Initial Statistics

    NASA Technical Reports Server (NTRS)

    Dent, W.R.F.; Thi, W. F.; Kamp, I.; Williams, J. P.; Menard, F.; Andrews, S.; Ardila, D.; Aresu, G.; Augereau, J.-C.; Barrado y Navascues, D.; hide

    2013-01-01

    We describe a large-scale far-infrared line and continuum survey of protoplanetary disk through to young debris disk systems carried out using the ACS instrument on the Herschel Space Observatory. This Open Time Key program, known as GASPS (Gas Survey of Protoplanetary Systems), targeted approx. 250 young stars in narrow wavelength regions covering the [OI] fine structure line at 63 micron the brightest far-infrared line in such objects. A subset of the brightest targets were also surveyed in [OI]145 micron, [CII] at 157 µm, as well as several transitions of H2O and high-excitation CO lines at selected wavelengths between 78 and 180 micron. Additionally, GASPS included continuum photometry at 70, 100 and 160 micron, around the peak of the dust emission. The targets were SED Class II– III T Tauri stars and debris disks from seven nearby young associations, along with a comparable sample of isolated Herbig AeBe stars. The aim was to study the global gas and dust content in a wide sample of circumstellar disks, combining the results with models in a systematic way. In this overview paper we review the scientific aims, target selection and observing strategy of the program. We summarize some of the initial results, showing line identifications, listing the detections, and giving a first statistical study of line detectability. The [OI] line at 63 micron was the brightest line seen in almost all objects, by a factor of 10. Overall [OI] 63 micron detection rates were 49%, with 100% of HAeBe stars and 43% of T Tauri stars detected. A comparison with published disk dust masses (derived mainly from sub-mm continuum, assuming standard values of the mm mass opacity) shows a dust mass threshold for [OI] 63 µm detection of approx.10(exp -5) Solar M.. Normalizing to a distance of 140 pc, 84% of objects with dust masses =10 (exp -5) Solar M can be detected in this line in the present survey; 32% of those of mass 10(exp -6) – 10 (exp -5) Solar M, and only a very small number of unusual objects with lower masses can be detected. This is consistent with models with a moderate UV excess and disk flaring. For a given disk mass, [OI] detectability is lower for M stars compared with earlier spectral types. Both the continuum and line emission was, in most systems, spatially and spectrally unresolved and centered on the star, suggesting that emission in most cases was from the disk. Approximately 10 objects showed resolved emission, most likely from outflows. In the GASPS sample, [OI] detection rates in T Tauri associations in the 0.3–4 Myr age range were approx. 50%. For each association in the 5–20 Myr age range, approx. 2 stars remain detectable in [OI] 63 micron, and no systems were detected in associations with age >20 Myr. Comparing with the total number of young stars in each association, and assuming a ISM-like gas/dust ratio, this indicates that approx. 18% of stars retain a gas-rich disk of total mass approx. Jupiter- M for 1–4 Myr, 1–7% keep such disks for 5–10 Myr, but none are detected beyond 10–20 Myr. The brightest [OI] objects from GASPS were also observed in [OI]145 micron, [CII]157 micron and CO J = 18- 17, with detection rates of 20–40%. Detection of the [CII] line was not correlated with disk mass, suggesting it arises more commonly from a compact remnant envelope.

  10. ARS-Media for Excel: A Spreadsheet Tool for Calculating Media Recipes Based on Ion-Specific Constraints

    PubMed Central

    Niedz, Randall P.

    2016-01-01

    ARS-Media for Excel is an ion solution calculator that uses “Microsoft Excel” to generate recipes of salts for complex ion mixtures specified by the user. Generating salt combinations (recipes) that result in pre-specified target ion values is a linear programming problem. Excel’s Solver add-on solves the linear programming equation to generate a recipe. Calculating a mixture of salts to generate exact solutions of complex ionic mixtures is required for at least 2 types of problems– 1) formulating relevant ecological/biological ionic solutions such as those from a specific lake, soil, cell, tissue, or organ and, 2) designing ion confounding-free experiments to determine ion-specific effects where ions are treated as statistical factors. Using ARS-Media for Excel to solve these two problems is illustrated by 1) exactly reconstructing a soil solution representative of a loamy agricultural soil and, 2) constructing an ion-based experiment to determine the effects of substituting Na+ for K+ on the growth of a Valencia sweet orange nonembryogenic cell line. PMID:27812202

  11. ARS-Media for Excel: A Spreadsheet Tool for Calculating Media Recipes Based on Ion-Specific Constraints.

    PubMed

    Niedz, Randall P

    2016-01-01

    ARS-Media for Excel is an ion solution calculator that uses "Microsoft Excel" to generate recipes of salts for complex ion mixtures specified by the user. Generating salt combinations (recipes) that result in pre-specified target ion values is a linear programming problem. Excel's Solver add-on solves the linear programming equation to generate a recipe. Calculating a mixture of salts to generate exact solutions of complex ionic mixtures is required for at least 2 types of problems- 1) formulating relevant ecological/biological ionic solutions such as those from a specific lake, soil, cell, tissue, or organ and, 2) designing ion confounding-free experiments to determine ion-specific effects where ions are treated as statistical factors. Using ARS-Media for Excel to solve these two problems is illustrated by 1) exactly reconstructing a soil solution representative of a loamy agricultural soil and, 2) constructing an ion-based experiment to determine the effects of substituting Na+ for K+ on the growth of a Valencia sweet orange nonembryogenic cell line.

  12. On-line failure detection and damping measurement of aerospace structures by random decrement signatures

    NASA Technical Reports Server (NTRS)

    Cole, H. A., Jr.

    1973-01-01

    Random decrement signatures of structures vibrating in a random environment are studied through use of computer-generated and experimental data. Statistical properties obtained indicate that these signatures are stable in form and scale and hence, should have wide application in one-line failure detection and damping measurement. On-line procedures are described and equations for estimating record-length requirements to obtain signatures of a prescribed precision are given.

  13. The National Streamflow Statistics Program: A Computer Program for Estimating Streamflow Statistics for Ungaged Sites

    USGS Publications Warehouse

    Ries(compiler), Kernell G.; With sections by Atkins, J. B.; Hummel, P.R.; Gray, Matthew J.; Dusenbury, R.; Jennings, M.E.; Kirby, W.H.; Riggs, H.C.; Sauer, V.B.; Thomas, W.O.

    2007-01-01

    The National Streamflow Statistics (NSS) Program is a computer program that should be useful to engineers, hydrologists, and others for planning, management, and design applications. NSS compiles all current U.S. Geological Survey (USGS) regional regression equations for estimating streamflow statistics at ungaged sites in an easy-to-use interface that operates on computers with Microsoft Windows operating systems. NSS expands on the functionality of the USGS National Flood Frequency Program, and replaces it. The regression equations included in NSS are used to transfer streamflow statistics from gaged to ungaged sites through the use of watershed and climatic characteristics as explanatory or predictor variables. Generally, the equations were developed on a statewide or metropolitan-area basis as part of cooperative study programs. Equations are available for estimating rural and urban flood-frequency statistics, such as the 1 00-year flood, for every state, for Puerto Rico, and for the island of Tutuila, American Samoa. Equations are available for estimating other statistics, such as the mean annual flow, monthly mean flows, flow-duration percentiles, and low-flow frequencies (such as the 7-day, 0-year low flow) for less than half of the states. All equations available for estimating streamflow statistics other than flood-frequency statistics assume rural (non-regulated, non-urbanized) conditions. The NSS output provides indicators of the accuracy of the estimated streamflow statistics. The indicators may include any combination of the standard error of estimate, the standard error of prediction, the equivalent years of record, or 90 percent prediction intervals, depending on what was provided by the authors of the equations. The program includes several other features that can be used only for flood-frequency estimation. These include the ability to generate flood-frequency plots, and plots of typical flood hydrographs for selected recurrence intervals, estimates of the probable maximum flood, extrapolation of the 500-year flood when an equation for estimating it is not available, and weighting techniques to improve flood-frequency estimates for gaging stations and ungaged sites on gaged streams. This report describes the regionalization techniques used to develop the equations in NSS and provides guidance on the applicability and limitations of the techniques. The report also includes a users manual and a summary of equations available for estimating basin lagtime, which is needed by the program to generate flood hydrographs. The NSS software and accompanying database, and the documentation for the regression equations included in NSS, are available on the Web at http://water.usgs.gov/software/.

  14. HOPI: on-line injection optimization program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    LeMaire, J L

    1977-10-26

    A method of matching the beam from the 200 MeV linac to the AGS without the necessity of making emittance measurements is presented. An on-line computer program written on the PDP10 computer performs the matching by modifying independently the horizontal and vertical emittance. Experimental results show success with this method, which can be applied to any matching section.

  15. 49 CFR 1248.4 - Originating and connecting line traffic.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... TRANSPORTATION BOARD, DEPARTMENT OF TRANSPORTATION (CONTINUED) ACCOUNTS, RECORDS AND REPORTS FREIGHT COMMODITY STATISTICS § 1248.4 Originating and connecting line traffic. (a) Revenue freight reported as received from... or indirectly, so far as apparent from information on the waybills or abstracts. (b) Revenue freight...

  16. 49 CFR 1248.4 - Originating and connecting line traffic.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... TRANSPORTATION BOARD, DEPARTMENT OF TRANSPORTATION (CONTINUED) ACCOUNTS, RECORDS AND REPORTS FREIGHT COMMODITY STATISTICS § 1248.4 Originating and connecting line traffic. (a) Revenue freight reported as received from... or indirectly, so far as apparent from information on the waybills or abstracts. (b) Revenue freight...

  17. 49 CFR 1248.4 - Originating and connecting line traffic.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... TRANSPORTATION BOARD, DEPARTMENT OF TRANSPORTATION (CONTINUED) ACCOUNTS, RECORDS AND REPORTS FREIGHT COMMODITY STATISTICS § 1248.4 Originating and connecting line traffic. (a) Revenue freight reported as received from... or indirectly, so far as apparent from information on the waybills or abstracts. (b) Revenue freight...

  18. 49 CFR 1248.4 - Originating and connecting line traffic.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... TRANSPORTATION BOARD, DEPARTMENT OF TRANSPORTATION (CONTINUED) ACCOUNTS, RECORDS AND REPORTS FREIGHT COMMODITY STATISTICS § 1248.4 Originating and connecting line traffic. (a) Revenue freight reported as received from... or indirectly, so far as apparent from information on the waybills or abstracts. (b) Revenue freight...

  19. 49 CFR 1248.4 - Originating and connecting line traffic.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... TRANSPORTATION BOARD, DEPARTMENT OF TRANSPORTATION (CONTINUED) ACCOUNTS, RECORDS AND REPORTS FREIGHT COMMODITY STATISTICS § 1248.4 Originating and connecting line traffic. (a) Revenue freight reported as received from... or indirectly, so far as apparent from information on the waybills or abstracts. (b) Revenue freight...

  20. A Performance-Based Comparison of Object-Oriented Simulation Tools

    DTIC Science & Technology

    1992-04-01

    simulation" [Belanger 90a, 90b]. CACI Products Company markets MODSIM II as the commercial version of ModSim, which was created on a US Army contract...aim fprintf (report_file, "Line Statistics\\ nLine teller repoirt.cust interrupts; Lengt~is\

  1. Realistic training for effective crew performance

    NASA Technical Reports Server (NTRS)

    Foushee, H. C.

    1985-01-01

    Evaluation of incident and accident statistics reveals that most problems occur not because of a lack of proficiency in pilot training, but because of the inability to coordinate skills into effective courses of action. Line-Oriented Flight Training (LOFT) and Cockpit Resource Management (CRM) programs provide training which will develop both individual crew member skills, as well as those associated with effective group function. A study conducted by NASA at the request of the U.S. Congress supports the argument for training that enhances crew performance in addition to providing individual technical skills, and is described in detail.

  2. The Search for Solar Gravity-Mode Oscillations: an Analysis Using ULYSSES Magnetic Field Data

    NASA Astrophysics Data System (ADS)

    Denison, David G. T.; Walden, Andrew T.

    1999-04-01

    In 1995 Thomson, Maclennon, and Lanzerotti (TML) reported on work where they carried out a time-series analysis of energetic particle fluxes measured by Ulysses and Voyager 2 and concluded that solar g-mode oscillations had been detected. The approach is based on finding significant peaks in spectra using a statistical F-test. Using three sets of 2048 hourly averages of Ulysses magnetic field magnitude data, and the same multitaper spectral estimation techniques, we obtain, on average, nine coincidences with the lines listed in the TML paper. We could not reject the hypothesis that the F-test peaks we obtained are uniformly distributed, and further statistical computations show that a sequence of uniformly distributed lines generated on the frequency grid would have, on average, nine coincidences with the lines of TML. Further, we find that a time series generated from a model with a smooth spectrum of the same form as derived from the Ulysses magnetic field magnitude data and having no true spectral lines above 2 μHz, when subjected to the multitaper F-tests, gives rise to essentially the same number of ``identified'' lines and coincident frequencies as found with our Ulysses data. We conclude that our average nine coincidences with the lines found by TML can arise by mechanisms wholly unconnected with the existence of real physical spectral lines and hence find no firm evidence that g-modes can be detected in our sample of magnetic field data.

  3. External Reporting Lines of Academic Special Libraries: A Health Sciences Case Study

    ERIC Educational Resources Information Center

    Buhler, Amy G.; Ferree, Nita; Cataldo, Tara T.; Tennant, Michele R.

    2010-01-01

    Very little literature exists on the nature of external reporting lines and funding structures of academic special libraries. This study focuses on academic health sciences libraries. The authors analyze information gathered from statistics published by the Association of Academic Health Sciences Libraries (AAHSL) from 1977 through 2007; an…

  4. How Many Is a Zillion? Sources of Number Distortion

    ERIC Educational Resources Information Center

    Rips, Lance J.

    2013-01-01

    When young children attempt to locate the positions of numerals on a number line, the positions are often logarithmically rather than linearly distributed. This finding has been taken as evidence that the children represent numbers on a mental number line that is logarithmically calibrated. This article reports a statistical simulation showing…

  5. Education Statistics on Disk. [CD-ROM.

    ERIC Educational Resources Information Center

    National Center for Education Statistics (ED), Washington, DC.

    This CD-ROM disk contains a computer program developed by the Office of Educational Research and Improvement to provide convenient access to the wealth of education statistics published by the National Center for Education Statistics (NCES). The program contains over 1,800 tables, charts, and text files from the following NCES publications,…

  6. Limits, discovery and cut optimization for a Poisson process with uncertainty in background and signal efficiency: TRolke 2.0

    NASA Astrophysics Data System (ADS)

    Lundberg, J.; Conrad, J.; Rolke, W.; Lopez, A.

    2010-03-01

    A C++ class was written for the calculation of frequentist confidence intervals using the profile likelihood method. Seven combinations of Binomial, Gaussian, Poissonian and Binomial uncertainties are implemented. The package provides routines for the calculation of upper and lower limits, sensitivity and related properties. It also supports hypothesis tests which take uncertainties into account. It can be used in compiled C++ code, in Python or interactively via the ROOT analysis framework. Program summaryProgram title: TRolke version 2.0 Catalogue identifier: AEFT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFT_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: MIT license No. of lines in distributed program, including test data, etc.: 3431 No. of bytes in distributed program, including test data, etc.: 21 789 Distribution format: tar.gz Programming language: ISO C++. Computer: Unix, GNU/Linux, Mac. Operating system: Linux 2.6 (Scientific Linux 4 and 5, Ubuntu 8.10), Darwin 9.0 (Mac-OS X 10.5.8). RAM:˜20 MB Classification: 14.13. External routines: ROOT ( http://root.cern.ch/drupal/) Nature of problem: The problem is to calculate a frequentist confidence interval on the parameter of a Poisson process with statistical or systematic uncertainties in signal efficiency or background. Solution method: Profile likelihood method, Analytical Running time:<10 seconds per extracted limit.

  7. GLISSANDO: GLauber Initial-State Simulation AND mOre…

    NASA Astrophysics Data System (ADS)

    Broniowski, Wojciech; Rybczyński, Maciej; Bożek, Piotr

    2009-01-01

    We present a Monte Carlo generator for a variety of Glauber-like models (the wounded-nucleon model, binary collisions model, mixed model, model with hot spots). These models describe the early stages of relativistic heavy-ion collisions, in particular the spatial distribution of the transverse energy deposition which ultimately leads to production of particles from the interaction region. The original geometric distribution of sources in the transverse plane can be superimposed with a statistical distribution simulating the dispersion in the generated transverse energy in each individual collision. The program generates inter alia the fixed-axes (standard) and variable-axes (participant) two-dimensional profiles of the density of sources in the transverse plane and their azimuthal Fourier components. These profiles can be used in further analysis of physical phenomena, such as the jet quenching, event-by-event hydrodynamics, or analysis of the elliptic flow and its fluctuations. Characteristics of the event (multiplicities, eccentricities, Fourier coefficients, etc.) are stored in a ROOT file and can be analyzed off-line. In particular, event-by-event studies can be carried out in a simple way. A number of ROOT scripts is provided for that purpose. Supplied variants of the code can also be used for the proton-nucleus and deuteron-nucleus collisions. Program summaryProgram title: GLISSANDO Catalogue identifier: AEBS_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEBS_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 4452 No. of bytes in distributed program, including test data, etc.: 34 766 Distribution format: tar.gz Programming language: C++ Computer: any computer with a C++ compiler and the ROOT environment [R. Brun, et al., Root Users Guide 5.16, CERN, 2007, http://root.cern.ch[1

  8. Decreasing dialysis catheter rates by creating a multidisciplinary dialysis access program.

    PubMed

    Rosenberry, Patricia M; Niederhaus, Silke V; Schweitzer, Eugene J; Leeser, David B

    2018-03-01

    Centers for Medicare and Medicaid Services have determined that chronic dialysis units should have <12% of their patients utilizing central venous catheters for hemodialysis treatments. On the Eastern Shore of Maryland, the central venous catheter rates in the dialysis units averaged >45%. A multidisciplinary program was established with goals of decreasing catheter rates in order to decrease central line-associated bloodstream infections, decrease mortality associated with central line-associated bloodstream infection, decrease hospital days, and provide savings to the healthcare system. We collected the catheter rates within three dialysis centers served over a 5-year period. Using published data surrounding the incidence and related costs of central line-associated bloodstream infection and mortality per catheter day, the number of central line-associated bloodstream infection events, the costs, and the related mortality could be determined prior to and after the initiation of the dialysis access program. An organized dialysis access program resulted in a 82% decrease in the number of central venous catheter days which lead to a concurrent reduction in central line-associated bloodstream infection and deaths. As a result of creating an access program, central venous catheter rates decreased from an average rate of 45% to 8%. The cost savings related to the program was calculated to be over US$5 million. The decrease in the number of mortalities is estimated to be between 13 and 27 patients. We conclude that a formalized access program decreases catheter rates, central line-associated bloodstream infection, and the resultant hospitalizations, mortality, and costs. Areas with high hemodialysis catheter rates should develop access programs to better serve their patient population.

  9. Continuing Studies in Support of Ultraviolet Observations of Planetary Atmospheres

    NASA Technical Reports Server (NTRS)

    Clark, John

    1997-01-01

    This program was a one-year extension of an earlier Planetary Atmospheres program grant, covering the period 1 August 1996 through 30 September 1997. The grant was for supporting work to complement an active program observing planetary atmospheres with Earth-orbital telescopes, principally the Hubble Space Telescope (HST). The recent concentration of this work has been on HST observations of Jupiter's upper atmosphere and aurora, but it has also included observations of Io, serendipitous observations of asteroids, and observations of the velocity structure in the interplanetary medium. The observations of Jupiter have been at vacuum ultraviolet wavelengths, including imaging and spectroscopy of the auroral and airglow emissions. The most recent HST observations have been at the same time as in situ measurements made by the Galileo orbiter instruments, as reflected in the meeting presentations listed below. Concentrated efforts have been applied in this year to the following projects: The analysis of HST WFPC 2 images of Jupiter's aurora, including the Io footprint emissions. We have performed a comparative analysis of the lo footprint locations with two magnetic field models, studied the statistical properties of the apparent dawn auroral storms on Jupiter, and found various other repeated patterns in Jupiter's aurora. Analysis and modeling of airglow and auroral Ly alpha emission line profiles from Jupiter. This has included modeling the aurora] line profiles, including the energy degradation of precipitating charged particles and radiative transfer of the emerging emissions. Jupiter's auroral emission line profile is self-absorbed, since it is produced by an internal source, and the resulting emission with a deep central absorption from the overlying atmosphere permits modeling of the depth of the emissions, plus the motion of the emitting layer with respect to the overlying atmospheric column from the observed Doppler shift of the central absorption. By contrast the airglow emission line, which is dominated by resonant scattering of solar emission, has no central absorption, but displays rapid time variations and broad wings, indicative of a superthermal component (or corona) in Jupiter's upper atmosphere. Modeling of the observed motions of the plumes produced after the impacts of the fragments of Comet S/L-9 with Jupiter in July 1994, from the HST WFPC 2 imaging series.

  10. Accountability Indicators from the Viewpoint of Statistical Method.

    ERIC Educational Resources Information Center

    Jordan, Larry

    Few people seriously regard students as "products" coming off an educational assembly line, but notions about accountability and quality improvement in higher education are pervaded by manufacturing ideas and metaphors. Because numerical indicators of quality are inevitably expressed by trend lines or statistical control chars of some kind, they…

  11. Nonlinear estimation of parameters in biphasic Arrhenius plots.

    PubMed

    Puterman, M L; Hrboticky, N; Innis, S M

    1988-05-01

    This paper presents a formal procedure for the statistical analysis of data on the thermotropic behavior of membrane-bound enzymes generated using the Arrhenius equation and compares the analysis to several alternatives. Data is modeled by a bent hyperbola. Nonlinear regression is used to obtain estimates and standard errors of the intersection of line segments, defined as the transition temperature, and slopes, defined as energies of activation of the enzyme reaction. The methodology allows formal tests of the adequacy of a biphasic model rather than either a single straight line or a curvilinear model. Examples on data concerning the thermotropic behavior of pig brain synaptosomal acetylcholinesterase are given. The data support the biphasic temperature dependence of this enzyme. The methodology represents a formal procedure for statistical validation of any biphasic data and allows for calculation of all line parameters with estimates of precision.

  12. Mg II Spectral Atlas and Flux Catalog for Late-Type Stars in the Hyades Cluster

    NASA Technical Reports Server (NTRS)

    Simon, Theodore

    2001-01-01

    In the course of a long-running IUE Guest Observer program, UV spectral images were obtained for more than 60 late-type members of the Hyades Cluster in order to investigate their chromospheric emissions. The emission line fluxes extracted from those observations were used to study the dependence of stellar dynamo activity upon age and rotation (IUE Observations of Rapidly Rotating Low-Mass Stars in Young Clusters: The Relation between Chromospheric Activity and Rotation). However, the details of those measurements, including a tabulation of the line fluxes, were never published. The purpose of the investigation summarized here was to extract all of the existing Hyades long-wavelength Mg II spectra in the IUE public archives in order to survey UV chromospheric emission in the cluster, thereby providing a consistent dataset for statistical and correlative studies of the relationship between stellar dynamo activity, rotation, and age over a broad range in mass.

  13. EVOLUTION OF THE MAGNETIC FIELD LINE DIFFUSION COEFFICIENT AND NON-GAUSSIAN STATISTICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snodin, A. P.; Ruffolo, D.; Matthaeus, W. H.

    The magnetic field line random walk (FLRW) plays an important role in the transport of energy and particles in turbulent plasmas. For magnetic fluctuations that are transverse or almost transverse to a large-scale mean magnetic field, theories describing the FLRW usually predict asymptotic diffusion of magnetic field lines perpendicular to the mean field. Such theories often depend on the assumption that one can relate the Lagrangian and Eulerian statistics of the magnetic field via Corrsin’s hypothesis, and additionally take the distribution of magnetic field line displacements to be Gaussian. Here we take an ordinary differential equation (ODE) model with thesemore » underlying assumptions and test how well it describes the evolution of the magnetic field line diffusion coefficient in 2D+slab magnetic turbulence, by comparisons to computer simulations that do not involve such assumptions. In addition, we directly test the accuracy of the Corrsin approximation to the Lagrangian correlation. Over much of the studied parameter space we find that the ODE model is in fairly good agreement with computer simulations, in terms of both the evolution and asymptotic values of the diffusion coefficient. When there is poor agreement, we show that this can be largely attributed to the failure of Corrsin’s hypothesis rather than the assumption of Gaussian statistics of field line displacements. The degree of non-Gaussianity, which we measure in terms of the kurtosis, appears to be an indicator of how well Corrsin’s approximation works.« less

  14. ROOT — A C++ framework for petabyte data storage, statistical analysis and visualization

    NASA Astrophysics Data System (ADS)

    Antcheva, I.; Ballintijn, M.; Bellenot, B.; Biskup, M.; Brun, R.; Buncic, N.; Canal, Ph.; Casadei, D.; Couet, O.; Fine, V.; Franco, L.; Ganis, G.; Gheata, A.; Maline, D. Gonzalez; Goto, M.; Iwaszkiewicz, J.; Kreshuk, A.; Segura, D. Marcos; Maunder, R.; Moneta, L.; Naumann, A.; Offermann, E.; Onuchin, V.; Panacek, S.; Rademakers, F.; Russo, P.; Tadel, M.

    2009-12-01

    ROOT is an object-oriented C++ framework conceived in the high-energy physics (HEP) community, designed for storing and analyzing petabytes of data in an efficient way. Any instance of a C++ class can be stored into a ROOT file in a machine-independent compressed binary format. In ROOT the TTree object container is optimized for statistical data analysis over very large data sets by using vertical data storage techniques. These containers can span a large number of files on local disks, the web, or a number of different shared file systems. In order to analyze this data, the user can chose out of a wide set of mathematical and statistical functions, including linear algebra classes, numerical algorithms such as integration and minimization, and various methods for performing regression analysis (fitting). In particular, the RooFit package allows the user to perform complex data modeling and fitting while the RooStats library provides abstractions and implementations for advanced statistical tools. Multivariate classification methods based on machine learning techniques are available via the TMVA package. A central piece in these analysis tools are the histogram classes which provide binning of one- and multi-dimensional data. Results can be saved in high-quality graphical formats like Postscript and PDF or in bitmap formats like JPG or GIF. The result can also be stored into ROOT macros that allow a full recreation and rework of the graphics. Users typically create their analysis macros step by step, making use of the interactive C++ interpreter CINT, while running over small data samples. Once the development is finished, they can run these macros at full compiled speed over large data sets, using on-the-fly compilation, or by creating a stand-alone batch program. Finally, if processing farms are available, the user can reduce the execution time of intrinsically parallel tasks — e.g. data mining in HEP — by using PROOF, which will take care of optimally distributing the work over the available resources in a transparent way. Program summaryProgram title: ROOT Catalogue identifier: AEFA_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFA_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: LGPL No. of lines in distributed program, including test data, etc.: 3 044 581 No. of bytes in distributed program, including test data, etc.: 36 325 133 Distribution format: tar.gz Programming language: C++ Computer: Intel i386, Intel x86-64, Motorola PPC, Sun Sparc, HP PA-RISC Operating system: GNU/Linux, Windows XP/Vista, Mac OS X, FreeBSD, OpenBSD, Solaris, HP-UX, AIX Has the code been vectorized or parallelized?: Yes RAM:>55 Mbytes Classification: 4, 9, 11.9, 14 Nature of problem: Storage, analysis and visualization of scientific data Solution method: Object store, wide range of analysis algorithms and visualization methods Additional comments: For an up-to-date author list see: http://root.cern.ch/drupal/content/root-development-team and http://root.cern.ch/drupal/content/former-root-developers Running time: Depending on the data size and complexity of analysis algorithms References:http://root.cern.ch.

  15. The One Micron Fe II Lines in Active Galaxies and Emission Line Stars

    NASA Astrophysics Data System (ADS)

    Rudy, R. J.; Mazuk, S.; Puetter, R. C.; Hamann, F. W.

    1999-05-01

    The infrared multiplet of Fe II lines at 0.9997, 1.0501, 1.0863, and 1.1126 microns are particularly strong relative to other red and infrared Fe II features. They reach their greatest strength, relative to the hydrogen lines, in the Seyfert 1 galaxy I Zw 1, and are a common, although not ubiquitous feature, in the broad line regions of active galaxies. In addition, they are seen in a diverse assortment of Galactic sources including young stars, Herbig Ae and Be stars, luminous blue variables, proto-planetary nebulae, and symbiotic novae. They are probably excited by Lyman alpha florescence but the exact path of the cascade to their upper levels is uncertain. They arise in dense, sheltered regions of low ionization and are frequently observed together with the infrared Ca II triplet and the Lyman beta excited O I lines 8446 and 11287. The strengths of the four Fe II features, relative to each other, are nearly constant from object to object suggesting a statistical population of their common upper multiplet. Their intensities, in comparison to the Paschen lines, indicate that they can be important coolants for regions with high optical depths in the hydrogen lines. In addition to I Zw 1 and other active galaxies, we present spectra for the Galactic sources MWC 17, MWC 84, MWC 340, MWC 922, PU Vul, and M 1-92. We review the status of the Fe II observations and discuss the excitation process and possible implications. This work was supported by the IR&D program of the Aerospace Corporation. RCP and FWH acknowledge support from NASA.

  16. Overview of the SAMSI year-long program on Statistical, Mathematical and Computational Methods for Astronomy

    NASA Astrophysics Data System (ADS)

    Jogesh Babu, G.

    2017-01-01

    A year-long research (Aug 2016- May 2017) program on `Statistical, Mathematical and Computational Methods for Astronomy (ASTRO)’ is well under way at Statistical and Applied Mathematical Sciences Institute (SAMSI), a National Science Foundation research institute in Research Triangle Park, NC. This program has brought together astronomers, computer scientists, applied mathematicians and statisticians. The main aims of this program are: to foster cross-disciplinary activities; to accelerate the adoption of modern statistical and mathematical tools into modern astronomy; and to develop new tools needed for important astronomical research problems. The program provides multiple avenues for cross-disciplinary interactions, including several workshops, long-term visitors, and regular teleconferences, so participants can continue collaborations, even if they can only spend limited time in residence at SAMSI. The main program is organized around five working groups:i) Uncertainty Quantification and Astrophysical Emulationii) Synoptic Time Domain Surveysiii) Multivariate and Irregularly Sampled Time Seriesiv) Astrophysical Populationsv) Statistics, computation, and modeling in cosmology.A brief description of each of the work under way by these groups will be given. Overlaps among various working groups will also be highlighted. How the wider astronomy community can both participate and benefit from the activities, will be briefly mentioned.

  17. Limitations of Using Microsoft Excel Version 2016 (MS Excel 2016) for Statistical Analysis for Medical Research.

    PubMed

    Tanavalee, Chotetawan; Luksanapruksa, Panya; Singhatanadgige, Weerasak

    2016-06-01

    Microsoft Excel (MS Excel) is a commonly used program for data collection and statistical analysis in biomedical research. However, this program has many limitations, including fewer functions that can be used for analysis and a limited number of total cells compared with dedicated statistical programs. MS Excel cannot complete analyses with blank cells, and cells must be selected manually for analysis. In addition, it requires multiple steps of data transformation and formulas to plot survival analysis graphs, among others. The Megastat add-on program, which will be supported by MS Excel 2016 soon, would eliminate some limitations of using statistic formulas within MS Excel.

  18. A program for the Bayesian Neural Network in the ROOT framework

    NASA Astrophysics Data System (ADS)

    Zhong, Jiahang; Huang, Run-Sheng; Lee, Shih-Chang

    2011-12-01

    We present a Bayesian Neural Network algorithm implemented in the TMVA package (Hoecker et al., 2007 [1]), within the ROOT framework (Brun and Rademakers, 1997 [2]). Comparing to the conventional utilization of Neural Network as discriminator, this new implementation has more advantages as a non-parametric regression tool, particularly for fitting probabilities. It provides functionalities including cost function selection, complexity control and uncertainty estimation. An example of such application in High Energy Physics is shown. The algorithm is available with ROOT release later than 5.29. Program summaryProgram title: TMVA-BNN Catalogue identifier: AEJX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJX_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: BSD license No. of lines in distributed program, including test data, etc.: 5094 No. of bytes in distributed program, including test data, etc.: 1,320,987 Distribution format: tar.gz Programming language: C++ Computer: Any computer system or cluster with C++ compiler and UNIX-like operating system Operating system: Most UNIX/Linux systems. The application programs were thoroughly tested under Fedora and Scientific Linux CERN. Classification: 11.9 External routines: ROOT package version 5.29 or higher ( http://root.cern.ch) Nature of problem: Non-parametric fitting of multivariate distributions Solution method: An implementation of Neural Network following the Bayesian statistical interpretation. Uses Laplace approximation for the Bayesian marginalizations. Provides the functionalities of automatic complexity control and uncertainty estimation. Running time: Time consumption for the training depends substantially on the size of input sample, the NN topology, the number of training iterations, etc. For the example in this manuscript, about 7 min was used on a PC/Linux with 2.0 GHz processors.

  19. Infant Directed Speech Enhances Statistical Learning in Newborn Infants: An ERP Study

    PubMed Central

    Teinonen, Tuomas; Tervaniemi, Mari; Huotilainen, Minna

    2016-01-01

    Statistical learning and the social contexts of language addressed to infants are hypothesized to play important roles in early language development. Previous behavioral work has found that the exaggerated prosodic contours of infant-directed speech (IDS) facilitate statistical learning in 8-month-old infants. Here we examined the neural processes involved in on-line statistical learning and investigated whether the use of IDS facilitates statistical learning in sleeping newborns. Event-related potentials (ERPs) were recorded while newborns were exposed to12 pseudo-words, six spoken with exaggerated pitch contours of IDS and six spoken without exaggerated pitch contours (ADS) in ten alternating blocks. We examined whether ERP amplitudes for syllable position within a pseudo-word (word-initial vs. word-medial vs. word-final, indicating statistical word learning) and speech register (ADS vs. IDS) would interact. The ADS and IDS registers elicited similar ERP patterns for syllable position in an early 0–100 ms component but elicited different ERP effects in both the polarity and topographical distribution at 200–400 ms and 450–650 ms. These results provide the first evidence that the exaggerated pitch contours of IDS result in differences in brain activity linked to on-line statistical learning in sleeping newborns. PMID:27617967

  20. Implementing a Daily Maintenance Care Bundle to Prevent Central Line-Associated Bloodstream Infections in Pediatric Oncology Patients.

    PubMed

    Duffy, Elizabeth A; Rodgers, Cheryl C; Shever, Leah L; Hockenberry, Marilyn J

    2015-01-01

    Eliminating central line-associated bloodstream infection (CLABSI) is a national priority. Central venous catheter (CVC) care bundles are composed of a series of interventions that, when used together, are effective in preventing CLABSI. A CVC daily maintenance care bundle includes procedural guidelines for hygiene, dressing changes, and access as well as specific timeframes. Failure to complete one of the components of the care bundle predisposes the patient to a bloodstream infection. A nurse-led multidisciplinary team implemented and, for six months, sustained a daily maintenance care bundle for pediatric oncology patients. This quality improvement project focused on nursing staffs' implementation of the daily maintenance care bundle and the sustainment of the intervention. The project used a pre-post program design to evaluate outcomes of CVC daily maintenance care bundle compliancy and CLABSI. A statistically significant increase between the pre- and post-assessments of the compliance was noted with the CVC daily maintenance care bundle. CLABSI infection rates decreased during the intervention. Strategies to implement practice change and promote sustainability are discussed. © 2015 by Association of Pediatric Hematology/Oncology Nurses.

  1. [How to reduce health inequities by targeting social determinants: the role of the health sector in Mexico].

    PubMed

    Martínez Valle, Adolfo; Terrazas, Paulina; Alvarez, Fernando

    2014-04-01

    To study lines of action implemented in Mexico by the health sector from 2007 to 2012 in order to combat health inequities by targeting social determinants. To contribute to better understanding and knowledge of how health system inequalities in the Region of the Americas can be reduced. To formulate recommendations for designing a future public policy agenda to address the social determinants associated with health inequities in Mexico. The policies and programs established in the National Health Program (PRONASA) 2007 - 2012 were reviewed, and those that met four criteria were selected: i) they affected the social determinants of health (SDH); ii) they developed specific lines of action aimed at reducing health inequities; iii) they set concrete goals; and iv) they had been evaluated to determine whether those goals had been met. Three programs were selected: Seguro Popular, Programa de Desarrollo Humano Oportunidades (PDHO), and Caravanas de la Salud. Once each program's specific lines of action targeting SDH had been identified, the monitoring and evaluation indicators established in PRONASA 2007 - 2012, along with other available evaluations and empirical evidence, were used to measure the extent to which the goals were met. The findings showed that Seguro Popular had had a positive impact in terms of the financial protection of lower-income households. Moreover, the reduction in the gap between workers covered by the social security system and those who were not was more evident. By reducing poverty among its beneficiaries, the PDHO also managed to reduce health inequities. The indicators for Caravanas de la Salud, on the other hand, did not show statistically significant differences between the control localities and the localities covered by the program, except in the case of Pap tests. These findings have important public policy implications for designing an agenda that promotes continued targeting of SDH and heightening its impact in terms of reducing inequities. Guaranteeing the effective exercise of social rights, without socioeconomic, employment, ethnic, or gender-based exclusion, will be key. Action to provide comprehensive, inclusive, equitable, effective, and quality coverage, supported by a preventive and remedial model of primary health care, are recommended. Strategies should be centered on primary health services, because at that level, more comprehensive care focusing on the person rather than the disease can be provided. It will also be necessary to include periodic monitoring and evaluation phases to offer the comprehensive social protection system scientific armor and guarantee its effectiveness.

  2. Determination of the vinyl fluoride line intensities by TDL spectroscopy: the object oriented approach of Visual Line Shape Fitting Program to line profile analysis

    NASA Astrophysics Data System (ADS)

    Tasinato, Nicola; Pietropolli Charmet, Andrea; Stoppa, Paolo; Giorgianni, Santi

    2010-03-01

    In this work the self-broadening coefficients and the integrated line intensities for a number of ro-vibrational transitions of vinyl fluoride have been determined for the first time by means of TDL spectroscopy. The spectra recorded in the atmospheric window around 8.7 µm appear very crowded with a density of about 90 lines per cm-1. In order to fit these spectral features a new fitting software has been implemented. The program, which is designed for laser spectroscopy, can fit many lines simultaneously on the basis of different theoretical profiles (Doppler, Lorentz, Voigt, Galatry and Nelkin-Ghatak). Details of the object oriented implementation of the application are given. The reliability of the program is demonstrated by determining the line parameters of some ro-vibrational lines of sulphur dioxide in the ν1 band region around 9 µm. Then the software is used for the line profile analysis of vinyl fluoride. The experimental line shapes show deviations from the Voigt profile, which can be well modelled by using a Dicke narrowed line shape function. This leads to the determination of the self-narrowing coefficient within the framework of the strong collision model.

  3. Beam transport program for FEL project

    NASA Astrophysics Data System (ADS)

    Sugimoto, Masayoshi; Takao, Masaru

    1992-07-01

    A beam transport program is developed to design the beam transport line of the free electron laser system at JAERI and to assist the beam diagnosis. The program traces a beam matrix through the elements in the beam transport line and the accelerators. The graphical user interface is employed to access the parameters and to represent the results. The basic computational method is based on the LANL-TRACE program and it is rewritten for personal computers in Pascal.

  4. Interpretation of Statistical Data: The Importance of Affective Expressions

    ERIC Educational Resources Information Center

    Queiroz, Tamires; Monteiro, Carlos; Carvalho, Liliane; François, Karen

    2017-01-01

    In recent years, research on teaching and learning of statistics emphasized that the interpretation of data is a complex process that involves cognitive and technical aspects. However, it is a human activity that involves also contextual and affective aspects. This view is in line with research on affectivity and cognition. While the affective…

  5. Telesoftware. CET Information Sheet No. 3.

    ERIC Educational Resources Information Center

    Council for Educational Technology, London (England).

    Telesoftware provides the transmission of computer programs from one computer to another by either broadcast radio or television via telephone lines and offers a national electronic system for the distribution of computer programs. Telephone based telesoftware can be based on any viewdata system or locally established telephone lines between…

  6. Report to the Legislature on: School Breakfast and Summer Food Service Programs. MGL Chapter 15 Section 1G(f) and Chapter 61 of the Acts of 2007 Line Item 7053-1925

    ERIC Educational Resources Information Center

    Massachusetts Department of Education, 2008

    2008-01-01

    The paper presents the report on "School Breakfast and Summer Food Service Program." Pursuant to Chapter 61 of the Acts of 2007 line item 7053-1925 and Massachusetts General Laws (MGL) chapter 15 section 1G(f), this report is submitted to the Legislature. An Act establishing school-based Nutrition and Child Hunger Relief Programs was…

  7. Coal-seismic, desktop computer programs in BASIC; Part 5, Perform X-square T-square analysis and plot normal moveout lines on seismogram overlay

    USGS Publications Warehouse

    Hasbrouck, W.P.

    1983-01-01

    Processing of data taken with the U.S. Geological Survey's coal-seismic system is done with a desktop, stand-alone computer. Programs for this computer are written in the extended BASIC language used by the Tektronix 4051 Graphic System. This report presents computer programs to perform X-square/T-square analyses and to plot normal moveout lines on a seismogram overlay.

  8. Treatment effects model for assessing disease management: measuring outcomes and strengthening program management.

    PubMed

    Wendel, Jeanne; Dumitras, Diana

    2005-06-01

    This paper describes an analytical methodology for obtaining statistically unbiased outcomes estimates for programs in which participation decisions may be correlated with variables that impact outcomes. This methodology is particularly useful for intraorganizational program evaluations conducted for business purposes. In this situation, data is likely to be available for a population of managed care members who are eligible to participate in a disease management (DM) program, with some electing to participate while others eschew the opportunity. The most pragmatic analytical strategy for in-house evaluation of such programs is likely to be the pre-intervention/post-intervention design in which the control group consists of people who were invited to participate in the DM program, but declined the invitation. Regression estimates of program impacts may be statistically biased if factors that impact participation decisions are correlated with outcomes measures. This paper describes an econometric procedure, the Treatment Effects model, developed to produce statistically unbiased estimates of program impacts in this type of situation. Two equations are estimated to (a) estimate the impacts of patient characteristics on decisions to participate in the program, and then (b) use this information to produce a statistically unbiased estimate of the impact of program participation on outcomes. This methodology is well-established in economics and econometrics, but has not been widely applied in the DM outcomes measurement literature; hence, this paper focuses on one illustrative application.

  9. Region 9 2010 Census Web Service

    EPA Pesticide Factsheets

    This web service displays data collected during the 2010 U.S. Census. The data are organized into layers representing Tract, Block, and Block Group visualizations. Geography The TIGER Line Files are feature classes and related database files that are an extract of selected geographic and cartographic information from the U.S. Census Bureau's Master Address File / Topologically Integrated Geographic Encoding and Referencing (MAF/TIGER) Database (MTDB). The MTDB represents a seamless national file with no overlaps or gaps between parts, however, each TIGER Line File is designed to stand alone as an independent data set, or they can be combined to cover the entire nation. Census tracts are small, relatively permanent statistical subdivisions of a county or equivalent entity, and were defined by local participants as part of the 2010 Census Participant Statistical Areas Program. The Census Bureau delineated the census tracts in situations where no local participant existed or where all the potential participants declined to participate. The primary purpose of census tracts is to provide a stable set of geographic units for the presentation of census data and comparison back to previous decennial censuses. Census tracts generally have a population size between 1,200 and 8,000 people, with an optimum size of 4,000 people. When first delineated, census tracts were designed to be homogeneous with respect to population characteristics, economic status

  10. Statistical and methodological issues in the evaluation of case management studies.

    PubMed

    Lesser, M L; Robertson, S; Kohn, N; Cooper, D J; Dlugacz, Y D

    1996-01-01

    For the past 3 years, the nursing case management team at North Shore University Hospital in Manhasset, NY, has been involved in a project to implement more than 50 clinical pathways, which provide a written "time line" for clinical events that should occur during a patient's hospital stay. A major objective of this project was to evaluate the efficacy of these pathways with respect to a number of important outcomes, such as length of stay, hospital costs, quality of patient care, and nursing and patient satisfaction. This article discusses several statistics-related issues in the design and evaluation of such case management studies. In particular, the role of a research approach in implementing and evaluating hospital programs, the choice of a comparison (control) group, the exclusion of selected patients from analysis, and the problems of equating pathways with diagnosis-related groups are addressed.

  11. Line identification studies using traditional techniques and wavelength coincidence statistics

    NASA Technical Reports Server (NTRS)

    Cowley, Charles R.; Adelman, Saul J.

    1990-01-01

    Traditional line identification techniques result in the assignment of individual lines to an atomic or ionic species. These methods may be supplemented by wavelength coincidence statistics (WCS). The strength and weakness of these methods are discussed using spectra of a number of normal and peculiar B and A stars that have been studied independently by both methods. The present results support the overall findings of some earlier studies. WCS would be most useful in a first survey, before traditional methods have been applied. WCS can quickly make a global search for all species and in this way may enable identifications of an unexpected spectrum that could easily be omitted entirely from a traditional study. This is illustrated by O I. WCS is a subject to well known weakness of any statistical technique, for example, a predictable number of spurious results are to be expected. The danger of small number statistics are illustrated. WCS is at its best relative to traditional methods in finding a line-rich atomic species that is only weakly present in a complicated stellar spectrum.

  12. Selected Streamflow Statistics for Streamgaging Stations in Delaware, 2003

    USGS Publications Warehouse

    Ries, Kernell G.

    2004-01-01

    Flow-duration and low-flow frequency statistics were calculated for 15 streamgaging stations in Delaware, in cooperation with the Delaware Geological Survey. The flow-duration statistics include the 1-, 2-, 5-, 10-, 20-, 30-, 40-, 50-, 60-, 70-, 80-, 90-, 95-, 98-, and 99-percent duration discharges. The low-flow frequency statistics include the average discharges for 1, 7, 14, 30, 60, 90, and 120 days that recur, on average, once in 1.01, 2, 5, 10, 20, 50, and 100 years. The statistics were computed using U.S. Geological Survey computer programs that can be downloaded from the World Wide Web at no cost. The computer programs automate standard U.S. Geological Survey methods for computing the statistics. Documentation is provided at the Web sites for the individual programs. The computed statistics are presented in tabular format on a separate page for each station, along with the station name, station number, the location, the period of record, and remarks.

  13. 76 FR 41756 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-15

    ... materials and supplies used in production. The economic census will produce basic statistics by kind of business on number of establishments, sales, payroll, employment, inventories, and operating expenses. It also will yield a variety of subject statistics, including sales by product line; sales by class of...

  14. Women Entrepreneurship Across Racial Lines: Current Status, Critical Issues, and Future Implications

    ERIC Educational Resources Information Center

    Smith-Hunter, Andrea

    2004-01-01

    This article begins with a look at women employment over the years and the historical place of women entrepreneurship in today's economy. It continues by analyzing data statistically on women entrepreneurs in the United States across racial lines, with a particular focus on Hispanic women entrepreneurs. The article ends by examining the critical…

  15. REST: a computer system for estimating logging residue by using the line-intersect method

    Treesearch

    A. Jeff Martin

    1975-01-01

    A computer program was designed to accept logging-residue measurements obtained by line-intersect sampling and transform them into summaries useful for the land manager. The features of the program, along with inputs and outputs, are briefly described, with a note on machine compatibility.

  16. Outcomes of home-based employment service programs for people with disabilities and their related factors--a preliminary study in Taiwan.

    PubMed

    Lin, Yi-Jiun; Huang, I-Chun; Wang, Yun-Tung

    2014-01-01

    The aim of this exploratory study is to gain an understanding of the outcomes of home-based employment service programs for people with disabilities and their related factors in Taiwan. This study used survey method to collect 132 questionnaires. Descriptive and two-variable statistics including chi-square (χ(2)), independent sample t-test and analysis of variance were employed. The results found that 36.5% of the subjects improved their employment status and 75.8% of them improved in employability. Educational level and and vocational categories including "web page production", "e-commerce", "internet marketing", "on-line store" and "website set-up and management" were significantly "positively" associated with either of the two outcome indicators - change of employment status and employability. This study is the first evidence-based study about the outcomes of home-based employment service programs and their related factors for people with disabilities in Taiwan. The outcomes of the home-based employment service programs for people with disabilities were presented. Implications for Rehabilitation Home-based rehabilitation for people with disabilities can be effective. A programme of this kind supports participants in improving or gaining employment status as well as developing employability skills. Further consideration should be given to developing cost-effective home-based programmes and evaluating their effectiveness.

  17. FY2017 Report on NISC Measurements and Detector Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrews, Madison Theresa; Meierbachtol, Krista Cruse; Jordan, Tyler Alexander

    FY17 work focused on automation, both of the measurement analysis and comparison of simulations. The experimental apparatus was relocated and weeks of continuous measurements of the spontaneous fission source 252Cf was performed. Programs were developed to automate the conversion of measurements into ROOT data framework files with a simple terminal input. The complete analysis of the measurement (which includes energy calibration and the identification of correlated counts) can now be completed with a documented process which involves one simple execution line as well. Finally, the hurdles of slow MCNP simulations resulting in low simulation statistics have been overcome with themore » generation of multi-run suites which make use of the highperformance computing resources at LANL. Preliminary comparisons of measurements and simulations have been performed and will be the focus of FY18 work.« less

  18. Genomic Selection in Plant Breeding: Methods, Models, and Perspectives.

    PubMed

    Crossa, José; Pérez-Rodríguez, Paulino; Cuevas, Jaime; Montesinos-López, Osval; Jarquín, Diego; de Los Campos, Gustavo; Burgueño, Juan; González-Camacho, Juan M; Pérez-Elizalde, Sergio; Beyene, Yoseph; Dreisigacker, Susanne; Singh, Ravi; Zhang, Xuecai; Gowda, Manje; Roorkiwal, Manish; Rutkoski, Jessica; Varshney, Rajeev K

    2017-11-01

    Genomic selection (GS) facilitates the rapid selection of superior genotypes and accelerates the breeding cycle. In this review, we discuss the history, principles, and basis of GS and genomic-enabled prediction (GP) as well as the genetics and statistical complexities of GP models, including genomic genotype×environment (G×E) interactions. We also examine the accuracy of GP models and methods for two cereal crops and two legume crops based on random cross-validation. GS applied to maize breeding has shown tangible genetic gains. Based on GP results, we speculate how GS in germplasm enhancement (i.e., prebreeding) programs could accelerate the flow of genes from gene bank accessions to elite lines. Recent advances in hyperspectral image technology could be combined with GS and pedigree-assisted breeding. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Acquisition-Management Program

    NASA Technical Reports Server (NTRS)

    Avery, Don E.; Vann, A. Vernon; Jones, Richard H.; Rew, William E.

    1987-01-01

    NASA Acquisition Management Subsystem (AMS) program integrated NASA-wide standard automated-procurement-system program developed in 1985. Designed to provide each NASA installation with procurement data-base concept with on-line terminals for managing, tracking, reporting, and controlling contractual actions and associated procurement data. Subsystem provides control, status, and reporting for various procurement areas. Purpose of standardization is to decrease costs of procurement and operation of automatic data processing; increases procurement productivity; furnishes accurate, on-line management information and improves customer support. Written in the ADABAS NATURAL.

  20. Breeding potential of elite Pee Dee germplasm in Upland cotton breeding programs

    USDA-ARS?s Scientific Manuscript database

    Successful plant breeding programs begin with parental line selection. Effective parental line selection is facilitated when the breeding potential of candidate parental lines is known. Using topcross families involving germplasm representing eight US public cotton breeding programs, we evaluated th...

  1. Voice Response System Statistics Program : Operational Handbook.

    DOT National Transportation Integrated Search

    1980-06-01

    This report documents the Voice Response System (VRS) Statistics Program developed for the preflight weather briefing VRS. It describes the VRS statistical report format and contents, the software program structure, and the program operation.

  2. Stan: Statistical inference

    NASA Astrophysics Data System (ADS)

    Stan Development Team

    2018-01-01

    Stan facilitates statistical inference at the frontiers of applied statistics and provides both a modeling language for specifying complex statistical models and a library of statistical algorithms for computing inferences with those models. These components are exposed through interfaces in environments such as R, Python, and the command line.

  3. Stan : A Probabilistic Programming Language

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carpenter, Bob; Gelman, Andrew; Hoffman, Matthew D.

    Stan is a probabilistic programming language for specifying statistical models. A Stan program imperatively defines a log probability function over parameters conditioned on specified data and constants. As of version 2.14.0, Stan provides full Bayesian inference for continuous-variable models through Markov chain Monte Carlo methods such as the No-U-Turn sampler, an adaptive form of Hamiltonian Monte Carlo sampling. Penalized maximum likelihood estimates are calculated using optimization methods such as the limited memory Broyden-Fletcher-Goldfarb-Shanno algorithm. Stan is also a platform for computing log densities and their gradients and Hessians, which can be used in alternative algorithms such as variational Bayes, expectationmore » propagation, and marginal inference using approximate integration. To this end, Stan is set up so that the densities, gradients, and Hessians, along with intermediate quantities of the algorithm such as acceptance probabilities, are easily accessible. Stan can also be called from the command line using the cmdstan package, through R using the rstan package, and through Python using the pystan package. All three interfaces support sampling and optimization-based inference with diagnostics and posterior analysis. rstan and pystan also provide access to log probabilities, gradients, Hessians, parameter transforms, and specialized plotting.« less

  4. Stan : A Probabilistic Programming Language

    DOE PAGES

    Carpenter, Bob; Gelman, Andrew; Hoffman, Matthew D.; ...

    2017-01-01

    Stan is a probabilistic programming language for specifying statistical models. A Stan program imperatively defines a log probability function over parameters conditioned on specified data and constants. As of version 2.14.0, Stan provides full Bayesian inference for continuous-variable models through Markov chain Monte Carlo methods such as the No-U-Turn sampler, an adaptive form of Hamiltonian Monte Carlo sampling. Penalized maximum likelihood estimates are calculated using optimization methods such as the limited memory Broyden-Fletcher-Goldfarb-Shanno algorithm. Stan is also a platform for computing log densities and their gradients and Hessians, which can be used in alternative algorithms such as variational Bayes, expectationmore » propagation, and marginal inference using approximate integration. To this end, Stan is set up so that the densities, gradients, and Hessians, along with intermediate quantities of the algorithm such as acceptance probabilities, are easily accessible. Stan can also be called from the command line using the cmdstan package, through R using the rstan package, and through Python using the pystan package. All three interfaces support sampling and optimization-based inference with diagnostics and posterior analysis. rstan and pystan also provide access to log probabilities, gradients, Hessians, parameter transforms, and specialized plotting.« less

  5. Multiple-Line Inference of Selection on Quantitative Traits

    PubMed Central

    Riedel, Nico; Khatri, Bhavin S.; Lässig, Michael; Berg, Johannes

    2015-01-01

    Trait differences between species may be attributable to natural selection. However, quantifying the strength of evidence for selection acting on a particular trait is a difficult task. Here we develop a population genetics test for selection acting on a quantitative trait that is based on multiple-line crosses. We show that using multiple lines increases both the power and the scope of selection inferences. First, a test based on three or more lines detects selection with strongly increased statistical significance, and we show explicitly how the sensitivity of the test depends on the number of lines. Second, a multiple-line test can distinguish between different lineage-specific selection scenarios. Our analytical results are complemented by extensive numerical simulations. We then apply the multiple-line test to QTL data on floral character traits in plant species of the Mimulus genus and on photoperiodic traits in different maize strains, where we find a signature of lineage-specific selection not seen in two-line tests. PMID:26139839

  6. Investigation of the effects of storage time on the dimensional accuracy of impression materials using cone beam computed tomography

    PubMed Central

    2016-01-01

    PURPOSE The storage conditions of impressions affect the dimensional accuracy of the impression materials. The aim of the study was to assess the effects of storage time on dimensional accuracy of five different impression materials by cone beam computed tomography (CBCT). MATERIALS AND METHODS Polyether (Impregum), hydrocolloid (Hydrogum and Alginoplast), and silicone (Zetaflow and Honigum) impression materials were used for impressions taken from an acrylic master model. The impressions were poured and subjected to four different storage times: immediate use, and 1, 3, and 5 days of storage. Line 1 (between right and left first molar mesiobuccal cusp tips) and Line 2 (between right and left canine tips) were measured on a CBCT scanned model, and time dependent mean differences were analyzed by two-way univariate and Duncan's test (α=.05). RESULTS For Line 1, the total mean difference of Impregum and Hydrogum were statistically different from Alginoplast (P<.05), while Zetaflow and Honigum had smaller discrepancies. Alginoplast resulted in more difference than the other impressions (P<.05). For Line 2, the total mean difference of Impregum was statistically different from the other impressions. Significant differences were observed in Line 1 and Line 2 for the different storage periods (P<.05). CONCLUSION The dimensional accuracy of impression material is clinically acceptable if the impression material is stored in suitable conditions. PMID:27826388

  7. Investigation of the effects of storage time on the dimensional accuracy of impression materials using cone beam computed tomography.

    PubMed

    Alkurt, Murat; Yeşıl Duymus, Zeynep; Dedeoglu, Numan

    2016-10-01

    The storage conditions of impressions affect the dimensional accuracy of the impression materials. The aim of the study was to assess the effects of storage time on dimensional accuracy of five different impression materials by cone beam computed tomography (CBCT). Polyether (Impregum), hydrocolloid (Hydrogum and Alginoplast), and silicone (Zetaflow and Honigum) impression materials were used for impressions taken from an acrylic master model. The impressions were poured and subjected to four different storage times: immediate use, and 1, 3, and 5 days of storage. Line 1 (between right and left first molar mesiobuccal cusp tips) and Line 2 (between right and left canine tips) were measured on a CBCT scanned model, and time dependent mean differences were analyzed by two-way univariate and Duncan's test (α=.05). For Line 1, the total mean difference of Impregum and Hydrogum were statistically different from Alginoplast ( P <.05), while Zetaflow and Honigum had smaller discrepancies. Alginoplast resulted in more difference than the other impressions ( P <.05). For Line 2, the total mean difference of Impregum was statistically different from the other impressions. Significant differences were observed in Line 1 and Line 2 for the different storage periods ( P <.05). The dimensional accuracy of impression material is clinically acceptable if the impression material is stored in suitable conditions.

  8. Surface inspection of flat products by means of texture analysis: on-line implementation using neural networks

    NASA Astrophysics Data System (ADS)

    Fernandez, Carlos; Platero, Carlos; Campoy, Pascual; Aracil, Rafael

    1994-11-01

    This paper describes some texture-based techniques that can be applied to quality assessment of flat products continuously produced (metal strips, wooden surfaces, cork, textile products, ...). Since the most difficult task is that of inspecting for product appearance, human-like inspection ability is required. A common feature to all these products is the presence of non- deterministic texture on their surfaces. Two main subjects are discussed: statistical techniques for both surface finishing determination and surface defect analysis as well as real-time implementation for on-line inspection in high-speed applications. For surface finishing determination a Gray Level Difference technique is presented to perform over low resolution images, that is, no-zoomed images. Defect analysis is performed by means of statistical texture analysis over defective portions of the surface. On-line implementation is accomplished by means of neural networks. When a defect arises, textural analysis is applied which result in a data-vector, acting as input of a neural net, previously trained in a supervised way. This approach tries to reach on-line performance in automated visual inspection applications when texture is presented in flat product surfaces.

  9. Increasing consumer demand among Medicaid enrollees for tobacco dependence treatment: The Wisconsin Medicaid Covers It campaign

    PubMed Central

    Keller, Paula A.; Christiansen, Bruce; Kim, Su-Young; Piper, Megan E.; Redmond, Lezli; Adsit, Robert; Fiore, Michael C.

    2010-01-01

    Purpose Smoking prevalence among Medicaid enrollees is higher than the general population, but use of evidence-based cessation treatment is low. We evaluated whether a communications campaign improved cessation treatment utilization. Design Quasi-experimental. Setting Wisconsin. Subjects Enrollees in the Wisconsin Family Medicaid program. The average monthly enrollment during the study period was approximately 170,000 individuals. Intervention Print materials for clinicians and consumers distributed to 13 health maintenance organizations (HMO) serving Wisconsin Medicaid HMO enrollees. Measures Wisconsin Medicaid pharmacy claims data for smoking cessation medications were analyzed before and after a targeted communications campaign. HMO enrollees were the intervention group. Fee-for-service enrollees were a quasi-experimental comparison group. Quit Line utilization data were also analyzed. Analysis Pharmacotherapy claims and number of registered quitline callers were compared pre-and post-campaign. Results Pre-campaign, cessation pharmacotherapy claims declined for the intervention group and increased slightly for the comparison group (t = 2.29, p = 0.03). Post-campaign, claims increased in both groups. However, the rate of increase in the intervention group was significantly greater than in the comparison group (t = −2.2, p = 0.04). A statistically significant increase was also seen in the average monthly number of Medicaid enrollees that registered for Quit Line services post-campaign compared to pre-campaign (F (1,22) = 7.19, p = 0.01). Conclusion This natural experiment demonstrated statistically significant improvements in both pharmacotherapy claims and Quit Line registrations among Medicaid enrollees. These findings may help inform other states’ efforts to improve cessation treatment utilization. PMID:21721965

  10. Corona performance of a compact 230-kV line

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chartier, V.L.; Blair, D.E.; Easley, M.D.

    Permitting requirements and the acquisition of new rights-of-way for transmission facilities has in recent years become increasingly difficult for most utilities, including Puget Sound Power and Light Company. In order to maintain a high degree of reliability of service while being responsive to public concerns regarding the siting of high voltage (HV) transmission facilities, Puget Power has found it necessary to more heavily rely upon the use of compact lines in franchise corridors. Compaction does, however, precipitate increased levels of audible noise (AN) and radio and TV interference (RI and TVI) due to corona on the conductors and insulator assemblies.more » Puget Power relies upon the Bonneville Power Administration (BPA) Corona and Field Effects computer program to calculate AN and RI for new lines. Since there was some question of the program`s ability to accurately represent quiet 230-kV compact designs, a joint project was undertaken with BPA to verify the program`s algorithms. Long-term measurements made on an operating Puget Power 230-kV compact line confirmed the accuracy of BPA`s AN model; however, the RI measurements were much lower than predicted by the BPA and other programs. This paper also describes how the BPA computer program can be used to calculate the voltage needed to expose insulator assemblies to the correct electric field in single test setups in HV laboratories.« less

  11. Corona performance of a compact 230-kV line

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chartier, V.L.; Blair, D.E.; Easley, M.D.

    Permitting requirements and the acquisition of new rights-of-way for transmission facilities has in recent years become increasingly difficult for most utilities, including Puget Sound Power and Light Company. In order to maintain a high degree of reliability of service while being responsive to public concerns regarding the siting of high voltage (HV) transmission facilities, Puget Power has found it necessary to more heavily rely upon the use of compact lines in franchise corridors. Compaction does, however, precipitant increased levels of audible noise (AN) and radio and TV interference (RI and TVI) due to corona on the conductors and insulator assemblies.more » Puget Power relies upon the Bonneville Power Administration (BPA) Corona and Field Effects computer program to calculate AN and RI for new lines. Since there was some question of the program`s ability to accurately represent quiet 230-kV compact designs, a joint project was undertaken with BPA to verify the program`s algorithms. Long-term measurements made on an operating Puget Power 230-kV compact line confirmed the accuracy of BPA`s AN model; however, the RI measurements were much lower than predicted by the BPA computer and other programs. This paper also describes how the BPA computer program can be used to calculate the voltage needed to expose insulator assemblies to the correct electric field in single test setups in HV laboratories.« less

  12. Preliminary observations of the SELENE Gamma Ray Spectrometer

    NASA Astrophysics Data System (ADS)

    Forni, O.; Diez, B.; Gasnault, O.; Munoz, B.; D'Uston, C.; Reedy, R. C.; Hasebe, N.

    2008-09-01

    Introduction We analyze the spectra measured by the Gamma Ray Spectrometer (GRS) on board the SELENE satellite [1]. SELENE was inserted in lunar orbit on 4 Oct. 2007. After passing through a health check and a function check, the GRS was shifted to nominal observation on 21 Dec. 2007. The spectra consist in various lines of interest (O, Mg, Al, Si, Ti, Ca, Fe, K, Th, U, and possibly H) superposed on a continuum. The energies of the gamma rays identify the nuclides responsible for the gamma ray emission and their intensities relate to their abundance. Data collected through 17 Feb. 2008 are studied here, corresponding to an accumulation time (Fig. 1) sufficiently good to allow preliminary mapping. Analysis of the global gamma ray spectrum In order to obtain spectra with counting statistics sufficient for peak analysis, we accumulate all observations. The identification of lines is performed on this global lunar spectrum (Fig 2). Fit of individual lines The gamma ray lines that arise from decay of longlived radioactive species are among the easiest to analyze. So far the abundance of two species is studied thanks to such lines: potassium (1461 keV) and thorium (2614 keV). Secondary neutrons from cosmic ray interactions also produce gamma ray when reacting with the planetary material, according to scattering or absorption reactions. However these lines need substantial corrections before an interpretation in terms of abundance can be performed. Lines have been examined with different techniques. The simplest method consists in summing the spectra in a window containing the line of interest. The continuum is adjusted with a polynomial and removed. Such a method was used for the gamma ray spectra collected by Lunar Prospector [2]. This method is especially robust for isolated lines, such as those of K and Th mentioned above, or with very low statistics. The second method consists in fitting the lines by summing a quadratic continuum with Gaussian lines and exponential tails. We presently fit the spectra thanks to a program developed at CESR: Aquarius. Afterwards the areas associated with the parameters of these ideal lines are calculated. This method is welladapted for interfering lines, such as U, Al, and H around 2210 keV, but it requires good statistics. These two methods were used to analyze the Mars Odyssey gamma-ray spectra [3]. Prettyman et al. [4] applied a third method where theoretical spectra are simulated and matched against the observations. Below we propose a fourth approach based on statistical analyzes. Mapping of elemental abundances Data returned by the spacecraft are time-tagged records acquired with a resolution of 17 seconds. The angular distance covered by the spacecraft during this interval corresponds to about 1° at the surface. However the true resolution of the instrument is lower because gamma rays come from all directions onto the spacecraft. The resolution is therefore set by the field of view of the instrument, which depends on the spacecraft altitude and the geometry of the instrument. The full width half maximum of the instrumental response has been estimated to be 130 km at 1 MeV by the SELENE GRS team. We have tiled the data in agreement with the better resolution we could obtain depending on the intensity of a given line. The thorium line at 2614 keV was thus mapped at a resolution of 3° with the first method described above (sum over 2550-2640 keV). Then this map was smoothed with a 5° filter (152 km radius) to approximate the response function of the instrument. Finally the counting rate was converted into abundance (Fig. 3), using the compositions at landing sites and in the highlands as did Gillis et al. [5]. Statistical analysis We have also analysed the data with various multivariate techniques, one of them being the Independent Component Analysis (ICA) [6, 7]. ICA defines a generative model for the observed multivariate data, which is typically given as a large database of samples. In the model, the data variables are assumed to be linear mixtures of some unknown latent variables, and the mixing system is also unknown. The latent variables are assumed non- Gaussian and mutually independent and they are called the independent components of the observed data. These independent components, also called sources or factors, can be found by ICA. This is done by maximising a non-gaussianity criterion of the sources. As in [8], we have used the JADE algorithm developed and described in [9] for our analysis that we focused in the energy range from 750 to 3000 keV. We identify at least three meaningful components. The first one is correlated to the Thorium map (Fig. 4). The corresponding correlation coefficient spectrum exhibits the lines of Thorium, Potassium and Uranium (Fig. 5). The second component (Fig. 6) is clearly correlated with the Iron as shown on its corresponding spectrum (Fig. 5). A third component, identified at lower resolution, seems to be partly correlated with the altitude of the spacecraft (not shown). Further improvement in the data reduction, like corrections for altitude, cosmic ray, and neutron current variations should allow a better interpretation of the data. Acknowledgement. The SELENE GRS team members are: N. Hasebe, O. Okudaira, N. Yamashita, S. Kobayashi, Y. Karouji, M. Hareyama, S. Kodaira, S. Komatsu, K. Hayatsu, K. Iwabuchi, S. Nemoto, E. Shibamura, M.-N. Kobayashi, R.C. Reedy, K.J. Kim, C. d'Uston, S. Maurice, O. Gasnault, O. Forni, B. Diez. References. [1] Hasebe, N. et al. (2008) Earth, Planets and Space, 60, 299-312.. [2] Lawrence, D.J. et al. (1999) Geophys. Res. Lett., 26 (17), 2681-2684. [3] Evans, L.E. et al. (2006) J. Geophys. Res., 111, E03S04. [4] Prettyman, T.H. et al. (2006) J. Geophys. Res., 111, E12007. [5] Gillis, J.J. et al. (2004) Geo. et Cosmo. Acta, 68 (18), 3791-3805. [6] Comon P. (1994) Signal Processing, 36, 287-314. [7] Hyvärinen, A. and E. Oja (2000) Neural Networks, 13(4-5), 411-430. [8] Forni O. et al. (2005) LPSC, 36, 1623 [9] Cardoso, J.-F. (1997) IEEE Letters on Signal Processing, 4, 112-114.

  13. Designing Health Care Risk Management On-Line: Meeting Regulators' Concerns for Fixed-Hour Curriculum

    ERIC Educational Resources Information Center

    Hyer, Kathryn; Taylor, Heidi H.; Nanni, Kenneth

    2004-01-01

    This paper describes the experience of creating a continuing professional education on-line risk management program that is designed to meet Florida's educational requirements for licensure as a risk manager in health-care settings and details the challenges faced when the in-class didactic program of 15 eight-hour sessions is reformatted as an…

  14. Bridging child welfare and juvenile justice: preventing unnecessary detention of foster children.

    PubMed

    Ross, Timothy; Conger, Dylan; Armstrong, Molly

    2002-01-01

    Gaps in service coordination between child welfare and other child-serving agencies are well documented. This article examines the gap between the child welfare and juvenile justice systems and discusses a program, Project Confirm, designed to reduce the problems associated with this gap. Project Confirm aims to improve cooperation between juvenile justice and child welfare agencies to prevent the unnecessary detention of arrested foster children in secure facilities. The program's design is outlined, and implementation statistics and government partner perceptions of the program in its first year of operations are provided. The article also identifies future challenges to implementation and discusses the broader implications of the program. In making this appeal for coordination and collaboration among public agencies, I'm not talking about pro forma integration efforts-I'm not talking about memoranda of understanding or top-level reorganizations that have autonomous agencies vaguely reporting to a single box labeled "human services." Rather, I'm talking about the much more challenging objective of achieving real working partnerships at the front line. (Nelson, 1998, p. 6)

  15. Text line extraction in free style document

    NASA Astrophysics Data System (ADS)

    Shen, Xiaolu; Liu, Changsong; Ding, Xiaoqing; Zou, Yanming

    2009-01-01

    This paper addresses to text line extraction in free style document, such as business card, envelope, poster, etc. In free style document, global property such as character size, line direction can hardly be concluded, which reveals a grave limitation in traditional layout analysis. 'Line' is the most prominent and the highest structure in our bottom-up method. First, we apply a novel intensity function found on gradient information to locate text areas where gradient within a window have large magnitude and various directions, and split such areas into text pieces. We build a probability model of lines consist of text pieces via statistics on training data. For an input image, we group text pieces to lines using a simulated annealing algorithm with cost function based on the probability model.

  16. Interstellar lines in high resolution IUE spectra. Part 1: Groningen data reduction package and technical results

    NASA Astrophysics Data System (ADS)

    Gilra, D. P.; Pwa, T. H.; Arnal, E. M.; de Vries, J.

    1982-06-01

    In order to process and analyze high resolution IUE data on a large number of interstellar lines in a large number of images for a large number of stars, computer programs were developed for 115 lines in the short wavelength range and 40 in the long wavelength range. Programs include extraction, processing, plotting, averaging, and profile fitting. Wavelength calibration in high resolution spectra, fixed pattern noise, instrument profile and resolution, and the background problem in the region where orders are crowding are discussed. All the expected lines are detected in at least one spectrum.

  17. Knowledge of severe acute respiratory syndrome among community physicians, nurses, and emergency medical responders.

    PubMed

    Tice, Alan Douglas; Kishimoto, Mitsumasa; Dinh, Chuong Hoang; Lam, Geoffrey Tak-Kin; Marineau, Michelle

    2006-01-01

    The preparedness levels of front-line clinicians including physicians, nurses, emergency medical responders (EMRs), and other medical staff working in clinics, offices and ambulatory care centers must be assessed, so these personnel are able to deal with communicable and potentially lethal diseases, such as severe acute respiratory syndrome (SARS). In order to determine the knowledge of these clinicians, a survey of their understanding of SARS and their use of educational resources was administered. A questionnaire was distributed to physicians, nurses, and EMRs attending conferences on SARS in the summer of 2003. Questions related to information sources, knowledge of SARS, and plans implemented in their workplace to deal with it. Statistical analysis was performed using the Statistical Package for the Social Sciences (10.1 Program, SPSS Inc., Chicago, Illinois). A total of 201 community healthcare providers (HCPs) participated in the study. A total of 51% of the participants correctly identified the incubation period of SARS; 48% correctly identified the symptoms of SARS; and 60% knew the recommended infection control precautions to take for families. There was little difference in knowledge among the physicians, nurses, and EMRs evaluated. Media outlets such as newspapers, journals, television, and radio were reported as the main sources of information on SARS. However, there appears to be a growing use of the Internet, which correlated best with the correct answers on symptoms of SARS. Fewer than one-third of respondents were aware of a protocol for SARS in their workplace. A total of 60% reported that N-95 masks were available in their workplace. These findings suggest the need for more effective means of education and training for front-line clinicians, as well as the institution of policies and procedures in medical offices, clinics, and emergency services in the community.

  18. Problem-Solving Management Training Effects on Sales Productivity and Job Satisfaction.

    ERIC Educational Resources Information Center

    Ross, Paul C.; And Others

    Research suggests that effective organizational change must be led by line personnel rather than by outside consultants. The Performance Management Program (PMP) implemented in two Bell Telephone companies is a line-led, self-help program in which managers participate in problem-solving activities within their own jobs. Marketing and sales…

  19. Contemporary issues in HIM. The application layer--III.

    PubMed

    Wear, L L; Pinkert, J R

    1993-07-01

    We have seen document preparation systems evolve from basic line editors through powerful, sophisticated desktop publishing programs. This component of the application layer is probably one of the most used, and most readily identifiable. Ask grade school children nowadays, and many will tell you that they have written a paper on a computer. Next month will be a "fun" tour through a number of other application programs we find useful. They will range from a simple notebook reminder to a sophisticated photograph processor. Application layer: Software targeted for the end user, focusing on a specific application area, and typically residing in the computer system as distinct components on top of the OS. Desktop publishing: A document preparation program that begins with the text features of a word processor, then adds the ability for a user to incorporate outputs from a variety of graphic programs, spreadsheets, and other applications. Line editor: A document preparation program that manipulates text in a file on the basis of numbered lines. Word processor: A document preparation program that can, among other things, reformat sections of documents, move and replace blocks of text, use multiple character fonts, automatically create a table of contents and index, create complex tables, and combine text and graphics.

  20. 45 CFR 309.170 - What statistical and narrative reporting requirements apply to Tribal IV-D programs?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 2 2011-10-01 2011-10-01 false What statistical and narrative reporting... (IV-D) PROGRAM Statistical and Narrative Reporting Requirements § 309.170 What statistical and... organizations must submit the following information and statistics for Tribal IV-D program activity and caseload...

  1. 45 CFR 309.170 - What statistical and narrative reporting requirements apply to Tribal IV-D programs?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 2 2010-10-01 2010-10-01 false What statistical and narrative reporting... (IV-D) PROGRAM Statistical and Narrative Reporting Requirements § 309.170 What statistical and... organizations must submit the following information and statistics for Tribal IV-D program activity and caseload...

  2. A Selective Stratistical Study of Transaction Activity in a Large On-Line Automated Circulation System. Final Report.

    ERIC Educational Resources Information Center

    Guthrie, Gerry D.

    The objective of this study was to provide the library community with basic statistical data from on-line activity in the Ohio State University Libraries' Circulation System. Over 1.6 million archive records in the circulation system for 1972 were investigated to produce subject reports of circulation activity, activity reports by collection…

  3. Probabilistic analysis and fatigue damage assessment of offshore mooring system due to non-Gaussian bimodal tension processes

    NASA Astrophysics Data System (ADS)

    Chang, Anteng; Li, Huajun; Wang, Shuqing; Du, Junfeng

    2017-08-01

    Both wave-frequency (WF) and low-frequency (LF) components of mooring tension are in principle non-Gaussian due to nonlinearities in the dynamic system. This paper conducts a comprehensive investigation of applicable probability density functions (PDFs) of mooring tension amplitudes used to assess mooring-line fatigue damage via the spectral method. Short-term statistical characteristics of mooring-line tension responses are firstly investigated, in which the discrepancy arising from Gaussian approximation is revealed by comparing kurtosis and skewness coefficients. Several distribution functions based on present analytical spectral methods are selected to express the statistical distribution of the mooring-line tension amplitudes. Results indicate that the Gamma-type distribution and a linear combination of Dirlik and Tovo-Benasciutti formulas are suitable for separate WF and LF mooring tension components. A novel parametric method based on nonlinear transformations and stochastic optimization is then proposed to increase the effectiveness of mooring-line fatigue assessment due to non-Gaussian bimodal tension responses. Using time domain simulation as a benchmark, its accuracy is further validated using a numerical case study of a moored semi-submersible platform.

  4. THE CAUSAL ANALYSIS / DIAGNOSIS DECISION ...

    EPA Pesticide Factsheets

    CADDIS is an on-line decision support system that helps investigators in the regions, states and tribes find, access, organize, use and share information to produce causal evaluations in aquatic systems. It is based on the US EPA's Stressor Identification process which is a formal method for identifying causes of impairments in aquatic systems. CADDIS 2007 increases access to relevant information useful for causal analysis and provides methods and tools that practitioners can use to analyze their own data. The new Candidate Cause section provides overviews of commonly encountered causes of impairments to aquatic systems: metals, sediments, nutrients, flow alteration, temperature, ionic strength, and low dissolved oxygen. CADDIS includes new Conceptual Models that illustrate the relationships from sources to stressors to biological effects. An Interactive Conceptual Model for phosphorus links the diagram with supporting literature citations. The new Analyzing Data section helps practitioners analyze their data sets and interpret and use those results as evidence within the USEPA causal assessment process. Downloadable tools include a graphical user interface statistical package (CADStat), and programs for use with the freeware R statistical package, and a Microsoft Excel template. These tools can be used to quantify associations between causes and biological impairments using innovative methods such as species-sensitivity distributions, biological inferenc

  5. Activity Demands During Multi-Directional Team Sports: A Systematic Review.

    PubMed

    Taylor, Jeffrey B; Wright, Alexis A; Dischiavi, Steven L; Townsend, M Allison; Marmon, Adam R

    2017-12-01

    Late-stage rehabilitation programs often incorporate 'sport-specific' demands, but may not optimally simulate the in-game volume or intensity of such activities as sprinting, cutting, jumping, and lateral movement. The aim of this review was to characterize, quantify, and compare straight-line running and multi-directional demands during sport competition. A systematic review of PubMed, CINAHL, SPORTDiscus, and Cochrane Central Register of Controlled Trials databases was conducted. Studies that reported time-motion analysis data on straight-line running, accelerations/decelerations, activity changes, jumping, cutting, or lateral movement over the course of an entire competition in a multi-directional sport (soccer, basketball, lacrosse, handball, field hockey, futsal, volleyball) were included. Data was organized based on sport, age level, and sex and descriptive statistics of the frequency, intensity, time, and volume of the characteristics of running and multi-directional demands were extracted from each study. Eighty-one studies were included in the review (n = 47 soccer, n = 11 basketball, n = 9 handball, n = 7 field hockey, n = 3 futsal, n = 4 volleyball). Variability of sport demand data was found across sports, sexes, and age levels. Specifically, soccer and field hockey demanded the most volume of running, while basketball required the highest ratio of high-intensity running to sprinting. Athletes change activity between 500 and 3000 times over the course of a competition, or once every 2-4 s. Studies of soccer reported the most frequent cutting (up to 800 per game), while studies of basketball reported the highest frequency of lateral movement (up to 450 per game). Basketball (42-56 per game), handball (up to 90 per game), and volleyball (up to 35 per game) were found to require the most jumping. These data may provide an incomplete view of an athlete's straight-line running load, considering that only competition and not practice data was provided. Considerable variability exists in the demands of straight-line running and multi-directional demands across sports, competition levels, and sexes, indicating the need for sports medicine clinicians to design future rehabilitation programs with improved specificity (including the type of activity and dosage) to these demands.

  6. Laboratory data manipulation tools basic data handling programs. Volume 2: Detailed software/hardware documentation

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The set of computer programs described allows for data definition, data input, and data transfer between the LSI-11 microcomputers and the VAX-11/780 minicomputer. Program VAXCOM allows for a simple method of textual file transfer from the LSI to the VAX. Program LSICOM allows for easy file transfer from the VAX to the LSI. Program TTY changes the LSI-11 operators console to the LSI's printing device. Program DICTIN provides a means for defining a data set for input to either computer. Program DATAIN is a simple to operate data entry program which is capable of building data files on either machine. Program LEDITV is an extremely powerful, easy to use, line oriented text editor. Program COPYSBF is designed to print out textual files on the line printer without character loss from FORTRAN carriage control or wide record transfer.

  7. NPLOT: an Interactive Plotting Program for NASTRAN Finite Element Models

    NASA Technical Reports Server (NTRS)

    Jones, G. K.; Mcentire, K. J.

    1985-01-01

    The NPLOT (NASTRAN Plot) is an interactive computer graphics program for plotting undeformed and deformed NASTRAN finite element models. Developed at NASA's Goddard Space Flight Center, the program provides flexible element selection and grid point, ASET and SPC degree of freedom labelling. It is easy to use and provides a combination menu and command driven user interface. NPLOT also provides very fast hidden line and haloed line algorithms. The hidden line algorithm in NPLOT proved to be both very accurate and several times faster than other existing hidden line algorithms. A fast spatial bucket sort and horizon edge computation are used to achieve this high level of performance. The hidden line and the haloed line algorithms are the primary features that make NPLOT different from other plotting programs.

  8. Microjets in the penumbra of a sunspot

    NASA Astrophysics Data System (ADS)

    Drews, Ainar; Rouppe van der Voort, Luc

    2017-06-01

    Context. Penumbral microjets (PMJs) are short-lived jets found in the penumbra of sunspots, first observed in wide-band Ca II H line observations as localized brightenings, and are thought to be caused by magnetic reconnection. Earlier work on PMJs has focused on smaller samples of by-eye selected events and case studies. Aims: It is our goal to present an automated study of a large sample of PMJs to place the basic statistics of PMJs on a sure footing and to study the PMJ Ca II 8542 Å spectral profile in detail. Methods: High spatial resolution and spectrally well-sampled observations in the Ca II 8542 Å line obtained from the Swedish 1-m Solar Telescope (SST) were reduced by a principle component analysis and subsequently used in the automated detection of PMJs using the simple machine learning algorithm k-nearest neighbour. PMJ detections were verified with co-temporal Ca II H line observations. Results: We find a total of 453 tracked PMJ events, 4253 PMJs detections tallied over all timeframes, and a detection rate of 21 events per timestep. From these, an average length, width and lifetime of 640 km, 210 km and 90 s are obtained. The average PMJ Ca II 8542 Å line profile is characterized by enhanced inner wings, often in the form of one or two distinct peaks, and a brighter line core as compared to the quiet-Sun average. Average blue and red peak positions are determined at - 10.4 km s-1 and + 10.2 km s-1 offsets from the Ca II 8542 Å line core. We find several clusters of PMJ hot-spots within the sunspot penumbra, in which PMJ events occur in the same general area repeatedly over time. Conclusions: Our results indicate smaller average PMJs sizes and longer lifetimes compared to previously published values, but with statistics still in the same orders of magnitude. The investigation and analysis of the PMJ line profiles strengthens the proposed heating of PMJs to transition region temperatures. The presented statistics on PMJs form a solid basis for future investigations and numerical modelling of PMJs.

  9. Cognitive and attitudinal predictors related to graphing achievement among pre-service elementary teachers

    NASA Astrophysics Data System (ADS)

    Szyjka, Sebastian P.

    The purpose of this study was to determine the extent to which six cognitive and attitudinal variables predicted pre-service elementary teachers' performance on line graphing. Predictors included Illinois teacher education basic skills sub-component scores in reading comprehension and mathematics, logical thinking performance scores, as well as measures of attitudes toward science, mathematics and graphing. This study also determined the strength of the relationship between each prospective predictor variable and the line graphing performance variable, as well as the extent to which measures of attitude towards science, mathematics and graphing mediated relationships between scores on mathematics, reading, logical thinking and line graphing. Ninety-four pre-service elementary education teachers enrolled in two different elementary science methods courses during the spring 2009 semester at Southern Illinois University Carbondale participated in this study. Each subject completed five different instruments designed to assess science, mathematics and graphing attitudes as well as logical thinking and graphing ability. Sixty subjects provided copies of primary basic skills score reports that listed subset scores for both reading comprehension and mathematics. The remaining scores were supplied by a faculty member who had access to a database from which the scores were drawn. Seven subjects, whose scores could not be found, were eliminated from final data analysis. Confirmatory factor analysis (CFA) was conducted in order to establish validity and reliability of the Questionnaire of Attitude Toward Line Graphs in Science (QALGS) instrument. CFA tested the statistical hypothesis that the five main factor structures within the Questionnaire of Attitude Toward Statistical Graphs (QASG) would be maintained in the revised QALGS. Stepwise Regression Analysis with backward elimination was conducted in order to generate a parsimonious and precise predictive model. This procedure allowed the researcher to explore the relationships among the affective and cognitive variables that were included in the regression analysis. The results for CFA indicated that the revised QALGS measure was sound in its psychometric properties when tested against the QASG. Reliability statistics indicated that the overall reliability for the 32 items in the QALGS was .90. The learning preferences construct had the lowest reliability (.67), while enjoyment (.89), confidence (.86) and usefulness (.77) constructs had moderate to high reliabilities. The first four measurement models fit the data well as indicated by the appropriate descriptive and statistical indices. However, the fifth measurement model did not fit the data well statistically, and only fit well with two descriptive indices. The results addressing the research question indicated that mathematical and logical thinking ability were significant predictors of line graph performance among the remaining group of variables. These predictors accounted for 41% of the total variability on the line graph performance variable. Partial correlation coefficients indicated that mathematics ability accounted for 20.5% of the variance on the line graphing performance variable when removing the effect of logical thinking. The logical thinking variable accounted for 4.7% of the variance on the line graphing performance variable when removing the effect of mathematics ability.

  10. Lip line preference for variant face types.

    PubMed

    Anwar, Nabila; Fida, Mubassar

    2012-06-01

    To determine the effect of altered lip line on attractiveness and to find preferred lip line for vertical face types in both genders. Cross-sectional analytical study. The Aga Khan University Hospital, Karachi, from May to July 2009. Photographs of two selected subjects were altered to produce three face types for the same individual with the aim of keeping the frame of the smile constant. Lip line was then altered for both the subjects as: both dentitions visible, upper incisors visible, upper incisors and 2 mm gum and 4 mm gum visible. The pictures were rated by different professionals for attractiveness. Descriptive statistics for the raters and multiple factor ANOVA was used to find the most attractive lip line. The total number of raters was 100 with the mean age of 30.3 ± 8 years. The alterations in the smile parameters produced statistically significant difference in the attractiveness of faces, whereas the perception difference was found to be insignificant amongst raters of different professions. Preferred lip line was the one showing only the upper incisors in dolico and mesofacial male and female genders whereas 2 mm gum show was preferred in brachyfacial subjects. The variability in lip line showed significant difference in the perceived attractiveness. Preferred lip lines as the one showing only the upper incisors in dolico and mesofacial male and female genders whereas 2 mm gum show was preferred in brachyfacial subjects.

  11. Determining the Nature of [CII] 158 Micron Emission: an Improved Star Formation Rate Indicator

    NASA Astrophysics Data System (ADS)

    Sutter, Jessica; Dale, Daniel A.; KINGFISH Team

    2018-06-01

    The brightest observed emission line from most normal star-forming galaxies is the 158 micron line arising from singly-ionized carbon (also known as C+ or CII). In fact, astronomers have recently begun using the bright emission line to detect and characterize galaxies in the furthermost reaches of the universe. It is thus imperative that we have the tools to fully understand how this emission line could be utilized as an indicator of star formation rate, a primary parameter by which galaxies and their constituent star-forming regions are characterized. There are two main challenges to utilizing the [CII] 158 micron line as a star formation rate indicator. First, advances in long-wavelength astronomical instrumentation have only recently enabled its detection in statistically-significant samples of galaxies. Second, it is both a blessing and a curse that singly-ionized carbon can be created in both star-forming regions (ionized HII regions) and in non-star forming regions (neutral photo-dissociation regions). In order to better understand and quantify the [CII] emission as an indicator of star-formation rate, the relationship between the [NII] 205 micron emission, which can only arise from the ionized interstellar medium (ISM), and the [CII] 158 micron emission has been employed to determine the fraction of [CII] emission that originates from each phase of the ISM. Sub-kiloparsec measurements of the [NII] 205 micron line in nearby galaxies have recently become available as part of the KINGFISH program. We use these two far-infrared lines along with the full suite of KINGFISH panchromatic data to present an improved calibration of the [CII] emission line as a star formation rate indicator.

  12. GARLIC, A SHIELDING PROGRAM FOR GAMMA RADIATION FROM LINE- AND CYLINDER- SOURCES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roos, M.

    1959-06-01

    GARLlC is a program for computing the gamma ray flux or dose rate at a shielded isotropic point detector, due to a line source or the line equivalent of a cylindrical source. The source strength distribution along the line must be either uniform or an arbitrary part of the positive half-cycle of a cosine function The line source can be orierted arbitrarily with respect to the main shield and the detector, except that the detector must not be located on the line source or on its extensionThe main source is a homogeneous plane slab in which scattered radiation is accountedmore » for by multiplying each point element of the line source by a point source buildup factor inside the integral over the point elements. Between the main shield and the line source additional shields can be introduced, which are either plane slabs, parallel to the main shield, or cylindrical rings, coaxial with the line source. Scattered radiation in the additional shields can only be accounted for by constant build-up factors outside the integral. GARLlC-xyz is an extended version particularly suited for the frequently met problem of shielding a room containing a large number of line sources in diHerent positions. The program computes the angles and linear dimensions of a problem for GARLIC when the positions of the detector point and the end points of the line source are given as points in an arbitrary rectangular coordinate system. As an example the isodose curves in water are presented for a monoenergetic cosine-distributed line source at several source energies and for an operating fuel element of the Swedish reactor R3, (auth)« less

  13. Multiple linear regression analysis

    NASA Technical Reports Server (NTRS)

    Edwards, T. R.

    1980-01-01

    Program rapidly selects best-suited set of coefficients. User supplies only vectors of independent and dependent data and specifies confidence level required. Program uses stepwise statistical procedure for relating minimal set of variables to set of observations; final regression contains only most statistically significant coefficients. Program is written in FORTRAN IV for batch execution and has been implemented on NOVA 1200.

  14. Adult Basic and Secondary Education Program Statistics. Fiscal Year 1976.

    ERIC Educational Resources Information Center

    Cain, Sylvester H.; Whalen, Barbara A.

    Reports submitted to the National Center for Education Statistics provided data for this compilation and tabulation of data on adult participants in U.S. educational programs in fiscal year 1976. In the summary section introducing the charts, it is noted that adult education programs funded under P.L. 91-230 served over 1.6 million persons--an…

  15. LINE-1 couples EMT programming with acquisition of oncogenic phenotypes in human bronchial epithelial cells.

    PubMed

    Reyes-Reyes, Elsa M; Aispuro, Ivan; Tavera-Garcia, Marco A; Field, Matthew; Moore, Sara; Ramos, Irma; Ramos, Kenneth S

    2017-11-28

    Although several lines of evidence have established the central role of epithelial-to-mesenchymal-transition (EMT) in malignant progression of non-small cell lung cancers (NSCLCs), the molecular events connecting EMT to malignancy remain poorly understood. This study presents evidence that Long Interspersed Nuclear Element-1 (LINE-1) retrotransposon couples EMT programming with malignancy in human bronchial epithelial cells (BEAS-2B). This conclusion is supported by studies showing that: 1) activation of EMT programming by TGF-β1 increases LINE-1 mRNAs and protein; 2) the lung carcinogen benzo(a)pyrene coregulates TGF-β1 and LINE-1 mRNAs, with LINE-1 positioned downstream of TGF-β1 signaling; and, 3) forced expression of LINE-1 in BEAS-2B cells recapitulates EMT programming and induces malignant phenotypes and tumorigenesis in vivo . These findings identify a TGFβ1-LINE-1 axis as a critical effector pathway that can be targeted for the development of precision therapies during malignant progression of intractable NSCLCs.

  16. TIERRAS: A package to simulate high energy cosmic ray showers underground, underwater and under-ice

    NASA Astrophysics Data System (ADS)

    Tueros, Matías; Sciutto, Sergio

    2010-02-01

    In this paper we present TIERRAS, a Monte Carlo simulation program based on the well-known AIRES air shower simulations system that enables the propagation of particle cascades underground, providing a tool to study particles arriving underground from a primary cosmic ray on the atmosphere or to initiate cascades directly underground and propagate them, exiting into the atmosphere if necessary. We show several cross-checks of its results against CORSIKA, FLUKA, GEANT and ZHS simulations and we make some considerations regarding its possible use and limitations. The first results of full underground shower simulations are presented, as an example of the package capabilities. Program summaryProgram title: TIERRAS for AIRES Catalogue identifier: AEFO_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFO_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 36 489 No. of bytes in distributed program, including test data, etc.: 3 261 669 Distribution format: tar.gz Programming language: Fortran 77 and C Computer: PC, Alpha, IBM, HP, Silicon Graphics and Sun workstations Operating system: Linux, DEC Unix, AIX, SunOS, Unix System V RAM: 22 Mb bytes Classification: 1.1 External routines: TIERRAS requires AIRES 2.8.4 to be installed on the system. AIRES 2.8.4 can be downloaded from http://www.fisica.unlp.edu.ar/auger/aires/eg_AiresDownload.html. Nature of problem: Simulation of high and ultra high energy underground particle showers. Solution method: Modification of the AIRES 2.8.4 code to accommodate underground conditions. Restrictions: In AIRES some processes that are not statistically significant on the atmosphere are not simulated. In particular, it does not include muon photonuclear processes. This imposes a limitation on the application of this package to a depth of 1 km of standard rock (or 2.5 km of water equivalent). Neutrinos are not tracked on the simulation, but their energy is taken into account in decays. Running time: A TIERRAS for AIRES run of a 10 eV shower with statistical sampling (thinning) below 10 eV and 0.2 weight factor (see [1]) uses approximately 1 h of CPU time on an Intel Core 2 Quad Q6600 at 2.4 GHz. It uses only one core, so 4 simultaneous simulations can be run on this computer. Aires includes a spooling system to run several simultaneous jobs of any type. References:S. Sciutto, AIRES 2.6 User Manual, http://www.fisica.unlp.edu.ar/auger/aires/.

  17. CUGatesDensity—Quantum circuit analyser extended to density matrices

    NASA Astrophysics Data System (ADS)

    Loke, T.; Wang, J. B.

    2013-12-01

    CUGatesDensity is an extension of the original quantum circuit analyser CUGates (Loke and Wang, 2011) [7] to provide explicit support for the use of density matrices. The new package enables simulation of quantum circuits involving statistical ensemble of mixed quantum states. Such analysis is of vital importance in dealing with quantum decoherence, measurements, noise and error correction, and fault tolerant computation. Several examples involving mixed state quantum computation are presented to illustrate the use of this package. Catalogue identifier: AEPY_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEPY_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 5368 No. of bytes in distributed program, including test data, etc.: 143994 Distribution format: tar.gz Programming language: Mathematica. Computer: Any computer installed with a copy of Mathematica 6.0 or higher. Operating system: Any system with a copy of Mathematica 6.0 or higher installed. Classification: 4.15. Nature of problem: To simulate arbitrarily complex quantum circuits comprised of single/multiple qubit and qudit quantum gates with mixed state registers. Solution method: A density matrix representation for mixed states and a state vector representation for pure states are used. The construct is based on an irreducible form of matrix decomposition, which allows a highly efficient implementation of general controlled gates with multiple conditionals. Running time: The examples provided in the notebook CUGatesDensity.nb take approximately 30 s to run on a laptop PC.

  18. Collecting maple sap with unvented spouts, using aerial and ground lines

    Treesearch

    H. Clay Smith; Carter B. Gibbs

    1971-01-01

    Two methods of using plastic tubing to collect sugar maple sap were tried: aerial lines and ground lines. Unvented spouts were used in both. We found that the sap yields collected from the aerial and ground lines were not statistically different from each other.

  19. Diversity of Poissonian populations.

    PubMed

    Eliazar, Iddo I; Sokolov, Igor M

    2010-01-01

    Populations represented by collections of points scattered randomly on the real line are ubiquitous in science and engineering. The statistical modeling of such populations leads naturally to Poissonian populations-Poisson processes on the real line with a distinguished maximal point. Poissonian populations are infinite objects underlying key issues in statistical physics, probability theory, and random fractals. Due to their infiniteness, measuring the diversity of Poissonian populations depends on the lower-bound cut-off applied. This research characterizes the classes of Poissonian populations whose diversities are invariant with respect to the cut-off level applied and establishes an elemental connection between these classes and extreme-value theory. The measures of diversity considered are variance and dispersion, Simpson's index and inverse participation ratio, Shannon's entropy and Rényi's entropy, and Gini's index.

  20. Development and Evaluation of an On-Line Educational Module for Volunteer Leaders on Bio-Security in Washington State 4-H Livestock Projects

    ERIC Educational Resources Information Center

    Stevenson, Jill L.; Moore, Dale A.; Newman, Jerry; Schmidt, Janet L.; Smith, Sarah M.; Smith, Jean; Kerr, Susan; Wallace, Michael; BoyEs, Pat

    2011-01-01

    An on-line module on disease prevention was created for 4-H volunteer leaders who work with livestock projects in Washington to better prepare them to teach youth about bio-security and its importance in 4-H livestock projects. Evaluation of the module and usage statistics since the module's debut were collected and evaluated. The module increases…

  1. On the apparent velocity of integrated sunlight. 2: 1983-1992 and comparisons with magnetograms

    NASA Technical Reports Server (NTRS)

    Deming, Drake; Plymate, Claude

    1994-01-01

    We report additional results in our program to monitor the wavelength stability of lines in the 2.3 micrometer spectrum of integrated sunlight. We use the McMath Fourier transform spectrometer (FTS) of the National Solar Observatory to monitor 16 delta V = 2 lines of (12)C(16)O, as well as five atomic lines. Wavenumber calibration is achieved using a low-pressure N2O absorption cell and checked against terrestrial atmospheric lines. Imperfect optical integration of the solar disk remains the principal source of error, but this error has been reduced by improved FTS/telescope collimation and observing procedures. The present results include data from an additional 13 quarterly observing runs since 1985. We continue to find that the apparent velocity of integrated sunlight is variable, in the sense of having a greater reshift at solar maximum. This is supported by the temporal dependence of the integrated light velocity, and by the presence of a correlation between velocity and the disk-averaged magnetic flux derived from Kitt Peak magnetograms. The indicated peak-to-peak apparent velocity amplitude over a solar cycle is approximately the same as the velocity amplitude of the Sun's motion about the solar system barycenter. This represents about half the amplitude which we inferred in Paper I (Deming et al. 1987), but the present result has a much greater statistical significance. Our results have implications for those investigations which search for the Doppler signatures of planetary-mass companions to solar-type stars. We contrast our results to the recent finding by McMillan et al. 1993 that solar absorption lines in the violet spectral region are wavelength-stable over the solar cycle.

  2. On the apparent velocity of integrated sunlight. 2: 1983-1992 and comparisons with magnetograms

    NASA Astrophysics Data System (ADS)

    Deming, Drake; Plymate, Claude

    1994-05-01

    We report additional results in our program to monitor the wavelength stability of lines in the 2.3 micrometer spectrum of integrated sunlight. We use the McMath Fourier transform spectrometer (FTS) of the National Solar Observatory to monitor 16 delta V = 2 lines of (12)C(16)O, as well as five atomic lines. Wavenumber calibration is achieved using a low-pressure N2O absorption cell and checked against terrestrial atmospheric lines. Imperfect optical integration of the solar disk remains the principal source of error, but this error has been reduced by improved FTS/telescope collimation and observing procedures. The present results include data from an additional 13 quarterly observing runs since 1985. We continue to find that the apparent velocity of integrated sunlight is variable, in the sense of having a greater reshift at solar maximum. This is supported by the temporal dependence of the integrated light velocity, and by the presence of a correlation between velocity and the disk-averaged magnetic flux derived from Kitt Peak magnetograms. The indicated peak-to-peak apparent velocity amplitude over a solar cycle is approximately the same as the velocity amplitude of the Sun's motion about the solar system barycenter. This represents about half the amplitude which we inferred in Paper I (Deming et al. 1987), but the present result has a much greater statistical significance. Our results have implications for those investigations which search for the Doppler signatures of planetary-mass companions to solar-type stars. We contrast our results to the recent finding by McMillan et al. 1993 that solar absorption lines in the violet spectral region are wavelength-stable over the solar cycle.

  3. Case Study: Collaborative Creation of an On-Line Degree Program

    ERIC Educational Resources Information Center

    Stewart, Barbara L.; Norwood, Marcella; Ezell, Shirley; Waight, Consuelo

    2006-01-01

    Faculty collaboratively developed an on-line Bachelor of Science degree in Consumer Science and Merchandising (CSM). Part-time faculty and technical support services supported the four-member team. Small grants assisted in the creation and redesign of all CSM major courses for on-line delivery. Issues of appropriate learning strategies, student…

  4. The implementation of an integrated on-line health education system at RMIT.

    PubMed

    Zylinski, J; Allan, G L; Jamieson, P; Maher, K P; Green, R; Hislop, J

    1998-06-01

    The Faculty of Biomedical and Health Sciences at RMIT has been developing an on-line health education system using a systems thinking approach, to create a learning environment whose basis is supported by Information Technology (IT). The centre-piece of this system is the Faculty Learning Centre, which has been created, both in space and layout, to promote collaborative learning between the students, so that the educator is physically assimilated with the student body. This facility is supplemented by the Faculty WWW server, which has been the main vehicle for course material dissemination to students. To ensure an effective on-line teaching environment, the position of an on-line facilitator has been created, whose responsibilities include both the continual evaluation of the system and the implementation of appropriate system changes. Aspects have included the production of a staff development training program and extensive user documentation. This paper discusses the systems thinking approach used to implement this integrated on-line system, and the establishment of explicit educational rationales in the use of IT to support learning strategies. Some examples of the on-line educational programs are also presented.

  5. An On-Line Virtual Environment for Teaching Statistical Sampling and Analysis

    ERIC Educational Resources Information Center

    Marsh, Michael T.

    2009-01-01

    Regardless of the related discipline, students in statistics courses invariably have difficulty understanding the connection between the numerical values calculated for end-of-the-chapter exercises and their usefulness in decision making. This disconnect is, in part, due to the lack of time and opportunity to actually design the experiments and…

  6. Seismicity map tools for earthquake studies

    NASA Astrophysics Data System (ADS)

    Boucouvalas, Anthony; Kaskebes, Athanasios; Tselikas, Nikos

    2014-05-01

    We report on the development of new and online set of tools for use within Google Maps, for earthquake research. We demonstrate this server based and online platform (developped with PHP, Javascript, MySQL) with the new tools using a database system with earthquake data. The platform allows us to carry out statistical and deterministic analysis on earthquake data use of Google Maps and plot various seismicity graphs. The tool box has been extended to draw on the map line segments, multiple straight lines horizontally and vertically as well as multiple circles, including geodesic lines. The application is demonstrated using localized seismic data from the geographic region of Greece as well as other global earthquake data. The application also offers regional segmentation (NxN) which allows the studying earthquake clustering, and earthquake cluster shift within the segments in space. The platform offers many filters such for plotting selected magnitude ranges or time periods. The plotting facility allows statistically based plots such as cumulative earthquake magnitude plots and earthquake magnitude histograms, calculation of 'b' etc. What is novel for the platform is the additional deterministic tools. Using the newly developed horizontal and vertical line and circle tools we have studied the spatial distribution trends of many earthquakes and we here show for the first time the link between Fibonacci Numbers and spatiotemporal location of some earthquakes. The new tools are valuable for examining visualizing trends in earthquake research as it allows calculation of statistics as well as deterministic precursors. We plan to show many new results based on our newly developed platform.

  7. Pinning time statistics for vortex lines in disordered environments.

    PubMed

    Dobramysl, Ulrich; Pleimling, Michel; Täuber, Uwe C

    2014-12-01

    We study the pinning dynamics of magnetic flux (vortex) lines in a disordered type-II superconductor. Using numerical simulations of a directed elastic line model, we extract the pinning time distributions of vortex line segments. We compare different model implementations for the disorder in the surrounding medium: discrete, localized pinning potential wells that are either attractive and repulsive or purely attractive, and whose strengths are drawn from a Gaussian distribution; as well as continuous Gaussian random potential landscapes. We find that both schemes yield power-law distributions in the pinned phase as predicted by extreme-event statistics, yet they differ significantly in their effective scaling exponents and their short-time behavior.

  8. A search for spectral lines in gamma-ray bursts using TGRS

    NASA Astrophysics Data System (ADS)

    Kurczynski, P.; Palmer, D.; Seifert, H.; Teegarden, B. J.; Gehrels, N.; Cline, T. L.; Ramaty, R.; Hurley, K.; Madden, N. W.; Pehl, R. H.

    1998-05-01

    We present the results of an ongoing search for narrow spectral lines in gamma-ray burst data. TGRS, the Transient Gamma-Ray Spectrometer aboard the Wind satellite is a high energy-resolution Ge device. Thus it is uniquely situated among the array of space-based, burst sensitive instruments to look for line features in gamma-ray burst spectra. Our search strategy adopts a two tiered approach. An automated `quick look' scan searches spectra for statistically significant deviations from the continuum. We analyzed all possible time accumulations of spectra as well as individual spectra for each burst. Follow-up analysis of potential line candidates uses model fitting with F-test and χ2 tests for statistical significance.

  9. [Factors associated with adherence to school meals by adolescents in State public schools in Colombo, Paraná State, Brazil].

    PubMed

    Valentim, Emanuele de Araujo; Almeida, Claudia Choma Bettega de; Taconeli, César Augusto; Osório, Mônica Maria; Schmidt, Suely Teresinha

    2017-10-26

    This study aimed to estimate the prevalence of adherence to school meals and associated factors among adolescent schoolchildren (N = 1,569). The adolescents completed an on-line questionnaire on adherence to school meals, and their parents answered another questionnaire on socioeconomic data. The chi-square test was used to assess the association between adherence to school meals and gender, nutritional status, per capita family income, maternal schooling, adolescents' opinions on the dining hall layout, whether they considered school meals healthy, and consumption of other foods. Variables with statistical significance for adherence to school meals were included in the multilevel proportional odds logistic regression model. The covariates for comprising the final model were defined by backward selection methods. The results of the adjusted model were presented as odds ratios with respective 95% confidence intervals (95%CI). Prevalence of adherence to school meals was low, especially effective adherence (19.8%). Adherence was associated with per capita family income less than one minimum wage, lower consumption of foods outside of school meals, the fact that adolescents considered the dining hall space adequate, and believing that school meals are healthy. Adherence to school meals in this study falls short of universal coverage for the program. Different factors contribute to incomplete program implementation, which may hinder achieving the food and nutritional security policy under the Brazilian National School Feeding Program (PNAE).

  10. A Model of Human Cognitive Behavior in Writing Code for Computer Programs. Volume 1

    DTIC Science & Technology

    1975-05-01

    nearly all programming languages, each line of code actually involves a great many decisions - basic statement types, variable and expression choices...labels, etc. - and any heuristic which evaluates code on the basis of a single decision is not likely to have sufficient power. Only the use of plans...recalculated in the following line because It was needed again. The second reason is that there are some decisions about the structure of a program

  11. Statistics in Japanese universities.

    PubMed Central

    Ito, P K

    1979-01-01

    The teaching of statistics in the U.S. and Japanese universities is briefly reviewed. It is found that H. Hotelling's articles and subsequent relevant publications on the teaching of statistics have contributed to a considerable extent to the establishment of excellent departments of statistics in U.S. universities and colleges. Today the U.S. may be proud of many well-staffed and well-organized departments of theoretical and applied statistics with excellent undergraduate and graduate programs. On the contrary, no Japanese universities have an independent department of statistics at present, and the teaching of statistics has been spread among a heterogeneous group of departments of application. This was mainly due to the Japanese government regulation concerning the establishment of a university. However, it has recently been revised so that an independent department of statistics may be started in a Japanese university with undergraduate and graduate programs. It is hoped that discussions will be started among those concerned on the question of organization of the teaching of statistics in Japanese universities as soon as possible. PMID:396154

  12. Electronic transport of bilayer graphene with asymmetry line defects

    NASA Astrophysics Data System (ADS)

    Zhao, Xiao-Ming; Wu, Ya-Jie; Chen, Chan; Liang, Ying; Kou, Su-Peng

    2016-11-01

    In this paper, we study the quantum properties of a bilayer graphene with (asymmetry) line defects. The localized states are found around the line defects. Thus, the line defects on one certain layer of the bilayer graphene can lead to an electric transport channel. By adding a bias potential along the direction of the line defects, we calculate the electric conductivity of bilayer graphene with line defects using the Landauer-Büttiker theory, and show that the channel affects the electric conductivity remarkably by comparing the results with those in a perfect bilayer graphene. This one-dimensional line electric channel has the potential to be applied in nanotechnology engineering. Project supported by the National Basic Research Program of China (Grant Nos. 2011CB921803 and 2012CB921704), the National Natural Science Foundation of China (Grant Nos. 11174035, 11474025, 11504285, and 11404090), the Specialized Research Fund for the Doctoral Program of Higher Education, China, the Fundamental Research Funds for the Central Universities, China, the Scientific Research Program Fund of the Shaanxi Provincial Education Department, China (Grant No. 15JK1363), and the Young Talent Fund of University Association for Science and Technology in Shaanxi Province, China.

  13. Statistical Fine Structure in the Inhomogeneously Broadened Electronic Origin of Pentacene in p-Terphenyl.

    DTIC Science & Technology

    1988-01-29

    Electronic Origin of Pentacene in p-Terphenyl by T. P. Carter, M. Manavi, and W. E. Moerner Prepared for Publication inDTIC Journal of Chemical Physics...Classification) Statistical Fine Structure in the Inhomogeneously Broadened Electronic Origin of Pentacene in p-Terphenyl 12. PERSONAL AUTHOR(S) T. P...of pentacene in p-terphenyl using laser FM spectroscopy. Statistical fine structure is time-independent structure on the inhomogeneous line caused by

  14. The Anatomy of Program Design for an On-Line Business Management Course

    ERIC Educational Resources Information Center

    Barger, Bonita

    2008-01-01

    How does one design an on-line course to bridge theory and practice? How can the feedback of on-going stakeholder (student and administration) be incorporated into the design process to enhance quality? This paper presents the theoretical underpinning of designing an on-line management course recognized as best practice for a "well organized…

  15. Modelling the line-of-sight contribution in substructure lensing

    NASA Astrophysics Data System (ADS)

    Despali, Giulia; Vegetti, Simona; White, Simon D. M.; Giocoli, Carlo; van den Bosch, Frank C.

    2018-04-01

    We investigate how Einstein rings and magnified arcs are affected by small-mass dark-matter haloes placed along the line of sight to gravitational lens systems. By comparing the gravitational signature of line-of-sight haloes with that of substructures within the lensing galaxy, we derive a mass-redshift relation that allows us to rescale the detection threshold (i.e. lowest detectable mass) for substructures to a detection threshold for line-of-sight haloes at any redshift. We then quantify the line-of-sight contribution to the total number density of low-mass objects that can be detected through strong gravitational lensing. Finally, we assess the degeneracy between substructures and line-of-sight haloes of different mass and redshift to provide a statistical interpretation of current and future detections, with the aim of distinguishing between cold dark matter and warm dark matter. We find that line-of-sight haloes statistically dominate with respect to substructures, by an amount that strongly depends on the source and lens redshifts, and on the chosen dark-matter model. Substructures represent about 30 percent of the total number of perturbers for low lens and source redshifts (as for the SLACS lenses), but less than 10 per cent for high-redshift systems. We also find that for data with high enough signal-to-noise ratio and angular resolution, the non-linear effects arising from a double-lens-plane configuration are such that one is able to observationally recover the line-of-sight halo redshift with an absolute error precision of 0.15 at the 68 per cent confidence level.

  16. Statistical Investigation of Supersonic Downflows in the Transition Region above Sunspots

    NASA Astrophysics Data System (ADS)

    Samanta, Tanmoy; Tian, Hui; Prasad Choudhary, Debi

    2018-06-01

    Downflows at supersonic speeds have been observed in the transition region (TR) above sunspots for more than three decades. These downflows are often seen in different TR spectral lines above sunspots. We have performed a statistical investigation of these downflows using a large sample that was missing previously. The Interface Region Imaging Spectrograph (IRIS) has provided a wealth of observational data of sunspots at high spatial and spectral resolutions in the past few years. We have identified 60 data sets obtained with IRIS raster scans. Using an automated code, we identified the locations of strong downflows within these sunspots. We found that around 80% of our sample shows supersonic downflows in the Si IV 1403 Å line. These downflows mostly appear in the penumbral regions, though some of them are found in the umbrae. We also found that almost half of these downflows show signatures in chromospheric lines. Furthermore, a detailed spectral analysis was performed by selecting a small spectral window containing the O IV 1400/1401 Å and Si IV 1403 Å lines. Six Gaussian functions were simultaneously fitted to these three spectral lines and their satellite lines associated with the supersonic downflows. We calculated the intensity, Doppler velocity, and line width for these lines. Using the O IV 1400/1401 Å line ratio, we find that the downflow components are around one order of magnitude less dense than the regular components. Results from our statistical analysis suggest that these downflows may originate from the corona and that they are independent of the background TR plasma.

  17. 76 FR 82322 - Agency Information Collection Activities; Submission for OMB Review; Comment Request; Mass Layoff...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-30

    ... for OMB Review; Comment Request; Mass Layoff Statistics Program ACTION: Notice. SUMMARY: The... request (ICR) titled, ``Mass Layoff Statistics Program,'' to the Office of Management and Budget (OMB) for... Statistics (BLS). Title of Collection: Mass Layoff Statistics Program. OMB Control Number: 1220-0090...

  18. USSR Report: Machine Tools and Metalworking Equipment.

    DTIC Science & Technology

    1986-01-23

    between satellite stop and the camshaft of the programer unit. The line has 23 positions including 12 automatic ones. Specification of line Number...technological, processes, automated research, etc.) are as follows.: a monochannel based on a shared trunk line, ring, star and tree (polychannel...line or ring networks based on decentralized control of data exchange between subscribers are very robust. A tree -form network has star structure

  19. Toward computer simulation of high-LET in vitro survival curves.

    PubMed

    Heuskin, A-C; Michiels, C; Lucas, S

    2013-09-21

    We developed a Monte Carlo based computer program called MCSC (Monte Carlo Survival Curve) able to predict the survival fraction of cells irradiated in vitro with a broad beam of high linear energy transfer particles. Three types of cell responses are studied: the usual high dose response, the bystander effect and the low-dose hypersensitivity (HRS). The program models the broad beam irradiation and double strand break distribution following Poisson statistics. The progression of cells through the cell cycle is taken into account while the repair takes place. Input parameters are experimentally determined for A549 lung carcinoma cells irradiated with 10 and 20 keV µm(-1) protons, 115 keV µm(-1) alpha particles and for EAhy926 endothelial cells exposed to 115 keV µm(-1) alpha particles. Results of simulations are presented and compared with experimental survival curves obtained for A549 and EAhy296 cells. Results are in good agreement with experimental data for both cell lines and all irradiation protocols. The benefits of MCSC are several: the gain of time that would have been spent performing time-consuming clonogenic assays, the capacity to estimate survival fraction of cell lines not forming colonies and possibly the evaluation of radiosensitivity parameters of given individuals.

  20. Analysis of reference transactions using packaged computer programs.

    PubMed

    Calabretta, N; Ross, R

    1984-01-01

    Motivated by a continuing education class attended by the authors on the measurement of reference desk activities, the reference department at Scott Memorial Library initiated a project to gather data on reference desk transactions and to analyze the data by using packaged computer programs. The programs utilized for the project were SPSS (Statistical Package for the Social Sciences) and SAS (Statistical Analysis System). The planning, implementation and development of the project are described.

  1. Real-Time Noise Removal for Line-Scanning Hyperspectral Devices Using a Minimum Noise Fraction-Based Approach

    PubMed Central

    Bjorgan, Asgeir; Randeberg, Lise Lyngsnes

    2015-01-01

    Processing line-by-line and in real-time can be convenient for some applications of line-scanning hyperspectral imaging technology. Some types of processing, like inverse modeling and spectral analysis, can be sensitive to noise. The MNF (minimum noise fraction) transform provides suitable denoising performance, but requires full image availability for the estimation of image and noise statistics. In this work, a modified algorithm is proposed. Incrementally-updated statistics enables the algorithm to denoise the image line-by-line. The denoising performance has been compared to conventional MNF and found to be equal. With a satisfying denoising performance and real-time implementation, the developed algorithm can denoise line-scanned hyperspectral images in real-time. The elimination of waiting time before denoised data are available is an important step towards real-time visualization of processed hyperspectral data. The source code can be found at http://www.github.com/ntnu-bioopt/mnf. This includes an implementation of conventional MNF denoising. PMID:25654717

  2. Non-Operational Property Evaluation for the Hanford Site River Corridor - 12409

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lowe, John; Aly, Alaa

    2012-07-01

    The Hanford Site River Corridor consists of the former reactor areas of the 100 Areas and the former industrial (fuel processing) area in the 300 Area. Most of the waste sites are located close to the decommissioned reactors or former industrial facilities along the Columbia River. Most of the surface area of the River Corridor consists of land with little or no subsurface infrastructure or indication of past or present releases of hazardous constituents, and is referred to as non-operational property or non-operational area. Multiple lines of evidence have been developed to assess identified fate and transport mechanisms and tomore » evaluate the potential magnitude and significance of waste site-related contaminants in the non-operational area. Predictive modeling was used for determining the likelihood of locating waste sites and evaluating the distribution of radionuclides in soil based on available soil concentration data and aerial radiological surveys. The results of this evaluation indicated: 1) With the exception of stack emissions, transport pathways associated with waste site contaminants are unlikely to result in dispersion of contaminants in soil away from operational areas, 2) Stack emissions that may have been associated with Hanford Site operations generally emitted short-lived and/or gaseous radionuclides, and (3) the likelihood of detecting elevated radionuclide concentrations or other waste sites in non-operational area soils is very small. The overall conclusions from the NPE evaluation of the River Corridor are: - With the exception of stack emissions to the air, transport pathways associated with waste site contaminants are unlikely to result in dispersion of contaminants in soil away from operational areas. While pathways such as windblown dust, overland transport and biointrusion have the potential for dispersing waste site contaminants, the resulting transport is unlikely to result in substantial contamination in non-operational areas. - Stack emissions that may have been associated with Hanford Site operations generally emitted short-lived and/or gaseous radionuclides; these radionuclides either would have decayed and would be undetectable in soil, or likely would not have deposited onto Hanford Site soils. A small fraction of the total historical emissions consisted of long-lived particulate radionuclides, which could have deposited onto the soil. Soil monitoring studies conducted as part of surveillance and monitoring programs do not indicate a build-up of radionuclide concentrations in soil, which might indicate potential deposition impacts from stack emissions. Aerial radiological surveys of the Hanford Site, while effective in detecting gamma-emitting nuclides, also do not indicate deposition patterns in soil from stack emissions. - The surveillance and monitoring programs also have verified that the limited occurrence of biointrusion observed in the River Corridor has not resulted in a spread of contamination into the non-operational areas. - Monitoring of radionuclides in ambient air conducted as part of the surveillance and monitoring programs generally show a low and declining trend of detected concentrations in air. Monitoring of radionuclides in soil and vegetation correspondingly show declining trends in concentrations, particularly for nuclides with short half lives (Cs-137, Co-60 and Sr-90). - Statistical analysis of the geographical distribution of waste sites based on man -made features and topography describes the likely locations of waste sites in the River Corridor. The results from this analysis reinforce the findings from the Orphan Site Evaluation program, which has systematically identified any remaining waste sites within the River Corridor. - Statistical analysis of the distribution of radionuclide concentrations observable from aerial surveys has confirmed that the likelihood of detecting elevated radionuclide concentrations in non-operational area soils is very small; the occurrences and locations where potentially elevated concentrations may be found are discussed below. In addition, statistical analysis showed that there is a relatively high probability (>50%) that concentrations of Cs-137 higher than background (3.9 Bq/kg or 1.05 pCi/g) are located outside of the operational portion of the 100-BC, 100-K, and 100-N Areas. This observation is based on modeled concentrations in soil derived from aerial radiography data. However, the extent is limited to a few meters from the respective facilities fence lines or known operational activities. Evaluation of the extent of contamination is being conducted as part of the RI process for each decision area. No unanticipated waste sites were identified either from the OSE program or statistical analysis of waste site proximity to known features. Based on the evaluation of these multiple lines of evidence, the likelihood of identifying waste sites or contaminant dispersal from Hanford site operations into non-operational areas can be considered very small. (authors)« less

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rasca, Anthony P.; Chen, James; Pevtsov, Alexei A., E-mail: anthony.rasca.ctr@nrl.navy.mil

    Recent observations of the photosphere using high spatial and temporal resolution show small dynamic features at or below the current resolving limits. A new pixel dynamics method has been developed to analyze spectral profiles and quantify changes in line displacement, width, asymmetry, and peakedness of photospheric absorption lines. The algorithm evaluates variations of line profile properties in each pixel and determines the statistics of such fluctuations averaged over all pixels in a given region. The method has been used to derive statistical characteristics of pixel fluctuations in observed quiet-Sun regions, an active region with no eruption, and an active regionmore » with an ongoing eruption. Using Stokes I images from the Vector Spectromagnetograph (VSM) of the Synoptic Optical Long-term Investigations of the Sun (SOLIS) telescope on 2012 March 13, variations in line width and peakedness of Fe i 6301.5 Å are shown to have a distinct spatial and temporal relationship with an M7.9 X-ray flare in NOAA 11429. This relationship is observed as stationary and contiguous patches of pixels adjacent to a sunspot exhibiting intense flattening in the line profile and line-center displacement as the X-ray flare approaches peak intensity, which is not present in area scans of the non-eruptive active region. The analysis of pixel dynamics allows one to extract quantitative information on differences in plasma dynamics on sub-pixel scales in these photospheric regions. The analysis can be extended to include the Stokes parameters and study signatures of vector components of magnetic fields and coupled plasma properties.« less

  4. [Comparison of preventive effects of two kinds of dental handpieces on viral contamination at different rotating times].

    PubMed

    Hu, Tao; Zuo, Yu-ling; Zhou, Xue-dong

    2004-08-01

    It has been demonstrated that when a high-speed handpiece stops rotating, negative pressure will form. Thus, contaminating fluid in which there are many kinds of bacteria and viruses from the external environment will retract into various compartments of the handpiece and the dental unit. The purpose of the study is to compare the preventing effect of antisuction designed handpiece and conventional handpiece on viral contamination at different rotating times. Twenty handpieces with or without antisuction device (10 of each) were used in the study. Each handpiece was submerged into 10(-6) microg/microl HBV particle solution rotating 5 and 10 times respectively (every time rotating for 10 seconds). Samples were obtained from the water line and chip air line of the handpieces and examined by RT-PCR. At the same rotating times, there was statistical significance of the viral concentration between the two kinds of handpieces (P < 0.05) . However, there was no statistical significance of the viral concentration between different rotating times in each group (P > 0.05). Contamination taking place in both water and air lines of dental handpiece was not enhanced by increasing the number of rotating times of the handpiece. The antisuction devices installed into the water line and chip air line were demonstrated to prevent viral contamination effectively.

  5. Statistics on Social Work Education in the United States: 1978.

    ERIC Educational Resources Information Center

    Rubin, Allen, Comp.; Whitcomb, G. Robert, Comp.

    The document consists of statistical tables which characterize social work education in the United States in 1978. Data were supplied by all Council on Social Work Education accredited graduate programs and all but three undergraduate programs. Six sections comprise the document. Section I analyzes the regional distribution of 264 institutions,…

  6. Methods for estimating selected low-flow frequency statistics and mean annual flow for ungaged locations on streams in North Georgia

    USGS Publications Warehouse

    Gotvald, Anthony J.

    2017-01-13

    The U.S. Geological Survey, in cooperation with the Georgia Department of Natural Resources, Environmental Protection Division, developed regional regression equations for estimating selected low-flow frequency and mean annual flow statistics for ungaged streams in north Georgia that are not substantially affected by regulation, diversions, or urbanization. Selected low-flow frequency statistics and basin characteristics for 56 streamgage locations within north Georgia and 75 miles beyond the State’s borders in Alabama, Tennessee, North Carolina, and South Carolina were combined to form the final dataset used in the regional regression analysis. Because some of the streamgages in the study recorded zero flow, the final regression equations were developed using weighted left-censored regression analysis to analyze the flow data in an unbiased manner, with weights based on the number of years of record. The set of equations includes the annual minimum 1- and 7-day average streamflow with the 10-year recurrence interval (referred to as 1Q10 and 7Q10), monthly 7Q10, and mean annual flow. The final regional regression equations are functions of drainage area, mean annual precipitation, and relief ratio for the selected low-flow frequency statistics and drainage area and mean annual precipitation for mean annual flow. The average standard error of estimate was 13.7 percent for the mean annual flow regression equation and ranged from 26.1 to 91.6 percent for the selected low-flow frequency equations.The equations, which are based on data from streams with little to no flow alterations, can be used to provide estimates of the natural flows for selected ungaged stream locations in the area of Georgia north of the Fall Line. The regression equations are not to be used to estimate flows for streams that have been altered by the effects of major dams, surface-water withdrawals, groundwater withdrawals (pumping wells), diversions, or wastewater discharges. The regression equations should be used only for ungaged sites with drainage areas between 1.67 and 576 square miles, mean annual precipitation between 47.6 and 81.6 inches, and relief ratios between 0.146 and 0.607; these are the ranges of the explanatory variables used to develop the equations. An attempt was made to develop regional regression equations for the area of Georgia south of the Fall Line by using the same approach used during this study for north Georgia; however, the equations resulted with high average standard errors of estimates and poorly predicted flows below 0.5 cubic foot per second, which may be attributed to the karst topography common in that area.The final regression equations developed from this study are planned to be incorporated into the U.S. Geological Survey StreamStats program. StreamStats is a Web-based geographic information system that provides users with access to an assortment of analytical tools useful for water-resources planning and management, and for engineering design applications, such as the design of bridges. The StreamStats program provides streamflow statistics and basin characteristics for U.S. Geological Survey streamgage locations and ungaged sites of interest. StreamStats also can compute basin characteristics and provide estimates of streamflow statistics for ungaged sites when users select the location of a site along any stream in Georgia.

  7. Operational effectiveness of blended e-learning program for nursing research ethics.

    PubMed

    Cho, Kap-Chul; Shin, Gisoo

    2014-06-01

    Since 2006, the Korean Ministry of Education, Science and Technology, and the National Research Foundation of Korea have taken the lead in developing an institutional guideline for research ethics. The purpose was to identify the effectiveness of the Good Research Practice program, developed on a fund granted by the National Research Foundation of Korea, for nurses and nursing students whose knowledge and perception of research ethics were compared before and after the implementation of the Good Research Practice program. This study was conducted to compare the levels of knowledge and perception of research ethics in the participants before and after the program was implemented. The participants included 45 nurses and 69 nursing students from hospitals, colleges of nursing, and the Korean Nurses Association, located in Seoul, Korea. This study was approved by the Institutional Research Board in Korea. Based on the Analysis, Design, Development, Implementation, and Evaluation model, the Good Research Practice program was made up of a total of 30 h of the blended learning both online and off-line. The results of this study showed that there were statistically significant differences in both knowledge and perception of research ethics in nursing students and nurses before and after the program had been implemented. The concepts of professional nursing ethics, moral issues, and bioethics were often confused with one another and not clearly defined. Therefore, the concept and scope of bioethics, moral judgment, and overall nursing ethics should be well defined and conceptualized in the future. This study suggested integrating research ethics education in the nursing curriculum as a required course of study for nursing students and as part of the in-service training program for nurses in order to improve research ethics in nursing research in Korea. © The Author(s) 2013.

  8. An Optimization Model for the Selection of Bus-Only Lanes in a City.

    PubMed

    Chen, Qun

    2015-01-01

    The planning of urban bus-only lane networks is an important measure to improve bus service and bus priority. To determine the effective arrangement of bus-only lanes, a bi-level programming model for urban bus lane layout is developed in this study that considers accessibility and budget constraints. The goal of the upper-level model is to minimize the total travel time, and the lower-level model is a capacity-constrained traffic assignment model that describes the passenger flow assignment on bus lines, in which the priority sequence of the transfer times is reflected in the passengers' route-choice behaviors. Using the proposed bi-level programming model, optimal bus lines are selected from a set of candidate bus lines; thus, the corresponding bus lane network on which the selected bus lines run is determined. The solution method using a genetic algorithm in the bi-level programming model is developed, and two numerical examples are investigated to demonstrate the efficacy of the proposed model.

  9. Teaching Computational Geophysics Classes using Active Learning Techniques

    NASA Astrophysics Data System (ADS)

    Keers, H.; Rondenay, S.; Harlap, Y.; Nordmo, I.

    2016-12-01

    We give an overview of our experience in teaching two computational geophysics classes at the undergraduate level. In particular we describe The first class is for most students the first programming class and assumes that the students have had an introductory course in geophysics. In this class the students are introduced to basic Matlab skills: use of variables, basic array and matrix definition and manipulation, basic statistics, 1D integration, plotting of lines and surfaces, making of .m files and basic debugging techniques. All of these concepts are applied to elementary but important concepts in earthquake and exploration geophysics (including epicentre location, computation of travel time curves for simple layered media plotting of 1D and 2D velocity models etc.). It is important to integrate the geophysics with the programming concepts: we found that this enhances students' understanding. Moreover, as this is a 3 year Bachelor program, and this class is taught in the 2nd semester, there is little time for a class that focusses on only programming. In the second class, which is optional and can be taken in the 4th or 6th semester, but often is also taken by Master students we extend the Matlab programming to include signal processing and ordinary and partial differential equations, again with emphasis on geophysics (such as ray tracing and solving the acoustic wave equation). This class also contains a project in which the students have to write a brief paper on a topic in computational geophysics, preferably with programming examples. When teaching these classes it was found that active learning techniques, in which the students actively participate in the class, either individually, in pairs or in groups, are indispensable. We give a brief overview of the various activities that we have developed when teaching theses classes.

  10. NASA's online machine aided indexing system

    NASA Technical Reports Server (NTRS)

    Silvester, June P.; Genuardi, Michael T.; Klingbiel, Paul H.

    1993-01-01

    This report describes the NASA Lexical Dictionary, a machine aided indexing system used online at the National Aeronautics and Space Administration's Center for Aerospace Information (CASI). This system is comprised of a text processor that is based on the computational, non-syntactic analysis of input text, and an extensive 'knowledge base' that serves to recognize and translate text-extracted concepts. The structure and function of the various NLD system components are described in detail. Methods used for the development of the knowledge base are discussed. Particular attention is given to a statistically-based text analysis program that provides the knowledge base developer with a list of concept-specific phrases extracted from large textual corpora. Production and quality benefits resulting from the integration of machine aided indexing at CASI are discussed along with a number of secondary applications of NLD-derived systems including on-line spell checking and machine aided lexicography.

  11. Acute Restraint Stress Alters Wheel-Running Behavior Immediately Following Stress and up to 20 Hours Later in House Mice.

    PubMed

    Malisch, Jessica L; deWolski, Karen; Meek, Thomas H; Acosta, Wendy; Middleton, Kevin M; Crino, Ondi L; Garland, Theodore

    In vertebrates, acute stressors-although short in duration-can influence physiology and behavior over a longer time course, which might have important ramifications under natural conditions. In laboratory rats, for example, acute stress has been shown to increase anxiogenic behaviors for days after a stressor. In this study, we quantified voluntary wheel-running behavior for 22 h following a restraint stress and glucocorticoid levels 24 h postrestraint. We utilized mice from four replicate lines that have been selectively bred for high voluntary wheel-running activity (HR mice) for 60 generations and their nonselected control (C) lines to examine potential interactions between exercise propensity and sensitivity to stress. Following 6 d of wheel access on a 12L∶12D photo cycle (0700-1900 hours, as during the routine selective breeding protocol), 80 mice were physically restrained for 40 min, beginning at 1400 hours, while another 80 were left undisturbed. Relative to unrestrained mice, wheel running increased for both HR and C mice during the first hour postrestraint (P < 0.0001) but did not differ 2 or 3 h postrestraint. Wheel running was also examined at four distinct phases of the photoperiod. Running in the period of 1600-1840 hours was unaffected by restraint stress and did not differ statistically between HR and C mice. During the period of peak wheel running (1920-0140 hours), restrained mice tended to run fewer revolutions (-11%; two-tailed P = 0.0733), while HR mice ran 473% more than C (P = 0.0008), with no restraint × line type interaction. Wheel running declined for all mice in the latter part of the scotophase (0140-0600 hours), restraint had no statistical effect on wheel running, but HR again ran more than C (+467%; P = 0.0122). Finally, during the start of the photophase (0720-1200 hours), restraint increased running by an average of 53% (P = 0.0443) in both line types, but HR and C mice did not differ statistically. Mice from HR lines had statistically higher plasma corticosterone concentrations than C mice, with no statistical effect of restraint and no interaction between line type and restraint. Overall, these results indicate that acute stress can affect locomotor activity (or activity patterns) for many hours, with the most prominent effect being an increase in activity during a period of typical inactivity at the start of the photophase, 15-20 h poststressor.

  12. A Proteomics Analysis to Evaluate Cytotoxicity in NRK-52E Cells Caused by Unmodified Nano-Fe3O4

    PubMed Central

    Lin, Yi-Reng; Kuo, Chao-Jen; Wu, Chin-Jen

    2014-01-01

    We synthesized unmodified Fe3O4 nanoparticles (NPs) with particles size from 10 nm to 100 nm. We cultured NRK-52E cell lines (rat, kidney) and treated with Fe3O4 NPs to investigate and evaluate the cytotoxicity of NPs for NRK-52E cells. Through global proteomics analysis using dimethyl labeling techniques and liquid phase chromatography coupled with a tandem mass spectrometer (LC-MS/MS), we characterized 435 proteins including the programmed cell death related proteins, ras-related proteins, glutathione related proteins, and the chaperone proteins such as heat shock proteins, serpin H1, protein disulfide-isomerase A4, endoplasmin, and endoplasmic reticulum resident proteins. From the statistical data of identified proteins, we believed that NPs treatment causes cell death and promotes expression of ras-related proteins. In order to avoid apoptosis, NRK-52E cell lines induce a series of protective effects such as glutathione related proteins to reduce reactive oxygen species (ROS), and chaperone proteins to recycle damaged proteins. We suggested that, in the indigenous cellular environment, Fe3O4 NPs treatment induced an antagonistic effect for cell lines to go to which avoids apoptosis. PMID:25197711

  13. 40 CFR Table 10 to Subpart Wwww of... - Data Requirements for New and Existing Continuous Lamination Lines and Continuous Casting Lines...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 13 2014-07-01 2014-07-01 false Data Requirements for New and Existing Continuous Lamination Lines and Continuous Casting Lines Complying With a Percent Reduction Limit on a Per Line Basis 10 Table 10 to Subpart WWWW of Part 63 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS ...

  14. 40 CFR Table 10 to Subpart Wwww of... - Data Requirements for New and Existing Continuous Lamination Lines and Continuous Casting Lines...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 13 2012-07-01 2012-07-01 false Data Requirements for New and Existing Continuous Lamination Lines and Continuous Casting Lines Complying With a Percent Reduction Limit on a Per Line Basis 10 Table 10 to Subpart WWWW of Part 63 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS ...

  15. Optimizing the use of intravenous therapy in internal medicine.

    PubMed

    Champion, Karine; Mouly, Stéphane; Lloret-Linares, Celia; Lopes, Amanda; Vicaut, Eric; Bergmann, Jean-François

    2013-10-01

    We aimed to evaluate the impact of physicians' educational programs in the reduction of inappropriate intravenous lines in internal medicine. Fifty-six French internal medicine units were enrolled in a nationwide, prospective, blinded, randomized controlled trial. Forms describing the patients with an intravenous line and internal medicine department characteristics were filled out on 2 separate days in January and April 2007. Following the first visit, all units were randomly assigned to either a specific education program on the appropriate indications of an intravenous line, during February and March 2007, or no training (control group). The Investigators' Committee then blindly evaluated the clinical relevance of the intravenous line according to pre-established criteria. The primary outcome was the percentage of inappropriate intravenous lines. During January 2007, intravenous lines were used in 475 (24.9%) of the 1910 hospitalized patients. Of these, 80 (16.8%) were considered inappropriate. In April 2007, 416 (22.8%) of the 1823 hospitalized patients received an intravenous line, which was considered in 10.2% (21/205) of patients managed by trained physicians, versus 16.6% (35/211) of patients in the control group (relative difference 39%; 95% confidence interval, -0.6-13.3; P = .05). Reduced intravenous administration of fluids, antibiotics, and analgesics accounted for the observed decrease. The use of a simple education program reduced the rate of inappropriate intravenous lines by almost 40% in an internal medicine setting (NCT01633307). Copyright © 2013 Elsevier Inc. All rights reserved.

  16. Multipactor threshold calculation of coaxial transmission lines in microwave applications with nonstationary statistical theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, S.; Li, Y.; Liu, C.

    2015-08-15

    This paper presents a statistical theory for the initial onset of multipactor breakdown in coaxial transmission lines, taking both the nonuniform electric field and random electron emission velocity into account. A general numerical method is first developed to construct the joint probability density function based on the approximate equation of the electron trajectory. The nonstationary dynamics of the multipactor process on both surfaces of coaxial lines are modelled based on the probability of various impacts and their corresponding secondary emission. The resonant assumption of the classical theory on the independent double-sided and single-sided impacts is replaced by the consideration ofmore » their interaction. As a result, the time evolutions of the electron population for exponential growth and absorption on both inner and outer conductor, in response to the applied voltage above and below the multipactor breakdown level, are obtained to investigate the exact mechanism of multipactor discharge in coaxial lines. Furthermore, the multipactor threshold predictions of the presented model are compared with experimental results using measured secondary emission yield of the tested samples which shows reasonable agreement. Finally, the detailed impact scenario reveals that single-surface multipactor is more likely to occur with a higher outer to inner conductor radius ratio.« less

  17. IDEN2-A program for visual identification of spectral lines and energy levels in optical spectra of atoms and simple molecules

    NASA Astrophysics Data System (ADS)

    Azarov, V. I.; Kramida, A.; Vokhmentsev, M. Ya.

    2018-04-01

    The article describes a Java program that can be used in a user-friendly way to visually identify spectral lines observed in complex spectra with theoretically predicted transitions between atomic or molecular energy levels. The program arranges various information about spectral lines and energy levels in such a way that line identification and determination of positions of experimentally observed energy levels become much easier tasks that can be solved fast and efficiently.

  18. Topology in Synthetic Column Density Maps for Interstellar Turbulence

    NASA Astrophysics Data System (ADS)

    Putko, Joseph; Burkhart, B. K.; Lazarian, A.

    2013-01-01

    We show how the topology tool known as the genus statistic can be utilized to characterize magnetohydrodyanmic (MHD) turbulence in the ISM. The genus is measured with respect to a given density threshold and varying the threshold produces a genus curve, which can suggest an overall ‘‘meatball,’’ neutral, or ‘‘Swiss cheese’’ topology through its integral. We use synthetic column density maps made from three-dimensional 5123 compressible MHD isothermal simulations performed for different sonic and Alfvénic Mach numbers (Ms and MA respectively). We study eight different Ms values each with one sub- and one super-Alfvénic counterpart. We consider sight-lines both parallel (x) and perpendicular (y and z) to the mean magnetic field. We find that the genus integral shows a dependence on both Mach numbers, and this is still the case even after adding beam smoothing and Gaussian noise to the maps to mimic observational data. The genus integral increases with higher Ms values (but saturates after about Ms = 4) for all lines of sight. This is consistent with greater values of Ms resulting in stronger shocks, which results in a clumpier topology. We observe a larger genus integral for the sub-Alfvénic cases along the perpendicular lines of sight due to increased compression from the field lines and enhanced anisotropy. Application of the genus integral to column density maps should allow astronomers to infer the Mach numbers and thus learn about the environments of interstellar turbulence. This work was supported by the National Science Foundation’s REU program through NSF Award AST-1004881.

  19. Searching for the 3.5 keV Line in the Stacked Suzaku Observations of Galaxy Clusters

    NASA Technical Reports Server (NTRS)

    Bulbul, Esra; Markevitch, Maxim; Foster, Adam; Miller, Eric; Bautz, Mark; Lowenstein, Mike; Randall, Scott W.; Smith, Randall K.

    2016-01-01

    We perform a detailed study of the stacked Suzaku observations of 47 galaxy clusters, spanning a redshift range of 0.01-0.45, to search for the unidentified 3.5 keV line. This sample provides an independent test for the previously detected line. We detect a 2sigma-significant spectral feature at 3.5 keV in the spectrum of the full sample. When the sample is divided into two subsamples (cool-core and non-cool core clusters), the cool-core subsample shows no statistically significant positive residuals at the line energy. A very weak (approx. 2sigma confidence) spectral feature at 3.5 keV is permitted by the data from the non-cool-core clusters sample. The upper limit on a neutrino decay mixing angle of sin(sup 2)(2theta) = 6.1 x 10(exp -11) from the full Suzaku sample is consistent with the previous detections in the stacked XMM-Newton sample of galaxy clusters (which had a higher statistical sensitivity to faint lines), M31, and Galactic center, at a 90% confidence level. However, the constraint from the present sample, which does not include the Perseus cluster, is in tension with previously reported line flux observed in the core of the Perseus cluster with XMM-Newton and Suzaku.

  20. Classifying Bugs is a Tricky Business.

    DTIC Science & Technology

    1983-08-01

    programming tutor to help students learn to program in Pascal; we wanted the system to identify the nen-.pntactic bugs in a student’s program and tutor the... student with respect to the misconceptions that might have given rise to the bugs. The emphasis was on the system understanding what the student did...and did not understand; we felt that simply telling the student that there was a bug in line 14 was not sufficient -- since oftentimes the bug in line

  1. Bangladesh.

    PubMed

    Ahmed, K S

    1979-01-01

    In Bangladesh the Population Control and Family Planning Division of the Ministry of Health and Population Control has decided to delegate increased financial and administrative powers to the officers of the family planning program at the district level and below. Currently, about 20,000 family planning workers and officials are at work in rural areas. The government believes that the success of the entire family planning program depends on the performance of workers in rural areas, because that is where about 90% of the population lives. Awareness of the need to improve statistical data in Bangladesh has been increasing, particularly in regard to the development of rural areas. An accurate statistical profile of rural Bangladesh is crucial to the formation, implementation and evaluation of rural development programs. A Seminar on Statistics for Rural Development will be held from June 18-20, 1980. The primary objectives of the Seminar are to make an exhaustive analysis of the current availability of statistics required for rural development programs and to consider methodological and operational improvements toward building up an adequate data base.

  2. Description and texts for the auxiliary programs for processing video information on the YeS computer. Part 3: Test program 2

    NASA Technical Reports Server (NTRS)

    Borisenko, V. I., G.g.; Stetsenko, Z. A.

    1980-01-01

    The functions were discribed and the operating instructions, the block diagram and the proposed versions are given for modifying the program in order to obtain the statistical characteristics of multi-channel video information. The program implements certain man-machine methods for investigating video information. It permits representation of the material and its statistical characteristics in a form which is convenient for the user.

  3. Visualization of the significance of Receiver Operating Characteristics based on confidence ellipses

    NASA Astrophysics Data System (ADS)

    Sarlis, Nicholas V.; Christopoulos, Stavros-Richard G.

    2014-03-01

    The Receiver Operating Characteristics (ROC) is used for the evaluation of prediction methods in various disciplines like meteorology, geophysics, complex system physics, medicine etc. The estimation of the significance of a binary prediction method, however, remains a cumbersome task and is usually done by repeating the calculations by Monte Carlo. The FORTRAN code provided here simplifies this problem by evaluating the significance of binary predictions for a family of ellipses which are based on confidence ellipses and cover the whole ROC space. Catalogue identifier: AERY_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERY_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 11511 No. of bytes in distributed program, including test data, etc.: 72906 Distribution format: tar.gz Programming language: FORTRAN. Computer: Any computer supporting a GNU FORTRAN compiler. Operating system: Linux, MacOS, Windows. RAM: 1Mbyte Classification: 4.13, 9, 14. Nature of problem: The Receiver Operating Characteristics (ROC) is used for the evaluation of prediction methods in various disciplines like meteorology, geophysics, complex system physics, medicine etc. The estimation of the significance of a binary prediction method, however, remains a cumbersome task and is usually done by repeating the calculations by Monte Carlo. The FORTRAN code provided here simplifies this problem by evaluating the significance of binary predictions for a family of ellipses which are based on confidence ellipses and cover the whole ROC space. Solution method: Using the statistics of random binary predictions for a given value of the predictor threshold ɛt, one can construct the corresponding confidence ellipses. The envelope of these corresponding confidence ellipses is estimated when ɛt varies from 0 to 1. This way a new family of ellipses is obtained, named k-ellipses, which covers the whole ROC plane and leads to a well defined Area Under the Curve (AUC). For the latter quantity, Mason and Graham [1] have shown that it follows the Mann-Whitney U-statistics [2] which can be applied [3] for the estimation of the statistical significance of each k-ellipse. As the transformation is invertible, any point on the ROC plane corresponds to a unique value of k, thus to a unique p-value to obtain this point by chance. The present FORTRAN code provides this p-value field on the ROC plane as well as the k-ellipses corresponding to the (p=)10%, 5% and 1% significance levels using as input the number of the positive (P) and negative (Q) cases to be predicted. Unusual features: In some machines, the compiler directive -O2 or -O3 should be used to avoid NaN’s in some points of the p-field along the diagonal. Running time: Depending on the application, e.g., 4s for an Intel(R) Core(TM)2 CPU E7600 at 3.06 GHz with 2 GB RAM for the examples presented here References: [1] S.J. Mason, N.E. Graham, Quart. J. Roy. Meteor. Soc. 128 (2002) 2145. [2] H.B. Mann, D.R. Whitney, Ann. Math. Statist. 18 (1947) 50. [3] L.C. Dinneen, B.C. Blakesley, J. Roy. Stat. Soc. Ser. C Appl. Stat. 22 (1973) 269.

  4. Relationship of the functional movement screen in-line lunge to power, speed, and balance measures.

    PubMed

    Hartigan, Erin H; Lawrence, Michael; Bisson, Brian M; Torgerson, Erik; Knight, Ryan C

    2014-05-01

    The in-line lunge of the Functional Movement Screen (FMS) evaluates lateral stability, balance, and movement asymmetries. Athletes who score poorly on the in-line lunge should avoid activities requiring power or speed until scores are improved, yet relationships between the in-line lunge scores and other measures of balance, power, and speed are unknown. (1) Lunge scores will correlate with center of pressure (COP), maximum jump height (MJH), and 36.6-meter sprint time and (2) there will be no differences between limbs on lunge scores, MJH, or COP. Descriptive laboratory study. Level 3. Thirty-seven healthy, active participants completed the first 3 tasks of the FMS (eg, deep squat, hurdle step, in-line lunge), unilateral drop jumps, and 36.6-meter sprints. A 3-dimensional motion analysis system captured MJH. Force platforms measured COP excursion. A laser timing system measured 36.6-m sprint time. Statistical analyses were used to determine whether a relationship existed between lunge scores and COP, MJH, and 36.6-m speed (Spearman rho tests) and whether differences existed between limbs in lunge scores (Wilcoxon signed-rank test), MJH, and COP (paired t tests). Lunge scores were not significantly correlated with COP, MJH, or 36.6-m sprint time. Lunge scores, COP excursion, and MJH were not statistically different between limbs. Performance on the FMS in-line lunge was not related to balance, power, or speed. Healthy participants were symmetrical in lunging measures and MJH. Scores on the FMS in-line lunge should not be attributed to power, speed, or balance performance without further examination. However, assessing limb symmetry appears to be clinically relevant.

  5. Effects of the Use of Millimeter Waves on the Statistics of Writer-to-Reader Delays in Military Communications Systems,

    DTIC Science & Technology

    1980-12-01

    distributions of Figs. 3 and 4 may be fitted quit, accurately by broken straight lines. If we had plotted the differential distributions directly...collection process. These fluctuations are smoothed by replacing the actual differential distribution by the derivative of the fitted broken-line lognormal...for each interval T. The constants in the distribution for each broken section of the lognormal approximations are found by fitting lines to the curve

  6. Assembly-line Simulation Program

    NASA Technical Reports Server (NTRS)

    Chamberlain, Robert G.; Zendejas, Silvino; Malhotra, Shan

    1987-01-01

    Costs and profits estimated for models based on user inputs. Standard Assembly-line Manufacturing Industry Simulation (SAMIS) program generalized so useful for production-line manufacturing companies. Provides accurate and reliable means of comparing alternative manufacturing processes. Used to assess impact of changes in financial parameters as cost of resources and services, inflation rates, interest rates, tax policies, and required rate of return of equity. Most important capability is ability to estimate prices manufacturer would have to receive for its products to recover all of costs of production and make specified profit. Written in TURBO PASCAL.

  7. Zero Autocorrelation Waveforms: A Doppler Statistic and Multifunction Problems

    DTIC Science & Technology

    2006-01-01

    by ANSI Std Z39-18 It is natural to refer to A as the ambiguity function of u, since in the usual setting on the real line R, the analogue ambiguity...Doppler statistic |Cu,uek(j)| is excellent and provable for detecting deodorized Doppler frequency shift [11] (see Fig. 2). Also, if one graphs only

  8. Consistent Tolerance Bounds for Statistical Distributions

    NASA Technical Reports Server (NTRS)

    Mezzacappa, M. A.

    1983-01-01

    Assumption that sample comes from population with particular distribution is made with confidence C if data lie between certain bounds. These "confidence bounds" depend on C and assumption about distribution of sampling errors around regression line. Graphical test criteria using tolerance bounds are applied in industry where statistical analysis influences product development and use. Applied to evaluate equipment life.

  9. Acoustic Prediction State of the Art Assessment

    NASA Technical Reports Server (NTRS)

    Dahl, Milo D.

    2007-01-01

    The acoustic assessment task for both the Subsonic Fixed Wing and the Supersonic projects under NASA s Fundamental Aeronautics Program was designed to assess the current state-of-the-art in noise prediction capability and to establish baselines for gauging future progress. The documentation of our current capabilities included quantifying the differences between predictions of noise from computer codes and measurements of noise from experimental tests. Quantifying the accuracy of both the computed and experimental results further enhanced the credibility of the assessment. This presentation gives sample results from codes representative of NASA s capabilities in aircraft noise prediction both for systems and components. These include semi-empirical, statistical, analytical, and numerical codes. System level results are shown for both aircraft and engines. Component level results are shown for a landing gear prototype, for fan broadband noise, for jet noise from a subsonic round nozzle, and for propulsion airframe aeroacoustic interactions. Additional results are shown for modeling of the acoustic behavior of duct acoustic lining and the attenuation of sound in lined ducts with flow.

  10. chemalot and chemalot_knime: Command line programs as workflow tools for drug discovery.

    PubMed

    Lee, Man-Ling; Aliagas, Ignacio; Feng, Jianwen A; Gabriel, Thomas; O'Donnell, T J; Sellers, Benjamin D; Wiswedel, Bernd; Gobbi, Alberto

    2017-06-12

    Analyzing files containing chemical information is at the core of cheminformatics. Each analysis may require a unique workflow. This paper describes the chemalot and chemalot_knime open source packages. Chemalot is a set of command line programs with a wide range of functionalities for cheminformatics. The chemalot_knime package allows command line programs that read and write SD files from stdin and to stdout to be wrapped into KNIME nodes. The combination of chemalot and chemalot_knime not only facilitates the compilation and maintenance of sequences of command line programs but also allows KNIME workflows to take advantage of the compute power of a LINUX cluster. Use of the command line programs is demonstrated in three different workflow examples: (1) A workflow to create a data file with project-relevant data for structure-activity or property analysis and other type of investigations, (2) The creation of a quantitative structure-property-relationship model using the command line programs via KNIME nodes, and (3) The analysis of strain energy in small molecule ligand conformations from the Protein Data Bank database. The chemalot and chemalot_knime packages provide lightweight and powerful tools for many tasks in cheminformatics. They are easily integrated with other open source and commercial command line tools and can be combined to build new and even more powerful tools. The chemalot_knime package facilitates the generation and maintenance of user-defined command line workflows, taking advantage of the graphical design capabilities in KNIME. Graphical abstract Example KNIME workflow with chemalot nodes and the corresponding command line pipe.

  11. Understanding software faults and their role in software reliability modeling

    NASA Technical Reports Server (NTRS)

    Munson, John C.

    1994-01-01

    This study is a direct result of an on-going project to model the reliability of a large real-time control avionics system. In previous modeling efforts with this system, hardware reliability models were applied in modeling the reliability behavior of this system. In an attempt to enhance the performance of the adapted reliability models, certain software attributes were introduced in these models to control for differences between programs and also sequential executions of the same program. As the basic nature of the software attributes that affect software reliability become better understood in the modeling process, this information begins to have important implications on the software development process. A significant problem arises when raw attribute measures are to be used in statistical models as predictors, for example, of measures of software quality. This is because many of the metrics are highly correlated. Consider the two attributes: lines of code, LOC, and number of program statements, Stmts. In this case, it is quite obvious that a program with a high value of LOC probably will also have a relatively high value of Stmts. In the case of low level languages, such as assembly language programs, there might be a one-to-one relationship between the statement count and the lines of code. When there is a complete absence of linear relationship among the metrics, they are said to be orthogonal or uncorrelated. Usually the lack of orthogonality is not serious enough to affect a statistical analysis. However, for the purposes of some statistical analysis such as multiple regression, the software metrics are so strongly interrelated that the regression results may be ambiguous and possibly even misleading. Typically, it is difficult to estimate the unique effects of individual software metrics in the regression equation. The estimated values of the coefficients are very sensitive to slight changes in the data and to the addition or deletion of variables in the regression equation. Since most of the existing metrics have common elements and are linear combinations of these common elements, it seems reasonable to investigate the structure of the underlying common factors or components that make up the raw metrics. The technique we have chosen to use to explore this structure is a procedure called principal components analysis. Principal components analysis is a decomposition technique that may be used to detect and analyze collinearity in software metrics. When confronted with a large number of metrics measuring a single construct, it may be desirable to represent the set by some smaller number of variables that convey all, or most, of the information in the original set. Principal components are linear transformations of a set of random variables that summarize the information contained in the variables. The transformations are chosen so that the first component accounts for the maximal amount of variation of the measures of any possible linear transform; the second component accounts for the maximal amount of residual variation; and so on. The principal components are constructed so that they represent transformed scores on dimensions that are orthogonal. Through the use of principal components analysis, it is possible to have a set of highly related software attributes mapped into a small number of uncorrelated attribute domains. This definitively solves the problem of multi-collinearity in subsequent regression analysis. There are many software metrics in the literature, but principal component analysis reveals that there are few distinct sources of variation, i.e. dimensions, in this set of metrics. It would appear perfectly reasonable to characterize the measurable attributes of a program with a simple function of a small number of orthogonal metrics each of which represents a distinct software attribute domain.

  12. Nodal portraits of quantum billiards: Domains, lines, and statistics

    NASA Astrophysics Data System (ADS)

    Jain, Sudhir Ranjan; Samajdar, Rhine

    2017-10-01

    This is a comprehensive review of the nodal domains and lines of quantum billiards, emphasizing a quantitative comparison of theoretical findings to experiments. The nodal statistics are shown to distinguish not only between regular and chaotic classical dynamics but also between different geometric shapes of the billiard system itself. How a random superposition of plane waves can model chaotic eigenfunctions is discussed and the connections of the complex morphology of the nodal lines thereof to percolation theory and Schramm-Loewner evolution are highlighted. Various approaches to counting the nodal domains—using trace formulas, graph theory, and difference equations—are also illustrated with examples. The nodal patterns addressed pertain to waves on vibrating plates and membranes, acoustic and electromagnetic modes, wave functions of a "particle in a box" as well as to percolating clusters, and domains in ferromagnets, thus underlining the diversity and far-reaching implications of the problem.

  13. A statistical package for computing time and frequency domain analysis

    NASA Technical Reports Server (NTRS)

    Brownlow, J.

    1978-01-01

    The spectrum analysis (SPA) program is a general purpose digital computer program designed to aid in data analysis. The program does time and frequency domain statistical analyses as well as some preanalysis data preparation. The capabilities of the SPA program include linear trend removal and/or digital filtering of data, plotting and/or listing of both filtered and unfiltered data, time domain statistical characterization of data, and frequency domain statistical characterization of data.

  14. Ursgal, Universal Python Module Combining Common Bottom-Up Proteomics Tools for Large-Scale Analysis.

    PubMed

    Kremer, Lukas P M; Leufken, Johannes; Oyunchimeg, Purevdulam; Schulze, Stefan; Fufezan, Christian

    2016-03-04

    Proteomics data integration has become a broad field with a variety of programs offering innovative algorithms to analyze increasing amounts of data. Unfortunately, this software diversity leads to many problems as soon as the data is analyzed using more than one algorithm for the same task. Although it was shown that the combination of multiple peptide identification algorithms yields more robust results, it is only recently that unified approaches are emerging; however, workflows that, for example, aim to optimize search parameters or that employ cascaded style searches can only be made accessible if data analysis becomes not only unified but also and most importantly scriptable. Here we introduce Ursgal, a Python interface to many commonly used bottom-up proteomics tools and to additional auxiliary programs. Complex workflows can thus be composed using the Python scripting language using a few lines of code. Ursgal is easily extensible, and we have made several database search engines (X!Tandem, OMSSA, MS-GF+, Myrimatch, MS Amanda), statistical postprocessing algorithms (qvality, Percolator), and one algorithm that combines statistically postprocessed outputs from multiple search engines ("combined FDR") accessible as an interface in Python. Furthermore, we have implemented a new algorithm ("combined PEP") that combines multiple search engines employing elements of "combined FDR", PeptideShaker, and Bayes' theorem.

  15. Hazardous and Toxic Waste (HTW) Contracting Problems: A Study of the Contracting Problems Related to Surety Bonding in the HTW Cleanup Program

    DTIC Science & Technology

    1990-07-01

    Fair value of solely owned real estate* $ b. All mortgages or other encumbrances on the real estate included in Line a c. Real estate equity (subtract...Line b from Line a) d. Fair value of all solely owned property other than real estate* e. Total of the amounts on Lines c and d f. All other

  16. CIDR

    Science.gov Websites

    Statistics Quality Control Statistics CIDR is dedicated to producing the highest quality data for our investigators. These cumulative quality control statistics are based on data from 419 released CIDR Program

  17. 49 CFR Schedule G to Subpart B of... - Selected Statistical Data

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 8 2010-10-01 2010-10-01 false Selected Statistical Data G Schedule G to Subpart... Statistical Data [Dollars in thousands] () Greyhound Lines, Inc. () Trailways combined () All study carriers.... 9002, L. 9, col. (b) Other Statistics: 25Number of regulator route intercity passenger miles Sch. 9002...

  18. Statistical equilibrium calculations for silicon in early-type model stellar atmospheres

    NASA Technical Reports Server (NTRS)

    Kamp, L. W.

    1976-01-01

    Line profiles of 36 multiplets of silicon (Si) II, III, and IV were computed for a grid of model atmospheres covering the range from 15,000 to 35,000 K in effective temperature and 2.5 to 4.5 in log (gravity). The computations involved simultaneous solution of the steady-state statistical equilibrium equations for the populations and of the equation of radiative transfer in the lines. The variables were linearized, and successive corrections were computed until a minimal accuracy of 1/1000 in the line intensities was reached. The common assumption of local thermodynamic equilibrium (LTE) was dropped. The model atmospheres used also were computed by non-LTE methods. Some effects that were incorporated into the calculations were the depression of the continuum by free electrons, hydrogen and ionized helium line blocking, and auto-ionization and dielectronic recombination, which later were found to be insignificant. Use of radiation damping and detailed electron (quadratic Stark) damping constants had small but significant effects on the strong resonance lines of Si III and IV. For weak and intermediate-strength lines, large differences with respect to LTE computations, the results of which are also presented, were found in line shapes and strengths. For the strong lines the differences are generally small, except for the models at the hot, low-gravity extreme of our range. These computations should be useful in the interpretation of the spectra of stars in the spectral range B0-B5, luminosity classes III, IV, and V.

  19. Constraining the variation of the fine-structure constant with observations of narrow quasar absorption lines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Songaila, A.; Cowie, L. L., E-mail: acowie@ifa.hawaii.edu

    2014-10-01

    The unequivocal demonstration of temporal or spatial variability in a fundamental constant of nature would be of enormous significance. Recent attempts to measure the variability of the fine-structure constant α over cosmological time, using high-resolution spectra of high-redshift quasars observed with 10 m class telescopes, have produced conflicting results. We use the many multiplet (MM) method with Mg II and Fe II lines on very high signal-to-noise, high-resolution (R = 72, 000) Keck HIRES spectra of eight narrow quasar absorption systems. We consider both systematic uncertainties in spectrograph wavelength calibration and also velocity offsets introduced by complex velocity structure inmore » even apparently simple and weak narrow lines and analyze their effect on claimed variations in α. We find no significant change in α, Δα/α = (0.43 ± 0.34) × 10{sup –5}, in the redshift range z = 0.7-1.5, where this includes both statistical and systematic errors. We also show that the scatter in measurements of Δα/α arising from absorption line structure can be considerably larger than assigned statistical errors even for apparently simple and narrow absorption systems. We find a null result of Δα/α = (– 0.59 ± 0.55) × 10{sup –5} in a system at z = 1.7382 using lines of Cr II, Zn II, and Mn II, whereas using Cr II and Zn II lines in a system at z = 1.6614 we find a systematic velocity trend that, if interpreted as a shift in α, would correspond to Δα/α = (1.88 ± 0.47) × 10{sup –5}, where both results include both statistical and systematic errors. This latter result is almost certainly caused by varying ionic abundances in subcomponents of the line: using Mn II, Ni II, and Cr II in the analysis changes the result to Δα/α = (– 0.47 ± 0.53) × 10{sup –5}. Combining the Mg II and Fe II results with estimates based on Mn II, Ni II, and Cr II gives Δα/α = (– 0.01 ± 0.26) × 10{sup –5}. We conclude that spectroscopic measurements of quasar absorption lines are not yet capable of unambiguously detecting variation in α using the MM method.« less

  20. Controlling mechanisms over the internet

    NASA Astrophysics Data System (ADS)

    Lumia, Ronald

    1997-01-01

    The internet, widely available throughout the world, can be used to control robots, machine tools, and other mechanisms. This paper will describe a low-cost virtual collaborative environment (VCE) which will connect users with distant equipment. The system is based on PC technology, and incorporates off-line-programming with on-line execution. A remote user programs the systems graphically and simulates the motions and actions of the mechanism until satisfied with the functionality of the program. The program is then transferred from the remote site to the local site where the real equipment exists. At the local site, the simulation is run again to check the program from a safety standpoint. Then, the local user runs the program on the real equipment. During execution, a camera in the real workspace provides an image back to the remote user through a teleconferencing system. The system costs approximately 12,500 dollars and represents a low-cost alternative to the Sandia National Laboratories VCE.

  1. Are some BL Lac objects artefacts of gravitational lensing?

    NASA Technical Reports Server (NTRS)

    Ostriker, J. P.; Vietri, M.

    1985-01-01

    It is proposed here that a significant fraction of BL Lac objects are optically violently variable quasars whose continuum emission has been greatly amplified, relative to the line emission, by pointlike gravitational lenses in intervening galaxies. Several anomalous physical and statistical properties of BL Lacs can be understood on the basis of this model, which is immediately testable on the basis of absorption line studies and by direct imaging.

  2. The social construction of "evidence-based'' drug prevention programs: a reanalysis of data from the Drug Abuse Resistance Education (DARE) program.

    PubMed

    Gorman, Dennis M; Huber, J Charles

    2009-08-01

    This study explores the possibility that any drug prevention program might be considered ;;evidence-based'' given the use of data analysis procedures that optimize the chance of producing statistically significant results by reanalyzing data from a Drug Abuse Resistance Education (DARE) program evaluation. The analysis produced a number of statistically significant differences between the DARE and control conditions on alcohol and marijuana use measures. Many of these differences occurred at cutoff points on the assessment scales for which post hoc meaningful labels were created. Our results are compared to those from evaluations of programs that appear on evidence-based drug prevention lists.

  3. Analysis of several Boolean operation based trajectory generation strategies for automotive spray applications

    NASA Astrophysics Data System (ADS)

    Gao, Guoyou; Jiang, Chunsheng; Chen, Tao; Hui, Chun

    2018-05-01

    Industrial robots are widely used in various processes of surface manufacturing, such as thermal spraying. The established robot programming methods are highly time-consuming and not accurate enough to fulfil the demands of the actual market. There are many off-line programming methods developed to reduce the robot programming effort. This work introduces the principle of several based robot trajectory generation strategy on planar surface and curved surface. Since the off-line programming software is widely used and thus facilitates the robot programming efforts and improves the accuracy of robot trajectory, the analysis of this work is based on the second development of off-line programming software Robot studio™. To meet the requirements of automotive paint industry, this kind of software extension helps provide special functions according to the users defined operation parameters. The presented planning strategy generates the robot trajectory by moving an orthogonal surface according to the information of coating surface, a series of intersection curves are then employed to generate the trajectory points. The simulation results show that the path curve created with this method is successive and smooth, which corresponds to the requirements of automotive spray industrial applications.

  4. Experimental study of spectral and spatial distribution of solar X-rays

    NASA Technical Reports Server (NTRS)

    Acton, L. W.; Catura, R. C.; Culhane, J. L.

    1972-01-01

    The study of the physical conditions within the solar corona and the development of instrumentation and technical expertise necessary for advanced studies of solar X-ray emission are reported. Details are given on the Aerobee-borne-X-ray spectrometer/monochromator and also on the observing program. Preliminary discussions of some results are presented and include studies of helium-like line emission, mapping O(VII) and Ne(IX) lines, survey of O(VII) and Ne(IX) lines, study of plage regions and small flares, and analysis of line emission from individual active regions. It is concluded that the use of large-area collimated Bragg spectrometers to scan narrow wavelength intervals and the capability of the SPARCS pointing control to execute a complex observing program are established.

  5. 40 CFR Table 10 to Subpart Wwww of... - Data Requirements for New and Existing Continuous Lamination Lines and Continuous Casting Lines...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 12 2010-07-01 2010-07-01 true Data Requirements for New and Existing Continuous Lamination Lines and Continuous Casting Lines Complying With a Percent Reduction Limit on a Per Line Basis 10 Table 10 to Subpart WWWW of Part 63 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUE...

  6. 40 CFR Table 10 to Subpart Wwww of... - Data Requirements for New and Existing Continuous Lamination Lines and Continuous Casting Lines...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 12 2011-07-01 2009-07-01 true Data Requirements for New and Existing Continuous Lamination Lines and Continuous Casting Lines Complying With a Percent Reduction Limit on a Per Line Basis 10 Table 10 to Subpart WWWW of Part 63 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUE...

  7. 40 CFR Table 10 to Subpart Wwww of... - Data Requirements for New and Existing Continuous Lamination Lines and Continuous Casting Lines...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 13 2013-07-01 2012-07-01 true Data Requirements for New and Existing Continuous Lamination Lines and Continuous Casting Lines Complying With a Percent Reduction Limit on a Per Line Basis 10 Table 10 to Subpart WWWW of Part 63 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUE...

  8. VHSIC Electronics and the Cost of Air Force Avionics in the 1990s

    DTIC Science & Technology

    1990-11-01

    circuit. LRM Line replaceable module. LRU Line replaceable unit. LSI Large-scale integration. LSTTL Tow-power Schottky Transitor -to-Transistor Logic...displays, communications/navigation/identification, electronic combat equipment, dispensers, and computers. These CERs, which statistically relate the...some of the reliability numbers, and adding the F-15 and F-16 to obtain the data sample shown in Table 6. Both suite costs and reliability statistics

  9. Statistics of Magnetic Reconnection X-Lines in Kinetic Turbulence

    NASA Astrophysics Data System (ADS)

    Haggerty, C. C.; Parashar, T.; Matthaeus, W. H.; Shay, M. A.; Wan, M.; Servidio, S.; Wu, P.

    2016-12-01

    In this work we examine the statistics of magnetic reconnection (x-lines) and their associated reconnection rates in intermittent current sheets generated in turbulent plasmas. Although such statistics have been studied previously for fluid simulations (e.g. [1]), they have not yet been generalized to fully kinetic particle-in-cell (PIC) simulations. A significant problem with PIC simulations, however, is electrostatic fluctuations generated due to numerical particle counting statistics. We find that analyzing gradients of the magnetic vector potential from the raw PIC field data identifies numerous artificial (or non-physical) x-points. Using small Orszag-Tang vortex PIC simulations, we analyze x-line identification and show that these artificial x-lines can be removed using sub-Debye length filtering of the data. We examine how turbulent properties such as the magnetic spectrum and scale dependent kurtosis are affected by particle noise and sub-Debye length filtering. We subsequently apply these analysis methods to a large scale kinetic PIC turbulent simulation. Consistent with previous fluid models, we find a range of normalized reconnection rates as large as ½ but with the bulk of the rates being approximately less than to 0.1. [1] Servidio, S., W. H. Matthaeus, M. A. Shay, P. A. Cassak, and P. Dmitruk (2009), Magnetic reconnection and two-dimensional magnetohydrodynamic turbulence, Phys. Rev. Lett., 102, 115003.

  10. Effects of Nongray Opacity on Radiatively Driven Wolf-Rayet Winds

    NASA Astrophysics Data System (ADS)

    Onifer, A. J.; Gayley, K. G.

    2002-05-01

    Wolf-Rayet winds are characterized by their large momentum fluxes, and simulations of radiation driving have been increasingly successful in modeling these winds. Simple analytic approaches that help understand the most critical processes for copious momentum deposition already exist in the effectively gray approximation, but these have not been extended to more realistic nongray opacities. With this in mind, we have developed a simplified theory for describing the interaction of the stellar flux with nongray wind opacity. We replace the detailed line list with a set of statistical parameters that are sensitive not only to the strength but also the wavelength distribution of lines, incorporating as a free parameter the rate of photon frequency redistribution. We label the resulting flux-weighted opacity the statistical Sobolev- Rosseland (SSR) mean, and explore how changing these various statistical parameters affects the flux/opacity interaction. We wish to acknowledge NSF grant AST-0098155

  11. Marital satisfaction and break-ups differ across on-line and off-line meeting venues

    PubMed Central

    Cacioppo, John T.; Cacioppo, Stephanie; Gonzaga, Gian C.; Ogburn, Elizabeth L.; VanderWeele, Tyler J.

    2013-01-01

    Marital discord is costly to children, families, and communities. The advent of the Internet, social networking, and on-line dating has affected how people meet future spouses, but little is known about the prevalence or outcomes of these marriages or the demographics of those involved. We addressed these questions in a nationally representative sample of 19,131 respondents who married between 2005 and 2012. Results indicate that more than one-third of marriages in America now begin on-line. In addition, marriages that began on-line, when compared with those that began through traditional off-line venues, were slightly less likely to result in a marital break-up (separation or divorce) and were associated with slightly higher marital satisfaction among those respondents who remained married. Demographic differences were identified between respondents who met their spouse through on-line vs. traditional off-line venues, but the findings for marital break-up and marital satisfaction remained significant after statistically controlling for these differences. These data suggest that the Internet may be altering the dynamics and outcomes of marriage itself. PMID:23733955

  12. SHAREv2: fluctuations and a comprehensive treatment of decay feed-down

    NASA Astrophysics Data System (ADS)

    Torrieri, G.; Jeon, S.; Letessier, J.; Rafelski, J.

    2006-11-01

    This the user's manual for SHARE version 2. SHARE [G. Torrieri, S. Steinke, W. Broniowski, W. Florkowski, J. Letessier, J. Rafelski, Comput. Phys. Comm. 167 (2005) 229] (Statistical Hadronization with Resonances) is a collection of programs designed for the statistical analysis of particle production in relativistic heavy-ion collisions. While the structure of the program remains similar to v1.x, v2 provides several new features such as evaluation of statistical fluctuations of particle yields, and a greater versatility, in particular regarding decay feed-down and input/output structure. This article describes all the new features, with emphasis on statistical fluctuations. Program summaryTitle of program:SHAREv2 Catalogue identifier:ADVD_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVD_v2_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer:PC, Pentium III, 512 MB RAM not hardware dependent Operating system:Linux: RedHat 6.1, 7.2, FEDORA, etc. not system dependent Programming language:FORTRAN77 Size of the package:167 KB directory, without libraries (see http://wwwasdoc.web.cern.ch/wwwasdoc/minuit/minmain.html, http://wwwasd.web.cern.ch/wwwasd/cernlib.html for details on library requirements) Number of lines in distributed program, including test data, etc.:26 101 Number of bytes in distributed program, including test data, etc.:170 346 Distribution format:tar.gzip file Computer:Any computer with an f77 compiler Nature of the physical problem:Event-by-event fluctuations have been recognized to be the physical observable capable to constrain particle production models. Therefore, consideration of event-by-event fluctuations is required for a decisive falsification or constraining of (variants of) particle production models based on (grand-, micro-) canonical statistical mechanics phase space, the so called statistical hadronization models (SHM). As in the case of particle yields, to properly compare model calculations to data it is necessary to consistently take into account resonance decays. However, event-by-event fluctuations are more sensitive than particle yields to experimental acceptance issues, and a range of techniques needs to be implemented to extract 'physical' fluctuations from an experimental event-by-event measurement. Method of solving the problem:The techniques used within the SHARE suite of programs [G. Torrieri, S. Steinke, W. Broniowski, W. Florkowski, J. Letessier, J. Rafelski, Comput. Phys. Comm. 167 (2005) 229; SHAREv1] are updated and extended to fluctuations. A full particle data-table, decay tree, and set of experimental feed-down coefficients are provided. Unlike SHAREv1.x, experimental acceptance feed-down coefficients can be entered for any resonance decay. SHAREv2 can calculate yields, fluctuations, and bulk properties of the fireball from provided thermal parameters; alternatively, parameters can be obtained from fits to experimental data, via the MINUIT fitting algorithm [F. James, M. Roos, Comput. Phys. Comm. 10 (1975) 343]. Fits can also be analyzed for significance, parameter and data point sensitivity. Averages and fluctuations at freeze-out of both the stable particles and the hadronic resonances are set according to a statistical prescription, calculated via a series of Bessel functions, using CERN library programs. We also have the option of including finite particle widths of the resonances. A χ minimization algorithm, also from the CERN library programs, is used to perform and analyze the fit. Please see SHAREv1 for more details on these. Purpose:The vast amount of high quality soft hadron production data, from experiments running at the SPS, RHIC, in past at the AGS, and in the near future at the LHC, offers the opportunity for statistical particle production model falsification. This task has turned out to be difficult when considering solely particle yields addressed in the context of SHAREv1.x. For this reason physical conditions at freeze-out remain contested. Inclusion in the analysis of event-by-event fluctuations appears to resolve this issue. Similarly, a thorough analysis including both fluctuations and average multiplicities gives a way to explore the presence and strength of interactions following hadronization (when hadrons form), ending with thermal freeze-out (when all interactions cease). SHAREv2 with fluctuations will also help determine which statistical ensemble (if any), e.g., canonical or grand-canonical, is more physically appropriate for analyzing a given system. Together with resonances, fluctuations can also be used for a direct estimate of the extent the system re-interacts between chemical and thermal freeze-out. We hope and expect that SHAREv2 will contribute to decide if any of the statistical hadronization model variants has a genuine physical connection to hadron particle production. Computation time survey:We encounter, in the FORTRAN version computation, times up to seconds for evaluation of particle yields. These rise by up to a factor of 300 in the process of minimization and a further factor of a few when χ/N profiles and contours with chemical non-equilibrium are requested. Summary of new features (w.r.t. SHAREv1.x)Fluctuations:In addition to particle yields, ratios and bulk quantities SHAREv2 can calculate, fit and analyze statistical fluctuations of particles and particle ratios Decays:SHAREv2 has the flexibility to account for any experimental method of allowing for decay feed-downs to the particle yields Charm flavor:Charmed particles have been added to the decay tree, allowing as an option study of statistical hadronization of J/ψ, χ, D, etc. Quark chemistry:Chemical non-equilibrium yields for both u and d flavors, as opposed to generically light quarks q, are considered; η- η mixing, etc., are properly dealt with, and chemical non-equilibrium can be studied for each flavor separately Misc:Many new commands and features have been introduced and added to the basic user interface. For example, it is possible to study combinations of particles and their ratios. It is also possible to combine all the input files into one file. SHARE compatibility and manual:This write-up is an update and extension of SHAREv1. The user should consult SHAREv1 regarding the principles of user interface and for all particle yield related physics and program instructions, other than the parameter additions and minor changes described here. SHAREv2 is downward compatible for the changes of the user interface, offering the user of SHAREv1 a computer generated revised input files compatible with SHAREv2.

  13. Are On-Line Data Bases in Your Library's Future?

    ERIC Educational Resources Information Center

    Deacon, Jim

    1983-01-01

    THE FOLLOWING IS THE FULL TEXT OF THIS DOCUMENT: Today there are over 900 on-line data banks available for public access. Most microcomputers can use them through the aid of a modem and communication program. Major public information utilities that offer access to these on-line data bases are growing and expanding. The Source, a data base utility…

  14. Residents' perspectives of the value of a simulation curriculum in a general surgery residency program: a multimethod study of stakeholder feedback.

    PubMed

    Wehbe-Janek, Hania; Colbert, Colleen Y; Govednik-Horny, Cara; White, Bobbie Ann A; Thomas, Scott; Shabahang, Mohsen

    2012-06-01

    Simulation has altered surgical curricula throughout residency programs. The purpose of this multimethod study was to explore residents' perceptions of simulation within surgical residency as relevant stakeholder feedback and program evaluation of the surgery simulation curriculum. Focus groups were held with a sample of surgery residents (n = 25) at a university-affiliated program. Residents participated in focus groups based on level of training and completed questionnaires regarding simulation curricula. Groups were facilitated by nonsurgeon faculty. Residents were asked: "What is the role of simulation in surgical education?" An interdisciplinary team recorded narrative data and performed content analyses. Quantitative data from questionnaires were summarized using descriptive statistics and frequencies. Major themes from the qualitative data included: concerns regarding simulation in surgical education (28%), exposure to situations and technical skills in a low-stress learning environment (24%), pressure by external agencies (19%), an educational tool (17%), and quality assurance for patient care (12%). Laparoscopy and cadaver lab were the most prevalent simulation training during residency, in addition to trauma simulations, central lines/chest tubes/IV access, and stapling lab. In response to the statement: "ACGME should require a simulation curriculum in surgery residency," 52.1% responded favorably and 47.8% responded nonfavorably. Residents acknowledge the value of simulation in patient safety, quality, and exposure to procedures before clinical experience, but remain divided on efficacy and requirement of simulation within curricula. The greater challenge to residency programs may be strategic implementation of simulation curricula within the right training context. Copyright © 2012 Mosby, Inc. All rights reserved.

  15. Lithium-doped solar cell pilot line fabrication and test programs

    NASA Technical Reports Server (NTRS)

    Berman, P. A.; Yasui, R. K.

    1974-01-01

    An investigation was conducted to determine the technology readiness of lithium-doped silicon solar cells with respect to use in space programs. A pilot line fabrication program was established, in which the pilot line cells were evaluated after being exposed to environments ordinarily imposed on nonlithium-doped silicon solar cells. Results indicate that further process improvements are required, particularly with respect to the P/N junction diffusion and the electrical contacting technique (including solder coating). It is concluded that lithium-doped cells can be fabricated to exhibit (1) high efficiencies, (2) uniform cell-to-cell recovery characteristics after exposure to 1-MeV electrons; and (3) good stability in most environments investigated (the only exception being the thermal shock environment).

  16. Momentum deposition on Wolf-Rayet winds: Nonisotropic diffusion with effective gray opacity

    NASA Technical Reports Server (NTRS)

    Gayley, Kenneth G.; Owocki, Stanley P.; Cranmer, Steven R.

    1995-01-01

    We derive the velocity and mass-loss rate of a steady state Wolf-Rayet (WR) wind, using a nonisotropic diffusion approximation applied to the transfer between strongly overlapping spectral lines. Following the approach of Friend & Castor (1983), the line list is assumed to approximate a statistically parameterized Poisson distribution in frequency, so that photon transport is controlled by an angle-dependent, effectively gray opacity. We show the nonisotropic diffusion approximation yields good agreement with more accurate numerical treatments of the radiative transfer, while providing analytic insight into wind driving by multiple scattering. We illustrate, in particular, that multiple radiative momentum deposition does not require that potons be repeatedly reflected across substantial distances within the spherical envelope, but indeed is greatest when photons undergo a nearly local diffusion, e.g., through scattering by many lines closely spaced in frequency. Our results reiterate the view that the so-called 'momentum problem' of Wolf-Rayet winds is better characterized as an 'opacity problem' of simply identfying enough lines. One way of increasing the number of thick lines in Wolf-Rayet winds is to transfer opacity from saturated to unsaturated lines, yielding a steeper opacity distribution than that found in OB winds. We discuss the implications of this perspective for extending our approach to W-R wind models that incorporate a more fundamental treatment of the ionization and excitation processes that determine the line opacity. In particular, we argue that developing statistical descriptions of the lines to allow an improved effective opacity for the line ensemble would offer several advantages for deriving such more fundamental W-R wind models.

  17. Momentum deposition on Wolf-Rayet winds: Nonisotropic diffusion with effective gray opacity

    NASA Astrophysics Data System (ADS)

    Gayley, Kenneth G.; Owocki, Stanley P.; Cranmer, Steven R.

    1995-03-01

    We derive the velocity and mass-loss rate of a steady state Wolf-Rayet (WR) wind, using a nonisotropic diffusion approximation applied to the transfer between strongly overlapping spectral lines. Following the approach of Friend & Castor (1983), the line list is assumed to approximate a statistically parameterized Poisson distribution in frequency, so that photon transport is controlled by an angle-dependent, effectively gray opacity. We show the nonisotropic diffusion approximation yields good agreement with more accurate numerical treatments of the radiative transfer, while providing analytic insight into wind driving by multiple scattering. We illustrate, in particular, that multiple radiative momentum deposition does not require that photons be repeatedly reflected across substantial distances within the spherical envelope, but indeed is greatest when photons undergo a nearly local diffusion, e.g., through scattering by many lines closely spaced in frequency. Our results reiterate the view that the so-called 'momentum problem' of Wolf-Rayet winds is better characterized as an 'opacity problem' of simply identifying enough lines. One way of increasing the number of thick lines in Wolf-Rayet winds is to transfer opacity from saturated to unsaturated lines, yielding a steeper opacity distribution than that found in OB winds. We discuss the implications of this perspective for extending our approach to W-R wind models that incorporate a more fundamental treatment of the ionization and excitation processes that determine the line opacity. In particular, we argue that developing statistical descriptions of the lines to allow an improved effective opacity for the line ensemble would offer several advantages for deriving such more fundamental W-R wind models.

  18. Comparison of Absolute Apparent Diffusion Coefficient (ADC) Values in ADC Maps Generated Across Different Postprocessing Software: Reproducibility in Endometrial Carcinoma.

    PubMed

    Ghosh, Adarsh; Singh, Tulika; Singla, Veenu; Bagga, Rashmi; Khandelwal, Niranjan

    2017-12-01

    Apparent diffusion coefficient (ADC) maps are usually generated by builtin software provided by the MRI scanner vendors; however, various open-source postprocessing software packages are available for image manipulation and parametric map generation. The purpose of this study is to establish the reproducibility of absolute ADC values obtained using different postprocessing software programs. DW images with three b values were obtained with a 1.5-T MRI scanner, and the trace images were obtained. ADC maps were automatically generated by the in-line software provided by the vendor during image generation and were also separately generated on postprocessing software. These ADC maps were compared on the basis of ROIs using paired t test, Bland-Altman plot, mountain plot, and Passing-Bablok regression plot. There was a statistically significant difference in the mean ADC values obtained from the different postprocessing software programs when the same baseline trace DW images were used for the ADC map generation. For using ADC values as a quantitative cutoff for histologic characterization of tissues, standardization of the postprocessing algorithm is essential across processing software packages, especially in view of the implementation of vendor-neutral archiving.

  19. A Statistical Analysis of the Effect of the Navy’s Tuition Assistance Program: Do Distance Learning Classes Make a Difference?

    DTIC Science & Technology

    2010-03-01

    ANALYSIS OF THE EFFECT OF THE NAVY’S TUITION ASSISTANCE PROGRAM : DO DISTANCE LEARNING CLASSES MAKE A DIFFERENCE? by Jeremy P. McLaughlin March...TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE A Statistical Analysis of the Effect of the Navy’s Tuition Assistance Program : Do...200 words) This thesis analyzes the impact of participation in the Navy’s Tuition Assistance (TA) program on the retention of first-term Navy

  20. Nonlinear Curve-Fitting Program

    NASA Technical Reports Server (NTRS)

    Everhart, Joel L.; Badavi, Forooz F.

    1989-01-01

    Nonlinear optimization algorithm helps in finding best-fit curve. Nonlinear Curve Fitting Program, NLINEAR, interactive curve-fitting routine based on description of quadratic expansion of X(sup 2) statistic. Utilizes nonlinear optimization algorithm calculating best statistically weighted values of parameters of fitting function and X(sup 2) minimized. Provides user with such statistical information as goodness of fit and estimated values of parameters producing highest degree of correlation between experimental data and mathematical model. Written in FORTRAN 77.

  1. Children in the States, 1997.

    ERIC Educational Resources Information Center

    Children's Defense Fund, Washington, DC.

    This report from the Children's Defense Fund lists statistics on child and youth well-being for each of the states and the United States as a whole. Statistics are provided in the following categories: (1) children participating in federally subsidized programs (including Title 1 Education for the Disadvantaged, bilingual education programs,…

  2. Anatomy of the Orphan Stream using RR Lyrae Stars

    NASA Astrophysics Data System (ADS)

    Hendel, David; Johnston, Kathryn; Scowcroft, Victoria; SMHASH

    2018-01-01

    Stellar tidal streams provide an opportunity to study the motion and structure of the disrupting galaxy as well as the gravitational potential of its host. Streams around the Milky Way are especially promising as new datasets make additional phase space dimensions available as constraints. We present observations of 32 stars thought to be RR Lyrae in the Orphan tidal stream as part of the {\\it Spitzer} Merger History and Shape of the Galactic Halo (SMHASH) program. The extremely tight correlation between the periods, luminosities, and metallicities of RR Lyrae variable stars in the {\\it Spitzer} IRAC $3.6\\mu$m band allows the determination of precise distances to individual stars; the median statistical distance uncertainty in this sample is $2.5\\%$. By fitting orbits in an example potential we obtain an upper limit on the mass of the Milky Way interior to 60 kpc of $\\mathrm{3.9_{-0.8}^{+1.2}\\times 10^{11} M_\\odot}$, bringing estimates based on the Orphan stream in line with those using other tracers. The SMHASH data also resolves the stream in line-of-sight depth, allowing unprecedented access its internal structure. Comparing this structure with n-body models we find that Orphan had an initial dark halo mass $\\sim \\mathrm{3 \\times 10^{9} M_\\odot}$, placing the progenitor amongst the classical dwarf spheriodals.

  3. Mapping of epistatic quantitative trait loci in four-way crosses.

    PubMed

    He, Xiao-Hong; Qin, Hongde; Hu, Zhongli; Zhang, Tianzhen; Zhang, Yuan-Ming

    2011-01-01

    Four-way crosses (4WC) involving four different inbred lines often appear in plant and animal commercial breeding programs. Direct mapping of quantitative trait loci (QTL) in these commercial populations is both economical and practical. However, the existing statistical methods for mapping QTL in a 4WC population are built on the single-QTL genetic model. This simple genetic model fails to take into account QTL interactions, which play an important role in the genetic architecture of complex traits. In this paper, therefore, we attempted to develop a statistical method to detect epistatic QTL in 4WC population. Conditional probabilities of QTL genotypes, computed by the multi-point single locus method, were used to sample the genotypes of all putative QTL in the entire genome. The sampled genotypes were used to construct the design matrix for QTL effects. All QTL effects, including main and epistatic effects, were simultaneously estimated by the penalized maximum likelihood method. The proposed method was confirmed by a series of Monte Carlo simulation studies and real data analysis of cotton. The new method will provide novel tools for the genetic dissection of complex traits, construction of QTL networks, and analysis of heterosis.

  4. Statistical Techniques to Analyze Pesticide Data Program Food Residue Observations.

    PubMed

    Szarka, Arpad Z; Hayworth, Carol G; Ramanarayanan, Tharacad S; Joseph, Robert S I

    2018-06-26

    The U.S. EPA conducts dietary-risk assessments to ensure that levels of pesticides on food in the U.S. food supply are safe. Often these assessments utilize conservative residue estimates, maximum residue levels (MRLs), and a high-end estimate derived from registrant-generated field-trial data sets. A more realistic estimate of consumers' pesticide exposure from food may be obtained by utilizing residues from food-monitoring programs, such as the Pesticide Data Program (PDP) of the U.S. Department of Agriculture. A substantial portion of food-residue concentrations in PDP monitoring programs are below the limits of detection (left-censored), which makes the comparison of regulatory-field-trial and PDP residue levels difficult. In this paper, we present a novel adaption of established statistical techniques, the Kaplan-Meier estimator (K-M), the robust regression on ordered statistic (ROS), and the maximum-likelihood estimator (MLE), to quantify the pesticide-residue concentrations in the presence of heavily censored data sets. The examined statistical approaches include the most commonly used parametric and nonparametric methods for handling left-censored data that have been used in the fields of medical and environmental sciences. This work presents a case study in which data of thiamethoxam residue on bell pepper generated from registrant field trials were compared with PDP-monitoring residue values. The results from the statistical techniques were evaluated and compared with commonly used simple substitution methods for the determination of summary statistics. It was found that the maximum-likelihood estimator (MLE) is the most appropriate statistical method to analyze this residue data set. Using the MLE technique, the data analyses showed that the median and mean PDP bell pepper residue levels were approximately 19 and 7 times lower, respectively, than the corresponding statistics of the field-trial residues.

  5. MBTA Green Line Tests - Riverside Line, December 1972 : Volume 1. Description.

    DOT National Transportation Integrated Search

    1973-09-01

    The Urban Rail Supporting Technology Program emphasizes three major task areas; facilities development, technology development, and test program development. The test program development is composed of three sub-areas; vehicle testing, ways and struc...

  6. Statistics-based optimization of the polarimetric radar hydrometeor classification algorithm and its application for a squall line in South China

    NASA Astrophysics Data System (ADS)

    Wu, Chong; Liu, Liping; Wei, Ming; Xi, Baozhu; Yu, Minghui

    2018-03-01

    A modified hydrometeor classification algorithm (HCA) is developed in this study for Chinese polarimetric radars. This algorithm is based on the U.S. operational HCA. Meanwhile, the methodology of statistics-based optimization is proposed including calibration checking, datasets selection, membership functions modification, computation thresholds modification, and effect verification. Zhuhai radar, the first operational polarimetric radar in South China, applies these procedures. The systematic bias of calibration is corrected, the reliability of radar measurements deteriorates when the signal-to-noise ratio is low, and correlation coefficient within the melting layer is usually lower than that of the U.S. WSR-88D radar. Through modification based on statistical analysis of polarimetric variables, the localized HCA especially for Zhuhai is obtained, and it performs well over a one-month test through comparison with sounding and surface observations. The algorithm is then utilized for analysis of a squall line process on 11 May 2014 and is found to provide reasonable details with respect to horizontal and vertical structures, and the HCA results—especially in the mixed rain-hail region—can reflect the life cycle of the squall line. In addition, the kinematic and microphysical processes of cloud evolution and the differences between radar-detected hail and surface observations are also analyzed. The results of this study provide evidence for the improvement of this HCA developed specifically for China.

  7. Characterizing L1-norm best-fit subspaces

    NASA Astrophysics Data System (ADS)

    Brooks, J. Paul; Dulá, José H.

    2017-05-01

    Fitting affine objects to data is the basis of many tools and methodologies in statistics, machine learning, and signal processing. The L1 norm is often employed to produce subspaces exhibiting a robustness to outliers and faulty observations. The L1-norm best-fit subspace problem is directly formulated as a nonlinear, nonconvex, and nondifferentiable optimization problem. The case when the subspace is a hyperplane can be solved to global optimality efficiently by solving a series of linear programs. The problem of finding the best-fit line has recently been shown to be NP-hard. We present necessary conditions for optimality for the best-fit subspace problem, and use them to characterize properties of optimal solutions.

  8. AN INDEPENDENT MEASUREMENT OF THE INCIDENCE OF Mg II ABSORBERS ALONG GAMMA-RAY BURST SIGHT LINES: THE END OF THE MYSTERY?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cucchiara, A.; Prochaska, J. X.; Zhu, G.

    2013-08-20

    In 2006, Prochter et al. reported a statistically significant enhancement of very strong Mg II absorption systems intervening the sight lines to gamma-ray bursts (GRBs) relative to the incidence of such absorption along quasar sight lines. This counterintuitive result has inspired a diverse set of astrophysical explanations (e.g., dust, gravitational lensing) but none of these has obviously resolved the puzzle. Using the largest set of GRB afterglow spectra available, we reexamine the purported enhancement. In an independent sample of GRB spectra with a survey path three times larger than Prochter et al., we measure the incidence per unit redshift ofmore » {>=}1 A rest-frame equivalent width Mg II absorbers at z Almost-Equal-To 1 to be l(z) = 0.18 {+-} 0.06. This is fully consistent with current estimates for the incidence of such absorbers along quasar sight lines. Therefore, we do not confirm the original enhancement and suggest those results suffered from a statistical fluke. Signatures of the original result do remain in our full sample (l(z) shows an Almost-Equal-To 1.5 enhancement over l(z){sub QSO}), but the statistical significance now lies at Almost-Equal-To 90% c.l. Restricting our analysis to the subset of high-resolution spectra of GRB afterglows (which overlaps substantially with Prochter et al.), we still reproduce a statistically significant enhancement of Mg II absorption. The reason for this excess, if real, is still unclear since there is no connection between the rapid afterglow follow-up process with echelle (or echellette) spectrographs and the detectability of strong Mg II doublets. Only a larger sample of such high-resolution data will shed some light on this matter.« less

  9. An Open Label Clinical Trial of a Peptide Treatment Serum and Supporting Regimen Designed to Improve the Appearance of Aging Facial Skin.

    PubMed

    Draelos, Zoe Diana; Kononov, Tatiana; Fox, Theresa

    2016-09-01

    A 14-week single-center clinical usage study was conducted to test the efficacy of a peptide treatment serum and supporting skincare regimen in 29 women with mild to moderately photodamaged facial skin. The peptide treatment serum contained gamma-aminobutyric acid (GABA) and various peptides with neurotransmitter inhibiting and cell signaling properties. It was hypothesized that the peptide treatment serum would ameliorate eye and facial expression lines including crow's feet and forehead lines. The efficacy of the supporting skincare regimen was also evaluated. An expert investigator examined the subjects at rest and at maximum smile. Additionally, the subjects completed self-assessment questionnaires. At week 14, the expert investigator found a statistically significant improvement in facial lines, facial wrinkles, eye lines, and eye wrinkles at rest when compared to baseline results. The expert investigator also found statistically significant improvement at week 14 in facial lines, eye lines, and eye wrinkles when compared to baseline results at maximum smile. In addition, there was continued highly statistically significant improvement in smoothness, softness, firmness, radiance, luminosity, and overall appearance at rest when compared to baseline results at the 14-week time point. The test regimen was well perceived by the subjects for efficacy and product attributes. The products were well tolerated with no adverse events.

    J Drugs Dermatol. 2016;15(9):1100-1106.

  10. 31 CFR 50.14 - Separate line item.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ....14 Money and Finance: Treasury Office of the Secretary of the Treasury TERRORISM RISK INSURANCE PROGRAM Disclosures as Conditions for Federal Payment § 50.14 Separate line item. An insurer is deemed to be in compliance with the requirement of providing disclosure on a “separate line item in the policy...

  11. Satellite Data Processing System (SDPS) users manual V1.0

    NASA Technical Reports Server (NTRS)

    Caruso, Michael; Dunn, Chris

    1989-01-01

    SDPS is a menu driven interactive program designed to facilitate the display and output of image and line-based data sets common to telemetry, modeling and remote sensing. This program can be used to display up to four separate raster images and overlay line-based data such as coastlines, ship tracks and velocity vectors. The program uses multiple windows to communicate information with the user. At any given time, the program may have up to four image display windows as well as auxiliary windows containing information about each image displayed. SDPS is not a commercial program. It does not contain complete type checking or error diagnostics which may allow the program to crash. Known anomalies will be mentioned in the appropriate section as notes or cautions. SDPS was designed to be used on Sun Microsystems Workstations running SunView1 (Sun Visual/Integrated Environment for Workstations). It was primarily designed to be used on workstations equipped with color monitors, but most of the line-based functions and several of the raster-based functions can be used with monochrome monitors. The program currently runs on Sun 3 series workstations running Sun OS 4.0 and should port easily to Sun 4 and Sun 386 series workstations with SunView1. Users should also be familiar with UNIX, Sun workstations and the SunView window system.

  12. Effects of drain bias on the statistical variation of double-gate tunnel field-effect transistors

    NASA Astrophysics Data System (ADS)

    Choi, Woo Young

    2017-04-01

    The effects of drain bias on the statistical variation of double-gate (DG) tunnel field-effect transistors (TFETs) are discussed in comparison with DG metal-oxide-semiconductor FETs (MOSFETs). Statistical variation corresponds to the variation of threshold voltage (V th), subthreshold swing (SS), and drain-induced barrier thinning (DIBT). The unique statistical variation characteristics of DG TFETs and DG MOSFETs with the variation of drain bias are analyzed by using full three-dimensional technology computer-aided design (TCAD) simulation in terms of the three dominant variation sources: line-edge roughness (LER), random dopant fluctuation (RDF) and workfunction variation (WFV). It is observed than DG TFETs suffer from less severe statistical variation as drain voltage increases unlike DG MOSFETs.

  13. Genome-wide transcriptomic analysis of response to low temperature reveals candidate genes determining divergent cold-sensitivity of maize inbred lines.

    PubMed

    Sobkowiak, Alicja; Jończyk, Maciej; Jarochowska, Emilia; Biecek, Przemysław; Trzcinska-Danielewicz, Joanna; Leipner, Jörg; Fronk, Jan; Sowiński, Paweł

    2014-06-01

    Maize, despite being thermophyllic due to its tropical origin, demonstrates high intraspecific diversity in cold-tolerance. To search for molecular mechanisms of this diversity, transcriptomic response to cold was studied in two inbred lines of contrasting cold-tolerance. Microarray analysis was followed by extensive statistical elaboration of data, literature data mining, and gene ontology-based classification. The lines used had been bred earlier specifically for determination of QTLs for cold-performance of photosynthesis. This allowed direct comparison of present transcriptomic data with the earlier QTL mapping results. Cold-treated (14 h at 8/6 °C) maize seedlings of cold-tolerant ETH-DH7 and cold-sensitive ETH-DL3 lines at V3 stage showed strong, consistent response of the third leaf transcriptome: several thousand probes showed similar, statistically significant change in both lines, while only tens responded differently in the two lines. The most striking difference between the responses of the two lines to cold was the induction of expression of ca. twenty genes encoding membrane/cell wall proteins exclusively in the cold-tolerant ETH-DH7 line. The common response comprised mainly repression of numerous genes related to photosynthesis and induction of genes related to basic biological activity: transcription, regulation of gene expression, protein phosphorylation, cell wall organization. Among the genes showing differential response, several were close to the QTL regions identified in earlier studies with the same inbred lines and associated with biometrical, physiological or biochemical parameters. These transcripts, including two apparently non-protein-coding ones, are particularly attractive candidates for future studies on mechanisms determining divergent cold-tolerance of inbred maize lines.

  14. Adaptive filtering in biological signal processing.

    PubMed

    Iyer, V K; Ploysongsang, Y; Ramamoorthy, P A

    1990-01-01

    The high dependence of conventional optimal filtering methods on the a priori knowledge of the signal and noise statistics render them ineffective in dealing with signals whose statistics cannot be predetermined accurately. Adaptive filtering methods offer a better alternative, since the a priori knowledge of statistics is less critical, real time processing is possible, and the computations are less expensive for this approach. Adaptive filtering methods compute the filter coefficients "on-line", converging to the optimal values in the least-mean square (LMS) error sense. Adaptive filtering is therefore apt for dealing with the "unknown" statistics situation and has been applied extensively in areas like communication, speech, radar, sonar, seismology, and biological signal processing and analysis for channel equalization, interference and echo canceling, line enhancement, signal detection, system identification, spectral analysis, beamforming, modeling, control, etc. In this review article adaptive filtering in the context of biological signals is reviewed. An intuitive approach to the underlying theory of adaptive filters and its applicability are presented. Applications of the principles in biological signal processing are discussed in a manner that brings out the key ideas involved. Current and potential future directions in adaptive biological signal processing are also discussed.

  15. Design Graphics

    NASA Technical Reports Server (NTRS)

    1990-01-01

    A mathematician, David R. Hedgley, Jr. developed a computer program that considers whether a line in a graphic model of a three-dimensional object should or should not be visible. Known as the Hidden Line Computer Code, the program automatically removes superfluous lines and displays an object from a specific viewpoint, just as the human eye would see it. An example of how one company uses the program is the experience of Birdair which specializes in production of fabric skylights and stadium covers. The fabric called SHEERFILL is a Teflon coated fiberglass material developed in cooperation with DuPont Company. SHEERFILL glazed structures are either tension structures or air-supported tension structures. Both are formed by patterned fabric sheets supported by a steel or aluminum frame or cable network. Birdair uses the Hidden Line Computer Code, to illustrate a prospective structure to an architect or owner. The program generates a three- dimensional perspective with the hidden lines removed. This program is still used by Birdair and continues to be commercially available to the public.

  16. Genetic characterization of Russian honey bee stock selected for improved resistance to Varroa destructor.

    PubMed

    Bourgeois, A Lelania; Rinderer, Thomas E

    2009-06-01

    Maintenance of genetic diversity among breeding lines is important in selective breeding and stock management. The Russian Honey Bee Breeding Program has strived to maintain high levels of heterozygosity among its breeding lines since its inception in 1997. After numerous rounds of selection for resistance to tracheal and varroa mites and improved honey production, 18 lines were selected as the core of the program. These lines were grouped into three breeding blocks that were crossbred to improve overall heterozygosity levels of the population. Microsatellite DNA data demonstrated that the program has been successful. Heterozygosity and allelic richness values are high and there are no indications of inbreeding among the three blocks. There were significant levels of genetic structure measured among the three blocks. Block C was genetically distinct from both blocks A and B (F(ST) = 0.0238), whereas blocks A and B did not differ from each other (F(ST) = 0.0074). The same pattern was seen for genic (based on numbers of alleles) differentiation. Genetic distance, as measured by chord distance, indicates that all of the 18 lines are equally distant, with minimal clustering. The data indicate that the overall design of the breeding program has been successful in maintaining high levels of diversity and avoiding problems associated with inbreeding.

  17. Summary Report on NRL Participation in the Microwave Landing System Program.

    DTIC Science & Technology

    1980-08-19

    shifters were measured and statistically analyzed. Several research contracts for promising phased array techniques were awarded to industrial contractors...program was written for compiling statistical data on the measurements, which reads out inser- sertion phase characteristics and standard deviation...GLOSSARY OF TERMS ALPA Airline Pilots’ Association ATA Air Transport Association AWA Australiasian Wireless Amalgamated AWOP All-weather Operations

  18. STATISTICAL PROGRAMS OF THE UNITED STATES GOVERNMENT: FISCAL YEAR 2018

    DOT National Transportation Integrated Search

    2018-01-01

    Statistical Programs of the United States Government: Fiscal Year 2018 outlines the funding proposed for Federal statistical activities in the President's Budget. This report, along with the chapter "Strengthening Federal Statistics" in the Analytica...

  19. MBTA Green Line Tests - Riverside Line, December 1972 : Volume 5. Gage Computer Printout.

    DOT National Transportation Integrated Search

    1973-01-01

    The Urban Rail Supporting Technology Program emphasizes three major task areas; facilities development, technology development, and test program development. The test program development is composed of three sub-areas; vehicle testing, ways and struc...

  20. MBTA Green Line Tests - Riverside Line, December 1972 : Volume 4. Westbound Track Profile.

    DOT National Transportation Integrated Search

    1973-01-01

    The Urban Rail Supporting Technology Program emphasizes three major task areas; facilities development, technology development, and test program development. The test program development is composed of three sub-areas; vehicle testing, ways and struc...

  1. MBTA Green Line Tests - Riverside Line, December 1972 : Volume 3. Eastbound Track Profile.

    DOT National Transportation Integrated Search

    1973-01-01

    The Urban Rail Supporting Technology Program emphasizes three major task areas; facilities development, technology development, and test program development. The test program development is composed of three sub-areas; vehicle testing, ways and struc...

  2. Persistant Spectral Hole-Burning: Photon-Gating and Fundamental Statistical Limits

    DTIC Science & Technology

    1989-11-03

    pentacene inhomogeneous line that results from tile statistics of independent, additive random variables. For this data, Nil - 10’. The rms amplitude...features in inhomogencous lines. To illustrate this, Figure 5 shows a portion of the optical spectrum of pentacene in p-terphenyl before and after a...contained in each irradiated spot of recording medium. The stress-induced variations in the local environment of the storage centers are random in nature

  3. Laboratory Earth Under the Lens: Diachronic Evaluation of an Integrated Graduate-Level On-Line Earth System Science Course Series for K-12 Educators

    NASA Astrophysics Data System (ADS)

    Low, R.; Gosselin, D. C.; Haney, C.; Larson-Miller, C.; Bonnstetter, R.; Mandryk, C.

    2012-12-01

    Educational research strives to identify the pedagogies that promote student learning. However, the body of research identifying the characteristics of effective teacher preparation is "least strong for science," and is largely based on studies of the effectiveness of individual courses or workshops (NRC 2010). The National Research Council's "Preparing Teachers: Building Evidence for Strong Policy," (2010) provides a mandate for teacher education providers to conduct research on program-scale effectiveness. The high priority research agenda identified by the NRC is expected to elicit understanding of the aspects of teacher preparation that critically impact classroom student learning outcomes. The Laboratory Lens project is designed to identify effective practices in a teacher education program, with specific reference to the content domain of Earth science. Now in its fifth year, the Masters of Applied Science (MAS) program at UNL offers a variety of science courses, ranging from entomology to food science. The six-course Lab Earth series serves as the backbone of the Specialization for Science Educators within the MAS program, and provides comprehensive content coverage of all Earth science topics identified in the AAAS Benchmarks. "How People Learn," (NRC 2009) emphasizes that expert knowledge includes not only factual knowledge, but also the well-developed conceptual framework critical to the ability to, "remember, reason, and solve problems." A focus of our research is to document the process by which the transition from novice to expert takes place in Lab Earth's on-line teacher participants. A feature of our research design is the standardization of evaluation instruments across the six courses. We have used data derived from implementation of the Community of Inquiry Survey (COI) in pilot offerings to ensure that the course sequence is effective in developing a community of learners, while developing their content knowledge. A pre- and post- course Wilcoxan Signed Ranks Test is included in the battery of assessments to ensure that the courses achieve a statistically significant increase in participants' beliefs about their personal science teaching efficacy. The research design also includes the analysis of concept maps and content mastery assignments to assist in documentation of a teacher's transition from mastery of novice to expert knowledge. Content-based, course-specific pre and post knowledge surveys are included in the battery of assessments. In the analysis of on-line discussions, the project employs a textual analysis technique outlined in "The Rhetoric of Social Intervention," (RSI) (Opt and Gring 2009). RSI provides a promising analytical framework, especially when examining the development of understanding of scientific topics with societal implications, such as sustainability and climate change. The session provides a description of the integrated research design and data collection and analysis in the first year of this project.

  4. Strategies for Selecting Crosses Using Genomic Prediction in Two Wheat Breeding Programs.

    PubMed

    Lado, Bettina; Battenfield, Sarah; Guzmán, Carlos; Quincke, Martín; Singh, Ravi P; Dreisigacker, Susanne; Peña, R Javier; Fritz, Allan; Silva, Paula; Poland, Jesse; Gutiérrez, Lucía

    2017-07-01

    The single most important decision in plant breeding programs is the selection of appropriate crosses. The ideal cross would provide superior predicted progeny performance and enough diversity to maintain genetic gain. The aim of this study was to compare the best crosses predicted using combinations of mid-parent value and variance prediction accounting for linkage disequilibrium (V) or assuming linkage equilibrium (V). After predicting the mean and the variance of each cross, we selected crosses based on mid-parent value, the top 10% of the progeny, and weighted mean and variance within progenies for grain yield, grain protein content, mixing time, and loaf volume in two applied wheat ( L.) breeding programs: Instituto Nacional de Investigación Agropecuaria (INIA) Uruguay and CIMMYT Mexico. Although the variance of the progeny is important to increase the chances of finding superior individuals from transgressive segregation, we observed that the mid-parent values of the crosses drove the genetic gain but the variance of the progeny had a small impact on genetic gain for grain yield. However, the relative importance of the variance of the progeny was larger for quality traits. Overall, the genomic resources and the statistical models are now available to plant breeders to predict both the performance of breeding lines per se as well as the value of progeny from any potential crosses. Copyright © 2017 Crop Science Society of America.

  5. Anatolian honey is not only sweet but can also protect from breast cancer: Elixir for women from artemis to present.

    PubMed

    Seyhan, Mehmet Fatih; Yılmaz, Eren; Timirci-Kahraman, Özlem; Saygılı, Neslihan; Kısakesen, Halil İbrahim; Eronat, Allison Pınar; Ceviz, Ayşe Begüm; Bilgiç Gazioğlu, Sema; Yılmaz-Aydoğan, Hülya; Öztürk, Oğuz

    2017-09-01

    Natural products with bioactive components are widely studied on various cancer cell lines for their possible cytotoxic effects, recently. Among these products, honey stands out as a valuable bee product containing many active phenolic compounds and flavonoids. Numerous types of multifloral honey and honeydew honey are produced in Turkey owing to its abundant vegetation. Therefore, in this study, we investigated the cytotoxic effects of particular tree-originated honeys from chestnut, cedar, pine, and multifloral honey on cell lines representing different types of the most common cancer of women, breast cancer, MCF7, SKBR3, and MDAMB-231, and fibrocystic breast epithelial cell line, MCF10A as a control. All honey samples were analyzed biochemically. The dose- (1, 2.5, 5, 7.5, and 10 µg/mL) and time (24th, 48th, and 72nd hours)-dependent effects of ethanol/water solutions of the honey samples were scrutinized. Cell viability/cytotoxicity was evaluated by the water soluble tetrazolium Salt-1 (WST-1) method. Apoptotic status was detected by Annexin V-PI assay using FACSCalibur. The statistical analysis was performed using GraphPad Prism 6 and the clustering data analysis with the R programming language. The biochemical analyses of the honey samples showed that the tree-originated honey samples contained more total phenolic compounds than the multifloral honey. Phenolic content of the honey types increases in order of multifloral, pine, cedar, and chestnut, respectively, which is compatible with their cytotoxic affectivity and dark color. In addition, the antioxidant capacity of the studied honey types was observed to increase in order of multifloral < pine < cedar ≅ chestnut. According to the WST-1 data, chestnut honey induced cytotoxicity over 50% on all the cell lines, including the control MCF10A cells, even with low doses (honey concentrations starting from 1 µg/mL) (P < 0.0001). Similarly, Cedar honey was observed to be the second most effective honey in this study. Cedar honey, with the dose of 1 µg/mL, was detected statistically highly significant on MCF10A, MCF7, and SKBR3. In contrast, pine honey showed dramatically significant cytotoxicity only on the MDAMB 231 cells with a 1 µg/mL dose at the same time point (P = 0.018). While pine honey caused an anticancer effect on the MCF-7 and SKBR3 cancer cell lines with a 2.5-5 µg/mL dose (P < 0.0001), like cedar and chestnut honeys, it increased the viability of the MCF10A control cells with the doses of 2.5-5 µg/mL. It only showed cytotoxicity with higher doses (10 µg/mL) on the MCF10A cell line (P < 0.0001). Moreover, we have observed that the multifloral and artificial honey samples were mostly ineffective or increased cell viability with the doses of 1-5 µg/mL. Apoptotic effects of the other honey samples on the MCF-7 cell line were found as chestnut> pine> cedar> multifloral in the Annexin V-propidium iodide (PI) analysis. Chestnut, cedar, and pine honey displayed a remarkably cytotoxic effect on breast cancer cell lines, MCF7, SKBR3, and even on the most aggressive MDAMB 231, representing the triple negative breast cancer, which lacks of targeted anticancer therapy. The chestnut and cedar honeys stand out to be the most cytotoxic on all cell lines, while pine honey was found to be the least toxic on control cells with appropriate toxicity on the cancer cells. © 2017 IUBMB Life, 69(9):677-688, 2017. © 2017 International Union of Biochemistry and Molecular Biology.

  6. Taking charge: front-line nurse leadership development.

    PubMed

    Schwarzkopf, Ruth; Sherman, Rose O; Kiger, Anna J

    2012-04-01

    The recent Institute of Medicine (2010) report, The Future of Nursing: Leading Change, Advancing Health, included a recommendation that nurses at all levels should be prepared and enabled to lead change to advance health care in the United States. Historically, in most organizations, nursing leadership development programs have focused on nurses in management or executive roles rather than those working in front-line leadership roles. This article describes a front-line leadership development initiative developed by Tenet Healthcare Corporation and attended by 400 charge nurses. Program development, evaluation, and lessons learned that can be applied in other organizations are discussed. Copyright 2012, SLACK Incorporated.

  7. Detection of Interstellar Urea with Carma

    NASA Astrophysics Data System (ADS)

    Kuo, H.-L.; Snyder, L. E.; Friedel, D. N.; Looney, L. W.; McCall, B. J.; Remijan, A. J.; Lovas, F. J.; Hollis, J. M.

    2010-06-01

    Urea, a molecule discovered in human urine by H. M. Rouelle in 1773, has a significant role in prebiotic chemistry. Previous BIMA observations have suggested that interstellar urea [(NH_2)_2CO] is a compact hot core molecule such as other large molecules, e.g. methyl formate and acetic acid (2009, 64th OSU Symposium On Molecular Spectroscopy, WI05). We have conducted an extensive search for urea toward the high mass hot molecular core Sgr B2(N-LMH) using CARMA and the IRAM 30 m. Because the spectral lines of heavy molecules like urea tend to be weak and hot cores display lines from a wide range of molecules, a major problem in identifying urea lines is confusion with lines of other molecules. Therefore, it is necessary to detect a number of urea lines and apply sophisticated statistical tests before having confidence in an identification. The 1 mm resolution of CARMA enables favorable coupling of the source size and synthesized beam size, which was found to be essential for the detection of weak signals. The 2.5^"×2^" synthesized beam of CARMA significantly resolves out the contamination by extended emission and reveals the eight weak urea lines that were previously blended with nearby transitions. Our analysis indicates that these lines are likely to be urea since the resulting observed line frequencies are coincident with a set of overlapping connecting urea lines, and the observed line intensities are consistent with the expected line strengths of urea. In addition, we have developed a new statistical approach to examine the spatial correlation between the observed lines by applying the Student T-test to the high resolution channel maps obtained from CARMA. The T-test shows similar spatial distributions from all eight candidate lines, suggesting a common molecular origin, urea. Our T-test method could have a broad impact on the next generation of arrays, such as ALMA, because the new arrays will require a method to systematically determine the credibility of detections of weaker signals from new and larger interstellar molecules.

  8. Application of visual basic in high-throughput mass spectrometry-directed purification of combinatorial libraries.

    PubMed

    Li, B; Chan, E C Y

    2003-01-01

    We present an approach to customize the sample submission process for high-throughput purification (HTP) of combinatorial parallel libraries using preparative liquid chromatography electrospray ionization mass spectrometry. In this study, Visual Basic and Visual Basic for Applications programs were developed using Microsoft Visual Basic 6 and Microsoft Excel 2000, respectively. These programs are subsequently applied for the seamless electronic submission and handling of data for HTP. Functions were incorporated into these programs where medicinal chemists can perform on-line verification of the purification status and on-line retrieval of postpurification data. The application of these user friendly and cost effective programs in our HTP technology has greatly increased our work efficiency by reducing paper work and manual manipulation of data.

  9. Effects of the Dietary Detoxification Program on Serum γ-glutamyltransferase, Anthropometric Data and Metabolic Biomarkers in Adults.

    PubMed

    Kim, Ju Ah; Kim, Jin Young; Kang, Seung Wan

    2016-09-01

    Persistent Organic Pollutants (POPs) are well-known environmental contaminants which are associated with chronic diseases. As foods are the major sources of human exposure to toxic pollutants, we developed an integrated dietary and education program to eliminate the chemical toxin throughout the human body. The present study evaluated effects of the dietary detoxification program on serum γ -glutamyltransferase (GGT), anthropometric data and metabolic biomarkers in adults. Single-armed, pre-post study was conducted from June 2013 to June 2015 at a health examination center and a public health center in Seoul, Korea. Sixty eight subjects (mean age of 52.4 years) were recruited. Subjects participated 20 hours' dietary education sessions. On-line coaching with SNS was performed to enhance participants' proper protocol compliance. Physical and laboratory examinations were assessed at week 0 and 3. Changes of the serum GGT were correlated with reductions of the body fat percentage (r = .379, p = .001), body fat mass (r = .435, p = .000) and fasting blood glucose (r = .423, p = .000). Serum GGT, weight, body fat percentage, body fat mass, waist circumference, LDL-cholesterol, HDL-cholesterol, triglyceride, total cholesterol, and blood pressure of all participants were reduced with statistical significance in 3 weeks. In metabolic syndrome group, total cholesterol (p = .049), fasting blood glucose (p = .002), and systolic blood pressure (p = .001) were significantly reduced comparison to non-metabolic syndrome group. This dietary detoxification program might decrease serum GGT which indicated the overall toxic burden in the body. Anthropometric data and metabolic biomarkers were improved. The integrated dietary and education detoxification program seemed to be a protective intervention for elimination of toxicants from the body.

  10. Desktop computer graphics for RMS/payload handling flight design

    NASA Technical Reports Server (NTRS)

    Homan, D. J.

    1984-01-01

    A computer program, the Multi-Adaptive Drawings, Renderings and Similitudes (MADRAS) program, is discussed. The modeling program, written for a desktop computer system (the Hewlett-Packard 9845/C), is written in BASIC and uses modular construction of objects while generating both wire-frame and hidden-line drawings from any viewpoint. The dimensions and placement of objects are user definable. Once the hidden-line calculations are made for a particular viewpoint, the viewpoint may be rotated in pan, tilt, and roll without further hidden-line calculations. The use and results of this program are discussed.

  11. 77 FR 39496 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-03

    ... Awareness Program, Adoption Opportunities Program, Child Abuse and Neglect Program and the Child Welfare... Estimate 90. Program Estimate 45. Performance Measurement On-Line Tool Child Abuse and Neglect 2 per fiscal... DEPARTMENT OF HEALTH AND HUMAN SERVICES Administration for Children and Families Submission for...

  12. Computer Assistance for Writing Interactive Programs: TICS.

    ERIC Educational Resources Information Center

    Kaplow, Roy; And Others

    1973-01-01

    Investigators developed an on-line, interactive programing system--the Teacher-Interactive Computer System (TICS)--to provide assistance to those who were not programers, but nevertheless wished to write interactive instructional programs. TICS had two components: an author system and a delivery system. Underlying assumptions were that…

  13. Statistical correlation of structural mode shapes from test measurements and NASTRAN analytical values

    NASA Technical Reports Server (NTRS)

    Purves, L.; Strang, R. F.; Dube, M. P.; Alea, P.; Ferragut, N.; Hershfeld, D.

    1983-01-01

    The software and procedures of a system of programs used to generate a report of the statistical correlation between NASTRAN modal analysis results and physical tests results from modal surveys are described. Topics discussed include: a mathematical description of statistical correlation, a user's guide for generating a statistical correlation report, a programmer's guide describing the organization and functions of individual programs leading to a statistical correlation report, and a set of examples including complete listings of programs, and input and output data.

  14. SIPT: a seismic refraction inverse modeling program for timeshare terminal computer systems

    USGS Publications Warehouse

    Scott, James Henry

    1977-01-01

    SIPB is an interactive Fortran computer program that was developed for use with a timeshare computer system with program control information submitted from a remote terminal, and output data displayed on the terminal or printed on a line printer. The program is an upgraded version of FSIPI (Scott, Tibbetts, and Burdick, 1972) with several major improvements in addition to .its adaptation to timeshare operation. The most significant improvement was made in the procedure for handling data from in-line offset shotpoints beyond the end shotpoints of the geophone spread. The changes and improvements are described, user's instructions are outlined, examples of input and output data for a test problem are presented, and the Fortran program is listed in this report. An upgraded batch-mode program, SIPB, is available for users who do not have a timeshare computer system available (Scott, 1977).

  15. SIPB: a seismic refraction inverse modeling program for batch computer systems

    USGS Publications Warehouse

    Scott, James Henry

    1977-01-01

    SIPB is an interactive Fortran computer program that was developed for use with a timeshare computer system with program control information submitted from a remote terminal, and output data displayed on the terminal or printed on a line printer. The program is an upgraded version of FSIPI (Scott, Tibbetts, and Burdick, 1972) with several major improvements in addition to .its adaptation to timeshare operation. The most significant improvement was made in the procedure for handling data from in-line offset shotpoints beyond the end shotpoints of the geophone spread. The changes and improvements are described, user's instructions are outlined, examples of input and output data for a test problem are presented, and the Fortran program is listed in this report. An upgraded batch-mode program, SIPB, is available for users who do not have a timeshare computer system available (Scott, 1977).

  16. Taxonomy and clustering in collaborative systems: The case of the on-line encyclopedia Wikipedia

    NASA Astrophysics Data System (ADS)

    Capocci, A.; Rao, F.; Caldarelli, G.

    2008-01-01

    In this paper we investigate the nature and structure of the relation between imposed classifications and real clustering in a particular case of a scale-free network given by the on-line encyclopedia Wikipedia. We find a statistical similarity in the distributions of community sizes both by using the top-down approach of the categories division present in the archive and in the bottom-up procedure of community detection given by an algorithm based on the spectral properties of the graph. Regardless of the statistically similar behaviour, the two methods provide a rather different division of the articles, thereby signaling that the nature and presence of power laws is a general feature for these systems and cannot be used as a benchmark to evaluate the suitability of a clustering method.

  17. The Japanese and the American First-Line Supervisor.

    ERIC Educational Resources Information Center

    Bryan, Leslie A., Jr.

    1982-01-01

    Compares the American and Japanese first-line supervisor: production statistics, supervisory style, company loyalty, management style, and communication. Also suggests what Americans might learn from the Japanese methods. (CT)

  18. Status Report on Female Completers in New Jersey Vocational Education 1990.

    ERIC Educational Resources Information Center

    Montclair State Coll., Upper Montclair, NJ. Life Skills Center.

    The New Jersey Occupational Information Coordinating Committee's statistics for average annual predicted job openings for program year 1989 are given in this report, along with the New Jersey Division of Vocational Education completers' statistics for the 1988-89 school year. The numbers of male and female completers of secondary programs for each…

  19. MBTA Green Line Tests - Riverside Line, December 1972 : Volume 2. Track Geometry Data Plots.

    DOT National Transportation Integrated Search

    1973-09-01

    The Urban Rail Supporting Technology Program emphasizes three major task areas; facilities development, technology development, and test program development. The test program development is composed of three sub-areas; vehicle testing, ways and struc...

  20. Using the Properties of Broad Absorption Line Quasars to Illuminate Quasar Structure

    NASA Astrophysics Data System (ADS)

    Yong, Suk Yee; King, Anthea L.; Webster, Rachel L.; Bate, Nicholas F.; O'Dowd, Matthew J.; Labrie, Kathleen

    2018-06-01

    A key to understanding quasar unification paradigms is the emission properties of broad absorption line quasars (BALQs). The fact that only a small fraction of quasar spectra exhibit deep absorption troughs blueward of the broad permitted emission lines provides a crucial clue to the structure of quasar emitting regions. To learn whether it is possible to discriminate between the BALQ and non-BALQ populations given the observed spectral properties of a quasar, we employ two approaches: one based on statistical methods and the other supervised machine learning classification, applied to quasar samples from the Sloan Digital Sky Survey. The features explored include continuum and emission line properties, in particular the absolute magnitude, redshift, spectral index, line width, asymmetry, strength, and relative velocity offsets of high-ionisation C IV λ1549 and low-ionisation Mg II λ2798 lines. We consider a complete population of quasars, and assume that the statistical distributions of properties represent all angles where the quasar is viewed without obscuration. The distributions of the BALQ and non-BALQ sample properties show few significant differences. None of the observed continuum and emission line features are capable of differentiating between the two samples. Most published narrow disk-wind models are inconsistent with these observations, and an alternative disk-wind model is proposed. The key feature of the proposed model is a disk-wind filling a wide opening angle with multiple radial streams of dense clumps.

  1. THREE-POINT PHASE CORRELATIONS: A NEW MEASURE OF NONLINEAR LARGE-SCALE STRUCTURE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolstenhulme, Richard; Bonvin, Camille; Obreschkow, Danail

    2015-05-10

    We derive an analytical expression for a novel large-scale structure observable: the line correlation function. The line correlation function, which is constructed from the three-point correlation function of the phase of the density field, is a robust statistical measure allowing the extraction of information in the nonlinear and non-Gaussian regime. We show that, in perturbation theory, the line correlation is sensitive to the coupling kernel F{sub 2}, which governs the nonlinear gravitational evolution of the density field. We compare our analytical expression with results from numerical simulations and find a 1σ agreement for separations r ≳ 30 h{sup −1} Mpc.more » Fitting formulae for the power spectrum and the nonlinear coupling kernel at small scales allow us to extend our prediction into the strongly nonlinear regime, where we find a 1σ agreement with the simulations for r ≳ 2 h{sup −1} Mpc. We discuss the advantages of the line correlation relative to standard statistical measures like the bispectrum. Unlike the latter, the line correlation is independent of the bias, in the regime where the bias is local and linear. Furthermore, the variance of the line correlation is independent of the Gaussian variance on the modulus of the density field. This suggests that the line correlation can probe more precisely the nonlinear regime of gravity, with less contamination from the power spectrum variance.« less

  2. 76 FR 81984 - Agency Information Collection Activities; Submission for OMB Review; Comment Request; Local Area...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-29

    ... for OMB Review; Comment Request; Local Area Unemployment Statistics Program ACTION: Notice. SUMMARY... collection request (ICR) titled, ``Local Area Unemployment Statistics Program,'' to the Office of Management... of Collection: Local Area Unemployment Statistics Program. OMB Control Number: 1220-0017. Affected...

  3. Baldwin Effect and Additional BLR Component in AGN with Superluminal Jets

    NASA Astrophysics Data System (ADS)

    Patiño Álvarez, Víctor; Torrealba, Janet; Chavushyan, Vahram; Cruz González, Irene; Arshakian, Tigran; León Tavares, Jonathan; Popovic, Luka

    2016-06-01

    We study the Baldwin Effect (BE) in 96 core-jet blazars with optical and ultraviolet spectroscopic data from a radio-loud AGN sample obtained from the MOJAVE 2cm survey. A statistical analysis is presented of the equivalent widths W_lambda of emission lines H beta 4861, Mg II 2798, C IV 1549, and continuum luminosities at 5100, 3000, and 1350 angstroms. The BE is found statistically significant (with confidence level c.l. > 95%) in H beta and C IV emission lines, while for Mg II the trend is slightly less significant (c.l. = 94.5%). The slopes of the BE in the studied samples for H beta and Mg II are found steeper and with statistically significant difference than those of a comparison radio-quiet sample. We present simulations of the expected BE slopes produced by the contribution to the total continuum of the non-thermal boosted emission from the relativistic jet, and by variability of the continuum components. We find that the slopes of the BE between radio-quiet and radio-loud AGN should not be different, under the assumption that the broad line is only being emitted by the canonical broad line region around the black hole. We discuss that the BE slope steepening in radio AGN is due to a jet associated broad-line region.

  4. Multidirectional Scanning Model, MUSCLE, to Vectorize Raster Images with Straight Lines

    PubMed Central

    Karas, Ismail Rakip; Bayram, Bulent; Batuk, Fatmagul; Akay, Abdullah Emin; Baz, Ibrahim

    2008-01-01

    This paper presents a new model, MUSCLE (Multidirectional Scanning for Line Extraction), for automatic vectorization of raster images with straight lines. The algorithm of the model implements the line thinning and the simple neighborhood methods to perform vectorization. The model allows users to define specified criteria which are crucial for acquiring the vectorization process. In this model, various raster images can be vectorized such as township plans, maps, architectural drawings, and machine plans. The algorithm of the model was developed by implementing an appropriate computer programming and tested on a basic application. Results, verified by using two well known vectorization programs (WinTopo and Scan2CAD), indicated that the model can successfully vectorize the specified raster data quickly and accurately. PMID:27879843

  5. Multidirectional Scanning Model, MUSCLE, to Vectorize Raster Images with Straight Lines.

    PubMed

    Karas, Ismail Rakip; Bayram, Bulent; Batuk, Fatmagul; Akay, Abdullah Emin; Baz, Ibrahim

    2008-04-15

    This paper presents a new model, MUSCLE (Multidirectional Scanning for Line Extraction), for automatic vectorization of raster images with straight lines. The algorithm of the model implements the line thinning and the simple neighborhood methods to perform vectorization. The model allows users to define specified criteria which are crucial for acquiring the vectorization process. In this model, various raster images can be vectorized such as township plans, maps, architectural drawings, and machine plans. The algorithm of the model was developed by implementing an appropriate computer programming and tested on a basic application. Results, verified by using two well known vectorization programs (WinTopo and Scan2CAD), indicated that the model can successfully vectorize the specified raster data quickly and accurately.

  6. A Management Information System Model for Program Management. Ph.D. Thesis - Oklahoma State Univ.; [Computerized Systems Analysis

    NASA Technical Reports Server (NTRS)

    Shipman, D. L.

    1972-01-01

    The development of a model to simulate the information system of a program management type of organization is reported. The model statistically determines the following parameters: type of messages, destinations, delivery durations, type processing, processing durations, communication channels, outgoing messages, and priorites. The total management information system of the program management organization is considered, including formal and informal information flows and both facilities and equipment. The model is written in General Purpose System Simulation 2 computer programming language for use on the Univac 1108, Executive 8 computer. The model is simulated on a daily basis and collects queue and resource utilization statistics for each decision point. The statistics are then used by management to evaluate proposed resource allocations, to evaluate proposed changes to the system, and to identify potential problem areas. The model employs both empirical and theoretical distributions which are adjusted to simulate the information flow being studied.

  7. The National Shipbuilding Research Program. Proceedings of the IREAPS Technical Symposium. Paper No. 24: MAPLIS: An On-Line Materials Resource Planning System Tailored to the Shipbuilding and Offshore Industry

    DTIC Science & Technology

    1982-09-01

    Offshore Industry U.S. DEPARTMENT OF THE NAVY CARDEROCK DIVISION, NAVAL SURFACE WARFARE CENTER Report Documentation Page Form ApprovedOMB No . 0704...INNOVATION MARINE INDUSTRY STANDARDS WELDING INDUSTRIAL ENGINEERING EDUCATION AND TRAINING THE NATIONAL SHIPBUILDING RESEARCH PROGRAM September 1982 NSRP 0009...Proceedings of the IREAPS Technical Symposium Paper No . 24: MAPLIS: An On-Line Materials Resource Planning System Tailored to the Shipbuilding and

  8. Can You Explain that in Plain English? Making Statistics Group Projects Work in a Multicultural Setting

    ERIC Educational Resources Information Center

    Sisto, Michelle

    2009-01-01

    Students increasingly need to learn to communicate statistical results clearly and effectively, as well as to become competent consumers of statistical information. These two learning goals are particularly important for business students. In line with reform movements in Statistics Education and the GAISE guidelines, we are working to implement…

  9. Genetic variability among elite popcorn lines based on molecular and morphoagronomic characteristics.

    PubMed

    Dos Santos, J F; Mangolin, C A; Machado, M F P S; Scapim, C A; Giordani, W; Gonçalves, L S A

    2017-06-29

    Knowledge of genetic diversity among genotypes and relationships among elite lines is of great importance for the development of breeding programs. Therefore, the objective of this study was to evaluate genetic variability based on the morphoagronomic and molecular characterization of 18 elite popcorn (Zea mays var. everta) lines to be used by Universidade Estadual de Maringá breeding programs. We used 31 microsatellite primers (widely distributed in the genome), and 16 morphological descriptors (including the resistance to maize white spot, common rust, polysora rust of maize, cercospora and leaf blights). The molecular data revealed variability among the lines, which were divided into four groups that were partially concordant with unweighted pair group method with arithmetic mean (UPMGA) and Bayesian clusters. The lines G3, G4, G11, and G13 exhibited favorable morphological characters and low disease incidence rates. The four groups were confirmed using the Gower distance in the UPGMA cluster; however, there was no association with the dissimilarity patterns obtained using the molecular data. The absence of a correlation suggests that both characterizations (morphoagronomic and molecular) are important for discriminating among elite popcorn lines.

  10. 2009 GED Testing Program Statistical Report

    ERIC Educational Resources Information Center

    GED Testing Service, 2010

    2010-01-01

    The "2009 GED[R] Testing Program Statistical Report" is the 52nd annual report in the program's 68-year history of providing a second opportunity for adults without a high school credential to earn their jurisdiction's GED credential. The report provides candidate demographic and GED Test performance statistics as well as historical…

  11. The Time-Limited Hot Line.

    ERIC Educational Resources Information Center

    Loring, Marti Tamm; Wimberley, Edward T.

    1993-01-01

    Notes that media have become involved in creating programs and addressing issues that have been historically exclusive purview of mental health and human services agencies. Explains how time-limited hot line has been used to address specific issues raised by these programs. Provides overview of this type of hot line, offering triangular model of…

  12. Two-Bin Kanban: Ordering Impact at Navy Medical Center San Diego

    DTIC Science & Technology

    2016-06-17

    pretest (2013 data set) and posttest (2015 data set) analysis to avoid having the findings influenced by price changes. DMLSS does not track shipping...statistics based on those observations (Kabacoff, 2011, p. 112). Replacing the groups of observations with summary statistics allows the analyst...listed on the Acquisition Research Program website (www.acquisitionresearch.net). Acquisition Research Program Graduate School of Business & Public

  13. 34 CFR 642.12 - What activities may a project conduct?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION TRAINING PROGRAM FOR FEDERAL TRIO PROGRAMS What Types of... project conduct? A Training program project may include on-site training, on-line training, conferences, internships, seminars, workshops, and the publication of manuals designed to improve the operations of Federal...

  14. 34 CFR 642.12 - What activities may a project conduct?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION TRAINING PROGRAM FOR FEDERAL TRIO PROGRAMS What Types of... project conduct? A Training program project may include on-site training, on-line training, conferences, internships, seminars, workshops, and the publication of manuals designed to improve the operations of Federal...

  15. 34 CFR 642.12 - What activities may a project conduct?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION TRAINING PROGRAM FOR FEDERAL TRIO PROGRAMS What Types of... project conduct? A Training program project may include on-site training, on-line training, conferences, internships, seminars, workshops, and the publication of manuals designed to improve the operations of Federal...

  16. 34 CFR 642.12 - What activities may a project conduct?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION TRAINING PROGRAM FOR FEDERAL TRIO PROGRAMS What Types of... project conduct? A Training program project may include on-site training, on-line training, conferences, internships, seminars, workshops, and the publication of manuals designed to improve the operations of Federal...

  17. Disability management: organizational diversity and Dutch employment policy.

    PubMed

    Kopnina, Helen; Haafkens, Joke A

    2010-06-01

    While Human Resource Managers (HRM) and line managers could play a significant role in the prevention of job-related problems and in promotion of early job-continuation, it is not clear wether the chronically ill workers are recognized as a group. Unlike some other groups, distinguished by gender, age or ethnicity, those with chronic illness are less distinct and may not be included in diversity management programs. The aim of this research is to address theory and evidence in literature about the topic, as well as to inquire whether chronic illness of the employees is 'visible' in practice. For desk research, we used a systematic search strategy involving medical, statistical, management, and social science databases (Web of Science, MedLine, Pub Med, Psych Info, etc.). Research results are based on case studies conducted with the managers and HRM of government and commercial organizations between March 2007 and October 2008 and between October 2008 and April 2009. These case studies were based on open interviews and focus group sessions (for human resource departments) which were consequently analyzed using thematical analysis. For group sessions, we used concept mapping to collect information from two groups of HRM professionals and managers. Secondary analysis included thematic and content analysis of 'best practice' organizations carried out by the Dutch organization Gatekeeper. We have discovered that the chronically ill employees are largely invisible to HRM practitioners, line managers who do not always have the right instruments for implementation of the European or national frameworks. Most practitioners are unaware of the impact of chronic illness in their organizations and in employees work life.

  18. An Assessment Blueprint for EncStat: A Statistics Anxiety Intervention Program.

    ERIC Educational Resources Information Center

    Watson, Freda S.; Lang, Thomas R.; Kromrey, Jeffrey D.; Ferron, John M.; Hess, Melinda R.; Hogarty, Kristine Y.

    EncStat (Encouraged about Statistics) is a multimedia program being developed to identify and assist students with statistics anxiety or negative attitudes about statistics. This study explored the validity of the assessment instruments included in EncStat with respect to their diagnostic value for statistics anxiety and negative attitudes about…

  19. Comparison and statistical analysis of four write stability metrics in bulk CMOS static random access memory cells

    NASA Astrophysics Data System (ADS)

    Qiu, Hao; Mizutani, Tomoko; Saraya, Takuya; Hiramoto, Toshiro

    2015-04-01

    The commonly used four metrics for write stability were measured and compared based on the same set of 2048 (2k) six-transistor (6T) static random access memory (SRAM) cells by the 65 nm bulk technology. The preferred one should be effective for yield estimation and help predict edge of stability. Results have demonstrated that all metrics share the same worst SRAM cell. On the other hand, compared to butterfly curve with non-normality and write N-curve where no cell state flip happens, bit-line and word-line margins have good normality as well as almost perfect correlation. As a result, both bit line method and word line method prove themselves preferred write stability metrics.

  20. On-line estimation of error covariance parameters for atmospheric data assimilation

    NASA Technical Reports Server (NTRS)

    Dee, Dick P.

    1995-01-01

    A simple scheme is presented for on-line estimation of covariance parameters in statistical data assimilation systems. The scheme is based on a maximum-likelihood approach in which estimates are produced on the basis of a single batch of simultaneous observations. Simple-sample covariance estimation is reasonable as long as the number of available observations exceeds the number of tunable parameters by two or three orders of magnitude. Not much is known at present about model error associated with actual forecast systems. Our scheme can be used to estimate some important statistical model error parameters such as regionally averaged variances or characteristic correlation length scales. The advantage of the single-sample approach is that it does not rely on any assumptions about the temporal behavior of the covariance parameters: time-dependent parameter estimates can be continuously adjusted on the basis of current observations. This is of practical importance since it is likely to be the case that both model error and observation error strongly depend on the actual state of the atmosphere. The single-sample estimation scheme can be incorporated into any four-dimensional statistical data assimilation system that involves explicit calculation of forecast error covariances, including optimal interpolation (OI) and the simplified Kalman filter (SKF). The computational cost of the scheme is high but not prohibitive; on-line estimation of one or two covariance parameters in each analysis box of an operational bozed-OI system is currently feasible. A number of numerical experiments performed with an adaptive SKF and an adaptive version of OI, using a linear two-dimensional shallow-water model and artificially generated model error are described. The performance of the nonadaptive versions of these methods turns out to depend rather strongly on correct specification of model error parameters. These parameters are estimated under a variety of conditions, including uniformly distributed model error and time-dependent model error statistics.

  1. Which level of evidence does the US National Toxicology Program provide? Statistical considerations using the Technical Report 578 on Ginkgo biloba as an example.

    PubMed

    Gaus, Wilhelm

    2014-09-02

    The US National Toxicology Program (NTP) is assessed by a statistician. In the NTP-program groups of rodents are fed for a certain period of time with different doses of the substance that is being investigated. Then the animals are sacrificed and all organs are examined pathologically. Such an investigation facilitates many statistical tests. Technical Report TR 578 on Ginkgo biloba is used as an example. More than 4800 statistical tests are possible with the investigations performed. Due to a thought experiment we expect >240 false significant tests. In actuality, 209 significant pathological findings were reported. The readers of Toxicology Letters should carefully distinguish between confirmative and explorative statistics. A confirmative interpretation of a significant test rejects the null-hypothesis and delivers "statistical proof". It is only allowed if (i) a precise hypothesis was established independently from the data used for the test and (ii) the computed p-values are adjusted for multiple testing if more than one test was performed. Otherwise an explorative interpretation generates a hypothesis. We conclude that NTP-reports - including TR 578 on Ginkgo biloba - deliver explorative statistics, i.e. they generate hypotheses, but do not prove them. Copyright © 2014 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  2. Lung Ultrasonography in the Evaluation of Interstitial Lung Disease in Systemic Connective Tissue Diseases: Criteria and Severity of Pulmonary Fibrosis - Analysis of 52 Patients.

    PubMed

    Buda, N; Piskunowicz, M; Porzezińska, M; Kosiak, W; Zdrojewski, Z

    2016-08-01

    Patients with a diagnosed systemic connective tissue disease require regular monitoring from the point of view of interstitial lung disease. The main aim of this work is a description of the criteria for pulmonary fibrosis and the degree of the severity of the fibrosis during the course of interstitial lung disease through the TLU (transthoracic lung ultrasound). 52 patients with diagnosed diffuse interstitial lung disease were qualified for this research, together with 50 volunteers in the control group. The patients in both groups were over 18 years of age and were of both sexes. The results of the TLU of the patients underwent statistical analysis and were compared to High-Resolution Computed Tomography (HRCT) results. As a consequence of the statistical analysis, we defined our own criteria for pulmonary fibrosis in TLU: irregularity of the pleura line, tightening of the pleura line, the fragmentary nature of the pleura line, blurring of the pleura line, thickening of the pleura line, artifacts of line B ≤ 3 and ≥ 4, artifacts of Am line and subpleural consolidations < 5 mm. As a result of the conducted research, a scale of severity of pulmonary fibrosis in TLU was devised (UFI - Ultrasound Fibrosis Index), enabling a division to be made into mild, moderate and severe cases. Transthoracic Lung Ultrasonography (TLU) gives a new outlook on the diagnostic possibilities, non-invasive and devoid of ionising radiation, of pulmonary fibrosis. This research work has allowed to discover two new ultrasound symptoms of pulmonary fibrosis (blurred pleural line and Am lines). © Georg Thieme Verlag KG Stuttgart · New York.

  3. Statistics: Can We Get beyond Terminal?

    ERIC Educational Resources Information Center

    Green, Suzy; Carney, JoLynn V.

    Recent articles in behavioral sciences statistics literature address the need for modernizing graduate statistics programs and courses. This paper describes the development of one such course and evaluates student background for a class designed to provide a more consumer-oriented type of statistics instruction by focusing on the needs of students…

  4. Random sampling technique for ultra-fast computations of molecular opacities for exoplanet atmospheres

    NASA Astrophysics Data System (ADS)

    Min, M.

    2017-10-01

    Context. Opacities of molecules in exoplanet atmospheres rely on increasingly detailed line-lists for these molecules. The line lists available today contain for many species up to several billions of lines. Computation of the spectral line profile created by pressure and temperature broadening, the Voigt profile, of all of these lines is becoming a computational challenge. Aims: We aim to create a method to compute the Voigt profile in a way that automatically focusses the computation time into the strongest lines, while still maintaining the continuum contribution of the high number of weaker lines. Methods: Here, we outline a statistical line sampling technique that samples the Voigt profile quickly and with high accuracy. The number of samples is adjusted to the strength of the line and the local spectral line density. This automatically provides high accuracy line shapes for strong lines or lines that are spectrally isolated. The line sampling technique automatically preserves the integrated line opacity for all lines, thereby also providing the continuum opacity created by the large number of weak lines at very low computational cost. Results: The line sampling technique is tested for accuracy when computing line spectra and correlated-k tables. Extremely fast computations ( 3.5 × 105 lines per second per core on a standard current day desktop computer) with high accuracy (≤1% almost everywhere) are obtained. A detailed recipe on how to perform the computations is given.

  5. Structure of small-scale magnetic fields in the kinematic dynamo theory.

    PubMed

    Schekochihin, Alexander; Cowley, Steven; Maron, Jason; Malyshkin, Leonid

    2002-01-01

    A weak fluctuating magnetic field embedded into a a turbulent conducting medium grows exponentially while its characteristic scale decays. In the interstellar medium and protogalactic plasmas, the magnetic Prandtl number is very large, so a broad spectrum of growing magnetic fluctuations is excited at small (subviscous) scales. The condition for the onset of nonlinear back reaction depends on the structure of the field lines. We study the statistical correlations that are set up in the field pattern and show that the magnetic-field lines possess a folding structure, where most of the scale decrease is due to the field variation across itself (rapid transverse direction reversals), while the scale of the field variation along itself stays approximately constant. Specifically, we find that, though both the magnetic energy and the mean-square curvature of the field lines grow exponentially, the field strength and the field-line curvature are anticorrelated, i.e., the curved field is relatively weak, while the growing field is relatively flat. The detailed analysis of the statistics of the curvature shows that it possesses a stationary limiting distribution with the bulk located at the values of curvature comparable to the characteristic wave number of the velocity field and a power tail extending to large values of curvature where it is eventually cut off by the resistive regularization. The regions of large curvature, therefore, occupy only a small fraction of the total volume of the system. Our theoretical results are corroborated by direct numerical simulations. The implication of the folding effect is that the advent of the Lorentz back reaction occurs when the magnetic energy approaches that of the smallest turbulent eddies. Our results also directly apply to the problem of statistical geometry of the material lines in a random flow.

  6. Conference: Statistical Physics and Biological Information

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gross, David J.; Hwa, Terence

    OAK B188 In the spring of 2001, the Institute for Theoretical Physics ran a 6 month scientific program on Statistical Physics and Biological Information. This program was organized by Walter Fitch (UC Irvine), Terence Hwa (UC San Diego), Luca Peliti (University Federico II), Naples Gary Stormo (Washington University School of Medicine) and Chao Tang (NEC). Overall scientific supervision was provided by David Gross, Director, ITP. The ITP has an online conference/program proceeding which consists of audio and transparencies of almost all of the talks held during this program. Over 100 talks are available on the site at http://online.kitp.ucsb.edu/online/infobio01/.

  7. Improving the performance of a filling line based on simulation

    NASA Astrophysics Data System (ADS)

    Jasiulewicz-Kaczmarek, M.; Bartkowiak, T.

    2016-08-01

    The paper describes the method of improving performance of a filling line based on simulation. This study concerns a production line that is located in a manufacturing centre of a FMCG company. A discrete event simulation model was built using data provided by maintenance data acquisition system. Two types of failures were identified in the system and were approximated using continuous statistical distributions. The model was validated taking into consideration line performance measures. A brief Pareto analysis of line failures was conducted to identify potential areas of improvement. Two improvements scenarios were proposed and tested via simulation. The outcome of the simulations were the bases of financial analysis. NPV and ROI values were calculated taking into account depreciation, profits, losses, current CIT rate and inflation. A validated simulation model can be a useful tool in maintenance decision-making process.

  8. A voice-actuated wind tunnel model leak checking system

    NASA Technical Reports Server (NTRS)

    Larson, William E.

    1989-01-01

    A computer program has been developed that improves the efficiency of wind tunnel model leak checking. The program uses a voice recognition unit to relay a technician's commands to the computer. The computer, after receiving a command, can respond to the technician via a voice response unit. Information about the model pressure orifice being checked is displayed on a gas-plasma terminal. On command, the program records up to 30 seconds of pressure data. After the recording is complete, the raw data and a straight line fit of the data are plotted on the terminal. This allows the technician to make a decision on the integrity of the orifice being checked. All results of the leak check program are stored in a database file that can be listed on the line printer for record keeping purposes or displayed on the terminal to help the technician find unchecked orifices. This program allows one technician to check a model for leaks instead of the two or three previously required.

  9. Clinical Track Program Expansion Increases Rotation Capacity for Experiential Program.

    PubMed

    Tofade, Toyin S; Brueckl, Mark; Ross, Patricia A

    2017-10-01

    Objective. To evaluate the rotation capacity at the University of Maryland School of Pharmacy and see if the implementation of clinical track programs across the state correlates to an increase in rotation capacity for the school. Methods. The following information was collected: number of preceptors over the years in the school's experiential learning program, number of clinical track programs from 2012 to 2015, rotation type, availability submissions per rotation type per year, and availability submissions per hospital participant in the clinical track program per year. The rotation capacity and rotation types from 2012 to 2015 academic years were assessed and compared to see if there was any impact on the clinical track programs implemented. Results. There was no statistically significant difference in the frequency distribution of rotation types among all sites from 2012 through 2015 academic years. However, there was a statistically significant difference in the total number/capacity of rotations from 2012 to 2015 academic years. There were also statistically significant differences in the rotation capacity in all sites except for three sites. Conclusion. Adding clinical track programs can help increase the capacity of a school's clinical rotations.

  10. The Student-to-Student Chemistry Initiative: Training High School Students To Perform Chemistry Demonstration Programs for Elementary School Students

    NASA Astrophysics Data System (ADS)

    Voegel, Phillip D.; Quashnock, Kathryn A.; Heil, Katrina M.

    2004-05-01

    The Student-to-Student Chemistry Initiative is an outreach program started in the fall of 2001 at Midwestern State University (MSU). The oncampus program trains high school science students to perform a series of chemistry demonstrations and subsequently provides kits containing necessary supplies and reagents for the high school students to perform demonstration programs at elementary schools. The program focuses on improving student perception of science. The program's impact on high school student perception is evaluated through statistical analysis of paired preparticipation and postparticipation surveys. The surveys focus on four areas of student perception: general attitude toward science, interest in careers in science, science awareness, and interest in attending MSU for postsecondary education. Increased scores were observed in all evaluation areas including a statistically significant increase in science awareness following participation.

  11. Explorations in Statistics: Correlation

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2010-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This sixth installment of "Explorations in Statistics" explores correlation, a familiar technique that estimates the magnitude of a straight-line relationship between two variables. Correlation is meaningful only when the…

  12. Effect of open rhinoplasty on the smile line.

    PubMed

    Tabrizi, Reza; Mirmohamadsadeghi, Hoori; Daneshjoo, Danadokht; Zare, Samira

    2012-05-01

    Open rhinoplasty is an esthetic surgical technique that is becoming increasingly popular, and can affect the nose and upper lip compartments. The aim of this study was to evaluate the effect of open rhinoplasty on tooth show and the smile line. The study participants were 61 patients with a mean age of 24.3 years (range, 17.2 to 39.6 years). The surgical procedure consisted of an esthetic open rhinoplasty without alar resection. Analysis of tooth show was limited to pre- and postoperative (at 12 months) tooth show measurements at rest and the maximum smile with a ruler (when participants held their heads naturally). Statistical analyses were performed with SPSS 13.0, and paired-sample t tests were used to compare tooth show means before and after the operation. Analysis of the rest position showed no statistically significant change in tooth show (P = .15), but analysis of participants' maximum smile data showed a statistically significant increase in tooth show after surgery (P < .05). In contrast, Pearson correlation analysis showed a positive relation between rhinoplasty and tooth show increases in maximum smile, especially in subjects with high smile lines. This study shows that the nasolabial compartment is a single unit and any change in 1 part may influence the other parts. Further studies should be conducted to investigate these interactions. Copyright © 2012 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  13. Statistical procedures for determination and verification of minimum reporting levels for drinking water methods.

    PubMed

    Winslow, Stephen D; Pepich, Barry V; Martin, John J; Hallberg, George R; Munch, David J; Frebis, Christopher P; Hedrick, Elizabeth J; Krop, Richard A

    2006-01-01

    The United States Environmental Protection Agency's Office of Ground Water and Drinking Water has developed a single-laboratory quantitation procedure: the lowest concentration minimum reporting level (LCMRL). The LCMRL is the lowest true concentration for which future recovery is predicted to fall, with high confidence (99%), between 50% and 150%. The procedure takes into account precision and accuracy. Multiple concentration replicates are processed through the entire analytical method and the data are plotted as measured sample concentration (y-axis) versus true concentration (x-axis). If the data support an assumption of constant variance over the concentration range, an ordinary least-squares regression line is drawn; otherwise, a variance-weighted least-squares regression is used. Prediction interval lines of 99% confidence are drawn about the regression. At the points where the prediction interval lines intersect with data quality objective lines of 50% and 150% recovery, lines are dropped to the x-axis. The higher of the two values is the LCMRL. The LCMRL procedure is flexible because the data quality objectives (50-150%) and the prediction interval confidence (99%) can be varied to suit program needs. The LCMRL determination is performed during method development only. A simpler procedure for verification of data quality objectives at a given minimum reporting level (MRL) is also presented. The verification procedure requires a single set of seven samples taken through the entire method procedure. If the calculated prediction interval is contained within data quality recovery limits (50-150%), the laboratory performance at the MRL is verified.

  14. Understanding medical group financial and operational performance: the synergistic effect of linking statistical process control and profit and loss.

    PubMed

    Smolko, J R; Greisler, D S

    2001-01-01

    There is ongoing pressure for medical groups owned by not-for-profit health care systems or for-profit entrepreneurs to generate profit. The fading promise of superior strategy through health care integration has boards of directors clamoring for bottom-line performance. While prudent, sole focus on the bottom line through the lens of the profit-and-loss (P&L) statement provides incomplete information upon which to base executive decisions. The purpose of this paper is to suggest that placing statistical process control (SPC) charts in tandem with the P&L statement provides a more complete picture of medical group performance thereby optimizing decision making as executives deal with the whitewater issues surrounding physician practice ownership.

  15. THE YOUNG SOLAR ANALOGS PROJECT. I. SPECTROSCOPIC AND PHOTOMETRIC METHODS AND MULTI-YEAR TIMESCALE SPECTROSCOPIC RESULTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gray, R. O.; Briley, M. M.; Lambert, R. A.

    2015-12-15

    This is the first in a series of papers presenting methods and results from the Young Solar Analogs Project, which began in 2007. This project monitors both spectroscopically and photometrically a set of 31 young (300–1500 Myr) solar-type stars with the goal of gaining insight into the space environment of the Earth during the period when life first appeared. From our spectroscopic observations we derive the Mount Wilson S chromospheric activity index (S{sub MW}), and describe the method we use to transform our instrumental indices to S{sub MW} without the need for a color term. We introduce three photospheric indicesmore » based on strong absorption features in the blue-violet spectrum—the G-band, the Ca i resonance line, and the Hydrogen-γ line—with the expectation that these indices might prove to be useful in detecting variations in the surface temperatures of active solar-type stars. We also describe our photometric program, and in particular our “Superstar technique” for differential photometry which, instead of relying on a handful of comparison stars, uses the photon flux in the entire star field in the CCD image to derive the program star magnitude. This enables photometric errors on the order of 0.005–0.007 magnitude. We present time series plots of our spectroscopic data for all four indices, and carry out extensive statistical tests on those time series demonstrating the reality of variations on timescales of years in all four indices. We also statistically test for and discover correlations and anti-correlations between the four indices. We discuss the physical basis of those correlations. As it turns out, the “photospheric” indices appear to be most strongly affected by emission in the Paschen continuum. We thus anticipate that these indices may prove to be useful proxies for monitoring emission in the ultraviolet Balmer continuum. Future papers in this series will discuss variability of the program stars on medium (days–months) and short (minutes to hours) timescales.« less

  16. Measurement of attachment-line location in a wind-tunnel and in supersonic flight

    NASA Technical Reports Server (NTRS)

    Agarwal, Naval K.; Miley, Stan J.; Fisher, Michael C.; Anderson, Bianca T.; Geenen, Robert J.

    1992-01-01

    For the supersonic laminar flow control research program, tests are being conducted to measure the attachment-line flow characteristics and its location on a highly swept aircraft wing. Subsonic wind tunnel experiments were conducted on 2D models to develop sensors and techniques for the flight application. Representative attachment-line data are discussed and results from the wind tunnel investigation are presented.

  17. The Status of Child Nutrition Programs in Colorado.

    ERIC Educational Resources Information Center

    McMillan, Daniel C.; Vigil, Herminia J.

    This report provides descriptive and statistical data on the status of child nutrition programs in Colorado. The report contains descriptions of the National School Lunch Program, school breakfast programs, the Special Milk Program, the Summer Food Service Program, the Nutrition Education and Training Program, state dietary guidelines, Colorado…

  18. STANDBY TOP AND BOTTOM ROTARY MILLING CUTTERS FOR TORIN LINE. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    STANDBY TOP AND BOTTOM ROTARY MILLING CUTTERS FOR TORIN LINE. SOME PRODUCT FROM THE #43 HOT ROLL IS PROCESSED ON THE TORIN LINE TO REMOVE OXIDIZED SURFACE MATERIAL. IN PRACTICE 15-20/1000 IS CUT FROM THE UPPER AND LOWER SURFACES OF THE STRIP AND RECYCLED TO THE CASTING SHOP. TORIN LINE ADDED AS PART OF 1981 EXPANSION PROGRAM. - American Brass Foundry, 70 Sayre Street, Buffalo, Erie County, NY

  19. Statistical Supplement to the Annual Report of the Coordinating Board, Texas College and University System for the Fiscal Year 1980.

    ERIC Educational Resources Information Center

    Texas Coll. and Univ. System, Austin. Coordinating Board.

    Comprehensive statistical data on Texas higher education is presented. Data and formulas relating to student enrollments and faculty headcounts, program development and productivity, faculty salaries and teaching loads, campus development, funding, and the state student load program are included. Student headcount enrollment data are presented by…

  20. Digest of Adult Education Statistics--1998.

    ERIC Educational Resources Information Center

    Elliott, Barbara G.

    Information on literacy programs for adults in the United States was compiled from the annual statistical performance reports states submit to the U.S. Department of Education at the end of each program year (PY). Nearly 27 percent of adults had not completed a high school diploma or equivalent. In PY 1991, the nation's adult education (AE)…

  1. ACHP GUIDANCE ON PROGRAM COMMENTS AS A PROGRAM ALTERNATIVE

    Science.gov Websites

    usual case-by-case basis. Format of this Guidance The following table identifies the steps in the adequate monitoring of issued program comments. Following the table, the ACHP has developed a series of following subject line: "RE: ACHP Program Comment Guidance." Return to Top

  2. Enhancing patient understanding of medical procedures: evaluation of an interactive multimedia program with in-line exercises.

    PubMed

    Tait, Alan R; Voepel-Lewis, Terri; Chetcuti, Stanley J; Brennan-Martinez, Colleen; Levine, Robert

    2014-05-01

    Standard print and verbal information provided to patients undergoing treatments are often difficult to understand and may impair their ability to be truly informed. This study examined the effect of an interactive multimedia informational program with in-line exercises and corrected feedback on patients' real-time understanding of their cardiac catheterization procedure. 151 adult patients scheduled for diagnostic cardiac catheterization were randomized to receive information about their procedure using either the standard institutional verbal and written information (SI) or an interactive iPad-based informational program (IPI). Subject understanding was evaluated using semi-structured interviews at baseline, immediately following catheterization, and 2 weeks after the procedure. In addition, for those randomized to the IPI, the ability to respond correctly to several in-line exercises was recorded. Subjects' perceptions of, and preferences for the information delivery were also elicited. Subjects randomized to the IPI program had significantly better understanding following the intervention compared with those randomized to the SI group (8.3±2.4 vs 7.4±2.5, respectively, 0-12 scale where 12=complete understanding, P<0.05). First-time correct responses to the in-line exercises ranged from 24.3% to 100%. Subjects reported that the in-line exercises were very helpful (9.1±1.7, 0-10 scale, where 10=extremely helpful) and the iPad program very easy to use (9.0±1.6, 0-10 scale, where 10=extremely easy) suggesting good clinical utility. Results demonstrated the ability of an interactive multimedia program to enhance patients' understanding of their medical procedure. Importantly, the incorporation of in-line exercises permitted identification of knowledge deficits, provided corrected feedback, and confirmed the patients' understanding of treatment information in real-time when consent was sought. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  3. CIRCAL-2 - General-purpose on-line circuit design.

    NASA Technical Reports Server (NTRS)

    Dertouzos, M. L.; Jessel, G. P.; Stinger, J. R.

    1972-01-01

    CIRCAL-2 is a second-generation general-purpose on-line circuit-design program with the following main features: (1) multiple-analysis capability; (2) uniform and general data structures for handling text editing, network representations, and output results, regardless of analysis; (3) special techniques and structures for minimizing and controlling user-program interaction; (4) use of functionals for the description of hysteresis and heat effects; and (5) ability to define optimization procedures that 'replace' the user. The paper discusses the organization of CIRCAL-2, the aforementioned main features, and their consequences, such as a set of network elements and models general enough for most analyses and a set of functions tailored to circuit-design requirements. The presentation is descriptive, concentrating on conceptual rather than on program implementation details.

  4. Mithras Studies of the Boundary Between Open and Closed Field Lines.

    DTIC Science & Technology

    1994-01-31

    I ¸ . . A- : - Final Report • March 1995 MITHRAS STUDIES OF THE BOUNDARY BETWEEN OPEN AND CLOSED FIELD LINES John D. Kelly, Program Manager Richard A...Kelly, Program Manager Richard A. Doe, Research Physicist Geoscience and Engineering Center SRI Project 3245 Prepared for: Department of the Air...characteristic energy, energy flux, and an estimate for upward field-aligned current. On the basis of coordinated radar/optical experiments, Vallance Jones et al

  5. A system overview of the Aerospace Safety Research and Data Institute data management programs

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The NASA Aerospace Safety Information System, is an interactive, generalized data base management system. The on-line retrieval aspects provide for operating from a variety of terminals (or in batch mode). NASIS retrieval enables the user to expand and display (review) the terms of index (cross reference) files, select desired index terms, combine sets of documents corresponding to selected terms and display the resulting records. It also allows the user to print (record) this information on a high speed printer if desired. NASIS also provides the ability to store the strategy of any given session the user has executed. It has a searching and publication ability through generalized linear search and report generating modules which may be performed interactively or in a batch mode. The user may specify formats for the terminal from which he is operating. The system features an interactive user's guide which explains the various commands available and how to use them as well as explanations for all system messages. This explain capability may be extended, without program changes, to include descriptions of the various files in use. Coupled with the ability of NASIS to run in an MTT (multi-terminal task) mode is its automatic accumulation of statistics on each user of the system as well as each file.

  6. Anatomical evaluation and stress distribution of intact canine femur.

    PubMed

    Verim, Ozgur; Tasgetiren, Suleyman; Er, Mehmet S; Ozdemir, Vural; Yuran, Ahmet F

    2013-03-01

    In the biomedical field, three-dimensional (3D) modeling and analysis of bones and tissues has steadily gained in importance. The aim of this study was to produce more accurate 3D models of the canine femur derived from computed tomography (CT) data by using several modeling software programs and two different methods. The accuracy of the analysis depends on the modeling process and the right boundary conditions. Solidworks, Rapidform, Inventor, and 3DsMax software programs were used to create 3D models. Data derived from CT were converted into 3D models using two different methods: in the first, 3D models were generated using boundary lines, while in the second, 3D models were generated using point clouds. Stress analyses in the models were made by ANSYS v12, also considering any muscle forces acting on the canine femur. When stress values and statistical values were taken into consideration, more accurate models were obtained with the point cloud method. It was found that the maximum von Mises stress on the canine femur shaft was 34.8 MPa. Stress and accuracy values were obtained from the model formed using the Rapidform software. The values obtained were similar to those in other studies in the literature. Copyright © 2012 John Wiley & Sons, Ltd.

  7. Teaching Statistics in APA-Accredited Doctoral Programs in Clinical and Counseling Psychology: A Syllabi Review

    ERIC Educational Resources Information Center

    Ord, Anna S.; Ripley, Jennifer S.; Hook, Joshua; Erspamer, Tiffany

    2016-01-01

    Although statistical methods and research design are crucial areas of competency for psychologists, few studies explore how statistics are taught across doctoral programs in psychology in the United States. The present study examined 153 American Psychological Association-accredited doctoral programs in clinical and counseling psychology and aimed…

  8. SPA- STATISTICAL PACKAGE FOR TIME AND FREQUENCY DOMAIN ANALYSIS

    NASA Technical Reports Server (NTRS)

    Brownlow, J. D.

    1994-01-01

    The need for statistical analysis often arises when data is in the form of a time series. This type of data is usually a collection of numerical observations made at specified time intervals. Two kinds of analysis may be performed on the data. First, the time series may be treated as a set of independent observations using a time domain analysis to derive the usual statistical properties including the mean, variance, and distribution form. Secondly, the order and time intervals of the observations may be used in a frequency domain analysis to examine the time series for periodicities. In almost all practical applications, the collected data is actually a mixture of the desired signal and a noise signal which is collected over a finite time period with a finite precision. Therefore, any statistical calculations and analyses are actually estimates. The Spectrum Analysis (SPA) program was developed to perform a wide range of statistical estimation functions. SPA can provide the data analyst with a rigorous tool for performing time and frequency domain studies. In a time domain statistical analysis the SPA program will compute the mean variance, standard deviation, mean square, and root mean square. It also lists the data maximum, data minimum, and the number of observations included in the sample. In addition, a histogram of the time domain data is generated, a normal curve is fit to the histogram, and a goodness-of-fit test is performed. These time domain calculations may be performed on both raw and filtered data. For a frequency domain statistical analysis the SPA program computes the power spectrum, cross spectrum, coherence, phase angle, amplitude ratio, and transfer function. The estimates of the frequency domain parameters may be smoothed with the use of Hann-Tukey, Hamming, Barlett, or moving average windows. Various digital filters are available to isolate data frequency components. Frequency components with periods longer than the data collection interval are removed by least-squares detrending. As many as ten channels of data may be analyzed at one time. Both tabular and plotted output may be generated by the SPA program. This program is written in FORTRAN IV and has been implemented on a CDC 6000 series computer with a central memory requirement of approximately 142K (octal) of 60 bit words. This core requirement can be reduced by segmentation of the program. The SPA program was developed in 1978.

  9. Immunohistochemical evaluation of inducible nitric oxide synthase in the epithelial lining of odontogenic cysts: A qualitative and quantitative analysis

    PubMed Central

    Akshatha, B K; Karuppiah, Karpagaselvi; Manjunath, G S; Kumarswamy, Jayalakshmi; Papaiah, Lokesh; Rao, Jyothi

    2017-01-01

    Introduction: The three common odontogenic cysts include radicular cysts (RCs), dentigerous cysts (DCs), and odontogenic keratocysts (OKCs). Among these 3 cysts, OKC is recently been classified as benign keratocystic odontogenic tumor attributing to its aggressive behavior, recurrence rate, and malignant potential. The present study involved qualitative and quantitative analysis of inducible nitric oxide synthase (iNOS) expression in epithelial lining of RCs, DCs, and OKCs, compare iNOS expression in epithelial linings of all the 3 cysts and determined overexpression of iNOS in OKCs which might contribute to its aggressive behavior and malignant potential. Aims: The present study is to investigate the role of iNOS in the pathogenesis of OKCs, DCs, and RCs by evaluating the iNOS expression in the epithelial lining of these cysts. Subjects and Methods: Analysis of iNOS expression in epithelial lining cells of 20 RCs, 20 DCs, and 20 OKCs using immunohistochemistry done. Statistical Analysis Used: The percentage of positive cells and intensity of stain was assessed and compared among all the 3 cysts using contingency coefficient. Kappa statistics for the two observers were computed for finding interobserver agreement. Results: The percentage of iNOS-positive cells was found to be remarkably high in OKCs (12/20) –57.1% as compared to RCs (6/20) – 28.6% and DCs (3/20) – 14.3%. The interobserver agreement for iNOS-positive percentage cells was arrived with kappa values with OKCs → Statistically significant (P > 0.000), RCs → statistically significant (P > 0.001) with no significant values for DCs. No statistical difference exists among 3 study samples in regard to the intensity of staining with iNOS. Conclusions: Increased iNOS expression in OKCs may contribute to bone resorption and accumulation of wild-type p53, hence, making OKCs more aggressive. PMID:29391711

  10. Assessment of Cell Line Models of Primary Human Cells by Raman Spectral Phenotyping

    PubMed Central

    Swain, Robin J.; Kemp, Sarah J.; Goldstraw, Peter; Tetley, Teresa D.; Stevens, Molly M.

    2010-01-01

    Abstract Researchers have previously questioned the suitability of cell lines as models for primary cells. In this study, we used Raman microspectroscopy to characterize live A549 cells from a unique molecular biochemical perspective to shed light on their suitability as a model for primary human pulmonary alveolar type II (ATII) cells. We also investigated a recently developed transduced type I (TT1) cell line as a model for alveolar type I (ATI) cells. Single-cell Raman spectra provide unique biomolecular fingerprints that can be used to characterize cellular phenotypes. A multivariate statistical analysis of Raman spectra indicated that the spectra of A549 and TT1 cells are characterized by significantly lower phospholipid content compared to ATII and ATI spectra because their cytoplasm contains fewer surfactant lamellar bodies. Furthermore, we found that A549 spectra are statistically more similar to ATI spectra than to ATII spectra. The spectral variation permitted phenotypic classification of cells based on Raman spectral signatures with >99% accuracy. These results suggest that A549 cells are not a good model for ATII cells, but TT1 cells do provide a reasonable model for ATI cells. The findings have far-reaching implications for the assessment of cell lines as suitable primary cellular models in live cultures. PMID:20409492

  11. 34 CFR Appendix A to Subpart N of... - Sample Default Prevention Plan

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... relevant default prevention statistics, including a statistical analysis of the borrowers who default on...'s delinquency status by obtaining reports from data managers and FFEL Program lenders. 5. Enhance... academic study. III. Statistics for Measuring Progress 1. The number of students enrolled at your...

  12. Solid State Lighting Program (Falcon)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meeks, Steven

    2012-06-30

    Over the past two years, KLA-Tencor and partners successfully developed and deployed software and hardware tools that increase product yield for High Brightness LED (HBLED) manufacturing and reduce product development and factory ramp times. This report summarizes our development effort and details of how the results of the Solid State Light Program (Falcon) have started to help HBLED manufacturers optimize process control by enabling them to flag and correct identified killer defect conditions at any point of origin in the process manufacturing flow. This constitutes a quantum leap in yield management over current practice. Current practice consists of die dispositioningmore » which is just rejection of bad die at end of process based upon probe tests, loosely assisted by optical in-line monitoring for gross process deficiencies. For the first time, and as a result of our Solid State Lighting Program, our LED manufacturing partners have obtained the software and hardware tools that optimize individual process steps to control killer defects at the point in the processes where they originate. Products developed during our two year program enable optimized inspection strategies for many product lines to minimize cost and maximize yield. The Solid State Lighting Program was structured in three phases: i) the development of advanced imaging modes that achieve clear separation between LED defect types, improves signal to noise and scan rates, and minimizes nuisance defects for both front end and back end inspection tools, ii) the creation of defect source analysis (DSA) software that connect the defect maps from back-end and front-end HBLED manufacturing tools to permit the automatic overlay and traceability of defects between tools and process steps, suppress nuisance defects, and identify the origin of killer defects with process step and conditions, and iii) working with partners (Philips Lumileds) on product wafers, obtain a detailed statistical correlation of automated defect and DSA map overlay to failed die identified using end product probe test results. Results from our two year effort have led to “automated end-to-end defect detection” with full defect traceability and the ability to unambiguously correlate device killer defects to optically detected features and their point of origin within the process. Success of the program can be measured by yield improvements at our partner’s facilities and new product orders.« less

  13. HR 7578 - A K dwarf double-lined spectroscopic binary with peculiar abundances

    NASA Technical Reports Server (NTRS)

    Fekel, F. C., Jr.; Beavers, W. I.

    1983-01-01

    The number of double-lined K and M dwarf binaries which is currently known is quite small, only a dozen or less of each type. The HR 7578 system was classified as dK5 on the Mount Wilson system and as K2 V on the MK ystem. A summary of radial-velocity measurements including the observatory and weight of each observation is given in a table. The star with the stronger lines has been called component A. The final orbital element solution with all observations appropriately weighted was computed with a differential corrections computer program described by Barker et al. (1967). The program had been modified for the double-lined case. Of particular interest are the very large eccentricity of the system and the large minimum masses for each component. These large minimum masses suggest that eclipses may be detectable despite the relatively long period and small radii of the stars.

  14. Effect of Flexible Duty Hour Policies on Length of Stay for Complex Intra-Abdominal Operations: A Flexibility in Duty Hour Requirements for Surgical Trainees (FIRST) Trial Analysis.

    PubMed

    Stulberg, Jonah J; Pavey, Emily S; Cohen, Mark E; Ko, Clifford Y; Hoyt, David B; Bilimoria, Karl Y

    2017-02-01

    Changes to resident duty hour policies in the Flexibility in Duty Hour Requirements for Surgical Trainees (FIRST) trial could impact hospitalized patients' length of stay (LOS) by altering care coordination. Length of stay can also serve as a reflection of all complications, particularly those not captured in the FIRST trial (eg pneumothorax from central line). Programs were randomized to either maintaining current ACGME duty hour policies (Standard arm) or more flexible policies waiving rules on maximum shift lengths and time off between shifts (Flexible arm). Our objective was to determine whether flexibility in resident duty hours affected LOS in patients undergoing high-risk surgical operations. Patients were identified who underwent hepatectomy, pancreatectomy, laparoscopic colectomy, open colectomy, or ventral hernia repair (2014-2015 academic year) at 154 hospitals participating in the FIRST trial. Two procedure-stratified evaluations of LOS were undertaken: multivariable negative binomial regression analysis on LOS and a multivariable logistic regression analysis on the likelihood of a prolonged LOS (>75 th percentile). Before any adjustments, there was no statistically significant difference in overall mean LOS between study arms (Flexible Policy: mean [SD] LOS 6.03 [5.78] days vs Standard Policy: mean LOS 6.21 [5.82] days; p = 0.74). In adjusted analyses, there was no statistically significant difference in LOS between study arms overall (incidence rate ratio for Flexible vs Standard: 0.982; 95% CI, 0.939-1.026; p = 0.41) or for any individual procedures. In addition, there was no statistically significant difference in the proportion of patients with prolonged LOS between study arms overall (Flexible vs Standard: odds ratio = 1.028; 95% CI, 0.871-1.212) or for any individual procedures. Duty hour flexibility had no statistically significant effect on LOS in patients undergoing complex intra-abdominal operations. Copyright © 2016 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  15. Moving line model and avalanche statistics of Bingham fluid flow in porous media.

    PubMed

    Chevalier, Thibaud; Talon, Laurent

    2015-07-01

    In this article, we propose a simple model to understand the critical behavior of path opening during flow of a yield stress fluid in porous media as numerically observed by Chevalier and Talon (2015). This model can be mapped to the problem of a contact line moving in an heterogeneous field. Close to the critical point, this line presents an avalanche dynamic where the front advances by a succession of waiting time and large burst events. These burst events are then related to the non-flowing (i.e. unyielded) areas. Remarkably, the statistics of these areas reproduce the same properties as in the direct numerical simulations. Furthermore, even if our exponents seem to be close to the mean field universal exponents, we report an unusual bump in the distribution which depends on the disorder. Finally, we identify a scaling invariance of the cluster spatial shape that is well fit, to first order, by a self-affine parabola.

  16. Adaptive variable-length coding for efficient compression of spacecraft television data.

    NASA Technical Reports Server (NTRS)

    Rice, R. F.; Plaunt, J. R.

    1971-01-01

    An adaptive variable length coding system is presented. Although developed primarily for the proposed Grand Tour missions, many features of this system clearly indicate a much wider applicability. Using sample to sample prediction, the coding system produces output rates within 0.25 bit/picture element (pixel) of the one-dimensional difference entropy for entropy values ranging from 0 to 8 bit/pixel. This is accomplished without the necessity of storing any code words. Performance improvements of 0.5 bit/pixel can be simply achieved by utilizing previous line correlation. A Basic Compressor, using concatenated codes, adapts to rapid changes in source statistics by automatically selecting one of three codes to use for each block of 21 pixels. The system adapts to less frequent, but more dramatic, changes in source statistics by adjusting the mode in which the Basic Compressor operates on a line-to-line basis. Furthermore, the compression system is independent of the quantization requirements of the pulse-code modulation system.

  17. New insights from a statistical analysis of IUE spectra of dwarf novae and nova-like stars. I - Inclination effects in lines and continua

    NASA Technical Reports Server (NTRS)

    La Dous, Constanze

    1991-01-01

    IUE observations of dwarf novae at maximum at quiescence and novalike objects at the high brightness state are analyzed for effects of the inclination angle on the emitted continuum and line radiation. A clear pattern in the continuum flux distribution is exhibited only by dwarf novae at maximum where some 80 percent of the non-double-eclipsing systems show essentially identical distributions. This result is not in disagreement with theoretical expectations. All classes of objects exhibit a clear, but in each case different, dependence of the line radiation on the inclination angle.

  18. Manufacturing of Protected Lithium Electrodes for Advanced Lithium-Air, Lithium-Water & Lithium-Sulfur Batteries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Visco, Steven J

    The global demand for rechargeable batteries is large and growing rapidly. Assuming the adoption of electric vehicles continues to increase, the need for smaller, lighter, and less expensive batteries will become even more pressing. In this vein, PolyPlus Battery Company has developed ultra-light high performance batteries based on its proprietary protected lithium electrode (PLE) technology. The Company’s Lithium-Air and Lithium-Seawater batteries have already demonstrated world record performance (verified by third party testing), and we are developing advanced lithium-sulfur batteries which have the potential deliver high performance at low cost. In this program PolyPlus Battery Company teamed with Corning Incorporated tomore » transition the PLE technology from bench top fabrication using manual tooling to a pre- commercial semi-automated pilot line. At the inception of this program PolyPlus worked with a Tier 1 battery manufacturing engineering firm to design and build the first-of-its-kind pilot line for PLE production. The pilot line was shipped and installed in Berkeley, California several months after the start of the program. PolyPlus spent the next two years working with and optimizing the pilot line and now produces all of its PLEs on this line. The optimization process successfully increased the yield, throughput, and quality of PLEs produced on the pilot line. The Corning team focused on fabrication and scale-up of the ceramic membranes that are key to the PLE technology. PolyPlus next demonstrated that it could take Corning membranes through the pilot line process to produce state-of-the-art protected lithium electrodes. In the latter part of the program the Corning team developed alternative membranes targeted for the large rechargeable battery market. PolyPlus is now in discussions with several potential customers for its advanced PLE-enabled batteries, and is building relationships and infrastructure for the transition into manufacturing. It is likely that the next step will be accomplished through a combination of joint venture partnering and licensing of the technology.« less

  19. Statistical characteristic in time-domain of direct current corona-generated audible noise from conductor in corona cage

    NASA Astrophysics Data System (ADS)

    Li, Xuebao; Cui, Xiang; Lu, Tiebing; Ma, Wenzuo; Bian, Xingming; Wang, Donglai; Hiziroglu, Huseyin

    2016-03-01

    The corona-generated audible noise (AN) has become one of decisive factors in the design of high voltage direct current (HVDC) transmission lines. The AN from transmission lines can be attributed to sound pressure pulses which are generated by the multiple corona sources formed on the conductor, i.e., transmission lines. In this paper, a detailed time-domain characteristics of the sound pressure pulses, which are generated by the DC corona discharges formed over the surfaces of a stranded conductors, are investigated systematically in a laboratory settings using a corona cage structure. The amplitude of sound pressure pulse and its time intervals are extracted by observing a direct correlation between corona current pulses and corona-generated sound pressure pulses. Based on the statistical characteristics, a stochastic model is presented for simulating the sound pressure pulses due to DC corona discharges occurring on conductors. The proposed stochastic model is validated by comparing the calculated and measured A-weighted sound pressure level (SPL). The proposed model is then used to analyze the influence of the pulse amplitudes and pulse rate on the SPL. Furthermore, a mathematical relationship is found between the SPL and conductor diameter, electric field, and radial distance.

  20. 49 CFR Schedule G to Subpart B of... - Selected Statistical Data

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 8 2011-10-01 2011-10-01 false Selected Statistical Data G Schedule G to Subpart... Statistical Data [Dollars in thousands] () Greyhound Lines, Inc. () Trailways combined () All study carriers... purpose of Schedule G is to develop selected property, labor and operational data for use in evaluating...

  1. Effect of a 16-week Pilates exercise program on the ego resiliency and depression in elderly women

    PubMed Central

    Roh, Su Yeon

    2016-01-01

    This study aims to examine the effect of a 16-week Pilates exercise program on the ego resiliency and depression in elderly women. Before participating in Pilates exercise programs, researcher explained the purpose and the intention of the research to elderly women who were willing to participate in this research. A total of 148 elderly women agreed to participate in the program and they filled in ego resiliency and depression questionnaires. Then, the elderly participated in the 16-week Pilates exercise program and completed the same questionnaires afterwards. Collected data was analyzed by the SPSS ver. 20.0 program and results of paired t-test were as follows; there were statistically significant differences in all subvariables of the ego resiliency such as self-confidence (t=7.770, P<0.001), communication efficiency (t=2.690, P<0.01), optimistic trait (t=1.996, P<0.05), and anger management (t=4.525, P<0.001) after elderly women participated in the 16-week Pilates exercise program, there was a statistically significant difference in depression of elderly women who participated in the 16-week Pilates exercise program (t=−6.506, P<0.001) which was statistically lower than before their participation in the program. Consequently, participating in the Pilates exercise program can help improve the ego-resiliency and alleviate depression of the elderly women. PMID:27807531

  2. Effect of a 16-week Pilates exercise program on the ego resiliency and depression in elderly women.

    PubMed

    Roh, Su Yeon

    2016-10-01

    This study aims to examine the effect of a 16-week Pilates exercise program on the ego resiliency and depression in elderly women. Before participating in Pilates exercise programs, researcher explained the purpose and the intention of the research to elderly women who were willing to participate in this research. A total of 148 elderly women agreed to participate in the program and they filled in ego resiliency and depression questionnaires. Then, the elderly participated in the 16-week Pilates exercise program and completed the same questionnaires afterwards. Collected data was analyzed by the SPSS ver. 20.0 program and results of paired t -test were as follows; there were statistically significant differences in all subvariables of the ego resiliency such as self-confidence ( t =7.770, P <0.001), communication efficiency ( t =2.690, P <0.01), optimistic trait ( t =1.996, P <0.05), and anger management ( t =4.525, P <0.001) after elderly women participated in the 16-week Pilates exercise program, there was a statistically significant difference in depression of elderly women who participated in the 16-week Pilates exercise program ( t =-6.506, P <0.001) which was statistically lower than before their participation in the program. Consequently, participating in the Pilates exercise program can help improve the ego-resiliency and alleviate depression of the elderly women.

  3. Line Intercept (LI)

    Treesearch

    John F. Caratti

    2006-01-01

    The FIREMON Line Intercept (LI) method is used to assess changes in plant species cover for a macroplot. This method uses multiple line transects to sample within plot variation and quantify statistically valid changes in plant species cover and height over time. This method is suited for most forest and rangeland communities, but is especially useful for sampling...

  4. A new statistical framework to assess structural alignment quality using information compression

    PubMed Central

    Collier, James H.; Allison, Lloyd; Lesk, Arthur M.; Garcia de la Banda, Maria; Konagurthu, Arun S.

    2014-01-01

    Motivation: Progress in protein biology depends on the reliability of results from a handful of computational techniques, structural alignments being one. Recent reviews have highlighted substantial inconsistencies and differences between alignment results generated by the ever-growing stock of structural alignment programs. The lack of consensus on how the quality of structural alignments must be assessed has been identified as the main cause for the observed differences. Current methods assess structural alignment quality by constructing a scoring function that attempts to balance conflicting criteria, mainly alignment coverage and fidelity of structures under superposition. This traditional approach to measuring alignment quality, the subject of considerable literature, has failed to solve the problem. Further development along the same lines is unlikely to rectify the current deficiencies in the field. Results: This paper proposes a new statistical framework to assess structural alignment quality and significance based on lossless information compression. This is a radical departure from the traditional approach of formulating scoring functions. It links the structural alignment problem to the general class of statistical inductive inference problems, solved using the information-theoretic criterion of minimum message length. Based on this, we developed an efficient and reliable measure of structural alignment quality, I-value. The performance of I-value is demonstrated in comparison with a number of popular scoring functions, on a large collection of competing alignments. Our analysis shows that I-value provides a rigorous and reliable quantification of structural alignment quality, addressing a major gap in the field. Availability: http://lcb.infotech.monash.edu.au/I-value Contact: arun.konagurthu@monash.edu Supplementary information: Online supplementary data are available at http://lcb.infotech.monash.edu.au/I-value/suppl.html PMID:25161241

  5. Fourier Transform Spectroscopy of Carbonyl Sulfide from 3700 to 4800 cm -1and Selection of a Line-Pointing Program

    NASA Astrophysics Data System (ADS)

    Naı̈m, S.; Fayt, A.; Bredohl, H.; Blavier, J.-F.; Dubois, I.

    1998-11-01

    We have measured the Fourier transform spectrum of natural OCS from 3700 to 4800 cm-1with a near Doppler resolution and a line-position accuracy between 4 and 8 × 10-5cm-1. For the normal isotopic species, 37 vibrational transitions have been analyzed for both frequencies and intensities. We also report 15 bands of OC34S, eight bands of O13CS, nine bands of OC33S, and two bands of18OCS. Important effective Herman-Wallis terms are explained on the basis of eigenvectors. A comparison of different line-pointing programs is also presented.

  6. Possible 6-qubit NMR quantum computer device material; simulator of the NMR line width

    NASA Astrophysics Data System (ADS)

    Hashi, K.; Kitazawa, H.; Shimizu, T.; Goto, A.; Eguchi, S.; Ohki, S.

    2002-12-01

    For an NMR quantum computer, splitting of an NMR spectrum must be larger than a line width. In order to find a best device material for a solid-state NMR quantum computer, we have made a simulation program to calculate the NMR line width due to the nuclear dipole field by the 2nd moment method. The program utilizes the lattice information prepared by commercial software to draw a crystal structure. By applying this program, we can estimate the NMR line width due to the nuclear dipole field without measurements and find a candidate material for a 6-qubit solid-state NMR quantum computer device.

  7. Building a base map with AutoCAD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flarity, S.J.

    1989-12-01

    The fundamental step in the exploration process is building a base map. Consequently, any serious computer exploration program should be capable of providing base maps. Data used in constructing base maps are available from commercial sources such as Tobin. and Petroleum Information. These data sets include line and well data, the line data being latitude longitude vectors, and the ell data any identifying text information for well and their locations. AutoCAD is a commercial program useful in building base maps. Its features include infinite zoom and pan capability, layering, block definition, text dialog boxes, and a command language, AutoLisp. AutoLispmore » provides more power by allowing the geologist to modify the way the program works. Three AutoLisp routines presented here allow geologists to construct a geologic base map from raw Tobin data. The first program, WELLS.LSP, sets up the map environment for the subsequent programs, WELLADD.LSP and LINEADD.LSP. Welladd.lisp reads the Tobin data and spots the well symbols and the identifying information. Lineadd.lsp performs the same task on line and textural information contained within the data set.« less

  8. Selecting the optimum plot size for a California design-based stream and wetland mapping program.

    PubMed

    Lackey, Leila G; Stein, Eric D

    2014-04-01

    Accurate estimates of the extent and distribution of wetlands and streams are the foundation of wetland monitoring, management, restoration, and regulatory programs. Traditionally, these estimates have relied on comprehensive mapping. However, this approach is prohibitively resource-intensive over large areas, making it both impractical and statistically unreliable. Probabilistic (design-based) approaches to evaluating status and trends provide a more cost-effective alternative because, compared with comprehensive mapping, overall extent is inferred from mapping a statistically representative, randomly selected subset of the target area. In this type of design, the size of sample plots has a significant impact on program costs and on statistical precision and accuracy; however, no consensus exists on the appropriate plot size for remote monitoring of stream and wetland extent. This study utilized simulated sampling to assess the performance of four plot sizes (1, 4, 9, and 16 km(2)) for three geographic regions of California. Simulation results showed smaller plot sizes (1 and 4 km(2)) were most efficient for achieving desired levels of statistical accuracy and precision. However, larger plot sizes were more likely to contain rare and spatially limited wetland subtypes. Balancing these considerations led to selection of 4 km(2) for the California status and trends program.

  9. 34 CFR 230.2 - What definitions apply to the Troops-to-Teacher program?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... incomes below the poverty line means the updated data on the number of children ages 5 through 17 from families with incomes below the poverty line provided by the Department of Commerce that the Secretary uses... agency— (1) That serves not fewer than 10,000 children from families with incomes below the poverty line...

  10. LATTICE/hor ellipsis/a beam transport program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Staples, J.

    1987-06-01

    LATTICE is a computer program that calculates the first order characteristics of synchrotrons and beam transport systems. The program uses matrix algebra to calculate the propagation of the betatron (Twiss) parameters along a beam line. The program draws on ideas from several older programs, notably Transport and Synch, adds many new ones and incorporates them into an interactive, user-friendly program. LATTICE will calculate the matched functions of a synchrotron lattice and display them in a number of ways, including a high resolution Tektronix graphics display. An optimizer is included to adjust selected element parameters so the beam meets a setmore » of constraints. LATTICE is a first order program, but the effect of sextupoles on the chromaticity of a synchrotron lattice is included, and the optimizer will set the sextupole strengths for zero chromaticity. The program will also calculate the characteristics of beam transport systems. In this mode, the beam parameters, defined at the start of the transport line, are propagated through to the end. LATTICE has two distinct modes: the lattice mode which finds the matched functions of a synchrotron, and the transport mode which propagates a predefined beam through a beam line. However, each mode can be used for either type of problem: the transport mode may be used to calculate an insertion for a synchrotron lattice, and the lattice mode may be used to calculate the characteristics of a long periodic beam transport system.« less

  11. Lambert W function for applications in physics

    NASA Astrophysics Data System (ADS)

    Veberič, Darko

    2012-12-01

    The Lambert W(x) function and its possible applications in physics are presented. The actual numerical implementation in C++ consists of Halley's and Fritsch's iterations with initial approximations based on branch-point expansion, asymptotic series, rational fits, and continued-logarithm recursion. Program summaryProgram title: LambertW Catalogue identifier: AENC_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AENC_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 1335 No. of bytes in distributed program, including test data, etc.: 25 283 Distribution format: tar.gz Programming language: C++ (with suitable wrappers it can be called from C, Fortran etc.), the supplied command-line utility is suitable for other scripting languages like sh, csh, awk, perl etc. Computer: All systems with a C++ compiler. Operating system: All Unix flavors, Windows. It might work with others. RAM: Small memory footprint, less than 1 MB Classification: 1.1, 4.7, 11.3, 11.9. Nature of problem: Find fast and accurate numerical implementation for the Lambert W function. Solution method: Halley's and Fritsch's iterations with initial approximations based on branch-point expansion, asymptotic series, rational fits, and continued logarithm recursion. Additional comments: Distribution file contains the command-line utility lambert-w. Doxygen comments, included in the source files. Makefile. Running time: The tests provided take only a few seconds to run.

  12. Senior Computational Scientist | Center for Cancer Research

    Cancer.gov

    The Basic Science Program (BSP) pursues independent, multidisciplinary research in basic and applied molecular biology, immunology, retrovirology, cancer biology, and human genetics. Research efforts and support are an integral part of the Center for Cancer Research (CCR) at the Frederick National Laboratory for Cancer Research (FNLCR). The Cancer & Inflammation Program (CIP), Basic Science Program, HLA Immunogenetics Section, under the leadership of Dr. Mary Carrington, studies the influence of human leukocyte antigens (HLA) and specific KIR/HLA genotypes on risk of and outcomes to infection, cancer, autoimmune disease, and maternal-fetal disease. Recent studies have focused on the impact of HLA gene expression in disease, the molecular mechanism regulating expression levels, and the functional basis for the effect of differential expression on disease outcome. The lab’s further focus is on the genetic basis for resistance/susceptibility to disease conferred by immunogenetic variation. KEY ROLES/RESPONSIBILITIES The Senior Computational Scientist will provide research support to the CIP-BSP-HLA Immunogenetics Section performing bio-statistical design, analysis and reporting of research projects conducted in the lab. This individual will be involved in the implementation of statistical models and data preparation. Successful candidate should have 5 or more years of competent, innovative biostatistics/bioinformatics research experience, beyond doctoral training Considerable experience with statistical software, such as SAS, R and S-Plus Sound knowledge, and demonstrated experience of theoretical and applied statistics Write program code to analyze data using statistical analysis software Contribute to the interpretation and publication of research results

  13. Fully digital programmable optical frequency comb generation and application.

    PubMed

    Yan, Xianglei; Zou, Xihua; Pan, Wei; Yan, Lianshan; Azaña, José

    2018-01-15

    We propose a fully digital programmable optical frequency comb (OFC) generation scheme based on binary phase-sampling modulation, wherein an optimized bit sequence is applied to phase modulate a narrow-linewidth light wave. Programming the bit sequence enables us to tune both the comb spacing and comb-line number (i.e., number of comb lines). The programmable OFCs are also characterized by ultra-flat spectral envelope, uniform temporal envelope, and stable bias-free setup. Target OFCs are digitally programmed to have 19, 39, 61, 81, 101, or 201 comb lines and to have a 100, 50, 20, 10, 5, or 1 MHz comb spacing. As a demonstration, a scanning-free temperature sensing system using a proposed OFC with 1001 comb lines was also implemented with a sensitivity of 0.89°C/MHz.

  14. An efficient interior-point algorithm with new non-monotone line search filter method for nonlinear constrained programming

    NASA Astrophysics Data System (ADS)

    Wang, Liwei; Liu, Xinggao; Zhang, Zeyin

    2017-02-01

    An efficient primal-dual interior-point algorithm using a new non-monotone line search filter method is presented for nonlinear constrained programming, which is widely applied in engineering optimization. The new non-monotone line search technique is introduced to lead to relaxed step acceptance conditions and improved convergence performance. It can also avoid the choice of the upper bound on the memory, which brings obvious disadvantages to traditional techniques. Under mild assumptions, the global convergence of the new non-monotone line search filter method is analysed, and fast local convergence is ensured by second order corrections. The proposed algorithm is applied to the classical alkylation process optimization problem and the results illustrate its effectiveness. Some comprehensive comparisons to existing methods are also presented.

  15. Visual Performance on the Small Letter Contrast Test: Effects of Aging, Low Luminance and Refractive Error

    DTIC Science & Technology

    2000-08-01

    luminance performance and aviation, many aviators develop ametropias refractive error having comparable effects on during their careers. We were... statistically (0.04 logMAR, the non-aviator group. Separate investigators at p=0.01), but not clinically significant (ə/2 line different research facilities... statistically significant (0.11 ± 0.1 logCS, t=4.0, sensitivity on the SLCT decreased for the aviator pɘ.001), yet there is significant overlap group at a

  16. Research progress of on-line automatic monitoring of chemical oxygen demand (COD) of water

    NASA Astrophysics Data System (ADS)

    Cai, Youfa; Fu, Xing; Gao, Xiaolu; Li, Lianyin

    2018-02-01

    With the increasingly stricter control of pollutant emission in China, the on-line automatic monitoring of water quality is particularly urgent. The chemical oxygen demand (COD) is a comprehensive index to measure the contamination caused by organic matters, and thus it is taken as one important index of energy-saving and emission reduction in China’s “Twelve-Five” program. So far, the COD on-line automatic monitoring instrument has played an important role in the field of sewage monitoring. This paper reviews the existing methods to achieve on-line automatic monitoring of COD, and on the basis, points out the future trend of the COD on-line automatic monitoring instruments.

  17. Use of the Global Test Statistic as a Performance Measurement in a Reananlysis of Environmental Health Data

    PubMed Central

    Dymova, Natalya; Hanumara, R. Choudary; Gagnon, Ronald N.

    2009-01-01

    Performance measurement is increasingly viewed as an essential component of environmental and public health protection programs. In characterizing program performance over time, investigators often observe multiple changes resulting from a single intervention across a range of categories. Although a variety of statistical tools allow evaluation of data one variable at a time, the global test statistic is uniquely suited for analyses of categories or groups of interrelated variables. Here we demonstrate how the global test statistic can be applied to environmental and occupational health data for the purpose of making overall statements on the success of targeted intervention strategies. PMID:19696393

  18. Use of the global test statistic as a performance measurement in a reanalysis of environmental health data.

    PubMed

    Dymova, Natalya; Hanumara, R Choudary; Enander, Richard T; Gagnon, Ronald N

    2009-10-01

    Performance measurement is increasingly viewed as an essential component of environmental and public health protection programs. In characterizing program performance over time, investigators often observe multiple changes resulting from a single intervention across a range of categories. Although a variety of statistical tools allow evaluation of data one variable at a time, the global test statistic is uniquely suited for analyses of categories or groups of interrelated variables. Here we demonstrate how the global test statistic can be applied to environmental and occupational health data for the purpose of making overall statements on the success of targeted intervention strategies.

  19. Center for Prostate Disease Research

    MedlinePlus

    ... 2017 Cancer Statistics programs Clinical Research Program Synopsis Leadership Multi-Disciplinary Clinic Staff Listing 2017 Cancer Statistics Basic Science Research Program Synopsis Leadership Gene Expression Data Research Achievements Staff Listing Lab ...

  20. Strengthening Statistics Graduate Programs with Statistical Collaboration--The Case of Hawassa University, Ethiopia

    ERIC Educational Resources Information Center

    Goshu, Ayele Taye

    2016-01-01

    This paper describes the experiences gained from the established statistical collaboration canter at Hawassa University in May 2015 as part of LISA 2020 [Laboratory for Interdisciplinary Statistical Analysis] network. The center has got similar setup as LISA of Virginia Tech. Statisticians are trained on how to become more effective scientific…

  1. Technological Tools in the Introductory Statistics Classroom: Effects on Student Understanding of Inferential Statistics

    ERIC Educational Resources Information Center

    Meletiou-Mavrotheris, Maria

    2004-01-01

    While technology has become an integral part of introductory statistics courses, the programs typically employed are professional packages designed primarily for data analysis rather than for learning. Findings from several studies suggest that use of such software in the introductory statistics classroom may not be very effective in helping…

  2. Coronal emission-line polarization from the statistical equilibrium of magnetic sublevels. II. Fe XIV 5303 A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    House, L.L.; Querfeld, C.W.; Rees, D.E.

    1982-04-15

    Coronal magnetic fields influence in the intensity and linear polarization of light scattered by coronal Fe XIV ions. To interpret polarization measurements of Fe XIV 5303 A coronal emission requires a detailed understanding of the dependence of the emitted Stokes vector on coronal magnetic field direction, electron density, and temperature and on height of origin. The required dependence is included in the solutions of statistical equilibrium for the ion which are solved explicitly for 34 magnetic sublevels in both the ground and four excited terms. The full solutions are reduced to equivalent simple analytic forms which clearly show the requiredmore » dependence on coronal conditions. The analytic forms of the reduced solutions are suitable for routine analysis of 5303 green line polarimetric data obtained at Pic du Midi and from the Solar Maximum Mission Coronagraph/Polarimeter.« less

  3. Human resources for health strategies adopted by providers in resource-limited settings to sustain long-term delivery of ART: a mixed-methods study from Uganda.

    PubMed

    Zakumumpa, Henry; Taiwo, Modupe Oladunni; Muganzi, Alex; Ssengooba, Freddie

    2016-10-19

    Human resources for health (HRH) constraints are a major barrier to the sustainability of antiretroviral therapy (ART) scale-up programs in Sub-Saharan Africa. Many prior approaches to HRH constraints have taken a top-down trend of generalized global strategies and policy guidelines. The objective of the study was to examine the human resources for health strategies adopted by front-line providers in Uganda to sustain ART delivery beyond the initial ART scale-up phase between 2004 and 2009. A two-phase mixed-methods approach was adopted. In the first phase, a survey of a nationally representative sample of health facilities (n = 195) across Uganda was conducted. The second phase involved in-depth interviews (n = 36) with ART clinic managers and staff of 6 of the 195 health facilities purposively selected from the first study phase. Quantitative data was analysed based on descriptive statistics, and qualitative data was analysed by coding and thematic analysis. The identified strategies were categorized into five themes: (1) providing monetary and non-monetary incentives to health workers on busy ART clinic days; (2) workload reduction through spacing ART clinic appointments; (3) adopting training workshops in ART management as a motivation strategy for health workers; (4) adopting non-physician-centred staffing models; and (5) devising ART program leadership styles that enhanced health worker commitment. Facility-level strategies for responding to HRH constraints are feasible and can contribute to efforts to increase country ownership of HIV programs in resource-limited settings. Consideration of the human resources for health strategies identified in the study by ART program planners and managers could enhance the long-term sustainment of ART programs by providers in resource-limited settings.

  4. The effects of organization on medical utilization: an analysis of service line organization.

    PubMed

    Byrne, Margaret M; Charns, Martin P; Parker, Victoria A; Meterko, Mark M; Wray, Nelda P

    2004-01-01

    To determine whether clinical service lines in primary care and mental health reduces inpatient and urgent care utilization. All VHA medical centers were surveyed to determine whether service lines had been established in primary care or mental health care prior to the beginning of fiscal year 1997 (FY97). Facility-level data on medical utilization from Veterans Health Affairs (VHA) administrative databases were used for descriptive and multivariate regression analyses of utilization and of changes in measures between FY97 and FY98. Nine primary care-related and 5 mental health-related variables were analyzed. Primary care and mental health service lines had been established in approximately half of all facilities. Service lines varied in duration and extent of restructuring. Mere presence of a service line had no positive and several negative effects on measured outcome variables. More detailed analyses showed that some types of service lines have statistically significant and mostly negative effects on both mental health and primary care-related measures. Newly implemented service lines had significantly less improvement in measures over time than facilities with no service line. Health care organizations are implementing innovative organizational structures in hopes of improving quality of care and reducing resource utilization. We found that service lines in primary care and mental health may lead to an initial period of disruption, with little evidence of a beneficial effect on performance for longer duration service lines.

  5. Linear ground-water flow, flood-wave response program for programmable calculators

    USGS Publications Warehouse

    Kernodle, John Michael

    1978-01-01

    Two programs are documented which solve a discretized analytical equation derived to determine head changes at a point in a one-dimensional ground-water flow system. The programs, written for programmable calculators, are in widely divergent but commonly encountered languages and serve to illustrate the adaptability of the linear model to use in situations where access to true computers is not possible or economical. The analytical method assumes a semi-infinite aquifer which is uniform in thickness and hydrologic characteristics, bounded on one side by an impermeable barrier and on the other parallel side by a fully penetrating stream in complete hydraulic connection with the aquifer. Ground-water heads may be calculated for points along a line which is perpendicular to the impermeable barrie and the fully penetrating stream. Head changes at the observation point are dependent on (1) the distance between that point and the impermeable barrier, (2) the distance between the line of stress (the stream) and the impermeable barrier, (3) aquifer diffusivity, (4) time, and (5) head changes along the line of stress. The primary application of the programs is to determine aquifer diffusivity by the flood-wave response technique. (Woodard-USGS)

  6. Stellar parameters of Be stars observed with X-shooter

    NASA Astrophysics Data System (ADS)

    Shokry, A.; Rivinius, Th.; Mehner, A.; Martayan, C.; Hummel, W.; Townsend, R. H. D.; Mérand, A.; Mota, B.; Faes, D. M.; Hamdy, M. A.; Beheary, M. M.; Gadallah, K. A. K.; Abo-Elazm, M. S.

    2018-01-01

    Aims: The X-shooter archive of several thousand telluric standard star spectra was skimmed for Be and Be shell stars to derive the stellar fundamental parameters and statistical properties, in particular for the less investigated late-type Be stars and the extension of the Be phenomenon into early A stars. Methods: An adapted version of the BCD method is used, using the Balmer discontinuity parameters to determine effective temperature and surface gravity. This method is optimally suited for late B stars. The projected rotational velocity was obtained by profile fitting to the Mg ii lines of the targets, and the spectra were inspected visually for the presence of peculiar features such as the infrared Ca ii triplet or the presence of a double Balmer discontinuity. The Balmer line equivalent widths were measured, but they are only useful for determining the pure emission contribution in a subsample of Be stars owing to uncertainties in determining the photospheric contribution. Results: A total of 78, mostly late-type, Be stars, were identified in the X-shooter telluric standard star archive, out of which 48 had not been reported before. We confirm the general trend that late-type Be stars have more tenuous disks and are less variable than early-type Be stars. The relatively large number (48) of relatively bright (V> 8.5) additional Be stars casts some doubt on the statistics of late-type Be stars; they are more common than currently thought. The Be/B star fraction may not strongly depend on spectral subtype. Based on observations made with ESO Telescopes at the La Silla Paranal Observatory under program IDs 60.A-9022, 60.A-9024, 077.D-0085, 085.A-0962, 185.D-0056, 091.B-0900, and 093.D-0415.Table 6 is only available at the CDS via anonymous ftp to cdsarc.u-strasbg.fr (130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/609/A108

  7. Characterization of palmprints by wavelet signatures via directional context modeling.

    PubMed

    Zhang, Lei; Zhang, David

    2004-06-01

    The palmprint is one of the most reliable physiological characteristics that can be used to distinguish between individuals. Current palmprint-based systems are more user friendly, more cost effective, and require fewer data signatures than traditional fingerprint-based identification systems. The principal lines and wrinkles captured in a low-resolution palmprint image provide more than enough information to uniquely identify an individual. This paper presents a palmprint identification scheme that characterizes a palmprint using a set of statistical signatures. The palmprint is first transformed into the wavelet domain, and the directional context of each wavelet subband is defined and computed in order to collect the predominant coefficients of its principal lines and wrinkles. A set of statistical signatures, which includes gravity center, density, spatial dispersivity and energy, is then defined to characterize the palmprint with the selected directional context values. A classification and identification scheme based on these signatures is subsequently developed. This scheme exploits the features of principal lines and prominent wrinkles sufficiently and achieves satisfactory results. Compared with the line-segments-matching or interesting-points-matching based palmprint verification schemes, the proposed scheme uses a much smaller amount of data signatures. It also provides a convenient classification strategy and more accurate identification.

  8. Effect of thalidomide and arsenic trioxide on the release of tumor necrosis factor-α and vascular endothelial growth factor from the KG-1a human acute myelogenous leukemia cell line.

    PubMed

    Girgis, Erian H; Mahoney, John P; Khalil, Rafaat H; Soliman, Magdi R

    2010-07-01

    Studies conducted in our lab have indicated that thalidomide cytotoxicity in the KG-1a human acute myelogenous leukemia (AML) cell line was enhanced by combining it with arsenic trioxide. The current investigation was conducted in order to evaluate the effect of thalidomide either alone or in combination with arsenic trioxide on the release of tumor necrosis factor-α (TNF-α) and vascular endothelial growth factor (VEGF) from this cell line in an attempt to clarify its possible cytotoxic mechanism(s). Human AML cell line KG-1a was used in this study. The cells were cultured for 48 h in the presence or absence of thalidomide (5 mg/l), and or arsenic trioxide (4 μM). The levels of TNF-α and VEGF in the supernatant were determined by ELISA. Results obtained indicate that the levels of TNF-α in the supernatant of KG-1a cell cultures incubated with thalidomide, arsenic trioxide, or combination were statistically lower than those observed in the supernatant of control cells (2.89, 5.07, 4.15 and 16.88 pg/ml, respectively). However, the levels of VEGF in the supernatant of thalidomide-treated cells were statistically higher than those in the supernatant of control cells (69.61 vs. 11.48 pg/l). Arsenic trioxide, whether alone or in combination with thalidomide, did not produce any statistically significant difference in the levels of VEGF as compared to the control or thalidomide-treated cell supernatant. These findings indicate that thalidomide and the arsenic trioxide inhibition of TNF-α production by KG-1a cells may play an important role in their cytotoxic effect.

  9. [Applications of the hospital statistics management system].

    PubMed

    Zhai, Hong; Ren, Yong; Liu, Jing; Li, You-Zhang; Ma, Xiao-Long; Jiao, Tao-Tao

    2008-01-01

    The Hospital Statistics Management System is built on an Office Automation Platform of Shandong provincial hospital system. Its workflow, role and popedom technologies are used to standardize and optimize the management program of statistics in the total quality control of hospital statistics. The system's applications have combined the office automation platform with the statistics management in a hospital and this provides a practical example of a modern hospital statistics management model.

  10. Program Predicts Nonlinear Inverter Performance

    NASA Technical Reports Server (NTRS)

    Al-Ayoubi, R. R.; Oepomo, T. S.

    1985-01-01

    Program developed for ac power distribution system on Shuttle orbiter predicts total load on inverters and node voltages at each of line replaceable units (LRU's). Mathematical model simulates inverter performance at each change of state in power distribution system.

  11. Preparing Teachers to Use GIS: The Impact of a Hybrid Professional Development Program on Teachers' Use of GIS

    NASA Astrophysics Data System (ADS)

    Moore, Steven; Haviland, Don; Moore, William; Tran, Michael

    2016-12-01

    This article reports the findings of a 3-year study of a hybrid professional development program designed to prepare science and mathematics teachers to implement GIS in their classrooms. The study was conducted as part of the CoastLines Innovative Technology Experiences for Students and Teachers project funded by the National Science Foundation. Three cohorts of teachers participated in the program, with each participant receiving 40 h of synchronous online instruction and 80 h of in-person instruction and support over an 8-month period. Data from surveys of participants both before and after the program were analyzed using correlation, ordinary least squares, and ordered logit regression analyses. The analyses revealed increases in the self-reported frequency of GIS use and enhanced feelings of preparation, competence, community, and comfort with respect to using GIS for instruction. A composite index of all impact variables was positively influenced as well. The statistical analyses found a strong relationship between self-reported feelings of preparation and use of GIS. Some support was found for the idea that feelings of competence, community, and comfort were related to the teachers' sense of preparation. The findings suggest that a robust hybrid model of teacher professional development can prepare teachers to use GIS in their classrooms. More research is needed to understand how hybrid models influence the sociopsychological and other dimensions that support teachers' feelings of preparation to implement GIS.

  12. Near-term hybrid vehicle program, phase 1. Appendix D: Sensitivity analysis resport

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Parametric analyses, using a hybrid vehicle synthesis and economics program (HYVELD) are described investigating the sensitivity of hybrid vehicle cost, fuel usage, utility, and marketability to changes in travel statistics, energy costs, vehicle lifetime and maintenance, owner use patterns, internal combustion engine (ICE) reference vehicle fuel economy, and drive-line component costs and type. The lowest initial cost of the hybrid vehicle would be $1200 to $1500 higher than that of the conventional vehicle. For nominal energy costs ($1.00/gal for gasoline and 4.2 cents/kWh for electricity), the ownership cost of the hybrid vehicle is projected to be 0.5 to 1.0 cents/mi less than the conventional ICE vehicle. To attain this ownership cost differential, the lifetime of the hybrid vehicle must be extended to 12 years and its maintenance cost reduced by 25 percent compared with the conventional vehicle. The ownership cost advantage of the hybrid vehicle increases rapidly as the price of fuel increases from $1 to $2/gal.

  13. Effects of size, sex, and voluntary running speeds on costs of locomotion in lines of laboratory mice selectively bred for high wheel-running activity.

    PubMed

    Rezende, Enrico L; Kelly, Scott A; Gomes, Fernando R; Chappell, Mark A; Garland, Theodore

    2006-01-01

    Selective breeding for over 35 generations has led to four replicate (S) lines of laboratory house mice (Mus domesticus) that run voluntarily on wheels about 170% more than four random-bred control (C) lines. We tested whether S lines have evolved higher running performance by increasing running economy (i.e., decreasing energy spent per unit of distance) as a correlated response to selection, using a recently developed method that allows for nearly continuous measurements of oxygen consumption (VO2) and running speed in freely behaving animals. We estimated slope (incremental cost of transport [COT]) and intercept for regressions of power (the dependent variable, VO2/min) on speed for 49 males and 47 females, as well as their maximum VO2 and speeds during wheel running, under conditions mimicking those that these lines face during the selection protocol. For comparison, we also measured COT and maximum aerobic capacity (VO2max) during forced exercise on a motorized treadmill. As in previous studies, the increased wheel running of S lines was mainly attributable to increased average speed, with males also showing a tendency for increased time spent running. On a whole-animal basis, combined analysis of males and females indicated that COT during voluntary wheel running was significantly lower in the S lines (one-tailed P=0.015). However, mice from S lines are significantly smaller and attain higher maximum speeds on the wheels; with either body mass or maximum speed (or both) entered as a covariate, the statistical significance of the difference in COT is lost (one-tailed P> or =0.2). Thus, both body size and behavior are key components of the reduction in COT. Several statistically significant sex differences were observed, including lower COT and higher resting metabolic rate in females. In addition, maximum voluntary running speeds were negatively correlated with COT in females but not in males. Moreover, males (but not females) from the S lines exhibited significantly higher treadmill VO2max as compared to those from C lines. The sex-specific responses to selection may in part be consequences of sex differences in body mass and running style. Our results highlight how differences in size and running speed can account for lower COT in S lines and suggest that lower COT may have coadapted in response to selection for higher running distances in these lines.

  14. Around the Sun in a Graphing Calculator.

    ERIC Educational Resources Information Center

    Demana, Franklin; Waits, Bert K.

    1989-01-01

    Discusses the use of graphing calculators for polar and parametric equations. Presents eight lines of the program for the graph of a parametric equation and 11 lines of the program for a graph of a polar equation. Illustrates the application of the programs for planetary motion and free-fall motion. (YP)

  15. Assessment of Coronal Spinal Alignment for Adult Spine Deformity Cases After Intraoperative T Square Shaped Use.

    PubMed

    Kurra, Swamy; Metkar, Umesh; Yirenkyi, Henaku; Tallarico, Richard A; Lavelle, William F

    Retrospectively reviewed surgeries between 2011 and 2015 of patients who underwent posterior spinal deformity instrumentation with constructs involving fusions to pelvis and encompassing at least five levels. Measure the radiographic outcomes of coronal malalignment (CM) after use of an intraoperative T square shaped instrument in posterior spinal deformity surgeries with at least five levels of fusion and extension to pelvis. Neuromuscular children found to benefit from intraoperative T square technique to help achieve proper coronal spinal balance with extensive fusions. This intraoperative technique used in our posterior spine deformity instrumentation surgeries with the aforementioned parameters. There were 50 patients: n = 16 with intraoperative T square and n = 34 no-T square shaped device. Subgroups divided based on greater than 20 mm displacement and greater than 40 mm displacement of the C7 plumb line to the central sacral vertical line on either side in preoperative radiographs. We analyzed the demographics and the pre- and postoperative radiographic parameters of standing films: standing CM (displacement of C7 plumb line to central sacral vertical line), and major coronal Cobb angles in total sample and subgroups and compared T square shaped device with no-T square shaped device use by analysis of variance. A p value ≤.05 is statistically significant. In the total sample, though postoperative CM mean was not statistically different, we observed greater CM corrections in patients where a T square shaped device was used (70%) versus no-T square shaped device used (18%). In >20 mm and >40 mm subgroups, the postoperative mean CM values were statistically lower for the patients where a T square shaped device was used, p = .016 and p = .003, respectively. Cobb corrections were statistically higher for T square shaped device use in both >20 mm and >40 mm subgroups, 68%, respectively. The intraoperative T square shaped device technique had a positive effect on the amount of spine coronal malalignment correction after its use and for lumbar and thoracic coronal Cobb angles. Level III. Copyright © 2017 Scoliosis Research Society. Published by Elsevier Inc. All rights reserved.

  16. Guide Lines for Evaluation of Continuing Education Programs in Mental Health.

    ERIC Educational Resources Information Center

    Miller, Norma; And Others

    Suggestions for program administrators and training program directors to develop comprehensive plans based on principles of community involvement, education, administration and finance, and the disciplines being taught are broadly outlined. Three accompanying charts illustrate approach to evaluation planning. (NF)

  17. The study of microstrip antenna arrays and related problems

    NASA Technical Reports Server (NTRS)

    Lo, Y. T.

    1986-01-01

    In February, an initial computer program to be used in analyzing the four-element array module was completed. This program performs the analysis of modules composed of four rectangular patches which are corporately fed by a microstrip line network terminated in four identical load impedances. Currently, a rigorous full-wave analysis of various types of microstrip line feed structures and patches is being performed. These tests include the microstrip line feed between layers of different electrical parameters. A method of moments was implemented for the case of a single dielectric layer and microstrip line fed rectangular patches in which the primary source is assumed to be a magnetic current ribbon across the line some distance from the patch. Measured values are compared with those computed by the program.

  18. Statistical Software and Artificial Intelligence: A Watershed in Applications Programming.

    ERIC Educational Resources Information Center

    Pickett, John C.

    1984-01-01

    AUTOBJ and AUTOBOX are revolutionary software programs which contain the first application of artificial intelligence to statistical procedures used in analysis of time series data. The artificial intelligence included in the programs and program features are discussed. (JN)

  19. Sets, Probability and Statistics: The Mathematics of Life Insurance. [Computer Program.] Second Edition.

    ERIC Educational Resources Information Center

    King, James M.; And Others

    The materials described here represent the conversion of a highly popular student workbook "Sets, Probability and Statistics: The Mathematics of Life Insurance" into a computer program. The program is designed to familiarize students with the concepts of sets, probability, and statistics, and to provide practice using real life examples. It also…

  20. Bayesian data analysis tools for atomic physics

    NASA Astrophysics Data System (ADS)

    Trassinelli, Martino

    2017-10-01

    We present an introduction to some concepts of Bayesian data analysis in the context of atomic physics. Starting from basic rules of probability, we present the Bayes' theorem and its applications. In particular we discuss about how to calculate simple and joint probability distributions and the Bayesian evidence, a model dependent quantity that allows to assign probabilities to different hypotheses from the analysis of a same data set. To give some practical examples, these methods are applied to two concrete cases. In the first example, the presence or not of a satellite line in an atomic spectrum is investigated. In the second example, we determine the most probable model among a set of possible profiles from the analysis of a statistically poor spectrum. We show also how to calculate the probability distribution of the main spectral component without having to determine uniquely the spectrum modeling. For these two studies, we implement the program Nested_fit to calculate the different probability distributions and other related quantities. Nested_fit is a Fortran90/Python code developed during the last years for analysis of atomic spectra. As indicated by the name, it is based on the nested algorithm, which is presented in details together with the program itself.

Top