Sample records for statistics program system

  1. Voice Response System Statistics Program : Operational Handbook.

    DOT National Transportation Integrated Search

    1980-06-01

    This report documents the Voice Response System (VRS) Statistics Program developed for the preflight weather briefing VRS. It describes the VRS statistical report format and contents, the software program structure, and the program operation.

  2. The U.S. geological survey rass-statpac system for management and statistical reduction of geochemical data

    USGS Publications Warehouse

    VanTrump, G.; Miesch, A.T.

    1977-01-01

    RASS is an acronym for Rock Analysis Storage System and STATPAC, for Statistical Package. The RASS and STATPAC computer programs are integrated into the RASS-STATPAC system for the management and statistical reduction of geochemical data. The system, in its present form, has been in use for more than 9 yr by scores of U.S. Geological Survey geologists, geochemists, and other scientists engaged in a broad range of geologic and geochemical investigations. The principal advantage of the system is the flexibility afforded the user both in data searches and retrievals and in the manner of statistical treatment of data. The statistical programs provide for most types of statistical reduction normally used in geochemistry and petrology, but also contain bridges to other program systems for statistical processing and automatic plotting. ?? 1977.

  3. The Fact Book: Report for the Florida College System, 2014

    ERIC Educational Resources Information Center

    Florida Department of Education, 2014

    2014-01-01

    This 2014 fact book for the Florida College System is divided into the following categories: (1) Student Information, which includes fall, annual, FTE, and program enrollment statistics, as well as credit program completion statistics; (2) Employee Information, which includes statistics regarding employee headcount by occupational activity, and…

  4. The Fact Book: Report for the Florida College System, 2015

    ERIC Educational Resources Information Center

    Florida Department of Education, 2015

    2015-01-01

    This 2015 fact book for the Florida College System is divided into the following categories: (1) Student Information, which includes fall, annual, FTE, and program enrollment statistics, as well as credit program completion statistics; (2) Employee Information, which includes statistics regarding employee headcount by occupational activity, and…

  5. The Fact Book: Report for the Florida College System, 2016

    ERIC Educational Resources Information Center

    Florida Department of Education, 2016

    2016-01-01

    This 2016 fact book for the Florida College System is divided into the following categories: (1) Student Information, which includes fall, annual, FTE, and program enrollment statistics, as well as credit program completion statistics; (2) Employee Information, which includes statistics regarding employee headcount by occupational activity and…

  6. A Management Information System Model for Program Management. Ph.D. Thesis - Oklahoma State Univ.; [Computerized Systems Analysis

    NASA Technical Reports Server (NTRS)

    Shipman, D. L.

    1972-01-01

    The development of a model to simulate the information system of a program management type of organization is reported. The model statistically determines the following parameters: type of messages, destinations, delivery durations, type processing, processing durations, communication channels, outgoing messages, and priorites. The total management information system of the program management organization is considered, including formal and informal information flows and both facilities and equipment. The model is written in General Purpose System Simulation 2 computer programming language for use on the Univac 1108, Executive 8 computer. The model is simulated on a daily basis and collects queue and resource utilization statistics for each decision point. The statistics are then used by management to evaluate proposed resource allocations, to evaluate proposed changes to the system, and to identify potential problem areas. The model employs both empirical and theoretical distributions which are adjusted to simulate the information flow being studied.

  7. [Applications of the hospital statistics management system].

    PubMed

    Zhai, Hong; Ren, Yong; Liu, Jing; Li, You-Zhang; Ma, Xiao-Long; Jiao, Tao-Tao

    2008-01-01

    The Hospital Statistics Management System is built on an Office Automation Platform of Shandong provincial hospital system. Its workflow, role and popedom technologies are used to standardize and optimize the management program of statistics in the total quality control of hospital statistics. The system's applications have combined the office automation platform with the statistics management in a hospital and this provides a practical example of a modern hospital statistics management model.

  8. Development of computer-assisted instruction application for statistical data analysis android platform as learning resource

    NASA Astrophysics Data System (ADS)

    Hendikawati, P.; Arifudin, R.; Zahid, M. Z.

    2018-03-01

    This study aims to design an android Statistics Data Analysis application that can be accessed through mobile devices to making it easier for users to access. The Statistics Data Analysis application includes various topics of basic statistical along with a parametric statistics data analysis application. The output of this application system is parametric statistics data analysis that can be used for students, lecturers, and users who need the results of statistical calculations quickly and easily understood. Android application development is created using Java programming language. The server programming language uses PHP with the Code Igniter framework, and the database used MySQL. The system development methodology used is the Waterfall methodology with the stages of analysis, design, coding, testing, and implementation and system maintenance. This statistical data analysis application is expected to support statistical lecturing activities and make students easier to understand the statistical analysis of mobile devices.

  9. Statistical correlation of structural mode shapes from test measurements and NASTRAN analytical values

    NASA Technical Reports Server (NTRS)

    Purves, L.; Strang, R. F.; Dube, M. P.; Alea, P.; Ferragut, N.; Hershfeld, D.

    1983-01-01

    The software and procedures of a system of programs used to generate a report of the statistical correlation between NASTRAN modal analysis results and physical tests results from modal surveys are described. Topics discussed include: a mathematical description of statistical correlation, a user's guide for generating a statistical correlation report, a programmer's guide describing the organization and functions of individual programs leading to a statistical correlation report, and a set of examples including complete listings of programs, and input and output data.

  10. Computer program uses Monte Carlo techniques for statistical system performance analysis

    NASA Technical Reports Server (NTRS)

    Wohl, D. P.

    1967-01-01

    Computer program with Monte Carlo sampling techniques determines the effect of a component part of a unit upon the overall system performance. It utilizes the full statistics of the disturbances and misalignments of each component to provide unbiased results through simulated random sampling.

  11. Application of Statistics in Engineering Technology Programs

    ERIC Educational Resources Information Center

    Zhan, Wei; Fink, Rainer; Fang, Alex

    2010-01-01

    Statistics is a critical tool for robustness analysis, measurement system error analysis, test data analysis, probabilistic risk assessment, and many other fields in the engineering world. Traditionally, however, statistics is not extensively used in undergraduate engineering technology (ET) programs, resulting in a major disconnect from industry…

  12. Statistical principle and methodology in the NISAN system.

    PubMed Central

    Asano, C

    1979-01-01

    The NISAN system is a new interactive statistical analysis program package constructed by an organization of Japanese statisticans. The package is widely available for both statistical situations, confirmatory analysis and exploratory analysis, and is planned to obtain statistical wisdom and to choose optimal process of statistical analysis for senior statisticians. PMID:540594

  13. Using an Instructional LAN to Teach a Statistics Course.

    ERIC Educational Resources Information Center

    Barnes, J. Wesley; And Others

    1988-01-01

    Discusses a computer assisted learning system for engineering statistics based on personalized system of instruction methods. Describes the system's network, development, course structure, programing, and security. Lists the benefits of the system. (MVL)

  14. System analysis for the Huntsville Operational Support Center distributed computer system

    NASA Technical Reports Server (NTRS)

    Ingels, E. M.

    1983-01-01

    A simulation model was developed and programmed in three languages BASIC, PASCAL, and SLAM. Two of the programs are included in this report, the BASIC and the PASCAL language programs. SLAM is not supported by NASA/MSFC facilities and hence was not included. The statistical comparison of simulations of the same HOSC system configurations are in good agreement and are in agreement with the operational statistics of HOSC that were obtained. Three variations of the most recent HOSC configuration was run and some conclusions drawn as to the system performance under these variations.

  15. 76 FR 74839 - Generalized System of Preferences (GSP): Import Statistics Relating to Competitive Need Limitations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-01

    ... Statistics Relating to Competitive Need Limitations AGENCY: Office of the United States Trade Representative. ACTION: Notice. SUMMARY: This notice is to inform the public of the availability of import statistics for... System of Preferences (GSP) program. These import statistics identify some articles for which the 2011...

  16. 20 CFR 634.4 - Statistical standards.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 20 Employees' Benefits 3 2011-04-01 2011-04-01 false Statistical standards. 634.4 Section 634.4... System § 634.4 Statistical standards. Recipients shall agree to provide required data following the statistical standards prescribed by the Bureau of Labor Statistics for cooperative statistical programs. ...

  17. 20 CFR 634.4 - Statistical standards.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Statistical standards. 634.4 Section 634.4... System § 634.4 Statistical standards. Recipients shall agree to provide required data following the statistical standards prescribed by the Bureau of Labor Statistics for cooperative statistical programs. ...

  18. Designing a Course in Statistics for a Learning Health Systems Training Program

    ERIC Educational Resources Information Center

    Samsa, Gregory P.; LeBlanc, Thomas W.; Zaas, Aimee; Howie, Lynn; Abernethy, Amy P.

    2014-01-01

    The core pedagogic problem considered here is how to effectively teach statistics to physicians who are engaged in a "learning health system" (LHS). This is a special case of a broader issue--namely, how to effectively teach statistics to academic physicians for whom research--and thus statistics--is a requirement for professional…

  19. 77 FR 62602 - Privacy Act of 1974, as Amended; System of Records Notice

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-15

    ... Record; (12) Statistical Reports--retrievable by names: (a) Personnel Transcript Report, (b) Class... training processes, such as the collection of statistical information on training programs, development of... systems, creating and reviewing statistics to improve the quality of services provided, or conducting debt...

  20. 77 FR 18689 - Changes to Standard Numbering System, Vessel Identification System, and Boating Accident Report...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-28

    ... requires States to compile and send us reports, information, and statistics on casualties reported to them... data and statistical information received from the current collection to establish National... accident prevention programs; and publish accident statistics in accordance with Title 46 U.S.C. 6102...

  1. Derivative Free Optimization of Complex Systems with the Use of Statistical Machine Learning Models

    DTIC Science & Technology

    2015-09-12

    AFRL-AFOSR-VA-TR-2015-0278 DERIVATIVE FREE OPTIMIZATION OF COMPLEX SYSTEMS WITH THE USE OF STATISTICAL MACHINE LEARNING MODELS Katya Scheinberg...COMPLEX SYSTEMS WITH THE USE OF STATISTICAL MACHINE LEARNING MODELS 5a.  CONTRACT NUMBER 5b.  GRANT NUMBER FA9550-11-1-0239 5c.  PROGRAM ELEMENT...developed, which has been the focus of our research. 15. SUBJECT TERMS optimization, Derivative-Free Optimization, Statistical Machine Learning 16. SECURITY

  2. Quality Space and Launch Requirements, Addendum to AS9100C

    DTIC Science & Technology

    2015-05-08

    45 8.9.1 Statistical Process Control (SPC) .......................................................................... 45...SMC Space and Missile Systems Center SME Subject Matter Expert SOW Statement of Work SPC Statistical Process Control SPO System Program Office SRP...occur without any individual data exceeding the control limits. Control limits are developed using standard statistical methods or other approved

  3. Annual Statistical Report, 1988. Client Assistance Program, Protection & Advocacy System for Persons with Mental Illness, Protection & Advocacy System for Persons with Developmental Disabilities.

    ERIC Educational Resources Information Center

    National Association of Protection and Advocacy Systems, Washington, DC.

    The report summarizes: (1) 1988 program data for state Protection and Advocacy Systems for persons with developmental disabilities and persons with mental illness, and (2) 1988 program data for Client Assistance Programs. The data are derived from reports from 56 states and territories. In addition to nationwide data totals, each state's…

  4. Akterations/corrections to the BRASS Program

    NASA Technical Reports Server (NTRS)

    Brand, S. N.

    1985-01-01

    Corrections applied to statistical programs contained in two subroutines of the Bed Rest Analysis Software System (BRASS) are summarized. Two subroutines independently calculate significant values within the BRASS program.

  5. Statistical Analysis Tools for Learning in Engineering Laboratories.

    ERIC Educational Resources Information Center

    Maher, Carolyn A.

    1990-01-01

    Described are engineering programs that have used automated data acquisition systems to implement data collection and analyze experiments. Applications include a biochemical engineering laboratory, heat transfer performance, engineering materials testing, mechanical system reliability, statistical control laboratory, thermo-fluid laboratory, and a…

  6. Sun Series program for the REEDA System. [predicting orbital lifetime using sunspot values

    NASA Technical Reports Server (NTRS)

    Shankle, R. W.

    1980-01-01

    Modifications made to data bases and to four programs in a series of computer programs (Sun Series) which run on the REEDA HP minicomputer system to aid NASA's solar activity predictions used in orbital life time predictions are described. These programs utilize various mathematical smoothing technique and perform statistical and graphical analysis of various solar activity data bases residing on the REEDA System.

  7. Quality Space and Launch Requirements Addendum to AS9100C

    DTIC Science & Technology

    2015-03-05

    45 8.9.1 Statistical Process Control (SPC) .......................................................................... 45 8.9.1.1 Out of Control...Systems Center SME Subject Matter Expert SOW Statement of Work SPC Statistical Process Control SPO System Program Office SRP Standard Repair...individual data exceeding the control limits. Control limits are developed using standard statistical methods or other approved techniques and are based on

  8. Additional Support for the Information Systems Analyst Exam as a Valid Program Assessment Tool

    ERIC Educational Resources Information Center

    Carpenter, Donald A.; Snyder, Johnny; Slauson, Gayla Jo; Bridge, Morgan K.

    2011-01-01

    This paper presents a statistical analysis to support the notion that the Information Systems Analyst (ISA) exam can be used as a program assessment tool in addition to measuring student performance. It compares ISA exam scores earned by students in one particular Computer Information Systems program with scores earned by the same students on the…

  9. 77 FR 61791 - System of Records; Presidential Management Fellows Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-11

    ... program personnel for the following reasons: a. To determine basic program eligibility and to evaluate... descriptive statistics and analytical studies in support of the function for which the records are collected...

  10. Analysis of reference transactions using packaged computer programs.

    PubMed

    Calabretta, N; Ross, R

    1984-01-01

    Motivated by a continuing education class attended by the authors on the measurement of reference desk activities, the reference department at Scott Memorial Library initiated a project to gather data on reference desk transactions and to analyze the data by using packaged computer programs. The programs utilized for the project were SPSS (Statistical Package for the Social Sciences) and SAS (Statistical Analysis System). The planning, implementation and development of the project are described.

  11. Statistical Supplement to the Annual Report of the Coordinating Board, Texas College and University System for the Fiscal Year 1980.

    ERIC Educational Resources Information Center

    Texas Coll. and Univ. System, Austin. Coordinating Board.

    Comprehensive statistical data on Texas higher education is presented. Data and formulas relating to student enrollments and faculty headcounts, program development and productivity, faculty salaries and teaching loads, campus development, funding, and the state student load program are included. Student headcount enrollment data are presented by…

  12. Public Schools and the Juvenile Justice System: Facilitating Relationships

    ERIC Educational Resources Information Center

    Mazzotti, Valerie L.; Higgins, Kyle

    2006-01-01

    This article describes the importance of facilitating relationships between schools and the Juvenile Justice System. Emphasis is placed on statistics concerning children/youth involved in the Juvenile Justice System and the current state of school programs. Strategies for developing integrated programs between schools and the Juvenile Justice…

  13. Mathematical and Statistical Software Index. Final Report.

    ERIC Educational Resources Information Center

    Black, Doris E., Comp.

    Brief descriptions are provided of general-purpose mathematical and statistical software, including 27 "stand-alone" programs, three subroutine systems, and two nationally recognized statistical packages, which are available in the Air Force Human Resources Laboratory (AFHRL) software library. This index was created to enable researchers…

  14. Computer program for prediction of fuel consumption statistical data for an upper stage three-axes stabilized on-off control system

    NASA Technical Reports Server (NTRS)

    1982-01-01

    A FORTRAN coded computer program and method to predict the reaction control fuel consumption statistics for a three axis stabilized rocket vehicle upper stage is described. A Monte Carlo approach is used which is more efficient by using closed form estimates of impulses. The effects of rocket motor thrust misalignment, static unbalance, aerodynamic disturbances, and deviations in trajectory, mass properties and control system characteristics are included. This routine can be applied to many types of on-off reaction controlled vehicles. The pseudorandom number generation and statistical analyses subroutines including the output histograms can be used for other Monte Carlo analyses problems.

  15. Statistics from the Operation of the Low-Level Wind Shear Alert System (LLWAS) during the Joint Airport Weather Studies (JAWS) Project.

    DTIC Science & Technology

    1984-12-01

    AD-RI59 367 STATISTICS FROM THE OPERATION OF THE LOW-LEVEL WIND I/i SHEAR ALERT SYSTEM (L..(U) NATIONAL CENTER FOR ATOMSPHERIC RESEARCH BOULDER CO...NATIONAL BUREAU OF STANDARDS-1963A % % Oh b DOT/FAAIPM-84132 Statistics from the Operation of the Program Engineering Low-Level Wind Shear Alert System and...The Operation of The Low-Level Wind December 1984 Shear Alert System (LLWAS) During The JAWS Project: 6. Performing Organization Code An Interim Report

  16. Summary Report on NRL Participation in the Microwave Landing System Program.

    DTIC Science & Technology

    1980-08-19

    shifters were measured and statistically analyzed. Several research contracts for promising phased array techniques were awarded to industrial contractors...program was written for compiling statistical data on the measurements, which reads out inser- sertion phase characteristics and standard deviation...GLOSSARY OF TERMS ALPA Airline Pilots’ Association ATA Air Transport Association AWA Australiasian Wireless Amalgamated AWOP All-weather Operations

  17. Remote Sensing/gis Integration for Site Planning and Resource Management

    NASA Technical Reports Server (NTRS)

    Fellows, J. D.

    1982-01-01

    The development of an interactive/batch gridded information system (array of cells georeferenced to USGS quad sheets) and interfacing application programs (e.g., hydrologic models) is discussed. This system allows non-programer users to request any data set(s) stored in the data base by inputing any random polygon's (watershed, political zone) boundary points. The data base information contained within this polygon can be used to produce maps, statistics, and define model parameters for the area. Present/proposed conditions for the area may be compared by inputing future usage (land cover, soils, slope, etc.). This system, known as the Hydrologic Analysis Program (HAP), is especially effective in the real time analysis of proposed land cover changes on runoff hydrographs and graphics/statistics resource inventories of random study area/watersheds.

  18. Computer Assisted Instruction. Papers Presented at the Association for Educational Data Systems Annual Convention (Phoenix, Arizona, May 3-7, 1976).

    ERIC Educational Resources Information Center

    Association for Educational Data Systems, Washington, DC.

    Two abstracts and seventeen articles on computer assisted instruction (CAI) presented at the 1976 Association for Educational Data Systems (AEDS) convention are included here. Four new computer programs are described: Author System for Education and Training (ASET); GNOSIS, a Swedish/English CAI package; Statistical Interactive Programming System…

  19. Management system of occupational diseases in Korea: statistics, report and monitoring system.

    PubMed

    Rhee, Kyung Yong; Choe, Seong Weon

    2010-12-01

    The management system of occupational diseases in Korea can be assessed from the perspective of a surveillance system. Workers' compensation insurance reports are used to produce official statistics on occupational diseases in Korea. National working conditions surveys are used to monitor the magnitude of work-related symptoms and signs in the labor force. A health examination program was introduced to detect occupational diseases through both selective and mass screening programs. The Working Environment Measurement Institution assesses workers' exposure to hazards in the workplace. Government regulates that the employer should do health examinations and working conditions measurement through contracted private agencies and following the Occupational Safety and Health Act. It is hoped that these institutions may be able to effectively detect and monitor occupational diseases and hazards in the workplace. In view of this, the occupational management system in Korea is well designed, except for the national survey system. In the future, national surveys for detection of hazards and ill-health outcomes in workers should be developed. The existing surveillance system for occupational disease can be improved by providing more refined information through statistical analysis of surveillance data.

  20. Oregon University System Fact Book 2006

    ERIC Educational Resources Information Center

    Mayfield, Vern; North, Tom; Kieran, Bob

    2007-01-01

    This compendium of narrative and statistical information is an overview of the Oregon University System (OUS) and is produced every two years. The introduction includes a mission and vision statement, a listing of OUS campuses and centers, a history of the institutions, OUS degree partnership programs, and distance education degree programs, OUS…

  1. Schoolhouse Systems Project: SSP. 3rd Report.

    ERIC Educational Resources Information Center

    Florida State Dept. of Education, Tallahassee.

    This brochure provides statistical bid breakdown for Programs 1A and 2 of the Florida Schoolhouse Systems Project. Tabular information is provided on bidders, compatible building subsystems, bid tabulation by compatibility, "per school" building subsystems, nominated bidders and lump sums, and a comparison of programs 1A and 2 bids. Data…

  2. The open-source movement: an introduction for forestry professionals

    Treesearch

    Patrick Proctor; Paul C. Van Deusen; Linda S. Heath; Jeffrey H. Gove

    2005-01-01

    In recent years, the open-source movement has yielded a generous and powerful suite of software and utilities that rivals those developed by many commercial software companies. Open-source programs are available for many scientific needs: operating systems, databases, statistical analysis, Geographic Information System applications, and object-oriented programming....

  3. "Hyperstat": an educational and working tool in epidemiology.

    PubMed

    Nicolosi, A

    1995-01-01

    The work of a researcher in epidemiology is based on studying literature, planning studies, gathering data, analyzing data and writing results. Therefore he has need for performing, more or less, simple calculations, the need for consulting or quoting literature, the need for consulting textbooks about certain issues or procedures, and the need for looking at a specific formula. There are no programs conceived as a workstation to assist the different aspects of researcher work in an integrated fashion. A hypertextual system was developed which supports different stages of the epidemiologist's work. It combines database management, statistical analysis or planning, and literature searches. The software was developed on Apple Macintosh by using Hypercard 2.1 as a database and HyperTalk as a programming language. The program is structured in 7 "stacks" or files: Procedures; Statistical Tables; Graphs; References; Text; Formulas; Help. Each stack has its own management system with an automated Table of Contents. Stacks contain "cards" which make up the databases and carry executable programs. The programs are of four kinds: association; statistical procedure; formatting (input/output); database management. The system performs general statistical procedures, procedures applicable to epidemiological studies only (follow-up and case-control), and procedures for clinical trials. All commands are given by clicking the mouse on self-explanatory "buttons". In order to perform calculations, the user only needs to enter the data into the appropriate cells and then click on the selected procedure's button. The system has a hypertextual structure. The user can go from a procedure to other cards following the preferred order of succession and according to built-in associations. The user can access different levels of knowledge or information from any stack he is consulting or operating. From every card, the user can go to a selected procedure to perform statistical calculations, to the reference database management system, to the textbook in which all procedures and issues are discussed in detail, to the database of statistical formulas with automated table of contents, to statistical tables with automated table of contents, or to the help module. he program has a very user-friendly interface and leaves the user free to use the same format he would use on paper. The interface does not require special skills. It reflects the Macintosh philosophy of using windows, buttons and mouse. This allows the user to perform complicated calculations without losing the "feel" of data, weight alternatives, and simulations. This program shares many features in common with hypertexts. It has an underlying network database where the nodes consist of text, graphics, executable procedures, and combinations of these; the nodes in the database correspond to windows on the screen; the links between the nodes in the database are visible as "active" text or icons in the windows; the text is read by following links and opening new windows. The program is especially useful as an educational tool, directed to medical and epidemiology students. The combination of computing capabilities with a textbook and databases of formulas and literature references, makes the program versatile and attractive as a learning tool. The program is also helpful in the work done at the desk, where the researcher examines results, consults literature, explores different analytic approaches, plans new studies, or writes grant proposals or scientific articles.

  4. Statistical Supplement to the Annual Report of the Coordinating Board, Texas College and University System for Fiscal Year 1978.

    ERIC Educational Resources Information Center

    Ashworth, Kenneth H.

    This supplement to the 1978 Annual Report of the Coordinating Board, Texas College and University System, contains comprehensive statistical data on higher education in Texas. The supplement provides facts, figures, and formulas relating to student enrollments and faculty headcounts, program development and productivity, faculty salaries and…

  5. Human Systems Engineering and Program Success - A Retrospective Content Analysis

    DTIC Science & Technology

    2016-01-01

    collected from the 546 documents and entered into SPSS Statistics Version 22.0 for Windows. HSI words within the sampled doc- uments ranged from zero to...engineers. The approach used a retrospective content analysis of documents from weapon systems acquisi- tion programs, namely Major Defense Acquisition...January 2016, Vol. 23 No. 1 : 78–101 January 2016 The interaction between humans and the systems they use affects program success, as well as life-cycle

  6. How Miniature/Microminiature (2M) Repair Capabilities Can Reduce the Impact of No Evidence of Failure (NEOF) Among Repairables on the Navy’s Operations and Maintenance Account

    DTIC Science & Technology

    1988-06-01

    and PCBs. The pilot program involved screening, testing , and repairing of EMs/PCBs for both COMNAVSEASYSCOM and Commander, Naval Electronic Systems...were chosen from the Support and Test Equipment Engineering Program (STEEP) tests rformed by"IMA San Diego duringl987. A statistical analysis and a Level...were chosen from the Support and Test Equipment Engineering Program (STEEP) tests performed by SIMA San Diego during 1987. A statistical analysis and a

  7. Aviation system capacity : annual report

    DOT National Transportation Integrated Search

    1993-10-01

    The Aviation System Capacity Plan is published annually and, in addition to providing airport delay statistics, serves to identify programs that have potential for increasing capacity and reducing delay.

  8. The Nonprofit Program Classification System: Increasing Understanding of the Nonprofit Sector.

    ERIC Educational Resources Information Center

    Romeo, Sheryl; Lampkin, Linda; Twombly, Eric

    2001-01-01

    The Nonprofit Program Classification System being developed by the National Center for Charitable Statistics (NCCS) provides a way to enrich the information available on nonprofits and utilize the newly available NCCS/PRI National Nonprofit Organization database from the IRS Forms 990 filed annually by charities. It provides a method to organize…

  9. Software Reviews.

    ERIC Educational Resources Information Center

    Science Software Quarterly, 1984

    1984-01-01

    Provides extensive reviews of computer software, examining documentation, ease of use, performance, error handling, special features, and system requirements. Includes statistics, problem-solving (TK Solver), label printing, database management, experimental psychology, Encyclopedia Britannica biology, and DNA-sequencing programs. A program for…

  10. 77 FR 73694 - Privacy Act of 1974: Update Existing System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-11

    ... survey response, and in the production of summary descriptive statistics and analytical studies in... participation in an agency's Upward Mobility Program or other personnel program designed to broaden an employee...

  11. AutoBayes Program Synthesis System Users Manual

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Jafari, Hamed; Pressburger, Tom; Denney, Ewen; Buntine, Wray; Fischer, Bernd

    2008-01-01

    Program synthesis is the systematic, automatic construction of efficient executable code from high-level declarative specifications. AutoBayes is a fully automatic program synthesis system for the statistical data analysis domain; in particular, it solves parameter estimation problems. It has seen many successful applications at NASA and is currently being used, for example, to analyze simulation results for Orion. The input to AutoBayes is a concise description of a data analysis problem composed of a parameterized statistical model and a goal that is a probability term involving parameters and input data. The output is optimized and fully documented C/C++ code computing the values for those parameters that maximize the probability term. AutoBayes can solve many subproblems symbolically rather than having to rely on numeric approximation algorithms, thus yielding effective, efficient, and compact code. Statistical analysis is faster and more reliable, because effort can be focused on model development and validation rather than manual development of solution algorithms and code.

  12. Statistics and Epidemiology of Lead Poisoning (FY 72-L1).

    ERIC Educational Resources Information Center

    Morrison, John H., Jr.; And Others

    This report is the first in a quarterly series which will contain statistics and epidemiologic notes on lead poisoning at both the national and local levels. This report contains (a) statistics on childhood lead poisoning; (b) a status report on the Community Lead Poisoning Data System, which was designed to assist local lead control programs and…

  13. 1976-77 California Public Schools Selected Statistics: Operating Units, Revenues, Teachers and Pupils, Expenditures.

    ERIC Educational Resources Information Center

    California State Dept. of Education, Sacramento. Bureau of School Apportionments and Reports.

    Statistics on the California school system presented in this publication are prerequisites to the determination of policy at the state and local levels. This publication lists the number of pupils, teachers, and schools in the California school system and associated programs in the 1976-77 school year. It also records the expenditures of public…

  14. The FORTRAN static source code analyzer program (SAP) system description

    NASA Technical Reports Server (NTRS)

    Decker, W.; Taylor, W.; Merwarth, P.; Oneill, M.; Goorevich, C.; Waligora, S.

    1982-01-01

    A source code analyzer program (SAP) designed to assist personnel in conducting studies of FORTRAN programs is described. The SAP scans FORTRAN source code and produces reports that present statistics and measures of statements and structures that make up a module. The processing performed by SAP and of the routines, COMMON blocks, and files used by SAP are described. The system generation procedure for SAP is also presented.

  15. How Do Microfinance Programs Contribute to Poverty Reduction

    DTIC Science & Technology

    2016-09-01

    areas have experienced statistically higher incidents of crime tied to class conflict.90 Land tax systems under the British were also responsible for...countries.173 This low delinquency rate is credited to the lack of alternative opportunities that are available to the poor.174 According to Muhammad...TOTAL: 909 54.6 60.2 55 Figure 2. Program Duration and Objective Poverty.197 The statistical analysis conducted by Chowdhury, Gosh and Wright finds

  16. Data Processing System (DPS) software with experimental design, statistical analysis and data mining developed for use in entomological research.

    PubMed

    Tang, Qi-Yi; Zhang, Chuan-Xi

    2013-04-01

    A comprehensive but simple-to-use software package called DPS (Data Processing System) has been developed to execute a range of standard numerical analyses and operations used in experimental design, statistics and data mining. This program runs on standard Windows computers. Many of the functions are specific to entomological and other biological research and are not found in standard statistical software. This paper presents applications of DPS to experimental design, statistical analysis and data mining in entomology. © 2012 The Authors Insect Science © 2012 Institute of Zoology, Chinese Academy of Sciences.

  17. Information retrieval system

    NASA Technical Reports Server (NTRS)

    Berg, R. F.; Holcomb, J. E.; Kelroy, E. A.; Levine, D. A.; Mee, C., III

    1970-01-01

    Generalized information storage and retrieval system capable of generating and maintaining a file, gathering statistics, sorting output, and generating final reports for output is reviewed. File generation and file maintenance programs written for the system are general purpose routines.

  18. A Computer Program for the Generation of ARIMA Data

    ERIC Educational Resources Information Center

    Green, Samuel B.; Noles, Keith O.

    1977-01-01

    The autoregressive integrated moving averages model (ARIMA) has been applied to time series data in psychological and educational research. A program is described that generates ARIMA data of a known order. The program enables researchers to explore statistical properties of ARIMA data and simulate systems producing time dependent observations.…

  19. Management System of Occupational Diseases in Korea: Statistics, Report and Monitoring System

    PubMed Central

    Choe, Seong Weon

    2010-01-01

    The management system of occupational diseases in Korea can be assessed from the perspective of a surveillance system. Workers' compensation insurance reports are used to produce official statistics on occupational diseases in Korea. National working conditions surveys are used to monitor the magnitude of work-related symptoms and signs in the labor force. A health examination program was introduced to detect occupational diseases through both selective and mass screening programs. The Working Environment Measurement Institution assesses workers' exposure to hazards in the workplace. Government regulates that the employer should do health examinations and working conditions measurement through contracted private agencies and following the Occupational Safety and Health Act. It is hoped that these institutions may be able to effectively detect and monitor occupational diseases and hazards in the workplace. In view of this, the occupational management system in Korea is well designed, except for the national survey system. In the future, national surveys for detection of hazards and ill-health outcomes in workers should be developed. The existing surveillance system for occupational disease can be improved by providing more refined information through statistical analysis of surveillance data. PMID:21258584

  20. Statistical porcess control in Deep Space Network operation

    NASA Technical Reports Server (NTRS)

    Hodder, J. A.

    2002-01-01

    This report describes how the Deep Space Mission System (DSMS) Operations Program Office at the Jet Propulsion Laboratory's (EL) uses Statistical Process Control (SPC) to monitor performance and evaluate initiatives for improving processes on the National Aeronautics and Space Administration's (NASA) Deep Space Network (DSN).

  1. 43 CFR 2.46 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... group of any records under the control of the Department or a bureau thereof from which information is... management programs or processes such as staffing, employee development, retirement, and grievances and appeals. (i) Statistical records. As used in this subpart, “statistical records” means records in a system...

  2. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.

  3. Mental Health Systems in Scandinavia.

    ERIC Educational Resources Information Center

    Vail, David J.

    The guidebook is introduced by general observations on the Scandinavian countries concerning history, social policy, medicine, mental health, and psychiatric diagnosis. Discussed individually for Norway, Sweden, and Denmark are the following areas: mental health programs and statistics; mental illness programs, regional, hospital, aftercare,…

  4. A Statistically Based Training Diagnostic Tool for Marine Aviation

    DTIC Science & Technology

    2014-06-01

    mission essential task list MDG maneuver description guide MOS military occupational specialty MSHARP Marine Sierra Hotel Aviation Reporting Program...include the Defense Readiness Reporting System (DRRS) Marine Corps, the Current Readiness Program (CRP), and the Marine Sierra Hotel Aviation...Beuschel, 2008). Many of these systems focus on business decisions regarding how companies can increase their bottom line, by appealing to customers more

  5. An analysis of student performance benchmarks in dental hygiene via distance education.

    PubMed

    Olmsted, Jodi L

    2010-01-01

    Three graduate programs, 35 undergraduate programs and 12 dental hygiene degree completion programs in the United States use varying forms of Distance Learning (DL). Relying heavily on DL leaves an unanswered question: Is learner performance on standard benchmark assessments impacted when using technology as a delivery system? A 10 year, longitudinal examination looked for student performance differences in a Distance Education (DE) dental hygiene program. The purpose of this research was to determine if there was a difference in performance between learners taught in a traditional classroom as compared to their counterparts taking classes through an alternative delivery system. A longitudinal, ex post facto design was used. Two hundred and sixty-six subject records were examined. Seventy-seven individuals (29%) were lost through attrition over 10 years. One hundred and eighty-nine records were used as the study sample, 117 individuals were located face-to-face and 72 were at a distance. Independent variables included time and location, while the dependent variables included course grades, grade point average (GPA) and the National Board of Dental Hygiene Examination (NBDHE). Three research questions were asked: Were there statistically significant differences in learner performance on the National Board of Dental Hygiene Examination (NBDHE)? Were there statistically significant differences in learner performance when considering GPAs? Did statistically significant differences in performance exist relating to individual course grades? T-tests were used for data analysis in answering the research questions. From a cumulative perspective, no statistically significant differences were apparent for the NBDHE and GPAs or for individual courses. Interactive Television (ITV), the synchronous DL system examined, was considered effective for delivering education to learners if similar performance outcomes were the evaluation criteria.

  6. Automated Reporting of DXA Studies Using a Custom-Built Computer Program.

    PubMed

    England, Joseph R; Colletti, Patrick M

    2018-06-01

    Dual-energy x-ray absorptiometry (DXA) scans are a critical population health tool and relatively simple to interpret but can be time consuming to report, often requiring manual transfer of bone mineral density and associated statistics into commercially available dictation systems. We describe here a custom-built computer program for automated reporting of DXA scans using Pydicom, an open-source package built in the Python computer language, and regular expressions to mine DICOM tags for patient information and bone mineral density statistics. This program, easy to emulate by any novice computer programmer, has doubled our efficiency at reporting DXA scans and has eliminated dictation errors.

  7. Computer Programs for Obtaining and Analyzing Daily Mean Steamflow Data from the U.S. Geological Survey National Water Information System Web Site

    USGS Publications Warehouse

    Granato, Gregory E.

    2009-01-01

    Streamflow information is important for many planning and design activities including water-supply analysis, habitat protection, bridge and culvert design, calibration of surface and ground-water models, and water-quality assessments. Streamflow information is especially critical for water-quality assessments (Warn and Brew, 1980; Di Toro, 1984; Driscoll and others, 1989; Driscoll and others, 1990, a,b). Calculation of streamflow statistics for receiving waters is necessary to estimate the potential effects of point sources such as wastewater-treatment plants and nonpoint sources such as highway and urban-runoff discharges on receiving water. Streamflow statistics indicate the amount of flow that may be available for dilution and transport of contaminants (U.S. Environmental Protection Agency, 1986; Driscoll and others, 1990, a,b). Streamflow statistics also may be used to indicate receiving-water quality because concentrations of water-quality constituents commonly vary naturally with streamflow. For example, concentrations of suspended sediment and sediment-associated constituents (such as nutrients, trace elements, and many organic compounds) commonly increase with increasing flows, and concentrations of many dissolved constituents commonly decrease with increasing flows in streams and rivers (O'Connor, 1976; Glysson, 1987; Vogel and others, 2003, 2005). Reliable, efficient and repeatable methods are needed to access and process streamflow information and data. For example, the Nation's highway infrastructure includes an innumerable number of stream crossings and stormwater-outfall points for which estimates of stream-discharge statistics may be needed. The U.S. Geological Survey (USGS) streamflow data-collection program is designed to provide streamflow data at gaged sites and to provide information that can be used to estimate streamflows at almost any point along any stream in the United States (Benson and Carter, 1973; Wahl and others, 1995; National Research Council, 2004). The USGS maintains the National Water Information System (NWIS), a distributed network of computers and file servers used to store and retrieve hydrologic data (Mathey, 1998; U.S. Geological Survey, 2008). NWISWeb is an online version of this database that includes water data from more than 24,000 streamflow-gaging stations throughout the United States (U.S. Geological Survey, 2002, 2008). Information from NWISWeb is commonly used to characterize streamflows at gaged sites and to help predict streamflows at ungaged sites. Five computer programs were developed for obtaining and analyzing streamflow from the National Water Information System (NWISWeb). The programs were developed as part of a study by the U.S. Geological Survey, in cooperation with the Federal Highway Administration, to develop a stochastic empirical loading and dilution model. The programs were developed because reliable, efficient, and repeatable methods are needed to access and process streamflow information and data. The first program is designed to facilitate the downloading and reformatting of NWISWeb streamflow data. The second program is designed to facilitate graphical analysis of streamflow data. The third program is designed to facilitate streamflow-record extension and augmentation to help develop long-term statistical estimates for sites with limited data. The fourth program is designed to facilitate statistical analysis of streamflow data. The fifth program is a preprocessor to create batch input files for the U.S. Environmental Protection Agency DFLOW3 program for calculating low-flow statistics. These computer programs were developed to facilitate the analysis of daily mean streamflow data for planning-level water-quality analyses but also are useful for many other applications pertaining to streamflow data and statistics. These programs and the associated documentation are included on the CD-ROM accompanying this report. This report and the appendixes on the

  8. Application of the Linux cluster for exhaustive window haplotype analysis using the FBAT and Unphased programs.

    PubMed

    Mishima, Hiroyuki; Lidral, Andrew C; Ni, Jun

    2008-05-28

    Genetic association studies have been used to map disease-causing genes. A newly introduced statistical method, called exhaustive haplotype association study, analyzes genetic information consisting of different numbers and combinations of DNA sequence variations along a chromosome. Such studies involve a large number of statistical calculations and subsequently high computing power. It is possible to develop parallel algorithms and codes to perform the calculations on a high performance computing (HPC) system. However, most existing commonly-used statistic packages for genetic studies are non-parallel versions. Alternatively, one may use the cutting-edge technology of grid computing and its packages to conduct non-parallel genetic statistical packages on a centralized HPC system or distributed computing systems. In this paper, we report the utilization of a queuing scheduler built on the Grid Engine and run on a Rocks Linux cluster for our genetic statistical studies. Analysis of both consecutive and combinational window haplotypes was conducted by the FBAT (Laird et al., 2000) and Unphased (Dudbridge, 2003) programs. The dataset consisted of 26 loci from 277 extended families (1484 persons). Using the Rocks Linux cluster with 22 compute-nodes, FBAT jobs performed about 14.4-15.9 times faster, while Unphased jobs performed 1.1-18.6 times faster compared to the accumulated computation duration. Execution of exhaustive haplotype analysis using non-parallel software packages on a Linux-based system is an effective and efficient approach in terms of cost and performance.

  9. Application of the Linux cluster for exhaustive window haplotype analysis using the FBAT and Unphased programs

    PubMed Central

    Mishima, Hiroyuki; Lidral, Andrew C; Ni, Jun

    2008-01-01

    Background Genetic association studies have been used to map disease-causing genes. A newly introduced statistical method, called exhaustive haplotype association study, analyzes genetic information consisting of different numbers and combinations of DNA sequence variations along a chromosome. Such studies involve a large number of statistical calculations and subsequently high computing power. It is possible to develop parallel algorithms and codes to perform the calculations on a high performance computing (HPC) system. However, most existing commonly-used statistic packages for genetic studies are non-parallel versions. Alternatively, one may use the cutting-edge technology of grid computing and its packages to conduct non-parallel genetic statistical packages on a centralized HPC system or distributed computing systems. In this paper, we report the utilization of a queuing scheduler built on the Grid Engine and run on a Rocks Linux cluster for our genetic statistical studies. Results Analysis of both consecutive and combinational window haplotypes was conducted by the FBAT (Laird et al., 2000) and Unphased (Dudbridge, 2003) programs. The dataset consisted of 26 loci from 277 extended families (1484 persons). Using the Rocks Linux cluster with 22 compute-nodes, FBAT jobs performed about 14.4–15.9 times faster, while Unphased jobs performed 1.1–18.6 times faster compared to the accumulated computation duration. Conclusion Execution of exhaustive haplotype analysis using non-parallel software packages on a Linux-based system is an effective and efficient approach in terms of cost and performance. PMID:18541045

  10. The Technologies of EXPER SIM.

    ERIC Educational Resources Information Center

    Hedberg, John G.

    EXPER SIM has been translated into two basic software systems: the Michigan Experimental Simulation Supervisor (MESS) and Louisville Experiment Simulation Supervisor (LESS). MESS and LESS have been programed to facilitate student interaction with the computer for research purposes. The programs contain models for several statistical analyses, and…

  11. Reducing child mortality in Nigeria: a case study of immunization and systemic factors.

    PubMed

    Nwogu, Rufus; Ngowu, Rufus; Larson, James S; Kim, Min Su

    2008-07-01

    The purpose of the study is to assess the outcome of the Expanded Program on Immunization (EPI) in Nigeria, as well as to examine systemic factors influencing its high under-five mortality rate (UFMR). The principal objective of the EPI program when it was implemented in 1978 was to reduce mortality, morbidity and disability associated with six vaccine preventable diseases namely tuberculosis, tetanus, diphtheria, measles, pertussis and poliomyelitis. The methodological approach to this study is quantitative, using secondary time series data from 1970 to 2003. The study tested three hypotheses using time series multiple regression analysis with autocorrelation adjustment as a statistical model. The results showed that the EPI program had little effect on UFMR in Nigeria. Only the literacy rate and domestic spending on healthcare had statistically significant effects on the UFMR. The military government was not a significant factor in reducing or increasing the UFMR. It appears that Nigeria needs a unified approach to healthcare delivery, rather than fragmented programs, to overcome cultural and political divisions in society.

  12. AESS: Accelerated Exact Stochastic Simulation

    NASA Astrophysics Data System (ADS)

    Jenkins, David D.; Peterson, Gregory D.

    2011-12-01

    The Stochastic Simulation Algorithm (SSA) developed by Gillespie provides a powerful mechanism for exploring the behavior of chemical systems with small species populations or with important noise contributions. Gene circuit simulations for systems biology commonly employ the SSA method, as do ecological applications. This algorithm tends to be computationally expensive, so researchers seek an efficient implementation of SSA. In this program package, the Accelerated Exact Stochastic Simulation Algorithm (AESS) contains optimized implementations of Gillespie's SSA that improve the performance of individual simulation runs or ensembles of simulations used for sweeping parameters or to provide statistically significant results. Program summaryProgram title: AESS Catalogue identifier: AEJW_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJW_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: University of Tennessee copyright agreement No. of lines in distributed program, including test data, etc.: 10 861 No. of bytes in distributed program, including test data, etc.: 394 631 Distribution format: tar.gz Programming language: C for processors, CUDA for NVIDIA GPUs Computer: Developed and tested on various x86 computers and NVIDIA C1060 Tesla and GTX 480 Fermi GPUs. The system targets x86 workstations, optionally with multicore processors or NVIDIA GPUs as accelerators. Operating system: Tested under Ubuntu Linux OS and CentOS 5.5 Linux OS Classification: 3, 16.12 Nature of problem: Simulation of chemical systems, particularly with low species populations, can be accurately performed using Gillespie's method of stochastic simulation. Numerous variations on the original stochastic simulation algorithm have been developed, including approaches that produce results with statistics that exactly match the chemical master equation (CME) as well as other approaches that approximate the CME. Solution method: The Accelerated Exact Stochastic Simulation (AESS) tool provides implementations of a wide variety of popular variations on the Gillespie method. Users can select the specific algorithm considered most appropriate. Comparisons between the methods and with other available implementations indicate that AESS provides the fastest known implementation of Gillespie's method for a variety of test models. Users may wish to execute ensembles of simulations to sweep parameters or to obtain better statistical results, so AESS supports acceleration of ensembles of simulation using parallel processing with MPI, SSE vector units on x86 processors, and/or using NVIDIA GPUs with CUDA.

  13. A statistical approach to deriving subsystem specifications. [for spacecraft shock and vibrational environment tests

    NASA Technical Reports Server (NTRS)

    Keegan, W. B.

    1974-01-01

    In order to produce cost effective environmental test programs, the test specifications must be realistic and to be useful, they must be available early in the life of a program. This paper describes a method for achieving such specifications for subsystems by utilizing the results of a statistical analysis of data acquired at subsystem mounting locations during system level environmental tests. The paper describes the details of this statistical analysis. The resultant recommended levels are a function of the subsystems' mounting location in the spacecraft. Methods of determining this mounting 'zone' are described. Recommendations are then made as to which of the various problem areas encountered should be pursued further.

  14. Development Of Educational Programs In Renewable And Alternative Energy Processing: The Case Of Russia

    NASA Astrophysics Data System (ADS)

    Svirina, Anna; Shindor, Olga; Tatmyshevsky, Konstantin

    2014-12-01

    The paper deals with the main problems of Russian energy system development that proves necessary to provide educational programs in the field of renewable and alternative energy. In the paper the process of curricula development and defining teaching techniques on the basis of expert opinion evaluation is defined, and the competence model for renewable and alternative energy processing master students is suggested. On the basis of a distributed questionnaire and in-depth interviews, the data for statistical analysis was obtained. On the basis of this data, an optimization of curricula structure was performed, and three models of a structure for optimizing teaching techniques were developed. The suggested educational program structure which was adopted by employers is presented in the paper. The findings include quantitatively estimated importance of systemic thinking and professional skills and knowledge as basic competences of a masters' program graduate; statistically estimated necessity of practice-based learning approach; and optimization models for structuring curricula in renewable and alternative energy processing. These findings allow the establishment of a platform for the development of educational programs.

  15. Development of new on-line statistical program for the Korean Society for Radiation Oncology

    PubMed Central

    Song, Si Yeol; Ahn, Seung Do; Chung, Weon Kuu; Choi, Eun Kyung; Cho, Kwan Ho

    2015-01-01

    Purpose To develop new on-line statistical program for the Korean Society for Radiation Oncology (KOSRO) to collect and extract medical data in radiation oncology more efficiently. Materials and Methods The statistical program is a web-based program. The directory was placed in a sub-folder of the homepage of KOSRO and its web address is http://www.kosro.or.kr/asda. The operating systems server is Linux and the webserver is the Apache HTTP server. For database (DB) server, MySQL is adopted and dedicated scripting language is the PHP. Each ID and password are controlled independently and all screen pages for data input or analysis are made to be friendly to users. Scroll-down menu is actively used for the convenience of user and the consistence of data analysis. Results Year of data is one of top categories and main topics include human resource, equipment, clinical statistics, specialized treatment and research achievement. Each topic or category has several subcategorized topics. Real-time on-line report of analysis is produced immediately after entering each data and the administrator is able to monitor status of data input of each hospital. Backup of data as spread sheets can be accessed by the administrator and be used for academic works by any members of the KOSRO. Conclusion The new on-line statistical program was developed to collect data from nationwide departments of radiation oncology. Intuitive screen and consistent input structure are expected to promote entering data of member hospitals and annual statistics should be a cornerstone of advance in radiation oncology. PMID:26157684

  16. Development of new on-line statistical program for the Korean Society for Radiation Oncology.

    PubMed

    Song, Si Yeol; Ahn, Seung Do; Chung, Weon Kuu; Shin, Kyung Hwan; Choi, Eun Kyung; Cho, Kwan Ho

    2015-06-01

    To develop new on-line statistical program for the Korean Society for Radiation Oncology (KOSRO) to collect and extract medical data in radiation oncology more efficiently. The statistical program is a web-based program. The directory was placed in a sub-folder of the homepage of KOSRO and its web address is http://www.kosro.or.kr/asda. The operating systems server is Linux and the webserver is the Apache HTTP server. For database (DB) server, MySQL is adopted and dedicated scripting language is the PHP. Each ID and password are controlled independently and all screen pages for data input or analysis are made to be friendly to users. Scroll-down menu is actively used for the convenience of user and the consistence of data analysis. Year of data is one of top categories and main topics include human resource, equipment, clinical statistics, specialized treatment and research achievement. Each topic or category has several subcategorized topics. Real-time on-line report of analysis is produced immediately after entering each data and the administrator is able to monitor status of data input of each hospital. Backup of data as spread sheets can be accessed by the administrator and be used for academic works by any members of the KOSRO. The new on-line statistical program was developed to collect data from nationwide departments of radiation oncology. Intuitive screen and consistent input structure are expected to promote entering data of member hospitals and annual statistics should be a cornerstone of advance in radiation oncology.

  17. College Freshman with Disabilities, 1999: A Biennial Statistical Profile. Statistical Year 1998.

    ERIC Educational Resources Information Center

    Henderson, Cathy

    This monograph presents information on college freshmen with disabilities based on data collected by the Cooperative Institutional Research Program, a longitudinal study of the American higher education system that includes 469 institutions and 275,811 students. Section 1 presents highlights of the 1998 freshman survey and includes personal and…

  18. Running R Statistical Computing Environment Software on the Peregrine

    Science.gov Websites

    for the development of new statistical methodologies and enjoys a large user base. Please consult the distribution details. Natural language support but running in an English locale R is a collaborative project programming paradigms to better leverage modern HPC systems. The CRAN task view for High Performance Computing

  19. NASA university program management information system, FY 1985

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The University Program Report provides current information and related statistics for approximately 4200 grants/contracts/cooperative agreements active during the reporting period. NASA Field Centers and certain Headquarters Program Offices provide funds for those research and development activities in universities which contribute to the mission needs of that particular NASA element. This annual report is one means of documenting the NASA-University relationship, frequently denoted, collectively, as NASA's University Program.

  20. NASA university program management information system, FY 1986

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The University Program Report provides current information and related statistics for approximately 4300 grants/contracts/cooperative agreements active during the report period. NASA Field centers and certain Headquarters Program Offices provide funds for those R&D activities in universities which contribute to the mission needs of that particular NASA element. This annual report is one means of documenting the NASA-university relationship, frequently denoted, collectively, as NASA's University Program.

  1. NASA University Program Management Information System: FY 1995

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The University Program Report, Fiscal Year 1995, provides current information and related statistics for grants/contracts/cooperative agreements active during the report period. NASA field centers and certain Headquarters program offices provide funds for those R&D activities in universities which contribute to the mission needs of that particular NASA element. This annual report is one means of documenting the NASA-university relationship, frequently denoted, collectively, as NASA's University Program.

  2. NASA University program management information system, FY 1993

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The University Program Report, Fiscal Year 1993, provides current information and related statistics for 7682 grants/contracts/cooperative agreements active during the report period. NASA field centers and certain Headquarters program offices provide funds for those R&D activities in universities which contribute to the mission needs of that particular NASA element. This annual report is one means of documenting the NASA-university relationship, frequently denoted, collectively, as NASA's University Program.

  3. NASA university program management information system, FY 1994

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The University Program report, Fiscal Year 1994, provides current information and related statistics for 7841 grants/contracts/cooperative agreements active during the reporting period. NASA field centers and certain Headquarters program offices provide funds for those activities in universities which contribute to the mission needs of that particular NASA element. This annual report is one means of documenting the NASA-university relationship, frequently denoted, collectively, as NASA's University Program.

  4. Evaluation of English Language Development Programs in the Santa Ana Unified School District. A Report on Data System Reliability and Statistical Modeling of Program Impacts.

    ERIC Educational Resources Information Center

    Mitchell, Douglas E.; Destino, Tom; Karam, Rita

    In response to concern about the effectiveness of programs for English-as-a-Second-Language students in California's schools, the Santa Ana Unified School District, in which over 80 percent of students are limited-English-proficient (LEP) conducted a study of both the operations and effectiveness of the district's language development program,…

  5. The Global Education Network for Retinopathy of Prematurity (Gen-Rop): Development, Implementation, and Evaluation of A Novel Tele-Education System (An American Ophthalmological Society Thesis).

    PubMed

    Chan, R V Paul; Patel, Samir N; Ryan, Michael C; Jonas, Karyn E; Ostmo, Susan; Port, Alexander D; Sun, Grace I; Lauer, Andreas K; Chiang, Michael F

    2015-01-01

    To describe the design, implementation, and evaluation of a tele-education system developed to improve diagnostic competency in retinopathy of prematurity (ROP) by ophthalmology residents. A secure Web-based tele-education system was developed utilizing a repository of over 2,500 unique image sets of ROP. For each image set used in the system, a reference standard ROP diagnosis was established. Performance by ophthalmology residents (postgraduate years 2 to 4) from the United States and Canada in taking the ROP tele-education program was prospectively evaluated. Residents were presented with image-based clinical cases of ROP during a pretest, posttest, and training chapters. Accuracy and reliability of ROP diagnosis (eg, plus disease, zone, stage, category) were determined using sensitivity, specificity, and the kappa statistic calculations of the results from the pretest and posttest. Fifty-five ophthalmology residents were provided access to the ROP tele-education program. Thirty-one ophthalmology residents completed the program. When all training levels were analyzed together, a statistically significant increase was observed in sensitivity for the diagnosis of plus disease, zone, stage, category, and aggressive posterior ROP (P<.05). Statistically significant changes in specificity for identification of stage 2 or worse (P=.027) and pre-plus (P=.028) were observed. A tele-education system for ROP education is effective in improving diagnostic accuracy of ROP by ophthalmology residents. This system may have utility in the setting of both healthcare and medical education reform by creating a validated method to certify telemedicine providers and educate the next generation of ophthalmologists.

  6. Physician-directed software design: the role of utilization statistics and user input in enhancing HELP results review capabilities.

    PubMed

    Michael, P A

    1993-01-01

    The M.D. Rounds Report program was developed and implemented in June of 1992 as an adjunct to the HELP System at Rex Hospital. The program facilitates rapid access to information on allergies and current medications, laboratory results, radiology reports and therapist notes for a list of patients without physicians having to make additional menu or submenu selections. In planning for an upgrade of the program, utilization statistics and user feedback provided valuable information in terms of frequency of access, features used and unused, and the value of the program as a reporting tool in comparison to other online results reporting applications. A brief description of the functionality of the M.D. Rounds Report, evaluation of the program audit trail and user feedback, planned enhancements to the program, and a discussion of the prototyping and monitoring experience and the impact on future physician subsystem development will be presented.

  7. Adult and Community Education: An Overview. Australian Vocational Education and Training Statistics, 2001.

    ERIC Educational Resources Information Center

    Australian National Training Authority, Melbourne.

    In 2001, around 1,200 of 6,700 providers in the Australian public vocational education and training (VET) system delivered adult and community education (ACE) programs, which attracted 497,500 people. ACE programs accounted for 20.7 million hours of training and 972,500 subject enrolments. Vocational ACE programs accounted for 238,700 students,…

  8. A PROPOSED CHEMICAL INFORMATION AND DATA SYSTEM. VOLUME I.

    DTIC Science & Technology

    CHEMICAL COMPOUNDS, *DATA PROCESSING, *INFORMATION RETRIEVAL, * CHEMICAL ANALYSIS, INPUT OUTPUT DEVICES, COMPUTER PROGRAMMING, CLASSIFICATION...CONFIGURATIONS, DATA STORAGE SYSTEMS, ATOMS, MOLECULES, PERFORMANCE( ENGINEERING ), MAINTENANCE, SUBJECT INDEXING, MAGNETIC TAPE, AUTOMATIC, MILITARY REQUIREMENTS, TYPEWRITERS, OPTICS, TOPOLOGY, STATISTICAL ANALYSIS, FLOW CHARTING.

  9. The National Streamflow Statistics Program: A Computer Program for Estimating Streamflow Statistics for Ungaged Sites

    USGS Publications Warehouse

    Ries(compiler), Kernell G.; With sections by Atkins, J. B.; Hummel, P.R.; Gray, Matthew J.; Dusenbury, R.; Jennings, M.E.; Kirby, W.H.; Riggs, H.C.; Sauer, V.B.; Thomas, W.O.

    2007-01-01

    The National Streamflow Statistics (NSS) Program is a computer program that should be useful to engineers, hydrologists, and others for planning, management, and design applications. NSS compiles all current U.S. Geological Survey (USGS) regional regression equations for estimating streamflow statistics at ungaged sites in an easy-to-use interface that operates on computers with Microsoft Windows operating systems. NSS expands on the functionality of the USGS National Flood Frequency Program, and replaces it. The regression equations included in NSS are used to transfer streamflow statistics from gaged to ungaged sites through the use of watershed and climatic characteristics as explanatory or predictor variables. Generally, the equations were developed on a statewide or metropolitan-area basis as part of cooperative study programs. Equations are available for estimating rural and urban flood-frequency statistics, such as the 1 00-year flood, for every state, for Puerto Rico, and for the island of Tutuila, American Samoa. Equations are available for estimating other statistics, such as the mean annual flow, monthly mean flows, flow-duration percentiles, and low-flow frequencies (such as the 7-day, 0-year low flow) for less than half of the states. All equations available for estimating streamflow statistics other than flood-frequency statistics assume rural (non-regulated, non-urbanized) conditions. The NSS output provides indicators of the accuracy of the estimated streamflow statistics. The indicators may include any combination of the standard error of estimate, the standard error of prediction, the equivalent years of record, or 90 percent prediction intervals, depending on what was provided by the authors of the equations. The program includes several other features that can be used only for flood-frequency estimation. These include the ability to generate flood-frequency plots, and plots of typical flood hydrographs for selected recurrence intervals, estimates of the probable maximum flood, extrapolation of the 500-year flood when an equation for estimating it is not available, and weighting techniques to improve flood-frequency estimates for gaging stations and ungaged sites on gaged streams. This report describes the regionalization techniques used to develop the equations in NSS and provides guidance on the applicability and limitations of the techniques. The report also includes a users manual and a summary of equations available for estimating basin lagtime, which is needed by the program to generate flood hydrographs. The NSS software and accompanying database, and the documentation for the regression equations included in NSS, are available on the Web at http://water.usgs.gov/software/.

  10. 75 FR 10339 - Generalized System of Preferences (GSP): Announcing the Availability of Import Statistics...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-05

    ... OFFICE OF THE UNITED STATES TRADE REPRESENTATIVE Generalized System of Preferences (GSP... need limitations (CNLs) under the Generalized System of Preferences (GSP) program. The Office of the...; and (3) possible redesignations of articles currently not eligible for GSP benefits because they...

  11. Systems Analysis of NASA Aviation Safety Program: Final Report

    NASA Technical Reports Server (NTRS)

    Jones, Sharon M.; Reveley, Mary S.; Withrow, Colleen A.; Evans, Joni K.; Barr, Lawrence; Leone, Karen

    2013-01-01

    A three-month study (February to April 2010) of the NASA Aviation Safety (AvSafe) program was conducted. This study comprised three components: (1) a statistical analysis of currently available civilian subsonic aircraft data from the National Transportation Safety Board (NTSB), the Federal Aviation Administration (FAA), and the Aviation Safety Information Analysis and Sharing (ASIAS) system to identify any significant or overlooked aviation safety issues; (2) a high-level qualitative identification of future safety risks, with an assessment of the potential impact of the NASA AvSafe research on the National Airspace System (NAS) based on these risks; and (3) a detailed, top-down analysis of the NASA AvSafe program using an established and peer-reviewed systems analysis methodology. The statistical analysis identified the top aviation "tall poles" based on NTSB accident and FAA incident data from 1997 to 2006. A separate examination of medical helicopter accidents in the United States was also conducted. Multiple external sources were used to develop a compilation of ten "tall poles" in future safety issues/risks. The top-down analysis of the AvSafe was conducted by using a modification of the Gibson methodology. Of the 17 challenging safety issues that were identified, 11 were directly addressed by the AvSafe program research portfolio.

  12. Iowa's Community College Adult Literacy Annual Report. Program Year 2007, July 1, 2006-June 30, 2007

    ERIC Educational Resources Information Center

    Division of Community Colleges and Workforce Preparation, Iowa Department of Education, 2007

    2007-01-01

    This comprehensive document replaces the previously published Benchmark Report, Benchmark Report Executive Summary, Iowa's Community College Basic Literacy Skills Credential Report, Iowa GED Statistical Report, GED Annual Performance Report and Iowa's Adult Literacy Program National Reporting System Annual Performance Report (Graphic…

  13. Government-Funded Program Completions 2014. Preliminary. Australian Vocational Education and Training Statistics

    ERIC Educational Resources Information Center

    National Centre for Vocational Education Research (NCVER), 2015

    2015-01-01

    This publication provides data on Australian Qualifications Framework (AQF) programs completed from 2010 to 2014 in Australia's government-funded vocational education and training (VET) system (broadly defined as all activity delivered by government providers and government-funded activity delivered by community education and other registered…

  14. Australian Vocational Education and Training Statistics 1997 in Detail.

    ERIC Educational Resources Information Center

    National Centre for Vocational Education Research, Leabrook (Australia).

    This publication reflects the diversity of the Australian vocational education and training system at client, provider, and program levels. Following an introduction, section 2 presents a summary of data on: Australia by provider type; Australia, 1997 and 1996; states and territories, 1997; and activity in vocational programs. Section 3, arranged…

  15. Introduction to the JAWS Program

    NASA Technical Reports Server (NTRS)

    Mccarthy, John

    1987-01-01

    The JAWS Project is the Joint Airport Weather Studies project conceived in 1980 jointly between the National Center for Atmospheric Research and the Univ. of Chicago. The objectives of the program are threefold: (1) Basic scientific characterization of the microbursts and the statistics of microbursts occurrence; (2) Detection and warning, using the Low Level Wind Shear Alert System (LLWSAS) operation and performance; and (3) Doppler radar and airborne systems. These goals and the operation of the JAWS system in general are discussed in detail.

  16. Space shuttle solid rocket booster recovery system definition, volume 1

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The performance requirements, preliminary designs, and development program plans for an airborne recovery system for the space shuttle solid rocket booster are discussed. The analyses performed during the study phase of the program are presented. The basic considerations which established the system configuration are defined. A Monte Carlo statistical technique using random sampling of the probability distribution for the critical water impact parameters was used to determine the failure probability of each solid rocket booster component as functions of impact velocity and component strength capability.

  17. HyperCard Monitor System.

    ERIC Educational Resources Information Center

    Harris, Julian; Maurer, Hermann

    An investigation into high level event monitoring within the scope of a well-known multimedia application, HyperCard--a program on the Macintosh computer, is carried out. A monitoring system is defined as a system which automatically monitors usage of some activity and gathers statistics based on what is has observed. Monitor systems can give the…

  18. An automated library financial management system

    NASA Technical Reports Server (NTRS)

    Dueker, S.; Gustafson, L.

    1977-01-01

    A computerized library acquisition system developed for control of informational materials acquired at NASA Ames Research Center is described. The system monitors the acquisition of both library and individual researchers' orders and supplies detailed financial, statistical, and bibliographical information. Applicability for other libraries and the future availability of the program is discussed.

  19. Analysis of USAREUR Family Housing.

    DTIC Science & Technology

    1985-04-01

    Standard Installation/Division Personnel System SJA ................ Staff Judge Advocate SPSS ............... Statistical Package for the...for Projecting Family Housing Requirements. a. Attempts to define USAREUR’s programmable family housing deficit Sbased on the FHS have caused anguish ...responses using the Statistical Package for the Social Sciences ( SPSS ) computer program. E-2 ANNEX E RESPONSE TO ESC HOUSING QUESTIONNAIRE Section Page I

  20. Factorial analysis of trihalomethanes formation in drinking water.

    PubMed

    Chowdhury, Shakhawat; Champagne, Pascale; McLellan, P James

    2010-06-01

    Disinfection of drinking water reduces pathogenic infection, but may pose risks to human health through the formation of disinfection byproducts. The effects of different factors on the formation of trihalomethanes were investigated using a statistically designed experimental program, and a predictive model for trihalomethanes formation was developed. Synthetic water samples with different factor levels were produced, and trihalomethanes concentrations were measured. A replicated fractional factorial design with center points was performed, and significant factors were identified through statistical analysis. A second-order trihalomethanes formation model was developed from 92 experiments, and the statistical adequacy was assessed through appropriate diagnostics. This model was validated using additional data from the Drinking Water Surveillance Program database and was applied to the Smiths Falls water supply system in Ontario, Canada. The model predictions were correlated strongly to the measured trihalomethanes, with correlations of 0.95 and 0.91, respectively. The resulting model can assist in analyzing risk-cost tradeoffs in the design and operation of water supply systems.

  1. 36 CFR 1008.2 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... photograph. Related definitions include: (1) System of records means a group of any records under the control... means records used for personnel management programs or processes such as staffing, employee development, retirement, and grievances and appeals. (4) Statistical records means records in a system of records...

  2. Experiences with hypercube operating system instrumentation

    NASA Technical Reports Server (NTRS)

    Reed, Daniel A.; Rudolph, David C.

    1989-01-01

    The difficulties in conceptualizing the interactions among a large number of processors make it difficult both to identify the sources of inefficiencies and to determine how a parallel program could be made more efficient. This paper describes an instrumentation system that can trace the execution of distributed memory parallel programs by recording the occurrence of parallel program events. The resulting event traces can be used to compile summary statistics that provide a global view of program performance. In addition, visualization tools permit the graphic display of event traces. Visual presentation of performance data is particularly useful, indeed, necessary for large-scale parallel computers; the enormous volume of performance data mandates visual display.

  3. The Global Education Network for Retinopathy of Prematurity (Gen-Rop): Development, Implementation, and Evaluation of A Novel Tele-Education System (An American Ophthalmological Society Thesis)

    PubMed Central

    Chan, R.V. Paul; Patel, Samir N.; Ryan, Michael C.; Jonas, Karyn E.; Ostmo, Susan; Port, Alexander D.; Sun, Grace I.; Lauer, Andreas K.; Chiang, Michael F.

    2015-01-01

    Purpose: To describe the design, implementation, and evaluation of a tele-education system developed to improve diagnostic competency in retinopathy of prematurity (ROP) by ophthalmology residents. Methods: A secure Web-based tele-education system was developed utilizing a repository of over 2,500 unique image sets of ROP. For each image set used in the system, a reference standard ROP diagnosis was established. Performance by ophthalmology residents (postgraduate years 2 to 4) from the United States and Canada in taking the ROP tele-education program was prospectively evaluated. Residents were presented with image-based clinical cases of ROP during a pretest, posttest, and training chapters. Accuracy and reliability of ROP diagnosis (eg, plus disease, zone, stage, category) were determined using sensitivity, specificity, and the kappa statistic calculations of the results from the pretest and posttest. Results: Fifty-five ophthalmology residents were provided access to the ROP tele-education program. Thirty-one ophthalmology residents completed the program. When all training levels were analyzed together, a statistically significant increase was observed in sensitivity for the diagnosis of plus disease, zone, stage, category, and aggressive posterior ROP (P<.05). Statistically significant changes in specificity for identification of stage 2 or worse (P=.027) and pre-plus (P=.028) were observed. Conclusions: A tele-education system for ROP education is effective in improving diagnostic accuracy of ROP by ophthalmology residents. This system may have utility in the setting of both healthcare and medical education reform by creating a validated method to certify telemedicine providers and educate the next generation of ophthalmologists. PMID:26538772

  4. The Politics and Statistics of Value-Added Modeling for Accountability of Teacher Preparation Programs

    ERIC Educational Resources Information Center

    Lincove, Jane Arnold; Osborne, Cynthia; Dillon, Amanda; Mills, Nicholas

    2014-01-01

    Despite questions about validity and reliability, the use of value-added estimation methods has moved beyond academic research into state accountability systems for teachers, schools, and teacher preparation programs (TPPs). Prior studies of value-added measurement for TPPs test the validity of researcher-designed models and find that measuring…

  5. Graduate Level Research Methods and Statistics Courses: The Perspective of an Instructor

    ERIC Educational Resources Information Center

    Mulvenon, Sean W.; Wang, Victor C. X.

    2015-01-01

    The goal of an educational system or degree program is to "educate" students. This immediately raises the question of what does it mean to "educate" students. All academic institutions, degree programs and content areas are typically expected to answer this question and establish appropriate academic expectations both within…

  6. National visitor use monitoring implementation in Alaska.

    Treesearch

    Eric M. White; Joshua B. Wilson

    2008-01-01

    The USDA Forest Service implemented the National Visitor Use Monitoring (NVUM) program across the entire National Forest System (NFS) in calendar year 2000. The primary objective of the NVUM program is to develop reliable estimates of recreation use on NFS lands via a nationally consistent, statistically valid sampling approach. Secondary objectives of NVUM are to...

  7. From micro to mainframe. A practical approach to perinatal data processing.

    PubMed

    Yeh, S Y; Lincoln, T

    1985-04-01

    A new, practical approach to perinatal data processing for a large obstetric population is described. This was done with a microcomputer for data entry and a mainframe computer for data reduction. The Screen Oriented Data Access (SODA) program was used to generate the data entry form and to input data into the Apple II Plus computer. Data were stored on diskettes and transmitted through a modern and telephone line to the IBM 370/168 computer. The Statistical Analysis System (SAS) program was used for statistical analyses and report generations. This approach was found to be most practical, flexible, and economical.

  8. Evaluating bacterial gene-finding HMM structures as probabilistic logic programs.

    PubMed

    Mørk, Søren; Holmes, Ian

    2012-03-01

    Probabilistic logic programming offers a powerful way to describe and evaluate structured statistical models. To investigate the practicality of probabilistic logic programming for structure learning in bioinformatics, we undertook a simplified bacterial gene-finding benchmark in PRISM, a probabilistic dialect of Prolog. We evaluate Hidden Markov Model structures for bacterial protein-coding gene potential, including a simple null model structure, three structures based on existing bacterial gene finders and two novel model structures. We test standard versions as well as ADPH length modeling and three-state versions of the five model structures. The models are all represented as probabilistic logic programs and evaluated using the PRISM machine learning system in terms of statistical information criteria and gene-finding prediction accuracy, in two bacterial genomes. Neither of our implementations of the two currently most used model structures are best performing in terms of statistical information criteria or prediction performances, suggesting that better-fitting models might be achievable. The source code of all PRISM models, data and additional scripts are freely available for download at: http://github.com/somork/codonhmm. Supplementary data are available at Bioinformatics online.

  9. A statistical data analysis and plotting program for cloud microphysics experiments

    NASA Technical Reports Server (NTRS)

    Jordan, A. J.

    1981-01-01

    The analysis software developed for atmospheric cloud microphysics experiments conducted in the laboratory as well as aboard a KC-135 aircraft is described. A group of four programs was developed and implemented on a Hewlett Packard 1000 series F minicomputer running under HP's RTE-IVB operating system. The programs control and read data from a MEMODYNE Model 3765-8BV cassette recorder, format the data on the Hewlett Packard disk subsystem, and generate statistical data (mean, variance, standard deviation) and voltage and engineering unit plots on a user selected plotting device. The programs are written in HP FORTRAN IV and HP ASSEMBLY Language with the graphics software using the HP 1000 Graphics. The supported plotting devices are the HP 2647A graphics terminal, the HP 9872B four color pen plotter, and the HP 2608A matrix line printer.

  10. Air Combat Training: Good Stick Index Validation. Final Report for Period 3 April 1978-1 April 1979.

    ERIC Educational Resources Information Center

    Moore, Samuel B.; And Others

    A study was conducted to investigate and statistically validate a performance measuring system (the Good Stick Index) in the Tactical Air Command Combat Engagement Simulator I (TAC ACES I) Air Combat Maneuvering (ACM) training program. The study utilized a twelve-week sample of eighty-nine student pilots to statistically validate the Good Stick…

  11. Missile Systems Maintenance, AFSC 411XOB/C.

    DTIC Science & Technology

    1988-04-01

    technician’s rating. A statistical measurement of their agreement, known as the interrater reliability (as assessed through components of variance of...senior technician’s ratings. A statistical measurement of their agreement, known as the interrater reliability (as assessed through components of...FABRICATION TRANSITORS *INPUT/OUTPUT (PERIPHERAL) DEVICES SOLID-STATE SPECIAL PURPOSE DEVICES COMPUTER MICRO PROCESSORS AND PROGRAMS POWER SUPPLIES

  12. A data storage, retrieval and analysis system for endocrine research. [for Skylab

    NASA Technical Reports Server (NTRS)

    Newton, L. E.; Johnston, D. A.

    1975-01-01

    This retrieval system builds, updates, retrieves, and performs basic statistical analyses on blood, urine, and diet parameters for the M071 and M073 Skylab and Apollo experiments. This system permits data entry from cards to build an indexed sequential file. Programs are easily modified for specialized analyses.

  13. Proceedings of the NASTRAN (Tradename) Users’ Colloquium (15th) Held in Kansas City, Missouri on 4-8 May 1987

    DTIC Science & Technology

    1987-08-01

    HVAC duct hanger system over an extensive frequency range. The finite element, component mode synthesis, and statistical energy analysis methods are...800-5,000 Hz) analysis was conducted with Statistical Energy Analysis (SEA) coupled with a closed-form harmonic beam analysis program. These...resonances may be obtained by using a finer frequency increment. Statistical Energy Analysis The basic assumption used in SEA analysis is that within each band

  14. Compliance program data management system for The Idaho National Engineering Laboratory/Environmental Protection Agency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hertzler, C.L.; Poloski, J.P.; Bates, R.A.

    1988-01-01

    The Compliance Program Data Management System (DMS) developed at the Idaho National Engineering Laboratory (INEL) validates and maintains the integrity of data collected to support the Consent Order and Compliance Agreement (COCA) between the INEL and the Environmental Protection Agency (EPA). The system uses dBase III Plus programs and dBase III Plus in an interactive mode to enter, store, validate, manage, and retrieve analytical information provided on EPA Contract Laboratory Program (CLP) forms and CLP forms modified to accommodate 40 CFR 264 Appendix IX constituent analyses. Data analysis and presentation is performed utilizing SAS, a statistical analysis software program. Archivingmore » of data and results is performed at appropriate stages of data management. The DMS is useful for sampling and analysis programs where adherence to EPA CLP protocol, along with maintenance and retrieval of waste site investigation sampling results is desired or requested. 3 refs.« less

  15. [Development and application of emergency medical information management system].

    PubMed

    Wang, Fang; Zhu, Baofeng; Chen, Jianrong; Wang, Jian; Gu, Chaoli; Liu, Buyun

    2011-03-01

    To meet the needs of clinical practice of rescuing critical illness and develop the information management system of the emergency medicine. Microsoft Visual FoxPro, which is one of Microsoft's visual programming tool, is used to develop computer-aided system included the information management system of the emergency medicine. The system mainly consists of the module of statistic analysis, the module of quality control of emergency rescue, the module of flow path of emergency rescue, the module of nursing care in emergency rescue, and the module of rescue training. It can realize the system management of emergency medicine and,process and analyze the emergency statistical data. This system is practical. It can optimize emergency clinical pathway, and meet the needs of clinical rescue.

  16. Radial velocity detection of extra-solar planetary systems

    NASA Technical Reports Server (NTRS)

    Cochran, William D.

    1991-01-01

    The goal of this program was to detect planetary systems in orbit around other stars through the ultra high precision measurement of the orbital motion of the star around the star-planet barycenter. The survey of 33 nearby solar-type stars is the essential first step in understanding the overall problem of planet formation. The program will accumulate the necessary statistics to determine the frequency of planet formation as a function of stellar mass, age, and composition.

  17. Physician-directed software design: the role of utilization statistics and user input in enhancing HELP results review capabilities.

    PubMed Central

    Michael, P. A.

    1993-01-01

    The M.D. Rounds Report program was developed and implemented in June of 1992 as an adjunct to the HELP System at Rex Hospital. The program facilitates rapid access to information on allergies and current medications, laboratory results, radiology reports and therapist notes for a list of patients without physicians having to make additional menu or submenu selections. In planning for an upgrade of the program, utilization statistics and user feedback provided valuable information in terms of frequency of access, features used and unused, and the value of the program as a reporting tool in comparison to other online results reporting applications. A brief description of the functionality of the M.D. Rounds Report, evaluation of the program audit trail and user feedback, planned enhancements to the program, and a discussion of the prototyping and monitoring experience and the impact on future physician subsystem development will be presented. PMID:8130443

  18. VAPEPS user's reference manual, version 5.0

    NASA Technical Reports Server (NTRS)

    Park, D. M.

    1988-01-01

    This is the reference manual for the VibroAcoustic Payload Environment Prediction System (VAPEPS). The system consists of a computer program and a vibroacoustic database. The purpose of the system is to collect measurements of vibroacoustic data taken from flight events and ground tests, and to retrieve this data and provide a means of using the data to predict future payload environments. This manual describes the operating language of the program. Topics covered include database commands, Statistical Energy Analysis (SEA) prediction commands, stress prediction command, and general computational commands.

  19. A PERFORMANCE EVALUATION OF THE ETA- CMAQ AIR QUALITY FORECAST SYSTEM FOR THE SUMMER OF 2005

    EPA Science Inventory

    This poster presents an evaluation of the Eta-CMAQ Air Quality Forecast System's experimental domain using O3 observations obtained from EPA's AIRNOW program and a suite of statistical metrics examining both discrete and categorical forecasts.

  20. Software Analytical Instrument for Assessment of the Process of Casting Slabs

    NASA Astrophysics Data System (ADS)

    Franěk, Zdeněk; Kavička, František; Štětina, Josef; Masarik, Miloš

    2010-06-01

    The paper describes the original proposal of ways of solution and function of the program equipment for assessment of the process of casting slabs. The program system LITIOS was developed and implemented in EVRAZ Vitkovice Steel Ostrava on the equipment of continuous casting of steel (further only ECC). This program system works on the data warehouse of technological parameters of casting and quality parameters of slabs. It enables an ECC technologist to analyze the course of casting melt and with using statistics methods to set the influence of single technological parameters on the duality of final slabs. The system also enables long term monitoring and optimization of the production.

  1. Building a Performance-Based Assessment System To Diagnose Strengths and Weaknesses in Reading Achievement.

    ERIC Educational Resources Information Center

    Hennings, Sara S.; Hughes, Kay E.

    This paper provides a brief description of the development of the Diagnostic Assessments of Reading with Trial Teaching Strategies (DARTTS) program by F. G. Roswell and J. S. Chall. It also describes the editorial and statistical procedures that were used to validate the program for determining students' strengths and weaknesses in important areas…

  2. An exploratory investigation of weight estimation techniques for hypersonic flight vehicles

    NASA Technical Reports Server (NTRS)

    Cook, E. L.

    1981-01-01

    The three basic methods of weight prediction (fixed-fraction, statistical correlation, and point stress analysis) and some of the computer programs that have been developed to implement them are discussed. A modified version of the WAATS (Weights Analysis of Advanced Transportation Systems) program is presented, along with input data forms and an example problem.

  3. Improving the Effectiveness of Program Managers

    DTIC Science & Technology

    2006-05-03

    Improving the Effectiveness of Program Managers Systems and Software Technology Conference Salt Lake City, Utah May 3, 2006 Presented by GAO’s...Companies’ best practices Motorola Caterpillar Toyota FedEx NCR Teradata Boeing Hughes Space and Communications Disciplined software and management...and total ownership costs Collection of metrics data to improve software reliability Technology readiness levels and design maturity Statistical

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goltz, G.; Kaiser, L.M.; Weiner, H.

    A major mission of the U.S. Coast Guard is the task of providing and maintaining Maritime Aids to Navigation. These aids are located on and near the coastline and inland waters of the United States and its possessions. A computer program, Design Synthesis and Performance Analysis (DSPA), has been developed by the Jet Propulsion Laboratory to demonstrate the feasibility of low-cost solar array/battery power systems for use on flashing lamp buoys. To provide detailed, realistic temperature, wind, and solar insolation data for analysis of the flashing lamp buoy power systems, the two DSPA support computer program sets: MERGE and STATmore » were developed. A general description of these two packages is presented in this program summary report. The MERGE program set will enable the Coast Guard to combine temperature and wind velocity data (NOAA TDF-14 tapes) with solar insolation data (NOAA DECK-280 tapes) onto a single sequential MERGE file containing up to 12 years of hourly observations. This MERGE file can then be used as direct input to the DSPA program. The STAT program set will enable a statistical analysis to be performed of the MERGE data and produce high or low or mean profiles of the data and/or do a worst case analysis. The STAT output file consists of a one-year set of hourly statistical weather data which can be used as input to the DSPA program.« less

  5. Image Analysis Program for Measuring Particles with the Zeiss CSM 950 Scanning Electron Microscope (SEM)

    DTIC Science & Technology

    1990-01-01

    7 𔄁 . ,: 1& *U _’ ś TECHNICAL REPORT AD NATICK/TR-90/014 (V) N* IMAGE ANALYSIS PROGRAM FOR MEASURING PARTICLES < WITH THE ZEISS CSM 950 SCANNING... image analysis program for measuring particles using the Zeiss CSM 950/Kontron system is as follows: A>CSM calls the image analysis program. Press D to...27 vili LIST OF TABLES TABLE PAGE 1. Image Analysis Program for Measuring 29 Spherical Particles 14 2. Printout of Statistical Data Frcm Table 1 16 3

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    This document comprises Pacific Northwest National Laboratory`s report for Fiscal Year 1996 on research and development programs. The document contains 161 project summaries in 16 areas of research and development. The 16 areas of research and development reported on are: atmospheric sciences, biotechnology, chemical instrumentation and analysis, computer and information science, ecological science, electronics and sensors, health protection and dosimetry, hydrological and geologic sciences, marine sciences, materials science and engineering, molecular science, process science and engineering, risk and safety analysis, socio-technical systems analysis, statistics and applied mathematics, and thermal and energy systems. In addition, this report provides an overview ofmore » the research and development program, program management, program funding, and Fiscal Year 1997 projects.« less

  7. AutoBayes Program Synthesis System System Internals

    NASA Technical Reports Server (NTRS)

    Schumann, Johann Martin

    2011-01-01

    This lecture combines the theoretical background of schema based program synthesis with the hands-on study of a powerful, open-source program synthesis system (Auto-Bayes). Schema-based program synthesis is a popular approach toward program synthesis. The lecture will provide an introduction into this topic and discuss how this technology can be used to generate customized algorithms. The synthesis of advanced numerical algorithms requires the availability of a powerful symbolic (algebra) system. Its task is to symbolically solve equations, simplify expressions, or to symbolically calculate derivatives (among others) such that the synthesized algorithms become as efficient as possible. We will discuss the use and importance of the symbolic system for synthesis. Any synthesis system is a large and complex piece of code. In this lecture, we will study Autobayes in detail. AutoBayes has been developed at NASA Ames and has been made open source. It takes a compact statistical specification and generates a customized data analysis algorithm (in C/C++) from it. AutoBayes is written in SWI Prolog and many concepts from rewriting, logic, functional, and symbolic programming. We will discuss the system architecture, the schema libary and the extensive support infra-structure. Practical hands-on experiments and exercises will enable the student to get insight into a realistic program synthesis system and provides knowledge to use, modify, and extend Autobayes.

  8. Methods for estimating magnitude and frequency of floods in Arizona, developed with unregulated and rural peak-flow data through water year 2010

    USGS Publications Warehouse

    Paretti, Nicholas V.; Kennedy, Jeffrey R.; Turney, Lovina A.; Veilleux, Andrea G.

    2014-01-01

    The regional regression equations were integrated into the U.S. Geological Survey’s StreamStats program. The StreamStats program is a national map-based web application that allows the public to easily access published flood frequency and basin characteristic statistics. The interactive web application allows a user to select a point within a watershed (gaged or ungaged) and retrieve flood-frequency estimates derived from the current regional regression equations and geographic information system data within the selected basin. StreamStats provides users with an efficient and accurate means for retrieving the most up to date flood frequency and basin characteristic data. StreamStats is intended to provide consistent statistics, minimize user error, and reduce the need for large datasets and costly geographic information system software.

  9. A Study of Persistence in the Northeast State Community College Health-Related Programs of Study

    NASA Astrophysics Data System (ADS)

    Hamilton, Allana R.

    2011-12-01

    The purpose of the study was to identify factors that were positively associated with persistence to graduation by students who were admitted to Health-Related Programs leading to the degree associate of applied science at Northeast State Community College. The criterion variable in this study was persistence, which was categorized into two groups the persister group (program completers) and the nonpersister (program noncompleters) group. The predictor variables included gender, ethnic origin, first- (or nonfirst-) generation-student status, age, specific major program of study, number of remedial and/or developmental courses taken, grades in selected courses (human anatomy and physiology I and II, microbiology, probability and statistics, composition I, clinical I, clinical II), and number of mathematics and science credit hours earned prior to program admission. The data for this ex post facto nonexperimental design were located in Northeast State's student records database, Banner Information System. The subjects of the study were students who had been admitted into Health-Related Programs of study at a 2-year public community college between the years of 1999 and 2008. The population size was 761. Health-Related Programs of study included Dental Assisting, Cardiovascular Technology, Emergency Medical Technology -- Paramedic, Medical Laboratory Technology, Nursing, and Surgical Technology. A combination of descriptive and inferential statistics was used in the analysis of the data. Descriptive statistics included measures of central tendency, standard deviations, and percentages, as appropriate. Independent samples t-tests were used to determine if the mean of a variable on one group of subjects was different from the mean of the same variable with a different group of subjects. It was found that gender, ethnic origin, first-generation status, and age were not significantly associated with persistence to graduation. However, findings did reveal a statistically significant difference in persistence rates among the specific Health-Related Programs of study. Academic data including grades in human anatomy and physiology I, probability and statistics, and composition I, suggested a relationship between the course grade and persistence to graduation. Findings also revealed a relationship between the number of math and science courses completed and students' persistence to graduation.

  10. Evaluating and implementing temporal, spatial, and spatio-temporal methods for outbreak detection in a local syndromic surveillance system.

    PubMed

    Mathes, Robert W; Lall, Ramona; Levin-Rector, Alison; Sell, Jessica; Paladini, Marc; Konty, Kevin J; Olson, Don; Weiss, Don

    2017-01-01

    The New York City Department of Health and Mental Hygiene has operated an emergency department syndromic surveillance system since 2001, using temporal and spatial scan statistics run on a daily basis for cluster detection. Since the system was originally implemented, a number of new methods have been proposed for use in cluster detection. We evaluated six temporal and four spatial/spatio-temporal detection methods using syndromic surveillance data spiked with simulated injections. The algorithms were compared on several metrics, including sensitivity, specificity, positive predictive value, coherence, and timeliness. We also evaluated each method's implementation, programming time, run time, and the ease of use. Among the temporal methods, at a set specificity of 95%, a Holt-Winters exponential smoother performed the best, detecting 19% of the simulated injects across all shapes and sizes, followed by an autoregressive moving average model (16%), a generalized linear model (15%), a modified version of the Early Aberration Reporting System's C2 algorithm (13%), a temporal scan statistic (11%), and a cumulative sum control chart (<2%). Of the spatial/spatio-temporal methods we tested, a spatial scan statistic detected 3% of all injects, a Bayes regression found 2%, and a generalized linear mixed model and a space-time permutation scan statistic detected none at a specificity of 95%. Positive predictive value was low (<7%) for all methods. Overall, the detection methods we tested did not perform well in identifying the temporal and spatial clusters of cases in the inject dataset. The spatial scan statistic, our current method for spatial cluster detection, performed slightly better than the other tested methods across different inject magnitudes and types. Furthermore, we found the scan statistics, as applied in the SaTScan software package, to be the easiest to program and implement for daily data analysis.

  11. General purpose simulation system of the data management system for Space Shuttle mission 18

    NASA Technical Reports Server (NTRS)

    Bengtson, N. M.; Mellichamp, J. M.; Smith, O. C.

    1976-01-01

    A simulation program for the flow of data through the Data Management System of Spacelab and Space Shuttle was presented. The science, engineering, command and guidance, navigation and control data were included. The programming language used was General Purpose Simulation System V (OS). The science and engineering data flow was modeled from its origin at the experiments and subsystems to transmission from Space Shuttle. Command data flow was modeled from the point of reception onboard and from the CDMS Control Panel to the experiments and subsystems. The GN&C data flow model handled data between the General Purpose Computer and the experiments and subsystems. Mission 18 was the particular flight chosen for simulation. The general structure of the program is presented, followed by a user's manual. Input data required to make runs are discussed followed by identification of the output statistics. The appendices contain a detailed model configuration, program listing and results.

  12. Control of optical systems

    NASA Technical Reports Server (NTRS)

    Founds, D.

    1988-01-01

    Some of the current and planned activities at the Air Force Systems Command in structures and controls for optical-type systems are summarized. Many of the activities are contracted to industry; one task is an in-house program which includes a hardware test program. The objective of the in-house program, referred to as the Aluminum Beam Expander Structure (ABES), is to address issues involved in on-orbit system identification. The structure, which appears similar to the LDR backup structure, is about 35 feet tall. The activity to date has been limited to acquisition of about 250 hours of test data. About 30 hours of data per excitation force is gathered in order to obtain sufficient data for a good statistical estimate of the structural parameters. The development of an Integrated Structural Modeling (ISM) computer program is being done by Boeing Aerospace Company. The objective of the contracted effort is to develop a combined optics, structures, thermal, controls, and multibody dynamics simulation code.

  13. 45 CFR 309.170 - What statistical and narrative reporting requirements apply to Tribal IV-D programs?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 2 2011-10-01 2011-10-01 false What statistical and narrative reporting... (IV-D) PROGRAM Statistical and Narrative Reporting Requirements § 309.170 What statistical and... organizations must submit the following information and statistics for Tribal IV-D program activity and caseload...

  14. 45 CFR 309.170 - What statistical and narrative reporting requirements apply to Tribal IV-D programs?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 2 2010-10-01 2010-10-01 false What statistical and narrative reporting... (IV-D) PROGRAM Statistical and Narrative Reporting Requirements § 309.170 What statistical and... organizations must submit the following information and statistics for Tribal IV-D program activity and caseload...

  15. Evaluating a Health Educational First aid Program with the Implementation of Synchronous Distance Learning.

    PubMed

    Ponirou, Paraskevi; Diomidous, Marianna; Mantas, John; Kalokairinou, Athena; Kalouri, Ourania; Kapadochos, Theodoros; Tzavara, Chara

    2014-01-01

    The education in First Aid through health education programs can help in promoting the health of the population. Meanwhile, the development of alternative forms of education with emphasis on distance learning implemented with e-learning creates an innovative system of knowledge and skills in different population groups. The main purpose of this research proposal is to investigate the effectiveness of the educational program to candidates educators about knowledge and emergency preparedness at school. The study used the Solomon four group design (2 intervention groups and 2 control groups). Statistical analysis showed significant difference within the four groups. Intervention groups had improved significantly their knowledge showing that the program was effective and that they would eventually deal with a threatening situation with right handlings. There were no statistical significant findings regarding other independent variables (p>0,05).The health education program with the implementation of synchronous distance learning succeeded to enhance the knowledge of candidates educators.

  16. Mentor and protege attitudes towards the science mentoring program

    NASA Astrophysics Data System (ADS)

    Rios Jimenez, Noemaris

    The purpose of this study was to examine mentor and protege attitudes towards the science mentoring program. This study focused on the attitudes that proteges and mentors participating in the Puerto Rico Statewide Systemic Initiative (PRSSI) have towards the PRSSI mentoring program and the mentoring relationship. The data was gathered from a questionnaire for mentors and beginning teachers designed by Reiman and Edelfelt in 1990. It was used to measure the mentor and protege attitudes towards the science mentoring program by three variables: mentor-protege relationship, professional development, and supportive school climate. Data were collected from 56 science teachers (proteges) and 21 mentors from fourteen (14) junior high schools. Descriptive statistics were used to indicate both proteges and mentor attitudes towards the science mentoring program. T-tests were conducted to establish if there was a statistically significant difference between protege and mentor attitudes. In conclusion, the attitudes of mentors and proteges in regard to mentor-protege relationship, professional development, and supportive school climate were similar.

  17. Who Does Not Benefit from Federal and State Financial Aid Programs? Information Brief. Volume 7, Issue 3

    ERIC Educational Resources Information Center

    Florida Board of Governors, State University System, 2009

    2009-01-01

    This brief presents statistics showing that many students from middle-income and lower-income Florida families do not qualify for federal or state grants and scholarships, and that nearly half of state university system middle- and lower-income families do not receive benefits from federal or state financial aid programs. (Contains technical…

  18. 45 CFR 310.10 - What are the functional requirements for the Model Tribal IV-D System?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Number; and (E) Participant Identification Number; (ii) Delinquency and enforcement activities; (iii... operations and to assess program performance through the audit of financial and statistical data maintained...

  19. SWToolbox: A surface-water tool-box for statistical analysis of streamflow time series

    USGS Publications Warehouse

    Kiang, Julie E.; Flynn, Kate; Zhai, Tong; Hummel, Paul; Granato, Gregory

    2018-03-07

    This report is a user guide for the low-flow analysis methods provided with version 1.0 of the Surface Water Toolbox (SWToolbox) computer program. The software combines functionality from two software programs—U.S. Geological Survey (USGS) SWSTAT and U.S. Environmental Protection Agency (EPA) DFLOW. Both of these programs have been used primarily for computation of critical low-flow statistics. The main analysis methods are the computation of hydrologic frequency statistics such as the 7-day minimum flow that occurs on average only once every 10 years (7Q10), computation of design flows including biologically based flows, and computation of flow-duration curves and duration hydrographs. Other annual, monthly, and seasonal statistics can also be computed. The interface facilitates retrieval of streamflow discharge data from the USGS National Water Information System and outputs text reports for a record of the analysis. Tools for graphing data and screening tests are available to assist the analyst in conducting the analysis.

  20. Residency Program Directors' View on the Value of Teaching.

    PubMed

    Korte, Catherine; Smith, Andrew; Pace, Heather

    2016-08-01

    There is no standardization for teaching activities or a requirement for residency programs to offer specific teaching programs to pharmacy residents. This study will determine the perceived value of providing teaching opportunities to postgraduate year 1 (PGY-1) pharmacy residents in the perspective of the residency program director. The study will also identify the features, depth, and breadth of the teaching experiences afforded to PGY-1 pharmacy residents. A 20-question survey was distributed electronically to 868 American Society of Health-System Pharmacists-accredited PGY-1 residency program directors. The survey was completed by 322 program directors. Developing pharmacy educators was found to be highly valued by 57% of the program directors. Advertisement of teaching opportunities was found to be statistically significant when comparing program directors with a high perceived value for providing teaching opportunities to program demographics. Statistically significant differences were identified associating development of a teaching portfolio, evaluation of Advanced Pharmacy Practice Experiences students, and delivery of didactic lectures with program directors who highly value developing pharmacy educators. Future residency candidates interested in teaching or a career in academia may utilize these findings to identify programs that are more likely to value developing pharmacy educators. The implementation of a standardized teaching experience among all programs may be difficult. © The Author(s) 2015.

  1. Underestimates of unintentional firearm fatalities: comparing Supplementary Homicide Report data with the National Vital Statistics System

    PubMed Central

    Barber, C; Hemenway, D; Hochstadt, J; Azrael, D

    2002-01-01

    Objective: A growing body of evidence suggests that the nation's vital statistics system undercounts unintentional firearm deaths that are not self inflicted. This issue was examined by comparing how unintentional firearm injuries identified in police Supplementary Homicide Report (SHR) data were coded in the National Vital Statistics System. Methods: National Vital Statistics System data are based on death certificates and divide firearm fatalities into six subcategories: homicide, suicide, accident, legal intervention, war operations, and undetermined. SHRs are completed by local police departments as part of the FBI's Uniform Crime Reports program. The SHR divides homicides into two categories: "murder and non-negligent manslaughter" (type A) and "negligent manslaughter" (type B). Type B shooting deaths are those that are inflicted by another person and that a police investigation determined were inflicted unintentionally, as in a child killing a playmate after mistaking a gun for a toy. In 1997, the SHR classified 168 shooting victims this way. Using probabilistic matching, 140 of these victims were linked to their death certificate records. Results: Among the 140 linked cases, 75% were recorded on the death certificate as homicides and only 23% as accidents. Conclusion: Official data from the National Vital Statistics System almost certainly undercount firearm accidents when the victim is shot by another person. PMID:12226128

  2. A new 2D segmentation method based on dynamic programming applied to computer aided detection in mammography.

    PubMed

    Timp, Sheila; Karssemeijer, Nico

    2004-05-01

    Mass segmentation plays a crucial role in computer-aided diagnosis (CAD) systems for classification of suspicious regions as normal, benign, or malignant. In this article we present a robust and automated segmentation technique--based on dynamic programming--to segment mass lesions from surrounding tissue. In addition, we propose an efficient algorithm to guarantee resulting contours to be closed. The segmentation method based on dynamic programming was quantitatively compared with two other automated segmentation methods (region growing and the discrete contour model) on a dataset of 1210 masses. For each mass an overlap criterion was calculated to determine the similarity with manual segmentation. The mean overlap percentage for dynamic programming was 0.69, for the other two methods 0.60 and 0.59, respectively. The difference in overlap percentage was statistically significant. To study the influence of the segmentation method on the performance of a CAD system two additional experiments were carried out. The first experiment studied the detection performance of the CAD system for the different segmentation methods. Free-response receiver operating characteristics analysis showed that the detection performance was nearly identical for the three segmentation methods. In the second experiment the ability of the classifier to discriminate between malignant and benign lesions was studied. For region based evaluation the area Az under the receiver operating characteristics curve was 0.74 for dynamic programming, 0.72 for the discrete contour model, and 0.67 for region growing. The difference in Az values obtained by the dynamic programming method and region growing was statistically significant. The differences between other methods were not significant.

  3. Support Provided to the External Tank (ET) Project on the Use of Statistical Analysis for ET Certification Consultation Position Paper

    NASA Technical Reports Server (NTRS)

    Null, Cynthia H.

    2009-01-01

    In June 2004, the June Space Flight Leadership Council (SFLC) assigned an action to the NASA Engineering and Safety Center (NESC) and External Tank (ET) project jointly to characterize the available dataset [of defect sizes from dissections of foam], identify resultant limitations to statistical treatment of ET as-built foam as part of the overall thermal protection system (TPS) certification, and report to the Program Requirements Change Board (PRCB) and SFLC in September 2004. The NESC statistics team was formed to assist the ET statistics group in August 2004. The NESC's conclusions are presented in this report.

  4. New Phone System Coming to NCI Campus at Frederick | Poster

    Cancer.gov

    By Travis Fouche and Trent McKee, Guest Writers Beginning in September, phones at the NCI Campus at Frederick will begin to be replaced, as the project to upgrade the current phone system ramps up. Over the next 16 months, the Information Systems Program (ISP) will be working with Facilities Maintenance and Engineering and Computer & Statistical Services to replace the current

  5. Australian Vocational Education and Training Statistics: VET Program Completion Rates, 2011-15

    ERIC Educational Resources Information Center

    National Centre for Vocational Education Research (NCVER), 2017

    2017-01-01

    The Australian vocational education and training (VET) system provides training across a wide range of subject areas and is delivered through a variety of training institutions and enterprises (including to apprentices and trainees). The system provides training for students of all ages and backgrounds. Students may study individual subjects or…

  6. 77 FR 1728 - Privacy Act of 1974; Publication of Five New Systems of Records; Amendments to Five Existing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-11

    ... assistance to correspondents; to use Web site based programs; to provide usage statistics associated with the... of individuals for surveys. Among other things, maintaining the names, addresses, etc. of individuals... information in the system. Safeguards: Access by authorized personnel only. Computer security safeguards are...

  7. Evaluation of a training program for persons with SCI paraplegia using the Parastep 1 ambulation system: part 4. Effect on physical self-concept and depression.

    PubMed

    Guest, R S; Klose, K J; Needham-Shropshire, B M; Jacobs, P L

    1997-08-01

    To determine whether persons with spinal cord injury (SCI) paraplegia who participated in an electrical stimulation walking program experienced changes in measures of physical self-concept and depression. Before-after trial. Human SCI applied research laboratory. Volunteer sample of 12 men and 3 women with SCI paraplegia, mean age 28.75 +/- 6.6yrs and mean duration of injury 3.8 +/- 3.2yrs. Thirty-two FNS ambulation training sessions using a commercially available system (Parastep 1). The hybrid system consists of a microprocessor-controlled stimulator and a modified walking frame with finger-operated switches that permit the user to control the stimulation parameters and activate the stepping. The Tennessee Self-Concept Scale (TSCS) and the Beck Depression Inventory (BDI) were administered before and after training. Only the Physical Self subscale of the TSCS was analyzed. After training, individual interviews were performed to assess participants' subjective reactions to the training program. A repeated measures analysis of variance indicated that desired directional and statistically significant changes occurred on the Physical Self subscale of the TSCS (F(1,14) = 8.54, p < .011) and on the BDI (F(1,14) = 5.42, p < .035). Subsequent to the ambulation training program there were statistically significant increases in physical self-concept scores and decreases in depression scores.

  8. 45 CFR 310.10 - What are the functional requirements for the Model Tribal IV-D System?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... plan, including: (1) Identifying information such as Social Security numbers, names, dates of birth... operations and to assess program performance through the audit of financial and statistical data maintained...

  9. DECIDE: a software for computer-assisted evaluation of diagnostic test performance.

    PubMed

    Chiecchio, A; Bo, A; Manzone, P; Giglioli, F

    1993-05-01

    The evaluation of the performance of clinical tests is a complex problem involving different steps and many statistical tools, not always structured in an organic and rational system. This paper presents a software which provides an organic system of statistical tools helping evaluation of clinical test performance. The program allows (a) the building and the organization of a working database, (b) the selection of the minimal set of tests with the maximum information content, (c) the search of the model best fitting the distribution of the test values, (d) the selection of optimal diagnostic cut-off value of the test for every positive/negative situation, (e) the evaluation of performance of the combinations of correlated and uncorrelated tests. The uncertainty associated with all the variables involved is evaluated. The program works in a MS-DOS environment with EGA or higher performing graphic card.

  10. RAD-ADAPT: Software for modelling clonogenic assay data in radiation biology.

    PubMed

    Zhang, Yaping; Hu, Kaiqiang; Beumer, Jan H; Bakkenist, Christopher J; D'Argenio, David Z

    2017-04-01

    We present a comprehensive software program, RAD-ADAPT, for the quantitative analysis of clonogenic assays in radiation biology. Two commonly used models for clonogenic assay analysis, the linear-quadratic model and single-hit multi-target model, are included in the software. RAD-ADAPT uses maximum likelihood estimation method to obtain parameter estimates with the assumption that cell colony count data follow a Poisson distribution. The program has an intuitive interface, generates model prediction plots, tabulates model parameter estimates, and allows automatic statistical comparison of parameters between different groups. The RAD-ADAPT interface is written using the statistical software R and the underlying computations are accomplished by the ADAPT software system for pharmacokinetic/pharmacodynamic systems analysis. The use of RAD-ADAPT is demonstrated using an example that examines the impact of pharmacologic ATM and ATR kinase inhibition on human lung cancer cell line A549 after ionizing radiation. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. 76 FR 82322 - Agency Information Collection Activities; Submission for OMB Review; Comment Request; Mass Layoff...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-30

    ... for OMB Review; Comment Request; Mass Layoff Statistics Program ACTION: Notice. SUMMARY: The... request (ICR) titled, ``Mass Layoff Statistics Program,'' to the Office of Management and Budget (OMB) for... Statistics (BLS). Title of Collection: Mass Layoff Statistics Program. OMB Control Number: 1220-0090...

  12. CAPSAS: Computer Assisted Program for the Selection of Appropriate Statistics.

    ERIC Educational Resources Information Center

    Shermis, Mark D.; Albert, Susan L.

    A computer-assisted program has been developed for the selection of statistics or statistical techniques by both students and researchers. Based on Andrews, Klem, Davidson, O'Malley and Rodgers "A Guide for Selecting Statistical Techniques for Analyzing Social Science Data," this FORTRAN-compiled interactive computer program was…

  13. Efforts to improve international migration statistics: a historical perspective.

    PubMed

    Kraly, E P; Gnanasekaran, K S

    1987-01-01

    During the past decade, the international statistical community has made several efforts to develop standards for the definition, collection and publication of statistics on international migration. This article surveys the history of official initiatives to standardize international migration statistics by reviewing the recommendations of the International Statistical Institute, International Labor Organization, and the UN, and reports a recently proposed agenda for moving toward comparability among national statistical systems. Heightening awareness of the benefits of exchange and creating motivation to implement international standards requires a 3-pronged effort from the international statistical community. 1st, it is essential to continue discussion about the significance of improvement, specifically standardization, of international migration statistics. The move from theory to practice in this area requires ongoing focus by migration statisticians so that conformity to international standards itself becomes a criterion by which national statistical practices are examined and assessed. 2nd, the countries should be provided with technical documentation to support and facilitate the implementation of the recommended statistical systems. Documentation should be developed with an understanding that conformity to international standards for migration and travel statistics must be achieved within existing national statistical programs. 3rd, the call for statistical research in this area requires more efforts by the community of migration statisticians, beginning with the mobilization of bilateral and multilateral resources to undertake the preceding list of activities.

  14. 45 CFR 287.165 - What are the data collection and reporting requirements for Public Law 102-477 Tribes that...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Public Law 102-477. This system includes a program report, consisting of a narrative report, a statistical form, and a financial report. (1) The program report is required annually and submitted to BIA, as the lead Federal agency and shared with DHHS and DOL. (2) The financial report is submitted on a SF...

  15. A Low-Cost Method for Multiple Disease Prediction.

    PubMed

    Bayati, Mohsen; Bhaskar, Sonia; Montanari, Andrea

    Recently, in response to the rising costs of healthcare services, employers that are financially responsible for the healthcare costs of their workforce have been investing in health improvement programs for their employees. A main objective of these so called "wellness programs" is to reduce the incidence of chronic illnesses such as cardiovascular disease, cancer, diabetes, and obesity, with the goal of reducing future medical costs. The majority of these wellness programs include an annual screening to detect individuals with the highest risk of developing chronic disease. Once these individuals are identified, the company can invest in interventions to reduce the risk of those individuals. However, capturing many biomarkers per employee creates a costly screening procedure. We propose a statistical data-driven method to address this challenge by minimizing the number of biomarkers in the screening procedure while maximizing the predictive power over a broad spectrum of diseases. Our solution uses multi-task learning and group dimensionality reduction from machine learning and statistics. We provide empirical validation of the proposed solution using data from two different electronic medical records systems, with comparisons to a statistical benchmark.

  16. Regression modeling of ground-water flow

    USGS Publications Warehouse

    Cooley, R.L.; Naff, R.L.

    1985-01-01

    Nonlinear multiple regression methods are developed to model and analyze groundwater flow systems. Complete descriptions of regression methodology as applied to groundwater flow models allow scientists and engineers engaged in flow modeling to apply the methods to a wide range of problems. Organization of the text proceeds from an introduction that discusses the general topic of groundwater flow modeling, to a review of basic statistics necessary to properly apply regression techniques, and then to the main topic: exposition and use of linear and nonlinear regression to model groundwater flow. Statistical procedures are given to analyze and use the regression models. A number of exercises and answers are included to exercise the student on nearly all the methods that are presented for modeling and statistical analysis. Three computer programs implement the more complex methods. These three are a general two-dimensional, steady-state regression model for flow in an anisotropic, heterogeneous porous medium, a program to calculate a measure of model nonlinearity with respect to the regression parameters, and a program to analyze model errors in computed dependent variables such as hydraulic head. (USGS)

  17. User's Guide to Galoper: A Program for Simulating the Shapes of Crystal Size Distributions from Growth Mechanisms - and Associated Programs

    USGS Publications Warehouse

    Eberl, Dennis D.; Drits, V.A.; Srodon, J.

    2000-01-01

    GALOPER is a computer program that simulates the shapes of crystal size distributions (CSDs) from crystal growth mechanisms. This manual describes how to use the program. The theory for the program's operation has been described previously (Eberl, Drits, and Srodon, 1998). CSDs that can be simulated using GALOPER include those that result from growth mechanisms operating in the open system, such as constant-rate nucleation and growth, nucleation with a decaying nucleation rate and growth, surface-controlled growth, supply-controlled growth, and constant-rate and random growth; and those that result from mechanisms operating in the closed system such as Ostwald ripening, random ripening, and crystal coalescence. In addition, CSDs for two types weathering reactions can be simulated. The operation of associated programs also is described, including two statistical programs used for comparing calculated with measured CSDs, a program used for calculating lognormal CSDs, and a program for arranging measured crystal sizes into size groupings (bins).

  18. Joint Services Electronics Program Annual Progress Report.

    DTIC Science & Technology

    1985-11-01

    one symbol memory) adaptive lHuffman codes were performed, and the compression achieved was compared with that of Ziv - Lempel coding. As was expected...MATERIALS 8 4. Information Systems 9 4.1 REAL TIME STATISTICAL DATA PROCESSING 9 -. 4.2 DATA COMPRESSION for COMPUTER DATA STRUCTURES 9 5. PhD...a. Real Time Statistical Data Processing (T. Kailatb) b. Data Compression for Computer Data Structures (J. Gill) Acces Fo NTIS CRA&I I " DTIC TAB

  19. An object programming based environment for protein secondary structure prediction.

    PubMed

    Giacomini, M; Ruggiero, C; Sacile, R

    1996-01-01

    The most frequently used methods for protein secondary structure prediction are empirical statistical methods and rule based methods. A consensus system based on object-oriented programming is presented, which integrates the two approaches with the aim of improving the prediction quality. This system uses an object-oriented knowledge representation based on the concepts of conformation, residue and protein, where the conformation class is the basis, the residue class derives from it and the protein class derives from the residue class. The system has been tested with satisfactory results on several proteins of the Brookhaven Protein Data Bank. Its results have been compared with the results of the most widely used prediction methods, and they show a higher prediction capability and greater stability. Moreover, the system itself provides an index of the reliability of its current prediction. This system can also be regarded as a basis structure for programs of this kind.

  20. 75 FR 56059 - Patent Examiner Technical Training Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-15

    ...); statistical methods in validation of microarry data; personalized medicine, manufacture of carbon nanospheres... processing, growing monocrystals, hydrogen production, liquid and gas purification and separation, making... Systems and Components: Mixed signal design and architecture, flexible displays, OLED display technology...

  1. Port performance freight statistics program annual report to Congress, 2016.

    DOT National Transportation Integrated Search

    2017-01-01

    Maritime ports are a major component of the Nations freight transportation system. : Collectively they handle 75 percent of Americas international trade by volume. : 1 Port : throughput (the typical amount of cargo a port handles annually) and ...

  2. 25 CFR 32.4 - Policies.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... control in planning, priority-setting, development, management, operation, staffing and evaluation in all... exemplary programs reflecting Tribal or Alaska Native village specific learning styles, including but not... management information system which will provide statistical information such as, but not limited to, student...

  3. An Environmental Decision Support System for Spatial Assessment and Selective Remediation

    EPA Science Inventory

    Spatial Analysis and Decision Assistance (SADA) is a Windows freeware program that incorporates environmental assessment tools for effective problem-solving. The software integrates modules for GIS, visualization, geospatial analysis, statistical analysis, human health and ecolog...

  4. Fault-Tolerant Multiprocessor and VLSI-Based Systems.

    DTIC Science & Technology

    1987-03-15

    54590 170 Table 1: Statistics for the Benchmark Programs pages are distributed amongst the groups of the reconfigured memory in proportion to the...distances are proportional to only the logarithm of the sure that possesses relevance to a system which consists of alare nmbe ofhomgenouseleent...and comn.unication overhead resulting from faults communicating with all of the other elements in the system the network to degrade proportionately to

  5. A comparison of two surveillance systems for deaths related to violent injury

    PubMed Central

    Comstock, R; Mallonee, S; Jordan, F

    2005-01-01

    Objective: To compare violent injury death reporting by the statewide Medical Examiner and Vital Statistics Office surveillance systems in Oklahoma. Methods: Using a standard study definition for violent injury death, the sensitivity and predictive value positive (PVP) of the Medical Examiner and Vital Statistics violent injury death reporting systems in Oklahoma in 2001 were evaluated. Results: Altogether 776 violent injury deaths were identified (violent injury death rate: 22.4 per 100 000 population) including 519 (66.9%) suicides, 248 (32.0%) homicides, and nine (1.2%) unintentional firearm deaths. The Medical Examiner system over-reported homicides and the Vital Statistics system under-reported homicides and suicides and over-reported unintentional firearm injury deaths. When compared with the standard, the Medical Examiner and Vital Statistics systems had sensitivities of 99.2% and 90.7% (respectively) and PVPs of 95.0% and 99.1% for homicide, sensitivities of 99.2% and 93.1% and PVPs of 100% and 99.0% for suicide, and sensitivities of 100% and 100% and PVPs of 100% and 31.0% for unintentional firearm deaths. Conclusions: Both the Vital Statistics and Medical Examiner systems contain valuable data and when combined can work synergistically to provide violent injury death information while also serving as quality control checks for each other. Preventable errors within both systems can be reduced by increasing training, addressing sources of human error, and expanding computer quality assurance programming. A standardized nationwide Medical Examiners' coding system and a national violent death reporting system that merges multiple public health and criminal justice datasets would enhance violent injury surveillance and prevention efforts. PMID:15691992

  6. A statistical package for computing time and frequency domain analysis

    NASA Technical Reports Server (NTRS)

    Brownlow, J.

    1978-01-01

    The spectrum analysis (SPA) program is a general purpose digital computer program designed to aid in data analysis. The program does time and frequency domain statistical analyses as well as some preanalysis data preparation. The capabilities of the SPA program include linear trend removal and/or digital filtering of data, plotting and/or listing of both filtered and unfiltered data, time domain statistical characterization of data, and frequency domain statistical characterization of data.

  7. Computer Applications in Professional Writing: Systems that Analyze and Describe Natural Language.

    ERIC Educational Resources Information Center

    O'Brien, Frank

    Two varieties of user-friendly computer systems that deal with natural language are now available, providing either at-the-monitor stylistic and grammatic correction of keyed-in writing or a sorting, selecting, and generating of statistical data for any written or spoken document. The editor programs, such as "The Writer's Workbench"…

  8. The Likelihood of Completing a Government-Funded VET Program, 2010-14. Australian Vocational Education and Training Statistics

    ERIC Educational Resources Information Center

    National Centre for Vocational Education Research (NCVER), 2016

    2016-01-01

    The Australian vocational education and training (VET) system provides training across a wide range of subject areas and is delivered through a variety of training institutions and enterprises (including to apprentices and trainees). The system provides training for students of all ages and backgrounds. Students may study individual subjects or…

  9. The Department of Defense Very High Speed Integrated Circuit (VHSIC) Technology Availability Program Plan for the Committees on Armed Services United States Congress.

    DTIC Science & Technology

    1986-06-30

    features of computer aided design systems and statistical quality control procedures that are generic to chip sets and processes. RADIATION HARDNESS -The...System PSP Programmable Signal Processor SSI Small Scale Integration ." TOW Tube Launched, Optically Tracked, Wire Guided TTL Transistor Transitor Logic

  10. Coastal and Marine Bird Data Base

    USGS Publications Warehouse

    Anderson, S.H.; Geissler, P.H.; Dawson, D.K.

    1980-01-01

    Summary: This report discusses the development of a coastal and marine bird data base at the Migratory Bird and Habitat Research Laboratory. The system is compared with other data bases, and suggestions for future development, such as possible adaptations for other taxonomic groups, are included. The data base is based on the Statistical Analysis System but includes extensions programmed in PL/I. The Appendix shows how the system evolved. Output examples are given for heron data and pelagic bird data which indicate the types of analyses that can be conducted and output figures. The Appendixes include a retrieval language user's guide and description of the retrieval process and listing of translator program.

  11. Developing points-based risk-scoring systems in the presence of competing risks.

    PubMed

    Austin, Peter C; Lee, Douglas S; D'Agostino, Ralph B; Fine, Jason P

    2016-09-30

    Predicting the occurrence of an adverse event over time is an important issue in clinical medicine. Clinical prediction models and associated points-based risk-scoring systems are popular statistical methods for summarizing the relationship between a multivariable set of patient risk factors and the risk of the occurrence of an adverse event. Points-based risk-scoring systems are popular amongst physicians as they permit a rapid assessment of patient risk without the use of computers or other electronic devices. The use of such points-based risk-scoring systems facilitates evidence-based clinical decision making. There is a growing interest in cause-specific mortality and in non-fatal outcomes. However, when considering these types of outcomes, one must account for competing risks whose occurrence precludes the occurrence of the event of interest. We describe how points-based risk-scoring systems can be developed in the presence of competing events. We illustrate the application of these methods by developing risk-scoring systems for predicting cardiovascular mortality in patients hospitalized with acute myocardial infarction. Code in the R statistical programming language is provided for the implementation of the described methods. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.

  12. The Impact of Sexual Assault Nurse Examiner Programs on Criminal Justice Case Outcomes: A Multisite Replication Study.

    PubMed

    Campbell, Rebecca; Bybee, Deborah; Townsend, Stephanie M; Shaw, Jessica; Karim, Nidal; Markowitz, Jenifer

    2014-05-01

    To address the underreporting and underprosecution of adult sexual assaults, communities throughout the United States have implemented multidisciplinary interventions to improve postassault care for victims and the criminal justice system response. One such model is the Sexual Assault Nurse Examiner (SANE) Program, whereby specially trained nurses provide comprehensive psychological, medical, and forensic services for sexual assault. In this study, we conducted a multisite evaluation of six SANE programs (two rural programs, two serving midsized communities, two urban) to assess how implementation of SANE programs affects adult sexual assault prosecution rates. At each site, most sexual assaults reported to law enforcement were never referred by police to prosecutors or were not charged by the prosecutor's office (80%-89%). Individually, none of the sites had a statistically significant increase in prosecution rates pre-SANE to post-SANE. However, when the data were aggregated across sites, thereby increasing statistical power, there was a significant effect such that cases were more likely to be prosecuted post-SANE as compared with pre-SANE. These findings suggest that the SANE intervention model does have a positive impact on sexual assault case progression in the criminal justice system. Nevertheless, there is still a pressing need for improvement as the vast majority of both pre-SANE and post-SANE resulted in nonreferral/no charges filed. © The Author(s) 2014.

  13. Evaluating and implementing temporal, spatial, and spatio-temporal methods for outbreak detection in a local syndromic surveillance system

    PubMed Central

    Lall, Ramona; Levin-Rector, Alison; Sell, Jessica; Paladini, Marc; Konty, Kevin J.; Olson, Don; Weiss, Don

    2017-01-01

    The New York City Department of Health and Mental Hygiene has operated an emergency department syndromic surveillance system since 2001, using temporal and spatial scan statistics run on a daily basis for cluster detection. Since the system was originally implemented, a number of new methods have been proposed for use in cluster detection. We evaluated six temporal and four spatial/spatio-temporal detection methods using syndromic surveillance data spiked with simulated injections. The algorithms were compared on several metrics, including sensitivity, specificity, positive predictive value, coherence, and timeliness. We also evaluated each method’s implementation, programming time, run time, and the ease of use. Among the temporal methods, at a set specificity of 95%, a Holt-Winters exponential smoother performed the best, detecting 19% of the simulated injects across all shapes and sizes, followed by an autoregressive moving average model (16%), a generalized linear model (15%), a modified version of the Early Aberration Reporting System’s C2 algorithm (13%), a temporal scan statistic (11%), and a cumulative sum control chart (<2%). Of the spatial/spatio-temporal methods we tested, a spatial scan statistic detected 3% of all injects, a Bayes regression found 2%, and a generalized linear mixed model and a space-time permutation scan statistic detected none at a specificity of 95%. Positive predictive value was low (<7%) for all methods. Overall, the detection methods we tested did not perform well in identifying the temporal and spatial clusters of cases in the inject dataset. The spatial scan statistic, our current method for spatial cluster detection, performed slightly better than the other tested methods across different inject magnitudes and types. Furthermore, we found the scan statistics, as applied in the SaTScan software package, to be the easiest to program and implement for daily data analysis. PMID:28886112

  14. OPR-PPR, a Computer Program for Assessing Data Importance to Model Predictions Using Linear Statistics

    USGS Publications Warehouse

    Tonkin, Matthew J.; Tiedeman, Claire; Ely, D. Matthew; Hill, Mary C.

    2007-01-01

    The OPR-PPR program calculates the Observation-Prediction (OPR) and Parameter-Prediction (PPR) statistics that can be used to evaluate the relative importance of various kinds of data to simulated predictions. The data considered fall into three categories: (1) existing observations, (2) potential observations, and (3) potential information about parameters. The first two are addressed by the OPR statistic; the third is addressed by the PPR statistic. The statistics are based on linear theory and measure the leverage of the data, which depends on the location, the type, and possibly the time of the data being considered. For example, in a ground-water system the type of data might be a head measurement at a particular location and time. As a measure of leverage, the statistics do not take into account the value of the measurement. As linear measures, the OPR and PPR statistics require minimal computational effort once sensitivities have been calculated. Sensitivities need to be calculated for only one set of parameter values; commonly these are the values estimated through model calibration. OPR-PPR can calculate the OPR and PPR statistics for any mathematical model that produces the necessary OPR-PPR input files. In this report, OPR-PPR capabilities are presented in the context of using the ground-water model MODFLOW-2000 and the universal inverse program UCODE_2005. The method used to calculate the OPR and PPR statistics is based on the linear equation for prediction standard deviation. Using sensitivities and other information, OPR-PPR calculates (a) the percent increase in the prediction standard deviation that results when one or more existing observations are omitted from the calibration data set; (b) the percent decrease in the prediction standard deviation that results when one or more potential observations are added to the calibration data set; or (c) the percent decrease in the prediction standard deviation that results when potential information on one or more parameters is added.

  15. An investigation of the feasibility of improving oculometer data analysis through application of advanced statistical techniques

    NASA Technical Reports Server (NTRS)

    Rana, D. S.

    1980-01-01

    The data reduction capabilities of the current data reduction programs were assessed and a search for a more comprehensive system with higher data analytic capabilities was made. Results of the investigation are presented.

  16. Statistical loads data for Boeing 737-400 aircraft in commercial operations

    DOT National Transportation Integrated Search

    1998-08-01

    The primary objective of this research is to support the FAA Airborne Data Monitoring Systems Research Program by developing new and improved methods and criteria for processing and presenting large commercial transport airplane flight and ground loa...

  17. Statistical loads data for BE-1900D aircraft in commuter operations

    DOT National Transportation Integrated Search

    2000-04-01

    The primary objective of this research is to support the FAA Airborne Data Monitoring Systems Research Program by developing new and improved methods and criteria for processing and presenting commuter airplane flight and ground loads usage data. The...

  18. Modular reweighting software for statistical mechanical analysis of biased equilibrium data

    NASA Astrophysics Data System (ADS)

    Sindhikara, Daniel J.

    2012-07-01

    Here a simple, useful, modular approach and software suite designed for statistical reweighting and analysis of equilibrium ensembles is presented. Statistical reweighting is useful and sometimes necessary for analysis of equilibrium enhanced sampling methods, such as umbrella sampling or replica exchange, and also in experimental cases where biasing factors are explicitly known. Essentially, statistical reweighting allows extrapolation of data from one or more equilibrium ensembles to another. Here, the fundamental separable steps of statistical reweighting are broken up into modules - allowing for application to the general case and avoiding the black-box nature of some “all-inclusive” reweighting programs. Additionally, the programs included are, by-design, written with little dependencies. The compilers required are either pre-installed on most systems, or freely available for download with minimal trouble. Examples of the use of this suite applied to umbrella sampling and replica exchange molecular dynamics simulations will be shown along with advice on how to apply it in the general case. New version program summaryProgram title: Modular reweighting version 2 Catalogue identifier: AEJH_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJH_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 179 118 No. of bytes in distributed program, including test data, etc.: 8 518 178 Distribution format: tar.gz Programming language: C++, Python 2.6+, Perl 5+ Computer: Any Operating system: Any RAM: 50-500 MB Supplementary material: An updated version of the original manuscript (Comput. Phys. Commun. 182 (2011) 2227) is available Classification: 4.13 Catalogue identifier of previous version: AEJH_v1_0 Journal reference of previous version: Comput. Phys. Commun. 182 (2011) 2227 Does the new version supersede the previous version?: Yes Nature of problem: While equilibrium reweighting is ubiquitous, there are no public programs available to perform the reweighting in the general case. Further, specific programs often suffer from many library dependencies and numerical instability. Solution method: This package is written in a modular format that allows for easy applicability of reweighting in the general case. Modules are small, numerically stable, and require minimal libraries. Reasons for new version: Some minor bugs, some upgrades needed, error analysis added. analyzeweight.py/analyzeweight.py2 has been replaced by “multihist.py”. This new program performs all the functions of its predecessor while being versatile enough to handle other types of histograms and probability analysis. “bootstrap.py” was added. This script performs basic bootstrap resampling allowing for error analysis of data. “avg_dev_distribution.py” was added. This program computes the averages and standard deviations of multiple distributions, making error analysis (e.g. from bootstrap resampling) easier to visualize. WRE.cpp was slightly modified purely for cosmetic reasons. The manual was updated for clarity and to reflect version updates. Examples were removed from the manual in favor of online tutorials (packaged examples remain). Examples were updated to reflect the new format. An additional example is included to demonstrate error analysis. Running time: Preprocessing scripts 1-5 minutes, WHAM engine <1 minute, postprocess script ∼1-5 minutes.

  19. New Phone System Coming to NCI Campus at Frederick | Poster

    Cancer.gov

    By Travis Fouche and Trent McKee, Guest Writers Beginning in September, phones at the NCI Campus at Frederick will begin to be replaced, as the project to upgrade the current phone system ramps up. Over the next 16 months, the Information Systems Program (ISP) will be working with Facilities Maintenance and Engineering and Computer & Statistical Services to replace the current Avaya phone system with a Cisco Unified Communications phone system. The Cisco system is already in use at the Advanced Technology Research Facility (ATRF).

  20. [Development and application of information management system for advanced schistosomiasis chemotherapy and assistance in Jiangxi Province].

    PubMed

    Mao, Yuan-Hua; Li, Dong; Ning, An; Qiu, Ling; Xiong, Ji-Jie

    2011-04-01

    To develop the information management system for advanced schistosomiasis chemotherapy and assistance in Jiangxi Province. Based on Access 2003, the system was programmed by Visual Basic 6.0 and packaged by Setup Factory 8.0. In the system, advanced schistosomiasis data were able to be input, printed, indexed, and statistically analyzed. The system could be operated and maintained easily and timely. The information management system for advanced schistosomiasis chemotherapy and assistance in Jiangxi Province is successfully developed.

  1. SEDPAK—A comprehensive operational system and data-processing package in APPLESOFT BASIC for a settling tube, sediment analyzer

    NASA Astrophysics Data System (ADS)

    Goldbery, R.; Tehori, O.

    SEDPAK provides a comprehensive software package for operation of a settling tube and sand analyzer (2-0.063 mm) and includes data-processing programs for statistical and graphic output of results. The programs are menu-driven and written in APPLESOFT BASIC, conforming with APPLE 3.3 DOS. Data storage and retrieval from disc is an important feature of SEDPAK. Additional features of SEDPAK include condensation of raw settling data via standard size-calibration curves to yield statistical grain-size parameters, plots of grain-size frequency distributions and cumulative log/probability curves. The program also has a module for processing of grain-size frequency data from sieved samples. An addition feature of SEDPAK is the option for automatic data processing and graphic output of a sequential or nonsequential array of samples on one side of a disc.

  2. Computer programs for computing particle-size statistics of fluvial sediments

    USGS Publications Warehouse

    Stevens, H.H.; Hubbell, D.W.

    1986-01-01

    Two versions of computer programs for inputing data and computing particle-size statistics of fluvial sediments are presented. The FORTRAN 77 language versions are for use on the Prime computer, and the BASIC language versions are for use on microcomputers. The size-statistics program compute Inman, Trask , and Folk statistical parameters from phi values and sizes determined for 10 specified percent-finer values from inputed size and percent-finer data. The program also determines the percentage gravel, sand, silt, and clay, and the Meyer-Peter effective diameter. Documentation and listings for both versions of the programs are included. (Author 's abstract)

  3. Processing of on-board recorded data for quick analysis of aircraft performance. [rotor systems research aircraft

    NASA Technical Reports Server (NTRS)

    Michaud, N. H.

    1979-01-01

    A system of independent computer programs for the processing of digitized pulse code modulated (PCM) and frequency modulated (FM) data is described. Information is stored in a set of random files and accessed to produce both statistical and graphical output. The software system is designed primarily to present these reports within a twenty-four hour period for quick analysis of the helicopter's performance.

  4. Family Day Care in the United States: Family Day Care Systems. Final Report of the National Day Care Home Study. Volume 5.

    ERIC Educational Resources Information Center

    Grasso, Janet; Fosburg, Steven

    Fifth in a series of seven volumes reporting the design, methodology, and findings of the 4-year National Day Care Home Study (NDCHS), this volume presents a descriptive and statistical analysis of the day care institutions that administer day care systems. These systems, such as Learning Unlimited in Los Angeles and the family day care program of…

  5. STATISTICAL PROGRAMS OF THE UNITED STATES GOVERNMENT: FISCAL YEAR 2018

    DOT National Transportation Integrated Search

    2018-01-01

    Statistical Programs of the United States Government: Fiscal Year 2018 outlines the funding proposed for Federal statistical activities in the President's Budget. This report, along with the chapter "Strengthening Federal Statistics" in the Analytica...

  6. Hand-held computer operating system program for collection of resident experience data.

    PubMed

    Malan, T K; Haffner, W H; Armstrong, A Y; Satin, A J

    2000-11-01

    To describe a system for recording resident experience involving hand-held computers with the Palm Operating System (3 Com, Inc., Santa Clara, CA). Hand-held personal computers (PCs) are popular, easy to use, inexpensive, portable, and can share data among other operating systems. Residents in our program carry individual hand-held database computers to record Residency Review Committee (RRC) reportable patient encounters. Each resident's data is transferred to a single central relational database compatible with Microsoft Access (Microsoft Corporation, Redmond, WA). Patient data entry and subsequent transfer to a central database is accomplished with commercially available software that requires minimal computer expertise to implement and maintain. The central database can then be used for statistical analysis or to create required RRC resident experience reports. As a result, the data collection and transfer process takes less time for residents and program director alike, than paper-based or central computer-based systems. The system of collecting resident encounter data using hand-held computers with the Palm Operating System is easy to use, relatively inexpensive, accurate, and secure. The user-friendly system provides prompt, complete, and accurate data, enhancing the education of residents while facilitating the job of the program director.

  7. FabricS: A user-friendly, complete and robust software for particle shape-fabric analysis

    NASA Astrophysics Data System (ADS)

    Moreno Chávez, G.; Castillo Rivera, F.; Sarocchi, D.; Borselli, L.; Rodríguez-Sedano, L. A.

    2018-06-01

    Shape-fabric is a textural parameter related to the spatial arrangement of elongated particles in geological samples. Its usefulness spans a range from sedimentary petrology to igneous and metamorphic petrology. Independently of the process being studied, when a material flows, the elongated particles are oriented with the major axis in the direction of flow. In sedimentary petrology this information has been used for studies of paleo-flow direction of turbidites, the origin of quartz sediments, and locating ignimbrite vents, among others. In addition to flow direction and its polarity, the method enables flow rheology to be inferred. The use of shape-fabric has been limited due to the difficulties of automatically measuring particles and analyzing them with reliable circular statistics programs. This has dampened interest in the method for a long time. Shape-fabric measurement has increased in popularity since the 1980s thanks to the development of new image analysis techniques and circular statistics software. However, the programs currently available are unreliable, old and are incompatible with newer operating systems, or require programming skills. The goal of our work is to develop a user-friendly program, in the MATLAB environment, with a graphical user interface, that can process images and includes editing functions, and thresholds (elongation and size) for selecting a particle population and analyzing it with reliable circular statistics algorithms. Moreover, the method also has to produce rose diagrams, orientation vectors, and a complete series of statistical parameters. All these requirements are met by our new software. In this paper, we briefly explain the methodology from collection of oriented samples in the field to the minimum number of particles needed to obtain reliable fabric data. We obtained the data using specific statistical tests and taking into account the degree of iso-orientation of the samples and the required degree of reliability. The program has been verified by means of several simulations performed using appropriately designed features and by analyzing real samples.

  8. 7 CFR 295.5 - Program statistical reports.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 4 2011-01-01 2011-01-01 false Program statistical reports. 295.5 Section 295.5 Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF... statistical reports. Current and historical information on FNS food assistance program size, monetary outlays...

  9. 7 CFR 295.5 - Program statistical reports.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 4 2010-01-01 2010-01-01 false Program statistical reports. 295.5 Section 295.5 Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF... statistical reports. Current and historical information on FNS food assistance program size, monetary outlays...

  10. Statistical loads data for Cessna 172 aircraft using the Aircraft Cumulative Fatigue System (ACFS)

    DOT National Transportation Integrated Search

    2001-08-01

    The purpose of this research and development program was to manufacture a small, lightweight, low-cost recorder for loads usage monitoring of general aviation and commuter type aircraft to support the Federal Aviation Administration (FAA) Operation L...

  11. Proceedings of the second annual Forest Inventory and Analysis symposium; Salt Lake City, UT. October 17-18, 2000

    Treesearch

    Gregory A. Reams; Ronald E. McRoberts; Paul C. van Deusen; [Editors

    2001-01-01

    Documents progress in developing techniques in remote sensing, statistics, information management, and analysis required for full implementation of the national Forest Inventory and Analysis program’s annual forest inventory system.

  12. The Application of Computer Technology to the Development of a Native American Planning and Information System.

    ERIC Educational Resources Information Center

    McKinley, Kenneth H.; Self, Burl E., Jr.

    A study was conducted to determine the feasibility of using the computer-based Synagraphic Mapping Program (SYMAP) and the Statistical Package for the Social Sciences (SPSS) in formulating an efficient and accurate information system which Creek Nation tribal staff could implement and use in planning for more effective and precise delivery of…

  13. Information-Decay Pursuit of Dynamic Parameters in Student Models

    DTIC Science & Technology

    1994-04-01

    simple worked-through example). Commercially available computer programs for structuring and using Bayesian inference include ERGO ( Noetic Systems...Tukey, J.W. (1977). Data analysis and Regression: A second course in statistics. Reading, MA: Addison-Wesley. Noetic Systems, Inc. (1991). ERGO...Naval Academy Division of Educational Studies Annapolis MD 21402-5002 Elmory Univerity Dr Janice Gifford 210 Fiabburne Bldg University of

  14. On Designing Multicore-Aware Simulators for Systems Biology Endowed with OnLine Statistics

    PubMed Central

    Calcagno, Cristina; Coppo, Mario

    2014-01-01

    The paper arguments are on enabling methodologies for the design of a fully parallel, online, interactive tool aiming to support the bioinformatics scientists .In particular, the features of these methodologies, supported by the FastFlow parallel programming framework, are shown on a simulation tool to perform the modeling, the tuning, and the sensitivity analysis of stochastic biological models. A stochastic simulation needs thousands of independent simulation trajectories turning into big data that should be analysed by statistic and data mining tools. In the considered approach the two stages are pipelined in such a way that the simulation stage streams out the partial results of all simulation trajectories to the analysis stage that immediately produces a partial result. The simulation-analysis workflow is validated for performance and effectiveness of the online analysis in capturing biological systems behavior on a multicore platform and representative proof-of-concept biological systems. The exploited methodologies include pattern-based parallel programming and data streaming that provide key features to the software designers such as performance portability and efficient in-memory (big) data management and movement. Two paradigmatic classes of biological systems exhibiting multistable and oscillatory behavior are used as a testbed. PMID:25050327

  15. On designing multicore-aware simulators for systems biology endowed with OnLine statistics.

    PubMed

    Aldinucci, Marco; Calcagno, Cristina; Coppo, Mario; Damiani, Ferruccio; Drocco, Maurizio; Sciacca, Eva; Spinella, Salvatore; Torquati, Massimo; Troina, Angelo

    2014-01-01

    The paper arguments are on enabling methodologies for the design of a fully parallel, online, interactive tool aiming to support the bioinformatics scientists .In particular, the features of these methodologies, supported by the FastFlow parallel programming framework, are shown on a simulation tool to perform the modeling, the tuning, and the sensitivity analysis of stochastic biological models. A stochastic simulation needs thousands of independent simulation trajectories turning into big data that should be analysed by statistic and data mining tools. In the considered approach the two stages are pipelined in such a way that the simulation stage streams out the partial results of all simulation trajectories to the analysis stage that immediately produces a partial result. The simulation-analysis workflow is validated for performance and effectiveness of the online analysis in capturing biological systems behavior on a multicore platform and representative proof-of-concept biological systems. The exploited methodologies include pattern-based parallel programming and data streaming that provide key features to the software designers such as performance portability and efficient in-memory (big) data management and movement. Two paradigmatic classes of biological systems exhibiting multistable and oscillatory behavior are used as a testbed.

  16. 76 FR 81984 - Agency Information Collection Activities; Submission for OMB Review; Comment Request; Local Area...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-29

    ... for OMB Review; Comment Request; Local Area Unemployment Statistics Program ACTION: Notice. SUMMARY... collection request (ICR) titled, ``Local Area Unemployment Statistics Program,'' to the Office of Management... of Collection: Local Area Unemployment Statistics Program. OMB Control Number: 1220-0017. Affected...

  17. Application of capital social of Bali cattle farmers that participate in the partnership system in Barru Regency, South Sulawesi Province

    NASA Astrophysics Data System (ADS)

    Sirajuddin, S. N.; Siregar, A. R.; Mappigau, P.

    2018-05-01

    There are four models of partnership that is centralized models, multipartite models, intermediary models and informal model application in all livestock commodities, including beef cattle. Partnership in the beef cattle business has been done in Barruie the program showroom cattle (SRS).This study aimed to known application the social capital of beef cattle breeders who followed the partnership system (program showroom cattle) in Barru. This research was conducted in April 2017 in the district Tanete Riaja. The population is all the farmers in Barru Regency who joined the partnership system (showroom program) and the sample is beef cattle breeders who followed the partnership system in Tanete Riaja district, Barru regency. This type of research is quantitative descriptive. This type of data is quantitative and qualitative. The resource data are primary data and secondary data. Data analysis uses descriptive statistical analysis with Likert scale. The results research show that social capital (trust, linkage, norm) of beef cattle breeders who joined the partnership system (cattle showroom program) at high scale

  18. LACIE performance predictor final operational capability program description, volume 3

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The requirements and processing logic for the LACIE Error Model program (LEM) are described. This program is an integral part of the Large Area Crop Inventory Experiment (LACIE) system. LEM is that portion of the LPP (LACIE Performance Predictor) which simulates the sample segment classification, strata yield estimation, and production aggregation. LEM controls repetitive Monte Carlo trials based on input error distributions to obtain statistical estimates of the wheat area, yield, and production at different levels of aggregation. LEM interfaces with the rest of the LPP through a set of data files.

  19. Operating a Geiger Müller tube using a PC sound card

    NASA Astrophysics Data System (ADS)

    Azooz, A. A.

    2009-01-01

    In this paper, a simple MATLAB-based PC program that enables the computer to function as a replacement for the electronic scalar-counter system associated with a Geiger-Müller (GM) tube is described. The program utilizes the ability of MATLAB to acquire data directly from the computer sound card. The signal from the GM tube is applied to the computer sound card via the line in port. All standard GM experiments, pulse shape and statistical analysis experiments can be carried out using this system. A new visual demonstration of dead time effects is also presented.

  20. Sensor data validation and reconstruction. Phase 1: System architecture study

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The sensor validation and data reconstruction task reviewed relevant literature and selected applicable validation and reconstruction techniques for further study; analyzed the selected techniques and emphasized those which could be used for both validation and reconstruction; analyzed Space Shuttle Main Engine (SSME) hot fire test data to determine statistical and physical relationships between various parameters; developed statistical and empirical correlations between parameters to perform validation and reconstruction tasks, using a computer aided engineering (CAE) package; and conceptually designed an expert system based knowledge fusion tool, which allows the user to relate diverse types of information when validating sensor data. The host hardware for the system is intended to be a Sun SPARCstation, but could be any RISC workstation with a UNIX operating system and a windowing/graphics system such as Motif or Dataviews. The information fusion tool is intended to be developed using the NEXPERT Object expert system shell, and the C programming language.

  1. Navy Littoral Combat Ship (LCS)/Frigate Program: Background and Issues for Congress

    DTIC Science & Technology

    2015-09-23

    Defense Daily, June 2, 1014 : 4-5; Michael Fabey, “Robust Air Defense Not Needed In New Frigates, Studies Show,” Aerospace Daily & Defense Report...the frigate, according to an Aug. 7 notice posted to the Federal Business Opportunities website.48 Technical Risk and Issues Relating to Program...the Pentagon’s top test and evaluation officer. “Recent developmental testing provides no statistical evidence that the system is demonstrating

  2. 75 FR 24718 - Guidance for Industry on Documenting Statistical Analysis Programs and Data Files; Availability

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-05

    ...] Guidance for Industry on Documenting Statistical Analysis Programs and Data Files; Availability AGENCY... Programs and Data Files.'' This guidance is provided to inform study statisticians of recommendations for documenting statistical analyses and data files submitted to the Center for Veterinary Medicine (CVM) for the...

  3. 2009 GED Testing Program Statistical Report

    ERIC Educational Resources Information Center

    GED Testing Service, 2010

    2010-01-01

    The "2009 GED[R] Testing Program Statistical Report" is the 52nd annual report in the program's 68-year history of providing a second opportunity for adults without a high school credential to earn their jurisdiction's GED credential. The report provides candidate demographic and GED Test performance statistics as well as historical…

  4. Implementation of a data management software system for SSME test history data

    NASA Technical Reports Server (NTRS)

    Abernethy, Kenneth

    1986-01-01

    The implementation of a software system for managing Space Shuttle Main Engine (SSME) test/flight historical data is presented. The software system uses the database management system RIM7 for primary data storage and routine data management, but includes several FORTRAN programs, described here, which provide customized access to the RIM7 database. The consolidation, modification, and transfer of data from the database THIST, to the RIM7 database THISRM is discussed. The RIM7 utility modules for generating some standard reports from THISRM and performing some routine updating and maintenance are briefly described. The FORTRAN accessing programs described include programs for initial loading of large data sets into the database, capturing data from files for database inclusion, and producing specialized statistical reports which cannot be provided by the RIM7 report generator utility. An expert system tutorial, constructed using the expert system shell product INSIGHT2, is described. Finally, a potential expert system, which would analyze data in the database, is outlined. This system could use INSIGHT2 as well and would take advantage of RIM7's compatibility with the microcomputer database system RBase 5000.

  5. An outlook for cargo aircraft of the future. [assessment of the future of air cargo by analyzing statistics and trends

    NASA Technical Reports Server (NTRS)

    Nicks, O. W.; Whitehead, A. H., Jr.; Alford, W. J., Jr.

    1975-01-01

    An assessment is provided of the future of air cargo by analyzing air cargo statistics and trends, by noting air cargo system problems and inefficiencies, by analyzing characteristics of air-eligible commodities, and by showing the promise of new technology for future cargo aircraft with significant improvements in costs and efficiency. NASA's proposed program is reviewed which would sponsor the research needed to provide for development of advanced designs by 1985.

  6. An Assessment Blueprint for EncStat: A Statistics Anxiety Intervention Program.

    ERIC Educational Resources Information Center

    Watson, Freda S.; Lang, Thomas R.; Kromrey, Jeffrey D.; Ferron, John M.; Hess, Melinda R.; Hogarty, Kristine Y.

    EncStat (Encouraged about Statistics) is a multimedia program being developed to identify and assist students with statistics anxiety or negative attitudes about statistics. This study explored the validity of the assessment instruments included in EncStat with respect to their diagnostic value for statistics anxiety and negative attitudes about…

  7. The Impact of New Technology on Accounting Education.

    ERIC Educational Resources Information Center

    Shaoul, Jean

    The introduction of computers in the Department of Accounting and Finance at Manchester University is described. General background outlining the increasing need for microcomputers in the accounting curriculum (including financial modelling tools and decision support systems such as linear programming, statistical packages, and simulation) is…

  8. University Safety Culture: A Work-in-Progress?

    ERIC Educational Resources Information Center

    Lyons, Michael

    2016-01-01

    Safety management systems in Australian higher education organisations are under-researched. Limited workplace safety information can be found in the various reports on university human resources benchmarking programs, and typically they show only descriptive statistics. With the commencement of new consultation-focused regulations applying to…

  9. An argument for mechanism-based statistical inference in cancer

    PubMed Central

    Ochs, Michael; Price, Nathan D.; Tomasetti, Cristian; Younes, Laurent

    2015-01-01

    Cancer is perhaps the prototypical systems disease, and as such has been the focus of extensive study in quantitative systems biology. However, translating these programs into personalized clinical care remains elusive and incomplete. In this perspective, we argue that realizing this agenda—in particular, predicting disease phenotypes, progression and treatment response for individuals—requires going well beyond standard computational and bioinformatics tools and algorithms. It entails designing global mathematical models over network-scale configurations of genomic states and molecular concentrations, and learning the model parameters from limited available samples of high-dimensional and integrative omics data. As such, any plausible design should accommodate: biological mechanism, necessary for both feasible learning and interpretable decision making; stochasticity, to deal with uncertainty and observed variation at many scales; and a capacity for statistical inference at the patient level. This program, which requires a close, sustained collaboration between mathematicians and biologists, is illustrated in several contexts, including learning bio-markers, metabolism, cell signaling, network inference and tumorigenesis. PMID:25381197

  10. Anima: Modular Workflow System for Comprehensive Image Data Analysis

    PubMed Central

    Rantanen, Ville; Valori, Miko; Hautaniemi, Sampsa

    2014-01-01

    Modern microscopes produce vast amounts of image data, and computational methods are needed to analyze and interpret these data. Furthermore, a single image analysis project may require tens or hundreds of analysis steps starting from data import and pre-processing to segmentation and statistical analysis; and ending with visualization and reporting. To manage such large-scale image data analysis projects, we present here a modular workflow system called Anima. Anima is designed for comprehensive and efficient image data analysis development, and it contains several features that are crucial in high-throughput image data analysis: programing language independence, batch processing, easily customized data processing, interoperability with other software via application programing interfaces, and advanced multivariate statistical analysis. The utility of Anima is shown with two case studies focusing on testing different algorithms developed in different imaging platforms and an automated prediction of alive/dead C. elegans worms by integrating several analysis environments. Anima is a fully open source and available with documentation at www.anduril.org/anima. PMID:25126541

  11. SHAREv2: fluctuations and a comprehensive treatment of decay feed-down

    NASA Astrophysics Data System (ADS)

    Torrieri, G.; Jeon, S.; Letessier, J.; Rafelski, J.

    2006-11-01

    This the user's manual for SHARE version 2. SHARE [G. Torrieri, S. Steinke, W. Broniowski, W. Florkowski, J. Letessier, J. Rafelski, Comput. Phys. Comm. 167 (2005) 229] (Statistical Hadronization with Resonances) is a collection of programs designed for the statistical analysis of particle production in relativistic heavy-ion collisions. While the structure of the program remains similar to v1.x, v2 provides several new features such as evaluation of statistical fluctuations of particle yields, and a greater versatility, in particular regarding decay feed-down and input/output structure. This article describes all the new features, with emphasis on statistical fluctuations. Program summaryTitle of program:SHAREv2 Catalogue identifier:ADVD_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVD_v2_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer:PC, Pentium III, 512 MB RAM not hardware dependent Operating system:Linux: RedHat 6.1, 7.2, FEDORA, etc. not system dependent Programming language:FORTRAN77 Size of the package:167 KB directory, without libraries (see http://wwwasdoc.web.cern.ch/wwwasdoc/minuit/minmain.html, http://wwwasd.web.cern.ch/wwwasd/cernlib.html for details on library requirements) Number of lines in distributed program, including test data, etc.:26 101 Number of bytes in distributed program, including test data, etc.:170 346 Distribution format:tar.gzip file Computer:Any computer with an f77 compiler Nature of the physical problem:Event-by-event fluctuations have been recognized to be the physical observable capable to constrain particle production models. Therefore, consideration of event-by-event fluctuations is required for a decisive falsification or constraining of (variants of) particle production models based on (grand-, micro-) canonical statistical mechanics phase space, the so called statistical hadronization models (SHM). As in the case of particle yields, to properly compare model calculations to data it is necessary to consistently take into account resonance decays. However, event-by-event fluctuations are more sensitive than particle yields to experimental acceptance issues, and a range of techniques needs to be implemented to extract 'physical' fluctuations from an experimental event-by-event measurement. Method of solving the problem:The techniques used within the SHARE suite of programs [G. Torrieri, S. Steinke, W. Broniowski, W. Florkowski, J. Letessier, J. Rafelski, Comput. Phys. Comm. 167 (2005) 229; SHAREv1] are updated and extended to fluctuations. A full particle data-table, decay tree, and set of experimental feed-down coefficients are provided. Unlike SHAREv1.x, experimental acceptance feed-down coefficients can be entered for any resonance decay. SHAREv2 can calculate yields, fluctuations, and bulk properties of the fireball from provided thermal parameters; alternatively, parameters can be obtained from fits to experimental data, via the MINUIT fitting algorithm [F. James, M. Roos, Comput. Phys. Comm. 10 (1975) 343]. Fits can also be analyzed for significance, parameter and data point sensitivity. Averages and fluctuations at freeze-out of both the stable particles and the hadronic resonances are set according to a statistical prescription, calculated via a series of Bessel functions, using CERN library programs. We also have the option of including finite particle widths of the resonances. A χ minimization algorithm, also from the CERN library programs, is used to perform and analyze the fit. Please see SHAREv1 for more details on these. Purpose:The vast amount of high quality soft hadron production data, from experiments running at the SPS, RHIC, in past at the AGS, and in the near future at the LHC, offers the opportunity for statistical particle production model falsification. This task has turned out to be difficult when considering solely particle yields addressed in the context of SHAREv1.x. For this reason physical conditions at freeze-out remain contested. Inclusion in the analysis of event-by-event fluctuations appears to resolve this issue. Similarly, a thorough analysis including both fluctuations and average multiplicities gives a way to explore the presence and strength of interactions following hadronization (when hadrons form), ending with thermal freeze-out (when all interactions cease). SHAREv2 with fluctuations will also help determine which statistical ensemble (if any), e.g., canonical or grand-canonical, is more physically appropriate for analyzing a given system. Together with resonances, fluctuations can also be used for a direct estimate of the extent the system re-interacts between chemical and thermal freeze-out. We hope and expect that SHAREv2 will contribute to decide if any of the statistical hadronization model variants has a genuine physical connection to hadron particle production. Computation time survey:We encounter, in the FORTRAN version computation, times up to seconds for evaluation of particle yields. These rise by up to a factor of 300 in the process of minimization and a further factor of a few when χ/N profiles and contours with chemical non-equilibrium are requested. Summary of new features (w.r.t. SHAREv1.x)Fluctuations:In addition to particle yields, ratios and bulk quantities SHAREv2 can calculate, fit and analyze statistical fluctuations of particles and particle ratios Decays:SHAREv2 has the flexibility to account for any experimental method of allowing for decay feed-downs to the particle yields Charm flavor:Charmed particles have been added to the decay tree, allowing as an option study of statistical hadronization of J/ψ, χ, D, etc. Quark chemistry:Chemical non-equilibrium yields for both u and d flavors, as opposed to generically light quarks q, are considered; η- η mixing, etc., are properly dealt with, and chemical non-equilibrium can be studied for each flavor separately Misc:Many new commands and features have been introduced and added to the basic user interface. For example, it is possible to study combinations of particles and their ratios. It is also possible to combine all the input files into one file. SHARE compatibility and manual:This write-up is an update and extension of SHAREv1. The user should consult SHAREv1 regarding the principles of user interface and for all particle yield related physics and program instructions, other than the parameter additions and minor changes described here. SHAREv2 is downward compatible for the changes of the user interface, offering the user of SHAREv1 a computer generated revised input files compatible with SHAREv2.

  12. [The main directions of reforming the service of medical statistics in Ukraine].

    PubMed

    Golubchykov, Mykhailo V; Orlova, Nataliia M; Bielikova, Inna V

    2018-01-01

    Introduction: Implementation of new methods of information support of managerial decision-making should ensure of the effective health system reform and create conditions for improving the quality of operational management, reasonable planning of medical care and increasing the efficiency of the use of system resources. Reforming of Medical Statistics Service of Ukraine should be considered only in the context of the reform of the entire health system. The aim: This work is an analysis of the current situation and justification of the main directions of reforming of Medical Statistics Service of Ukraine. Material and methods: In the work is used a range of methods: content analysis, bibliosemantic, systematic approach. The information base of the research became: WHO strategic and program documents, data of the Medical Statistics Center of the Ministry of Health of Ukraine. Review: The Medical Statistics Service of Ukraine has a completed and effective structure, headed by the State Institution "Medical Statistics Center of the Ministry of Health of Ukraine." This institution reports on behalf of the Ministry of Health of Ukraine to the State Statistical Service of Ukraine, the WHO European Office and other international organizations. An analysis of the current situation showed that to achieve this goal it is necessary: to improve the system of statistical indicators for an adequate assessment of the performance of health institutions, including in the economic aspect; creation of a developed medical and statistical base of administrative territories; change of existing technologies for the formation of information resources; strengthening the material-technical base of the structural units of Medical Statistics Service; improvement of the system of training and retraining of personnel for the service of medical statistics; development of international cooperation in the field of methodology and practice of medical statistics, implementation of internationally accepted methods for collecting, processing, analyzing and disseminating medical and statistical information; the creation of a medical and statistical service that adapted to the specifics of market relations in health care, flexible and sensitive to changes in international methodologies and standards. Conclusions: The data of medical statistics are the basis for taking managerial decisions by managers at all levels of health care. Reform of Medical Statistics Service of Ukraine should be considered only in the context of the reform of the entire health system. The main directions of the reform of the medical statistics service in Ukraine are: the introduction of information technologies, the improvement of the training of personnel for the service, the improvement of material and technical equipment, the maximum reuse of the data obtained, which provides for the unification of primary data and a system of indicators. The most difficult area is the formation of information funds and the introduction of modern information technologies.

  13. [The main directions of reforming the service of medical statistics in Ukraine].

    PubMed

    Golubchykov, Mykhailo V; Orlova, Nataliia M; Bielikova, Inna V

    Introduction: Implementation of new methods of information support of managerial decision-making should ensure of the effective health system reform and create conditions for improving the quality of operational management, reasonable planning of medical care and increasing the efficiency of the use of system resources. Reforming of Medical Statistics Service of Ukraine should be considered only in the context of the reform of the entire health system. The aim: This work is an analysis of the current situation and justification of the main directions of reforming of Medical Statistics Service of Ukraine. Material and methods: In the work is used a range of methods: content analysis, bibliosemantic, systematic approach. The information base of the research became: WHO strategic and program documents, data of the Medical Statistics Center of the Ministry of Health of Ukraine. Review: The Medical Statistics Service of Ukraine has a completed and effective structure, headed by the State Institution "Medical Statistics Center of the Ministry of Health of Ukraine." This institution reports on behalf of the Ministry of Health of Ukraine to the State Statistical Service of Ukraine, the WHO European Office and other international organizations. An analysis of the current situation showed that to achieve this goal it is necessary: to improve the system of statistical indicators for an adequate assessment of the performance of health institutions, including in the economic aspect; creation of a developed medical and statistical base of administrative territories; change of existing technologies for the formation of information resources; strengthening the material-technical base of the structural units of Medical Statistics Service; improvement of the system of training and retraining of personnel for the service of medical statistics; development of international cooperation in the field of methodology and practice of medical statistics, implementation of internationally accepted methods for collecting, processing, analyzing and disseminating medical and statistical information; the creation of a medical and statistical service that adapted to the specifics of market relations in health care, flexible and sensitive to changes in international methodologies and standards. Conclusions: The data of medical statistics are the basis for taking managerial decisions by managers at all levels of health care. Reform of Medical Statistics Service of Ukraine should be considered only in the context of the reform of the entire health system. The main directions of the reform of the medical statistics service in Ukraine are: the introduction of information technologies, the improvement of the training of personnel for the service, the improvement of material and technical equipment, the maximum reuse of the data obtained, which provides for the unification of primary data and a system of indicators. The most difficult area is the formation of information funds and the introduction of modern information technologies.

  14. Teaching Statistics in APA-Accredited Doctoral Programs in Clinical and Counseling Psychology: A Syllabi Review

    ERIC Educational Resources Information Center

    Ord, Anna S.; Ripley, Jennifer S.; Hook, Joshua; Erspamer, Tiffany

    2016-01-01

    Although statistical methods and research design are crucial areas of competency for psychologists, few studies explore how statistics are taught across doctoral programs in psychology in the United States. The present study examined 153 American Psychological Association-accredited doctoral programs in clinical and counseling psychology and aimed…

  15. GenomeGraphs: integrated genomic data visualization with R.

    PubMed

    Durinck, Steffen; Bullard, James; Spellman, Paul T; Dudoit, Sandrine

    2009-01-06

    Biological studies involve a growing number of distinct high-throughput experiments to characterize samples of interest. There is a lack of methods to visualize these different genomic datasets in a versatile manner. In addition, genomic data analysis requires integrated visualization of experimental data along with constantly changing genomic annotation and statistical analyses. We developed GenomeGraphs, as an add-on software package for the statistical programming environment R, to facilitate integrated visualization of genomic datasets. GenomeGraphs uses the biomaRt package to perform on-line annotation queries to Ensembl and translates these to gene/transcript structures in viewports of the grid graphics package. This allows genomic annotation to be plotted together with experimental data. GenomeGraphs can also be used to plot custom annotation tracks in combination with different experimental data types together in one plot using the same genomic coordinate system. GenomeGraphs is a flexible and extensible software package which can be used to visualize a multitude of genomic datasets within the statistical programming environment R.

  16. ASTEP user's guide and software documentation

    NASA Technical Reports Server (NTRS)

    Gliniewicz, A. S.; Lachowski, H. M.; Pace, W. H., Jr.; Salvato, P., Jr.

    1974-01-01

    The Algorithm Simulation Test and Evaluation Program (ASTEP) is a modular computer program developed for the purpose of testing and evaluating methods of processing remotely sensed multispectral scanner earth resources data. ASTEP is written in FORTRAND V on the UNIVAC 1110 under the EXEC 8 operating system and may be operated in either a batch or interactive mode. The program currently contains over one hundred subroutines consisting of data classification and display algorithms, statistical analysis algorithms, utility support routines, and feature selection capability. The current program can accept data in LARSC1, LARSC2, ERTS, and Universal formats, and can output processed image or data tapes in Universal format.

  17. Communications oriented programming of parallel iterative solutions of sparse linear systems

    NASA Technical Reports Server (NTRS)

    Patrick, M. L.; Pratt, T. W.

    1986-01-01

    Parallel algorithms are developed for a class of scientific computational problems by partitioning the problems into smaller problems which may be solved concurrently. The effectiveness of the resulting parallel solutions is determined by the amount and frequency of communication and synchronization and the extent to which communication can be overlapped with computation. Three different parallel algorithms for solving the same class of problems are presented, and their effectiveness is analyzed from this point of view. The algorithms are programmed using a new programming environment. Run-time statistics and experience obtained from the execution of these programs assist in measuring the effectiveness of these algorithms.

  18. Research on Secure Systems and Automatic Programming. Volume I

    DTIC Science & Technology

    1977-10-14

    for the enforcement of adherence to authorization; they include physical limitations, legal codes, social pressures, and the psychological makeup of...systems job statistics and possibly indications of an support instructions. The criteria for their abnormal termination. * inclusion were high execution...interrupt processes, for the output data page. Jobs may also terminate however, use the standard SWI TCH PROCESS instruc- abnormally by executing an

  19. Using "WeBWorK," a Web-Based Homework Delivery and Grading System, to Help Prepare Students for Active Learning

    ERIC Educational Resources Information Center

    Lucas, Adam R.

    2012-01-01

    "WeBWorK," an online homework system, can be be used to deliver daily reading questions to students. The author studied its use for this purpose with a lower division Introduction to Programming course and an upper division Probability and Statistics course. In the lower division course, "WeBWorK" significantly improved peer…

  20. Structural health monitoring feature design by genetic programming

    NASA Astrophysics Data System (ADS)

    Harvey, Dustin Y.; Todd, Michael D.

    2014-09-01

    Structural health monitoring (SHM) systems provide real-time damage and performance information for civil, aerospace, and other high-capital or life-safety critical structures. Conventional data processing involves pre-processing and extraction of low-dimensional features from in situ time series measurements. The features are then input to a statistical pattern recognition algorithm to perform the relevant classification or regression task necessary to facilitate decisions by the SHM system. Traditional design of signal processing and feature extraction algorithms can be an expensive and time-consuming process requiring extensive system knowledge and domain expertise. Genetic programming, a heuristic program search method from evolutionary computation, was recently adapted by the authors to perform automated, data-driven design of signal processing and feature extraction algorithms for statistical pattern recognition applications. The proposed method, called Autofead, is particularly suitable to handle the challenges inherent in algorithm design for SHM problems where the manifestation of damage in structural response measurements is often unclear or unknown. Autofead mines a training database of response measurements to discover information-rich features specific to the problem at hand. This study provides experimental validation on three SHM applications including ultrasonic damage detection, bearing damage classification for rotating machinery, and vibration-based structural health monitoring. Performance comparisons with common feature choices for each problem area are provided demonstrating the versatility of Autofead to produce significant algorithm improvements on a wide range of problems.

  1. Improved Anti-Submarine Warfare (ASW) Effectiveness MSSE Capstone Project

    DTIC Science & Technology

    2008-06-01

    9 - Barrier System RMA Data Component MTBF MTTR Source Buoy 9,600 Hours Not Repairable During Mission [Ref 70, Lumpkin and Pazos , 2004...and Mayra Pazos , “Lifetime Statistics of Most Recent Drifter Deployments (2002-2003),” Global Drifter Program/ Drifter Data Assembly Center, NOAA

  2. University of Alaska 1984 Statistical Summary.

    ERIC Educational Resources Information Center

    Spargo, Frank R.; Gaylord, Thomas A.

    Designed to inform decisions about the University of Alaska's (UA's) budget, direction, scope, and academic thrusts, this report provides statewide, unit, and campus data for the two- and four-year colleges in the university system. First, a systemwide summary offers information on finances, enrollments, student loan program participation,…

  3. Alternatives for the Disruptive and Delinquent: New Systems or New Teachers?

    ERIC Educational Resources Information Center

    Bell, Raymond

    1975-01-01

    No one would disagree that delinquency and more violent crimes are increasing in the nation's schools. To combat the grim statistics, this author has some concrete suggestions. If your school is considering alternative programs for the alienated, here are some pitfalls to avoid. (Editor)

  4. Center for Prostate Disease Research

    MedlinePlus

    ... 2017 Cancer Statistics programs Clinical Research Program Synopsis Leadership Multi-Disciplinary Clinic Staff Listing 2017 Cancer Statistics Basic Science Research Program Synopsis Leadership Gene Expression Data Research Achievements Staff Listing Lab ...

  5. Notes on numerical reliability of several statistical analysis programs

    USGS Publications Warehouse

    Landwehr, J.M.; Tasker, Gary D.

    1999-01-01

    This report presents a benchmark analysis of several statistical analysis programs currently in use in the USGS. The benchmark consists of a comparison between the values provided by a statistical analysis program for variables in the reference data set ANASTY and their known or calculated theoretical values. The ANASTY data set is an amendment of the Wilkinson NASTY data set that has been used in the statistical literature to assess the reliability (computational correctness) of calculated analytical results.

  6. Statistical Software and Artificial Intelligence: A Watershed in Applications Programming.

    ERIC Educational Resources Information Center

    Pickett, John C.

    1984-01-01

    AUTOBJ and AUTOBOX are revolutionary software programs which contain the first application of artificial intelligence to statistical procedures used in analysis of time series data. The artificial intelligence included in the programs and program features are discussed. (JN)

  7. Evaluating observations in the context of predictions for the death valley regional groundwater system

    USGS Publications Warehouse

    Ely, D.M.; Hill, M.C.; Tiedeman, C.R.; O'Brien, G. M.

    2004-01-01

    When a model is calibrated by nonlinear regression, calculated diagnostic and inferential statistics provide a wealth of information about many aspects of the system. This work uses linear inferential statistics that are measures of prediction uncertainty to investigate the likely importance of continued monitoring of hydraulic head to the accuracy of model predictions. The measurements evaluated are hydraulic heads; the predictions of interest are subsurface transport from 15 locations. The advective component of transport is considered because it is the component most affected by the system dynamics represented by the regional-scale model being used. The problem is addressed using the capabilities of the U.S. Geological Survey computer program MODFLOW-2000, with its Advective Travel Observation (ADV) Package. Copyright ASCE 2004.

  8. Sets, Probability and Statistics: The Mathematics of Life Insurance. [Computer Program.] Second Edition.

    ERIC Educational Resources Information Center

    King, James M.; And Others

    The materials described here represent the conversion of a highly popular student workbook "Sets, Probability and Statistics: The Mathematics of Life Insurance" into a computer program. The program is designed to familiarize students with the concepts of sets, probability, and statistics, and to provide practice using real life examples. It also…

  9. Why Wait? The Influence of Academic Self-Regulation, Intrinsic Motivation, and Statistics Anxiety on Procrastination in Online Statistics

    ERIC Educational Resources Information Center

    Dunn, Karee

    2014-01-01

    Online graduate education programs are expanding rapidly. Many of these programs require a statistics course, resulting in an increasing need for online statistics courses. The study reported here grew from experiences teaching online, graduate statistics courses. In seeking answers on how to improve this class, I discovered that research has yet…

  10. SAP- FORTRAN STATIC SOURCE CODE ANALYZER PROGRAM (IBM VERSION)

    NASA Technical Reports Server (NTRS)

    Manteufel, R.

    1994-01-01

    The FORTRAN Static Source Code Analyzer program, SAP, was developed to automatically gather statistics on the occurrences of statements and structures within a FORTRAN program and to provide for the reporting of those statistics. Provisions have been made for weighting each statistic and to provide an overall figure of complexity. Statistics, as well as figures of complexity, are gathered on a module by module basis. Overall summed statistics are also accumulated for the complete input source file. SAP accepts as input syntactically correct FORTRAN source code written in the FORTRAN 77 standard language. In addition, code written using features in the following languages is also accepted: VAX-11 FORTRAN, IBM S/360 FORTRAN IV Level H Extended; and Structured FORTRAN. The SAP program utilizes two external files in its analysis procedure. A keyword file allows flexibility in classifying statements and in marking a statement as either executable or non-executable. A statistical weight file allows the user to assign weights to all output statistics, thus allowing the user flexibility in defining the figure of complexity. The SAP program is written in FORTRAN IV for batch execution and has been implemented on a DEC VAX series computer under VMS and on an IBM 370 series computer under MVS. The SAP program was developed in 1978 and last updated in 1985.

  11. SAP- FORTRAN STATIC SOURCE CODE ANALYZER PROGRAM (DEC VAX VERSION)

    NASA Technical Reports Server (NTRS)

    Merwarth, P. D.

    1994-01-01

    The FORTRAN Static Source Code Analyzer program, SAP, was developed to automatically gather statistics on the occurrences of statements and structures within a FORTRAN program and to provide for the reporting of those statistics. Provisions have been made for weighting each statistic and to provide an overall figure of complexity. Statistics, as well as figures of complexity, are gathered on a module by module basis. Overall summed statistics are also accumulated for the complete input source file. SAP accepts as input syntactically correct FORTRAN source code written in the FORTRAN 77 standard language. In addition, code written using features in the following languages is also accepted: VAX-11 FORTRAN, IBM S/360 FORTRAN IV Level H Extended; and Structured FORTRAN. The SAP program utilizes two external files in its analysis procedure. A keyword file allows flexibility in classifying statements and in marking a statement as either executable or non-executable. A statistical weight file allows the user to assign weights to all output statistics, thus allowing the user flexibility in defining the figure of complexity. The SAP program is written in FORTRAN IV for batch execution and has been implemented on a DEC VAX series computer under VMS and on an IBM 370 series computer under MVS. The SAP program was developed in 1978 and last updated in 1985.

  12. Method and system for knowledge discovery using non-linear statistical analysis and a 1st and 2nd tier computer program

    DOEpatents

    Hively, Lee M [Philadelphia, TN

    2011-07-12

    The invention relates to a method and apparatus for simultaneously processing different sources of test data into informational data and then processing different categories of informational data into knowledge-based data. The knowledge-based data can then be communicated between nodes in a system of multiple computers according to rules for a type of complex, hierarchical computer system modeled on a human brain.

  13. A Robot-Based Tool for Physical and Cognitive Rehabilitation of Elderly People Using Biofeedback

    PubMed Central

    Lopez-Samaniego, Leire; Garcia-Zapirain, Begonya

    2016-01-01

    This publication presents a complete description of a technological solution system for the physical and cognitive rehabilitation of elderly people through a biofeedback system, which is combined with a Lego robot. The technology used was the iOS’s (iPhone Operating System) Objective-C programming language and its XCode programming environment; and SQLite in order to create the database. The biofeedback system is implemented by the use of two biosensors which are, in fact, a Microsoft band 2 in order to register the user’s heart rate and a MYO sensor to detect the user’s arm movement. Finally, the system was tested with seven elderly people from La Santa y Real Casa de la Misericordia nursing home in Bilbao. The statistical assessment has shown that the users are satisfied with the usability of the system, with a mean score of 79.29 on the System Usability Scale (SUS) questionnaire. PMID:27886146

  14. Statistical Teleodynamics: Toward a Theory of Emergence.

    PubMed

    Venkatasubramanian, Venkat

    2017-10-24

    The central scientific challenge of the 21st century is developing a mathematical theory of emergence that can explain and predict phenomena such as consciousness and self-awareness. The most successful research program of the 20th century, reductionism, which goes from the whole to parts, seems unable to address this challenge. This is because addressing this challenge inherently requires an opposite approach, going from parts to the whole. In addition, reductionism, by the very nature of its inquiry, typically does not concern itself with teleology or purposeful behavior. Modeling emergence, in contrast, requires the addressing of teleology. Together, these two requirements present a formidable challenge in developing a successful mathematical theory of emergence. In this article, I describe a new theory of emergence, called statistical teleodynamics, that addresses certain aspects of the general problem. Statistical teleodynamics is a mathematical framework that unifies three seemingly disparate domains-purpose-free entities in statistical mechanics, human engineered teleological systems in systems engineering, and nature-evolved teleological systems in biology and sociology-within the same conceptual formalism. This theory rests on several key conceptual insights, the most important one being the recognition that entropy mathematically models the concept of fairness in economics and philosophy and, equivalently, the concept of robustness in systems engineering. These insights help prove that the fairest inequality of income is a log-normal distribution, which will emerge naturally at equilibrium in an ideal free market society. Similarly, the theory predicts the emergence of the three classes of network organization-exponential, scale-free, and Poisson-seen widely in a variety of domains. Statistical teleodynamics is the natural generalization of statistical thermodynamics, the most successful parts-to-whole systems theory to date, but this generalization is only a modest step toward a more comprehensive mathematical theory of emergence.

  15. Vega roll and attitude control system algorithms trade-off study

    NASA Astrophysics Data System (ADS)

    Paulino, N.; Cuciniello, G.; Cruciani, I.; Corraro, F.; Spallotta, D.; Nebula, F.

    2013-12-01

    This paper describes the trade-off study for the selection of the most suitable algorithms for the Roll and Attitude Control System (RACS) within the FPS-A program, aimed at developing the new Flight Program Software of VEGA Launcher. Two algorithms were analyzed: Switching Lines (SL) and Quaternion Feedback Regulation. Using a development simulation tool that models two critical flight phases (Long Coasting Phase (LCP) and Payload Release (PLR) Phase), both algorithms were assessed with Monte Carlo batch simulations for both of the phases. The statistical outcomes of the results demonstrate a 100 percent success rate for Quaternion Feedback Regulation, and support the choice of this method.

  16. KERNELHR: A program for estimating animal home ranges

    USGS Publications Warehouse

    Seaman, D.E.; Griffith, B.; Powell, R.A.

    1998-01-01

    Kernel methods are state of the art for estimating animal home-range area and utilization distribution (UD). The KERNELHR program was developed to provide researchers and managers a tool to implement this extremely flexible set of methods with many variants. KERNELHR runs interactively or from the command line on any personal computer (PC) running DOS. KERNELHR provides output of fixed and adaptive kernel home-range estimates, as well as density values in a format suitable for in-depth statistical and spatial analyses. An additional package of programs creates contour files for plotting in geographic information systems (GIS) and estimates core areas of ranges.

  17. College Freshmen with Disabilities: A Triennial Statistical Profile.

    ERIC Educational Resources Information Center

    Henderson, Cathy

    This monograph uses narrative, tables, and figures to present information on college freshmen with disabilities, based on data collected by the Cooperative Institutional Research Program, a longitudinal study of the American higher education system involving data on some 1,300 institutions, over 7 million students, and about 100,000 faculty.…

  18. DIFAS: Differential Item Functioning Analysis System. Computer Program Exchange

    ERIC Educational Resources Information Center

    Penfield, Randall D.

    2005-01-01

    Differential item functioning (DIF) is an important consideration in assessing the validity of test scores (Camilli & Shepard, 1994). A variety of statistical procedures have been developed to assess DIF in tests of dichotomous (Hills, 1989; Millsap & Everson, 1993) and polytomous (Penfield & Lam, 2000; Potenza & Dorans, 1995) items. Some of these…

  19. Monte Carlo Approach for Reliability Estimations in Generalizability Studies.

    ERIC Educational Resources Information Center

    Dimitrov, Dimiter M.

    A Monte Carlo approach is proposed, using the Statistical Analysis System (SAS) programming language, for estimating reliability coefficients in generalizability theory studies. Test scores are generated by a probabilistic model that considers the probability for a person with a given ability score to answer an item with a given difficulty…

  20. Government-Funded Students and Courses, 2016. Australian Vocational Education and Training Statistics

    ERIC Educational Resources Information Center

    National Centre for Vocational Education Research (NCVER), 2017

    2017-01-01

    This publication provides a summary of data relating to students, programs, subjects and training providers in Australia's government-funded vocational education and training (VET) system (defined as all Commonwealth and state/territory government-funded training delivered by technical and further education [TAFE] institutes, other government…

  1. Government-Funded Students and Courses, 2015. Australian Vocational Education and Training Statistics

    ERIC Educational Resources Information Center

    National Centre for Vocational Education Research (NCVER), 2016

    2016-01-01

    This publication provides a summary of 2015 and time-series data relating to students, programs, subjects, training providers and funding in Australia's government-funded vocational education and training (VET) system (broadly defined as all activity delivered by government providers and government-funded activity delivered by community education…

  2. Australian Vocational Education and Training--Statistics 1999: An Overview.

    ERIC Educational Resources Information Center

    National Centre for Vocational Education Research, Leabrook (Australia).

    Data pertaining to Australia's publicly funded vocational education and training (VET) sector in 1999 were reviewed. Both national-level and state/territory-level data on the following topics were reviewed: VET providers and delivery systems; student characteristics; enrollment trends; program costs and financing mechanisms; and apprentices and…

  3. Use of microcomputers for planning and managing silviculture habitat relationships.

    Treesearch

    B.G. Marcot; R.S. McNay; R.E. Page

    1988-01-01

    Microcomputers aid in monitoring, modeling, and decision support for integrating objectives of silviculture and wildlife habitat management. Spreadsheets, data bases, statistics, and graphics programs are described for use in monitoring. Stand growth models, modeling languages, area and geobased information systems, and optimization models are discussed for use in...

  4. Advanced support systems development and supporting technologies for Controlled Ecological Life Support Systems (CELSS)

    NASA Technical Reports Server (NTRS)

    Simon, William E.; Li, Ku-Yen; Yaws, Carl L.; Mei, Harry T.; Nguyen, Vinh D.; Chu, Hsing-Wei

    1994-01-01

    A methyl acetate reactor was developed to perform a subscale kinetic investigation in the design and optimization of a full-scale metabolic simulator for long term testing of life support systems. Other tasks in support of the closed ecological life support system test program included: (1) heating, ventilation and air conditioning analysis of a variable pressure growth chamber, (2) experimental design for statistical analysis of plant crops, (3) resource recovery for closed life support systems, and (4) development of data acquisition software for automating an environmental growth chamber.

  5. Intelligent guidance and control for wind shear encounter

    NASA Technical Reports Server (NTRS)

    Stengel, Robert F.

    1988-01-01

    The principal objective is to develop methods for assessing the likelihood of wind shear encounter, for deciding what flight path to pursue, and for using the aircraft's full potential for combating wind shear. This study requires the definition of both deterministic and statistical techniques for fusing internal and external information, for making go/no-go decisions, and for generating commands to the aircraft's cockpit displays and autopilot for both manually controlled and automatic flight. The program has begun with the development of a real-time expert system for pilot aiding that is based on the results of the FAA Windshear Training Aids Program. A two-volume manual that presents an overview, pilot guide, training program, and substantiating data provides guidelines for this initial development. The Expert System to Avoid Wind Shear (ESAWS) currently contains over 140 rules and is coded in the LISP programming language for implementation on a Symbolics 3670 LISP machine.

  6. [Development of a multimedia learning DM diet education program using standardized patients and analysis of its effects on clinical competency and learning satisfaction for nursing students].

    PubMed

    Hyun, Kyung Sun; Kang, Hyun Sook; Kim, Won Ock; Park, Sunhee; Lee, Jia; Sok, Sohyune

    2009-04-01

    The purpose of this study was to develop a multimedia learning program for patients with diabetes mellitus (DM) diet education using standardized patients and to examine the effects of the program on educational skills, communication skills, DM diet knowledge and learning satisfaction. The study employed a randomized control posttest non-synchronized design. The participants were 108 third year nursing students (52 experimental group, 56 control group) at K university in Seoul, Korea. The experimental group had regular lectures and the multimedia learning program for DM diet education using standardized patients while the control group had regular lectures only. The DM educational skills were measured by trained research assistants. The students who received the multimedia learning program scored higher for DM diet educational skills, communication skills and DM diet knowledge compared to the control group. Learning satisfaction of the experimental group was higher than the control group, but statistically insignificant. Clinical competency was improved for students receiving the multimedia learning program for DM diet education using standardized patients, but there was no statistically significant effect on learning satisfaction. In the nursing education system there is a need to develop and apply more multimedia materials for education and to use standardized patients effectively.

  7. Research Education in Undergraduate Occupational Therapy Programs.

    ERIC Educational Resources Information Center

    Petersen, Paul; And Others

    1992-01-01

    Of 63 undergraduate occupational therapy programs surveyed, the 38 responses revealed some common areas covered: elementary descriptive statistics, validity, reliability, and measurement. Areas underrepresented include statistical analysis with or without computers, research design, and advanced statistics. (SK)

  8. 2012 statistical summaries : FTA grant assistance programs.

    DOT National Transportation Integrated Search

    2013-12-01

    The 2012 Statistical Summaries provides information about the Federal Transit Administrations (FTA) major financial aid programs for : Federal Fiscal Year (FY) 2012. The report covers the following programs: Urbanized Area Formula, Non-urbanized A...

  9. 2011 statistical summaries : FTA grant assistance programs.

    DOT National Transportation Integrated Search

    2013-05-01

    The 2011 Statistical Summaries provides information about the Federal Transit Administrations (FTA) major financial aid programs for Federal Fiscal Year (FY) 2011. The report covers the following programs: Urbanized Area Formula, Non-urbanized Are...

  10. 2010 statistical summaries : FTA grant assistance programs.

    DOT National Transportation Integrated Search

    2013-07-01

    The 2010 Statistical Summaries provides information about the Federal Transit Administrations (FTA) major financial aid programs for Federal Fiscal Year (FY) 2010. The report covers the following programs: Urbanized Area Formula, Non-urbanized Are...

  11. A Census of Statistics Requirements at U.S. Journalism Programs and a Model for a "Statistics for Journalism" Course

    ERIC Educational Resources Information Center

    Martin, Justin D.

    2017-01-01

    This essay presents data from a census of statistics requirements and offerings at all 4-year journalism programs in the United States (N = 369) and proposes a model of a potential course in statistics for journalism majors. The author proposes that three philosophies underlie a statistics course for journalism students. Such a course should (a)…

  12. Climate Considerations Of The Electricity Supply Systems In Industries

    NASA Astrophysics Data System (ADS)

    Asset, Khabdullin; Zauresh, Khabdullina

    2014-12-01

    The study is focused on analysis of climate considerations of electricity supply systems in a pellet industry. The developed analysis model consists of two modules: statistical data of active power losses evaluation module and climate aspects evaluation module. The statistical data module is presented as a universal mathematical model of electrical systems and components of industrial load. It forms a basis for detailed accounting of power loss from the voltage levels. On the basis of the universal model, a set of programs is designed to perform the calculation and experimental research. It helps to obtain the statistical characteristics of the power losses and loads of the electricity supply systems and to define the nature of changes in these characteristics. Within the module, several methods and algorithms for calculating parameters of equivalent circuits of low- and high-voltage ADC and SD with a massive smooth rotor with laminated poles are developed. The climate aspects module includes an analysis of the experimental data of power supply system in pellet production. It allows identification of GHG emission reduction parameters: operation hours, type of electrical motors, values of load factor and deviation of standard value of voltage.

  13. [Changing of the patient safety culture in the pilot institutes of the Hungarian accreditation program].

    PubMed

    Lám, Judit; Merész, Gergő; Bakacsi, Gyula; Belicza, Éva; Surján, Cecília; Takács, Erika

    2016-10-01

    The accreditation system for health care providers was developed in Hungary aiming to increase safety, efficiency, and efficacy of care and optimise its organisational operation. The aim of this study was to assess changes of organisational culture in pilot institutes of the accreditation program. 7 volunteer pilot institutes using an internationally validated questionnaire were included. The impact study was performed in 2 rounds: the first before the introduction of the accreditation program, and the second a year later, when the standards were already known. Data were analysed using descriptive statistics and logistic regression models. Statistically significant (p<0.05) positive changes were detected in hospitals in three dimensions: organisational learning - continuous improvement, communication openness, teamwork within the unit while in outpatient clinics: overall perceptions of patient safety, and patient safety within the unit. Organisational culture in the observed institutes needs improvement, but positive changes already point to a safer care. Orv. Hetil., 2016, 157(42), 1667-1673.

  14. Environmental flow allocation and statistics calculator

    USGS Publications Warehouse

    Konrad, Christopher P.

    2011-01-01

    The Environmental Flow Allocation and Statistics Calculator (EFASC) is a computer program that calculates hydrologic statistics based on a time series of daily streamflow values. EFASC will calculate statistics for daily streamflow in an input file or will generate synthetic daily flow series from an input file based on rules for allocating and protecting streamflow and then calculate statistics for the synthetic time series. The program reads dates and daily streamflow values from input files. The program writes statistics out to a series of worksheets and text files. Multiple sites can be processed in series as one run. EFASC is written in MicrosoftRegistered Visual BasicCopyright for Applications and implemented as a macro in MicrosoftOffice Excel 2007Registered. EFASC is intended as a research tool for users familiar with computer programming. The code for EFASC is provided so that it can be modified for specific applications. All users should review how output statistics are calculated and recognize that the algorithms may not comply with conventions used to calculate streamflow statistics published by the U.S. Geological Survey.

  15. Milestones on a Shoestring: A Cost-Effective, Semi-automated Implementation of the New ACGME Requirements for Radiology.

    PubMed

    Schmitt, J Eric; Scanlon, Mary H; Servaes, Sabah; Levin, Dayna; Cook, Tessa S

    2015-10-01

    The advent of the ACGME's Next Accreditation System represents a significant new challenge for residencies and fellowships, owing to its requirements for more complex and detailed information. We developed a system of online assessment tools to provide comprehensive coverage of the twelve ACGME Milestones and digitized them using freely available cloud-based productivity tools. These tools include a combination of point-of-care procedural assessments, electronic quizzes, online modules, and other data entry forms. Using free statistical analytic tools, we also developed an automated system for management, processing, and data reporting. After one year of use, our Milestones project has resulted in the submission of over 20,000 individual data points. The use of automated statistical methods to generate resident-specific profiles has allowed for dynamic reports of individual residents' progress. These profiles both summarize data and also allow program directors access to more granular information as needed. Informatics-driven strategies for data assessment and processing represent feasible solutions to Milestones assessment and analysis, reducing the potential administrative burden for program directors, residents, and staff. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  16. Traveling through Time: The Forum Guide to Longitudinal Data Systems. Book Four of Four: Advanced LDS Usage. NFES 2011-802

    ERIC Educational Resources Information Center

    National Forum on Education Statistics, 2011

    2011-01-01

    This document is the fourth and final installment of this Forum series of guides on longitudinal data systems (LDS). One goal of the National Forum on Education Statistics (the Forum) is to improve the quality of education data gathered for use by policymakers and program decisionmakers. An approach to furthering this goal has been to pool the…

  17. Memetic computing through bio-inspired heuristics integration with sequential quadratic programming for nonlinear systems arising in different physical models.

    PubMed

    Raja, Muhammad Asif Zahoor; Kiani, Adiqa Kausar; Shehzad, Azam; Zameer, Aneela

    2016-01-01

    In this study, bio-inspired computing is exploited for solving system of nonlinear equations using variants of genetic algorithms (GAs) as a tool for global search method hybrid with sequential quadratic programming (SQP) for efficient local search. The fitness function is constructed by defining the error function for systems of nonlinear equations in mean square sense. The design parameters of mathematical models are trained by exploiting the competency of GAs and refinement are carried out by viable SQP algorithm. Twelve versions of the memetic approach GA-SQP are designed by taking a different set of reproduction routines in the optimization process. Performance of proposed variants is evaluated on six numerical problems comprising of system of nonlinear equations arising in the interval arithmetic benchmark model, kinematics, neurophysiology, combustion and chemical equilibrium. Comparative studies of the proposed results in terms of accuracy, convergence and complexity are performed with the help of statistical performance indices to establish the worth of the schemes. Accuracy and convergence of the memetic computing GA-SQP is found better in each case of the simulation study and effectiveness of the scheme is further established through results of statistics based on different performance indices for accuracy and complexity.

  18. Probabilistic/Fracture-Mechanics Model For Service Life

    NASA Technical Reports Server (NTRS)

    Watkins, T., Jr.; Annis, C. G., Jr.

    1991-01-01

    Computer program makes probabilistic estimates of lifetime of engine and components thereof. Developed to fill need for more accurate life-assessment technique that avoids errors in estimated lives and provides for statistical assessment of levels of risk created by engineering decisions in designing system. Implements mathematical model combining techniques of statistics, fatigue, fracture mechanics, nondestructive analysis, life-cycle cost analysis, and management of engine parts. Used to investigate effects of such engine-component life-controlling parameters as return-to-service intervals, stresses, capabilities for nondestructive evaluation, and qualities of materials.

  19. Compilation of 1986 Annual Reports of the Navy ELF (Extremely Low Frequency) Communications System Ecological Monitoring Program. Volume 3. TABS H-J.

    DTIC Science & Technology

    1987-07-01

    yearly trends, and the effect of population size and abiotic factors on growth will be completed when the 1984-1986 scales are completed. Fish condition...settling of suspended particles on substrates in its absence. The pumps were powered by a heavy duty marine battery which had to be exchanged and...computer using procedures available in SPSS (Hull and Nie 1981) and programs available in the BIOM statistical package (Rohlf). Sokal and Rohlf (1981

  20. S.P.S.S. User's Manual #1-#4. Basic Program Construction in S.P.S.S.; S.P.S.S. Non-Procedural Statements and Procedural Commands; System Control Language and S.P.S.S.; Quick File Equate Statement Reference.

    ERIC Educational Resources Information Center

    Earl, Lorna L.

    This series of manuals describing and illustrating the Statistical Package for the Social Sciences (SPSS) was planned as a self-teaching instrument, beginning with the basics and progressing to an advanced level. Information on what the searcher must know to define the data and write a program for preliminary analysis is contained in manual 1,…

  1. Quantitative Measurements of the Effects of Variations in Panel Density and Distributions for Panel Method Computer Programs

    DTIC Science & Technology

    1980-01-01

    AND ADDRESS 1 .PROGRAM ELEMENT. PROJECT, TASK AREA & WORK UNIT NUMBERS Aircraft and -.rew Systems Technology Di r. (C:ode 60 E61-.U o A0 Naval Air...0 . . . . . . . . . . . B- 1 B-If Input Statistics For Equations 5 and 6 .... ....... B- 2 B-Ill Computation of Coefficients of...trapezoidal panels and the formula for PAR can be derived for the case where equal spanwise and chordwise divisions are used: PAR (N/2M)/( 1 +X ( 2 = ( 2

  2. 1998 statistical summaries : Federal Transit Administration : grant assistance programs

    DOT National Transportation Integrated Search

    1999-03-01

    The 1998 Statistical Summaries provides information about the Federal Transit Administration's (FTA) major financial aid programs for Federal Fiscal Year (FY) 1998. The report covers the following programs: Urbanized Area Formula, Non-urbanized Area ...

  3. Statistical energy analysis computer program, user's guide

    NASA Technical Reports Server (NTRS)

    Trudell, R. W.; Yano, L. I.

    1981-01-01

    A high frequency random vibration analysis, (statistical energy analysis (SEA) method) is examined. The SEA method accomplishes high frequency prediction of arbitrary structural configurations. A general SEA computer program is described. A summary of SEA theory, example problems of SEA program application, and complete program listing are presented.

  4. Preliminary Survey of Icing Conditions Measured During Routine Transcontinental Airline Operation

    NASA Technical Reports Server (NTRS)

    Perkins, Porter J.

    1952-01-01

    Icing data collected on routine operations by four DC-4-type aircraft equipped with NACA pressure-type icing-rate meters are presented as preliminary information obtained from a statistical icing data program sponsored by the NACA with the cooperation of many airline companies and the United States Air Force. The program is continuing on a much greater scale to provide large quantities of data from many air routes in the United States and overseas. Areas not covered by established air routes are also being included in the survey. The four aircraft which collected the data presented in this report were operated by United Air Lines over a transcontinental route from January through May, 1951. An analysis of the pressure-type icing-rate meter was satisfactory for collecting statistical data during routine operations. Data obtained on routine flight icing encounters from.these four instrumented aircraft, although insufficient for a conclusive statistical analysis, provide a greater quantity and considerably more realistic information than that obtained from random research flights. A summary of statistical data will be published when the information obtained daring the 1951-52 icing season and that to be obtained during the 1952-53 season can be analyzed and assembled. The 1951-52 data already analyzed indicate that the quantity, quality, and range of icing information being provided by this expanded program should afford a sound basis for ice-protection-system design by defining the important meteorological parameters of the icing cloud.

  5. Impact of a visual programming experience on the attitude toward programming of introductory undergraduate students

    NASA Astrophysics Data System (ADS)

    Godbole, Saurabh

    Traditionally, textual tools have been utilized to teach basic programming languages and paradigms. Research has shown that students tend to be visual learners. Using flowcharts, students can quickly understand the logic of their programs and visualize the flow of commands in the algorithm. Moreover, applying programming to physical systems through the use of a microcontroller to facilitate this type of learning can spark an interest in students to advance their programming knowledge to create novel applications. This study examined if freshmen college students' attitudes towards programming changed after completing a graphical programming lesson. Various attributes about students' attitudes were examined including confidence, interest, stereotypes, and their belief in the usefulness of acquiring programming skills. The study found that there were no statistically significant differences in attitudes either immediately following the session or after a period of four weeks.

  6. An integrated user-friendly ArcMAP tool for bivariate statistical modeling in geoscience applications

    NASA Astrophysics Data System (ADS)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.

    2014-10-01

    Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  7. An integrated user-friendly ArcMAP tool for bivariate statistical modelling in geoscience applications

    NASA Astrophysics Data System (ADS)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.

    2015-03-01

    Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  8. Assessing the impact of a remote digital coaching engagement program on patient-reported outcomes in asthma.

    PubMed

    Rasulnia, Mazi; Burton, Billy Stephen; Ginter, Robert P; Wang, Tracy Y; Pleasants, Roy Alton; Green, Cynthia L; Lugogo, Njira

    2017-08-11

    Low adherence and poor outcomes provide opportunity for digital coaching to engage patients with uncontrolled asthma in their care to improve outcomes. To examine the impact of a remote digital coaching program on asthma control and patient experience. We recruited 51 adults with uncontrolled asthma, denoted by albuterol use of >2 times per week and/or exacerbations requiring corticosteroids, and applied a 12-week patient-centered remote digital coaching program using a combination of educational pamphlets, symptom trackers, best peak flow establishment, physical activity, and dietary counseling, as well as coaches who implemented emotional enforcement to motivate disease self-management through telephone, text, and email. Baseline and post-intervention measures were quality of life (QOL), spirometry, Asthma Control Test (ACT), Asthma Symptom Utility Index (ASUI), rescue albuterol use, and exacerbation history. Among 51 patients recruited, 40 completed the study. Eight subjects required assistance reading medical materials. Significant improvements from baseline were observed for Patient-Reported Outcomes Measurement Information System mental status (p = 0.010), body weight, and outpatient exacerbation frequency (p = 0.028). The changes from baseline in ACT (p = 0.005) were statistically significant but did not achieve the pre-specified minimum clinically important difference (MCID), whereas for ASUI, the MCID and statistical significance were achieved. Spirometry and rescue albuterol use were no different. A patient-oriented, remote digital coaching program that utilized trained health coaches and digital materials led to statistically significant improvement in mental status, outpatient exacerbations, body weight, and ASUI. Digital coaching programs may improve some outcomes in adults with uncontrolled asthma.

  9. An Innovative Program in the Science of Health Care Delivery: Workforce Diversity in the Business of Health.

    PubMed

    Essary, Alison C; Wade, Nathaniel L

    2016-01-01

    According to the most recent statistics from the National Center for Education Statistics, disparities in enrollment in undergraduate and graduate education are significant and not improving commensurate with the national population. Similarly, only 12% of graduating medical students and 13% of graduating physician assistant students are from underrepresented racial and ethnic groups. Established in 2012 to promote health care transformation at the organization and system levels, the School for the Science of Health Care Delivery is aligned with the university and college missions to create innovative, interdisciplinary curricula that meet the needs of our diverse patient and community populations. Three-year enrollment trends in the program exceed most national benchmarks, particularly among students who identify as Hispanic and American Indian/Alaska Native. The Science of Health Care Delivery program provides students a seamless learning experience that prepares them to be solutions-oriented leaders proficient in the business of health care, change management, innovation, and data-driven decision making. Defined as the study and design of systems, processes, leadership and management used to optimize health care delivery and health for all, the Science of Health Care Delivery will prepare the next generation of creative, diverse, pioneering leaders in health care.

  10. Integrating Formal Methods and Testing 2002

    NASA Technical Reports Server (NTRS)

    Cukic, Bojan

    2002-01-01

    Traditionally, qualitative program verification methodologies and program testing are studied in separate research communities. None of them alone is powerful and practical enough to provide sufficient confidence in ultra-high reliability assessment when used exclusively. Significant advances can be made by accounting not only tho formal verification and program testing. but also the impact of many other standard V&V techniques, in a unified software reliability assessment framework. The first year of this research resulted in the statistical framework that, given the assumptions on the success of the qualitative V&V and QA procedures, significantly reduces the amount of testing needed to confidently assess reliability at so-called high and ultra-high levels (10-4 or higher). The coming years shall address the methodologies to realistically estimate the impacts of various V&V techniques to system reliability and include the impact of operational risk to reliability assessment. Combine formal correctness verification, process and product metrics, and other standard qualitative software assurance methods with statistical testing with the aim of gaining higher confidence in software reliability assessment for high-assurance applications. B) Quantify the impact of these methods on software reliability. C) Demonstrate that accounting for the effectiveness of these methods reduces the number of tests needed to attain certain confidence level. D) Quantify and justify the reliability estimate for systems developed using various methods.

  11. Use of Spatial Epidemiology and Hot Spot Analysis to Target Women Eligible for Prenatal Women, Infants, and Children Services

    PubMed Central

    Krawczyk, Christopher; Gradziel, Pat; Geraghty, Estella M.

    2014-01-01

    Objectives. We used a geographic information system and cluster analyses to determine locations in need of enhanced Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) Program services. Methods. We linked documented births in the 2010 California Birth Statistical Master File with the 2010 data from the WIC Integrated Statewide Information System. Analyses focused on the density of pregnant women who were eligible for but not receiving WIC services in California’s 7049 census tracts. We used incremental spatial autocorrelation and hot spot analyses to identify clusters of WIC-eligible nonparticipants. Results. We detected clusters of census tracts with higher-than-expected densities, compared with the state mean density of WIC-eligible nonparticipants, in 21 of 58 (36.2%) California counties (P < .05). In subsequent county-level analyses, we located neighborhood-level clusters of higher-than-expected densities of eligible nonparticipants in Sacramento, San Francisco, Fresno, and Los Angeles Counties (P < .05). Conclusions. Hot spot analyses provided a rigorous and objective approach to determine the locations of statistically significant clusters of WIC-eligible nonparticipants. Results helped inform WIC program and funding decisions, including the opening of new WIC centers, and offered a novel approach for targeting public health services. PMID:24354821

  12. Randomization in cancer clinical trials: permutation test and development of a computer program.

    PubMed Central

    Ohashi, Y

    1990-01-01

    When analyzing cancer clinical trial data where the treatment allocation is done using dynamic balancing methods such as the minimization method for balancing the distribution of important prognostic factors in each arm, conservativeness occurs if such a randomization scheme is ignored and a simple unstratified analysis is carried out. In this paper, the above conservativeness is demonstrated by computer simulation, and the development of a computer program that carries out permutation tests of the log-rank statistics for clinical trial data where the allocation is done by the minimization method or a stratified permuted block design is introduced. We are planning to use this program in practice to supplement a usual stratified analysis and model-based methods such as the Cox regression. The most serious problem in cancer clinical trials in Japan is how to carry out the quality control or data management in trials that are initiated and conducted by researchers without support from pharmaceutical companies. In the final section of this paper, one international collaborative work for developing international guidelines on data management in clinical trials of bladder cancer is briefly introduced, and the differences between the system adopted in US/European statistical centers and the Japanese system is described. PMID:2269216

  13. [Use of the Elektronika-T3-16M special-purpose computer for the automatic processing of cytophotometric and cytofluorimetric data].

    PubMed

    Loktionov, A S; Prianishnikov, V A

    1981-05-01

    A system has been proposed to provide the automatic analysis of data on: a) point cytophotometry, b) two-wave cytophotometry, c) cytofluorimetry. The system provides the input of the data from a photomultiplier to a specialized computer "Electronica-T3-16M" in addition to the simultaneous statistical analysis of these. The information on the programs used is presented. The advantages of the system, compared with some commercially available cytophotometers, are indicated.

  14. Building flexible real-time systems using the Flex language

    NASA Technical Reports Server (NTRS)

    Kenny, Kevin B.; Lin, Kwei-Jay

    1991-01-01

    The design and implementation of a real-time programming language called Flex, which is a derivative of C++, are presented. It is shown how different types of timing requirements might be expressed and enforced in Flex, how they might be fulfilled in a flexible way using different program models, and how the programming environment can help in making binding and scheduling decisions. The timing constraint primitives in Flex are easy to use yet powerful enough to define both independent and relative timing constraints. Program models like imprecise computation and performance polymorphism can carry out flexible real-time programs. In addition, programmers can use a performance measurement tool that produces statistically correct timing models to predict the expected execution time of a program and to help make binding decisions. A real-time programming environment is also presented.

  15. Scheduler software for tracking and data relay satellite system loading analysis: User manual and programmer guide

    NASA Technical Reports Server (NTRS)

    Craft, R.; Dunn, C.; Mccord, J.; Simeone, L.

    1980-01-01

    A user guide and programmer documentation is provided for a system of PRIME 400 minicomputer programs. The system was designed to support loading analyses on the Tracking Data Relay Satellite System (TDRSS). The system is a scheduler for various types of data relays (including tape recorder dumps and real time relays) from orbiting payloads to the TDRSS. Several model options are available to statistically generate data relay requirements. TDRSS time lines (representing resources available for scheduling) and payload/TDRSS acquisition and loss of sight time lines are input to the scheduler from disk. Tabulated output from the interactive system includes a summary of the scheduler activities over time intervals specified by the user and overall summary of scheduler input and output information. A history file, which records every event generated by the scheduler, is written to disk to allow further scheduling on remaining resources and to provide data for graphic displays or additional statistical analysis.

  16. Program evaluation of remote heart failure monitoring: healthcare utilization analysis in a rural regional medical center.

    PubMed

    Riley, William T; Keberlein, Pamela; Sorenson, Gigi; Mohler, Sailor; Tye, Blake; Ramirez, A Susana; Carroll, Mark

    2015-03-01

    Remote monitoring for heart failure (HF) has had mixed and heterogeneous effects across studies, necessitating further evaluation of remote monitoring systems within specific healthcare systems and their patient populations. "Care Beyond Walls and Wires," a wireless remote monitoring program to facilitate patient and care team co-management of HF patients, served by a rural regional medical center, provided the opportunity to evaluate the effects of this program on healthcare utilization. Fifty HF patients admitted to Flagstaff Medical Center (Flagstaff, AZ) participated in the project. Many of these patients lived in underserved and rural communities, including Native American reservations. Enrolled patients received mobile, broadband-enabled remote monitoring devices. A matched cohort was identified for comparison. HF patients enrolled in this program showed substantial and statistically significant reductions in healthcare utilization during the 6 months following enrollment, and these reductions were significantly greater compared with those who declined to participate but not when compared with a matched cohort. The findings from this project indicate that a remote HF monitoring program can be successfully implemented in a rural, underserved area. Reductions in healthcare utilization were observed among program participants, but reductions were also observed among a matched cohort, illustrating the need for rigorous assessment of the effects of HF remote monitoring programs in healthcare systems.

  17. HYPERSAMP - HYPERGEOMETRIC ATTRIBUTE SAMPLING SYSTEM BASED ON RISK AND FRACTION DEFECTIVE

    NASA Technical Reports Server (NTRS)

    De, Salvo L. J.

    1994-01-01

    HYPERSAMP is a demonstration of an attribute sampling system developed to determine the minimum sample size required for any preselected value for consumer's risk and fraction of nonconforming. This statistical method can be used in place of MIL-STD-105E sampling plans when a minimum sample size is desirable, such as when tests are destructive or expensive. HYPERSAMP utilizes the Hypergeometric Distribution and can be used for any fraction nonconforming. The program employs an iterative technique that circumvents the obstacle presented by the factorial of a non-whole number. HYPERSAMP provides the required Hypergeometric sample size for any equivalent real number of nonconformances in the lot or batch under evaluation. Many currently used sampling systems, such as the MIL-STD-105E, utilize the Binomial or the Poisson equations as an estimate of the Hypergeometric when performing inspection by attributes. However, this is primarily because of the difficulty in calculation of the factorials required by the Hypergeometric. Sampling plans based on the Binomial or Poisson equations will result in the maximum sample size possible with the Hypergeometric. The difference in the sample sizes between the Poisson or Binomial and the Hypergeometric can be significant. For example, a lot size of 400 devices with an error rate of 1.0% and a confidence of 99% would require a sample size of 400 (all units would need to be inspected) for the Binomial sampling plan and only 273 for a Hypergeometric sampling plan. The Hypergeometric results in a savings of 127 units, a significant reduction in the required sample size. HYPERSAMP is a demonstration program and is limited to sampling plans with zero defectives in the sample (acceptance number of zero). Since it is only a demonstration program, the sample size determination is limited to sample sizes of 1500 or less. The Hypergeometric Attribute Sampling System demonstration code is a spreadsheet program written for IBM PC compatible computers running DOS and Lotus 1-2-3 or Quattro Pro. This program is distributed on a 5.25 inch 360K MS-DOS format diskette, and the program price includes documentation. This statistical method was developed in 1992.

  18. Limitations of Using Microsoft Excel Version 2016 (MS Excel 2016) for Statistical Analysis for Medical Research.

    PubMed

    Tanavalee, Chotetawan; Luksanapruksa, Panya; Singhatanadgige, Weerasak

    2016-06-01

    Microsoft Excel (MS Excel) is a commonly used program for data collection and statistical analysis in biomedical research. However, this program has many limitations, including fewer functions that can be used for analysis and a limited number of total cells compared with dedicated statistical programs. MS Excel cannot complete analyses with blank cells, and cells must be selected manually for analysis. In addition, it requires multiple steps of data transformation and formulas to plot survival analysis graphs, among others. The Megastat add-on program, which will be supported by MS Excel 2016 soon, would eliminate some limitations of using statistic formulas within MS Excel.

  19. Seminar presentation on the economic evaluation of the space shuttle system

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The proceedings of a seminar on the economic aspects of the space shuttle system are presented. Emphasis was placed on the problems of economic analysis of large scale public investments, the state of the art of cost estimation, the statistical data base for estimating costs of new technological systems, and the role of the main economic parameters affecting the results of the analyses. An explanation of the system components of a space program and the present choice of launch vehicles, spacecraft, and instruments was conducted.

  20. Technology, Data Bases and System Analysis for Space-to-Ground Optical Communications

    NASA Technical Reports Server (NTRS)

    Lesh, James

    1995-01-01

    Optical communications is becoming an ever-increasingly important option for designers of space-to- ground communications links, whether it be for government or commercial applications. In this paper the technology being developed by NASA for use in space-to-ground optical communications is presented. Next, a program which is collecting a long term data base of atmospheric visibility statistics for optical propagation through the atmosphere will be described. Finally, a methodology for utilizing the statistics of the atmospheric data base in the analysis of space-to-ground links will be presented. This methodology takes into account the effects of station availability, is useful when comparing optical communications with microwave systems, and provides a rationale establishing the recommended link margin.

  1. Two Computer Programs for the Statistical Evaluation of a Weighted Linear Composite.

    ERIC Educational Resources Information Center

    Sands, William A.

    1978-01-01

    Two computer programs (one batch, one interactive) are designed to provide statistics for a weighted linear combination of several component variables. Both programs provide mean, variance, standard deviation, and a validity coefficient. (Author/JKS)

  2. Financial Accounting for Local and State School Systems, 1990.

    ERIC Educational Resources Information Center

    Fowler, William J., Jr.

    The purpose of this guidebook is to reflect the changes that have occurred since 1973 in governmental accounting and education finance. This document serves as a vehicle for program cost accounting at the local and intermediate levels. Although not required by federal law, the National Center for Education Statistics (NCES) encourages state and…

  3. Automatic Rock Detection and Mapping from HiRISE Imagery

    NASA Technical Reports Server (NTRS)

    Huertas, Andres; Adams, Douglas S.; Cheng, Yang

    2008-01-01

    This system includes a C-code software program and a set of MATLAB software tools for statistical analysis and rock distribution mapping. The major functions include rock detection and rock detection validation. The rock detection code has been evolved into a production tool that can be used by engineers and geologists with minor training.

  4. Certificates Awarded by Oregon's Degree Granting Colleges and Universities, 1993-94.

    ERIC Educational Resources Information Center

    Oregon State Dept. of Education, Salem. Office of Educational Policy and Planning.

    This document presents statistical data in summary form on the certificates awarded by institutions of higher education in Oregon. These data were obtained from a completions survey, part of the national Integrated Postsecondary Education Data System (IPEDS). Summary tables are arranged by institution and by program area, followed by tables…

  5. 76 FR 41263 - Notice of Intent To Award Affordable Care Act (ACA) Funding, EH10-1004

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-13

    ... Intent To Award Affordable Care Act (ACA) Funding, EH10-1004 Notice of Intent to award Affordable Care Act (ACA) funding to National Association for Public Health Statistics and Information Systems... under funding opportunity EH10-1004, ``National Environmental Public Health Tracking Program.'' AGENCY...

  6. Review of "Cross-Country Evidence on Teacher Performance Pay"

    ERIC Educational Resources Information Center

    von Davier, Matthias

    2011-01-01

    The primary claim of this Harvard Program on Education Policy and Governance report and the abridged Education Next version is that nations "that pay teachers on their performance score higher on PISA tests." After statistically controlling for several variables, the author concludes that nations with some form of merit pay system have,…

  7. The Recruiting Game: Toward a New System of Intercollegiate Sport. Second Edition, Revised.

    ERIC Educational Resources Information Center

    Rooney, John F., Jr.

    Problems in recruitment for big-time collegiate sports are updated, and an eleven-point improvement program is proposed. Statistics on football and basketball recruitment are updated, many through the 1985 season. New focus is placed on "blue chip" recruiting, and maps of recruiting by selected institutions, conferences, and states are…

  8. Use of Failure in IS Development Statistics: Lessons for IS Curriculum Design

    ERIC Educational Resources Information Center

    Longenecker, Herbert H., Jr.; Babb, Jeffry; Waguespack, Leslie; Tastle, William; Landry, Jeff

    2016-01-01

    The evolution of computing education reflects the history of the professional practice of computing. Keeping computing education current has been a major challenge due to the explosive advances in technologies. Academic programs in Information Systems, a long-standing computing discipline, develop and refine the theory and practice of computing…

  9. A chance constraint estimation approach to optimizing resource management under uncertainty

    Treesearch

    Michael Bevers

    2007-01-01

    Chance-constrained optimization is an important method for managing risk arising from random variations in natural resource systems, but the probabilistic formulations often pose mathematical programming problems that cannot be solved with exact methods. A heuristic estimation method for these problems is presented that combines a formulation for order statistic...

  10. Government-Funded Students and Courses--January to June 2016. Australian Vocational Education and Training Statistics

    ERIC Educational Resources Information Center

    National Centre for Vocational Education Research (NCVER), 2016

    2016-01-01

    This publication provides a summary of data relating to students, programs, subjects, and training providers in Australia's government-funded vocational education and training (VET) system. This is broadly defined as all activity delivered by government providers and government-funded activity delivered by community education and other registered…

  11. Government-Funded Students and Courses: January to September 2017. Australian Vocational Education and Training Statistics

    ERIC Educational Resources Information Center

    National Centre for Vocational Education Research (NCVER), 2018

    2018-01-01

    This publication provides a summary of data relating to students, programs, subjects, and training providers in Australia's government-funded vocational education and training (VET) system (defined as Commonwealth and state/territory government-funded training). Data for the Government-funded students and courses series are received by the…

  12. Government-Funded Students and Courses: January to March 2015. Australian Vocational Education and Training Statistics

    ERIC Educational Resources Information Center

    National Centre for Vocational Education Research (NCVER), 2015

    2015-01-01

    This publication provides a summary of data relating to students, programs, training providers, and funding in Australia's government-funded vocational education and training (VET) system (broadly defined as all activity delivered by government providers and government-funded activity delivered by community education and other registered…

  13. Government-Funded Students and Courses, January to March 2017. Australian Vocational Education and Training Statistics

    ERIC Educational Resources Information Center

    National Centre for Vocational Education Research (NCVER), 2017

    2017-01-01

    This publication provides a summary of data relating to students, programs, subjects and training providers in Australia's government-funded vocational education and training (VET) system (defined as all Commonwealth and state/territory government-funded training delivered by technical and further education [TAFE] institutes, other government…

  14. Tracking Juvenile Recidivists: Three Options for Creating Statewide, Longitudinal Records of Juvenile Offenders.

    ERIC Educational Resources Information Center

    Rooney, Teresa L.

    This document describes three options for a statewide statistical system for tracking recidivism of juvenile delinquents placed outside their homes in treatment programs. The information is intended for use by the state in allocating resources. The options described involve potential use of juvenile court records, placement data, and/or…

  15. Teachers' Intentions to Use National Literacy and Numeracy Assessment Data: A Pilot Study

    ERIC Educational Resources Information Center

    Pierce, Robyn; Chick, Helen

    2011-01-01

    In recent years the educational policy environment has emphasised data-driven change. This has increased the expectation for school personnel to use statistical information to inform their programs and to improve teaching practices. Such data include system reports of student achievement tests and socio-economic profiles provided to schools by…

  16. Computer Assisted Assembly of Tests at Educational Testing Service.

    ERIC Educational Resources Information Center

    Educational Testing Service, Princeton, NJ.

    Two basic requirements for the successful initiation of a program for test assembly are the development of detailed item content classification systems and the delineation of the professional judgements made in building a test from a pool of items to detailed content, ability, and statistical specifications in terms precise enough to be translated…

  17. Government-Funded Students and Courses: January to September 2015. Australian Vocational Education and Training Statistics

    ERIC Educational Resources Information Center

    National Centre for Vocational Education Research (NCVER), 2015

    2015-01-01

    This publication provides a summary of data relating to students, programs, training providers and funding in Australia's government-funded vocational education and training (VET) system (broadly defined as all activity delivered by government providers and government-funded activity delivered by community education and private training…

  18. Poster - Thur Eve - 54: A software solution for ongoing DVH quality assurance in radiation therapy.

    PubMed

    Annis, S-L; Zeng, G; Wu, X; Macpherson, M

    2012-07-01

    A program has been developed in MATLAB for use in quality assurance of treatment planning of radiation therapy. It analyzes patient DVH files and compiles dose volume data for review, trending, comparison and analysis. Patient DVH files are exported from the Eclipse treatment planning system and saved according to treatment sites and date. Currently analysis is available for 4 treatment sites; Prostate, Prostate Bed, Lung, and Upper GI, with two functions for data report and analysis: patient-specific and organ-specific. The patient-specific function loads one patient DVH file and reports the user-specified dose volume data of organs and targets. These data can be compiled to an external file for a third party analysis. The organ-specific function extracts a requested dose volume of an organ from the DVH files of a patient group and reports the statistics over this population. A graphical user interface is utilized to select clinical sites, function and structures, and input user's requests. We have implemented this program in planning quality assurance at our center. The program has tracked the dosimetric improvement in GU sites after VMAT was implemented clinically. It has generated dose volume statistics for different groups of patients associated with technique or time range. This program allows reporting and statistical analysis of DVH files. It is an efficient tool for the planning quality control in radiation therapy. © 2012 American Association of Physicists in Medicine.

  19. A Critical Analysis of U.S. Army Accessions through Socioeconomic Consideration between 1970 and 1984.

    DTIC Science & Technology

    1985-06-01

    ADDRESS 10. PROGRAM ELEMENT, PROJECT. TASK AREA & WORK UNIT NUMBERS Naval Postgraduate School Monterey, California 93943 11. CONTROLLING OFFICE NAME AND...determine the sccioeccnomic representativeness of the Army’s enlistees in that iarticular year. In addition, the socioeconomic overviev of Republic cf...accomplished with the use of the Statistical Analysis System (SAS), an integrated computer system for data analysis. 32 TABLE 2 The States in Each District

  20. A Database of Computer Attacks for the Evaluation of Intrusion Detection Systems

    DTIC Science & Technology

    1999-06-01

    administrator whenever a system binary file (such as the ps, login , or ls program) is modified. Normal users have no legitimate reason to alter these files...development of EMERALD [46], which combines statistical anomaly detection from NIDES with signature verification. Specification-based intrusion detection...the creation of a single host that can act as many hosts. Daemons that provide network services—including telnetd, ftpd, and login — display banners

  1. Metrics of Software Quality.

    DTIC Science & Technology

    1980-11-01

    Systems: A Raytheon Project History", RADC-TR-77-188, Final Technical Report, June 1977. 4. IBM Federal Systems Division, "Statistical Prediction of...147, June 1979. 4. W. D. Brooks, R. W. Motley, "Analysis of Discrete Software Reliability Models", IBM Corp., RADC-TR-80-84, RADC, New York, April 1980...J. C. King of IBM (Reference 9) and Lori A. Clark (Reference 10) of the University of Massachusetts. Programs, so exercised must be augmented so they

  2. Updated System-Availability and Resource-Allocation Program

    NASA Technical Reports Server (NTRS)

    Viterna, Larry

    2004-01-01

    A second version of the Availability, Cost and Resource Allocation (ACARA) computer program has become available. The first version was reported in an earlier tech brief. To recapitulate: ACARA analyzes the availability, mean-time-between-failures of components, life-cycle costs, and scheduling of resources of a complex system of equipment. ACARA uses a statistical Monte Carlo method to simulate the failure and repair of components while complying with user-specified constraints on spare parts and resources. ACARA evaluates the performance of the system on the basis of a mathematical model developed from a block-diagram representation. The previous version utilized the MS-DOS operating system and could not be run by use of the most recent versions of the Windows operating system. The current version incorporates the algorithms of the previous version but is compatible with Windows and utilizes menus and a file-management approach typical of Windows-based software.

  3. Comparison of traditional six-year and new four-year dental curricula in South Korea.

    PubMed

    Komabayashi, Takashi; Ahn, Chul; Kim, Kang-Ju; Oh, Hyo-Won

    2012-01-01

    This study aimed to compare the dental curriculum of the traditional six-year system with that of the new four-year (graduate-entry) system in South Korea. There are 11 dental schools in South Korea: six are public and five are private. Eight offer the new four-year program and the other three offer the traditional six-year program. Descriptive analyses were conducted using bibliographic data and local information along with statistical analyses such as chi-square tests. In the six-year programs, clinical dentistry subjects were taught almost equally in practical and didactic courses, while the basic science courses were taught more often as practical courses (P < 0.0001). In the four-year programs, both the basic science and clinical dentistry subjects were taught didactically more often; while more dentistry subjects were taught than basic sciences (P = 0.004). The four-year program model in South Korea is more focused on dentistry than on basic science, while both basic and clinical dentistry subjects were equally taught in the six-year program.

  4. Is There a Right Ear Advantage in Congenital Aural Atresia?

    PubMed

    Reed, Robert; Hubbard, Matthew; Kesser, Bradley W

    2016-12-01

    To compare speech/language development and academic progress between children with right versus left congenital aural atresia (CAA). Case control survey and review of audiometric data. Tertiary care academic practice. Children with unilateral CAA. Demographic and audiometric data; rates of grade retention, use of any hearing or learning resource, and behavioral problems. No significant differences in grade retention rate, utilization of amplification, speech language therapy, use of an individualized education program, or frequency modulated system were found between children with right versus left CAA. Children with left CAA were significantly more likely to be enrolled in special education programs (p = 0.026). Differences in reported communication problems approached significance with more difficulty noted in the right ear group (p = 0.059). Left CAA patients were also more likely to have reported behavioral problems (p = 0.0039). Contrary to the hypothesis that a normal hearing right ear confers a language advantage in patients with unilateral hearing loss, children with left CAA (normal right ear) were statistically more likely to be enrolled in a special education program and have behavioral problems. Reported communication problems were more common in right CAA patients, but this did not reach statistical significance. No differences were found in use of amplification, frequency modulated system, individualized education program, or grade retention. Further investigation of both the clinical implications and underlying psychoacoustics of unilateral hearing loss and the identification and habilitation of "at risk" unilateral hearing loss children is warranted.

  5. Space shuttle solid rocket booster recovery system definition. Volume 2: SRB water impact Monte Carlo computer program, user's manual

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The HD 220 program was created as part of the space shuttle solid rocket booster recovery system definition. The model was generated to investigate the damage to SRB components under water impact loads. The random nature of environmental parameters, such as ocean waves and wind conditions, necessitates estimation of the relative frequency of occurrence for these parameters. The nondeterministic nature of component strengths also lends itself to probabilistic simulation. The Monte Carlo technique allows the simultaneous perturbation of multiple independent parameters and provides outputs describing the probability distribution functions of the dependent parameters. This allows the user to determine the required statistics for each output parameter.

  6. Problem solving for breast health care delivery in low and middle resource countries (LMCs): consensus statement from the Breast Health Global Initiative.

    PubMed

    Harford, Joe B; Otero, Isabel V; Anderson, Benjamin O; Cazap, Eduardo; Gradishar, William J; Gralow, Julie R; Kane, Gabrielle M; Niëns, Laurens M; Porter, Peggy L; Reeler, Anne V; Rieger, Paula T; Shockney, Lillie D; Shulman, Lawrence N; Soldak, Tanya; Thomas, David B; Thompson, Beti; Winchester, David P; Zelle, Sten G; Badwe, Rajendra A

    2011-04-01

    International collaborations like the Breast Health Global Initiative (BHGI) can help low and middle income countries (LMCs) to establish or improve breast cancer control programs by providing evidence-based, resource-stratified guidelines for the management and control of breast cancer. The Problem Solving Working Group of the BHGI 2010 Global Summit met to develop a consensus statement on problem-solving strategies addressing breast cancer in LMCs. To better assess breast cancer burden in poorly studied populations, countries require accurate statistics regarding breast cancer incidence and mortality. To better identify health care system strengths and weaknesses, countries require reasonable indicators of true health system quality and capacity. Using qualitative and quantitative research methods, countries should formulate cancer control strategies to identify both system inefficiencies and patient barriers. Patient navigation programs linked to public advocacy efforts feed and strengthen functional early detection and treatment programs. Cost-effectiveness research and implementation science are tools that can guide and expand successful pilot programs. Copyright © 2011 Elsevier Ltd. All rights reserved.

  7. Cost estimating methods for advanced space systems

    NASA Technical Reports Server (NTRS)

    Cyr, Kelley

    1994-01-01

    NASA is responsible for developing much of the nation's future space technology. Cost estimates for new programs are required early in the planning process so that decisions can be made accurately. Because of the long lead times required to develop space hardware, the cost estimates are frequently required 10 to 15 years before the program delivers hardware. The system design in conceptual phases of a program is usually only vaguely defined and the technology used is so often state-of-the-art or beyond. These factors combine to make cost estimating for conceptual programs very challenging. This paper describes an effort to develop parametric cost estimating methods for space systems in the conceptual design phase. The approach is to identify variables that drive cost such as weight, quantity, development culture, design inheritance and time. The nature of the relationships between the driver variables and cost will be discussed. In particular, the relationship between weight and cost will be examined in detail. A theoretical model of cost will be developed and tested statistically against a historical database of major research and development projects.

  8. A pre-admission program for underrepresented minority and disadvantaged students: application, acceptance, graduation rates and timeliness of graduating from medical school.

    PubMed

    Strayhorn, G

    2000-04-01

    To determine whether students' performances in a pre-admission program predicted whether participants would (1) apply to medical school, (2) get accepted, and (3) graduate. Using prospectively collected data from participants in the University of North Carolina at Chapel Hill's Medical Education Development Program (MEDP) and data from the Association of American Colleges Student and Applicant Information Management System, the author identified 371 underrepresented minority (URM) students who were full-time participants and completed the program between 1984 and 1989, prior to their acceptance into medical school. Logistic regression analysis was used to determine whether MEDP performance significantly predicted (after statistically controlling for traditional predictors of these outcomes) the proportions of URM participants who applied to medical school and were accepted, the timeliness of graduating, and the proportion graduating. Odds ratios with 95% confidence intervals were calculated to determine the associations between the independent and outcome variables. In separate logistic regression models, MEDP performance predicted the study's outcomes after statistically controlling for traditional predictors with 95% confidence intervals. Pre-admission programs with similar outcomes can improve the diversity of the physician workforce and the access to health care for underrepresented minority and economically disadvantaged populations.

  9. Comparison of Danish dichotomous and BI-RADS classifications of mammographic density.

    PubMed

    Hodge, Rebecca; Hellmann, Sophie Sell; von Euler-Chelpin, My; Vejborg, Ilse; Andersen, Zorana Jovanovic

    2014-06-01

    In the Copenhagen mammography screening program from 1991 to 2001, mammographic density was classified either as fatty or mixed/dense. This dichotomous mammographic density classification system is unique internationally, and has not been validated before. To compare the Danish dichotomous mammographic density classification system from 1991 to 2001 with the density BI-RADS classifications, in an attempt to validate the Danish classification system. The study sample consisted of 120 mammograms taken in Copenhagen in 1991-2001, which tested false positive, and which were in 2012 re-assessed and classified according to the BI-RADS classification system. We calculated inter-rater agreement between the Danish dichotomous mammographic classification as fatty or mixed/dense and the four-level BI-RADS classification by the linear weighted Kappa statistic. Of the 120 women, 32 (26.7%) were classified as having fatty and 88 (73.3%) as mixed/dense mammographic density, according to Danish dichotomous classification. According to BI-RADS density classification, 12 (10.0%) women were classified as having predominantly fatty (BI-RADS code 1), 46 (38.3%) as having scattered fibroglandular (BI-RADS code 2), 57 (47.5%) as having heterogeneously dense (BI-RADS 3), and five (4.2%) as having extremely dense (BI-RADS code 4) mammographic density. The inter-rater variability assessed by weighted kappa statistic showed a substantial agreement (0.75). The dichotomous mammographic density classification system utilized in early years of Copenhagen's mammographic screening program (1991-2001) agreed well with the BI-RADS density classification system.

  10. Data-base development for water-quality modeling of the Patuxent River basin, Maryland

    USGS Publications Warehouse

    Fisher, G.T.; Summers, R.M.

    1987-01-01

    Procedures and rationale used to develop a data base and data management system for the Patuxent Watershed Nonpoint Source Water Quality Monitoring and Modeling Program of the Maryland Department of the Environment and the U.S. Geological Survey are described. A detailed data base and data management system has been developed to facilitate modeling of the watershed for water quality planning purposes; statistical analysis; plotting of meteorologic, hydrologic and water quality data; and geographic data analysis. The system is Maryland 's prototype for development of a basinwide water quality management program. A key step in the program is to build a calibrated and verified water quality model of the basin using the Hydrological Simulation Program--FORTRAN (HSPF) hydrologic model, which has been used extensively in large-scale basin modeling. The compilation of the substantial existing data base for preliminary calibration of the basin model, including meteorologic, hydrologic, and water quality data from federal and state data bases and a geographic information system containing digital land use and soils data is described. The data base development is significant in its application of an integrated, uniform approach to data base management and modeling. (Lantz-PTT)

  11. Robust statistical reconstruction for charged particle tomography

    DOEpatents

    Schultz, Larry Joe; Klimenko, Alexei Vasilievich; Fraser, Andrew Mcleod; Morris, Christopher; Orum, John Christopher; Borozdin, Konstantin N; Sossong, Michael James; Hengartner, Nicolas W

    2013-10-08

    Systems and methods for charged particle detection including statistical reconstruction of object volume scattering density profiles from charged particle tomographic data to determine the probability distribution of charged particle scattering using a statistical multiple scattering model and determine a substantially maximum likelihood estimate of object volume scattering density using expectation maximization (ML/EM) algorithm to reconstruct the object volume scattering density. The presence of and/or type of object occupying the volume of interest can be identified from the reconstructed volume scattering density profile. The charged particle tomographic data can be cosmic ray muon tomographic data from a muon tracker for scanning packages, containers, vehicles or cargo. The method can be implemented using a computer program which is executable on a computer.

  12. Johnson Space Center's Risk and Reliability Analysis Group 2008 Annual Report

    NASA Technical Reports Server (NTRS)

    Valentine, Mark; Boyer, Roger; Cross, Bob; Hamlin, Teri; Roelant, Henk; Stewart, Mike; Bigler, Mark; Winter, Scott; Reistle, Bruce; Heydorn,Dick

    2009-01-01

    The Johnson Space Center (JSC) Safety & Mission Assurance (S&MA) Directorate s Risk and Reliability Analysis Group provides both mathematical and engineering analysis expertise in the areas of Probabilistic Risk Assessment (PRA), Reliability and Maintainability (R&M) analysis, and data collection and analysis. The fundamental goal of this group is to provide National Aeronautics and Space Administration (NASA) decisionmakers with the necessary information to make informed decisions when evaluating personnel, flight hardware, and public safety concerns associated with current operating systems as well as with any future systems. The Analysis Group includes a staff of statistical and reliability experts with valuable backgrounds in the statistical, reliability, and engineering fields. This group includes JSC S&MA Analysis Branch personnel as well as S&MA support services contractors, such as Science Applications International Corporation (SAIC) and SoHaR. The Analysis Group s experience base includes nuclear power (both commercial and navy), manufacturing, Department of Defense, chemical, and shipping industries, as well as significant aerospace experience specifically in the Shuttle, International Space Station (ISS), and Constellation Programs. The Analysis Group partners with project and program offices, other NASA centers, NASA contractors, and universities to provide additional resources or information to the group when performing various analysis tasks. The JSC S&MA Analysis Group is recognized as a leader in risk and reliability analysis within the NASA community. Therefore, the Analysis Group is in high demand to help the Space Shuttle Program (SSP) continue to fly safely, assist in designing the next generation spacecraft for the Constellation Program (CxP), and promote advanced analytical techniques. The Analysis Section s tasks include teaching classes and instituting personnel qualification processes to enhance the professional abilities of our analysts as well as performing major probabilistic assessments used to support flight rationale and help establish program requirements. During 2008, the Analysis Group performed more than 70 assessments. Although all these assessments were important, some were instrumental in the decisionmaking processes for the Shuttle and Constellation Programs. Two of the more significant tasks were the Space Transportation System (STS)-122 Low Level Cutoff PRA for the SSP and the Orion Pad Abort One (PA-1) PRA for the CxP. These two activities, along with the numerous other tasks the Analysis Group performed in 2008, are summarized in this report. This report also highlights several ongoing and upcoming efforts to provide crucial statistical and probabilistic assessments, such as the Extravehicular Activity (EVA) PRA for the Hubble Space Telescope service mission and the first fully integrated PRAs for the CxP's Lunar Sortie and ISS missions.

  13. Linear combination reading program for capture gamma rays

    USGS Publications Warehouse

    Tanner, Allan B.

    1971-01-01

    This program computes a weighting function, Qj, which gives a scalar output value of unity when applied to the spectrum of a desired element and a minimum value (considering statistics) when applied to spectra of materials not containing the desired element. Intermediate values are obtained for materials containing the desired element, in proportion to the amount of the element they contain. The program is written in the BASIC language in a format specific to the Hewlett-Packard 2000A Time-Sharing System, and is an adaptation of an earlier program for linear combination reading for X-ray fluorescence analysis (Tanner and Brinkerhoff, 1971). Following the program is a sample run from a study of the application of the linear combination technique to capture-gamma-ray analysis for calcium (report in preparation).

  14. A plan for the North American Bat Monitoring Program (NABat)

    USGS Publications Warehouse

    Loeb, Susan C.; Rodhouse, Thomas J.; Ellison, Laura E.; Lausen, Cori L.; Reichard, Jonathan D.; Irvine, Kathryn M.; Ingersoll, Thomas E.; Coleman, Jeremy; Thogmartin, Wayne E.; Sauer, John R.; Francis, Charles M.; Bayless, Mylea L.; Stanley, Thomas R.; Johnson, Douglas H.

    2015-01-01

    The purpose of the North American Bat Monitoring Program (NABat) is to create a continent-wide program to monitor bats at local to rangewide scales that will provide reliable data to promote effective conservation decisionmaking and the long-term viability of bat populations across the continent. This is an international, multiagency program. Four approaches will be used to gather monitoring data to assess changes in bat distributions and abundances: winter hibernaculum counts, maternity colony counts, mobile acoustic surveys along road transects, and acoustic surveys at stationary points. These monitoring approaches are described along with methods for identifying species recorded by acoustic detectors. Other chapters describe the sampling design, the database management system (Bat Population Database), and statistical approaches that can be used to analyze data collected through this program.

  15. The Functional Measurement Experiment Builder suite: two Java-based programs to generate and run functional measurement experiments.

    PubMed

    Mairesse, Olivier; Hofmans, Joeri; Theuns, Peter

    2008-05-01

    We propose a free, easy-to-use computer program that does not requires prior knowledge of computer programming to generate and run experiments using textual or pictorial stimuli. Although the FM Experiment Builder suite was initially programmed for building and conducting FM experiments, it can also be applied for non-FM experiments that necessitate randomized, single, or multifactorial designs. The program is highly configurable, allowing multilingual use and a wide range of different response formats. The outputs of the experiments are Microsoft Excel compatible .xls files that allow easy copy-paste of the results into Weiss's FM CalSTAT program (2006) or any other statistical package. Its Java-based structure is compatible with both Windows and Macintosh operating systems, and its compactness (< 1 MB) makes it easily distributable over the Internet.

  16. NCI: DCTD: Biometric Research Program

    Cancer.gov

    The Biometric Research Program (BRP) is the statistical and biomathematical component of the Division of Cancer Treatment, Diagnosis and Centers (DCTDC). Its members provide statistical leadership for the national and international research programs of the division in developmental therapeutics, developmental diagnostics, diagnostic imaging and clinical trials.

  17. NCI: DCTD: Biometric Research Program

    Cancer.gov

    The Biometric Research Program (BRB) is the statistical and biomathematical component of the Division of Cancer Treatment, Diagnosis and Centers (DCTDC). Its members provide statistical leadership for the national and international research programs of the division in developmental therapeutics, developmental diagnostics, diagnostic imaging and clinical trials.

  18. Effects of health intervention programs and arsenic exposure on child mortality from acute lower respiratory infections in rural Bangladesh.

    PubMed

    Jochem, Warren C; Razzaque, Abdur; Root, Elisabeth Dowling

    2016-09-01

    Respiratory infections continue to be a public health threat, particularly to young children in developing countries. Understanding the geographic patterns of diseases and the role of potential risk factors can help improve future mitigation efforts. Toward this goal, this paper applies a spatial scan statistic combined with a zero-inflated negative-binomial regression to re-examine the impacts of a community-based treatment program on the geographic patterns of acute lower respiratory infection (ALRI) mortality in an area of rural Bangladesh. Exposure to arsenic-contaminated drinking water is also a serious threat to the health of children in this area, and the variation in exposure to arsenic must be considered when evaluating the health interventions. ALRI mortality data were obtained for children under 2 years old from 1989 to 1996 in the Matlab Health and Demographic Surveillance System. This study period covers the years immediately following the implementation of an ALRI control program. A zero-inflated negative binomial (ZINB) regression model was first used to simultaneously estimate mortality rates and the likelihood of no deaths in groups of related households while controlling for socioeconomic status, potential arsenic exposure, and access to care. Next a spatial scan statistic was used to assess the location and magnitude of clusters of ALRI mortality. The ZINB model was used to adjust the scan statistic for multiple social and environmental risk factors. The results of the ZINB models and spatial scan statistic suggest that the ALRI control program was successful in reducing child mortality in the study area. Exposure to arsenic-contaminated drinking water was not associated with increased mortality. Higher socioeconomic status also significantly reduced mortality rates, even among households who were in the treatment program area. Community-based ALRI interventions can be effective at reducing child mortality, though socioeconomic factors may continue to influence mortality patterns. The combination of spatial and non-spatial methods used in this paper has not been applied previously in the literature, and this study demonstrates the importance of such approaches for evaluating and improving public health intervention programs.

  19. [Data collection in anesthesia. Experiences with the inauguration of a new information system].

    PubMed

    Zbinden, A M; Rothenbühler, H; Häberli, B

    1997-06-01

    In many institutions information systems are used to process off-line anaesthesia data for invoices, statistical purposes, and quality assurance. Information systems are also increasingly being used to improve process control in order to reduce costs. Most of today's systems were created when information technology and working processes in anaesthesia were very different from those in use today. Thus, many institutions must now replace their computer systems but are probably not aware of how complex this change will be. Modern information systems mostly use client-server architecture and relational data bases. Substituting an old system with a new one is frequently a greater task than designing a system from scratch. This article gives the conclusions drawn from the experience obtained when a large departmental computer system is redesigned in an university hospital. The new system was based on a client-server architecture and was developed by an external company without preceding conceptual analysis. Modules for patient, anaesthesia, surgical, and pain-service data were included. Data were analysed using a separate statistical package (RS/1 from Bolt Beranek), taking advantage of its powerful precompiled procedures. Development and introduction of the new system took much more time and effort than expected despite the use of modern software tools. Introduction of the new program required intensive user training despite the choice of modem graphic screen layouts. Automatic data-reading systems could not be used, as too many faults occurred and the effort for the user was too high. However, after the initial problems were solved the system turned out to be a powerful tool for quality control (both process and outcome quality), billing, and scheduling. The statistical analysis of the data resulted in meaningful and relevant conclusions. Before creating a new information system, the working processes have to be analysed and, if possible, made more efficient; a detailed programme specification must then be made. A servicing and maintenance contract should be drawn up before the order is given to a company. Time periods of equal duration have to be scheduled for defining, writing, testing and introducing the program. Modern client-server systems with relational data bases are by no means simpler to establish and maintain than previous mainframe systems with hierarchical data bases, and thus, experienced computer specialists need to be close at hand. We recommend collecting data only once for both statistics and quality control. To verify data quality, a system of random spot-sampling has to be established. Despite the large investments needed to build up such a system, we consider it a powerful tool for helping to solve the difficult daily problems of managing a surgical and anaesthesia unit.

  20. [Gait, balance and independence rehabilitation program in elderly adults in a primary care unit].

    PubMed

    Espinosa-Cuervo, Gisela; López-Roldán, Verónica Miriam; Escobar-Rodríguez, David Alvaro; Conde-Embarcadero, Margarita; Trejo-León, Gerardo; González-Carmona, Beatriz

    2013-01-01

    to evaluate the effect of a supervised rehabilitation program to improve gait, balance and independence in elderly patients attending a family medicine unit. we conducted a quasi-experimental study over a period of four weeks in a group of 72 patients older than 65 years. a supervised program regarding the risk factors for falling, and balance, gait, coordination and oculovestibular system, the modalities to be done two or three times a week in the primary care unit or at home. An analysis of both tests was performed by "up and go," Tinetti scale and the Katz index. "intention to treat" and "by protocol." mean age was 72 ± 5 years, 67.8% were female and 81.9% of the patients completed the program. A significant clinical improvement with statistical level were evident for gait and balance (p = 0.001), independence showed only clinical improvement (p = 0.083). The efficacy for periodicity (two or three times/week) and performance place showed same clinical improvement and statistical level for gait and balance (p = 0.001 to 0.003) and independence showed only clinical improvement (p = 0.317 to 0.991). an integral rehabilitation program improved gait, balance and clinical independence significantly. The supervised program is applicable and can be reproduced at primary care unit or home for geriatric care and preventive actions.

  1. Duane Webster, Assessment Pioneer

    ERIC Educational Resources Information Center

    Franklin, Brinley

    2009-01-01

    Duane Webster oversaw the Association of Research Libraries' (ARL) Statistics and Measurement Program as it evolved into the Statistics and Assessment Program. During his 20-year tenure as ARL's executive director, Duane was instrumental in the creation of ARL's Web-based Interactive Statistics and played a leadership role in the development of a…

  2. Education Statistics on Disk. [CD-ROM.

    ERIC Educational Resources Information Center

    National Center for Education Statistics (ED), Washington, DC.

    This CD-ROM disk contains a computer program developed by the Office of Educational Research and Improvement to provide convenient access to the wealth of education statistics published by the National Center for Education Statistics (NCES). The program contains over 1,800 tables, charts, and text files from the following NCES publications,…

  3. Overview of the SAMSI year-long program on Statistical, Mathematical and Computational Methods for Astronomy

    NASA Astrophysics Data System (ADS)

    Jogesh Babu, G.

    2017-01-01

    A year-long research (Aug 2016- May 2017) program on `Statistical, Mathematical and Computational Methods for Astronomy (ASTRO)’ is well under way at Statistical and Applied Mathematical Sciences Institute (SAMSI), a National Science Foundation research institute in Research Triangle Park, NC. This program has brought together astronomers, computer scientists, applied mathematicians and statisticians. The main aims of this program are: to foster cross-disciplinary activities; to accelerate the adoption of modern statistical and mathematical tools into modern astronomy; and to develop new tools needed for important astronomical research problems. The program provides multiple avenues for cross-disciplinary interactions, including several workshops, long-term visitors, and regular teleconferences, so participants can continue collaborations, even if they can only spend limited time in residence at SAMSI. The main program is organized around five working groups:i) Uncertainty Quantification and Astrophysical Emulationii) Synoptic Time Domain Surveysiii) Multivariate and Irregularly Sampled Time Seriesiv) Astrophysical Populationsv) Statistics, computation, and modeling in cosmology.A brief description of each of the work under way by these groups will be given. Overlaps among various working groups will also be highlighted. How the wider astronomy community can both participate and benefit from the activities, will be briefly mentioned.

  4. MODFLOW-2000, the U.S. Geological Survey modular ground-water model; user guide to the observation, sensitivity, and parameter-estimation processes and three post-processing programs

    USGS Publications Warehouse

    Hill, Mary C.; Banta, E.R.; Harbaugh, A.W.; Anderman, E.R.

    2000-01-01

    This report documents the Observation, Sensitivity, and Parameter-Estimation Processes of the ground-water modeling computer program MODFLOW-2000. The Observation Process generates model-calculated values for comparison with measured, or observed, quantities. A variety of statistics is calculated to quantify this comparison, including a weighted least-squares objective function. In addition, a number of files are produced that can be used to compare the values graphically. The Sensitivity Process calculates the sensitivity of hydraulic heads throughout the model with respect to specified parameters using the accurate sensitivity-equation method. These are called grid sensitivities. If the Observation Process is active, it uses the grid sensitivities to calculate sensitivities for the simulated values associated with the observations. These are called observation sensitivities. Observation sensitivities are used to calculate a number of statistics that can be used (1) to diagnose inadequate data, (2) to identify parameters that probably cannot be estimated by regression using the available observations, and (3) to evaluate the utility of proposed new data. The Parameter-Estimation Process uses a modified Gauss-Newton method to adjust values of user-selected input parameters in an iterative procedure to minimize the value of the weighted least-squares objective function. Statistics produced by the Parameter-Estimation Process can be used to evaluate estimated parameter values; statistics produced by the Observation Process and post-processing program RESAN-2000 can be used to evaluate how accurately the model represents the actual processes; statistics produced by post-processing program YCINT-2000 can be used to quantify the uncertainty of model simulated values. Parameters are defined in the Ground-Water Flow Process input files and can be used to calculate most model inputs, such as: for explicitly defined model layers, horizontal hydraulic conductivity, horizontal anisotropy, vertical hydraulic conductivity or vertical anisotropy, specific storage, and specific yield; and, for implicitly represented layers, vertical hydraulic conductivity. In addition, parameters can be defined to calculate the hydraulic conductance of the River, General-Head Boundary, and Drain Packages; areal recharge rates of the Recharge Package; maximum evapotranspiration of the Evapotranspiration Package; pumpage or the rate of flow at defined-flux boundaries of the Well Package; and the hydraulic head at constant-head boundaries. The spatial variation of model inputs produced using defined parameters is very flexible, including interpolated distributions that require the summation of contributions from different parameters. Observations can include measured hydraulic heads or temporal changes in hydraulic heads, measured gains and losses along head-dependent boundaries (such as streams), flows through constant-head boundaries, and advective transport through the system, which generally would be inferred from measured concentrations. MODFLOW-2000 is intended for use on any computer operating system. The program consists of algorithms programmed in Fortran 90, which efficiently performs numerical calculations and is fully compatible with the newer Fortran 95. The code is easily modified to be compatible with FORTRAN 77. Coordination for multiple processors is accommodated using Message Passing Interface (MPI) commands. The program is designed in a modular fashion that is intended to support inclusion of new capabilities.

  5. Web-GIS platform for forest fire danger prediction in Ukraine: prospects of RS technologies

    NASA Astrophysics Data System (ADS)

    Baranovskiy, N. V.; Zharikova, M. V.

    2016-10-01

    There are many different statistical and empirical methods of forest fire danger use at present time. All systems have not physical basis. Last decade deterministic-probabilistic method is rapidly developed in Tomsk Polytechnic University. Forest sites classification is one way to estimate forest fire danger. We used this method in present work. Forest fire danger estimation depends on forest vegetation condition, forest fire retrospective, precipitation and air temperature. In fact, we use modified Nesterov Criterion. Lightning activity is under consideration as a high temperature source in present work. We use Web-GIS platform for program realization of this method. The program realization of the fire danger assessment system is the Web-oriented geoinformation system developed by the Django platform in the programming language Python. The GeoDjango framework was used for realization of cartographic functions. We suggest using of Terra/Aqua MODIS products for hot spot monitoring. Typical territory for forest fire danger estimation is Proletarskoe forestry of Kherson region (Ukraine).

  6. Computer program for prediction of capture maneuver probability for an on-off reaction controlled upper stage

    NASA Technical Reports Server (NTRS)

    Knauber, R. N.

    1982-01-01

    A FORTRAN coded computer program which computes the capture transient of a launch vehicle upper stage at the ignition and/or separation event is presented. It is for a single degree-of-freedom on-off reaction jet attitude control system. The Monte Carlo method is used to determine the statistical value of key parameters at the outcome of the event. Aerodynamic and booster induced disturbances, vehicle and control system characteristics, and initial conditions are treated as random variables. By appropriate selection of input data pitch, yaw and roll axes can be analyzed. Transient response of a single deterministic case can be computed. The program is currently set up on a CDC CYBER 175 computer system but is compatible with ANSI FORTRAN computer language. This routine has been used over the past fifteen (15) years for the SCOUT Launch Vehicle and has been run on RECOMP III, IBM 7090, IBM 360/370, CDC6600 and CDC CYBER 175 computers with little modification.

  7. An expert system for wind shear avoidance

    NASA Technical Reports Server (NTRS)

    Stengel, Robert F.; Stratton, D. Alexander

    1990-01-01

    The principal objectives are to develop methods for assessing the likelihood of wind shear encounter (based on real-time information in the cockpit), for deciding what flight path to pursue (e.g., takeoff abort, landing go-around, or normal climbout or glide slope), and for using the aircraft's full potential for combating wind shear. This study requires the definition of both deterministic and statistical techniques for fusing internal and external information, for making go/no-go decisions, and for generating commands to the aircraft's autopilot and flight directors for both automatic and manually controlled flight. The expert system for pilot aiding is based on the results of the FAA Windshear Training Aids Program, a two-volume manual that presents an overview, pilot guide, training program, and substantiating data that provides guidelines for this initial development. The Windshear Safety Advisor expert system currently contains over 140 rules and is coded in the LISP programming language for implementation on a Symbolics 3670 LISP Machine.

  8. An introduction to data reduction: space-group determination, scaling and intensity statistics.

    PubMed

    Evans, Philip R

    2011-04-01

    This paper presents an overview of how to run the CCP4 programs for data reduction (SCALA, POINTLESS and CTRUNCATE) through the CCP4 graphical interface ccp4i and points out some issues that need to be considered, together with a few examples. It covers determination of the point-group symmetry of the diffraction data (the Laue group), which is required for the subsequent scaling step, examination of systematic absences, which in many cases will allow inference of the space group, putting multiple data sets on a common indexing system when there are alternatives, the scaling step itself, which produces a large set of data-quality indicators, estimation of |F| from intensity and finally examination of intensity statistics to detect crystal pathologies such as twinning. An appendix outlines the scoring schemes used by the program POINTLESS to assign probabilities to possible Laue and space groups.

  9. 18th Annual NDIA SOLIC Symposium and Exhibition - Warfare in the Seams: Defense and Industry Partnering to Win the Long War. Volume 1. Presentations

    DTIC Science & Technology

    2007-02-28

    Program •Services executed Defense HUMINT Activities •DIA ran attaché system •Over time , deferred the Secretary’s Authorities •Post-1995 (Perry and White...ornl.gov orbucma@doe.ic.gov 26 February, 2007 TT L SENSO RS COMMS time trust Intelligence …the power of change… hameleon ORNL Cognitive Radio Program...and internal states in real- time to meet user requirements and goals • Learns: uses statistical signal processing and machine learning to reflect

  10. Photographic and photometric enhancement of Lunar Orbiter products, projects A, B and C

    NASA Technical Reports Server (NTRS)

    1972-01-01

    A detailed discussion is presented of the framelet joining, photometric data improvement, and statistical error analysis. The Lunar Orbiter film handling system, readout system, and the digitization are described, along with the technique of joining adjacent framelets by a using a digital computer. Time and cost estimates are given. The problems and techniques involved in improving the digitized data are discussed. It was found that spectacular improvements are possible. Program documentations are included.

  11. Forecasting Ocean Waves: Comparing a Physics-Based Model with Statistical Models

    DTIC Science & Technology

    2011-01-01

    m) 46029 (135 m) 46211 (38 m) ( CDIP -036) 42039 (307 m) 42040 (165 m) 42007 (14 m) Boundary forcing from NCEP WW3 ENP 15′×15′ resolution SWAN CNW-G1...wave energy. Acronyms and abbreviations CenGOOS Central Gulf Ocean Observing System CDIP Coastal Data Information Program CNW Coastal Northwest SWAN

  12. On the Training of Radio and Communications Engineers in the Decades of the Immediate Future.

    ERIC Educational Resources Information Center

    Klyatskin, I.G.

    A list of 11 statements relating to the change in training programs for radio and communications engineers is presented in this article, in preparation for future developments in the field. Semiconductors, decimeter and centimeter radio frequency ranges, and a statistical approach to communications systems are analyzed as the three important…

  13. OCLC Annual Report. 1990/91.

    ERIC Educational Resources Information Center

    OCLC Online Computer Library Center, Inc., Dublin, OH.

    Beginning this annual report is a letter to the OCLC membership from OCLC President and Chief Executive Officer, K. Wayne Smith. Statistical data are then presented in tables and/or graphs for OCLC programs and the system's financial status for fiscal years 1990/91 and 1989/90; the growth of the OCLC Online Union Catalog from 1971-1991 in terms of…

  14. Government-Funded Students and Courses--January to March 2016. Australian Vocational Education and Training Statistics

    ERIC Educational Resources Information Center

    National Centre for Vocational Education Research (NCVER), 2016

    2016-01-01

    This publication provides a summary of data relating to students, programs, subjects, and training providers in Australia's government-funded vocational education and training (VET) system (defined as Commonwealth and state/territory government funded training). This is the first time that government-funded data from one quarter is compared with…

  15. Australian Vocational Education and Training Statistics: Government-Funded Students and Courses-- January to September 2016

    ERIC Educational Resources Information Center

    National Centre for Vocational Education Research (NCVER), 2016

    2016-01-01

    This publication provides a summary of data relating to students, programs, subjects and training providers in Australia's government-funded vocational education and training (VET) system (defined as Commonwealth and state/territory government-funded training). The data in this publication cover the period of 1 January to 30 September 2016. For…

  16. A Model for Post Hoc Evaluation.

    ERIC Educational Resources Information Center

    Theimer, William C., Jr.

    Often a research department in a school system is called on to make an after the fact evaluation of a program or project. Although the department is operating under a handicap, it can still provide some data useful for evaluative purposes. It is suggested that all the classical methods of descriptive statistics be brought into play. The use of…

  17. A generic minimization random allocation and blinding system on web.

    PubMed

    Cai, Hongwei; Xia, Jielai; Xu, Dezhong; Gao, Donghuai; Yan, Yongping

    2006-12-01

    Minimization is a dynamic randomization method for clinical trials. Although recommended by many researchers, the utilization of minimization has been seldom reported in randomized trials mainly because of the controversy surrounding the validity of conventional analyses and its complexity in implementation. However, both the statistical and clinical validity of minimization were demonstrated in recent studies. Minimization random allocation system integrated with blinding function that could facilitate the implementation of this method in general clinical trials has not been reported. SYSTEM OVERVIEW: The system is a web-based random allocation system using Pocock and Simon minimization method. It also supports multiple treatment arms within a trial, multiple simultaneous trials, and blinding without further programming. This system was constructed with generic database schema design method, Pocock and Simon minimization method and blinding method. It was coded with Microsoft Visual Basic and Active Server Pages (ASP) programming languages. And all dataset were managed with a Microsoft SQL Server database. Some critical programming codes were also provided. SIMULATIONS AND RESULTS: Two clinical trials were simulated simultaneously to test the system's applicability. Not only balanced groups but also blinded allocation results were achieved in both trials. Practical considerations for minimization method, the benefits, general applicability and drawbacks of the technique implemented in this system are discussed. Promising features of the proposed system are also summarized.

  18. TIERRAS: A package to simulate high energy cosmic ray showers underground, underwater and under-ice

    NASA Astrophysics Data System (ADS)

    Tueros, Matías; Sciutto, Sergio

    2010-02-01

    In this paper we present TIERRAS, a Monte Carlo simulation program based on the well-known AIRES air shower simulations system that enables the propagation of particle cascades underground, providing a tool to study particles arriving underground from a primary cosmic ray on the atmosphere or to initiate cascades directly underground and propagate them, exiting into the atmosphere if necessary. We show several cross-checks of its results against CORSIKA, FLUKA, GEANT and ZHS simulations and we make some considerations regarding its possible use and limitations. The first results of full underground shower simulations are presented, as an example of the package capabilities. Program summaryProgram title: TIERRAS for AIRES Catalogue identifier: AEFO_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFO_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 36 489 No. of bytes in distributed program, including test data, etc.: 3 261 669 Distribution format: tar.gz Programming language: Fortran 77 and C Computer: PC, Alpha, IBM, HP, Silicon Graphics and Sun workstations Operating system: Linux, DEC Unix, AIX, SunOS, Unix System V RAM: 22 Mb bytes Classification: 1.1 External routines: TIERRAS requires AIRES 2.8.4 to be installed on the system. AIRES 2.8.4 can be downloaded from http://www.fisica.unlp.edu.ar/auger/aires/eg_AiresDownload.html. Nature of problem: Simulation of high and ultra high energy underground particle showers. Solution method: Modification of the AIRES 2.8.4 code to accommodate underground conditions. Restrictions: In AIRES some processes that are not statistically significant on the atmosphere are not simulated. In particular, it does not include muon photonuclear processes. This imposes a limitation on the application of this package to a depth of 1 km of standard rock (or 2.5 km of water equivalent). Neutrinos are not tracked on the simulation, but their energy is taken into account in decays. Running time: A TIERRAS for AIRES run of a 10 eV shower with statistical sampling (thinning) below 10 eV and 0.2 weight factor (see [1]) uses approximately 1 h of CPU time on an Intel Core 2 Quad Q6600 at 2.4 GHz. It uses only one core, so 4 simultaneous simulations can be run on this computer. Aires includes a spooling system to run several simultaneous jobs of any type. References:S. Sciutto, AIRES 2.6 User Manual, http://www.fisica.unlp.edu.ar/auger/aires/.

  19. A program for the Bayesian Neural Network in the ROOT framework

    NASA Astrophysics Data System (ADS)

    Zhong, Jiahang; Huang, Run-Sheng; Lee, Shih-Chang

    2011-12-01

    We present a Bayesian Neural Network algorithm implemented in the TMVA package (Hoecker et al., 2007 [1]), within the ROOT framework (Brun and Rademakers, 1997 [2]). Comparing to the conventional utilization of Neural Network as discriminator, this new implementation has more advantages as a non-parametric regression tool, particularly for fitting probabilities. It provides functionalities including cost function selection, complexity control and uncertainty estimation. An example of such application in High Energy Physics is shown. The algorithm is available with ROOT release later than 5.29. Program summaryProgram title: TMVA-BNN Catalogue identifier: AEJX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJX_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: BSD license No. of lines in distributed program, including test data, etc.: 5094 No. of bytes in distributed program, including test data, etc.: 1,320,987 Distribution format: tar.gz Programming language: C++ Computer: Any computer system or cluster with C++ compiler and UNIX-like operating system Operating system: Most UNIX/Linux systems. The application programs were thoroughly tested under Fedora and Scientific Linux CERN. Classification: 11.9 External routines: ROOT package version 5.29 or higher ( http://root.cern.ch) Nature of problem: Non-parametric fitting of multivariate distributions Solution method: An implementation of Neural Network following the Bayesian statistical interpretation. Uses Laplace approximation for the Bayesian marginalizations. Provides the functionalities of automatic complexity control and uncertainty estimation. Running time: Time consumption for the training depends substantially on the size of input sample, the NN topology, the number of training iterations, etc. For the example in this manuscript, about 7 min was used on a PC/Linux with 2.0 GHz processors.

  20. Short-term effects of an intensive lifestyle modification program on lipid peroxidation and antioxidant systems in patients with coronary artery disease.

    PubMed

    Jatuporn, Srisakul; Sangwatanaroj, Somkiat; Saengsiri, Aem-Orn; Rattanapruks, Sopida; Srimahachota, Suphot; Uthayachalerm, Wasan; Kuanoon, Wanpen; Panpakdee, Orasa; Tangkijvanich, Pisit; Tosukhowong, Piyaratana

    2003-01-01

    The purpose of this study was to compare the short-term effects of an intensive lifestyle modification (ILM) program on lipid peroxidation and antioxidant systems in patients with coronary artery disease (CAD). Twenty-two patients in the control group continued to receive their conventional treatment with lipid-lowering drugs, whereas 22 patients in the experimental group were assigned to intensive lifestyle modification (ILM) without taking any lipid-lowering agent. The ILM program comprised dietary advice on low-fat diets, high antioxidants and high fiber intakes, yoga exercise, stress management and smoking cessation. After 4 months of intervention, patients in the experimental group revealed a statistically significant increase in plasma total antioxidants, plasma vitamin E and erythrocyte glutathione (GSH) compared to patients in the control group. There was no significant change in plasma malondialdehyde (MDA), a circulating product of lipid peroxidation, in either group. We concluded that the ILM program increased circulating antioxidants and reduced oxidative stress in patients with CAD.

  1. The Electronic Nose Training Automation Development

    NASA Technical Reports Server (NTRS)

    Schattke, Nathan

    2002-01-01

    The electronic nose is a method of using several sensors in conjunction to identify an unknown gas. Statistical analysis has shown that a large number of training exposures need to be performed in order to get a model that can be depended on. The number of training exposures needed is on the order of 1000. Data acquisition from the noses are generally automatic and built in. The gas generation equipment consists of a Miller-Nelson (MN) flow/temperature/humidity controller and a Kin-Tek (KT) trace gas generator. This equipment has been controlled in the past by an old data acquisition and control system. The new system will use new control boards and an easy graphical user interface. The programming for this is in the LabVIEW G programming language. A language easy for the user to make modifications to. This paper details some of the issues in selecting the components and programming the connections. It is not a primer on LabVIEW programming, a separate CD is being delivered with website files to teach that.

  2. Microcomputer package for statistical analysis of microbial populations.

    PubMed

    Lacroix, J M; Lavoie, M C

    1987-11-01

    We have developed a Pascal system to compare microbial populations from different ecological sites using microcomputers. The values calculated are: the coverage value and its standard error, the minimum similarity and the geometric similarity between two biological samples, and the Lambda test consisting of calculating the ratio of the mean similarity between two subsets by the mean similarity within subsets. This system is written for Apple II, IBM or compatible computers, but it can work for any computer which can use CP/M, if the programs are recompiled for such a system.

  3. Building Analytic Capacity and Statistical Literacy Among Title IV-E MSW Students

    PubMed Central

    LERY, BRIDGETTE; PUTNAM-HORNSTEIN, EMILY; WIEGMANN, WENDY; KING, BRYN

    2016-01-01

    Building and sustaining effective child welfare practice requires an infrastructure of social work professionals trained to use data to identify target populations, connect interventions to outcomes, adapt practice to varying contexts and dynamic populations, and assess their own effectiveness. Increasingly, public agencies are implementing models of self-assessment in which administrative data are used to guide and continuously evaluate the implementation of programs and policies. The research curriculum described in the article was developed to provide Title IV-E and other students interested in public child welfare systems with hands-on opportunities to become experienced and “statistically literate” users of aggregated public child welfare data from California’s administrative child welfare system, attending to the often missing link between data/research and practice improvement. PMID:27429600

  4. Assessment of Geographic Information Systems and Data Confidentiality Guidelines in STD Programs.

    PubMed

    Bissette, Jennifer M; Stover, Jeffrey A; Newman, Lori M; Delcher, Philip Christopher; Bernstein, Kyle T; Matthews, Lindsey

    2009-01-01

    Advancements in technology, such as geographic information systems (GIS), expand sexually transmitted disease (STD) program capacity for data analysis and visualization, and introduce additional confidentiality considerations. We developed a survey to examine GIS use among STD programs and to better understand existing data confidentiality practices. A Web-based survey of eight to 22 questions, depending on program-specific GIS capacity, was e-mailed to all STD program directors through the National Coalition of STD Directors in November 2004. Survey responses were accepted until April 15, 2005. Eighty-five percent of the 65 currently funded STD programs responded to the survey. Of those, 58% used GIS and 54% used geocoding. STD programs that did not use GIS (42%) identified lack of training and insufficient staff as primary barriers. Mapping, spatial analyses, and targeting program interventions were the main reasons for geocoding data. Nineteen of the 25 programs that responded to questions related to statistical disclosure rules employed a numerator rule, and 56% of those used a variation of the "Rule of 5." Of the 28 programs that responded to questions pertaining to confidentiality guidelines, 82% addressed confidentiality of GIS data informally. Survey findings showed the increasing use of GIS and highlighted the struggles STD programs face in employing GIS and protecting confidentiality. Guidance related to data confidentiality and additional access to GIS software and training could assist programs in optimizing use of spatial data.

  5. Strategies for Energy Efficient Resource Management of Hybrid Programming Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Dong; Supinski, Bronis de; Schulz, Martin

    2013-01-01

    Many scientific applications are programmed using hybrid programming models that use both message-passing and shared-memory, due to the increasing prevalence of large-scale systems with multicore, multisocket nodes. Previous work has shown that energy efficiency can be improved using software-controlled execution schemes that consider both the programming model and the power-aware execution capabilities of the system. However, such approaches have focused on identifying optimal resource utilization for one programming model, either shared-memory or message-passing, in isolation. The potential solution space, thus the challenge, increases substantially when optimizing hybrid models since the possible resource configurations increase exponentially. Nonetheless, with the accelerating adoptionmore » of hybrid programming models, we increasingly need improved energy efficiency in hybrid parallel applications on large-scale systems. In this work, we present new software-controlled execution schemes that consider the effects of dynamic concurrency throttling (DCT) and dynamic voltage and frequency scaling (DVFS) in the context of hybrid programming models. Specifically, we present predictive models and novel algorithms based on statistical analysis that anticipate application power and time requirements under different concurrency and frequency configurations. We apply our models and methods to the NPB MZ benchmarks and selected applications from the ASC Sequoia codes. Overall, we achieve substantial energy savings (8.74% on average and up to 13.8%) with some performance gain (up to 7.5%) or negligible performance loss.« less

  6. An expert system for wind shear avoidance

    NASA Technical Reports Server (NTRS)

    Stengel, Robert F.; Stratton, D. Alexander

    1990-01-01

    A study of intelligent guidance and control concepts for protecting against the adverse effects of wind shear during aircraft takeoffs and landings is being conducted, with current emphasis on developing an expert system for wind shear avoidance. Principal objectives are to develop methods for assessing the likelihood of wind shear encounter (based on real-time information in the cockpit), for deciding what flight path to pursue (e.g., takeoff abort, landing go-around, or normal climbout or glide slope), and for using the aircraft's full potential for combating wind shear. This study requires the definition of both deterministic and statistical techniques for fusing internal and external information , for making go/no-go decisions, and for generating commands to the manually controlled flight. The program has begun with the development of the WindShear Safety Advisor, an expert system for pilot aiding that is based on the FAA Windshear Training Aid; a two-volume manual that presents an overview , pilot guide, training program, and substantiating data provides guidelines for this initial development. The WindShear Safety Advisor expert system currently contains over 200 rules and is coded in the LISP programming language.

  7. Diabetes mellitus disease management in a safety net hospital system: translating evidence into practice.

    PubMed

    Butler, Michael K; Kaiser, Michael; Johnson, Jolene; Besse, Jay; Horswell, Ronald

    2010-12-01

    The Louisiana State University Health Care Services Division system assessed the effectiveness of implementing a multisite disease management program targeting diabetes mellitus in an indigent patient population. A population-based disease management program centered on evidence-based clinical care guidelines was applied from the system level. Specific clinic modifications and models were used, as well as ancillary services such as medication assistance and equipment subsidies. Marked improvement in process goals led to improved clinical outcomes. From 2001 to 2008, the percentage of patients with a hemoglobin A1c < 7.0 increased from 45% to 55% on the system level, with some sites experiencing a more dramatic shift. Results were similar across sites, which included both small provider groups and academic health centers. In order to achieve these results, the clinical environment changed to promote those evidence-based interventions. Even in complex environments such as academic health centers with several provider levels, or those environments with limited care resources, disease management programs can be successfully implemented and achieve statistically significant results.

  8. MQSA National Statistics

    MedlinePlus

    ... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2018 Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard ...

  9. Mobile health is worth it! Economic benefit and impact on health of a population-based mobile screening program in new Mexico.

    PubMed

    Brown-Connolly, Nancy E; Concha, Jeannie B; English, Jennifer

    2014-01-01

    HABITS for Life was a 3-year initiative to broadly deliver a statewide biometric and retinal screening program via a mobile unit throughout New Mexico at no charge to participants. The program goal-to identify health risk and improve population health status-was tested over a 3-year period. Value to participants and impact to the healthcare system were measured to quantify impact and value of investing in prevention at the community level. We used the Mobile Health Map Return-on-Investment Calculator, a mobile screening unit, biometric screening, retinography, and community coordination. Our systems included satellite, DSL, and 3G connectivity, a Tanita® (Arlington Heights, IL) automated body mass index-measuring scale, the Cholestec® (Alere™, Waltham, MA) system for biomarkers and glycosylated hemoglobin, a Canon (Melville, NY) CR-1 Mark II camera, and the Picture Archiving Communication System. In this report for the fiscal year 2011 time frame, 6,426 individuals received biometric screening, and 5,219 received retinal screening. A 15:1 return on investment was calculated; this excluded retinal screening for the under-65 year olds, estimated at $10 million in quality-adjusted life years saved. Statistically significant improvement in health status evidenced by sequential screening included a decrease in total cholesterol level (p=0.002) (n=308) and an increase in high-density lipoprotein level after the first and second screening (p=0.02 and p=0.01, respectively), but a decrease in mean random glucose level was not statistically significant (p=0.62). Retinal results indicate 28.4% (n=1,482) with a positive/abnormal finding, of which 1.79% (n=93) required immediate referral for sight-threatening retinopathy and 27% (n=1,389) required follow-up of from 3 months to 1 year. Screening programs are cost-effective and provide value in preventive health efforts. Broad use of screening programs should be considered in healthcare redesign efforts. Community-based screening is an effective strategy to identify health risk, improve access, provide motivation to change health habits, and improve physical status while returning significant value.

  10. Use of a Relational Database to Support Clinical Research: Application in a Diabetes Program

    PubMed Central

    Lomatch, Diane; Truax, Terry; Savage, Peter

    1981-01-01

    A database has been established to support conduct of clinical research and monitor delivery of medical care for 1200 diabetic patients as part of the Michigan Diabetes Research and Training Center (MDRTC). Use of an intelligent microcomputer to enter and retrieve the data and use of a relational database management system (DBMS) to store and manage data have provided a flexible, efficient method of achieving both support of small projects and monitoring overall activity of the Diabetes Center Unit (DCU). Simplicity of access to data, efficiency in providing data for unanticipated requests, ease of manipulations of relations, security and “logical data independence” were important factors in choosing a relational DBMS. The ability to interface with an interactive statistical program and a graphics program is a major advantage of this system. Out database currently provides support for the operation and analysis of several ongoing research projects.

  11. C-statistic fitting routines: User's manual and reference guide

    NASA Technical Reports Server (NTRS)

    Nousek, John A.; Farwana, Vida

    1991-01-01

    The computer program is discussed which can read several input files and provide a best set of values for the functions provided by the user, using either C-statistic or the chi(exp 2) statistic method. The program consists of one main routine and several functions and subroutines. Detail descriptions of each function and subroutine is presented. A brief description of the C-statistic and the reason for its application is also presented.

  12. Selected Streamflow Statistics for Streamgaging Stations in Delaware, 2003

    USGS Publications Warehouse

    Ries, Kernell G.

    2004-01-01

    Flow-duration and low-flow frequency statistics were calculated for 15 streamgaging stations in Delaware, in cooperation with the Delaware Geological Survey. The flow-duration statistics include the 1-, 2-, 5-, 10-, 20-, 30-, 40-, 50-, 60-, 70-, 80-, 90-, 95-, 98-, and 99-percent duration discharges. The low-flow frequency statistics include the average discharges for 1, 7, 14, 30, 60, 90, and 120 days that recur, on average, once in 1.01, 2, 5, 10, 20, 50, and 100 years. The statistics were computed using U.S. Geological Survey computer programs that can be downloaded from the World Wide Web at no cost. The computer programs automate standard U.S. Geological Survey methods for computing the statistics. Documentation is provided at the Web sites for the individual programs. The computed statistics are presented in tabular format on a separate page for each station, along with the station name, station number, the location, the period of record, and remarks.

  13. AEOSS design guide for system analysis on Advanced Earth-Orbital Spacecraft Systems

    NASA Technical Reports Server (NTRS)

    Lee, Hwa-Ping

    1990-01-01

    Advanced Earth Orbital Spacecraft System (AEOSS) enables users to project the requried power, weight, and cost for a generic earth-orbital spacecraft system. These variables are calculated on the component and subsystem levels, and then the system level. The included six subsystems are electric power, thermal control, structure, auxillary propulsion, attitude control, and communication, command, and data handling. The costs are computed using statistically determined models that were derived from the flown spacecraft in the past and were categorized into classes according to their functions and structural complexity. Selected design and performance analyses for essential components and subsystems are also provided. AEOSS has the feature permitting a user to enter known values of these parameters, totally and partially, at all levels. All information is of vital importance to project managers of subsystems or a spacecraft system. AEOSS is a specially tailored software coded from the relational database program of the Acius; 4th Dimension with a Macintosh version. Because of the licensing agreement, two versions of the AEOSS documents were prepared. This version AEOSS Design Guide, is for users to exploit the full capacity of the 4th Dimension. It is for a user who wants to alter or expand the program structures, the program statements, and the program procedures. The user has to possess a 4th Dimension first.

  14. Performance evaluation of three computed radiography systems using methods recommended in American Association of Physicists in Medicine Report 93

    PubMed Central

    Muhogora, Wilbroad; Padovani, Renato; Bonutti, Faustino; Msaki, Peter; Kazema, R.

    2011-01-01

    The performances of three clinical computed radiography (CR) systems, (Agfa CR 75 (with CRMD 4.0 image plates), Kodak CR 850 (with Kodak GP plates) and Kodak CR 850A (with Kodak GP plates)) were evaluated using six tests recommended in American Association of Physicists in Medicine Report 93. The results indicated variable performances with majority being within acceptable limits. The variations were mainly attributed to differences in detector formulations, plate readers’ characteristics, and aging effects. The differences of the mean low contrast scores between the imaging systems for three observers were statistically significant for Agfa and Kodak CR 850A (P=0.009) and for Kodak CR systems (P=0.006) probably because of the differences in ages. However, the differences were not statistically significant between Agfa and Kodak CR 850 (P=0.284) suggesting similar perceived image quality. The study demonstrates the need to implement quality control program regularly. PMID:21897559

  15. Performance evaluation of three computed radiography systems using methods recommended in American Association of Physicists in Medicine Report 93.

    PubMed

    Muhogora, Wilbroad; Padovani, Renato; Bonutti, Faustino; Msaki, Peter; Kazema, R

    2011-07-01

    The performances of three clinical computed radiography (CR) systems, (Agfa CR 75 (with CRMD 4.0 image plates), Kodak CR 850 (with Kodak GP plates) and Kodak CR 850A (with Kodak GP plates)) were evaluated using six tests recommended in American Association of Physicists in Medicine Report 93. The results indicated variable performances with majority being within acceptable limits. The variations were mainly attributed to differences in detector formulations, plate readers' characteristics, and aging effects. The differences of the mean low contrast scores between the imaging systems for three observers were statistically significant for Agfa and Kodak CR 850A (P=0.009) and for Kodak CR systems (P=0.006) probably because of the differences in ages. However, the differences were not statistically significant between Agfa and Kodak CR 850 (P=0.284) suggesting similar perceived image quality. The study demonstrates the need to implement quality control program regularly.

  16. Higher order statistical moment application for solar PV potential analysis

    NASA Astrophysics Data System (ADS)

    Basri, Mohd Juhari Mat; Abdullah, Samizee; Azrulhisham, Engku Ahmad; Harun, Khairulezuan

    2016-10-01

    Solar photovoltaic energy could be as alternative energy to fossil fuel, which is depleting and posing a global warming problem. However, this renewable energy is so variable and intermittent to be relied on. Therefore the knowledge of energy potential is very important for any site to build this solar photovoltaic power generation system. Here, the application of higher order statistical moment model is being analyzed using data collected from 5MW grid-connected photovoltaic system. Due to the dynamic changes of skewness and kurtosis of AC power and solar irradiance distributions of the solar farm, Pearson system where the probability distribution is calculated by matching their theoretical moments with that of the empirical moments of a distribution could be suitable for this purpose. On the advantage of the Pearson system in MATLAB, a software programming has been developed to help in data processing for distribution fitting and potential analysis for future projection of amount of AC power and solar irradiance availability.

  17. Comparison of HSPF and PRMS model simulated flows using different temporal and spatial scales in the Black Hills, South Dakota

    USGS Publications Warehouse

    Chalise, D. R.; Haj, Adel E.; Fontaine, T.A.

    2018-01-01

    The hydrological simulation program Fortran (HSPF) [Hydrological Simulation Program Fortran version 12.2 (Computer software). USEPA, Washington, DC] and the precipitation runoff modeling system (PRMS) [Precipitation Runoff Modeling System version 4.0 (Computer software). USGS, Reston, VA] models are semidistributed, deterministic hydrological tools for simulating the impacts of precipitation, land use, and climate on basin hydrology and streamflow. Both models have been applied independently to many watersheds across the United States. This paper reports the statistical results assessing various temporal (daily, monthly, and annual) and spatial (small versus large watershed) scale biases in HSPF and PRMS simulations using two watersheds in the Black Hills, South Dakota. The Nash-Sutcliffe efficiency (NSE), Pearson correlation coefficient (r">rr), and coefficient of determination (R2">R2R2) statistics for the daily, monthly, and annual flows were used to evaluate the models’ performance. Results from the HSPF models showed that the HSPF consistently simulated the annual flows for both large and small basins better than the monthly and daily flows, and the simulated flows for the small watershed better than flows for the large watershed. In comparison, the PRMS model results show that the PRMS simulated the monthly flows for both the large and small watersheds better than the daily and annual flows, and the range of statistical error in the PRMS models was greater than that in the HSPF models. Moreover, it can be concluded that the statistical error in the HSPF and the PRMSdaily, monthly, and annual flow estimates for watersheds in the Black Hills was influenced by both temporal and spatial scale variability.

  18. LACIE performance predictor final operational capability program description, volume 1

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The program EPHEMS computes the orbital parameters for up to two vehicles orbiting the earth for up to 549 days. The data represents a continuous swath about the earth, producing tables which can be used to determine when and if certain land segments will be covered. The program GRID processes NASA's climatology tape to obtain the weather indices along with associated latitudes and longitudes. The program LUMP takes substrata historical data and sample segment ID, crop window, crop window error and statistical data, checks for valid input parameters and generates the segment ID file, crop window file and the substrata historical file. Finally, the System Error Executive (SEE) Program checks YES error and truth data, CAMS error data, and signature extension data for validity and missing elements. A message is printed for each error found.

  19. Direct conversion of solar energy to thermal energy

    NASA Astrophysics Data System (ADS)

    Sizmann, Rudolf

    1986-12-01

    Selective coatings (cermets) were produced by simultaneous evaporation of copper and silicon dioxide, and analyzed by computer assisted spectral photometers and ellipsometers; hemispherical emittance was measured. Steady state test procedures for covered and uncovered collectors were investigated. A method for evaluating the transient behavior of collectors was developed. The derived transfer functions describe their transient behavior. A stochastic approach was used for reducing the meteorological data volume. Data sets which are statistically equivalent to the original data can be synthesized. A simulation program for solar systems using analytical solutions of differential equations was developed. A large solar DHW system was optimized by a detailed modular simulation program. A microprocessor assisted data aquisition records the four characteristics of solar cells and solar cell systems in less than 10 msec. Measurements of a large photovoltaic installation (50 sqm) are reported.

  20. MHSS: a material handling system simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pomernacki, L.; Hollstien, R.B.

    1976-04-07

    A Material Handling System Simulator (MHSS) program is described that provides specialized functional blocks for modeling and simulation of nuclear material handling systems. Models of nuclear fuel fabrication plants may be built using functional blocks that simulate material receiving, storage, transport, inventory, processing, and shipping operations as well as the control and reporting tasks of operators or on-line computers. Blocks are also provided that allow the user to observe and gather statistical information on the dynamic behavior of simulated plants over single or replicated runs. Although it is currently being developed for the nuclear materials handling application, MHSS can bemore » adapted to other industries in which material accountability is important. In this paper, emphasis is on the simulation methodology of the MHSS program with application to the nuclear material safeguards problem. (auth)« less

  1. Improving Ecological Response Monitoring of Environmental Flows

    NASA Astrophysics Data System (ADS)

    King, Alison J.; Gawne, Ben; Beesley, Leah; Koehn, John D.; Nielsen, Daryl L.; Price, Amina

    2015-05-01

    Environmental flows are now an important restoration technique in flow-degraded rivers, and with the increasing public scrutiny of their effectiveness and value, the importance of undertaking scientifically robust monitoring is now even more critical. Many existing environmental flow monitoring programs have poorly defined objectives, nonjustified indicator choices, weak experimental designs, poor statistical strength, and often focus on outcomes from a single event. These negative attributes make them difficult to learn from. We provide practical recommendations that aim to improve the performance, scientific robustness, and defensibility of environmental flow monitoring programs. We draw on the literature and knowledge gained from working with stakeholders and managers to design, implement, and monitor a range of environmental flow types. We recommend that (1) environmental flow monitoring programs should be implemented within an adaptive management framework; (2) objectives of environmental flow programs should be well defined, attainable, and based on an agreed conceptual understanding of the system; (3) program and intervention targets should be attainable, measurable, and inform program objectives; (4) intervention monitoring programs should improve our understanding of flow-ecological responses and related conceptual models; (5) indicator selection should be based on conceptual models, objectives, and prioritization approaches; (6) appropriate monitoring designs and statistical tools should be used to measure and determine ecological response; (7) responses should be measured within timeframes that are relevant to the indicator(s); (8) watering events should be treated as replicates of a larger experiment; (9) environmental flow outcomes should be reported using a standard suite of metadata. Incorporating these attributes into future monitoring programs should ensure their outcomes are transferable and measured with high scientific credibility.

  2. Integrative Biological Chemistry Program Includes The Use Of Informatics Tools, GIS And SAS Software Applications

    PubMed Central

    D’Souza, Malcolm J.; Kashmar, Richard J.; Hurst, Kent; Fiedler, Frank; Gross, Catherine E.; Deol, Jasbir K.; Wilson, Alora

    2015-01-01

    Wesley College is a private, primarily undergraduate minority-serving institution located in the historic district of Dover, Delaware (DE). The College recently revised its baccalaureate biological chemistry program requirements to include a one-semester Physical Chemistry for the Life Sciences course and project-based experiential learning courses using instrumentation, data-collection, data-storage, statistical-modeling analysis, visualization, and computational techniques. In this revised curriculum, students begin with a traditional set of biology, chemistry, physics, and mathematics major core-requirements, a geographic information systems (GIS) course, a choice of an instrumental analysis course or a statistical analysis systems (SAS) programming course, and then, students can add major-electives that further add depth and value to their future post-graduate specialty areas. Open-sourced georeferenced census, health and health disparity data were coupled with GIS and SAS tools, in a public health surveillance system project, based on US county zip-codes, to develop use-cases for chronic adult obesity where income, poverty status, health insurance coverage, education, and age were categorical variables. Across the 48 contiguous states, obesity rates are found to be directly proportional to high poverty and inversely proportional to median income and educational achievement. For the State of Delaware, age and educational attainment were found to be limiting obesity risk-factors in its adult population. Furthermore, the 2004–2010 obesity trends showed that for two of the less densely populated Delaware counties; Sussex and Kent, the rates of adult obesity were found to be progressing at much higher proportions when compared to the national average. PMID:26191337

  3. Integrative Biological Chemistry Program Includes The Use Of Informatics Tools, GIS And SAS Software Applications.

    PubMed

    D'Souza, Malcolm J; Kashmar, Richard J; Hurst, Kent; Fiedler, Frank; Gross, Catherine E; Deol, Jasbir K; Wilson, Alora

    Wesley College is a private, primarily undergraduate minority-serving institution located in the historic district of Dover, Delaware (DE). The College recently revised its baccalaureate biological chemistry program requirements to include a one-semester Physical Chemistry for the Life Sciences course and project-based experiential learning courses using instrumentation, data-collection, data-storage, statistical-modeling analysis, visualization, and computational techniques. In this revised curriculum, students begin with a traditional set of biology, chemistry, physics, and mathematics major core-requirements, a geographic information systems (GIS) course, a choice of an instrumental analysis course or a statistical analysis systems (SAS) programming course, and then, students can add major-electives that further add depth and value to their future post-graduate specialty areas. Open-sourced georeferenced census, health and health disparity data were coupled with GIS and SAS tools, in a public health surveillance system project, based on US county zip-codes, to develop use-cases for chronic adult obesity where income, poverty status, health insurance coverage, education, and age were categorical variables. Across the 48 contiguous states, obesity rates are found to be directly proportional to high poverty and inversely proportional to median income and educational achievement. For the State of Delaware, age and educational attainment were found to be limiting obesity risk-factors in its adult population. Furthermore, the 2004-2010 obesity trends showed that for two of the less densely populated Delaware counties; Sussex and Kent, the rates of adult obesity were found to be progressing at much higher proportions when compared to the national average.

  4. A Monte Carlo Analysis of the Thrust Imbalance for the Space Launch System Booster During Both the Ignition Transient and Steady State Operation

    NASA Technical Reports Server (NTRS)

    Foster, Winfred A., Jr.; Crowder, Winston; Steadman, Todd E.

    2014-01-01

    This paper presents the results of statistical analyses performed to predict the thrust imbalance between two solid rocket motor boosters to be used on the Space Launch System (SLS) vehicle. Two legacy internal ballistics codes developed for the Space Shuttle program were coupled with a Monte Carlo analysis code to determine a thrust imbalance envelope for the SLS vehicle based on the performance of 1000 motor pairs. Thirty three variables which could impact the performance of the motors during the ignition transient and thirty eight variables which could impact the performance of the motors during steady state operation of the motor were identified and treated as statistical variables for the analyses. The effects of motor to motor variation as well as variations between motors of a single pair were included in the analyses. The statistical variations of the variables were defined based on data provided by NASA's Marshall Space Flight Center for the upgraded five segment booster and from the Space Shuttle booster when appropriate. The results obtained for the statistical envelope are compared with the design specification thrust imbalance limits for the SLS launch vehicle.

  5. Health Resources Statistics; Health Manpower and Health Facilities, 1968. Public Health Service Publication No. 1509.

    ERIC Educational Resources Information Center

    National Center for Health Statistics (DHEW/PHS), Hyattsville, MD.

    This report is a part of the program of the National Center for Health Statistics to provide current statistics as baseline data for the evaluation, planning, and administration of health programs. Part I presents data concerning the occupational fields: (1) administration, (2) anthropology and sociology, (3) data processing, (4) basic sciences,…

  6. Attitudes toward Advanced and Multivariate Statistics When Using Computers.

    ERIC Educational Resources Information Center

    Kennedy, Robert L.; McCallister, Corliss Jean

    This study investigated the attitudes toward statistics of graduate students who studied advanced statistics in a course in which the focus of instruction was the use of a computer program in class. The use of the program made it possible to provide an individualized, self-paced, student-centered, and activity-based course. The three sections…

  7. Cardiac surgery report cards: comprehensive review and statistical critique.

    PubMed

    Shahian, D M; Normand, S L; Torchiana, D F; Lewis, S M; Pastore, J O; Kuntz, R E; Dreyer, P I

    2001-12-01

    Public report cards and confidential, collaborative peer education represent distinctly different approaches to cardiac surgery quality assessment and improvement. This review discusses the controversies regarding their methodology and relative effectiveness. Report cards have been the more commonly used approach, typically as a result of state legislation. They are based on the presumption that publication of outcomes effectively motivates providers, and that market forces will reward higher quality. Numerous studies have challenged the validity of these hypotheses. Furthermore, although states with report cards have reported significant decreases in risk-adjusted mortality, it is unclear whether this improvement resulted from public disclosure or, rather, from the development of internal quality programs by hospitals. An additional confounding factor is the nationwide decline in heart surgery mortality, including states without quality monitoring. Finally, report cards may engender negative behaviors such as high-risk case avoidance and "gaming" of the reporting system, especially if individual surgeon results are published. The alternative approach, continuous quality improvement, may provide an opportunity to enhance performance and reduce interprovider variability while avoiding the unintended negative consequences of report cards. This collaborative method, which uses exchange visits between programs and determination of best practice, has been highly effective in northern New England and in the Veterans Affairs Administration. However, despite their potential advantages, quality programs based solely on confidential continuous quality improvement do not address the issue of public accountability. For this reason, some states may continue to mandate report cards. In such instances, it is imperative that appropriate statistical techniques and report formats are used, and that professional organizations simultaneously implement continuous quality improvement programs. The statistical methodology underlying current report cards is flawed, and does not justify the degree of accuracy presented to the public. All existing risk-adjustment methods have substantial inherent imprecision, and this is compounded when the results of such patient-level models are aggregated and used inappropriately to assess provider performance. Specific problems include sample size differences, clustering of observations, multiple comparisons, and failure to account for the random component of interprovider variability. We advocate the use of hierarchical or multilevel statistical models to address these concerns, as well as report formats that emphasize the statistical uncertainty of the results.

  8. American Recovery and Reinvestment Act (ARRA) statistical summaries.

    DOT National Transportation Integrated Search

    2012-05-01

    The American Recovery and Reinvestment Act (ARRA) Statistical Summaries provide information about the Federal Transit Administrations (FTA) financial investment programs funded through ARRA.This report covers the Urbanized Area Formula Program and...

  9. The benefits of a comprehensive rehabilitation program in patients diagnosed with spastic quadriplegia

    PubMed Central

    Rogoveanu, OC; Tuțescu, NC; Kamal, D; Alexandru, DO; Kamal, C; Streba, CT; Trăistaru, MR

    2016-01-01

    Spastic quadriplegia has as an etiopathogenic substrate, a non-progressive brain lesion; however, the clinical manifestations of the disease evolve over time. Children diagnosed with spastic quadriplegia show a variety of symptoms in different areas: sensorimotor, emotional, cognitive, and social. The purpose of this study was to assess the functional status in patients diagnosed with spastic quadriplegia, who followed a complex medical rehabilitation program, during a year, and highlight the importance of using physical and kinetic techniques in improving their status. A total of 10 children diagnosed with spastic quadriplegia were included in the study and the Gross Motor Function Classification System (GMFCS) and manual ability classification system (MACS) were used to evaluate the functionality status of each patient. Every patient was evaluated initially (T1), after six months of program (T2), and after they completed the study. All the children were originally monitored daily, for 5 days per week for a period of one month, then two times a week for a year. A statistically significant difference regarding the modification of the GMFCS and MACS stage was found, which occurred between the first and the third evaluation. The inverse correlation of the statistical significance between the ages of patients and the decrease in GMFCS or MACS stage was highlighted; the younger the patient, the more the scale decreased. A direct link between the gross motor function and the manual ability was noticed. Applying a complex rehabilitation program has proven efficient by improving both the gross motor functionality and the manual ability. PMID:27974931

  10. The benefits of a comprehensive rehabilitation program in patients diagnosed with spastic quadriplegia.

    PubMed

    Rogoveanu, O C; Tuțescu, N C; Kamal, D; Alexandru, D O; Kamal, C; Streba, C T; Trăistaru, M R

    2016-01-01

    Spastic quadriplegia has as an etiopathogenic substrate, a non-progressive brain lesion; however, the clinical manifestations of the disease evolve over time. Children diagnosed with spastic quadriplegia show a variety of symptoms in different areas: sensorimotor, emotional, cognitive, and social. The purpose of this study was to assess the functional status in patients diagnosed with spastic quadriplegia, who followed a complex medical rehabilitation program, during a year, and highlight the importance of using physical and kinetic techniques in improving their status. A total of 10 children diagnosed with spastic quadriplegia were included in the study and the Gross Motor Function Classification System (GMFCS) and manual ability classification system (MACS) were used to evaluate the functionality status of each patient. Every patient was evaluated initially (T1), after six months of program (T2), and after they completed the study. All the children were originally monitored daily, for 5 days per week for a period of one month, then two times a week for a year. A statistically significant difference regarding the modification of the GMFCS and MACS stage was found, which occurred between the first and the third evaluation. The inverse correlation of the statistical significance between the ages of patients and the decrease in GMFCS or MACS stage was highlighted; the younger the patient, the more the scale decreased. A direct link between the gross motor function and the manual ability was noticed. Applying a complex rehabilitation program has proven efficient by improving both the gross motor functionality and the manual ability.

  11. Year-End Clinic Handoffs: A National Survey of Academic Internal Medicine Programs.

    PubMed

    Phillips, Erica; Harris, Christina; Lee, Wei Wei; Pincavage, Amber T; Ouchida, Karin; Miller, Rachel K; Chaudhry, Saima; Arora, Vineet M

    2017-06-01

    While there has been increasing emphasis and innovation nationwide in training residents in inpatient handoffs, very little is known about the practice and preparation for year-end clinic handoffs of residency outpatient continuity practices. Thus, the latter remains an identified, yet nationally unaddressed, patient safety concern. The 2014 annual Association of Program Directors in Internal Medicine (APDIM) survey included seven items for assessing the current year-end clinic handoff practices of internal medicine residency programs throughout the country. Nationwide survey. All internal medicine program directors registered with APDIM. Descriptive statistics of programs and tools used to formulate a year-end handoff in the ambulatory setting, methods for evaluating the process, patient safety and quality measures incorporated within the process, and barriers to conducting year-end handoffs. Of the 361 APDIM member programs, 214 (59%) completed the Transitions of Care Year-End Clinic Handoffs section of the survey. Only 34% of respondent programs reported having a year-end ambulatory handoff system, and 4% reported assessing residents for competency in this area. The top three barriers to developing a year-end handoff system were insufficient overlap between graduating and incoming residents, inability to schedule patients with new residents in advance, and time constraints for residents, attendings, and support staff. Most internal medicine programs do not have a year-end clinic handoff system in place. Greater attention to clinic handoffs and resident assessment of this care transition is needed.

  12. Treatment effects model for assessing disease management: measuring outcomes and strengthening program management.

    PubMed

    Wendel, Jeanne; Dumitras, Diana

    2005-06-01

    This paper describes an analytical methodology for obtaining statistically unbiased outcomes estimates for programs in which participation decisions may be correlated with variables that impact outcomes. This methodology is particularly useful for intraorganizational program evaluations conducted for business purposes. In this situation, data is likely to be available for a population of managed care members who are eligible to participate in a disease management (DM) program, with some electing to participate while others eschew the opportunity. The most pragmatic analytical strategy for in-house evaluation of such programs is likely to be the pre-intervention/post-intervention design in which the control group consists of people who were invited to participate in the DM program, but declined the invitation. Regression estimates of program impacts may be statistically biased if factors that impact participation decisions are correlated with outcomes measures. This paper describes an econometric procedure, the Treatment Effects model, developed to produce statistically unbiased estimates of program impacts in this type of situation. Two equations are estimated to (a) estimate the impacts of patient characteristics on decisions to participate in the program, and then (b) use this information to produce a statistically unbiased estimate of the impact of program participation on outcomes. This methodology is well-established in economics and econometrics, but has not been widely applied in the DM outcomes measurement literature; hence, this paper focuses on one illustrative application.

  13. Cost estimating methods for advanced space systems

    NASA Technical Reports Server (NTRS)

    Cyr, Kelley

    1988-01-01

    Parametric cost estimating methods for space systems in the conceptual design phase are developed. The approach is to identify variables that drive cost such as weight, quantity, development culture, design inheritance, and time. The relationship between weight and cost is examined in detail. A theoretical model of cost is developed and tested statistically against a historical data base of major research and development programs. It is concluded that the technique presented is sound, but that it must be refined in order to produce acceptable cost estimates.

  14. Discovery: Faculty Publications and Presentations, Fiscal Year 1980. Volume 1. Books, Texts, Manuals, Chapters, Papers, Reports, and Presentations

    DTIC Science & Technology

    1980-09-01

    organizationally dysfunctional behavior. Of course, the latter includes racism , sexism , and brutality. c. "Transcultural Health Care." Menlo Park...area rule. The paper discusses the theory as it pertains to the simplication wind tunnel experiments used to verify the theory , a program developed...34Computerized Test Generation System" (Research) The purpose of the system is to allow a course director to build a data base of questions. statistics

  15. Science and Engineering of an Operational Tsunami Forecasting System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gonzalez, Frank

    2009-04-06

    After a review of tsunami statistics and the destruction caused by tsunamis, a means of forecasting tsunamis is discussed as part of an overall program of reducing fatalities through hazard assessment, education, training, mitigation, and a tsunami warning system. The forecast is accomplished via a concept called Deep Ocean Assessment and Reporting of Tsunamis (DART). Small changes of pressure at the sea floor are measured and relayed to warning centers. Under development is an international modeling network to transfer, maintain, and improve tsunami forecast models.

  16. History of the Redstone Missile System

    DTIC Science & Technology

    1965-10-15

    knowledge in areas such as propulsion systems, rocket fuels, aerodynamics, guidance equipment, and’testing equipment. It compiled basic statistics on...i n g of t he 46th and 4 3 Tech Rept, ABMA, 30 Jun 57, sub: Ordnance Guided M i s s i l e and Rocket Programs, Redstone, Vol. IVg Supp. 2, p. 67...embarked in June for Europe. The main body boarded ship on 18 June 1958 for Saint-Nazaire, France , and moved in convoy across France and 48~ept

  17. Science and Engineering of an Operational Tsunami Forecasting System

    ScienceCinema

    Gonzalez, Frank

    2017-12-09

    After a review of tsunami statistics and the destruction caused by tsunamis, a means of forecasting tsunamis is discussed as part of an overall program of reducing fatalities through hazard assessment, education, training, mitigation, and a tsunami warning system. The forecast is accomplished via a concept called Deep Ocean Assessment and Reporting of Tsunamis (DART). Small changes of pressure at the sea floor are measured and relayed to warning centers. Under development is an international modeling network to transfer, maintain, and improve tsunami forecast models.

  18. Report for Florida Community Colleges, 1983-1984. Part I: Statistical Tables.

    ERIC Educational Resources Information Center

    Florida State Dept. of Education, Tallahassee. Div. of Community Colleges.

    Statistical data are presented on student enrollments, academic programs, personnel and salaries, and finances for the Florida community colleges for 1983-84. A series of tables provide data on: (1) opening fall enrollment by class, program and student status; (2) fall enrollment headcount by age groups; (3) annual program headcount enrollment;…

  19. Reverse-engineering the genetic circuitry of a cancer cell with predicted intervention in chronic lymphocytic leukemia.

    PubMed

    Vallat, Laurent; Kemper, Corey A; Jung, Nicolas; Maumy-Bertrand, Myriam; Bertrand, Frédéric; Meyer, Nicolas; Pocheville, Arnaud; Fisher, John W; Gribben, John G; Bahram, Seiamak

    2013-01-08

    Cellular behavior is sustained by genetic programs that are progressively disrupted in pathological conditions--notably, cancer. High-throughput gene expression profiling has been used to infer statistical models describing these cellular programs, and development is now needed to guide orientated modulation of these systems. Here we develop a regression-based model to reverse-engineer a temporal genetic program, based on relevant patterns of gene expression after cell stimulation. This method integrates the temporal dimension of biological rewiring of genetic programs and enables the prediction of the effect of targeted gene disruption at the system level. We tested the performance accuracy of this model on synthetic data before reverse-engineering the response of primary cancer cells to a proliferative (protumorigenic) stimulation in a multistate leukemia biological model (i.e., chronic lymphocytic leukemia). To validate the ability of our method to predict the effects of gene modulation on the global program, we performed an intervention experiment on a targeted gene. Comparison of the predicted and observed gene expression changes demonstrates the possibility of predicting the effects of a perturbation in a gene regulatory network, a first step toward an orientated intervention in a cancer cell genetic program.

  20. LabVIEW: a software system for data acquisition, data analysis, and instrument control.

    PubMed

    Kalkman, C J

    1995-01-01

    Computer-based data acquisition systems play an important role in clinical monitoring and in the development of new monitoring tools. LabVIEW (National Instruments, Austin, TX) is a data acquisition and programming environment that allows flexible acquisition and processing of analog and digital data. The main feature that distinguishes LabVIEW from other data acquisition programs is its highly modular graphical programming language, "G," and a large library of mathematical and statistical functions. The advantage of graphical programming is that the code is flexible, reusable, and self-documenting. Subroutines can be saved in a library and reused without modification in other programs. This dramatically reduces development time and enables researchers to develop or modify their own programs. LabVIEW uses a large amount of processing power and computer memory, thus requiring a powerful computer. A large-screen monitor is desirable when developing larger applications. LabVIEW is excellently suited for testing new monitoring paradigms, analysis algorithms, or user interfaces. The typical LabVIEW user is the researcher who wants to develop a new monitoring technique, a set of new (derived) variables by integrating signals from several existing patient monitors, closed-loop control of a physiological variable, or a physiological simulator.

  1. Toward quantifying the effectiveness of water trading under uncertainty.

    PubMed

    Luo, B; Huang, G H; Zou, Y; Yin, Y Y

    2007-04-01

    This paper presents a methodology for quantifying the effectiveness of water-trading under uncertainty, by developing an optimization model based on the interval-parameter two-stage stochastic program (TSP) technique. In the study, the effectiveness of a water-trading program is measured by the water volume that can be released through trading from a statistical point of view. The methodology can also deal with recourse water allocation problems generated by randomness in water availability and, at the same time, tackle uncertainties expressed as intervals in the trading system. The developed methodology was tested with a hypothetical water-trading program in an agricultural system in the Swift Current Creek watershed, Canada. Study results indicate that the methodology can effectively measure the effectiveness of a trading program through estimating the water volume being released through trading in a long-term view. A sensitivity analysis was also conducted to analyze the effects of different trading costs on the trading program. It shows that the trading efforts would become ineffective when the trading costs are too high. The case study also demonstrates that the trading program is more effective in a dry season when total water availability is in shortage.

  2. [A development and evaluation of nursing KMS using QFD in outpatient departments].

    PubMed

    Lee, Han Na; Yun, Eun Kyoung

    2014-02-01

    This study was done to develop and implement the Nursing KMS (knowledge management system) in order to improve knowledge sharing and creation among clinical nurses in outpatient departments. This study was a methodological research using the 'System Development Life Cycle': consisting of planning, analyzing, design, implementation, and evaluation. Quality Function Deployment (QFD) was applied to establish nurse requirements and to identify important design requirements. Participants were 32 nurses and for evaluation data were collected pre and post intervention at K Hospital in Seoul, a tertiary hospital with over 1,000 beds. The Nursing KMS was built using a Linux-based operating system, Oracle DBMS, and Java 1.6 web programming tools. The system was implemented as a sub-system of the hospital information system. There was statistically significant differences in the sharing of knowledge but creating of knowledge was no statistically meaningful difference observed. In terms of satisfaction with the system, system efficiency ranked first followed by system convenience, information suitability and information usefulness. The results indicate that the use of Nursing KMS increases nurses' knowledge sharing and can contribute to increased quality of nursing knowledge and provide more opportunities for nurses to gain expertise from knowledge shared among nurses.

  3. Novel Kalman Filter Algorithm for Statistical Monitoring of Extensive Landscapes with Synoptic Sensor Data

    PubMed Central

    Czaplewski, Raymond L.

    2015-01-01

    Wall-to-wall remotely sensed data are increasingly available to monitor landscape dynamics over large geographic areas. However, statistical monitoring programs that use post-stratification cannot fully utilize those sensor data. The Kalman filter (KF) is an alternative statistical estimator. I develop a new KF algorithm that is numerically robust with large numbers of study variables and auxiliary sensor variables. A National Forest Inventory (NFI) illustrates application within an official statistics program. Practical recommendations regarding remote sensing and statistical issues are offered. This algorithm has the potential to increase the value of synoptic sensor data for statistical monitoring of large geographic areas. PMID:26393588

  4. Comparative Study of School Leaders Use of Learning Consultants to Support At-Risk Students

    ERIC Educational Resources Information Center

    Forester, GinaMarie

    2013-01-01

    The education system continues to change and the needs of the 21st century learner are increasing. More students are being identified as at-risk. Statistics in New Jersey demonstrate trends that indicate the number of students being classified for special education is rising. Supervisors in schools need to develop and support programs to provide…

  5. Host-Based Multivariate Statistical Computer Operating Process Anomaly Intrusion Detection System (PAIDS)

    DTIC Science & Technology

    2009-03-01

    viii 3.2.3 Sub7 ...from TaskInfo in Excel Format. 3.2.3 Sub7 Also known as SubSeven, this is one of the best known, most widely distributed backdoor programs on the...engineering the spread of viruses, worms, backdoors and other malware. The Sub7 Trojan establishes a server on the victim computer that

  6. A Summary of the Naval Postgraduate School Research Program.

    DTIC Science & Technology

    1985-09-30

    new model will now be used in a variety of oceanic investigations including the response of the ocean to tropical and extratropical storms (R. L...Numerical Study of Maritime Extratropical e. Cyclones Using FGGE Data ........................... 249 Oceanic Current System Response to Atmospheric...In addition* Professor Jayachandran has performed statistical analyses of the storm tracking methodology used by the Naval Environmental Prediction

  7. The NASA digital VGH program: Exploration of methods and final results. Volume 2: L 1011 data 1978-1979: 1619 hours

    NASA Technical Reports Server (NTRS)

    Crabill, Norman L.

    1989-01-01

    Data obtained from the digital flight data recorder system of a L 1011 aircraft in 914 flights and 1619 hours of airline revenue operations are presented. Data on conditions with flap deployment and autopilot use are given. In addition, acceleration statistics are presented from 23 hours on nonrevenue flights.

  8. Value of Forecaster in the Loop

    DTIC Science & Technology

    2014-09-01

    forecast system IFR instrument flight rules IMC instrument meteorological conditions LAMP Localized Aviation Model Output Statistics Program METOC...obtaining valuable experience. Additional factors have impacted the Navy weather forecast process. There has been a the realignment of the meteorology...forecasts that are assessed, it may be a relatively small number that have direct impact on the decision-making process. Whether the value is minimal or

  9. Compilation of annual reports of the Navy ELF (Extremely Low Frequency) communications system ecological monitoring program (1983). Volume 2: Tabs F-J

    NASA Astrophysics Data System (ADS)

    Fischer, R. L.; Beaver, D. L.; Asher, J. H.; Hill, R. W.; Burton, T. M.

    1984-07-01

    The climatic factors which are known to impinge upon the biological behavior patterns of the megachilid bees were measured. Statistical testing of man-made influences such as ELF into the environment were pursued, with emphasis on nest architecture, pollen collection, and ability to orient to nest sites.

  10. Evaluation program for secondary spacecraft cells: Cycle life test

    NASA Technical Reports Server (NTRS)

    Harkness, J. D.

    1979-01-01

    The service life and storage stability for several storage batteries were determined. The batteries included silver-zinc batteries, nickel-cadmium batteries, and silver-cadmium batteries. The cell performance characteristics and limitations are to be used by spacecraft power systems planners and designers. A statistical analysis of the life cycle prediction and cause of failure versus test conditions is presented.

  11. Macro-Econophysics

    NASA Astrophysics Data System (ADS)

    Aoyama, Hideaki; Fujiwara, Yoshi; Ikeda, Yuichi; Iyetomi, Hiroshi; Souma, Wataru; Yoshikawa, Hiroshi

    2017-07-01

    Preface; Foreword, Acknowledgements, List of tables; List of figures, prologue, 1. Introduction: reconstructing macroeconomics; 2. Basic concepts in statistical physics and stochastic models; 3. Income and firm-size distributions; 4. Productivity distribution and related topics; 5. Multivariate time-series analysis; 6. Business cycles; 7. Price dynamics and inflation/deflation; 8. Complex network, community analysis, visualization; 9. Systemic risks; Appendix A: computer program for beginners; Epilogue; Bibliography; Index.

  12. ENT care of children and adolescents in the Brazilian public healthy system in three different municipalities.

    PubMed

    T-Ping, Cheng; Weckx, Luc Louis Maurice

    2008-01-01

    The data base of ENT care in the Brazilian public health system (Sistema Unico de Saude - SUS) will help organize public health programs. The following items were investigated in patients aged up to 17 years attended in public health system outpatient units in the city of Mariana, in the ENT screening unit, UNIFESP-EPM, and in CISMISEL: 1) The main otorhinolaryngological diagnoses; 2) The most frequently required exams, drugs, and surgical procedures and their indications; 3) The jobs of parents; the number of siblings; and 4) A statistical analysis and comparison of data in each location. We undertook a prospective study and a statistical analysis of variables that were gathered during the first visit. The age, the parents' salary, the number of siblings aged below 18 years, the presence of rhinitis, ears diseases, the exams, drugs and otological surgeries that were indicated were all statistically significant. The most common diagnosis was mouth breathing. The most common surgery was adenotonsillectomy. The most frequently requested exam was a lateral cranial radiograph. The number of unemployed parents, their poor salaries, and the number of siblings make it difficult for these patients to be treated in any facility other than the public heath system.

  13. Cloud-free resolution element statistics program

    NASA Technical Reports Server (NTRS)

    Liley, B.; Martin, C. D.

    1971-01-01

    Computer program computes number of cloud-free elements in field-of-view and percentage of total field-of-view occupied by clouds. Human error is eliminated by using visual estimation to compute cloud statistics from aerial photographs.

  14. UNIX-BASED DATA MANAGEMENT SYSTEM FOR PROPAGATION EXPERIMENTS

    NASA Technical Reports Server (NTRS)

    Kantak, A. V.

    1994-01-01

    This collection of programs comprises The UNIX Based Data Management System for the Pilot Field Experiment (PiFEx) which is an attempt to mimic the Mobile Satellite (MSAT) scenario. The major purposes of PiFEx are to define the mobile communications channels and test the workability of new concepts used to design various components of the receiver system. The results of the PiFex experiment are large amounts of raw data which must be accessed according to a researcher's needs. This package provides a system to manage the PiFEx data in an interactive way. The system not only provides the file handling necessary to retrieve the desired data, but also several FORTRAN programs to generate some standard results pertaining to propagation data. This package assumes that the data file initially generated from the experiment has been already converted from binary to ASCII format. The Data Management system described here consists of programs divided into two categories: those programs that handle the PiFEx generated files and those that are used for number-crunching of these files. Five FORTRAN programs and one UNIX shell script file are used for file manipulation purposes. These activities include: calibration of the acquired data; and parsing of the large data file into datasets concerned with different aspects of the experiment such as the specific calibrated propagation data, dynamic and static loop error data, statistical data, and temperature and spatial data on the hardware used in the experiment. The five remaining FORTRAN programs are used to generate usable information about the data. Signal level probability, probability density of the signal fitting the Rician density function, frequency of the data's fade duration, and the Fourier transform of the data can all be generated from these data manipulation programs. In addition, a program is provided which generates a downloadable file from the signal levels and signal phases files for use with the plotting routine AKPLOT (NPO-16931). All programs in this package are written in either FORTRAN-77 or UNIX shell-scripts. The package does not include test data. The programs were developed in 1987 for use with a UNIX operating system on a DEC MicroVAX computer.

  15. Retrofitting the AutoBayes Program Synthesis System with Concrete Syntax

    NASA Technical Reports Server (NTRS)

    Fischer, Bernd; Visser, Eelco

    2004-01-01

    AutoBayes is a fully automatic, schema-based program synthesis system for statistical data analysis applications. Its core component is a schema library. i.e., a collection of generic code templates with associated applicability constraints which are instantiated in a problem-specific way during synthesis. Currently, AutoBayes is implemented in Prolog; the schemas thus use abstract syntax (i.e., Prolog terms) to formulate the templates. However, the conceptual distance between this abstract representation and the concrete syntax of the generated programs makes the schemas hard to create and maintain. In this paper we describe how AutoBayes is retrofitted with concrete syntax. We show how it is integrated into Prolog and describe how the seamless interaction of concrete syntax fragments with AutoBayes's remaining legacy meta-programming kernel based on abstract syntax is achieved. We apply the approach to gradually mitigate individual schemas without forcing a disruptive migration of the entire system to a different First experiences show that a smooth migration can be achieved. Moreover, it can result in a considerable reduction of the code size and improved readability of the code. In particular, abstracting out fresh-variable generation and second-order term construction allows the formulation of larger continuous fragments.

  16. Direct assessment as a measure of institutional effectiveness in a dental hygiene distance education program.

    PubMed

    Olmsted, Jodi L

    2014-10-01

    This ten-year, longitudinal examination of a dental hygiene distance education (DE) program considered student performance on standard benchmark assessments as direct measures of institutional effectiveness. The aim of the study was to determine if students face-to-face in a classroom with an instructor performed differently from their counterparts in a DE program, taking courses through the alternative delivery system of synchronous interactive television (ITV). This study used students' grade point averages and National Board Dental Hygiene Examination scores to assess the impact of ITV on student learning, filling a crucial gap in current evidence. The study's research population consisted of 189 students who graduated from one dental hygiene program between 1997 and 2006. One hundred percent of the institution's data files for these students were used: 117 students were face-to-face with the instructor, and seventy-two received instruction through the ITV system. The results showed that, from a year-by-year perspective, no statistically significant performance differences were apparent between the two student groups when t-tests were used for data analysis. The DE system examined was considered effective for delivering education if similar performance outcomes were the evaluation criteria used for assessment.

  17. Moments of inclination error distribution computer program

    NASA Technical Reports Server (NTRS)

    Myler, T. R.

    1981-01-01

    A FORTRAN coded computer program is described which calculates orbital inclination error statistics using a closed-form solution. This solution uses a data base of trajectory errors from actual flights to predict the orbital inclination error statistics. The Scott flight history data base consists of orbit insertion errors in the trajectory parameters - altitude, velocity, flight path angle, flight azimuth, latitude and longitude. The methods used to generate the error statistics are of general interest since they have other applications. Program theory, user instructions, output definitions, subroutine descriptions and detailed FORTRAN coding information are included.

  18. Statistical Package User’s Guide.

    DTIC Science & Technology

    1980-08-01

    261 C. STACH Nonparametric Descriptive Statistics ... ......... ... 265 D. CHIRA Coefficient of Concordance...135 I.- -a - - W 7- Test Data: This program was tested using data from John Neter and William Wasserman, Applied Linear Statistical Models: Regression...length of data file e. new fileý name (not same as raw data file) 5. Printout as optioned for only. Comments: Ranked data are used for program CHIRA

  19. For the Love of Statistics: Appreciating and Learning to Apply Experimental Analysis and Statistics through Computer Programming Activities

    ERIC Educational Resources Information Center

    Mascaró, Maite; Sacristán, Ana Isabel; Rufino, Marta M.

    2016-01-01

    For the past 4 years, we have been involved in a project that aims to enhance the teaching and learning of experimental analysis and statistics, of environmental and biological sciences students, through computational programming activities (using R code). In this project, through an iterative design, we have developed sequences of R-code-based…

  20. Big Data: Are Biomedical and Health Informatics Training Programs Ready?

    PubMed Central

    Hersh, W.; Ganesh, A. U. Jai

    2014-01-01

    Summary Objectives The growing volume and diversity of health and biomedical data indicate that the era of Big Data has arrived for healthcare. This has many implications for informatics, not only in terms of implementing and evaluating information systems, but also for the work and training of informatics researchers and professionals. This article addresses the question: What do biomedical and health informaticians working in analytics and Big Data need to know? Methods We hypothesize a set of skills that we hope will be discussed among academic and other informaticians. Results The set of skills includes: Programming - especially with data-oriented tools, such as SQL and statistical programming languages; Statistics - working knowledge to apply tools and techniques; Domain knowledge - depending on one’s area of work, bioscience or health care; and Communication - being able to understand needs of people and organizations, and articulate results back to them. Conclusions Biomedical and health informatics educational programs must introduce concepts of analytics, Big Data, and the underlying skills to use and apply them into their curricula. The development of new coursework should focus on those who will become experts, with training aiming to provide skills in “deep analytical talent” as well as those who need knowledge to support such individuals. PMID:25123740

  1. Big Data: Are Biomedical and Health Informatics Training Programs Ready? Contribution of the IMIA Working Group for Health and Medical Informatics Education.

    PubMed

    Otero, P; Hersh, W; Jai Ganesh, A U

    2014-08-15

    The growing volume and diversity of health and biomedical data indicate that the era of Big Data has arrived for healthcare. This has many implications for informatics, not only in terms of implementing and evaluating information systems, but also for the work and training of informatics researchers and professionals. This article addresses the question: What do biomedical and health informaticians working in analytics and Big Data need to know? We hypothesize a set of skills that we hope will be discussed among academic and other informaticians. The set of skills includes: Programming - especially with data-oriented tools, such as SQL and statistical programming languages; Statistics - working knowledge to apply tools and techniques; Domain knowledge - depending on one's area of work, bioscience or health care; and Communication - being able to understand needs of people and organizations, and articulate results back to them. Biomedical and health informatics educational programs must introduce concepts of analytics, Big Data, and the underlying skills to use and apply them into their curricula. The development of new coursework should focus on those who will become experts, with training aiming to provide skills in "deep analytical talent" as well as those who need knowledge to support such individuals.

  2. Contribution of artificial intelligence to the knowledge of prognostic factors in laryngeal carcinoma.

    PubMed

    Zapater, E; Moreno, S; Fortea, M A; Campos, A; Armengot, M; Basterra, J

    2000-11-01

    Many studies have investigated prognostic factors in laryngeal carcinoma, with sometimes conflicting results. Apart from the importance of environmental factors, the different statistical methods employed may have influenced such discrepancies. A program based on artificial intelligence techniques is designed to determine the prognostic factors in a series of 122 laryngeal carcinomas. The results obtained are compared with those derived from two classical statistical methods (Cox regression and mortality tables). Tumor location was found to be the most important prognostic factor by all methods. The proposed intelligent system is found to be a sound method capable of detecting exceptional cases.

  3. Statistics Anxiety Update: Refining the Construct and Recommendations for a New Research Agenda.

    PubMed

    Chew, Peter K H; Dillon, Denise B

    2014-03-01

    Appreciation of the importance of statistics literacy for citizens of a democracy has resulted in an increasing number of degree programs making statistics courses mandatory for university students. Unfortunately, empirical evidence suggests that students in nonmathematical disciplines (e.g., social sciences) regard statistics courses as the most anxiety-inducing course in their degree programs. Although a literature review exists for statistics anxiety, it was done more than a decade ago, and newer studies have since added findings for consideration. In this article, we provide a current review of the statistics anxiety literature. Specifically, related variables, definitions, and measures of statistics anxiety are reviewed with the goal of refining the statistics anxiety construct. Antecedents, effects, and interventions of statistics anxiety are also reviewed to provide recommendations for statistics instructors and for a new research agenda. © The Author(s) 2014.

  4. Multiple linear regression analysis

    NASA Technical Reports Server (NTRS)

    Edwards, T. R.

    1980-01-01

    Program rapidly selects best-suited set of coefficients. User supplies only vectors of independent and dependent data and specifies confidence level required. Program uses stepwise statistical procedure for relating minimal set of variables to set of observations; final regression contains only most statistically significant coefficients. Program is written in FORTRAN IV for batch execution and has been implemented on NOVA 1200.

  5. Adult Basic and Secondary Education Program Statistics. Fiscal Year 1976.

    ERIC Educational Resources Information Center

    Cain, Sylvester H.; Whalen, Barbara A.

    Reports submitted to the National Center for Education Statistics provided data for this compilation and tabulation of data on adult participants in U.S. educational programs in fiscal year 1976. In the summary section introducing the charts, it is noted that adult education programs funded under P.L. 91-230 served over 1.6 million persons--an…

  6. Using Microsoft Excel to teach statistics in a graduate advanced practice nursing program.

    PubMed

    DiMaria-Ghalili, Rose Ann; Ostrow, C Lynne

    2009-02-01

    This article describes the authors' experiences during 3 years of using Microsoft Excel to teach graduate-level statistics, as part of the research core required by the American Association of Colleges of Nursing for all professional graduate nursing programs. The advantages to using this program instead of specialized statistical programs are ease of accessibility, increased transferability of skills, and reduced cost for students. The authors share their insight about realistic goals for teaching statistics to master's-level students and the resources that are available to faculty to help them to learn and use Excel in their courses. Several online sites that are excellent resources for both faculty and students are discussed. Detailed attention is given to an online course (Carnegie-Mellon University Open Learning Initiative, n.d.), which the authors have incorporated into their graduate-level research methods course.

  7. Compilation of streamflow statistics calculated from daily mean streamflow data collected during water years 1901–2015 for selected U.S. Geological Survey streamgages

    USGS Publications Warehouse

    Granato, Gregory E.; Ries, Kernell G.; Steeves, Peter A.

    2017-10-16

    Streamflow statistics are needed by decision makers for many planning, management, and design activities. The U.S. Geological Survey (USGS) StreamStats Web application provides convenient access to streamflow statistics for many streamgages by accessing the underlying StreamStatsDB database. In 2016, non-interpretive streamflow statistics were compiled for streamgages located throughout the Nation and stored in StreamStatsDB for use with StreamStats and other applications. Two previously published USGS computer programs that were designed to help calculate streamflow statistics were updated to better support StreamStats as part of this effort. These programs are named “GNWISQ” (Get National Water Information System Streamflow (Q) files), updated to version 1.1.1, and “QSTATS” (Streamflow (Q) Statistics), updated to version 1.1.2.Statistics for 20,438 streamgages that had 1 or more complete years of record during water years 1901 through 2015 were calculated from daily mean streamflow data; 19,415 of these streamgages were within the conterminous United States. About 89 percent of the 20,438 streamgages had 3 or more years of record, and about 65 percent had 10 or more years of record. Drainage areas of the 20,438 streamgages ranged from 0.01 to 1,144,500 square miles. The magnitude of annual average streamflow yields (streamflow per square mile) for these streamgages varied by almost six orders of magnitude, from 0.000029 to 34 cubic feet per second per square mile. About 64 percent of these streamgages did not have any zero-flow days during their available period of record. The 18,122 streamgages with 3 or more years of record were included in the StreamStatsDB compilation so they would be available via the StreamStats interface for user-selected streamgages. All the statistics are available in a USGS ScienceBase data release.

  8. Transportation statistics annual report 1995

    DOT National Transportation Integrated Search

    1995-01-01

    The summary of transportation statistics : programs and many of the tables and : graphs pioneered in last years Transportation : Statistics Annual Report have : been incorporated into the companion volume, : National Transportation Statistics. The...

  9. Performance of experienced dentists in Switzerland after an e-learning program on ICDAS occlusal caries detection.

    PubMed

    Rodrigues, Jonas Almeida; de Oliveira, Renata Schlesner; Hug, Isabel; Neuhaus, Klaus; Lussi, Adrian

    2013-08-01

    This study aimed to evaluate the effect of an e-learning program on the validity and reproducibility of the International Caries Detection and Assessment System (ICDAS) in detecting occlusal caries. For the study, 170 permanent molars were selected. Four dentists in Switzerland who had no previous contact with ICDAS examined the teeth before and after the e-learning program and scored the sites according to ICDAS. Teeth were histologically prepared and assessed for caries extension. The significance level was set at 0.05. Sensitivity before and after the e-learning program was 0.80 and 0.77 (D1), 0.72 and 0.63 (D2), and 0.74 and 0.67 (D3,4), respectively. Specificity was 0.64 and 0.69 (D1), 0.70 and 0.81 (D2), and 0.81 and 0.87 (D3,4). A McNemar test did not show any difference between the values of sensitivity, specificity, accuracy, and area under the ROC curve (AUC) before and after the e-learning program. The averages of wK values for interexaminer reproducibility were 0.61 (before) and 0.66 (after). Correlation with histology presented wK values of 0.62 (before) and 0.63 (after). A Wilcoxon test showed a statistically significant difference between before and after the e-learning program. In conclusion, even though ICDAS performed well in detecting occlusal caries, the e-learning program did not have any statistically significant effect on its performance by these experienced dentists.

  10. Addressing hospital-acquired pressure ulcers: patient care managers enhancing outcomes at the point of service.

    PubMed

    Frumenti, Jeanine M; Kurtz, Abby

    2014-01-01

    An innovative leadership training program for patient care managers (PCMs) aimed at improving the management of operational failures was conducted at a large metropolitan hospital center. The program focused on developing and enhancing the transformational leadership skills of PCMs by improving their ability to manage operational failures in general and, in this case, hospital-acquired pressure ulcers. The PCMs received 8 weeks of intense training using the Toyota Production System process improvement approach, along with executive coaching. Compared with the control group, the gains made by the intervention group were statistically significant.

  11. Digital image profilers for detecting faint sources which have bright companions, phase 2

    NASA Technical Reports Server (NTRS)

    Morris, Elena; Flint, Graham

    1991-01-01

    A breadboard image profiling system developed for the first phase of this project demonstrated the potential for detecting extremely faint optical sources in the presence of light companions. Experimental data derived from laboratory testing of the device supports the theory that image profilers of this type may approach the theoretical limit imposed by photon statistics. The objective of Phase 2 of this program is the development of a ground-based multichannel image profiling system capable of detecting faint stellar objects slightly displaced from brighter stars. We have finalized the multichannel image profiling system and attempted three field tests.

  12. A Simplified Algorithm for Statistical Investigation of Damage Spreading

    NASA Astrophysics Data System (ADS)

    Gecow, Andrzej

    2009-04-01

    On the way to simulating adaptive evolution of complex system describing a living object or human developed project, a fitness should be defined on node states or network external outputs. Feedbacks lead to circular attractors of these states or outputs which make it difficult to define a fitness. The main statistical effects of adaptive condition are the result of small change tendency and to appear, they only need a statistically correct size of damage initiated by evolutionary change of system. This observation allows to cut loops of feedbacks and in effect to obtain a particular statistically correct state instead of a long circular attractor which in the quenched model is expected for chaotic network with feedback. Defining fitness on such states is simple. We calculate only damaged nodes and only once. Such an algorithm is optimal for investigation of damage spreading i.e. statistical connections of structural parameters of initial change with the size of effected damage. It is a reversed-annealed method—function and states (signals) may be randomly substituted but connections are important and are preserved. The small damages important for adaptive evolution are correctly depicted in comparison to Derrida annealed approximation which expects equilibrium levels for large networks. The algorithm indicates these levels correctly. The relevant program in Pascal, which executes the algorithm for a wide range of parameters, can be obtained from the author.

  13. Evaluation of the effects of one year's operation of the dynamic preferential runway system. [human reactions to overflight air traffic pattern

    NASA Technical Reports Server (NTRS)

    Borsky, P. N.

    1974-01-01

    The FAA introduced an experimental aircraft operations program at JFK Airport called the Dynamic Preferential Runway System (DPRS) in the summer of 1971. The program is designed to distribute air traffic as equally as possible over the surrounding communities, to limit periods of continuous overflight and to vary the same hours of overflight from day to day. After a full year's operation, an evaluation was made of the system's effectiveness. All of the operation's goals were moderately achieved with the greatest relief in reduced overflight afforded the most heavily impacted areas. Few residents, however, were aware of DPRS or felt that it had greatly reduced annoyance or represented a major effort by the aircraft authorities. Statistical analyses of reported annoyance obtained from two independent surveys in 1969 and 1972 reveal limited reductions in annoyance in 1972, with shifts from reported high annoyance to moderate annoyance.

  14. Probabilistic Survivability Versus Time Modeling

    NASA Technical Reports Server (NTRS)

    Joyner, James J., Sr.

    2016-01-01

    This presentation documents Kennedy Space Center's Independent Assessment work completed on three assessments for the Ground Systems Development and Operations (GSDO) Program to assist the Chief Safety and Mission Assurance Officer during key programmatic reviews and provided the GSDO Program with analyses of how egress time affects the likelihood of astronaut and ground worker survival during an emergency. For each assessment, a team developed probability distributions for hazard scenarios to address statistical uncertainty, resulting in survivability plots over time. The first assessment developed a mathematical model of probabilistic survivability versus time to reach a safe location using an ideal Emergency Egress System at Launch Complex 39B (LC-39B); the second used the first model to evaluate and compare various egress systems under consideration at LC-39B. The third used a modified LC-39B model to determine if a specific hazard decreased survivability more rapidly than other events during flight hardware processing in Kennedy's Vehicle Assembly Building.

  15. Results of the Association of Directors of Radiation Oncology Programs (ADROP) Survey of Radiation Oncology Residency Program Directors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, Eleanor; Abdel-Wahab, May; Spangler, Ann E.

    2009-06-01

    Purpose: To survey the radiation oncology residency program directors on the topics of departmental and institutional support systems, residency program structure, Accreditation Council for Graduate Medical Education (ACGME) requirements, and challenges as program director. Methods: A survey was developed and distributed by the leadership of the Association of Directors of Radiation Oncology Programs to all radiation oncology program directors. Summary statistics, medians, and ranges were collated from responses. Results: Radiation oncology program directors had implemented all current required aspects of the ACGME Outcome Project into their training curriculum. Didactic curricula were similar across programs nationally, but research requirements and resourcesmore » varied widely. Program directors responded that implementation of the ACGME Outcome Project and the external review process were among their greatest challenges. Protected time was the top priority for program directors. Conclusions: The Association of Directors of Radiation Oncology Programs recommends that all radiation oncology program directors have protected time and an administrative stipend to support their important administrative and educational role. Departments and institutions should provide adequate and equitable resources to the program directors and residents to meet increasingly demanding training program requirements.« less

  16. THERMUS—A thermal model package for ROOT

    NASA Astrophysics Data System (ADS)

    Wheaton, S.; Cleymans, J.; Hauer, M.

    2009-01-01

    THERMUS is a package of C++ classes and functions allowing statistical-thermal model analyses of particle production in relativistic heavy-ion collisions to be performed within the ROOT framework of analysis. Calculations are possible within three statistical ensembles; a grand-canonical treatment of the conserved charges B, S and Q, a fully canonical treatment of the conserved charges, and a mixed-canonical ensemble combining a canonical treatment of strangeness with a grand-canonical treatment of baryon number and electric charge. THERMUS allows for the assignment of decay chains and detector efficiencies specific to each particle yield, which enables sensible fitting of model parameters to experimental data. Program summaryProgram title: THERMUS, version 2.1 Catalogue identifier: AEBW_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEBW_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 17 152 No. of bytes in distributed program, including test data, etc.: 93 581 Distribution format: tar.gz Programming language: C++ Computer: PC, Pentium 4, 1 GB RAM (not hardware dependent) Operating system: Linux: FEDORA, RedHat, etc. Classification: 17.7 External routines: Numerical Recipes in C [1], ROOT [2] Nature of problem: Statistical-thermal model analyses of heavy-ion collision data require the calculation of both primordial particle densities and contributions from resonance decay. A set of thermal parameters (the number depending on the particular model imposed) and a set of thermalized particles, with their decays specified, is required as input to these models. The output is then a complete set of primordial thermal quantities for each particle, together with the contributions to the final particle yields from resonance decay. In many applications of statistical-thermal models it is required to fit experimental particle multiplicities or particle ratios. In such analyses, the input is a set of experimental yields and ratios, a set of particles comprising the assumed hadron resonance gas formed in the collision and the constraints to be placed on the system. The thermal model parameters consistent with the specified constraints leading to the best-fit to the experimental data are then output. Solution method: THERMUS is a package designed for incorporation into the ROOT [2] framework, used extensively by the heavy-ion community. As such, it utilizes a great deal of ROOT's functionality in its operation. ROOT features used in THERMUS include its containers, the wrapper TMinuit implementing the MINUIT fitting package, and the TMath class of mathematical functions and routines. Arguably the most useful feature is the utilization of CINT as the control language, which allows interactive access to the THERMUS objects. Three distinct statistical ensembles are included in THERMUS, while additional options to include quantum statistics, resonance width and excluded volume corrections are also available. THERMUS provides a default particle list including all mesons (up to the K4∗ (2045)) and baryons (up to the Ω) listed in the July 2002 Particle Physics Booklet [3]. For each typically unstable particle in this list, THERMUS includes a text-file listing its decays. With thermal parameters specified, THERMUS calculates primordial thermal densities either by performing numerical integrations or else, in the case of the Boltzmann approximation without resonance width in the grand-canonical ensemble, by evaluating Bessel functions. Particle decay chains are then used to evaluate experimental observables (i.e. particle yields following resonance decay). Additional detector efficiency factors allow fine-tuning of the model predictions to a specific detector arrangement. When parameters are required to be constrained, use is made of the 'Numerical Recipes in C' [1] function which applies the Broyden globally convergent secant method of solving nonlinear systems of equations. Since the NRC software is not freely-available, it has to be purchased by the user. THERMUS provides the means of imposing a large number of constraints on the chosen model (amongst others, THERMUS can fix the baryon-to-charge ratio of the system, the strangeness density of the system and the primordial energy per hadron). Fits to experimental data are accomplished in THERMUS by using the ROOT TMinuit class. In its default operation, the standard χ function is minimized, yielding the set of best-fit thermal parameters. THERMUS allows the assignment of separate decay chains to each experimental input. In this way, the model is able to match the specific feed-down corrections of a particular data set. Running time: Depending on the analysis required, run-times vary from seconds (for the evaluation of particle multiplicities given a set of parameters) to several minutes (for fits to experimental data subject to constraints). References:W.H. Press, S.A. Teukolsky, W.T. Vetterling, B.P. Flannery, Numerical Recipes in C: The Art of Scientific Computing, Cambridge University Press, Cambridge, 2002. R. Brun, F. Rademakers, Nucl. Inst. Meth. Phys. Res. A 389 (1997) 81. See also http://root.cern.ch/. K. Hagiwara et al., Phys. Rev. D 66 (2002) 010001.

  17. UKIRT fast guide system improvements

    NASA Astrophysics Data System (ADS)

    Balius, Al; Rees, Nicholas P.

    1997-09-01

    The United Kingdom Infra-Red Telescope (UKIRT) has recently undergone the first major upgrade program since its construction. One part of the upgrade program was an adaptive tip-tilt secondary mirror closed with a CCD system collectively called the fast guide system. The installation of the new secondary and associated systems was carried out in the first half of 1996. Initial testing of the fast guide system has shown great improvement in guide accuracy. The initial installation included a fixed integration time CCD. In the first part of 1997 an integration time controller based on computed guide star luminosity was implemented in the fast guide system. Also, a Kalman type estimator was installed in the image tracking loop based on a dynamic model and knowledge of the statistical properties of the guide star position error measurement as a function of computed guide star magnitude and CCD integration time. The new configuration was tested in terms of improved guide performance nd graceful degradation when tracking faint guide stars. This paper describes the modified fast guide system configuration and reports the results of performance tests.

  18. Do-it-yourself statistics: A computer-assisted likelihood approach to analysis of data from genetic crosses.

    PubMed Central

    Robbins, L G

    2000-01-01

    Graduate school programs in genetics have become so full that courses in statistics have often been eliminated. In addition, typical introductory statistics courses for the "statistics user" rather than the nascent statistician are laden with methods for analysis of measured variables while genetic data are most often discrete numbers. These courses are often seen by students and genetics professors alike as largely irrelevant cookbook courses. The powerful methods of likelihood analysis, although commonly employed in human genetics, are much less often used in other areas of genetics, even though current computational tools make this approach readily accessible. This article introduces the MLIKELY.PAS computer program and the logic of do-it-yourself maximum-likelihood statistics. The program itself, course materials, and expanded discussions of some examples that are only summarized here are available at http://www.unisi. it/ricerca/dip/bio_evol/sitomlikely/mlikely.h tml. PMID:10628965

  19. Revised Perturbation Statistics for the Global Scale Atmospheric Model

    NASA Technical Reports Server (NTRS)

    Justus, C. G.; Woodrum, A.

    1975-01-01

    Magnitudes and scales of atmospheric perturbations about the monthly mean for the thermodynamic variables and wind components are presented by month at various latitudes. These perturbation statistics are a revision of the random perturbation data required for the global scale atmospheric model program and are from meteorological rocket network statistical summaries in the 22 to 65 km height range and NASA grenade and pitot tube data summaries in the region up to 90 km. The observed perturbations in the thermodynamic variables were adjusted to make them consistent with constraints required by the perfect gas law and the hydrostatic equation. Vertical scales were evaluated by Buell's depth of pressure system equation and from vertical structure function analysis. Tables of magnitudes and vertical scales are presented for each month at latitude 10, 30, 50, 70, and 90 degrees.

  20. Fade durations in satellite-path mobile radio propagation

    NASA Technical Reports Server (NTRS)

    Schmier, Robert G.; Bostian, Charles W.

    1986-01-01

    Fades on satellite to land mobile radio links are caused by several factors, the most important of which are multipath propagation and vegetative shadowing. Designers of vehicular satellite communications systems require information about the statistics of fade durations in order to overcome or compensate for the fades. Except for a few limiting cases, only the mean fade duration can be determined analytically, and all other statistics must be obtained experimentally or via simulation. This report describes and presents results from a computer program developed at Virginia Tech to simulate satellite path propagation of a mobile station in a rural area. It generates rapidly-fading and slowly-fading signals by separate processes that yield correct cumulative signal distributions and then combines these to simulate the overall signal. This is then analyzed to yield the statistics of fade duration.

  1. 2016 Annual Disability Statistics Compendium

    ERIC Educational Resources Information Center

    Lauer, E. A.; Houtenville, A. J.

    2017-01-01

    The "Annual Disability Statistics Compendium" is a publication of statistics about people with disabilities and about the government programs which serve them. The "Compendium" is designed to serve as a summary of government statistics. The 2016 "Annual Disability Statistics Compendium" was substantially revised and…

  2. Efficacy of the Social Skills Improvement System Classwide Intervention Program (SSIS-CIP) primary version.

    PubMed

    DiPerna, James Clyde; Lei, Puiwa; Bellinger, Jillian; Cheng, Weiyi

    2015-03-01

    A multisite cluster randomized trial was conducted to examine the effects of the Social Skills Improvement System Classwide Intervention Program (SSIS-CIP; Elliott & Gresham, 2007) on students' classroom social behavior. The final sample included 432 students across 38 second grade classrooms. Social skills and problem behaviors were measured via the SSIS rating scale for all participants, and direct observations were completed for a subsample of participants within each classroom. Results indicated that the SSIS-CIP demonstrated positive effects on teacher ratings of participants' social skills and internalizing behaviors, with the greatest changes occurring in classrooms with students who exhibited lower skill proficiency prior to implementation. Statistically significant differences were not observed between treatment and control participants on teacher ratings of externalizing problem behaviors or direct observation.

  3. Mathematical Sciences Division 1992 Programs

    DTIC Science & Technology

    1992-10-01

    statistical theory that underlies modern signal analysis . There is a strong emphasis on stochastic processes and time series , particularly those which...include optimal resource planning and real- time scheduling of stochastic shop-floor processes. Scheduling systems will be developed that can adapt to...make forecasts for the length-of-service time series . Protocol analysis of these sessions will be used to idenify relevant contextual features and to

  4. A Comparative Study of Family Planning Service Statistics Systems in the ESCAP Region. Asian Population Studies Series No. 15.

    ERIC Educational Resources Information Center

    United Nations Economic and Social Commission for Asia and the Pacific, Bangkok (Thailand).

    This monograph contains a study conducted by the Population Division of the United Nations Economic and Social Committee for Asia and the Pacific (ESCAP). The document is designed to aid policy-makers, administrators and evaluation personnel in family planning programs in the ESCAP region, primarily; and researchers working in the field of family…

  5. Watershed Statistics | ECHO | US EPA

    EPA Pesticide Factsheets

    ECHO, Enforcement and Compliance History Online, provides compliance and enforcement information for approximately 800,000 EPA-regulated facilities nationwide. ECHO includes permit, inspection, violation, enforcement action, and penalty information about facilities regulated under the Clean Air Act (CAA) Stationary Source Program, Clean Water Act (CWA) National Pollutant Elimination Discharge System (NPDES), and/or Resource Conservation and Recovery Act (RCRA). Information also is provided on surrounding demographics when available.

  6. Watershed Statistics Help | ECHO | US EPA

    EPA Pesticide Factsheets

    ECHO, Enforcement and Compliance History Online, provides compliance and enforcement information for approximately 800,000 EPA-regulated facilities nationwide. ECHO includes permit, inspection, violation, enforcement action, and penalty information about facilities regulated under the Clean Air Act (CAA) Stationary Source Program, Clean Water Act (CWA) National Pollutant Elimination Discharge System (NPDES), and/or Resource Conservation and Recovery Act (RCRA). Information also is provided on surrounding demographics when available.

  7. DMR Search Statistics Help | ECHO | US EPA

    EPA Pesticide Factsheets

    ECHO, Enforcement and Compliance History Online, provides compliance and enforcement information for approximately 800,000 EPA-regulated facilities nationwide. ECHO includes permit, inspection, violation, enforcement action, and penalty information about facilities regulated under the Clean Air Act (CAA) Stationary Source Program, Clean Water Act (CWA) National Pollutant Elimination Discharge System (NPDES), and/or Resource Conservation and Recovery Act (RCRA). Information also is provided on surrounding demographics when available.

  8. How economic development and family planning programs combined to reduce Indonesian fertility.

    PubMed

    Gertler, P J; Molyneaux, J W

    1994-02-01

    This paper examines the contributions of family planning programs, economic development, and women's status to Indonesian fertility decline from 1982 to 1987. Methodologically we unify seemingly conflicting demographic and economic frameworks into a single "structural" proximate-cause model as well as controlling statistically for the targeted (nonrandom) placement of family planning program inputs. The results are consistent with both frameworks: 75% of the fertility decline resulted from increased contraceptive use, but was induced primarily through economic development and improved education and economic opportunities for females. Even so, the dramatic impact of the changes in demand-side factors (education and economic development) on contraceptive use was possible only because there already existed a highly responsive contraceptive supply delivery system.

  9. [An analysis of residents' self-evaluation and faculty-evaluation in internal medicine standardized residency training program using Milestones evaluation system].

    PubMed

    Zhang, Y; Chu, X T; Zeng, X J; Li, H; Zhang, F C; Zhang, S Y; Shen, T

    2018-06-01

    Objective: To assess the value of internal medicine residency training program at Peking Union Medical College Hospital (PUMCH), and the feasibility of applying revised Milestones evaluation system. Methods: Postgraduate-year-one to four (PGY-1 to PGY-4) residents in PUMCH finished the revised Milestones evaluation scales in September 2017. Residents' self-evaluation and faculty-evaluation scores were calculated. Statistical analysis was conducted on the data. Results: A total of 207 residents were enrolled in this cross-sectional study. Both self and faculty scores showed an increasing trend in senior residents. PGY-1 residents were assessed during their first month of residency with scores of 4 points or higher, suggesting that residents have a high starting level. More strikingly, the mean score in PGY-4 was 7 points or higher, proving the career development of residency training program. There was no statistically significant difference between total self- and faculty-evaluation scores. Evaluation scores of learning ability and communication ability were lower in faculty group ( t =-2.627, -4.279, all P <0.05). The scores in graduate students were lower than those in standardized training residents. Conclusions: The goal of national standardized residency training is to improve the quality of healthcare and residents' career development. The evaluation results would guide curriculum design and emphasize the importance and necessity of multi-level teaching. Self-evaluation contributes to the understanding of training objectives and personal cognition.

  10. Analysis of trends in water-quality data for water conservation area 3A, the Everglades, Florida

    USGS Publications Warehouse

    Mattraw, H.C.; Scheidt, D.J.; Federico, A.C.

    1987-01-01

    Rainfall and water quality data bases from the South Florida Water Management District were used to evaluate water quality trends at 10 locations near or in Water Conservation Area 3A in The Everglades. The Seasonal Kendall test was applied to specific conductance, orthophosphate-phosphorus, nitrate-nitrogen, total Kjeldahl nitrogen, and total nitrogen regression residuals for the period 1978-82. Residuals of orthophosphate and nitrate quadratic models, based on antecedent 7-day rainfall at inflow gate S-11B, were the only two constituent-structure pairs that showed apparent significant (p < 0.05) increases in constituent concentrations. Elimination of regression models with distinct residual patterns and data outlines resulted in 17 statistically significant station water quality combinations for trend analysis. No water quality trends were observed. The 1979 Memorandum of Agreement outlining the water quality monitoring program between the Everglades National Park and the U.S. Army Corps of Engineers stressed collection four times a year at three stations, and extensive coverage of water quality properties. Trend analysis and other rigorous statistical evaluation programs are better suited to data monitoring programs that include more frequent sampling and that are organized in a water quality data management system. Pronounced areal differences in water quality suggest that a water quality monitoring system for Shark River Slough in Everglades National Park include collection locations near the source of inflow to Water Conservation Area 3A. (Author 's abstract)

  11. Description and texts for the auxiliary programs for processing video information on the YeS computer. Part 3: Test program 2

    NASA Technical Reports Server (NTRS)

    Borisenko, V. I., G.g.; Stetsenko, Z. A.

    1980-01-01

    The functions were discribed and the operating instructions, the block diagram and the proposed versions are given for modifying the program in order to obtain the statistical characteristics of multi-channel video information. The program implements certain man-machine methods for investigating video information. It permits representation of the material and its statistical characteristics in a form which is convenient for the user.

  12. ENRE 655 Class Project. Development of the Initial Main Parachute Failure Probability for the Constellation Program (CxP) Orion Crew Exploration Vehicle (CEV) Parachute Assembly System (CPAS)

    NASA Technical Reports Server (NTRS)

    Fuqua, Bryan C.

    2010-01-01

    Loss of Crew (LOC) and Loss of Mission (LOM) are two key requirements the Constellation Program (CxP) measure against. To date, one of the top risk drivers for both LOC and LOM has been Orion's Crew Exploration Vehicle (CEV) Parachute Assembly System (CPAS). Even though the Orion CPAS is one of the top risk drivers of CxP, it has been very difficult to obtain any relevant data to accurately quantify the risk. At first glance, it would seem that a parachute system would be very reliable given the track record of Apollo and Soyuz. Given the success of those two programs, the amount of data is considered to be statistically insignificant. However, due to CxP having LOC/LOM as key design requirements, it was necessary for Orion to generate a valid prior to begin the Risk Informed Design process. To do so, the Safety & Mission Assurance (S&MA) Space Shuttle & Exploration Analysis Section generated an initial failure probability for Orion to use in preparation for the Orion Systems Requirements Review (SRR).

  13. Statistical Trajectory Estimation Program (STEP) implementation for BLDT post flight trajectory simulation

    NASA Technical Reports Server (NTRS)

    Shields, W. E.

    1973-01-01

    Tests were conducted to provide flight conditions for qualifying the Viking Decelerator System in a simulated Mars environment. A balloon launched decelerator test (BLDT) vehicle which has an external shape similar to the actual Mars Viking Lander Capsule was used so that the decelerator would be deployed in the wake of a blunt body. An effort was made to simulate the BLDT vehicle flights from the time they were dropped from the balloon, through decelerator deployment, until stable decelerator conditions were reached. The procedure used to simulate these flights using the Statistical Trajectory Estimation Program (STEP) is discussed. Using primarily ground-based position radar and vehicle onboard rate gyro and accelerometer data, the STEP produces a minimum variance solution of the vehicle trajectory and calculates vehicle attitude histories. Using film from cameras in the vehicle along with a computer program, attitude histories for portions of the flight before and after decelerator deployment were calculated independent of the STEP simulation. With the assumption that the vehicle motions derived from camera data are accurate, a comparison reveals that STEP was able to simulate vehicle motions for all flights both before and after decelerator deployment.

  14. Optimization strategies based on sequential quadratic programming applied for a fermentation process for butanol production.

    PubMed

    Pinto Mariano, Adriano; Bastos Borba Costa, Caliane; de Franceschi de Angelis, Dejanira; Maugeri Filho, Francisco; Pires Atala, Daniel Ibraim; Wolf Maciel, Maria Regina; Maciel Filho, Rubens

    2009-11-01

    In this work, the mathematical optimization of a continuous flash fermentation process for the production of biobutanol was studied. The process consists of three interconnected units, as follows: fermentor, cell-retention system (tangential microfiltration), and vacuum flash vessel (responsible for the continuous recovery of butanol from the broth). The objective of the optimization was to maximize butanol productivity for a desired substrate conversion. Two strategies were compared for the optimization of the process. In one of them, the process was represented by a deterministic model with kinetic parameters determined experimentally and, in the other, by a statistical model obtained using the factorial design technique combined with simulation. For both strategies, the problem was written as a nonlinear programming problem and was solved with the sequential quadratic programming technique. The results showed that despite the very similar solutions obtained with both strategies, the problems found with the strategy using the deterministic model, such as lack of convergence and high computational time, make the use of the optimization strategy with the statistical model, which showed to be robust and fast, more suitable for the flash fermentation process, being recommended for real-time applications coupling optimization and control.

  15. Cost estimation and analysis using the Sherpa Automated Mine Cost Engineering System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stebbins, P.E.

    1993-09-01

    The Sherpa Automated Mine Cost Engineering System is a menu-driven software package designed to estimate capital and operating costs for proposed surface mining operations. The program is engineering (as opposed to statistically) based, meaning that all equipment, manpower, and supply requirements are determined from deposit geology, project design and mine production information using standard engineering techniques. These requirements are used in conjunction with equipment, supply, and labor cost databases internal to the program to estimate all associated costs. Because virtually all on-site cost parameters are interrelated within the program, Sherpa provides an efficient means of examining the impact of changesmore » in the equipment mix on total capital and operating costs. If any aspect of the operation is changed, Sherpa immediately adjusts all related aspects as necessary. For instance, if the user wishes to examine the cost ramifications of selecting larger trucks, the program not only considers truck purchase and operation costs, it also automatically and immediately adjusts excavator requirements, operator and mechanic needs, repair facility size, haul road construction and maintenance costs, and ancillary equipment specifications.« less

  16. Development and status of data quality assurance program at NASA Langley research center: Toward national standards

    NASA Technical Reports Server (NTRS)

    Hemsch, Michael J.

    1996-01-01

    As part of a continuing effort to re-engineer the wind tunnel testing process, a comprehensive data quality assurance program is being established at NASA Langley Research Center (LaRC). The ultimate goal of the program is routing provision of tunnel-to-tunnel reproducibility with total uncertainty levels acceptable for test and evaluation of civilian transports. The operational elements for reaching such levels of reproducibility are: (1) statistical control, which provides long term measurement uncertainty predictability and a base for continuous improvement, (2) measurement uncertainty prediction, which provides test designs that can meet data quality expectations with the system's predictable variation, and (3) national standards, which provide a means for resolving tunnel-to-tunnel differences. The paper presents the LaRC design for the program and discusses the process of implementation.

  17. A better way to evaluate remote monitoring programs in chronic disease care: receiver operating characteristic analysis.

    PubMed

    Brown Connolly, Nancy E

    2014-12-01

    This foundational study applies the process of receiver operating characteristic (ROC) analysis to evaluate utility and predictive value of a disease management (DM) model that uses RM devices for chronic obstructive pulmonary disease (COPD). The literature identifies a need for a more rigorous method to validate and quantify evidence-based value for remote monitoring (RM) systems being used to monitor persons with a chronic disease. ROC analysis is an engineering approach widely applied in medical testing, but that has not been evaluated for its utility in RM. Classifiers (saturated peripheral oxygen [SPO2], blood pressure [BP], and pulse), optimum threshold, and predictive accuracy are evaluated based on patient outcomes. Parametric and nonparametric methods were used. Event-based patient outcomes included inpatient hospitalization, accident and emergency, and home health visits. Statistical analysis tools included Microsoft (Redmond, WA) Excel(®) and MedCalc(®) (MedCalc Software, Ostend, Belgium) version 12 © 1993-2013 to generate ROC curves and statistics. Persons with COPD were monitored a minimum of 183 days, with at least one inpatient hospitalization within 12 months prior to monitoring. Retrospective, de-identified patient data from a United Kingdom National Health System COPD program were used. Datasets included biometric readings, alerts, and resource utilization. SPO2 was identified as a predictive classifier, with an optimal average threshold setting of 85-86%. BP and pulse were failed classifiers, and areas of design were identified that may improve utility and predictive capacity. Cost avoidance methodology was developed. RESULTS can be applied to health services planning decisions. Methods can be applied to system design and evaluation based on patient outcomes. This study validated the use of ROC in RM program evaluation.

  18. Image compression system and method having optimized quantization tables

    NASA Technical Reports Server (NTRS)

    Ratnakar, Viresh (Inventor); Livny, Miron (Inventor)

    1998-01-01

    A digital image compression preprocessor for use in a discrete cosine transform-based digital image compression device is provided. The preprocessor includes a gathering mechanism for determining discrete cosine transform statistics from input digital image data. A computing mechanism is operatively coupled to the gathering mechanism to calculate a image distortion array and a rate of image compression array based upon the discrete cosine transform statistics for each possible quantization value. A dynamic programming mechanism is operatively coupled to the computing mechanism to optimize the rate of image compression array against the image distortion array such that a rate-distortion-optimal quantization table is derived. In addition, a discrete cosine transform-based digital image compression device and a discrete cosine transform-based digital image compression and decompression system are provided. Also, a method for generating a rate-distortion-optimal quantization table, using discrete cosine transform-based digital image compression, and operating a discrete cosine transform-based digital image compression and decompression system are provided.

  19. Recognition of speaker-dependent continuous speech with KEAL

    NASA Astrophysics Data System (ADS)

    Mercier, G.; Bigorgne, D.; Miclet, L.; Le Guennec, L.; Querre, M.

    1989-04-01

    A description of the speaker-dependent continuous speech recognition system KEAL is given. An unknown utterance, is recognized by means of the followng procedures: acoustic analysis, phonetic segmentation and identification, word and sentence analysis. The combination of feature-based, speaker-independent coarse phonetic segmentation with speaker-dependent statistical classification techniques is one of the main design features of the acoustic-phonetic decoder. The lexical access component is essentially based on a statistical dynamic programming technique which aims at matching a phonemic lexical entry containing various phonological forms, against a phonetic lattice. Sentence recognition is achieved by use of a context-free grammar and a parsing algorithm derived from Earley's parser. A speaker adaptation module allows some of the system parameters to be adjusted by matching known utterances with their acoustical representation. The task to be performed, described by its vocabulary and its grammar, is given as a parameter of the system. Continuously spoken sentences extracted from a 'pseudo-Logo' language are analyzed and results are presented.

  20. Effectiveness of teaching International Caries Detection and Assessment System II and its e-learning program to freshman dental students on occlusal caries detection

    PubMed Central

    El-Damanhoury, Hatem M.; Fakhruddin, Kausar Sadia; Awad, Manal A.

    2014-01-01

    Objective: To assess the feasibility of teaching International Caries Detection and Assessment System (ICDAS) II and its e-learning program as tools for occlusal caries detection to freshmen dental students in comparison to dental graduates with 2 years of experience. Materials and Methods: Eighty-four freshmen and 32 dental graduates examined occlusal surfaces of molars/premolars (n = 72) after a lecture and a hands-on workshop. The same procedure was repeated after 1 month following the training with ICDAS II e-learning program. Validation of ICDAS II codes was done histologically. Intra- and inter-examiner reproducibility of ICDAS II severity scores were assessed before and after e-learning using (Fleiss's kappa). Results: The kappa values showed inter-examiner reproducibility ranged from 0.53 (ICDAS II code cut off ≥ 1) to 0.70 (ICDAS II code cut off ≥ 3) by undergraduates and 0.69 (ICDAS II code cut off ≥ 1) to 0.95 (ICDAS II code cut off ≥ 3) by graduates. The inter-examiner reproducibility ranged from 0.64 (ICDAS II code cut off ≥ 1) to 0.89 (ICDAS II code cut off ≥ 3). No statistically significant difference was found between both groups in intra-examiner agreements for assessing ICDAS II codes. A high statistically significant difference (P ≤ 0.01) in correct identification of codes 1, 2, and 4 from before to after e-learning were observed in both groups. The bias indices for the undergraduate group were higher than those of the graduate group. Conclusions: Early exposure of students to ICDAS II is a valuable method of teaching caries detection and its e-learning program significantly improves their caries diagnostic skills. PMID:25512730

  1. Seven Q-Tracks monitors of laboratory quality drive general performance improvement: experience from the College of American Pathologists Q-Tracks program 1999-2011.

    PubMed

    Meier, Frederick A; Souers, Rhona J; Howanitz, Peter J; Tworek, Joseph A; Perrotta, Peter L; Nakhleh, Raouf E; Karcher, Donald S; Bashleben, Christine; Darcy, Teresa P; Schifman, Ron B; Jones, Bruce A

    2015-06-01

    Many production systems employ standardized statistical monitors that measure defect rates and cycle times, as indices of performance quality. Clinical laboratory testing, a system that produces test results, is amenable to such monitoring. To demonstrate patterns in clinical laboratory testing defect rates and cycle time using 7 College of American Pathologists Q-Tracks program monitors. Subscribers measured monthly rates of outpatient order-entry errors, identification band defects, and specimen rejections; median troponin order-to-report cycle times and rates of STAT test receipt-to-report turnaround time outliers; and critical values reporting event defects, and corrected reports. From these submissions Q-Tracks program staff produced quarterly and annual reports. These charted each subscriber's performance relative to other participating laboratories and aggregate and subgroup performance over time, dividing participants into best and median performers and performers with the most room to improve. Each monitor's patterns of change present percentile distributions of subscribers' performance in relation to monitoring durations and numbers of participating subscribers. Changes over time in defect frequencies and the cycle duration quantify effects on performance of monitor participation. All monitors showed significant decreases in defect rates as the 7 monitors ran variously for 6, 6, 7, 11, 12, 13, and 13 years. The most striking decreases occurred among performers who initially had the most room to improve and among subscribers who participated the longest. All 7 monitors registered significant improvement. Participation effects improved between 0.85% and 5.1% per quarter of participation. Using statistical quality measures, collecting data monthly, and receiving reports quarterly and yearly, subscribers to a comparative monitoring program documented significant decreases in defect rates and shortening of a cycle time for 6 to 13 years in all 7 ongoing clinical laboratory quality monitors.

  2. Clinical Track Program Expansion Increases Rotation Capacity for Experiential Program.

    PubMed

    Tofade, Toyin S; Brueckl, Mark; Ross, Patricia A

    2017-10-01

    Objective. To evaluate the rotation capacity at the University of Maryland School of Pharmacy and see if the implementation of clinical track programs across the state correlates to an increase in rotation capacity for the school. Methods. The following information was collected: number of preceptors over the years in the school's experiential learning program, number of clinical track programs from 2012 to 2015, rotation type, availability submissions per rotation type per year, and availability submissions per hospital participant in the clinical track program per year. The rotation capacity and rotation types from 2012 to 2015 academic years were assessed and compared to see if there was any impact on the clinical track programs implemented. Results. There was no statistically significant difference in the frequency distribution of rotation types among all sites from 2012 through 2015 academic years. However, there was a statistically significant difference in the total number/capacity of rotations from 2012 to 2015 academic years. There were also statistically significant differences in the rotation capacity in all sites except for three sites. Conclusion. Adding clinical track programs can help increase the capacity of a school's clinical rotations.

  3. ROOT — A C++ framework for petabyte data storage, statistical analysis and visualization

    NASA Astrophysics Data System (ADS)

    Antcheva, I.; Ballintijn, M.; Bellenot, B.; Biskup, M.; Brun, R.; Buncic, N.; Canal, Ph.; Casadei, D.; Couet, O.; Fine, V.; Franco, L.; Ganis, G.; Gheata, A.; Maline, D. Gonzalez; Goto, M.; Iwaszkiewicz, J.; Kreshuk, A.; Segura, D. Marcos; Maunder, R.; Moneta, L.; Naumann, A.; Offermann, E.; Onuchin, V.; Panacek, S.; Rademakers, F.; Russo, P.; Tadel, M.

    2009-12-01

    ROOT is an object-oriented C++ framework conceived in the high-energy physics (HEP) community, designed for storing and analyzing petabytes of data in an efficient way. Any instance of a C++ class can be stored into a ROOT file in a machine-independent compressed binary format. In ROOT the TTree object container is optimized for statistical data analysis over very large data sets by using vertical data storage techniques. These containers can span a large number of files on local disks, the web, or a number of different shared file systems. In order to analyze this data, the user can chose out of a wide set of mathematical and statistical functions, including linear algebra classes, numerical algorithms such as integration and minimization, and various methods for performing regression analysis (fitting). In particular, the RooFit package allows the user to perform complex data modeling and fitting while the RooStats library provides abstractions and implementations for advanced statistical tools. Multivariate classification methods based on machine learning techniques are available via the TMVA package. A central piece in these analysis tools are the histogram classes which provide binning of one- and multi-dimensional data. Results can be saved in high-quality graphical formats like Postscript and PDF or in bitmap formats like JPG or GIF. The result can also be stored into ROOT macros that allow a full recreation and rework of the graphics. Users typically create their analysis macros step by step, making use of the interactive C++ interpreter CINT, while running over small data samples. Once the development is finished, they can run these macros at full compiled speed over large data sets, using on-the-fly compilation, or by creating a stand-alone batch program. Finally, if processing farms are available, the user can reduce the execution time of intrinsically parallel tasks — e.g. data mining in HEP — by using PROOF, which will take care of optimally distributing the work over the available resources in a transparent way. Program summaryProgram title: ROOT Catalogue identifier: AEFA_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFA_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: LGPL No. of lines in distributed program, including test data, etc.: 3 044 581 No. of bytes in distributed program, including test data, etc.: 36 325 133 Distribution format: tar.gz Programming language: C++ Computer: Intel i386, Intel x86-64, Motorola PPC, Sun Sparc, HP PA-RISC Operating system: GNU/Linux, Windows XP/Vista, Mac OS X, FreeBSD, OpenBSD, Solaris, HP-UX, AIX Has the code been vectorized or parallelized?: Yes RAM:>55 Mbytes Classification: 4, 9, 11.9, 14 Nature of problem: Storage, analysis and visualization of scientific data Solution method: Object store, wide range of analysis algorithms and visualization methods Additional comments: For an up-to-date author list see: http://root.cern.ch/drupal/content/root-development-team and http://root.cern.ch/drupal/content/former-root-developers Running time: Depending on the data size and complexity of analysis algorithms References:http://root.cern.ch.

  4. THERMINATOR: THERMal heavy-IoN generATOR

    NASA Astrophysics Data System (ADS)

    Kisiel, Adam; Tałuć, Tomasz; Broniowski, Wojciech; Florkowski, Wojciech

    2006-04-01

    THERMINATOR is a Monte Carlo event generator designed for studying of particle production in relativistic heavy-ion collisions performed at such experimental facilities as the SPS, RHIC, or LHC. The program implements thermal models of particle production with single freeze-out. It performs the following tasks: (1) generation of stable particles and unstable resonances at the chosen freeze-out hypersurface with the local phase-space density of particles given by the statistical distribution factors, (2) subsequent space-time evolution and decays of hadronic resonances in cascades, (3) calculation of the transverse-momentum spectra and numerous other observables related to the space-time evolution. The geometry of the freeze-out hypersurface and the collective velocity of expansion may be chosen from two successful models, the Cracow single-freeze-out model and the Blast-Wave model. All particles from the Particle Data Tables are used. The code is written in the object-oriented c++ language and complies to the standards of the ROOT environment. Program summaryProgram title:THERMINATOR Catalogue identifier:ADXL_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXL_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland RAM required to execute with typical data:50 Mbytes Number of processors used:1 Computer(s) for which the program has been designed: PC, Pentium III, IV, or Athlon, 512 MB RAM not hardware dependent (any computer with the c++ compiler and the ROOT environment [R. Brun, F. Rademakers, Nucl. Instrum. Methods A 389 (1997) 81, http://root.cern.ch] Operating system(s) for which the program has been designed:Linux: Mandrake 9.0, Debian 3.0, SuSE 9.0, Red Hat FEDORA 3, etc., Windows XP with Cygwin ver. 1.5.13-1 and gcc ver. 3.3.3 (cygwin special)—not system dependent External routines/libraries used: ROOT ver. 4.02.00 Programming language:c++ Size of the package: (324 KB directory 40 KB compressed distribution archive), without the ROOT libraries (see http://root.cern.ch for details on the ROOT [R. Brun, F. Rademakers, Nucl. Instrum. Methods A 389 (1997) 81, http://root.cern.ch] requirements). The output files created by the code need 1.1 GB for each 500 events. Distribution format: tar gzip file Number of lines in distributed program, including test data, etc.: 6534 Number of bytes in ditribution program, including test data, etc.:41 828 Nature of the physical problem: Statistical models have proved to be very useful in the description of soft physics in relativistic heavy-ion collisions [P. Braun-Munzinger, K. Redlich, J. Stachel, 2003, nucl-th/0304013. [2

  5. The New Drug Conditional Approval Process in China: Challenges and Opportunities.

    PubMed

    Yao, Xuefang; Ding, Jinxi; Liu, Yingfang; Li, Penghui

    2017-05-01

    Our aim was to characterize the newly established new drug conditional approval process in China and discuss the challenges and opportunities with respect to new drug research and development and registration. We examined the new approval program through literature review, law analysis, and data analysis. Data were derived from published materials, such as journal articles, government publications, press releases, and news articles, along with statistical data from INSIGHT-China Pharma Databases, the China Food and Drug Administration website, the Center for Drug Evaluation website, the US Food and Drug Administration website, and search results published by Google. Currently, there is a large backlog of New Drug Applications in China, mainly because of the prolonged review time at the China Food and Drug Administration, resulting in a lag in drug approvals. In 2015, the Chinese government implemented the drug review and registration system reform and tackled this issue through various approaches, such as setting up a drug review fee system, adjusting the drug registration classification, and establishing innovative review pathways, including the conditional approval process. In Europe and the United States, programs comparable to the conditional approval program in China have been well developed. The conditional approval program recently established in China is an expedited new drug approval process that is expected to affect new drug development at home and abroad and profoundly influence the public health and the pharmaceutical industry in China. Like any program in its initial stage, the conditional approval program is facing several challenges, including setting up a robust system, formatting new drug clinical research requirements, and improving the regulatory agency's function for drug review and approval. The program is expected to evolve and improve as part of the government mandate of the drug registration system reform. Copyright © 2017 Elsevier HS Journals, Inc. All rights reserved.

  6. DoD Educational Intervention Programs for Scientists and Engineers.

    DTIC Science & Technology

    1995-10-01

    Nabeel , ed. The Condition of Education: 1993. Washington, D.C.: U.S.Department of Education, National Center for Education Statistics (NCES 93-290), p...Naval Facilities I Undergraduate Academic Program Undergraduate Navy Naval Ocean Sy Cooperative Education Program (COOP) Undergraduate Navy Naval... Nabeel , ed. The Condition of Education: 1993. Washington, D.C.: U.S. Department of Education, National Center for Education Statistics (NCES 93-290

  7. Residents in difficulty: a mixed methods study on the prevalence, characteristics, and sociocultural challenges from the perspective of residency program directors.

    PubMed

    Christensen, Mette K; O'Neill, Lotte; Hansen, Dorthe H; Norberg, Karen; Mortensen, Lene S; Charles, Peder

    2016-02-22

    The majority of studies on prevalence and characteristics of residents in difficulty have been conducted in English-speaking countries and the existing literature may not reflect the prevalence and characteristics of residents in difficulty in other parts of the world such as the Scandinavian countries, where healthcare systems are slightly different. The aim of this study was to examine prevalence and characteristics of residents in difficulty in one out of three postgraduate medical training regions in Denmark, and to produce both a quantifiable overview and in-depth understanding of the topic. We performed a mixed methods study. All regional residency program directors (N = 157) were invited to participate in an e-survey about residents in difficulty. Survey data were combined with database data on demographical characteristics of the background population (N = 2399) of residents, and analyzed statistically (Chi-squared test (Χ (2)) or Fisher's exact test). Secondly, we performed a qualitative interview study involving three focus group interviews with residency program directors. The analysis of the interview data employed qualitative content analysis. 73.2 % of the residency program directors completed the e-survey and 22 participated in the focus group interviews. The prevalence of residents in difficulty was 6.8 %. We found no statistically significant differences in the prevalence of residents in difficulty by gender and type of specialty. The results also showed two important themes related to the workplace culture of the resident in difficulty: 1) belated and inconsistent feedback on the resident's inadequate performance, and 2) the perceived culturally rooted priority of efficient patient care before education in the workplace. These two themes were emphasized by the program directors as the primary underlying causes of the residents' difficulty. More work is needed in order to clarify the link between, on the one hand, observable markers of residents in difficulty and, on the other hand, immanent processes and logics of practice in a healthcare system. From our perspective, further sociological and pedagogical investigations in educational cultures across settings and specialties could inform our understanding of and knowledge about pitfalls in residents' and doctors' socialization into the healthcare system.

  8. Comparative Optical Measurements of Airspeed and Aerosols on a DC-8 Aircraft

    NASA Technical Reports Server (NTRS)

    Bogue, Rodney; McGann, Rick; Wagener, Thomas; Abbiss, John; Smart, Anthony

    1997-01-01

    NASA Dryden supported a cooperative flight test program on the NASA DC-8 aircraft in November 1993. This program evaluated optical airspeed and aerosol measurement techniques. Three brassboard optical systems were tested. Two were laser Doppler systems designed to measure free-stream-referenced airspeed. The third system was designed to characterize the natural aerosol statistics and airspeed. These systems relied on optical backscatter from natural aerosols for operation. The DC-8 aircraft carried instrumentation that provided real-time flight situation information and reference data on the aerosol environment. This test is believed to be the first to include multiple optical airspeed systems on the same carrier aircraft, so performance could be directly compared. During 23 hr of flight, a broad range of atmospheric conditions was encountered, including aerosol-rich layers, visible clouds, and unusually clean (aerosol-poor) regions. Substantial amounts of data were obtained. Important insights regarding the use of laser-based systems of this type in an aircraft environment were gained. This paper describes the sensors used and flight operations conducted to support the experiments. The paper also briefly describes the general results of the experiments.

  9. Relationship between Graduate Students' Statistics Self-Efficacy, Statistics Anxiety, Attitude toward Statistics, and Social Support

    ERIC Educational Resources Information Center

    Perepiczka, Michelle; Chandler, Nichelle; Becerra, Michael

    2011-01-01

    Statistics plays an integral role in graduate programs. However, numerous intra- and interpersonal factors may lead to successful completion of needed coursework in this area. The authors examined the extent of the relationship between self-efficacy to learn statistics and statistics anxiety, attitude towards statistics, and social support of 166…

  10. Care 3, Phase 1, volume 1

    NASA Technical Reports Server (NTRS)

    Stiffler, J. J.; Bryant, L. A.; Guccione, L.

    1979-01-01

    A computer program to aid in accessing the reliability of fault tolerant avionics systems was developed. A simple mathematical expression was used to evaluate the reliability of any redundant configuration over any interval during which the failure rates and coverage parameters remained unaffected by configuration changes. Provision was made for convolving such expressions in order to evaluate the reliability of a dual mode system. A coverage model was also developed to determine the various relevant coverage coefficients as a function of the available hardware and software fault detector characteristics, and subsequent isolation and recovery delay statistics.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nielsen, Erik; Blume-Kohout, Robin; Rudinger, Kenneth

    PyGSTi is an implementation of Gate Set Tomography in the python programming language. Gate Set Tomography (GST) is a theory and protocol for simultaneously estimating the state preparation, gate operations, and measurement effects of a physical system of one or many quantum bits (qubits). These estimates are based entirely on the statistics of experimental measurements, and their interpretation and analysis can provide a detailed understanding of the types of errors/imperfections in the physical system. In this way, GST provides not only a means of certifying the "goodness" of qubits but also a means of debugging (i.e. improving) them.

  12. Statistical Reference Datasets

    National Institute of Standards and Technology Data Gateway

    Statistical Reference Datasets (Web, free access)   The Statistical Reference Datasets is also supported by the Standard Reference Data Program. The purpose of this project is to improve the accuracy of statistical software by providing reference datasets with certified computational results that enable the objective evaluation of statistical software.

  13. MEASURE: An integrated data-analysis and model identification facility

    NASA Technical Reports Server (NTRS)

    Singh, Jaidip; Iyer, Ravi K.

    1990-01-01

    The first phase of the development of MEASURE, an integrated data analysis and model identification facility is described. The facility takes system activity data as input and produces as output representative behavioral models of the system in near real time. In addition a wide range of statistical characteristics of the measured system are also available. The usage of the system is illustrated on data collected via software instrumentation of a network of SUN workstations at the University of Illinois. Initially, statistical clustering is used to identify high density regions of resource-usage in a given environment. The identified regions form the states for building a state-transition model to evaluate system and program performance in real time. The model is then solved to obtain useful parameters such as the response-time distribution and the mean waiting time in each state. A graphical interface which displays the identified models and their characteristics (with real time updates) was also developed. The results provide an understanding of the resource-usage in the system under various workload conditions. This work is targeted for a testbed of UNIX workstations with the initial phase ported to SUN workstations on the NASA, Ames Research Center Advanced Automation Testbed.

  14. A Statistical Analysis of the Effect of the Navy’s Tuition Assistance Program: Do Distance Learning Classes Make a Difference?

    DTIC Science & Technology

    2010-03-01

    ANALYSIS OF THE EFFECT OF THE NAVY’S TUITION ASSISTANCE PROGRAM : DO DISTANCE LEARNING CLASSES MAKE A DIFFERENCE? by Jeremy P. McLaughlin March...TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE A Statistical Analysis of the Effect of the Navy’s Tuition Assistance Program : Do...200 words) This thesis analyzes the impact of participation in the Navy’s Tuition Assistance (TA) program on the retention of first-term Navy

  15. Statistical Policy Working Paper 24. Electronic Dissemination of Statistical Data

    DOT National Transportation Integrated Search

    1995-11-01

    The report, Statistical Policy Working Paper 24, Electronic Dissemination of Statistical Data, includes several topics, such as Options and Best Uses for Different Media Operation of Electronic Dissemination Service, Customer Service Programs, Cost a...

  16. ALCF Data Science Program: Productive Data-centric Supercomputing

    NASA Astrophysics Data System (ADS)

    Romero, Nichols; Vishwanath, Venkatram

    The ALCF Data Science Program (ADSP) is targeted at big data science problems that require leadership computing resources. The goal of the program is to explore and improve a variety of computational methods that will enable data-driven discoveries across all scientific disciplines. The projects will focus on data science techniques covering a wide area of discovery including but not limited to uncertainty quantification, statistics, machine learning, deep learning, databases, pattern recognition, image processing, graph analytics, data mining, real-time data analysis, and complex and interactive workflows. Project teams will be among the first to access Theta, ALCFs forthcoming 8.5 petaflops Intel/Cray system. The program will transition to the 200 petaflop/s Aurora supercomputing system when it becomes available. In 2016, four projects have been selected to kick off the ADSP. The selected projects span experimental and computational sciences and range from modeling the brain to discovering new materials for solar-powered windows to simulating collision events at the Large Hadron Collider (LHC). The program will have a regular call for proposals with the next call expected in Spring 2017.http://www.alcf.anl.gov/alcf-data-science-program This research used resources of the ALCF, which is a DOE Office of Science User Facility supported under Contract DE-AC02-06CH11357.

  17. Analysis of differences in exercise recognition by constraints on physical activity of hospitalized cancer patients based on their medical history.

    PubMed

    Choi, Mi-Ri; Jeon, Sang-Wan; Yi, Eun-Surk

    2018-04-01

    The purpose of this study is to analyze the differences among the hospitalized cancer patients on their perception of exercise and physical activity constraints based on their medical history. The study used questionnaire survey as measurement tool for 194 cancer patients (male or female, aged 20 or older) living in Seoul metropolitan area (Seoul, Gyeonggi, Incheon). The collected data were analyzed using frequency analysis, exploratory factor analysis, reliability analysis t -test, and one-way distribution using statistical program SPSS 18.0. The following results were obtained. First, there was no statistically significant difference between cancer stage and exercise recognition/physical activity constraint. Second, there was a significant difference between cancer stage and sociocultural constraint/facility constraint/program constraint. Third, there was a significant difference between cancer operation history and physical/socio-cultural/facility/program constraint. Fourth, there was a significant difference between cancer operation history and negative perception/facility/program constraint. Fifth, there was a significant difference between ancillary cancer treatment method and negative perception/facility/program constraint. Sixth, there was a significant difference between hospitalization period and positive perception/negative perception/physical constraint/cognitive constraint. In conclusion, this study will provide information necessary to create patient-centered healthcare service system by analyzing exercise recognition of hospitalized cancer patients based on their medical history and to investigate the constraint factors that prevents patients from actually making efforts to exercise.

  18. Automated segmentation of foveal avascular zone in fundus fluorescein angiography.

    PubMed

    Zheng, Yalin; Gandhi, Jagdeep Singh; Stangos, Alexandros N; Campa, Claudio; Broadbent, Deborah M; Harding, Simon P

    2010-07-01

    PURPOSE. To describe and evaluate the performance of a computerized automated segmentation technique for use in quantification of the foveal avascular zone (FAZ). METHODS. A computerized technique for automated segmentation of the FAZ using images from fundus fluorescein angiography (FFA) was applied to 26 transit-phase images obtained from patients with various grades of diabetic retinopathy. The area containing the FAZ zone was first extracted from the original image and smoothed by a Gaussian kernel (sigma = 1.5). An initializing contour was manually placed inside the FAZ of the smoothed image and iteratively moved by the segmentation program toward the FAZ boundary. Five tests with different initializing curves were run on each of 26 images to assess reproducibility. The accuracy of the program was also validated by comparing results obtained by the program with the FAZ boundaries manually delineated by medical retina specialists. Interobserver performance was then evaluated by comparing delineations from two of the experts. RESULTS. One-way analysis of variance indicated that the disparities between different tests were not statistically significant, signifying excellent reproducibility for the computer program. There was a statistically significant linear correlation between the results obtained by automation and manual delineations by experts. CONCLUSIONS. This automated segmentation program can produce highly reproducible results that are comparable to those made by clinical experts. It has the potential to assist in the detection and management of foveal ischemia and to be integrated into automated grading systems.

  19. Federal Communications Commission (FCC) Transponder Loading Data Conversion Software. User's guide and software maintenance manual, version 1.2

    NASA Technical Reports Server (NTRS)

    Mallasch, Paul G.

    1993-01-01

    This volume contains the complete software system documentation for the Federal Communications Commission (FCC) Transponder Loading Data Conversion Software (FIX-FCC). This software was written to facilitate the formatting and conversion of FCC Transponder Occupancy (Loading) Data before it is loaded into the NASA Geosynchronous Satellite Orbital Statistics Database System (GSOSTATS). The information that FCC supplies NASA is in report form and must be converted into a form readable by the database management software used in the GSOSTATS application. Both the User's Guide and Software Maintenance Manual are contained in this document. This volume of documentation passed an independent quality assurance review and certification by the Product Assurance and Security Office of the Planning Research Corporation (PRC). The manuals were reviewed for format, content, and readability. The Software Management and Assurance Program (SMAP) life cycle and documentation standards were used in the development of this document. Accordingly, these standards were used in the review. Refer to the System/Software Test/Product Assurance Report for the Geosynchronous Satellite Orbital Statistics Database System (GSOSTATS) for additional information.

  20. Energy Efficiency Optimization in Relay-Assisted MIMO Systems With Perfect and Statistical CSI

    NASA Astrophysics Data System (ADS)

    Zappone, Alessio; Cao, Pan; Jorswieck, Eduard A.

    2014-01-01

    A framework for energy-efficient resource allocation in a single-user, amplify-and-forward relay-assisted MIMO system is devised in this paper. Previous results in this area have focused on rate maximization or sum power minimization problems, whereas fewer results are available when bits/Joule energy efficiency (EE) optimization is the goal. The performance metric to optimize is the ratio between the system's achievable rate and the total consumed power. The optimization is carried out with respect to the source and relay precoding matrices, subject to QoS and power constraints. Such a challenging non-convex problem is tackled by means of fractional programming and and alternating maximization algorithms, for various CSI assumptions at the source and relay. In particular the scenarios of perfect CSI and those of statistical CSI for either the source-relay or the relay-destination channel are addressed. Moreover, sufficient conditions for beamforming optimality are derived, which is useful in simplifying the system design. Numerical results are provided to corroborate the validity of the theoretical findings.

  1. Probabilistic structural analysis methods for improving Space Shuttle engine reliability

    NASA Technical Reports Server (NTRS)

    Boyce, L.

    1989-01-01

    Probabilistic structural analysis methods are particularly useful in the design and analysis of critical structural components and systems that operate in very severe and uncertain environments. These methods have recently found application in space propulsion systems to improve the structural reliability of Space Shuttle Main Engine (SSME) components. A computer program, NESSUS, based on a deterministic finite-element program and a method of probabilistic analysis (fast probability integration) provides probabilistic structural analysis for selected SSME components. While computationally efficient, it considers both correlated and nonnormal random variables as well as an implicit functional relationship between independent and dependent variables. The program is used to determine the response of a nickel-based superalloy SSME turbopump blade. Results include blade tip displacement statistics due to the variability in blade thickness, modulus of elasticity, Poisson's ratio or density. Modulus of elasticity significantly contributed to blade tip variability while Poisson's ratio did not. Thus, a rational method for choosing parameters to be modeled as random is provided.

  2. Effectiveness of a program using a vehicle tracking system, incentives, and disincentives to reduce the speeding behavior of drivers with ADHD.

    PubMed

    Markham, Paula T; Porter, Bryan E; Ball, J D

    2013-04-01

    In this article, the authors investigated the effectiveness of a behavior modification program using global positioning system (GPS) vehicle tracking devices with contingency incentives and disincentives to reduce the speeding behavior of drivers with ADHD. Using an AB multiple-baseline design, six participants drove a 5-mile stable driving route weekly while GPS devices recorded speeds. The dependent variable was percentage of feet speeding. Following an initial baseline period, five participants received treatment. One participant remained at baseline. Visual inspection of individual participant graphs, reductions in mean percentages of speeding from baseline to treatment across participants (M = 82%), C-statistic analyses, and visual graphs with applied binomial formula supported a treatment effect. The between-participant analysis using R n Test of Ranks was significant, R n = 6, p < .01, and complemented a clean multiple-baseline result. Results indicated that this treatment program was effective in reducing speeding by drivers with ADHD and warrants replication.

  3. Application of a computerized vibroacoustic data bank for random vibration criteria development

    NASA Technical Reports Server (NTRS)

    Ferebee, R. C.

    1982-01-01

    A computerized data bank system was developed for utilization of large amounts of vibration and acoustic data to formulate component random vibration design and test criteria. This system consists of a computer, graphics tablets, and a dry silver hard copier which are all desk top type hardware and occupy minimal space. Currently, the data bank contains data from the Saturn 5 and Titan 3 flight and static test programs. The vibration and acoustic data are stored in the form of power spectral density and one third octave band plots over the frequency range from 20 to 2000 Hz. The data were stored by digitizing each spectral plot by tracing with the graphics tablet. The digitized data were statistically analyzed, and the resulting 97.5 percent confidence levels were stored on tape along with the appropriate structural parameters. Standard extrapolation procedures were programmed for prediction of component random vibration test criteria for new launch vehicle and payload configurations. A user's manual is included to guide potential users through the programs.

  4. The influence of the low-frequency magnetic fields of different parameters on the secretion of cortisol in men.

    PubMed

    Woldańska-Okońska, Marta; Czernicki, Jan; Karasek, Michał

    2013-03-01

    The aim of this paper is to test the influence of long-term application of the low-frequency magnetic fields in magnetotherapy and magnetostimulation on cortisol secretion in men. Patients were divided into three groups: 16 men underwent magnetotherapy and 20 men (divided into two groups) underwent magnetostimulation. Magnetotherapy - 2 mT induction, 40 Hz, bipolar square wave, was applied for 20 min to lumbar area. Magnetostimulation (Viofor Jaroszyk, Paluszak, Sieroń (JPS) system, M2P2 program) was applied to 10 patients for 12 min each day. The third group (10 patients) underwent magnetostimulation (Viofor JPS system, M3P3) for 12 min each day using a different machine. All groups had 15 rounds of applications at approximately 10:00 a.m. with intermissions on the weekends. Blood serum was taken four times in a 24-hour period, before applications, the day after applications and a month later. Chemiluminescence micromethod was used to indicate hormone concentrations. Data was statistically analyzed with the analysis of variance (ANOVA) method. The statistically significant gains in the circadian cortisol profile at 4:00 p.m., be- fore and after application, were observed as a decrease in concentration during magnetotherapy. In magnetostimulation, with the M2P2 program, a significant increase in the cortisol concentration was observed in circadian profile at 12:00 p.m. one month after the last application. After magnetostimulation with the M3P3 program, a significant increase in concentration at 6:00 a.m. and a decrease in concentration at 12:00 p.m. were observed one month later. Statistically significant difference was demonstrated in the participants after the application of magnetotherapy and magnetostimulation with M3P3 program compared to the men submitted to magnetostimulation, with M2P2 program, at 4:00 p.m. after 15 applications. Biological hysteresis one month after magnetostimulation suggests long-term influence on the hypothalamo-hypophysial axis. The circadian curves of cortisol secretion a day after magnetotherapy and magnetostimulation with M3P3 program compared to magnetostimulation with M2P2 progam differs nearly by 100%, which proves that they show varied influence on cortisol secretion in men. All changes in the hormone concentration did not exceed the physiological standards of cortisol secretion, which suggests a regulating influence of magnetic fields on cortisol concentration rather than a strong stressogenic impact of magnetostimulation.

  5. Navigation analysis for Viking 1979, option B

    NASA Technical Reports Server (NTRS)

    Mitchell, P. H.

    1971-01-01

    A parametric study performed for 48 trans-Mars reference missions in support of the Viking program is reported. The launch dates cover several months in the year 1979, and each launch date has multiple arrival dates in 1980. A plot of launch versus arrival dates with case numbers designated for reference purposes is included. The analysis consists of the computation of statistical covariance matrices based on certain assumptions about the ground-based tracking systems. The error model statistics are listed in tables. Tracking systems were assumed at three sites: Goldstone, California; Canberra, Australia; and Madrid, Spain. The tracking data consisted of range and Doppler measurements taken during the tracking intervals starting at E-30(d) and ending at E-10(d) for the control data and ending at E-18(h) for the knowledge data. The control and knowledge covariance matrices were delivered to the Planetary Mission Analysis Branch for inputs into a delta V dispersion analysis.

  6. [Health for All-Italia: an indicator system on health].

    PubMed

    Burgio, Alessandra; Crialesi, Roberta; Loghi, Marzia

    2003-01-01

    The Health for All - Italia information system collects health data from several sources. It is intended to be a cornerstone for the achievement of an overview about health in Italy. Health is analyzed at different levels, ranging from health services, health needs, lifestyles, demographic, social, economic and environmental contexts. The database associated software allows to pin down statistical data into graphs and tables, and to carry out simple statistical analysis. It is therefore possible to view the indicators' time series, make simple projections and compare the various indicators over the years for each territorial unit. This is possible by means of tables, graphs (histograms, line graphs, frequencies, linear regression with calculation of correlation coefficients, etc) and maps. These charts can be exported to other programs (i.e. Word, Excel, Power Point), or they can be directly printed in color or black and white.

  7. Simulation of interference between Earth stations and Earth-orbiting satellites

    NASA Technical Reports Server (NTRS)

    Bishop, D. F.

    1994-01-01

    It is often desirable to determine the potential for radio frequency interference between earth stations and orbiting spacecraft. This information can be used to select frequencies for radio systems to avoid interference or it can be used to determine if coordination between radio systems is necessary. A model is developed that will determine the statistics of interference between earth stations and elliptical orbiting spacecraft. The model uses orbital dynamics, detailed antenna patterns, and spectral characteristics to obtain accurate levels of interference at the victim receiver. The model is programmed into a computer simulation to obtain long-term statistics of interference. Two specific examples are shown to demonstrate the model. The first example is a simulation of interference from a fixed-satellite earth station to an orbiting scatterometer receiver. The second example is a simulation of interference from earth-exploration satellites to a deep-space earth station.

  8. Bangladesh.

    PubMed

    Ahmed, K S

    1979-01-01

    In Bangladesh the Population Control and Family Planning Division of the Ministry of Health and Population Control has decided to delegate increased financial and administrative powers to the officers of the family planning program at the district level and below. Currently, about 20,000 family planning workers and officials are at work in rural areas. The government believes that the success of the entire family planning program depends on the performance of workers in rural areas, because that is where about 90% of the population lives. Awareness of the need to improve statistical data in Bangladesh has been increasing, particularly in regard to the development of rural areas. An accurate statistical profile of rural Bangladesh is crucial to the formation, implementation and evaluation of rural development programs. A Seminar on Statistics for Rural Development will be held from June 18-20, 1980. The primary objectives of the Seminar are to make an exhaustive analysis of the current availability of statistics required for rural development programs and to consider methodological and operational improvements toward building up an adequate data base.

  9. Data Mining and Complex Problems: Case Study in Composite Materials

    NASA Technical Reports Server (NTRS)

    Rabelo, Luis; Marin, Mario

    2009-01-01

    Data mining is defined as the discovery of useful, possibly unexpected, patterns and relationships in data using statistical and non-statistical techniques in order to develop schemes for decision and policy making. Data mining can be used to discover the sources and causes of problems in complex systems. In addition, data mining can support simulation strategies by finding the different constants and parameters to be used in the development of simulation models. This paper introduces a framework for data mining and its application to complex problems. To further explain some of the concepts outlined in this paper, the potential application to the NASA Shuttle Reinforced Carbon-Carbon structures and genetic programming is used as an illustration.

  10. Statistics for demodulation RFI in inverting operational amplifier circuits

    NASA Astrophysics Data System (ADS)

    Sutu, Y.-H.; Whalen, J. J.

    An investigation was conducted with the objective to determine statistical variations for RFI demodulation responses in operational amplifier (op amp) circuits. Attention is given to the experimental procedures employed, a three-stage op amp LED experiment, NCAP (Nonlinear Circuit Analysis Program) simulations of demodulation RFI in 741 op amps, and a comparison of RFI in four op amp types. Three major recommendations for future investigations are presented on the basis of the obtained results. One is concerned with the conduction of additional measurements of demodulation RFI in inverting amplifiers, while another suggests the employment of an automatic measurement system. It is also proposed to conduct additional NCAP simulations in which parasitic effects are accounted for more thoroughly.

  11. Sampling Errors in Monthly Rainfall Totals for TRMM and SSM/I, Based on Statistics of Retrieved Rain Rates and Simple Models

    NASA Technical Reports Server (NTRS)

    Bell, Thomas L.; Kundu, Prasun K.; Einaudi, Franco (Technical Monitor)

    2000-01-01

    Estimates from TRMM satellite data of monthly total rainfall over an area are subject to substantial sampling errors due to the limited number of visits to the area by the satellite during the month. Quantitative comparisons of TRMM averages with data collected by other satellites and by ground-based systems require some estimate of the size of this sampling error. A method of estimating this sampling error based on the actual statistics of the TRMM observations and on some modeling work has been developed. "Sampling error" in TRMM monthly averages is defined here relative to the monthly total a hypothetical satellite permanently stationed above the area would have reported. "Sampling error" therefore includes contributions from the random and systematic errors introduced by the satellite remote sensing system. As part of our long-term goal of providing error estimates for each grid point accessible to the TRMM instruments, sampling error estimates for TRMM based on rain retrievals from TRMM microwave (TMI) data are compared for different times of the year and different oceanic areas (to minimize changes in the statistics due to algorithmic differences over land and ocean). Changes in sampling error estimates due to changes in rain statistics due 1) to evolution of the official algorithms used to process the data, and 2) differences from other remote sensing systems such as the Defense Meteorological Satellite Program (DMSP) Special Sensor Microwave/Imager (SSM/I), are analyzed.

  12. Nonlinear Curve-Fitting Program

    NASA Technical Reports Server (NTRS)

    Everhart, Joel L.; Badavi, Forooz F.

    1989-01-01

    Nonlinear optimization algorithm helps in finding best-fit curve. Nonlinear Curve Fitting Program, NLINEAR, interactive curve-fitting routine based on description of quadratic expansion of X(sup 2) statistic. Utilizes nonlinear optimization algorithm calculating best statistically weighted values of parameters of fitting function and X(sup 2) minimized. Provides user with such statistical information as goodness of fit and estimated values of parameters producing highest degree of correlation between experimental data and mathematical model. Written in FORTRAN 77.

  13. Gender-responsive programs in U.S. prisons: implications for change.

    PubMed

    White, Gale D

    2012-01-01

    This research examines the need for programs that focus on mental health issues, parenting issues, and other unique needs of female offenders incarcerated throughout the United States. The Bureau of Justice Statistics showed that 84% of female offenders were living with their children prior to their arrest. This constitutes a crisis in our society today, which is manifest in overcrowded state and federal prisons, increased caseloads for the Department of Children and Family Services, the Foster Care System, and families of the offenders. The goal of this research is to determine what types of gender-responsive programs are effective in reducing recidivism. The methods used were qualitative data analysis, by comparing which programs are offered, either within the prison, or as a reentry postrelease program. A survey was used and interview data were analyzed by identifying and comparing common themes and patterns. The findings reveal the most effective gender-responsive programs are those that incorporate substance abuse treatment, education and job preparedness, parenting programs where contact with children is allowed and/or encouraged, and family reunification programs.

  14. 2017 Annual Disability Statistics Compendium

    ERIC Educational Resources Information Center

    Lauer, E. A.; Houtenville, A. J.

    2018-01-01

    The "Annual Disability Statistics Compendium" and its compliment, the "Annual Disability Statistics Supplement," are publications of statistics about people with disabilities and about the government programs which serve them. The "Compendium" and "Supplement" are designed to serve as a summary of government…

  15. Reduction of lithologic-log data to numbers for use in the digital computer

    USGS Publications Warehouse

    Morgan, C.O.; McNellis, J.M.

    1971-01-01

    The development of a standardized system for conveniently coding lithologic-log data for use in the digital computer has long been needed. The technique suggested involves a reduction of the original written alphanumeric log to a numeric log by use of computer programs. This numeric log can then be retrieved as a written log, interrogated for pertinent information, or analyzed statistically. ?? 1971 Plenum Publishing Corporation.

  16. Public School Library Media Centers in South Carolina: A Survey of Service Levels Offered. Conducted during School Year 1988/1989.

    ERIC Educational Resources Information Center

    Townsend, Catherine M.

    A state-wide survey was undertaken in 1988-1989 to determine the status of the library media programs in South Carolina's public schools. The first of two phases of the study involved the compilation of statistical data reported to the State Department of Education by building level administrators on the Basic Educational Data System (BEDS) for…

  17. How to Create Automatically Graded Spreadsheets for Statistics Courses

    ERIC Educational Resources Information Center

    LoSchiavo, Frank M.

    2016-01-01

    Instructors often use spreadsheet software (e.g., Microsoft Excel) in their statistics courses so that students can gain experience conducting computerized analyses. Unfortunately, students tend to make several predictable errors when programming spreadsheets. Without immediate feedback, programming errors are likely to go undetected, and as a…

  18. 78 FR 73927 - Single Family Housing Guaranteed Loan Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-09

    .... The term ``MSA (Metropolitan Statistical Area)'' was added as it is a term the Office of Management... statistics. The term ``new dwelling'' was amended to achieve consistency with other Agency program... Government. One respondent recommended establishing a delinquency goal to improve and monitor a lender's...

  19. Performance Data Gathering and Representation from Fixed-Size Statistical Data

    NASA Technical Reports Server (NTRS)

    Yan, Jerry C.; Jin, Haoqiang H.; Schmidt, Melisa A.; Kutler, Paul (Technical Monitor)

    1997-01-01

    The two commonly-used performance data types in the super-computing community, statistics and event traces, are discussed and compared. Statistical data are much more compact but lack the probative power event traces offer. Event traces, on the other hand, are unbounded and can easily fill up the entire file system during program execution. In this paper, we propose an innovative methodology for performance data gathering and representation that offers a middle ground. Two basic ideas are employed: the use of averages to replace recording data for each instance and 'formulae' to represent sequences associated with communication and control flow. The user can trade off tracing overhead, trace data size with data quality incrementally. In other words, the user will be able to limit the amount of trace data collected and, at the same time, carry out some of the analysis event traces offer using space-time views. With the help of a few simple examples, we illustrate the use of these techniques in performance tuning and compare the quality of the traces we collected with event traces. We found that the trace files thus obtained are, indeed, small, bounded and predictable before program execution, and that the quality of the space-time views generated from these statistical data are excellent. Furthermore, experimental results showed that the formulae proposed were able to capture all the sequences associated with 11 of the 15 applications tested. The performance of the formulae can be incrementally improved by allocating more memory at runtime to learn longer sequences.

  20. The Exoplanet Microlensing Survey by the Proposed WFIRST Observatory

    NASA Technical Reports Server (NTRS)

    Barry, Richard; Kruk, Jeffrey; Anderson, Jay; Beaulieu, Jean-Philippe; Bennett, David P.; Catanzarite, Joseph; Cheng, Ed; Gaudi, Scott; Gehrels, Neil; Kane, Stephen; hide

    2012-01-01

    The New Worlds, New Horizons report released by the Astronomy and Astrophysics Decadal Survey Board in 2010 listed the Wide Field Infrared Survey Telescope (WFIRST) as the highest-priority large space mission for the . coming decade. This observatory will provide wide-field imaging and slitless spectroscopy at near infrared wavelengths. The scientific goals are to obtain a statistical census of exoplanets using gravitational microlensing. measure the expansion history of and the growth of structure in the Universe by multiple methods, and perform other astronomical surveys to be selected through a guest observer program. A Science Definition Team has been established to assist NASA in the development of a Design Reference Mission that accomplishes this diverse array of science programs with a single observatory. In this paper we present the current WFIRST payload concept and the expected capabilities for planet detection. The observatory. with science goals that are complimentary to the Kepler exoplanet transit mission, is designed to complete the statistical census of planetary systems in the Galaxy, from habitable Earth-mass planets to free floating planets, including analogs to all of the planets in our Solar System except Mercury. The exoplanet microlensing survey will observe for 500 days spanning 5 years. This long temporal baseline will enable the determination of the masses for most detected exoplanets down to 0.1 Earth masses.

  1. Identifying weaknesses in undergraduate programs within the context input process product model framework in view of faculty and library staff in 2014.

    PubMed

    Neyazi, Narges; Arab, Mohammad; Farzianpour, Freshteh; Mahmoudi, Mahmood

    2016-06-01

    Objective of this research is to find out weaknesses of undergraduate programs in terms of personnel and financial, organizational management and facilities in view of faculty and library staff, and determining factors that may facilitate program quality-improvement. This is a descriptive analytical survey research and from purpose aspect is an application evaluation study that undergraduate groups of selected faculties (Public Health, Nursing and Midwifery, Allied Medical Sciences and Rehabilitation) at Tehran University of Medical Sciences (TUMS) have been surveyed using context input process product model in 2014. Statistical population were consist of three subgroups including department head (n=10), faculty members (n=61), and library staff (n=10) with total population of 81 people. Data collected through three researcher-made questionnaires which were based on Likert scale. The data were then analyzed using descriptive and inferential statistics. Results showed desirable and relatively desirable situation for factors in context, input, process, and product fields except for factors of administration and financial; and research and educational spaces and equipment which were in undesirable situation. Based on results, researcher highlighted weaknesses in the undergraduate programs of TUMS in terms of research and educational spaces and facilities, educational curriculum, administration and financial; and recommended some steps in terms of financial, organizational management and communication with graduates in order to improve the quality of this system.

  2. Differences between Lab Completion and Non-Completion on Student Performance in an Online Undergraduate Environmental Science Program

    NASA Astrophysics Data System (ADS)

    Corsi, Gianluca

    2011-12-01

    Web-based technology has revolutionized the way education is delivered. Although the advantages of online learning appeal to large numbers of students, some concerns arise. One major concern in online science education is the value that participation in labs has on student performance. The purpose of this study was to assess the relationships between lab completion and student academic success as measured by test grades, scientific self-confidence, scientific skills, and concept mastery. A random sample of 114 volunteer undergraduate students, from an online Environmental Science program at the American Public University System, was tested. The study followed a quantitative, non-experimental research design. Paired sample t-tests were used for statistical comparison between pre-lab and post-lab test grades, two scientific skills quizzes, and two scientific self-confidence surveys administered at the beginning and at the end of the course. The results of the paired sample t-tests revealed statistically significant improvements on all post-lab test scores: Air Pollution lab, t(112) = 6.759, p < .001; Home Chemicals lab t(114) = 8.585, p < .001; Water Use lab, t(116) = 6.657, p < .001; Trees and Carbon lab, t(113) = 9.921, p < .001; Stratospheric Ozone lab, t(112) =12.974, p < .001; Renewable Energy lab, t(115) = 7.369, p < .001. The end of the course Scientific Skills quiz revealed statistically significant improvements, t(112) = 8.221, p < .001. The results of the two surveys showed a statistically significant improvement on student Scientific Self-Confidence because of lab completion, t(114) = 3.015, p < .05. Because age and gender were available, regression models were developed. The results indicated weak multiple correlation coefficients and were not statistically significant at alpha = .05. Evidence suggests that labs play a positive role in a student's academic success. It is recommended that lab experiences be included in all online Environmental Science programs, with emphasis on open-ended inquiries, and adoption of online tools to enhance hands-on experiences, such as virtual reality platforms and digital animations. Future research is encouraged to investigate possible correlations between socio-demographic attributes and academic success of students enrolled in online science programs in reference to lab completion.

  3. Public health workforce employment in US public and private sectors.

    PubMed

    Kennedy, Virginia C

    2009-01-01

    The purpose of this study was to describe the number and distribution of 26 administrative, professional, and technical public health occupations across the array of US governmental and nongovernmental industries. This study used data from the Occupational Employment Statistics program of the US Bureau of Labor Statistics. For each occupation of interest, the investigator determined the number of persons employed in 2006 in five industries and industry groups: government, nonprofit agencies, education, healthcare, and all other industries. Industry-specific employment profiles varied from one occupation to another. However, about three-fourths of all those engaged in these occupations worked in the private healthcare industry. Relatively few worked in nonprofit or educational settings, and less than 10 percent were employed in government agencies. The industry-specific distribution of public health personnel, particularly the proportion employed in the public sector, merits close monitoring. This study also highlights the need for a better understanding of the work performed by public health occupations in nongovernmental work settings. Finally, the Occupational Employment Statistics program has the potential to serve as an ongoing, national data collection system for public health workforce information. If this potential was realized, future workforce enumerations would not require primary data collection but rather could be accomplished using secondary data.

  4. Statistical benchmarking for orthogonal electrostatic quantum dot qubit devices

    NASA Astrophysics Data System (ADS)

    Gamble, John; Frees, Adam; Friesen, Mark; Coppersmith, S. N.

    2014-03-01

    Quantum dots in semiconductor systems have emerged as attractive candidates for the implementation of quantum information processors because of the promise of scalability, manipulability, and integration with existing classical electronics. A limitation in current devices is that the electrostatic gates used for qubit manipulation exhibit strong cross-capacitance, presenting a barrier for practical scale-up. Here, we introduce a statistical framework for making precise the notion of orthogonality. We apply our method to analyze recently implemented designs at the University of Wisconsin-Madison that exhibit much increased orthogonal control than was previously possible. We then use our statistical modeling to future device designs, providing practical guidelines for devices to have robust control properties. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy Nuclear Security Administration under contract DE-AC04-94AL85000. The views and conclusions contained in this document are those of the authors and should not be interpreted as representing the official policies, either expressly or implied, of the US Government. This work was supported in part by the Laboratory Directed Research and Development program at Sandia National Laboratories, by ARO (W911NF-12-0607), and by the United States Department of Defense.

  5. Doctoral training in statistics, measurement, and methodology in psychology: replication and extension of Aiken, West, Sechrest, and Reno's (1990) survey of PhD programs in North America.

    PubMed

    Aiken, Leona S; West, Stephen G; Millsap, Roger E

    2008-01-01

    In a survey of all PhD programs in psychology in the United States and Canada, the authors documented the quantitative methodology curriculum (statistics, measurement, and research design) to examine the extent to which innovations in quantitative methodology have diffused into the training of PhDs in psychology. In all, 201 psychology PhD programs (86%) participated. This survey replicated and extended a previous survey (L. S. Aiken, S. G. West, L. B. Sechrest, & R. R. Reno, 1990), permitting examination of curriculum development. Most training supported laboratory and not field research. The median of 1.6 years of training in statistics and measurement was mainly devoted to the modally 1-year introductory statistics course, leaving little room for advanced study. Curricular enhancements were noted in statistics and to a minor degree in measurement. Additional coverage of both fundamental and innovative quantitative methodology is needed. The research design curriculum has largely stagnated, a cause for great concern. Elite programs showed no overall advantage in quantitative training. Forces that support curricular innovation are characterized. Human capital challenges to quantitative training, including recruiting and supporting young quantitative faculty, are discussed. Steps must be taken to bring innovations in quantitative methodology into the curriculum of PhD programs in psychology. PsycINFO Database Record (c) 2008 APA, all rights reserved.

  6. Visualization of the significance of Receiver Operating Characteristics based on confidence ellipses

    NASA Astrophysics Data System (ADS)

    Sarlis, Nicholas V.; Christopoulos, Stavros-Richard G.

    2014-03-01

    The Receiver Operating Characteristics (ROC) is used for the evaluation of prediction methods in various disciplines like meteorology, geophysics, complex system physics, medicine etc. The estimation of the significance of a binary prediction method, however, remains a cumbersome task and is usually done by repeating the calculations by Monte Carlo. The FORTRAN code provided here simplifies this problem by evaluating the significance of binary predictions for a family of ellipses which are based on confidence ellipses and cover the whole ROC space. Catalogue identifier: AERY_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERY_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 11511 No. of bytes in distributed program, including test data, etc.: 72906 Distribution format: tar.gz Programming language: FORTRAN. Computer: Any computer supporting a GNU FORTRAN compiler. Operating system: Linux, MacOS, Windows. RAM: 1Mbyte Classification: 4.13, 9, 14. Nature of problem: The Receiver Operating Characteristics (ROC) is used for the evaluation of prediction methods in various disciplines like meteorology, geophysics, complex system physics, medicine etc. The estimation of the significance of a binary prediction method, however, remains a cumbersome task and is usually done by repeating the calculations by Monte Carlo. The FORTRAN code provided here simplifies this problem by evaluating the significance of binary predictions for a family of ellipses which are based on confidence ellipses and cover the whole ROC space. Solution method: Using the statistics of random binary predictions for a given value of the predictor threshold ɛt, one can construct the corresponding confidence ellipses. The envelope of these corresponding confidence ellipses is estimated when ɛt varies from 0 to 1. This way a new family of ellipses is obtained, named k-ellipses, which covers the whole ROC plane and leads to a well defined Area Under the Curve (AUC). For the latter quantity, Mason and Graham [1] have shown that it follows the Mann-Whitney U-statistics [2] which can be applied [3] for the estimation of the statistical significance of each k-ellipse. As the transformation is invertible, any point on the ROC plane corresponds to a unique value of k, thus to a unique p-value to obtain this point by chance. The present FORTRAN code provides this p-value field on the ROC plane as well as the k-ellipses corresponding to the (p=)10%, 5% and 1% significance levels using as input the number of the positive (P) and negative (Q) cases to be predicted. Unusual features: In some machines, the compiler directive -O2 or -O3 should be used to avoid NaN’s in some points of the p-field along the diagonal. Running time: Depending on the application, e.g., 4s for an Intel(R) Core(TM)2 CPU E7600 at 3.06 GHz with 2 GB RAM for the examples presented here References: [1] S.J. Mason, N.E. Graham, Quart. J. Roy. Meteor. Soc. 128 (2002) 2145. [2] H.B. Mann, D.R. Whitney, Ann. Math. Statist. 18 (1947) 50. [3] L.C. Dinneen, B.C. Blakesley, J. Roy. Stat. Soc. Ser. C Appl. Stat. 22 (1973) 269.

  7. Emergent irreversibility and entanglement spectrum statistics

    NASA Astrophysics Data System (ADS)

    Mucciolo, Eduardo; Chamon, Claudio; Hamma, Alioscia

    2014-03-01

    We study the problem of irreversibility when the dynamical evolution of a many-body system is described by a stochastic quantum circuit. Such evolution is more general than Hamitonian, and since energy levels are not well defined, the well-established connection between the statistical fluctuations of the energy spectrum and irreversibility cannot be made. We show that the entanglement spectrum provides a more general connection. Irreversibility is marked by a failure of a disentangling algorithm and is preceded by the appearance of Wigner-Dyson statistical fluctuations in the entanglement spectrum. This analysis can be done at the wavefunction level and offers a new route to study quantum chaos and quantum integrability. We acknowledge financial support from the U.S. National Science Foundation through grants CCF 1116590 and CCF 1117241, from the National Basic Research Program of China through grants 2011CBA00300 and 2011CBA00301, and from the National Natural Science Fo.

  8. Data management system for USGS/USEPA urban hydrology studies program

    USGS Publications Warehouse

    Doyle, W.H.; Lorens, J.A.

    1982-01-01

    A data management system was developed to store, update, and retrieve data collected in urban stormwater studies jointly conducted by the U.S. Geological Survey and U.S. Environmental Protection Agency in 11 cities in the United States. The data management system is used to retrieve and combine data from USGS data files for use in rainfall, runoff, and water-quality models and for data computations such as storm loads. The system is based on the data management aspect of the Statistical Analysis System (SAS) and was used to create all the data files in the data base. SAS is used for storage and retrieval of basin physiography, land-use, and environmental practices inventory data. Also, storm-event water-quality characteristics are stored in the data base. The advantages of using SAS to create and manage a data base are many with a few being that it is simple, easy to use, contains a comprehensive statistical package, and can be used to modify files very easily. Data base system development has progressed rapidly during the last two decades and the data managment system concepts used in this study reflect the advancement made in computer technology during this era. Urban stormwater data is, however, just one application for which the system can be used. (USGS)

  9. Enigma Version 12

    NASA Technical Reports Server (NTRS)

    Shores, David; Goza, Sharon P.; McKeegan, Cheyenne; Easley, Rick; Way, Janet; Everett, Shonn; Guerra, Mark; Kraesig, Ray; Leu, William

    2013-01-01

    Enigma Version 12 software combines model building, animation, and engineering visualization into one concise software package. Enigma employs a versatile user interface to allow average users access to even the most complex pieces of the application. Using Enigma eliminates the need to buy and learn several software packages to create an engineering visualization. Models can be created and/or modified within Enigma down to the polygon level. Textures and materials can be applied for additional realism. Within Enigma, these models can be combined to create systems of models that have a hierarchical relationship to one another, such as a robotic arm. Then these systems can be animated within the program or controlled by an external application programming interface (API). In addition, Enigma provides the ability to use plug-ins. Plugins allow the user to create custom code for a specific application and access the Enigma model and system data, but still use the Enigma drawing functionality. CAD files can be imported into Enigma and combined to create systems of computer graphics models that can be manipulated with constraints. An API is available so that an engineer can write a simulation and drive the computer graphics models with no knowledge of computer graphics. An animation editor allows an engineer to set up sequences of animations generated by simulations or by conceptual trajectories in order to record these to highquality media for presentation. Enigma Version 12 Lyndon B. Johnson Space Center, Houston, Texas 28 NASA Tech Briefs, September 2013 Planetary Protection Bioburden Analysis Program NASA's Jet Propulsion Laboratory, Pasadena, California This program is a Microsoft Access program that performed statistical analysis of the colony counts from assays performed on the Mars Science Laboratory (MSL) spacecraft to determine the bioburden density, 3-sigma biodensity, and the total bioburdens required for the MSL prelaunch reports. It also contains numerous tools that report the data in various ways to simplify the reports required. The program performs all the calculations directly in the MS Access program. Prior to this development, the data was exported to large Excel files that had to be cut and pasted to provide the desired results. The program contains a main menu and a number of submenus. Analyses can be performed by using either all the assays, or only the accountable assays that will be used in the final analysis. There are three options on the first menu: either calculate using (1) the old MER (Mars Exploration Rover) statistics, (2) the MSL statistics for all the assays, or This software implements penetration limit equations for common micrometeoroid and orbital debris (MMOD) shield configurations, windows, and thermal protection systems. Allowable MMOD risk is formulated in terms of the probability of penetration (PNP) of the spacecraft pressure hull. For calculating the risk, spacecraft geometry models, mission profiles, debris environment models, and penetration limit equations for installed shielding configurations are required. Risk assessment software such as NASA's BUMPERII is used to calculate mission PNP; however, they are unsuitable for use in shield design and preliminary analysis studies. The software defines a single equation for the design and performance evaluation of common MMOD shielding configurations, windows, and thermal protection systems, along with a description of their validity range and guidelines for their application. Recommendations are based on preliminary reviews of fundamental assumptions, and accuracy in predicting experimental impact test results. The software is programmed in Visual Basic for Applications for installation as a simple add-in for Microsoft Excel. The user is directed to a graphical user interface (GUI) that requires user inputs and provides solutions directly in Microsoft Excel workbooks. This work was done by Shannon Ryan of the USRA Lunar and Planetary Institute for Johnson Space Center. Further information is contained in a TSP (see page 1). MSC- 24582-1 Micrometeoroid and Orbital Debris (MMOD) Shield Ballistic Limit Analysis Program Lyndon B. Johnson Space Center, Houston, Texas Commercially, because it is so generic, Enigma can be used for almost any project that requires engineering visualization, model building, or animation. Models in Enigma can be exported to many other formats for use in other applications as well. Educationally, Enigma is being used to allow university students to visualize robotic algorithms in a simulation mode before using them with actual hardware.

  10. Statistics in Japanese universities.

    PubMed Central

    Ito, P K

    1979-01-01

    The teaching of statistics in the U.S. and Japanese universities is briefly reviewed. It is found that H. Hotelling's articles and subsequent relevant publications on the teaching of statistics have contributed to a considerable extent to the establishment of excellent departments of statistics in U.S. universities and colleges. Today the U.S. may be proud of many well-staffed and well-organized departments of theoretical and applied statistics with excellent undergraduate and graduate programs. On the contrary, no Japanese universities have an independent department of statistics at present, and the teaching of statistics has been spread among a heterogeneous group of departments of application. This was mainly due to the Japanese government regulation concerning the establishment of a university. However, it has recently been revised so that an independent department of statistics may be started in a Japanese university with undergraduate and graduate programs. It is hoped that discussions will be started among those concerned on the question of organization of the teaching of statistics in Japanese universities as soon as possible. PMID:396154

  11. Tools for Basic Statistical Analysis

    NASA Technical Reports Server (NTRS)

    Luz, Paul L.

    2005-01-01

    Statistical Analysis Toolset is a collection of eight Microsoft Excel spreadsheet programs, each of which performs calculations pertaining to an aspect of statistical analysis. These programs present input and output data in user-friendly, menu-driven formats, with automatic execution. The following types of calculations are performed: Descriptive statistics are computed for a set of data x(i) (i = 1, 2, 3 . . . ) entered by the user. Normal Distribution Estimates will calculate the statistical value that corresponds to cumulative probability values, given a sample mean and standard deviation of the normal distribution. Normal Distribution from two Data Points will extend and generate a cumulative normal distribution for the user, given two data points and their associated probability values. Two programs perform two-way analysis of variance (ANOVA) with no replication or generalized ANOVA for two factors with four levels and three repetitions. Linear Regression-ANOVA will curvefit data to the linear equation y=f(x) and will do an ANOVA to check its significance.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tandon, Lav; Colletti, Lisa M.; Drake, Lawrence R.

    This report discusses the process used to prove in the SRNL-Rev.2 coulometer for isotopic data analysis used in the special plutonium material project. In May of 2012, the PAR 173 coulometer system that had been the workhorse of the Plutonium Assay team since the early 1970s became inoperable. A new coulometer system had been purchased from Savannah River National Laboratory (SRNL) and installed in August of 2011. Due to funding issues the new system was not qualified at that time. Following the failure of the PAR 173, it became necessary to qualify the new system for use in Process 3401a,more » Plutonium Assay by Controlled Coulometry. A qualification plan similar to what is described in PQR -141a was followed. Experiments were performed to establish a statistical summary of the performance of the new system by monitoring the repetitive analysis of quality control sample, PEOL, and the assay of plutonium metals obtained from the Plutonium Exchange Program. The data for the experiments was acquired using work instructions ANC125 and ANC195. Figure 1 shows approximately 2 years of data for the PEOL material obtained using the PAR 173. The required acceptance criteria for the sample are that it returns the correct value for the quality control material of 88.00% within 2 sigma (95% Confidence Interval). It also must meet daily precision standards that are set from the historical data analysis of decades of data. The 2 sigma value that is currently used is 0.146 % as evaluated by the Statistical Science Group, CCS-6. The average value of the PEOL quality control material run in 10 separate days on the SRNL-03 coulometer is 87.98% with a relative standard deviation of 0.04 at the 95% Confidence interval. The date of data acquisition is between 5/23/2012 to 8/1/2012. The control samples are run every day experiments using the coulometer are carried out. It is also used to prove an instrument is in statistical control before any experiments are undertaken. The total number of replicate controls run with the new coulometer to date, is n=18. This value is identical to that calculated by the LANL statistical group for this material from data produced by the PAR 173 system over the period of October 2007 to May 2011. The final validation/verification test was to run a blind sample over multiple days. AAC participates in a plutonium exchange program which supplies blind Pu metal samples to the group on a regular basis. The Pu material supplied for this study was ran using the PAR 173 in the past and more recently with the new system. Table 1a contains the values determined through the use of the PAR 173 and Table 1b contains the values obtained with the new system. The Pu assay value obtained on the SRNL system is for paired analysis and had a value of 98.88+/-0.07% RSD at 95% CI. The Pu assay value (decay corrected to July 2012) of the material determined in prior measurements using the PAR173 is 99.05 +/- 0.06 % RSD at 95% CI. We believe that the instrument is adequate to meet the needs of the program.« less

  13. Toward smartphone applications for geoparks information and interpretation systems in China

    NASA Astrophysics Data System (ADS)

    Li, Qian; Tian, Mingzhong; Li, Xingle; Shi, Yihua; Zhou, Xu

    2015-11-01

    Geopark information and interpretation systems are both necessary infrastructure in geopark planning and construction program, and they are also essential for geoeducation and geoconservation in geopark tourism. The current state and development of information and interpretation systems in China's geoparks were presented and analyzed in this paper. Statistics showed that fewer than half of geoparks run websites, and less than that amount maintained database, and less than one percent of all Internet/smartphone applications were used for geopark tourism. The results of our analysis indicated that smartphone applications in geopark information and interpretation systems would provide benefits such as accelerated geopark science popularization and education and facilitated interactive communication between geoparks and tourists.

  14. Development and Characterization of a Low-Pressure Calibration System for Hypersonic Wind Tunnels

    NASA Technical Reports Server (NTRS)

    Green, Del L.; Everhart, Joel L.; Rhode, Matthew N.

    2004-01-01

    Minimization of uncertainty is essential for accurate ESP measurements at very low free-stream static pressures found in hypersonic wind tunnels. Statistical characterization of environmental error sources requires a well defined and controlled calibration method. A calibration system has been constructed and environmental control software developed to control experimentation to eliminate human induced error sources. The initial stability study of the calibration system shows a high degree of measurement accuracy and precision in temperature and pressure control. Control manometer drift and reference pressure instabilities induce uncertainty into the repeatability of voltage responses measured from the PSI System 8400 between calibrations. Methods of improving repeatability are possible through software programming and further experimentation.

  15. Program Development and Effectiveness of Workplace Health Promotion Program for Preventing Metabolic Syndrome among Office Workers

    PubMed Central

    Ryu, Hosihn; Jung, Jiyeon; Cho, Jeonghyun; Chin, Dal Lae

    2017-01-01

    This paper aims to develop and analyze the effects of a socio-ecological model-based intervention program for preventing metabolic syndrome (MetS) among office workers. The intervention program was developed using regular health examinations, a “health behavior and need” assessment survey among workers, and a focus group study. According to the type of intervention, subjects took part in three groups: health education via an intranet-based web magazine (Group 1), self-monitoring with the U-health system (Group 2), and the target population who received intensive intervention (Group 3). The intervention programs of Group 1 and Group 2, which relied on voluntary participation, did not show significant effects. In Group 3, which relied on targeted and proactive programs, showed a decrease in waist circumference and in fasting glucose (p < 0.001). The MetS score in both males (−0.61 ± 3.35 versus −2.32 ± 2.55, p = 0.001) and females (−3.99 ± 2.05 versus −5.50 ± 2.19, p = 0.028) also showed a statistically significant decrease. In light of the effectiveness of the intensive intervention strategy for metabolic syndrome prevention among workers used in this study, companies should establish targeted and proactive health care programs rather than providing a healthcare system that is dependent on an individual’s voluntary participation. PMID:28777320

  16. Program Development and Effectiveness of Workplace Health Promotion Program for Preventing Metabolic Syndrome among Office Workers.

    PubMed

    Ryu, Hosihn; Jung, Jiyeon; Cho, Jeonghyun; Chin, Dal Lae

    2017-08-04

    This paper aims to develop and analyze the effects of a socio-ecological model-based intervention program for preventing metabolic syndrome (MetS) among office workers. The intervention program was developed using regular health examinations, a "health behavior and need" assessment survey among workers, and a focus group study. According to the type of intervention, subjects took part in three groups: health education via an intranet-based web magazine (Group 1), self-monitoring with the U-health system (Group 2), and the target population who received intensive intervention (Group 3). The intervention programs of Group 1 and Group 2, which relied on voluntary participation, did not show significant effects. In Group 3, which relied on targeted and proactive programs, showed a decrease in waist circumference and in fasting glucose ( p < 0.001). The MetS score in both males (-0.61 ± 3.35 versus -2.32 ± 2.55, p = 0.001) and females (-3.99 ± 2.05 versus -5.50 ± 2.19, p = 0.028) also showed a statistically significant decrease. In light of the effectiveness of the intensive intervention strategy for metabolic syndrome prevention among workers used in this study, companies should establish targeted and proactive health care programs rather than providing a healthcare system that is dependent on an individual's voluntary participation.

  17. Priority of a Hesitant Fuzzy Linguistic Preference Relation with a Normal Distribution in Meteorological Disaster Risk Assessment.

    PubMed

    Wang, Lihong; Gong, Zaiwu

    2017-10-10

    As meteorological disaster systems are large complex systems, disaster reduction programs must be based on risk analysis. Consequently, judgment by an expert based on his or her experience (also known as qualitative evaluation) is an important link in meteorological disaster risk assessment. In some complex and non-procedural meteorological disaster risk assessments, a hesitant fuzzy linguistic preference relation (HFLPR) is often used to deal with a situation in which experts may be hesitant while providing preference information of a pairwise comparison of alternatives, that is, the degree of preference of one alternative over another. This study explores hesitation from the perspective of statistical distributions, and obtains an optimal ranking of an HFLPR based on chance-restricted programming, which provides a new approach for hesitant fuzzy optimisation of decision-making in meteorological disaster risk assessments.

  18. CIDR

    Science.gov Websites

    Statistics Quality Control Statistics CIDR is dedicated to producing the highest quality data for our investigators. These cumulative quality control statistics are based on data from 419 released CIDR Program

  19. ENVIRONMENTAL MONITORING AND ASSESSMENT PROGRAM (EMAP): WESTERN STREAMS AND RIVERS STATISTICAL SUMMARY

    EPA Science Inventory

    This statistical summary reports data from the Environmental Monitoring and Assessment Program (EMAP) Western Pilot (EMAP-W). EMAP-W was a sample survey (or probability survey, often simply called 'random') of streams and rivers in 12 states of the western U.S. (Arizona, Californ...

  20. Statistical Process Control in the Practice of Program Evaluation.

    ERIC Educational Resources Information Center

    Posavac, Emil J.

    1995-01-01

    A technique developed to monitor the quality of manufactured products, statistical process control (SPC), incorporates several features that may prove attractive to evaluators. This paper reviews the history of SPC, suggests how the approach can enrich program evaluation, and illustrates its use in a hospital-based example. (SLD)

Top